Artificial Intelligence Accurately Diagnoses Skin Cancers

Artificial Intelligence Accurately Diagnoses Skin Cancers

New study conducted by an international team of researchers suggests that artificial intelligence (AI) may be better than highly-trained humans at detecting certain skin cancers Artificial...

[[ This is a content summary only. Visit my website for full links, other content, and more! ]]
Machine learning, meet quantum computing

Machine learning, meet quantum computing

Back in 1958, in the earliest days of the computing revolution, the US Office of Naval Research organized a press conference to unveil a device invented by a psychologist named Frank Rosenblatt at...

[[ This is a content summary only. Visit my website for full links, other content, and more! ]]
Japan’s Grand Strategy and Military Forces (I)

Japan’s Grand Strategy and Military Forces (I)

[Source: Consulate-General of Japan, Sydney]

This is the first in a series of Orders of Battle (OOB) posts, which will cover Japan, the neighboring and regional powers in East Asia, as well as the major global players, with a specific viewpoint on their military forces in East Asia and the Greater Indo-Pacific. The idea is to provide a catalog of forces and capabilities, but also to provide some analysis of how those forces are linked to the nation’s strategy.

The geographic term “Indo-Pacific� is a relatively new one, and referred to by name in the grand strategy as detailed by the Japanese Ministry of Foreign Affairs (MOFA) in April 2017. It also aligns with the strategy and terminology used by US Defense Secretary James Mattis at the Shangri-La conference in June 2018. Dr. Michael J. Green has a good primer on the evolution of Japan’s grand strategy, along with a workable definition of the term:

What is “grand strategy�? It is the integration of all instruments of national power to shape a more favorable external environment for peace and prosperity. These comprehensive instruments of power are diplomatic, informational, military and economic. Successful grand strategies are most important in peacetime, since war may be considered the failure of strategy.

Nonetheless, the seminal speech by Vice President Pence regarding China policy on 4 October 2018, had an articulation of Chinese grand strategy: “Beijing is employing a whole-of-government approach, using political, economic, and military tools, as well as propaganda, to advance its influence and benefit its interests in the United States.� The concept of grand strategy is not new; Thucydides is often credited with the first discussion of this concept in History of the Peloponnesian War (431-404 BCE). It is fundamentally about the projection of power in all its forms.

With the Focus on the Indo-Pacific Strategy, What About the Home Islands? 

[Source: Japanese Ministry of Defense (MOD) ]

The East Asian region has some long simmering conflicts, legacies from past wars, such as World War II (or Great Pacific War) (1937-1945), the Korean War (1950-1953), and the Chinese Civil War (1921-1947). These conflicts led to static and stable borders, across which a “military balance� is often referred to, and publications from think tanks often refer to this, for example the Institute for International and Strategic Studies (IISS) offers a publication with this title. The points emphasized by IISS in the 2018 edition are “new arms orders and deliveries graphics and essays on Chinese and Russian air-launched weapons, artificial intelligence and defence, and Russian strategic-force modernisation.�

So, the Japanese military has two challenges, maintain the balance of power at home, that is playing defense, with neighbors who are changing and deploying new capabilities that have a material effect on this balance. And, as seen above Japan is working to build an offense as part of the new grand strategy, and military forces play a role.

Given the size and capability of the Japanese military forces, it is possible to project power  at great distances from the Japanese home waters. Yet, as a legacy from the Great Pacific War, the Japanese do not technically have armed forces. The constitution, imposed by Americans, officially renounces war as a sovereign right of the nation.

In July 2014, the constitution was officially �re-interpreted� to allow collective self-defense. The meaning was that if the American military was under attack, for example in Guam, nearby Japanese military units could not legally engage with the forces attacking the Americans, even though they are allied nations, and conduct numerous training exercises together, that is, they train to fight together. This caused significant policy debate in Japan.

More recently, as was an item of debate in the national election in September 2018, the legal status of the SDF is viewed as requiring clarification, with some saying they are altogether illegal. “It’s time to tackle a constitutional revision,” Abe said in a victory speech.

The original defense plan was for the American military to defend Japan. The practical realities of the Cold War and the Soviet threat to Japan ended up creating what are technically “self-defense forces� (SDF) in three branches:

  • Japan Ground Self-Defense Forces (JGSDF)
  • Japan Maritime Self-Defense Forces (JGSDF)
  • Japan Air Self-Defense Forces (JASDF)

In the next post, these forces will be cataloged, with specific capabilities linked to Japanese strategy. As a quick preview, the map below illustrates the early warning radar sites, airborne early warning aircraft, and fighter-interceptor aircraft, charged with the mission to maintain a balance of power in the air, as Russian and Chinese air forces challenge the sovereignty of Japanese airspace. With the Russians, this is an old dance from the Cold War, but recently the Chinese have gotten into this game as well.

[Source: J-Wings magazine, December 2018]

Data and IoT: The Secret Sauce

Data and IoT: The Secret Sauce

Data and IoT: Discussion about the Internet of Things often centers around sensors and hardware, the additions to our physical environments. They may be built into street lights and bus stops, or in...

[[ This is a content summary only. Visit my website for full links, other content, and more! ]]
6 Business Intelligence Trends to Watch For in 2019

6 Business Intelligence Trends to Watch For in 2019

The goal of business intelligence (BI) is to thoughtfully and purposefully collect and analyze past information to support an organization and make better decisions about it. As the new year...

[[ This is a content summary only. Visit my website for full links, other content, and more! ]]
The State of Analytics Degrees in Universities

The State of Analytics Degrees in Universities

If you want to hire students from universities with strong analytical skills, you need to know the landscape of available programs and skills. Some believe that universities move slowly, but many...

[[ This is a content summary only. Visit my website for full links, other content, and more! ]]
How to future-proof your IT job in the age of AI

How to future-proof your IT job in the age of AI

Could a robot do your job? Could you help a robot do its job? If you are thinking about your career development and where you’d like to be a year from now, it’s time to ask yourself these questions....

[[ This is a content summary only. Visit my website for full links, other content, and more! ]]
Accelerating Data Sharing

Accelerating Data Sharing

Figshare and Digital Science published the 2018 State of Open Data report this week. Based on an annual survey run with us at Springer Nature since 2016, the report tracks changes in...

[[ This is a content summary only. Visit my website for full links, other content, and more! ]]
Datascience; more bang for the buck

Datascience; more bang for the buck

Bang

'I made some Python code that really rocks, Ronald'

'It extracts data from various sources, validates it, does some cleansing, codes xxx business rules, lots of integration of course, executes some kind of predictive model, outputs the data and visualizes it'.

And then the magical words are uttered; lets deploy it to production……

Alas,, the magic of datascience ends abruptly, IT is probably being blamed for not being agile and architects are scorned for being too restrictive and even killing innovation in the process.

Datascience has got a problem, it fails to operationalize its brilliant models and it therefor fails to deliver value to the business. There I said it, I know, pretty polarizing, but I encounter it on a daily basis now. Datascience needs to grow up....

It’s all about: (1) the required quality of services and (2) separating concerns.  Both seem to be not that important in datascience. It should though!

Let me clarify;

Quality of services

Two use cases;

(a) deploying a riscmodel at scale (lets say 500K transactions per day) that evaluates a transaction (from a customer) in realtime based on contra-information and determining in the process the level of supervision needed. Oh and by the way; one has to take into account ‘equality of right’ since the organization is a publicly owned organization.

(b) doing a one-time analysis on various sources, using advanced machine learning and where the output is used for a one-time policy influencing decision.

The quality of services between (a) and (b) are like night and day. (a) needs to be run at scale, realitime (direct feedback), using contra-information, the provenance is hugely important, it is subject based, so there are privacy concerns, it’s an automated decision (there is heavy legal shit here), equality of rights (metadata like; what model did we use on what transaction, what data did we evaluate,…) and many more.

(b) is a one-off….its output influences new policy or contributes to some insight. Your quality of services might be that the model is versioned, properly annotated and that the dataset is somehow archived properly to ensure repeatability. 

My point is that, whenever you start on an analytic journey, establish the quality of services that you require on forehand as much as possible. And for that to happen you need to have a clear explicit statement on how the required informationproduct contributes to the bottom line. So yes, a proper portfolio management process, a risk based impact assessment (!) and deployment patterns (architecture!) that are designed in advance!

With regard to datascience it is vital to make a conscious choice, before you start, of the required quality of services. If these services are high, you might wanna work together closely with system engineers, datamodelling experts, rule experts, legal experts, etc.. Only then, you might be able to deploy stuff and generate the value the field of datascience promises us.

 

Separation of concerns

For those who do not know what ‘separation of concerns’ means, start with Wikipedia or google Edsger Dijkstra, one of the greatest (Dutch) computer scientist…..

Anything IT related is suffering from the ‘overloading concerns’ issue. Some examples;

  • XBRL is a great standard, but suffers from overloading; integrity, validation, structure, meaning and presentation concerns are all bundled into one technical exchange format.
  • Datavault is a great technical modeling paradigm, but it does not capture logical, linguistic or semantic concerns, and yet the data modelling community still tries
  • Archimate is a great modeling notation in the Enterprise Architecture arena, why is it overloaded with process concerns? BPMN is such a better choice.

And of course we see it in code and we have seen it for ages in all programming languages; human tendency to solve all challenges/problems with the tool they are dominantly familiar/trained with. Datascience is no different. Failing to separate concerns lies at  the root of many software related problems like maintainability, transparancy, changeability, performance, scaleability and many many more. 

Like the example I started with in this blog;

A brilliant Python script where a staggering number of concerns have all been dealt with. This might not be a problem when the required quality of services is not that high. But when the required quality of service are high it becomes painfully clear that ‘deploying’ this code to production is a fantasy.

Extraction, validation, cleansing and integration concerns might be better dealt with by making use of tools and techniques in the information (modeling) arena.

Business rules might be better of designed (for example) by means of Rulespeak and subsequently making them more transparent for legal people and domain experts (which is btw – especially in AI – a huge concern!).

Visualization and presentation might be better of by using  tools that are already purchased by your organization, be it Tableau, SAS Visual Analytics, Qlik, Tibco or whatever.

 

Finally

Dear datascientist, stop blaming IT, architects or whoever that your brilliant code is not being deployed in production. Instead, reflect on your own role and your own part in the total supply chain of stuff that need to happen to actually get things working in production at the quality of services that are required.

Dear organization, stop blaming datascientists for not delivering on the value which was promised. Start organizing, consciously, the operationalization of datascience. It is not a walk in the park, it is an assignment that requires an extremely broad skillset, an holistic view, cooperation and of course attention to human behavior/nature.

And the majority of these challenges fall with management!

Starting a datascience team or department without organizing the operationalisation is a waste of resources. 

Operationalization of datascience IS NOT a technical problem!

For my dataquadrant fans; it is all about transitioning from quadrant IV to II.

 

 

 

 

Mastering data governance initiatives in the age of IIoT

Mastering data governance initiatives in the age of IIoT

The Industrial Internet of Things encompasses internet-connected devices that companies use within their organizations. Improved data governance is a necessary goal to strive for regarding the...

[[ This is a content summary only. Visit my website for full links, other content, and more! ]]
From DevOps to DataOps

From DevOps to DataOps

Over the past 10 years, many of us in technology companies have experienced the emergence of “DevOps.� This new set of practices and tools has improved the velocity, quality, predictability and scale...

[[ This is a content summary only. Visit my website for full links, other content, and more! ]]
How Big Data Is Redefining Ad Networks

How Big Data Is Redefining Ad Networks

The rapid rise of big data in virtually every realm of economic activity has caused a flurry of activity across the market as business owners, and hard-working professionals try to cash in on this exciting trend. Despite the renewed attention being paid to big data lately, however, many commentators are still paying pitifully little attention to how big data is set to redefine ad networks, which could be the most important way that it reshapes our material lives.

Here are some of the ways that continued innovations in big data could redefine ad networks as we know them, and how our consumer lifestyles will soon never be the same.

Big data is already changing advertising

To comprehensively understand how big data is redefining ad networks, you need to have a basic understanding of how it’s already changed the advertising industry. Despite the fact that many proponents and critics of big data alike frequently talk about it as though it were some forthcoming innovation, making use of software to sort through dizzying sums of information has been a vital part of the market for years now. The way that companies like Amazon link hands with Madison Avenue to deliver you enticing content like never ...


Read More on Datafloq
How should CIOs manage data at the edge?

How should CIOs manage data at the edge?

The ubiquity of popular buzzwords or phrases in the technology community brings a certain kind of pressure. If everyone seems to be talking about the importance and transformative potential of an...

[[ This is a content summary only. Visit my website for full links, other content, and more! ]]
What is Data Portability?

What is Data Portability?

In May of 2018 the European Union tightened regulations about customer right to data portability as part of theGDPR(General Data Protection Regulation). But what do these changes mean, and how will...

[[ This is a content summary only. Visit my website for full links, other content, and more! ]]
With blockchain asset tracking, Walmart pushes supplier tech adoption

With blockchain asset tracking, Walmart pushes supplier tech adoption

Walmart Inc.’s recent mandate that suppliers of leafy, green vegetables use blockchain by September 2019 faces two important hurdles that other companies should consider: the adoption rate of a...

[[ This is a content summary only. Visit my website for full links, other content, and more! ]]
6 Ways AR Plays a Role in Your Company’s Digital Transformation

6 Ways AR Plays a Role in Your Company’s Digital Transformation

The Augmented Reality revolution has reached a new point, with enterprise adoption overshadowing the consumer world. Market leaders have started directing their focus from niche offerings to...

[[ This is a content summary only. Visit my website for full links, other content, and more! ]]
Inside SAP’s digital transformation strategy

Inside SAP’s digital transformation strategy

SAP has been at the forefront of business digital transformation, primarily by selling technology and applications… that enable companies to create new digital models and opportunities. But SAP...

[[ This is a content summary only. Visit my website for full links, other content, and more! ]]
Facial recognition’s failings: Coping with uncertainty in the age of machine learning

Facial recognition’s failings: Coping with uncertainty in the age of machine learning

Deep learning is a technology with a lot of promise: helping computers “see” the world, understand speech, and make sense of language. But away from the headlines about computers...

[[ This is a content summary only. Visit my website for full links, other content, and more! ]]
Monetizing IoT Data Is The Next Step In A Connected Economy

Monetizing IoT Data Is The Next Step In A Connected Economy

We are moving at a break neck speed into the digitized world, where data has reigned king for several years. With the Internet of Things (IoT) becoming more prominent in our daily lives, data will...

[[ This is a content summary only. Visit my website for full links, other content, and more! ]]
36th Panzer Regiment Tank Losses, January 1944

36th Panzer Regiment Tank Losses, January 1944

The 36th Panzer Regiment was the tank component of the 14th Panzer Division, which had been destroyed at Stalingrad. When recreated, the division was supposed to have a three-battalion panzer regiment. However, it only received the I. and III. battalions before transferring to the eastern front in the autumn 1943. As losses accumulated, its remaining tanks and assault guns were concentrated in the III. battalion and the I. battalion was sent out of the theatre to replenish.

On 1 January 1944, the regiment had the following vehicles operational: 10 StuG, 11 Pz III, 11 Pz IV. In short term repair were: 7 StuG, 1 Pz III and 8 Pz IV. However, there is some uncertainty regarding the Pz III tanks, as they are not to be found in the organization chart, except for 6 command tanks in the battalion and the regiment. This is according to the monthly report to the Inspector-General of Panzer Troops (BA-MA RH 10/152).

The battalion war diary can be found in file BA-MA RH39/380. It is not as detailed as the war diary I have used for previous posts on I./Pz.Rgt. 26 Panther battalion. It just contains a narrative and I don’t have the kind of detailed annexes included in the file on I./Pz.Rgt. 26.

From the war diary, I conclude the following losses during January 1944:

StuG: 10 complete losses. One of them was only damaged by enemy fire, but could not be recovered due to nearby enemy units and was fired upon by other German StuG until it caught fire. Another 10 StuG were damaged, either by enemy fire or suffered from technical breakdowns.

Pz IV: 3 complete losses, 6 damaged. As there has been some posts on the effectiveness of artillery versus armour on the blog, it can be worth mentioning that one of the damaged Pz IV was hit by artillery fire.

Pz III tanks are not mentioned at all in the battalion war diary.

There are two comments on repairs in the war diary. On 14 January, it is said that one repaired tank returns and on 27 January, it is reported that 3 Pz IV and 1 StuG returns from workshops. However, this can not be all repairs. On 1 February, the battalion had 5 operational Stug and 2 in short term repair. As it started out with 10+7 StuG, had 5+2 on 1 February, while reporting 10 destroyed and 10 damaged during January, there must have been more repairs. The figures would suggest that 15 StuG were repaired, as there were no shipments of new StuG from the factories, according to the records in BA-MA RH 10/349 (list of deliveries of new AFV). Neither is any transfer of AFV from other units mentioned in the war diary.

It seems that the number of repaired Pz IV is 5, given the number on hand on 1 February.

All in all, this would mean that the battalion started out with 10 StuG and 11 Pz IV on 1 January, lost irretrievably 10 StuG and 3 Pz IV, 10 StuG damaged and 6 Pz IV damaged, while 15 StuG and 5 Pz IV were repaired. It should be noted that these figures are less certain than those given for the I./Pz.Rgt. 26 in previous posts, as the war diary of the III./Pz.Rgt 36 is not as detailed.

Admittedly, it is problematic to compare loss rates between units fighting different enemy formations, but it is still tempting to compare the III./Pz.Rgt. 36 with the I./Pz.Rgt. 26. After all, they fought in the same general area (Ukraine south of Kiev) in similar conditions against similar Soviet units. Clearly, the StuG and Pz IV were far more often directly destroyed by hits from enemy units. On the other hand, there seems to have been significantly fewer cases of mechanical breakdown among StuG and Pz IV. Ten such cases are explicitly mentioned, but in many cases the war diary just says that a tank was out of action, without giving a cause. Most likely, in those cases the cause was enemy action.

Clearly the proportions between destroyed by enemy fire, damaged by enemy fire and lost due to other causes differ considerably between the III./Pz.Rgt 36 and I./Pz.Rgt. 26.

This Picture is TAKEN FROM The SS Panzer corps in July 1943.

 

Understanding the role of automation in data management strategies

Understanding the role of automation in data management strategies

“The bigger the better,� so the saying goes. However, when it comes to data, it’s not so simple. We’ve ended up with bigger data, but have we really got better data? In my experience, businesses are...

[[ This is a content summary only. Visit my website for full links, other content, and more! ]]
How to Make Your Business Smarter with IBM Watson Studio

How to Make Your Business Smarter with IBM Watson Studio

A smart business is one that runs on the numbers. Today, open communication means more access to customers and competition than ever. Even small companies have world-wide customer lists, in part due to the increasingly streamlined logistics companies that help with shipping. In a business world of static customers and increasing competition, staying ahead of the curve often means staying on the leading edge of technology. Smart businesses leverage technology for direct gains, and that all comes down to the data. Efficiency improvements, trend tracking, smooth workflows and dozens of other internal operational improvements depend on the data you can collect and interpret.

All Data is Part of Big Data

Big Data is a term that gets thrown around a lot, and all it means is that you now have access to an incredible amount of data points. Some data points are important, while others just clutter up the landscape. In a Big Data approach to collection, you take it all in and start sorting through to gain valuable business insights. The sheer volume of information collected can make this approach seem inefficient. But, when done well, data collection can lead to some serious business benefits.

Benefits of Adopting a Data-Centric Approach

Data collection ...


Read More on Datafloq
Hitachi’s Bill Schmarzo: ‘IoT avalanche will open up security vulnerabilities’

Hitachi’s Bill Schmarzo: ‘IoT avalanche will open up security vulnerabilities’

Known as the ‘Dean of Big Data’, Hitachi Vantara CTO Bill Schmarzo reveals his thoughts on AI, blockchain and the internet of things. Bill Schmarzo is CTO in charge of internet of things (IoT) and...

[[ This is a content summary only. Visit my website for full links, other content, and more! ]]
Top 4 Artificial Intelligence Applications in Financial Institutions

Top 4 Artificial Intelligence Applications in Financial Institutions

Artificial intelligence has become very important in financial institutions and banking. Many of the applications and banking software are embracing AI extensively to compete in a very intense atmosphere. They are using various applications like virtual assistants, chatbots and AI debt collection assistants.

It is believed that more than 85% of bank customer interactions will be solely managed by artificial intelligence by 2020. TechEmergence, the AI market research specialist believes that chatbots are going to become the primary consumer AI apps in the future as banks need to engage with their customers who seek help and information.

AI Solutions in Banks & Financial Institutions

The use of chatbots and various virtual assistants reduce the expensive and tedious tasks in call centres. This reduces the work of customer service agents to a great extent. All financial institutions and banks have no alternative other than artificial intelligence to provide quick and fast responses and effective solutions to their thousands of customers who contact them every day.

Here are the top 4 Artificial Intelligence solutions that are currently being used in banks and financial institutions:

Sales Assistant

Artificial intelligence provides real-time assistance to fill forms in banks. This can increase the bank's conversion rate from 2% to 12%. It ...


Read More on Datafloq
Automatic Clustering, Materialized Views and Automatic Maintenance in Snowflake 

Automatic Clustering, Materialized Views and Automatic Maintenance in Snowflake 

Boy are things going bananas at Snowflake these days. The really big news a few weeks back was another round of funding! This week we announced two new major features.
Connected Vehicles – Are Commuters in The Privacy Driving Seat?

Connected Vehicles – Are Commuters in The Privacy Driving Seat?

We seem to be living in a world that is barreling headlong towards a science fiction reality that was the dream of writers in the 1950’s. Our parents were shown images and told stories of an ever...

[[ This is a content summary only. Visit my website for full links, other content, and more! ]]
Drug Barons, Rogue States and Terror Groups Use Banks – Can Blockchain Stop Them?

Drug Barons, Rogue States and Terror Groups Use Banks – Can Blockchain Stop Them?

Scathing reports by regulators have accused traditional banks of inadvertently helping “drug kingpins and rogue nations� – enabling them to commit money laundering, make questionable transfers and...

[[ This is a content summary only. Visit my website for full links, other content, and more! ]]
Can the Blockchain Make AI-based Systems Free of Monopolization?

Can the Blockchain Make AI-based Systems Free of Monopolization?

The potential of blockchain technology is thought by many to be huge - the technology could not only affect the economy, but also medicine, scientific research, government, education, and several other fields. The same is thought about artificial intelligence. If both technologies have so much potential, what could happen if the two are combined? According to some computer scientists, venture capitalists, and entrepreneurs, decentralization of AI-based systems is one of the expected outcomes.

Some people from the AI sector have raised concerns about the extent to which companies such as Google or Facebook have taken control over online data and how this control could limit the training of machine learning programs. Dawn Song, a computer science professor from UC Berkeley believes that blockchain technology could significantly limit the control that Internet giants have over online information. Song defends the importance to have machine learning capabilities that are under the users’ control and he believes blockchain could be the answer. More specifically, blockchain technology could be used to provide AI networks access to stores of online data without the need to involve third parties in the process.

Song also found a way to test his beliefs, as he is currently developing a blockchain named Oasis, ...


Read More on Datafloq
Peacekeeping Institute?

Peacekeeping Institute?

It appears that the Army is looking at shuttering the rather small Peacekeeping and Stability Operations Institute: https://www.yahoo.com/news/army-push-end-peacekeeping-institute-sparks-wider-debate-100019830.html

Now, I have never had any intersection with this organization, so I have no idea of how effective, productive or useful it is. I gather they are looking at renaming it (because peacekeeping is a bad word?) and cutting it more than 2/3rds. This is minimal savings.

Now, my experience is that DOD, being command driven and mission oriented, tends to forget about missions that are not currently getting “command attention.” I discussed this problem in some depth in my book Modern American Wars. We have seen parts of DOD go from ignoring the study of insurgencies before 2001 to recently not being able to properly model conventional combat for training exercises. As outrageous as this last sentence sounds, I can back it up with real world examples, except I really don’t want to embarrass anyone. But let us say, that we have seen multiple examples over the years of DOD being overly focused on the mission de jure at the expense of its other missions. DOD missions range from conventional wars, to counterinsurgencies, to irregular operations, to peacekeeping, nation building, and even border protection. These missions come and go, but they always show back up. It has been the case for over 200 years. The DOD always needs to be ready to conduct all missions. The failures in Iraq, which cost American lives, drives home that point in blood.

DATAx: Why and how blockchain is going to disrupt marketing

DATAx: Why and how blockchain is going to disrupt marketing

In a lot of ways, modern marketing is broken. Speaking at this year’s Digital Marketing Summit in London, Wayne Lloyd of the EOS Nation opened his presentation by outlining why the systems of...

[[ This is a content summary only. Visit my website for full links, other content, and more! ]]
How AI Directly Influences The Bike-Sharing Program In Smart Cities?

How AI Directly Influences The Bike-Sharing Program In Smart Cities?

Bike-Sharing system or bicycle-sharing system has been around since 1965 when a group called Provo introduced it in the bicycle-loving Amsterdam. But the idea of bike-sharing created a buzz only...

[[ This is a content summary only. Visit my website for full links, other content, and more! ]]
Data as jet fuel: An interview with Boeing’s CIO

Data as jet fuel: An interview with Boeing’s CIO

It isn’t always comfortable, but data analytics is helping Boeing reach new heights. Boeing CIO Ted Colbert is something of an evangelist for the power of data analytics. He recently spoke with...

[[ This is a content summary only. Visit my website for full links, other content, and more! ]]
Global Business via Blockchain Is Inevitable. Crypto Isn’t

Global Business via Blockchain Is Inevitable. Crypto Isn’t

For anyone familiar with the benefits of blockchain, it’s reasonable to expect global enterprises to convert, at least partially, to a distributed ledger in the coming years. Global enterprises have a ton of data and red tape to deal with, and all essential data and documents could find a secure, accessible home with blockchain technology. Furthermore, businesses can conduct all manner of transactions via blockchain.

Out of the 2,000 biggest companies in the world, at least 50 enterprises are exploring ways to implement blockchain technology. This includes Berkshire Hathaway, whose founder and CEO Warren Buffett has railed against bitcoin and cryptocurrency; however, Berkshire subsidiaries Richline and BNSF are exploring blockchain solutions to verify the sourcing of diamonds and to track railroad freight.

Similarly, JP Morgan Chase CEO Jamie Dimon doubts bitcoin, but his company created the Quorum blockchain platform and designated it open source. IHS Markit and Pfizer are both interested in Quorum. JP Morgan and Berkshire Hathaway are part of a trend. About 12 percent of financial institutions are using blockchain, and 24 percent plan to use in the coming year.

Yet there’s no indication that global enterprises are interested in cryptocurrency. Given the fact that cryptos operate on the blockchain, is the ...


Read More on Datafloq
How artificial intelligence is transforming the insurance sector

How artificial intelligence is transforming the insurance sector

The following is an opinion piece written by Carlos Somohano from WHISHWORKS who shares his insights into how big data can bring benefits to insurers, and to the sector as a whole. The views...

[[ This is a content summary only. Visit my website for full links, other content, and more! ]]
How drones and artificial intelligence can be used together

How drones and artificial intelligence can be used together

Artificial Intelligence, or AI, has existed for some time now. If we could develop an AI that could operate drones without humans, what could this mean? What sort of new opportunities could come from...

[[ This is a content summary only. Visit my website for full links, other content, and more! ]]
5 ways artificial intelligence is transforming health care

5 ways artificial intelligence is transforming health care

Artificial Intelligence (AI) is slowly but surely becoming a part of the health-care industry and is bringing forth some revolutionary changes in the medical field. The introduction of artificial...

[[ This is a content summary only. Visit my website for full links, other content, and more! ]]
Mine Effectiveness (Mines at Kursk III)

Mine Effectiveness (Mines at Kursk III)

There is an interesting statement in Zamulin’s book on Kursk (Demolishing the Myth, page 43) that says:

If in front of the line of defenses it required 350-400 anti-tank mines on average to damage or destroy one tank, then in the depths of the defense that number fell to 150-200 anti-tank mines. Such a difference is explained by the fact that mine emplacement in the depths of the defense occurred along lines of advance already revealed by the enemy.

I think I am reading that correctly, in that it takes 400 mines to damage or destroy one tank. There is no footnote to this passage, so I do not know if such a figure is from studies done in World War II, after World War II, or is just some rule of thumb. But, I have that data to test it here:

Mines at Kursk I

and here:

Tank Losses to Mines (Mines at Kursk II)

So, for the first day of the offensive in the south I have 131 to 154 German tanks lost to mines. So in first echelon of Sixth Guards Army there were 68,987 anti-tank mines (see my Kursk book, page 200). In the first echelon of the Seventh Guards Army there were 32,194 anti-tank mines in front of the III Panzer Corps (the 81st GRD and the 78th GRD, see page 201). This is a total of 101,181 anti-tank mines in the first echelon, opposite the three attacking German panzer corps.

So….101,181/154 = 657 or 101,181/131 = 772. Therefore, based upon this data, it appears that it was more like 657-772 mines per tank damaged or destroyed.

Now maybe I should only count 1/2 of the 71st Guards Rifle Division (GRD) mines, because the 332nd Infantry Division was opposite to half of the division (Kursk, page 378) and maybe 1/2 of the 67th GRD because both the 11th PzD and the 167th ID were opposite to it (Kursk, page 388). This reduces the mines counted against the German armor by 17,756. Now this is probably not really correct, as the mines are going to be biased towards the most obvious areas of attack (which is where the German armor went), but still (101,181-17,756)/154 = 541 or 101,181-17,756)/131 = 637.

So, it appears were are looking at a figure ranging from 541 to 772 anti-tank mines per tank damaged or destroyed.

Now……I can break it down by division attacking sector:

                            Estimated Tanks

                            Lost to Mines

Division               Low       High        Mines                               Range

3rd PzD                —                7            19,530 or less                383 or more per tank

GD PzGrD            —              25            as above                        as above

Panther Rgt         15             19            as above                        as above

11th PzD               —               8           15,981 or less                 1,998 or more per tank

LSSAH PzGrD     15             20           16,476                            515 to 687 per tank

DR SS PzGrD        9             12           as above                        as above

T SS PzGrD           9             12           17,000/2                        708 to 944 per tank

6th PzD      .69 x  20     .79 x 20          17,000/2 + 20,266/2      1165 to 1331 per tank

19h PzD     .69 x  19     .79 x 19          20,266/2 + 11,928/2      1073 to 1238 per tank

7th PzD      .69 x  21     .79 x 21          11,928/2                         351 to 426 per tank

Now, we never attempted to estimate the number of tanks damaged or destroyed by mines after the first day (5 July 1943) because we did not have the data. But this does give us some idea of how many anti-tank mines need to be laid to damage or destroy a tank. I have not done a “literature search” to determine if anyone else has done any other in-depth analysis of this.

Brian Krzanich: Intel’s AI Commitments to Deliver a Better World

Brian Krzanich: Intel’s AI Commitments to Deliver a Better World

Artificial intelligence (AI) is not only the next big wave in computing – it’s the next major turning point in human history. Similar to how machine tools, factory systems and steam power ushered in...

[[ This is a content summary only. Visit my website for full links, other content, and more! ]]
4 Startups Using AI to Solve 4 Totally Different Problems

4 Startups Using AI to Solve 4 Totally Different Problems

AI is one of the biggest buzzwords in tech (and in general) these days, and there’s no question AI gets a lot of hype, both for better and for worse. But the latest round of machine learning—which...

[[ This is a content summary only. Visit my website for full links, other content, and more! ]]
The Amazing Ways Toyota Is Using Artificial Intelligence, Big Data & Robots

The Amazing Ways Toyota Is Using Artificial Intelligence, Big Data & Robots

industrial revolution as a result of its investments and innovation in artificial intelligence, big data and robots. With initial funding of $100 million, invests in tech start-ups and entrepreneurs...

[[ This is a content summary only. Visit my website for full links, other content, and more! ]]
Want to Become a Data Engineer? Here’s a Comprehensive List of Resources to get Started

Want to Become a Data Engineer? Here’s a Comprehensive List of Resources to get Started

Before a model is built, before the data is cleaned and made ready for exploration, even before the role of a data scientist begins – this is where data engineers come into the picture. Every...

[[ This is a content summary only. Visit my website for full links, other content, and more! ]]
How the Internet of Things Makes a Significant Difference in Healthcare

How the Internet of Things Makes a Significant Difference in Healthcare

The provision of healthcare services has evolved tremendously in hospitals. This is mainly because of technological breakthroughs such as MRI machines, pacemakers, and other medical devices. This major transformation in the healthcare segment has been made possible with the introduction of the internet of things. This innovative information technology solution has made complex processes simpler and improved the delivery of healthcare services. Besides, the management of workflow which used to be a cumbersome task has become a smooth process. All thanks to the adoption of internet technology and smart devices which has made significant changes in the way traditional healthcare was delivered.

Role of IoT in Healthcare

The integration of IoT in healthcare has two crucial purposes which are-


Improving diagnosis of disease by timely communicating healthcare information related to heartbeat, blood sugar level etc..
Provides a seamless integration of doctors with patients by removing the wider demographic barriers.


IoT to enhance healthcare delivery

While the role that medical devices and smart machines play in providing healthcare service is clear. It needs to be emphasized how these smart devices improve the segment. To draw attention to below are some of the key pointers

Data insights

As patients use smart devices at any given point of time, the data ...


Read More on Datafloq
Automated data management is a crucial part of IT’s future

Automated data management is a crucial part of IT’s future

Readers of this column know I’m preoccupied with the idea of automated data management. Data management is where the proverbial rubber meets the road when it comes to the future of IT. You can...

[[ This is a content summary only. Visit my website for full links, other content, and more! ]]
U.S. versus China (GDP)

U.S. versus China (GDP)

Right now (as of 2017) the U.S. GDP is $19.391 Trillion according to the World Bank. The Chinese economy is $12.238 Trillion. This is 63% of the U.S. economy. No economy has been that close to the U.S. economy since Japan leading up to 1995.

It is a rather amazing growth on the part of China. Back in the bad old days, after we had fought a war with them over Korea, they were threatening to invade Taiwan, they were supporting North Vietnam against our ally South Vietnam, and allied with the Soviet Union as part of the Communist Bloc, the difference was much greater. The U.S. GDP in 1960 was 543.3 Billion, while China’s was 59.716, meaning the U.S. economy was 9 times greater. Now it is only 1.6 times greater.

Of course, the two economies are intertwined, with the United States being China’s largest trading partner. This sort of leads to the odd situation where some in the U.S. and China consider the other to be a rival. But, I can’t think of too many cases where major trading partners were opposing hostile players on the world stage. Still, it is a very uncomfortable arrangement with the U.S. nominally the leader of the free world, while China had been known to run over its people with tanks. They are still very much a dictatorship. So the two nations seem to exist as trading partners who are not really friends and not really enemies. They may be rivals in the long run, or may not. There is, of course, an on-going trade dispute between the two nations.

Now….if were extend those lines on the graph out…..it does look like they will cross at some point around 2050 or so. This of course, leads me back to this post:

Demographics of China

It is projected that by 2050 the Chinese population will decline to 1.36 billion by 2050 (it is currently 1.51 billion) while the U.S. will grow to 402 million by 2050 (it is currently 328 million). For a number of reasons, I don’t think we will see the Chinese economy exceed the U.S. economy by 2050.

Drones, data analytics, smart seeds: How to reforest x1,000 faster after wildfires

Drones, data analytics, smart seeds: How to reforest x1,000 faster after wildfires

Juan Carlos Sesma wants to reverse climate change by planting 10 billion trees in 10 years. That would be the equivalent of giving every inhabitant of the planet his or her own tree to help create...

[[ This is a content summary only. Visit my website for full links, other content, and more! ]]
TIBCO Spotfire named a Leader in The Forrester Waveâ„¢: Data Preparation Solutions, Q4 2018

TIBCO Spotfire named a Leader in The Forrester Waveâ„¢: Data Preparation Solutions, Q4 2018

There is a golden rule for analytics and data science. Your results are only as good as the data that you work with. Essentially, it’s garbage in, garbage out. That’s why it is so important to have...

[[ This is a content summary only. Visit my website for full links, other content, and more! ]]
The Biggest Benefits of IoT and Enterprise Mobility Management Integration

The Biggest Benefits of IoT and Enterprise Mobility Management Integration

The world of technology is emerging with new concepts, like artificial intelligence, machine learning, and the Internet of Things (IoT). The list goes on and on. Moreover, then there's also the impact such ingenious technologies have had on practically every aspect of how the world runs. Take the IoT and enterprise mobility for example. Enterprise mobility is the newest trend taking over the market wherein companies allow employees not to be bound to the office and instead perform their jobs using cloud services and mobile devices.

IoT is all about an extensive network of interconnected devices, that has had an impact on the concept of enterprise mobility. It has not only united devices in the context of enterprise mobility but also brought together users. It, in turn, has resulted in enhanced cost-effectiveness and better revenues for businesses. Furthermore, it has allowed employees to stay always connected with their organizations and made working together with their other remotely located colleagues exceptionally easy.

Moreover, we are only getting started -- IoT has also made it quite simple to send as well as receive data, which consequently enables employees to rapidly make crucial decisions by fast and easy access to essential data. Another way IoT ...


Read More on Datafloq
What’s Driving the Cloud Data Warehouse Explosion

What’s Driving the Cloud Data Warehouse Explosion

The advent of powerful data warehouses in the cloud is changing the face of big data analytics, as companies move their workloads into the cloud. According to analysts and cloud executives, the...

[[ This is a content summary only. Visit my website for full links, other content, and more! ]]
Which Is The Best JavaScript Framework For Your Big Data Project

Which Is The Best JavaScript Framework For Your Big Data Project

When developing for Big Data, there's one of two options we have for our JavaScript framework: Angular or React. Since JavaScript is mostly a consideration for front-end developers, developers on the back-end don't directly interact with the JavaScript framework. However, their processes and decisions are shaped by the type of JavaScript that the related front-end developer uses. At the front-end, the type of JavaScript framework used is very important as it can shape how the developer thinks about solving the problems presented to him or her. In the world of web development, the discussion of Angular vs. React finds its place into almost every developer conference. The truth is that depending on what you're developing you'll go with either one or the other depending on what suits your solution better.

Angular Development Highlights

Angular is a strict JavaScript framework and serves as a great choice for the development of enterprise-level applications. Because of how many functions and associated applications that it provides out of the box, it saves overthinking on the project, allowing developers to get cracking without needing to make major decisions early on. A wide range of developer environments can function alongside Angular making it flexible. Infoworld states that Angular ...


Read More on Datafloq
Top Data Privacy and Security Scandals

Top Data Privacy and Security Scandals

Data privacy and security is one of the main topics across the globe today. More and more people are aware of the potential risks their online activities pose, with 90 percent of respondents to a recent study expressing concern about internet privacy overall.

A number of scandals have brought the potential dangers of data privacy and security breaches to wider attention, highlighting the demand for stronger protective measures. Below, we explore four of the biggest scandals of recent years, all of which affected some of the world’s most well-known brands.

Yahoo: Risking the Privacy of 3 Billion People  

Yahoo has been subject to multiple large-scale breaches in the past five years. The biggest is believed to have compromised three billion user accounts in 2013, though they originally admitted to just one-third of these accounts being affected.

Names, email addresses, passwords, dates of birth, and security questions with answers could all have been in danger of falling into the wrong hands. The reason for this? Experts accused Yahoo’s ‘outdated’ encryption of being too easy to crack.

Yahoo was quick to reassure users that payment card and bank account details had not been compromised in the breach. They further claimed that of the full three billion accounts, ...


Read More on Datafloq
3 Surprising Ways Data Analytics Could Save Your Life

3 Surprising Ways Data Analytics Could Save Your Life

People typically talk about big data in the context of increasing efficiency or revenue for businesses, but it has applications across virtually every sector.

Some of those applications could save lives. Here are three surprising ways in which data analytics could help save your life.

1. Preventing Chronic Diseases

Every year, the U.S. spends around $30 billion on preventable hospitalizations. About half of those hospitalizations are due to chronic diseases, namely heart disease and diabetes. These diseases are two of the most prominent health problems Americans face.

Preventing these diseases and the hospital visits they lead to is a significant challenge, but it's one data analytics may be able to help address.

Boston University’s College of Engineering and Boston Medical Center are teaming up to tackle this issue by developing machine learning algorithms that can identify patients who have a high risk of developing these two diseases, which can allow doctors to intervene early and keep them healthy through personalized health plans.

Data analytics technologies enable the use of many more variables than medical professionals typically use when determining risk. For this project, the researchers plan to use data from electronic health records, implantable devices, wearables and home-based networked diagnostic devices.

2. Improving EMS Efficiency

It's essential for ...


Read More on Datafloq
Three challenges facing blockchain technology

Three challenges facing blockchain technology

Nearly five years ago, Overstock.com became the first major retailer to accept bitcoin as a form of payment. It now accepts many top cryptocurrencies. As a member of the senior executive team and...

[[ This is a content summary only. Visit my website for full links, other content, and more! ]]
Dupuy’s Verities: Initiative

Dupuy’s Verities: Initiative

German Army soldiers advance during the Third Battle of Kharkov in early 1943. This was the culmination of a counteroffensive by German Field Marshal Erich von Manstein that blunted the Soviet offensive drive following the recapture of Stalingrad in late 1942. [Photo: KonchitsyaLeto/Reddit]

The fifth of Trevor Dupuy’s Timeless Verities of Combat is:

Initiative permits application of preponderant combat power.

From Understanding War (1987):

The importance of seizing and maintaining the initiative has not declined in our times, nor will it in the future. This has been the secret of success of all of the great captains of history. It was as true of MacArthur as it was of Alexander the Great, Grant or Napoleon. Some modern Soviet theorists have suggested that this is even more important now in an era of high technology than formerly. They may be right. This has certainly been a major factor in the Israeli victories over the Arabs in all of their wars.

Given the prominent role initiative has played in warfare historically, it is curious that it is not a principle of war in its own right. However, it could be argued that it is sufficiently embedded in the principles of the offensive and maneuver that it does not need to be articulated separately. After all, the traditional means of sizing the initiative on the battlefield is through a combination of the offensive and maneuver.

Initiative is a fundamental aspect of current U.S. Army doctrine, as stated in ADP 3-0 Operations (2017):

The central idea of operations is that, as part of a joint force, Army forces seize, retain, and exploit the initiative to gain and maintain a position of relative advantage in sustained land operations to prevent conflict, shape the operational environment, and win our Nation’s wars as part of unified action.

For Dupuy, the specific connection between initiative and combat power is likely why he chose to include it as a verity in its own right. Combat power was the central concept in his theory of combat and initiative was not just the basic means of achieving a preponderance of combat power through superior force strength (i.e. numbers), but also in harnessing the effects of the circumstantial variables of combat that multiply combat power (i.e. surprise, mobility, vulnerability, combat effectiveness). It was precisely through the exploitation of this relationship between initiative and combat power that allowed inferior numbers of German and Israeli combat forces to succeed time and again in combat against superior numbers of Soviet and Arab opponents.

Using initiative to apply preponderant combat power in battle is the primary way the effects of maneuver (to “gain and maintain a position of relative advantage “) are abstracted in Dupuy’s Quantified Judgement Model (QJM)/Tactical Numerical Deterministic Model (TNDM). The QJM/TNDM itself is primarily a combat attrition adjudicator that determines combat outcomes through calculations of relative combat power. The numerical force strengths of the opposing forces engaged as determined by maneuver can be easily inputted into the QJM/TNDM and then modified by the applicable circumstantial variables of combat related to maneuver to obtain a calculation of relative combat power. As another of Dupuy’s verities states, “superior combat power always wins.”

How data cataloging helps analytics cultures evolve

How data cataloging helps analytics cultures evolve

According to anthropologists, around 10,000 years ago, human civilization began to undergo a dramatic change. Until that point, most human societies had been hunter-gatherers, living in small,...

[[ This is a content summary only. Visit my website for full links, other content, and more! ]]
How automation is changing data science and machine learning

How automation is changing data science and machine learning

Almost any article you read about how automation will affect our future can be classified into one of two narratives. The first one is that it will definitely lead to a better future, as it always...

[[ This is a content summary only. Visit my website for full links, other content, and more! ]]
The Top 6 Data Visualization Tools for 2019

The Top 6 Data Visualization Tools for 2019

Business intelligence (BI) has long gone from being an industry buzzword into being an absolute necessity for organizations in almost any field. The ability to understand your company’s and industries most impactful data points in real time is paramount in keeping up with an ever-changing business landscape. 

When you can quickly and effortlessly display that data in an easy-to-understand visual format, access it from anywhere and share it with the people who stand to gain most from its insights – that’s when you’re cooking with power. As such, data visualizations have become increasingly vital to business intelligence and analytics. They enhance online dashboards and reports by offering a more effective way to tackle decision-making processes. Even so, it’s important to understand what the various top data visualization tools offer, and how they can improve your operations. 

Read on for our top six data visualization platforms to consider using in the coming year.

1. Sisense

One of the fastest-rising products on the market, Sisense offers companies an expansive and comprehensive business intelligence suite with some of the most powerful visualization tools available. The company’s platform includes an AI-enhanced analytics engine, natural language querying, and the ability to create individualized and completely customizable dashboards for a variety ...


Read More on Datafloq
AI is not “magic dust� for your company, says Google’s Cloud AI boss

AI is not “magic dust� for your company, says Google’s Cloud AI boss

Andrew Moore is the new head of Google’s Cloud AI business, a unit that is striving to make machine learning tools and techniques more accessible and useful for ordinary businesses. To that end, his...

[[ This is a content summary only. Visit my website for full links, other content, and more! ]]
BI vs Data Mining: What’s the Difference and How Can They Be Used?

BI vs Data Mining: What’s the Difference and How Can They Be Used?

Both business intelligence and data mining can be extremely valuable to your business. However, because the two terms are often used interchangeably, it can be confusing to understand exactly what they are, how they're different and how they can be used. With this in mind, we've put together a guide to demonstrate what each process is and involves.

What is Business Intelligence?

BI is used to provide insights about both your own company and others such as rivals or business partners. It involves collecting and often processing large volumes of data, whether it be through your own internal metrics or third-party resources as specific as an Australian business directory database.

Ultimately it is used to help make more informed and therefore better business decisions, as well as making cost savings and finding new prospects. It can also be used to identify which processes and systems aren't performing well enough, so managers can alter them accordingly.

Thanks to the wide range of tools now readily available, as well as the rise of big data and an increase in open data initiatives, BI has become much more accessible to companies of all sizes. Whereas it once relied on BI professionals to make sense of the data, ...


Read More on Datafloq
Etika a mesterséges intelligencia világában

Etika a mesterséges intelligencia világában

Egyre többször felvetődik az etikusság és a döntés hozatali felelősség kérdése ahogy közelítünk az általános mesterséges intelligenciához.

Több cég algoritmusairól derült ki az utóbbi időben, hogy szexisták, vagy rasszisták. 

Amíg a tanulóalgoritmus az ember által adott információk alapján fog döntéseket hozni, addig ugyanolyan elfogult lesz, mint az emberek. 

Szabad-e, és ha igen, hogyan lehet konfigurálni a modelleket?

img_9679_kicsi_1.JPG
AI Ethics, Impossibility Theorems and Tradeoffs címmel tartott előadást Chris Stucchio az idei Crunchconfon. Elfogulatlanul mutatta be a területet, de azért sejthető volt, hogy van álláspontja a témában.

Két nagyon erős példát hozott:

  1. Amerikában az igazságügyben használt COMPASS algoritmus, mely segítségével predikciót hajtanak végre, hogy eldöntsék, hogy a börtönbüntetésének minimális idejét letöltött személyek vissza engedhetők-e a társadalomba vagy sem. Mint kiderült a modell elfogult volt a feketebőrű bűnözőkkel szemben. Kérdés, hogy helyesen cselekszünk-e, ha olyan adatokat is szolgáltatunk a gépi tanuló rendszerünknek, amik alapján mi sem ítélnénk meg szívesen az embereket. Melyik a helyes döntés etikai szempontból? Csökkenteni a bűntények számát úgy, hogy az azonos attribútumokkal rendelkező elítéltek közül azt a személyt börtönben tartjuk, akinek színes a bőre, vagy eltekinteni ettől és kockáztatni a bűntények elszaporodását? 
  2. Stucchio másik példáját a pénzügyi szektorból hozta. Felmérések alapján az ázsiai emberek fizetik vissza legnagyobb eséllyel a jelzálogkölcsönt, míg a feketebőrű emberek a legkisebb valószínűséggel. A machine learning modell számára, mely elvégzi a bankoknak a szükséges predikciót etikus cselekedet lenne átadni azokat az attribútumokat, mint például a személy bőrszíne vagy egyéb kényes adat? Sajnos vagy sem, mindenki maga dönti el, hogy hol van az a határ, melyet nem akar átlépni egy kicsivel több profit megszerzése érdekében. Míg a gazdasági szektorban csak a pénz a tét, addig az igazságügyben emberi életek és sorsok is múlhatnak a kérdésen.

img_9677_kicsi.JPG

Az előadás fő mondani valója szerint próbáljunk meg a lehetőségekhez mérten mindent formalizálni és mérhetővé tenni az igazságosságot egy meghatározott metrika segítségével.

Ha van saját véleményed a témában kíváncsiak vagyunk rá, írd meg kommentben.
Ha pedíg érdekelt a leírás itt tudsz többet olvasni a témáról: Delayed Impact of Fair Machine Learning

Megosztom Facebookon! Megosztom Twitteren! Megosztom Tumblren!

Will GraphQL Become a Standard for the New Data Economy?

Will GraphQL Become a Standard for the New Data Economy?

Don’t look now but a new language called GraphQL is emerging that could radically simplify how developers use APIs to get data into applications, and potentially provide a graph-like alternative to...

[[ This is a content summary only. Visit my website for full links, other content, and more! ]]
Digital transformation: Guide to become a smart company

Digital transformation: Guide to become a smart company

The data revolution is gaining pace at breakneck speed, and we are finally towards the latter end of its implementation. Many inroads have been made by important stakeholders, and numerous...

[[ This is a content summary only. Visit my website for full links, other content, and more! ]]
From Preventing Evidence Tampering to Crucial Weapons Systems: Applications of Blockchain in Security

From Preventing Evidence Tampering to Crucial Weapons Systems: Applications of Blockchain in Security

Since the “creation� of the Bitcoin, engineers have been working on finding ways to implement the technology behind it into other sectors. The blockchain technology is something that can easily be implemented into anything. These applications range from the financial sector, health, military, government, etc. Even though this technology was initially intended to be used in the financial sector, today it is used in other sectors, such as security. Over the past several years governments and private companies are working hard on implementing the blockchain in as many sectors as possible.

The idea of blockchain isn’t something that was created several years ago. Back in 1991 scientist was working on implementing a system with timestamps on documents to prevent tampering. A year later, the hash tree or Merkel tree was implemented which improved the efficiency. Things went somewhat quiet for almost two decades. The first real-life use for the blockchain technology was for bitcoin in 2008. Since then, experts have been looking for ways to implement technology in other sectors. Today we are going to talk about three security applications of the blockchain

Blockchain applications

Cybersecurity

Data protection is one of the biggest concerns in modern times. The biggest problem occurs when certain data ...


Read More on Datafloq
Applying Big Data Streaming Analytics in the Real World

Applying Big Data Streaming Analytics in the Real World

IoT, the Internet of Things, has been a buzzword for the past five years. Literally everyone across all industries – business executives, line of business owners, operation staff, mechanical...

[[ This is a content summary only. Visit my website for full links, other content, and more! ]]
How restaurants will use artificial intelligence to boost sales

How restaurants will use artificial intelligence to boost sales

Tracking technology can help restaurants plan efficiencies around staffing and serving, as well as help them market their restaurant effectively. Artificial Intelligence is going to impact us in many...

[[ This is a content summary only. Visit my website for full links, other content, and more! ]]
AI And Automation 2019 Predictions From Forrester

AI And Automation 2019 Predictions From Forrester

It’s difficult to make predictions, especially about the future, but we can be certain that “AI Washing� will continue to rise and flourish in 2019. That’s what market research firm Forrester calls...

[[ This is a content summary only. Visit my website for full links, other content, and more! ]]
These are the practical uses for artificial intelligence in business

These are the practical uses for artificial intelligence in business

Schneider Electric Chief Digital Officer Herve Coureil sat down with TechRepulic’s Dan Patterson and talked about practical uses for AI in business. The following is an edited transcript of the...

[[ This is a content summary only. Visit my website for full links, other content, and more! ]]
How Researchers Are Using Restaurant Reviews And Data Analytics To Predict Health Risks

How Researchers Are Using Restaurant Reviews And Data Analytics To Predict Health Risks

All of us rely on online restaurant reviews before we try a new restaurant. These reviews are written by normal people like us not just help people recognise the good and bad restaurants or dishes...

[[ This is a content summary only. Visit my website for full links, other content, and more! ]]
Can AI Address Health Care’s Red-Tape Problem?

Can AI Address Health Care’s Red-Tape Problem?

Productivity in the United States’ health care industry is declining – and has been ever since World War II. As the cost of treating patients continues to rise, life expectancies in America are...

[[ This is a content summary only. Visit my website for full links, other content, and more! ]]
The world’s best playground for AI and blockchain

The world’s best playground for AI and blockchain

Imagine a country with an army of techies, a government that supports AI and blockchain by setting a mandate and investing billions, large scale tech companies that are rapidly experimenting and...

[[ This is a content summary only. Visit my website for full links, other content, and more! ]]
Our lack of interest in data ethics will come back to haunt us

Our lack of interest in data ethics will come back to haunt us

When was the last time you saw a creepy ad on Facebook, which seemed to know about a product you were discussing with a coworker? Or when was the last time you noticed that your Google search had...

[[ This is a content summary only. Visit my website for full links, other content, and more! ]]
Behavioral Health Providers are Failing: Why Data Analytics is the Solution

Behavioral Health Providers are Failing: Why Data Analytics is the Solution

Today, behavioral health is more important than ever. Major news outlets, from The New York Times to The Wall Street Journal, constantly cover the mental health epidemics wracking the nation, be it...

[[ This is a content summary only. Visit my website for full links, other content, and more! ]]
Why Women Are Needed in Data Science

Why Women Are Needed in Data Science

The field of data science continues to grow quickly, and organizations of all sizes and in all industries need people who can analyze the numbers, dive into new theories, and innovate. Data scientist was the #1 job in the U.S. in 2017, but finding qualified applicants is proving to be a challenge for companies. Jobs in data require knowledge of programming, algorithms, statistics, and more—skills that take a lot of time to develop.

Like many tech jobs, the data science industry is currently male-dominated. According to one study, 70% of data scientists are men. With the shortage of qualified candidates in the field, however, this is a great time for women to join the industry and move into these roles. Here’s why closing the data science gender gap is so important.

Women Can Excel In Data When Given the Opportunity

We’ve lived with damaging stereotypes about women in tech for decades. Although most early programmers were women, today many people hold the opinion that women aren’t good with numbers, analytics, or other STEM topics. Those false stereotypes hold women back from pursuing data careers and reinforcing the gender gap. Currently, women make up just 26% of data professionals.

Women are ambitious, skilled, and adaptable, ...


Read More on Datafloq
5 Predictions for 2019: Business Value From Data

5 Predictions for 2019: Business Value From Data

You’ll get more value out of your data projects and programs in 2019, according to five predictions from Forrester Research. The year 2018 is almost a wrap. Your enterprise may have moved from...

[[ This is a content summary only. Visit my website for full links, other content, and more! ]]
The Data Catalog Comes of Age

The Data Catalog Comes of Age

Nowadays, it isn’t just banks and multinational corporations who have to be rigorous about data. Even modest organizations who would previously have been unable to afford the storage, tooling,...

[[ This is a content summary only. Visit my website for full links, other content, and more! ]]
Crunch Conference 2018 – Mint szakmai tükör

Crunch Conference 2018 – Mint szakmai tükör

Különleges helyzetben van a Crunch konferencia, mivel az előadók szándékai eléggé tiszták: nem akarnak valami bonyolult dolgot elmagyarázni nekünk, mint egy tudományos konferencián, nem egy PR eseményen vagyunk, ahol mindennek a főszponzor megoldásszállító marketingüzenetét kell hordoznia, de nem is egy zárt szakmai közösség találkozóján, ahol mindenki az ügyfeleknek vagy a versenytársaiknak kommunikálva egyre nagyobbat mond.

A Crunch előadóin azt láttam, hogy van egy jó gondolatmenetük, egy elfogadható / fejlődő céges adatvezérelt rendszerük, amiről egyszerűen szívesen mesélnek. Az ebből fakadó őszinteség eredménye pedig az, hogy a konferencia egyfajta tükérként tud szolgálni a hallgatóságnak, amibe belenézve elég sokat megtudhatunk magunkról.

csapat_kicsinyitett_2.JPG Dmlab a Crunchonfon. 

Mivel 2018-ban lényegében mindenhonnan a big data, az AI, az ezekre épülő megoldások folynak, miközben a legtöbb hír, anyag, siker mögött elég kétes szándékok, ügyes marketingfogások, vagy újságírói nagyotmondás húzódik, ezért sokakban az lehet az érzés, hogy mindenhol máshol fantasztikus adatvezérelt folyamatokkal építik a jövő sikeres vállalkozásait, amitől egyfajta lemaradottság-érzés alakul ki bennünk.

Ezért is volt üdítő látni, hogy nagy, neves cégek adatelemző szakembereit hallgatva kiderült, hogy nagyon hasonló problémákkal, nagyon hasonló megoldási javaslatokkal dolgoznak a legnagyobbak is, mint amikkel idehaza is összefutunk. Izgalmas volt látni többek között a Runtastic, a Slack, a LinkedIn, az Uber adattudósait mesélni arról a felépített folyamatokról, rendszerekről, kihívásokról és a nehézségekről. 

Látva a nemzetközi szinten jelentős cégek működését, nekem az volt a Crunch fő tanulsága, hogy nemzetközi szinten is vállalhatók azok a dilemmák, problémák, nehézségek és az ezekre kiépülő megoldások, amikkel mi magunk is dolgozunk. Nem csak nekünk okoz nehézséget egy folyamatosan újra és újra épülő modell üzemeltetése, máshol is ugyanúgy fáj a fejük a heterogén környezetektől, ugyanúgy dilemma másnak is, mitől lesz valami egyszeri vagy ismétlődő elemzés. Nyilván a hazai adatos világ le van maradva a fejlett gazdaságoktól, de szakmai oldalról ez a lemaradás nem érzékelhető. Jó volt ezt látni a tükörben.

A Crunch mellett két másik konferenciára is bejárásunk volt ugyanazzal a belépővel: a UX témában futó Amuse és a product managment témában futó Impact egy konferenciahelyszínen volt velünk. Mindkettő egy-egy ígéretetes előadására átmentem, és az Impact - nevéhez híven - nagyon nagy hatással is volt rám. Az a bő egy óra, amit ott töltöttem, elementáris erővel hatott rám, beláttatta velem, hogy mennyire vakon próbálkoztunk eddig a Dmlab megoldásainak termékesítése során. Ha úgy vesszük, ez is egyfajta tükörként funkcionált nálam, csak itt az elégedett mosoly helyett az a döbbenet ült ki az arcomra, amit akkor látsz, ha egy féléves gyerek először döbben rá, hogy a tükörben mit is lát.

Megosztom Facebookon! Megosztom Twitteren! Megosztom Tumblren!

Crunch Conference 2018 – Mint szakmai tükör

Crunch Conference 2018 – Mint szakmai tükör

Különleges helyzetben van a Crunch konferencia, mivel az előadók szándékai eléggé tiszták: nem akarnak valami bonyolult dolgot elmagyarázni nekünk, mint egy tudományos konferencián, nem egy PR eseményen vagyunk, ahol mindennek a főszponzor megoldásszállító marketingüzenetét kell hordoznia, de nem is egy zárt szakmai közösség találkozóján, ahol mindenki az ügyfeleknek vagy a versenytársaiknak kommunikálva egyre nagyobbat mond.

A Crunch előadóin azt láttam, hogy van egy jó gondolatmenetük, egy elfogadható / fejlődő céges adatvezérelt rendszerük, amiről egyszerűen szívesen mesélnek. Az ebből fakadó őszinteség eredménye pedig az, hogy a konferencia egyfajta tükérként tud szolgálni a hallgatóságnak, amibe belenézve elég sokat megtudhatunk magunkról.

csapat_kicsinyitett_2.JPG Dmlab a Crunchonfon. 

Mivel 2018-ban lényegében mindenhonnan a big data, az AI, az ezekre épülő megoldások folynak, miközben a legtöbb hír, anyag, siker mögött elég kétes szándékok, ügyes marketingfogások, vagy újságírói nagyotmondás húzódik, ezért sokakban az lehet az érzés, hogy mindenhol máshol fantasztikus adatvezérelt folyamatokkal építik a jövő sikeres vállalkozásait, amitől egyfajta lemaradottság-érzés alakul ki bennünk.

Ezért is volt üdítő látni, hogy nagy, neves cégek adatelemző szakembereit hallgatva kiderült, hogy nagyon hasonló problémákkal, nagyon hasonló megoldási javaslatokkal dolgoznak a legnagyobbak is, mint amikkel idehaza is összefutunk. Izgalmas volt látni többek között a Runtastic, a Slack, a LinkedIn, az Uber adattudósait mesélni arról a felépített folyamatokról, rendszerekről, kihívásokról és a nehézségekről. 

Látva a nemzetközi szinten jelentős cégek működését, nekem az volt a Crunch fő tanulsága, hogy nemzetközi szinten is vállalhatók azok a dilemmák, problémák, nehézségek és az ezekre kiépülő megoldások, amikkel mi magunk is dolgozunk. Nem csak nekünk okoz nehézséget egy folyamatosan újra és újra épülő modell üzemeltetése, máshol is ugyanúgy fáj a fejük a heterogén környezetektől, ugyanúgy dilemma másnak is, mitől lesz valami egyszeri vagy ismétlődő elemzés. Nyilván a hazai adatos világ le van maradva a fejlett gazdaságoktól, de szakmai oldalról ez a lemaradás nem érzékelhető. Jó volt ezt látni a tükörben.

A Crunch mellett két másik konferenciára is bejárásunk volt ugyanazzal a belépővel: a UX témában futó Amuse és a product managment témában futó Impact egy konferenciahelyszínen volt velünk. Mindkettő egy-egy ígéretetes előadására átmentem, és az Impact - nevéhez híven - nagyon nagy hatással is volt rám. Az a bő egy óra, amit ott töltöttem, elementáris erővel hatott rám, beláttatta velem, hogy mennyire vakon próbálkoztunk eddig a Dmlab megoldásainak termékesítése során. Ha úgy vesszük, ez is egyfajta tükörként funkcionált nálam, csak itt az elégedett mosoly helyett az a döbbenet ült ki az arcomra, amit akkor látsz, ha egy féléves gyerek először döbben rá, hogy a tükörben mit is lát.

Megosztom Facebookon! Megosztom Twitteren! Megosztom Tumblren!

Blockchain Big Data Options will Change Entire Industries

Blockchain Big Data Options will Change Entire Industries

Big data has been big news for businesses in recent years. The rapid rise in internet use and smartphone technology means we are now generating vast quantities of data every minute. According toone...

[[ This is a content summary only. Visit my website for full links, other content, and more! ]]
Panther Breakdowns in February-March 1944

Panther Breakdowns in February-March 1944

In a previous post, I presented information on the actions of the I./Pz.Rgt. 26 Panther battalion in January 1944.

Panther Breakdowns In January 1944

During late January and February, the battalion had usually been attacking. In March, it found itself embroiled in defensive fighting. This would show up in its losses. On 1 February, it reported 30 operational Panthers and 8 in workshops. No less than 24 had been sent off by train for more extensive engine repairs. Eleven Panthers had been irrevocably lost in combat (five due to hits that caused permanent damage, five had been damaged but could not be recovered and one had suffered an accident and could not be recovered). Two Panthers had been lost to engine room fires during march.

The battles fought 1-20 February only cost the battalion one Panther destroyed by enemy action (burned out due to AT round hit). Also, one more Panther spontaneously caught fire in the engine room and burned out. In the same period, no less than 97 Panthers were repaired by the workshop units, a very high damaged to destroyed ratio. Note that these repairs are due to combat damage as well as non-combat damage.

On 12 February, 20 Panthers were operational, 25 in workshop and 14 were to be recovered from various locations.

From 21 February, the battalion shifted to defensive operations and would retreat. This would lead to an increasing number of tanks being irrevocably lost. On 6 and 7 March, eight Panthers were blown up by the battalion. Of these only two had been damaged by enemy fire. The remaining six suffered from mechanical breakdowns.

The report (I./Pz.Rgt. 26 “Zusammengefasster Bericht über Panzerlage”) gives causes for each Panther being put out of action these days. It is clear that of 15 Panthers put out of action 5-7 March, only three had been hit by enemy fire and it seems that none received irreparable damage.

The battalion continued to retreat and on 8 March two Panthers were cannibalized for parts and subsequently blown up. These two (numbers 132 and 332) had not been knocked out by the enemy. They had simply got stuck in the terrain, one of them had also damaged a final drive. Lack of towing vehicles meant that they could not be recovered.

Late on 8 March it was decided to blow up another three Panthers, all of which had technical problems but could not be recovered in time, due to lack of towing vehicles and mounting enemy pressure.

On 9 March, another six Panther were blown up. Three had been hit by enemy fire, but as they were blown up, it seems unlikely that they had received terminal damage before being blown up. Finally, on 14 March two more Panthers were blown up. They both had technical damage.

After the actions in the Uman–Zvenigorodka area, the battalion retreated southwest, to the Kishinev–Balta area. During the retreat, another 19 Panthers were blown up, none of which had been damaged by enemy fire. Instead, demolition was carried out because vehicles had crashed off bridges, suffered technical damage, got stuck in the terrain, but could not be recovered in time before enemy pressure got to strong.

Despite these problems, the repair services repaired 41 Panthers in the period, which indicates that they worked hard.

During the first three months of 1944, the I./Pz.Rgt. 26 lost 60 Panthers irrevocably. Of these 37 were blown up without being damaged by the enemy. Four destroyed themselves by engine room fires. This left 19 being hit by enemy fire. Of these, its seems only 7 were actually destroyed by the hits received.

It seems clear that enemy fire was not the main cause of losses. As long as the Germans could recover damaged tanks, and had spare parts, few total losses occurred. Also, it is obvious that tanks were put out of action mainly by other causes than enemy fire. However, advancing Soviet ground units had much to do with the German tank losses, as such action could prevent recovery and force the Germans to blow up otherwise repairable tanks.

I have previously encountered claims that the Germans kept destroyed tanks on the rosters and thereby their true losses would appear smaller than they actually were. That notion finds no support in the very detailed war diary of the I./Pz.Rgt 26. I cannot find one single such case in the three months I have studied. Instead, it is clear that they far more often had to blow up perfectly repairable tanks.

During the three months discussed here, the workshops of the battalion repaired well over 200 Panthers, perhaps as many as 300, which can be contrasted to only about 7 being directly destroyed by enemy fire. This shows that it can be very problematic to infer tank losses from changes in the number of operational tanks from one time to another. Also, it shows the importance of controlling the terrain after the action has been fought.

All information is from the war diary, with annexes, of the I./Pz.Rgt. 26 (Bundesarchiv-Militärarchiv, RH 39/599).

Why Microservices Will Become a Core Business Strategy for Most Organizations

Why Microservices Will Become a Core Business Strategy for Most Organizations

As an industry, we have collectively returned to that eternal debate about what constitutes a largely technical evolution versus when an important digital idea becomes a full-blown business trend....

[[ This is a content summary only. Visit my website for full links, other content, and more! ]]
Robots in the Workplace: Pros and Cons

Robots in the Workplace: Pros and Cons

The use of automation to benefit business is nothing new. I always think back to the conveyor belt model thought up by Henry T. Ford when the first mass-produced car was introduced, the Model T. Ford. The idea was straightforward enough and made production infinitely easier; a car would be put together piece by piece by various people along a conveyor belt. This meant that each employee only had to perform one aspect of the production and was therefore very good at it. This lead to decreased production time and a more consistent product.

The next step to this, which you see in most car manufacturers, was to replace the human aspect with machinery which automatically performed tasks along the conveyor belt, removing a great deal of human error and cutting wage costs. Now, with technology becoming more advanced, the near-future offers a further level of automation that will not be exclusive to manufacturing. Robots are set to invade the workplace and take over many jobs in the coming years.

Surprisingly, the generation who were introduced to Skynet and the terminator seem less worried about this than younger people. Research shows that 16-24-year-olds are most worried about the robots in the workplace, ...


Read More on Datafloq
Why the Artificial Intelligence Era Requires New Approaches to Create AI Talent

Why the Artificial Intelligence Era Requires New Approaches to Create AI Talent

We live in the world of AI currently, and there is plenty of talk about what the future holds. With doubts over the future everywhere, one can predict that AI is soon going to be an imperative part of our lives. Considering the likely future impact of AI, there is a need to ensure that the right AI talent comes up and gets to the top to lead this wave of change. According to several key leaders from Huawei, this can only be achieved by changing certain approaches.

As a Huawei partner and a member of Huawei’s Key Opinion Leader Program, I joined three other experts conducting several keynotes at Huawei Connect in Shanghai all related to the question of how to develop talent in the AI era; Dr. Hao Lu, who is the Chief Innovation Officer at Yitu, Huang Weiwei, who is the Senior Management Consultant for Huawei, and Qian Wang, who is the Co-Founder of Mai Mai.

WeiWei mentioned that, “The adoption of AI technologies will not only make its products more intelligent and improve internal management efficiency; it could contribute as much as 90 per cent of total income even though AI headcount would be relatively small compared to ...


Read More on Datafloq
9 Ways to Incorporate Analytics Into Your Organisation

9 Ways to Incorporate Analytics Into Your Organisation

Descriptive, diagnostic, predictive and prescriptive analytics can each provide insights into the business and as such improve and optimise your performance and increase your competitive advantage. Descriptive and diagnostic analytics enable organisations to learn, sense, filter, shape and calibrate opportunities by providing insights as to what has happened in their environment. This will allow your organisation to better sense opportunities than the competition. Predictive analytics can improve your decision-making across your organisation to help you understand which opportunities are best to be seized depending on their future outcome.

As an example, predictive analytics can predict future customer demand based on detailed customer profiles, which will build loyalty and commitment if carried out correctly. When predictive analytics is successfully incorporated into your organisation, you can start to predict a lot more, including:


Customer churn; when is your customer about to leave you for what reason. Knowing this information will enable you to take proactive action to prevent your customer from eventually leaving you.
Sentiment; what do your customers think of your new product, service, campaign or commercial. Knowing this information enables you to change it before or shortly after launch to ensure that they match your customers’ needs.
Customer support; when can you expect an ...


Read More on Datafloq
3 crucial factors when applying Artificial Intelligence in business

3 crucial factors when applying Artificial Intelligence in business

Artificial Intelligence is really happening — and it has big implications. There’s now a collective fascination with the potential for scaling AI in both public and private sectors, and, in the last...

[[ This is a content summary only. Visit my website for full links, other content, and more! ]]
The Achille’s Heel Of Artificial Intelligence

The Achille’s Heel Of Artificial Intelligence

In a previous blog, I discussed how Artificial Intelligence (AI) today merely has specific intelligence as opposed to generalized intelligence. This means that an AI process can appear quite intelligent within very specific bounds yet fall apart if the context in which the process was built is changed. In this blog I will discuss why adding an awareness of context into an AI process – and dealing with that context – may prove to be the hardest part of succeeding with AI. In fact, handling context may be the Achille’s heel of AI!This discussion will expand upon some points made within a recent IIA research brief.

That's Impressive!... Or Is It? 

Consider a picture like the one below.



It is possible to build AI processes to recognize a wide range of things about the image. Examples include:


This is a tennis game
There are two females playing
They are wearing tennis skirts
One of them has on a cap or visor
One player just hit the tennis ball


Seeing all of those facts identified in an image automatically is certainly impressive and I saw a gentleman at a conference discuss a scenario just like this. However, there were examples discussed where what at first seemed impressive did not hold up under scrutiny.

Imagine that ...


Read More on Datafloq
Really the End of Stealth?

Really the End of Stealth?

This blog reported in July that the end of stealth might be near.  Further evidence comes with Aviation Week’s interview with Fred Kennedy, the lead of the Tactical Technology Office (TTO) at the Defense Advanced Research Projects Agency (DARPA) reveals key elements of the debates within the US defense community, specifically about stealth.

“We have been doubling down on the miracle of stealth for forty years. … There are diminishing returns to using the same tactic.  I don’t think there is a lot of advantage to going further into this particular tactic of stealth.�  Rather, DARPA suggests what they call “un-deterable air presence. … You’re going to see me coming, since I won’t be stealthy, and you’re going to shoot at me, but you’re not going to hit anything.  An example is hypersonics.�

Meanwhile, Air Force Chief of Staff General Dave Goldfein is looking at the network approach, sometimes called combat cloud.  �When you look at — through the lens of the network — and you look at air superiority as a mission, as a family-of-systems approach, you can see why you don’t hear me talking a lot about a replacement, A for B.�

This indicates that several programs which are underway to consider building new stealth aircraft might be facing an uphill battle to convince the Air Force, DARPA and Department of Defense (DoD):

  • Next Generation Air Dominance (NGAD)
  • Penetrating Counter Air (PCA)
  • F/A-XX

So, does this mean that stealth is near its end? A few key facts illustrate otherwise:

  1. Significant investment in new stealthy platforms, worldwide.
    1. US F-117 – in service from 1983 to 2008
    2. US B-2 – in service since 1997
    3. US F-22 – in service 2005-Dec, first combat 2014-Sep
    4. F-35 Program – first combat by Israeli Air Force, 2018-May
    5. US B-21 Program – expected to enter service by 2025
    6. British Tempest – concept announced 2018-July
    7. Franco-German Future Combat Air System (FCAS)
    8. Japanese X-2 Shinshin – costly, but may proceed with partners
    9. Korean & Indonesian KF-X – expected by 2032
    10. Turkish & British TF-X – first flight by 2023 ?
    11. Chinese J-20 – in serial production since 2017-Oct
    12. Chinese J-31 – improved version, first flight 2016-Dec
    13. Chinese H-20 – strategic stealth bomber, planned for 2025
    14. Russian Su-57 – in service, combat evaluation in Syria
    15. Russian PAK DA Program – bomber planned for 2025-2030
  2. Research projects by DARPA that leverage existing stealthy platforms.
    1. Gremlins – semi-disposable, air launch and recovery UAVs
    2. Software – System of Systems approach (SoSITE)
  3. Evidence that stealth capabilities by potential adversaries are overstated.
    1. India Air Force claims Su-30MKI tracked Chinese J-20
    2. Russia cancels mass production of Su-57

Clearly then, stealth is a capability that is here to stay, and many new aircraft with incorporate it into their design. The point that DARPA’s Kennedy makes is that potential adversaries know this tactic, and they are investing in ways to counter it.  Stealth is no longer a source of technological surprise, it is mainstream.  It was original and likely the source of significant surprise in 1983!

The Unlikely Marriage of Data Warehousing & Marketing

The Unlikely Marriage of Data Warehousing & Marketing

Traditionally, CIO and CMO organizations have operated separately with different mandates. One was responsible for technologies that enabled company operations, while the other was responsible for...

[[ This is a content summary only. Visit my website for full links, other content, and more! ]]
Does Your Data Center Have A White Box Future?

Does Your Data Center Have A White Box Future?

In the world of networking, a brand carries with its strength and dependability, and because of that, consumers usually opt for a brand name over something they've never heard of before. At least, that's what marketing believes. While hyperscale data center business like Amazon and Google might have depended on names like Dell and Cisco in the past, now, according to Network World, more and more of them are leaning towards white box servers.

White box servers tend to be cheaper and more easily upgradable than the current industry market leaders, and the scariest part (at least if you're in marketing for Dell or Cisco) is that there isn't a single company that these white box servers are linked to, and so they don't even have an advertising campaign that can be studied and copied. The real question is: are white box servers really that great an investment?

Enterprises are Wary

Enterprise level data servers haven't taken the same enthusiastic approach to white box servers. The reason why white box servers tend to be more prevalent in massive data centers is that of the availability of these systems for bulk production. Additionally, another important distinction between brand-name manufacturers and white box server installs ...


Read More on Datafloq
You Already Have the Customer Data You Need. Here’s What to Do With It.

You Already Have the Customer Data You Need. Here’s What to Do With It.

I’m going to tell you a secret. You already have all the data you need to make a difference in your business. Sure, you might have to dust it off, organize it and put it to work, but the...

[[ This is a content summary only. Visit my website for full links, other content, and more! ]]
Ethereum Energy Project Now Powers 700 Households in 10 Cities

Ethereum Energy Project Now Powers 700 Households in 10 Cities

Launched earlier this year, Lition is already a licensed energy supplier in Germany with clients in 12 major cities (including Berlin, Hamburg and Munich) who are now using its decentralized energy...

[[ This is a content summary only. Visit my website for full links, other content, and more! ]]
The Missing Field Guide to Industrial IoT Platform Development Choices

The Missing Field Guide to Industrial IoT Platform Development Choices

You’re an industrial equipment manufacturer. You design, produce and sell equipment used in manufacturing, agriculture, life sciences, food & beverage production, transportation, oil & gas,...

[[ This is a content summary only. Visit my website for full links, other content, and more! ]]
How the Internet of Things Affects Your Data Center?

How the Internet of Things Affects Your Data Center?

The Internet of Things is very much a disruptive technology from a data center perspective. When it was initially announced, the IoT sounded as though it could change the face of the industry and make computers more integrated into things. This was great news for data centers since it meant they could look forward to deeper incorporation into the industry and wider use cases across a larger number of areas.

While data centers have anticipated that the IoT was likely to change their hardware, few expected that the IoT would lead to such massive sweeping changes in data center management paradigms as we see today. Information Week notes how technology like Edge Computing has impacted data centers showing that these impacts are widespread and potentially industry-changing. But that's what we expect from disruptive technology, right?

Connectivity to Aid The IoT

The IoT depends heavily on interconnectivity between devices and servers. This is one of the core elements behind the establishment and maintenance of a data center - the ability to create and maintain connections between servers. The IoT's demands, however, are far more massive than anything data centers have experienced in the past. No one user, no matter how big, could generate as ...


Read More on Datafloq
5 Advanced Analytics Algorithms for Your Big Data Initiatives

5 Advanced Analytics Algorithms for Your Big Data Initiatives

Getting started with your advanced analytics initiatives can seem like a daunting task, but these five fundamental algorithms can make your work easier. There is a fervor in the air when it comes to...

[[ This is a content summary only. Visit my website for full links, other content, and more! ]]
The Role of Robotics in Smart Warehousing

The Role of Robotics in Smart Warehousing

Today's warehouses are arguably smarter than ever. They feature equipment that works without constant human intervention and even gives alerts before breakdowns happen.

And, robotics equipment features substantially in these smart warehouses, making the operations safer and more efficient.

Robots Help Prepare Items for Shipment

People are accustomed to not having to wait very long when they order things online. Some providers, such as Amazon, fulfill orders within hours. Also, companies in the online grocery industry deal with perishable items and must figure out the best ways to get those products from warehouses to customers' doorsteps before they spoil.

As such, robots are an integral part of the workforces at distribution centers and fulfillment centers. They get tasks like loading and unloading or picking products from respective areas of a warehouse. In Amazon's case, the company uses more than 100,000 robots in its facilities around the world.

The company primarily needs the robots to supplement human work, not replace it. Without the reliance on robots, though, Amazon wouldn't be able to ship items out so quickly and keep its customers satisfied.

Robots Can Lift and Carry Things — but Shouldn't Replace Humans in All Cases

Alibaba is another well-known e-commerce company. In its smart warehouse in Huiyang, ...


Read More on Datafloq
Databases vs data lakes: Which should you be using?

Databases vs data lakes: Which should you be using?

As the transformational power of data is realised, the debate around whether to choose databases or data lakes has intensified Businesses large and small, along with data scientists, IT professionals...

[[ This is a content summary only. Visit my website for full links, other content, and more! ]]
The Growing Significance Of DevOps For Data Science

The Growing Significance Of DevOps For Data Science

DevOps involves infrastructure provisioning, configuration management, continuous integration and deployment, testing and monitoring.  DevOps teams have been closely working with the development...

[[ This is a content summary only. Visit my website for full links, other content, and more! ]]
Tank Losses to Mines (Mines at Kursk II)

Tank Losses to Mines (Mines at Kursk II)

This is a follow up post to this post:

Mines at Kursk I

This is excerpted directly from my Kursk book, page 423.

The German tank losses due to mines can be estimated for this day [5 July 1943]. There is a report that appears to record all the German tanks lost to mines on 5 July. The 3rd Panzer Division reported losing six tanks damaged to mines. One other was destroyed, and while the records do not explicitly so state, the assumption is that this one was also lost to mines. The Gross Deutschland reports that it lost 5 assault guns and about 20 tanks to mines. This was the worst loss to mines suffered by any division. The 11th Panzer Division reported losing eight tanks on 5 July. It is unclear, but it appear that all of these were also lost to mines. The Adolf Hitler SS Division reported losing 20 tanks on 5 July, including six Tiger tanks. The Totenkopf SS Division lost 12 tanks, including five Tigers. The record notes that these were “mostly from mines.” The Das Reich SS Division reported 12 tanks lost, including two Tigers. They do not report the cause of loss, but it is also assumed that most were from mines. A comparison of the calculated losses from these divisions (generate by subtracting the number of tanks ready-for-action on the evening of the 4th from the number ready-for-action on the evening of the 5th) compared to the reported losses due to mines shows the following:

                           Calculated losses     Reported losses     Mine loss 

                           (from all causes)      (mine losses)          as a percent

                            5 July                       5 July                     of total loss

3rd PzD                  10                                7                             70%

GD PzGrD              30                              25                             83%

11th PzD                12                                8                             67%

LSSAH PzGrD       20                               20                          100%

DR SS PzGrD        19                               12                            63%

T SS PzGD            15                               12                             80%

                            ——                           ——                            ——

                            106                               84                             79%

 

It is estimated that the Panther regiment lost 19 tanks to mines on this day. In the case of the III Panzer corps, it showed a decline in strength on the first day of 64 tanks. Assuming that the same percent were lost to mines as for the other two panzer corps, then another 51 tanks were lost to mines. We do know that at least 9 Tigers were lost to mines on this day and a total of 16 were lost to mines in the first three days of battle.

Therefore, it is estimated that on the 5th of July, the attacking Germans lost as many as 154 tanks to mines out of 249 damaged or destroyed that day. This amounts to 62 percent of the tanks lost on that day and accounted for 10 percent of the total German tanks lost between 4 and 18 Julty. This is the upper estimate, as the records often report that “most” of the tank losses were to mines. If one assumes that “most” means 75 percent lost to mines, then the figure are less. The XLVIII Panzer corps loss to mines would remained at 40, the SS Panzer Corps loss to mines would lower to 33, the Panzer Regiment von Lauchert to 15, and the III Panzer and Corps Raus to 44, for a total of 131 tanks lost to mines on 5 July.

Of those tanks lost to mines, only one or two, including one Panther, were clearly destroyed by mines. Most of the tanks lost to mines were damaged without loss to the crew. In many case the tanks were repaired and put back into action within a few days. There were also at least four Soviet tanks lost to mines on this day, probably their own mines, and we know of two German Tiger tanks that were lost to their own mines.

One more mine related post to follow.

Does Synthetic Data Hold The Secret To Artificial Intelligence?

Does Synthetic Data Hold The Secret To Artificial Intelligence?

synthetic data be the solution to rapidly train artificial intelligence (AI) algorithms? There are advantages and disadvantages to synthetic data; however, many technology experts believe that...

[[ This is a content summary only. Visit my website for full links, other content, and more! ]]
Your Team Doesn’t Need a Data Scientist for Simple Analytics

Your Team Doesn’t Need a Data Scientist for Simple Analytics

Data analytics is a powerful and promising source of competitive advantage. But organizations are often hobbled by the lack of the requisite skills in the marketplace. To cope with the shortfall in...

[[ This is a content summary only. Visit my website for full links, other content, and more! ]]
China versus India

China versus India

In a blog posts on Indian demographics someone asked “The question is why India (per capita income $6,490-2016) is still so poor compared to China (per capital income $15,000 – 2016).” I have never really examined this, but it is an interesting enough question that I wanted to take a further look at it.

In 1960, the World Bank has China’s GDP at $60 billion. India’s GDP at $37 billion. India’s GDP is around 61% of China’s. Considering that India’s population was smaller, they were clearly at similar levels of development. This is, of course, back in the day when China and India were having border fights in the Himalayas (like in 1962).

Over the next couple of decades, India actually closed in on China. In 1970, India’s GDP was 67% of China’s GDP. In 1980 it was 96%. In 1990, they had separated a little with India’s GDP being 88% of China’s. So for three decades they grew at similar rates. And then as you can see rather clearly from the chart below, China’s economy took off.

By the year 2000, China’s GDP was 2.6 times larger than India’s. India was at 38% of China. By 2010, the disparity widened, with China’s GDP now 3.7 times larger than India’s (27%). As of 2017, the disparity continued to grow with China’s GDP now 4.7 times larger (or 21% for India). This is a pretty significant change over time, with clearly most of the difference developing from 1990 to the present. It has been an amazing three decades for China.

Of course, the last time we saw such amazing growth was with Japan up through 1995. Is China growth permanent and sustainable (like U.S. growth tends to be), or is it a bubble?

See:

Where Did Japan Go?

 

Privacy Policy

Copyright © 2018 BBBT - All Rights Reserved
Powered by WordPress & Atahualpa
X