All You Wanted to Know About AWS Certifications but Didn’t Know Whom to Ask | Simplilearn webinar starts 02-11-2017 23:00

All You Wanted to Know About AWS Certifications but Didn’t Know Whom to Ask | Simplilearn webinar starts 02-11-2017 23:00

Should you get an AWS certification? What are the different AWS certifications and which one is right for you? Whether you’re new to cloud computing or are planning a move into the world of cloud computing  - here’s your chance to find answers to some of the most frequently asked questions about AWS certification. Join Albert Anth...Read More.
All You Wanted to Know About AWS Certifications but Didn’t Know Whom to Ask | Simplilearn webinar starts 26-10-2017 21:00

All You Wanted to Know About AWS Certifications but Didn’t Know Whom to Ask | Simplilearn webinar starts 26-10-2017 21:00

Should you become AWS certified? What are the different AWS certifications and which one is right for you? Whether you’re new to cloud computing or are planning a move into the world of cloud computing  - here’s your chance to find answers to some of the most frequently asked questions about AWS certification. Join Albert Anthony,...Read More.
The Effects Of Dispersion On Combat

The Effects Of Dispersion On Combat

[The article below is reprinted from the December 1996 edition of The International TNDM Newsletter. A revised version appears in Christopher A. Lawrence, War by Numbers: Understanding Conventional Combat (Potomac Books, 2017), Chapter 13.]

The Effects of Dispersion on Combat
by Christopher A. Lawrence

The TNDM[1] does not play dispersion. But it is clear that dispersion has continued to increase over time, and this must have some effect on combat. This effect was identified by Trevor N. Dupuy in his various writings, starting with the Evolution of Weapons and Warfare. His graph in Understanding War of the battle casualties trends over time is presented here as Figure 1. As dispersion changes over time (dramatically), one would expect the casualties would change over time. I therefore went back to the Land Warfare Database (the 605 engagement version[2]) and proceeded to look at casualties over time and dispersion from every angle that l could.

l eventually realized that l was going to need some better definition of the time periods l was measuring to, as measuring by years scattered the data, measuring by century assembled the data in too gross a manner, and measuring by war left a confusing picture due to the number of small wars with only two or three battles in them in the Land Warfare Database. I eventually defined the wars into 14 categories, so I could �t them onto one readable graph:

To give some idea of how representative the battles listed in the LWDB were for covering the period, I have included a count of the number of battles listed in Michael Clodfelter’s two-volume book Warfare and Armed Conflict, 1618-1991. In the case of WWI, WWII and later, battles tend to be defined as a divisional-level engagement, and there were literally tens of thousands of those.

I then tested my data again looking at the 14 wars that I defined:

  • Average Strength by War (Figure 2)
  • Average Losses by War (Figure 3)
  • Percent Losses Per Day By War (Figure 4)a
  • Average People Per Kilometer By War (Figure 5)
  • Losses per Kilometer of Front by War (Figure 6)
  • Strength and Losses Per Kilometer of Front By War (Figure 7)
  • Ratio of Strength and Losses per Kilometer of Front by War (Figure 8)
  • Ratio of Strength and Loses per Kilometer of Front by Century (Figure 9)

A review of average strengths over time by century and by war showed no surprises (see Figure 2). Up through around 1900, battles were easy to define: they were one- to three-day affairs between clearly defined forces at a locale. The forces had a clear left flank and right flank that was not bounded by other friendly forces. After 1900 (and in a few cases before), warfare was fought on continuous fronts

with a ‘battle’ often being a large multi-corps operation. It is no longer clearly understood what is meant by a battle, as the forces, area covered, and duration can vary widely. For the LWDB, each battle was de�ned as the analyst wished. ln the case of WWI, there are a lot of very large battles which drive the average battle size up. ln the cases of the WWII, there are a lot of division-level battles, which bring the average down. In the case of the Arab-Israeli Wars, there are nothing but division and brigade-level battles, which bring the average down.

The interesting point to notice is that the average attacker strength in the 16th and 17th century is lower than the average defender strength. Later it is higher. This may be due to anomalies in our data selection.

Average loses by war (see Figure 3) suffers from the same battle definition problem.

Percent losses per day (see Figure 4) is a useful comparison through the end of the 19th Century. After that, the battles get longer and the definition of a duration of the battle is up to the analyst. Note the very dear and definite downward pattern of percent loses per day from the Napoleonic Wars through the Arab-Israeli Wars. Here is a very clear indication of the effects of dispersion. It would appear that from the 1600s to the 1800s the pattern was effectively constant and level, then declines in a very systematic pattern. This partially contradicts Trevor Dupuy’s writing and graphs (see Figure 1). It does appear that after this period of decline that the percent losses per day are being set at a new, much lower plateau. Percent losses per day by war is attached.

Looking at the actual subject of the dispersion of people (measured in people per kilometer of front) remained relatively constant from 1600 through the American Civil War (see Figure 5). Trevor Dupuy defined dispersion as the number of people in a box-like area. Unfortunately, l do not know how to measure that. lean clearly identify the left and right of a unit, but it is more difficult to tell how deep it is Furthermore, density of occupation of this box is far from uniform, with a very forward bias By the same token, �re delivered into this box is also not uniform, with a very forward bias. Therefore, l am quite comfortable measuring dispersion based upon unit frontage, more so than front multiplied by depth.

Note, when comparing the Napoleonic Wars to the American Civil War that the dispersion remains about the same. Yet, if you look at the average casualties (Figure 3) and the average percent casualties per day (Figure 4), it is clear that the rate of casualty accumulation is lower in the American Civil War (this again partially contradicts Dupuy‘s writings). There is no question that with the advent of the Minié ball, allowing for rapid-�re rifled muskets, the ability to deliver accurate �repower increased.

As you will also note, the average people per linear kilometer between WWI and WWII differs by a factor of a little over 1.5 to 1. Yet the actual difference in casualties (see Figure 4) is much greater. While one can just postulate that the difference is the change in dispersion squared (basically Dupuy‘s approach), this does not seem to explain the complete difference, especially the difference between the Napoleonic Wars and the Civil War.

lnstead of discussing dispersion, we should be discussing “casualty reduction efforts.� This basically consists of three elements:

  • Dispersion (D)
  • Increased engagement ranges (R)
  • More individual use of cover and concealment (C&C).

These three factors together result in the reduced chance to hit. They are also partially interrelated, as one cannot make more individual use of cover and concealment unless one is allowed to disperse. So, therefore. The need for cover and concealment increases the desire to disperse and the process of dispersing allows one to use more cover and concealment.

Command and control are integrated into this construct as being something that allows dispersion, and dispersion creates the need for better command control. Therefore, improved command and control in this construct does not operate as a force modi�er, but enables a force to disperse.

Intelligence becomes more necessary as the opposing forces use cover and concealment and the ranges of engagement increase. By the same token, improved intelligence allows you to increase the range of engagement and forces the enemy to use better concealment.

This whole construct could be represented by the diagram at the top of the next page.

Now, I may have said the obvious here, but this construct is probably provable in each individual element, and the overall outcome is measurable. Each individual connection between these boxes may also be measurable.

Therefore, to measure the effects of reduced chance to hit, one would need to measure the following formula (assuming these formulae are close to being correct):

(K * ΔD) + (K * ΔC&C) + (K * ΔR) = H

(K * ΔC2) = ΔD

(K * ΔD) = ΔC&C

(K * ΔW) + (K * ΔI) = ΔR

K = a constant
Δ = the change in….. (alias “Delta”)
D = Dispersion
C&C = Cover & Concealment
R = Engagement Range
W = Weapon’s Characteristics
H = the chance to hit
C2 = Command and control
I = Intelligence or ability to observe

Also, certain actions lead to a desire for certain technological and system improvements. This includes the effect of increased dispersion leading to a need for better C2 and increased range leading to a need for better intelligence. I am not sure these are measurable.

I have also shown in the diagram how the enemy impacts upon this. There is also an interrelated mirror image of this construct for the other side.

I am focusing on this because l really want to come up with some means of measuring the effects of a “revolution in warfare.” The last 400 years of human history have given us more revolutionary inventions impacting war than we can reasonably expect to see in the next 100 years. In particular, I would like to measure the impact of increased weapon accuracy, improved intelligence, and improved C2 on combat.

For the purposes of the TNDM, I would very specifically like to work out an attrition multiplier for battles before WWII (and theoretically after WWII) based upon reduced chance to be hit (“dispersion�). For example, Dave Bongard is currently using an attrition multiplier of 4 for his WWI engagements that he is running for the battalion-level validation data base.[3] No one can point to a piece of paper saying this is the value that should be used. Dave picked this value based upon experience and familiarity with the period.

I have also attached Average Loses per Kilometer of Front by War (see Figure 6 above), and a summary chart showing the two on the same chart (see �gure 7 above).

The values from these charts are:

The TNDM sets WWII dispersion factor at 3,000 (which l gather translates into 30,000 men per square kilometer). The above data shows a linear dispersion per kilometer of 2,992 men, so this number parallels Dupuy‘s �gures.

The �nal chart I have included is the Ratio of Strength and Losses per Kilometer of Front by War (Figure 8). Each line on the bar graph measures the average ratio of strength over casualties for either the attacker or defender. Being a ratio, unusual outcomes resulted in some really unusually high ratios. I took the liberty of taking out six

data points because they appeared unusually lop-sided. Three of these points are from the English Civil War and were way out of line with everything else. These were the three Scottish battles where you had a small group of mostly sword-armed troops defeating a “modem� army. Also, Walcourt (1689), Front Royal (1862), and Calbritto (1943) were removed. L also have included the same chart, except by century (Figure 9).
Again, one sees a consistency in results in over 300+ years of war, in this case going all the way through WWI, then sees an entirely different pattern with WWII and the Arab-Israeli Wars

A very tentative set of conclusions from all this is:

  1. Dispersion has been relatively constant and driven by factors other than �repower from 1600-1815.
  2. Since the Napoleonic Wars, units have increasingly dispersed (found ways to reduce their chance to be hit) in response to increased lethality of weapons.
  3. As a result of this increased dispersion, casualties in a given space have declined.
  4. The ratio of this decline in casualties over area have been roughly proportional to the strength over an area from 1600 through WWI. Starting with WWII, it appears that people have dispersed faster than weapons lethality, and this trend has continued.
  5. In effect, people dispersed in direct relation to increased firepower from 1815 through 1920, and then after that time dispersed faster than the increase in lethality.
  6. It appears that since WWII, people have gone back to dispersing (reducing their chance to be hit) at the same rate that �repower is increasing.
  7. Effectively, there are four patterns of casualties in modem war:

Period 1 (1600 – 1815): Period of Stability

  • Short battles
  • Short frontages
  • High attrition per day
  • Constant dispersion
  • Dispersion decreasing slightly after late 1700s
  • Attrition decreasing slightly after mid-1700s.

Period 2 (1816 – 1905): Period of Adjustment

  • Longer battles
  • Longer frontages
  • Lower attrition per day
  • Increasing dispersion
  • Dispersion increasing slightly faster than lethality

Period 3 (1912 – 1920): Period of Transition

  • Long Battles
  • Continuous Frontages
  • Lower attrition per day
  • Increasing dispersion
  • Relative lethality per kilometer similar to past, but lower
  • Dispersion increasing slightly faster than lethality

Period 4 (1937 – present): Modern Warfare

  • Long Battles
  • Continuous Frontages
  • Low Attrition per day
  • High dispersion (perhaps constant?)
  • Relatively lethality per kilometer much lower than the past
  • Dispersion increased much faster than lethality going into the period.
  • Dispersion increased at the same rate as lethality within the period.

So the question is whether warfare of the next 50 years will see a new “period of adjustment,” where the rate of dispersion (and other factors) adjusts in direct proportion to increased lethality, or will there be a significant change in the nature of war?

Note that when l use the word “dispersion� above, l often mean “reduced chance to be hit,� which consists of dispersion, increased engagement ranges, and use of cover & concealment.

One of the reasons l wandered into this subject was to see if the TNDM can be used for predicting combat before WWII. l then spent the next few days attempting to �nd some correlation between dispersion and casualties. Using the data on historical dispersion provided above, l created a mathematical formulation and tested that against the actual historical data points, and could not get any type of �t.

I then locked at the length of battles over time, at one-day battles, and attempted to �nd a pattern. I could �nd none. I also looked at other permutations, but did not keep a record of my attempts. I then looked through the work done by Dean Hartley (Oakridge) with the LWDB and called Paul Davis (RAND) to see if there was anyone who had found any correlation between dispersion and casualties, and they had not noted any.

It became clear to me that if there is any such correlation, it is buried so deep in the data that it cannot be found by any casual search. I suspect that I can �nd a mathematical correlation between weapon lethality, reduced chance to hit (including dispersion), and casualties. This would require some improvement to the data, some systematic measure of weapons lethality, and some serious regression analysis. I unfortunately cannot pursue this at this time.

Finally, for reference, l have attached two charts showing the duration of the battles in the LWDB in days (Figure 10, Duration of Battles Over Time and Figure 11, A Count of the Duration of Battles by War).


[1] The Tactical Numerical Deterministic Model, a combat model developed by Trevor Dupuy in 1990-1991 as the follow-up to his Quantified Judgement Model. Dr. James G. Taylor and Jose Perez also contributed to the TNDM’s development.

[2] TDI’s Land Warfare Database (LWDB) was a revised version of a database created by the Historical Evaluation Research Organization (HERO) for the then-U.S. Army Concepts and Analysis Agency (now known as the U.S. Army Center for Army Analysis (CAA)) in 1984. Since the original publication of this article, TDI expanded and revised the data into a suite of databases.

[3] This matter is discussed in Christopher A. Lawrence, “The Second Test of the TNDM Battalion-Level Validations: Predicting Casualties,” The International TNDM Newsletter, April 1997, pp. 40-50.

How to Get The Most Out of Data Warehouses

How to Get The Most Out of Data Warehouses

Currently, small and medium-sized businesses have started embracing the utilization of Big Data. Data warehouses have given organizations a window into their historical performance and data analysis so as to get more information about customer behavior and business trends on quarterly and annual analysis. The need to capture and analyze data has helped expand the growth of the data warehouse industry. However, it's important to analyze your need for a data warehouse platform and determine its benefits before you decide on whether you intend to invest in it or not.

Investing in a data warehouse platform depends on your organization's specific needs. A warehouse platform deployment option can either be a departmental or an enterprise-wide platform. For your big data analytics, you should also decide whether you'll integrate the traditional data warehousing with online analytical processing (OLAP) uses. It's important to a match the usage of data warehousing with the most appropriate data warehouse mediums.

Why your organization or business requires data warehousing

The general reason why your organization will need data warehousing is that it extracts data on a continuous process that is later copied to a specialized system like a data warehouse for analysis on dashboards, business intelligence tools, and portals. ...

Read More on Datafloq
The Role of Big Data in Black Friday and Cyber Monday Shopping

The Role of Big Data in Black Friday and Cyber Monday Shopping

Have you ever wondered how companies can afford to sell off their products at such low prices during the massive sales events, such as Black Friday or Cyber Monday? As you can suppose, none of their decisions is made randomly. To prepare for such events, retailers need to build a detailed pricing strategy, conduct customer segmentation and, most importantly, constantly collect the data on the purchases and customer engagement.

Here is how big data influences Black Friday and Cyber Monday.

How the Use of Data Changed over Time?

Retail has always relied heavily on big numbers. Traditionally, to prepare for the next season, stores focused on historical sales data to develop discounting strategies. If they saw that the sales could not beat the previous year’s digits, they would simply drop the prices. They would set the prices lower than those offered by their competitors, or address the needs of the consumers who could not afford their latest products by giving them huge discounts on the last-season models. Even though they still rely on these practices, the way retailers analyze their market and choose their prices has radically changed.

However, with the growth of advanced tech solutions, these businesses have taken some more intuitive factors ...

Read More on Datafloq
Student Spotlight: Paint Franchisee to Digital Marketing Entrepreneur | Simplilearn

Student Spotlight: Paint Franchisee to Digital Marketing Entrepreneur | Simplilearn

Student Spotlight: Paint Franchisee to Digital Marketing Entrepreneur | Simplilearn Christina Stormson is the founder and partner at Pink Iguana, a company that offers website management, lead and review generation services for franchised painting contractors. What started as a quest to master fundamental skills in website development and SEO paved Christina’s way into the world of entrepreneurship. Simplilearn - Market Moti...Read More.
This Is Why We Need To Regulate AI

This Is Why We Need To Regulate AI

It’s time we start talking about AI regulation.

As the technology progresses at a rapid pace, it is a critical time for governments and policymakers to think about how we can safeguard the effects of Artificial Intelligence on a social, economic and political scale. Artificial Intelligence is not inherently good or bad, but the way we use it could well be one or the other.

Unfortunately, there has been little attention paid by such governing bodies as yet regarding the impact of this technology. We’re going to see huge changes to employment, privacy, and arms to name a few, that if managed incorrectly or not at all, could spell disaster. Handled correctly, with forwarding planning and proper regulation, the technology has the potential to better the future of our societies. 

Elon Musk’s warnings have made headlines in recent months, as he urges the regulation of AI to be proactive rather than reactive for fears the latter would be much too late.  Whether you’re in the Musk or Zuckerberg camp, it’s undeniable that we need to consider all outcomes for society. 

It’s been a year since giants in the field of deep learning, Amazon, Facebook, Google, IBM and Microsoft, announced the launch of non-profit Partnership ...

Read More on Datafloq
How to Become a Digital Marketing Specialist – Learning Paths Explored | Simplilearn

How to Become a Digital Marketing Specialist – Learning Paths Explored | Simplilearn

How to Become a Digital Marketing Specialist - Learning Paths Explored | Simplilearn The Digital Revolution is Here! The world is becoming increasingly digitized. More than two-thirds of the world’s population is now online, and the numbers are growing every day. With more than three billion global users, digital media now has a significant impact in our everyday lives. Organizations around the world are waking up to the op...Read More.
How Businesses Short on Time and Budget Invest in Team Training | Simplilearn

How Businesses Short on Time and Budget Invest in Team Training | Simplilearn

How Businesses Short on Time and Budget Invest in Team Training | Simplilearn The maturity of a business, its size, leadership, financial stability and the industry it’s in all come into play when considering whether training internal staff is a worthy investment.  Very few industries are excluded from the impact of digital transformation, and while most business owners, stakeholders, and executives are aware tha...Read More.
How to Solve the Metadata Challenge for Cross-industry Semantic Interoperability

How to Solve the Metadata Challenge for Cross-industry Semantic Interoperability

This is part one of a multi-part series that addresses the need for a single semantic data model supporting the Internet of Things (IoT) and the digital transformation of buildings, businesses, and consumers. Such a model must be simple and extensible to enable plug-and-play interoperability and universal adoption across industries. All parts of the series are accessible at

IoT network abstraction layers and degrees of interoperability

Interoperability, or the ability of computer systems or software to exchange or make use of information[1], is a requirement of all devices participating in today’s information economy. Traditionally, interoperability has been defined mostly in the context of network communications. But with millions of devices being connected in industries ranging from smart home and building automation to smart energy and retail to healthcare and transportation, a broader definition is now required that considers the cross-domain impact of interoperability on system-to-system performance.

Perhaps the best-known example of a framework for defining network interoperability is provided by the Open System Interconnection (OSI) model, which serves as the foundation of telecommunications networks worldwide. The OSI model provides a framework for interoperability through seven distinct abstraction layers that isolate and define various communications functions, from the transmission of bits in physical media ...

Read More on Datafloq
What Happens When a Tesla S Reaches Very High Mileage?

What Happens When a Tesla S Reaches Very High Mileage?

Tesla cars might someday become the premier vehicles on the road. The world always moves forward and electric vehicles most definitely reflect where the car industry is headed. Electric vehicles are not novelties. The cars bring with them a great many positive benefits that traditional gasoline-fueled cars lack. Since electric vehicles are so new, not everyone knows what to expect from the cars. Questions about high-mileage Teslas may not get easy answers. After all, Teslas are quite new so there probably won't be too many models out there with significant mileage. Manufacturers do have an idea how to answer such questions. Among the more intriguing questions to ask is "What happens to a Tesla S once it hits 300,000 miles?"

The Tesloop Car

A company called Tesloop actually did put 300,000 miles on a Tesla Model S. The results were somewhat shocking. The vehicle rarely required any serious maintenance. Overall maintenance costs were only in the $10,000 range. Even at 300,000 miles, the vehicle operates well. A standard car would doubtfully be able to reach mileage that high with the original engine. Combustible engines are not designed to last that long.

Of course, an electric car does not run on a traditional engine. ...

Read More on Datafloq
Gear Up for the Smart Video Revolution

Gear Up for the Smart Video Revolution

The technology revolution promises to deliver a lot of rewards for organizations that take it by the horns and implement it. The spread of the Internet across every facet of society has meant that the traditional approach to watching videos has changed. Research shows that the average American is still fond of watching videos on the traditional television. The interest in watching TV results in around five hours per day and a whooping 149 hours per month.   

But are these stats a true representation of what is happening in American society? The average American citizen has cut down on TV viewing by more than six to eight hours per month. However, the interest in videos has not dwindled. Instead, it is now being directed towards what we come across on the Internet. During the last quarter of last year, an average American citizen spent four extra hours per month on streaming and watching videos over the Internet. The formats and the source might change, but the love for videos and what they offer is something that will tend to remain constant over time.  

The changing trends do suggest that we are indeed a part of the video revolution. Understanding the needs of ...

Read More on Datafloq
Raqqa Has Fallen

Raqqa Has Fallen

It would appear that Raqqa has fallen:

  1. This announcement comes from U.S.-backed militias.
  2. It was only a four-month battle (in contrast to Mosul).
  3. “A formal declaration of victory in Raqqa will be made soon”

This does appear to end the current phase of the Islamic State, which exploded out of the desert to take Raqqa and Mosul in the first half of 2014. It lasted less than 4 years. It was an interesting experiment for a guerilla movement to suddenly try to seize power in several cities and establish a functioning state in the middle of a war. Sort of gave conventional forces something to attack. You wonder if this worked to the advantage of ISIL in the long run or not.

I gather now that the state-less Islamic state will go back to being a guerilla movement. Not sure what its long term prognosis is. This is still a far-from-resolved civil war going on in Syria.

How Website Optimization Can Help to Manage Constant Data Analytics

How Website Optimization Can Help to Manage Constant Data Analytics

In recent years, it’s become patently obvious that website optimization is the best way to jump up the rankings and put your website in the best position possible for the future. With the likes of Google Analytics, we can check the areas that need improvement and keep optimizing the website until a good number of people are visiting and clicking through the pages on a daily basis.

For a long time, data analytics have provided all website owners with great way to progress but now we’re reaching a time where we have too much data. As strange as it sounds, we’re able to pull data from every interaction and every process which makes it hard to know where to focus our attentions.

Today, we’re going to be discussing the role website optimization has to play in managing constant data analytics.

Problems with Comparing

When we compare current data analytics, this can give a good insight into how things have changed from one point in time to the next. Using old data to compare, let’s say you’re assessing how things have changed, due to website optimization, from February to March. What are you really comparing?

For both months, the data has come from a single moment ...

Read More on Datafloq
3 Big Data Privacy Risks and What They Mean to Everyday Users

3 Big Data Privacy Risks and What They Mean to Everyday Users

When the internet was conceived, many people believed it was the pinnacle of digital communications. It enabled users to share information despite being continents away seamlessly. And as the online repository of unstructured data grew to a massive scale, technology pioneers began connecting the dots and took digital information-sharing to a whole new level.

Today, big data has become one of the most promising concepts in the tech space. You can find it everywhere — from music streaming services to hospitals that store medical records digitally. Big data analytics also enable businesses to refine their lead and customer engagement strategies as well as execute data-driven marketing campaigns.

But what if you’re an everyday user who’s never even heard of big data before? What if you’re simply an employee who’s in no position to worry about big data analytics?

Chances are, you might consider giving up some of your information to certain brands in exchange for personalized services and experiences. This, however, could open up gaping holes in your online security and privacy. After all, the World Wide Web is no stranger to embarrassing data breaches that endanger the information of users such as yourself.

Without further ado, here is a closer look at three ...

Read More on Datafloq
How to become a Big Data Hadoop Architect – Learning Paths Explored | Simplilearn

How to become a Big Data Hadoop Architect – Learning Paths Explored | Simplilearn

What does a Big Data Hadoop Architect do? Big Data Hadoop architects have evolved to become vital links between businesses and technology. They’re responsible for planning and designing next-generation big-data systems and managing large-scale development and deployment of Hadoop applications. Hadoop architects are among the highest paid p...Read More.
Full Scale Data Architects at DMZ 2017

Full Scale Data Architects at DMZ 2017

As already mentioned in my previous blogpost I will give a talk at the first day of the Data Modeling Zone 2017 about temporal data in the data warehouse.

Another interesting talk will take place on the third day of the DMZ 2017: Martijn Evers will give a full day session about Full Scale Data Architects.

Ahead of this session there will be a Kickoff Event sponsored by I-Refact, data42morrow and TEDAMOH: At 6 pm on Tuesday, 24. October, after the second day of the Data Modeling Zone 2017, all interested people can meet up and join the launch of the German chapter of Full Scale Data Architects.

What Are The Potential Dangers Of Quantum Computing?

What Are The Potential Dangers Of Quantum Computing?

The development of quantum computers may create serious cyber-security threats. The NSA has recently released statements expressing their concern over the potential of quantum computing to foil the cryptography protecting all data to date. Furthermore, the use of quantum computers may become widespread before many people expect.

What Are The Potential Dangers Of Quantum Computing?

A powerful quantum computer could crack the cryptographic algorithms that keep our data safe. While managed detection and response services are highly effective at keeping today's data safe, these services wouldn't be able to protect data from a quantum computer. Quantum computers could even break the algorithms that are used by New York's stock exchange. This could lead to the collapse of the stock market.

How Do Quantum Computers Work?

Quantum computers use quarks as bits (qubits) rather than bits made of silicon. The device's bits are the smallest particles in the universe, and this will make it possible for quantum computers to have a larger number of bits than conventional devices.

Quantum computers could have drives that contain hundreds of millions of terabytes or more. It's even possible that mobile devices could be this powerful!

Also, quantum computers use a code that is different than binary code. In fact, quarks ...

Read More on Datafloq
Blade Runner Rule: Public Perception and A.I. in Marketing and Brand Representation

Blade Runner Rule: Public Perception and A.I. in Marketing and Brand Representation

The role of A.I. in business is ever-expanding, with companies increasingly relying on the nascent technology for everything from analysis of data streams to brand representation. However, Markets Insider reports that consumers might not be too keen on A.I. in business — at least, not if they don’t know about them.

A digital agency group called SYZYGY recently produced research titled “Sex, Lies, and A.I.�, which yielded interesting finds:

“Research… reveals that 79% of Americans believe a new ‘Blade Runner rule’ is needed to make it illegal for A.I. applications such as social media bots, chatbots and virtual assistants to conceal their identity and pose as humans,� writes Markets Insider. “Nine in 10 (89%) of Americans believe that the use of A.I. in marketing should be regulated with a legally-binding code of conduct and almost three-quarters (71%) think that brands should need their explicit consent before using A.I. when marketing to them.�

This is an interesting find, considering that advertising professionals who are turning to A.I. are generally looking at how they can make these machines seem more human. Neil Davidson of HeyHuman argues that software that imbues machines with emotional intelligence (EI), a trait increasingly valued in the human workplace, is “the ...

Read More on Datafloq
Captured Records: World War I

Captured Records: World War I

When Shawn Woodford sent me that article on the Captured Iraqi Records, it reminded me of a half-dozen examples I had dealt with over the years. This will keep me blogging for a week or more. Let me start a hundred years ago, with World War I.

The United States signed a separate treaty with Germany after the end of World War I. We were not part of the much maligned Versailles Treaty (although we had to come back to Europe to help clean up the mess they made of the peace). As part of that agreement, we required access to the German military archives.

We used this access well. We put together a research team that included Lt. Colonel Walter Kruger. The United States plopped this team of researchers in Germany for a while and carefully copied all the records of German units that the United States faced. This really covered only the last year of a war that lasted over four years (28 July 1914 – 11 November 1918). Krueger was in Germany with the team in 1922. This was a pretty significant effort back in the days before Xerox machines, microfilm, scanners, and other such tools.

These records were later saved to the U.S. National Archives. So one can access those records now, and will find the records of U.S. units involved, the German units involved (much of it translated), along with maps and descriptions of the fighting. It is all nicely assembled for researchers. It is a very meticulous and nicely done collection.

Just to add to the importance of this record collection, the German archives in Potsdam were bombed by the RAF in April 1945, destroying most of their World War I records (this raid on 14/15 April: apr45 and also: bundesarchiv). So, one cannot now go back and look up these German records. The only surviving primary source combat records for many German units from World War I is the records in the U.S. Archives copied (translated and typed from the original) done by the U.S. researchers.

Lt. Colonel Krueger was fluent in German because of his family background (he was born in Prussia). During World War II, he rose to be a full general (four-star general) in command of the Sixth Army in the Pacific:

This effort sets that standard almost a hundred years ago of what could/should be done with captured records. A later post will discuss the World War II effort.

The Sad Story Of The Captured Iraqi DESERT STORM Documents


U.S. Army Updates Draft Multi-Domain Battle Operating Concept

U.S. Army Updates Draft Multi-Domain Battle Operating Concept

The U.S. Army Training and Doctrine Command has released a revised draft version of its Multi-Domain Battle operating concept, titled “Multi-Domain Battle: Evolution of Combined Arms for the 21st Century, 2025-2040.” Clearly a work in progress, the document is listed as version 1.0, dated October 2017, and as a draft and not for implementation. Sydney J. Freeberg, Jr. has an excellent run-down on the revision at Breaking Defense.

The update is the result of the initial round of work between the U.S. Army and U.S. Air Force to redefine the scope of the multi-domain battlespace for the Joint Force. More work will be needed to refine the concept, but it shows remarkable cooperation in forging a common warfighting perspective between services long-noted for their independent thinking.

On a related note, Albert Palazzo, an Australian defense thinker and one of the early contributors to the Multi-Domain Battle concept, has published the first of a series of articles at The Strategy Bridge offering constructive criticism of the U.S. military’s approach to defining the concept. Palazzo warns that the U.S. may be over-emphasizing countering potential Russian and Chinese capabilities in its efforts and not enough on the broad general implications of long-range fires with global reach.

What difference can it make if those designing Multi-Domain Battle are acting on possibly the wrong threat diagnosis? Designing a solution for a misdiagnosed problem can result in the inculcation of a way of war unsuited for the wars of the future. One is reminded of the French Army during the interwar period. No one can accuse the French of not thinking seriously about war during these years, but, in the doctrine of the methodical battle, they got it wrong and misread the opportunities presented by mechanisation. There were many factors contributing to France’s defeat, but at their core was a misinterpretation of the art of the possible and a singular focus on a particular way of war. Shaping Multi-Domain Battle for the wrong problem may see the United States similarly sow the seeds for a military disaster that is avoidable.

He suggests that it would be wise for U.S. doctrine writers to take a more considered look at potential implications before venturing too far ahead with specific solutions.

No, Not Everyone Needs To Understand Analytics

No, Not Everyone Needs To Understand Analytics

The breadth of analytics has certainly increased in recent years. So, too, has the pool of people who dip their toe into creating analytics of one sort or the other. The trends toward democratization of data and self-service analytical capabilities are powerful and both have driven a lot of value for organizations in recent years. At the same time, it is possible to go too far. I get concerned when I hear the suggestion that everyone in the organization needs to create, use, and understand analytics. Many people don’t (and shouldn’t!) understand analytics at all.

Who Needs to Understand Analytics?

There are absolutely people within an organization who must understand how analytics work. Many of those people already have the need and the skills to create their own analytics. In most companies today, the number of people who have analytical toolsets of some kind and data access beyond standardized reports is growing rapidly. However, in most cases, it is still a relatively small number as a percentage of all employees. This is how it should be.

The fact is that many people have no training in, understanding of, or interest in analytics. It makes no sense to try to get them deeply involved. ...

Read More on Datafloq
Top Four Competencies a Data Scientist Should Have

Top Four Competencies a Data Scientist Should Have

Jobs in the niche of data science is quite lucrative in terms of fat-pay package and adequate job exposure. But the role of a data scientist may slightly vary from one company to other. For example, in the Indian job market, almost every Java software development company that deals with data science and data management, recruits data scientists but the role may vary depending on the nature of data the company deals with. This year unveils enormous job prospects for data scientist provided they acquire these four competencies.

Proficiency in quantitative analysis

Quantitative analysis is one of the most pivotal skills of a data scientist which helps him in acquiring knowledge about a set of data for three major reasons. These are   

Experimental design and concerning analysis: This mode of analysis works for consumer market-related data, where a data scientist can help a lot.
Machine learning:  This is one of the most intricate roles a data scientist plays. A data scientist helps in creating prototypes for testing assumptions, selection and creation of needed features, as well as he helps in recognizing the areas of strength and prospect in obtainable machine learning systems.
Simulation of complex economic or growth system: In this area, a data scientist checks ...

Read More on Datafloq
How Blockchain Will Bring Back Data Ownership to Consumers

How Blockchain Will Bring Back Data Ownership to Consumers

Blockchain is set to change data ownership. It will help restore data control to the user by empowering them to determine who has access to their information online. It is a paradigm shift in how we deal with data and it will offer consumers the much-needed control over their own data.

Blockchain Will Drive the Need for Change

Change is required for various reasons, including security and privacy concerns. A 2016 Pew Research study revealed that 74 percent of its participants ranked control over who can access information online as a primary concern. The study also revealed that transparency of data collection is a concern. Often, consumers do not even realize that they provide companies with the permission to use their data based on the use of their applications. Websites and applications served as some of the first collectors of personal data. Now data can be collected in many ways, including via smart devices and vehicles.

The issue of privacy, security, and transparency can grow when third-party vendors who create accessories or supplemental services have access to consumer information. Moreover, the amount of data produced every day is increasing exponentially. Every person is expected to produce 1.7 megabytes of new data per second ...

Read More on Datafloq
3 Ways CRM How Real-time Data Can Improve Your Business

3 Ways CRM How Real-time Data Can Improve Your Business

Customer relationship management (CRM) platforms are an important modern business asset. The software automates many internal operations from lead generation to sales tracking and client management. CRM application integrates data from devices, products, forms, and other business apps to enable data insights for more centralized management. Here are three key benefits of adopting a CRM software.

Organize customer data

As a business' client base grows, so does the need for a centralized management platform. It's hard to imagine modern corporations operate without an enabling customer and data management tool. CRM platforms store data in backend tables and allow employees to access them via a user-friendly interface. For example, clients might customize their CRM dashboard to display relevant graphs on lead, sales, and monthly summaries.

CRM reporting tools also allow companies to chart data and measure their performance. Employees can create reports that organize sales by products and clients, for instance; or reports that populate forms with relevant data to present to partners and clients. Even if a default CRM application doesn't have an embedded reporting tool, users can integrate the CRM database with a third-party data warehousing tool like Microsoft SQL Server Reporting Services.

Get actionable data insights

CRM applications can incorporate machine learning algorithms ...

Read More on Datafloq
TPC-DS at 100TB and 10TB Scale Now Available in Snowflake’s Samples

TPC-DS at 100TB and 10TB Scale Now Available in Snowflake’s Samples

We are happy to announce that a full 100 TB version of TPC-DS data, along with samples of all the benchmark’s 99 queries, are available now to all Snowflake customers for exploration and testing.
New Snowflake features released in Q2’17 

New Snowflake features released in Q2’17 

It has been an incredible few months at Snowflake with the introduction of self-service, the data sharehouse, and numerous other features added in the last quarter.
Why Blockchain-as-a-Service Should Replace Servers and the Cloud

Why Blockchain-as-a-Service Should Replace Servers and the Cloud

Blockchain and big data are very similar in one big way: it’s all about how you use them. Much like the Force in Star Wars, blockchain and big data are powerful tools that can be used to serve the dark or the good side. You could say they’re two sides of the same coin. While blockchain is still emerging into the light, you might not think of big data in this way because plenty of very legitimate organizations use it for legitimate purposes; but you can be sure plenty of dark web denizens view big data as a dark tool.

The Dark Side of the Coin   

There’s a huge question mark hovering over the fate of millions of people. Without blockchain and big data, that question mark wouldn’t be quite as big as it is.

Consumer identity data from the Equifax breach is already popping up for sale on the dark web. This is lifetime data—Social Security numbers, names, addresses, credit card numbers—from 143 million Americans and 67 percent of UK citizens. Terbium Labs reports that a Tor Hidden Service on the dark web claimed to have the entire cache of data less than 12 hours after the Equifax breach, and demanded ...

Read More on Datafloq
TDI Friday Read: Principles Of War & Verities Of Combat

TDI Friday Read: Principles Of War & Verities Of Combat


Trevor Dupuy distilled his research and analysis on combat into a series of verities, or what he believed were empirically-derived principles. He intended for his verities to complement the classic principles of war, a slightly variable list of maxims of unknown derivation and provenance, which describe the essence of warfare largely from the perspective of Western societies. These are summarized below.

What Is The Best List Of The Principles Of War?

The Timeless Verities of Combat

Trevor N. Dupuy’s Combat Attrition Verities

Trevor Dupuy’s Combat Advance Rate Verities

Military History and Validation of Combat Models

Military History and Validation of Combat Models

Soldiers from Britain’s Royal Artillery train in a “virtual world” during Exercise Steel Sabre, 2015 [Sgt Si Longworth RLC (Phot)/MOD]

Military History and Validation of Combat Models

A Presentation at MORS Mini-Symposium on Validation, 16 Oct 1990

By Trevor N. Dupuy

In the operations research community there is some confusion as to the respective meanings of the words “validation� and “verification.� My definition of validation is as follows:

“To con�rm or prove that the output or outputs of a model are consistent with the real-world functioning or operation of the process, procedure, or activity which the model is intended to represent or replicate.�

In this paper the word “validation” with respect to combat models is assumed to mean assurance that a model realistically and reliably represents the real world of combat. Or, in other words, given a set of inputs which reflect the anticipated forces and weapons in a combat encounter between two opponents under a given set of circumstances, the model is validated if we can demonstrate that its outputs are likely to represent what would actually happen in a real-world encounter between these forces under those circumstances

Thus, in this paper, the word “validation” has nothing to do with the correctness of computer code, or the apparent internal consistency or logic of relationships of model components, or with the soundness of the mathematical relationships or algorithms, or with satisfying the military judgment or experience of one individual.

True validation of combat models is not possible without testing them against modern historical combat experience. And so, in my opinion, a model is validated only when it will consistently replicate a number of military history battle outcomes in terms of: (a) Success-failure; (b) Attrition rates; and (c) Advance rates.

“Why,� you may ask, “use imprecise, doubtful, and outdated history to validate a modem, scientific process? Field tests, experiments, and field exercises can provide data that is often instrumented, and certainly more reliable than any historical data.�

I recognize that military history is imprecise; it is only an approximate, often biased and/or distorted, and frequently inconsistent reflection of what actually happened on historical battle�elds. Records are contradictory. I also recognize that there is an element of chance or randomness in human combat which can produce different results in otherwise apparently identical circumstances. I further recognize that history is retrospective, telling us only what has happened in the past. It cannot predict, if only because combat in the future will be fought with different weapons and equipment than were used in historical combat.

Despite these undoubted problems, military history provides more, and more accurate information about the real world of combat, and how human beings behave and perform under varying circumstances of combat, than is possible to derive or compile from arty other source. Despite some discrepancies, patterns are unmistakable and consistent. There is always a logical explanation for any individual deviations from the patterns. Historical examples that are inconsistent, or that are counter-intuitive, must be viewed with suspicion as possibly being poor or false history.

Of course absolute prediction of a future event is practically impossible, although not necessarily so theoretically. Any speculations which we make from tests or experiments must have some basis in terms of projections from past experience.

Training or demonstration exercises, proving ground tests, �eld experiments, all lack the one most pervasive and most important component of combat: Fear in a lethal environment. There is no way in peacetime, or non-battle�eld, exercises, test, or experiments to be sure that the results are consistent with what would have been the behavior or performance of individuals or units or formations facing hostile �repower on a real battle�eld.

We know from the writings of the ancients (for instance Sun Tze—pronounced Sun Dzuh—and Thucydides) that have survived to this day that human nature has not changed since the dawn of history. The human factor the way in which humans respond to stimuli or circumstances is the most important basis for speculation and prediction. What about the “scientific” approach of those who insist that we cart have no conï¬�dence in the accuracy or reliability of historical data, that it is therefore unscientific, and therefore that it should be ignored? These people insist that only “scientificâ€� data should be used in modeling.

In fact, every model is based upon fundamental assumptions that are intuitive and unprovable. The �rst step in the creation of a model is a step away from scientific reality in seeking a basis for an unreal representation of a real phenomenon. I have shown that the unreality is perpetuated when we use other imitations of reality as the basis for representing reality. History is less than perfect, but to ignore it, and to use only data that is bound to be wrong, assures that we will not be able to represent human behavior in real combat.

At the risk of repetition, and even of protesting too much, let me assure you that I am well aware of the shortcomings of military history:

The record which is available to us, which is history, only approximately reflects what actually happened. It is incomplete. It is often biased, it is often distorted. Even when it is accurate, it may be reflecting chance rather than normal processes. It is neither precise nor consistent. But, it provides more, and more accurate, information on the real world of battle than is available from the most thoroughly documented �eld exercises, proving ground less, or laboratory or �eld experiments.

Military history is imperfect. At best it reflects the actions and interactions of unpredictable human beings. We must always realize that a single historical example can be misleading for either of two reasons: (1) The data may be inaccurate, or (2) The data may be accurate, but untypical.

Nevertheless, history is indispensable. I repeat that the most pervasive characteristic of combat is fear in a lethal environment. For all of its imperfections, military history and only military history represents what happens under the environmental condition of fear.

Unfortunately, and somewhat unfairly, the reported ï¬�ndings of S.L.A. Marshall about human behavior in combat, which he reported in Men Against Fire, have been recently discounted by revisionist historians who assert that he never could have physically performed the research on which the book’s ï¬�ndings were supposedly based. This has raised doubts about Marshall’s assertion that 85% of infantry soldiers didn’t ï¬�re their weapons in combat in World War ll. That dramatic and surprising assertion was ï¬�rst challenged in a New Zealand study which found, on the basis of painstaking interviews, that most New Zealanders ï¬�red their weapons in combat. Thus, either Americans were different from New Zealanders, or Marshall was wrong. And now American historians have demonstrated that Marshall had had neither the time nor the opportunity to conduct his battlefield interviews which he claimed were the basis for his ï¬�ndings.

I knew Marshall, moderately well. I was fully as aware of his weaknesses as of his strengths. He was not a historian. I deplored the imprecision and lack of documentation in Men Against Fire. But the revisionist historians have underestimated the shrewd journalistic assessment capability of “SLAMâ€� Marshall. His observations may not have been scientifically precise, but they were generally sound, and his assessment has been shared by many American infantry officers whose judgements l also respect. As to the New Zealand study, how many people will, after the war, admit that they didn’t ï¬�re their weapons?

Perhaps most important, however, in judging the assessments of SLAM Marshall, is a recent study by a highly-respected British operations research analyst, David Rowland. Using impeccable OR methods Rowland has demonstrated that Marshall’s assessment of the inefficient performance, or non-performance, of most soldiers in combat was essentially correct. An unclassified version of Rowland’s study, “Assessments of Combat Degradation,â€� appeared in the June 1986 issue of the Royal United Services Institution Journal.

Rowland was led to his investigations by the fact that soldier performance in �eld training exercises, using the British version of MILES technology, was not consistent with historical experience. Even after allowances for degradation from theoretical proving ground capability of weapons, defensive rifle �re almost invariably stopped any attack in these �eld trials. But history showed that attacks were often in fact, usually successful. He therefore began a study in which he made both imaginative and scientific use of historical data from over 100 small unit battles in the Boer War and the two World Wars. He demonstrated that when troops are under �re in actual combat, there is an additional degradation of performance by a factor ranging between 10 and 7. A degradation virtually of an order of magnitude! And this, mind you, on top of a comparable built-in degradation to allow for the difference between �eld conditions and proving ground conditions.

Not only does Rowland‘s study corroborate SLAM Marshall’s observations, it showed conclusively that ï¬�eld exercises, training competitions and demonstrations, give results so different from real battleï¬�eld performance as to render them useless for validation purposes.

Which brings us back to military history. For all of the imprecision, internal contradictions, and inaccuracies inherent in historical data, at worst the deviations are generally far less than a factor of 2.0. This is at least four times more reliable than �eld test or exercise results.

I do not believe that history can ever repeat itself. The conditions of an event at one time can never be precisely duplicated later. But, bolstered by the Rowland study, I am con�dent that history paraphrases itself.

If large bodies of historical data are compiled, the patterns are clear and unmistakable, even if slightly fuzzy around the edges. Behavior in accordance with this pattern is therefore typical. As we have already agreed, sometimes behavior can be different from the pattern, but we know that it is untypical, and we can then seek for the reason, which invariably can be discovered.

This permits what l call an actuarial approach to data analysis. We can never predict precisely what will happen under any circumstances. But the actuarial approach, with ample data, provides con�dence that the patterns reveal what is to happen under those circumstances, even if the actual results in individual instances vary to some extent from this “norm� (to use the Soviet military historical expression.).

It is relatively easy to take into account the differences in performance resulting from new weapons and equipment. The characteristics of the historical weapons and the current (or projected) weapons can be readily compared, and adjustments made accordingly in the validation procedure.

In the early 1960s an effort was made at SHAPE Headquarters to test the ATLAS Model against World War II data for the German invasion of Western Europe in May, 1940. The ï¬�rst excursion had the Allies ending up on the Rhine River. This was apparently quite reasonable: the Allies substantially outnumbered the Germans, they had more tanks, and their tanks were better. However, despite these Allied advantages, the actual events in 1940 had not matched what ATLAS was now predicting. So the analysts did a little “ï¬�ne tuning,” (a splendid term for fudging). Alter the so-called adjustments, they tried again, and ran another excursion. This time the model had the Allies ending up in Berlin. The analysts (may the Lord forgive them!) were quite satisfied with the ability of ATLAS to represent modem combat. (Or at least they said so.) Their official conclusion was that the historical example was worthless, since weapons and equipment had changed so much in the preceding 20 years!

As I demonstrated in my book, Options of Command, the problem was that the model was unable to represent the German strategy, or to reflect the relative combat effectiveness of the opponents. The analysts should have reached a different conclusion. ATLAS had failed validation because a model that cannot with reasonable faithfulness and consistency replicate historical combat experience, certainly will be unable validly to reflect current or future combat.

How then, do we account for what l have said about the fuzziness of patterns, and the fact that individual historical examples may not �t the patterns? I will give you my rules of thumb:

  1. The battle outcome should reflect historical success-failure experience about four times out of five.
  2. For attrition rates, the model average of �ve historical scenarios should be consistent with the historical average within a factor of about 1.5.
  3. For the advance rates, the model average of �ve historical scenarios should be consistent with the historical average within a factor of about 1.5.

Just as the heavens are the laboratory of the astronomer, so military history is the laboratory of the soldier and the military operations research analyst. The scientific basis for both astronomy and military science is the recording of the movements and relationships of bodies, and then analysis of those movements. (In the one case the bodies are heavenly, in the other they are very terrestrial.)

I repeat: Military history is the laboratory of the soldier. Failure of the analyst to use this laboratory will doom him to live with the scientific equivalent of Ptolomean astronomy, whereas he could use the evidence available in his laboratory to progress to the military science equivalent of Copernican astronomy.

20 Sample PMP® Questions and Answers | Simplilearn

20 Sample PMP® Questions and Answers | Simplilearn

20 Sample PMP® Questions and Answers | Simplilearn The PMP®, or Project Management Professional, is an exam conducted by the Project Management Institute (PMI)®, is a globally recognized certification. The exam consists of 200 multiple choice questions that outline the five process groups (Initiation, Planning, Executing, Monitoring and Controlling, and Closing) and nine knowledge areas (In...Read More.
McMaster vs Spector on Vietnam

McMaster vs Spector on Vietnam

Lt. General H. R. McMaster, the U.S. National Security Advisor, wrote a doctoral dissertation on Vietnam that was published in 1997 as Dereliction of Duty: Lyndon Johnson, the Joint Chiefs of Staff and the Lies That Led to Vietnam. Ronald Spector, former Marine, Vietnam vet, and historian just published this article that caught my interest:  What McMaster Gets Wrong About Vietnam

What caught my interest was the discussion by Spector, very brief, that the Vietnamese had something to do with the Vietnam war. Not an earthshaking statement, but certainly a deserved poke at the more American-centric view of the war.

In my book, America’s Modern Wars, I do have a chapter called “The Other Side” (Chapter 18). As I note in the intro to that chapter (page 224):

Warfare is always a struggle between at least two sides. Yet, the theoretical study of insurgencies always seems to be written primarily from the standpoint of one side, the counterinsurgents. We therefore briefly looked at what the other side was saying to see if there were any theoretical constructs that were proposed or supported by them. They obviously knew as much about insurgencies as the counterinsurgents.

We then examined the writings and interview transcripts of eight practitioners of insurgency and ended up trying to summarize their thoughts in one barely “easy-to-read” table (pages 228-229), the same as we did for ten counterinsurgent theorists (pages 187-201). The conclusion to this discussion was (pages 235-236):

The review of the insurgents shows an entirely different focus as to what is important in an insurgency than one gets from reading the “classical” counterinsurgent theorists. In the end, the insurgent is primarily focused on the cause. The military aspects of the insurgency seem to be secondary concerns…..On the other hand, the majority of the insurgents we reviewed actually won or managed a favorable results from their war in the long run (this certainly applies to Grivas and Itote). Perhaps their focus on the political cause, with the military aspects secondary, is an indication of the correct priorities. 

I do have a chapter on Vietnam in the book also (Chapter 22).

How MIT Researchers Use Drone Fleets to Track Warehouse Inventory

How MIT Researchers Use Drone Fleets to Track Warehouse Inventory

What should you know about MIT Researchers’ use of Drones?

Many companies are becoming aware of the need to automate warehouse inventory tracking in their businesses. The tracking of warehouse stock inventories is one of the major challenges that every business faces including the best-run enterprises. Reports indicate that business makes losses totaling to more than $45 billion due to losses resulting from lost items. The MIT Researchers seeks to ensure that business can cut down their losses related to lost items through the incorporation of drone technology in tracking inventories. MIT Researchers are optimistic that the use of drone fleets could be the solution for businesses having challenges in tracking their warehouse inventories.

What are the advantages of incorporating drones in tracking warehouse inventories?

It makes the location of stock in the warehouse inventories easy. The drones have the capacity to fly around without any security concern and read the RFID tags on warehouse inventories from a distance with a margin error of about 19 cm according to the report by MIT Researchers. The discoveries by the MIT Researchers indicate that the incorporation of drones in the management of warehouses will make it easy to locate the location of items. The MIT ...

Read More on Datafloq
8 Key Steps in Reacting to a Server Outage

8 Key Steps in Reacting to a Server Outage

One of the worst things that can happen to a data manager is an incident that causes server downtime. When your server is down, you may not be able to access the apps or the information you need to do your job, your clients and customers may not be able to reach your app, and worst of all, your data may become temporarily vulnerable.

Fortunately, there are some key steps you can take to restore service as quickly as possible, identify the root cause of the problem, and prevent future outages.

The Incident Management Methodology

Responding to and correcting server-related problems can be considered under the umbrella of technical incident management. Incident management is a system designed to be consistent and repeatable, so you follow the same protocols each time to address the problem as quickly as possible. This is beneficial because it reduces the possibility of improvisation (which is dangerous when you’re in panic mode), and gives you a template for improvement with each incident it’s used for. With 98 percent of organizations citing a single hour of downtime costing more than $100,000, every minute you save with incident management is valuable.

Actionable Steps to Take

Now, let’s look at the individual steps you’ll ...

Read More on Datafloq
NATO’S Black Sea Force

NATO’S Black Sea Force

This article caught my attention: NATO launches Black Sea force as latest counter to Russia

It consists of a Romanian brigade of up to 4,000 soldiers, troops from nine other NATO countries (including Poland, Bulgaria, Italy, Portugal, Germany, Britain, Canada). In addition, there is a separate deployment of 900 U.S. troops in the area.

During the cold war, there was only one NATO member on the Black Sea, Turkey, but there were three Warsaw Pact members (Soviet Union, Romania and Bulgaria). Now there are three NATO members (Turkey, Romania and Bulgaria), several countries who have a Russian-supported separatist enclave or two in them  (Ukraine, Georgia, Moldova) and, of course, Russia. It has become an interesting area.


Semi-Autonomous Cars: Worries and Fears Driving Tesla-S and Similar Cars

Semi-Autonomous Cars: Worries and Fears Driving Tesla-S and Similar Cars

Science-fiction movies love to show off cars that drive themselves. In the original Total Recall, self-driving cars were even used to squeeze a little humor into the action film. For Tesla owners, there might not be much to laugh about when it comes to the semi-autonomous features associated with the vehicle. Not everyone feels comfortable with the automatic driving capabilities found in Tesla models and other cars. In time, however, such attitudes may change. Understanding the how's and why's of semi-autonomous cars could make them less frightening.

The Concept of Semi-Autonomous Cars

Semi-autonomous cars present a very valuable feature, one capable of helping drivers avoid mishaps and calamities on the road. Semi-autonomous is not the same thing as self-driving. Semi-autonomous cars do handle some of the tasks associated with driving, but do require the driver to do some work as well. So, the vehicle does not take on all the responsibilities of driving. Rather, a semi-autonomous car lends an assist. For those nervous about allowing a Tesla to run "on its own," the semi-autonomous features gives them some control.

The Purpose of Semi-Autonomous Cars

The main purpose of semi-autonomous cars is not just to make driving one of the vehicles very leisurely. One huge ...

Read More on Datafloq
The BBBT Sessions: Outlier, and the Importance of Being One

The BBBT Sessions: Outlier, and the Importance of Being One

It has been some time since my last write up about my briefings with the Boulder Business Intelligence Brain Trust (BBBT), multiple business engagements and yes, perhaps a bit of laziness can be attribute to it.

Now I have the firm intention to regain coverage of these series of great analyst sessions a more regular basis, hoping of course, my hectic life will not stand in my way.

So, to resume my coverage of this great series of sessions with software vendors and analysts, I have picked one that, while not that recent, was especially significant for the BBBT group and the vendor itself. I’m talking about a new addition to the analytics and BI landscape called Outlier.

Members of the BBBT and myself had the pleasure to be witness of the official launch of this new analytics and business intelligence (BI) company and its solution to the market.

Outlier presented its solution to our analyst gathering in what was an appealing session. So here, a summary of the session and info about this newcomer to the BI and Analytics space.

About Outlier

Outlier, the company, was founded by seasoned tech entrepreneur Sean Byrnes (CEO) and experienced data scientist Mike Kim (CTO) in 2015 in Oakland, Ca. with founding of First Round CapitalHomebrew, and Susa Ventures.

Devoting more than year to develop the new solution, Outlier maintained it in beta through most of 2016, to be finally released in February of 2017 aiming to offer users a unique approach to BI and analytics.

With its product named after the company, 33,3 333Outlier aims to be well, precisely that by offering a different approach to analytics, so that it:

“Monitors your business data and notifies you when unexpected changes occur.�

Which means that, rather than taking a reactive approach in which the system waits for the business user to launch the analytics process, the system will take a proactive approach and signal or alert when these changes occur, triggering action from analysts. 

Now to be honest, this is not the first time I hear this claim from a vendor and frankly, as many modern BI solutions incorporate more sophisticated alerting mechanisms and functionality I’m less concerned on hearing it and more on discovering how each software provider addresses the issue of making analytics and BI solutions able to be proactive.

During the session, Sean Byrnes and Doug Mitarotonda, CEO and Head of Customer Development respectively, gave us a great overview of Outlier’s new approach to BI and analytics. Here, a summary of this great briefing.

Outlier AI and a New Analytics Value Chain

Being data scientists themselves, Outlier’s team understands the hardships, complexities and pains data scientists and business analysts undergo to design, prepare and deploy BI and analytics solutions so, by considering this and aiming to take a fresh approach Outlier was born, aiming to provide a new approach to business intelligence.

The approach developed by Outlier intends to ―opposed to creating dashboards or running queries against business data analysis requirements― watch consistently and automatically business data and alert of when unexpected changes occur so to do this.

Outlier connects directly to a number business data types in the likes of Google Analytics, Adobe Cloud, Salesforce, Stripe, SQL databases and many others to, then, automatically monitor or watch the data and alert of unexpected behavior.

Along with the ability to proactively monitor business data and alert of changes, Outlier can sift through metrics and dimensions, aiming to understand and identify business cycles, trends and patterns to automate the business analysis process and consequently, position themselves in the realm of a new generation of BI solutions (Figure 1).

Figure 1. Outlier’s positioning themselves as new generation BI (Courtesy of Outlier)

During the BBBT session with Outlier, one key thing brought up by Sean Byrnes was the fact that the company’s leadership understands the analytics and business intelligence (BI) market is changing and yet, many companies are still struggling now, not with the availability of data but with the questions themselves, as the analytics formulation process becomes increasingly complex.

According to the company, as part of a process aimed to automate the monitoring and analytics process and, to help users ease its regular monitoring process, once deployed Outlier can provide daily headlines from key business dimensions, enabling them to ask those critical questions in the knowing there will be a regular answer but still enable them to formulate also new ones to keep discovering what is important (Figure 2).

Figure 2. Outlier’s positioning themselves as new generation BI (Courtesy of Outlier)

Interestingly, I find this process to be useful, especially to:
  • Carry on with common data analysis and reporting tasks and above all that can truly automate the analytics process so it can detect when a significant change occurs.
  • Take a proactive approach to encapsulate the complexities of data management and present insights in a proper way for users to make business decisions ―act on data―
  • Filter data to recognize what is important to know when making a decision.

Outlier: It is not Just About the Common, but the Uncommon

Today many organizations can know how much they sold last month or, how much they spend on the last quarter, those are relevant yet, common questions that can be answered with relative ease but today, it is now also about discovering not just answers but new questions that can unveil new key insights, opportunities, and risks.

Outlier identified this as a key need and acted upon it, knowing that sometimes constructing the infrastructure to achieve it can become far more than a trivial task, as it many times forces organizations to radically modify existing traditional BI platforms to accommodate the introduction of new or additional analytics capabilities ―predictive, mining, etc.― that might easily fit or not with existing BI solutions within an organization.

Outlier aims to automate this process by making it possible for organizations connect directly with various sources a business analyst take data from to guide him through an automation of the monitoring process.

One key aspect of Outlier I find worth mentioning is how the company strives to augment rather than replace the capabilities of existing analytics and data management solutions, and trying to fit within a specific point of what the company calls the analytics value chain (Figure 3).

Figure 3. Outlier’s Analytics Value Chain Proposition (Courtesy of Outlier)

During the demo session, other relevant aspects of Outlier include its functionality for providing new and useful functional elements like the inclusion of headlines and dashboards or scorecards that include nicely a combination of graph and textual information (Figure 4), a large set of connectors for different data sources including traditional databases and social media sources.

Also, worth mentioning is the effort Outlier is doing for educating potential users in the field of BI and Analytics and, of course, the potential use of Outlier in different industries and lines of business by making available a section in their portal with helpful information ranging how to analyze customer acquisition cost to performing customer segmentation.

Figure 4. Outlier’s Screencap (Courtesy of Outlier)

Outlier and a New Generation of BI and Analytics Solutions

As part of a new wave of solutions developing and providing analytics and BI services, Outlier is constantly working in the introduction of new technologies and techniques to the common functional portfolio of data analysis tasks, Outlier seems to have countless appealing functions and features to modernize the current state of analytics.

Of course, Outlier will face significant competition from other incumbents already in the market such is the case for Yellowfin, Board, AtScale, Pyramid Analytics and others but, if you are in the search or just knowing about new analytics and BI offerings, it might be a good idea to check out this new solution if you think your organization requires an innovative and agile approach to analytics, with full monitoring and alerting capabilities.

Finally, you can start by checking, aside its website some additional information right from the BBBT, including a nice podcast and the session’s video trailer.
How Big Data is Helping Predict Heart Disease

How Big Data is Helping Predict Heart Disease

Heart disease is the leading cause of death in America. One out of every four people who die, pass away because of heart disease.

Thanks to big data though, doctors and scientists are making progress on being able to predict heart disease and find which treatments are the most effective.

Our Current Fight Against Heart Disease

As it stands right now, diagnosing heart disease requires a person to take a variety of medical tests to find. There is often minimal symptoms that can clue in a person that they have heart disease, except massive things like a heart attack.

Because there are minimal noticeable symptoms for heart disease, doctors have to look for clues in every checkup, like high blood pressure, being overweight, or having difficulty breathing.

When it comes to treating heart disease, we have methods for decreasing the chances of a dangerous incident, like a heart attack or stroke, but no clear way to cure it. Treatment methods can include medication to lower blood pressure or thin the blood to decrease the chance of clotting and stroke, getting a pacemaker, and more.

Since we have no clear way to cure heart disease, it comes down to predicting and preventing it from becoming a problem ...

Read More on Datafloq
The Sad Story Of The Captured Iraqi DESERT STORM Documents

The Sad Story Of The Captured Iraqi DESERT STORM Documents

Iraqi soldiers cross a highway carrying white surrender flags on Feb. 25, 1991, in Kuwait City. The U.S.-led coalition overwhelmed the Iraqi forces and swiftly drove them out of Kuwait. [Christophe Simon/AFP/Getty Images]

The fundamental building blocks of history are primary sources, i.e artifacts, documents, diaries and memoirs, manuscripts, or other contemporaneous sources of information. It has been the availability and accessibility of primary source documentation that allowed Trevor Dupuy and The Dupuy Institute to build the large historical combat databases that much of their analyses have drawn upon. It took uncounted man-hours of time-consuming, pain-staking research to collect and assemble two-sided data sufficiently detailed to analyze the complex phenomena of combat.

Going back to the Civil War, the United States has done a commendable job collecting and organizing captured military documentation and making that material available for historians, scholars, and professional military educators. TDI has made extensive use of captured German documentation from World War I and World War II held by the U.S. National Archives in its research, for example.

Unfortunately, that dedication faltered when it came to preserving documentation recovered from the battlefield during the 1990-1991 Gulf War. As related by Douglas Cox, an attorney and Law Library Professor at the City University of New York School of Law, millions of pages of Iraqi military paper documents collected during Operation DESERT STORM were destroyed by the Defense Intelligence Agency (DIA) in 2002 after they were contaminated by mold.

As described by the National Archives,

The documents date from 1978 up until Operation Desert Storm (1991). The collection includes Iraq operations plans and orders; maps and overlays; unit rosters (including photographs); manuals covering tactics, camouflage, equipment, and doctrine; equipment maintenance logs; ammunition inventories; unit punishment records; unit pay and leave records; handling of prisoners of war; detainee lists; lists of captured vehicles; and other military records. The collection also includes some manuals of foreign, non-Iraqi weapons systems. Some of Saddam Hussein’s Revolutionary Command Council records are in the captured material.

According to Cox, DIA began making digital copies of the documents shortly after the Gulf War ended. After the State Department requested copies, DIA subsequently determined that only 60% of the digital tapes the original scans had been stored on could be read. It was during an effort to rescan the lost 40% of the documents that it was discovered that the entire paper collection had been contaminated by mold.

DIA created a library of the scanned documents stored on 43 compact discs, which remain classified. It is not clear if DIA still has all of the CDs; none had been transferred to the National Archives as of 2012. A set of 725,000 declassifed pages was made available for a research effort at Harvard in 2000. That effort ended, however, and the declassified collection was sent to the Hoover Institution at Stanford University. The collection is closed to researchers, although Hoover has indicated it hopes to make it publicly available sometime in the future.

While the failure to preserve the original paper documents is bad enough, the possibility that any or all of the DIA’s digital collection might be permanently lost would constitute a grievous and baffling blunder. It also makes little sense for this collection to remain classified a quarter of a century after end of the Gulf War. Yet, it appears that failures to adequately collect and preserve U.S. military documents and records is becoming more common in the Information Age.

How Big Data Is Putting Better Food On Your Plate

How Big Data Is Putting Better Food On Your Plate

There are few industries where scale is more important than in agriculture.

Farms that stretch for as far as the eye can see might contain millions of individual plants. We might lovingly tend each potato plant in our allotment, but on an industrial scale, it is impossible to understand what is happening on a micro scale.

That is until Big Data made its mark in the agricultural industry.

There are so many challenges that might be better solved with a closer understanding of the underlying situation. Over a third of food produced is lost or wasted each year. Globally, that amounts to $940bn in lost revenue. Crops are planted inefficiently, harvested haphazardly and watered ineffectively. Unexpected weather events can wreak havoc, and the consumer is often a fickle beast. There is before we get to the more involved issues of genetics and nutrition.

Big Data is hitting the fields

Sensors can provide data on soil conditions, irrigation, and fertilizer requirements while keeping an eye on the weather. GPS trackers and drones can direct the farmers to use their resources in the optimal fashion and track the growth of their crops. All the while, the data can be crunched to allow farmers to optimise their output ...

Read More on Datafloq
RELX Group: The Transformation to a Leading Global Information & Analytics Company

RELX Group: The Transformation to a Leading Global Information & Analytics Company

When we talk about taking the changes in technology and implementing them within an organisation, one name jumps to mind - RELX Group. The transformation of the FTSE 100 (and FTSE 15) RELX Group from a media company to leading global information and analytics company, with a market capitalization of about $44bn, is indeed inspirational and somewhat surprising. 

With a heritage in publishing, RELX Group has now successfully transformed its revenue streams. Over the past decade, print sales have been managed down from 50% to just 10% and the vast majority of revenues are now derived from digital. The company spends $1.3bn on technology annually and employs 7,000 technologists across the company’s four global divisions. Notably, MSCI re-categorized RELX as a business services company rather than a media group last year.  

I recently had the pleasure of interviewing with Kumsal Bayazit, Chairwoman, RELX Technology Forum at RELX Group. Ms Bayazit has been at the group for more than 14 years now and played a major role in devising the pathway that dictated the company’s transformation during the last decade. 

The Transformation

Every transformation within an organization requires firm belief and perseverance. Without any one of these factors, the transformation will either be left a void or will propel the ...

Read More on Datafloq
Augmented Reality – Recreating Reality in Our Own Image

Augmented Reality – Recreating Reality in Our Own Image

It is true: there is nothing new under the sun. But we sure find the most incredible ways to do nothing new under the sun. Once upon a time, there was reality; and we made reality more interesting through storytelling and the imagination. Lately, Augmented Reality is breaking the boundary between what is imagined and what is experienced as real.

Again, there is nothing new under the sun. We have spent over a century in search of new ways to bring our imaginations to life. Carlo  Collodi gave us The Adventures of Pinocchio (1883)--stories of Geppetto's puppet, Pinocchio, dreaming of being a real boy. Pinocchio was very much alive, though made of wood. Caryn Bailey reminds us of The Enchanted Drawing (1900), the first live-action/animated silent film that would kick off the Twentieth Century.

Who can forget Mary Poppins, with Dick Van Dyke dancing with a group of animated penguins, or The Incredible Mr. Limpet, with Don Knotts’ adventures as a cartoon fish? Those films amazed audiences. Cartoonists were able to create the impossible. Their very creations--what was once thought to forever be confined to the imagination--now appeared to be interacting with the real world elements in the motion picture. It added ...

Read More on Datafloq
How Does Analytics Enable Better Lives?

How Does Analytics Enable Better Lives?

Imagine the parents of a small child who is going to turn two in the next few weeks. And instead of planning for his birthday as most parents would, the parents of this child are just hoping that he will live to see his birthday. With a heart defect that keeps the tiny heart from functioning normally, the poor kid has seen more nurses and hospital operating rooms than cakes and balloons.

What if there were an AI-enabled device that could automatically monitor all the key health indicators and alert his parents to any abnormalities, immediately helping them take action instead of just waiting and watching?

What if that device turns out to be the surprise birthday gift, helping the child reach his birthday milestone with a smile?

We often feel data and analytics are esoteric entities that are only traded between companies and used to make even more profits. It either does not impact us -- the common consumers -- directly or it is a like a creepy shadow that we could live well without, given the danger of privacy and information leak.

With increased access to technology and data, can analytics insights actually become the invisible forces that can help improve human lives?

Do we wonder how ...

Read More on Datafloq
The Evolution of the Commercial Lending Industry Due to Big Data Utility

The Evolution of the Commercial Lending Industry Due to Big Data Utility

There is no doubt that small businesses act as the foundation for the entire U.S. economy, and nearly half of the entire private workforce is made up of small businesses. Despite the large portion of the economy that constitutes these businesses, they have been classically ignored when it comes to commercial lending practices. This is primarily due to the fact that lending systems weren't modernized until fairly late in the game, and even now there are still issues that could be resolved with the proper tools.

The First Step Toward Modernization

A digital commercial lending infrastructure was created in 1970 by the Fair Credit Reporting Act, which focused on using large amounts of data to promote the advancement of the financial industry. By adding transparency to the process and defining the rights of all consumers, the FCRA sought to balance the system such that all businesses would have equal access to commercial funding.

Once the FCRA was put into action, the regulations that it included allowed for a number of new and helpful financial products to enter the market. It was at this time that the idea of a credit score was introduced, which is nothing more than a measurement of an individual's ...

Read More on Datafloq
How Artificial Intelligence Impacts Financial Services

How Artificial Intelligence Impacts Financial Services

Artificial Intelligence is using structured and unstructured data in financial services to improve the customer experience and engagement, to detect outliers and anomalies, to increase revenues, reduce costs, find predictability in patterns and increase forecasts reliability…but it is not so in any other industry? We all know this story, right? So what is really peculiar about AI in financial services?

First of all, FS is an industry full of data. You might expect this data to be concentrated in big financial institutions’ hands, but most of them are actually public and thanks to the new EU payment directive (PSD2) larger datasets are available to smaller players as well. AI can then be easily developed and applied because the barriers to entry are lower with respect to other sectors.

Second, many of the underlying processes can be relatively easier to be automatized while many others can be improved by either brute force computation or speed. And historically is one of the sectors that needed this type of innovation the most, is incredibly competitive and is always looking for some new source of ROI. Bottom line: the marginal impact of AI is greater than in other sectors.

Third, the transfer of wealth across different generations makes the field really fertile for AI. AI needs ...

Read More on Datafloq
How Artificial Intelligence Affects and Changes the Insurance Industry

How Artificial Intelligence Affects and Changes the Insurance Industry

There are plenty of startups out there working at the intersection of AI and insurance, and it essential to look at least at some of them to understand the future direction of the industry, as well as the kind of improvements AI is having in the insurtech space. An interesting thing to notice is that most of the innovation is happening in the UK rather than other countries, in all the segments proposed below.

Claim processing

Shift Technology skims the valid claims from the ones that deserve further validations; Tractable instead is trying to automatize experts task for insurances; ControlExpert has a specific focus on car claims;Cognotekt optimizes internal business processes, as well as Snapsheet does; Motionscloud offers instead mobile claim management solutions; and finallyRightIndem aims to help insurers to deliver on-premise smoothing the claiming flow.

Virtual Agents & Chatbots

Spixii is an automated insurance agent who helps you buying any insurance coverage you might want; Cognicor is a virtual assistant that offers customer care services; Conversica identifies which leads intend to purchase, while Your.MD is a personal health assistant that analyzes symptoms and produces pieces of advice. MedWhat instead uses EMR (medical records) to assist the patient as it was a virtual doctor, and Babylon gives medical advice taking care of tight budget constraints. Insurifyis another personal insurance agent who works as a comparator for car ...

Read More on Datafloq
TDI Friday Read: Tank Warfare In World War II

TDI Friday Read: Tank Warfare In World War II

American troops advance under the cover of M4 Sherman tank ‘Lucky Legs II’ during mop up operations on Bougainville, Solomon Islands, March 1944. [National Archives/ww2dbase]

In honor of Tony Buzbee, who has parked a fully-functional vintage World War II era M-4 Sherman tank in front of his house in Houston, Texas (much to the annoyance of his home owner’s association), here is a selection of posts addressing various aspects of tank warfare in World War II for you weekend leisure reading.

Counting Holes in Tanks in Tunisia

U.S. Tank Losses and Crew Casualties in World War II

Tank Loss Rates in Combat: Then and Now

Was Kursk the Largest Tank Battle in History?

A2/D2 Study

Against the Panzers

And, of course, Chris Lawrence has written the largest existing book on the largest tank battle in history, Kursk.

Trust Is Why Blockchains Will Soon Be Everywhere

Trust Is Why Blockchains Will Soon Be Everywhere

The business world runs on trust, but trust is hard to come by. A staggering amount of time and money is spent searching, validating, verifying, checking, auditing, certifying, and worrying — trust is an expensive proposition. Blockchains make trust easier and less expensive, which is why half a billion dollars was invested in block chain technology last year. Analysts expect that the block chain industry will be worth billions of dollars in just a few years.

Blockchains first came to prominence as the underlying protocol of the Bitcoin cryptocurrency. But blockchains aren’t limited to financial transactions and there are many types of blockchain, of which Bitcoin is just one example.

So what do blockchains have to do with trust? A blockchain can be thought of as a type of database or ledger. Information can be added to a blockchain and retrieved from it later. But no single individual has complete control over a blockchain.

To understand blockchains, you need to understand blocks, chains, and networks.

A block is a unit of information. Each records a set of transactions. Transactions are changes to the information stored in the “database�. The size of the blocks varies depending on the protocol, but every blockchain joins blocks together ...

Read More on Datafloq
TechCast: How Enterprises Identify Potential Digital Talent | Simplilearn webinar starts 31-10-2017 20:00

TechCast: How Enterprises Identify Potential Digital Talent | Simplilearn webinar starts 31-10-2017 20:00

Technology and automation are constantly shifting the recruitment goalposts. The advent of the digital era has resulted in a massive shift in hiring policies, with organizations inevitably preferring the near-perfect efficiency of machine learning over humans. As the market becomes a tight squeeze, how do you ensure that you stand out and g...Read More.
Size of U.S. Defense Budget

Size of U.S. Defense Budget

If you have kids, the conversations sometime wander into strange areas. I was told yesterday that the U.S. Defense budget was 54% of the U.S. budget. I said that not right, even though Siri was telling him otherwise.

It turns out that in 2015 that the U.S. Defense budget was 54% of U.S. discretionary spending, according to Wikipedia. This is a significant distinction. In 2015 the U.S. defense budget was $598 billion. In 2015 the U.S. Federal budget was $3.688 trillion actual (compared to 3.9 Trillion requested). This is 16% of the U.S. budget. As always, have to read carefully.

Just to complete the math, the U.S. GDP in 2015 was 18.037 Trillion (United Nations figures). So, federal budget is  20% of GDP (or 22% is the requested budget figure is used) and defense budget is 3.3% of GDP.

Latest figures are 583 billion for U.S. Defense budget (requested for 2017), 3.854 estimated expenditures for the U.S. Federal Budget for 2016 and 4.2 trillion requested for 2017, and 18.56 trillion for U.S. GDP (2016) and 19.3 trillion (preliminary for 2017).



And other wikipedia links.



You Look Like A Criminal! Predicting Crime With Algorithms

You Look Like A Criminal! Predicting Crime With Algorithms

Can you really predict if someone is going to commit a crime?

Some authorities are using facial recognition, predictive analytics and machine learning to predict who will commit a crime. Even if you can use an algorithm to deduce the likelihood of an individual’s future, to apprehend a suspect before a crime is even committed surely cannot lead to a conviction as no offence will have actually taken place. Yes, this is all very Minority Report.

Nevertheless, companies are currently working on these technologies to catch the bad guys before they even strike.

Cloud Walk tracks people’s location to note where they go and rates them on how likely they are of committing a crime using this location data. According to a spokesperson talking to Financial Times, authorities in China can track location data and purchases - a person buying just a kitchen knife would not be considered suspicious. If this person goes on to buy a hammer and sack later, their rating goes up and their potential for committing a crime flags up.

Researchers at Shanghai Jiao Tong University have carried out a study linking criminality and facial images. Training algorithms with headshots of over 1000 faces from government IDs, with 700 of ...

Read More on Datafloq
TechCast : The Future of Work and The Workforce of the Digital Era | Simplilearn webinar starts 20-09-2017 16:00

TechCast : The Future of Work and The Workforce of the Digital Era | Simplilearn webinar starts 20-09-2017 16:00

What are the skills that can help you get noticed as enterprises seek to assemble a digital savvy workforce? Join Dr. Somdutta Singh, Director at Center for Entrepreneurial Excellence and Avnish Sabharwal, Managing Director - Growth and Strategy at Accenture India, as they elaborate on how to devise innovative strateg...Read More.
Why Prescriptive Analytics Is the Future of Big Data

Why Prescriptive Analytics Is the Future of Big Data

Big Data has ushered in an era of data analytics that is taking different forms, including prescriptive analytics. This type of business analytics helps you find the best approach for a specific circumstance. It is also considered the third or final part of business analytics, which also encompasses descriptive analytics and predictive analytics. Prescriptive analytics leverages predictive analytics and descriptive analytics to derive ideal outcomes or solutions from helping you solve business problems, and it is driving the future of Big Data. Here's how:

Differences Between Prescriptive Analytics and Predictive Analytics

Raw data is plentiful in today's digital age. Approximately 90 percent of today's online data represents a compilation of data that was generated in only a few years, and it is projected to grow rapidly. Consumers send billions of messages via instant messaging apps and social networking sites, such as Facebook and Twitter, and generate upwards of six billion on Google every day via their mobile devices and desktops. However, this raw data does not create value on its own. It must be processed in a way that delivers valuable insight to your enterprise for it to be resourceful. With raw data, you can identify patterns, build models based on these ...

Read More on Datafloq
Will Big Data Revive the Economy?

Will Big Data Revive the Economy?

You'd be forgiven for feeling a little jittery about a world economy is driven first and foremost by data. As news headlines regularly confirm, we're climbing a pretty steep learning curve right now, and lots of people are worried about how safe their data is once it's floating around "out there."

But the truth is that big data stands a good chance of reviving and even completely remaking the global economy. It's true that the world's cryptographers have yet to build hackproof encryption and the internet itself might be due for an upgrade, but data is here to stay — and the global economy won't be the same after we get a better handle on storing and managing it. Here's a quick look at what that world could look like in a data-driven economy.

Data Is the Lifeblood of the Global Economy

When most of us think about the collection of big data, we usually think about marketing first. It seems like the only purpose of amassing data is to more effectively part consumers from their money — right?

Not so fast. Data is the lifeblood of the modern global economy for far more reasons than its usefulness in targeting marketing content to consumers. ...

Read More on Datafloq
Data Modeling Zone 17

Data Modeling Zone 17

After all, I am very happy to be a speaker at this year's Data Modeling Zone in Düsseldorf. Again, like at the Global Data Summit, I'm talking about one of my favorite topics: Temporal data in the data warehouse, especially in connection with data vault and dimensional modeling.

Human Factors In Warfare: Combat Effectiveness

Human Factors In Warfare: Combat Effectiveness

An Israeli tank unit crosses the Sinai, heading for the Suez Canal, during the 1973 Arab-Israeli War [Israeli Government Press Office/HistoryNet]

It has been noted throughout the history of human conflict that some armies have consistently fought more effectively on the battlefield than others. The armies of Sparta in ancient Greece, for example, have come to epitomize the warrior ideal in Western societies. Rome’s legions have acquired a similar legendary reputation. Within armies too, some units are known to be superior combatants than others. The U.S. 1st Infantry Division, the British Expeditionary Force of 1914, Japan’s Special Naval Landing Forces, the U.S. Marine Corps, the German 7th Panzer Division, and the Soviet Guards divisions are among the many superior fighting forces from history.

Trevor Dupuy found empirical substantiation of this in his analysis of historical combat data. He discovered that in 1943-1944 during World War II, after accounting for environmental and operational factors, the German Army consistently performed more effectively in ground combat than the U.S. and British armies. This advantage—measured in terms of casualty exchanges, terrain held or lost, and mission accomplishment—manifested whether the Germans were attacking or defending, or winning or losing. Dupuy observed that the Germans demonstrated an even more marked effectiveness in battle against the Soviet Army throughout the war.

He found the same disparity in battlefield effectiveness in combat data on the 1967 and 1973 Arab-Israeli wars. The Israeli Army performed uniformly better in ground combat over all of the Arab armies it faced in both conflicts, regardless of posture or outcome.

The clear and consistent patterns in the historical data led Dupuy to conclude that superior combat effectiveness on the battlefield was attributable to moral and behavioral (i.e. human) factors. Those factors he believed were the most important contributors to combat effectiveness were:

  • Leadership
  • Training or Experience
  • Morale, which may or may not include
  • Cohesion

Although the influence of human factors on combat effectiveness was identifiable and measurable in the aggregate, Dupuy was skeptical whether all of the individual moral and behavioral intangibles could be discreetly quantified. He thought this particularly true for a set of factors that also contributed to combat effectiveness, but were a blend of human and operational factors. These include:

  • Logistical effectiveness
  • Time and Space
  • Momentum
  • Technical Command, Control, Communications
  • Intelligence
  • Initiative
  • Chance

Dupuy grouped all of these intangibles together into a composite factor he designated as relative combat effectiveness value, or CEV. The CEV, along with environmental and operational factors (Vf), comprise the Circumstantial Variables of Combat, which when multiplied by force strength (S), determines the combat power (P) of a military force in Dupuy’s formulation.

P = S x Vf x CEV

Dupuy did not believe that CEVs were static values. As with human behavior, they vary somewhat from engagement to engagement. He did think that human factors were the most substantial of the combat variables. Therefore any model or theory of combat that failed to account for them would invariably be inaccurate.


This post is drawn from Trevor N. Dupuy, Numbers, Predictions and War: Using History to Evaluate Combat Factors and Predict the Outcome of Battles (Indianapolis; New York: The Bobbs-Merrill Co., 1979), Chapters 5, 7 and 9; Trevor N. Dupuy, Understanding War: History and Theory of Combat (New York: Paragon House, 1987), Chapters 8 and 10; and Trevor N. Dupuy, “The Fundamental Information Base for Modeling Human Behavior in Combat, � presented at the Military Operations Research Society (MORS) Mini-Symposium, “Human Behavior and Performance as Essential Ingredients in Realistic Modeling of Combat – MORIMOC II,� 22-24 February 1989, Center for Naval Analyses, Alexandria, Virginia.

Designing Effective AI Public Policies

Designing Effective AI Public Policies

There’s growing noise around ‘regulating’ AI. Some claim it’s too early, citing that precautionary regulations could impede technical developments; others call for action, advocating measures that could mitigate the risks of AI.

It’s an important problem. And both ends of the debate make compelling arguments. AI applications have the potential to improve output, productivity, and quality of life. Forestalling AI developments that facilitate these advancements are big opportunity costs. Equally, the risks of broad-scope AI applications shouldn’t be dismissed. There are near-term implications, like job displacement and autonomous weapons, and longer-term risks, like values misalignment and the control problem.

Regardless of where one sits on the ‘AI regulation’ spectrum, few would disagree that policymakers should have a firm grasp on the development and implications of AI. It’s unsurprising, given the rapid developments, that most do not.


Policymakers are still very much at the beginning of learning about AI. The US government held public hearings late last year to ‘survey the current state of AI’. Similarly, the UK House of Commons undertook an inquiry to identify AIs’ ‘potential value’ and ‘prospective problems’.

While these broad inquiries signify positive engagement, they also highlight policymakers’ relatively poor understanding, particularly compared to industry. This is understandable given the majority of AI development and ...

Read More on Datafloq
Big Data, not Oil, is King in Today’s Economy

Big Data, not Oil, is King in Today’s Economy

Big Data is not just about volumes of data; it is the ability to both collect information and extract meaning from it. The businesses that are prospering in today's economy are those able to take advantage of Big Data. In the next ten years, it will be the ability to use Big Data that will make or break a business. According to contributing writer to Forbes, Daniel Newman, "it is not what you know, it is what you do with what you know". Intelligent big data is changing all industries: from supply side management with smart cameras and intelligent stock management that automatically place future orders to a large cost reduction in the healthcare industry.

What is all the bustle

The data in Big Data has become a key commodity in all business sectors, not just in technologically driven markets. Although Uber and Tesla are reaping billions of dollars in what seems to be traditional transportation and car industry, their true success will be measured in their ability to collect massive amounts of data. Uber now owns supply/demand (i.e., drivers/passengers) for personal transportation, while Tesla gathered 1.3bn miles-worth of driving data to optimize their algorithm for self-driving cars. Other examples abound. Sears ...

Read More on Datafloq
How to Overcome the Hurdles of Big Data in Marketing

How to Overcome the Hurdles of Big Data in Marketing

There are many things to look out for when it comes to big data and how advertisers will use this information in the future. Its impact will be omnipresent. This is why there is a lot of information to examine when it comes to the influence that big data will continue to have on businesses and the way they handle large amounts of information. The force that big data will have on the ways that advertisers use, process, and gather information will affect the lives of consumers. This information will provide advertisers with personal details about consumers and their lifestyles which will provide the data needed to create unique and personalized experiences for consumers.

Understand Privacy Rights

Nowadays, most advertisers are willing to reward you in exchange for a peek at your information. Many of us are all too willing to allow free access to this information as long as the dangling carrot is compelling enough. However, how much information is too much and is the compensation offered really adequate? This is the question. The advertising world is fraught with situations where a lack of transparency may be an issue on the part of many marketing companies. This raises the question of ...

Read More on Datafloq
How Quantum Computers Will Revolutionize Artificial Intelligence and Big Data

How Quantum Computers Will Revolutionize Artificial Intelligence and Big Data

It goes without saying that we are living in a digital age. Technology has drastically changed the way we carry out our day to day activities. Currently, computers produce large volumes of data most of which is fed to open-source streaming platforms such as Apache Kafka, data banks, and social media platforms. Although computers are at the peak of their data processing power, the amount of data keeps growing.

The situation has spawned a race among competing firms to launch a suitable quantum computer, which is more powerful than current computers. This computer will have the ability to process the large volumes of data that we generate every day at a faster rate besides solving increasingly complex problems. The main talking point of this mini digital revolution is how it will influence big data and artificial intelligence.

The Role of Quantum Computers in Artificial Intelligence

Previously, it was possible to key in big data problems to desktop computers. Nonetheless, it is becoming harder to crunch larger volumes of data into computers that we currently use. Quantum computers come with advanced algorithms that cannot fit into the memory of traditional computers. This means that addressing storage capacity challenges that come with big data and ...

Read More on Datafloq
Using Big Data, AI Can Predict the Next Bestseller

Using Big Data, AI Can Predict the Next Bestseller

Less than one-half of one percent of all aspiring authors reach the best-seller list. Publishers receive thousands of manuscripts a year and even the literary titan like T.S. Eliot erred when he rejected the now-famous novel "Animal Farm" by George Orwell.

Now, Two scientists, Jodie Archer and Matthew Jockers claim to have created a new artificial intelligence (AI) algorithm, called the "bestseller-ometer", that uses Big Data to predict the next best-seller with more than 80% accuracy. For a struggling publishing industry where the economic stakes couldn't be higher for selecting the next winning novel, such a quantitative tool may be a windfall.

In their new book, "The Bestseller Code: Anatomy of the Blockbuster Novel," from St. Martin's Press, these researchers from the Stanford Literary Lab, describe how they analyzed over 25,000 novels and determined some 2799 relevant features for training their algorithms to detect bestsellers. Despite excitement by the predictive success of their algorithm, it is not without its critics. Skeptics argue that such a code may not detect truly new ideas and would create a stale "literary painting by numbers".

Literary DNA: Theme and Plotline

Despite criticism, Archer and Jockers work is one of many that are attempting to quantify the humanities. Their ...

Read More on Datafloq
Data Testing: Why Traditional Approaches fail in the Era of Big Data

Data Testing: Why Traditional Approaches fail in the Era of Big Data

The contemporary business environment is characterized by proliferating data, growing customer demands, and shrinking budgets. It, therefore, calls for organizations to remain competitive by making the right decisions at the right time.

The business world has witnessed a paradigm shift over the past several years. It’s no longer imperative for business leaders to only rely on their judgment to make the right strategic decisions. Successful leaders have to be equipped with as much information as possible to enable better decisions. The insights that enable businesses to make better-informed decisions come by using a combination of past data, responding to the existing business needs in real time, and using a predictive modeling method to design a roadmap for future growth. Thus the need for big data!

What is Big Data?

Big data are datasets that have been gathered, stored, managed and analyzed by standard software tools. They generate plenty of value for businesses of all types and sizes. Organizations that can harness the power of big data benefit from the quality and operational efficiency, leading to labor and cost savings and ensuring a competitive edge. Leveraging the big data is also useful for companies to reduce errors, fight frauds and streamline processes.

Testing: Big Data ...

Read More on Datafloq
How Big Data and Analytics Can Help Business Workflow

How Big Data and Analytics Can Help Business Workflow

Major organizations use Big Data and Data Analytics to improve their business and make informed strategic and operational decisions. But how do they do it? It is said that one can derive meaningful insights from data and convert them into actionable knowledge – but it is easier said than done. As you know that data is invaluable, and it is the foundation of every successful venture in the world. That makes it imperative for you to learn how to use it effectively for seeing a growth in your businesses.

Out of the various ways to analyze your data, there are some simple and effective tools that can help get more out of it. Let’s focus on six  ways to analyze data more efficiently:

1. Find a Database Engine

Database engine does sound like a lengthy process to implement, but it is simple to utilize and use. Every company has a lot of unused data, which could be used to serve the customers. It can manage your entire data and content, along with the archived data for future usage. It ensures the security of your data and keeps them all in one place for easier review. You can easily work on areas that need ...

Read More on Datafloq
Is It a Good Idea to Quantify Unquantifiable Concepts?

Is It a Good Idea to Quantify Unquantifiable Concepts?

In the world of big data, some things are easy to objectively measure, such as how much money a person spends, or how much time it takes to accomplish something. Other things are notoriously difficult to quantify, such as moods, subjective opinions, and beliefs.

As our technology enables us to make and store more measurements, and as demand for big data analysis continues to grow, we’re going to have the option to quantify these “unquantifiable� metrics. There are some major advantages to this approach, but are they worth the potential costs?

The Plus Side of Quantification

These are some of the biggest advantages this kind of data quantification can offer:

Tools for decision making. When you’re making a major decision on behalf of a company, staking thousands to millions of dollars on your conclusions, you can’t cite your instincts or beliefs as hard evidence in favor of your chosen position. It’s better to have something evidence-backed and provable on your side. For example, you can calculate a priority score for projects in your portfolio management (PPM) strategies, or even rely on average user ratings to gauge satisfaction.
Avoiding blind speculation. Numerical values also hold us to some degree of objectivity; without a figure to point ...

Read More on Datafloq
Big Data Can Help Us Evolve to the Internet of People

Big Data Can Help Us Evolve to the Internet of People

Right now the biggest craze in tech is from big data and its role in helping the Internet of Things (IoT). However, many have said with more smart devices is consumerism on steroids. We can see this in a variety of products that rely on the IoT from umbrellas, which predict chances of rain to vacuums that clean your entire home. Currently, there are a variety of security issues as hackers can easily break through software that is installed on many of these devices making the user vulnerable to attack.

Big data is also being utilized for just about every sector right now, and this is to be expected. Integration of technology at times can be a slow process, but older systems are replaced with new ones. NBN plans are a prime example of the internet infrastructure upgrades happening till 2020 when fibre optic connections become the norm.

However, people as a collective need to stop thinking about how much junk they can collect, and utilize big data with nanotechnology to create an Internet of People (IoP). The concept sounds far-fetched, but this idea is not that distant, and today we will explore this a bit further.

IoT and Wearable Technology

With IoT, we've ...

Read More on Datafloq
3 Ways Blockchain Will Transform the Internet of Things

3 Ways Blockchain Will Transform the Internet of Things

There is no denying the power of the Internet of Things (IoT). IoT devices are already in 60 percent of U.S. homes using a broadband connection, and an estimated 200 million vehicles will be connected to the internet by 2020, standing to transform entire industries for a good reason. By the end of 2017, approximately 8.4 billion devices are projected to be connected to the IoT -- that is more than the human population. It is only going to expand as time moves on, as more data is created by these smart devices and as technology continues to advance. These numbers are expected to increase exponentially to 24 billion IoT devices communicating by 2020 and as many as 75.44 billion IoT devices by 2025. However, this also brings about some concerns that you should be aware of, including security concerns, reliability and the validity of transactions.

However, blockchain technology stands to cause a huge impact on the Internet of Things. The combination of data that cannot be altered but can be traced and verified from connected devices will drive the birth of transactions among connected devices. With the power of blockchain technology, connected devices will be able to network to conduct ...

Read More on Datafloq
TDI Friday Read: Mike Spagat’s Economics of Warfare Lectures & Commentaries

TDI Friday Read: Mike Spagat’s Economics of Warfare Lectures & Commentaries

Below is an aggregated list of links to Dr. Michael Spagat‘s E3320: Economics of Warfare lecture series at the Royal Holloway University of London, and Chris Lawrence’s commentary on each. Spagat is a professor of economics and the course addresses quantitative research on war.

The aim of the course is to:

Introduce students to the main facts about conflict. Apply theoretical and empirical economic tools to the study of conflict. Give students an appreciation of the main questions at the research frontier in the economic analysis of conflict. Draw some policy conclusions on how the international community should deal with conflict. Study data issues that arise when analysing conflict.
Mike’s Lecture Chris’s Commentary
Economics of Warfare 1 Commentary
Economics of Warfare 2 Commentary
Economics of Warfare 3 Commentary
Economics of Warfare 4 Commentary
Economics of Warfare 5 Commentary
Economics of Warfare 6 Commentary
Economics of Warfare 7 Commentary
Economics of Warfare 8 Commentary
Economics of Warfare 9 Commentary
Economics of Warfare 10 Commentary
Economics of Warfare 11 Commentary 1

Commentary 2

Economics of Warfare 12 Commentary
Economics of Warfare 13 Commentary 1

Commentary 2

Commentary 3

Economics of Warfare 14 Commentary
Economics of Warfare 15 Commentary 1

Commentary 2

Economics of Warfare 16 Commentary
Economics of Warfare 17 Commentary 1

Commentary 2

Commentary 3

Economics of Warfare 18 Commentary
Economics of Warfare 19 Commentary 1

Commentary 2

Commentary 3

Commentary 4

Economics of Warfare 20 Commentary
A Return To Big Guns In Future Naval Warfare?

A Return To Big Guns In Future Naval Warfare?

The first shot of the U.S. Navy Office of Naval Research’s (ONR) electromagnetic railgun, conducted at Naval Surface Warfare Center, Dahlgren Division in Virginia on 17 November 2016. [ONR’s Official YouTube Page]

Defense One’s Patrick Tucker reported last month that the U.S Navy Office of Naval Research (ONR) had achieved a breakthrough in capacitor design which is an important step forward in facilitating the use of electromagnetic railguns in future warships. The new capacitors are compact yet capable of delivering 20 megajoule bursts of electricity. ONR plans to increase this to 32 megajoules by next year.

Railguns use such bursts of energy to power powerful electromagnets capable of accelerating projectiles to hypersonic speeds. ONR’s goal is to produce railguns capable of firing 10 rounds per minute to a range of 100 miles.

The Navy initiated railgun development in 2005, intending to mount them on the new Zumwalt class destroyers. Since then, the production run of Zumwalts was cut from 32 to three. With the railguns still under development, the Navy has mounted 155mm cannons on them in the meantime.

Development of the railgun and a suitable naval powerplant continues. While the Zumwalts can generate 78 megajoules of energy and the Navy’s current railgun design only needs 25 to fire, the Navy still wants advanced capacitors capable of powering 150-killowatt lasers for drone defense, and new generations of radars and electronic warfare systems as well.

While railguns are huge improvement over chemical powered naval guns, there are still doubts about their effectiveness in combat compared to guided anti-ship missiles. Railgun projectiles are currently unguided and the Navy’s existing design is less powerful than the 1,000 pound warhead on the new Long Range Anti-Ship Missile (LRASM).

The U.S. Navy remains committed to railgun development nevertheless. For one idea of the role railguns and the U.S.S. Zumwalt might play in a future war, take a look at P. W. Singer and August Cole’s Ghost Fleet: A Novel of the Next World War, which came out in 2015.

4 Ways Artificial Intelligence Is Disrupting the Web Hosting Industry

4 Ways Artificial Intelligence Is Disrupting the Web Hosting Industry

Artificial intelligence is transforming many industries both on and offline. One that has seen significant change is web hosting. AI is improving the website experience for businesses around the world.

Cloud hosting presents an excellent platform for integrating AI software. It helps to speed learning, discover improvements, and streamline services. AI solutions can keep up with the growing amounts of data coming from internet activity. Below are just some of the ways AI can make website processes easier.

Improved Security

A chief concern is protecting your website and customer information against cyber threats. Sophisticated AI applications can identify and evaluate patterns to discover threats, and instantly alert you when problems appear. This way even the latest hacker tricks can be blocked before any damage is done.

With fast predictive analytics and adaptive machine learning, you can get warnings and reports every time your online sites are targeted by malware. Advanced software such as IBM Watson takes cyber security to the next level. Instead of relying on hardware or spotting known threats, it will analyze large amounts of network data and continue improving its ability to "learn" the differences between network issues and cyber attacks.

Increased Productivity

Keeping up with changing technologies, installation, and support requires considerable ...

Read More on Datafloq
No Russian Chemical Weapons

No Russian Chemical Weapons

Russia is claiming that it has destroyed all of its chemical weapons (I gather that there is no reason to doubt this, this has been going on for a while). The U.S has destroyed most its weapons, with complete elimination planned for 2023.


How Augmented Reality Can Improve The Real Estate Sector

How Augmented Reality Can Improve The Real Estate Sector

The real estate industry is successfully developing, and real estate agents are making use of new technologies so that their potential customers will be provided with an innovative experience when they are looking for new homes to buy. The utilization of AR app development in this sector is on the increase with the hope of helping potential home buyers to find their desired home.

With the help of AR app development company, both buyers and real estate agents benefit equally from augmented reality. Augmented reality brings life to products, and there can’t be a better way of displaying homes that have not been created yet.

How can AR be used in Real Estate

Augmented reality in the real estate sector by projecting information regarding the property that is up for sale over the images of the property. The information of the property is visible to the clients when their device is pointed in the direction of the property. In some scenarios, the buyers will not only see the overlaid information of the house, but they can as well see the interior of the house or how it will look like after it is completed, just by looking at the building from outside with ...

Read More on Datafloq
Selling BioMetric Data as Big Data

Selling BioMetric Data as Big Data

Are you reading this on your phone? Did you just use your fingerprint to sign in? Remember when that seemed futuristic?

It wasn’t that long ago, in truth. It seems that in just a few years biometrics has gone from a niche concern to being woven into many of the products we take for granted. Increasingly, our devices take measurements, from our fingerprints to our facial expressions, and use them for a variety of purposes.

Smartphones are just the most obvious example, of course. Perhaps the biggest innovator in this space has been Apple integrating biometric data in the form of fingerprint scanning and now facial recognition.

Many stores now use facial recognition technology, and even some gun safes use biometricss. Using your face or finger to unlock your safe, or your phone, is certainly convenient. However, the rise of biometrics gives rise to some tricky questions about how this data should be used and shared.

The Uses Of Biometric Data

Biometric data is now collected by a vast variety of companies and devices. A survey of British retail chains found that more than 25% of these stores are now using facial recognition software to log when customers visit, and what they buy.

Some companies are ...

Read More on Datafloq
OOW is just around the corner

OOW is just around the corner

Originally posted on HeliFromFinland:
I cannot believe how fast time flies! It has been an exceptionally busy year and that is the reason I have not had time to write blog posts either. I have been speaking in amazing events like BIWA Summit, APEX Connect, Riga DevDays 2017, OTNEMEA Tour in Baku and in Madrid,…
Human Factors In Combat: Interaction Of Variable Factors

Human Factors In Combat: Interaction Of Variable Factors

The Second Battle of Ypres, 22 April to 25 May 1915 by Richard Jack [Canadian War Museum]

Trevor Dupuy thought that it was possible to identify and quantify the effects of some individual moral and behavioral (i.e. human) factors on combat. He also believed that many of these factors interacted with each other and with environmental and operational (i.e. physical) variables in combat as well, although parsing and quantifying these effects was a good deal more difficult. Among the combat phenomena he considered to be the result of interaction with human factors were:

Dupuy was critical of combat models and simulations that failed to address these relationships. The prevailing approach to the design of combat modeling used by the U.S. Department of Defense is known as the aggregated, hierarchical, or “bottom-up� construct. Bottom-up models generally use the Lanchester equations, or some variation on them, to calculate combat outcomes between individual soldiers, tanks, airplanes, and ships. These results are then used as inputs for models representing warfare at the brigade/division level, the outputs of which are then fed into theater-level simulations. Many in the American military operations research community believe bottom-up models to be the most realistic method of modeling combat.

Dupuy criticized this approach for many reasons (including the inability of the Lanchester equations to accurately replicate real-world combat outcomes), but mainly because it failed to represent human factors and their interactions with other combat variables.

It is almost undeniable that there must be some interaction among and within the effects of physical as well as behavioral variable factors. I know of no way of measuring this. One thing that is reasonably certain is that the use of the bottom-up approach to model design and development cannot capture such interactions. (Most models in use today are bottom-up models, built up from one-on-one weapons interactions to many-on-many.) Presumably these interactions are captured in a top-down model derived from historical experience, of which there is at least one in existence [by which, Dupuy meant his own].

Dupuy was convinced that any model of combat that failed to incorporate human factors would invariably be inaccurate, which put him at odds with much of the American operations research community.

War does not consist merely of a number of duels. Duels, in fact, are only a very small—though integral—part of combat. Combat is a complex process involving interaction over time of many men and numerous weapons combined in a great number of different, and differently organized, units. This process cannot be understood completely by considering the theoretical interactions of individual men and weapons. Complete understanding requires knowing how to structure such interactions and fit them together. Learning how to structure these interactions must be based on scientific analysis of real combat data.[1]

While this unresolved debate went dormant some time ago, bottom-up models became the simulations of choice in Defense Department campaign planning and analysis. It should be noted, however, that the Defense Department disbanded its campaign-level modeling capabilities in 2011 because the use of the simulations in strategic analysis was criticized as “slow, manpower-intensive, opaque, difficult to explain because of its dependence on complex models, inflexible, and weak in dealing with uncertainty.�


[1] Trevor N. Dupuy, Understanding War: History and Theory of Combat (New York: Paragon House, 1987), p. 195.

For AI to Change Business, It Needs to Be Fueled with Quality Data

For AI to Change Business, It Needs to Be Fueled with Quality Data

There’s no doubt that AI has usurped big data as the enterprise technology industry’s favorite new buzzword. After all, it’s on Gartner’s 2017 Hype Cycle for emerging technologies, for a reason.

While progress was slow during the first few decades, AI advancement has rapidly accelerated during the last decade. Some people say AI will augment humans and maybe even make us immortal; other pessimistic individuals say AI will lead to conflict and may even automate our society out of jobs. Despite the differences in opinion, the fact is, only a few people can identify what AI really is. Today, we are surrounded by minute forms of AI, like the voice assistants that we all hold in our smartphones, without us knowing or perceiving the efficiency of the service. From Siri to self-driving cars, a lot of promise has already been shown by AI and the benefits it can bring to our economy, personal lives and society at large. The question now turns to how enterprises will benefit from AI. But, before companies or people can obtain the numerous improvements AI promises to deliver, they must first start with good quality, clean data. Having accurate, cleansed and verified information is critical to ...

Read More on Datafloq
Understanding SDR-Based Attacks on IoT

Understanding SDR-Based Attacks on IoT

IoT devices are ubiquitous today. Many of these devices use RF based communication techniques to connect to other devices or to receive commands from remote controls in near field. Many devices like smart tube lights have dual mode operation, in which they connect over the internet for interfacing with a mobile app or via RF with a local remote. RF based communication interface introduces an all new attack surface on IoT devices.

Software Defined Radios (SDR) are a versatile piece of hardware that can change reception and transmission profiles based on software configuration. SDRs are available in half-duplex (only reception or transmission at a time) or full-duplex mode (reception and transmission simultaneously). Affordable SDRs like HackRF have given rise to the recent SDR revolution among radio enthusiasts. These SDRs can be used to analyze the signal transmitted between IoT devices and transmit rogue messages. In this article we discuss some of the common attacks that work by exploiting the signal transmission and how some new devices are mitigating those attacks.

Replay Attacks

The most common type of attacks are based on capturing a command sequence and retransmitting it later. This is fairly easy to do using an SDR. The first step is to find out ...

Read More on Datafloq
Global Data Summit

Global Data Summit

I am very pleased to be speaking at the Global Data Summit in Golden, Colorodo this year. I am talking about one of my favorite topics: Temporal data in the data warehouse, especially in connection with data vault and dimensional modeling. The title is:

Bitemporal modeling for the Agile Data Warehouse

The talk is a 5x5 presentation, that is 5 slides in 5 minutes. Afterwards, the participants have the opportunity to discuss the topic intensively with me in a 90-minute whiteboard session.

How the Insurance Industry Uses Mobility Data?

How the Insurance Industry Uses Mobility Data?

Barely recovering from the financial crisis and facing disruption from new technology, the insurance industry has to leverage on information outside of their organization to compete in today’s economy.

With the global economy still struggling to recover from the financial crisis, insurers have been fighting to maintain policy numbers amidst low-interest rates. Life insurance products have become less attractive due to low rates and interest in investment-linked policies have declined as a result of the uncertain economy. In addition, the widespread adoption of new technology across all industries has raised customer expectations of insurance solutions and interaction channels. In order to maintain their competitive edge, insurers are looking into mobility data to help them stay competitive in today’s market.  

What is mobility data?

Mobility data refers to the trajectories of people and objects. In the insurance industry, the movement patterns of people in a city are of particular interest. This is commonly referred to as footfall, which involves measuring the number of people in and around an area within a period of time. With mobility data, insurers can explore new ways to optimize advertising, understand the needs of their customers and create new products.

Redefining the insurance industry with mobility data

Here are three examples ...

Read More on Datafloq
The Power of Predictive Analytics in Hiring

The Power of Predictive Analytics in Hiring

In an ever-increasing corporate landscape, making quality hires is critically important for organizations looking to improve their bottom line. The U.S. Department of Labor estimates that the cost of a bad hire is in excess of 30% of that employee’s first-year earnings.

To reduce the occurrence of bad hires, a growing number of businesses are turning to predictive analytics and big data. Using algorithms to analyze past and current data, these businesses more effectively can predict and adapt to future trends. From the sports world to big-box retailers, predictive analytics in hiring is shifting the paradigm of hiring decisions away from resumes and traditional metrics and towards data-driven analysis and advanced simulations.

The Moneyball Phenomenon

The 2001 Oakland Athletics baseball team enjoyed one of the most successful regular seasons in franchise history. Led by Tim Hudson, Mark Mulder, and Barry Zito, a trio of excellent starting pitchers, the team won 102 games. Their 63-18 record during the 2nd half of the season following the All-Star break today remains the best in MLB history.

Despite the momentum going into the post-season, the A’s faltered in a five-game series against the New York Yankees in the American League Divisional Series. The stage was then set for ...

Read More on Datafloq
The One Board Wargame To Rule Them All

The One Board Wargame To Rule Them All

The cover of SPI’s monster wargame, The Campaign For North Africa: The Desert War 1940-43 [SPI]

Even as board gaming appears to be enjoying a resurgence in the age of ubiquitous computer gaming, it appears, sadly, that table-top wargaming continues its long, slow decline in popularity from its 1970s-80s heyday. Pockets of enthusiasm remain however, and there is new advocacy for wargaming as a method of professional military education.

Luke Winkie has written an ode to that bygone era through a look at the legacy of The Campaign For North Africa: The Desert War 1940-43, a so-called “monster” wargame created by designer Richard Berg and published by Simulations Publications, Inc. (SPI) in 1979. It is a representation of the entire North African theater of war at the company/battalion level, played on five maps which extend over 10 feet and include 70 charts and tables. The rule book encompasses three volumes. There are over 1,600 cardboard counter playing pieces. As befits the real conflict, the game places a major emphasis on managing logistics and supply, which can either enable or inhibit combat options. The rule book recommends that each side consist of five players, an overall commander, a battlefield commander, an air power commander, one dedicated to managing rear area activities, and one devoted to overseeing logistics.

The game map. [BoardGameGeek]

Given that to complete a full game requires an estimated 1,500 hours, actually playing The Campaign For North Africa is something that would appeal to only committed, die-hard wargame enthusiasts (known as grognards, i.e. Napoleonic era slang for “grumblers” or veteran soldiers.) As the game blurb suggests, the infamous monster wargames were an effort to appeal to a desire for a “super detailed, intensive simulation specially designed for maximum realism,” or as realistic as war on a tabletop can be, anyway. Berg admitted that he intentionally designed the game to be “wretched excess.”

Although The Campaign For North Africa was never popular, it did acquire a distinct notoriety not entirely confined to those of us nostalgic for board wargaming’s illustriously nerdy past. It retains a dedicated fanbase. Winkie’s article describes the recent efforts of Jake, a 16-year Minnesotan who, unable to afford to buy a second-end edition of the game priced at $400, printed out the maps and rule book for himself. He and a dedicated group of friends intend to complete a game before Jake heads off to college in two years. Berg himself harbors few romantic sentiments about wargaming or his past work, having sold his own last copy of the game several years ago because a “whole bunch of dollars seemed to be [a] more worthwhile thing to have.â€� The greatness of SPI’s game offerings has been tempered by the realization that the company died for its business sins.

However, some folks of a certain age relate more to Jake’s youthful enthusiasm and the attraction to a love of structure and complexity embodied in The Campaign For North Africa‘s depth of detail. These elements led many of us on to a scholarly study of war and warfare. Some of us may have discovered the work of Trevor Dupuy in an advertisement for Numbers, Predictions and War: Using History to Evaluate Combat Factors and Predict the Outcome of Battles in the pages of SPI’s legendary Strategy & Tactics magazine, way back in the day.

Five Ways Healthcare Data Analytics Can Help You

Five Ways Healthcare Data Analytics Can Help You

Big data analytics is surfacing into a promising field in healthcare as it provides valuable insights from extremely large data sets and enhances outcomes while reducing costs. While it is a valuable asset, intelligent decisions regarding treatment options and intervention can be driven through analyzing data. The ability to track patterns and trends from multiple sources provides better accessibility and insights that are required to deliver improved outcomes, quality care and better management of decisions.

Benefits of Data Analytics

Using data analytics, healthcare providers can take charge of the information and convert it into meaningful insights that can lead to timely and strategic decisions. Administrators of healthcare enterprises can control and reduce their cost through analytics as it improves the operation efficiency without negotiating on the quality of outcome and care. Collaborating clinical and financial data allows for efficient diagnosis and treatment as against alternatives since big data analytics is accountable and transparent in its functionality.

Let’s take a look at how data sources can help deliver next-level insights to patients using big data analytics:

Advance Patient Care

Big data analytics helps administrators and clinical providers to fill the gap between services currently offered. The availability of all patient information in one platform facilitates ...

Read More on Datafloq
Book Commentary: Predictive Analytics by Eric Siegel

Book Commentary: Predictive Analytics by Eric Siegel

As much as we’d like to imagine that today the deployment and use of predictive analysis has now become a commodity for every organization and it’s of use in every “modern� business.

The reality is that in many cases an number of small, medium and even large organizations are still not using predictive analytics and data mining solutions as part of their core software business stack.

Reasons can be plenty: insufficient time, budget or human resources as well as a dose of inexperience and ignorance of its real potential benefits can be the cause. This and other reasons came to my mind when I had the opportunity to read the book: Predictive Analytics: The Power to Predict Who Will Click, Buy, Lie, or Die written by former Columbia University professor and founder of Predictive Analytics World conference series, Eric Siegel.

Aside from being a well and clear written book filled with examples and bits of humor to make it enjoyable, what makes this book in my view a one is, mostly written for a general audience, in plain English, which makes it a great option for those new in the field to fully understand what predictive analytics is and the potential effect and benefits for any organization.

With plenty of industry examples and use cases, Mr. Siegel neatly introduces the reader to the world of predictive analytics, what it is, and how this discipline and its tools are currently helping an increasing number organizations in the likes of Facebook, HP, Google, Pfizer ―and other big players in their fields― to discover hidden trends, predict and plan for making better decisions with data.

Another great aspect of the book is also its clear and easy explanation of current important topics including data mining and machine learning as a key to further advanced topics including artificial intelligence and deep learning. It also does a good job of mentioning some caveats and dangers making wrong assumptions when using predictive analytics.

I especially enjoyed the central section of the book filled with examples and use cases predictive analysis in different industries and lines of business like healthcare financial and law enforcement among others as well as list of resources listed at the end of the book.

Of course for me, being a practitioner for many years, there was a small sense of wanting a bit more technical and theoric details but yet, the book is a great introduction reference for both novices that need to get the full potential of predictive analytics and those familiar with the topic but that want to know what their peers are doing to expand their view of the application of predictive analysis in their organization.

If you are still struggling to understand what predictive analysis is and what benefits can offer to your organization can do to improve your decision making and planning abilities, or want a fresh view on what are the new use cases for this discipline and software solutions, Predictive Analytics from Eric Siegel is certainly reference you should consider having in your physical or virtual bookshelf.

Have you read the book? About to do it? Don’t be shy, share your comments right below…

Used Kursk Books

Used Kursk Books

Kursk: The Battle of Prokhorovka: is selling off its used copies of the Kursk book at $118.80. This is the first time I have seen them selling the book for below $200. They have eight “used-acceptable” books at 118.80. Another seller has a “used-acceptable” for $114.82. has six “used-good” for $120.04, seven “used-very good” for $121.28, and three “used-like new” for $122.51.

Kind of mystified how ended up with 24 used books.

Why Digital Innovation Starts with a Digital Core

Why Digital Innovation Starts with a Digital Core

A lot of times when the prevalent industry trends are discussed among industry folks, there are usually two directions in which the conversation goes. It is either varying states of disbelief at the rate of change within the business and IT landscapes; or it is enthusiastic agreement on the importance of moving with the times and adopting a digital infrastructure. The former is born of a dedication to supposedly “tried and tested� methods, which arguably are worth next to nothing in this day and age. And the latter is what more and more business leaders and executives need to be doing at the moment.

Relevance of Digital Transformation

The economy, for the most part, has begun to undergo a massive change in the way the entire infrastructure and various other components function. This is due to the onset of the digital age, which has made the idea of a “traditional� economy effectively obsolete; and in good time too, since the relevance of digital transformation is growing by leaps and bounds each day, with the completion of that transformation being predicted to be around 2020, according to some industry experts.

It is estimated that a whopping 212 billion (yes, Billion!) sources will be in ...

Read More on Datafloq
How Edge Computing will Give a New Life to Health Care

How Edge Computing will Give a New Life to Health Care

With the advent of Edge Computing, health care industry has transformed itself considerably, while hospitals and clinics are gearing up to take better and faster care of their patients.

In fact, Edge Computing has permeated the industry in such a powerful manner that clinicians and doctors heavily rely on them to treat patients.

As more and more devices get connected in the health care industry, networking among them all has really become huge because the data that keeps comes in is never going to slow down.

What is Edge Computing?

Edge Computing or Fog Computing is a technology by which the time between data capture and analytics are considerably reduced. It works by upending the system in such a manner that the devices themselves are configured to handle critical analysis on their own, and only the filtered ones will enter the user’s device.

Through this technology, patient management, remote monitoring, in-patient care, health information management all receive a burst of speed. As it adds a secure layer of computing power between the cloud and the device, there is no need to worry about information going into the wrong hands.

Health information management can come in various forms like wearables, home scales, Telehealth tools, healthcare apps, blood glucose monitor, heart rate monitor and plenty more, so while devices get smaller and data gets ...

Read More on Datafloq
How can Blockchain Revolutionise Mobile App Security?

How can Blockchain Revolutionise Mobile App Security?

Have you heard of Blockchain? That's already an obsolete question for many of us who closely followed the way data-centric security measures have evolved in the past few years. Yes, you have guessed it right. The blockchain is about data and is also about security. The blockchain is also about collaboration and transaction.

But Blockchain is never a server full of data locked to prevent any unwanted access. With Blockchain the opposite is the case. An open for all database that cannot be rewritten or tampered while remaining open for any subsequent addition, that's Blockchain for you.

But what relation does it have with mobility and app market? As envisaged from the core competence of Blockchain, it can safeguard data from any misuse or tampering while making it always accessible for all.

It all started with the cryptocurrency called Bitcoin followed by its popularity in the financial sector as a high-tech measure to protect transaction and finally, it is knocking the door of mobile apps. Yes, with such promising features for data protection Blockchain can well revolutionise the mobile app security.

Distributed database powers blockchain

The concept of a distributed open source database was coined long ago, but it was shaped by a relevant ...

Read More on Datafloq
Secretary of the Army, take 3

Secretary of the Army, take 3

On 19 July 2017 Mark Thomas Esper was nominated to be the new Secretary of the Army. This is the third nomination for this position, as the first two nominees, Vincent Viola and Mark E. Green, withdrew. Not sure when the congress with review and approve this nomination. I am guessing it won’t happen in September. The acting Secretary of the Army is Ryan McCarthy (approved as Undersecretary of the Army in August 2017).

Mr. Esper’s background:

  1. Graduate of USMA (West Point) in 1986 with a BS in Engineering.
  2. Masters degree from Harvard in 1995.
  3. PhD from GWU in 2008.
  4. Served as in infantry officer with the 101st Airborne Division during the Gulf War (1990-1991).
  5. Over ten years of active duty (1986-1996?). I gather still in the Army Reserve as a Lt. Colonel.
  6. Chief of Staff of the Heritage Foundation, 1996-1998.
  7. Senior staffer for Senate Foreign Relations Committee and Senate Governmental Affairs Committee, 1998-2002.
  8. Policy Director House Armed Services Committee, 2001-2202.
  9. Deputy Assistance Secretary of Defense for Negotiations Policy, 2002-2004.
  10. Director of National Security Affairs for U.S. Senate, 2004-2006.
  11.  Executive Vice President at Aerospace Industries Association, 2006-2007.
  12. National Policy Director for Senator Fred Thompson’s 2008 Presidential campaign, 2007-2008.
  13. Executive Vice President of the Global Intellectual Property Center, and Vice President for Europe and Eurasia at U.S. Chamber of Commerce, 2008-2010.
  14. Vice President of Government Relations at Raytheon, 2010 to present.

Wikipedia article:

How Leading Organizations are Leveraging Big Data and Analytics

How Leading Organizations are Leveraging Big Data and Analytics

“Data will talk to you if you’re willing to listen� — Jim Bergeson.

Few can dispute that.

However, the challenge comes when data transforms into bundles and stacks of unorganized and unstructured data sets. The challenge comes with listening to big data and making sense of what it says.

With big data, the conversing data becomes loud and noisy. You don’t hear the voice; you hear the cacophony. This is where organizations struggle.

And, amidst a struggle, you look up to the leaders to see how they are rising to the challenge. You observe, you learn, you implement and you adapt.

This is the first article of my “Under the Spotlight� series, where we will look at how leading organizations are leveraging big data and analytics, filtering out white noise from the cacophony in the process—to closely follow and benefit from what data has to say.  

These organizations are spaced among different industry verticals, including aerospace industry, sports industry and life sciences industry, along with government agencies.

Airbus Leveraging Big Data and Analytics to Improve Customer Experience

Airbus has been a global leader in the aerospace industry for the last four decades, specializing in designing and manufacturing of aerospace products, services and solution. 

Operating in a complex and highly-competitive industry ...

Read More on Datafloq
Rocket Man

Rocket Man

One of the two best versions of this song, from 2011:

The other great version, from 1978:


Human Factors In Warfare: Diminishing Returns In Combat

Human Factors In Warfare: Diminishing Returns In Combat

[Jan Spousta; Wikimedia Commons]

One of the basic problems facing military commanders at all levels is deciding how to allocate available forces to accomplish desired objectives. A guiding concept in this sort of decision-making is economy of force, one of the fundamental and enduring principles of war. As defined in U.S. Army’s Field Manual FM 100-5, Field Service Regulations, Operations (which Trevor Dupuy believed contained the best listing of the principles):

Economy of Force

Minimum essential means must be employed at points other than that of decision. To devote means to unnecessary secondary efforts or to employ excessive means on required secondary efforts is to violate the principle of both mass and the objective. Limited attacks, the defensive, deception, or even retrograde action are used in noncritical areas to achieve mass in the critical area.

How do leaders determine the appropriate means for accomplishing a particular mission? The risk of failing to assign too few forces to a critical task is self-evident, but is it possible to allocate too many? Determining the appropriate means in battle has historically involved subjective calculations by commanders and their staff advisors of the relative combat power of friendly and enemy forces. Most often, it entails a rudimentary numerical comparison of numbers of troops and weapons and estimates of the influence of environmental and operational factors. An exemplar of this is the so-called “3-1 rule,� which holds that an attacking force must achieve a three to one superiority in order to defeat a defending force.

Through detailed analysis of combat data from World War II and the 1967 and 1973 Arab-Israeli wars, Dupuy determined that combat appears subject to a law of diminishing returns and that it is indeed possible to over-allocate forces to a mission.[1] By comparing the theoretical outcomes of combat engagements with the actual results, Dupuy discovered that a force with a combat power advantage greater than double that of its adversary seldom achieved proportionally better results than a 2-1 advantage. A combat power superiority of 3 or 4 to 1 rarely yielded additional benefit when measured in terms of casualty rates, ground gained or lost, and mission accomplishment.

Dupuy also found that attackers sometimes gained marginal benefits from combat power advantages greater than 2-1, though less proportionally and economically than the numbers of forces would suggest. Defenders, however, received no benefit at all from a combat power advantage beyond 2-1.

Two human factors contributed to this apparent force limitation, Dupuy believed, Clausewitzian friction and breakpoints. As described in a previous post, friction accumulates on the battlefield through the innumerable human interactions between soldiers, degrading combat performance. This phenomenon increases as the number of soldiers increases.

A breakpoint represents a change of combat posture by a unit on the battlefield, for example, from attack to defense, or from defense to withdrawal. A voluntary breakpoint occurs due to mission accomplishment or a commander’s order. An involuntary breakpoint happens when a unit spontaneously ceases an attack, withdraws without orders, or breaks and routs. Involuntary breakpoints occur for a variety of reasons (though contrary to popular wisdom, seldom due to casualties). Soldiers are not automatons and will rarely fight to the death.

As Dupuy summarized,

It is obvious that the law of diminishing returns applies to combat. The old military adage that the greater the superiority the better, is not necessarily true. In the interests of economy of force, it appears to be unnecessary, and not really cost-effective, to build up a combat power superiority greater than two-to-one. (Note that this is not the same as a numerical superiority of two-to-one.)[2] Of course, to take advantage of this phenomenon, it is essential that a commander be satisfied that he has a reliable basis for calculating relative combat power. This requires an ability to understand and use “combat multipliers� with greater precision than permitted by U.S. Army doctrine today.[3] [Emphasis added.]


[1] This section is drawn from Trevor N. Dupuy, Understanding War: History and Theory of Combat (New York: Paragon House, 1987), Chapter 11.

[2] This relates to Dupuy’s foundational conception of combat power, which is clearly defined and explained in Understanding War, Chapter 8.

[3] Dupuy, Understanding War, p. 139.

How European Organizations Increase Their Innovation Speed with Smart Data Application

How European Organizations Increase Their Innovation Speed with Smart Data Application

For the third consecutive year, GoDataDriven collaborated with Big Data Expo to conduct Big Data Survey. Over 800 professionals participated in this third edition of Big Data Survey and shared their insights and experiences on topics like data strategy, implementing data science, technology, cloud, gdpr, and how to become attractive as an employer.

What Makes a Data Strategy Successful?

As in the first two editions of Big Data Survey, vision (87%) remains the most important aspect of a successful Big Data strategy, followed by talent (54%) and, last year’s number two, support from the management (50%).

Improving data quality is the largest challenge when it comes to implementing data infrastructure, followed by making data available and implementing data governance.

"Our management has a clear vision of the type of data products our organization needs to realize. We are constantly working to collect more data that supports us in executing our Big Data Strategy�, says Erik van Osenbruggen, Operational Manager at Promovendum/CAK Group

Although 9 out of every 10 organizations see lots of potential with data, data are primarily applied in dashboards and not so much to develop predictive models. The number of websites that are personalized in real-time using artificial intelligence is still very limited.

Becoming ...

Read More on Datafloq
5 Benefits and Applications of Internet of Things Technology

5 Benefits and Applications of Internet of Things Technology

Interconnected devices that serve the purpose of enhancing our everyday life are nothing new on the market. For years we have been using smartphones, electrical home devices and World Wide Web as a means to make our life easier and more streamlined.

One factor that connects all of these terms together is Internet of Things, a piece of technology that aims to automate and further simplify the way we live by taking over most of our everyday processes. But what are the positives of including such advanced technology as IoT in everything we do and how can we benefit from using it?

1. Autonomous vehicles

There’s no denying that self-driven cars are not a fantasy anymore. Google has admittedly reined the autonomous vehicle market for several years, offering a very real and possible use of IoT to streamline our commuting process.

You can now buy a car that drives itself to and from work, suggests shortcuts, notifies you of weather changes, possible malfunctions and even detects the type of music you’d like to hear during your drive. These cars are few and far in between, costing upwards of $1 billion, but they are real and powered by IoT which connects them to their central ...

Read More on Datafloq
How Artificial Intelligence is Making Mobile Apps Ever Smarter?

How Artificial Intelligence is Making Mobile Apps Ever Smarter?

Artificial intelligence, one of the most talked about technologies in recent times actually started its journey way back in 1950’s when the concept and term were first coined. Ever since the launch of that concept users worldwide believed that slowly machines would start performing tasks intended for machines. So there was huge enthusiasm concerning robotics that lasted for decades.

Finally, we have Artificial Intelligence for the mobile interface and a connected environment that continued to become the biggest and most influential proponent of the concept as of now. The launch of IBM’s Watson at the beginning of the previous decade was appreciated around the world became the hallmark of a new beginning.

But it was only with a smartphone, and voice-enabled digital assistants AI took over the mobile interfaces.

Now, we have AI in almost every handheld and connected device. From the digital assistants which quickly replies to queries with immediate answers and results to chatbots capable of guiding web traffic and app users to devices responsive to voice commands at home and workplaces, machines are behaving like human and learning from user actions more than ever before. It is needless to mention that mobile apps have been the biggest beneficiaries of Artificial ...

Read More on Datafloq
How Big Data Analytics Boosts Cyber Security

How Big Data Analytics Boosts Cyber Security

As more and more parts of our lives become connected to the internet, and more of our daily transactions take place online, cybersecurity is becoming an increasingly important topic. Just as modern technology changes more quickly than ever before, so do cyber criminals create newer and faster ways to target and rip off organizations. New malware is difficult to detect using previous strategies, which means we need new cybersecurity strategies to ensure commercial security.

One such new strategy is to use big data analytics. Big data analytics is an automated process by which a computer system examines large and varied sets of data to find patterns and trends. It is currently used to help companies track customer preferences and therefore better target their products and advertisements to specific users. However, with some reprogramming, those same big data analytics could be used to detect, respond to, and ultimately prevent cybercrime.

Here are some ways that big data analytics could help in the fight against cyber criminals.

1. Identifying Anomalies in Device and Employee Behavior

It is nigh-on impossible for a human user to manually analyze the millions of alerts that internet customers generate each month and pick out the valid ones from the threats. A ...

Read More on Datafloq
Economics of Warfare 20

Economics of Warfare 20


This is the twentieth lecture from Professor Michael Spagat’s Economics of Warfare course that he gives at Royal Holloway University. It is posted on his blog Wars, Numbers and Human Losses at:

This lecture continues the discussion of terrorism, looking at whether poverty or poor education causes terrorism. The conventional wisdom, supported by a book by Alan Krueger, is that they do not. The lecture presents four studies. Of those, one study (Krueger) makes the argument that they do not (see pages 4-5), while three of the studies (Enders and Hoover, de Mesquita, and Benmelech) find a limited association (see pages 6-14, 15 and page 21 ). Some of these other three studies have to work pretty hard to make their point. One is left to conclude that while poverty may have some impact on degree of terrorism and recruitment of terrorists, it is probably not the main or determining factor. We probably need to look elsewhere for the root causes.

The link to his lecture is here:


Big Data and Healthcare – It’s a Great Match

Big Data and Healthcare – It’s a Great Match

A patient has arrived at his primary care physician’s office. His symptoms are unusual – some fit one diagnosis but others do not. Traditionally, that doctor has had to rely on his training and experience, his medical journals, and, occasionally, consultation with colleagues.

Big data has changed all of that. With access to a huge database of global medical information, that doctor can now input all of the symptoms and have spit out to him potential diagnoses, based upon patient histories from everywhere. Not only will he obtain potential diagnoses but, as well, treatments that have proved to be successful.

Big Data Disruption

Big data has disrupted many industries. It has drastically altered the way in which financial services, insurance, and even investment enterprises do business. But a key industry that is less discussed, one in which big data has made a huge impact, has been health care.

Exactly What is “Big Data?�

We now have the technology to gather huge amounts of information from multiple sources. That gathering is useless, however, unless that information can be categorized and synthesized so that it makes sense to users. Fortunately, we have the technology to do that too.

So, in the healthcare industry, for example, we can gather ...

Read More on Datafloq
Thoughts on Looker JOIN 2017 and the Looker Product Roadmap

Thoughts on Looker JOIN 2017 and the Looker Product Roadmap

Anyone who follows me on Twitter will probably know that I’ve been at the Looker JOIN 2017 product conference in San Francisco this week, if only because I’ve been tweeting constantly for the past two days about the new features coming with the Looker 5 release, the vibrant partner ecosystem and customers coming-up on stage and talking about how Looker had transformed their organization and helped them become more data-driven.

I’d initially registered to go to the event on my own behalf but ended-up attending in my new role as Product Manager responsible for an analytics service that used Looker as the data analysis component, speaking about our experiences and meeting our contacts in their partner and product engineering teams.

But I was also interested in what Looker were planing on doing next given the very interesting position they were now in, having ridden the wave of the new analytic databases introduced by Google and Amazon and started to displace Tableau and the other standalone, data visualization tools in the tech startup marketin the types of data-driven digital businesses and tech startups valuing Looker’s ability to query and analyze all the data they were collecting rather than just the subsets tools like Tableau were only capable of working with.

Looker started-out as a back-end technology play with its tech-focused initial set of customers willing to overlook more primitive front-end features, an example of which was Qubit when I arrived who’s engineering team loved the LookML modeling language and the efficient way in which it accessed their BigQuery data layer. As Looker’s customer base became more mainstream as sales grew in the enterprise market they’d need close the feature-gap with tools like Tableau and more generally work on the front-end user experience, particularly the initial experience when users first log in and from my experience aren’t really sure what to do next.

Looker are by no means the first BI software vendor to face this challenge and I remember a similar evolution taking place with Oracle’s BI platform in the years following the Siebel acquisition where features such as a user homepage that surfaced recent and relevant content, integration links into other applications and business processes and a general UI makeover were introduced with the 11g release as I wrote about in a blog post at the time for my old company …but at the same time introduced deep integration with their Oracle Fusion Middleware platform that greatly increased the complexity of the product was added primarily to enable BI functionality to be added their upcoming Fusion Applications business applications suite.

None of this is particularly meant as criticism of Oracle in-particular and I only it as an example because I wrote the book on it back in the day and built a business providing consulting and training for customers moving onto this platform from earlier versions. And of course this is in part an argument for delivering BI as cloud-based software-as-a-service as Oracle and others have since done, but even with a move to a cloud much of the product complexity still remains and you can’t help notice that development priority is now around enabling, and in some cases requiring, customers to adopt other products from that supplier rather than building out core BI functionality.

What’s interesting to me is the opportunity Looker has to similarly transition from a niche to mainstream but with a blank sheet of paper when creating the product’s architecture and more importantly, without the “strategy tax� of having to integrate and drive sales of unrelated business and infrastructure software. Just as interesting though is the opportunity this focus can provide for Looker to evolve their product that just happens to be built on a modern, scalable and standards-based platform to a platform for running data applications of which Looker the BI tool is just one example, just as did with their platform some while ago.

To get some sense of how successful this strategy has been for Salesforce consider how busy San Francisco is when Oracle Openworld comes to town each year; 60,000 customers, vendors and salespeople descend on downtown San Francisco and fill all the bars and hotel rooms and make crossing the road down by the Moscone a major logistical exercise. Two weeks though Salesforce roll into town with their Dreamforce event and 180,000 turn-up, and now it’s 2nd and 3rd Street that are turfed-over and not just the gap between Moscone North and South that the city’s only just opened-up after Openworld.

So what new features were announced at Looker JOIN 2017 with the launch of Looker 5, the latest release of their platform? Well if you followed my tweets at the time you’ll have seen the UI had a refresh with a new initial homepage for users with recent content they’d worked on, and content commonly used with their group, surfaced on that page, together with the Action Hub, an evolution of the integrations feature recently added to the Looker 4 platform and used by Segment to enable users to route newly-discovered user cohorts to their platform and onwards to other marketing applications — the similarity to features similarly added by Oracle to their BI tool back in 2007 noted by me at the time (and several times subsequently)

Other UI improvements included a feature where stats analysis is used to automatically suggest which other columns are commonly-used by other users when you add columns yourself to a Look, a feature added to help new users navigate what can often be fairly complex and lengthy explores with no particular indication of what columns are useful for a particular report.

Basic query federation will also come with this new release giving user the ability to join data together in a query from two (or more?) explores and have the front-end environment join them together. It’s not clear whether this join happens in-memory in the web browser (I think so) or server-side, but it’s clear that these joins aren’t defined once-only in the LookML model and instead seem to be transitory, defined as needed by the user as part of an individual Look.

More importantly, as with all query federation features in tools going back to the original nQuire Query Server that provided the core of Oracle’s subsequent BI Server query federation technology there’s a limit to the size of tables you can join, from a network bandwidth perspective and because the server running the federation engine typically is much less powerful than the database server engines it sourced the data from — something to bear in mind when thinking about joining BigQuery datasets to other sources when using this feature when it becomes available.

A very welcome development from my work perspective was the addition of what Looker are calling “Analytic Blocks�, “Data Blocks� and “Viz Blocks� to their existing Looker Blocks method of packaging up best-practice metadata models and analysis techniques shared by Looker and partner developers. Analytic Blocks seem a further development of existing Looker Blocks and examples announced included examples for web analysis and cohort analysis, whilst Viz blocks extend Looker’s capability to add custom data visualizations to the hosted (and more commonly-used) version of Looker. Data Blocks meanwhile package-up LookML patterns for public datasources and commonly-used SaaS data services.

My thoughts on Looker evolving into a more broad platform for hosting data-driven applications was confirmed when Looker announced new analytic apps, though at first I thought I’d heard these positioned as having accompanying ETL routines which as we know from bitter experience usually end-up being more hassle than they’re worth, but they’re actually just examples of data, analytic and viz LookML blocks brought together to create a specific industry or horizontal application.

Finally, Looker announced one feature out soon and another due out later in 2018 that addressed a problem I’d certainly identified myself but didn’t expect Looker to try to address, but was pleased they’re making it a focus. The breakthrough that new cloud-based, massively-distributed query engines such as Google BigQuery gave users is its ability to query petabyte-size datasets with response time typically within a minute or less, which is fantastic and was the topic of my presentation in the deep-dive sessions.

But users aren’t impressed with how you’ve managed to query petabytes of data in under a minute … they want response-times in under a second and BigQuery just isn’t optimized for that. What is though is in-memory cache technology such as Druid used by Superset and a few other cutting-edge BI tools, and Looker announced imminent support for Druid with provisos that functionality would be limited and evolve further over time as the engineering team got to grips with Druid’s different approach to data access and querying.

For myself thought the real prize with developing-out this query optimisation would be in making Looker aggregate-aware so that pre-built aggregates in Druid, mySQL or other fast-response datastores could be mapped into Looker’s LookML model and used automatically by the query engine when data was requested at summary-level. As far as I know no such similar feature is on Google or Amazon Web Services’ roadmap and therefore investment by Looker to solve this “last mile� query problem by Looker would not be wasted.

Another potential solution as shown in the tweet above was shown in the final roadmap session at the end of the event, using memory in the users’ desktop browser was used to cache data returned by a Look in order to then have all subsequent front-end user interactions run at split-second speed.

So, based on what I saw at Looker JOIN 2017 does Looker the BI tool, and Looker the company, have the potential to grown beyond its current super-enthusiastic market of tech startups and VC-funded retail and other high-growth companies and become the next Will it end-up being bought by Google to complement their more basic and free-to-use Google Data Studio tool (Alphabet Ventures led the most recent round of investment in Looker back in May this year)?

Or will usage grow rapidly at first and then plateau, with further growth limited by the very platform choices that were innovative at the time but now limited their ability to take advantage of the next wave of IT innovation, as happened with Qlik and their eventual buyout by private equity?

Or will they suffer perhaps the worst outcome to this story and be bought by one of the software mega-vendors to be adapted thereafter to become the analysis and reporting solution for that company’s applications, and with more general analytics innovation mostly ending as engineers depart and customers move on to the next point solution? Well, that may be the fate of BeyondCore but having met and talked with their founder, Arijit Sengupta, last year just after their acquisition by Salesforce I think they’ll end-up subsumed as a distinct product within Einstein but play a big role in Salesforce’s drive to add AI to CRM and sales automation.

My opinion from what I saw at Join 2017 and from speaking with their product teams, customers and founder Lloyd Tabb on a couple of occasions — one of which was at the main customer service desk where I saw him regularly speaking with attendees and helping with support questions, just after delivering the product keynote, is that they’ve got the vision, market positioning and ability to execute that gives them a shot at the big prize, and enough VC funding to go for it without an immediate need to sell to a trade buyer and dilute their current focus. Good luck to them and I’m already looking forward to next year’s JOIN event and a trip back to San Francisco.

Thoughts on Looker JOIN 2017 and the Looker Product Roadmap was originally published in Mark Rittman’s Personal Blog on Medium, where people are continuing the conversation by highlighting and responding to this story.

Why Bitcoin Will Ultimately Fail and What Will Come Next

Why Bitcoin Will Ultimately Fail and What Will Come Next

We live in exciting times, where it has become possible to send money across the globe nearly instantaneously, where you can create value out of nothing and where we are working towards a future that is decentralised. The first application that kickstarted this revolution was Bitcoin when it was launched in 2009. Since then, the price of Bitcoin has increased dramatically, reaching $5000 for the first time in September 2017. The news of the Chinese government banning ICO’s resulted in a brief dip in the value of the coin, as well as almost all other cryptocurrencies, but after two days it recovered already. It seems that Bitcoin continues to rise in value, with multiple people predicting that Bitcoin will reach $10.000 by the end of 2017, $250.000 in ten years, $500.000 by 2030 or would even reach a value of $ 1 million in 5-10 years. Obviously, there is a hype going on if such enormous returns are predicted.

However, I believe that Bitcoin will not reach those valuations. In fact, I believe that Bitcoin will ultimately fail and be worth nothing or close to nothing. This will probably not happen in the short term. Most likely the value will keep ...

Read More on Datafloq
BOARD International: Cognitive, Mobile, and Collaborative

BOARD International: Cognitive, Mobile, and Collaborative

Business Intelligence (BI) and Enterprise Performance Management (EPM) software provider BOARD International recently released version 10.1 of its all-in-one BI and EPM solution. This release includes new user experience, collaboration, and cognitive capabilities, which will enable BOARD to enter into the cognitive computing field.

By incorporating all these new capabilities into its single BI/EPM offering, BOARD continues to uphold its philosophy of offering powerful capabilities within a single platform.

With version 10.1, BOARD aims to improve the way users interact with data significantly. The new version’s interface introduces new user interaction functionality in areas such as user experience and storytelling and is a major improvement on that of the previous version.

BOARD gave me an exclusive overview of the main features of version 10.1 and the company's product roadmap. Read on for details.

Getting On Board with Cognitive Technologies

With version 10.1, BOARD seems to be making its solution fit for a new era centered on machine learning. The solution uses natural language recognition (NLR) and natural language generation (NLG) capabilities to offer users new ways to interact with data (see Figure 1).

Figure 1. BOARD’s assistant (image courtesy of Board International)

For instance, users can now create an entire report in a drag-and-drop interface. They can also directly ‘talk’ to the system through spoken and written language. The system uses search-like strings that automatically translate human speech into words, words into queries, queries into reports, and finally reports that include the most important insights from the source information.

One key aspect of these features is that users can create a report by simply writing a search string or request. Specifically, BOARD uses a fuzzy search mechanism that searches the string for character sequences that are not only the same but similar to the query term to transform this request into a machine-generated report (Figure 2).

Figure 2. BOARD’s machine-generated report analysis (image courtesy of Board International)

BOARD can also identify, recover, and list reports that match the search criteria, such as reports generated by other users. This capability speeds up the solution development process by enabling users to identify existing work that can be used for a new purpose.

In-context Collaboration

BOARD has also improved its collaboration strategy, specifically by facilitating communication between users. The vendor has introduced an in-context collaboration feature that enables users to share their analyses, communicate via live chat, and enabling multiple users to edit and author reports in a single interface. Embedded security (Figure 3) ensures users have the right level of access and defines user groups. This enables users to share analytics securely and seems to improve the overall analysis of data and the development of analytics apps.

Figure 3. BOARD’s embedded collaboration features (Courtesy of Board International)

User Experience and Storytelling

BOARD is also continuing to focus heavily on customer experience and functional efficiency.

The latest version of BOARD’s BI and EPM platform has redesigned user interfaces, including a color-coded tile menu with icons to improve hierarchy management and touchscreen usability. In addition, the configuration panel now offers more time and analytics functions.

10.1 also introduces Presentations—a new storytelling capability that enables users to personalize their reports and save them as a live presentation. This enables users to share presentations that incorporate live information rather than static images and graphs with other users and groups, improving user collaboration.

This new feature lets BOARD stay up to date with current trends in BI and compete with other players in the field that already offer similar capabilities, such as Tableau and Yellowfin.

Mobility, Cognitive Capabilities, and Collaboration: BOARD’s Bet for the Future

BOARD also explained that it‘s paving the way for medium- and long-term product advancements.

In its latest release, BOARD has ensured its HTML 5- based client will replicate all the functionality of its existing Windows client interface in future. This will enable users to choose between mobile and desktop devices.

10.1 also introduces, new mobile apps and add-ons, which widen BOARD’s intrinsic analytics and data management capabilities and the solution’s mobile functions and features.   The company is also currently reinforcing the product’s interaction with the Microsoft Office software stack in a continuous effort to help users increase productivity. This will help users conduct BI and EPM analysis more easily as they will have access to embedded analytics services within the standard Office applications such as Word and Excel.

Lastly, 10.1 also includes more features for accessing big data sources and cloud-based technologies and has partnered with cloud CRM and Business Software leader’s also worth noting that BOARD is now expanding its North American presence. Specifically, the vendor is increasing the number of its human and material resources to reinforce its marketing and sales efforts and support and services capabilities.

BOARD 10.1 offers a good balance of analytics and enterprise performance management capabilities. It could be a solution for those looking to start using analytics or enhance their existing analytics capabilities.

(Originally published on TEC's Blog)

Privacy Policy

Copyright © 2017 BBBT - All Rights Reserved
Powered by WordPress & Atahualpa