Choose the right AI method for the job

Choose the right AI method for the job

It’s hard to remember the days when artificial intelligence seemed like an intangible, futuristic concept. Today, AI is everywhere. This has been decades in the making, however, and the past 90 years...

[[ This is a content summary only. Visit my website for full links, other content, and more! ]]
If Your Data Is Bad, Your Machine Learning Tools Are Useless

If Your Data Is Bad, Your Machine Learning Tools Are Useless

Poor data quality is enemy number one to the widespread, profitable use of machine learning. The quality demands of machine learning are steep, and bad data can rear its ugly head twice both in the...

[[ This is a content summary only. Visit my website for full links, other content, and more! ]]
The IoT Challenge for Local Governments as Data Stewards

The IoT Challenge for Local Governments as Data Stewards

The ‘Internet of Things,’ (IoT) is all around us, every day and everywhere. The boardroom, courtroom, classroom, and coffee shop; wherever you go, people are sporting a smart watch they use as a...

[[ This is a content summary only. Visit my website for full links, other content, and more! ]]
It’s Time to Prepare for GDPR

It’s Time to Prepare for GDPR

Recent high-profile cases of data breaching mean that the issue of protecting customers’ information is rarely out of the headlines. Several major companies worldwide have suffered data breaches with...

[[ This is a content summary only. Visit my website for full links, other content, and more! ]]
7 Trends That Will Drive Digital Transformation

7 Trends That Will Drive Digital Transformation

There’s only one thing that we’re sure of and that is change will continue to happen. Below are the top seven trends that will drive digital transformation in 2018. Blockchain continues...

[[ This is a content summary only. Visit my website for full links, other content, and more! ]]
Improve Your SMB’s Data Security

Improve Your SMB’s Data Security

As a small business, you might believe you’re not at risk for a data security breach. However, you’re just as vulnerable as larger corporations. Learn how to safeguard your data with these...

[[ This is a content summary only. Visit my website for full links, other content, and more! ]]
What Facebook’s Data Scandal Really Means for Regulators

What Facebook’s Data Scandal Really Means for Regulators

China is racing against the U.S. to dominate the development of Artificial Intelligence. The scandal over the alleged abuse of Facebook Inc.’s user data is unfolding a long way from China, and yet...

[[ This is a content summary only. Visit my website for full links, other content, and more! ]]
The Dupuy Institute Air Model Historical Data Study

The Dupuy Institute Air Model Historical Data Study

British Air Ministry aerial combat diagram that sought to explain how the RAF had fought off the Luftwaffe. [World War II Today]

[The article below is reprinted from the April 1997 edition of The International TNDM Newsletter.]

Air Model Historical Data Study
by Col. Joseph A. Bulger, Jr., USAF, Ret

The Air Model Historical Study (AMHS) was designed to lead to the development of an air campaign model for use by the Air Command and Staff College (ACSC). This model, never completed, became known as the Dupuy Air Campaign Model (DACM). It was a team effort led by Trevor N. Dupuy and included the active participation of Lt. Col. Joseph Bulger, Gen. Nicholas Krawciw, Chris Lawrence, Dave Bongard, Robert Schmaltz, Robert Shaw, Dr. James Taylor, John Kettelle, Dr. George Daoust and Louis Zocchi, among others. After Dupuy’s death, I took over as the project manager.

At the �rst meeting of the team Dupuy assembled for the study, it became clear that this effort would be a serious challenge. In his own style. Dupuy was careful to provide essential guidance while, at the same time, cultivating a broad investigative approach to the unique demands of modeling for air combat. It would have been no surprise if the initial guidance established a focus on the analytical approach, level of aggregation, and overall philosophy of the QJM [Quantified Judgement Model] and TNDM [Tactical Numerical Deterministic Model]. It was clear that Trevor had no intention of steering the study into an air combat modeling methodology based directly on QJM/TNDM. To the contrary, he insisted on a rigorous derivation of the factors that would permit the �nal choice of model methodology.

At the time of Dupuy’s death in June 1995, the Air Model Historical Data Study had reached a point where a major decision was needed. The early months of the study had been devoted to developing a consensus among the TDI team members with respect to the factors that needed to be included in the model. The discussions tended to highlight three areas of particular interest—factors that had been included in models currently in use, the limitations of these models, and the need for new factors (and relationships) peculiar to the properties and dynamics of the air campaign. Team members formulated a family of relationships and factors, but the model architecture itself was not investigated beyond the surface considerations.

Despite substantial contributions from team members, including analytical demonstrations of selected factors and air combat relationships, no consensus had been achieved. On the contrary, there was a growing sense of need to abandon traditional modeling approaches in favor of a new application of the “Dupuy Method� based on a solid body of air combat data from WWII.

The Dupuy approach to modeling land combat relied heavily on the ratio of force strengths (largely determined by �repower as modi�ed by other factors). After almost a year of investigations by the AMHDS team, it was beginning to appear that air combat differed in a fundamental way from ground combat. The essence of the difference is that in air combat, the outcome of the maneuver battle for platform position must be determined before the �repower relationships may be brought to bear on the battle outcome.

At the time of Dupuy’s death, it was apparent that if the study contract was to yield a meaningful product, an immediate choice of analysis thrust was required. Shortly prior to Dupuy’s death, I and other members of the TDI team recommended that we adopt the overall approach, level of aggregation, and analytical complexity that had characterized Dupuy’s models of land combat. We also agreed on the time-sequenced predominance of the maneuver phase of air combat. When I was asked to take the analytical lead for the contact in Dupuy’s absence, I was reasonably conï¬�dent that there was overall agreement.

In view of the time available to prepare a deliverable product, it was decided to prepare a model using the air combat data we had been evaluating up to that point—June 1995. Fortunately, Robert Shaw had developed a set of preliminary analysis relationships that could be used in an initial assessment of the maneuver/�repower relationship. In view of the analytical, logistic, contractual, and time factors discussed, we decided to complete the contract effort based on the following analytical thrust:

  1. The contract deliverable would be based on the maneuver/firepower analysis approach as currently formulated in Robert Shaw’s performance equations;
  2. A spreadsheet formulation of outcomes for selected (Battle of Britain) engagements would be presented to the customer in August 1995;
  3. To the extent practical, a working model would be provided to the customer with suggestions for further development.

During the following six weeks, the demonstration model was constructed. The model (programmed for a Lotus 1-2-3 style spreadsheet formulation) was developed. Mechanized, and demonstrated to ACSC in August 1995. The �nal report was delivered in September of 1995.

The working model demonstrated to ACSC in August 1995 suggests the following observations:

  • A substantial contribution to the understanding of air combat modeling has been achieved.
  • While relationships developed in the Dupuy Air Combat Model (DACM) are not fully mature, they are analytically signiï¬�cant.
  • The approach embodied in DACM derives its authenticity from the famous “Dupuy Methodâ€� thus ensuring its strong correlations with actual combat data.
  • Although demonstrated only for air combat in the Battle of Britain, the methodology is fully capable of incorporating modem technology contributions to sensor, command and control, and ï¬�repower performance.
  • The knowledge base, fundamental performance relationships, and methodology contributions embodied in DACM are worthy of further exploration. They await only the expression of interest and a relatively modest investment to extend the analysis methodology into modem air combat and the engagements anticipated for the 21st Century.

One �nal observation seems appropriate. The DACM demonstration provided to ACSC in August 1995 should not be dismissed as a perhaps interesting, but largely simplistic approach to air combat modeling. It is a signi�cant contribution to the understanding of air combat relationships that will prevail in the 2lst Century. The Dupuy Institute is convinced that further development of DACM makes eminent good sense. An exploitation of the maneuver and �repower relationships already demonstrated in DACM will provide a valid basis for modeling air combat with modern technology sensors, control mechanisms, and weapons. It is appropriate to include the Dupuy name in the title of this latest in a series of distinguished combat models. Trevor would be pleased.

Slay the big data ‘Swamp Thing’ with these governance protips

Slay the big data ‘Swamp Thing’ with these governance protips

Now that many companies find themselves with expansive data lakes in this era of big data, what should they do to keep these information reservoirs from coagulating into sticky swamps? Scratch that —...

[[ This is a content summary only. Visit my website for full links, other content, and more! ]]
This Data Startup Uses Artificial Intelligence To Figure Out If Your Roof Is In Decent Shape

This Data Startup Uses Artificial Intelligence To Figure Out If Your Roof Is In Decent Shape

When you first buy a house, your insurance company doesn’t know very much about it or how much insuring it will cost. That’s because it first has to send out an inspector to look at the...

[[ This is a content summary only. Visit my website for full links, other content, and more! ]]
Imperatives of digital transformation for Knowledge Management

Imperatives of digital transformation for Knowledge Management

Traditional Knowledge Management (KM) models focus on codifying knowledge nuggets derived from processes and projects and encourage stakeholders to share tacit and explicit knowledge, thus building...

[[ This is a content summary only. Visit my website for full links, other content, and more! ]]
3 Ways How Businesses Can Use Data from Smart Homes

3 Ways How Businesses Can Use Data from Smart Homes

Smart homes are getting popular. People are investing heavily in smart homes to save money and to gain control over their home and all the devices. Statistics by Ooma show that there are between 6.4 billion and 13 billion smart home devices in use and the projected growth of smart homes in the US is expected to reach 28% in 2021.



These smart homes are powered by the Internet of Things (IoT), and according to Gartner, it is a $235 billion industry, and on average, 5.5 million new devices are connected each day.

Understanding a smart home and its data

A smart home uses several IoT devices. All the devices are connected wirelessly to your home’s network and can communicate with each other to provide consumers with a seamless user experience.

These devices have sensors and software that let them communicate with the network and other devices. The data collection is at the heart of smart homes because this is how these devices work. They constantly collect data from several sources and use it to provide users with a better user experience.

According to Daniel Knight, the technical director of Fibaro UK:

“All the data we leave behind us is owned by those who collect it. ...


Read More on Datafloq
Data Marketplace: Returning Control to Personal Data’s Rightful Owners

Data Marketplace: Returning Control to Personal Data’s Rightful Owners

Data leaks permeate discussions at the cultural level. One that’s currently making headlines is Cambridge Analytica’s harvesting of private information from the Facebook profiles of 50 million users. The firm extrapolated voter behaviour based on the US electorate’s social media activities. Regardless of who benefitted during the campaign, which is to say let us set aside politics in this write-up, the most important point not to miss is that Cambridge Analytica did it without the permission of the users.

At a time that the likes of Facebook and Google hold the key to real people’s online activities, it is crucial to not only report about data breaches but also to remind users of their ownership rights. It’s the same principle that belies what also seems an unfair practice in the offline world: making people pay to get access to their personal information, such as an individual’s medical history in the healthcare sector.

Amidst such issues, the business sector is the likely place to look for solutions. After all, it is in this sector that consumer behaviours are tracked, presenting a potential dilemma for society if left unchecked. It is also in this sector that innovations sparking accountability are expected to spring up.

Here ...


Read More on Datafloq
Harnessing data for improved customer service and smart urban planning

Harnessing data for improved customer service and smart urban planning

The EDH was used to combine network topology (GIS) data with terabytes of DSL performance (time series) and electrical line test data to grade the quality of every line in the network. This helped...

[[ This is a content summary only. Visit my website for full links, other content, and more! ]]
4 Things Every Leader Should Know Before Starting A Big Data Project

4 Things Every Leader Should Know Before Starting A Big Data Project

We were told that “data is the new oil.” The Internet of Things combined with the ability to store massive amounts of data and powerful new analytical techniques like machine learning...

[[ This is a content summary only. Visit my website for full links, other content, and more! ]]
Smart Farming Is the Future Thanks to the IoT

Smart Farming Is the Future Thanks to the IoT

The world’s population rests at about 7.5 billion, and it’s expected to grow to around 9.8 billion by 2050. To feed that many people, we’ll need to produce 70 percent more food. One part of the solution to this growing demand could be smart farming powered by the Internet of Things.

The industry has quickly adopted this advanced technology. Between 2016 and 2022, the smart agriculture market is forecast to grow from $5.18 billion in 2016 to $11.23 billion, a 13.27 percent compound annual growth rate. How are farmers using this smart technology? Here are a few examples.

Crop Monitoring

By using sensors powered by the Internet of Things, meaning they connect to the internet and each other through that connection, farmers can get precise information about various factors that impact their farming operations. Depending on the sensors used, they can gather data on crop health and maturity, soil condition, water and fertiliser use and more.

Farmers can place these sensors in fields, underground and on drones that fly over farmlands. Sensors placed in the soil can measure characteristics such as soil moisture, acidity and air permeability.

Optical sensors placed on tractors, drones or satellites can analyse soil condition as well. They can also provide information ...


Read More on Datafloq
Reconciling Blockchain Technology With Europe’s GDPR

Reconciling Blockchain Technology With Europe’s GDPR

Over twenty years of evolving data privacy legislation from across Europe has culminated in a restructuring of individual data privacy rights for the entire European Union. The sweeping new...

[[ This is a content summary only. Visit my website for full links, other content, and more! ]]
Why it is difficult to withdraw from (Syria, Iraq, Afghanistan….)

Why it is difficult to withdraw from (Syria, Iraq, Afghanistan….)

Leaving an unstable country in some regions is an invite to further international problems. This was the case with Afghanistan in the 1990s, which resulted in Al-Qaeda being hosted there. This was the case with Somalia, which not only hosted elements of Al-Qaeda, but also conducted rampant piracy. This was the case with Iraq/Syria, which gave the Islamic State a huge opening and resulted in them seizing the second largest city in Iraq. It seems a bad idea to ignore these areas, even though there is a cost to not ignoring them.

The cost of not ignoring them is one must maintain a presence of something like 2,000 to 20,000 or more support troops, Air Force personnel, trainers, advisors, special operations forces, etc. And they must be maintained for a while. It will certainly result in the loss of a few American lives, perhaps even dozens. It will certainly cost hundreds of millions to pay for deployment, security operations, develop the local forces, and to re-build and re-vitalize these areas. In fact, the bill usually ends up costing billions. Furthermore, these operations go on for a decade or two or more. The annual cost times 20 years gets considerable. We have never done any studies of “security operations” or “advisory missions.” The focus of our work was on insurgencies, but we have no doubt that these things tend to drag on a while before completion.

The cost of ignoring these countries may be nothing. If there is no international terror threat and no direct threat to our interests, then there may not be a major cost to withdrawing. On the other hand, the cost of ignoring Somalia was a pirate campaign that started around 2005 and where they attacked at least 232 ships. They captured over 3,500 seafarers. At least 62 of them died. The cost of ignoring Afghanistan in the 1990s? Well, was it 9-11? Would 9-11 have occurred anyway if Al-Qaeda was not free to reside, organize, recruit and train in Afghanistan? I don’t know for sure…..but I think it was certainly an enabling factor.

I have never seen a study that analyzes/estimates the cost of these interventions (although some such studies may exist).  Conversely, I have never seen a study that analyzes/estimates the cost of not doing these interventions (and I kind of doubt that such a study exists).

Hard to do analyze the cost of the trade-off if we really don’t know the cost.

 

Blockchains aren’t just tech, they’re new economic systems

Blockchains aren’t just tech, they’re new economic systems

Forget for a moment about the value of the cryptocurrencies that you may or may not own. Instead of thinking of blockchains as investment bets or just cool technology, think of them as entirely new,...

[[ This is a content summary only. Visit my website for full links, other content, and more! ]]
How Your Marketing Team Can Use Big Data To Increase Productivity

How Your Marketing Team Can Use Big Data To Increase Productivity

Big data is one of the most pivotal aspects of marketing success today, but recent data breaches and a public row over privacy has left many marketing experts baffled when it comes to properly leveraging big data analytics today without drawing the public’s ire. Is it even possible to rely on the power of big data to increase your marketing team’s productivity anymore?

The answer is a definitive yes, and marketing pros who follow the below tips will quickly come to see that harnessing the power of big data to optimise their day to day operations has never been easier. Here’s your blueprint for success in the 21st-century marketplace.

Understand the importance of privacy

It’s a shame that it even needs to be said, but the first thing marketers trying to harness the power of big data should understand is that privacy is a paramount concern of yours, and should always be one of the chief blips on your radar. A failure to take the information you’ll be handling seriously could land you in hot water, kind of like how Facebook is presently enduring a global outcry centred on its misusage of user data.

Assuming that your marketing team is savvy enough to understand ...


Read More on Datafloq
3 Ways How Blockchain Could Disrupt the Telecom Industry

3 Ways How Blockchain Could Disrupt the Telecom Industry

The Telecom industry is known for reinventing itself. In the early days of telecom, fixed landlines were the key product of Telco's, while with the advent of the mobile phone this moved to mobile subscriptions and the massive cash cow SMS. However, in recent years, for many Telecom organisations the main revenue is no longer call but data, which required another change in their business model. With every organisation turning into a data organisation, there are significant opportunities for the Telecom industry to reinvent itself once again, thanks to Blockchain.

 

Since telephony is so pervasive, the telecom industry is huge and often complex, with over 2000 mobile network operators. Many different players and billions of consumers and devices that interchangeably use a variety of networks. In addition, the Internet of Things will add tens of billions of devices to the global telecom networks, providing a huge opportunity for the Telecom industry. All these connections and billions of transactions have to be managed, and Blockchain is. Blockchain’s core attributes to make data immutable, verifiable and traceable enables the Telecom industry to create an ecosystem where consumers are in full control, and there is trust, security and transparency in the participating ecosystem. The ...


Read More on Datafloq
Blockchain’s big potential in Africa

Blockchain’s big potential in Africa

At the beginning of March, some of the brightest minds from the blockchain world joined representatives from the financial, legal, and global technology industries to discuss widespread blockchain...

[[ This is a content summary only. Visit my website for full links, other content, and more! ]]
Data Protection by Design, a future enabled by Privacy Enhancing Technologies

Data Protection by Design, a future enabled by Privacy Enhancing Technologies

We have made it our mission to improve how computers and software systems work with data. Systems that process personal data have to change fundamentally and this is one of the things that Data...

[[ This is a content summary only. Visit my website for full links, other content, and more! ]]
How to become Data Scientist?

How to become Data Scientist?

In this tutorial we will discuss about the skills and steps for Data Scientist. More important thing is we will discuss in this tutorial is whether Scientist should be added or not. If yes why? We...

[[ This is a content summary only. Visit my website for full links, other content, and more! ]]
Blockchain is the key to fair distribution of wealth in the sharing economy

Blockchain is the key to fair distribution of wealth in the sharing economy

Early sharing economy enthusiasts had a clear vision for the peer-to-peer marketplace: path towards sustainability, empowerment of individuals, and new job opportunities for the disadvantaged....

[[ This is a content summary only. Visit my website for full links, other content, and more! ]]
Syrian Disengagement

Syrian Disengagement

The United States has struggled with what to do in Syria. We never had good relations with the dictatorial Assad family. Their civil war started with civil protests on 15 March 2011 as part of the Arab Spring. The protests turned bloody with over a thousand civilian dead (have no idea how accurate this number is) and thousands arrested. It had turned into a full civil war by late July 2011. Our initial response was to remain disengaged.

It is only when Assad used chemical weapons against his own population, similar to Saddam Hussein of Iraq, that we finally considered intervening. President Obama announced a “red line” on 20 August 2012 against the use of chemical weapons. Assad’s forces violated this on 17 October 2012 in Salqin, 23 December 2012 at Al-Bayadah, most notably in 19 March 2013 in Aleppo and in several other locations during March and April,  29 April 2013 in Saraqib and a couple of more incidents in May, 21 August 2013 in Ghouta and several other incidents in August. All attacks used the nerve agent Sarin. Instead of responding militarily, this then turned into a coordinated international effort to eliminate all the Syria chemical weapons, which was done in conjunction with Russia. This was not entirely successful, as repeated later incidences would demonstrate.

In my opinion, the United States should have intervened with considerable force in March 2013 if not before. This would include an significant air campaign, extensive aid to the rebels, and a small number of advisors. This would have certainly entailed some American casualties. Perhaps the overall results would have been no better than Libya (which has also been in civil war from 2011). But, at least with Libya we did got rid of Muammar Gaddafi in October 2011. Gaddafi had most likely organized a terrorist attack against the United States. This was the 1988 Lockerbie bombing of Pan Am Flight 103 which killed 270 people, including 190 Americans (and was most likely conducted in response to Reagan’s 1986 U.S. bombing of Libya).

Still, an intervention in Syria at that point may well have ended Assad’s regime and empowered a moderate Sunni Arab force that could control the government. It may have also forestalled the rise of ISIL. Or it may not have…it is hard to say. But, what happened over the next eight years, with the rise of ISIL, their seizure of Mosul in Iraq, and the extended civil war, was probably close to a worse case scenario. This was a case where an early intervention may have lead to a more favorable result for us. I suspect that our intervention in Libya probably created a more favorable result than if we had not intervened.

The problem in Syria is that Assad represents a minority government of Shiite Arabs. They make up around 13% of the population (largest group are Alawites). This lords over a population of 69-74% Sunni (most are Arabs but it includes Kurds and Turcoman). In the end, given enough decades and enough violence, the majority will eventually rule. It is hard to imagine in this day and age that a minority can continue to rule forever, although Bashir Assad and his father have now ruled over Syria for almost 49 years. Part of what makes that possible is that around 10% of the population of Syria is Christian and 3% Druze. They tend to side with and support the Alawites, as a dominant, non-democratic Sunni rule would be extremely prejudiced against them. Needless to say, something like an Islamic State would be a nightmare scenario for them. So, for all practical purposes, Assad tends to have the support of at least a quarter of the population. From their central position, and armed by Russia, this makes them a significant force.

So, the question becomes, should the United States now disengage from Syria, now that the Islamic States is gone (but as many as 3,000 of their fighters remain)? Right now, we have at least 2,000 troops in and around Syria, with most of them outside of Syria (mostly based with our fellow NATO member Turkey). We have lost a total of two people since this affair started. We are allied with and supporting small moderate Sunni Arab groups and some Kurdish groups (which Turkey is opposed to and sometimes engages in combat). Turkey is supporting some of its own moderate Sunni Arab groups. Also in Syria is the radical Arab groups, Al-Qaeda and of course, the Islamic State (whose leader is still at large) and Al-Nusrah. So, is it time to leave?

What are the possible outcomes if we leave?

  1. Assad will win the civil war and we will have “peace in our time” (written with irony).
    1. As the moderate Sunni groups are primarily based in Turkey they may not disappear anytime soon, especially if they are still being given support from Saudi Arabia and other Arab nations, even if the U.S. withdraws support.
    2. The Kurdish groups are still in Syria and probably not going away soon. They have some support from the Kurds in Iraq.
    3. Al-Qaeda and ISIL and other radical groups are probably not going away as long as Syria is ruled by the Alawites.
    4. There is a border with Iraq that facilitates flow of arms and men in both directions.
  2. The civil war will continue at a low level.
    1. A pretty likely scenario given the points above.
    2. Will this allow for the resurgence of radical Islamist groups?
  3. The civil war will continue at significant intensity for a while.
    1. Hard to say how long people can maintain a civil war, but the war in Lebanon went on for a while (over 15 years, from 1975 to 1990).
    2. This will certainly allow for the resurgence of radical Islamist groups.
  4. We will have a period of relative peace and then there will be a second civil war later.
    1. The conditions that lead to the first revolt have not been corrected in any manner.
    2. Syria is still a minority ruled government.
    3. This could allow for the resurgences of radical Islamist groups.
  5. There is a political compromise and joint or shared rule.
    1. I don’t think this was ever on the Assad’s agenda before, and will certainly not be now.
  6. Assad is overthrown.
    1. This is extremely unlikely, but one cannot rule out an internal Alawite coup by a leadership with a significantly different view and approach.
    2. As it is, it does not look like he is going to be defeated militarily any time soon.

So, where does continued U.S. engagement or disengagement help or hinder in these scenarios?

A few related links:

  1. Map of situation in Syria (have no idea how accurate it is): https://www.aljazeera.com/indepth/interactive/2015/05/syria-country-divided-150529144229467.html
  2. Comments by Lindsey Graham on Syria: https://www.yahoo.com/news/republican-senator-graham-warns-against-syria-troop-withdrawal-165314872.html
  3. More Maps: http://www.newsweek.com/russia-says-syria-war-nearly-over-trump-claims-us-leave-very-soon-866770

 

 

How Machine Learning Uncovers the Best Customer Insights

How Machine Learning Uncovers the Best Customer Insights

Data is the new oil. Insights derived from massive amounts of information can go beyond planning, budgeting, and forecasting, it can define a brand’s competitive advantage. Big data processed using...

[[ This is a content summary only. Visit my website for full links, other content, and more! ]]
Top 5 Perplexing CMO Challenges That Can Be Solved With Data

Top 5 Perplexing CMO Challenges That Can Be Solved With Data

Today’s CMO wears more hats than ever before, often having to play the role of technologist, data analyst, culture advocate, strategist, and the list goes on. They’re on the hook for not...

[[ This is a content summary only. Visit my website for full links, other content, and more! ]]
Response

Response

A fellow analyst posted an extended comment to two of our threads:

C-WAM 3

and

Military History and Validation of Combat Models

Instead of responding in the comments section, I have decided to respond with another blog post.

As the person points out, most Army simulations exist to “enable students/staff to maintain and improve readiness…improve their staff skills, SOPs, reporting procedures, and planning….”

Yes this true, but I argue that this does not obviate the need for accurate simulations. Assuming no change in complexity, I cannot think of a single scenario where having a less accurate model is more desirable that having a more accurate model.

Now what is missing from many of these models that I have seen? Often a realistic unit breakpoint methodology, a proper comparison of force ratios, a proper set of casualty rates, addressing human factors, and many other matters. Many of these things are being done in these simulations already, but are being done incorrectly. Quite simply, they do not realistically portray a range of historical or real combat examples.

He then quotes the 1997-1998 Simulation Handbook of the National Simulations Training Center:

The algorithms used in training simulations provide sufficient fidelity for training, not validation of war plans. This is due to the fact that important factors (leadership, morale, terrain, weather, level of training or units) and a myriad of human and environmental impacts are not modeled in sufficient detail….”

Let’s take their list made around 20 years ago. In the last 20 years, what significant quantitative studies have been done on the impact of leadership on combat? Can anyone list them? Can anyone point to even one? The same with morale or level of training of units. The Army has TRADOC, the Army Staff, Leavenworth, the War College, CAA and other agencies, and I have not seen in the last twenty years a quantitative study done to address these issues. And what of terrain and weather? They have been around for a long time.

Army simulations have been around since the late 1950s. So at the time these shortfalls are noted in 1997-1998, 40 years had passed. By their own admission, they had not been adequately addressed in the previous 40 years. I gather they have not been adequately in addressed in the last 20 years. So, the clock is ticking, 60 years of Army modeling and simulation, and no one has yet fully and properly address many of these issues. In many cases, they have not even gotten a good start in addressing them.

Anyhow, I have little interest in arguing these issues. My interest is in correcting them.

What chief data officers can learn from Facebook about building better big data security practices

What chief data officers can learn from Facebook about building better big data security practices

The harvesting of millions of voter profiles by Cambridge Analytica in order to exploit personal fears and influence the outcome of the 2016 US presidential election has not made life easy for...

[[ This is a content summary only. Visit my website for full links, other content, and more! ]]
Who Needs A Data Model Anyway?

Who Needs A Data Model Anyway?

Will AI eliminate the need for data models? With data lakes offering to store raw data and promising schema-on-read access, data warehouses moving in-memory for vastly enhanced query performance, and...

[[ This is a content summary only. Visit my website for full links, other content, and more! ]]
How Industrial IoT is Influenced by Cognitive Anomaly Detection

How Industrial IoT is Influenced by Cognitive Anomaly Detection

There are about 6,000 sensors on an A350 aeroplane. The average Airbus flight generates 2.5 petabytes per flight with over 100,000 flights per day! 

Industrial Internet of Things, or IIoT, is a massive market.

It includes aeroplane and car manufacturers, power plants, oil rigs, and assembly lines, all of which contain sensors measuring thousands of different attributes. But most IIoT companies let 80% of their data go unused. And this is a big challenge for businesses. 

But there are other challenges too, like latency issues that affect the results from real-time data, the failure to predict when parts will breakdown, and the expense of hiring data scientists.

A Cognitive approach to Anomaly Detection, powered by Machine Learning and excellent data and analytics, is providing IIoT businesses with solutions and helping them to overcome the limitations of traditional statistical approaches. Machine Learning is becoming a commonplace tool for businesses, accelerating root cause analysis. Anomaly detection refers to the problem of finding patterns in data that don’t conform to expected behaviour. There are many different types of anomalies, and determining which is a good and bad anomaly is challenging. 

In Industrial IoT, one main objective is the automatic monitoring and detection of these abnormal events, or changes and shifts in the collected ...


Read More on Datafloq
4 Developments to Expect From Cloud Computing During the Coming Year

4 Developments to Expect From Cloud Computing During the Coming Year

While businesses these days are finally becoming more aware of the benefits that cloud computing can offer, the truth is that the development of cloud computing is far from complete.

The technology still has plenty to offer businesses of all types and is still going through many changes that will continue to influence and revolutionise the way that companies operate and what they can accomplish in the cloud. 

While much of the changes that will come to cloud computing can't be predicted, there are a few things that are expected to happen in the coming year. 

Below are a few of the main things that can be expected from cloud computing during 2018 and beyond.

1. An Increase in Cloud Security

One of the main things to expect from cloud computing in the coming year is that there will be a major increase of focus in cloud security. 

Recent high-profile security breaches have businesses worried about the safety of their important data, and cyber-security threats are continuing to increase. Many businesses are sceptical about using cloud services due to worries about security and the safety of their data.

Over the coming year, cloud computing service providers will ensure that they tighten their security to the greatest extent ...


Read More on Datafloq
Assessing The Assessments Of The Military Balance In The China Seas

Assessing The Assessments Of The Military Balance In The China Seas

“If we maintain our faith in God, love of freedom, and superior global airpower, the future [of the US] looks good.” — U.S. Air Force General Curtis E. LeMay (Commander, U.S. Strategic Command, 1948-1957)

Curt LeMay was involved in the formation of RAND Corporation after World War II. RAND created several models to measure the dynamics of the US-China military balance over time. Since 1996, this has been computed for two scenarios, differing by range from mainland China: one over Taiwan and the other over the Spratly Islands. The results of the model results for selected years can be seen in the graphic below.

The capabilities listed in the RAND study are interesting, notable in that the air superiority category, rough parity exists as of 2017. Also, the ability to attack air bases has given an advantage to the Chinese forces.

Investigating the methodology used does not yield any precise quantitative modeling examples, as would be expected in a rigorous academic effort, although there is some mention of statistics, simulation and historical examples.

The analysis presented here necessarily simplifies a great number of conflict characteristics. The emphasis throughout is on developing and assessing metrics in each area that provide a sense of the level of difficulty faced by each side in achieving its objectives. Apart from practical limitations, selectivity is driven largely by the desire to make the work transparent and replicable. Moreover, given the complexities and uncertainties in modern warfare, one could make the case that it is better to capture a handful of important dynamics than to present the illusion of comprehensiveness and precision. All that said, the analysis is grounded in recognized conclusions from a variety of historical sources on modern warfare, from the air war over Korea and Vietnam to the naval conflict in the Falklands and SAM hunting in Kosovo and Iraq. [Emphasis added].

We coded most of the scorecards (nine out of ten) using a five-color stoplight scheme to denote major or minor U.S. advantage, a competitive situation, or major or minor Chinese advantage. Advantage, in this case, means that one side is able to achieve its primary objectives in an operationally relevant time frame while the other side would have trouble in doing so. [Footnote] For example, even if the U.S. military could clear the skies of Chinese escort fighters with minimal friendly losses, the air superiority scorecard could be coded as “Chinese advantage� if the United States cannot prevail while the invasion hangs in the balance. If U.S. forces cannot move on to focus on destroying attacking strike and bomber aircraft, they cannot contribute to the larger mission of protecting Taiwan.

All of the dynamic modeling methodology (which involved a mix of statistical analysis, Monte Carlo simulation, and modified Lanchester equations) is publicly available and widely used by specialists at U.S. and foreign civilian and military universities.” [Emphasis added].

As TDI has contended before, the problem with using Lanchester’s equations is that, despite numerous efforts, no one has been able to demonstrate that they accurately represent real-world combat. So, even with statistics and simulation, how good are the results if they have relied on factors or force ratios with no relation to actual combat?

What about new capabilities?

As previously posted, the Kratos Mako Unmanned Combat Aerial Vehicle (UCAV), marketed as the “unmanned wingman,” has recently been cleared for export by the U.S. State Department. This vehicle is specifically oriented towards air-to-air combat, is stated to have unparalleled maneuverability, as it need not abide by limits imposed by human physiology. The Mako “offers fighter-like performance and is designed to function as a wingman to manned aircraft, as a force multiplier in contested airspace, or to be deployed independently or in groups of UASs. It is capable of carrying both weapons and sensor systems.” In addition, the Mako has the capability to be launched independently of a runway, as illustrated below. The price for these vehicles is three million each, dropping to two million each for an order of at least 100 units. Assuming a cost of $95 million for an F-35A, we can imagine a hypothetical combat scenario pitting two F-35As up against 100 of these Mako UCAVs in a drone swarm; a great example of the famous phrase, quantity has a quality all its own.

A battery of Kratos Aerial Target drone ready for take off. One of the advantages of the low-cost Kratos drones are their ability to get into the air quickly. [Kratos Defense]

How to evaluate the effects of these possible UCAV drone swarms?

In building up towards the analysis of all of these capabilities in the full theater, campaign level conflict, some supplemental wargaming may be useful. One game that takes a good shot at modeling these dynamics is Asian Fleet.  This is a part of the venerable Fleet Series, published by Victory Games, designed by Joseph Balkoski to model modern (that is Cold War) naval combat. This game system has been extended in recent years, originally by Command Magazine Japan, and then later by Technical Term Gaming Company.

Screenshot of Asian Fleet module by Bryan Taylor [vassalengine.org]

More to follow on how this game transpires!

From the IoT Evolution to the Revolution of the Cognitive Age

From the IoT Evolution to the Revolution of the Cognitive Age

IoT Evolution or IoT Revolution

During all these years evangelising on the Internet of Things (IoT), I have been explaining to customers, partners and friends that IoT can positively change the way we do business and the way we live our lives. I have been asked if IoT is a new revolution in our society, or if it is just one more step in the technological evolution of the digital revolution. Today, the debate continues but whether evolution or revolution, the Internet of Things is here to stay.

If you have read AIG´s whitepaper entitled “Internet of Things: Evolution or Revolution?� you learned about IoT, from its origins to its applications in business, the risks associated with its inevitable arrival and how the Internet of Things brings dramatic change. In the whitepaper, we discover that IoT is often presented as a revolution that is changing the face of society or the industry in a profound manner. It is an evolution that has its origins in technologies and functionalities developed by visionary automation suppliers more than 15 years ago.

I definitely think it’s an evolution

The development of the Internet of Things is a bold move. IoT is not just a leap from the Internet. The Internet of Things brings ...


Read More on Datafloq
Artificial intelligence in oncology

Artificial intelligence in oncology

There is no denying the presence of computers in our everyday life, whether it’s through phones, personal virtual assistants such as Apple’s Siri and Amazon’s Alexa, or video games. Lately, the...

[[ This is a content summary only. Visit my website for full links, other content, and more! ]]
Blockchain will make AI smarter by feeding it better data

Blockchain will make AI smarter by feeding it better data

We’ve heard — probably too much — about how cryptocurrencies like Bitcoin shift financial power away from governments and big banks to individuals. Blockchain technology is also democratizing...

[[ This is a content summary only. Visit my website for full links, other content, and more! ]]
IoT: Pushing manufacturers to the edge

IoT: Pushing manufacturers to the edge

Tego CEO Tim Butler speaks about the company’s focus on the ‘T’ in IoT (Internet of Things) in order to make mobile assets smart, enabling embedded intelligence at source, ready to analyse and...

[[ This is a content summary only. Visit my website for full links, other content, and more! ]]
Two Overlooked Factors Cryptocurrency Newbies Need to Pay Attention To

Two Overlooked Factors Cryptocurrency Newbies Need to Pay Attention To

Words and terminology are sometimes created to better understand a particular world. Cryptocurrency created a whole new financial world and with it came a number of terms like 'market cap.'

The important thing is not to let the mystery of these words intimidate investors, especially newcomers. The market cap refers to the amount of Fiat monies supporting a cryptocurrency. Remember that fiat money comes in many forms like GBP or USD to name a few worth knowing.

It is easy to get fixated on the price of a single coin, especially as a newcomer. The following are two factors that are sometimes overlooked by investors but are pretty important.

Paying Attention to the Market Cap

Too often, a person will attempt to convince investors that a coin can reach a specified amount without taking into account how vital the market cap can be.

Say a coin was worth $2,000 and its coin supply stood at 16.4 million. Then, its supply was spiked up to 164 million. This means that the price of the coin is going to drop to $200, which would be a real shame.

Investors are sometimes enticed by the price of the coin and its potential for growth without detailing the cryptocurrency market ...


Read More on Datafloq
Timeline Charts, Derived Tables and Analytic Functions in Looker 5

Timeline Charts, Derived Tables and Analytic Functions in Looker 5

One of the data streams that comes into my personal Google BigQuery data warehouse is a record of all the Hue light switch settings and illuminance levels, Nest thermostat settings and motion sensor readings coming in via the Samsung Smart Things hub under the stairs back home. For the Hue lightbulbs I get a stream 1s and 0s that initially get stored in a four-column table in BigQuery, like this:

I then create a SQL view over that table that joins to a lookup table containing device types and room allocated for each of the devices in the data stream, and sets a metric type for each recording so that I can analyze all the thermostat readings together, bring together all the light switch settings and so on.

Its then quite straightforward to bring this SQL view into Looker and add it to the rest of the data streams I’ve brought together there so I can see, for example, whether the kitchen lights get turned-off after everyone’s gone upstairs after breakfast or whether I should try and automate those as well as my kettle.

Looker 5 introduced another way to visualize data with defined start and end events in the form of the new Timeline Chart, a feature I tweeted about the other day when I brought data on tasks from Asana into my BigQuery data warehouse via a Fivetran integration.

It’d be interesting to use this type of chart to show how one particular light switch changed from on to off and then on again over a period of time, or take all the Hue lights for a given part of the house and see whether anybody turns the lights off ever … or whether they need to be linked to the motion sensors and automatically turned-off after a period of inactivity in each room.

To use this type of chart I first need to create an “end date� field for each of my event records, so each timeline has a start and end date with other fields being optional for label and value. Looking at the list of events for the light switch I looked at earlier and ordering the events by time recorded you can see the information I need is there:

SELECT
*
FROM
`aerial-vehicle-148023.personal_metrics.smartthings_readings`
WHERE
device = 'kitchen seating area spot 1'
AND raw_value IN ('off','on')
ORDER BY
date_time

What this calls for is the LEAD BigQuery Standard SQL analytic function that returns a value based on a subsequent row to the one we’re working with, and I can combine that with the TIMESTAMP_DIFF function to compute the time between the start and end event dates as that would be useful to know too. The SQL expressions for the two new fields would therefore be:

LEAD(date_time, 1) OVER (PARTITION BY device, metric ORDER BY date_time) end_date_time,
timestamp_DIFF(LEAD(date_time, 1) OVER (PARTITION BY device, metric ORDER BY date_time),date_time,second) as value_duration_seconds

I could add these column expressions into the SQL view definition in BigQuery and order the rows by device, metric and timestamp, like this:

SELECT
date_time as date_time,
device as device,
metric,
value,
raw_value,
LEAD(date_time, 1) OVER (PARTITION BY device, metric ORDER BY date_time) as end_date_time,
timestamp_DIFF(LEAD(date_time, 1) OVER (PARTITION BY device, metric ORDER BY date_time),date_time,second) as value_duration_seconds
from `aerial-vehicle-148023.personal_metrics.smartthings_readings`
where device = 'kitchen seating area spot 1'
and metric = 'switch'
and raw_value in ('on','off')
order by 2,3,1

Executing that SQL shows the logic is working, but I’d then have to maintain that view within BigQuery and that might not be the most convenient place to add new code.

Instead, I could just go into Looker and create a new view there based on a derived table SQL expression and do the work there. Right now my LookML model looks like the excerpt below, where you can see the sensor readings view joined into the rest of the explore so all my data can be analyzed together.

connection: "rittman_bigquery"
include: "*.view.lkml" # include all views in this project
include: "*.dashboard.lookml" # include all dashboards in this project
explore: date_dim {
case_sensitive: no
label: "Data Warehouse"
join: fluentd_transactions {
type: left_outer
sql_on: ${date_dim.date_minute5} = ${fluentd_transactions.date_minute5} ;;
relationship: many_to_many
}
join: smartthings_readings {
type: left_outer
sql_on: ${date_dim.date_minute5} = ${smartthings_readings.date_minute5} ;;
relationship: one_to_many
}
join: fluentd_uber_rides {
type: left_outer
sql_on: ${date_dim.date_minute5} = ${fluentd_uber_rides.date_minute5} ;;
relationship: many_to_many
}
}

I now create a new LookML view that uses a derived table SQL query as the view definition rather than simply referencing an existing BigQuery table. Note how I’ve used the same view_label as the main LookML view containing my event data, so that the dimensions and metric I define here appear alongside all the other smart device fields the same explore view.

view: device_event_end_and_timespan {
view_label: "6 - Smart Devices"
derived_table: {
sql: SELECT
date_time as date_time,
device as device,
metric as metric,
LEAD(date_time, 1) OVER (PARTITION BY device, metric ORDER BY date_time) as end_date_time,
timestamp_DIFF(LEAD(date_time, 1) OVER (PARTITION BY device, metric ORDER BY date_time),date_time,second) as value_duration_seconds
from `aerial-vehicle-148023.personal_metrics.smartthings_readings`
order by 2,3,1;;
}
dimension: date_time {
type: date_time
hidden: yes
sql: ${TABLE}.date_time ;;
}
dimension: device {
type: string
hidden: yes
sql: ${TABLE}.device ;;
}
dimension: metric {
type: string
hidden: yes
sql: ${TABLE}.metric ;;
}
dimension_group: end_date_time {
group_label: "End Date"
label: "End"
type: time
timeframes: [
raw,
time,
hour,
hour3,
hour4,
hour6,
hour12,
hour_of_day,
time_of_day,
minute,
minute5,
minute10,
minute15,
minute30,
day_of_week,
day_of_month,
day_of_year,
date,
week,
week_of_year,
month,
month_name,
month_num,
quarter,
year
]
sql: ${TABLE}.end_date_time ;;
}
dimension: pk {
primary_key: yes
hidden: yes
type: string
sql: concat(cast(${TABLE}.start_date_time as string), ${TABLE}.device) ;;
}
measure: value_duration_seconds {
type: average
sql: ${TABLE}.value_duration_seconds ;;
}
}

Then I join this derived table view back into the explore within my LookML model, as below:

connection: "rittman_bigquery"
include: "*.view.lkml" # include all views in this project
include: "*.dashboard.lookml" # include all dashboards in this project
explore: date_dim {
case_sensitive: no
label: "Data Warehouse"
join: fluentd_transactions {
type: left_outer
sql_on: ${date_dim.date_minute5} = ${fluentd_transactions.date_minute5} ;;
relationship: many_to_many
}
join: smartthings_readings {
type: left_outer
sql_on: ${date_dim.date_minute5} = ${smartthings_readings.date_minute5} ;;
relationship: one_to_many
}
join: device_event_end_and_timespan {
type: left_outer
sql_on: ${smartthings_readings.date_time} = ${device_event_end_and_timespan.date_time} and
${smartthings_readings.device} = ${device_event_end_and_timespan.device} and
${smartthings_readings.metric} = ${device_event_end_and_timespan.metric};;
relationship: one_to_one
}
join: fluentd_uber_rides {
type: left_outer
sql_on: ${date_dim.date_minute5} = ${fluentd_uber_rides.date_minute5} ;;
relationship: many_to_many
}

and now I can create timeline charts that show me which lights were on over a particular period of time, like this:

or more importantly, work out why the bathroom never seems to be free when you’ve got two teenage kids in the house.


Timeline Charts, Derived Tables and Analytic Functions in Looker 5 was originally published in Mark Rittman’s Personal Blog on Medium, where people are continuing the conversation by highlighting and responding to this story.

IoT platform market: growth, fragmentation, trends, forecasts, vendors

IoT platform market: growth, fragmentation, trends, forecasts, vendors

One of the many components of an IoT solution is the IoT platform (other components of the IoT stack are connectivity, devices, sensors and actuators, IoT gateways etc.). Although it’s only a few...

[[ This is a content summary only. Visit my website for full links, other content, and more! ]]
Five Surprising Reasons to Invest in Better Security Training

Five Surprising Reasons to Invest in Better Security Training

The conventional wisdom about security training needs an update — and for reasons that may surprise you. Cyberattacks are rising in frequency, severity and the damage they cause. Since the weakest...

[[ This is a content summary only. Visit my website for full links, other content, and more! ]]
C-WAM 3

C-WAM 3

Now, in the article by Michael Peck introducing C-WAM, there was a quote that got our attention:

“We tell everybody: Don’t focus on the various tactical outcomes,â€� Mahoney says. “We know they are wrong. They are just approximations. But they are good enough to say that at the operational level, ‘This is a good idea. This might work. That is a bad idea. Don’t do that.’â€�

Source: https://www.govtechworks.com/how-a-board-game-helps-dod-win-real-battles/#gs.ifXPm5M

I am sorry, but this line of argument has always bothered me.

While I understand that no model is perfect, that is the goal that modelers should always strive for. If the model is a poor representation of combat, or parts of combat, then what are you teaching the user? If the user is professional military, then is this negative training? Are you teaching them an incorrect understanding of combat? Will that understanding only be corrected after real combat and loss of American lives? This is not being melodramatic…..you fight as you train.

We have seen the argument made elsewhere that some models are only being used for training, so…….

I would like to again bring your attention to the “base of sand” problem:

Wargaming Multi-Domain Battle: The Base Of Sand Problem

As always, it seems that making the models more accurate seems to take lower precedence to whatever. Validating models tends to never be done. JICM has never been validated. COSAGE and ATCAL as used in JICM have never been validated. I don’t think C-WAM has ever been validated.

Just to be annoyingly preachy, I would like to again bring your attention to the issue of validation:

Military History and Validation of Combat Models

 

 

Qualitative Visualization: Chart choosing and the design process

Qualitative Visualization: Chart choosing and the design process

In order for data to be used for learning and adapting, the data itself needs to be easily accessible. Evaluators and researchers have been hungry for resources on how to effectively present...

[[ This is a content summary only. Visit my website for full links, other content, and more! ]]
5 Questions to ask cloud services providers about security

5 Questions to ask cloud services providers about security

You’ve finally decided to move your storage and apps to the cloud. Great, but it is not just a question of uploading files to a new server. What do you know about your cloud provider? Is their...

[[ This is a content summary only. Visit my website for full links, other content, and more! ]]
Blockchain use cases where IoT and distributed ledger technology meet

Blockchain use cases where IoT and distributed ledger technology meet

As mentioned on our blockchain for businesses page, there are several blockchain use cases, as among others found in blockchain research and forecasts on blockchain spending, that are moving faster...

[[ This is a content summary only. Visit my website for full links, other content, and more! ]]
IBM Watson with blockchain boost adds visibility to supply chain disruptions

IBM Watson with blockchain boost adds visibility to supply chain disruptions

IBM Corp. has made big bets on Watson, its artificial intelligence platform, and the revolutionary ledger system blockchain. So, it only makes sense that the company would ultimately combine the two...

[[ This is a content summary only. Visit my website for full links, other content, and more! ]]
These 6 Industries Are Using Blockchain to Gain a Competitive Advantage

These 6 Industries Are Using Blockchain to Gain a Competitive Advantage

Real estate. Law firms. Information services. Banking: Where else will this technology raise its profile? While people debate the benefits and pitfalls of cryptocurrency, the blockchain technology...

[[ This is a content summary only. Visit my website for full links, other content, and more! ]]
What Entrepreneurs Need to Know About Facial Recognition Technology

What Entrepreneurs Need to Know About Facial Recognition Technology

As the Fourth Industrial Revolution unfolds with billions of people sharing a wide and deep array of data — texts, tweets, GPS coordinates, all manner of photos, videos, environmental data,...

[[ This is a content summary only. Visit my website for full links, other content, and more! ]]
Is it smart to have artificial intelligence?

Is it smart to have artificial intelligence?

This is unsettling. The help is getting surly. We were in Brooklyn heading to a favorite Mexican dive when a pal, Demetri, asked about a movie we’d seen recently. My wife, Wink, happened to be using...

[[ This is a content summary only. Visit my website for full links, other content, and more! ]]
AI-powered ambient computing is just getting started

AI-powered ambient computing is just getting started

It’s hard to believe that not even 75 years have passed since ENIAC, that room-sized, 30-ton calculating machine, was the number-crunching hero of World War II. Since then, technology has evolved and...

[[ This is a content summary only. Visit my website for full links, other content, and more! ]]
Why Artificial Intelligence Cannot Survive Without Big Data

Why Artificial Intelligence Cannot Survive Without Big Data

It may come as no surprise that the internet has been swelling up with an increasing amount of data, so much so that it’s become difficult to keep track of. If in 2005 we were barely dealing with 0.1...

[[ This is a content summary only. Visit my website for full links, other content, and more! ]]
The 4 Laws of Digital Transformation

The 4 Laws of Digital Transformation

My discussions with organizations looking to “digitally transform� themselves is yielding some interesting observations. I expect that when these discussions move into the execution phase, we will...

[[ This is a content summary only. Visit my website for full links, other content, and more! ]]
Machine Learning and Its Algorithms to Know

Machine Learning and Its Algorithms to Know

Algorithms in Machine Learning – MLAlgos   Machine Learning at a Glance Machine learning is subset to Artificial Intelligence  which borrows principles from computer science. It is not an AI though;...

[[ This is a content summary only. Visit my website for full links, other content, and more! ]]
Here’s What the Media Should Really be Focusing on With Blockchain Technology

Here’s What the Media Should Really be Focusing on With Blockchain Technology

It’s not an exaggeration to say that from a technological framework perspective within 30 years blockchain technology is going to form the foundation of practically every global sector Bitcoin...

[[ This is a content summary only. Visit my website for full links, other content, and more! ]]
3 Essentials for Using Big Data for Financial Forecasting

3 Essentials for Using Big Data for Financial Forecasting

At this point, big data’s impact on the way companies operate is no longer up for debate. Proving itself to be a key player in everything from marketing to retail, big data’s prevalence in business grows every day. Information about exactly how populations spend their time and money is more available than ever before, and it’s up to individual companies to decide how they want to use that vast knowledge.

In 2018, big data stands to make the leap from vague, intangible concept to accessible resource. As new tools store, organize and decipher big data, it becomes increasingly more usable, and the way businesses conduct financial forecasting could be significantly affected. If you’re considering using big data in your own financial forecasting, make sure you have these three things in place to get the most out of your data.

1. The Right Technology

Trying to harness the power of big data without the proper systems in place will leave you lacking a clear direction. While that might seem obvious, deciding on the tool that will work best for your needs isn’t always easy. Fortunately, many software companies know of big data’s potential and are creating dashboards for companies to track analytics more efficiently. ...


Read More on Datafloq
6 Technology Trends Shaking Up the Business World

6 Technology Trends Shaking Up the Business World

Technology will always be changing and adapting to the needs of the world we live in. For anybody doing business today, technology is definitely something that should be kept in mind. These days, it's not an option for most business owners to stay current with tech trends and find practical ways to apply innovations, concepts, and processes to daily operations. But why? For starters, technology is changing at a rapid pace, so much so that many of the innovations that drive business today didn't even exist when the millennium was new. Take a moment to consider tech trends that are currently shaking up the business world.

1. Remote Storage

Referring to remote data storage in a centralised information database, cloud computing can do more than free up valuable hard drive space. Because cloud data is accessible via the Internet, employees can access info from any PC or device from multiple locations. As a result, internal communications can be more efficient. It's also a useful technology for test and development environments, data backup that can minimise or eliminate downtime in the event of an emergency, and software-as-a-service companies looking for an easier way to make their products accessible.

2. Big Data Collection and Processing

The increase ...


Read More on Datafloq
Blockchain is on a collision course with EU privacy law

Blockchain is on a collision course with EU privacy law

Those who have heard of “blockchain� technology generally know it as the underpinning of the Bitcoin virtual currency, but there are myriad organizations planning different kinds of applications for...

[[ This is a content summary only. Visit my website for full links, other content, and more! ]]
3 steps to getting started with supply chain AI

3 steps to getting started with supply chain AI

The modern global supply chain is defined by scale — billions of transactions and terabytes of data across multiple systems, with businesses generating more every moment. Traditional supply chain...

[[ This is a content summary only. Visit my website for full links, other content, and more! ]]
TDI Friday Read: Links You May Have Missed, 30 March 2018

TDI Friday Read: Links You May Have Missed, 30 March 2018

This week’s list of links is an odds-and-ends assortment.

David Vergun has an interview with General Stephen J. Townshend, commander of the U.S. Army Training and Doctrine Command (TRADOC) on the Army website about the need for smaller, lighter, and faster equipment for future warfare.

Defense News’s apparently inexhaustible Jen Judson details the Army’s newest forthcoming organization, “US Army’s Futures Command sets groundwork for battlefield transformation.�

At West Point’s Modern War Institute, Army Lionel Beehner, Liam Collins, Steve Ferenzi, Robert Person and Aaron Brantly have a very interesting analysis of the contemporary Russian approach to warfare, “Analyzing the Russian Way of War: Evidence from the 2008 Conflict with Georgia.�

Also at the Modern War Institute, Ethan Olberding examines ways to improve the planning skills of the U.S. Army’s junior leaders, “You Can Lead, But Can You Plan? Time to Change the Way We Develop Junior Leaders.�

Kyle Mizokami at Popular Mechanics takes a look at the state of the art in drone defenses, “Watch Microwave and Laser Weapons Knock Drones Out of the Sky.�

Jared Keller at Task & Purpose looks into the Army’s interest in upgunning its medium-weight armored vehicles, “The Army Is Eyeing This Beastly 40mm Cannon For Its Ground Combat Vehicles.�

And finally, MeritTalk, a site focused on U.S. government information technology, has posted a piece, “Pentagon Wants An Early Warning System For Hybrid Warfare,” looking at the Defense Advanced Research Projects Agency’s (DARPA) ambitious Collection and Monitoring via Planning for Active Situational Scenarios (COMPASS) program, which will incorporate AI, game theory, modeling, and estimation technologies to attempt to decipher the often subtle signs that precede a full-scale attack.

AI-driven data could be the music industry’s best marketing instrument

AI-driven data could be the music industry’s best marketing instrument

The music industry is learning a new rhythm through the instrument of artificial intelligence. AI is revolutionizing insights and business strategies and fine-tuning the way we work, connect, learn,...

[[ This is a content summary only. Visit my website for full links, other content, and more! ]]
The BBBT Sessions: Zoomdata and The New Generation of BI Solutions

The BBBT Sessions: Zoomdata and The New Generation of BI Solutions

As mentioned right at the end of my Look Back Into 2017 And Forward To 2018 post, I did start this year looking forward for an exciting 2018 and, well, it seems my dream has coming true.

Right in January, the BBBT group, which I’m proudly part of, hosted a briefing with visual analytics solution ZoomData, one of what I like to call new generation analytics solutions.

As usual, this briefing was a great opportunity to know what the company is and what its up to, in the present and for the future.

So, here is a brief report on what happened during this insightful encounter with Zoomdata and my fellow BBBT members.

About Zoomdata

Innovative company Zoomdata develops what they describe as:

“The world’s fastest visual analytics solution for big and streaming data. Using patented data sharpening and micro-query technologies, Zoomdata empowers business users to visually consume data in seconds, even across tens of billions of rows of data.�

According to the software provider, its offering enables interactive analytics across disparate data sources and helps bridge modern and legacy data architectures, enabling effective blending of real-time data streams and historical data from both on-premises and in the cloud.

Interestingly, Zoomdata delivers its offering which uses a microservices architecture to provide elastic scalability and the ability to run on-premises, in the cloud, or embedded within a third party application.

Another thing that makes Zoomdata such an appealing solution is its ability to sit right in the middle of the action (Figure 1), and by doing so, aim to become an ideal broker for the full analytics and business intelligence process, especially for analytics performed over big data sources.

Figure 1. Zoomdata in the center of the analytics process (Courtesy of Zoomdata)

Some features/advantages offered by Zoomdata include:
  • No data movement
  • Native support for widest breadth of modern data sources, including streaming and search
  • Modern elastically scalable microservices based architecture
    • Secure – in context of underlying data platform
    • Easy to embed and extend
  • Fastest time to insight – modern user experience, minimal IT intervention, no data modeling required
Zoomdata is venture-backed by Accel, Columbus Nova Technology Partners, Comcast Ventures, Goldman Sachs, NEA and Razor’s Edge

The company currently has offices in Chicago, New York, San Mateo, CA and Reston, VA.

Zoomdata: New Generation BI

Presented by current leadership team formed by Nick Halsey, president and CEO, Justin Langseth, founder and chairman, Ruhollah Farchtchi, Zoomdata’s CTO and Ian Fyfe, senior director of product management, the company spend some time into the fundamental question of why the need to build Zoomdata, and update on Zoomdata's new features and  of course, to provide the company's current figures.

Founded in 2012, the company has steadily grown since to become a company currently with more than 50 customer accounts, achieving consistent 100% sales growth year over year, and 200% growth internationally.

Additionally, Zoomdata has been able to engage in relevant partnerships with companies in the likes of Infosys, Deloitte and Atos, aiming to further expand its presence across the big data and data management landscape.

Currently, the company has extended its presence internationally to places including Tokio, Singapore and Sydney, and accounts for more than 80 employees in Reston and San Mateo offices.
(post-ads)

OK, So Why Zoomdata?

Formed by an experienced team of executives in the business intelligence market, it is not a surprise that the origin of Zoomdata had to do with developing a solution that addresses  existing gaps traditional BI tools have.

While most of the traditional BI solutions created are fundamentally designed to work on structured transactional datasets, a growing number of companies needed, with the advent of the Big Data revolution to deal also with unstructured or multi-structured sets of data and, many times to addressed them all using, instead of combination of analytics tools, all within a single solution.

The Zoomdata realized early on that while structured data sets are not going away, companies need to incorporate these new data sets like interactions (click-streams, social) or observations (IoT and sensors) into the analytics mix (Figure 2).

Figure 2. Zoomdata in the Evolution of the Big Data Landscape (Courtesy of Zoomdata)

Realizing that “legacy� BI systems were not designed for taking over these emerging types of data sets, Zoomdata took on the challenge to provide a solution with the ability to process both traditional structured and new big data sources within a single environment and to provide a consistent analytics framework for all of them.

According to Zoomdata, this put them in a position to better address the challenges that other BI systems face, such is the case for insufficient support for streaming and/or unstructured data, and scalability limitations, among others.

Through the session, Zoomdata pinpointed —both during the presentation and the demo— its main features and benefits offered by Zoomdata to provide its users with a reliable new generation analytics solution.

So Zoomdata aims, instead of change companies existing data frameworks, to ensure organizations can deal as neatly as possible with the complete stack of “old� and “new� data sources using a single analytics solution that bonds them.

One noteworthy aspect of Zoomdata is its holistic approach to data and, despite providing strong support for Hadoop sources, the company focuses on being a connecting hub for a plethora of other data sources using native connectivity and providing access to real-time data by having a native streaming architecture.

About the Session

The Zoomdata team gave us a nice company and product presentation, as well as a good opportunity to have interesting discussions ranging from the most effective role of Zoomdata within a company’s data and analytics strategy, to the role and importance of real-time data within a big data initiative.

The demo session gave us also an opportunity to check some of the most important functionality features claimed by Zoomdata, such is the case for Zoomdata’s:


  • User experience and empowerment. From what we were able to witness from the demo, Zoomdata’s user interface looks friendly and easy to operate with a nice look and feel.
  • Wide connectivity to data sources. Zoomdata includes many connectors for both traditional structured, and modern data sources via a set of native connectors for Hadoop, NoSQL, as well as streaming and cloud data sources directly, avoiding data movement (Figure 3).
  • Embedding capabilities. The team at Zoomdata stressed the solution’s capabilities and rich set of API’s available to easy embed it to third party applications. This includes an SDK to allow development of custom analytics extensions.
  • Real-time streaming data analysis capabilities. Here, Zoomdata made emphasis on the core capabilities Zoomdata relies upon to connect and work with streaming infrastructures and effectively visualize real-time information, going beyond the traditional business intelligence approach of working with historical data.

Figure 3. Zoomdata’s Screenhot (Courtesy of Zoomdata)

Also, relevant aspects of Zoomdata’s capabilities arsenal include the fact that no data movement is needed and a flexible architecture using micro-services, including micro querying and a scalable in-memory caching configuration to increased processing speeds.

An interesting discussion erupted in the session regarding the place where Zoomdata might fit more within due to its design nature, meaning large data volumes with low process complexity or by the contrary, within smaller yet more complex data sets.

While during the discussion I did not perceive a conclusive idea in this regards, my guess is that by nature Zoomdata, due to its micro-service and embedded nature fit within deployments with large and less complex sets, there is no reason in my view, to consider Zoomdata a good candidate to more complex deployments acting as an ideal “intelligent connector�, especially with data infrastructures that by nature are decomposed and fragmented, Zoomdata can be a right interface to homologate analysis coming from different sources. In some way, this reminds me about an Enterprise Information Integration configuration with additional features.

So what?

Aside from providing a nice briefing and demo, full of examples and case  studies, the Zoomdata team did provide us with a wide view of the solution and where it might, or not, fit, it seems Zoomdata plays well within scenarios where traditional and big data sources need to be placed jointly to work together.

Zoomdata is a solution to consider especially when big data initiatives are already acting together along with traditional structured sources, and where customization and embedding within third party systems play a relevant role in a project. This might mean Zoomdata is not necessarily designed keeping non-expert users in mind. I certain data management expertise might be required to take full advantage of Zoomdata’s capabilities yet, this learning effort might pay good dividends at the end.

Want to know more about Zoomdata or the BBBT session?

You can watch the video trailer below or visit  Zoomdata’s web page or leave a comment right in the space below.

Three Ways Machine Learning Is Improving The Hiring Process

Three Ways Machine Learning Is Improving The Hiring Process

Technology’s advance into all industries and jobs tends to send ripples of worry with each evolution. It started with computers and continues with artificial intelligence, machine learning, IoT, big...

[[ This is a content summary only. Visit my website for full links, other content, and more! ]]
Why the Internet of Things Plays an Essential Role in Digital Marketing

Why the Internet of Things Plays an Essential Role in Digital Marketing

It's no secret that technology is an essential part of our everyday lives. People simply enjoy the benefits that the online world and technology have to offer. That being said, modern businesses are aware where their customers are, and they'll leverage any advantage they can to capture their customers’ interests and outrun their competitors. The Internet of Things (IoT) is proof of technology's ever-growing presence in people's lives with over 6 billion connected devices across the world.

Simply put, IoT is a network of devices that can connect with each other over the internet and gather, as well as broadcast data via sensors or beacons. But what does that mean for online businesses and digital marketing? Even now, businesses and digital marketers are gathering data about online consumers and their behavior, preferences, needs, and demands, to win them over. With IoT being present in consumers' lives, access to more in-depth data about consumers is available to businesses. It's safe to say that the data provided from IoT devices will, and already is, reshaping digital marketing. Let's have a look.

The importance of big data

As mentioned before, online businesses and digital marketers are gathering data about their customers, by tracking various metrics and ...


Read More on Datafloq
The ethics of AI in the shadow of GDPR

The ethics of AI in the shadow of GDPR

Days later my ears are still ringing from the booming baritone of the public-address announcer in the keynote session on the first morning at the IBM Think 2018 conference in Las Vegas. I keep...

[[ This is a content summary only. Visit my website for full links, other content, and more! ]]
Saving lives with big data analytics that predict patient outcomes

Saving lives with big data analytics that predict patient outcomes

Cerner’s Enterprise Data Hub allows data to be brought together from an almost unlimited number of sources, and that data can be used to build a far more complete picture of any patient,...

[[ This is a content summary only. Visit my website for full links, other content, and more! ]]
Internet of things is re-defining the way business is done: Here’s why it’s time for total transformation

Internet of things is re-defining the way business is done: Here’s why it’s time for total transformation

The internet has undeniably impacted the lives of nearly everyone across the globe; be it a modern-day millennial or a baby boomer, everyone today feels the need to be connected and part of a network...

[[ This is a content summary only. Visit my website for full links, other content, and more! ]]
‘Love’s Tables’: U.S. War Department Casualty Estimation in World War II

‘Love’s Tables’: U.S. War Department Casualty Estimation in World War II

The same friend of TDI who asked about ‘Evett’s Rates,� the British casualty estimation methodology during World War II, also mentioned that the work of Albert G. Love III was now available on-line. Rick Atkinson also referenced “Love’s Tables� in The Guns At Last Light.

In 1931, Lieutenant Colonel (later Brigadier General) Love, then a Medical Corps physician in the U.S. Army Medical Field Services School, published a study of American casualty data in the recent Great War, titled “War Casualties.�[1] This study was likely the source for tables used for casualty estimation by the U.S. Army through 1944.[2]

Love, who had no advanced math or statistical training, undertook his study with the support of the Army Surgeon General, Merritte W. Ireland, and initial assistance from Dr. Lowell J. Reed, a professor of biostatistics at John Hopkins University. Love’s posting in the Surgeon General’s Office afforded him access to an array of casualty data collected from the records of the American Expeditionary Forces in France, as well as data from annual Surgeon General reports dating back to 1819, the official medical history of the U.S. Civil War, and U.S. general population statistics.

Love’s research was likely the basis for rate tables for calculating casualties that first appeared in the 1932 edition of the War Department’s Staff Officer’s Field Manual.[3]

Battle Casualties, including Killed, in Percent of Unit Strength, Staff Officer’s Field Manual (1932).

The 1932 Staff Officer’s Field Manual estimation methodology reflected Love’s sophisticated understanding of the factors influencing combat casualty rates. It showed that both the resistance and combat strength (and all of the factors that comprised it) of the enemy, as well as the equipment and state of training and discipline of the friendly troops had to be taken into consideration. The text accompanying the tables pointed out that loss rates in small units could be quite high and variable over time, and that larger formations took fewer casualties as a fraction of overall strength, and that their rates tended to become more constant over time. Casualties were not distributed evenly, but concentrated most heavily among the combat arms, and in the front-line infantry in particular. Attackers usually suffered higher loss rates than defenders. Other factors to be accounted for included the character of the terrain, the relative amount of artillery on each side, and the employment of gas.

The 1941 iteration of the Staff Officer’s Field Manual, now designated Field Manual (FM) 101-10[4], provided two methods for estimating battle casualties. It included the original 1932 Battle Casualties table, but the associated text no longer included the section outlining factors to be considered in calculating loss rates. This passage was moved to a note appended to a new table showing the distribution of casualties among the combat arms.

Rather confusingly, FM 101-10 (1941) presented a second table, Estimated Daily Losses in Campaign of Personnel, Dead and Evacuated, Per 1,000 of Actual Strength. It included rates for front line regiments and divisions, corps and army units, reserves, and attached cavalry. The rates were broken down by posture and tactical mission.

Estimated Daily Losses in Campaign of Personnel, Dead and Evacuated, Per 1,000 of Actual Strength, FM 101-10 (1941)

The source for this table is unknown, nor the method by which it was derived. No explanatory text accompanied it, but a footnote stated that “this table is intended primarily for use in school work and in field exercises.� The rates in it were weighted toward the upper range of the figures provided in the 1932 Battle Casualties table.

The October 1943 edition of FM 101-10 contained no significant changes from the 1941 version, except for the caveat that the 1932 Battle Casualties table “may or may not prove correct when applied to the present conflict.�

The October 1944 version of FM 101-10 incorporated data obtained from World War II experience.[5] While it also noted that the 1932 Battle Casualties table might not be applicable, the experiences of the U.S. II Corps in North Africa and one division in Italy were found to be in agreement with the table’s division and corps loss rates.

FM 101-10 (1944) included another new table, Estimate of Battle Losses for a Front-Line Division (in % of Actual Strength), meaning that it now provided three distinct methods for estimating battle casualties.

Estimate of Battle Losses for a Front-Line Division (in % of Actual Strength), FM 101-10 (1944)

Like the 1941 Estimated Daily Losses in Campaign table, the sources for this new table were not provided, and the text contained no guidance as to how or when it should be used. The rates it contained fell roughly within the span for daily rates for severe (6-8%) to maximum (12%) combat listed in the 1932 Battle Casualty table, but would produce vastly higher overall rates if applied consistently, much higher than the 1932 table’s 1% daily average.

FM 101-10 (1944) included a table showing the distribution of losses by branch for the theater based on experience to that date, except for combat in the Philippine Islands. The new chart was used in conjunction with the 1944 Estimate of Battle Losses for a Front-Line Division table to determine daily casualty distribution.

Distribution of Battle Losses–Theater of Operations, FM 101-10 (1944)

The final World War II version of FM 101-10 issued in August 1945[6] contained no new casualty rate tables, nor any revisions to the existing figures. It did finally effectively invalidate the 1932 Battle Casualties table by noting that “the following table has been developed from American experience in active operations and, of course, may not be applicable to a particular situation.� (original emphasis)

NOTES

[1] Albert G. Love, War Casualties, The Army Medical Bulletin, No. 24, (Carlisle Barracks, PA: 1931)

[2] This post is adapted from TDI, Casualty Estimation Methodologies Study, Interim Report (May 2005) (Altarum) (pp. 314-317).

[3] U.S. War Department, Staff Officer’s Field Manual, Part Two: Technical and Logistical Data (Government Printing Office, Washington, D.C., 1932)

[4] U.S. War Department, FM 101-10, Staff Officer’s Field Manual: Organization, Technical and Logistical Data (Washington, D.C., June 15, 1941)

[5] U.S. War Department, FM 101-10, Staff Officer’s Field Manual: Organization, Technical and Logistical Data (Washington, D.C., October 12, 1944)

[6] U.S. War Department, FM 101-10 Staff Officer’s Field Manual: Organization, Technical and Logistical Data (Washington, D.C., August 1, 1945)

How (And Why) To Get The Data That You Need

How (And Why) To Get The Data That You Need

One of the questions I hear often is, “Where can I get data?� I wish I heard it a lot more often. The question means different things to different people. Some are on a quest for information that...

[[ This is a content summary only. Visit my website for full links, other content, and more! ]]
Do we take data visualisation too seriously?

Do we take data visualisation too seriously?

It’s been a while since my last post – there’s a good reason for this. Well, a reason anyway. I have a 90% written post which I have been mulling over for a long time. Because I haven’t felt...

[[ This is a content summary only. Visit my website for full links, other content, and more! ]]
An In Depth Look Into Blockchain Technology

An In Depth Look Into Blockchain Technology

Blockchain, a brainchild of the of the mysterious pseudonym Satoshi Nakamoto, is an indisputably ingenious innovation. The technology allows digital information to be distributed to users without...

[[ This is a content summary only. Visit my website for full links, other content, and more! ]]
Data Security: The Importance of End-to-End Quantum-Resistant Encryption

Data Security: The Importance of End-to-End Quantum-Resistant Encryption

In a world that increasingly revolves around data, security is key. Unfortunately, too often organisations do not take security seriously. Big tech giants like Facebook allow firms such as Cambridge Analytica to syphon away 50 million user profiles, while the average Internet of Things device is so easy to hack that a kid can do it, even if it is meant to be a highly-secure crypto wallet. In the years to come, data will only increase in importance and as such in value. With that will come increased attention by hackers to steal data or hack your products, services or servers. More than ever, data security is vital if we wish to benefit from it.

Data security comes in many flavours, which roughly speaking can be divided into three different streams:


Processes and organisational solutions;
Technical and hardware solutions;
Data and software solutions.


Let’s briefly discuss the first two, which should be obvious for all of us by now and take deep dive in the third stream:

Processes and organisational solutions

Security processes and organisational solutions are very straightforward, or at least should be. By now, every organisation should enforce a hard-to-guess password, preventing users and employees from using passwords such as 123456 or qwertyui, which, unfortunately, ...


Read More on Datafloq
How Big Data Sector Changed the Banking Sector

How Big Data Sector Changed the Banking Sector

Unlike the early days when everything was done manually, these days the evolution of technology has changed how different sectors perform. One of those industries that have been impacted by the growth in technology is the banking industry. During the early days, the sector handled communication manually which greatly limited the sector. It was difficult for the financial institutions to efficiently transfer data from one point to the other and hence holding an account those days was ridiculously expensive.

As time went by, growth in technology hit the data sector and everything started to change. The banks could now use technology in gathering and saving clients data, a move that improved the efficiency in which the sector operates. As growth in technology continues to grow, the banking sector is offering their services online, hence making it easy for anyone to access their services from wherever. Explained below are some major changes we are experiencing in the banking sector.

Delivering Services in the Sector has improved

Data in the banking sector is very substantial such that it is difficult to handle it manually. Just think of this, what could happen if an account number has to be searched manually in files? We could have unending ...


Read More on Datafloq
GDPR and Privacy Shield—Are They Compatible?

GDPR and Privacy Shield—Are They Compatible?

Moving and protecting sensitive data has faced a rocky road in the European Union. This is evidenced by the various schemes and regulations set up—and in some cases struck down—relating to rules and...

[[ This is a content summary only. Visit my website for full links, other content, and more! ]]
JSON. To ETL or to NoETL? The big data question.

JSON. To ETL or to NoETL? The big data question.

NoETL. The little brother of NoSQL You have probably come across the term NoSQL. It was coined a few years back to describe a class of database systems that can scale across a large number of nodes...

[[ This is a content summary only. Visit my website for full links, other content, and more! ]]
Rogue Data in the Present, Internet Regulation in the Future

Rogue Data in the Present, Internet Regulation in the Future

“Data gone AWOL� — According to DeVry, it’s one of the top cybersecurity threats that could end up affecting your life. As it turns out, it may already have.

Recent revelations that Trump-hired political marketing firm Cambridge Analytica illegally harvested user data from Facebook to influence the 2016 US elections has shocked the world, but the Washington Post reports that it might be worse than we realize.

While Cambridge Analytica wrongfully obtained data on 50 million unconsenting users in 2015 and 2014, Paul-Olivier Dehaye, a privacy expert and co-founder of PersonalData.IO, believes that the data has already spread to other groups, other databases, and across the dark web.

“It is the whole nature of this ecosystem,� Dehaye said to Washington Post reporters. “This data travels. And once it has spread, there is no way to get it back.�

For anybody who isn’t up on the headlines, here’s a quick (and comedic) summation of what’s happened so far, as broken down by Stephen Colbert.

All of this has lead to comparisons to the Obama campaign’s use of social media, as well as increased calls for data, social network, and internet regulations.

Comparisons With Obama

Conservatives seem keen to point out a double-standard in this affair. Politifact points to ...


Read More on Datafloq
Drink It Up: Coca-Cola Is Using Blockchain to Improve Workers’ Rights

Drink It Up: Coca-Cola Is Using Blockchain to Improve Workers’ Rights

Even with the latest hiccup in cryptocurrency valuations, I don’t think there’s been a faster-appreciating asset on the planet. Since the beginning of 2017, the aggregate cryptocurrency...

[[ This is a content summary only. Visit my website for full links, other content, and more! ]]
Digital Identity is the key to the Blockchain Economy

Digital Identity is the key to the Blockchain Economy

This is Part 1/chapter 12 in The Blockchain Economy serialised book. For the index please go here. On Thursday I gave a talk about Digital Cooperatives at an event near Geneva in France that was...

[[ This is a content summary only. Visit my website for full links, other content, and more! ]]
Cloud + Streaming Analytics + Data Science = Five Big Data Trends Now

Cloud + Streaming Analytics + Data Science = Five Big Data Trends Now

How streaming analytics, the rise of data science and the growth of cloud could change the digital transformation path for enterprise in the coming months. This year will be the when real-time big...

[[ This is a content summary only. Visit my website for full links, other content, and more! ]]
5 Strategies From Top Firms on How to Use Machine Learning

5 Strategies From Top Firms on How to Use Machine Learning

With machine learning making disruptive innovation easier than ever before, it’s up to entrepreneurs to show the big kids how it’s done. Machine learning is headed for a major growth...

[[ This is a content summary only. Visit my website for full links, other content, and more! ]]
Keys to Data Monetization

Keys to Data Monetization

Data analytics professionals have toiled for years in relative obscurity in the back office of their organizations. They’ve created data warehouses and data marts, delivered reports and dashboards,...

[[ This is a content summary only. Visit my website for full links, other content, and more! ]]
10 Steps to Detect Lateral Movement in a Data Breach

10 Steps to Detect Lateral Movement in a Data Breach

Many enterprises spend millions of dollars on solutions that promise to bolster their security. However, much less focus is placed on the ability to detect lateral movement during a breach. We’ve...

[[ This is a content summary only. Visit my website for full links, other content, and more! ]]
Top Benefits of IoT for Hospitals and Healthcare

Top Benefits of IoT for Hospitals and Healthcare

Undoubtedly, the Internet of Things technology has been significantly transforming the healthcare industry by revamping the way devices, apps, and users connect and interact with each other for delivering healthcare services. IoT is continuously introducing innovative tools as well as capabilities like IoT enabled medical app development that build up an integrated healthcare system with the vision of assuring better patients care at reduced costs.

Consequently, it is an accumulation of numerous opportunities that hospitals and wellness promoters can consider while they optimize resources with automated workflows. For example, a mass of hospitals utilizes IoT for controlling humidity and managing assets and temperature within operating areas. Moreover, IoT applications are offering enormous perks to health care providers and patients considerably improving health care services.

The Impact of IoT on Healthcare Industry

Check Out Some The Best IoT Applications That Are Impacting Healthcare Services:

Real-Time Remote Monitoring

IoT enables connecting multiple monitoring devices and thus monitoring patients in real time. Further, these connected devices can send out signals from home also, thereby decreasing the time required for patient care in the hospitals.

Blood Pressure Monitoring

A sensor based intelligent system like Bluetooth enabled coagulation system can be utilized to monitor blood pressure levels of patients who undergo hypertension. ...


Read More on Datafloq
C-WAM 2

C-WAM 2

Here are two C-WAM documents: their rule book and a CAA briefing, both from 2016:

C-WAM’s rule book: https://paxsims.files.wordpress.com/2016/10/c-wam-rules-version-7-29-jul-2016.docx

CAA briefing on C-WAM: https://paxsims.files.wordpress.com/2016/10/mors-wargame-cop-brief-20-apr-16.pptx

A few highlights (rule book):

  1. Grid size from 2 to 10 km, depending on terrain (section 2.2)
    1. Usually 5 km to a grid.
  2. There is an air-to-air combat table based upon force ratios (section 3.6.4).
  3. There is a naval combat table based upon force ratios (section 3.9.4).
  4. There are combat values of ground units (section 3.11.5.B)
  5. There is a ground combat table based upon force ratios (section 3.11.5.E)
  6. There is a “tactics degrade multiplier” which effectively divides one sides’ combat power by up to 4 (section 3.11.5.P).
  7. These tables use different types of dice for probability generation (showing the influence of Gary Gygax on DOD M&S).

A few highlights (briefing)

  1. Executes in 24 or 72 hours time steps (slide 3)
  2. Brigade-level (slide 18)
  3. Breakpoint at 50% strength (can only defend), removed at 30% strength (slide 18 and also rule book, section 5.7.2).

Anyhow, interesting stuff, but still basically an old style board-game, like Avalon Hill or SPI.

 

An Inside Look at Today’s Smart Office: Tools and Gadgets Shaping the Modern Workforce

An Inside Look at Today’s Smart Office: Tools and Gadgets Shaping the Modern Workforce

A mere five to eight years ago, having a forward-thinking and tech-savvy workforce meant equipping everyone with a smartphone so all employees, from in-house accountants to off-site field workers, could communicate instantly across time zones and geological barriers. Then, the adoption of Video Teleconferencing (VTC) became more prevalent to facilitate smarter, more effective meetings. As the cloud grew in popularity and scope, data migration took centre stage as companies around the world began the slow and meticulous process of weeding through years of data stored on legacy systems.

Now, the face of the modern office is rapidly changing. On the one hand, many of the technologies that were so prevalent a few years ago are still adopted in full swing today. Smart devices are still affixed to almost every hip, meetings are even more impressive as VTC technology advances, and Customer Relationship Management (CRM) and Enterprise Resource Planning (ERP) platforms help departments leverage cloud computing capabilities.

Still, thanks to the emergence of some impressive tools and more advanced IT services, today’s office worker has access to more capability than ever before, and it appears we’re just scratching the surface. Here are three of the most promising trends that are helping to redefine ...


Read More on Datafloq
Saudi Missile Defense

Saudi Missile Defense

The Houthi’s in Yemen are lobbing missiles at Saudi Arabia. Saudi Arabia does have a missile defense system (I assume made in America). Apparently they are missing the incoming missiles: http://www.businessinsider.com/saudi-missile-defense-failed-video-2018-3

A few other points:

  1. One interceptor appears to have “pulled a u-turn” and exploded over Riyadh.
    1. This interceptor may have been the source of the Saudi casualties (one dead, two injured)
  2. This could be the largest barrage of missiles fired at Saudi Arabia by the Houthi’s yet.

I wonder what interceptor Saudi Arabia was using. I wonder if failure is common with most missile defense systems (the situation with North Korea comes to mind here).

Transparency, responsibility and accountability in the age of IoT

Transparency, responsibility and accountability in the age of IoT

The Internet of Things market resembles the wild west with its rapid, chaotic growth and lack of effective oversight or security. Gartner estimates there will be 26 billion IoT devices connecting to...

[[ This is a content summary only. Visit my website for full links, other content, and more! ]]
The reasons U.S. blockchain adoption has stalled

The reasons U.S. blockchain adoption has stalled

Enthusiasm for blockchain technology in the financial services industry seems to be ebbing. JPMorgan Chase, which developed its own open-source distributed ledger, Quorum, was rumored on Thursday to...

[[ This is a content summary only. Visit my website for full links, other content, and more! ]]
How Ghent is using the semantic web to connect data and meaning

How Ghent is using the semantic web to connect data and meaning

Linked open data on a semantic website might seem distant from the daily operations of local government, but in Ghent we believe the benefits outweigh the possible pitfalls of using this pioneering...

[[ This is a content summary only. Visit my website for full links, other content, and more! ]]
5 Things to Think About When Considering Monetizing Data

5 Things to Think About When Considering Monetizing Data

Trey Stephens is the Director of Audience Monetization at Acxiom, where he specializes in connecting offline with online data and audiences, marketing technology and creating and growing strategic...

[[ This is a content summary only. Visit my website for full links, other content, and more! ]]
Machine Learning can transform education

Machine Learning can transform education

Futurist Arthur C. Clarke wrote, “Any sufficiently advanced technology is indistinguishable from magic.� The magic of software (giving data and rules to get answers) is often confused with the magic...

[[ This is a content summary only. Visit my website for full links, other content, and more! ]]
5 Ways IoT will Help Banking in 2018 and Beyond

5 Ways IoT will Help Banking in 2018 and Beyond

Wi-Fi, sensors, mobile, and other digital technologies are changing how we interact with the world around us. They have introduced a new era of connectivity that’s dubbed as the Internet of Things (IoT). This enhanced connectivity opens up a chance to tap into and make use of any data collected thereby providing boundless opportunities to various industries. Due to its potential, IoT has gained a lot of traction in the business world, and different industries such as healthcare, retail, and banking are all trying to figure out how to leverage this potential to generate greater revenues from their activities.

As far as the banking industry is concerned, IoT is still in the planning stage, but it is expected to help the bank considerably in 2018 and beyond. Some of the top innovations that are to be expected are:

Tailored marketing

These days, customers across all industries have begun to demand personalized solutions for their different needs, and this is also applicable to the banking industry. However, a bank can only tailor solutions for a client if they have the necessary information about their buying behavior, current economic condition and their individual needs. But, with IoT, it has become possible for banks to ...


Read More on Datafloq
How GDPR Drives Real-Time Analytics

How GDPR Drives Real-Time Analytics

New reforms under the General Data Protection Regulation (GDPR) started as an attempt to standardise data protection regulations in 2012. The European Union intends to make Europe “fit for the digital age.� It took four years to finalise the agreements and reach a roadmap on how the laws will be enforced.

The GDPR presents new opportunities as well as difficulties for businesses, digital companies, data collectors, and digital marketers. On the one hand, these regulations will make it more difficult for businesses and data mining firms to collect and analyse customer data for marketers, while on the other, they will present an opportunity for data collectors to innovate and enhance their techniques. This will lead to a better collection of more meaningful data, as customers will be directly involved.

Understanding GDPR

The GDRP will go into effect on May 25, 2018. It will apply to all organisations and businesses that process personal and marketing data from European residents. 

There are six underlying principles of GDPR.


Organizations must ensure that the personal data of users is processed transparently, lawfully, and fairly.
Personal data of users must only be collected for explicitly specified and legitimate purposes.
Data collectors must only gather limited amounts of personal information that is adequate ...


Read More on Datafloq
The Current Hype Cycle in Artificial Intelligence

The Current Hype Cycle in Artificial Intelligence

Prologue

Every decade seems to have its technological buzzwords: we had personal computers in the 1980s; Internet and worldwide web in 1990s; smartphones and social media in 2000s; and Artificial Intelligence (AI) and Machine Learning in this decade. However, the field of AI is 67 years old, and this is the third in a series of five articles wherein:


The first article discusses the genesis of AI and the first hype cycle during 1950 and 1982
The second article discusses a resurgence of AI and its achievements during 1983-2010
The third discusses the domains in which AI systems are already rivalling Humans
This article discusses the current hype cycle in Artificial Intelligence
The fifth article discusses as to what 2018-2035 may portend for brains, minds and machines


The Timeline



Introduction

Over the past decade, the field of artificial intelligence (AI) has seen striking developments. As surveyed in [141], there now exist over twenty domains in which AI programs are performing at least as well as (if not better than), humans. These advances have led to a massive burst of excitement in AI that is highly reminiscent of the one that took place during the 1956-1973 boom phase of the first AI hype cycle [56]. Investors are funding billions of ...


Read More on Datafloq
How-to Guide to Handling Missing Data in AI/ML Datasets

How-to Guide to Handling Missing Data in AI/ML Datasets

Artificial Intelligence and Machine Learning are the noble pursuits that depend largely on the data they are fed. With this data, systems figure out the future path and learn to handle complex scenarios. All of the applications of Machine Learning and Artificial Intelligence makes sense only when the supplied data is complete and rich.

But, in the real world, the data is not perfect, just like everything else. But, there are steps to fix the data when it is incomplete, incoherent, and unsuitable. Today, we discuss the methods to treat missing data when a comprehensive data is required for ML and AI applications.

Whether to ignore the missing values or to treat them effectively, depends on some factors to be considered such as the percentage of the missing values in the dataset, the variables these values affect, and whether the missing values belong to a dependent or an independent variable, etc. 

The performance of your predictive analytics depends on the accuracy and the integrity and the completeness of the data. Therefore, it becomes necessary to treat missing data when the need arises.

Treatment by Deletion

The best avoidable method to get over the missing data is to delete the record. This can be done either ...


Read More on Datafloq
9 Prospects of Big Data Based iPhone Apps

9 Prospects of Big Data Based iPhone Apps

Amongst the most user-friendly and enchanting technology innovations in the current century, big data and cloud technology iOS app development solutions are the most robust, secure, scalable, and universally acceptable. Big data represents the extremely complex and large sets of data that may be structured, semi-structured, or unstructured.

Useful information can be mined from this data by computationally analysing it to reveal human behaviour and interactions from the patterns, trends, and associations of data elements. Some data elements of this big data are web traffic logs, customer’s transactional history, software logs, production databases, online videos, social media interactions, and much more. In short, all behaviour patterns, interests, desires, pains, etc., everything is now stored in the treasure chest of the iPhone device.

The need for big data based iPhone apps

Big data can be used to augment human judgement in useful ways. It is growing larger with each passing moment as people are constantly using mobiles and their data is being stored constantly and is ever-increasing. As the data sets of big data are so large, traditional applications cannot collect and evaluate the data. So, iOS mobile applications that are developed based on big data bridge the gap between technology and consumer for ...


Read More on Datafloq
Domains in Which Artificial Intelligence is Rivalling Humans

Domains in Which Artificial Intelligence is Rivalling Humans

Prologue

Every decade seems to have its technological buzzwords: we had personal computers in the 1980s; Internet and worldwide web in 1990s; smartphones and social media in 2000s; and Artificial Intelligence (AI) and Machine Learning in this decade. However, the field of AI is 67 years old, and this is the third in a series of five articles wherein:


The first article discusses the genesis of AI and the first hype cycle during 1950 and 1982
The second article discusses a resurgence of AI and its achievements during 1983-2010
This article discusses the domains in which AI systems are already rivalling humans
The fourth article discusses the current hype cycle in Artificial Intelligence
The fifth article discusses as to what 2018-2035 may portend for brains, minds and machines


The Timeline



 

Domains where AI Systems are Rivaling Humans

As mentioned in a previous article [56], the 1950-82 era saw a new field of Artificial Intelligence (AI) being born, a lot of pioneering research being done, massive hype being created but eventually fizzling out. The 1983-2004 era saw research and development in AI gradually picking up and leading to a few key accomplishments (e.g., Deep Blue beating Kasparov in Chess) and commercialized solutions (e.g., Cyberknife), but its pace really picked up ...


Read More on Datafloq
C-WAM 1

C-WAM 1

Linked here is an article about a wargame called C-WAM, the Center for Army Analysis (CAA) Wargaming Analysis Model: https://www.govtechworks.com/how-a-board-game-helps-dod-win-real-battles/#gs.ifXPm5M

A few points:

  1. It is an old-style board game.
  2. Results are feed into RAND’s JICM (Joint Integrated Contingency Model).
    1. Battle attrition is done using CAA’s COSAGE and ATCAL.
  3. Ground combat is brigade-level.

More to come.

Big data in the travel industry: How can travel companies do more to collect and use customer data?

Big data in the travel industry: How can travel companies do more to collect and use customer data?

People leave lengthy trails of data when they travel. Purchases are made online, itineraries are stored in digital calendars, and GPS co-ordinates are shared every step of the way. Data-based...

[[ This is a content summary only. Visit my website for full links, other content, and more! ]]

Privacy Policy

Copyright © 2018 BBBT - All Rights Reserved
Powered by WordPress & Atahualpa
X