This Machine Learning System Thinks About Music Like You Do — NOVA Next

This Machine Learning System Thinks About Music Like You Do — NOVA Next

If you’ve ever let Spotify DJ your party and then found yourself asking, a half an hour in, “Spotify, what are you thinking?�—well, it actually may be thinking a lot like you. Scientists reported in...

[[ This is a content summary only. Visit my website for full links, other content, and more! ]]
How AI, Machine Learning & Data Visualization are Defining the Ultimate Customer Experience

How AI, Machine Learning & Data Visualization are Defining the Ultimate Customer Experience

The words artificial intelligence (AI), machine learning (ML) and data visualization are everywhere right now. Both AI and ML have gained an immense role in defining the business world and have...

[[ This is a content summary only. Visit my website for full links, other content, and more! ]]
5 Blockchain trends influencing the future of social media marketing

5 Blockchain trends influencing the future of social media marketing

Whether you are a social media marketer offering your services to businesses or a business owner using social media to reach customers, it is imperative you understand that social media is about to...

[[ This is a content summary only. Visit my website for full links, other content, and more! ]]
AI-powered personalization starts with the data layer

AI-powered personalization starts with the data layer

In 2018, machine learning and its game-changing capacity for automated predictions of visitor behavior and purchasing trends currently dominate digital personalization. Machine learning’s flame is...

[[ This is a content summary only. Visit my website for full links, other content, and more! ]]
Machine Learning’s ‘Amazing’ Ability to Predict Chaos

Machine Learning’s ‘Amazing’ Ability to Predict Chaos

Half a century ago, the pioneers of chaos theory discovered that the “butterfly effect� makes long-term prediction impossible. Even the smallest perturbation to a complex system (like the weather,...

[[ This is a content summary only. Visit my website for full links, other content, and more! ]]
Human Factors In Combat: Syrian Strike Edition

Human Factors In Combat: Syrian Strike Edition

Missile fire lit up the Damascus sky last week as the U.S. and allies launched an attack on chemical weapons sites. [Hassan Ammar, AP/USA Today]

Even as pundits and wonks debate the political and strategic impact of the 14 April combined U.S., British, and French cruise missile strike on Assad regime chemical warfare targets in Syria, it has become clear that effort was a notable tactical success.

Despite ample warning that the strike was coming, the Syrian regime’s Russian-made S-200 surface-to-air missile defense system failed to shoot down a single incoming missile. The U.S. Defense Department claimed that all 105 cruise missiles fired struck their targets. It also reported that the Syrians fired 40 interceptor missiles but nearly all launched after the incoming cruise missiles had already struck their targets.

Although cruise missiles are difficult to track and engage even with fully modernized air defense systems, the dismal performance of the Syrian network was a surprise to many analysts given the wary respect paid to it by U.S. military leaders in the recent past. Although the S-200 dates from the 1960s, many surmise an erosion in the combat effectiveness of the personnel manning the system is the real culprit.

[A] lack of training, command and control and other human factors are probably responsible for the failure, analysts said.

“It’s not just about the physical capability of the air defense system,� said David Deptula, a retired, three-star Air Force general. “It’s about the people who are operating the system.�

The Syrian regime has become dependent upon assistance from Russia and Iran to train, equip, and maintain its military forces. Russian forces in Syria have deployed the more sophisticated S-400 air defense system to protect their air and naval bases, which reportedly tracked but did not engage the cruise missile strike. The Assad regime is also believed to field the Russian-made Pantsir missile and air-defense artillery system, but it likely was not deployed near enough to the targeted facilities to help.

Despite the pervasive role technology plays in modern warfare, the human element remains the most important factor in determining combat effectiveness.

How to Design an Efficient Data Quality Management Strategy

How to Design an Efficient Data Quality Management Strategy

In this age of digital revolution, organizations deal with large volume of data and information emanating from various sources. To maintain a competitive advantage, gain insights and make informed...

[[ This is a content summary only. Visit my website for full links, other content, and more! ]]
Some reasons to think about big data as data commons

Some reasons to think about big data as data commons

The role of data in the current digitally soaked society is so important that data has been repeatedly defined in several and different spheres as the new oil, a new factor of production, and the...

[[ This is a content summary only. Visit my website for full links, other content, and more! ]]
In the IoT world, general-purpose databases can’t cut it

In the IoT world, general-purpose databases can’t cut it

We live in an age of instrumentation, where everything that can be measured is being measured so that it can be analyzed and acted upon, preferably in real time or near real time. This...

[[ This is a content summary only. Visit my website for full links, other content, and more! ]]
The U.S. Needs a New Paradigm for Data Governance

The U.S. Needs a New Paradigm for Data Governance

The U.S. Senate and House hearings last week on Facebook’s use of data and foreign interference in the U.S. election raised important challenges concerning data privacy, security, ethics,...

[[ This is a content summary only. Visit my website for full links, other content, and more! ]]
Why Data Governance is Crucial for Big Data Environments

Why Data Governance is Crucial for Big Data Environments

Emily Washington, Senior Vice President of Product Management, Infogix, writes about the importance of data governance  The most significant obstacle preventing organizations from realizing the full...

[[ This is a content summary only. Visit my website for full links, other content, and more! ]]
Meet SemSpect: A Different Approach to Graph Visualization

Meet SemSpect: A Different Approach to Graph Visualization

Understanding large graphs is challenging. Sure, a proper Cypher query can retrieve valuable information. But how do you find the pivotal queries when the structure of your graph is not known? In...

[[ This is a content summary only. Visit my website for full links, other content, and more! ]]
6 ways to attain top benefits from artificial intelligence & machine learning

6 ways to attain top benefits from artificial intelligence & machine learning

Data is the new strategic asset, the biggest business asset today. Data is to today’s digital economy what electricity was to the industrial economy. Organizations that understand the value of their...

[[ This is a content summary only. Visit my website for full links, other content, and more! ]]
How to keep your business data trustworthy

How to keep your business data trustworthy

Today, virtually every business is increasingly reliant on data to drive critical decision-making about the strategies that will deliver sustained growth. This puts the issue of data veracity – the...

[[ This is a content summary only. Visit my website for full links, other content, and more! ]]
How DataOps Is Transforming Data Management Practices

How DataOps Is Transforming Data Management Practices

NewVantage Partners 2018 Big Data Executive Survey, demonstrates that culture and organizational impediments are leading barriers to harnessing Big Data. Over half of executives surveyed reported...

[[ This is a content summary only. Visit my website for full links, other content, and more! ]]
How can European automakers thrive in the new mobility ecosystem?

How can European automakers thrive in the new mobility ecosystem?

​What does the future of mobility hold for European carmakers? We see a new value chain emerging, shaped by regulations, consumer attitudes, and the pace of technological change. Key to thriving in...

[[ This is a content summary only. Visit my website for full links, other content, and more! ]]
What is data wrangling and how can you leverage it for your business?

What is data wrangling and how can you leverage it for your business?

The term “wrangling� evokes images of cowboys lassoing runaway cattle, getting them under control, and taking them back to the herd. Data wrangling is also like lassoing data and putting it to work...

[[ This is a content summary only. Visit my website for full links, other content, and more! ]]
The tech behind cryptocurrency could save lives by fixing medical records

The tech behind cryptocurrency could save lives by fixing medical records

Your medical record grows longer with each visit to your doctor. Your weight, blood pressure, symptoms, and other data gradually builds up in your electronic medical records, or EMRs. It’s a neat,...

[[ This is a content summary only. Visit my website for full links, other content, and more! ]]
Can blockchain technology live up to the hype? Barclays analysts say no

Can blockchain technology live up to the hype? Barclays analysts say no

To some, blockchain is a potential game-changing innovation that could disrupt and replace traditional payment and information-recording systems. Created around 2009 by Satoshi Nakamoto, the...

[[ This is a content summary only. Visit my website for full links, other content, and more! ]]
Connecting Looker to Oracle Autonomous Data Warehouse Cloud

Connecting Looker to Oracle Autonomous Data Warehouse Cloud

Earlier in the week I wrote-up my first impressions of Oracle Autonomous Data Warehouse Cloud (ADWC) on this blog, and said at the end I’d follow with another post on how to connect ADWC to Looker, the cloud-based BI tool I usually query Google BigQuery with in my role as Analytics Product Manager at Qubit.

For this initial example I uploaded a few tables of workout and other data into ADWC using SQL Developer, exporting the relevant tables out of BigQuery in CSV format and then uploading them into a user account I created for the purpose in AWDC. The table I’ll concentrate on for this example is the STRAVA_RIDES table, each row detailing an individual cycle workout as recorded through the Strava smartphone app.

The first step in connecting-up Looker to ADWC, or indeed any Oracle Database, is to run some scripts that set up Oracle as Looker expects to find it. Unlike most other BI tools I’ve used with Oracle, Looker expects to connect through a specific user login (“LOOKER�) that is then granted SELECT access to any tables and views you want to report on in other schemas. This user login also needs to have some views and synonyms created to give it access to the V$ system views within Oracle that report on active sessions, and an ability to kill long-running sessions through a PL/SQL package that calls ALTER SYSTEM … KILL SESSION.

The commands to run for regular Oracle databases are detailed on the Looker website but you need to alter them slightly to use ADWC’s superuser account name (“ADMIN�) instead of SYS when initially connecting and when creating the LOOKER_HASH function for symmetric aggregate handling, along with various other changes due to differences in how various objects are named in AWDS vs. regular Oracle Database.

I’ve listed the commands I ran on my ADWC instance below, they should work for you but if not then check out the “Autonomous Data Warehouse Cloud for Experienced Oracle Database Users� section in Using Autonomous Data Warehouse Cloud that explains the differences between the new autonomous and regular Oracle Database server versions.

First create the Looker account and grant the relevant roles and priviledges:

connect ADMIN/<<your_admin_password>>
create user LOOKER identified by <<new_looker_account_password>>;
alter user LOOKER
default tablespace DATA
temporary tablespace TEMP
account unlock;

alter user LOOKER quota unlimited on DATA;
alter user LOOKER default role RESOURCE;
grant CREATE SESSION to LOOKER;
GRANT UNLIMITED TABLESPACE TO LOOKER; 
GRANT CREATE TABLE TO LOOKER;
grant select on -- <all tables that will be used by looker>;

Now create the views that Looker uses to understand what sessions are active and the SQL that’s currently being executed to provide data for looks and dashboard tiles:

create or replace view LOOKER_SQL 
as
select SQL.SQL_ID, SQL.SQL_TEXT
from V$SQL sql ,v$session sess
where SESS.SQL_ADDRESS = SQL.ADDRESS
and SESS.USERNAME='LOOKER';
create or replace synonym LOOKER.LOOKER_SQL for LOOKER_SQL;
grant select ON LOOKER.LOOKER_SQL to LOOKER;
create or replace view LOOKER_SESSION as 
SELECT SID, USERNAME, TYPE, STATUS, SQL_ID, "SERIAL#", AUDSID
FROM V$SESSION
WHERE USERNAME='LOOKER';
create or replace synonym LOOKER.LOOKER_SESSION FOR LOOKER_SESSION;
GRANT SELECT ON LOOKER.LOOKER_SESSION TO LOOKER;

Next, create the Oracle PL/SQL function that Looker uses as part of symmetric aggregate handling, and a function that Looker can use to “kill� runaway database queries that are taking too long to return results back to you.

create or replace function LOOKER_HASH(bytes raw, prec number)   return raw as   
begin
return(DBMS_CRYPTO.HASH(bytes, prec));
end;
create or replace synonym LOOKER.LOOKER_HASH for LOOKER_HASH;
grant execute on LOOKER.LOOKER_HASH to LOOKER;  
grant execute on ADMIN.LOOKER_HASH to LOOKER;
create or replace procedure LOOKER_KILL_QUERY(P_SID in VARCHAR2,
P_SERIAL# in VARCHAR2)
is
CURSOR_NAME pls_integer default dbms_sql.open_cursor;
IGNORE pls_integer;
begin
select COUNT(*) into IGNORE
from V$SESSION
where USERNAME = USER
and SID = P_SID
and SERIAL# = P_SERIAL#;
if (IGNORE = 1)
then
dbms_sql.parse(CURSOR_NAME,
'alter system kill session '''
|| P_SID || ',' || P_SERIAL# || '''',
dbms_sql.native);
IGNORE := dbms_sql.execute(CURSOR_NAME);
else
raise_application_error(-20001,
'You do not own session ''' ||
P_SID || ',' || P_SERIAL# ||
'''');
end if;
end;
create or replace synonym LOOKER.LOOKER_KILL_QUERY 
for ADMIN.LOOKER_KILL_QUERY;
grant execute on sys.LOOKER_KILL_QUERY to LOOKER;

Next over to the Looker configuration. You’ll need to be on the Looker 5.12.12 or higher release with an instance hosted in the US to get the integration working as of the time of writing so that “ADWC� is listed as a connection type and the ADWC wallet integration works; if you’re running Looker as a hosted instance you’ll also need to speak with support to have them copy across the wallet files to the correct location on the Looker server.

To create the connection, enter the following details:

  • Name : Name of your connection, e.g. “rittman_adwcâ€�
  • Dialect : Oracle ADWC (only appears with Looker 5.12.12+)
  • Host:Port : from the TNSNAMES.ORA file in your ADWC wallet zip file
  • Username : LOOKER (as per the account setup in previous steps)
  • Password : password of LOOKER account
  • Temp Database : LOOKER (as per previous steps)
  • Persistent Derived Tables : checked
  • Service Name : From TNSNAMES.ORA in your ADWC wallet zip file
  • Additional Database Params : TNSNAMES.ORA SSL Server Cert DN

To show the Service Name and Additional Database Params fields you first have to save the connection, then tick the “Use TNS� checkbox to reveal the fields. To find your host:port, service name and SSL Server Cert DN values first download the wallet zip file for your ADWC instance from the ADWC Service Console, unzip the archive and then locate the details you need in the TNSNAMES.ORA file as shown below. In my case I chose to use the “medium� ADWC instance type for my connection settings.

Then, save and test your connection. The step that checks that persistent derived tables will probably fail if you try this around the time of my writing as there’s a known bug in the step that checks this feature, it’ll no doubt be fixed soon but if the rest of the checks pass you should be good.

Finally, it’s just then a case of importing your table metadata into Looker and creating explores and a model as you’d do with any other data source, like this:

In this instance I’ve updated the Strava Rides LookML view to turn the relevant metric fields into measures, define a primary key for the view and remove or hide fields that aren’t relevant to my analysis, like this:

Now I can start to analyze my Strava workout data I previously uploaded to Oracle Autonomous Data Warehouse, starting with average cadence and speed along with total distance and Strava’s “suffer score� for each of my workouts:

and then looking to see how much correlation there is between distance and personal strava records being broken on my five longest rides.

In the background, Looker is sending Oracle and ADWC-specific SQL to Oracle Autonomous Data Warehouse Cloud, with the SQL tab in the Explore interface showing me the actual SQL for each query as its sent.

Should I wish to check how much of the storage and CPU capacity available for my ADWC instance is being used, I can do this from ADWC’s Service Console.

So there you have it — Looker powered by an Oracle Autonomous Data Warehouse Cloud, and no need for an Oracle DBA to get it all running for you.


Connecting Looker to Oracle Autonomous Data Warehouse Cloud was originally published in Mark Rittman’s Personal Blog on Medium, where people are continuing the conversation by highlighting and responding to this story.

From the doctor to the DMV, blockchain can make governments swift and secure

From the doctor to the DMV, blockchain can make governments swift and secure

Although most commonly associated with cryptocurrencies like Bitcoin, blockchain technology is also being used all over the world for many intriguing purposes. One of the most impactful could be the...

[[ This is a content summary only. Visit my website for full links, other content, and more! ]]
The creator of Javascript is using blockchain to save the internet from ads

The creator of Javascript is using blockchain to save the internet from ads

nline ads are a special kind of terrible. They’re invasive, they’re distracting, and worst of all — they’re necessary. Despite the fact they’re almost universally hated, ads are what make the World...

[[ This is a content summary only. Visit my website for full links, other content, and more! ]]
Blockchain will turn gaming into a career, and give power to the players

Blockchain will turn gaming into a career, and give power to the players

Video games are more than a game. They are, at different times for different people, a challenge, a business, a lifestyle, or all the above. While professional gamers fight for titles, and the money...

[[ This is a content summary only. Visit my website for full links, other content, and more! ]]
U.S. Army Invests In Revitalizing Long Range Precision Fires Capabilities

U.S. Army Invests In Revitalizing Long Range Precision Fires Capabilities

U.S. Marines from the The 11th MEU fire their M777 Lightweight 155mm Howitzer during Exercise Alligator Dagger, Dec. 18, 2016. (U.S. Marine Corps/Lance Cpl. Zachery C. Laning/Military.com)

In 2016, Michael Jacobson and Robert H. Scales amplified a warning that after years of neglect during the counterinsurgency war in Iraq and Afghanistan, the U.S. was falling behind potential adversaries in artillery and long range precision fires capabilities. The U.S. Army had already taken note of the performance of Russian artillery in Ukraine, particularly the strike at Zelenopillya in 2014.

Since then, the U.S. Army and Marine Corps have started working on a new Multi-Domain Battle concept aimed at countering the anti-access/area denial (A2/AD) capabilities of potential foes. In 2017, U.S. Army Chief of Staff General Mark Milley made rapid improvement in long range precision fires capabilities the top priority for the service’s modernization effort. It currently aims to field new field artillery, rocket, and missile weapons capable of striking at distances from 70 to 500 kilometers – double the existing ranges – within five years.

The value of ground-based long-range precision fires has been demonstrated recently by the effectiveness of U.S. artillery support, particularly U.S. Army and Marine Corps 155mm howitzers, for Iraqi security forces in retaking Mosul, Syrian Democratic Forces assaulting Raqaa, and in protection of Syrian Kurds being attacked by Russian mercenaries and Syrian regime forces.

According to Army historian Luke O’Brian, the Fiscal Year 2019 Defense budget includes funds to buy 28,737 XM1156 Precision Guided Kit (PGK) 155mm howitzer munitions, which includes replacements for the 6,269 rounds expended during Operation INHERENT RESOLVE. O’Brian also notes that the Army will also buy 2,162 M982 Excalibur 155mm rounds in 2019 and several hundred each in following years.

In addition, in an effort to reduce the dependence on potentially vulnerable Global Positioning System (GPS) satellite networks for precision fires capabilities, the Army has awarded a contract to BAE Systems to develop Precision Guided Kit-Modernization (PGK-M) rounds with internal navigational capacity.

While the numbers appear large at first glance, data on U.S. artillery expenditures in Operation DESERT STORM and IRAQI FREEDOM (also via Luke O’Brian) shows just how much the volume of long-range fires has changed just since 1991. For the U.S. at least, precision fires have indeed replaced mass fires on the battlefield.

The Role of Data Analytics in Digital Business

The Role of Data Analytics in Digital Business

When someone tells me they want to do analytics, I say that it is easy. I can explain it in five minutes. However, if someone says that they want to transform an organization using analytics, my...

[[ This is a content summary only. Visit my website for full links, other content, and more! ]]
The Power of the Algorithms: Pedro Domingos on the Arms Race in Artificial Intelligence

The Power of the Algorithms: Pedro Domingos on the Arms Race in Artificial Intelligence

It’s a quiet hallway in the computer science department at the University of Washington in Seattle. To the right, young software engineers sit in front of their laptops in the windowless,...

[[ This is a content summary only. Visit my website for full links, other content, and more! ]]
Leveraging AI and Blockchain to Transform Healthcare

Leveraging AI and Blockchain to Transform Healthcare

Medicine is ripe for disruption. As David Lawrence, former Chairman and CEO of the Kaiser Foundation Health Plan, wrote in his 2005 book chapter Bridging the Quality Chasm: The costs that result from...

[[ This is a content summary only. Visit my website for full links, other content, and more! ]]
Privacy, Ethics and Regulation in Our New World of Artificial Intelligence

Privacy, Ethics and Regulation in Our New World of Artificial Intelligence

New applications with technology are simply amazing. From new uses of data analytics to artificial intelligence and machine learning, our world is rapidly transforming before our eyes. But what are...

[[ This is a content summary only. Visit my website for full links, other content, and more! ]]
The Top-10 French Artificial Intelligence Startups

The Top-10 French Artificial Intelligence Startups

As France’s youngest president at 40 years old, Emmanuel Macron is known for his strong handshake, boyish good looks, and his controversial method of selecting a mate. Recently, he set his sights on...

[[ This is a content summary only. Visit my website for full links, other content, and more! ]]
Response 3 (Breakpoints)

Response 3 (Breakpoints)

This is in response to long comment by Clinton Reilly about Breakpoints (Forced Changes in Posture) on this thread:

Breakpoints in U.S. Army Doctrine

Reilly starts with a very nice statement of the issue:

Clearly breakpoints are crucial when modelling battlefield combat. I have read extensively about it using mostly first hand accounts of battles rather than high level summaries. Some of the major factors causing it appear to be loss of leadership (e.g. Harald’s death at Hastings), loss of belief in the units capacity to achieve its objectives (e.g. the retreat of the Old Guard at Waterloo, surprise often figured in Mongol successes, over confidence resulting in impetuous attacks which fail dramatically (e.g. French attacks at Agincourt and Crecy), loss of control over the troops (again Crecy and Agincourt) are some of the main ones I can think of off hand.

The break-point crisis seems to occur against a background of confusion, disorder, mounting casualties, increasing fatigue and loss of morale. Casualties are part of the background but not usually the actual break point itself.

He then states:

Perhaps a way forward in the short term is to review a number of first hand battle accounts (I am sure you can think of many) and calculate the percentage of times these factors and others appear as breakpoints in the literature.

This has been done. In effect this is what Robert McQuie did in his article and what was the basis for the DMSI breakpoints study.

Battle Outcomes: Casualty Rates As a Measure of Defeat

Mr. Reilly then concludes:

Why wait for the military to do something? You will die of old age before that happens!

That is distinctly possible. If this really was a simple issue that one person working for a year could produce a nice definitive answer for…..it would have already been done !!!

Let us look at the 1988 Breakpoints study. There was some effort leading up to that point. Trevor Dupuy and DMSI had already looked into the issue. This included developing a database of engagements (the Land Warfare Data Base or LWDB) and using that to examine the nature of breakpoints. The McQuie article was developed from this database, and his article was closely coordinated with Trevor Dupuy. This was part of the effort that led to the U.S. Army’s Concepts Analysis Agency (CAA) to issue out a RFP (Request for Proposal). It was competitive. I wrote the proposal that won the contract award, but the contract was given to Dr. Janice Fain to lead. My proposal was more quantitative in approach than what she actually did. Her effort was more of an intellectual exploration of the issue. I gather this was done with the assumption that there would be a follow-on contract (there never was). Now, up until that point at least a man-year of effort had been expended, and if you count the time to develop the databases used, it was several man-years.

Now the Breakpoints study was headed up by Dr. Janice B. Fain, who worked on it for the better part of a year. Trevor N. Dupuy worked on it part-time. Gay M. Hammerman conducted the interview with the veterans. Richard C. Anderson researched and created an additional 24 engagements that had clear breakpoints in them for the study (that is DMSI report 117B). Charles F. Hawkins was involved in analyzing the engagements from the LWDB. There were several other people also involved to some extent. Also, 39 veterans were interviewed for this effort. Many were brought into the office to talk about their experiences (that was truly entertaining). There were also a half-dozen other staff members and consultants involved in the effort, including Lt. Col. James T. Price (USA, ret), Dr. David Segal (sociologist), Dr. Abraham Wolf (a research psychologist), Dr. Peter Shapiro (social psychology) and Col. John R. Brinkerhoff (USA, ret). There were consultant fees, travel costs and other expenses related to that. So, the entire effort took at least three “man-years” of effort. This was what was needed just get to the point where we are able to take the next step.

This is not something that a single scholar can do. That is why funding is needed.

As to dying of old age before that happens…..that may very well be the case. Right now, I am working on two books, one of them under contract. I sort of need to finish those up before I look at breakpoints again. After that, I will decide whether to work on a follow-on to America’s Modern Wars (called Future American Wars) or work on a follow-on to War by Numbers (called War by Numbers II…being the creative guy that I am). Of course, neither of these books are selling well….so perhaps my time would be better spent writing another Kursk book, or any number of other interesting projects on my plate. Anyhow, if I do War by Numbers II, then I do plan on investing several chapters into addressing breakpoints. This would include using the 1,000+ cases that now populate our combat databases to do some analysis. This is going to take some time. So…….I may get to it next year or the year after that, but I may not. If someone really needs the issue addressed, they really need to contract for it.

5G to become the catalyst for innovation in IoT

5G to become the catalyst for innovation in IoT

5G represents a fundamental shift in communication network architectures. It promises to accelerate future revenue generation through innovative services facilitated via 5G-enabled devices, including...

[[ This is a content summary only. Visit my website for full links, other content, and more! ]]
Securing the Internet of Things with Digital Holograms

Securing the Internet of Things with Digital Holograms

Securing the Internet of Things is a phrase that is on everyone’s lips these days, but what exactly will it take? And why should device manufacturers care? Here are some of the major concerns:...

[[ This is a content summary only. Visit my website for full links, other content, and more! ]]
How to Start with Conversational AI

How to Start with Conversational AI

Since a few years, chatbots are here, and they will not go away any time soon. Facebook popularised the chatbot with Facebook Messenger Bots, but the first chatbot was already developed in the 1960s. MIT professor Joseph Weizenbaum developed a chatbot called ELIZA. The chatbot was developed to demonstrate the superficiality of communication between humans and machines, and it used very simple natural language processing. Of course, since then we have progressed a lot and, nowadays, it is possible to have lengthy conversations with a chatbot. For an overview of the history of chatbots, you can read this article.

Chatbots are a very tangible example where humans and machines work together to achieve a goal. A chatbot is a communication interface that helps individuals and organisations have conversations, and many organisations have developed a chatbot. There are multiple reasons for organisations to develop a chatbot, including obtaining experience with AI, engaging with customers and improving marketing, reducing the number of employees required for customer support, disseminating information and content in a way that users are comfortable with and, of course, increasing sales.

Chatbots offer a lot of opportunities for organisations, and they can be fun to interact with if developed correctly. But ...


Read More on Datafloq
Face Detection with Intel® Distribution for Python*

Face Detection with Intel® Distribution for Python*

This advertorial is sponsored by Intel®.

Artificial Intelligence (AI) can be used to solve a wide range of problems, including those related to computer vision, such as image recognition, object detection, and medical imaging. In the present paper we show how to integrate OpenCV* (Open Source Computer Vision Library) with a neural network backend. In order to achieve this aim, we first explain how the video stream is manipulated using a Python* programming interface and we also provide guidelines on how to use it. Finally, we discuss a working example of an OpenCV application. OpenCV is one of the packages that ship with Intel® Distribution for Python* 2018.

Introduction

Today, the possibilities of artificial intelligence (AI) are accessible to almost everyone. There are a number of artificial intelligence applications and many of them require the use of computer vision techniques. One of the most currently used libraries to help detection and matching, motion estimation, and tracking is OpenCV1. OpenCV is a library of programming functions mainly aimed at real-time computer vision. Originally developed by Intel, it was later supported by Willow Garage and is now maintained by Itseez. The library is cross-platform and free for use under the open-source BSD license.

Usually, the OpenCV ...


Read More on Datafloq
How Blockchain is Reinventing the RegTech Market

How Blockchain is Reinventing the RegTech Market

Regulation Technology (RegTech) is one of the fastest growing sectors of the software and technology industry for good reason. Banks, private equity, hedge funds, and other financial institutions are often able to reduce compliance costs and expense by up to 50 percent with proper implementation of RegTech solutions. The question is, where is the RegTech market headed next, and how will funds, managers, and administrators potentially benefit?

RegTech spending is already generating significant momentum, set to increase from 4.8 percent of regulatory spending in 2017, to 34.4 percent by 2022. A large part of this investment stems from the sheer volume of people, finances, and resources that financial institutions spend on regulatory compliance on an annual basis. Financial institutions like Citi often have upwards of 30,000 people employed in their compliance staff, and that’s just one of the tens of thousands of major financial institutions that deal with regulatory issues on a consistent basis. Just imagine if this staffing level could be reduced by 50 percent, a single large bank could save somewhere around $1.2 billion per year (based on the average wages of regulatory staff).

In an effort to ease much of the cost and complications associated with regulatory compliance, banks, ...


Read More on Datafloq
Looking under the hood at Amazon Neptune

Looking under the hood at Amazon Neptune

A few weeks back, my Big on Data bro George Anadiotis provided the deep dive on why graph platforms may become the next database in your portfolio, regardless of whether you know what they are. At...

[[ This is a content summary only. Visit my website for full links, other content, and more! ]]
How Can Data Scientist Become IoT Experts?

How Can Data Scientist Become IoT Experts?

Gartner predicts that the number of IoT devices will surpass 11.2 billion this year, the majority of which are in the consumer sector. The same report forecasts that the endpoint spending will exceed...

[[ This is a content summary only. Visit my website for full links, other content, and more! ]]
Customer data analytics and the empowered organization

Customer data analytics and the empowered organization

Data, whether in a data lake, enterprise data warehouse or an operational data store, is not useful. More accurately, data are not useful in and of themselves. Data only become useful when used, and...

[[ This is a content summary only. Visit my website for full links, other content, and more! ]]
GPU databases are coming of age

GPU databases are coming of age

GPUs are that obscure object of desire right now. Originally created to provide better performance for gamers, now everyone from crypto miners to deep-learning experts wants a piece of them....

[[ This is a content summary only. Visit my website for full links, other content, and more! ]]
Breakpoints in U.S. Army Doctrine

Breakpoints in U.S. Army Doctrine

U.S. Army prisoners of war captured by German forces during the Battle of the Bulge in 1944. [Wikipedia]

One of the least studied aspects of combat is battle termination. Why do units in combat stop attacking or defending? Shifts in combat posture (attack, defend, delay, withdrawal) are usually voluntary, directed by a commander, but they can also be involuntary, as a result of direct or indirect enemy action. Why do involuntary changes in combat posture, known as breakpoints, occur?

As Chris pointed out in a previous post, the topic of breakpoints has only been addressed by two known studies since 1954. Most existing military combat models and wargames address breakpoints in at least a cursory way, usually through some calculation based on personnel casualties. Both of the breakpoints studies suggest that involuntary changes in posture are seldom related to casualties alone, however.

Current U.S. Army doctrine addresses changes in combat posture through discussions of culmination points in the attack, and transitions from attack to defense, defense to counterattack, and defense to retrograde. But these all pertain to voluntary changes, not breakpoints.

Army doctrinal literature has little to say about breakpoints, either in the context of friendly forces or potential enemy combatants. The little it does say relates to the effects of fire on enemy forces and is based on personnel and material attrition.

According to ADRP 1-02 Terms and Military Symbols, an enemy combat unit is considered suppressed after suffering 3% personnel casualties or material losses, neutralized by 10% losses, and destroyed upon sustaining 30% losses. The sources and methodology for deriving these figures is unknown, although these specific terms and numbers have been a part of Army doctrine for decades.

The joint U.S. Army and U.S. Marine Corps vision of future land combat foresees battlefields that are highly lethal and demanding on human endurance. How will such a future operational environment affect combat performance? Past experience undoubtedly offers useful insights but there seems to be little interest in seeking out such knowledge.

Trevor Dupuy criticized the U.S. military in the 1980s for its lack of understanding of the phenomenon of suppression and other effects of fire on the battlefield, and its seeming disinterest in studying it. Not much appears to have changed since then.

What is a graph database? A better way to store connected data

What is a graph database? A better way to store connected data

Key-value, document-oriented, column family, graph, relational… Today we seem to have as many kinds of databases as there are kinds of data. While this may make choosing a database harder, it...

[[ This is a content summary only. Visit my website for full links, other content, and more! ]]
Can AI Fight Fake Content?

Can AI Fight Fake Content?

Fake content is a big problem. It’s not just fake news — it’s fake websites, social media profiles, and ads. There are plenty of uses for this stuff, ranging from subversive to downright malicious intent. At its core, fake content threatens the foundation of trust in society, a foundation necessary for societies to function properly, particularly democratic societies. Russia knows this, and it’s the reason why the Kremlin has an army of trolls disseminating fake content. The ultimate goal is to undermine American and European democracies.

In American homes, families are becoming increasingly concerned about fake news. A study by Panda Security found that fake news is right up there on the list of parents’ concerns next to sexual predators. This concern affects parents’ views towards media sites: nearly 50 percent view alt-right site Breitbart as “unsafe for children� and 20 percent feel the same about CNN. Nearly 6 percent of the parents surveyed block Facebook, which has been accused of unwittingly propagating fake news that may have influenced the 2016 presidential election. In comparison, only 2.5 percent of those parents block PornHub. Sexual predators oftentimes set up fake profiles on Facebook through which they lure kids.    

Fake news isn’t the only ...


Read More on Datafloq
Why Knowledge Graphs Are Foundational to Artificial Intelligence

Why Knowledge Graphs Are Foundational to Artificial Intelligence

AI is poised to drive the next wave of technological disruption across industries. Like previous technology revolutions in Web and mobile, however, there will be huge dividends for those...

[[ This is a content summary only. Visit my website for full links, other content, and more! ]]
How GDPR may help drive blockchain usage for content

How GDPR may help drive blockchain usage for content

The application of blockchain to content authentication has been gaining traction, and it may get an extra push from the zeitgeist brought about by the EU General Data Protection Regulation (GDPR)....

[[ This is a content summary only. Visit my website for full links, other content, and more! ]]
Blockchain In Health Care: The Good, The Bad And The Ugly

Blockchain In Health Care: The Good, The Bad And The Ugly

We hear a lot of talk about blockchain in healthcare and for good reason. Distributed ledger and/or blockchain technology is a hot topic in innumerable fields, ranging from finance to law to...

[[ This is a content summary only. Visit my website for full links, other content, and more! ]]
Could Your Website Stand Up Against a Cyber Attack?

Could Your Website Stand Up Against a Cyber Attack?

The world of cybercrime is as active as ever and cybercriminals are always looking for a new target. It’s easy to think you’re safe because you’re a single person or a small company with a website, but that’s not true at all.

In fact, many cybercriminals prey exclusively on smaller websites and companies because they have weaker security. Large corporations can afford to pay for massive security measure, but it’s quite likely you can’t.

The question is: could your current website survive against a cyber attack? No matter who you are, your website is constantly at risk of being attacked for a variety of reasons. It could be a criminal trying to steal sensitive data or a group of people you offended. There are new types of attacks becoming popular every year, meaning you need to stay vigilant. Here are some things to consider for your website’s security and where you could improve.

DDOS Attacks

A very common type of cyber attack for websites is a Distributed Denial of Service (DDOS) attack. This is when a website receives so much fake traffic very quickly that the hosting server can’t handle it and shuts down. This can prevent other, real visitors, from visiting the website, ...


Read More on Datafloq
Why Data Is HR’s Most Important Asset

Why Data Is HR’s Most Important Asset

There’s recruitment data, career progression data, training data, absenteeism figures, productivity data, personal development reviews, competency profiles and staff satisfaction data, just for...

[[ This is a content summary only. Visit my website for full links, other content, and more! ]]
Separating the hype from reality in healthcare AI

Separating the hype from reality in healthcare AI

Artificial intelligence (AI) and machine learning technology are sweeping most tech sectors and industries, and healthcare is no exception. In fact, at HIMSS18, no technology was hotter than AI....

[[ This is a content summary only. Visit my website for full links, other content, and more! ]]
Six Challenges of Big Data Analytics for Retail

Six Challenges of Big Data Analytics for Retail

Retail is one of the oldest platforms for a buyer to interact with a seller. Long before e-Commerce and m-Commerce hit it big, the retail industry relied on great product displays, competitive pricing and effective salesmanship to get the job done. With the growth of technology and its numerous flavours and avatars, the retail industry pulled in what it needed.

Today, with sales in the US alone over 5 trillion in 2017 and with the pressure to create a superior experience (86% of customers say that they will pay more for a superior experience), the retail industry is turning to niche technology to get a better job done.

Big data and retail

The niche technology here is big data. It is niche, but it is no longer a technology that is in the labs. The spending on big data was projected to reach USD 57 billion in 2017 with 6 million developers around the globe.

The retail industry is the perfect candidate for big data as it generates data with great volume, variety and velocity. From the products that are bought by the customers to the modes of payment used to the trends in buying in a particular festival to their buying decision factors ...


Read More on Datafloq
How Artificial Intelligence Is Revolutionizing Home Security

How Artificial Intelligence Is Revolutionizing Home Security

If you would like to protect your home and family from the unthinkable, it's time for you to learn how artificial intelligence is changing home security. According to the U.S. Department of Justice, 3.7 million burglaries take place each year on average. You are likely looking to the latest home security technology so that you can safeguard yourself from the threat.

Security systems of the past used simple sensors and alarms to detect intruders, and they would contact a team of security professionals if something went wrong. If you were away from home when the alarm went off, you would have needed to wait until you returned to survey the damage. Many people came back from vacation or business meetings early to find that it was only a false alarm. The ability to integrate artificial intelligence with home security combats that problem and offers several benefits, creating complete home protection on which you can depend.

Wi-Fi Connectivity

Modern security systems now let you connect your cameras to your home's Wi-Fi connection. You can then download and install a smartphone app so that you can view your security footage from anywhere with a cell signal or internet access. Your home will notify you when ...


Read More on Datafloq
Data Loss Prevention – It’s All About Decentralization

Data Loss Prevention – It’s All About Decentralization

The only realistic solution to solving the enterprise data loss problem is data decentralization.  I once heard a talk by a great CSO and his comment really stuck with me. Hackers can fail 999 times out of 1000 and still be very successful but we have to succeed every single time or we have failed.  The reality is that the best and brightest companies get hacked and it’s not because they aren’t skilled to prevent them. It’s simply a numbers game. Eventually, systems get so large and so many entry points exist that even the best and brightest can’t stop the 1 in 1000.  Once a hacker gets access to any data they likely get all of it. Why is this the case? It’s because most data is stored in a single database or in a single file repository. The model is flawed. The rise of blockchain technologies and the model of decentralization should be a guide on how to do this better.  

What does it mean to have your data decentralized?  That your processing and data reside all over on possibly many thousands of machines.  If this sounds a little scary let’s not forget what’s already happening in the ...


Read More on Datafloq
First Impressions of Oracle Autonomous Data Warehouse Cloud

First Impressions of Oracle Autonomous Data Warehouse Cloud

Regular readers of this blog will know that my day-to-day work involves using Google BigQuery to ingest, store and analyze petabytes of data using Google BigQuery, one of a new generation of fully-managed, cloud-hosted data warehouse platforms-as-a-service. Services like BigQuery take care of all the database infrastructure management and capacity planning leaving you just to define tables, load data into them and then run queries using SQL or your favourite BI tool.

Oracle recently announced Oracle Autonomous Data Warehouse Cloud (ADWC) that competes in this market and given my background in Oracle technology I was interested to see how it stacked-up compared to BigQuery and similar services from the likes of Amazon AWS and Snowflake, so I took advantage of the free $300 service credit offer and signed-up for an ADWC instance. Note that at the time of writing the AWDC service is only available in Oracle Public Cloud’s North American datacenter region as opposed to the European ones so make sure you choose the former when creating your Oracle Cloud account if you try this out yourself in the next few days.

Setting up an Oracle on-premises database was something once-upon-a-time I’d spent months and thousands on course fees studying for certifications to be able to do confidently, so it was nice to see how things had got a lot simpler with setting up an AWDC instance with the first step being to select AWDC from the Oracle Cloud Services menu, like this:

Then answer a few simple questions: the name to give the instance, number of CPUs and amount of storage to set aside initially, and the password for the administrator account.

Setup of an instance takes a few minutes, then its ready. Not bad.

The next big change you notice compared to Oracle’s customer-managed Database Cloud Service is the replacement of Enterprise Manager Database Express 12c and DBaaS Console with a single Service Console, accessed not through the usual SYS and SYSTEM superuser accounts but through a new superuser account called ADMIN, presumably adopting a naming standard more familiar to users of other SaaS and PaaS platforms.

The Service Console contains a couple of performance-tracking pages with the emphasis on read-only viewing of those stats vs. giving you controls to fine-tune database parameters or table structures, and a page where you can download a zip file containing a wallet directory with TNSNAMES.ORA and encrypted connection details to your AWDC instance.

I seem to remember at the time of Autonomous Data Warehouse Cloud’s launch that many in the Oracle community (myself included, most probably) said that all that the “autonomous� part of Autonomous Data Warehouse Cloud did was automate the Oracle Database maintenance tasks that more recent data warehousing platforms-as-a-service handled for you automatically, with another common critique being that under the covers ADWC was more scripting and use of existing “advisors� than true autonomous artificial intelligence-style database management. To be honest though, it doesn’t matter; compared to running a regular Oracle database you just fill in a short form and the platform works without any further intervention from you, you don’t need to know how it works, job done in terms of competing with BigQuery and Athena for ease-of-use and maintenance.

Connecting to AWDC is a bit different to regular Oracle Database connections in that it’s easier — no need to specify host, port and SID/service name — but this involves downloading a ZIP file wallet and the tool you wish to connect having support for that wallet; Oracle SQL Developer in its most recent 18c version has that support making it the obvious tool to define new users, create tables and upload data into them and then run simple SQL queries against your data.

Oracle ADWC is based Oracle Exadata Database Server technology and supports PL/SQL, full INSERT/UPDATE/DELETE support in SQL as well as SELECT, parallel query (enabled by default) and hybrid columnar compression.

What you don’t get are indexes and materialized views as these generally aren’t considered necessary when running on Exadata hardware that’s optimized for full table scans of data arranged in compressed columnar format, and as ADWC charges per CPU per hour there’s no penalty in scanning lots of detail-level data unnecessarily as you get with BigQuery.

Also missing are features such as the Oracle OLAP Option, Oracle Spatial and Graph and Oracle Text that I’m guessing might get implemented at a later date, along with management features such as Oracle Tuning Pack that are no longer needed when the autonomous features of the platform do the database tuning for you.

Price-wise, although as I said a moment ago you do get charged by the amount of data you scan (and a bit for storage) with Google BigQuery, Oracle AWDC seems to be more expensive to run at least at the data volumes I’d be using as a hobbyist developer. If you commit to monthly charging then AWDC costs around $1200/cpu/month whereas Google charge $5/TB for queries with the first 1TB free each month and a token amount for storage and streaming inserts which meant my bill last month for BigQuery itself was a total of $1.53, with my total bill for all Google Cloud Platform services including multiple VMs and extensive usage of NLP and Geocoding APIs taking the total up to just over $150. Clearly Google are aiming more at the startup market whereas Oracle are serving enterprise customers, but don’t make the assumption that just because AWDC is elastic it’s also super-cheap for hobbyist use, long-term.

So now its a case of loading some sample data into AWDC using SQL Developer and then running a couple of test queries to see it working in action with my data.

Tomorrow, I’ll show how Looker be connected using the newly announced support it has for Oracle AWDC, and later on in the week do the same for Oracle Analytics Cloud and DV Desktop.


First Impressions of Oracle Autonomous Data Warehouse Cloud was originally published in Mark Rittman’s Personal Blog on Medium, where people are continuing the conversation by highlighting and responding to this story.

Why the Future of Social Media Will Depend on Artificial Intelligence

Why the Future of Social Media Will Depend on Artificial Intelligence

With an estimated 2.77 billion social network users around the globe, social networking platforms possess an unimaginable amount of data. The reach of social networks is expanding at an alarming rate...

[[ This is a content summary only. Visit my website for full links, other content, and more! ]]
Five Surprising Uses of Virtual Reality in Business

Five Surprising Uses of Virtual Reality in Business

Ah, virtual reality. Huge potential but always seems that next year is the big breakout. There are many interesting case studies, more and more people trying VR for the first time and more hardware being sold every day, but in terms of practical, everyday applications in business, virtual reality is viewed by many as being in the “nice to have but not essential� category.  

But that’s already changing and is going to change rapidly in the near future. Virtual reality will offer significant advantages for professionals working in a host of different industries. It will allow industries from manufacturing to automotive, logistics, retail, hospitality and more to better plan for the future, access and see real-time data and hardware anywhere in the world, and collaborate on new designs, to name a few applications. In totality, estimates place the worth of the VR industry at $28.3 billion by 2020 (Source).

Below are five areas where there’s big potential for VR in business applications.

Automotive

As far back in 2012 (WAY back in VR-terms!), reports (example) were showing how manufacturers like Jaguar Land Rover (JLR) and Ford were using VR for visual prototyping, manufacturing, virtual assembly, testing and training. Ford’s VR testing center is well ...


Read More on Datafloq
IoT trends that will keep IT leaders busy throughout 2018 — and beyond

IoT trends that will keep IT leaders busy throughout 2018 — and beyond

Last year was one of excitement and transformation in the Internet of Things space. Nearly every enterprise invested in an IoT ecosystem to drive the bottom line. Connected gadgets, wearables, and...

[[ This is a content summary only. Visit my website for full links, other content, and more! ]]
Rethinking Data Marts in the Cloud

Rethinking Data Marts in the Cloud

Many of us are all too familiar with the traditional way enterprises operate when it comes to on-premises data warehousing and data marts: the enterprise data warehouse (EDW) is often the center of...

[[ This is a content summary only. Visit my website for full links, other content, and more! ]]
C-WAM 4 (Breakpoints)

C-WAM 4 (Breakpoints)

A breakpoint or involuntary change in posture is an essential part of modeling. There is a breakpoint methodology in C-WAM. According to slide 18 and rule book section 5.7.2 is that ground unit below 50% strength can only defend. It is removed at below 30% strength. I gather this is a breakpoint for a brigade.

C-WAM 2

Let me just quote from Chapter 18 (Modeling Warfare) of my book War by Numbers: Understanding Conventional Combat (pages 288-289):

The original breakpoints study was done in 1954 by Dorothy Clark of ORO.[1] It examined forty-three battalion-level engagements where the units “broke,� including measuring the percentage of losses at the time of the break. Clark correctly determined that casualties were probably not the primary cause of the breakpoint and also declared the need to look at more data. Obviously, forty-three cases of highly variable social science-type data with a large number of variables influencing them are not enough for any form of definitive study. Furthermore, she divided the breakpoints into three categories, resulting in one category based upon only nine observations. Also, as should have been obvious, this data would apply only to battalion-level combat. Clark concluded “The statement that a unit can be considered no longer combat effective when it has suffered a specific casualty percentage is a gross oversimplification not supported by combat data.� She also stated “Because of wide variations in data, average loss percentages alone have limited meaning.�[2]

Yet, even with her clear rejection of a percent loss formulation for breakpoints, the 20 to 40 percent casualty breakpoint figures remained in use by the training and combat modeling community. Charts in the 1964 Maneuver Control field manual showed a curve with the probability of unit break based on percentage of combat casualties.[3] Once a defending unit reached around 40 percent casualties, the chance of breaking approached 100 percent. Once an attacking unit reached around 20 percent casualties, the chance of it halting (type I break) approached 100% and the chance of it breaking (type II break) reached 40 percent. These data were for battalion-level combat. Because they were also applied to combat models, many models established a breakpoint of around 30 or 40 percent casualties for units of any size (and often applied to division-sized units).

To date, we have absolutely no idea where these rule-of-thumb formulations came from and despair of ever discovering their source. These formulations persist despite the fact that in fifteen (35%) of the cases in Clark’s study, the battalions had suffered more than 40 percent casualties before they broke. Furthermore, at the division-level in World War II, only two U.S. Army divisions (and there were ninety-one committed to combat) ever suffered more than 30% casualties in a week![4] Yet, there were many forced changes in combat posture by these divisions well below that casualty threshold.

The next breakpoints study occurred in 1988.[1] There was absolutely nothing of any significance (meaning providing any form of quantitative measurement) in the intervening thirty-five years, yet there were dozens of models in use that offered a breakpoint methodology. The 1988 study was inconclusive, and since then nothing further has been done.[2]

This seemingly extreme case is a fairly typical example. A specific combat phenomenon was studied only twice in the last fifty years, both times with inconclusive results, yet this phenomenon is incorporated in most combat models. Sadly, similar examples can be pulled for virtually each and every phenomena of combat being modeled. This failure to adequately examine basic combat phenomena is a problem independent of actual combat modeling methodology.

Footnotes:

[1] Dorothy K. Clark, Casualties as a Measure of the Loss of Combat Effectiveness of an Infantry Battalion (Operations Research Office, Johns Hopkins University, 1954).

 [2] Ibid, page 34.

[3] Headquarters, Department of the Army, FM 105-5 Maneuver Control (Washington, D.C., December, 1967), pages 128-133.

[4] The two exceptions included the U.S. 106th Infantry Division in December 1944, which incidentally continued fighting in the days after suffering more than 40 percent losses, and the Philippine Division upon its surrender in Bataan on 9 April 1942 suffered 100% losses in one day in addition to very heavy losses in the days leading up to its surrender.

[1] This was HERO Report number 117, Forced Changes of Combat Posture (Breakpoints) (Historical Evaluation and Research Organization, Fairfax, VA., 1988). The intervening years between 1954 and 1988 were not entirely quiet. See HERO Report number 112, Defeat Criteria Seminar, Seminar Papers on the Evaluation of the Criteria for Defeat in Battle (Historical Evaluation and Research Organization, Fairfax, VA., 12 June 1987) and the significant article by Robert McQuie, “Battle Outcomes: Casualty Rates as a Measure of Defeat� in Army, issue 37 (November 1987). Some of the results of the 1988 study was summarized in the book by Trevor N. Dupuy, Understanding Defeat: How to Recover from Loss in Battle to Gain Victory in War (Paragon House Publishers, New York, 1990).

 [2] The 1988 study was the basis for Trevor Dupuy’s book: Col. T. N. Dupuy, Understanding Defeat: How to Recover From Loss in Battle to Gain Victory in War (Paragon House Publishers, New York, 1990).

Also see:

Battle Outcomes: Casualty Rates As a Measure of Defeat

Blockchain is facing a backlash. Can it survive?

Blockchain is facing a backlash. Can it survive?

Not so long ago, the internet was hailed as the solution to humanity’s ills. It would shine a light on all corners of the globe, bringing new knowledge and exchange. But growing concerns about fake...

[[ This is a content summary only. Visit my website for full links, other content, and more! ]]
4 things executives should know about AI and data science

4 things executives should know about AI and data science

Artificial intelligence, machine learning, and deep learning are buzzworthy terms in the world of business, ranging across channels from customer service to finance and beyond. Because big data is...

[[ This is a content summary only. Visit my website for full links, other content, and more! ]]
Data Warrior #HappyDance! Guess who joined @Snowflakedb

Data Warrior #HappyDance! Guess who joined @Snowflakedb

It going to be a big day at Snowflake. Two of my good friends are joining my team.
Monetize your IoT investments with Connected Field Service

Monetize your IoT investments with Connected Field Service

“Customer experience will overtake price and product as the key brand differentiator by the year 2020,� according to consulting firm Walker. From bolts and ball bearings to machine tools and...

[[ This is a content summary only. Visit my website for full links, other content, and more! ]]
How Blockchain Can Solve Big Data Barriers

How Blockchain Can Solve Big Data Barriers

The tech industry is ranting and raving about blockchain, and for good reasons; few technologies seem set to upend modern ways of doing business like blockchain is. When it comes to big data, in particular, tech investors are coming to realise all of blockchain’s potential – but for blockchain to become a truly transitive force in the market, it needs to establish itself as more than a mere passing trend.

Here’s how the tech world could be relying upon blockchain when it comes to bypassing some of the biggest big data barriers of the contemporary marketplace, and why this exciting new technology likely isn’t going anywhere anytime soon.

Investors are going all in on blockchain

The first thing that needs to be established during any discussion about blockchain is that the technology will likely be impacting our market for some time; investors everywhere are pouring their money into blockchain-based initiatives and funding blockchain-based startups in greater numbers than ever before. It’s fair to say that the investment boom we see right now as it pertains to blockchain is only just getting started, too; as more and more entrepreneurs come to realise the big data potential squared away within blockchain, they’re going to see ...


Read More on Datafloq
Implementing Deep Learning Methods and Feature Engineering for Text Data: The Skip-gram Model

Implementing Deep Learning Methods and Feature Engineering for Text Data: The Skip-gram Model

Just like we discussed in the CBOW model, we need to model this Skip-gram architecture now as a deep learning classification model such that we take in the target word as our input and try to predict...

[[ This is a content summary only. Visit my website for full links, other content, and more! ]]
Mastering the critical steps to being a data-driven organization

Mastering the critical steps to being a data-driven organization

The organizational view of data has evolved from often being an afterthought to that of a fundamental currency that drives decisions. A key element in this data evolution has been the exponential...

[[ This is a content summary only. Visit my website for full links, other content, and more! ]]
Elephant in the room: Hortonworks CEO thinks Hadoop software will keep driving big data

Elephant in the room: Hortonworks CEO thinks Hadoop software will keep driving big data

Since its founding in 2011, Hortonworks Inc. has fought battles on two fronts: Persuade corporations to adopt an entirely new, open-source data platform called Hadoop for a novel type of analytic...

[[ This is a content summary only. Visit my website for full links, other content, and more! ]]
6 tricky obstacles security teams face in GDPR compliance

6 tricky obstacles security teams face in GDPR compliance

The European Union’s General Data Protection Regulation (GDPR) takes effect May 25 and the penalties are stiff for failing to comply. Many are still unsure whether their companies are safely out of...

[[ This is a content summary only. Visit my website for full links, other content, and more! ]]
How Open Data is Driving Innovation in Government

How Open Data is Driving Innovation in Government

The amount of useful data that most governments have access to is impressive by any standard. But what’s more impressive is the potential that data has to improve the way that governments operate. That’s why one of the biggest trends in the GovTech sector today is the push for open data.

The movement towards open data in the public sector is significant for a couple of primary reasons. On the one hand, transparency lets citizens know what data the government has and how they are potentially using it so they can hold the right people accountable. On the other, open data helps governments by fostering greater interdepartmental collaboration and driving innovation through sharing data with third parties.

Public sector utilization of data transparency and analytics is evolving at a rapid pace on the state, local and federal levels to catalyze innovation and create a more citizen-centric way of operating. Here’s how the landscape of data transparency has evolved over recent years, how the public sector is putting open data and analytics to use, and why we should all be optimistic about the benefits of data transparency for both governments and citizens.

1. The Emergence of Open Data

One of the most significant moments in the ...


Read More on Datafloq
Making Data Work For You – Think Big Start Small

Making Data Work For You – Think Big Start Small

If you work in digital media there’s no escaping data. It underpins pretty much everything we do and fuels strategic decisions from campaign to board level. But since big data took centre stage you’d...

[[ This is a content summary only. Visit my website for full links, other content, and more! ]]
The Top 10 Things We’re Doing at TeleSign to Comply With the GDPR

The Top 10 Things We’re Doing at TeleSign to Comply With the GDPR

The right to privacy is included as a fundamental right in the EU Charter. The GDPR’s sanctions-based regime was created to protect that right, and TeleSign firmly believes that full compliance with...

[[ This is a content summary only. Visit my website for full links, other content, and more! ]]
10 reasons to shift your data buying to next generation data marketplaces

10 reasons to shift your data buying to next generation data marketplaces

With the changes in both the nature and perception of data into something now seen as agile, real-time accurate and accessible, it has become a tool to help redefine customer experience and drive...

[[ This is a content summary only. Visit my website for full links, other content, and more! ]]
Can graph databases fulfill the hype? We won’t know for sure until AWS’ Neptune arrives at year-end

Can graph databases fulfill the hype? We won’t know for sure until AWS’ Neptune arrives at year-end

Graph technology has been the next big thing in the enterprise database market for longer than I can remember. It certainly doesn’t lack for dogged evangelists who tout graph as the silver bullet for...

[[ This is a content summary only. Visit my website for full links, other content, and more! ]]
Virtual Reality for Data Centers – A Competitive Advantage

Virtual Reality for Data Centers – A Competitive Advantage

We work with a host of clients, helping them bring massive spaces, data and physical assets into the digital world. The technology makes it possible for anyone, located anywhere in the world, to explore their space in both 3D and virtual reality (VR). Tens to hundreds of thousands of square feet - and all of the details and assets inside and out - can be at your fingertips online, with varied levels of access to provide any audience the amount of detail they require. 

It was exciting to see Flexential (formerly Peak 10 + Viawest) using VR at a trade show, where the company was able to “bring� their Gen 4 data center facility along for the ride. A conference attendee would enter the Flexential booth, strap on a pair of Oculus Rift VR goggles, and take an in-depth tour of a Flexential data center, complete with an in-tour quiz to help people learn more about how the facility operates or be fully immersed in 360° panoramas.

No kidding - there was a line of people waiting to take the virtual tour. And it generated buzz and interest - exactly what you want at a tradeshow. VR opens up the door to ...


Read More on Datafloq
3 Essentials for Using Big Data for Financial Forecasting

3 Essentials for Using Big Data for Financial Forecasting

At this point, big data’s impact on the way companies operate is no longer up for debate. Proving itself to be a key player in everything from marketing to retail, big data’s prevalence in business...

[[ This is a content summary only. Visit my website for full links, other content, and more! ]]
Top 25 Artificial Intelligence Companies

Top 25 Artificial Intelligence Companies

Artificial intelligence has exploded in the past few years with dozens of startup companies and major AI initiatives by big name firms alike. The New York Times estimates there are 45 AI startups...

[[ This is a content summary only. Visit my website for full links, other content, and more! ]]
Why Analytics Of Things Standards Are Needed

Why Analytics Of Things Standards Are Needed

As the Internet of Things (IoT) continues to explode, so does the need for the analysis of IoT data. At IIA, we call the analysis of IoT data the Analytics of Things (AoT). There are many success stories from the world of AoT. However, without some attention to standards of both IoT data itself and the analysis of it, organizations will struggle to achieve AoT’s potential. In this post, we’ll dig into several different areas where standardization must be pursued.

IoT Communications Standards

Many devices need to communicate with one another. Sometimes, this communication is a simple exchange of sensor readings, while in other cases it is the passing of complex analytics results. Without standards around how these communications occur, issues will arise. Some scenarios are fine with localized standards, while others require global standards.

To illustrate a local standard, consider a smart kitchen. That kitchen might have a wide range of appliances that all communicate to help keep everything running smoothly. As long as each brand of kitchen appliance has a standardized protocol, my kitchen works perfectly as long as I stick to a single brand. The fact that my neighbor has a different brand with a totally different set of standards doesn’t matter ...


Read More on Datafloq
The Cold Start Problem with AI

The Cold Start Problem with AI

If you have become a Data Scientist in the last three or four years, and you haven’t experienced the 1990’s or the 2000’s or even a large part of the 2010’s in the workforce, it is sometimes hard to...

[[ This is a content summary only. Visit my website for full links, other content, and more! ]]
Industry 4.0 In Internet Of Things And The Benefits Of IoT

Industry 4.0 In Internet Of Things And The Benefits Of IoT

Industry 4.0, or the 4th Industrial Revolution itself, is the perception and realization of "Intelligent Factories," with modular structures and cyber-physical systems that oversee physical processes and build a virtual world of the real to make decentralized decisions. The Internet of Things and cyber-physical systems communicate with each other and with humans all the time through cloud computing.

This industry connects machines, assets, and systems, which will enable companies to generate intelligent networks throughout the value chain and control production independently. In this way, companies can have autonomy to schedule maintenance, anticipate process failures and adjust to the requirements and changes required.

Principles of Industry 4.0

For the insertion and development of Industry 4.0, some principles may come to define intelligent production systems. They are:

Real-time operation capability: It is the acquisition and conversion of data in an instant, allowing the solutions in real time.

Virtualization

It proposes the existence of a virtual copy of the intelligent factories. Enabling the remote monitoring and tracking of processes through sensors scattered throughout the site.

Decentralization

Decisions and solutions can be made through the cyber-physical from the need of production in real time. The machines will not only receive commands but will also be able to provide information about the work ...


Read More on Datafloq
Future of Work: How Using AI Creates An Enhanced Candidate and Employee Experience

Future of Work: How Using AI Creates An Enhanced Candidate and Employee Experience

Artificial Intelligence (AI) is penetrating industries from healthcare to advertising, transportation, finance, legal, education, and hospitality. Many of us may have already interacted with a...

[[ This is a content summary only. Visit my website for full links, other content, and more! ]]
Pompeo: A couple hundred Russians were killed

Pompeo: A couple hundred Russians were killed

We usually stay away from the news of the day, but hard to ignore this one as we were recently blogging about it:

Story: https://www.yahoo.com/news/u-military-killed-apos-couple-181324480.html

Video: https://www.usatoday.com/videos/news/nation/2018/04/12/pompeo-%e2%80%9c-couple-hundred-russians-were-killed%e2%80%9d-syria-shootout/33770113/

In testimony to the Senate Foreign Relations Committee, Mike Pompeo, currently the CIA Director and nominee to serve as Secretary of State stated that “a couple hundred Russians were killed” by U.S. forces in Syria.

Our discussion of this:

Russian Body Count: Update

More on Russian Body Counts

More Russian Body Counts

Russian Body Counts

New SAS® Viya® enhancements embed transparent AI technology and offer better data governance and improved user productivity

New SAS® Viya® enhancements embed transparent AI technology and offer better data governance and improved user productivity

SAS, the leader in analytics, is embedding more artificial intelligence (AI) and automation in the SAS Platform, making it easier for customers to build AI solutions based on proven and trusted...

[[ This is a content summary only. Visit my website for full links, other content, and more! ]]
WTF is Machine Learning Anyway?

WTF is Machine Learning Anyway?

In a world where we might think is being ruled and controlled by tech geeks and data scientists, during meetings and phone calls with customers I’m still, often, being hit with honest and candid questions about any given topic about the data and analytics and give my personal take on them.

In virtue of this, I’ve decided to take a shot and a series of posts to answer, as plainly as I possibly can, common questions I receive in my day-to-day life as a consultant and analyst.

Starting with my most popular question nowadays: WTF is machine learning?
So, here we go...

Machine Learning in a Tiny Nutshell

The discipline of  machine learning evolved as part of larger disciplines including data mining and artificial intelligence (AI) and, in many ways, evolving side by side with traditional statistics and data mining and other mathematical disciplines.

So, simply put, machine learning cares about the development of mathematical models and algorithms with the ability to “learnâ€� from data input, adapt  and subsequently, improve the outcome. The concept of "learning" in machine learning, yet far from simple in practice, starts with a simple definition:
  • Learning​ ​=​ ​Representation​ ​+​ ​Evaluation​ ​+​ ​Optimization
In which:

  • Representation​ is a data element called classifier represented in a formal language that a computer can handle and interpret
  • Evaluation​ consists of a function that distinguish or evaluate the good and bad classifiers; and
  • Optimization​ which represents the method to be used to search among these classifiers within the language to find the highest scoring ones
From the previous idea, machine learning can be done by applying specific learning strategies, including:
  • Supervised strategy or learning, to map the data inputs and model them against desired outputs
  • Unsupervised strategy or learning, to map the inputs and model them to find new trends
Of course, derivative ones that combine these have appeared, such is the case for the combined semi-supervised learning strategy and others. Opening the door onto a multitude of new approaches to machine learning and the incorporation of diverse data analysis disciplines to its arsenal, such is the case for predictive analytics as well as pattern recognition.
(post-ads)
As approaches and algorithms emerge, they have been frequently organized in taxonomies and classified after different criteria, including the type of input, and output, required and its use in different situations and use case scenarios.

Some of these approaches include (in alphabetical order):
  • Association rule learning
  • Artificial neural networks
    • Deep learning
  • Bayesian networks
  • Clustering
  • Decision tree learning
  • Genetic algorithms
  • Inductive logic programming
  • Reinforcement learning
  • Representation learning
  • Rule-based machine learning
    • Learning classifier systems
  • Similarity and metric learning
  • Sparse dictionary learning
  • Support vector machines

Then, What is a Machine Learning Software Solution?

A perfect combination of factors, like the evolution of machine learning approaches and algorithms, as well as the continuous improvement in software and hardware technologies have enabled machine learning software to be applied for solving more types of problems and being adopted in increasingly number of business processes.

In essence, an machine learning software solution is simply a software piece ingrained with specific machine learning functional features aimed to solve both specific or general issues where machine learning is applicable so, we can see machine learning software evolving in two main ways:

So, today its likely that we, as information workers or as common users of a given software are in one way or the other, consuming software resources which actually use some form of machine learning technique.

Then, How Can I Use Machine Learning in My Organization?

As the adoption of machine learning increases, so does the use cases, a brief list describes some uses of machine learning applied in different industries and lines of businesses:
  • Recommendation systems. Probably its most common use case for, machine learning algorithms are deployed to analyze the online activity of an organization’s customer base to determine individual and/or collective buy or choosing preferences, enabling the system to increasingly learn about customer’s behavior to increase the system’s prediction accuracy. Companies including Amazon, Netflix or BestBuy
  • Marketing Personalization. Today, some organizations apply machine learning techniques to learn and understand better its customers and consequently to improve its marketing campaigns. From learning customers behavior, organizations can personalize, for example, which email campaigns a customer must receive and/or which direct mailings or coupons, or offerings that will likely have more impact if showed “recommendedâ€�.
  • Fraud Detection. Companies like Paypal are now using Machine Learning software solutions that analyze all their transactions, learn and identify fraudulent transactions from legitimate ones while increasing accuracy over time.

These of course are just a couple of examples of a wide set of uses cases in different industries including, healthcare, data security, healthcare and many others.

So...?

On one hand, today it is not hard to find use cases for machine learning, and it keeps growing, so if you are looking into adopting a machine learning solution, there is a good chance you will find one that fits your current needs for improving your organization’s analysis capabilities. Also, given it is possible to find many types of machine learning solutions in the market, both commercial and open source, it might not be cost prohibited to embark at least in the evaluation of some of these available options to get a sense of the benefits of having machine learning capabilities within your organization.

On the other, it is important to note, as with any other type of software, you will need to do the legwork and ensemble a coherent approach for the adoption of a machine learning initiative for your organization to get the best of a machine learning initiative, including a clear definition, scoping and evaluation of your actual needs that will help you define the best solution of choice in the market.

Small advise, don’t look for a vanilla solution, look for the most convenient for your organization.

You can find another example (pun intended) of the use ef machine learning and other technologies on Google’s latest product: The bad joke detector.

Finally, you are welcomed to leave a comment in the box below or download our very first DoT Industry Note report here.



Data Models: Beauty Is in the Eye of the Implementer

Data Models: Beauty Is in the Eye of the Implementer

The data vault model and data warehouse automation are worth investigating if you are about to embark on a new data warehouse project. In a recent TDWI Upside article, I suggested that data models...

[[ This is a content summary only. Visit my website for full links, other content, and more! ]]
Abstraction and Aggregation in Wargame Modeling

Abstraction and Aggregation in Wargame Modeling

[IPMS/USA Reviews]

“All models are wrong, some models are useful.â€� – George Box

Models, no matter what their subjects, must always be an imperfect copy of the original. The term “model” inherently has this connotation. If the subject is exact and precise, then it is a duplicate, a replica, a clone, or a copy, but not a “model.” The most common dimension to be compromised is generally size, or more literally the three spatial dimensions of length, width and height. A good example of this would be a scale model airplane, generally available in several ratios from the original, such as 1/144, 1/72 or 1/48 (which are interestingly all factors of 12 … there are also 1/100 for the more decimal-minded). These mean that the model airplane at 1/72 scale would be 72 times smaller … take the length, width and height measurements of the real item, and divide by 72 to get the model’s value.

If we take the real item’s weight and divide by 72, we would not expect our model to weight 72 times less! Not unless the same or similar materials would be used, certainly. Generally, the model has a different purpose than replicating the subject’s functionality. It is helping to model the subject’s qualities, or to mimic them in some useful way. In the case of the 1/72 plastic model airplane of the F-15J fighter, this might be replicating the sight of a real F-15J, to satisfy the desire of the youth to look at the F-15J and to imagine themselves taking flight. Or it might be for pilots at a flight school to mimic air combat with models instead of ha

The model aircraft is a simple physical object; once built, it does not change over time (unless you want to count dropping it and breaking it…). A real F-15J, however, is a dynamic physical object, which changes considerably over the course of its normal operation. It is loaded with fuel, ordnance, both of which have a huge effect on its weight, and thus its performance characteristics. Also, it may be occupied by different crew members, whose experience and skills may vary considerably. These qualities of the unit need to be taken into account, if the purpose of the model is to represent the aircraft. The classic example of this is a flight envelope model of an F-15A/C:

[Quora]

This flight envelope itself is a model, it represents the flight characteristics of the F-15 using two primary quantitative axes – altitude and speed (in numbers of mach), and also throttle setting. Perhaps the most interesting thing about this is the realization than an F-15 slows down as it descends. Are these particular qualities of an F-15 required to model air combat involving such and aircraft?

How to Apply This Modeling Process to a Wargame?

The purpose of the war game is to model or represent the possible outcome of a real combat situation, played forward in the model at whatever pace and scale the designer has intended.

As mentioned previously, my colleague and I are playing Asian Fleet, a war game that covers several types of naval combat, including those involving air units, surface units and submarine units. This was published in 2007, and updated in 2010. We’ve selected a scenario that has only air units on either side. The premise of this scenario is quite simple:

The Chinese air force, in trying to prevent the United States from intervening in a Taiwan invasion, will carry out an attack on the SDF as well as the US military base on Okinawa. Forces around Shanghai consisting of state-of-the-art fighter bombers and long-range attack aircraft have been placed for the invasion of Taiwan, and an attack on Okinawa would be carried out with a portion of these forces. [Asian Fleet Scenario Book]

Of course, this game is a model of reality. The infinite geospatial and temporal possibilities of space-time which is so familiar to us has been replaced by highly aggregated discreet buckets, such as turns that may last for a day, or eight hours. Latitude, longitude and altitude are replaced with a two-dimensional hexagonal “honey comb” surface. Hence, distance is no longer computed in miles or meters, but rather in “hexes”, each of which is about 50 nautical miles. Aircraft are effectively aloft, or on the ground, although a “high mission profile” will provide endurance benefits. Submarines are considered underwater, or may use “deep mode” attempting to hide from sonar searches.

Maneuver units are represented by “counters” or virtual chits to be moved about the map as play progresses. Their level of aggregation varies from large and powerful ships and subs represented individually, to smaller surface units and weaker subs grouped and represented by a single counter (a “flotilla”), to squadrons or regiments of aircraft represented by a single counter. Depending upon the nation and the military branch, this may be a few as 3-5 aircraft in a maritime patrol aircraft (MPA) detachment (“recon” in this game), to roughly 10-12 aircraft in a bomber unit, to 24 or even 72 aircraft in a fighter unit (“interceptor” in this game).

Enough Theory, What Happened?!

The Chinese Air Force mobilized their H6H bomber, escorted by large numbers of Flankers (J11 and Su-30MK2 fighters from the Shanghai area, and headed East towards Okinawa. The US Air Force F-15Cs supported by airborne warning and control system (AWACS) detected this inbound force and delayed engagement until their Japanese F-15J unit on combat air patrol (CAP) could support them, and then engaged the Chinese force about 50 miles from the AWACS orbits. In this game, air combat is broken down into two phases, long-range air to air (LRAA) combat (aka beyond visual range, BVR), and “regular” air combat, or within visual range (WVR) combat.

In BVR combat, only units marked as equipped with BVR capability may attack:

  • 2 x F-15C units have a factor of 32; scoring a hit in 5 out of 10 cases, or roughly 50%.
  • Su-30MK2 unit has a factor of 16; scoring a hit in 4 out of 10 cases, ~40%.

To these numbers a modifier of +2 exists when the attacker is supported by AWACS, so the odds to score a hit increase to roughly 70% for the F-15Cs … but in our example they miss, and the Chinese shot misses as well. Thus, the combat proceeds to WVR.

In WVR combat, each opposing side sums their aerial combat factors:

  • 2 x F-15C (32) + F-15J (13) = 45
  • Su-30MK2 (15) + J11 (13) + H6H (1) = 29

These two numbers are then expressed as a ratio, attacker-to-defender (45:29), and rounded down in favor of the defender (1:1), and then a ten-sided-die (d10) is rolled to consult the Air-to-Air Combat Results Table, on the “CAP/AWACS Interception” line. The die was rolled, and a result of “0/0r” was achieved, which basically says that neither side takes losses, but the defender is turned back from the mission (“r” being code for “return to base”). Given the +2 modifier for the AWACS, the worst outcome for the Allies would be a mutual return to base result (“0r/0r”). The best outcome would be inflicting two “steps” of damage, and sending the rest home (“0/2r”). A step of loss is about one half of an air unit, represented by flipping over the counter or chit, and operating with the combat factors at about half strength.

To sum this up, as the Allied commander, my conclusion was that the Americans were hung-over or asleep for this engagement.

I am encouraged by some similarities between this game and the fantastic detail that TDI has just posted about the DACM model, here and here. Thus, I plan to not only dissect this Asian Fleet game (VGAF), but also go a gap analysis between VGAF and DACM.

How the Internet of Things Changes Big Data Analytics

How the Internet of Things Changes Big Data Analytics

The internet of things is going to have a dramatic and far-reaching impact on the world that we can’t even imagine. By the year 2020, there will be somewhere in the vicinity of 28 billion sensors...

[[ This is a content summary only. Visit my website for full links, other content, and more! ]]
Tokenomics: Why and How Tokens Fuel the Decentralised Economy

Tokenomics: Why and How Tokens Fuel the Decentralised Economy

The decentralised economy is booming, despite heavy losses on the crypto market in the past months. In the first quarter of 2018, venture fundraising in blockchain startups has been more than 40% of the total VC funding in 2017, promising a good year for crypto startups concerning VC investments. On the other hand, nearly 50% of the companies that did an ICO in 2017, have already failed, despite having raised over $104 million.

Despite the differences in industry, location, product and service on offer, they all have one thing in common: they use some sort of token as the key enabler to the platform. Startups have multiple options when selecting the type of crypto token and the token economics they opt for influences the likelihood of success for the crypto startup. Therefore, let’s dive into token economics to understand how this new component of the economy works.

Four Types of Tokens

Tokens are the fuel of the decentralised economy and a token has been described as:

“A unit of value that an organisation creates to self-govern its business model, and empower its users to interact with its products while facilitating the distribution and sharing of rewards and benefits to all of its stakeholders.� – ...


Read More on Datafloq
GDPR and protecting data privacy with cryptographic pseudonyms

GDPR and protecting data privacy with cryptographic pseudonyms

Within two years, most of today’s cybersecurity technologies will be obsolete. Since the beginning of 2016, hackers have stolen more than 8 billion records — more than double the two previous years...

[[ This is a content summary only. Visit my website for full links, other content, and more! ]]
How Big Data Allows Pre-emptive Healthcare to Prevent Disease

How Big Data Allows Pre-emptive Healthcare to Prevent Disease

Big data, robotics and Artificial Intelligence (AI) are radically changing the way clinicians diagnose and treat disease.  Gone are the days of “one size fits all� treatment protocols.  Instead,...

[[ This is a content summary only. Visit my website for full links, other content, and more! ]]
Data Preparation and Data Wrangling Best Practices – Part 1

Data Preparation and Data Wrangling Best Practices – Part 1

Rekha Sree is a Customer Success Architect, using her expertise in Data Integration, Data Warehouse and Big Data to help drive customer success at Talend. Prior to joining Talend, Rekha worked at...

[[ This is a content summary only. Visit my website for full links, other content, and more! ]]
Will Blockchain and Smart Tech Disrupt the Energy Market?

Will Blockchain and Smart Tech Disrupt the Energy Market?

Energy is power. That doesn’t sound like a very controversial statement, but if Solar Bankers gains momentum through its revolutionary use of blockchain, solar energy could disrupt the current infrastructure. Should power companies be concerned? That depends on just how disruptive Solar Bankers’ model becomes. And it could become very, very disruptive. 

As it stands, utilities benefit a great deal from blockchain and cryptocurrencies like Bitcoin and Ether because mining consumes a lot of energy. The nature of the cryptocurrency means miners are continually adding new blocks to the chain. This process creates an ever-increasing demand for energy worldwide.

Bitcoin’s Energy Problem

Bitcoin alone consumes more energy than Bangladesh and is just below Israel on the energy consumption ladder. Because it’s a distributed system, Bitcoin’s energy consumption isn’t confined to a single data centre in a single country. Every miner is responsible for skyrocketing energy usage, and it’s incentivised worldwide. The more energy you use, the more Bitcoin you earn. This reinforces utility infrastructures worldwide and taxes grids so that energy companies have to generate more power. More often than not, the energy isn’t clean, it comes from fossil fuels and natural gas. The Guardian’s Alex Hern points out that Bitcoin uses as ...


Read More on Datafloq
Deep Learning vs Classical Machine Learning

Deep Learning vs Classical Machine Learning

Over the past several years, deep learning has become the go-to technique for most AI type problems, overshadowing classical machine learning. The clear reason for this is that deep learning has...

[[ This is a content summary only. Visit my website for full links, other content, and more! ]]
Why Employee Training and Big Data Should Work Together

Why Employee Training and Big Data Should Work Together

The flood of data available today is growing by leaps and bounds as expanding networks are able to capture real-time user decisions in an instant. The ability to analyze this data for trends and...

[[ This is a content summary only. Visit my website for full links, other content, and more! ]]
How GDPR Drives Real-Time Analytics

How GDPR Drives Real-Time Analytics

New reforms under the General Data Protection Regulation (GDPR) started as an attempt to standardise data protection regulations in 2012. The European Union intends to make Europe “fit for the...

[[ This is a content summary only. Visit my website for full links, other content, and more! ]]
Here’s how the US needs to prepare for the age of artificial intelligence

Here’s how the US needs to prepare for the age of artificial intelligence

Politicians worldwide are stealing one of the US government’s best ideas by drawing up ambitious plans to make the most of advances in artificial intelligence. These AI manifestos, penned in Paris,...

[[ This is a content summary only. Visit my website for full links, other content, and more! ]]
How Does Spotify Know You So Well? – Member Feature Stories – Medium

How Does Spotify Know You So Well? – Member Feature Stories – Medium

Let’s dive into how each of these recommendation models work! Recommendation Model #1: Collaborative Filtering First, some background: When people hear the words “collaborative filtering,� they...

[[ This is a content summary only. Visit my website for full links, other content, and more! ]]
Tapping The Power Of Data Analytics To Boost Business

Tapping The Power Of Data Analytics To Boost Business

Continuum Analytics, which develops systems for open source analytics that is based on Python (which is a computer language). For the most part, Travis spends a lot of his time turning massive...

[[ This is a content summary only. Visit my website for full links, other content, and more! ]]

Privacy Policy

Copyright © 2018 BBBT - All Rights Reserved
Powered by WordPress & Atahualpa
X