The Machine Learning Algorithms Used in Self-Driving Cars

The Machine Learning Algorithms Used in Self-Driving Cars

Machine Learning applications include evaluation of driver condition or driving scenario classification through data fusion from different external and internal sensors. We examine different...

[[ This is a content summary only. Visit my website for full links, other content, and more! ]]
Why Marketing Needs Quality Data before Big Data and Predictive Analytics

Why Marketing Needs Quality Data before Big Data and Predictive Analytics

Recent marketing hype has been about new analytics and big data, and becoming marketing technologists. However, there are some fundamentals which must first be addressed, and a key stumbling block to...

[[ This is a content summary only. Visit my website for full links, other content, and more! ]]
5 Major AI Trends of 2018

5 Major AI Trends of 2018

Humans have always been thrilled with the concept of human-like robots and Artificial Intelligence (A.I.). Hollywood movies and science fiction have perhaps inspired several scientists to start...

[[ This is a content summary only. Visit my website for full links, other content, and more! ]]
Measuring the Effects of Combat in Cities, Phase III – part 2.1

Measuring the Effects of Combat in Cities, Phase III – part 2.1

I forgot a cool graphic from these charts I posted the Phase III – part 2 discussion:

This is on page 61 of the Phase III report. It is also on page 260 of my book War by Numbers.

There is some explanatory text for this chart on pages 60-61 (and pages 259-261 of my book War by Numbers). The text from the report is below:

Over time one may note that the average weighted percent-loss-per-day in urban operations from 1943 to 2003 – a 60-year time span – ranges from 0.50 to 0.71 if Soviet attacks are excluded. In contrast, the average weighted percent-loss-per-day in non-urban terrain ranges from 0.76 to 1.27 if the Soviet attacks and Tet are excluded.

These data can be plotted over time by simply inserting the various percentage-loss-per-day for each of the engagements under the appropriate year. To do so we have eliminated the Eastern Front Soviet attacks (urban and non-urban) and Tet Offensive non-urban outliers and have normalized the intervening years where there are no data points. The result is interesting and clearly establishes that that over the last 60 years urban warfare has remained less intense than non-urban warfare (at least at the division-level and as measured as a percent-loss-per-day).

It is notable that the sole point at which the two lines intersect – during the 1973 Arab-Israeli War may actually shed some light upon why the belief that urban warfare is more costly and/or intense than that in other types of terrain exists. Quite simply, the urban case in the 1973 War – the Battle of Suez City – is one unique engagement fought during that entire war and is just one of 32 engagements from that war that was fought in urban terrain. And it is one of the few cases that we have found where division-level urban combat was as intense as the average non-urban combat during the same campaign. Overall in just seven of the 31 non-urban engagements in the 1973 War was the attacker percent-per-day loss higher than 1.57 percent found at Suez City, and in only two of those were the attackers Israeli. Nor were the Israeli armor losses extraordinary at Suez City, they amounted to only about 11 tanks, for a loss rate of just 4.6 percent-per-day. This may be contrasted to the 11.43 percent-per-day armor loss that the Israelis averaged in the nine non-urban attacks they made against the Egyptians in the 1973 War.[1]

That Suez City stands out as unique should hardly be surprising. What is surprising is that it – and the few other possible outliers we have found – has become identified as the “typical� urban battle rather than as a unique case. In that respect Suez City and the other outliers may provide copious lessons to be learned for future battles in urban terrain, but they should not be accepted as the norm. On that note however, it is somewhat depressing to see that many lessons of urban warfare apparently learned by the different combatants in World War II apparently were forcibly relearned in later wars. That the mistakes made in earlier urban battles are repeated over and over again in later wars – such as avoiding sending unsupported armor into built-up areas – is more than somewhat perplexing. Worse, we have been unable to find any example in World War II of the misemployment of armor in an urban environment that mirrors the foolishness exhibited by the attackers at Suez City or Grozny. Thus it could be supposed that any benefit of technological evolution in warfare over time might be counterbalanced in part by the simple failure to draw adequate lessons from the past.

[1] The highest rate was at Chinese Farm I when the Israelis armor loss was 24.40 percent-per-day.

 

The 5 Clustering Algorithms Data Scientists Need to Know

The 5 Clustering Algorithms Data Scientists Need to Know

Clustering is a Machine Learning technique that involves the grouping of data points. Given a set of data points, we can use a clustering algorithm to classify each data point into a specific group....

[[ This is a content summary only. Visit my website for full links, other content, and more! ]]
The 10 Coolest Business Intelligence and Data Analytics CEOs

The 10 Coolest Business Intelligence and Data Analytics CEOs

The chief executive officer (CEO) is the highest-ranked executive in a company. The CEO has many responsibilities, ranging from setting strategy and direction to configuring the company’s culture,...

[[ This is a content summary only. Visit my website for full links, other content, and more! ]]
Why Python Is The Top Programming Language For Big Data

Why Python Is The Top Programming Language For Big Data

Everybody is well aware of the fact that we’re now in the era of big data, where virtually all decisions made by major businesses and even government entities are being facilitated with the help of a...

[[ This is a content summary only. Visit my website for full links, other content, and more! ]]
How Criminals are Using Big Data for Their Crimes

How Criminals are Using Big Data for Their Crimes

For all the advantages big data has given to organizations, one that has proven especially beneficial is its use in tracking down and capturing criminals. The utilization of massive sets of data to...

[[ This is a content summary only. Visit my website for full links, other content, and more! ]]
5 Ways of Big Data in the Business Marketing Process

5 Ways of Big Data in the Business Marketing Process

Big data is more than a buzzword today. It has great amounts of data that could well change all facets of life, from boosting healthcare outcomes to help manage traffic levels in metro areas and,...

[[ This is a content summary only. Visit my website for full links, other content, and more! ]]
Will artificial intelligence replace humans?

Will artificial intelligence replace humans?

We have entered the “second machine age.� Th first machine age began with the industrial revolution, which was driven primarily by technology innovation. The ability to generate massive amounts of...

[[ This is a content summary only. Visit my website for full links, other content, and more! ]]
Cognitive Analytics Answers the Question: What’s Interesting in Your Data?

Cognitive Analytics Answers the Question: What’s Interesting in Your Data?

Dimensionality reduction is a critical component of any solution dealing with massive data collections. Being able to sift through a mountain of data efficiently in order to find the key descriptive,...

[[ This is a content summary only. Visit my website for full links, other content, and more! ]]
What Most People Get Wrong About Data Lakes

What Most People Get Wrong About Data Lakes

The technology industry has continued to find new ways to interpret big data, to develop artificial intelligence, create backup solutions, and expand the cloud into a platform for businesses. One...

[[ This is a content summary only. Visit my website for full links, other content, and more! ]]
The rise of modern architectures based on change data capture technology

The rise of modern architectures based on change data capture technology

Data is creating massive waves of change and giving rise to a new data-driven economy that is only in its infancy. Organizations in all industries are shifting their business models to monetize data,...

[[ This is a content summary only. Visit my website for full links, other content, and more! ]]
Measuring the Effects of Combat in Cities, Phase III – part 2

Measuring the Effects of Combat in Cities, Phase III – part 2

Another part of our Phase III effort was to look at post-World War II cases. This is, by its nature, invariably one-sided data. Maybe at some point we will get the Chinese, North Koreans, Vietnamese, Syrians, etc. to open up their archives to us researchers, but, except for possibly Vietnam, I don’t think that is going to happen any time in the near future. So, we ended up building our post-World War II cases primarily from U.S. data.

We added 10 engagements from the Inchon/Seoul operation in 1950. For Vietnam we added  65 division-level urban engagements from the Tet Offensive in 1968 and 57 division-level non-urban engagements. We also added 56 battalion-level urban engagements from the Tet Offensive (all in Hue). We had 14 division-level urban engagements and 65 division-level non-urban engagements from various contingencies and conventional operations from 1944 to 2003. This included ELAS Insurgency, Arab-Isreali Wars, Panama, Mogadishu, the 1991 Gulf War and Baghdad in 2003. We also added 9 battalion-level urban cases, mostly from Beirut 1982-1984.

To add it all up this was:

                                                 Urban       Non-urban

Phase I (ETO)                              46              91

Phase II (Kharkov/Kursk)             51              65

Phase III (Manila/PTO)                53              41

Post-WWII – Division-level           89            123

Post-WWII – Battalion-level          65               0

                                                   ——-         ——

Total cases                                 304           319

This is a lot of cases for comparisons.

Just to show how they match up (from page 28 of the report):

Attackers in Division-Level Engagements:

Urban

PTO Kor Tet Oth ETO EF (Ger Atk) EF (Sov Atk)
Avg Str/day 12,099 28,304 6,294 10,903 34,601 17,080 17,001
Avg Cas 78 30 94 254 178 86 371
Avg Cas/day 78 30 39 59 169 86 371
Avg % Loss/day 0.63 0.71 0.78 0.56 0.50 0.49 1.95
Wgt % Loss/day 0.65 0.71 0.62 0.54 0.49 0.50 2.18

 

Non-urban

PTO Tet Oth ETO EF (Ger Atk) EF (Sov Atk)
Avg Str/day 17,445 13,232 18,991 21,060 27,083 27,044
Avg Cas 663 44 377 469 276 761
Avg Cas/day 221 22 191 237 206 653
Avg % Loss/day 0.83 0.19 1.56 1.09 1.00 2.39
Wgt % Loss/day 1.27 0.17 1.01 1.13 0.76 2.41

I will pick up more on the Phase III effort in a subsequent posting (a part 3 to this series). These charts are also on page 238 of War by Numbers.

P.S. The image was taken from this website: https://vulep-photo.blogspot.com/2013/01/hue-1968-tet-mau-than_3410.html

It says image by Bettmann/CORBIS.

How Big Data Is Impacting E-Commerce In 2018

How Big Data Is Impacting E-Commerce In 2018

The world of e-commerce is growing larger at a staggering rate, and few things are driving its rise as much as big data. Certain companies have leveraged the power of data analytics operations to rise to the forefront of modern markets, and it’s only thanks to the power of today’s computers that we can appreciate the modern luxuries provided to us by e-commerce operations.

These are some of the ways that savvy companies are using big data to pioneer the field of e-commerce, and the coming changes we can expect as the rest of 2018 pans out.

A new way of doing business

Brick-and-mortar retailers are a thing of the past now that big data has finally come into its own. Across the US, e-commerce sales continue to spike upwards at a dizzying rate, and the current pivot to digital services we’re seeing in the retail industry doesn’t look like it’s going to come to an end anytime soon. Few members of the public seem to appreciate just how seriously big data has changed their lives when it comes to e-commerce, however, warranting a review of just how important big data is for today’s leading e-commerce giants.

Take Amazon, an international market behemoth that ...


Read More on Datafloq
Why Data Collaboration is the Next Revolution

Why Data Collaboration is the Next Revolution

If you believe, as I do, in the wisdom of the crowd, then it’s time for a wholesale shift in how people think about data discovery and analytics. The current working model in the Big Data space typically has companies keeping all of their work within their walls. When they talk with others in their industry there’s little collaboration for fear of destroying the value they have created. Naturally companies need to protect their intellectual property and maintain their value. That being said, even companies concerned about protecting their intellectual property can take advantage of a crowd based model. There are many options when it comes to sharing data and analytics that don’t require exposing privileged information, and not all companies have the same restrictions.  What’s right for a Fortune 500 company might not work for a company of 50, or a university researcher, or a not for profit.

Large companies have resources that put them at a competitive advantage. Amazon can hire the best Big Data technologists and data scientists. This means that smaller companies have to find ways to be competitive by doing things more creatively.  Just because access to tools and people are limited that doesn’t mean they ...


Read More on Datafloq
How technology can help tackle the plastics pollution crisis

How technology can help tackle the plastics pollution crisis

Retailers are under increasing pressure to cut down on unnecessary packaging, and some shops are showing what a plastic-free, tech-enabled future could look like. It’s a sad fact that plastic...

[[ This is a content summary only. Visit my website for full links, other content, and more! ]]
Recent Developments In “Game Changing” Precision Fires Technology

Recent Developments In “Game Changing” Precision Fires Technology

Nammo’s new 155mm Solid Fuel Ramjet projectile [The Drive]

From the “Build A Better Mousetrap� files come a couple of new developments in precision fires technology. The U.S. Army’s current top modernization priority is improving its long-range precision fires capabilities.

Joseph Trevithick reports in The Drive that Nammo, a Norwegian/Finnish aerospace and defense company, recently revealed that it is developing a solid-fueled, ramjet-powered, precision projectile capable of being fired from the ubiquitous 155mm howitzer. The projectile, which is scheduled for live-fire testing in 2019 or 2020, will have a range of more than 60 miles.

The Army’s current self-propelled and towed 155mm howitzers have a range of 12 miles using standard ammunition, and up to 20 miles with rocket-powered munitions. Nammo’s ramjet projectile could effectively double that, but the Army is also looking into developing a new 155mm howitzer with a longer barrel that could fully exploit the capabilities of Nammo’s ramjet shell and other new long-range precision munitions under development.

Anna Ahronheim has a story in The Jerusalem Post about a new weapon developed by the Israeli Rafael Advanced Defense Systems Ltd. called the FireFly. FireFly is a small, three-kilogram, loitering munition designed for use by light ground maneuver forces to deliver precision fires against enemy forces in cover. Similar to a drone, FireFly can hover for up to 15 minutes before delivery.

In a statement, Rafael claimed that “Firefly will essentially eliminate the value of cover and with it, the necessity of long-drawn-out firefights. It will also make obsolete the old infantry tactic of firing and maneuvering to eliminate an enemy hiding behind cover.�

Nammo and Rafael have very high hopes for their wares:

“This [155mm Solid Fuel Ramjet] could be a game-changer for artillery,� according to Thomas Danbolt, Vice President of Nammo’s Large Caliber Ammunitions division.

“The impact of FireFly on the infantry is revolutionary, fundamentally changing small infantry tactics,� Rafael has asserted.

Expansive claims for the impact of new technology are not new, of course. Oribtal ATK touted its XM25 Counter Defilade Target Engagement (CDTE) precision-guided grenade launcher along familiar lines, claiming that “The introduction of the XM25 is akin to other revolutionary systems such as the machine gun, the airplane and the tank, all of which changed battlefield tactics.�

Similar in battlefield effect to the FireFly, the Army cancelled its contract for the XM25 in 2017 after disappointing results in field tests.

4 Pitfalls to Avoid When Choosing Tech for Your Business

4 Pitfalls to Avoid When Choosing Tech for Your Business

When you’re chasing new tech for your business, consider these four tips to finding the right technology to propel your business forward. Technology is often thought of as the antidote to...

[[ This is a content summary only. Visit my website for full links, other content, and more! ]]
How an Artificial Intelligence enabled workspace to be like?

How an Artificial Intelligence enabled workspace to be like?

 The application of artificial intelligence is still at a nascent stage not because that co-working spaces are reluctant to the application of this sophisticated technology but since the potential of...

[[ This is a content summary only. Visit my website for full links, other content, and more! ]]
Getting to Trusted Data via AI, Machine Learning, and Blockchain

Getting to Trusted Data via AI, Machine Learning, and Blockchain

Establishing trust in data is an essential requirement for businesses and entities for whom credible, reliable information is the lifeblood. As enterprises seek to manage data as an asset, it becomes...

[[ This is a content summary only. Visit my website for full links, other content, and more! ]]
Walking With AI: How to Spot, Store and Clean the Data You Need

Walking With AI: How to Spot, Store and Clean the Data You Need

Last August, data science leader Monica Rogati unveiled a new way for entrepreneurs to think about artificial intelligence. Modeled after psychologist Abraham Maslow’s five-tier hierarchy of...

[[ This is a content summary only. Visit my website for full links, other content, and more! ]]
Deep Learning as a Service: Welcome IBM Watson Studio

Deep Learning as a Service: Welcome IBM Watson Studio


The next phase of machine learning, deep learning, relies on large amounts of data to train the algorithms. Normally, this would incur high-costs, but thanks to IBM’s new Watson Studio costs can be kept low, resulting in a positive ROI for organisations.
IBM Watson Studio allows you to train your models and easily add functionality thanks to their APIs using a pay-as-you-go model.
IBM Watson Studio solves the main issues of developing neural networks: complexity, standardisation and skills gaps, making deep learning also available to small organisations.


We have entered the data age, where insights from streams of data can offer organisations valuable insights. Obtaining these insights is done using machine learning and the next generation in machine learning, deep learning, offers businesses the ability to streamline operational processes and cut costs. As long as development costs remain under control, organisations can achieve a positive ROI on their investments.

The challenge lies in effectively leveraging this newly available technology. Smarter machines require more in-depth programming, and that takes time, expertise and tremendous computing power. Now, the introduction of IBM’s deep learning as a service (DLaaS) brings the benefits to companies of all sizes, from larger enterprises all the way to small businesses. Driving the ...


Read More on Datafloq
The four levels of cloud maturity: determining the right path

The four levels of cloud maturity: determining the right path

As a CIO or IT leader, if cloud adoption isn’t at the top of your priority list, then it should be. According to Gartner, cloud adoption strategies will influence more than 50% of IT outsourcing...

[[ This is a content summary only. Visit my website for full links, other content, and more! ]]
Measuring the Effects of Combat in Cities, Phase III – part 1

Measuring the Effects of Combat in Cities, Phase III – part 1

Now comes Phase III of this effort. The Phase I report was dated 11 January 2002 and covered the European Theater of Operations (ETO). The Phase II report [Part I and Part II] was dated 30 June 2003 and covered the Eastern Front (the three battles of Kharkov). Phase III was completed in 31 July 2004 and covered the Battle of Manila in the Pacific Theater, post-WWII engagements, and battalion-level engagements. It was a pretty far ranging effort.

In the case of Manila, this was the first time that we based our analysis using only one-side data (U.S. only). In this case, the Japanese tended to fight to almost the last man. We occupied the field of combat after the battle and picked up their surviving unit records. Among the Japanese, almost all died and only a few were captured by the U.S. So, we had fairly good data from the U.S. intelligence files. Regardless, the U.S. battle reports for Japanese data was the best data available. This allowed us to work with one-sided data. The engagements were based upon the daily operations of the U.S. Army’s 37th Infantry Division and the 1st Cavalry Division.

Conclusions (from pages 44-45):

The overall conclusions derived from the data analysis in Phase I were as follows, while those from this Phase III analysis are in bold italics.

  1. Urban combat did not significantly influence the Mission Accomplishment (Outcome) of the engagements. Phase III Conclusion: This conclusion was further supported.
  2. Urban combat may have influenced the casualty rate. If so, it appears that it resulted in a reduction of the attacker casualty rate and a more favorable casualty exchange ratio compared to non-urban warfare. Whether or not these differences are caused by the data selection or by the terrain differences is difficult to say, but regardless, there appears to be no basis to the claim that urban combat is significantly more intense with regards to casualties than is non-urban warfare. Phase III Conclusion: This conclusion was further supported. If urban combat influenced the casualty rate, it appears that it resulted in a reduction of the attacker casualty rate and a more favorable casualty exchange ratio compared to non-urban warfare. There still appears to be no basis to the claim that urban combat is significantly more intense with regards to casualties than is non-urban warfare.
  3. The average advance rate in urban combat should be one-half to one-third that of non-urban combat. Phase III Conclusion: There was strong evidence of a reduction in the advance rates in urban terrain in the PTO data. However, given that this was a single extreme case, then TDI still stands by its original conclusion that the average advance rate in urban combat should be about one-half to one-third that of non-urban combat/
  4. Overall, there is little evidence that the presence of urban terrain results in a higher linear density of troops, although the data does seem to trend in that direction. Phase III Conclusion: The PTO data shows the highest densities found in the data sets for all three phases of this study. However, it does not appear that the urban density in the PTO was significantly higher than the non-urban density. So it remains difficult to tell whether or not the higher density was a result of the urban terrain or was simply a consequence of the doctrine adopted to meet the requirements found in the Pacific Theater.
  5. Overall, it appears that the loss of armor in urban terrain is the same as or less than that found in non-urban terrain, and in some cases is significantly lower. Phase III Conclusion: This conclusion was further supported.
  6. Urban combat did not significantly influence the Force Ratio required to achieve success or effectively conduct combat operations. Phase III Conclusion: This conclusion was further supported.
  7. Nothing could be determined from an analysis of the data regarding the Duration of Combat (Time) in urban versus non-urban terrain. Phase III Conclusion: Nothing could be determined from an analysis of the data regarding the Duration of Combat (Time) in urban versus non-urban terrain.

So, in Phase I we compared 46 urban and conurban engagements in the ETO to 91 non-urban engagements. In Phase II, we compared 51 urban and conurban engagements in an around Kharkov to 49 non-urban Kursk engagements. On Phase III, from Manila we compared 53 urban and conurban engagements to 41 non-urban engagements mostly from Iwo Jima, Okinawa and Manila. The next blog post on urban warfare will discuss our post-WWII data.

P.S. The picture is an aerial view of the destroyed walled city of Intramuros taken on May 1945

The Brilliant Ways UPS Uses Artificial Intelligence, Machine Learning And Big Data

The Brilliant Ways UPS Uses Artificial Intelligence, Machine Learning And Big Data

savings of up to $50 million per year, UPS has plenty of incentive to incorporate technology to drive efficiencies in every area of its operations. According to, “Our business drives technology at...

[[ This is a content summary only. Visit my website for full links, other content, and more! ]]
Analyze Massive Data Graphs On Your PC With This Device

Analyze Massive Data Graphs On Your PC With This Device

We like our computers to be general-purpose, but specialty hardware has been creeping in since the beginning. Remember when Intel introduced the 287 math coprocessor, a dedicated chip that only did...

[[ This is a content summary only. Visit my website for full links, other content, and more! ]]
Machines learn language better by using a deep understanding of words

Machines learn language better by using a deep understanding of words

Computer systems are getting quite good at understanding what people say, but they also have some major weak spots. Among them is the fact that they have trouble with words that have multiple or...

[[ This is a content summary only. Visit my website for full links, other content, and more! ]]
The Death of Supply Chain Management

The Death of Supply Chain Management

The supply chain is the heart of a company’s operations. To make the best decisions, managers need access to real-time data about their supply chain, but the limitations of legacy technologies can...

[[ This is a content summary only. Visit my website for full links, other content, and more! ]]
6 Top Entrepreneurs In Business Intelligence

6 Top Entrepreneurs In Business Intelligence

Business intelligence can be defined as a technology-driven process that allows to analysis data in order to present tools and resources to directors, managers and others so they are able to make the...

[[ This is a content summary only. Visit my website for full links, other content, and more! ]]
How to Protect Your Privacy Online

How to Protect Your Privacy Online

Data breaches are common occurrences these days, with personal information routinely stolen or misappropriated from social media, banks, retail outlets and other online sites.  From the Cambridge Analytica scandal, which affected an estimated 87 million Facebook users to the data breach discovered by credit-reporting agency Equifax - which notified 143 million consumers in 2017 that their personal information (including Social Security Numbers, birth dates, addresses, and in some cases drivers' license numbers) and, in a small number of cases, credit card information exposed - we are witnessing with distressing regularity the insecurity of private, confidential information.

The general rule is to treat your online privacy as you would your money — with the highest level of security to ensure your information does not fall into the wrong hands. Here are some tips to strengthen the safety and security of your online accounts to avoid being the victim of a data breach.

Nothing is Truly Deleted

Every time you're online, you are leaving a trail of activity. From websites you search for information, online shopping sites to social media posts, including photos and videos, your information is being stored. Even when you delete a social media post or photo, chances are that it's still living ...


Read More on Datafloq
Snowflake: The New Cloud-based Data Warehousing Paradigm

Snowflake: The New Cloud-based Data Warehousing Paradigm

I fly to Boston regularly for business and often sit in its notorious traffic. The city is old and the roads are narrow with cars trying to zip down streets built for horses and carriages. There’s no...

[[ This is a content summary only. Visit my website for full links, other content, and more! ]]
U.S. and Russian Troops Fight

U.S. and Russian Troops Fight

Just wanted to post up this article by The National Interest….as they linked to our blog in the article: Did U.S. and Russian Troops Fight Their Bloodiest Battle Since World War I in February

We had no idea they were linking to us….I just noticed a few hits from their site, so decided to check. Our link is on the first line of the second page, under “ill-judged attack”: http://nationalinterest.org/blog/the-buzz/did-us-russian-troops-fight-their-bloodiest-battle-since-26280?page=2

Artificial Intelligence and A2P Messaging: Transforming the App Revolution

Artificial Intelligence and A2P Messaging: Transforming the App Revolution

As marketers seek to improve and expand their customer communication, traditional SMS messaging is quickly giving way to the more dynamic, feature-rich Application to Person (A2P) platform. While these messages are delivered in much the same way, they are significantly more interactive, personal and customizable than a standard text message alone. As the industry expands, it’s become integrated with other technologies as well, chief among them Artificial Intelligence, or AI.

A Brief Look at the History of A2P

Before its use became more widespread, A2P was a platform primarily reserved as a billing mechanism for recurring payments, or to facilitate one-off services, such as voting. While companies used the services for marketing initiatives to some extent, it was primarily reserved for alerts or to deliver simple content.

Yet, as our economy continues to lean toward becoming more app-based than ever before, it comes as no surprise that mobile messaging is following this trend. Now, customers don’t only expect to receive mobile alerts from the companies and brands they follow and patronize. In addition, they also desire that those messages be as comprehensive, relevant and content-rich as possible. Thus, A2P has proliferated the mobile app industry and helped drive the change SMS to ...


Read More on Datafloq
Business analytics: The essentials of data-driven decision-making

Business analytics: The essentials of data-driven decision-making

Having big data is not enough: Tips to turn it into a business advantage When it comes to using data to drive business, organizations such as Google or Facebook are iconic. Although a lot can be said...

[[ This is a content summary only. Visit my website for full links, other content, and more! ]]
Public Data Analytics Companies

Public Data Analytics Companies

A long, long time ago (maybe 10 years) the data analytics industry was fairly easy to define and track. Back in that pre-historic era SAS was considered the gold standard of analytics companies with...

[[ This is a content summary only. Visit my website for full links, other content, and more! ]]
A Developer’s Guide to Salesforce:DX and Custom Code

A Developer’s Guide to Salesforce:DX and Custom Code

In the fall of 2017, Salesforce became the “first enterprise cloud software company to break the $10 billion revenue run rate� according to CEO Marc Benioff, cracked the Fortune 500, and topped out at number one on two other significant Fortune rankings: Future 50 Leaders and 100 Best Companies to Work For.

What makes Salesforce so successful? Aside from their unique work environment and community giveback initiatives, Salesforce provides a customer relationship management (CRM) experience that is powerful, highly customizable, and easily accessible.

If you attend any Salesforce event, you will undoubtedly hear the phrase “click, not code.� This is the mantra of developing in the Salesforce environment. Salesforce has shifted the power of customization from the developer to the administrator by providing multiple avenues for “declarative programming� — step-based processes that allow someone with no software development experience to create sufficiently complex functionality with a few clicks.

The Problem With “Click, Not Code�

While this declarative programming model is great for those without software development experience, the reality is that developers are using Salesforce. And for many of us, the limitations of a click-to-code framework are cumbersome and frustrating. We need to write custom code; we need to integrate with external systems; we need the ability to track our changes. While Salesforce provides some capabilities to develop in a more traditional manner, their solutions are tedious, insecure, and mostly incomplete—until now.

As part of their Spring 2018 release, Salesforce brought something new to the table: Developer Experience, more commonly referred to as DX. This powerful set of tools is bringing Salesforce up to speed with the world of custom development, allowing users to approach challenges with an agile methodology and solving many of the platform’s previous problems.

In this blog series, we will discuss four of the most common problems developers have had with Salesforce development, and how DX has improved them:

  1. Custom software development
  2. Single source of truth
  3. Metadata management
  4. Continuous integration and deployment

This blog post will address the first topic: How custom software has historically been done in Salesforce and how DX has dramatically improved the process.

The Problem: Apex Development

Salesforce has had the ability to do custom development in Apex, their own programming language, for a long time, but the methodology around that development left much to be desired. Before DX, developers had two possible approaches: write code in the Salesforce Developer Console or develop locally in your IDE of choice and copy/paste changes into the organization (org).

Salesforce’s Developer Console has its merits, but it lacks much of the functionality developers expect from their IDEs, like keyboard shortcuts, autocomplete, and a visible file structure. It is only accessible through a Salesforce org, so it provides no way to develop locally and offline. Additionally, the debugging capabilities of Apex via the Developer Console are lacking, and the logs provided are confusing and ineffective. To its credit, the Developer Console does have fairly intuitive methods for running tests and test suites, as well as the ability to execute “anonymous Apex� in order to quickly test specific snippets of code.

The copy/paste methodology has some obvious flaws, most notably the risks of overwriting someone else’s changes or directly impacting the production org. While this provides you with the ability to develop offline and in a familiar IDE or text editor, it is insecure and time-consuming. Additionally, at the end of the day, you are still constricted by using the Developer Console to validate and save your code. And no matter which approach you choose, you are left without any history of your changes.

The Solution: Salesforce DX

DX centers around the new Salesforce command line interface (CLI); from retrieving metadata, to creating a project, to exporting and importing data, the CLI is a necessary tool to improve the speed and efficiency of a Salesforce developer. Using the Salesforce CLI, developers now have a solution to some of the major Apex development drawbacks previously mentioned. First and foremost, this effectively eliminates the need to develop directly in the Developer Console, as you are able to create a project with a familiar structure, populate the project with desired Apex classes, and deploy the code directly to a Salesforce org. This ability to deploy to an org also means that developers no longer need to use the copy/paste methodology that is all too common in the Salesforce world.

Not only does DX make deploying code much easier, but it also provides the ability to retrieve code from a Salesforce org and store it locally on your own machine. This allows a developer to work on and update previously written code locally before directly making changes in the org itself.

Finally, with the ability to create projects, DX introduces structure to custom development in Salesforce. Though this may seem like a small improvement, it is significant. In a Salesforce org, all Apex code is lumped into one place, making it difficult to distinguish which classes are supposed to be grouped together. DX gives the developer the ability to modularize the Salesforce org, splitting it up into discrete projects that provide order to what has previously been chaos.

The Drawbacks to Salesforce DX

With the introduction of DX, Salesforce has narrowed the gap between Apex development and standard development; unfortunately, a gap still remains. One of the most noticeable drawbacks is the lack of consistent and comprehensive autocompletion capabilities. DX alone does not provide context awareness, which can make development a tedious trial-and-error approach. To mitigate this, Salesforce also released free, supported plug-ins for VSCode that provide much of the functionality and convenience expected from a modern day IDE. However, while the plug-ins assist with context awareness, it is still difficult to maintain an accurate list of terms for autocompletion, as each Salesforce org has a unique set of metadata.

Another serious drawback is the limited ability to debug Apex code. There currently exists the “checkpoint� method—the Salesforce version of breakpoints—but this is not debugging so much as it is extensive logging. This means that you cannot actually step through your code, but can only look back on code that has already run and view variable logging at the specified checkpoints. Apex development could be greatly improved if more robust debugging capabilities were available.

Finally, with DX alone there is no way to manage any sort of code conflict, so the most recently deployed code always wins. This is probably the biggest drawback of the traditional org-based development model used in the Salesforce world to date. However, DX opens the door to transition development from the org to a source control repository. If you are an experienced developer, this may seem like a given; but for Salesforce developers, this is huge.

Next Time

We’ve focused here on how DX has improved writing custom Apex code, but what about managing those changes via source control? And wouldn’t it be nice if you could also track the changes made to an org through declarative programming? For our next installment in this series, we’re going to dive into the Salesforce methodologies for the source of truth: How it has been done historically, and how DX has changed the game. We’ll also introduce Scratch Orgs, one of the biggest features of DX, and discuss how they give developers control into effectively managing metadata.

Looking for help with Salesforce development? Reach out to us at findoutmore@credera.com. We are practitioners, not salespeople, so you’ll speak with an actual Credera consultant. We look forward to chatting to see how we might be able to help.

The post A Developer’s Guide to Salesforce:<br>DX and Custom Code appeared first on www.credera.com.

How To Ensure Your Marketers Become Evangelists For Digital Transformation

How To Ensure Your Marketers Become Evangelists For Digital Transformation

They say you have to lead by example — and this rings especially true when it comes to digital transformation. However, more than half of those questioned for my company’s 2018 Salesforce Digital...

[[ This is a content summary only. Visit my website for full links, other content, and more! ]]
Ribbentrop Memoirs – 1943

Ribbentrop Memoirs – 1943

I have been back to doing a lot of work lately on events in July 1943. This led me to Joachim von Ribbentrop’s memoirs, who was Hitler’s foreign minister. He wrote his memoirs while he was in prison after World War II. In 1946 he was the first Nazi leader to be executed. Below is a very interesting passage covering much of what he had to say about events in late 1942 and all of 1943. It is from pages 168-171. It can be found at: https://archive.org/stream/in.ernet.dli.2015.183521/2015.183521.The-Ribbentrop-Memoirs_djvu.txt

When the Anglo-American landing in North Africa took place in November, 1942, 1 happened to be in Berlin. The very first reports showed the remarkable tonnage employed — four millions were mentioned. Clearly, an operation of such vast dimensions was very serious, and we had apparently been very wrong in our estimates of enemy tonnage. Indeed, Hitler later admitted as much. Since fortunes in the African theatre had always swayed backwards and forwards, I now feared the worst concerning the Axis position in the Mediterranean.

After contacting the Fuhrer I invited Count Ciano to come to Munich immediately for a conference; the Duce could not be spared to leave Italy. I flew to Bamberg, where I boarded the Fuhrer’s special train, which arrived there from the East.

I briefly reported as follows: The Anglo-American landing was serious, for it showed that our estimates of enemy tonnage, and therefore of the prospects of our U-boat war, had been radically wrong. Unless we could expel the British and Americans from Africa, which seemed very doubtful in view of our transport experiences in the Mediterranean, Africa and the Axis army there were lost, the Mediterranean would be open to the enemy, and Italy, already weak, would be confronted with the gravest difficulties. In this situation the Fuhrer needed a decisive reduction of his war commitments, and I asked for authority to make contact with Stalin through Mme Kollontay, the Soviet Ambassadress in Stockholm; I suggested that, if need be, most of the conquered territories in the East would have to be given up.

To this the Fuhrer reacted most strongly. He flushed, jumped to his feet and told me with indescribable violence that all he wanted to discuss was Africa — nothing else. His manner forbade me to repeat my proposal. Perhaps my tactics should have been different, but I was so seriously worried that I had aimed straight at my target.

Since the previous spring my power of resistance in face of such scenes had declined. It struck me then, as it did on subsequent occasions, that any two men who had had so violent a quarrel as mine with Hitler simply had to part company. Our personal relations had been so shattered that genuine co-operation seemed no longer possible.

There was nothing left for me but to discuss a few details concerning Count Ciano’s visit, and then the Fuhrer curtly ended the interview.

The next few days brought no further opportunity to mention my proposed contact with Stalin, although at that time — before the Stalingrad catastrophe — our negotiating position with regard to Moscow was incomparably stronger than it became soon afterwards. A week later the Russians attacked, our allies on the Don front collapsed, and our Sixth Army’s catastrophe at Stalingrad followed. For the time being, negotiations with Russia were ruled out — especially in the opinion of Hitler.

During the sad days which followed the end of the battle of Stalingrad I had a very revealing talk with Hitler. He spoke, as he often did, of his great admiration for Stalin. In him, he said, one could perceive what one man could mean to a nation. Any other nation would have broken down under the blows of 1941 and 1942. Russia owed her victory to this man, whose iron will and heroism had rallied the people to renewed resistance. Stalin was his great opponent, ideologically and militarily. If he were ever to capture Stalin he would respect him and assign to him the most beautiful palace in Germany. He added, however, that he would never release such an opponent. Stalin had created the Red Army, a grandiose feat. He was undeniably a historic personality of very great stature.

On this occasion and in a later memorandum I again suggested peace feelers to Moscow, but the memorandum, which I asked Ambassador Hewel to present, suffered an inglorious end. Hewel told me that the Fuhrer would have nothing to do with it and had thrown it away. I mentioned the subject once again during a personal conversation, but Hitler replied that he must first be able to achieve a decisive military success; then we could see. Then and later he regarded any peace feeler as a sign of weakness.

Nevertheless, I did make contact with Mme Kollontay in Stockholm through my intermediary, Kleist, but without authority I could do nothing decisive.

After the treachery of the Badoglio Government in September, 1943, I again acted very energetically. This time Hitler was not as obstinate as in the past. He walked over to a map and drew a line of demarcation on which, he said, he might compromise with the Russians. When I asked for authority, Hitler said he would have to think the matter over until the following morning. But when the next day came, nothing happened. The Fuhrer said he would have to consider this more thoroughly. I was very disappointed, for I felt that strong forces had again strengthened Hitler’s inflexible attitude against an understanding with Stalin.

When Mussolini arrived at the Fiihrer’s headquarters after his liberation, the Fuhrer told him, to my surprise, that he wanted to settle with Russia, but when I thereupon asked for instructions I again received no precise answer, and on the following day the Fuhrer once more refused permission for overtures to be made. He must have noticed how dejected I was, for later he visited me in my quarters, and on leaving said suddenly: ‘You know, Ribbentrop, if I settled with Russia today I would only come to grips with her again tomorrow — I just can’t help it.’ I was disconcerted and replied: ‘This is not the way to conduct a foreign policy, unless you want to forfeit confidence.’ My helplessness made me regard the future with gloom.

Source of the picture is: http://andrewvanz.blogspot.com/2012/08/ribbentrop-and-hitler.html

The person who originally posted that picture guesses that the picture was from 1943 taken at Rastenburg Station, East Prussia (which is 5 miles west of Hitler’s headquarters, the Wolf’s Lair).

 

 

Why Python Is The Top Programming Language For Big Data

Why Python Is The Top Programming Language For Big Data

Everybody is well aware of the fact that we’re now in the era of big data, where virtually all decisions made by major businesses and even government entities are being facilitated with the help of a big data analytics program. The rise of the big data age has ignited a fierce fight amongst programmers regarding which language is the best to work with, too, and it’s becoming overwhelmingly clear that a consensus is forming around Python.

Here’s why Python is the top programming language for all things having to do with big data projects, and how you can tap into it to master your data projects in the future.

Experts rely on Python

Let’s be clear about big data – you can rely on a myriad of programming languages, including Java and R, for your big data projects. Nonetheless, experts are overwhelmingly picking Python as their programming language of choice for their data projects, precisely because it offers them the clear and concise language they need to tackle their projects without any hassle. According to a recent developer skills report, Python is becoming more and more commonly accepted across the market, precisely because it’s general purpose nature helps developers across industries tackle their ...


Read More on Datafloq
The digital transformation and the importance of humans

The digital transformation and the importance of humans

What role will we as human beings play in the digital world? A world in which robots perform monotonous physical labor faster and with fewer errors than humans. A world in which artificial...

[[ This is a content summary only. Visit my website for full links, other content, and more! ]]
3 Lessons the Big Brands Hopping on to the Blockchain Can Teach Entrepreneurs

3 Lessons the Big Brands Hopping on to the Blockchain Can Teach Entrepreneurs

By mirroring strategies undertaken by big brands, startups can find ways to offer unique services taking advantage of the blockchain’s beneficial attributes. With the blockchain bandwagon...

[[ This is a content summary only. Visit my website for full links, other content, and more! ]]
5 Leadership Traits Required For Digital Transformation Success

5 Leadership Traits Required For Digital Transformation Success

Businesses are changing right before our eyes as the digital transformation takes place around the world. And yet many dinosaur leaders, as I like to call them, are still in these businesses risking...

[[ This is a content summary only. Visit my website for full links, other content, and more! ]]
What is performance management? A super simple explanation for everyone

What is performance management? A super simple explanation for everyone

When properly designed and implemented, performance management techniques and processes enable an organisation to monitor, manage and improve strategy execution and the delivery of results....

[[ This is a content summary only. Visit my website for full links, other content, and more! ]]
“You Can’t Ban Blockchain. It’s Math�: a Talk with Jimmy Wales

“You Can’t Ban Blockchain. It’s Math�: a Talk with Jimmy Wales

This interview has been edited and condensed. ‘Jimmy Wales is good at failure.’ Who would think that? Apparently, Jimmy would, as this is the message he was spreading during his speech at the biggest...

[[ This is a content summary only. Visit my website for full links, other content, and more! ]]
Digitization and IoT – Making the Data Work

Digitization and IoT – Making the Data Work

One of the big goals of digitization and IoT is to use new data driven insights to create outcomes like improved efficiency and enhanced customer experience. That goal requires a huge jump from raw...

[[ This is a content summary only. Visit my website for full links, other content, and more! ]]
What CIOs need to know about getting started with blockchain

What CIOs need to know about getting started with blockchain

Blockchain has been one of the buzziest tech words of 2018, with companies scrambling to figure out what use cases actually make sense for the distributed ledger technology. In a panel discussion at...

[[ This is a content summary only. Visit my website for full links, other content, and more! ]]
Imagining A Blockchain University

Imagining A Blockchain University

A couple of Oxford faculty imagine a different kind of university, one that is distributed and democratic.Joshua Broggi, Faculty of Philosophy, is the founder ofWoolf Development, a platform startup...

[[ This is a content summary only. Visit my website for full links, other content, and more! ]]
Here’s How The UK Government Is Using Big Data For Tax Collection

Here’s How The UK Government Is Using Big Data For Tax Collection

Governments around the world are stepping up their use of big data for tax. They are depending on it more heavily than ever to ensure compliance and eliminate waste. The UK government has recently...

[[ This is a content summary only. Visit my website for full links, other content, and more! ]]
Is It Better to Manage Your Data Exchanges In-House or With an External Firm?

Is It Better to Manage Your Data Exchanges In-House or With an External Firm?

Data management is a sensitive operation at every stage of the process. And with big data becoming increasingly important to 75 percent of businesses, corporate decision makers and data officers need to think carefully before they decide how they manage their incoming information; is it better to keep things in-house, or work with an external firm?

The Key Concerns

Ultimately, data exchange management comes with the following challenges and concerns:


Privacy and security. The global average cost of a data breach for a company is $3.6 million, making privacy and security top concerns for any company that deals with data.
Costs. Companies also need to acquire, organise, and manage their data as cost-efficiently as possible. All forms of data management require an investment, but some cost more than others.
Time. There’s also a time component to consider; how long does it take to manage your system upkeep with each option?


The Value of External Management

Let’s start by looking at the value of using an external data partner:


Existing infrastructure. By working with an external data partner, you can leverage the infrastructure that already exists, instead of building your own. This is important for everything from electronic data interchanges (EDIs) to long-term customer data. Investing in all ...


Read More on Datafloq
How Big Data and Analytics Are Changing Manufacturing for the Better

How Big Data and Analytics Are Changing Manufacturing for the Better

When most people think about big data and analytics in the business world, their minds immediately jump to ecommerce businesses, websites, and social media. But the truth is, big data is changing...

[[ This is a content summary only. Visit my website for full links, other content, and more! ]]
The Four C’s Of Winning With IoT

The Four C’s Of Winning With IoT

Executives today need to be experts in both business execution and in data curation. According to McKinsey,90% of all data in the world today has been created in just the past two years. Given the...

[[ This is a content summary only. Visit my website for full links, other content, and more! ]]
Windows 10 Mobile Device Management with Intune: Part 1

Windows 10 Mobile Device Management with Intune: Part 1

Corporate laptops on Windows 10 can now be managed and secured more easily. Instead of joining an Active Directory domain or installing a management client, Microsoft leveraged mobile device management (MDM) based on the Open Mobile Alliance (OMA) specification. With Windows 10 MDM based on an open standard, there are a number of tools that can manage Windows 10 computers, including AirWatch, MobileIron, and Microsoft’s own Intune. This series will demonstrate a Windows 10 MDM implementation with Intune, but the same concepts can be translated to any number of enterprise mobility management systems.

Intune is an integrated part of the Office 365 suite of products and therefore makes for a great solution for existing Office 365 customers looking to deploy a MDM solution. For this blog post, we will assume a scenario with an Office 365 customer who currently manages Windows 10 machines with Group Policy in an Active Directory domain that is syncing to Azure AD. Our Windows 10 MDM implementation needs to meet the following objectives to replace the controls in the current imaging and Group Policy implementation:

  • Run on any Windows 10 PC out of the box
  • Add local admin rights for desktop administrators
  • Require BitLocker encryption and backup BitLocker keys
  • Manage Windows Updates setting by user group
  • Upgrade the Windows license to Enterprise
  • Deploy a SCEP certificate for WiFi access
  • Deploy Office 365 ProPlus
  • Deploy a set of custom fonts
  • Allow users to access several office printers

In this section we will cover how to:

  • Run on any Windows 10 PC out of the box
  • Add local admin rights for desktop administrators
  • Require BitLocker encryption and backup BitLocker keys

Configure Azure AD and Intune for Windows MDM

First, we need to tell Azure AD that we want to use Intune for MDM by going to the Azure Active Directory blade in the Azure portal, selecting Mobility, then Microsoft Intune and setting the MDM user scope to Some or All. If you are setting up Windows 10 MDM in a production tenant, it is recommended that you set up a test group to apply MDM policy to while testing.

Now users could use the URLs listed on the Configure page to register devices with Intune but we can set up CNAME DNS records that will automate the enrollment process. Details for both of these steps can be found in the Microsoft documentation, set up enrollment for Windows devices.

Add Local Administrators

While we are still in the Azure AD blade, we can go to Devices, then Device settings to add additional administrators to the Local Administrators group on Azure AD joined devices. Select the Selected button next to Additional local administrators on Azure AD joined devices and select the additional users who should be set up as local administrators. The reason “additional� is used in this setting is because Global Administrators and device Owners are set up as Local Administrators by default. If your needs require that the device Owners not be administrators, take a look at Windows Autopilot deployment profiles, which just came out of preview as I was writing this.

BitLocker

If the device supports it, Azure AD will automatically enable BitLocker and backup the BitLocker key in Azure AD, however a number of devices don’t support automatic BitLocker encryption, so we will also include an Intune Compliance policy that requires BitLocker encryption and a password for all users on the device.

Figure 1. Compliance Policy – Require BitLocker

Figure 2. Compliance Policy – Require a Password

Users whose devices are not automatically encrypted are prompted to encrypt their device after it is joined to Azure AD and the Intune Compliance policy is applied. This interactive process prevents users from accidentally having their disk encrypted and locked without having the BitLocker key backed up.

Figure 3. End User Encryption Walk-through

Once the BitLocker key is backed up in Azure AD, users can find their own keys in the Profile section of the myapp.microsoft.com portal. Administrators can see the BitLocker key in the Device blade in Azure AD.

Moving Forward With MDM

The second part of this series will cover how to:

  • Manage Windows Updates setting by user group
  • Upgrade the Windows license to Enterprise
  • Deploy SCEP certificate for WiFi access

Do you have questions about creating a MDM Windows 10 deployment to meet your needs? Credera has expertise helping clients achieve modern deployment patterns that can streamline the deployment process, freeing up resources from time-consuming image deployment processes. We would love to discuss potential cloud and infrastructure solutions with you—contact us at findoutmore@credera.com.

The post Windows 10 Mobile Device Management with Intune: Part 1 appeared first on www.credera.com.

Four Digital Transformation Trends Driving Industry 4.0

Four Digital Transformation Trends Driving Industry 4.0

When you hear the word ‘industry’ you may think of factories or tall smokestacks. And even though technology grows in leaps and bounds daily, this is still our mindset. For manufacturers that want to...

[[ This is a content summary only. Visit my website for full links, other content, and more! ]]
Are There Only Three Ways of Assessing Military Power?

Are There Only Three Ways of Assessing Military Power?

military-power[This article was originally posted on 11 October 2016]

In 2004, military analyst and academic Stephen Biddle published Military Power: Explaining Victory and Defeat in Modern Battle, a book that addressed the fundamental question of what causes victory and defeat in battle. Biddle took to task the study of the conduct of war, which he asserted was based on “a weak foundation� of empirical knowledge. He surveyed the existing literature on the topic and determined that the plethora of theories of military success or failure fell into one of three analytical categories: numerical preponderance, technological superiority, or force employment.

Numerical preponderance theories explain victory or defeat in terms of material advantage, with the winners possessing greater numbers of troops, populations, economic production, or financial expenditures. Many of these involve gross comparisons of numbers, but some of the more sophisticated analyses involve calculations of force density, force-to-space ratios, or measurements of quality-adjusted “combat power.� Notions of threshold “rules of thumb,� such as the 3-1 rule, arise from this. These sorts of measurements form the basis for many theories of power in the study of international relations.

The next most influential means of assessment, according to Biddle, involve views on the primacy of technology. One school, systemic technology theory, looks at how technological advances shift balances within the international system. The best example of this is how the introduction of machine guns in the late 19th century shifted the advantage in combat to the defender, and the development of the tank in the early 20th century shifted it back to the attacker. Such measures are influential in international relations and political science scholarship.

The other school of technological determinacy is dyadic technology theory, which looks at relative advantages between states regardless of posture. This usually involves detailed comparisons of specific weapons systems, tanks, aircraft, infantry weapons, ships, missiles, etc., with the edge going to the more sophisticated and capable technology. The use of Lanchester theory in operations research and combat modeling is rooted in this thinking.

Biddle identified the third category of assessment as subjective assessments of force employment based on non-material factors including tactics, doctrine, skill, experience, morale or leadership. Analyses on these lines are the stock-in-trade of military staff work, military historians, and strategic studies scholars. However, international relations theorists largely ignore force employment and operations research combat modelers tend to treat it as a constant or omit it because they believe its effects cannot be measured.

The common weakness of all of these approaches, Biddle argued, is that “there are differing views, each intuitively plausible but none of which can be considered empirically proven.� For example, no one has yet been able to find empirical support substantiating the validity of the 3-1 rule or Lanchester theory. Biddle notes that the track record for predictions based on force employment analyses has also been “poor.� (To be fair, the problem of testing theory to see if applies to the real world is not limited to assessments of military power, it afflicts security and strategic studies generally.)

So, is Biddle correct? Are there only three ways to assess military outcomes? Are they valid? Can we do better?

Enhancing Customer Insights with Public Location Data

Enhancing Customer Insights with Public Location Data

The increasing availability of location-based data presents marketers with more intricate opportunities for customer service. Integrating location information with social media posts can help firms...

[[ This is a content summary only. Visit my website for full links, other content, and more! ]]
How major retailers are using AI to keep brick-and-mortar alive

How major retailers are using AI to keep brick-and-mortar alive

While the age of the internet began in the last millennium, the ecommerce boom is just reaching its peak. Online retail behemoths like Amazon are easing the shopping process by making products just...

[[ This is a content summary only. Visit my website for full links, other content, and more! ]]
How to Manage Big Data More Efficiently

How to Manage Big Data More Efficiently

Data management entails governance, organization, and administration of large data files. You may want to archive or send large files of both structured and unstructured data and documents to make sure every department of your company is on the same page. Companies are using big data solutions to keep up with the rapid growth of data pools. Effective management of data enables an organization to locate both structured and unstructured data with ease. Companies collect big data from sources such as social media sites, website, and system logs. However, it's often difficult for starters to manage everything regarding big data with perfection.

Data environments in today's businesses go beyond relational database platforms and traditional data warehouse. You may require technologies to store and process data in non-transactional forms. Big data analytics is slowly changing the way entities function. As such, it's critical to learn to analyze and store data to improve business processes. In fact, effective data management gives a business a more competitive advantage. Here are a few tips to help an enterprise manage its databases.

Outline Your Goals

You need to define your goals to know the data that a business needs to thrive. Otherwise, you might end up with massive ...


Read More on Datafloq
Why Google Duplex Did Not Pass The Turing Test

Why Google Duplex Did Not Pass The Turing Test

Recently, a lot of press was given to Google’s Duplex Artificial Intelligence (AI) bot. Watch a video of the demonstration here. The Duplex bot calls businesses, such as a restaurant, on your behalf and makes a reservation. It does this by actually having a conversation with the human on the other end of the line. The voice is totally realistic, and in test cases, the humans receiving the call were not aware that they had been talking to an AI bot.

We’re going to save for another day the potential ethical issues raised by AI bots fooling people into believing they are talking to a person. Instead, let’s focus on whether or not Duplex has successfully passed the Turing test as some have suggested.

A Quick Review of the Turing Test

The Turing test is a famous set of criteria that assess whether or not AI has become realistic enough to fool humans into thinking that they are really interacting with another person. You can find information on the Turing test here. The crux of the test is the ability for an AI process to seem human to a human.

By the precise letter of the rules, one could argue that the Turing test was passed. ...


Read More on Datafloq
How artificial intelligence is transforming the recruitment process

How artificial intelligence is transforming the recruitment process

“The job search is hard,� says Angela Payne, general manager at Monster Canada. “It’s difficult and nobody really loves the process. It can be very lonely, so it’s imperative that organizations like...

[[ This is a content summary only. Visit my website for full links, other content, and more! ]]
Image Analysis in Marketing: See How it Can Make a Difference

Image Analysis in Marketing: See How it Can Make a Difference

Until recently, the growing popularity of images on social media and in consumer communication has been widening a gap in marketers’ capability to analyze consumer sentiment and preferences.

Now this gap is soon to close as image analysis technology finally gives digital platforms the gift of sight, enabling marketers to monitor not only consumer chatter, but also huge volumes of visual intelligence that was largely hidden from the analytical radar.

In this article, you’ll find a handful of suggestions on incorporating image analysis into your marketing activity, once you have integrated the appropriate software into your marketing, ecommerce, CRM, or ERP solutions.

Hear the Unspoken Sentiments

As you read this article, there is a cacophony of chatter taking place, all about your brand and products. The problem is, no matter how much effort you put into social media listening, you are only capturing a fraction of the sentiment expressed by your existing and future customers.

That’s because people are visually sharing their interactions with your products through the pictures they post on Twitter, Facebook, Pinterest, Instagram and other high-profile platforms. The text accompanying these images often contains no mention of the brand or product. In fact, according to Brandwatch, this is true in 80% of ...


Read More on Datafloq
Should The Marines Take Responsibility For Counterinsurgency?

Should The Marines Take Responsibility For Counterinsurgency?

United States Marines in Nacaragua with the captured flag of Augusto César Sandino, 1932. [Wikipedia]

Sydney J. Freedberg, Jr recently reported in Breaking Defense that the Senate Armed Services Committee (SASC), led by chairman Senator John McCain, has asked Defense Secretary James Mattis to report on progress toward preparing the U.S. armed services to carry out the recently published National Defense Strategy oriented toward potential Great Power conflict.

Among a series of questions that challenge existing service roles and missions, Freedberg reported that the SASC wants to know if responsibility for carrying out “low-intensity missions,� such as counterinsurgency, should be the primary responsibility of one service:

Make the Marines a counterinsurgency force? The Senate starts by asking whether the military “would benefit from having one Armed Force dedicated primarily to low-intensity missions, thereby enabling the other Armed Forces to focus more exclusively on advanced peer competitors.� It quickly becomes clear that “one Armed Force� means “the Marines.� The bill questions the Army’s new Security Force Assistance Brigades (SFABs) and suggest shifting that role to the Marines. It also questions the survivability of Navy-Marine flotillas in the face of long-range sensors and precision missiles — so-called Anti-Access/Area Denial (A2/AD) systems — and asked whether the Marines’ core mission, “amphibious forced entry operations,� should even “remain an enduring mission for the joint force� given the difficulties. It suggests replacing large-deck amphibious ships, which carry both Marine aircraft and landing forces, with small aircraft carriers that could carry “larger numbers of more diverse strike aircraft� (but not amphibious vehicles or landing craft). Separate provisions of the bill restrict spending on the current Amphibious Assault Vehicle (Sec. 221) and the future Amphibious Combat Vehicle (Sec. 128) until the Pentagon addresses the viability of amphibious landings.

This proposed change would drastically shift the U.S. Marine Corps’ existing role and missions, something that will inevitably generate political and institutional resistance. Deemphasizing the ability to execute amphibious forced entry operations would be both a difficult strategic choice and an unpalatable political decision to fundamentally alter the Marine Corps’ institutional identity. Amphibious warfare has defined the Marines since the 1920s. It would, however, be a concession to the reality that technological change is driving the evolving character of warfare.

Perhaps This Is Not A Crazy Idea After All

The Marine Corps also has a long history with so-called “small wars�: contingency operations and counterinsurgencies. Tasking the Marines as the proponents for low-intensity conflict would help alleviate one of the basic conundrums facing U.S. land power: the U.S. Army’s inability to optimize its force structure due to the strategic need to be prepared to wage both low-intensity conflict and conventional combined arms warfare against peer or near peer adversaries. The capabilities needed for waging each type of conflict are diverging, and continuing to field a general purpose force is running an increasing risk of creating an Army dangerously ill-suited for either. Giving the Marine Corps responsibility for low-intensity conflict would permit the Army to optimize most of its force structure for combined arms warfare, which poses the most significant threat to American national security (even if it less likely than potential future low-intensity conflicts).

Making the Marines the lead for low-intensity conflict would also play to another bulwark of its institutional identity, as the world’s premier light infantry force (“Every Marine is a rifleman�). Even as light infantry becomes increasingly vulnerable on modern battlefields dominated by the lethality of long-range precision firepower, its importance for providing mass in irregular warfare remains undiminished. Technology has yet to solve the need for large numbers of “boots on the ground� in counterinsurgency. The crucial role of manpower in counterinsurgency makes it somewhat short-sighted to follow through with the SASC’s suggestions to eliminate the Army’s new Security Force Assistance Brigades (SFABs) and to reorient Special Operations Forces (SOF) toward support for high-intensity conflict. As recent, so-called “hybrid warfare� conflicts in Lebanon and the Ukraine have demonstrated, future battlefields will likely involve a mix of combined arms and low-intensity warfare. It would be risky to assume that Marine Corps’ light infantry, as capable as they are, could tackle all of these challenges alone.

Giving the Marines responsibility for low-intensity conflict would not likely require a drastic change in force structure. Marines could continue to emphasize sea mobility and littoral warfare in circumstances other than forced entry. Giving up the existing large-deck amphibious landing ships would be a tough concession, admittedly, one that would likely reduce the Marines’ effectiveness in responding to contingencies.

It is not likely that a change as big as this will be possible without a protracted political and institutional fight. But fresh thinking and drastic changes in the U.S.’s approach to warfare are going to be necessary to effectively address both near and long-term strategic challenges.

Where to Open a Business Bank Account in Europe as a Crypto Startup?

Where to Open a Business Bank Account in Europe as a Crypto Startup?

Many banks around the globe are not very crypto-friendly. Most of them simply refuse any company that has something to do with cryptocurrencies, whether you are a trading company or doing an ICO. These banks rather stay away from cryptocurrencies, being afraid of criminal activities such as money laundering.

Although you could argue that the entire point of a crypto startup is not to have a bank account, almost always you still need a bank account, for example, to pay your taxes or your employees. The question then arises, where in the world should you go to if you want to open a bank account as a crypto startup? Which country has the most crypto-friendly banks and which countries should you avoid? Fortunately, some of the smaller banks are open to servicing crypto companies, and they see a lot of opportunities. Here is an overview of where to go in Europe:

The Netherlands

Banking in The Netherlands is difficult if you are a crypto startup. Most of the big banks in The Netherlands refuse to do business with cryptocurrency startups. The main reasons are that the market is still unregulated, the money flow not transparent, and the risks too high. The only banks ...


Read More on Datafloq
Guns or hammers? Big-data firms struggle with their role in responsible data use

Guns or hammers? Big-data firms struggle with their role in responsible data use

Looker Inc.’s business intelligence software can be used to deploy education resources more effectively in inner-city neighborhoods, identify hot spots of violent crime and pinpoint income...

[[ This is a content summary only. Visit my website for full links, other content, and more! ]]
Beyond E-Commerce: Three Ways SSL Certificates Help Protect Your Organizational Data

Beyond E-Commerce: Three Ways SSL Certificates Help Protect Your Organizational Data

If you’re an avid online shopper, you may be familiar with SSL certificates. These are noted by the padlock that appears in your browser bar, particularly when you’re attempting to check out from your digital shopping cart. It’s a small icon, but one that packs a powerful punch. It lets you know that as the system intercepts your credit card data, it does so securely, encrypting the numbers so they’re not visible or obtainable to would-be hackers. Another way you can discern whether or not a site is running with an SSL certificate is if the URL prefix begins with “https� over “http.�

To date, the e-commerce connection is the most common association between SSL certificates and the consumer. Yet, business leaders should also be aware of the myriad benefits they can reap by incorporating them into their operations and processing. As the costs associated with implementing these certificates are normally minimal to non-existent, it only makes sense for forward-thinking executives to give them a second look, regardless of whether or not they’re in an e-commerce retail marketplace. Here are three reasons why.

1. They encrypt sensitive communications

Even if you aren’t dealing directly with consumers’ financial information, your organization likely transmits sensitive ...


Read More on Datafloq
OpenGive: Serving STEM Nonprofits in Denver

OpenGive: Serving STEM Nonprofits in Denver

On April 28, 2017, Credera brought a group of volunteers together to develop an application to help track student progress for three Denver-based science, technology, engineering, and math (STEM) nonprofits: KidsTek, Open World Learning (OWL), and the Colorado Association of Black Professional Engineers and Scientists (CABPES). It’s part of OpenGive, an open-source program created by Credera with the goal of building a community of socially conscious technology professionals, working together to help nonprofit organizations execute their missions through technology.

“The goal for this application is to create a repository of all the information we keep on our courses and our students,� says Richard Liner, executive director of KidsTek.

“This tool is one that none of the organizations have right now, but something that we all need,� says Mark Smith, a board member with CABPES.

OpenGive is a perfect representation of the heart and values of Credera. It is an opportunity for Credera employees as well as other technologists in the Denver area to use their technology expertise and business acumen to make a lasting difference by serving organizations in their local community.

“The people of Credera take servant leadership very seriously,� says Credera Vice President, Derek Knudsen. “It’s what shapes our values and our culture. It’s what makes Credera, Credera,�

“It has been awe-inspiring working with Credera on this activity,� says Lucy Jayes, a development specialist with OWL. “I’m amazed by the dedication of so many people to spend their weekends on something that might not benefit them directly.�

The application is now being used by all three STEM nonprofits. It has significantly improved the way each group serves students by enabling better knowledge management and recording keeping, which equips students to continue pursuing education in technology and science fields.

“We are using the platform as a repository for both our student’s work and as a tool for our instructors to be able to see what projects are being completed in other KidsTek classrooms,â€� says Liner. “The instructor tool aspect was really not contemplated at the outset of the project, but because of the flexibility of the platform, we were able to utilize it for this purpose as well. We’re really thankful to Credera and all of the partners who made this project possible and did such a great job with it.”

To learn more about OpenGive and other ways Credera employees give back to their local communities, visit our Careers page or our #LifeAtCredera blog series.

The post OpenGive: Serving STEM Nonprofits in Denver appeared first on www.credera.com.

Being on the Bleeding Edge – Why Blockchain Scepticism Sounds Just Like the Early Internet Scepticism of the 90s

Being on the Bleeding Edge – Why Blockchain Scepticism Sounds Just Like the Early Internet Scepticism of the 90s

The late 80s and early 90s were known for a lot of things, but universal adoption of the internet was not one of them. Internet technology was still in its infancy and was expensive and difficult to use. Many mainstream experts criticised it as a fad or accused it of having far less potential than what was being “parroted� by its early adopters.

Here is a Newsweek article from the 1990s about the various fallacies of the internet and why it was doomed to fail. The article serves as a great example of how predictions can sometimes turn out to be inaccurate in the realm of emerging technologies. In all fairness though, most of the things mentioned in the article about the internet in 1995 were indeed true.

Internet searches were time-consuming, computers were expensive, and not a lot of people saw online bulletins. However, the fatal flaw in the analysis was that the author was looking at the current state of a new and rapidly evolving technology rather than its future potential. What the author missed was that someone somewhere was aware of all the problems and was working hard to solve them. Much of the scepticism around blockchain is likewise the ...


Read More on Datafloq
How Vineyard Vines Uses Analytics to Win Over Customers

How Vineyard Vines Uses Analytics to Win Over Customers

In July 2016, the eCommerce team at Vineyard Vines set out to find a solution to help them keep pace with their dynamic customer base and stay true to their principles of authentic, relevant, and...

[[ This is a content summary only. Visit my website for full links, other content, and more! ]]
Narrative Science: The Leader in Natural Language Generation Technology

Narrative Science: The Leader in Natural Language Generation Technology

Narrative Science helps people understand and communicate what is most important in their data. By transforming data into insightful, human-like language, the company’s natural language generation...

[[ This is a content summary only. Visit my website for full links, other content, and more! ]]
New Malware Designed to Attack Wireless Routers: Here’s How to Protect Your Data

New Malware Designed to Attack Wireless Routers: Here’s How to Protect Your Data

The concept of malware is nothing new, but as technology continues to sophisticate, the methods that hackers are using to target, infiltrate and compromise our most secure networks are rapidly changing. Case in point: High-tech malware is now being used to allow access into wireless routers, including those used to power personal home and office communication systems around the world.

On May 25, the FBI issued a statement urging all consumers to reset their routers in response to reports that a new malware attack, supposedly originating in Russia, could put them at risk. Specifically, it could block incoming web traffic from the routers, collect the information transported through them, and even disarm the device entirely. At present time, the specific malware, named VPN-Filter, has affected more than half a million routers, reaching a global span of at least 54 countries. As technical experts seek to find answers and determine a way to prevent or stop the attacks in their paths, one course of action has become clear: Router security must transition from an important initiative to a critical one.

Why Router Security Matters

Especially if you only use your wireless router for at-home, personal web surfing, you might believe its security isn’t all ...


Read More on Datafloq
Why Is Hadoop the Biggest Technology for Data Handling?

Why Is Hadoop the Biggest Technology for Data Handling?

Hadoop is by far the most mainstream execution of MapReduce, being a completely open source platform for working with Big Data. It is sufficiently adaptable to have the capacity to work with various...

[[ This is a content summary only. Visit my website for full links, other content, and more! ]]
10 Open-Source Tools/Frameworks for Artificial Intelligence

10 Open-Source Tools/Frameworks for Artificial Intelligence

Here are 10 open-source tools/frameworks for today’s hot topic, AI. TensorFlow™ is an open-source software library, which was originally developed by researchers and engineers working on...

[[ This is a content summary only. Visit my website for full links, other content, and more! ]]
Senate Armed Service Committee Proposes Far-Reaching Changes To U.S. Military

Senate Armed Service Committee Proposes Far-Reaching Changes To U.S. Military

Senate Armed Services Committee members (L-R) Sen. James Inhofe (R-OK), Chairman John McCain (R-AZ) and ranking member Sen. Jack Reed (R-RI) listen to testimony in the Dirksen Senate Office Building on Capitol Hill July 11, 2017 in Washington, D.C. [CREDIT: Chip Somodevilla—Getty Images]

In an article in Breaking Defense last week, Sydney J. Freedberg, Jr. pointed out that the Senate Armed Services Committee (SASC) has requested that Secretary of Defense James Mattis report back by 1 February 2019 on what amounts to “the most sweeping reevaluation of the military in 30 years, with tough questions for all four armed services but especially the Marine Corps.�

Freedberg identified SASC chairman Senator John McCain as the motivating element behind the report, which is part of the draft 2019 National Defense Authorization Act. It emphasizes the initiative to reorient the U.S. military away from its nearly two-decade long focus on counterinsurgency and counterterrorism to prioritizing preparation for potential future Great Power conflict, as outlined in Mattis’s recently published National Defense Strategy. McCain sees this shift taking place far too slowly according to Freedberg, who hints that Mattis shares this concern.

While the SASC request addresses some technological issues, its real focus is on redefining the priorities, missions, and force structures of the armed forces (including special operations forces) in the context of the National Defense Strategy.

The changes it seeks are drastic. According to Freedberg, among the difficult questions it poses are:

  • Make the Marines a counterinsurgency force? [This would greatly help alleviate the U.S. Army’s current strategic conundrum]
  • Make the Army heavier, with fewer helicopters?
  • Refocus Special Operations against Russia and China?
  • Rely less on stealth aircraft and more on drones?

Each of these questions relates directly to trends associated with the multi-domain battle and operations concepts the U.S. armed services are currently jointly developing in response to threats posed by Russian, Chinese, and Iranian military advances.

It is clear that the SASC believes that difficult choices with far-reaching consequences are needed to adequately prepare to meet these challenges. The armed services have been historically resistant to changes involving trade-offs, however, especially ones that touch on service budgets and roles and missions. It seems likely that more than a report will be needed to push through changes deemed necessary by the Senate Armed Services Committee chairman and the Secretary of Defense.

Read more of Freedberg’s article here.

The draft 2019 National Defense Authorization Act can be found here, and the SASC questions can be found in Section 1041 beginning on page 478.

See How This Hospital Uses Artificial Intelligence To Find Kidney Disease

See How This Hospital Uses Artificial Intelligence To Find Kidney Disease

According to a June 2018 ABI Research report, the number of patient monitoring devices, which also includes AI for home-based preventative healthcare) that use data to train AI models for predictive...

[[ This is a content summary only. Visit my website for full links, other content, and more! ]]
Organizing an iOS Project

Organizing an iOS Project

When I first learned iOS development, no one taught file organization. I went through many tutorials and learned how to program in Swift, use storyboards and constraints, and other basic skills to get started in iOS. However, these resources didn’t provide any recommendations for file organization and left me unprepared to work in a full enterprise application. Now, after working on multiple iOS projects, I have developed a basic file structure that I use to make the application easier to grasp.

This blog post will walk you through the different directories and the purpose of each one. This approach is a great starting point to get your application off the ground, but remember to tailor it to your specific needs.

The organization in the root project folder consists of the following folders:

  • App
  • Constants
  • Components
  • Extensions
  • Resources
  • Storyboard
  • Models
  • API

App

This folder contains two files in a basic iOS project: “AppDelegate.swift� and “Info.plist�. These files affect the entire app, and are referenced many times throughout development. I place this folder at the top of the root folder to ensure that these files are always easy to find. This folder can also hold configuration files and files that are global to the entire app.

Constants

This folder contains a base “Constants.swift� file, along with other “Constants_{feature}.swift� files that are used to keep the constants organized. The way this works is that the “Constants.swift� folder holds the class “Constants� and contains any constants that are used app-wide. The other files could include things like “Constants_Error.swift�. This file would be an extension to the “Constants� class, and include things like error codes. By using the extension feature in Swift, you can keep your constants organized in different files, and still be able to call “Constants.ErrorCodes� throughout your application.

Components

This folder contains any reusable components that your application might need. This could include something like a “BaseViewController.swift� file that extends “ViewController� and gives all your view controllers some custom base functionality. Depending on how many reusable components you have, this folder might need further organization.

Extensions

This folder holds extensions for built-in Swift classes. A file you might find in this folder would be “String.swift�, which would include functions to improve the “String� class, such as a “contains(s: String) -> Bool� function that would return true if a string contains another string. Many useful extensions for common Swift classes can be found with some quick Googling.

Resources

This folder contains the “Assets.xcassets� file. Usually that is the only thing contained in this folder as it holds all your images, however if you have any other resources for your project that can’t be added to the assets, then they should go in this folder.

Storyboard

This folder contains folders for the different flows in your app. For a basic app, this folder could contain the following folders:

  • LaunchScreen
  • Main
  • Home
  • Login
  • Register
Launch Screen

This folder contains “LaunchScreen.storyboard�, which is used to create the screen shown when launching the app.

Main

This folder contains the main storyboard and view controller, which will most likely be used to set up your TabBarController and redirect to other storyboards, if necessary.

Home, Login, Register, etc.

These folders contain the storyboard files and view files responsible for displaying and controlling these different flows. The structure of these folders will vary greatly depending on the complexity of the flow.

Models

This folder contains any models that your app or API will require. If you have a large number of models, then you will need to implement an organization strategy for this folder as well. If you ever access elements in an object directly using JSON or XML, stop and consider creating a model for the object. They will keep your code clean and maintainable.

API

This folder will contain any code used to communicate with outside APIs. Also, consider having some base classes in here to take care of boilerplate code that is usually necessary when working with APIs.

The Value of Organization

This file structure will give you a clean and organized starting point for your iOS application. As your projects grow, you will need to add more organization in certain sections to keep the structure understandable, but it is worth the effort. The code may run the same with or without great organization, but with it, your projects will be easier for other developers to pick up, and you will be able to maintain your sanity a little bit longer.

In conclusion, next time you find yourself in need of organization for an iOS project, give this structure a try. It has helped me out with my projects, and it can help you too. Feel free to reach out with any questions by emailing us at findoutmore@credera.com.

The post Organizing an iOS Project appeared first on www.credera.com.

Building IoT Data Pipelines with Python and Talend Data Streams

Building IoT Data Pipelines with Python and Talend Data Streams

Benoît Barranco is a Customer Success Architect at Talend. He brings years of experience working with customers and partners across multiple industries, helping them in their solution architectures...

[[ This is a content summary only. Visit my website for full links, other content, and more! ]]
5 Reasons Why Data Management Is Essential for User Experience

5 Reasons Why Data Management Is Essential for User Experience

Delivering an excellent user experience is essential to attracting and retaining customers. And although data management may not be the first thing that comes to mind when you think about optimizing...

[[ This is a content summary only. Visit my website for full links, other content, and more! ]]
Artificial intelligence and the future of programming

Artificial intelligence and the future of programming

Artificial intelligence is here. It’s in our homes, our pockets and handbags, and it’s creeping into our workplaces. It’s been able to best humans at board games since the 90s, and it’s not half bad at quiz shows, either. As AI continues to grow, the limitations on what it can do are shrinking.

Recently, AI has grown capable of more creative pursuits, such as drawing pictures and composing music. It’s completing increasingly difficult tasks, and every advancement sees it encroaching further into the roles of human employees. Now, not even the programmers of this very AI are safe from automation anxiety. AI isn’t just getting better at recognising a photo of Susan from HR; it’s writing basic code, too.

But does this mean that AI is set to replace programmers completely?

Anxiety, not the truth

Artificial intelligence and automation have been causing anxiety in the workplace for a long time. The continued advancement of artificial intelligence and automation hasn’t done anything to quell these fears, either. Instead, advancements in the field have caused more and more workers to fall foul of automation anxiety – the fear that we humans will soon be replaced with robots.

The fear is understandable enough. After all, AI has steadily ...


Read More on Datafloq
10 Data Science Skills You Need to Improve Project Success

10 Data Science Skills You Need to Improve Project Success

Last week, I identified the top skills across different data science professionals. The results of our survey of 620+ data professionals showed that, while data scientists possess many different...

[[ This is a content summary only. Visit my website for full links, other content, and more! ]]
What is TensorFlow? The machine learning library explained

What is TensorFlow? The machine learning library explained

Machine learning is a complex discipline. But implementing machine learning models is far less daunting and difficult than it used to be, thanks to machine learning frameworks—such as Google’s...

[[ This is a content summary only. Visit my website for full links, other content, and more! ]]
Top 10 HR Analytics tools in 2018

Top 10 HR Analytics tools in 2018

Larger companies call for more employees. And more employees mean that there’s a whole lot of employee data to manage and use. So, your HR team is probably frazzled with a lot of extra work to do....

[[ This is a content summary only. Visit my website for full links, other content, and more! ]]
What Influence Can Blockchain Technology Have On The Gaming Industry?

What Influence Can Blockchain Technology Have On The Gaming Industry?

Lately, there is a lot of talk about the blockchain technology and it is that it has become a technological phenomenon that is taking on great dimensions. It is said that it may become the most common way to carry out financial transactions from here to the very near future due to its advantages, although there are those who do not forget the inconveniences they have, both this technology and the different cryptocurrencies created so far as the well-known Bitcoin.

The video game industry is one of the sectors with the highest volume of business in the world, with approximately 2.2 billion users and revenues of around 11 billion dollars, according to the data collected by the consultancy Newzoo. But in addition, it is expected that by 2020 it will reach 12.8 billion dollars.

One of the main reasons why this growth is expected is due to the entry into the playing field of blockchain technology. Characteristics such as the ability to exchange value between strangers without intermediaries can be really interesting, as well as transparency and traceability. These advances can mean an important development, especially in two important aspects: market efficiency and safety.

The concept of Blockchain

To better understand what this technology ...


Read More on Datafloq
Measuring the Effects of Combat in Cities, Phase II – part 2

Measuring the Effects of Combat in Cities, Phase II – part 2

There was actually supposed to be a part 2 to this Phase II contract, which was analysis of urban combat at the army-level based upon 50 operations, of which a half-dozen would include significant urban terrain. This effort was not funded.

On the other hand, the quantitative analysis of battles of Kharkov only took up the first 41 pages of the report. A significant part of the rest of the report was a more detailed analysis and case study of the three fights over Kharkov in February, March and August of 1943. Kharkov was a large city, according to the January 1939 census, it has a population of 1,344,200, although a Soviet-era encyclopedia gives the pre-war population as 840,000. We never were able to figure out why there was a discrepancy. The whole area was populated with many villages. The January 1939 gives Kharkov Oblast (region) a population of 1,209,496. This is in addition to the city, so the region had a total population of 2,552,686. Soviet-era sources state that when the city was liberated in August 1943, the remaining population was only 190,000. Kharkov was a much larger city than any of the others ones covered in Phase I effort (except for Paris, but the liberation of that city was hardly a major urban battle).

The report then does a day-by-day review of the urban fighting in Kharkov. Doing a book or two on the battles of Kharkov is on my short list of books to write, as I have already done a lot of the research. We do have daily logistical expenditures of the SS Panzer Corps for February and March (tons of ammo fired, gasoline used and diesel used). In March when the SS Panzer Corps re-took Kharkov, we noted that the daily average for the four days of urban combat from 12 to 15 March was 97.25 tons of ammunition, 92 cubic meters of gasoline and 10 cubic meters of diesel. For the previous five days (7-11 March) the daily average was 93.20 tons of ammunition, 145 cubic meters of gasoline and 9 cubic meters of diesel. This it does not produced a lot of support for the idea that–as has sometimes been expressed (for example in RAND’s earlier reports on the subject)–that ammunition and other supplies will be consumed at a higher rate in urban operations.

We do observe from the three battles of Kharkov that (page 95):

There is no question that the most important lesson found in the three battles of Kharkov is that one should just bypass cities rather than attack them. The Phase I study also points out that the attacker is usually aware that faster progress can be made outside the urban terrain, and that the tendency is to weight one or both flanks and not bother to attack the city until it is enveloped. This is indeed what happened in two of the three cases at Kharkov and was also the order given by the Fourth Panzer Army that was violated by the SS Panzer Corps in March.

One must also note that since this study began the United States invaded Iraq and conducted operations in some major urban areas, albeit against somewhat desultory and ineffective opposition. In the southern part of Iraq the two major port cities Umm Qasar and Basra were first enveloped before any forces were sent in to clear them. In the case of Baghdad, it could have been enveloped if sufficient forces were available. As it was, it was not seriously defended. The recent operations in Iraq again confirmed that observations made in the two phases of this study.

P.S. The picture is of Kharkov in 1942, when it was under German occupation.

Data provenance: Be a star of GDPR

Data provenance: Be a star of GDPR

Jean-Michel Franco is Director of Product Marketing for Talend. He has dedicated his career to developing and broadening the adoption of innovative technologies in companies. Prior to joining Talend,...

[[ This is a content summary only. Visit my website for full links, other content, and more! ]]
Lack of budget, skills impedes big data analytics

Lack of budget, skills impedes big data analytics

Lack of in-house skills and adequate tools to address data quality issues are some of the main barriers to the adoption of big data by local organisations. This is according to a recent online Big...

[[ This is a content summary only. Visit my website for full links, other content, and more! ]]
What Do Smart License Plates and Their Data Mean for Consumers?

What Do Smart License Plates and Their Data Mean for Consumers?

California is the first U.S. state to begin a trial of digital license plates and is focusing on Sacramento for the test. Statistics from late May found that only 116 cars in California have the high-tech plates so far. However, under the pilot program’s rules, that number could rise to approximately 175,000 — equaling one half of 1 percent of the total cars in the state.

The DMV Doesn’t Offer Them

The new plates reportedly include batteries and computer chips and use the same technology as Amazon Kindle devices. Also, instead of metal components, these car attachments have screens covered with a protective material. However, those are not the only characteristics that set these plates apart from conventional license plates.

Drivers will not be able to order the plates from their nearest Department of Motor Vehicles (DMV) locations. Instead, they’ll have to go to participating dealerships to buy the plates and get them installed.

The cost for the plate itself should be around $699, plus whatever dealers decide to charge for installation. Drivers who opt for the digital license plates will also pay a monthly subscription fee of approximately $7.

What Can the Plates Do?

The Sacramento government recently received 24 Chevrolet Bolt vehicles equipped with ...


Read More on Datafloq
Measuring the Effects of Combat in Cities, Phase II – part 1

Measuring the Effects of Combat in Cities, Phase II – part 1

Our first urban warfare report that we did had a big impact. It clearly showed that the intensity of urban warfare was not what some of the “experts” out there were claiming. In particular, it called into question some of the claims being made by RAND. But, the report was based upon Aachen, Cherbourg, and a collection of mop-up operations along the Channel Coast. Although this was a good starting point because of the ease of research and availability of data, we did not feel that this was a fully representative collection of cases. We also did not feel that it was based upon enough cases, although we had already assembled more cases than most “experts” were using. We therefore convinced CAA (Center for Army Analysis) to fund a similar effort for the Eastern Front in World War II.

For this second phase, we again assembled a collection of Eastern Front urban warfare engagements in our DLEDB (Division-level Engagement Data Base) and compared it to Eastern Front non-urban engagements. We had, of course, a considerable collection of non-urban engagements already assembled from the Battle of Kursk in July 1943. We therefore needed a good urban engagement nearby. Kharkov is the nearest major city to where these non-urban engagements occurred and it was fought over three times in 1943. It was taken by the Red Army in February, it was retaken by the German Army in March, and it was taken again by the Red Army in August. Many of the units involved were the same units involved in the Battle of Kursk. This was a good close match. It has the additional advantage that both sides were at times on the offense.

Furthermore, Kharkov was a big city. At the time it was the fourth biggest city in the Soviet Union, being bigger than Stalingrad (as measured by pre-war population). A picture of its Red Square in March 1943, after the Germans retook it, is above.

We did have good German records for 1943 and we were able to get access to Soviet division-level records from February, March and August from the Soviet military archives in Podolsk. Therefore, we were able to assembled all the engagements based upon the unit records of both sides. No secondary sources were used, and those that were available were incomplete, usually one-sided, sometimes biased and often riddled with factual errors.

So, we ended up with 51 urban and conurban engagements from the fighting around Kharkov, along with 65 non-urban engagements from Kursk (we have more now).

The Phase II effort was completed on 30 June 2003. The conclusions of Phase II (pages 40-41) were similar to Phase I:

.Phase II Conclusions:

  1. Mission Accomplishment: This [Phase I] conclusion was further supported. The data does show a tendency for urban engagements not to generate penetrations.
  2. Casualty Rates: This [Phase I] conclusion was further supported. If urban combat influenced the casualty rate, it appears that it resulted in a reduction of the attacker casualty rate and a more favorable casualty exchange ratio compared to nonurban warfare. There still appears to be no basis to the claim that urban combat is significantly more intense with regards to casualties than is nonurban warfare.
  3. Advance Rates: There is no strong evidence of a reduction in the advance rates in urban terrain in the Eastern Front data. TDI still stands by its original conclusion that the average advance rate in urban combat should be one-half to one-third that of nonurban combat.
  4. Linear Density: Again, there is little evidence that the presence of urban terrain results in a higher linear density of troops, but unlike the ETO data, the data did not show a tendency to trend in that direction.
  5. Armor Losses: This conclusion was further supported (Phase I conclusion was: Overall, it appears that the loss of armor in urban terrain is the same as or less than that found in nonurban terrain, and in some cases is significantly lower.)
  6. Force Ratios: The conclusion was further supported (Phase I conclusion was: Urban combat did not significantly influence the Force Ratio required to achieve success or effectively conduct combat operations).
  7. Duration of Combat: Nothing could be determined from an analysis of the data regarding the Duration of Combat (Time) in urban versus nonurban terrain.

There is a part 2 to this effort that I will pick up in a later post.

Healthcare CIO roles shift as IT become more complex

Healthcare CIO roles shift as IT become more complex

Healthcare IT executives and thought leaders are making a transition within their roles. In the past, they’ve been called upon to manage very technical responsibilities—in the last decade, that’s...

[[ This is a content summary only. Visit my website for full links, other content, and more! ]]
What Is Wrong with Business Intelligence?

What Is Wrong with Business Intelligence?

I attended the IBM Business Analytics Analyst Summit in Ottawa and while I can’t tell you much about what was discussed there due to confidentiality restrictions that will be released shortly, I can...

[[ This is a content summary only. Visit my website for full links, other content, and more! ]]
How The Online Gaming Industry Uses Big Data Analytics To Grow

How The Online Gaming Industry Uses Big Data Analytics To Grow

Online casinos are fast becoming one of the most lucrative industries in the world. It seems that these venues are experiencing unstoppable growth, with more and more people getting into the games as...

[[ This is a content summary only. Visit my website for full links, other content, and more! ]]
How AI Will Alter the Digital Marketing Landscape

How AI Will Alter the Digital Marketing Landscape

Omnichannel, growth hacking, attribution, automation, micro-moments, gamification, agile and key performance indicators. There is no shortage of marketing buzzwords with short-term industry hype....

[[ This is a content summary only. Visit my website for full links, other content, and more! ]]
4 reasons blockchain could improve data security

4 reasons blockchain could improve data security

It’s easy to lose sight of the potential benefits and weaknesses of any new technology when the hype reaches fever pitch in the way that it has with blockchain. The global blockchain technology...

[[ This is a content summary only. Visit my website for full links, other content, and more! ]]
Keeping data fresh for wireless networks

Keeping data fresh for wireless networks

For wireless networks that share time-sensitive information on the fly, it’s not enough to transmit data quickly. That data also need to be fresh. Consider the many sensors in your car. While it may...

[[ This is a content summary only. Visit my website for full links, other content, and more! ]]
Unified analytics underpins the modern digital estate for AI

Unified analytics underpins the modern digital estate for AI

AI has enormous promise – driving disruptive innovation in every industry.  Leaders will embrace AI and drastically change how they interact with their customers, their suppliers, their employees,...

[[ This is a content summary only. Visit my website for full links, other content, and more! ]]
How Visual Object Detection Can Transform Manufacturing Industries

How Visual Object Detection Can Transform Manufacturing Industries

Since the industrial revolution, humanity has made tremendous progress in manufacturing. With time, we have seen more and more of mundane manual work being replaced by automation through advanced...

[[ This is a content summary only. Visit my website for full links, other content, and more! ]]
4 Ways Big Data is Changing the Auto Industry Forever

4 Ways Big Data is Changing the Auto Industry Forever

The auto industry is always at the forefront of technological trends. This is certainly the case when it comes to big data and some of the applications that stem from it.

Big Data Meets Auto Industry

Wearable devices, smart home appliances, sophisticated algorithms that power advanced B2B software…the internet of things (IoT), powered by big data, has many different application points. But when you study some of the more transformational intersections, it’s clear that big data has the power to fundamentally revolutionise the auto industry in unique ways.

In 2018, as things currently stand, we’re in a bit of an in-between phase. Much of the industry is still using the technology of yesterday. However, they are acutely aware of the innovations that are emerging and are strategically preparing to implement changes in the coming months. In other words, the technology is here, but the infrastructure for mass implementation and adoption is still being constructed.

As we stand in this transitional space, there are several exciting things happening. Let’s look at some of the hottest trends and expectations as they pertain to big data’s role in the auto industry.

1. Improved Car Buying

From the consumer side of things, the increased access to data in the car ...


Read More on Datafloq
3 tips to reduce bias in AI-powered chatbots

3 tips to reduce bias in AI-powered chatbots

AI-powered chatbots that use natural language processing are on the rise across all industries. A practical application is providing dynamic customer support that allows users to ask questions and...

[[ This is a content summary only. Visit my website for full links, other content, and more! ]]
Use artificial intelligence to identify, count, describe wild animals

Use artificial intelligence to identify, count, describe wild animals

A new paper in the Proceedings of the National Academy of Sciences (PNAS) reports how a cutting-edge artificial intelligence technique called deep learning can automatically identify, count and...

[[ This is a content summary only. Visit my website for full links, other content, and more! ]]

Privacy Policy

Copyright © 2018 BBBT - All Rights Reserved
Powered by WordPress & Atahualpa
X