If you’re waiting for someone to tell you what to do next, you’re on the wrong side of innovation and disruption

If you’re waiting for someone to tell you what to do next, you’re on the wrong side of innovation and disruption

What’s the future of business, industries, jobs, markets, technology? The answer is, “it’s up to you.� See, it’s a choice. The future is inevitable, and it is disruptive. We either choose to see it...

[[ This is a content summary only. Visit my website for full links, other content, and more! ]]
This blockchain-based company got $30 million to build a ‘new internet’

This blockchain-based company got $30 million to build a ‘new internet’

Cloud computing has quickly become one of the biggest innovations introduced by the internet. Cloud storage has allowed many businesses to streamline their operations, while it lets families share...

[[ This is a content summary only. Visit my website for full links, other content, and more! ]]
Looking for Marketing Lessons in Unsuspecting Places

Looking for Marketing Lessons in Unsuspecting Places

In this episode of the Unleash Possible podcast, Samantha talks with Len Herstein, CEO of Managecamp. Len has a really fascinating background as a longtime marketer because he is also a volunteer Sheriff Deputy.  He has an interesting perspective on key takeaways that have come from his experience blending his two worlds: professional (marketing) and (law enforcement) passion.

In the discussion, Len explains that he’s embraced a philosophy of looking for learnings in places you don’t expect to find them and, he shares some of the learnings gleaned from law enforcement that he’s been successful applying in marketing.

The critical lessons he covers include:

  • complacency kills
  • the advantages of being introspective and aware Look for learnings in places you don’t expect to find them
  • benefits of frequently debriefing successes and failures
  • why and how you should use the OODA loop

It’s clear from our discussion with Len that even though our worlds can often seem unrelated, our experiences teach us lessons and skills that are integral for our overall success. Thanks to Len for sharing his words of wisdom! You can access the podcast with Len Herstein here.

Demographics of China

Demographics of China

China is the most populous country/region in the world. In its unified and un-unified forms, it has been forever, so it seems. It certainly has been since the fall of the Roman Empire, although one can argue that the British Empire was larger for a moment. Oddly enough, the pre-eminent position that it has held for over 1,500 years, is about to be surpassed by India. China, in its wisdom brought it population under control decades ago, encouraging smaller families. This has allowed it to further develop and economically grow. Quite simply, if a country’s economic growth is 3% a year, and its population growth is 3% a year, then the average person is basically getting nowhere. This has been the case for many nations in the developing world. China has broken from that pattern.

The population of China (People’s Republic of China) for 2017 is estimated at 1,511 million, or 1.5 billion. This is a staggering figure making it almost five times (4.6 times) as many people as the United States. It is around three times what its population was in 1950. The population in its first official national census taken by the People’s Republic of China in 1953 was 583 million. It was a little hard to determine what the population of China was until the post-war period. Post-war in this case means post-Warlord period, post-Sino-Japanese War, post-World War II and post-Chinese Civil War. The Chinese population was almost four times larger than the United States in 1950/1953, back in the days when we were at war with China in the Korean peninsula. The Chinese population is now growing at a rate of 0.59% percent a year (a half percent a year). This is very low.

The fertility rates in China are 1.62 children per woman (2016) according to National Health and Family Planning Commission (NHFPC) and 1.29 in 2016 according to the National Bureau of Statistics (NBS). Not sure why there is such a difference. Regardless, this is not replacement rate and well below 2.1. It is a birth rate lower that what we see in many developed countries, although China is a still a developing country. This low birth rate was a result of the one-child policy instituted by the Communist Party in 1979. It appears to have not only worked, but it worked too well. In 2015, the government instituted a two-child policy. According to NHFPC, they are expecting the birth rate to grow to 1.8. I guess this is one of the goals of the 13th Five-Year Plan. This is still not replacement rate. China does have some emigration and immigration, but the population is so massive that this does not have a huge impact on population growth rates.

They have classified 91.51% of the population of China as Han Chinese. Still, 8.5% of 1.5 billion creates some significant minorities. This includes the Tibetians, with at least 2.8 million, and the Turkish Uyghurs estimated at 3.6 million. I ate recently at a Uyghur restaurant in Crystal City, VA. I have never seen to one of those before.

Most likely the Chinese population will experience negative population growth by 2030. The United Nations predicts the Chinese population will be 1.36 billion in 2050. This compares to 402 for the United States and 132 for Russian in 2050. Predicting population over 30 years is not that difficult. On the other hand, there is a projection that Chinese population will decline to 1.02 billion by 2100. I would not hang my hat on that last figure.

The population is aging, with its demographic “pyramid” developing a narrowing at the bottom. The demographic “pyramid” from 2015 is below:

These figures do not include Taiwan (Republic of China) or Macau (Macao Special Administrative Region). It does include the city of Hong Kong. Mainland China claims Taiwan is part of China and has had an army posed across the straights ready to invade for almost 70 years. I am guessing if they have not invaded in the last 70 years, they are not going to invade in the next 70, especially as Taiwan is a major trading partner. I do not expect re-unification as long as Taiwan remains democratic (and it has been since 1991/1996) and China remains a communist dictatorship. Taiwan had a population in 2010 of 23.1 million, and it is growing only very slowly. Macau, with a population of 552,300 in the 2010 census, is effectively under Chinese control, as is Hong Kong (7,097,600 in the 2010 census).

Transforming The Transformative: The CMO’s Role In Leading Digital Transformation

Transforming The Transformative: The CMO’s Role In Leading Digital Transformation

Whether you call it digital transformation, digitalization, or DX, it’s transforming companies of all sizes, across all industries. Within companies, DX is making its mark across disciplines, and...

[[ This is a content summary only. Visit my website for full links, other content, and more! ]]
What to do with the data? The evolution of data platforms in a post big data world

What to do with the data? The evolution of data platforms in a post big data world

Note: I’ve had the eminent thought leader Esteban Kolsky, founder and managing principal of ThinkJar, doing guest posts before on this blog. Time and again, the guy simply nails what the core of...

[[ This is a content summary only. Visit my website for full links, other content, and more! ]]
Why Data Scientists Are Crucial For AI Transformation

Why Data Scientists Are Crucial For AI Transformation

Fast forward 2018 and today every CEO, CIO, CDO, CMO  is seeking answers to questions that don’t exist yet. “> Until a few years ago the work of data scientists was isolated and mattered...

[[ This is a content summary only. Visit my website for full links, other content, and more! ]]
Digital transformation readiness: 7 steps to prepare for IIoT and AI

Digital transformation readiness: 7 steps to prepare for IIoT and AI

Manufacturing today faces many challenges including rising operational costs, operational inefficiencies, reduced budgets and the requirement to release products faster. Digital technologies such as...

[[ This is a content summary only. Visit my website for full links, other content, and more! ]]
Analytics Translator – The Most Important New Role in Analytics

Analytics Translator – The Most Important New Role in Analytics

Summary:  The role of Analytics Translator was recently identified by McKinsey as the most important new role in analytics, and a key factor in the failure of analytic programs when the role is...

[[ This is a content summary only. Visit my website for full links, other content, and more! ]]
U.S. Army Mobile Protected Firepower (MPF) Program Update

U.S. Army Mobile Protected Firepower (MPF) Program Update

BAE Systems has submitted its proposal to the U.S. Army to build and test the Mobile Protected Firepower (MPF) vehicle [BAE Systems/Fox News]

When we last checked in with the U.S. Army’s Mobile Protected Firepower (MPF) program—an effort to quickly field a new light tank lightweight armored vehicle with a long-range direct fire capability—Request for Proposals (RFPs) were expected by November 2017 and the first samples by April 2018. It now appears the first MPF prototypes will not be delivered before mid-2020 at the earliest.

According to a recent report by Kris Osborn on Warrior Maven, “The service expects to award two Engineering Manufacturing and Development (EMD) deals by 2019 as part of an initial step to building prototypes from multiple vendors, service officials said. Army statement said initial prototypes are expected within 14 months of a contract award.�

Part of the delay appears to stem from uncertainty about requirements. As Osborn reported, “For the Army, the [MPF} effort involves what could be described as a dual-pronged acquisition strategy in that it seeks to leverage currently available or fast emerging technology while engineering the vehicle with an architecture such that it can integrate new weapons and systems as they emerge over time.�

Among the technologies the Army will seek to integrate into the MPF are a lightweight, heavy caliber main gun, lightweight armor composites, active protection systems, a new generation of higher-resolution targeting sensors, greater computer automation, and artificial intelligence.

Osborn noted that

the Army’s Communications Electronics Research, Development and Engineering Center (CERDEC) is already building prototype sensors – with this in mind. In particular, this early work is part of a longer-range effort to inform the Army’s emerging Next-Generation Combat Vehicle (NGCV). The NGCV, expected to become an entire fleet of armored vehicles, is now being explored as something to emerge in the late 2020s or early 2030s.

These evolving requirements are already impacting the Army’s approach to fielding MPF. It originally intended to “do acquisition differently to deliver capability quickly.� MPF program director Major General David Bassett declared in October 2017, “We expect to be delivering prototypes off of that program effort within 15 months of contract award…and getting it in the hands of an evaluation unit six months after that — rapid!“

It is now clear the Army won’t be meeting that schedule after all. Stay tuned.

Introducing MJR Analytics … and How Two Years Go So Fast When You’re Learning Something New

Introducing MJR Analytics … and How Two Years Go So Fast When You’re Learning Something New

Today I’m excited to be launching MJR Analytics, a new consulting company focusing on modern, cloud analytics projects using technology from Looker, Qubit, Fivetran, Oracle and Snowflake and development techniques learnt from my time working as an analytics product manager at a startup in London.

Our new website (and no that’s not me sitting in the chair)

So what have I been up to in the two years since I left my old consulting company, and how has that experience and the way I’ve been working with analytics technologies over that time inspired me to start another one?

Two years ago I announced on Twitter that I’d left the company I’d co-founded back in 2007 and intended to now take on a new challenge, and then spent the rest of the week at Oracle Open World cycling over the Golden Gate Bridge and coming back on the ferry and trying to decide what that challenge might actually be.

For most of my time in the IT industry I’d been working on projects implementing products from vendors such as Oracle and I’d always been interested in how these products came to market, how software vendors came up with a strategy and roadmap for those products how the team behind those products worked with the engineers who built them.

I’d also become increasingly interested in the startup world and towards the end of time time at Rittman Mead had taken-on an informal role advising Gluent, Tanel Poder and Paul Bridger’s product startup who were building software that enabled big enterprise customers to offload their data warehousing workloads from expensive proprietary databases onto to cheap, flexible storage and processing running on Hadoop clusters.

What appealed to me about working more formally with Gluent was the opportunity it gave me to work with two smart founders and an even smarter development team developing a product built entirely on big data technology I’d until then only scratched the surface with on consulting gigs. The product marketing role I took on was all about establishing what market that product was primarily intended for, how we went about positioning the product to appeal to that market and how we then brought that product to market.

Understanding these four things are crucial if you’re going to actually get customers to buy your startup’s product:

  • who is the buyer
  • what problem does your product solve
  • what is the value solving that problem, and
  • why you’re the first product to solve it for them

otherwise you’ll spend your time building a solution to a problem that nobody actually has, and that’s the reason the majority of tech startups end-up failing. Solving a problem for a market that’s willing to pay you money to solve is called “product/market fit� and your product has it, and you’ve built your business such that it scales linearly as more customers discover your product then you’re going to make a lot more money than a consultancy limited by how many hours in the week a consultant can work.

I also learnt the distinction between product marketing, product management and product development in my time at Gluent. Going back to my time as a consultant attending product roadmap sessions at conferences I never quite knew which parts of the product team those speakers came from, but in summary:

  • Product Marketing is about taking a product that’s typically already built and then deciding the product’s positioning and messaging, then launching the product and ensuring the sales team, sales engineering and customers understand how it works and what it does; as such, this is a marketing role with a bit of technical evangelism thrown in
  • Product Development is the actual building of the product you’re looking to sell, and requires an engineering skillset together with the inspiration that typically came up with the product idea in the first place along and an entrepreneurial side that made you want to build a company around it
  • Product Management is more of a customer-facing role and is about understanding what your customers want and what their problems and use-cases are, and then creating a strategy, roadmap and feature definition for a product that will meet those needs

Despite my undoubted product marketing skills based around PowerPoint and internet memes:

In product marketing, it’s never too soon to put a Santa hat on a photo of the founder

in the end it I realised that it was product management that interested me the most and, after a couple of meetings with an old friend who used to run product management at Oracle for their business analytics product line and who had recently moved to London and now lead the product team team at Qubit, a technology startup created by four ex-Googlers building marketing technology products based-around Google’s big data and cloud technology, I joined their team later in 2016 as product manager responsible for the analytics features on their platform.

I spoke about the Qubit and the partnership we established with Looker back in May last year at a presentation at Looker’s JOIN 2017 conference in San Francisco and the slide deck below from that event goes into the background to the product and the problem it solves, helping customers using Qubit’s personalization platform make more effective use of the data we collected for them.

The product and data engineering teams at Qubit did an excellent job bringing together the features for this product and in hindsight, the bits I was most proud of included:

  • The business metadata layer we created on Google BigQuery and Google Cloud Platform to translate an event-level normalized many-to-many data model designed for fast data ingestion into a denormalized, dimensional data model designed for easy use with BI and ETL tools
  • Additional integration we created for the Looker BI tool including a set of industry vertical-specific Looker models and dashboards we then made available on the Looker Block Directory and in a Github public repo
Screenshot from Personalization Analytics Block for Looker by Qubit
  • The multi-tenant data warehouse and hosted Looker instance we then put together to enable customers without their own Looker instance to make use of their data in Google BigQuery, doing so in a way that supported per-tenant extensions and customizations by the customer or their implementation partner.
Technical Architecture for Live Tap as presented at Looker JOIN 2017

What I’ll take-away from my time at Qubit though was the incredible amount that I learnt about product management, product engineering, how to build and run a successful startup and team who are still highly-motivated seven years in and the introduction it gave me to the analytics and data-led world of digital marketing, eCommerce and modern data analytics platforms.

Consulting is a popular route into product management and the experience I brought to the role in areas such as business metadata models, analytical techniques and the needs of BI and ETL developers proved invaluable over the eighteen months I worked as part of Qubit’s product and engineering teams, but moving into product management within a young, technology-led startup founded by ex-Googlers and working with some of the smartest and most innovative people I’ve ever met involved learning a whole new set of skills including:

  • Developing on a new technology platform (Google Cloud Platform) within a new industry (eCommerce and digital marketing) and understanding a whole new set of analytics use-cases and customer roles (A/B testing, stats models and event-based analytics used by analysts and strategists within eCommerce businesses) that I described in a presentation at last year’s UK Oracle User Group Tech Conference in Birmingham:
  • Working as part of a team rather than directing that team, and managing -up as well as down, a technique I had to relearn pretty quickly in my first few months in the role
  • Learning to achieve my goals through influence rather than in the top-down way I’d been used to getting things done leading customer projects, and as CTO and owner of the company that team worked for
  • Saying no to customers rather than yes as you did as a consultant, as your objective is to build a product that solves the most important customer needs but doesn’t burden you with so many features addressing niche use-cases that you end up with Homer’s car and can’t innovate the product in future releases
  • How to take a product through its lifecycle from identifying a need that makes sense for your company to meet, through prototyping, alpha and beta releases to successful first launch and then creating a strategy and roadmap to manage that product over its complete lifecycle
  • How to use a new generation of modern, cloud-native data analytics tools such as Looker together with products such as FiveTran, Google Cloud Platform, Qubit, Snowflake DB and Snowplow Analytics that were increasingly also being adopted by the FinTech, MarTech and B2C startups clustering in London and other European/North American tech hubs

I learnt so much from my colleagues at Qubit about products, engineering and building a successful and motivated team that put up with my jokes and built the most technologically-advanced marketing personalization platform on the market.

But what my time at Qubit also made clear to me was that, when it came down to it, what really motivated me to get up in the morning, learn all these new technologies and still be wildly excited to come into work in the morning twenty years later was:

  • using data and analytics to find new insights and uncover new opportunities in a customer’s data set
  • working with that individual clients, over time, to enable them to find more of those insights and opportunities themselves
  • find new innovations in analytics technologies and how we deliver projects to make this process cheaper, faster and more likely to be successful
  • and building a team, and crucially a business, to do all of this at scale and offer a full set of analytics-related consulting services built around modern analytics tools and delivery techniques

Which is why after two years away from the consulting business and two enjoyable, rewarding and enlightening years working on the other side of the data and analytics industry I’m now launching my new consulting company, MJR Analytics; and I hope to be working with many of you as clients or members of our team over the coming months and years.


Introducing MJR Analytics … and How Two Years Go So Fast When You’re Learning Something New was originally published in The MJR Analytics Blog on Medium, where people are continuing the conversation by highlighting and responding to this story.

5 ways IoT device management differs from MDM

5 ways IoT device management differs from MDM

They say necessity is the mother of invention, and the accelerating pace of the mobile revolution is no exception. The proliferation of mobile devices affected countless industries, creating a...

[[ This is a content summary only. Visit my website for full links, other content, and more! ]]
The state of GDPR compliance is just dreadful, survey finds

The state of GDPR compliance is just dreadful, survey finds

In news that will probably surprise no one, a survey has found that few companies are complying with the requirements of the European Union’s new General Data Protection Regulation, which went into...

[[ This is a content summary only. Visit my website for full links, other content, and more! ]]
3 Ways To Improve Remote IT Support By Leveraging Data

3 Ways To Improve Remote IT Support By Leveraging Data

Information Technology departments are increasingly turning to data to help them manage the growing complexity of their IT infrastructure. For Sysadmins and support engineers alike, maintaining a...

[[ This is a content summary only. Visit my website for full links, other content, and more! ]]
How to make a wise machine learning platforms comparison

How to make a wise machine learning platforms comparison

Rash behavior can be costly if it leads to the wrong decisions. Organizations with eyes on the potential benefits of machine learning and artificial intelligence would be wise to heed this advice and...

[[ This is a content summary only. Visit my website for full links, other content, and more! ]]
The Ultimate Guide to Becoming an Information Company

The Ultimate Guide to Becoming an Information Company

The data revolution is gaining pace at breakneck speed, and we are finally towards the latter end of its implementation. Many inroads have been made by important stakeholders, and numerous...

[[ This is a content summary only. Visit my website for full links, other content, and more! ]]
Waterline Data Brings AI Based Data Catalog to MapR

Waterline Data Brings AI Based Data Catalog to MapR

To compete in today’s Data Economy, organizations need to convert their data into actionable intelligence immediately, accurately and in compliance with governmental and internal policies and...

[[ This is a content summary only. Visit my website for full links, other content, and more! ]]
Moving Towards Autonomous Driving Networks

Moving Towards Autonomous Driving Networks

Throughout history, we have never ceased in our pursuit of greater productivity. With each new industrial revolution, from industrialization and digitalization to today’s focus on robotics and...

[[ This is a content summary only. Visit my website for full links, other content, and more! ]]
How to improve manufacturing ROI with prescriptive analytics

How to improve manufacturing ROI with prescriptive analytics

Today’s manufacturing organizations operate in a dynamic environment characterized by increased complexity and uncertainty. The financial performance of manufacturers hinges on their ability to...

[[ This is a content summary only. Visit my website for full links, other content, and more! ]]
Why Python is so popular with developers: 3 reasons the language has exploded

Why Python is so popular with developers: 3 reasons the language has exploded

Python is the fastest-growing programming language in the world, as it increasingly becomes used in a wide range of developer job roles and data science positions across industries. But how did it...

[[ This is a content summary only. Visit my website for full links, other content, and more! ]]
Top 10 Digital Transformation Trends For 2019

Top 10 Digital Transformation Trends For 2019

Over the past few years I have made the commitment to looking forward to the year ahead to predict some of the most significant digital transformation trends. Knowing that digital transformation is...

[[ This is a content summary only. Visit my website for full links, other content, and more! ]]
Unifying Big Data And Machine Learning, Cisco Style

Unifying Big Data And Machine Learning, Cisco Style

It doesn’t take a machine learning algorithm to predict that server makers are trying to cash in on the machine learning revolution at the major nexus points on the global Internet. Many server...

[[ This is a content summary only. Visit my website for full links, other content, and more! ]]
“Quantity Has A Quality All Its Own”: How Robot Swarms Might Change Future Combat

“Quantity Has A Quality All Its Own”: How Robot Swarms Might Change Future Combat

Humans vs. machines in the film Matrix Revolutions (2003) [Screencap by The Matrix Wiki]

Yesterday, Paul Scharre, director of the Technology and National Security Program at the Center for a New American Security, and prolific writer on the future of robotics and artificial intelligence, posted a fascinating argument on Twitter regarding swarms and mass in future combat.

His thread was in response to an article by Shmuel Shmuel posted on War on the Rocks, which made the case that the same computer processing technology enabling robotic vehicles combined with old fashioned kinetic weapons (i.e. anti-aircraft guns) offered a cost-effective solution to swarms.

Scharre agreed that robotic drones are indeed vulnerable to such countermeasures, but made this point in response:

He then went to contend that robotic swarms offer the potential to reestablish the role of mass in future combat. Mass, either in terms of numbers of combatants or volume of firepower, has played a decisive role in most wars. As the aphorism goes, usually credited to Josef Stalin, “mass has a quality all of its own.”

Scharre observed that the United States went in a different direction in its post-World War II approach to warfare, adopting instead “offset” strategies that sought to leverage superior technology to balance against the mass militaries of the Communist bloc.

While effective during the Cold War, Scharre concurs with the arguments that offset strategies are becoming far too expensive and may ultimately become self-defeating.

In order to avoid this fate, Scharre contends that

The entire thread is well worth reading.

Trevor Dupuy would have agreed with much of what Scharre’s asserts. He identified the relationship between increasing weapon lethality and battlefield dispersion that goes back to the 17th century. Dupuy believed that the primary factor driving this relationship was the human response to fear in a lethal environment, with soldiers dispersing in depth and frontage on battlefields in order to survive weapons of ever increasing destructiveness.

TDI Friday Read: Lethality, Dispersion, And Mass On Future Battlefields

Robots might very well change that equation. Whether autonomous or “human in the loop,” robotic swarms do not feel fear and are inherently expendable. Cheaply produced robots might very well provide sufficient augmentation to human combat units to restore the primacy of mass in future warfare.

Why Inconsistent Definitions Wreak Havoc On Analytics

Why Inconsistent Definitions Wreak Havoc On Analytics

Everyone who has lived within the world of analytics has seen cases where different parts of a business have made use of slightly differing definitions of core business metrics. Sometimes these differences lead to only minor and non-material disagreement. At other times, the differences in definition can cause massive divergence of reported results and related actions taken. Organizations must ensure that where differences exist in definitions those differences are either reconciled or clearly labeled and articulated to provide the proper context.

It Is Just a Length of Stay, Right?

I recently came across a terrific example when a healthcare provider was discussing the seemingly simple issue of determining the length of stay that patients have at a hospital. Not only is length of stay important in and of itself, but it is also a component of other important metrics such as cost per stay.

Given its importance, one might assume that the formula for length of stay was standardized within the organization. But, when there are a number of hospitals acquired over time that still run in a largely autonomous fashion, it is easy to have different definitions creep in. Worse, each definition can be defended and there may be no true “right� answer.

To ...


Read More on Datafloq
Digitally Transforming the Manufacturing Workforce

Digitally Transforming the Manufacturing Workforce

Yet, PwC’s recent Global Digital Operations Study found that UK firms are lagging behind the global average for technology adoption. While there may be a number of reasons for the low digital...

[[ This is a content summary only. Visit my website for full links, other content, and more! ]]
Blockchain and real estate: A global revolution in the making?

Blockchain and real estate: A global revolution in the making?

In 2008, bitcoin announced itself as the first blockchain application, introducing the world to distributed ledger technology —a secure and transparent peer-to-peer payment protocol. Fast-forward a...

[[ This is a content summary only. Visit my website for full links, other content, and more! ]]
Who is responsible for cyber security in the enterprise?

Who is responsible for cyber security in the enterprise?

Uncertainty is widespread across companies over who takes the lead on cyber security, according to Willis Towers Watson Different organisations place the responsibility of cyber security at the feet...

[[ This is a content summary only. Visit my website for full links, other content, and more! ]]
How to Shift Career from Data Analyst to Data Scientist

How to Shift Career from Data Analyst to Data Scientist

Data Science is a wonderful profession and many people, including those in their mid-careers, think of making the big move to the field. But let’s face it: making a move from one career to another is not so easy. There are so many factors to consider, and a lot of patience is required. But at the end of the day, it is doable. Being a lucrative field, data science is attracting a lot of attention from people looking to grow their careers.

But first things first: Who is a data scientist? And, what does the role entail?

A data scientist simply makes the value of data using processes and tools that are machine-learning dependent. The person uses scientific methods and algorithms to gain insights and extract knowledge from structured and unstructured data. For example, a data scientist can use a data analytics and visualization application for extracting useful insights from data.

A data analyst, on the other hand, is involved with translating data from market researches, sales reports, and other sources into a plain language that can be easily comprehended for better decision making. Here are some important things you should put into consideration before making the career shift from a data analyst ...


Read More on Datafloq
5 Ways Robots are Revolutionizing Jobs

5 Ways Robots are Revolutionizing Jobs

Are Americans safe from the coming robot workforce? Every day, robotic engineers are working on new, innovative designs to make the workplace safer and more efficient. Despite this benefit, many people are wondering whether they'll lose their jobs to machines and what impact robots will have on their lives.

Although, they have no emotional capacity -- leaving them at a disadvantage -- robots can perform work precisely, safely and efficiently. According to a report issued by ManpowerGroup, called “The Skills Revolution,� robotic innovations will increase automation and take over many menial jobs, but the technology will also create new roles. However, according to the report, the nation's workforce must “upskill� to take on these emerging roles. As this scenario unfolds, professionals with the foresight to learn new skills will grow increasingly valuable in the workplace. Conversely, those who don't invest in themselves and prepare for the future will get left behind, an outcome that’s bad for workers and businesses that’s likely to increase the growing financial divide.

The following segments explore four ways that robots might revolutionize jobs.

1. Robotics Are Impacting Healthcare Procedures

Robots are helping to alleviate the healthcare tech crunch. As the technology matures, it's growing more cost-effective, accessible and beneficial. ...


Read More on Datafloq
Three Rules for Publishing High Impact Research

Three Rules for Publishing High Impact Research

I recently read The Content Experience Report by Uberflip with much interest.  After all, content strategy is a big part of what we do here at the Marketing Advisory Network.  In summary, I think the report offers some good tips and is worth reading, particularly the section about navigation. However, the report also falls victim to some of the classic content blunders that frequently come when writing a research report.

Here are some tips to help you avoid these common missteps.

Tip #1: You can’t “over review� a research document.  In a report like this, let’s face it, typos are a credibility killer.  Several people should review and edit the document and the last review should be done by a grammar and spelling pro that’s not seen the report before.  In the case of this report, they could have caught the spacing issues in the introduction (not a big deal) and some of the mislabelling of the graphs throughout (a bigger gotcha that will confuse readers).

Tip #2: Dig deep into your data to find insights. People read research to learn something new.  It is really important to give them that since they’re dedicating their time to reading your content. There were a couple places in the report that almost got there. For example, the report dedicates a section to “Putting content in more than one place can increase views by 8x on average!�.  It’s great to quantify the benefit of being in more places but this is not groundbreaking information for the reader. Deeper insight would have come had they dug a level deeper and uncovered a predictive model that shows how much larger the audience needs to be in order to generate different levels of viewership, or how much placement location impacts engagement for specific types of audiences.

Tip #3: Graphical representations are critical. Most people are visually oriented and that means, in a report like this, how you display the data is critical to people’s comprehension.  This report has a variety of graphical representation and some are very effective, but there are a couple that could use improvement. Here are some specific rules of thumb related to graphs that might help:

  1. Use the right charts for the right purpose: When comparing data points, always put them on the same chart.  Asking a reader to compare 2 charts against one another introduces a risk of misinterpretation and is not a smooth reader experience. Specifically, in this case, the reader is asked to compare two column graphs that each have 2 points.  All 4 data points could have been placed on a graph with points plotted on 2 axes and labeled to make comparison much easier.
  2. Be consistent: If you must have readers compare charts, put them side by side (not stacked vertically) and ensure that the scales on both axes are identical in order to make comprehension as easy as possible.
  3. Put charts through a robust review process: Labels on graphs are the key to comprehension.  Make them part of the review process as mislabelled axis or titles will cause confusion.
  4. Follow color norms: OK, this didn’t come from the UberFlip report, but by a user interface I was studying from another vendor. In this case, they were showing saturation points on a spectrum. Color intensity can be very helpful here, but they made a mistake. Green was used to signal low saturation, and yellow, orange and red were used to signal increasing amounts. Unfortunately, we have all been trained that red is bad and green is good, so visually the data told the opposite story they wanted to express.  

I applaud anyone who takes on a research project such as this one.  It is a huge undertaking. And, because it is such a huge undertaking, there are high expectations for the results from it. Taking extra steps and time to ensure a high-quality report makes for better content products and is worth it in the end.

Why Is Cybersecurity So Hard for Healthcare?

Why Is Cybersecurity So Hard for Healthcare?

Look into the matter at all, and you’ll see that the healthcare industry represents one of the most vulnerable sectors of the big data universe. That’s why, not long ago, I discussed blockchain for electronic health records, which is a good idea because about 70 percent of healthcare firms have no cybersecurity insurance. In comparison, about 24 percent of all firms lack cybersecurity coverage. Given the fact that healthcare data is some of the most valuable data for identity thieves, you would think healthcare firms would make cybersecurity priority number one.

At the risk of sounding cynical, perhaps healthcare firms don’t invest in cybersecurity insurance because it’s not where the money is. Quite simply, healthcare is mandatory for most, if not all Americans. Even the healthiest among us have to get a checkup once in a while. Healthcare firms may gamble with patient data because, even when there’s a breach, it doesn’t hurt their profits. And they need the profits a great deal because healthcare admin costs are the highest in the developed world, representing 8 percent of spend; overall, the U.S. spends the most on healthcare, at 16.9 percent of GDP. 

DeVry reports that “85% of healthcare organizations view security as ...


Read More on Datafloq
Understanding the power of Blockchain infused with AI

Understanding the power of Blockchain infused with AI

Blockchain and AI are arguably the hottest properties in the technology industry today.   The Blockchain market is forecast to expand from a value of $210 million in 2016 to over $2.3 billion by...

[[ This is a content summary only. Visit my website for full links, other content, and more! ]]
Getting Started with PostgreSQL Streaming Replication

Getting Started with PostgreSQL Streaming Replication

In this blog post, we dive into the nuts and bolts of setting up Streaming Replication (SR) in PostgreSQL. Streaming replication is the fundamental building block for achieving high availability in your PostgreSQL hosting, and is produced by running a master-slave configuration.

Read the original: Getting Started with PostgreSQL Streaming Replication

Master-Slave Terminology

Master/Primary Server


The server that can take writes.
Also called read/write server.


Slave/Standby Server


A server where the data is kept in sync with the master continuously.
Also called backup server or replica.
A warm standby server is one that cannot be connected to until it is promoted to become a master server.
In contrast, a hot standby server can accept connections and serves read-only queries. For the rest of this discussion, we will be focusing only on hot standby servers.


Data is written to the master server and propagated to the slave servers. In case there are an issue with the existing master server, one of the slave servers will take over and continue to take writes ensuring availability of the system.

WAL Shipping-Based Replication

What is WAL?


WAL stands for Write-Ahead Logging.
It is a log file where all the modifications to the database are written before they’re applied/written to data files.
WAL is used for recovery after a database crash, ensuring data integrity.
WAL is used in ...


Read More on Datafloq
Hortonworks unveils roadmap to make Hadoop cloud-native

Hortonworks unveils roadmap to make Hadoop cloud-native

It would be pure understatement to say that the world has changed since Hadoop debuted just over a decade ago. Rewind the tape to 5 – 10 years ago, and if you wanted to work with big data,...

[[ This is a content summary only. Visit my website for full links, other content, and more! ]]
What Is Fog Computing? And Why It Matters In Our Big Data And IoT World

What Is Fog Computing? And Why It Matters In Our Big Data And IoT World

edge computing, solves the problem by keeping data closer “to the ground,� so to speak, in local computers and devices, rather than routing everything through a central data center in the cloud. In...

[[ This is a content summary only. Visit my website for full links, other content, and more! ]]
Demographics of the United States

Demographics of the United States

The United States is the third most populous country in the world, and has been since the collapse of the Soviet Union. It is likely to remain the third most populous country for a long time to come, as it will certainly not catch up to the two countries with over a billion people (China and India) and will not be surpassed by anyone else any time soon (Indonesia, 4th on the list, has a population of around 264 million in 2017 and a growth rate of 1.1% compared to the U.S. growth rate of 0.7%). It appears that we will be the third most populous country in the world for decades to come.

The United States population for 2018 is estimated at 328.3 million. This is more than twice what it was in 1950. It has been growing at 9.7% or higher for every decade since 1790, with the exception of the 1930s (7.3% for that decade). Annual growth rate in 2017 is 0.7%.

The fertility rates in the U.S. is 1.76 children per woman (2017). This is not that much higher than the 1.61 figure that Russia has (see my previous post on demographics). Almost all developed countries in the world now have a birth rate below 2. It needs to be around 2.1 to achieve replacement rate. The U.S. fertility rate dropped below 2.1 in 1972 and has remained below the replacement rate ever since then, except for two years (2006 and 2007). But, the U.S. population continues to grow, and that growth is due to immigration. If it was not for immigration, the U.S. population would be in decline.

The United States is currently 77.7% “white” (in 2013), which is defined by the census bureau as “having origins in any of the original peoples of Europe, the Middle East, or North Africa.” Non-Hispanic “whites” make up 62.6% of the country’s population. The non-Hispanic “white” population is expected to fall below 50% by 2045. Needless to say, this has become a political issue inside the United States, one that I have no interest in discussing on this blog. See: https://www.brookings.edu/blog/the-avenue/2018/03/14/the-us-will-become-minority-white-in-2045-census-projects/

The United Nations predicts the U.S. population will be 402 million in 2050  (compared to their prediction of 132 for Russia in 2050). The U.S. census bureau projects the U.S. population will be 417 million in 2060.

The population has gotten older, with the median age now being 38.1 years. In 1970, at the height of the “youth culture” it was 28.1 years. The demographic “pyramid” from 2015 for the United States is below. This is worth comparing to the Russian “pyramid” in a previous post.

The legal immigration rate of the United States has averaged around 1 million a year from 1989. The legal immigration rate rose to 1.8 million in 1991 and was 1.2 million in 2016. This was very much driven by the Immigration Act of 1990, which raised the cap on immigration. See: https://www.dhs.gov/immigration-statistics/yearbook/2016/table1

The illegal immigration rate is harder to calculate, as some illegal immigrants are deported, some return home and I gather a significant number of them later convert to legal immigrants. It is estimated that there are around 11 million illegal immigrants in this country (2016 estimate). In 1990, it was estimated that there were 3.5 million illegal immigrants. Does that mean that the actual immigration rate from illegal immigration is less than 300,000 a year (as many later become legal immigrants)? It is hard to say exactly, but it appears that our immigration rate is somewhere between 1,300,000 and 1,500,000 a year counting both legal and illegal immigrants.

This is the primary source of our population growth and will probably be so for some time to come. The United States ceased reproducing at replacement rate almost 50 years ago. This is not going to change anytime soon. Some immigration is probably essential to maintain our labor force at current or growing levels (this is as close to a political statement as I am going to get on the subject).

Next to China, India, Japan and Germany.

The new choice in digital transformation: become a platform or disappear

The new choice in digital transformation: become a platform or disappear

For decades, technology has been the first lever of transformation for modern companies willing to improve their effectiveness, optimize processes or automate heavy and repetitive tasks. This...

[[ This is a content summary only. Visit my website for full links, other content, and more! ]]
Why Self-Service Analytics Has Gone Backward–and What To Do About It

Why Self-Service Analytics Has Gone Backward–and What To Do About It

During the past decade, the assertion that the data warehouse is required to be the center of an enterprise data system started to break down in a variety of ways. Reasons were numerous; they...

[[ This is a content summary only. Visit my website for full links, other content, and more! ]]
GDPR and application security: Why a holistic view is key

GDPR and application security: Why a holistic view is key

As unsettling as the European Union’s General Data Protection Regulation is for businesses around the world, it can be a blessing, too, because it will force many to take a more holistic view...

[[ This is a content summary only. Visit my website for full links, other content, and more! ]]
Ensuring Optimal Database Performance in the New Cloud World

Ensuring Optimal Database Performance in the New Cloud World

Today, data management environments are highly complex and often span multiple vendors with deployments across on-premise data centers, clouds, and hybrid installations. In addition to the...

[[ This is a content summary only. Visit my website for full links, other content, and more! ]]
BDaaS: Taking the Pain Out of Big Data Deployment

BDaaS: Taking the Pain Out of Big Data Deployment

There’s Software-as-a-Service (SaaS), Platform-as-a-Service (PaaS), even Infrastructure-as-a-Service (IaaS). Now, in the quest to make big data initiatives more accessible to mainstream...

[[ This is a content summary only. Visit my website for full links, other content, and more! ]]
Destroyed tanks in Das Reich

Destroyed tanks in Das Reich

Still working on sorting out the events on 6 July 1943 with the Das Reich SS Division at Luchki.

Das Reich’s first reported destroyed tank was at 0235 on 7 July (T354, R605, page 556). It was a Panzer IV. It then provides a map showing 6 tanks destroyed SSW of Luchki (1 Pz III, 2 Pz IVs, 1 Pz VI and 2 T-34s). The map is undated, but is probably between 6 and 9 July. Das Reich took Luchki (south) on 6 July. This map is shown here:

The Das Reich Valley of Death?

There is a 10-day report for 10 July. These are usually the best source for total loss reports. The report total losses as of 10 July as being 1 Pz III long, 1 Pz IV long and 1 StuG (T313, R387, page 6831). There is also a Corps’ quartermaster log, where they report that as of 7/11, there were three tanks totally destroyed: 1 Pz III, 1 Pz IV and 1 StuG III (T354, R607, pg 507). Hard to square these reports up with the map of Luchki showing six tanks destroyed.

It is only until the 28th of July do we get a complete listing for Das Reich of total losses for the period of 5-18 July. There was no 10-day status report for the 20th of July. The 28 July report records for Das Reich total losses of 2 Pz III, 6 Pz IV, 1 Pz VI and 2 StuG (T354, R607, page 629). This is total of 11 tanks, but does not include T-34s, where at least two were totally destroyed. We ended up recording 18 as destroyed based upon multiple sources (see page 1336 of my Kursk book).

The same report also records LSSAH total losses as 1 Pz Ib, 1 Pz III, 9 Pz IV, 1 Pz VI, 3 StuG, and 3 “Pak Sf.” (Marders). Same for Totenkopf SS, where they report 6 Pz IIIs, 7 Pz IVs, 1 Pz VI, 1 StuG and 2 Marders.

Now this was a clean-up report. There were other earlier reports of total losses that indicate less losses. On 23 July there is a report of tanks destroyed by the corps (T354, R605, page 853). It reports for the II SS Panzer Corps 5 Pz III long, 23 Pz IV long, 3 Pz VI and 5 Sturmgeschutz (assault guns). This does not quite match the report on 28 July. On 28 July they report 9 Pz III (+4), 22 Pz IV long (-1), 3 Pz VI and 6 StuG (+1). There was no fighting between 23 and 28 July, 1943 as the SS Panzer Corps was moving to conduct its next offensive.

A few things come to mind here:

  1. According to our accounting, Das Reich lost 129 tanks damaged and 18 destroyed between 4 July and 18 July (see page 1337 of my Kursk book).
  2. They apparently only lost 13-18 tanks destroyed in that period.
  3. It does appears that tanks are being written off as destroyed several days after they were actually damaged, in some cases a week or more later.
  4. Clearly, looking at destroyed tanks only does not really give a full and proper accounting of the actual fighting.

If you really want to know what is going on with combat among the German armored divisions in WWII, you really need to compare and contrast the ready-for-action reports from day-to-day.

Anyhow, not sure I am any closer to determining what happened on the 6th of July, if any significant did happen.

A Winning Game Plan For Building Your Data Science Team

A Winning Game Plan For Building Your Data Science Team

One of the most exciting challenges I have at Hitachi as the Vice-Chairmen of Hitachi’s “Data Science 部会� is to help lead the development of Hitachi’s data science capabilities. We have a target...

[[ This is a content summary only. Visit my website for full links, other content, and more! ]]
Data virtualization: Coping With The Forthcoming Data Avalanche

Data virtualization: Coping With The Forthcoming Data Avalanche

With better ways of translating data, organisations become faster, more agile and more competitive. Data virtualization is an increasingly necessary tool in organisations and businesses that face an...

[[ This is a content summary only. Visit my website for full links, other content, and more! ]]
Does Public Cloud Storage Cost More? Yes and No

Does Public Cloud Storage Cost More? Yes and No

While public cloud storage costs more in terms of raw dollars, users generally look beyond the raw cost for value received.
Could Biometric Fingerprint Identification Replace Traditional Smart (Bank) Cards and Password Soon?

Could Biometric Fingerprint Identification Replace Traditional Smart (Bank) Cards and Password Soon?

Unlike traditional smartcards most people are familiar with, a biometric payment card is a credit or debit card that uses the holder's fingerprint to verify the genuineness of one's transaction. Biometric cards introduce a layer of security that is not available in the traditional cards, the biometric fingerprint identification card possesses more impressive features. Various data are stored in the card's secure elements; such data include the template of the holder's fingerprint, the personal account details, and matching engines that confirm that the fingerprint presented at payment is authentic. This card allows consumers to keep hold of their biometric data rather than a third party having access. If holders misplace their cards, their data remains safely encrypted in the secure elements, and anyone who comes across them cannot use them. Biometric cards ultimately put the challenge of fraud in check.

In this century, there is no doubt that cards are the most popular way to make payments across the world among digital consumers. Cash payment is beginning to phase out as more transactions are being digitally processed. However, even traditional cards are now being threatened as they may soon be replaced by biometrics which has been proved to be more reliable ...


Read More on Datafloq
How You Can Improve Customer Experience With Fast Data Analytics

How You Can Improve Customer Experience With Fast Data Analytics

In today’s constantly connected world, customers expect more than ever before from the companies they do business with. With the emergence of big data, businesses have been able to better meet and...

[[ This is a content summary only. Visit my website for full links, other content, and more! ]]
So You Think You Want to be a Chief Data Officer?

So You Think You Want to be a Chief Data Officer?

According to Gartner, 90% of enterprise organizations will have a Chief Data Officer by the end of 2019. For organizations focused on digital transformation initiatives, that’s good news. The bad...

[[ This is a content summary only. Visit my website for full links, other content, and more! ]]
Demographics of Russia

Demographics of Russia

I am far from an expert on demographics, but it something I do occasionally pay attention to. When it comes to measuring long-term military power and world influence, the basic measurement of power is wealth times population. Or, you can simply stack the countries up by GNP. This puts Russia 11th on the list, below Canada. But looking at Russia’s population alone is useful.

As of January 2018 the population of Russia is estimated at 146.9 million people. This does make it the 9th most populous country on the planet. This does include Crimea and Sevastopol, which has 2.4 million people.

The population of Russia was in steady decline from 1991 to 2013. Even in 2017 Russia was only producing 1.61 children per women. This is below the replacement rates of 2.1. The last time the Russian fertility rate was above 2 was in 1989 and that was only for 7 years. It was below 2 for most of the time before that going back to mid-1960s. Low child birth rates and small families rates seem to be very much a part of the culture. I know a lot of Russians that are only children.

Their population is growing ever so slowly due to immigration. For 2017 they had a net migration of 211,878, and a natural population loss of 135,818. This gave them population gains of 76,060, which is a very small annual gain (0.05% gain a year). In 2010, ethnic Russians made up 77% of the total population.

The population of Russia was 147.4 million in 1990, 148.7 million in 1991, and then it declined at a rate of 0.5% a year, dropped to 142.9 million in 2010 and has increased to 146.9 since then by immigration and seizure of Crimea and Sevastopol. In 2006 the Russian government started simplifying immigration laws, encouraging immigration of ethnic Russian from former Soviet republics. There is probably a limit to how many more people this can draw in. Russia also has about 7 million temporary migrant workers (these are 2011 figures). The Russian population tends to be older than most. The demographic “pyramid” is anything but pyramidal in shape.

As of 2018, the UN is still predicting that Russia’s population will fall to 132 million by 2050. See: https://esa.un.org/unpd/wup/Country-Profiles/

Now, lets compare the population of the Soviet Union/Russia to the United States over time:

                Soviet Union/Russia               United States

1951        182.3 million                             151.3 (1950)

1959        209.0                                        179.3 (1960)

1970        241.7                                        203.2

1977        257.7                                        226.5 (1980)

1982        270.0                                        226.5 (1980)

1990        290.9                                        248.7

1991        293.0                                        248.7 (1990)

2002       145.2 (Russia only)                   281.4 (2000)

2010       142.9                                         308.7

2018       146.9                                         328.3 (est.)

 

So, during the height of the “we will bury you” era the Soviet Union had a population of about 20% larger than the U.S. Russia now has a population less than half the U.S. (and a GDP of less than Canada). It appears that their population will not be growing very fast and may well continue to decline.

Fascinating Ways Big Data Is Reshaping The Future of Football

Fascinating Ways Big Data Is Reshaping The Future of Football

The world of football is big business, and the future of football is changing. It is an industry that saw a big boom 25 years ago, thanks to the amount of money that was spent by television companies...

[[ This is a content summary only. Visit my website for full links, other content, and more! ]]
Alibaba Cloud Arena: An Open-Source Tool for Deep Learning

Alibaba Cloud Arena: An Open-Source Tool for Deep Learning

Alibaba Cloud introduced the Deep Learning tool Arena to the open-source community in July 2018. Now, data scientists can run Deep Learning on the cloud without having to learn to manipulate...

[[ This is a content summary only. Visit my website for full links, other content, and more! ]]
The Data-Centric Revolution: Governance in a Data-Centric Environment

The Data-Centric Revolution: Governance in a Data-Centric Environment

A traditional data landscape has the advantage of being extremely silo-ed.  By taking your entire data landscape and dividing it into thousands of databases, there is the potential that each database...

[[ This is a content summary only. Visit my website for full links, other content, and more! ]]
Enhanced Analytics To Stabilize Cash Flow, Income Projections

Enhanced Analytics To Stabilize Cash Flow, Income Projections

Does your business regularly experience budget shortfalls? Do incoming funds seem unpredictable? If your business is struggling to find its financial stride, the problem could be too little data rather than too little money. By enhancing your company’s analytics infrastructure, you can gain greater insight into client payment patterns, stabilize your cash flow, and grow your business.

Beyond Profit And Loss

Profit and loss (P&L) statements are a standard part of all business plans, but these reports directly reflect actual income and expenses. Unfortunately, when a business is still working to streamline their invoicing and payment practices, those P&L statements rarely reflect how a business will perform, and they’re often marred by the bumpy early months.

After the first few months of operation, all businesses should introduce analytics to their operations and to their business plan. No longer a supplemental element, analytics offer businesses the greatest opportunity to improve their performance and identify operational weaknesses. If you wait too long to introduce analytics software, you lose valuable time to elementary errors.

The Power Of Prediction

Not only can analytics software assess current operations and identify problems, new predictive programs can use past patterns to suggest future trends – and this is especially useful when it ...


Read More on Datafloq
Using Big Data to Excel in Sales

Using Big Data to Excel in Sales

For any business to grow and succeed, finding ways to increase revenue through sales is very important. When you are looking to improve your sales efforts, there are many different tools that can be used. Today, one of the best tools that you can use to improve your sales is big data. While many industries use big data today, those that are in the sales field could use it a number of different ways to be more efficient and improve revenue and outcomes.

Strategic Planning

One of the most important things that you can use big data for is the analysis of customers and information that has been provided. Today, through the use of a wide variety of data providing services, you can get a lot of information on customers. When you have a big data service at your disposal, you can use this data to create better plans for how to contact customers and when. The big data will be able to help you complete individual strategies for which customers to contact and what you need to do to win certain business.

Customer Analysis

When you have a big data system in place, you will also get to analyze customers to a greater ...


Read More on Datafloq
How Alternative DBs are Disrupting the Conventionals in 2018

How Alternative DBs are Disrupting the Conventionals in 2018

After providing a viable alternative for enterprise IT systems for nearly two decades, are NoSQL databases finally making significant inroads against the conventional proprietary world of SQL...

[[ This is a content summary only. Visit my website for full links, other content, and more! ]]
What VMworld 2018 Said About VMware

What VMworld 2018 Said About VMware

Containers and public clouds are surfacing as potential threats to VMware’s dominating position in enterprise IT.
Big data tooling rolls with the changing seas of analytics

Big data tooling rolls with the changing seas of analytics

In the early days of big data that followed the invention of Hadoop at Yahoo, proponents emphasized its potential for replacing bulging enterprise data warehouses focused on business intelligence....

[[ This is a content summary only. Visit my website for full links, other content, and more! ]]
400 Categorized Job Titles for Data Scientists

400 Categorized Job Titles for Data Scientists

Job titles for data scientists, including details about the simple but powerful classifier used to categorize these job titles. This analysis provides a break down per job category, and granular...

[[ This is a content summary only. Visit my website for full links, other content, and more! ]]
How on-demand data officers can help fill the knowledge gap

How on-demand data officers can help fill the knowledge gap

Data is an invaluable resource for driving commercial decision-making yet a dearth of people with the requisite skills in the UK has prevented some businesses from exploiting their numbers to the...

[[ This is a content summary only. Visit my website for full links, other content, and more! ]]
Data breach reports see 75% increase in last two years

Data breach reports see 75% increase in last two years

Data breaches are up 75% in two years, finds a report from the Information Commissioner (ICO). The study, carried out by Kroll, took into account an array of personal data, including health,...

[[ This is a content summary only. Visit my website for full links, other content, and more! ]]
Application of Technology and Tech Trends for Banking: 2019-2021

Application of Technology and Tech Trends for Banking: 2019-2021

Banking has gone from standing in long queues to one tap-everything over the last decade. The fact that most customers today, like daily errands to be performed fast, easy and error-free doesn’t require re-telling.If I had a write a monologue on my banking experience in the first half of this decade, it would include words like “terrible� and “not so great.� But new technology and the ways banks have adopted them has been changing my banking experience in recent years.

A recent study in the US claims banking and related tasks to be the most frequently performed daily activity of an adult aged between 25-65. Online banking, mobile banking, apps, etc.has made the life of a super-busy adult, super- easy. There are still many challenges, and you must be wondering what? Those challenges lead us to the NEXT BIG TRENDS in the world of BANKING. These trends will take your banking experience a notch higher and help you battle all the challenges in the future.

Here comes Artificial Intelligence

We live in the era of face recognition, robotic encounters, sci-fi movies, machine learning. It’s time to replace human interaction with human-like interactions. Artificial Intelligence removes all human errors and doubles the efficiency, speed ...


Read More on Datafloq
10 Industries Where Artificial Intelligence Has Caused a Disruption

10 Industries Where Artificial Intelligence Has Caused a Disruption

Artificial Intelligence (AI) is redefining industries by offering personalization, automating processes, and disrupting how we work. In modern times, AI is embraced by every industry from healthcare...

[[ This is a content summary only. Visit my website for full links, other content, and more! ]]
Wargaming Thread on Combat Results Tables

Wargaming Thread on Combat Results Tables

Thanks to a comment made on one of our posts, I recently became aware of a 17 page discussion thread on combat results tables (CRT) that is worth reading. It is here:

https://www.boardgamegeek.com/thread/1344914/crts-101/page/1

By default, much of their discussion of data centers around analysis based upon Trevor Dupuy’s writing, the CBD90 database, the Ardennes Campaign Simulation Data Base (ACSDB), the Kursk Data Base (KDB)  and my book War by Numbers. I was not aware of this discussion until yesterday even though the thread was started in 2015 and continues to this year (War by Numbers was published in 2017 so does not appear until the end of page 5).

The CBD90 was developed from a Dupuy research effort in the 1980s eventually codified as the Land Warfare Data Base (LWDB). Dupuy’s research was programmed with errors by the government to create the CBD90. A lot of the analysis in my book was based upon a greatly expanded and corrected version of the LWDB. I was the program manager for both the ACSDB and the KDB, and of course, the most updated version of our DuWar suite of combat databases.

http://www.dupuyinstitute.org/dbases.htm

There is about a hundred comments I could make to this thread, some in agreement and some in disagreement, but then I would not get my next book finished, so I will refrain. This does not stop me from posting a link:

Lanchester equations have been weighed….

 

Building competitive data advantage

Building competitive data advantage

Several years ago, my company faced a significant challenge: A large swath of small new entrants relying heavily on data and artificial intelligence provided services faster, cheaper, and more...

[[ This is a content summary only. Visit my website for full links, other content, and more! ]]
Crowdsourcing in the age of artificial intelligence: How the crowd will train machines

Crowdsourcing in the age of artificial intelligence: How the crowd will train machines

It was over 10 years ago that I was introduced to the concept of crowdsourcing. I was a student at London Business School when a professor one day came into the classroom with a jar of pennies. He...

[[ This is a content summary only. Visit my website for full links, other content, and more! ]]
10 Ways Technology Will Transform the Human Body in the next Decade

10 Ways Technology Will Transform the Human Body in the next Decade

Elon Musk has called it: you’re already a cyborg. Your smartphone enhances your mind, your spectacles enhance your vision, and your pacemaker (if you have one) regulates your heartbeat. Our...

[[ This is a content summary only. Visit my website for full links, other content, and more! ]]
The cloud engineers leveraging data transformation toward a cultural revolution

The cloud engineers leveraging data transformation toward a cultural revolution

While tech continues operating at its familiar breakneck pace of innovation, today’s market seems to constantly be playing catchup with data. The central industry megatrends of cloud, mobile,...

[[ This is a content summary only. Visit my website for full links, other content, and more! ]]
Connecting the dots: digital transformation and beyond

Connecting the dots: digital transformation and beyond

For government agencies thinking about journeying to the cloud, it’s all about data, according to IT cloud services veteran, Ahmed Hassan.  “Trying to connect the dots for government, and the...

[[ This is a content summary only. Visit my website for full links, other content, and more! ]]
Transformer in chief: the newest member of the c-suite

Transformer in chief: the newest member of the c-suite

It may be telling that one of the smash hit books of the past year has been on sleep. Berkeley professor of neuroscience Matthew Walker’s Why We Sleep was a comprehensive analysis of why humans need...

[[ This is a content summary only. Visit my website for full links, other content, and more! ]]
Telecom: Enterprise Messaging is the New Black

Telecom: Enterprise Messaging is the New Black

Enterprises love messaging. There is no evidence in this world that can deny the obviousness of the fact that enterprises just love reaching out to their customers. Be it regarding new deals, a new...

[[ This is a content summary only. Visit my website for full links, other content, and more! ]]
Why new CMOs need to be CSOs (Chief Synthesis Officers)

Why new CMOs need to be CSOs (Chief Synthesis Officers)

From the moment a new chief marketing officer starts a job, the expectations for improved marketing performance are outrageously high. The CEO is looking for simultaneous upticks in brand metrics and...

[[ This is a content summary only. Visit my website for full links, other content, and more! ]]
Why You Should Pay Attention To EVPN In Modern Data Centers

Why You Should Pay Attention To EVPN In Modern Data Centers

From the inception of data centers, there has been a need to standardize data transmission. At the current level of data transmission, we use something termed Ethernet VPN, or EVPN for short. EVPN, according to Juniper, is a network technology that utilizes a Layer 2 bridge in order to connect disparate networks together. When data centers were first developed, the topology was based on a couple of central links termed 'God boxes' that would be the point of entry and exit from the network into the wider world. The main issue that occurred was that this type of network setup had problems with cross-communication, usually because of its reliance upon Layer 2 networking. As things progressed, data centers started to move away from Layer 2 and adopt Layer 3 networking instead, which relied on networking IP's for routing. The problem here was that Layer 3 added complexity to the system that relied on simplicity to operate. But how could Layer 2 be used despite its massive shortcomings?

Exploring Layer 2 More Directly

Layer 2 was designed to be an easier to use system of routing. It requires very little overhead and companies usually charge more for Layer 3 routing since it utilizes ...


Read More on Datafloq
How Many Confederates Fought At Antietam?

How Many Confederates Fought At Antietam?

Dead soldiers lying near the Dunker Church on the Antietam battlefield. [History.com]

Numbers matter in war and warfare. Armies cannot function effectively without reliable counts of manpower, weapons, supplies, and losses. Wars, campaigns, and battles are waged or avoided based on assessments of relative numerical strength. Possessing superior numbers, either overall or at the decisive point, is a commonly held axiom (if not a guarantor) for success in warfare.

These numbers of war likewise inform the judgements of historians. They play a large role in shaping historical understanding of who won or lost, and why. Armies and leaders possessing a numerical advantage are expected to succeed, and thus come under exacting scrutiny when they do not. Commanders and combatants who win in spite of inferiorities in numbers are lauded as geniuses or elite fighters.

Given the importance of numbers in war and history, however, it is surprising to see how often historians treat quantitative data carelessly. All too often, for example, historical estimates of troop strength are presented uncritically and often rounded off, apparently for simplicity’s sake. Otherwise careful scholars are not immune from the casual or sloppy use of numbers.

However, just as careless treatment of qualitative historical evidence results in bad history, the same goes for mishandling quantitative data. To be sure, like any historical evidence, quantitative data can be imprecise or simply inaccurate. Thus, as with any historical evidence, it is incumbent upon historians to analyze the numbers they use with methodological rigor.

OK, with that bit of throat-clearing out of the way, let me now proceed to jump into one of the greatest quantitative morasses in military historiography: strengths and losses in the American Civil War. Participants, pundits, and scholars have been arguing endlessly over numbers since before the war ended. And since nothing seems to get folks riled up more than debating Civil War numbers than arguing about the merits (or lack thereof) of Union General George B. McClellan, I am eventually going to add him to the mix as well.

The reason I am grabbing these dual lightning rods is to illustrate the challenges of quantitative data and historical analysis by looking at one of Trevor Dupuy’s favorite historical case studies, the Battle of Antietam (or Sharpsburg, for the unreconstructed rebels lurking out there). Dupuy cited his analysis of the battle in several of his books, mainly as a way of walking readers through his Quantified Judgement Method of Analysis (QJMA), and to demonstrate his concept of combat multipliers.

I have questions about his Antietam analysis that I will address later. To begin, however, I want to look at the force strength numbers he used. On p. 156 of Numbers, Predictions and War, he provided the following figures for the opposing armies at Antietam:The sources he cited for these figures were R. Ernest Dupuy and Trevor N. Dupuy, The Compact History of the Civil War (New York: Hawthorn, 1960) and Thomas L. Livermore, Numbers and Losses of the Civil War (reprint, Bloomington: University of Indiana, 1957).

It is with Livermore that I will begin tracing the historical and historiographical mystery of how many Confederates fought at the Battle of Antietam.

Cybersecurity Misconceptions and Challenges Faced by Small Businesses

Cybersecurity Misconceptions and Challenges Faced by Small Businesses

The digital and physical world share some eerie similarities. One of the main ones being the abundance of people who have bad dispositions. Evildoers continually hunt for exploitable targets. This isn’t something only big businesses should be concerned with. Small to medium-sized businesses also need to be vigilant. There are many great reasons for this today.

The Growing Cybercriminal Culture

There are a few things that we’re seeing more of today. These are factors you should be on the lookout for. They include:

Ransomware is on the Rise

During a ransomware attack, criminals infect your computer or your network with a virus that encrypts data. Once you’re infected, the criminals demand payment in exchange for the return of your data. This typically happens via a pop-up window. Businesses who do a good job at backing up their data can probably ignore the threat, wipe their system clean, and start again from your last backup. Otherwise, you may need to pay the ransom. While you could end up paying anywhere from hundreds to thousands of dollars, this doesn’t guarantee that you’ll have your data returned. When you factor in the loss of productivity and the cost of recovering files, it’s easy to see how cybercrime ...


Read More on Datafloq
Why Retailers Are Losing Ground by Failing to Utilise Big Data

Why Retailers Are Losing Ground by Failing to Utilise Big Data

If you aren't running a business named Amazon, you're likely to feel the pressure of big data pushing small retailers out of business at an astounding rate.

It's a bigger issue than the simple existence of Amazon. 2017 was a year to remember for all the wrong reasons as multiple retailers struggled to stay afloat despite already holding healthy market shares or a strong physical presence. If owning a storefront is no longer a guarantee of longevity, how can retailers adapt to the digital age and leverage big data to keep their livelihoods alive?

The digitized storefront era

While a large chunk of retail sales have shifted to digital markets, and everything seems to be moving to a Cloud, there still exists a need for physical storefronts offering goods without the burden of shipping time or the possibility of blindly receiving damaged products with no method of quick recourse. Even storefronts with a foothold in physical retail spaces tend to benefit from maintaining a digital presence, but it's no longer as simple as selling excess stock on eBay. Putting digital advantages to use means investing in analytics and big data, much like what bigger online-only retailers have been pushing towards for years.

An immediate ...


Read More on Datafloq
The Promise of Blockchain for Nonprofits: Challenges and Hopes

The Promise of Blockchain for Nonprofits: Challenges and Hopes

Individuals and companies all over the world are increasingly excited about the opportunities provided by blockchain technology. They appreciate how it offers transparency about transactions, finalizes them efficiently and is cheaper than other ways to handle money, such as transfers. People also discuss how blockchain could be promising for the nonprofit sector. However, along with its probable benefits are challenges that must be overcome for blockchain to become widely adopted in the nonprofit realm.

Blockchain Gives Nonprofits New Ways of Interacting With Donors

According to a poll published in 2015, one-third of Americans felt nonprofits fell short concerning spending money wisely. The amount spent on administrative costs was another point brought up in the survey, and half the respondents said it was very important for them to know a charity spends a low amount on things like salaries and fundraising and gives as much money as possible to the people in need.

A charity in the United Kingdom called English Heritage hopes to help donors know precisely where their money goes with the help of blockchain technology.

It partnered with a platform called Giftcoin that allows seeing what happens to funds from the time they get donated to when charities spend them. There are reportedly ...


Read More on Datafloq
How Web 3.0 Is Going to Change Data Access As We Know It

How Web 3.0 Is Going to Change Data Access As We Know It

Web 2.0 is now fully entrenched in internet culture—it marked the shift from static, mostly read-only web pages to the interactive and social web, with increased user involvement, easily accessible...

[[ This is a content summary only. Visit my website for full links, other content, and more! ]]
What is Prescriptive Analytics and Why Should You Care

What is Prescriptive Analytics and Why Should You Care

There is more data than ever in the world and in the coming years, it will only increase exponentially. However, just collecting data is not enough if you wish to become a data-driven business. For that, you need to analyse the data to gain insights, and as you are probably aware, there are multiple ways to do so, each increasing in complexity and the value they create: descriptive analytics, diagnostic analytics, predictive analytics and prescriptive analytics.

Descriptive and diagnostic analytics is similar to business intelligence; it helps you understand what happened in the past, from one second ago to decades ago. Predictive analytics is all about the future and predicting what will happen. The final stage of understanding your business is prescriptive analytics, and slowly it is gaining more traction since the technology to do so becomes more available.

Prescriptive analytics is about what to do (now) and why to do it, given a complex set of requirements, objectives and constraints. It offers recommendations on how to act upon predictions to take advantage of those predictions and transform an organisation accordingly. It leverages predictive analytics and descriptive analytics to derive ideal outcomes or solutions from helping you solve business problems based on ...


Read More on Datafloq
10 questions machine learning engineers can expect in a job interview

10 questions machine learning engineers can expect in a job interview

Demand for machine learning engineers has exploded in the past two years, as AI development and adoption continue to grow across industries, according to a report from Indeed. These professionals are...

[[ This is a content summary only. Visit my website for full links, other content, and more! ]]
The human side of the data revolution

The human side of the data revolution

For over a decade, data has been at or near the top of the enterprise agenda. A robust ecosystem has emerged around all aspects of data (collection, management, storage, exploitation and...

[[ This is a content summary only. Visit my website for full links, other content, and more! ]]
Behavioural biometrics: How will it improve the consumer experience?

Behavioural biometrics: How will it improve the consumer experience?

Behavioural biometrics is a brave new world of seamless, hassle-free authentication that is set to revolutionise – and personalise – the life of consumers across the planet. In a world of increasing...

[[ This is a content summary only. Visit my website for full links, other content, and more! ]]
As Real Time as it Gets – Location Data is Actionable Data

As Real Time as it Gets – Location Data is Actionable Data

Location data has always been of interest to marketers but before 2010 the technology to leverage it was rather limited. The last 8 years, however, have seen a literal boom in a range of platforms that are capturing real-time, location data, and putting it to use via geofencing – a service where an app, or similar software, utilizes GPS, Wi-Fi or cellular data to prompt an action (the sending of advertisements to the user on social media for example) once the mobile device enters a virtual, pre-established geographical boundary. 

Starbucks is a common user, with folks receiving latte coupons for example after crossing into virtual boundaries. This has worked, quite well in many cases. But the limitations are evident. For example, notifying potential consumers when they cannot properly receive messages (driving a car) as well as a limited audience reach have been frequent critiques of geofencing.   

Yet marketers have stayed true, understanding that while geofencing is not the end all be all for pinpoint, location-based marketing, it has opened the door to multipurpose, social media-esque location-based initiatives which are a growing segment and eventually be a valuable resource for said marketers. The whole industry is beginning to rethink location-based data as ...


Read More on Datafloq
Das Reich Tank Losses on 6 July

Das Reich Tank Losses on 6 July

On 6 July 1943, we estimate that the Das Reich SS Division lost 30 tanks damaged and destroyed. We have them starting the battle on the evening of 4 July with the following:

Pz I Command:                1

Pz II:                                 0 (and 1 in repair)

Pz III short (Command):   8

Pz III short:                       1

Pz III long:                       52 (and 11 in repair)

Pz III Observation:            9 (not counted)

Pz III Command:               1

Pz IV long:                       30 (and 2 in repair)

Pz VI:                               12 (and 2 in repair)

StuG III:                            33

Marder II:                           2 (and one in repair)

Marder III 76.2mm:            8

T-34:                                18 (and 9 in repair)


Total                               166 (and 26 in repair)

 

This is from the Kursk Data Base (KDB).

Tank status report for 5 July from 4th PzA files (T313, R366, page 2209), no time given. It records:

Pz III short: 1

Pz III long: 45

Pz IV: 27

Pz VI: 12

Command: Not reported

StuG: 32

 

This file was not used. There are some errors in other parts of this report. The file we did use was 5.7.43 19:40 hours from the army files (T313, R368. page 4282), which states:

Pz III short: 1

Pz III long: 52

Pz IV long: 27

Pz VI: 11

T-34: 16

Command: 8

StuG: 21

 

It took me a while to find all these files, which is why this blog post was so late today.

The next report from Das Reich is at 0235 on 7 July. Not sure why the delay, as the other division’s in the corps were submitting daily reports. They report:

Pz III short: 1

Pz III long: 47

Pz IV: 16

Pz VI: 7

Command: 6

StuG: 14

Total losses: 1 Pz IV.

 

We assume that this report at 0235 on 7 July is the end of the day report for July 6. That gives us a count of at least 47-48 tanks lost since the start of the offensive. One will note that they claim only one tank completely lost, yet there are six tanks listed destroyed in the gully SSW of Luchki. Is this an indication that they may have been lost on subsequent days and towed there?

The Kursk Data Base records 19 tanks damaged/destroyed on the 5th and 30 tanks damaged/destroyed in the 6th. These counts include T-34 losses. On the 6th this includes 2 Pz III short (Command), 5 Pz III longs, 11 Pz IV longs (one listed as destroyed), 4 Pz VIs, 7 StuG IIIs, and 1 T-34.

The status report for 7 July is probably in the message of 8 July dated at 0830, which list tank status from 6.7.43. This could be a 7 July report and was used as such in the Kursk Data Base. It is from 4th Panzer Army files (T313, R366, page 2251):

Pz III: 43

Pz IV: 25

Pz VI: 6

Command tank: 7

T-34: 14

Stug: 7 (!)

 

Status report for 8 July (from 4th Panzer Army files, T313, R366, page 2247):

Pz III long: 31

Pz IV long: 14

Pz VI: 0  (hard to read)

Command Tank: 7

T-34: 12

StuG: 21

 

Report does have a handwritten figure of 45 next to it (31+14 = 45)

The next report of tank status we have for Das Reich is for 9 July. They report:

                           Division      Corps (1830)     Corps (1905)   4th PzA Report (2300)

Pz III short:            0

Pz III long:           31                   33                   31                      38 (31)

Pz IV:                  13                   15                    13                     13

 Pz VI:                   1                     1                      1                       1

Command tank:    7                     7                      7                            (7)

T-34:                     7                     7                      7

Stug:                   26                   26                    26                       26

 

For the Kursk Data Base, these were the nuts and bolt calculations we did for all nine German panzer and panzer grenadier divisions for all 15 days of the operation. We also did the same for all 10 Soviet tank and mechanized corps. While we may have made a error here and there on a given day, we did try to count and track tank strengths and losses for every single day, even when the records were not cooperating. I believe the KDB is the most accurate accounting of tank losses at the Battle of Kursk.

Anyhow, this is related to this previous post, as I am still trying to sort out what might of occurred near the village of Luchki on 6 July, 1943:

The Das Reich Valley of Death?

With six tanks reported destroyed to the SSW of Luchki, perhaps all on 6 July, and the Das Reich SS Division losing around 30 tanks on 6 July, there may have been a major fight there that is not otherwise documented.

How Companies Can Use Conversation Bots to Stop Click Fraud

How Companies Can Use Conversation Bots to Stop Click Fraud

Marketing is hard enough when there are no set rules. But when click fraud muddies the waters, marketers find themselves struggling to stay afloat. Click fraud is an insidious practice that has been a thorn in the sides of digital marketers for years. Click farms in countries like Russia and India hire workers to click on ads repeatedly, gaming the system and creating false data that dupes advertisers and ad companies alike. Many click farms don’t involve humans at all. Instead, click farms use software scripts to generate thousands of illegitimate clicks.

Some operations take these bad deeds a step further and enter false information into forms on websites. For B2B companies with long sales cycles, this type of fraud is particularly harmful. Salespeople waste their time trying to contact nonexistent prospects while marketers are frustrated that they have to account for garbage leads in their ROI reports. This forces marketers to rejustify continued investment and experiments. Great campaigns have been cancelled because fake leads and form fills skewed conversion metrics.

When campaigns are abandoned, marketers have a hard time delivering results. Instead of growing their customer base and providing a return on investment, marketers often find themselves losing money because of click ...


Read More on Datafloq
Cognitive Electronic Warfare: Radio Frequency Spectrum Meets Machine Learning

Cognitive Electronic Warfare: Radio Frequency Spectrum Meets Machine Learning

The trend toward digital, programmable radio frequency (RF) equipment — epitomized by software-defined radio — means that radars can quickly change waveforms, creating unique signatures on the fly....

[[ This is a content summary only. Visit my website for full links, other content, and more! ]]
How to Implement AI in Your Business in the Next 6-9 Months

How to Implement AI in Your Business in the Next 6-9 Months

Artificial intelligence (AI) has made breathtaking strides in the past few years embedding itself into the consumer subconscious. Computers are using advanced data science, machine learning (ML) and...

[[ This is a content summary only. Visit my website for full links, other content, and more! ]]
What do businesses do with the top machine learning platforms?

What do businesses do with the top machine learning platforms?

While the term connotes the high-tech future envisioned by sci-fi writers, the products, services and capabilities the top machine learning platforms facilitate are very common. For example,...

[[ This is a content summary only. Visit my website for full links, other content, and more! ]]
Transforming Master Data Management for Customer Experience

Transforming Master Data Management for Customer Experience

Remember when the iPhone was introduced eleven years ago? It was nothing short of a miracle. A revolutionary device that shipped with an easy to use interface and comprised of carefully crafted apps...

[[ This is a content summary only. Visit my website for full links, other content, and more! ]]
The 5 systemic challenges firms must overcome to derive value from data

The 5 systemic challenges firms must overcome to derive value from data

It would be safe to say that organizations today have more data than ever before. But deriving actionable insights from the data and converting the insights into value can prove to be a daunting task...

[[ This is a content summary only. Visit my website for full links, other content, and more! ]]
Five Enterprise applications of Recurrent Neural Networks

Five Enterprise applications of Recurrent Neural Networks

Although not a new concept, Artificial Intelligence has gained significant popularity lately. The term Artificial Intelligence was coined back in the 50s, and the research around it was mostly confined within the research institution. Some of the earliest industrial application of artificial intelligence was around optical character recognition, expert systems and industrial robots, which were mostly developed as a rule-based AI system or a very basic machine learning.

What makes artificial intelligence interesting is the machine learning, a subfield within AI that deals with pattern learning. An evolution of statistical methods, machine learning, is a technique to teach computers to understand pattern within a given set of data so it can infer its generalized answer to any future input.

Another notable method within AI is a neural network, a technique within machine learning that is derived from the understanding of how the human brain works. Although researchers haven’t been fully able to mimic the human brain, functioning of individual neurons has been the basis of development of neural network techniques.

Recently, deep learning has been getting mainstream popularity compared to other machine learning techniques. Deep learning incorporates multiple hidden neural network and requires a large set of data to train it’s “learnability.� Information ...


Read More on Datafloq
How China Is Quickly Becoming an AI Superpower

How China Is Quickly Becoming an AI Superpower

Last year, China’s government put out its plan to lead the world in AI by 2030. As Eric Schmidt has explained, “It’s pretty simple. By 2020, they will have caught up. By 2025, they will be better...

[[ This is a content summary only. Visit my website for full links, other content, and more! ]]
Is Machine Learning Inevitable for Data Analytics?

Is Machine Learning Inevitable for Data Analytics?

One of the most watched developments in data analytics and business technology these days is that of machine learning. Becoming something of a business-critical technology, machine learning makes...

[[ This is a content summary only. Visit my website for full links, other content, and more! ]]
The 3-to-1 Rule in Recent History Books

The 3-to-1 Rule in Recent History Books

This seems to be the rule that never goes away. I have a recent a case of it being used in a history book. The book was published in English in 2017 (and in German in 2007). In discussing the preparation for the Battle of Kursk in 1943 the author states that:

A military rule of thumb says an attacker should have a superiority of 3 to 1 in order to have a chance of success. While this vague principal applies only at tactical level, the superiority could be even greater if the defender is entrenched behind fortifications. Given the Kursk salient’s fortress-like defences, that was precisely the case.

This was drawn from Germany and the Second World War, Volume VIII: The Eastern Front 1943-1944: The War in the East and on the Neighboring Fronts, page 86. This section was written by Karl-Heinz Frieser.

This version of the rule now says that you have to have a superiority of 3-to-1 in order to have a chance of success? We have done a little analysis of force ratios compared to outcome. See Chapter 2: Force Ratios (pages 8-13) in War by Numbers. I never heard the caveat in the second sentence that the “principal applies only at tactical level.”

This rule has been discussed by me in previous blog posts. Dr. Frieser made a similar claim in his book The Blitzkrieg Legend:

The 3-to-1 Rule in Histories

These books were written by a German author who was an officer in the Bundeswehr, so apparently this rule of thumb has spread to some of our NATO allies, or maybe it started in Germany. We really don’t know where this rule of thumb first came from. It ain’t from Clausewitz.

A Deep Dive Into Data Lakes

A Deep Dive Into Data Lakes

Data lakes are centralized storage and data repositories that allow you to work with a variety of different types of data. (Photo: Rich Miller) In the age of Big Data, we’ve had to come up with new...

[[ This is a content summary only. Visit my website for full links, other content, and more! ]]
Marketing Advisory Network Expands with the Addition of Maribeth Ross

Marketing Advisory Network Expands with the Addition of Maribeth Ross

Veteran marketer joins ranks of highly successful strategy firm

Boston (Sept. 4, 2018) – Marketing Advisory Network, the Boston-based marketing strategy advisory firm, today announced the addition of Maribeth Ross as Chief Operating Officer.

Maribeth is a veteran marketing executive, who has long been fascinated with cracking the code of what makes customers engage, buy from, and champion a brand. An expert in messaging for sales and demand gen, she has more than 15 years of award-winning experience driving marketing strategy and execution for high-growth companies. Prior to Marketing Advisory Network, Maribeth held global responsibility as SVP of Marketing for Monetate, SaaS provider of personalization for consumer-facing brands. Prior to Monetate, Maribeth was Managing Director for the Aberdeen Group subsidiary of Harte Hanks where she helped take the Aberdeen private with a purchase by a private equity firm. Early in her career, she obtained a Six Sigma Black Belt certification, giving her a unique perspective on process and continuous improvement that she infuses into her work.

“I’m delighted that Maribeth has joined the Marketing Advisory Network team,� shares Samantha Stone, Marketing Advisory Network founder. “She brings the chops to help drive our business forward and our clients will benefit from her no-nonsense approach to marketing.�

Maribeth is a frequent speaker on marketing best practices and is a self-professed connoisseur of brick oven pizza. She lives in Massachusetts husband, son, and 3 dogs. When not in Massachusetts, she can be found enjoying all that the state of Maine has to offer.

Earlier this year, she was named among the 50 Women You Need to Know in MarTech by martechexec.com.

About Marketing Advisory Network

The Marketing Advisory Network brings together strategic planning with hands-on expertise to unleash the possible within organizations.  Specializing in B2B, high growth environments that need practical advice to drive results, the Marketing Advisory Network team leverages analytical expertise, combined with years of finely-honed gut instinct to enables revenue growth, new opportunity identification, improved efficiencies, and stronger collaboration between marketing and sales functions.

Founded in 2012 by Samantha Stone, Marketing Advisory Network has been used by brands such as Monetate, Northern Light, and Click Software to develop and implement strategies and tactics that drive growth. The company’s philosophy is highlighted in Samantha’s book, “Unleash Possible: A Marketing Playbook that Drives Sales,�  which has received 5 star reviews and was named as a must-read book by LinkedIn and FlipMyFunnel.

###

 

Privacy Policy

Copyright © 2018 BBBT - All Rights Reserved
Powered by WordPress & Atahualpa
X