More on Russian Body Counts

More on Russian Body Counts

Don’t have any resolution on the casualty counts for the fighting on 7 February, but do have a few additional newspaper reports of interest:

  1. The Guardian reposts that the Russian foreign ministry reports that dozens were killed or wounded.
    1. So, if 9 were killed (a figure that is probably the lowest possible count), then you would certainly get to dozens killed or wounded. As this is a conventional fight, I would be tempted to guess a figure of 3 or 4 wounded per killed, vice the 9 or 10 wounded per killed we have been getting from our operations in Iraq and Afghanistan (see War by Numbers, Chapter 15: Casualties).
    2. Guardian article is here:
    3. https://www.theguardian.com/world/2018/feb/20/russia-admits-several-dozen-its-citizens-killed-syria-fighting
  2. The BBC repeats these claims along with noting that “…at least 131 Russians died in Syria in the first nine months of 2017…”: http://www.bbc.com/news/world-europe-43125506
  3. Wikipedia does have an article on the subject that is worth looking at, even though its count halts on 3 February:
    1. https://en.wikipedia.org/wiki/Russian_Armed_Forces_casualties_in_Syria
  4. The original report was that about 100 Syrian soldiers had been killed. I still don’t know if this count of 100+ killed on 7 February is supposed to be all Russians, or a mix of Russians and Syrians. It could be possible there were 9 Russians killed and over 100 people killed. On the other hand, it could also be an inflated casualty count. See: https://www.nytimes.com/2018/02/13/world/europe/russia-syria-dead.html
  5. Some counts have gone as high as 215 Russians killed: https://thedefensepost.com/2018/02/10/russians-killed-coalition-strikes-deir-ezzor-syria/

Conclusions: A significant fight happened on 7 February, at least 9 Russians were killed and clearly several dozen wounded. It might have been over 100 killed in the fight, but we cannot find any clear confirmation of that. I am always suspicious of casualty claims, as anyone who has read my book on Kursk may note (and I think I provide plenty of examples in that book of claims that can be proven to be significantly in error).

Crypto Regulations: How ICO Regulations Differ Across the Globe

Crypto Regulations: How ICO Regulations Differ Across the Globe

For those of you who have been reading my recent articles, I have been focussing my attention on the necessity of regulations in the crypto world. The number of scams among ICOs seems to be growing and on a daily basis, I receive invitations to join another ICO as an ‘advisor'. I do not have to do anything, just give my name, and reputation, to this ‘startup' and they will list me as an advisor on their website. Most of the time, they do not have any code, no community, barely a team, a badly-written white paper, hardly any information on their token and why they need a token and if I ask them several critical questions, I never hear back from them.

Still, many of these scams manage to raise millions, simply because the ‘investors’ in these ICOs do not care about the product, but simply want to make a quick buck. As may be clear, I do not believe in such ICOs and I certainly do not lend my name to these startups. Neither should anyone else, as it gives them a podium to raise money from poorly-informed ‘investors’.

Fortunately, there is some light at the end of the tunnel ...


Read More on Datafloq
Russian Army Experiments With Using Tanks For Indirect Fire

Russian Army Experiments With Using Tanks For Indirect Fire

Russian Army T-90S main battle tanks. [Ministry of Defense of the Russian Federation]

Finnish freelance writer and military blogger Petri Mäkelä spotted an interesting announcement from the Ministry of Defense of the Russian Federation: the Combined-Arms Army of the Western Military District is currently testing the use of main battle tanks for indirect fire at the Pogonovo test range in the Voronezh region.

According to Major General Timur Trubiyenko, First Deputy Commander of the Western Military District Combined-Arms Army, in the course of company exercises, 200 tankers will test a combination of platoon direct and indirect fire tactics against simulated armored, lightly armored, and concealed targets up to 12 kilometers away.

Per Mäkelä, the exercise will involve T-90S main battle tanks using their 2A46 125 mm/L48 smoothbore cannons. According to the Ministry of Defense, more than 1,000 Russian Army soldiers, employing over 100 weapons systems and special equipment items, will participate in the exercises between 19 and 22 February 2018.

Tanks have been used on occasion to deliver indirect fire in World War II and Korea, but it is not a commonly used modern tactic. The use of modern fire control systems, guided rounds, and drone spotters might offer the means to make this more useful.

The Coming Intersection of HPC and the Enterprise Data Center

The Coming Intersection of HPC and the Enterprise Data Center

To extract better economic value from their data, enterprises can now more fully enable AI, machine learning and deep neural networks by integrating HPC technologies.
More Russian Body Counts

More Russian Body Counts

Interesting article form the Telegraph: Putin’s shock troops-how Russia’s secret mercenary army came up against the U.S. in Syria

That copy may be behind a paywall…so try this: https://firenewsfeed.com/news/1167339

A few highlights:

  1. Russian newspapers (which still maintain some independence from the government) have listed the names of 9 Russian’s who died in Syria on 7 February.
    1. This clearly contradicts the foreign ministry claim that 5 were killed.
  2. This is again claimed to be a battalion-sized action (500 Russians)
  3. There are some interesting conspiracy theories offered in the article as to why this Russian unit was sent in to be slaughtered. I am hesitant to explain by conspiracy something that can be explained by incompetence. There is no shortage of incompetence in warfare (or any other human affairs).
  4. Supposedly 3,000 Russians have fought for the Wagner Group in Syria since 2015.
    1. Before 7 February, there were 73 deaths (official figure is 46).
  5. It has been busy. In the last two weeks an Al-Qaeda affiliated rebel group shot down a Russian jet, Kurdish fighters downed a Turkish helicopter, Israel downed an Iranian drone and the Syrian army shot down an Israeli F-16.

This was a direct confrontation between U.S. forces and Russian-paid contractors. During the Vietnam War, some Russians were killed in our bombing of North Vietnam and other operations (16 or more Russians killed). During the Korean War, Russians pilots, posing as North Koreans, engaged in aerial combat with the U.S. aircraft, along with providing AA (around 300 Russians killed total). But I suspect you have to go back to Russian Civil War (1917-1921) to find a major ground action between U.S. and Russian forces. Not sure any of them were of this size.

Related articles: These 5 Proxy Battles Are Making Syria’s Civil War Increasingly Complicated

Russian Body Counts

Aerial Drone Tactics, 2025-2050

Aerial Drone Tactics, 2025-2050

[Image: War On The Rocks.]

My previous post outlined the potential advantages and limitations of current and future drone technology. The real utility of drones in future warfare may lie in a tactic that is both quite old and new, swarming. “‘This [drone swarm concept] goes all the way back to the tactics of Attila the Hun,’ says Randall Steeb, senior engineer at the Rand Corporation in the US. ‘A light attack force that can defeat more powerful and sophisticated opponents. They come out of nowhere, attack from all sides and then disappear, over and over.'”

In order to be effective, Mr. Steeb’s concept would require drones to be able to speed away from their adversary, or be able to hide. The Huns are described “as preferring to defeat their enemies by deceit, surprise attacks, and cutting off supplies. The Huns brought large numbers of horses to use as replacements and to give the impression of a larger army on campaign.” Also, prior to problems caused to the Roman Empire by the Huns under Attila (~400 CE), another group of people, the Scythians, used similar tactics much earlier, as mentioned by Herodotus, (~800 BCE). “With great mobility, the Scythians could absorb the attacks of more cumbersome foot soldiers and cavalry, just retreating into the steppes. Such tactics wore down their enemies, making them easier to defeat.” These tactics were also used by the Parthians, resulted in the Roman defeat under Crassis at the Battle of Carrahe, 53 BCE. Clearly, maneuver is as old as warfare itself.

Indeed, others have their own ancient analogies.

Today, fighter pilots approach warfare like a questing medieval knight. They search for opponents with similar capabilities and defeat them by using technologically superior equipment or better application of individual tactics and techniques. For decades, leading air forces nurtured this dynamic by developing expensive, manned air superiority fighters. This will all soon change. Advances in unmanned combat aerial vehicles (UCAVs) will turn fighter pilots from noble combatants to small-unit leaders and drive the development of new aerial combined arms tactics.

Drone Swarms: A Game Changer?

We can see that the new technologies come along, and they enable a new look at warfare, and often enable a new implementation of ancient tactics. There are some who claim that this changes the game, and indeed may change the fundamental nature of war.

Peter Singer, an expert on future warfare at the New America think-tank, is in no doubt. ‘What we have is a series of technologies that change the game. They’re not science fiction. They raise new questions. What’s possible? What’s proper?’ Mr. Singer is talking about artificial intelligence, machine learning, robotics and big-data analytics. Together they will produce systems and weapons with varying degrees of autonomy, from being able to work under human supervision to ‘thinking’ for themselves. The most decisive factor on the battlefield of the future may be the quality of each side’s algorithms. Combat may speed up so much that humans can no longer keep up. Frank Hoffman, a fellow of the National Defense University who coined the term ‘hybrid warfare’, believes that these new technologies have the potential not just to change the character of war but even possibly its supposedly immutable nature as a contest of wills. For the first time, the human factors that have defined success in war, ‘will, fear, decision-making and even the human spark of genius, may be less evident,’ he says.” (emphasis added).

Drones are highly capable, and with increasing autonomy, they themselves may be immune to fear. Technology has been progressing step by step to alter the character of war. Think of the Roman soldier and his personal experience in warfare up close vs. the modern sniper. They each have a different experience in warfare, and fear manifests itself in different ways. Unless we create and deploy full autonomous systems, with no human in or on the loop, there will be an opportunity for fear and confusion by the human mind to creep into martial matters. An indeed, with so much new technology, friction of some sort is almost assured.

I’m not alone in this assessment. Secretary of Defense James Mattis has said “You go all the way back to Thucydides who wrote the first history and it was of a war and he said it’s fear and honor and interest and those continue to this day. The fundamental nature of war is unchanging. War is a human social phenomenon.”

Swarming and Information Dominance

Indeed, the notion of the importance of information dominance plays upon one of the most important fundamental aspects of warfare: surprise. There are many synonyms for surprise, one of the most popular these days is situational awareness (SA). In a recent assessment of trends in air-to-air combat for the Center for Strategic and Budgetary Assessments (CSBA), Dr. John Stillion described the impact of SA.

Aerial combat over the past two decades, though relatively rare, continues to demonstrate the importance of superior SA. The building blocks, however, of superior SA, information acquisition and information denial, seem to be increasingly associated with sensors, signature reduction, and networks. Looking forward, these changes have greatly increased the proportion of BVR [Beyond Visual Range] engagements and likely reduced the utility of traditional fighter aircraft attributes, such as speed and maneuverability, in aerial combat. At the same time, they seem to have increased the importance of other attributes.

Stillion, famous for his RAND briefing on the F-35, proposes an interesting concept of operations for air-to-air combat, centered on larger aircraft with bigger sensor apertures, and subsonic UCAS fighters in the “front line.” He’s got a good video to illustrate how this concept would work against an adversary.

[I]t is important to acknowledge that all of the foregoing discussion is based on certain assumptions plus analysis of past trends, and the future of aerial combat might continue to belong to fast, agile aircraft. The alternative vision of future aerial combat presented in Chapter 5 relies heavily on robust LoS [Line of Sight] data links to enable widely distributed aircraft to efficiently share information and act in concert to achieve superior SA and combat effectiveness. Should the links be degraded or denied, the concept put forward here would be difficult or impossible to implement.

Therefore, in the near term, one of the most important capabilities to enable is a secure battle network. This will be required for remotely piloted and autonomous system alike, and this will be the foundation of information dominance – the acquisition of information for use by friendly forces, and the denial of information to an adversary.

How Blockchain is Making Data Predictions More Accessible

How Blockchain is Making Data Predictions More Accessible

Thanks to developments in big data, artificial intelligence (AI), and machine learning (ML), predictive analytics is starting to become highly reliable. It’s easy to notice how Google’s search suggestions or Amazon’s recommendations seem to be reading users’ minds. Such level of accuracy is made possible by developments in predictive technologies.

These are reaching more organisations too. Prediction tools that used to be exclusive to tech companies and research labs are now being offered as-a-service. Enterprise solutions providers like IBM and Microsoft have business intelligence offerings that include prediction features. IBM’s Cognos and Watson Analytics, for example, offers both big data and AI functionalities that enable enterprise users to generate better predictions.

The problem still is that these tools are mainly accessible to large enterprises who can invest in tools, time, and human resources. Effective predictions require well-trained data scientists who would work and iterate on models. These could also take weeks to accomplish. Given the resources needed, smaller business and organisations are often unable to benefit from the technology. 

Fortunately, blockchain, as one of today’s key emerging technologies, offers the ability to democratize access to such capabilities. Ventures like Endor, Augur and Gnosis, seek to leverage blockchain’s strengths to create prediction platforms and ...


Read More on Datafloq
Five Essential Resources Needed for Big Data Analytics

Five Essential Resources Needed for Big Data Analytics

Big Data services could run the gamut from assessments of data to business strategy to implementation. The services are offered by various business organisations, beyond systems integrators, VARs and IT consultants.

Originally, big data emerged as a term that describes sets of data who size is beyond the ability of traditional databases for capturing, storing, managing and analysing. Nevertheless, the scope of the term has expanded considerably over the years. Big data does not only refer to the data itself but a set of technologies as well, which capture, store, manage and analyse big and varied data collections for solving complex issues.

Amid the proliferation of data in real time, from sources like web, mobile devices, sensors, social media, transactional apps, log files, big data found a host of vertical market apps, which range from detection of fraud to scientific research and discovery. Regardless of the challenges which relate to privacy issues and organisational resistance, investments in big data continue gaining momentum for more than $57 billion in 2017 alone. The investments are further expected to grow at CAGR of approximately ten percent for the next three years.

Essential Resources Needed for Big Data Analytics

Big data consulting has become a viable option for software development ...


Read More on Datafloq
How Increasingly Intelligent Voice Search Is Changing SEO

How Increasingly Intelligent Voice Search Is Changing SEO

While we're not quite to the point of routinely passing by Blade Runner-esque sentient beings on our way to work, AI is growing up. It's also bringing a lot of changes to our daily lives.

One of the places where our increased use of AI is most evident is in search engines. AI is changing the way search engines such as Google work - both when crawling the web and when determining what users want to find when they enter a search query.

Voice search is also becoming both more popular and more intelligent, helping users to find what they're looking for quicker and easier than ever before. 

Anyone who runs a website or a blog will need to be prepared. The way that users search for information on the web is changing, and along with it, new approaches will need to be taken when it comes to search engine optimisation (SEO). 

Below are a few of the main ways you can expect intelligent voice search to continue transforming search engines, and SEO, in 2018 and beyond.

1. Even More Mobile Searches

One of the main things you can expect from the rise of voice search and AI is that even more searches will originate from ...


Read More on Datafloq
Russian Body Counts

Russian Body Counts

I found this article from Reuters to be particularly interesting: Russian toll in Syria battle was 300 killed and wounded

A few key points:

1. The clash was on 7 February near Khusham in Deir al-Zor province in Syria.

2. Last week 300 Russian contractors may have been killed or wounded in Syria.

   a. That would be 100 killed and 200 wounded according to one rumor.

   b. Or at least 80 killed

   c. Or 5 killed according to Russian officials.

3. It is probably more than 5.

   a. The wounded men have been sent to 4 Russian military hospitals.

   b. There were more than 50 patients at one hospital.

   c. There were three planeloads of injured fighters flown to Moscow.

   d. One ward contained 8 patients.

4. The unit was 550 men and there are only about 200 who were not casualties according to one source.

5. They were “contractors” employed by the “Wagner Group.” These guys: https://en.wikipedia.org/wiki/Wagner_Group

….

Related article: Russia: 5 citizens probably killed by U.S. strike

Most Technology Projects Fail. What is Your Data Management Plan?

Most Technology Projects Fail. What is Your Data Management Plan?

The failure rate for technology projects continues to be over 50%, and some estimate it’s even higher. While that statistic may or may not surprise you, what will surprise you is how much data management strategy (or lack thereof) contributes to that failure rate, directly or indirectly. People blame what they can see, and because poor data management happens behind the scenes, it is often a silent killer.

Many projects start out without any data management plan whatsoever; and, once the system goes live, data management falls on either the application developers or the operations/support people, growing organically without any long-term plan. The project suffers as a result.

In a recent study, the Capability Maturity Model Institute (CMMI) found that data management played a part in 100% of the technology failures surveyed. It probably isn’t quite that high in reality, but poor data management does often masquerade as other problems, including scope creep, poor technology choices, poor planning, etc. Even if we play devil’s advocate and assume one of these other factors is the main cause of a technology project failure, further analysis will often reveal poor data management as a significant underlying contributor to that cause. Perhaps you could have done a better job planning if you had a better ...


Read More on Datafloq
Why AI-Powered Fake News is Very Dangerous

Why AI-Powered Fake News is Very Dangerous

We live in unprecedented times, where, unfortunately, increasingly things are not what they seem to be or what they should be. We only have been in this situation for less than 18 months, but it is rapidly affecting our lives on a daily basis. I am talking about fake news and how it has become one of the greatest threats to democracy, free debate and capitalism. Unfortunately, for many, fake news is not a problem at all. It is even Trump's favourite topic. But above all, there is disagreement about what constitutes as fake news, how big the problem is and what to do about it. And that is a very dangerous situation to be in.

The reason for this week’s article, was an article I read on Monday about how Aviv Ovadya, Chief Technologist at the Center for Social Media Responsibility, fears the fake news crisis and how he is now worried about an information apocalypse. Unfortunately, the fake news crisis is a lot bigger than Russian propaganda during the US Elections, which in itself is already a massive problem. Fake news is spreading to other domains and stopping it is becoming increasingly difficult.

Disturbing Examples of Fake News

Since the inception ...


Read More on Datafloq
3 Ways Big Data Will Transform Marketing In 2018

3 Ways Big Data Will Transform Marketing In 2018

The Big Data Market has been anticipated to cross $203 Billion by 2020, and the progress so far goes in complete sync with the prediction. While the enterprise of today is struggling hard to cement a stature in these highly competitive setups, emerging technologies have finally earned their due and are being unlatched to their potential.

Marketing has always been a flexible area to experiment with newer practices, and the current hype of emerging technologies has its bets placed.

Here are three ways marketing will be a changed process because of Big Data. 

First Things First – Approaching Big Data Smartly

Too much data is hard to analyze and software only approach is just a thing of the past. Every business no matter what scale and reach have entirely digitalized itself and a step forward, the best of breed applications are the differentiator between products and products that stand out.

Enterprises are more than eager to invest in cutting-edge technology, save costs and boost their reachability potential. Tapping upon Big Data is one thing and touching it smartly is another. Thus, mastering the best AI practices to get the most out of Big Data is another parameter that will bring the best in competition.

As ...


Read More on Datafloq
A Missing Ingredient in Alzheimer’s Disease Prevention May Lie in Brain Training Technology

A Missing Ingredient in Alzheimer’s Disease Prevention May Lie in Brain Training Technology

At some point, many of us start to feel it: the reaching for a word, the wondering what we came for when we entered a room. It's probably not dementia; more likely a severe case of digital overload and multitasking mania.

Yet books and films like Still Alice, in which a Harvard linguistics professor descends into early onset Alzheimer's disease at age 50, can hit a little too close to home. Who are we if we can't remember?

Memories, Like the Corners of My Mind…

In Memory and Emotion: The Making of Lasting Memories, memory expert James L. McGaugh of the University of California/Irvine writes: "We are, after all, our memories. Our memory provides us with an autobiographical record and enables us to understand and react appropriately to changing experiences. Memory is the 'glue' of our personal existence."

There are some surprising causes of memory loss, including sleep apnea, vitamin B12 deficiency (which can lead to pernicious anaemia), medications, and urinary tract infections, which can mimic dementia in the elderly. But a lot of it comes down to mental fitness.

What's a senior to do? Brain training.

How to Train, and Retrain, Your Brain

Just as we join gyms and work with personal trainers to keep our ...


Read More on Datafloq
Why More Restaurants Are Turning to Big Data

Why More Restaurants Are Turning to Big Data

Businesses are increasingly depending on big data to gain insights that would be too time-consuming or impossible to retrieve through other methods. It’s not surprising, then, that restaurant brands are following that lead. There are numerous reasons why they do so, and the following are some of the most prominent.

Boost Customer Experiences

Data gathered by the NPD Group warns of a two-percent decline in growth for full-service restaurants and no growth for quick-service restaurants.

Another survey carried out by Reuters/Ipsos at the beginning of 2017 found that a third of adults in the United States said they were eating out less frequently than during the three months before the poll. In most cases, the change was due to price increases.

Even if restaurants aren’t willing to make menu items cost less, they’re investing in big data that could make the overall experiences more pleasant for the people who choose to spend their hard-earned dollars dining out.

Chomp, a Rhode Island burger restaurant, extracts a wide variety of information from compiled data. All of it could help people have more favorable opinions of restaurants and what it’s like to eat at them.

For example, it’s possible to look at the occupancy of a restaurant per hour ...


Read More on Datafloq
The Changing Face of Social Media Marketing through Big Data

The Changing Face of Social Media Marketing through Big Data

Big data and analytics are an integral part of social media marketing. Ever since millions of people started using networks, it became apparent to marketers that data is a valuable resource for advertising. On no network has this been more prevalent than Facebook, the network with the most users and the highest ad revenue in the world.

Hubspot pinpoints 2009 as the year Facebook realized the potential of targeted ads based on geographic and language data of its users. Businesses then gained the power to set up their own pages and manage their own ads, and Facebook’s revenue shot up to $777 million, while user count rose to 350 million. As of June 2017, that user count was 2 billion.

Since 2009, a host of advertising options have popped up on Facebook, including sponsored stories, mobile ads, and the Social Graph, which provides marketers with data when a user likes a post or engages in other ways. And since then, other networks have emerged as big sites for various forms of marketing, including Instagram, where influencer marketing is a huge draw.

Influencer marketing is one of the top social media marketing trends for 2018 (continue reading and you’ll see why this is very ...


Read More on Datafloq
Air Power and Drones, 2025-2050

Air Power and Drones, 2025-2050

[Credit: Financial Times]

In the recently issued 2018 National Defense Strategy, the United States acknowledged that “long-term strategic competitions with China and Russia are the principal priorities for the Department [of Defense], and require both increased and sustained investment, because of the magnitude of the threats they pose to U.S. security and prosperity today, and the potential for those threats to increase in the future.”

The strategy statement lists technologies that will be focused upon:

The drive to develop new technologies is relentless, expanding to more actors with lower barriers of entry, and moving at accelerating speed. New technologies include advanced computing, “big dataâ€� analytics, artificial intelligence, autonomy, robotics, directed energy, hypersonics, and biotechnology— the very technologies that ensure we will be able to fight and win the wars of the future… The Department will invest broadly in military application of autonomy, artificial intelligence, and machine learning, including rapid application of commercial breakthroughs, to gain competitive military advantages.” (emphasis added).

Autonomy, robotics, artificial intelligence and machine learning…these are all related to the concept of “drone swarms.” TDI has reported previously on the idea of drone swarms on land. There is indeed promise in many domains of warfare for such technology. In testimony to the Senate Armed Services Committee on the future of warfare, Mr Bryan Clark of the Center for Strategic and Budgetary Assessments argued that “America should apply new technologies to four main areas of warfare: undersea, strike, air and electromagnetic.”

Drones have certainly transformed the way that the U.S. wages war from the air. The Central Intelligence Agency (CIA) innovated, deployed and fired weapons from drones first against the Taliban in Afghanistan, less than one month after the 9/11 attacks against the U.S. homeland. Most drones today are airborne, partly because it is generally easier to navigate in the air than it is on the land, due to fewer obstacles and more uniform and predictable terrain. The same is largely true of the oceans, at least the blue water parts.

Aerial Drones and Artificial Intelligence

It is important to note that the drones in active use today by the U.S. military are actually remotely piloted Unmanned Aerial Vehicles (UAVs). With the ability to fire missiles since 2001, one could argue that these crossed the threshold into Unmanned Combat Aerial Vehicles (UCAVs), but nonetheless, they have a pilot—typically a U.S. Air Force (USAF) member, who would very much like to be flying an F-16, rather than sitting in a shipping container in the desert somewhere safe, piloting a UAV in a distant theater of war.

Given these morale challenges, work on autonomy is clearly underway. Let’s look at a forecast from The Economist, which follows the development of artificial intelligence (AI) in both the commercial and military realms.

A distinction needs to be made between “narrowâ€� AI, which allows a machine to carry out a specific task much better than a human could, and “generalâ€� AI, which has far broader applications. Narrow AI is already in wide use for civilian tasks such as search and translation, spam filters, autonomous vehicles, high-frequency stock trading and chess-playing computers… General AI may still be at least 20 years off. A general AI machine should be able to carry out almost any intellectual task that a human is capable of.” (emphasis added)

Thus, it is reasonable to assume that the U.S. military (or others) will not field a fully automated drone, capable of prosecuting a battle without human assistance, until roughly 2038. This means that in the meantime, a human will be somewhere “in” or “on” the loop, making at least some of the decisions, especially those involving deadly force.

[Credit: The Economist]

Future Aerial Drone Roles and Missions

The CIA’s initial generation of UAVs was armed in an ad-hoc fashion; further innovation was spurred by the drive to seek out and destroy the 9/11 perpetrators. These early vehicles were designed for intelligence, reconnaissance, and surveillance (ISR) missions. In this role, drones have some big advantages over manned aircraft, including the ability to loiter for long periods. They are not quick, not very maneuverable, and as such are suited to operations in permissive airspace.

The development of UCAVs has allowed their integration into strike (air-to-ground) and air superiority (air-to-air) missions in contested airspace. UCAV strike missions could target and destroy land and sea nodes in command, control, communications, computers, intelligence, surveillance and reconnaissance (C4ISR) networks in an attempt to establish “information dominance.” They might also be targeted against assets like surface to air missiles and radars, part of an adversary anti-access/area denial (A2/AD) capability.

Given the sophistication of Russian and Chinese A2/AD networks and air forces, some focus should be placed upon developing more capable and advanced drones required to defeat these challenges. One example comes from Kratos, a drone maker, and reported on in Popular Science.

Concept art for Mako combat drone. Based on the existing BQM-167 aerial target, this drone can maneuver at forces that could kill a human pilot [Image courtesy of Kratos/Popular Science]

The Mako drone pictured above has much higher performance than some other visions of future drone swarms, which look more like paper airplanes. Given their size and numbers, they might be difficult to shoot down entirely, and this might be able to operate reasonably well within contested airspace. But, they’re not well suited for air-to-air combat, as they will not have the weapons or the speed necessary to engage with current manned aircraft in use with potential enemy air forces. Left unchecked, an adversary’s current fighters and bombers could easily avoid these types of drones and prosecute their own attacks on vital systems, installations and facilities.

The real utility of drones may lie in the unique tactic for which they are suited, swarming. More on that in my next post.

5 Things that Keep Your CTO Up at Night

5 Things that Keep Your CTO Up at Night

It is not easy being a CTO, especially considering all the threats from hackers coming from every possible direction, trying to get their hands on that priceless data. Perhaps what is most nerve-racking is that even one security breach can turn a respectable company into a laughingstock. Here are some of the reasons CTOs around the country are having trouble sleeping at night.

1. DevOps Data Breaches

This past November, a massive data breach at a U.K recruiting agency exposed the personal information of 780,000 clients. Hackers were able to access the data located on the agency’s development server, which was used by the agency’s IT provider. The American College of Cardiology as well as parenting retailer Kiddicare had similar incidents last year. And most notably, the Equifax data breach, which disclosed the personal information of 143 million Americans could be the worst to date.  Read also: What’s Wrong with Big Data



Such incidents have become more common and are giving CTOs everywhere nightmares in addition to concerns that oversight in their DevOps implementation could be the newest route for data breaches in 2018. DevOps is quickly emerging as a weak link in the security chain and in the rush to continuously innovate, ...


Read More on Datafloq
Could You Fall In Love With A Robot?

Could You Fall In Love With A Robot?

Forget stealing your job... in the future, a robot may well steal your heart.

Technology is leaving no stone unturned when it comes to the impact it is having on all aspects of modern life, including relationships.

Long gone are the days of finding love in the traditional sense, with people opting to use apps like Tinder or websites like match.com, where algorithms work to pair you up with potential suitors given your personality data. No more is the spontaneity of meeting someone over coffee or through friends something that rouses excitement of the unknown, where you have to invest time in finding out what someone is really like. Thanks to social media, you can find out more about your potential partner, their history and more before even meeting them!

Relationships have become more of a commodity than ever before. With a simple swipe left or right dictated by looks and a bio, many might be missing out on the real thing. But, we’re seeing the beginnings of an even more dramatic shift in the way in which we find love. Film and TV have explored the subject, with chatbot love in Her, romantic interests expressed by robot Ava in Ex Machina, ...


Read More on Datafloq
3 Major Data Usage Cloud computing Trends For 2018

3 Major Data Usage Cloud computing Trends For 2018

Just a couple of years back, cloud computing became one of the mainstream technologies in the world of business. One of the studies conducted by the Harvard Business School inferred that over 85% of companies were using cloud technology in one way or the other. However, the number of businesses that are making use of this technology is set to increase in 2018. The technological advancements which are popping up almost at breakneck speed are forcing business enterprises to adopt them to stay competitive or risk being kicked out of business. Cloud technology might be the trend that will take the business world of the 21st-century market economy by storm.

Technologies that are related to the cloud such as DevOps, microservices architecture and think containers are set to change at a breakneck pace. Cloud computing enables the government database, and business enterprises to eliminate the need for having to store their information and sensitive data in the traditional storage devices such as flash and hard drives. With the cloud, data is stored in millions of virtual servers which can be accessed at any time and in any geographical location with internet access. This has improved the flexibility of government and business ...


Read More on Datafloq
Why We Will Love AI More Than Our Pets

Why We Will Love AI More Than Our Pets

We will love Artificial Intelligence ... love as in ´bond emotionally´: an AI Love Affair.

Yes ... and you will say that I am crazy. Well, maybe. Time will tell. But I'm dead serious. In a few years' time, many of us cannot live without AI anymore. Just like today, we cannot do without our smartphones. And not only that. We will love it, him, her. Yes, love. As in 'amore'.

The first signs are here already. People are daydreaming about passionate romances with Siri and Alexa. People feel pain when they see robots hurt themselves. Dementing elderly tend to bond emotionally with social bots. Children play games and make jokes with Siri and Alexa as if they were friends. Young children easily become best friends with chatbots, often unaware that there is not a human inside.

It is how our brain works

Over time, smart AI assistants - think of Siri, Alexa, Jibo today - develop more and more complex behaviour, that we, in turn, will familiarise with. We get used to them, we grow accustomed to them, just like we got used to our smartphones. AI will be part of our daily life. And it will be as if our smart assistants have a ...


Read More on Datafloq
Preparing Your Dataset for Machine Learning: 8 Basic Techniques That Make Your Data Better

Preparing Your Dataset for Machine Learning: 8 Basic Techniques That Make Your Data Better

There’s a good story about bad data told by Martin Goodson, a data science consultant. A healthcare project was aimed to cut costs in the treatment of patients with pneumonia. It employed machine learning (ML) to automatically sort through patient records to decide who has the lowest death risk and should take antibiotics at home and who’s at high risk of death from pneumonia and should be in the hospital. The team used historical data from clinics, and the algorithm was accurate.

But there was an important exception. One of the most dangerous conditions that may accompany pneumonia is asthma, and doctors always send asthmatics to intensive care resulting in minimal death rates for these patients. So, the absence of asthmatic death cases in the data made the algorithm assume that asthma isn’t that dangerous during pneumonia, and in all cases, the machine recommended sending asthmatics home, while they had the highest risk of pneumonia complications.

ML depends heavily on data. It’s the most crucial aspect that makes algorithm training possible and explains why machine learning became so popular in recent years. But regardless of your actual terabytes of information and data science expertise, if you can’t make sense of data records, a ...


Read More on Datafloq
Attrition In Future Land Combat

Attrition In Future Land Combat

Soldiers with Battery C, 1st Battalion, 82nd Field Artillery Regiment, 1st Brigade Combat Team, 1st Cavalry Division maneuver their Paladins through Hohenfels Training Area, Oct. 26. Photo Credit: Capt. John Farmer, 1st Brigade Combat Team, 1st Cav

[This post was originally published on June 9, 2017]

Last autumn, U.S. Army Chief of Staff General Mark Milley asserted that “we are on the cusp of a fundamental change in the character of warfare, and specifically ground warfare. It will be highly lethal, very highly lethal, unlike anything our Army has experienced, at least since World War II.� He made these comments while describing the Army’s evolving Multi-Domain Battle concept for waging future combat against peer or near-peer adversaries.

How lethal will combat on future battlefields be? Forecasting the future is, of course, an undertaking fraught with uncertainties. Milley’s comments undoubtedly reflect the Army’s best guesses about the likely impact of new weapons systems of greater lethality and accuracy, as well as improved capabilities for acquiring targets. Many observers have been closely watching the use of such weapons on the battlefield in the Ukraine. The spectacular success of the Zelenopillya rocket strike in 2014 was a convincing display of the lethality of long-range precision strike capabilities.

It is possible that ground combat attrition in the future between peer or near-peer combatants may be comparable to the U.S. experience in World War II (although there were considerable differences between the experiences of the various belligerents). Combat losses could be heavier. It certainly seems likely that they would be higher than those experienced by U.S. forces in recent counterinsurgency operations.

Unfortunately, the U.S. Defense Department has demonstrated a tenuous understanding of the phenomenon of combat attrition. Despite wildly inaccurate estimates for combat losses in the 1991 Gulf War, only modest effort has been made since then to improve understanding of the relationship between combat and casualties. The U.S. Army currently does not have either an approved tool or a formal methodology for casualty estimation.

Historical Trends in Combat Attrition

Trevor Dupuy did a great deal of historical research on attrition in combat. He found several trends that had strong enough empirical backing that he deemed them to be verities. He detailed his conclusions in Understanding War: History and Theory of Combat (1987) and Attrition: Forecasting Battle Casualties and Equipment Losses in Modern War (1995).

Dupuy documented a clear relationship over time between increasing weapon lethality, greater battlefield dispersion, and declining casualty rates in conventional combat. Even as weapons became more lethal, greater dispersal in frontage and depth among ground forces led daily personnel loss rates in battle to decrease.

The average daily battle casualty rate in combat has been declining since 1600 as a consequence. Since battlefield weapons continue to increase in lethality and troops continue to disperse in response, it seems logical to presume the trend in loss rates continues to decline, although this may not necessarily be the case. There were two instances in the 19th century where daily battle casualty rates increased—during the Napoleonic Wars and the American Civil War—before declining again. Dupuy noted that combat casualty rates in the 1973 Arab-Israeli War remained roughly the same as those in World War II (1939-45), almost thirty years earlier. Further research is needed to determine if average daily personnel loss rates have indeed continued to decrease into the 21st century.

Dupuy also discovered that, as with battle outcomes, casualty rates are influenced by the circumstantial variables of combat. Posture, weather, terrain, season, time of day, surprise, fatigue, level of fortification, and “all out� efforts affect loss rates. (The combat loss rates of armored vehicles, artillery, and other other weapons systems are directly related to personnel loss rates, and are affected by many of the same factors.) Consequently, yet counterintuitively, he could find no direct relationship between numerical force ratios and combat casualty rates. Combat power ratios which take into account the circumstances of combat do affect casualty rates; forces with greater combat power inflict higher rates of casualties than less powerful forces do.

Winning forces suffer lower rates of combat losses than losing forces do, whether attacking or defending. (It should be noted that there is a difference between combat loss rates and numbers of losses. Depending on the circumstances, Dupuy found that the numerical losses of the winning and losing forces may often be similar, even if the winner’s casualty rate is lower.)

Dupuy’s research confirmed the fact that the combat loss rates of smaller forces is higher than that of larger forces. This is in part due to the fact that smaller forces have a larger proportion of their troops exposed to enemy weapons; combat casualties tend to concentrated in the forward-deployed combat and combat support elements. Dupuy also surmised that Prussian military theorist Carl von Clausewitz’s concept of friction plays a role in this. The complexity of interactions between increasing numbers of troops and weapons simply diminishes the lethal effects of weapons systems on real world battlefields.

Somewhat unsurprisingly, higher quality forces (that better manage the ambient effects of friction in combat) inflict casualties at higher rates than those with less effectiveness. This can be seen clearly in the disparities in casualties between German and Soviet forces during World War II, Israeli and Arab combatants in 1973, and U.S. and coalition forces and the Iraqis in 1991 and 2003.

Combat Loss Rates on Future Battlefields

What do Dupuy’s combat attrition verities imply about casualties in future battles? As a baseline, he found that the average daily combat casualty rate in Western Europe during World War II for divisional-level engagements was 1-2% for winning forces and 2-3% for losing ones. For a divisional slice of 15,000 personnel, this meant daily combat losses of 150-450 troops, concentrated in the maneuver battalions (The ratio of wounded to killed in modern combat has been found to be consistently about 4:1. 20% are killed in action; the other 80% include mortally wounded/wounded in action, missing, and captured).

It seems reasonable to conclude that future battlefields will be less densely occupied. Brigades, battalions, and companies will be fighting in spaces formerly filled with armies, corps, and divisions. Fewer troops mean fewer overall casualties, but the daily casualty rates of individual smaller units may well exceed those of WWII divisions. Smaller forces experience significant variation in daily casualties, but Dupuy established average daily rates for them as shown below.

For example, based on Dupuy’s methodology, the average daily loss rate unmodified by combat variables for brigade combat teams would be 1.8% per day, battalions would be 8% per day, and companies 21% per day. For a brigade of 4,500, that would result in 81 battle casualties per day, a battalion of 800 would suffer 64 casualties, and a company of 120 would lose 27 troops. These rates would then be modified by the circumstances of each particular engagement.

Several factors could push daily casualty rates down. Milley envisions that U.S. units engaged in an anti-access/area denial environment will be constantly moving. A low density, highly mobile battlefield with fluid lines would be expected to reduce casualty rates for all sides. High mobility might also limit opportunities for infantry assaults and close quarters combat. The high operational tempo will be exhausting, according to Milley. This could also lower loss rates, as the casualty inflicting capabilities of combat units decline with each successive day in battle.

It is not immediately clear how cyberwarfare and information operations might influence casualty rates. One combat variable they might directly impact would be surprise. Dupuy identified surprise as one of the most potent combat power multipliers. A surprised force suffers a higher casualty rate and surprisers enjoy lower loss rates. Russian combat doctrine emphasizes using cyber and information operations to achieve it and forces with degraded situational awareness are highly susceptible to it. As Zelenopillya demonstrated, surprise attacks with modern weapons can be devastating.

Some factors could push combat loss rates up. Long-range precision weapons could expose greater numbers of troops to enemy fires, which would drive casualties up among combat support and combat service support elements. Casualty rates historically drop during night time hours, although modern night-vision technology and persistent drone reconnaissance might will likely enable continuous night and day battle, which could result in higher losses.

Drawing solid conclusions is difficult but the question of future battlefield attrition is far too important not to be studied with greater urgency. Current policy debates over whether or not the draft should be reinstated and the proper size and distribution of manpower in active and reserve components of the Army hinge on getting this right. The trend away from mass on the battlefield means that there may not be a large margin of error should future combat forces suffer higher combat casualties than expected.

Conversational UIs: Rise of Chatbots

Conversational UIs: Rise of Chatbots

Conversational Interfaces such as Voice Interfaces and chatbots are replacing customer care executives that used to solve our queries on long, unending calls. They are the new age assistants who are revolutionising the way brands interact with consumers. Some businesses have chatbots or other conversational UIs as their primary mode of interaction with their online traffic due to the advantages it has over employees.

Chatbots are certainly here to stay as they make the process far easier with just a simple text message and saves our time that we would otherwise spend on figuring out the complex apps and websites.

Following are some of the benefits of chatbots as conversational interfaces:

Gauging the user attention

Unlike the usual customer care calls where you had to wait until the information is fed into the system manually and processed, chatbots give crisp and precise information instantly, eliminating all the unnecessary information. Also, chatbots ask short and to the point questions which prevents users from getting distracted, thereby, saving overall turnaround time.

Chatbots are mostly designed with in-built options which are given to the users when they ask a certain question. This further reduces the probability of giving any incorrect information to the users.

Better cost effectiveness

With chatbots, major ...


Read More on Datafloq
The Stellar Rise and Murky Future of the XRP

The Stellar Rise and Murky Future of the XRP

The recent boom of Bitcoin has brought cryptocurrencies into the limelight of media and business attention. Previously not that well-known token called XRP has taken the center stage and is now one of the front-runners. It represents a cryptocurrency that is linked to Ripple’s blockchain, which is marketed as a global, real-time settlement network.

The increase in value was impressive in 2017, to say the least, and this rising tide has prompted some to claim that the sky is the limit for this altcoin. Others, however, see its Icarus-like downfall inevitable.  



Two sides of the coin

Bitcoin has paved the way for many other altcoins to enter the big game and it’s natural that people compare XRP to the top tog. Namely, it’s clear that Ripple’s market capitalization (it peaked at $148 billion) is playing catch-up with Bitcoin. However, putting them in the same pot can be misleading.

Yes, both have undergone a surge and represent hot cryptocurrencies, but they don’t have that many similarities. In fact, XRP is different from Bitcoin in few crucial ways. First of all, unlike the big brother, it’s not created in accordance with a particular timeline, via the process of mining on hubs like 2miners.

Instead of ...


Read More on Datafloq
Are Data Lakes Fake News?

Are Data Lakes Fake News?

The problem with the data lake

Are data lakes fake news? The quick answer is yes and in this post, I will show you why.

The biggest problem I have with data lakes is that the term has been overloaded by vendors and analysts with meanings. Sometimes it seems that anything that does not fit into the traditional data warehouse architecture falls under the catch-all phrase of the data lake. The result is an ill-defined and blurry concept. We all know that blurred terminology leads to blurred thinking, which in turn leads to poor decisions.

I have come across various definitions for a data lake. We will discuss all of them in this post. Sometimes people only refer to one of these ideas when talking about a data lake other times they mix and match these concepts. Some people mean all of the things below when they refer to a data lake. Others are more selective.



Are Data Lakes Fake News?

The Data Lake as a Raw Data Reservoir

This is the original meaning of a data lake. In this definition, the data lake is not too dissimilar to a staging area in a data warehouse. In a staging area, we make a copy of the ...


Read More on Datafloq
FY2018-FY2019 Defense Budget

FY2018-FY2019 Defense Budget

I gather we finally have a defense budget in place and it runs through September 2019 (there is no requirement to pass a budget for only one year). It is an $80 billion boost above spending caps for this year and $85 billion above spending caps for FY2019. This is a total of $165 billion above spending caps. The U.S. defense budget was already at least $35 billion over the spending cap. The budget request for FY2107 was initially $583 billion. I gather the budget for FY2018 is the BCA Budget Cap figure of $549 + $80  = $629 billion. Don’t quote me on this.

See:

  1. https://www.yahoo.com/news/fiscal-hawk-paul-delays-senate-vote-budget-deal-002451918–business.html
  2. https://www.yahoo.com/news/congress-passes-massive-spending-deal-103529004.html

I did note that Senator Rand Paul in his brief filibuster speech last night said that we were involved in seven wars. Of the top of my head, I only count six:

  1. Afghanistan
  2. Iraq
  3. Syria
  4. Libya
  5. Somalia
  6. “Trans-Sahara” (Mali and Niger)

Makes me wonder which war I am missing (perhaps he is counting Yemen).

Also of note was that we ended up halting a battalion-sized attack in Syria by the Syrian government. See: http://www.businessinsider.com/us-syria-killed-100-russian-syrian-backed-fighters-2018-2

A few highlights:

  1. More than 100 Syrian soldiers are claimed to have been killed.
    1. Syrians claim they lost 7 killed and 27 injured.
    2. One SDF (Syrian Democratic Forces) member was injured.
  2. The Syrian attack included 122mm Howitzers, multiple launch rocket systems, T-55 and T-72 tanks.
  3. U.S. responded with Air Force AC-130 gunships, F-15s, F-22s, Army Apache gunships, Marine Corps artillery, HIMARS (our Katusha) and MQ-9 drones.
Is Real-Time Analytics A Money Pit?

Is Real-Time Analytics A Money Pit?

Certainly, it is important to have analytics available in the timeframe needed for making decisions. For many years, it was too difficult and expensive to execute analytics anywhere near real-time and so everything was done using infrequent batch processes. As processing power has increased exponentially and costs have dropped to unprecedented levels, it is feasible to perform a wide array of enterprise analytics on a near real-time basis. However, many organizations today are vastly over-utilizing real-time analytics and are paying a price for it that, unfortunately, isn’t always recognized.

Forget Real Time. Focus on Decision Time!

Naturally, I am a proponent of ensuring that business decisions are made in a timely fashion. However, many decisions do not need to be made in anything approaching real- time. Just because something can be analyzed in real-time does not mean that it should be analyzed in real-time. Creating real-time processes where they aren’t needed leads to a lot of additional complexity and cost. Careful consideration is needed to decide how fast or frequently a given analytics process should run.

The first step is to identify the true speed required to meet business needs. I like to call this required speed “Decision Time�. While you might be able to make a ...


Read More on Datafloq
Cognitive computing: Moving From Hype to Deployment

Cognitive computing: Moving From Hype to Deployment

Although cognitive computing, which is often referred to as AI or Artificial Intelligence, is not a new concept, the hype surrounding it and the level of interest about it is definitely new. The combination of hype surrounding robot overlords, vendor marketing and concerns regarding job losses has fueled the hype into where we stand now.

But, behind the cloud of hype that is surrounding the technology currently, there lies a potential for increased productivity, the ability to solve problems deemed too complex for the average human brains and better knowledge-based transactions and interactions with consumers. I recently got a chance to catch up with Dmitri Tcherevik, who is the CTO of Progress, about this disruption and we had a healthy discussion which led to the following insights.

Cognitive computing is considered a marketing jargon by many, but in layman terms, it is used to define the ability of computers to replicate or simulate human thought processes. The processes behind cognitive computing may make use of the same principles as AI, including neural networks, machine learning, contextual awareness, sentimental analysis, and natural language processing. However, there is a minute difference between both of them.

Difference between Cognitive Computing and AI

Both AI and Cognitive Computing ...


Read More on Datafloq
Virtual Reality and Future Of Gaming

Virtual Reality and Future Of Gaming

Virtual reality (VR) is the new big thing in gaming, offering a totally immersive experience, along with high definition that gamers are not accustomed to. The saturation and functionality of VR have increased by leaps and bounds in the recent years. The introduction of virtual reality headsets like the Oculus Rift, HTC Vive, and the PlayStation VR has created a huge interest in virtual reality’s potential.

VR applications are evolving at a fast pace. They’re moving from education to healthcare to pure entertainment. Prior to what was previously believed, gaming is directly related to the social behaviours of a society and technological advancements. When all three come together to connect, we’re more capable of analysing, then transforming the way we behave and interact. Our minds are broadened and constantly challenged by the new techniques, allowing game developers to construct games with a better understanding of the gaming experience.

Virtual reality is going to open new horizons for both the gaming industry and for individual games. It is building upon existing phenomena and moving forward at a fast pace, creating amazing technological advancements along the way. VR and VR gaming have been around since the 90’s. Where before it had an ambiguous, uncertain future, we ...


Read More on Datafloq
Artificial Intelligence And It’s Role In Mobile App Development And Businesses

Artificial Intelligence And It’s Role In Mobile App Development And Businesses

The birth of artificial intelligence has brought a whole new era of great mobile app potentials. For several years now, mobile app developers have adopted artificial intelligence in their Innovations. For instance, Apple's Siri has been used for quit a long time now and still has the potentials of transforming in the nearest future. Machine learning is fast developing, and users now require a flexible algorithm for an enhanced experience. The advancement and availability of machine learning and AI are causing major advancements in the manner in which users, businesses and developers appreciate the interactions within mobile applications.

What is Artificial Intelligence?

This is a branch of science that focuses on developing and designing intelligent machines that react, work and like humans. Artificial intelligence solution is a way of impacting intelligence to machines so they can solve problems on their own more accurately, efficiently and quickly.

Big companies like Uber, Amazon eBay and others have successfully adopted AI into their business model. AI can boost a company's competence because it offers personalised, relevant and seamless customer services and experiences. Data collected by AI in mobile apps helps in understanding customer behaviour. This enhances the amount of customer retention because it builds customer interaction. ...


Read More on Datafloq
How Big Data & Analytics Are Changing the Logistics Sector

How Big Data & Analytics Are Changing the Logistics Sector

Since its inception half a decade ago, Big Data has revolutionised everything from football to transportation, and logistics is one of the fields where big data brings an incredible potential. The complex and dynamic nature of logistics makes it the perfect use-case for big data application, and it is changing the way a lot of organisations operate. It has changed the way we collect, process and analyses data, and its disruption is so essential that some even call it “the electricity of the 21st century�, a new kind of power that is transforming businesses, governments and private life.

Information Processing, the Rise of Analytics Platforms

Big data is characterised by the “3V’s� – volume, velocity and variety. The volume is rather self-explanatory: big data wouldn’t be “big� if it weren’t for the huge amount of data collected. The velocity stands for the speed at which data gets generated, while the variety refers to the diversity of sources where data is retrieved from. All this is what is really disrupts the logistics sector, which had been for a long time driven by statistics and performance indicators, now operating with more real-time analytics.

The rise of the cloud platform storage made it much easier to store ...


Read More on Datafloq
How Big Data Is Improving Temperature Control in the Food Industry

How Big Data Is Improving Temperature Control in the Food Industry

Traceability is one of the most important factors or elements in the food and beverage industry. From the time a product or good leaves the source and is transported to a distributor, it undergoes many processes and separate handlers. Because of that, it’s important that all involved parties can track and monitor an item or goods status.

Why Is Monitoring Important?

You see, food or beverages can be contaminated at any point of the supply chain. Being able to pinpoint who, what, how and why is crucial to fixing major health and contamination issues. More importantly, monitoring is vital to maintaining the quality and safety of those goods. Keeping foods at the appropriate temperature no matter where they are — or how they’re being transported and stored — is necessary. With conventional systems, however, it’s difficult to track this information especially when the goods are constantly on the move.

That’s where big data can come in handy.

When locations or goods are equipped with a myriad of sensors and data tracking tools, all players in the supply chain can help keep an eye on optimal conditions. This opens up the opportunity for many new scenarios, including the option to identify and remedy temperature control ...


Read More on Datafloq
Blockchain and the Internet of Things: A Winning Combination

Blockchain and the Internet of Things: A Winning Combination

Blockchain, the distributed ledger that eliminates any need for a central authority, has been gaining momentum since the last year. From being a technology that backed a cryptocurrency to becoming mainstream for a secure and robust recording of transactions, Blockchain has come a long way.

Blockchain technology is changing all aspects of doing a digital business. Another technology that has been altering business operations and giving birth to newer opportunities is the Internet of Things buzz. Connecting all devices, or things to the Internet and deriving knowledge from them, or building practical applications on them, is a trend that organisations have well embraced.

The billions of smart things getting connected with each other and with humans can transform cities, homes, and lives. But, this immense interconnection also brings us to the same old worry about the security of big data and information.

Imagine a botnet attack a group of these devices. What are we to do then? To the rescue, Blockchain comes. Security of transactions and operations is the one thing Blockchain technology promises, and the one thing the Internet of Things technology lacks!

Through a winning combination of IoT and Blockchain, a lot of challenges that face IoT can be reduced and a lot of ...


Read More on Datafloq
Why the Current Crash in the Crypto Market is Good for Cryptocurrencies

Why the Current Crash in the Crypto Market is Good for Cryptocurrencies

On October 8, 2017, the total market cap of the cryptocurrency market was $148 billion. Three months later, on January 8, 2018, the market cap reached its highest point of $ 813 billion. A staggering increase of 449% in just three months. However, one month later, as of February 6, 2018, the market cap dropped to $308 billion, a drop of 62% in just one month. It is clear that current cryptocurrency market is a bubble and we have only started to drop back to normal figures.

In a classic bull market, there are four phases: stealth phase, awareness phase, mania phase and blow off phase. In the first years of cryptocurrencies, it clearly was a stealth phase. Let’s say from the beginning in 2009 until the start of 2017, when the total market cap of cryptocurrencies increased from $ 0 to $ 18 billion.

Then the awareness phase started, with increasingly cryptocurrencies getting more attention in the media. This awareness phase typically ends with the first bear trap and this bear trap started end of August when the market cap dropped from $148 billion to $ 124 billion in less than three weeks.

The third phase is the mania phase, where FOMO ...


Read More on Datafloq
Top Security Measures To Keep Your IT Assets Safe

Top Security Measures To Keep Your IT Assets Safe

New security breaches are reported more frequently today than ever before. According to the ITRC mid-year report for 2017, the number of tracked data breaches in the US increased to 791, up by 29% from the figures in 2016. Inappropriate handling of IT assets and inadequate access control may expose critical information to an attacker. IT Asset Management has an important role in safeguarding an organization against security breaches. Many organizations do not realize this and keep ITAM confined to inventory functions for life-cycle management. In many organizations, especially the ones without dedicated teams looking over the security of IT assets, ITAM is the department that is tangentially responsible for this. We discuss the top security measures by which an organization can safeguard itself against security breaches by effectively using ITAM.

Establish IAM Based Access Control for Assets

The first step the ITAM department of an organization must take is to be integrated into the Identity and Access Management (IAM) system. Giving the ITAM department the ability to control which entity has access to which assets and under what constraints enables a tighter information flow. Only the right entity that has the right reasons to have access to an asset is allowed to ...


Read More on Datafloq
Are We Smart Enough For Smart Machines

Are We Smart Enough For Smart Machines

Since the invention of the wheel, humans have developed tools to make work easier. The last centuries, we focused on making physical labour lighter and less time-consuming. With the invention of the computer, we have started to let smart machines do our ‘thinking labour’ as well. Who will be skilled enough to operate the future generation of smart machines?

Spreadsheets on our PC's do calculations, navigation apps on our smartphones tell us how to get to where we want to go. Today’s wave of artificial intelligence (AI) will bring this a giant step further.

In the not so far future, cars will do most of the driving. Intelligent machines will do most of the medical diagnosis, bookkeeping, journalism and legal work, to name just a few examples.

It is often said that artificial intelligence — and robots, for that matter — will take away our jobs. I don’t believe so. I do believe that AI and smart machines will change the content of our work dramatically.

It always starts with simple work

The nature of the change process is always the same. It is a sort of “skimming from below�. It starts with simple activities — often laughed at by critics and non-believers. Then, after a ...


Read More on Datafloq
Monzo, BigQuery, Looker and “Fintech�: The Other London Tech Startup Scene

Monzo, BigQuery, Looker and “Fintech�: The Other London Tech Startup Scene

I’ve written a number of blogs over the past few months about bringing my fitness, smart devices and other digital data sources into Google BigQuery and Looker to do interesting analysis with the combined dataset, whilst also keep my tech skills up to date and relevant for my day job as Product Manager for Analytics at Qubit, a London tech startup in the hot new MarTech (“Marketing Technology�) space.

At Qubit we also use Google BigQuery and Looker to enable our customers to run fast, interactive queries against petabytes of consumer behavioural data whilst BigQuery abstracts away the hassles in running a production data warehouse, whilst Looker abstracts away the complexities of SQL, joining datasets together and trying to visualize it all in D3 or RStudio.

One big gap in the data I’d managed to collect up until recently was anything to do with what I actually spent each day; whilst my current bank along with most others provides a means to manually download statement data in Excel or CSV format, or supports commercial services that load that data into your personal finance software for a monthly fee, I wanted to access my data in the same way that I accessed my health, timekeeping and cycling workout data … through a REST API, cURL and a cron job that loads new data into BigQuery on the hour every day.

So that’s why I smiled when I heard about Monzo, another tech startup just a mile or so from my office in London that’s disrupting another industry: the “FinTech� technology and app-based banking, lending and financial services startups taking advantage of new regulations such as the EU’s new Payment Services Directive (PSD2) that opens up existing bank accounts to authorised companies that aim to provide a much better user experience (UX) over your existing bank account, and new startups such as Monzo, Starling and Atom that gained banking licenses and aim to turn the traditional banking arrangement around by focusing on customers and doing everything by app, an approach that appeals to consumers more familiar more familiar with Facebook than chequebooks today.

Image Courtesy of Medici, “Awe-Inspiring London FinTech Industry Is Firing on All Cylinders�

Me, I was more interested in the fact that Monzo also use Looker and Google BigQuery to run their analytics infrastructure, and that they also offer a developer-orientated REST API that provides exactly what I’ve been looking for to enable myself and increasingly others in the developer community to download, analyze and visualize their spending data and start to build applications that when the full Monzo API comes out will revolutionize how we do our banking in the coming years.

To start analyzing my banking data through Monzo I first needed to do a historic batch download of all my historic transactions and top-ups and put that into a BigQuery table, and then setup a trigger on Monzo’s systems that sends across all the subsequent transactions as they happen to keep my data in-sync with Monzo’s record of my bank account.

To set up the initial transaction history transfer I first registered at https://developers.monzo.com/ and then generated an access token for my original pre-paid Monzo card and the current account card that recently replaced it. Note that this access token only lasts for a short time so you can’t generate it just once and use it in a script forever, and more importantly the developer API is meant just for personal use and can’t be used to access other people’s accounts, so you can’t build one of the new apps made possible by the UK’s new open banking regulations just yet (but this is coming soon through their interim AIS API now in testing with a small number of third-party developers)

To find out the account IDs for your Monzo accounts either use the API playground web app as I’ve done in the screenshot above, or use the Developer API now for the first time to get them yourself, via the cURL tool:

curl -G “https://api.monzo.com/accounts" -H “Authorization: Bearer <<YOUR_ACCESS_TOKEN_HERE>>" ./accounts.json

and then retrieve your account IDs from the JSON output through a tool such as jsonpp (brew install jsonpp if you’re on a Mac, similar tools for other OSs)

{
"accounts": [
{
"id": "acc_<<MY_PREPAID_CARD_ID>>"
"created": "2017-04-06T07:07:15.175Z",
"description": "Mark Rittman",
"type": "uk_prepaid"
},
{
"id": "acc_<<MY_CURRENT_ACCOUNT_ID>>"
"created": "2017-11-28T08:35:28.323Z",
"description": "Mark Robert Rittman",
"account_number": "<<MY_AC_NUMBER>",
"sort_code": "<<MY_SORTCODE>>",
"type": "uk_retail"
}
]
}

Then you can retrieve all your account transactions with another cURL request, like this one for my current account:

curl -G "https://api.monzo.com/transactions" -d account_id=acc_<<MY_CURRENT_ACCOUNT_ID>> -H "Authorization: Bearer <<YOUR_ACCESS_TOKEN_HERE>>" > /transactions.json

and then you can use tools such as “sed� (to strip out the transactions: []) array from the json output) and one of the many json2csv converters out there (for example this one that has npm as a prerequisite, something I’m guessing most people attempting this sort of project will have already, if not then first install npm via brew and you’re ready)

sed -i -e 's/{"transactions"://g' ./transactions.json
sed -i -e 's/]}/]/g' ./transactions.json
brew install json2csv
json2csv -i ./transactions.json -f created,amount ,description,currency,category,include_in_spending,<<ANY_OTHER_REQUIRED_FIELDS>> -o transactions.csv

There are many more fields available to extract from the JSON documented downloaded via the developer API depending on whether you’re accessing a Monzo prepaid card or current account; the developer docs on the Monzo site go through some of them but you’re best going through a handful of transactions to see what’s available and then create your own custom -f extract list.

Then as the final step I use the bq command-line tool to load the data in the CSV file output in the previous step into a BigQuery table, like this:

bq load --source_format=CSV --replace=true --skip_leading_rows=1 --autodetect dw_staging.monzo_transactions_stg ./transactions.csv

and then go into the BigQuery Web UI to check my transaction data has come in as expected.

In-practice for my analysis exercise I brought in a lot more fields including merchant name and map X/Y location, Foursquare metadata and other fields that are only available for the current account product, so check through the JSON elements available for your particular Monzo product and extract the ones that most interest you.

So what about capturing transactions after this date, how do we do that without some complicated process that downloads all the transactions again and then applies just the ones that are new to the BigQuery table? The way I bring in all my other app, smart device and other event data into BigQuery is to either post them directly to a fluentd server running on Google Compute Engine or use services such as IFTTT and Zapier that interface with my apps and then post to fluentd using outgoing webhook services, or their “Maker Channel� as IFTTT used to call their service when I created for a blog post back in 2016.

The Monzo Developer API also comes with a webhook feature that lets you register a URL, for example the one use when posting all my other event data to fluentd, with all subsequent transaction data then being posted to that server URL and then onto the various BigQuery tables that store my data for analysis.

In the case of the Monzo transactions I added an intermediate step and had the Monzo webhook post to a Zapier “zap� that made it easier to decode the nested JSON elements Monzo was sending over that held merchant and other payment processor-related information.

Zapier then sends over just those elements of the transaction I’m interested in to the fluentd server which then streams them into BigQuery using the fluentd BigQuery plugin. From there I can then write SQL to analyze this data in BigQuery’s web UI, remembering to divide the transaction amount by 100 as they’re all stored as pence in integer datatypes within the Monzo system, and to remove any top-ups, declines and and other non-spend transactions from the results by filtering on the “include_in_spending� column to remove all records with “false� as the predicate value.

Analyzing the data in a tool such as Looker makes things much more interesting as I can start to make use of the locational metadata for merchants that Monzo provide for current account transactions, showing me for example where I took out cash from ATMs on the way back home from the pub and then forgot I’d done so the morning after.

Or I can see how my spend on essentials like entertainment and eating out rises in the summer, especially when I appear to do away with non-essentials like food and other groceries.

I can see where it was I spend the most money when out on cycling trips in the summer, mostly in pubs around the New Forest and up near Bath, Frome and Warminster (and how I wish winter would end and I can get back out on my bike again).

I can even work out how my choice of restaurant and entertainment changed from lots of spend on music in August through to lots of eating out in October.

And more importantly, I can see I’m spending far too much money on coffees and breakfast on the way into work each day, but that’s the downside I guess of working just off Covent Garden in what must be one of the nicest work locations I’ve ever had the privilege to base myself out of.

The beauty of Looker is that I can define a data model in LookML that takes table structures and SQL expressions and turns them into more user-friendly measure and dimension definitions, like this:

These LookML “views� can then be joined to other views that share a common join key — in my case, the truncated “date� part of a date and time timestamp field — that then makes it possible for me to analyze spend data in the context of where I was on that day and that time; whether I was out on the bike and if so whether I’d had a good workout or not so good; what the weather was like on the day and what I’d been chatting about on Slack at the time; whether the spend was for an Uber ride and if so where to.

Analysis that shows me, now I’m finally able to join my financial and activity dots together for the first time, where all my money’s gone since the nights got darker and I’m clearly in the pub rather than out on my bike. Lucky spring is around the corner or I’ll be applying for Monzo’s new overdraft service that’s just gone into review rather than registering new Strava personal records next month.


Monzo, BigQuery, Looker and “Fintech�: The Other London Tech Startup Scene was originally published in Mark Rittman’s Personal Blog on Medium, where people are continuing the conversation by highlighting and responding to this story.

Teradata Open its Data Lake Management Strategy with Kylo: Literally

Teradata Open its Data Lake Management Strategy with Kylo: Literally

Still distilling good results from the acquisition of former consultancy company Think Big Analytics, Teradata, a powerhouse in the data management market took one step further to expand its data management stack and to make an interesting contribution to the open source community.

Fully developed by the team at Think Big Analytics, in March of 2017 the company launched Kylo –a full data lake management solution– but with an interesting twist: as a contribution to the open source community.

Offered as an open source project under the Apache 2.0 license Kylo is, according to Teradata, a new enterprise-ready data lake management platform that enables self-service data ingestion and preparation, as well the necessary functionality for managing metadata, governance and security.

One appealing aspect of Kylo is it was developed over an eight year period, as the result of number of internal projects with Fortune 1000 customers which has enabled Teradata to incorporate several best practices within Kylo. This way, Teradata has given the project the necessary maturity and testing under real production environments to launch a mature product.

Using some of the latest open source capabilities, including Apache Hadoop, Apache Spark and Apache NiFi, Kylo was designed by Teradata aiming to help organizations address common challenges of a data lake implementation and provide those common use cases the will enable reduced implementation cycles that average 6 to 12 months.

Teradata’s decision to release Kylo through an open source model â€”instead of a traditional commercial one— comes also within an interesting spinoff.

Traditionally a fully commercial software provider, the company has had in recent years a core transformation, being increasingly open to new business models and approaches, including its Teradata Everywhere strategy to enable increasing access to Teradata solutions and services in all possible on-premises and cloud platforms.

This broad strategy includes increased support for the open source community, such is the case with  the Hadoop community on different projects, Presto, and now of course with Kylo.

Teradata’s business model for Kylo is based the services its big data services company Think Big can offer on top of Kylo, these optional services include support training, as well as implementation and managed services.
(post-ads)
According to Teradata, Kylo will enable organizations to address specific challenges implied within common data lake implementation efforts, including:

  • Shortage of skilled and experienced software engineers and administrators
  • Implementation of best practices regarding data lake governance
  • Reinforce data lake adoption beyond engineers and specific IT teams

Teradata aims with Kylo for a data lake platform that requires no code and enable self-service data ingest and preparation via an intuitive user interface to help accelerate the development process by enabling reusable templates to increase productivity.

From a functions and features perspective, Kylo has been designed to provide the necessary data management capabilities for the deployment of a data lake:

  • Data Ingestion. Self-service data ingest capabilities along with data cleansing, validation, and automatic profiling.
  • Data Preparation.  Handling data capabilities through a visual SQL and interactive data transformation  user interface.
  • Data Discovery. Data searching and exploration capabilities as well as metadata, view lineage, and profile statistics.
  • Data Monitoring. Data monitoring capabilities for health of feeds and services through the complete data lake as well as tracking service level agreements (SLA’s) and troubleshoot performance.
  • Data Pipeline Design. Capabilities for designing batch and/or streaming pipeline templates in Apache NiFi to be registered with Kylo, allowing user self-services.

As per words from Oliver Ratzesberger, Executive Vice President and Chief Product Officer at Teradata:
“Kylo is an exciting first in open source data lake management, and perfectly represents Teradata’s vision around big data, analytics, and open source software. Teradata has a rich history in the development of many open source projects, including Presto and Covalent. We know how commercial and open source should work together. So we engineer the best of both worlds, and we pioneer new approaches to open source software as part of our customer-choice strategy, improving the commercial and open source landscape for everyone.�

With Kylo, Teradata aspires to play a leadership role in the data lake, governance, and stewardship market, yet a difficult goal as niche vendors like Zaloni and Podium Data or big vendor like Informatica with its Data Lake Management solution stack but at first, it looks like a solution to follow closely, especially considering price point due to its business model versus the other commercial offerings.

Want more information?

Kylo software, documentation and tutorials can be found in the Kylo project website or at the project’s GitHub site, or check the following video and its page on Youtube:




Successfully Transitioning your Team from Data Warehousing to Big Data

Successfully Transitioning your Team from Data Warehousing to Big Data

You are planning to complement your traditional data warehouse architecture with big data technologies. Now what? Should you upskill your existing data warehouse team? Or do Big Data technologies require a completely different set of skills?

What do we mean by big data technologies anyway? For the purpose of this article, I define big data as any distributed technology that is not a relational database. According to this definition, a distributed relational database (MPP) such as Redshift, Vertica, Teradata etc. is not a big data technology. I know. Other definitions for big data exist, e.g. any data volume that we can’t easily fit on a single server.

Tackling big data projects with a data warehouse mindset?

Your company already has a data warehouse built on traditional data warehouse architecture. Either a Kimball collection of conformed data marts or a Corporate Information Factory built on Inmon. Great! You already have a lot of experience running data projects successfully.

Let’s look at the various roles on your team and check if they are big data ready.

Project Managers

I presume you already run your data warehouse projects in an agile fashion. Not many changes there. It’s still scrum, user stories, spikes, and daily stand-ups. However, big data projects ...


Read More on Datafloq
FY2018 Defense Budget

FY2018 Defense Budget

In case you were not watching closely, we still don’t have a defense budget for FY2018…which started four months ago. Right now, it is looking like we may have something agreed to by February 8, and according to some rumors, it will be an increase of $80 billion.

  1. The initial requested budget (which is different than what is actually spent) for FY2017 was $582 Billion.
  2. The president requested a $30 billion increase for FY2017.
  3. The president requested a $52 or $54 billion increase for FY2018 to $639 billion for FY2018 (source: Wikipedia, May 2017 DOD News article), or to $603 (source AP). I have never been able to sort out the difference here. I still don’t understand why there seems to be two different figures regularly batted about, nor do I understand how this claimed 10% increase adds up to a 10% increase. (read this for an answer: https://www.csis.org/analysis/what-expect-fy-2018-defense-budget).
  4. Congress is looking at a deal that will increase the budget by $80 billion, or I gather to some figure around $662 billion or $629 billion.
  5. Not sure how that budget increase is assigned or implemented as we are already 1/3rd the way through the fiscal year.
  6. I gather this increase is for the next two years.
  7. I gather there will not be a government shut-down on the 8th and that we may have a defense budget by then.

Anyhow, maybe we will know something more by the end of the week.

 

Sources:

1. Wikipedia: https://en.wikipedia.org/wiki/Military_budget_of_the_United_States

2. AP Article: http://www.apnewsarchive.com/2018/The-era-of-trillion-dollar-budget-deficits-is-about-to-make-a-comeback-and-a-brewing-budget-deal-could-mean-their-return-comes-just-next-year/id-7d76e81cbea64e8fafe9d8a4576cfe6b

3. DOD article: https://www.defense.gov/News/News-Releases/News-Release-View/Article/1190216/dod-releases-fiscal-year-2018-budget-proposal/

 

 

 

 

Why Chatbots Should Be Part of Your Big Data Plan in 2018

Why Chatbots Should Be Part of Your Big Data Plan in 2018

Two buzzwords worth knowing in tech: chatbots and big data. These technologies are constantly improving, and their intersection is one of the most exciting places to be in tech in 2018. Read on to find out why.

Big Data Is the Future

At this point, you’re probably familiar with big data—the steady and gigantic flow of information businesses collect and leverage to improve marketing and customer service.

Big data is getting really big. Experts are predicting that we’ll be sitting on 35 zettabytes of data by 2020. A zettabyte is roughly one billion terabytes. So what do we do with all that information? Build a better robot that intersects chatbots and big data.

Where Chatbots Come In

Chatbots are AI digital concierges for users. They answer questions, provide direction, and give users information. The biggest advantage of chatbots is that they’re automated and can handle mundane, repetitive, time-consuming tasks, leaving human workers free to do things only humans can do.

“Chatbots provide a natural way for customers to interact with companies,� says Harrison Brady, Communications Specialist for Frontier Communications. “The conversational nature of chatbots helps create a more intimate atmosphere than an app can, and for many uses, it’s also more intuitive.�

This means that the user ...


Read More on Datafloq
The Most Unexpected and Valuable Source of Security for Your Business

The Most Unexpected and Valuable Source of Security for Your Business

Data security is a growing concern for business owners, with several prominent breaches affecting the market in recent years. Alongside other data security measures your business is likely already taking such as password-enabled tiered user access and two-step authentication there's another step you can take to enhance the security and privacy of your sensitive financial information: network and system virtualization. Creating virtual networks and firewalls can exponentially increase the security of your data and protect your business and clients from breaches and leaks of sensitive payment and business information.

How Virtualization Can Protect Your Data

Virtualization has its place in both, terminal machines and networking. Using a virtual machine on your desktop and laptop computers can protect the core system from malicious code and vulnerabilities, as the virtual operating system doesn't have access to your containing system. This level of security is achieved by using software like VMware and VirtualBox to create virtual instances of entire computing systems on a psychical desktop. Virtual systems can contain their own operating system, which may differ from the operating system on the host machine. When employees and end-users in your business access the internet from such virtual machines, they don't risk exposing your physical drives ...


Read More on Datafloq
TDI Friday Read: Cool Maps Edition

TDI Friday Read: Cool Maps Edition

Today’s edition of TDI Friday Read compiles some previous posts featuring maps we have found to be interesting, useful, or just plain cool. The history of military affairs would be incomprehensible without maps. Without them, it would be impossible to convey the temporal and geographical character of warfare or the situational awareness of the combatants. Of course, maps are susceptible to the same methodological distortions, fallacies, inaccuracies, and errors in interpretation to be found in any historical work. As with any historical resource, they need to be regarded with respectful skepticism.

Still, maps are cool. Check these out.

Arctic Territories

Visualizing European Population Density

Cartography And The Great War

Classics of Infoporn: Minard’s “Napoleon’s March”

New WWII German Maps At The National Archives

As an added bonus, here are two more links of interest. The first describes the famous map based on 1860 U.S. Census data that Abraham Lincoln used to understand the geographical distribution of slavery in the Southern states.

The second shows the potential of maps to provide new insights into history. It is an animated, interactive depiction of the trans-Atlantic slave trade derived from a database covering 315 years and 20,528 slave ship transits. It is simultaneously fascinating and sobering.

Nyitott data science képzéseink

Nyitott data science képzéseink

Idén is meghirdetjük a Műszaki Egyetemen tartott tárgyunk külsősök számára is. Ez azt jelenti, hogy az egyetemista hallgatókkal együtt február elejétől 14 héten keresztül lehet a data science és a big data világába betekintést kapni. 

A kezdeményezés igen népszerű, de a helyek számát korlátozzák (1) a rendelkezésre álló terem méretei, illetve (2) az az elvünk, hogy nem engedünk be több külsős érdeklődőt a tárgyra, mint ahány egyetemi hallgató jelentkezett az órára.

Ha érdekelnek a data science és gépi tanulás alkalmazási lehetőségei valamint a praktikus elméleti háttér, akkor várunk az Adatelemzési platformoktárgyunkon. A kurzus célja, hogy a hallgatók készség szinten legyenek képesek adatbányászati feladatok megfogalmazására és valós adathalmazok felett ilyen problémák megoldására. Ehhez a tárgy nemcsak az adatbányászat, a gépi tanulás, az adatelemzés elvi hátterét mutatja be, hanem vizuális programozási metodikát használó adatbányászati szoftvereket, platformokat is ismertet, külön figyelmet szentel a ’big data’ elemzési feladatokra megoldást jelentő Hadoop platform bemutatására.

A tárgy hivatalos honlapja
Időpont: keddenként 8.30-10.00-ig és szerdán 10.15-12.00-ig. Az első alkalom időpontja: február 6.
Helyszín: a Műszaki Egyetem Lágymányosi campusának épületeiben

Jelentkezem

A jelentkezés alapvetően jelentkezési sorrendben történik, várhatóan a hét végén fogunk eredményt hirdetni. Nektek, akik a levelezési listán fent vagytok, egy nappal korábban küldjük ki az információkat, hogy előnyötök legyen a blogról érkező többi érdeklődővel szemben. A részvételnek nincs külön feltétele, a kurzuson való részvétel ingyenes. Van lehetőség arra is, hogy hivatalosan beiratkozz a BME-re erre a tárgyra, ebben az esetben hivatalosan le is vizsgáztatunk, és mint hallgató vehetsz részt a tárgyon. 

Megosztom Facebookon! Megosztom Twitteren! Megosztom Tumblren!

Should we Watch out or Learn to Embrace Artificial Intelligence?

Should we Watch out or Learn to Embrace Artificial Intelligence?

Have you ever wondered what if the Robot movie becomes a reality? Or just like the move I, Robot ,humans won't be able to deal with their power?

There has been a lot of debate in the past one year about the pros and cons of Artificial Intelligence. Some argue that it’s going to take the technological advancements to new heights, while few others argue that it will result in extensive job losses because of automation. Media has not backed away from promoting all the news related to AI as a doomsday story when robots will rule the world.

However, this argument has very little meat to it.

Human body and mind are still beyond our cognisance as there’s much more to it then we know. The human body is wired and functions in such efficient ways that we still haven’t been able to understand some inexplicable phenomenon. Talking about the human mind, it is the most beautiful and most profound mystery. The potential of a human mind is an untapped ocean and the more in-depth we try to dig the more astonishing discoveries shall be made.

Though AI will aid businesses to deal with data and to handle repetitive tasks, it is far from ...


Read More on Datafloq
Is Searching for the Killer IoT Application a Waste of Time? Part 1: IoT Horizontal Apps

Is Searching for the Killer IoT Application a Waste of Time? Part 1: IoT Horizontal Apps

While so many companies and people are searching for the “IoT killer horizontal app�, nobody has found it yet. You can be sure that I have neither. Otherwise, I would not be writing this article and I would be furiously developing it. There are people who think pet care or fitness could be the “killer app� for IoT?  Sorry, but I do not buy it.

For others, the killer IoT app winners will be in verticals like predictive maintenance in manufacturing, smart home or smart city solutions. But I agree with Daniel Elizalde who is arguing that there’s no true “killer app� for IoT and that any company can create the right killer app to solve a business need of a customer or a whole industry.

Claudio Carnevali, CEO of Iomote, posted the following comment on LinkedIn recently “ Why measuring Temperature and Humidity seems the killer application for most of the IoT industry? Are your customers rheumatic?� – The comment is comical but at the same time ironic. In the absence of bright or innovative ideas, it seems that we would have discovered the fire when we install sensors and we can visualise temperature and humidity in real-time on the screen of our smartphone.

Luc Perard, ...


Read More on Datafloq
Python, Java or Scala? What to Use for Your next Spark Project?

Python, Java or Scala? What to Use for Your next Spark Project?

Apache Spark is a leading general data processing platform that runs programs 100 times faster in memory and 10 times faster on disk than the traditional choice for Big Data applications, Hadoop.

Spark is a project from Apache that the company likes to sell as a “lightning fast cluster computing� platform. A dilemma amongst the developers and users of the Spark platform is about the best programming language to be used for developing Apache Spark solutions. There are three languages that Apache Spark supports- Java, Python, and Scala.

Choosing a programming language out of the three is a subjective matter that depends on various factors, like the programmer's comfort and skills, the project's requirements, etc.

Why Leave out Java?

While Java has been programmer's favorite language for decades now, it lags behind when delivering the value that Scala and Python do.

First, it is verbose as compared to Python and Scala. Second, while the latest versions of the Java programming language allowed lambda and Streaming APIs, they don't even compare to what Scala offers.

Java also does not support REPL- the Read-Evaluate-Print loop interactive shell that is crucial for all developers who work on Big Data analytics and Data science.

Conclusively, any new features in Apache Spark ...


Read More on Datafloq
How Data Analytics Can Drive New Growth for Your Business

How Data Analytics Can Drive New Growth for Your Business

Any half-serious manager understands that acquiring massive amounts of business data doesn't guarantee you anything. Many even feel overwhelmed by it. Data analytics can offer statistics into many business aspects, both relevant and irrelevant. But the real question is - how to make those insights truly useful and profitable?

The challenge lies in the proper structure of the gathered data as well as correct usage of suitable analysis tools to understand that data correctly. Let’s examine how data analytics can help you boost revenue, improve client satisfaction and direct your business.

Understanding customer’s choice

If you offer a child a choice between two or three ice cream flavors, it will probably make a decision fairly quickly. However, if the choice is between ten different flavors - a decision-making fiasco ensues. Point being – sometimes having too many choices is just as bad as having no choices at all. This is a symptom of the modern world. Customers are often confused by the vast range of competing products and therefore have a problem to decide which one is the best.

Data analytics can help you better understand the habits of your customers. You can use that acquired knowledge to create/re-create your own products according to ...


Read More on Datafloq
Long-Term Blockchain Stability in a Speculator’s Market

Long-Term Blockchain Stability in a Speculator’s Market

Back when Bitcoin was first launched, it was met with a wide range of reactions. Some were hopeful for the future of cryptocurrencies and the potential for future development in blockchain technology, assuming the digital currency spurred enough interest to drive more eyes and development dollars to various projects. At the opposite end of the spectrum were those initially doubtful of the potential success of this approach to decentralised potential wealth to the point of parody. Dogecoin was initially launched as nothing more than a joke meant to poke fun at the apparent insanity of a currency backed up by nothing but public faith.

The Surprise Rise of Cryptocurrency Value

Five years later, Dogecoin is now collectively worth several billion dollars. What started off as a jab at cryptocurrency has begun raising questions about the future of blockchain technology in the public eye. At some level, most any cryptocurrency will hit a point of some infinitesimally small value over time. Hitting a penny per share is slightly more unusual, let alone a combined total value higher than the GDP of some extremely small countries.

Speculation and investing into currencies on the off-chance they rise in value in a way not unlike Ethereum or ...


Read More on Datafloq
What Telegram’s Upcoming ICO Could Mean for the Crypto World

What Telegram’s Upcoming ICO Could Mean for the Crypto World

A few weeks ago, the Telegram white paper leaked. The white paper discusses Telegram’s plans to build a decentralised ecosystem, launch a variety of Blockchain services and, of course, the Telegram Open Network (TON) comes with their own cryptocurrency called Gram. There are various rumours that Telegram is looking to raise at least $500 million in a pre-ICO sale and another $500 million in the actual ICO. As such, it could become the largest crypto raise to date and have a significant impact. However, it all depends on how Telegram will approach their ICO.

After all, the current trend within the crypto market is to run an ICO with simply a white paper and raise millions to build the vision presented in that whitepaper. More of than not, these ICOs are a scam as we saw last week when Ethereum startup Prodeum disappeared after having raised $11 million. Apart from scamming investors, they also performed identity theft, as the ‘founders’ of Prodeum claim they were not aware that their names had been used in the scam. Such scams are a threat to the crypto community and it is time for regulators to start regulating ICOs just as we have regulated IPOs, ...


Read More on Datafloq
7 Industries Where Augmented Reality Is Making a Mark

7 Industries Where Augmented Reality Is Making a Mark

Who knew that Augmented Reality turned 50 in 2018? When Ivan Sutherland created a head-mounted display back in 1968 that used cutting-edge computer graphics of the time to show users simple wireframe drawings, few could've predicted that it would be the first building block to what would become an industry worth over $1 billion. 

But today technology has advanced to the point where AR can be utilised in a range of businesses. The gaming industry catapulted it into public awareness and captured the imaginations of kids (and big kids) worldwide. They ventured out into the brave new world to pursue 150 cute virtual monsters in Niantic's wildly successful Pokemon Go, which was released as an app in 2016 and downloaded by over 750 million people within its first year. 

Having digital images and information generated and superimposed into reality has proved to be extremely appealing to users, while developers clamber to help the technology realise its benefits across a vast number of different industries. Here, we've identified seven areas where AI is soon to make its presence felt. 

1. Healthcare

Augmented Reality has the potential to revolutionise the way diagnoses are conducted and how industry professionals are trained. 

Patients conditions can be quickly found by ...


Read More on Datafloq
Initial SFAB Deployment To Afghanistan Generating High Expectations

Initial SFAB Deployment To Afghanistan Generating High Expectations

Staff Sgt. Braxton Pernice, 6th Battalion, 1st Security Force Assistance Brigade, is pinned his Pathfinder Badge by a fellow 1st SFAB Soldier Nov. 3, 2017, at Fort Benning, Ga., following his graduation from Pathfinder School. Pernice is one of three 1st SFAB Soldiers to graduate the school since the formation of the 1st SFAB. He and Sgt 1st Class Rachel Lyons and Capt. Travis Lowe, all with 6th Bn., 1st SFAB, were among 42 students of Pathfinder School class 001-18 to earn their badge. (U.S. Army photo by Spc. Noelle E. Wiehe)

It appears that the political and institutional stakes associated with the forthcoming deployment of the U.S. Army’s new 1st Security Force Assistance Brigade (SFAB) to Afghanistan have increased dramatically. Amidst the deteriorating security situation, the performance of 1st SFAB is coming to be seen as a test of President Donald Trump’s vow to “winâ€� in Afghanistan and his reported insistence that increased troop and financial commitments demonstrate a “quick return.”

Many will also be watching to see if the SFAB concept validates the Army’s revamped approach to Security Force Assistance (SFA)—an umbrella term for whole-of-government support provided to develop the capability and capacity of foreign security forces and institutions. SFA has long been one of the U.S. government’s primary response to threats of insurgency and terrorism around the world, but its record of success is decidedly mixed.

Earlier this month, the 1st SFAB commander Colonel Scott Jackson reportedly briefed General Joseph Votel, who heads U.S. Central Command, that his unit had less than eight months of training and preparation, instead of an expected 12 months. His personnel had been rushed through the six-week Military Advisor Training Academy curriculum in only two weeks, and that the command suffered from personnel shortages. Votel reportedly passed these concerns to U.S. Army Chief of Staff General Mark Milley.

Competing Mission Priorities

Milley’s brainchild, the SFABs are intended to improve the Army’s ability to conduct SFA and to relieve line Brigade Combat Teams (BCTs) of responsibility for conducting it. Committing BCTs to SFA missions has been seen as both keeping them from more important conventional missions and inhibiting their readiness for high-intensity combat.

However, 1st SFAB may be caught out between two competing priorities: to adequately train Afghan forces and also to partner with and support them in combat operations. The SFABs are purposely optimized for training and advising, but they are not designed for conducting combat operations. They lack a BCT’s command, control and intelligence and combat assets. Some veteran military advisors have pointed out that BCTs are able to control battlespace and possess organic force protection, two capabilities the SFABs lack. While SFAB personnel will advise and accompany Afghan security forces in the field, they will not be able to support them in combat with them the way BCTs can. The Army will also have to deploy additional combat troops to provide sufficient force protection for 1st SFAB’s trainers.

Institutional Questions

The deviating requirements for training and combat advising may be the reason the Army appears to be providing the SFABs with capabilities that resemble those of Army Special Forces (ARSOF) personnel and units. ARSOF’s primary mission is to operate “by, with and through� indigenous forces. While Milley made clear in the past that the SFABs were not ARSOF, they do appear to include some deliberate similarities. While organized overall as a conventional BCT, the SFAB’s basic tactical teams include 12 personnel, like an ARSOF Operational Detachment A (ODA). Also like an ODA, the SFAB teams include intelligence and medical non-commissioned officers, and are also apparently being assigned dedicated personnel for calling in air and fire support (It is unclear from news reports if the SFAB teams include regular personnel trained in basic for call for fire techniques or if they are being given highly-skilled joint terminal attack controllers (JTACs).)

SFAB personnel have been selected using criteria used for ARSOF recruitment and Army Ranger physical fitness standards. They are being given foreign language training at the Military Advisor Training Academy at Fort Benning, Georgia.

The SFAB concept has drawn some skepticism from the ARSOF community, which sees the train, advise, and assist mission as belonging to it. There are concerns that SFABs will compete with ARSOF for qualified personnel and the Army has work to do to create a viable career path for dedicated military advisors. However, as Milley has explained, there are not nearly enough ARSOF personnel to effectively staff the Army’s SFA requirements, let alone meet the current demand for other ARSOF missions.

An Enduring Mission

Single-handedly rescuing a floundering 16-year, $70 billion effort to create an effective Afghan army as well as a national policy that suffers from basic strategic contradictions seems like a tall order for a brand-new, understaffed Army unit. At least one veteran military advisor has asserted that 1st SFAB is being “set up to fail.�

Yet, regardless of how well it performs, the SFA requirement will neither diminish nor go away. The basic logic behind the SFAB concept remains valid. It is possible that a problematic deployment could inhibit future recruiting, but it seems more likely that the SFABs and Army military advising will evolve as experience accumulates. SFA may or may not be a strategic “game changer� in Afghanistan, but as a former Army combat advisor stated, “It sounds low risk and not expensive, even when it is, [but] it’s not going away whether it succeeds or fails.�

Aspiring Data Scientists – Get Hired!

Aspiring Data Scientists – Get Hired!

Working in Data Science recruitment, we’re no strangers to the mountains you have to climb and pitfalls faced when getting into a Data Science career. Despite the mounting demand for Data Science professionals, it’s still an extremely difficult career path to break into.

The most common complaints we see from candidates who have faced rejection are lack of experience, education level requirements, lack of opportunities for Freshers, overly demanding and confusing job role requirements.

Experience

First of all, let’s tackle what seems to be what seems the hardest obstacle to overcome, lack of experience.

This is a complex one and not just applicable to Data Science, across professions it’s a common complaint that entry-level jobs ask for years’ worth of experience. Every company wants an experienced data scientist, but with the extremely fast emergence of the field and growing demand for professionals, there is not enough to go around!

Our advice here for anyone trying to get into Data Science who is lacking experience is to try and get an internship by contacting companies directly. Sometimes, you will find these types of positions available with recruiters but you will no doubt have more luck going direct.

Another approach is to have a go at Kaggle competitions, ...


Read More on Datafloq
5 Myths about The World’s Most Favourite Cryptocurrencies Today

5 Myths about The World’s Most Favourite Cryptocurrencies Today

When you hear the words cryptocurrency or blockchain chances are the first thing that comes to mind is Bitcoin. Bitcoin is actually one of well over 700 different cryptocurrencies in the marketplace. It is also the most popular and the oldest one available. All other cryptocurrencies owe their existence to Bitcoin and the blockchain technology that was developed along with it. So it is only natural that something this popular would become a target of some kind. But what about the other stuff you hear about related to Bitcoin and the world of alternative pay platforms? Here’s a look at the top 5 myths about Bitcoin.

1. Bitcoin Is A Fad

Well, that’s just plain wrong. When you measure the overall volume of Bitcoin over the past few years you discover that if anything, the cryptocurrency has actually gained in popularity. For example, in late 2013 there was about $154-million in Bitcoins moving per day. While that in itself is a pretty impressive rate, the volume in September 2017 was at well over $11-billion per day. Plus, it is not a cryptocurrency myth that the price is not an indicator.

The price per ‘coin’ has gone from $118 to well over $17,000 and ...


Read More on Datafloq
IoT and 5G New Technologies Create New Opportunities

IoT and 5G New Technologies Create New Opportunities

The world is adapting and shaping around IoT technologies. In combination with 5G network connectivity, companies are preparing themselves for this constant shift to the progression of these technologies. 

Network operators, telcos, and IoT companies are coming together to promote an ecosystem where customers can have great experiences with your brand. Companies can’t deliver this customer experience all by themselves. They cooperate with each other in an ecosystem of companies. A lot of the reliable, low latency IoT connections run on Telecom networks.  

With IoT, everything can be connected and measured. If you can measure something, then you can make a lucrative businesses model with it. So there will be a ton of new opportunities and business models developing to monetize these IoT opportunities. 

This new approach has to be supported with a BSS, or Business Support System. Think of a Business Support System as a program on your computer that you use to manage your business, and all parties in the ecosystem can work together. From point of sales, to billing, to managing customer experiences and feedback, a Business Support System manages everything. 

Whether you are logging in as a Product Manager configuring and managing commercial aspects or a Business Configuration Engineer handling all technical ...


Read More on Datafloq
How to Disrupt Digital Advertising Through Blockchain?

How to Disrupt Digital Advertising Through Blockchain?

This article is sponsored by Pingvalue

How one open platform plans to transform the digital advertising market with its own cryptocurrency

Meet blockchain – the fintech newcomer that has taken the world by storm, redefining how businesses, governments, organizations and individuals interact. This decentralized ledger technology eliminates expensive third parties by providing an airtight verification process. It can authenticate any type of transaction, establishing trust and simplifying the movement of money, products and information worldwide.

Several Initial Coin Offerings (ICO) have appeared in recent years and the number is growing exponentially. The urgent need for a direct and trustworthy mode of interaction elevated the blockchain conversation from crypto-fan chatrooms to the most influential boardrooms.

Blockchain’s collaboration-based network has already impacted several industries, generating benefits for all parties involved. The digital advertising industry is next in line, with various startups attempting to use this technology to transform processes currently dominated by middlemen, fraud and a lack of measurability.

Make advertising transparent, relevant & rewarding

Many initiatives emerging from the digital advertising industry aims to reinvent the current model by harnessing blockchain to deliver transparency, relevance and rewards. A great thing is to create a people-centric approach, an open platform improves the customer experience while allowing for ...


Read More on Datafloq
5 Predictions on the Future of AR

5 Predictions on the Future of AR

There’s no question that Augmented Reality (AR) technology – in the form of glasses, headsets, and connected software – has made rapid advances in recent years. In fact, AR is predicted by some to be the next trillion dollar industry due to the way it could potentially change the way we live, work, and play.

And if the past few years is any indication, the way AR headsets and glasses function is going to change drastically, as the industry strives to integrate innovative technology, additional functionality, and enhanced ergonomic design into its products.

From integration with Artificial Intelligence to virtual advertising, here are Lucyd’s five predictions on what the future of AR will likely hold.

1. Access to AI

In June of 2017, Apple released its ARKit, a toolkit for prospective developers to create AR related applications that would integrate with Apple devices and apps like Siri. And during a recent AR hackathon, a smart-home integrated HoloLens with gesture control was developed. It all points to one of the key future trends of AR, which is integration with Artificial Intelligence (AI) software, technology, and platforms. Whether it’s displaying information about the temperature of another room in your house or providing additional information about the ...


Read More on Datafloq
How Big Data Is Changing Banking, Finance, and Credit

How Big Data Is Changing Banking, Finance, and Credit

Like most other businesses, banking and financial services organizations are fighting to adapt in this new, disruptive, digital world — and like most other businesses, big data analytics is at the top of the list of solutions to reign in. While those with the proper expertise and knowledge are finding great opportunity via big data analysis, unfortunately, not everybody is necessarily ready to deploy these solutions. Here’s how big data is changing the banking, finance, and credit industry.

Identity Theft, Credit Fraud, and Data Breaches

Beginning around 2014 or 2015, the world began to understand just how badly malicious actors wanted to profit from data breaches — as well as just how far these actors would go. According to the experts at University of Illinois Chicago, “more than 750 data breaches occurred in 2015, the top seven of which opened over 193 million personal records to fraud and identity theft.�

While this first spate of cyberattacks generally targeted healthcare data, criminals also began stepping up their initiative to steal and sell credit card numbers on the black market, particularly the dark web. Mobile payments using secure systems have become more popular recently, used by 6 percent of adults in 2013 and rising to ...


Read More on Datafloq
What Digital Asset Investors Can Learn from the Recent Coincheck Hack

What Digital Asset Investors Can Learn from the Recent Coincheck Hack

Whether it’s Bitcoin, Ethereum, or a slew of industry peer competitors in between, the main appeal of cryptocurrency is that it’s unregulated by the government or central banking system. On the one hand, that makes it highly attractive, as it’s less affected by the ebb and flow of the economy and transactions can be made quicker. On the other, it raises a security and regulatory issue that until recently, has flown somewhat under the radar.

With so much data stored on the internet, cryptocurrencies are inherently vulnerable to hacker attacks. At their core, they’re designed to be accessed by those with the computer knowledge to professionally and ethically crack the “code� protecting the assets, gaining access to the units. If you can solve the incredibly complex computational code, you’re rewarded with a cleared transaction equal to its worth. Though there has been some pushback and controversy surrounding this approach, with some critics claiming that the rich are the only ones who can afford the highly technical computer systems required for code-breaking, it remains the primary means of bitcoin wealth building outside of private buying, trading, and selling.

Yet, a recent incident has shed some light on the vulnerabilities built into the cryptocurrency ...


Read More on Datafloq
The Biggest Challenges for Big Data Analytics in the Age of Artificial Intelligence

The Biggest Challenges for Big Data Analytics in the Age of Artificial Intelligence

It’s been a huge decade for big data and artificial intelligence (AI), two of the biggest tech trends we’ve seen this century. From data-driven manufacturing to self-driving cars, we’ve witnessed dozens of jaw-dropping, previously unimaginable feats, all thanks to advances in big data analytics and AI.  

Not so long ago, businesses across industries often sat on tons of useful, game-changing data, unsure about the many ways they could put it to use to gain competitive advantage. But as methods in machine learning, deep learning, and natural language processing became more advanced while computing power went up, seemingly useless data suddenly began to make sense.

For instance, businesses could use customer data to analyze demographic profiles, shopping habits, and other behaviors, which helped improve marketing campaigns and overall customer experience.

Still, despite all the good that comes with AI, its growth presents a myriad of challenges for big data, especially when you consider how data-hungry AI systems can be.

These challenges represent the biggest roadblock that must be addressed before we can fully realize the potential of AI and big data.

1. Data Privacy and Security

AI systems, even the most basic forms, are usually very complex, with tons of algorithms obscuring what the system is ...


Read More on Datafloq
Improving Your Company’s Efficiency with Internal Data

Improving Your Company’s Efficiency with Internal Data

Business owners and managers are always on the hunt to help both their employees and company become more efficient. It makes sense. Getting more work done for less money means you can do things like pay employees more, hire new workers, get better perks, and have less stress.

So, what do people do? They look to external sources for advice and guidance. They hunt online and in leadership magazines on tips for getting the most out of employees. Claims like “having meetings where everybody stands the whole time will make them 80 percent shorter� are found, and suddenly all meetings are done standing up.

It’s not bad to do outside research, but often managers and business owners fail to look at a more important source of guidance: instead of looking at what other people in the world are doing, they need to start by looking at the data their own business is producing for ways to improve.

Quantifying All Tasks

Before you can start analysing your business's data, you need to be able to collect data from tons of different sources. Basically, every part of your business’ performance, from the CEO to the lowest employee, needs to be measured.

Find ways to collect data from ...


Read More on Datafloq
AI’s Impact on Retail: Examples of Walmart and Amazon

AI’s Impact on Retail: Examples of Walmart and Amazon

Artificial Intelligence or AI is expected to be in major demand by retail consumers due to its ability to make interactions in retail as flawless and seamless as possible. Many of us do realise the potential of AI and all that it is capable of, along with the support of Machine Learning or ML, but don’t realise that the implementation of AI in certain segments has already begun. 

AI in Retail 

The future for AI and the complicated computer processes involved behind it is really bright in the field of retail. AI currently has numerous datasets working along with computer visualisation methods to ensure that the users get the most seamless experience when it comes to AI in the workplace. There are some interesting facts that pertain to the use of AI in retail. Here we have some of them to build the insight into what you can expect during the feature; 


It is expected that customers will manage 85% of their relationship with the enterprise without interacting with a human. 
According to a report by Business Insider, it is said that customers who interact in online opinions and reviews with retailers are 97 percent more likely to convert along with the retailer during this ...


Read More on Datafloq
How Virtualisation Can Protect Your Data

How Virtualisation Can Protect Your Data

Data security is a growing concern for business owners, with several prominent breaches affecting the market in recent years. Alongside other data security measures, your business is likely already taking such as password-enabled tiered user access and two-step authentication, there's another step you can take to enhance the security and privacy of your sensitive financial information: network and system virtualisation. Creating virtual networks and firewalls can exponentially increase the security of your data and protect your business and clients from breaches and leaks of sensitive payment and business information.

How Virtualization Can Protect Your Data

Virtualization has its place in both, terminal machines and networking. Using a virtual machine on your desktop and laptop computers can protect the core system from malicious code and vulnerabilities, as the virtual operating system doesn't have access to your containing system. This level of security is achieved by using software like VMware and VirtualBox to create virtual instances of entire computing systems on a psychical desktop. Virtual systems can contain their own operating system, which may differ from the operating system on the host machine. When employees and end-users in your business access the internet from such virtual machines, they don't risk exposing your physical drives ...


Read More on Datafloq
Wow! Snowflake lands massive $263 million investment

Wow! Snowflake lands massive $263 million investment

Have I said recently I LOVE  my job? Well I do, and I love working with the awesome technology that is the Snowflake DWaaS as well as all the really smart and cool people that make up this innovative and ever growing company. As you may have heard, we had a HUGE week with the […]
IoT As Open Target: 3 Ways Firewalls Make It Safer

IoT As Open Target: 3 Ways Firewalls Make It Safer

Since the introduction of enterprise IoT, businesses have grappled with related security issues – a problem that is particularly relevant today as they consider how to collect and manipulate IoT data to improve outcomes across all sectors.

What does it take to make IoT safer? Examining best practice protocols for IoT, IT teams should consider how firewalls function within their network and implement stronger borders around enterprise data. This is the surest way to protect key business interests as well as guard client information.

Mass Production, Mass Fragility

One of the biggest challenges to IoT security is the fact that most of these products are meant for the mass market. Think about Fitbits, smart home tech, and even water bottles and appliances – their mass market popularity means they are, in Jon Hedren’s words, “generally shoddy, insecure and easily breakable.� Securing these devices goes against their very nature.

If businesses are going to protect client data, they need to start at home, rather than at the enterprise level. When they purchase an IoT product, clients should be encouraged to guard their Wi-Fi carefully, renaming it, using two-factor authentication, and installing a smart firewall. This more advanced security framework can alert you to potential threats ...


Read More on Datafloq
Industry 4.0: How Automation and IoT Will Affect Jobs

Industry 4.0: How Automation and IoT Will Affect Jobs

The current trend of automation has been branded as ‘Industry 4.0’, a phase that many believe to be the next economic revolution. Michael E Porter of the Harvard Business Review said: “What is underway is perhaps the most substantial change in the manufacturing firm since the second industrial revolution, more than a century ago.�

The Internet of Things combines the power of machinery and computers, bridging the gap through connectivity. Turning human-operated machine in the manufacturing, production, construction and agriculture industries into an intelligent or ‘smart’ entity. There’s no doubt that robotics and drones plus automated processes are going to change industry forever. But which jobs will be affected the most by the application of IoT machinery? Here are a few of the ways that

1. Jobs in Hybrid Verticals

One of the biggest fears is that IoT will lead to mass unemployment, but many experts believe the effect to be more of a shift in the job landscape. Roles in hybrid verticals (such as IT + energy, IT + healthcare, or IT + agriculture) will become key to driving businesses forward. This could mean that existing workers will need to be re-skilled at their jobs, or it could see the formation ...


Read More on Datafloq
Csatlakozz csapatunkhoz

Csatlakozz csapatunkhoz


9b7a02c6-f184-4baa-9240-b74b3303f09d.jpeg

Ha szeretnél olyan projekteken dolgozni, mint amikről a bejegyzéseinkben olvasol.

Ha érdekel a data science vagy a data engineering világa.

Ha olyan termékeket és szolgáltatásokat fejlesztenél, amelyekben adatelemzési megoldások dolgoznak. 

�rj nekünk a job@dmlab.hu címre, keressünk egy közös időpontot, ahol többet mesélhetünk a csapatunkról, projektjeinkről és a lehetőségekről, amelyeket kínálni tudunk, ha csatlakozol hozzánk. Emellett persze arra vagyunk a leginkább kíváncsiak, hogy te merre tervezed a karriered, mik motiválnak a munkahelykeresésben és -választásban.

Találkozzunk!

Megosztom Facebookon! Megosztom Twitteren! Megosztom Tumblren!

Where Moore’s Law Is Headed with Big Data

Where Moore’s Law Is Headed with Big Data

When measuring and testing computer applications, scientists and engineers collect huge amounts of data every second of the day. For instance, the world’s largest particle holder collider known as Large Hadron Collider generates approximately 40 terabytes of data per second. The jet engine of a Boeing creates approximately ten terabytes of data every thirty minutes. When a Jumbo jet takes a trip across the Atlantic ocean, the four engines on the jet can produce approximately 640 terabytes of data. If you multiply that kind of data with an average of 2,500 daily flights, the amount of data produced per day is staggering; this is what is called Big Data.

It is a difficult task to draw conclusions and get actionable data from the large sums of data, and Big Data encompasses this issue. Big data has brought about new ways of processing data; we have deep data analysis tools, data integration tools, search tools, reporting tools and maintenance tools that help in processes big data to derive value from it.

The International Data Corporation (IDC) performed a study in which music, video files, and other data files were analyzed. The study indicated that the amount of data being produced by systems is ...


Read More on Datafloq
Why You Need To Keep Learning In Data Science

Why You Need To Keep Learning In Data Science

You’ve made it!

The years of hard work and gaining experience in your field has finally paid off. You’re a Data Scientist! You’re a part of an innovative, forward-thinking team working on exciting, world-changing projects. You’ve got a great salary and tasty benefits package, flexible working and not an uncomfortable suit and tie in sight. Time to put down the books and enjoy the ride? Absolutely not! Quite the opposite in fact, and here is why. 

The field of Data Science is a vast landscape where possibilities for professionals are endless.  

It is by no means a static profession, you can apply your skills to an array of different industries and domains, so knowledge on the area you work in is key so that you are able to derive the most valuable insights out of your data. With so many businesses hiring for and building out data science teams, it’s important that you have knowledge of the business you’re working in, or if you’re looking to move jobs - the one you’re wanting to make a move into. The job role is so much more than numbers and big data sets, it’s about contributing to the bigger picture, solving problems for businesses ...


Read More on Datafloq
How to Build a Data Science Team for Your Business

How to Build a Data Science Team for Your Business

Choosing the right fit for the data science team can be a challenging task since the field is still recent and many businesses are still trying to identify what a data scientist should offer. Also, putting together a complete team of data science is not a cakewalk.  That’s why we’ve come up with the best tips in this blog post to make the entire process easier for you. 

Choose the Right People

What roles need to be filled in when building a team for data science?  You will need to have a team of data science professionals who can work on large datasets and can understand the theory behind the science.  Also, they should be adept at developing predictive models. Data software developers and data engineers are important, too. They need to understand the data infrastructure and architecture, and distributed programming. 

Some of the other important job roles that need to be filled in a data science team include full-stack developer, data platform administrator, data solutions architect, and designer. Those business enterprises that have large teams working on building products based on real-time data will need to hire product managers on the team because they can lead the team up the right ...


Read More on Datafloq
How Blockchain Could Create a Decentralised Reputation-Based Society

How Blockchain Could Create a Decentralised Reputation-Based Society

In the past, when you required some work done, you would ask a local craftsman to do so. You were confident that you could trust the craftsman as it was a small village and any betrayal by him or her would result in loss of business. On the other hand, the craftsman knew he/she could trust you as when you would betray the craftsman it would mean that no one in the village wanted to work for you again. It was a system that worked very well. Unfortunately, with the internet turning the world into a small global village, that system pretty much disappeared. As a result, it has become increasingly difficult to trust someone you don’t know.

What is reputation?

To understand the challenges we face with reputation, it is important to understand what reputation is and how it is created. Reputation, or image, is a certain opinion about that person, company or device. A reputation is created over time, based on a set of criteria. It is a mechanism of control within societies and increasingly also among connected devices. It is important to know that reputation exists on multiple levels, ranging from an individual, organisation, communities to countries and cultures. ...


Read More on Datafloq
How Big Data Plays a Big Role in Politics

How Big Data Plays a Big Role in Politics

Like it or not, politics today would not be what it is without big data and analytics. Our current president won the office through a close relationship with Cambridge Analytica, in which the analytics firm conducted analysis of raw data on the electorate. This analysis informed Trump’s strategy.

According to Trump campaign executive director Michael Glassner, the data came primarily from the Republican National Committee’s trove. Yet Wired reports that Cambridge Analytica does claim to possess “5,000 data points on every American.� These data go towards creating a personality profile of each American for the purposes of “psychographic targeting,� which is a fancy of way of saying that the data inform speeches, ads, and decisions about which sections of the country a candidate should campaign in the hardest.

This does open up a conundrum. In politics, there’s a certain level of trust the electorate grants a candidate. But if everything the candidate says and does in the public eye is based on big data analysis that tells the candidate which words and actions will be popular, you can’t be sure the candidate’s ethical backbone is strong.

On the bright side of big data usage, Rutgers University reveals the role data analysis plays in ...


Read More on Datafloq
Can You Save 15 Hours of Meeting Time per Month ?

Can You Save 15 Hours of Meeting Time per Month ?

meeting1

Can you really save 15 hours of meeting times, company-wide, per month?

The answer is, yes, potentially.

Properly designed Dashboards gather all the necessary information beforehand. Employees come to meetings ready to discuss the matters at hand. Instead of spending the first 15 minutes telling everyone what the issues are, you can start discussing the solutions quicker because everyone is on the same page.

Let’s do the math, let’s say on average:

  • Saving of 10-15 minutes per meeting
  • Average of 3 meetings/day, company-wide
  • Average 20 working days/month

That equals to 900 minutes of saving, or a saving of 15 hours of meeting time, per month!

Of course, the numbers are based on estimate. However, if you are a medium/large size company, think of how many meetings occur across your departments at any given day!

Simply put, Dashboards broadcast relevant and pertinent measurements throughout the office and facilities. When the relevant measurements are visible at all times, employees have higher motivations to correct the situation or improve the numbers.

Have questions? Drop us a line by clicking HERE.

Where do Driverless Cars Learn to Drive?

Where do Driverless Cars Learn to Drive?

Driverless cars are an exciting alternative to the modern day domestic vehicle. The future of transport is set to be ever more interesting by the year with many wondering about what the next big thing in futuristic transport will be? One of the main innovations that has everyone talking is the driverless car, how it functions, how it knows and how it learns?

In Michigan, there has been a custom-made town created to enable driverless vehicles the opportunity to test and develop their driver skills. The test site is called ‘Mcity’ and it sits just outside of Ann Arbor. It was created to challenge driverless cars and put the technology and abilities of the cars to the test. Mcity opened in July 2015, cost over $10 million to make and was built on land owned by the University of Michigan.

Mcity has an extensive list of industry partners and affiliates who invest and utilise the town for testing purposes, this includes car manufacturers such a Honda, Toyota, Ford, and BMW. Ford motor car announced that is was the first to use the Mcity facility.

Mcity is the first controlled environment that was specifically designed to test potential automated and driverless cars, it boasts ...


Read More on Datafloq
The Future of Digital Designing: The Role of AI in UX Design

The Future of Digital Designing: The Role of AI in UX Design

In this day and age, you probably encountered a few topics about AI or Artificial Intelligence on the internet. You may have also experienced interacting with them in some business website with their chatbots.

AI is not only exclusive to desktop and laptops, though. They are also available and are deployed in smartphones such as Siri and in homes such as Amazon’s Alexa. Additionally, they are also utilized in self-driving cars and delivery drones.

With the widespread use of AI and how its applications are branching out to different industries, what can it do to User Experience or UX design?

The Role of Current AIs in UX Design

AI has plenty of benefits that improve how people usually perform a task. Moreover, it has already been deployed in the digital designing world that can benefit and improve the user experience.

Here are some of the examples how AI has incorporated itself in UX Designing.

TheGrid.io

TheGrid’s concept was made in 2014 and spent the years in beta phase before officially launching in 2016. Molly, the AI behind the web builder, lets you build your very own website that is both aesthetically pleasing and highly optimized.

For users who don’t know how to code or manage a site, the ...


Read More on Datafloq
How Artificial Intelligence Will Change the Retail Industry

How Artificial Intelligence Will Change the Retail Industry

The age of Artificial Intelligence is here and automation technology is already being used across the globe by companies in every industry. From farming and warehouse operations to robots that assist customers in finding what they want, the evolution of robotics is game-changing. Particularly within retail and B2C industries where customer experience and customer journey is important.

A revolution is upon us and here are four ways that A.I is going to change the retail landscape forever:

Shift in Workforce Structure

With modern advancements often comes some sacrifice, and for the retail world, jobs could be at risk when automation becomes commonplace. It is believed that in some sectors, as many as half of the jobs could be replaced by robots. And two industries with the biggest risk factors are manufacturing and retail.

But as Gartner predicts, the introduction of A.I could help to create more jobs in the future. Their forecast outlines: “By 2020, A.I will generate 2.3 million jobs, exceeding the 1.8 million that it will remove�.

Improved Customer Experience

The addition of robots in retail could help to boost customer experience and satisfaction dramatically. The journey of shoppers is incredibly important today as consumers want more from retailers; the modern day shopper ...


Read More on Datafloq
Tips on How You Can Learn Coding Without Any Prior Training

Tips on How You Can Learn Coding Without Any Prior Training

Due to the advent of technological innovations and innovations, the digital age has had an impact on almost every aspect of human life. The 21st century is quickly being referred to as the tech century with technology significantly changing the way we live and do things from what used to be the norm in the last century just a few decades back. The advent of computers in the 1980s brought about a revolution of how office jobs are done and even the inventory systems and the delivery of invoices to clients. Tech gurus have developed different systems and programs such as low-code to solve human problems in a significant way. However, all this software are built by coding different computer programs in different languages to perform a particular task.

The birth of the internet just over a decade ago has given rise to numerous coding languages used by IT specialists to develop any program that can perform a specified task using the various free source codes that are available online free of charge. This has given rise to the need for more and more people to learn the basic concepts of coding a computer program to automatically perform a specified task ...


Read More on Datafloq
Ad Hoc Javascript Performance Benchmarking

Ad Hoc Javascript Performance Benchmarking

At Exago, we are always looking for ways to further improve the application’s user experience. While these advances are most noticeable in major enhancements such as our newly-released dashboard designer and formula code completion, upgrades to client- and server-side performance are just as important. For Exago’s v2017.2 release, I was tasked with speeding up our client-side application and so became familiar with a number of performance improvement tools. Web technologies such as Flexbox, better in-browser performance profiling, and particularly jsPerf.com have helped us reach our usability goals.

JsPerf is a free, open-source JavaScript benchmarking tool that encourages collaboration and sharing of performance testing. The website uses Benchmark.js along with an embedded Java applet to record execution times of JavaScript code snippets at a high precision. Benchmarks of many commonly used JS web functions have already been cataloged on jsPerf. Most of these tests show results from different browsers, revisions of the original test, and comments on the results and their validity. Although these publicly searchable benchmarks are unlikely to match your specific need exactly, they’re a great reference tool for comparing the performance of simple functions.

How can jsPerf help with more complex systems?

The first step to leveraging jsPerf for your own ...


Read More on Datafloq
Visualizing The Multidomain Battle Battlespace

Visualizing The Multidomain Battle Battlespace

In the latest issue of Joint Forces Quarterly, General David G. Perkins and General James M. Holmes, respectively the commanding generals of U.S. Army Training and Doctrine Command (TRADOC) and  U.S. Air Force Air Combat Command (ACC), present the results of the initial effort to fashion a unified, joint understanding of the multidomain battle (MDB) battlespace.

The thinking of the services proceeds from a basic idea:

Victory in future combat will be determined by how successfully commanders can understand, visualize, and describe the battlefield to their subordinate commands, thus allowing for more rapid decisionmaking to exploit the initiative and create positions of relative advantage.

In order to create this common understanding, TRADOC and ACC are seeking to blend the conceptualization of their respective operating concepts.

The Army’s…operational framework is a cognitive tool used to assist commanders and staffs in clearly visualizing and describing the application of combat power in time, space, and purpose… The Army’s operational and battlefield framework is, by the reality and physics of the land domain, generally geographically focused and employed in multiple echelons.

The mission of the Air Force is to fly, fight, and win—in air, space, and cyberspace. With this in mind, and with the inherent flexibility provided by the range and speed of air, space, and cyber power, the ACC construct for visualizing and describing operations in time and space has developed differently from the Army’s… One key difference between the two constructs is that while the Army’s is based on physical location of friendly and enemy assets and systems, ACC’s is typically focused more on the functions conducted by friendly and enemy assets and systems. Focusing on the functions conducted by friendly and enemy forces allows coordinated employment and integration of air, space, and cyber effects in the battlespace to protect or exploit friendly functions while degrading or defeating enemy functions across geographic boundaries to create and exploit enemy vulnerabilities and achieve a continuing advantage.

Despite having “somewhat differing perspectives on mission command versus C2 and on a battlefield framework that is oriented on forces and geography versus one that is oriented on function and time,” it turns out that the services’ respective conceptualizations of their operating concepts are not incompatible. The first cut on an integrated concept yielded the diagram above. As Perkins and Holmes point out,

The only noncommon area between these two frameworks is the Air Force’s Adversary Strategic area. This area could easily be accommodated into the Army’s existing framework with the addition of Strategic Deep Fires—an area over the horizon beyond the range of land-based systems, thus requiring cross-domain fires from the sea, air, and space.

Perkins and Holmes go on to map out the next steps.

In the coming year, the Army and Air Force will be conducting a series of experiments and initiatives to help determine the essential components of MDB C2. Between the Services there is a common understanding of the future operational environment, the macro-level problems that must be addressed, and the capability gaps that currently exist. Potential solutions require us to ask questions differently, to ask different questions, and in many cases to change our definitions.

Their expectation is that “Frameworks will tend to merge—not as an either/or binary choice—but as a realization that effective cross-domain operations on the land and sea, in the air, as well as cyber and electromagnetic domains will require a merged framework and a common operating picture.”

So far, so good. Stay tuned.

Book Commentary: Thank you for being late

Book Commentary: Thank you for being late

Not long ago I had the opportunity to read a book from my long reading list. "Thank You for Being Late: An Optimist's Guide to Thriving in the Age of Accelerations" is a book written by famous author and journalist Thomas L. Friedman and logically as you might know, a best-seller.

Admitting I have a mild tendency to avoid best-sellers ―I’ve ran into some disappointments when reading them― I was a bit reluctant to its reading, especially because this was, according to the back cover, close to things I’m familiar with being an industry analyst and consultant in the technology market.
Yeap, a typical case of “why should I read it if I know what it’s about.�

I was wrong, right from the first pages the book was engaging, entertaining and yet, quite insightful. The book guides you nicely through the recent and profound evolution of information technology in nice and fluent way.

Full of information from extensive interviews and clear narrative ―essential I think for a book that addresses technology― the book narrates nicely the events that have impacted the evolution of technology and from there, the huge effect this have had on our lives.

Moreover, it provides a structure and describes the connection between the different elements of technology that are dramatically changing the world: cloud computing, mobility, big data and of course the Internet of Things and so on.

Mr. Friedman has truly done a good job at describing and connecting the dots as how modern life has many benefits while it also contributes to our sense of unease and anxiety as we are unable to keep the pace adapting to technological advances, increasing volumes of information to digest or the worries that come by having our information moving on roads or resting in public servers potentially vulnerable to attacks and security breaches.

The book also contains several descriptions from direct conversations with those making history. Described with a good level of detail, it does not become just a gathering of facts but a nice compendium of some of their reflections and thinking, which enables us to get a glimpse of how these people transforming our lives think about the present and the future.

According to Mr. Friedman, aside from how problematic life can be in a time on continuous acceleration, the book has an optimistic approach, even when modern life can sometimes be daunting and overwhelming, much of these technologies are here to make our lives better, perhaps one of the few details I personally found to be opposite with

I would have liked to find further exploration on the not so pleasant side of technology: security concerns, sociological and society issues provoked by technology, etc.to have a more balanced view of what technology means in our lives today.

Finally, despite the fact this book talks technology in a general way and opposite to some not recommending it for people in the tech field I’ll dare to do it, as it presents a fresh perspective on the evolution of technology and the reality of the world and its potential impact in our present and our future.

It is also a nice call to slow down, reflect and live thought our time at our own speed.

Mr. Friedman's book contain good nuggets of information that for those in and out of the tech scene can be entertaining and informational.
Do You Know How Big Data Is Helpful To Immigrants?

Do You Know How Big Data Is Helpful To Immigrants?

There have been calls in the international community for improved quality and availability of migration statistics for several decades. As with any other area of policy domain, the data on migration ought to be the foundation for informed decisions, but it's not always the case.

That's crucial anytime there is a displacement crisis, like the ones happening across the European continent. Individuals and families from places like Afghanistan and Syria put their lives at risk as they flee violence and conflict, hoping for a better life.

Timely, accurate statistics are necessary to give these migrants effective assistance, so that common misperceptions can be countered, as they often just postpone humanitarian and political responses that are so urgently needed.

How does migration data even matter? Migration statistics prove crucial to the design and implementation of sensible and practical migration policies. They're critical if migration is ever to get mainlined into the planning for national development and growth, as the impact of migration on a particular country is assessed. It's also a chance to look at how development conversely impacts migration, as well as how policies might shape the behavior of migrants. If a nation doesn't have robust migration statistics, then it can't seriously plan ...


Read More on Datafloq
Smart Society: How to Trust Artificial Intelligence

Smart Society: How to Trust Artificial Intelligence

The concept of a smart society has been around for a long time, but the progress we have seen towards achieving it in the last decade has really been a giant leap for mankind. For those of us who are unaware, the smart society which looms over us is the future of mankind. We are about to enter a phase where living smart is the baseline, and everything else just falls into the jigsaw to complement that lifestyle. In smart societies, we are blessed with smart cities that run through the application of smart accessories and smart buildings. 

In smart societies, we have smart cars (also known as self-driven cars or autonomous vehicles). We expected a better flow of traffic with traffic management that is propelled through extensive and authentic data provided by these vehicles and analysed with smart algorithms (e.g. based on AI). The most prominent detail about smart societies as we know them now is the pervasiveness of the Internet of Things (IoT) at the smallest level. The implementation of IoT at micro levels drives the need for self-learning algorithms, hence the emphasis on AI. Eventually, it all conflates together to form the bigger picture of a smart society.

AI ...


Read More on Datafloq
How Enterprise Chatbots Platforms Will Change Customer Service

How Enterprise Chatbots Platforms Will Change Customer Service

The typical scenario of a customer calling up a customer service agent to answer a query is passé. The expectations of customers have risen with time, which requires the customer service agents to be available round the clock for any enquiry, complain and other information.

As a result, enterprises are having to invest more into acquiring the adequate amount of team members and training to get the job done. The other possible solution is enterprise chatbots, which can make the 24/7 customer service a reality. These chatbots can provide fast solutions.With the help of Artificial Intelligence, they can perceive data well and give more accurate and personalised solutions.

They cut costs of businesses and can do repetitive work. Not only do they keep their end users happy, but they also take off the workload from customer service counterparts, allowing them to concentrate on more complex tasks.

4 Impressive Chatbot Statistics


According to a study by Gartner, by 2020, 85% of all customer interactions will be handled without a human agent.
80% of businesses want chatbots by 2020 — Oracle
The global chatbot market is set to grow at CAGR of 37.11 between 2017 and 2021 — Orbis Research
Chatbots expected to cut business costs by $8 billion ...


Read More on Datafloq
How to Make the Most of Data Storage Services

How to Make the Most of Data Storage Services

As data storage options become exponentially more accessible and simultaneously more complex, business owners and individuals are left to wonder where to store their sensitive data. Cloud storage services, physical backup drive manufacturers and off-site data storage centres are all competing for your data storage business. With high competition in the data storage services market, consumers have several excellent options for storing large and sensitive files today. Read on to find out where you should be storing your data among the plethora of data storage solutions available this year.

Cloud-Based Data Storage Solutions

There are several benefits of storing your data in the cloud. Whether your a business owner or an individual with lots of media, information and financial data to keep secure, then you'll appreciate the newer cloud-based storage solutions from companies like Amazon and Microsoft. Amazon's cloud storage solution, AWS, is a scalable, pay-as-you-go option for uploading files of virtually any size from any number of terminals. You are charged on a per-usage basis, so you don't need to worry about paying for unused room, unlike traditional off-site data storage plans that often require renting an entire dedicated server.

Microsoft's Azure cloud storage system functions similarly, allowing users to disregard hardware ...


Read More on Datafloq
6 Risks with Using Predictive Analytics for Conversion Rate Optimization

6 Risks with Using Predictive Analytics for Conversion Rate Optimization

Conversion rate optimisation is very important, but it is also a very imprecise science. The biggest challenge is that customer behaviour is constantly evolving. The practices that customers responded to five years ago may not be effective today.

Marketers must use the latest tools and strategies to maintain a decent ROI. Predictive analytics models are highly effective, but they aren’t foolproof by any means. If you are planning on using predictive analytics to boost your online conversion rate, you will want to avoid making the following mistakes.

1. Be Wary of Your Ability to Understand Changes in Social Psychology at a Macro-Level

Some experts attempt to use predictive analytics models to identify future fads or predict changes in customer behaviour. You need to be cautious with these models because they are notoriously unreliable.

Even the most insightful predictive analytics model cannot account for unplanned variables. New events or product offerings can change customer perception overnight.

This doesn’t mean that predictive analytics tools are useless for conversion rate optimisation. However, there is a large margin of error, so you should only focus on variables that are easiest to analyse.

2. Don’t Extrapolate Data to Unrelated Campaigns

Many different variables affect your campaigns. These variables can include:


Marketing mediums or ...


Read More on Datafloq
Big Data and Privacy: The Consumer Paradox

Big Data and Privacy: The Consumer Paradox

Each minute Google records 3.8 million different queries on its search engine, and Facebook as many "likes".

For twenty years, new information technologies have increased the ability for businesses to collect, store, read, share and use the personal information of individuals. These activities have had many positive effects on the economy in general and some benefits for consumers, including greater personalization and better targeting of products and services.

However, undesirable effects due to the knowledge of these data have also emerged, the most important being the one related to the invasion of privacy. Today, according to the annual barometer of the intrusion conducted for the fifth consecutive year by the Publicis ETO agency, nearly 78% of French people are inconvenienced by the fact that their information is collected and stored in databases.

So what are the end-users' perceptions of the violation of their privacy in a Big Data environment?

The Challenges Of Big Data In Customer Knowledge

The growing collection of information related to consumer habits, preferences or expectations has given rise to Big Data. This mass of data represents a real competitive advantage today because it can be used by the company to better respond to its customers.

Indeed, Big Data is above all a great opportunity for ...


Read More on Datafloq
Big Data or Small Data? Here’s Why You Need Both

Big Data or Small Data? Here’s Why You Need Both

Small things often come in big packages, and this is something that the enterprise world is starting to realise as it continues to adjust to the Big Data paradigm. Data analysis must be conducted before making strategic business decisions these days, and this analysis is likely to come from one of the many tools provided by the Big Data industry, a sector expected to generate sales more than $187 billion next year.

It is safe to assume that Big Data is here to stay; however, the enterprise focus is likely to shift towards reducing the size of the massive data sets being collected and processed for the purpose of perfecting the analysis and making even better decisions.

The Problem With Big Data

In the world of Big Data, one size does not fit all. Over the last few years, the enthusiasm of being able to collect infinite amounts of information has overshadowed analytical implementation. The problem with Big Data is that it has grown out of proportion for many businesses. There is no question that computer science is progressing in the right direction of Big Data; however, only a small portion of the enterprise world stands to benefit from the technical aspects of ...


Read More on Datafloq
How Telcos can Fully Benefit from IoT Eco-Systems

How Telcos can Fully Benefit from IoT Eco-Systems

As the world is progressing towards an age of development in the Internet of Things (IoT) and other aspects related to it, the concept of the digital consumer is on the rise. Consumers of today want to experience the feasibility that is promised through this method. From experiencing customer support resources that fulfil their needs to having a seamless experience across platforms, consumers really want to experience the full taste of this development in technology. The high expectations of the digital consumer have meant that Telcos now have to understand consumer preferences and give a solution that is aligned with the needs of today’s consumer. Not only is this expected of Telcos operating currently, but they can also increase their revenue streams. 

One way to fulfil the needs of consumers flawlessly is through the use of the IoT and ecosystems. Today, I will talk in depth about how Telcos can use these concepts to their benefit. I recently had the opportunity to join a few other notable names such as Dez Blanchfield, Lillian Pierson and Ruven Cohen on a tour of the Ericsson Studio in Kista, Sweden. At the facility, we had a conversation with Elias Blomqvist, Strategic Product Manager at Ericsson. The conversation with one of the leading members ...


Read More on Datafloq
Robert Work On Recent Chinese Advances In A2/AD Technology

Robert Work On Recent Chinese Advances In A2/AD Technology

An image of a hypersonic glider-like object broadcast by Chinese state media in October 2017. No known images of the DF-17’s hypersonic glide vehicle exist in the public domain. [CCTV screen capture via East Pendulum/The Diplomat]

Robert Work, former Deputy Secretary of Defense and one of the architects of the Third Offset Strategy, has a very interesting article up over at Task & Purpose detailing the origins of the People’s Republic of China’s (PRC) anti-access/area denial (A2/AD) strategy and the development of military technology to enable it.

According to Work, the PRC government was humiliated by the impunity with which the U.S. was able to sail its aircraft carrier task forces unimpeded through the waters between China and Taiwan during the Third Taiwan Straits crisis in 1995-1996. Soon after, the PRC began a process of military modernization that remains in progress. Part of the modernization included technical development along three main “complementary lines of effort.”

  • The objective of the first line of effort was to obtain rough parity with the U.S. in “battle network-guided munitions warfare in the Western Pacific.” This included detailed study of U.S. performance in the 1990-1991 Gulf War; development of a Chinese version of a battle network that features ballistic and guided missiles;
  • The second line of effort resulted in a sophisticated capability to attack U.S. networked military capabilities through “a blend of cyber, electronic warfare, and deception operations.”
  • The third line of effort produced specialized “assassin’s mace” capabilities for attacking specific weapons systems used for projecting U.S. military power overseas, such as aircraft carriers.

Work asserts that “These three lines of effort now enable contemporary Chinese battle networks to contest the U.S. military in every operating domain: sea, air, land, space, and cyberspace.”

He goes on to describe a fourth technological development line of effort, the fielding of hypersonic glide vehicles (HGV). HGV’s are winged re-entry vehicles boosted aloft by ballistic missiles. Moving at hypersonic speeds at near space altitudes (below 100 kilometers) yet maneuverable, HGVs carrying warheads would be exceptionally difficult to intercept even if the U.S. fielded ballistic missile defense systems capable of engaging targets (which it currently does not). The Chinese have already deployed HGVs on Dong Feng (DF) 17 intermediate-range ballistic missiles, and began operational testing the DF-21 possessing intercontinental range.

Work concludes with a stark admonition: “An energetic and robust U.S. response to HGVs is required, including the development of new defenses and offensive hypersonic weapons of our own.”

Why Banning Bitcoin and other Cryptocurrencies is Not the Solution

Why Banning Bitcoin and other Cryptocurrencies is Not the Solution

Many government bodies, regulators, investors and banks see Bitcoin, or even the whole cryptocurrency market, as a speculative bubble. Some say it is a fraud, while others speculate that it will continue to rise in 2018. Earlier, I wrote an article why I believe that Bitcoin will fail. Not because it is a fraud, but because it is flawed. Despite my belief that Bitcoin will eventually fail, blockchain and some, not all, cryptocurrencies will likely reach mass adoption and bring significant change to how we run organisations and societies. Since the innovation of blockchain and cryptocurrencies is too important, I do not believe that considering banning cryptocurrencies is a solution as is now being considered by multiple countries.

Cryptocurrencies are a completely new way of doing business, raising money, transferring money, making transactions etc. Bitcoin is only nine years old, and as such, we are still learning how the underlying technology, blockchain, works; how we should deal with it; how we can implement it within our products and services, etc., resulting in a plethora of new cryptocurrencies.

Cryptocurrencies and opportunism

Of course, with such a new technology, there is a lot of opportunism, which is what you see happening now. This is nothing ...


Read More on Datafloq
The Top 7 IoT Trends For 2018

The Top 7 IoT Trends For 2018

The Internet of things (IoT) can be defined as the network of home appliances, vehicles, physical devices and other things entrenched with network connectivity, actuators, sensors, software and electronics which permit these objects to exchange and connect data. Each and every item is exclusively recognisable via its embedded computing system. However, it can inter-operate inside the current Internet infrastructure.

Experts have projected that the IoT will comprise of nearly 30 billion items by 2020. In addition to this, it has also been forecasted that the worldwide market value of IoT will upsurge to $7.1 trillion by 2020. These figures make it compulsory for all entrepreneurs to keep an eye on these latest IoT trends to incorporate them into their business practices.

Keeping this scenario mind, we are presenting you the list of top 7 IoT trends for 2018.

1. The IoT Will Expand

The most understood and clear forecast of the decade is that the IoT will keep on growing in 2018, with additional devices and gadgets coming online every single day. However, it is also significant to identify the areas where IoT growth will occur.

Three areas comprising of IoT in the supply chain, IoT in healthcare, and IoT in retail industries will probably ...


Read More on Datafloq
What AI Has In Store for SEO

What AI Has In Store for SEO

A popular conception of AI is the robot that looks, acts, and talks like a human. Invariably, it seems like this robot might be capable of developing emotions or overthrowing its master because, well, the robot is always learning and becoming smarter. We don’t know what it’s capable of (see the plot of 2004 movie I, Robot). How close is this conception to the truth?

We live in a world increasingly dominated by those who know best how to use big data. Jeff Bezos is now the richest man in the world because Amazon beats everyone else at using big data for everything from recommendations to shipping practices. It’s not a big stretch to assume an artificially intelligent machine will dominate in a world where big data is king. Elon Musk, for one, is afraid artificial intelligence will render humans obsolete.

But for now, AI is working to consistently do things like help you find the right website. Search engine optimization is one field in which AI is making a big difference. Current iterations of AI in the SEO world focus on rendering results for people in real time. AI is changing the fundamental nature of the game. What will the game ...


Read More on Datafloq
Moving Data from the Basement to the Boardroom

Moving Data from the Basement to the Boardroom

According to IDG, 35% of companies with effective data grow faster year-on-year. However, many still take little notice of data quality or ongoing data management, pushing it into deepest corners to eventually be abandoned and forgotten.

If this sounds like you, it’s time to consider putting data at the top of your business agenda again…

Big Data – a neglected but important aspect of business survival

As stated on Forbes, more data has been created in the past two years than in the entire previous history of the human race. This has led to the phenomenon known as ‘big data’. However, as we know its importance today, things weren’t so peachy a few years back. According to the report from Pricewaterhouse Coopers (PwC), even in 2015, only a small percentage of companies reported effective data management practice.

Moreover, the report found that while 75% of business owners were “making the most of their information assets,� with only 4% were set up for success. This was due to lack of tools and in-house knowledge of effective data management, as well as disparate data sources. Overall, 43% of companies surveyed “obtain little tangible benefit from their information,� while 23% “derived no benefit whatsoever�.

The dangers of poor data management 

Having no data strategy is an ...


Read More on Datafloq
Nuclear Buttons (continued)

Nuclear Buttons (continued)

Right now, I gather the President of the United States has the authority to unilaterally fire off the entire U.S. nuclear arsenal on a whim. Whether this would actually happen if he tried to order this is hard to say. But I gather there is no real legal impediment to him waking up one morning and deciding to nuke some city and that there is no formal process in place that actually stops him from doing this.

This is a set of conditions that came into being during the Cold War for the sake of making our nuclear deterrent and strike and counterstrike capability more credible. The U.S. and Russian no longer have their nukes targeted at each other. This is more a matter of good manners and is something that could be changed in a moments notice.

Is it time for the United States to consider placing the authority to launch nuclear weapons under control of more than one person? Perhaps the authority of three people, the president, a senior military leader, and a representative of congress?

 

There is a little technical difficulty here, for in the case of an emergency, the President, Vice-President and Speaker of the House would be shuttled off to separate locations. Still, there could be a designated representative for the military (commanding general or his representative at United States Strategic Command) and one of our 535 congressmen or senators appointed as a representative for congress. There are any number of ways to make sure that three people would be required to authorized a launch of a nuclear weapon, as opposed to leaving a decision that could exterminate millions in seconds in the hands of one man. With the Cold War now in the distant past, and nuclear strike forces a fraction of their original size, maybe it is time to consider changing this.

 

How to Structure a Data Science Team: Key Models and Roles to Consider

How to Structure a Data Science Team: Key Models and Roles to Consider

If you’ve been following the direction of expert opinion in data science and predictive analytics, you’ve likely come across the resolute recommendation to embark on machine learning. As James Hodson in Harvard Business Review recommends, the smartest move is to reach for the “low hanging fruit� and then scale for expertise in heavier operations.

Just recently we talked about machine-learning-as-a-service (MLaaS) platforms. The main takeaway from the current trends is simple. Machine learning becomes more approachable for midsize and small businesses as it gradually turns into a commodity. The leading vendors – Google, Amazon, and Microsoft – provide APIs and platforms to run basic ML operations without a private infrastructure and deep data science expertise. In the early stages, taking this lean and frugal approach would be the smartest move. As analytics capabilities scale, a team structure can be reshaped to boost operational speed and extend an analytics arsenal.

How to implement this incremental approach? This time we talk about data science team structures and their complexity.

Data science team structures

Embarking on data science and predictive analytics requires a clear understanding of how the initiative is going to be introduced, maintained, and further scaled regarding the team structure. We recommend considering three basic team structures ...


Read More on Datafloq
How to Find and Eliminate Duplicate Data in Your Database

How to Find and Eliminate Duplicate Data in Your Database

Duplicate data is a problem that plagues many businesses, but it’s relatively easy to spot and prevent—once you understand its nature, and what to do about it. You also have many potential options to explore when identifying and eliminating duplicate data, so you can find the best methodology for your business and needs.

But how do you get started if this is a new problem for you to resolve?

Why Duplicate Data Is an Issue

First, it pays to understand why duplicate data is a problem.

These are some of the biggest issues to note:


Recordkeeping issues. The first and possibly most obvious problem is with the accuracy and reliability of your recordkeeping. For example, let’s say you accidentally list the same business twice in your sales records; the sales figures for that company will double and, therefore, cause your revenue projections to spike unjustifiably. With a nagging dupe data issue, you’ll be far more prone to overestimations when looking at groups of data, and you may have greater difficulty tracking down the exact data you need when looking up specific instances.
System storage and bulk. Duplicate data also puts an increased burden on your tables, clogging up your system with unnecessary information. On a small ...


Read More on Datafloq
3 Advances in the Internet of Things That Will Change Our Daily Lives

3 Advances in the Internet of Things That Will Change Our Daily Lives

At the start of the new year, many tech-savvy individuals are looking to the Internet of Things to learn what's coming in the way of new advances. Technology is ever evolving, providing products and services that would have been unimaginable even just five years ago. Here's a look at a few innovations that will change day to day living in 2018.

The Ways We Connect Will Become More Accessible

While satellite m2m technology has been able to provide network communications to remote areas, that's not the only impressive advance to change the way the world communicates. Even in populated areas, connecting to a wifi provider requires a number of optimum conditions. The percentage of your battery, bandwidth allocation, and range are just a few of the factors that affect your ability to connect in a given area, but that's all about to change.

As we head into the new year, expect to see greater availability of low-power, short-range networks. In fact, these types of connections are expected to far exceed the number of longer-range connections through the next 8 to 10 years. While this beneficial to businesses, this also means individual users will have a greater pool of choices for their networking needs. ...


Read More on Datafloq
1st Security Force Assistance Brigade To Deploy To Afghanistan In Spring

1st Security Force Assistance Brigade To Deploy To Afghanistan In Spring

Capt. Christopher Hawkins, 1st Squadron, 38th Cavalry Regiment, 1st Security Force Assistance Brigade, middle, and an interpreter speaks with local national soldiers to gain information about a village during an enacted military operation on urban terrain event at Lee Field, Oct. 23, 2017, on Fort Benning, Ga. (Photo Credit: Spc. Noelle E. Wiehe)

The U.S. Army recently announced that the newly-created 1st Security Force Assistance Brigade (SFAB) will deploy to Afghanistan under the command of Colonel Scott Jackson in the spring of 2018 in support of the ongoing effort to train and advise Afghan security forces. 1st SFAB personnel formed the initial classes at the Military Advisor Training Academy (MATA) in August 2017 at Fort Benning, Georgia; approximately 525 had completed the course by November.

The Army intends to establish five Regular Army and one Army National Guard SFABs. In December it stated that the 2nd SFAB would stand up in January 2018 at Fort Bragg, North Carolina.

The Army created the SFABs and MATA in an effort to improve its capabilities to resource and conduct Security Force Assistance (SFA) missions and to relieve line Brigade Combat Teams (BCTs) of these responsibilities. Each SFAB will be manned by 800 senior and noncommissioned volunteer officers with demonstrated experience training and advising foreign security forces.

Specialized training at MATA includes language, foreign weapons, and the Joint Fires Observer course. SFAB commanders and leaders have previous command experience and enlisted advisors hold the rank of sergeant and above. As of August 2017, recruiting for the first unit had been short by approximately 350 personnel, though the shortfall appears to have been remedied. The Army is working to address policies and regulations with regard to promotion rates and boards, selection boards, and special pay.in order to formalize a SFAB career path

The Link between Good SEO and Big Data

The Link between Good SEO and Big Data

According to latest statistics, at the moment we have around 7 Zettabytes of data on the internet. And that number is increasing as we speak. So, where’s all of this data coming from? Simply put, we all create tons of data every day. For instance, we upload 576,000 hours’ worth of video on YouTube every day; Facebook has around 1.3 billion active users daily, who post pictures and videos almost on an hourly basis, and we are going to create a lot more data in the future.

Today, most internet traffic comes from mobile devices and according to the Pew Research Center, a staggering 77% of Americans own a smartphone. As smartphone usage rates continue to grow in the rest of the world, we can expect the volume of data to grow as well.

How does Big Data impact SEO?

In simple terms, Big Data is used to describe huge volumes of data – both unstructured and structured – that floods an organization on a day-to-day basis. If you want to use all of this information properly, you have to realize just how Big Data is related to Search Engine Optimization.

For starters, Google is the biggest Big Data organization in the world. And as SEO ...


Read More on Datafloq

Privacy Policy

Copyright © 2018 BBBT - All Rights Reserved
Powered by WordPress & Atahualpa
X