3 Strategies to Wow Your Customers with Omnichannel Experiences

3 Strategies to Wow Your Customers with Omnichannel Experiences

Did you know that companies with omnichannel customer engagement strategies retain on average 89% of their customers?  This is compared to a 33% customer retention rates for companies with weak omnichannel strategies. (Invesp)

What exactly does it mean to be omnichannel?

According to Wikipedia, omnichannel uses a variety of channels in a customer’s shopping experience including research before a purchase. Such channels include stores, online stores, mobile stores, mobile app stores, telephone sales and any other method of transacting with a customer.

Whatis.com defines omnichannel as a multichannel approach to sales that seeks to provide the customer with a seamless shopping experience whether the customer is shopping online from a desktop or mobile device, by telephone or in a bricks and mortar store.



Image Source: the DMA

How do consumers shop across channels?

Here are 14 statistics that give further insights into the shopping behaviors of today’s typical consumer:

1. 98% of Americans switch between devices in the same day. (Google Research)

2. 15 years ago the average consumer typically used two touch-points when buying an item and only 7% regularly used more than four. Today consumers use an average of almost six touch-points with nearly 50% regularly using more than four. (Marketing Week)

3. Over 35% of customers expect to be able to ...


Read More on Datafloq
The Race is On: How Blockchain Will Change Governments

The Race is On: How Blockchain Will Change Governments

We are at a global crossing point and if we manage to solve some of the problems that we have created in the past decades, the world can become a better place. When the internet was developed 28 years ago, the objective was to create a decentralised internet. However, somehow, we ended up with an internet which is in the hands of a few very powerful companies. As a result, the internet can go down for millions of people if one of those companies has a problem. This was the case a few months ago, when AWS experienced ‘a glitch’, or better know a typo, which brought down the internet for millions of users.

Sir Tim Berners-Lee explained it eloquently during the Decentralised Web Summit in 2016:

“The web was designed to be decentralised so that everybody could participate by having their own domain and having their own webserver and this hasn’t worked out. Instead, we’ve got the situation where individual personal data has been locked up in these silos.� - Sir Tim Berners-Lee

Fortunately, a new technology has appeared that can solve the existing problems with the internet and bring it back to its origins; a decentralised web, where everyone can participate ...


Read More on Datafloq
Why Business Analysts Play a Crucial Role for an IT Company

Why Business Analysts Play a Crucial Role for an IT Company

The role of a business analyst is gaining prominence with each passing year and more significantly in the IT Industry. The Business analysts play a crucial role in deciphering the future of many businesses and prove to the crucial functioning aspect in any industry. 

It’s been said that no business can reach to its pinnacle of success without the business analyst. 

Who actually are business analysts?

Business analysts are the people who are responsible for diving deep into the oceans of unlimited insights as offered by the clients and guiding the entire team towards meeting the same. 

The business analysts use their knowledge of better vision and better-analyzing capability and providing precious experience to the team in handling sales, and various business processes and even in framing all important documents. 

Business analysts play a crucial role In an IT Company

The business analysts play the all important role in the IT Company and are responsible for carrying out various activities in an IT Company. The role of Business analyst becomes more important as they are a part of the better delivery of the projects to the clients meeting their requirements and demands. 

With the below-mentioned functions of the Business analysts in an IT Company, they are the ...


Read More on Datafloq
How Will Marketers Use Big Data in the Future?

How Will Marketers Use Big Data in the Future?

With the advance of the Internet of Things and AI, the big data landscape for marketers is going to change. The IoT will be generating far more data than businesses are used to seeing. With that, marketing agencies and departments will need to decide which data they’ll use and which data is too sensitive to incorporate in campaigns.

In other words, as every aspect of our lives is increasingly converted into data, will anything be outside the scope of the marketing eye? And to what extent will marketers deliver messages through new, connected tech?

Digital marketing agencies are riding a wave of business trends directly related to big data:


Businesses are spending less on internal operations because they’re spending more on marketing technology
83 percent of businesses expect to see the demand for marketing analytics grow alongside an increase in data collection      


In part, the increase in data collection stems from the number of connected devices contributing to the IoT. It also comes from the fact that businesses are seeing a ROI from data collection and analytics. In the past, it was fairly normal to doubt an increase in revenue and profits due to marketing personalization through data. Now, there’s no doubt in anyone’s mind ...


Read More on Datafloq
How Big Data Bring Changes to Insurance & Health Care

How Big Data Bring Changes to Insurance & Health Care

Big data analysis is a revolutionary tool in the modern world, and it has brought significant changes to industries ranging from healthcare and politics to insurance and sport. In this article, a big data consulting firm discusses how big data as well as wearable gadgets are now transforming the health and insurance sectors in 2017. First, let's look at how big data is changing health care.

Wearable technology is the recent smart revolution in the healthcare sector. These are portable gadgets ranging from activity trackers and pacemakers to smart watches. While we cant really count the number of these gadgets in the market today, there are over a hundred thousand healthcare apps available for users, according to the report, and these can be downloaded from the app stores.

How Big Data Brings Changes to Healthcare and Insurance Industries in 2017

Healthcare and Pharma industries from around the globe are experiencing significant challenges including low access, high cost, little price transparency, over prescription, declining trust, weak R&D pipeline, effective medication, etc. With big data, both industries can overcome most of these challenges by simply adopting the analysis technology.

The healthcare industry is vast and lucrative, especially for the technology sector to invest more resources in ...


Read More on Datafloq
Gauging The Effects Of Technological Advancements In The World Of Medicine

Gauging The Effects Of Technological Advancements In The World Of Medicine

Technological advancements continue to take place on an almost daily basis and they have begun to have an effect on a number of unexpected sectors of society. As mobile app developers continue to work on the clock and Android app development and iPhone app development keep advancing, the medical community will also continue to experience a wide range of effects.

Thanks to the growth of mobile medical apps, patients and practitioners alike are both able to enjoy numerous benefits. With the advent of mobile monitoring and the increased number of telehealth systems, hospitalizations are expected to be reduced significantly, as well as emergency room visits.

This will benefit both sides of the equation, as patients who do not reside in areas that are close to hospitals and emergency rooms can now receive assistance without having to drive long distances, while the doctors and nurses who are on call are given the chance to focus their efforts on the patients who are already present.

Apps allow for improved communication between health care specialist, as well as the coordination of care. Patients can remain engaged with their doctors even when they are not on the premises, which leads to far better maintenance of any pre-existing ...


Read More on Datafloq
Why Mobility Data in the Retail Industry is Important

Why Mobility Data in the Retail Industry is Important

Just how do major retailers use mobility data to stay competitive in this fast paced industry? Read on and adopt these methods for your retail store today.

Mobility data refers to the trajectories or locations of people and objects. In the retail industry, the movement of shoppers and prospective customers are of particular interest. This is commonly referred to as footfall, which involves measuring the number of people in and around a store or area within a period of time.

With footfall data, you can determine other key metrics that are crucial for survival in today’s retail climate such as shopper traffic, estimated conversion rates and estimated transaction values.

Why is mobility data important?

Calculating footfall in and around your store helps managers make business decisions such as how to optimize marketing strategies, when to launch promotions or where to allocate labor. With footfall data, managers can identify, at any time of the day, how many purchasing opportunities they have in each store – and how many they are missing. 

Knowing your weekly, daily and hourly retail footfall can give you the extra edge you need to boost sales productivity, increase conversions and make instant operational changes in real-time. 

Growing your retail business with mobility data 

Here ...


Read More on Datafloq
Blockchain Technology Explained: Powering Bitcoin

Blockchain Technology Explained: Powering Bitcoin

What’s The Big Deal With Bitcoin?

Like most good stories, the bitcoin saga begins with a creation myth. The open-source cryptocurrency protocol was published in 2009 by Satoshi Nakamoto, an anonymous developer (or group of bitcoin developers) hiding behind this alias. The true identity of Satoshi Nakamoto has not been revealed yet, although the concept traces its roots back to the cypher-punk movement; and there’s no shortage of speculative theories across the web regarding Satoshi’s identity.

Bitcoin spent the next few years languishing, viewed as nothing more than another internet curiosity reserved for geeks and crypto-enthusiasts. Bitcoin eventually gained traction within several crowds. The different groups had little to nothing in common – ranging from the gathering fans, to black hat hackers, anarchists, libertarians, and darknet drug dealers; and eventually became accepted by legitimate entrepreneurs and major brands like Dell, Microsoft, and Newegg.

While it is usually described as a “cryptocurrency,� “digital currency,� or “virtual currency� with no intrinsic value, Bitcoin is a little more than that.

Bitcoin is a technology, and therein lies its potential value.


This is why we won’t waste much time on the basics – the bitcoin protocol, proof-of-work, the economics of bitcoin “mining,� or the way the bitcoin network functions. Plenty ...


Read More on Datafloq
The next line of defense—How new security leverages virtualization to counter sophisticated threats

The next line of defense—How new security leverages virtualization to counter sophisticated threats

When it comes to securing systems and data, the bad guys are constantly upping their games -- finding new ways to infiltrate businesses and users. Those who protect systems from these cascading threats must be ever vigilant for new technical advances in detection and protection. In fact, they must out-innovate their assailants.

The next BriefingsDirect security insights discussion examines the relationship between security and virtualization. We will now delve into how adaptive companies are finding ways to leverage their virtualization environments to become more resilient, more intelligent, and how they can protect themselves in new ways.

Listen to the podcast. Find it on iTunes. Get the mobile app. Read a full transcript or download a copy.

To learn how to ensure that virtualized data centers do not pose risks -- but in fact prove more defensible -- we are joined by two security-focused executives, Kurt Roemer, Chief Security Strategist at Citrix, and Harish Agastya, Vice President for Enterprise Solutions at Bitdefender. The discussion is moderated by Dana Gardner, principal analyst at Interarbor Solutions.

Here are some excerpts:

Gardner: Kurt, virtualization has become widespread and dominant within data centers over the past decade. At that same time, security has risen to the very top of IT leadership’s concerns. What is it about the simultaneous rise of virtualization and the rise of security concerns? Is there any intersection? Is there any relationship that most people may miss?

Roemer: The rise of virtualization and security has been concurrent. A lot of original deployments for virtualization technologies were for remote access, but they were also for secureremote access. The apps that people needed to get access to remotely were usually very substantial applications for the organization --  things like order processing or partner systems; they might have been employee access to email or internal timecard systems. These were things that you didn’t really want an attacker messing with -- or arbitrary people getting access to.

Roemer
Security has grown from just providing basic access to virtualization to really meeting a lot of the risks of these virtualized applications being exposed to the Internet in general, as well as now expanding out into the cloud. So, we have had to grow security capabilities to be able to not only keep up with the threat, but try to keep ahead of it as well.

Gardner: Hasn’t it historically been true that most security prevention technologies have been still focused at the operating system (OS)-level, not so much at the virtualization level? How has that changed over the past several years?

Roemer: That’s a good question. There have been a lot of technologies that are associated with virtualization, and as you go through and secure and harden your virtual environments, you really need to do it from the hardware level, through the hypervisor, through the operating system level, and up into the virtualization system and the applications themselves.

We are now seeing people take a much more rigorous approach at each of those layers, hardening the virtualization system and the OS and integrating in all the familiar security technologies that we’re used to, like antivirus, but also going through and providing for application-specific security.

So if you have a SAP system or something else where you need to protect some very sensitive company data and you don’t want that data to be accessed outside the office arbitrarily, you can provide very set interfaces into that system, being able to control the clipboard or copy and paste, what peripherals the application can interface with; i.e., turn off the camera, turn off the microphone if it’s not needed, and even get down to the level of with the browser, whether things like JavaScript is enabled or Flash is available.

So it helps to harden the overall environment and cut down on a lot of the vulnerabilities that would be inherent by just leaving things completely wide open. One of the benefits of virtualization is that you can get security to be very specific to the application.

Gardner: Harish, now that we are seeing this need for comprehensive security, what else is it that people perhaps don’t understand that they can do in the virtualization layer? Why is virtualization still uncharted territory as we seek to get even better security across the board?

Let’s get better than physical

Agastya
Agastya: Customers often don’t realize when they are dealing with security in physical or virtual environments. The opportunities that virtual environments provide to them are to have the ability to take security to a higher level than physical-only. So better than physical is, I think, a key value proposition that they can benefit from -- and the technology innovation of today has enabled that.

There is a wave of innovation among security vendors in this space. How do we run resource-intensive security workloads in a way that does not compromise the service-level agreements (SLAs) that those information technology operations (IT Ops) administrators need to deliver?

There is a lot of work happening to offload security-scanning mechanisms on to dedicated security virtual appliances, for example. Bitdefender has been working withpartners like Citrix to enable that.

Now, the huge opportunity is to take that story further in terms of being able to provide higher levels of visibility, detection, and prevention from the attacks of today, which are advanced persistent threats. We seek to detect how they manifest in the data center and -- in a virtual environment -- what you have the opportunity to do, and how you can respond. That game is really changing now.

Gardner: Kurt, is there something about the ability to spin up virtualized environments, and then take them down that provides a risk that the bad guys can target or does that also provide an opportunity to start fresh: To eliminate vulnerabilities, or learn quickly and adapt quickly? Is there something about the rapid change that virtualization enables that is a security plus?

Persistent protection anywhere

Roemer: You really hit on the two sides of the coin. On one side, virtualization does oftentimes provide an image of the application or the applications plus OS that could be fairly easy for a hacker to steal and be able to spin up offline and be able to get access to secrets. So you want to be able to protect your images, to make sure that they are not something that can be easily stolen.

On the other side, having the ability to define persistence -- what do you want to have to persist between reboots versus what’s non-persistent -- allows you to have a constantly refreshed system. So when you reboot it, it’s exactly back to the golden image -- and everything is as it should be. As you patch and update you are working with a known quantity as opposed to the endpoint where somebody might have administrative access and it has installed personal applications and plug-ins to their browser and other things like that that you may or may not want to have in placer.
The nice thing with virtualization is that it’s independent of the OS, the applications, the endpoints, and the varied situations that we all access our apps and data from.

Layering also comes into play and helps to make sure that you can dynamically layer in applications or components of the OS, depending on what’s needed. So if somebody is accessing a certain set of functionality in the office, maybe they have 100% functionality. But when they go home, because they are no longer in a trusted environment or maybe not working on a trusted PC from their home system, they get a degraded experience, seeing fewer applications and having less functionality layered onto the OS. Maybe they can’t save to local drives or print to local printers. All of that’s defined by policy. The nice thing with virtualization is that it’s independent of the OS, the applications, the endpoints, and the varied situations that we all access our apps and data from.

Gardner: Harish, with virtualization that there is a certain level of granularity as to how one can manage their security environment parameters. Can you expand on why having that granular capability to manage parameters is such a strong suit, and why virtualization is a great place to make that happen?

On the move, virtually

Agastya: That is one of the opportunities and challenges that security solutions need to be able to cope with.

As workloads are moving across different subgroups, sub-networks, that virtual machine (VM) needs to have a security policy that moves with it. It depends on what type of application is running, and it is not specific to the region or sub-network that that particular VM is resident on. That is something that security solutions that are designed to operate in virtual environments have the ability to do.

Security moves with the workload, as the workload is spawned off and new VMs are created. The same set of security policies associated with that workload now can protect that workload without needing to have a human step in and determine what security posture needs to belong to that VM. 


That is the opportunity that virtualization provides. But it’s also a challenge. For example, maybe the previous generations of solutions predated all of this. We now need to try and address that.

We love the fact that virtualization is happening and that it has become a very elastic software-defined mechanism that moves around and gives the IT operations people so much more control. It allows an opportunity to be able to sit very well in that environment and provide security that works tightly integrated with the virtualization layer.

Gardner: I hear this so much these days that IT operations people are looking for more automation, and more control.

Kurt, I think it’s important to understand that when we talk about security within a virtualization layer, that doesn’t obviate the value of security that other technologies provide at the OS level or network level. So this isn’t either-or, this is an augmentation, isn’t that correct, when we talk about virtualization and security?

The virtual focus

Roemer: Yes, that’s correct. Virtualization provides some very unique assets that help extend security, but there are some other things that we want to be sure to focus on in terms of virtualization. One of them is Bitdfender Hypervisor Introspection (HVI). It’s the ability for the hypervisor to provide a set of direct inspect application programming interfaces (APIs) that allow for inspection of guest memory outside of the guest.

When you look at Windows or Linux guests that are running on a hypervisor, typically when you have tried to secure those it’s been through technology installed in the guest. So you have the guest that’s self-protecting, and they are relying on OS APIs to be able to effect security. Sometimes that works really well and sometimes the attackers get around OS privileges and are successful, even with security solutions in place.

One of the things that HVI does is it looks for the techniques that would be associated with an attack against the memory of the guest from outside the guest. It’s not relying on the OS APIs and can therefore catch attacks that otherwise would have slipped past the OS-based security functionality.

Gardner: Harish, maybe you can tell us about how Citrix and Bitdefender are working together?

Step into the breach, together

Agastya: The solution is Bitdefender HVI. It works tightly with Citrix’s XenServer hypervisor, and it has been available in a controlled release for the last several months. We have had some great customer traction on it. At Citrix Synergy this year wewill be making that solution generally available.

We have been working together for the last four years to bring this groundbreaking technology to the market.

What is the problem we are trying to solve? It is the issue of advanced attacks that hit the data center when, as Kurt mentioned, advanced attackers are able to skirt past endpoint security defense mechanisms by having root access and operating at the same level of privilege as the endpoint security that may be running within the VM.

They can then essentially create a blind spot where the attackers can do anything they want while the endpoint security solution continues to run. 


These types of attacks stay in the environment and the customer suffers on average 200 days before a breach is discovered. The marketplace is filled with stories like this and it’s something that we have been working together with Citrix to address.

The fundamental solution leverages the power of the hypervisor to be able to monitor attacks that modify memory. It does that by looking for the common attack mechanisms that all these attackers use, whether it’s buffer overflows or it’s heap spraying, the list goes on.

They all result in memory modification that the endpoint security solution within the VM is blinded to. However, if you are leveraging the direct inspect APIs that Kurt talked about -- available as part of Citrix’s XenServer solution – then we have the ability to look into that VM without having a footprint in there. It is a completely agentless solution that runs outside the security virtual appliance. It monitors all of the VMs in the data center against these types of attacks. It allows you to take action immediately, reduces the time to detection and blocks the attack.

Gardner: Kurt, what are some of the major benefits for the end-user organization in deploying something like HVI? What is the payback in business terms?

Performance gains

Roemer: Hypervisor Introspection, which we introduced in XenServer 7.1, allows an organization to deploy virtualization with security technologies behind it at the hypervisor level. What that means for the business is that every guest you bring up has protection associated with it. Even if it’s a new version of Linux that you haven’t previously tested and you really don’t know which antivirus you would have integrated with it; or something that you are working on from an appliance perspective -- anything that can run on XenServer would be protected through these direct inspect APIs, and the Bitdefender HVI solution. That’s really exciting.

It also has performance benefits because you don’t have to run antivirus in every guest at the same level. By knowing what’s being protected at the hypervisor level, you can configure for a higher level of performance.

Now, of course, we always recommend having antivirus in guests, as you still have file-based access and so you need to look for malware, and sometimes files get emailed in or out or produced, and so having access to the files from an anti-malware perspective is very valuable.
So for the business, HVI gives you higher security, it gives you better performance, and the assurance that you are covered.

But you may need to cut down some of the scanning functionality and be able to meet much higher performance objectives. 

Gardner: Harish, it sounds like this ability to gain introspection into that hypervisor is wonderful for security and does it in such a way that it doesn’t degrade performance. But it seems to me that there are also other ancillary benefits in addition to security, when you have that ability to introspect and act quickly. Is there more than just a security benefit, that the value could go quite a bit further?
The benefits of introspection

Agastya: That’s true. The ability to introspect into memory has huge potential in the market. First of all, with this solution right now, we address the ability to detect advanced attacks, which is a very big problem in the industry -- where you have everything from nation-sponsored attacks to deep dark web, malicious components, attack components available to common citizens who can do bad things with them.

The capability to reduce that window to advanced attack detection is huge. But now with the power of introspection, we also have the ability to inject, on the fly, into the VM, additional solutions tools that can do deep forensics, measure network operations and the technology can expand to cover more. The future is bright for where we can take this between our companies.

Gardner: Kurt, anything to add on the potential for this memory introspection capability?

Specific, secure browsers

Roemer: There are a couple things to add. One is taking a look at the technologies and just rolling back through a lot of the exploits that we have seen, even throughout the last three months. There have been exploits against Microsoft Windows, exploits against Internet Explorer and Edge, hypervisors, there’s been EternalBlue and the Server Message Block (SMB) exploits. You can go back and be able to try these out against the solution and be able to see exactly how it would catch them, and what would have happened to your system had those exploits actually taken effect.

If you have a team that is doing forensics and trying to go through and determine whether systems had previously been exploited, you are giving that team additional functionality to be able to look back and see exactly how the exploits would have worked. Then they can understand better how things would have happened within their environment. Because you are doing that outside of the guest, you have a lot of visibility and a lot of information you otherwise wouldn't have had.

One big expanded use-case here is to get the capability for HVI between Citrix and Bitdefender in the hands of your security teams, in the hands of your forensics teams, and in the hands of your auditors -- so that they can see exactly what this tool brings to the table.


Something else you want to look at is the use-case that allows users to expand what they are doing and makes their lives easier -- and that's secured browsing.

Today, when people go out and browse the Internet or hit a popular application like Facebook or Outlook Web Access -- or if you have an administrator who is hitting an administrative console for your Domain Name System (DNS) environment, your routers, your Cisco, Microsoft environments, et cetera, oftentimes they are doing that via a web browser.
One big expanded use-case here is to get the capability for HVI between Citrix and Bitdefender in the hands of your security teams.

Well, if that's the same web browser that they use to do everything else on their PC, it's over-configured, it presents excessive risk, and you now have the opportunity with this solution to publish browsers that are very specific to each use.

For example, you publish one browser specifically for administrative access, and you know that you have advanced malware detection. Even if somebody is trying to target your administrators, you are able to thwart their ability to get in and take over the environments that the administrators are accessing.

As more things move to the browser -- and more very sensitive and critical applications move to the cloud -- it's extremely important to set up secured browsing. We strongly recommend doing this with XenServer and HVI along with Bitdefender providing security.

Agastya: The problem in the market with respect to the human who is sitting in front of the browser being the weakest link in the chain is a very important one. Many, many different technology approaches have been taken to address this problem -- and most of them have struggled to make it work.

The value of XenApp coming in with its secured browser model is this: You can stream your browser and you are just presenting, rendering an interface on the client device, but the browser is actually running in the backend, in the data center, running on XenServer, protected by Bitdefender HVI. This model not only allows you to shift the threat away from the client device, but also kill it completely, because that exploit which previously would have run on the client device is not on the client device anymore. It’s not even on the server anymore because HVI has gotten to it and stopped it.

Roemer: I bring up the browser benefit as an example because when you think of the lonely browser today, it is the interface to some of your most critical applications. A browser, at the same time, is also connected to your file system, your network, your Windows registry, your certificate chain and keys -- it’s basically connected to everything you do and everything you have access to in most OSes.

What we are talking about here is publishing a browser that is very specific to purpose and configured for an individual application. Just put an icon out there, users click on it and everything works for them silently in the background. By being able to redirect hyperlinks over to the new joint XenServer-Bitdefender solution, you are not only protecting against known applications and things that you would utilize -- but you can also redirect arbitrary links.

Even if you tell people, “don’t click on any links�, you know every once in a while it’s going to happen. When that one person clicks on the link and takes down the entire network, it’s awful. Ransomware attacks happen like that all the time. With this solution, that arbitrary link would be redirected over to a one-time use browser. Bitdefender would come up and say, “Hey, yup, there’s definitely a problem here, we are going to shut this down,� and the attack never would have had a chance to get anywhere.
What we are talking about here is publishing a browser that is very specific to purpose and configured for an individual application.

The organization is notified and can take additional remediatative actions. It’s a great opportunity to really change how people are working and take this arbitrary link problem and the ransomware problem and neutralize it.

Gardner: It sounds revolutionary rather than evolutionary when it comes to security. It’s quite impressive. I have learned a lot in just the last week or two in looking into this. Harish, you mentioned earlier that before the general availability being announced in May for Bitdefender HVI on XenServer that you have had this in beta. Do you have any results from that? Can you offer any metrics of what’s happened in the real world when people deploy this? Are the results as revolutionary as it sounds?

Real-world rollout

Agastya: The product was first in beta and then released in controlled availability mode, so the product is actually in production deployment at several companies in both North America and Europe. We have a few financial services companies, and we have some hospitals. We have put the product to use in production deployments for virtual desktop infrastructure (VDI) deployments where the customers are running XenApp and XenDesktop on top of XenServer with Bitdefender HVI.

We have server workloads running straight on XenServer, too. These are typically application workloads that the financial services companies or the hospitals need to run. We have had some great feedback from them. Some of them have become references as well, and we will be talking more about it at Citrix Synergy 2017, so stay tuned. We are very excited about the fact that the product is able to provide value in the real world.

Roemer: We have a very detailed white paper on how to set up the secured browsing solution, the joint solution between Citrix and Bitdefender. Even if you are running other hypervisors in your environment, I would recommend that you set up this solution and try redirecting some arbitrary hyperlinks over to it, to see what value you are going to get in your organization. It’s really straightforward to set up and provides a considerable amount of additional security visibility.
Bitdefender also has some really amazing videos that show exactly how the solution can block some of the more popular exploits from this year. They are really impressive to watch.

Gardner: Kurt, we are about out of time, but I was curious, what’s the low-lying fruit? Harish mentioned government, VDI, healthcare. Is it the usual suspects with compliance issues hanging over their heads that are the low-lying fruit, or are there other organizations that would be ripe to enjoy the benefits?

Roemer: I would say compliance environments and anybody with regulatory requirements would very much be low-lying fruit for this, but anybody who has sensitive applications or very sensitive use-cases, too. Oftentimes, we hear things like outsourcing as being one of the more sensitive use-cases because you have external third parties who are getting in and either developing code for you, administering part of the operating environment, or something else.

We have also seen a pretty big uptick in terms of people being interested in this for administering the cloud. As you move up to cloud environments and you are defining new operating environments in the cloud while putting new applications up in the cloud, you need to make sure that your administrative model is protected.

Oftentimes, you use a browser directly to provide all of the security interfaces for the cloud, and by publishing that browser and putting this solution in front of it, you can make sure that malware is not interrupting your ability to securely administer the cloud environment.

Gardner: Last question to you, Harish. What should organizations do to get ready for this? I hope we have enticed them to learn more about it. For those organizations that actually might want to deploy, what do they need to think about in order to be in the best position to do that?

A new way of life

Agastya: Organizations need to think aboutsecure virtualization as a way of life within organizational behavior. As a result, I think we will start to see more people with titles like Security DevOps (SecDevOps).

As far as specifically using HVI, organizations should be worried about how advanced attacks could enter their data center and potentially result in a very, very dangerous breach and the loss of confidential intellectual property.

If you are worried about that, you are worried about ransomware because an end-user sitting in front of a client browser is potentially putting out your address. You will want to think about a technology like HVI. The first step for that is to talk to us and there is a lot of information on the Bitdefender website as well as on Citrix’s website.

Listen to the podcast. Find it on iTunes. Get the mobile app. Read a full transcript or download a copy. Sponsor: Bitdefender.
The next line of defense—How new security leverages virtualization to counter sophisticated threats

The next line of defense—How new security leverages virtualization to counter sophisticated threats

When it comes to securing systems and data, the bad guys are constantly upping their games -- finding new ways to infiltrate businesses and users. Those who protect systems from these cascading threats must be ever vigilant for new technical advances in detection and protection. In fact, they must out-innovate their assailants.

The next BriefingsDirect security insights discussion examines the relationship between security and virtualization. We will now delve into how adaptive companies are finding ways to leverage their virtualization environments to become more resilient, more intelligent, and how they can protect themselves in new ways.

Listen to the podcast. Find it on iTunes. Get the mobile app. Read a full transcript or download a copy.

To learn how to ensure that virtualized data centers do not pose risks -- but in fact prove more defensible -- we are joined by two security-focused executives, Kurt Roemer, Chief Security Strategist at Citrix, and Harish Agastya, Vice President for Enterprise Solutions at Bitdefender. The discussion is moderated by Dana Gardner, principal analyst at Interarbor Solutions.

Here are some excerpts:

Gardner: Kurt, virtualization has become widespread and dominant within data centers over the past decade. At that same time, security has risen to the very top of IT leadership’s concerns. What is it about the simultaneous rise of virtualization and the rise of security concerns? Is there any intersection? Is there any relationship that most people may miss?

Roemer: The rise of virtualization and security has been concurrent. A lot of original deployments for virtualization technologies were for remote access, but they were also for secureremote access. The apps that people needed to get access to remotely were usually very substantial applications for the organization --  things like order processing or partner systems; they might have been employee access to email or internal timecard systems. These were things that you didn’t really want an attacker messing with -- or arbitrary people getting access to.

Roemer
Security has grown from just providing basic access to virtualization to really meeting a lot of the risks of these virtualized applications being exposed to the Internet in general, as well as now expanding out into the cloud. So, we have had to grow security capabilities to be able to not only keep up with the threat, but try to keep ahead of it as well.

Gardner: Hasn’t it historically been true that most security prevention technologies have been still focused at the operating system (OS)-level, not so much at the virtualization level? How has that changed over the past several years?

Roemer: That’s a good question. There have been a lot of technologies that are associated with virtualization, and as you go through and secure and harden your virtual environments, you really need to do it from the hardware level, through the hypervisor, through the operating system level, and up into the virtualization system and the applications themselves.

We are now seeing people take a much more rigorous approach at each of those layers, hardening the virtualization system and the OS and integrating in all the familiar security technologies that we’re used to, like antivirus, but also going through and providing for application-specific security.

So if you have a SAP system or something else where you need to protect some very sensitive company data and you don’t want that data to be accessed outside the office arbitrarily, you can provide very set interfaces into that system, being able to control the clipboard or copy and paste, what peripherals the application can interface with; i.e., turn off the camera, turn off the microphone if it’s not needed, and even get down to the level of with the browser, whether things like JavaScript is enabled or Flash is available.

So it helps to harden the overall environment and cut down on a lot of the vulnerabilities that would be inherent by just leaving things completely wide open. One of the benefits of virtualization is that you can get security to be very specific to the application.

Gardner: Harish, now that we are seeing this need for comprehensive security, what else is it that people perhaps don’t understand that they can do in the virtualization layer? Why is virtualization still uncharted territory as we seek to get even better security across the board?

Let’s get better than physical

Agastya
Agastya: Customers often don’t realize when they are dealing with security in physical or virtual environments. The opportunities that virtual environments provide to them are to have the ability to take security to a higher level than physical-only. So better than physical is, I think, a key value proposition that they can benefit from -- and the technology innovation of today has enabled that.

There is a wave of innovation among security vendors in this space. How do we run resource-intensive security workloads in a way that does not compromise the service-level agreements (SLAs) that those information technology operations (IT Ops) administrators need to deliver?

There is a lot of work happening to offload security-scanning mechanisms on to dedicated security virtual appliances, for example. Bitdefender has been working withpartners like Citrix to enable that.

Now, the huge opportunity is to take that story further in terms of being able to provide higher levels of visibility, detection, and prevention from the attacks of today, which are advanced persistent threats. We seek to detect how they manifest in the data center and -- in a virtual environment -- what you have the opportunity to do, and how you can respond. That game is really changing now.

Gardner: Kurt, is there something about the ability to spin up virtualized environments, and then take them down that provides a risk that the bad guys can target or does that also provide an opportunity to start fresh: To eliminate vulnerabilities, or learn quickly and adapt quickly? Is there something about the rapid change that virtualization enables that is a security plus?

Persistent protection anywhere

Roemer: You really hit on the two sides of the coin. On one side, virtualization does oftentimes provide an image of the application or the applications plus OS that could be fairly easy for a hacker to steal and be able to spin up offline and be able to get access to secrets. So you want to be able to protect your images, to make sure that they are not something that can be easily stolen.

On the other side, having the ability to define persistence -- what do you want to have to persist between reboots versus what’s non-persistent -- allows you to have a constantly refreshed system. So when you reboot it, it’s exactly back to the golden image -- and everything is as it should be. As you patch and update you are working with a known quantity as opposed to the endpoint where somebody might have administrative access and it has installed personal applications and plug-ins to their browser and other things like that that you may or may not want to have in placer.
The nice thing with virtualization is that it’s independent of the OS, the applications, the endpoints, and the varied situations that we all access our apps and data from.

Layering also comes into play and helps to make sure that you can dynamically layer in applications or components of the OS, depending on what’s needed. So if somebody is accessing a certain set of functionality in the office, maybe they have 100% functionality. But when they go home, because they are no longer in a trusted environment or maybe not working on a trusted PC from their home system, they get a degraded experience, seeing fewer applications and having less functionality layered onto the OS. Maybe they can’t save to local drives or print to local printers. All of that’s defined by policy. The nice thing with virtualization is that it’s independent of the OS, the applications, the endpoints, and the varied situations that we all access our apps and data from.

Gardner: Harish, with virtualization that there is a certain level of granularity as to how one can manage their security environment parameters. Can you expand on why having that granular capability to manage parameters is such a strong suit, and why virtualization is a great place to make that happen?

On the move, virtually

Agastya: That is one of the opportunities and challenges that security solutions need to be able to cope with.

As workloads are moving across different subgroups, sub-networks, that virtual machine (VM) needs to have a security policy that moves with it. It depends on what type of application is running, and it is not specific to the region or sub-network that that particular VM is resident on. That is something that security solutions that are designed to operate in virtual environments have the ability to do.

Security moves with the workload, as the workload is spawned off and new VMs are created. The same set of security policies associated with that workload now can protect that workload without needing to have a human step in and determine what security posture needs to belong to that VM. 


That is the opportunity that virtualization provides. But it’s also a challenge. For example, maybe the previous generations of solutions predated all of this. We now need to try and address that.

We love the fact that virtualization is happening and that it has become a very elastic software-defined mechanism that moves around and gives the IT operations people so much more control. It allows an opportunity to be able to sit very well in that environment and provide security that works tightly integrated with the virtualization layer.

Gardner: I hear this so much these days that IT operations people are looking for more automation, and more control.

Kurt, I think it’s important to understand that when we talk about security within a virtualization layer, that doesn’t obviate the value of security that other technologies provide at the OS level or network level. So this isn’t either-or, this is an augmentation, isn’t that correct, when we talk about virtualization and security?

The virtual focus

Roemer: Yes, that’s correct. Virtualization provides some very unique assets that help extend security, but there are some other things that we want to be sure to focus on in terms of virtualization. One of them is Bitdfender Hypervisor Introspection (HVI). It’s the ability for the hypervisor to provide a set of direct inspect application programming interfaces (APIs) that allow for inspection of guest memory outside of the guest.

When you look at Windows or Linux guests that are running on a hypervisor, typically when you have tried to secure those it’s been through technology installed in the guest. So you have the guest that’s self-protecting, and they are relying on OS APIs to be able to effect security. Sometimes that works really well and sometimes the attackers get around OS privileges and are successful, even with security solutions in place.

One of the things that HVI does is it looks for the techniques that would be associated with an attack against the memory of the guest from outside the guest. It’s not relying on the OS APIs and can therefore catch attacks that otherwise would have slipped past the OS-based security functionality.

Gardner: Harish, maybe you can tell us about how Citrix and Bitdefender are working together?

Step into the breach, together

Agastya: The solution is Bitdefender HVI. It works tightly with Citrix’s XenServer hypervisor, and it has been available in a controlled release for the last several months. We have had some great customer traction on it. At Citrix Synergy this year wewill be making that solution generally available.

We have been working together for the last four years to bring this groundbreaking technology to the market.

What is the problem we are trying to solve? It is the issue of advanced attacks that hit the data center when, as Kurt mentioned, advanced attackers are able to skirt past endpoint security defense mechanisms by having root access and operating at the same level of privilege as the endpoint security that may be running within the VM.

They can then essentially create a blind spot where the attackers can do anything they want while the endpoint security solution continues to run. 


These types of attacks stay in the environment and the customer suffers on average 200 days before a breach is discovered. The marketplace is filled with stories like this and it’s something that we have been working together with Citrix to address.

The fundamental solution leverages the power of the hypervisor to be able to monitor attacks that modify memory. It does that by looking for the common attack mechanisms that all these attackers use, whether it’s buffer overflows or it’s heap spraying, the list goes on.

They all result in memory modification that the endpoint security solution within the VM is blinded to. However, if you are leveraging the direct inspect APIs that Kurt talked about -- available as part of Citrix’s XenServer solution – then we have the ability to look into that VM without having a footprint in there. It is a completely agentless solution that runs outside the security virtual appliance. It monitors all of the VMs in the data center against these types of attacks. It allows you to take action immediately, reduces the time to detection and blocks the attack.

Gardner: Kurt, what are some of the major benefits for the end-user organization in deploying something like HVI? What is the payback in business terms?

Performance gains

Roemer: Hypervisor Introspection, which we introduced in XenServer 7.1, allows an organization to deploy virtualization with security technologies behind it at the hypervisor level. What that means for the business is that every guest you bring up has protection associated with it. Even if it’s a new version of Linux that you haven’t previously tested and you really don’t know which antivirus you would have integrated with it; or something that you are working on from an appliance perspective -- anything that can run on XenServer would be protected through these direct inspect APIs, and the Bitdefender HVI solution. That’s really exciting.

It also has performance benefits because you don’t have to run antivirus in every guest at the same level. By knowing what’s being protected at the hypervisor level, you can configure for a higher level of performance.

Now, of course, we always recommend having antivirus in guests, as you still have file-based access and so you need to look for malware, and sometimes files get emailed in or out or produced, and so having access to the files from an anti-malware perspective is very valuable.
So for the business, HVI gives you higher security, it gives you better performance, and the assurance that you are covered.

But you may need to cut down some of the scanning functionality and be able to meet much higher performance objectives. 

Gardner: Harish, it sounds like this ability to gain introspection into that hypervisor is wonderful for security and does it in such a way that it doesn’t degrade performance. But it seems to me that there are also other ancillary benefits in addition to security, when you have that ability to introspect and act quickly. Is there more than just a security benefit, that the value could go quite a bit further?
The benefits of introspection

Agastya: That’s true. The ability to introspect into memory has huge potential in the market. First of all, with this solution right now, we address the ability to detect advanced attacks, which is a very big problem in the industry -- where you have everything from nation-sponsored attacks to deep dark web, malicious components, attack components available to common citizens who can do bad things with them.

The capability to reduce that window to advanced attack detection is huge. But now with the power of introspection, we also have the ability to inject, on the fly, into the VM, additional solutions tools that can do deep forensics, measure network operations and the technology can expand to cover more. The future is bright for where we can take this between our companies.

Gardner: Kurt, anything to add on the potential for this memory introspection capability?

Specific, secure browsers

Roemer: There are a couple things to add. One is taking a look at the technologies and just rolling back through a lot of the exploits that we have seen, even throughout the last three months. There have been exploits against Microsoft Windows, exploits against Internet Explorer and Edge, hypervisors, there’s been EternalBlue and the Server Message Block (SMB) exploits. You can go back and be able to try these out against the solution and be able to see exactly how it would catch them, and what would have happened to your system had those exploits actually taken effect.

If you have a team that is doing forensics and trying to go through and determine whether systems had previously been exploited, you are giving that team additional functionality to be able to look back and see exactly how the exploits would have worked. Then they can understand better how things would have happened within their environment. Because you are doing that outside of the guest, you have a lot of visibility and a lot of information you otherwise wouldn't have had.

One big expanded use-case here is to get the capability for HVI between Citrix and Bitdefender in the hands of your security teams, in the hands of your forensics teams, and in the hands of your auditors -- so that they can see exactly what this tool brings to the table.


Something else you want to look at is the use-case that allows users to expand what they are doing and makes their lives easier -- and that's secured browsing.

Today, when people go out and browse the Internet or hit a popular application like Facebook or Outlook Web Access -- or if you have an administrator who is hitting an administrative console for your Domain Name System (DNS) environment, your routers, your Cisco, Microsoft environments, et cetera, oftentimes they are doing that via a web browser.
One big expanded use-case here is to get the capability for HVI between Citrix and Bitdefender in the hands of your security teams.

Well, if that's the same web browser that they use to do everything else on their PC, it's over-configured, it presents excessive risk, and you now have the opportunity with this solution to publish browsers that are very specific to each use.

For example, you publish one browser specifically for administrative access, and you know that you have advanced malware detection. Even if somebody is trying to target your administrators, you are able to thwart their ability to get in and take over the environments that the administrators are accessing.

As more things move to the browser -- and more very sensitive and critical applications move to the cloud -- it's extremely important to set up secured browsing. We strongly recommend doing this with XenServer and HVI along with Bitdefender providing security.

Agastya: The problem in the market with respect to the human who is sitting in front of the browser being the weakest link in the chain is a very important one. Many, many different technology approaches have been taken to address this problem -- and most of them have struggled to make it work.

The value of XenApp coming in with its secured browser model is this: You can stream your browser and you are just presenting, rendering an interface on the client device, but the browser is actually running in the backend, in the data center, running on XenServer, protected by Bitdefender HVI. This model not only allows you to shift the threat away from the client device, but also kill it completely, because that exploit which previously would have run on the client device is not on the client device anymore. It’s not even on the server anymore because HVI has gotten to it and stopped it.

Roemer: I bring up the browser benefit as an example because when you think of the lonely browser today, it is the interface to some of your most critical applications. A browser, at the same time, is also connected to your file system, your network, your Windows registry, your certificate chain and keys -- it’s basically connected to everything you do and everything you have access to in most OSes.

What we are talking about here is publishing a browser that is very specific to purpose and configured for an individual application. Just put an icon out there, users click on it and everything works for them silently in the background. By being able to redirect hyperlinks over to the new joint XenServer-Bitdefender solution, you are not only protecting against known applications and things that you would utilize -- but you can also redirect arbitrary links.

Even if you tell people, “don’t click on any links�, you know every once in a while it’s going to happen. When that one person clicks on the link and takes down the entire network, it’s awful. Ransomware attacks happen like that all the time. With this solution, that arbitrary link would be redirected over to a one-time use browser. Bitdefender would come up and say, “Hey, yup, there’s definitely a problem here, we are going to shut this down,� and the attack never would have had a chance to get anywhere.
What we are talking about here is publishing a browser that is very specific to purpose and configured for an individual application.

The organization is notified and can take additional remediatative actions. It’s a great opportunity to really change how people are working and take this arbitrary link problem and the ransomware problem and neutralize it.

Gardner: It sounds revolutionary rather than evolutionary when it comes to security. It’s quite impressive. I have learned a lot in just the last week or two in looking into this. Harish, you mentioned earlier that before the general availability being announced in May for Bitdefender HVI on XenServer that you have had this in beta. Do you have any results from that? Can you offer any metrics of what’s happened in the real world when people deploy this? Are the results as revolutionary as it sounds?

Real-world rollout

Agastya: The product was first in beta and then released in controlled availability mode, so the product is actually in production deployment at several companies in both North America and Europe. We have a few financial services companies, and we have some hospitals. We have put the product to use in production deployments for virtual desktop infrastructure (VDI) deployments where the customers are running XenApp and XenDesktop on top of XenServer with Bitdefender HVI.

We have server workloads running straight on XenServer, too. These are typically application workloads that the financial services companies or the hospitals need to run. We have had some great feedback from them. Some of them have become references as well, and we will be talking more about it at Citrix Synergy 2017, so stay tuned. We are very excited about the fact that the product is able to provide value in the real world.

Roemer: We have a very detailed white paper on how to set up the secured browsing solution, the joint solution between Citrix and Bitdefender. Even if you are running other hypervisors in your environment, I would recommend that you set up this solution and try redirecting some arbitrary hyperlinks over to it, to see what value you are going to get in your organization. It’s really straightforward to set up and provides a considerable amount of additional security visibility.
Bitdefender also has some really amazing videos that show exactly how the solution can block some of the more popular exploits from this year. They are really impressive to watch.

Gardner: Kurt, we are about out of time, but I was curious, what’s the low-lying fruit? Harish mentioned government, VDI, healthcare. Is it the usual suspects with compliance issues hanging over their heads that are the low-lying fruit, or are there other organizations that would be ripe to enjoy the benefits?

Roemer: I would say compliance environments and anybody with regulatory requirements would very much be low-lying fruit for this, but anybody who has sensitive applications or very sensitive use-cases, too. Oftentimes, we hear things like outsourcing as being one of the more sensitive use-cases because you have external third parties who are getting in and either developing code for you, administering part of the operating environment, or something else.

We have also seen a pretty big uptick in terms of people being interested in this for administering the cloud. As you move up to cloud environments and you are defining new operating environments in the cloud while putting new applications up in the cloud, you need to make sure that your administrative model is protected.

Oftentimes, you use a browser directly to provide all of the security interfaces for the cloud, and by publishing that browser and putting this solution in front of it, you can make sure that malware is not interrupting your ability to securely administer the cloud environment.

Gardner: Last question to you, Harish. What should organizations do to get ready for this? I hope we have enticed them to learn more about it. For those organizations that actually might want to deploy, what do they need to think about in order to be in the best position to do that?

A new way of life

Agastya: Organizations need to think aboutsecure virtualization as a way of life within organizational behavior. As a result, I think we will start to see more people with titles like Security DevOps (SecDevOps).

As far as specifically using HVI, organizations should be worried about how advanced attacks could enter their data center and potentially result in a very, very dangerous breach and the loss of confidential intellectual property.

If you are worried about that, you are worried about ransomware because an end-user sitting in front of a client browser is potentially putting out your address. You will want to think about a technology like HVI. The first step for that is to talk to us and there is a lot of information on the Bitdefender website as well as on Citrix’s website.

Listen to the podcast. Find it on iTunes. Get the mobile app. Read a full transcript or download a copy. Sponsor: Bitdefender.
IoT Testing Challenges and Legacy

IoT Testing Challenges and Legacy

We live in a world where your car will soon be a far better driver than you, and your fridge will be more responsible as well, even ordering groceries when supplies start to run low. This is feasible through the advancements of embedded systems into the Internet of Things (IoT). The quality and performance of such systems still require the improvement of existing testing methods to ensure quality while taking into consideration speed and scalability.

Embedded systems have been around for decades now, while the IoT is still in its infancy. The difference between the two is that embedded systems are self-contained and isolated, while IoT gadgets are in permanent communication with the server and, soon, with each other. Such a degree of integration  and interoperability poses significant problems regarding testing and robustness due to differences in protocols and security.

Old Tools, New Challenges

Traditional testing methods take time, which is no longer a viable option. The entire development process needs to be agile, detecting problems as early as possible. The new mantra is "test early, test often" and the result is continuous integration, which is even more important in an IoT environment. The sensors, communication channels, protocols, software-hardware interactions, and configurations determine an ...


Read More on Datafloq
How Big Data Transformed the Marketing World

How Big Data Transformed the Marketing World

We are now in the digital era with thousands of gadgets at our hands including tablets, smartphones, digital television, social networks, and e-commerce. Before digital marketing marketers’ mission was kind of hard and challenging, they have been required to create captivating and catchy advertisings for TVs, print media, and billboards without any solid knowledge about the consumers’ behavior. But with the evolving of digital marketing the marketers’ mission has become way easier and more efficient.

Marketers now have a different job description, they are not only working on coming up with innovative ideas but rather collecting data, deriving trends, crunching statistics and numbers and finding the right channel to reach out to their target audience. Though this might sound like a lot of work to do, yet it is a well-known fact that the digital age has brought a lot of data which marketers can use to create more efficient strategic marketing plans via more accurate insights.

Why should marketers consider using big data?

Today, many organizations are collecting data from different sources to improve and optimize their marketing campaigns including Amazon, Zoopla, Propertyinder, Rocket Internet and many others. Taking out the guesswork from the equation, such campaigns have proven to be more ...


Read More on Datafloq
How the Blockchain Will Change Social Networks

How the Blockchain Will Change Social Networks

The blockchain is the new technology that people just can’t stop seeming to talk about. It’s staged to change everything, from currency to digital property to even social media and the attention economy. Some have even taken to calling it the second significant overlay on top of the Internet, aka “the trust layer�.

William Mougayar’s The Business Blockchain: Premise, Practice, and Application of the Next Internet Technology describes it as such:

“The blockchain cannot be described just as a revolution. It is a tsunami-like phenomenon, slowly advancing and gradually enveloping everything along its way by the force of its progression,� he writes. “Plainly, it is the second significant overlay on top of the Internet, just as the Web was that first layer back in 1990. That new layer is mostly about trust, so we could call it the trust layer.�

Mougayar’s confidence is shared by more than a couple of other enthusiasts, such as Imogen Heap and others in the music industry, certain govtech CIOs, and even insurance tech startups. The real indicator that blockchain will touch everyone’s lives, however, is that it’s predicted to change the way we interact over social media.

A Quick Rundown on Blockchain Technology

For those unfamiliar with blockchain technology, ...


Read More on Datafloq
How Big Data Could Improve the Business Lending Industry

How Big Data Could Improve the Business Lending Industry

By now, most banks and financial institutions are already investing in big data. They’re pulling data from their customers and financial markets in new ways, and they’re hiring more data analysts and engineers to make the best use of that data. In fact, by the end of the decade, we could see the global existence of 45 zettabytes—that’s 45 trillion gigabytes—and an enormous shortfall of data analysis talent to keep up with demand.

So what exactly are banks using all this data for? In addition to making better investments and improving infrastructure and technology, many banks are putting this data to use in the lending sector—which could herald the changes necessary to spark a revolution in business lending.

The Business Lending Problem

Small business loans are an option available to aspiring entrepreneurs who have strong ideas for new businesses, but not enough capital to make those ideas a reality. Unfortunately, banks are notoriously strict about who they give loans to and how. From a logistical perspective, this is a necessity; if banks lent money to everyone, they’d never make a profit, and they’d cease to exist.

But current circumstances make it hard for entrepreneurs with little experience, or those with poor credit, to get ...


Read More on Datafloq
How Big Data Has Shifted Corporate Decision-Making

How Big Data Has Shifted Corporate Decision-Making

Why did businesses succeed in the past? A few factors: timing, intuition, know-how, and a little bit of luck. While a lot of these factors still influence corporate success and decision-making today, there are more tools available than ever that are available to help predict success or failure. One of these tools is big data—which offers solid insights that are often as valuable as even the most experienced CEO’s intuition. Today, big data has shifted corporate decision-making in many companies, particularly startups. Does intuition still have a place in our data-driven world? Let’s take a look.

Mining the Past for Future Growth

While intuition is based on past experience (the past experience of a business professional), big data represents a concrete picture of the past. Mining past data of an organization (or the organization’s competition if the company is brand new) is key to predicting the future of new initiatives. While trends can change, it tends to be more reliable to make decisions based on the past to help spur future growth.

Changes in Business Strategy

Big data has become a popular tool in business strategy partially because it’s easy to see and communicate the reasons behind decision-making. When decision-making is based on intuition ...


Read More on Datafloq
Big Data’s Powerful Role in Gas Pipe Safety

Big Data’s Powerful Role in Gas Pipe Safety

Infrastructure is a major area of opportunity for big data, and it’s being explored more each year as opportunities to improve efficiency and safety present themselves. Smart grid systems are becoming more common as aging systems are upgraded, resulting in lower energy usage and savings. Gas pipelines, another important piece of our global energy infrastructure, are also being improved with big data—but unfortunately not all of them.

The “Pigs� That Clean Pipelines

Data collection in infrastructure relies on sensors, which can provide valuable information in many different contexts. Smart grid sensors detect what times of day demand more power, while sensors installed on “pigs� that clean pipelines offer insights that can improve safety and maintenance. The catch? These pigs, which look like cylindrical robots, can’t be used on all pipes worldwide. In fact, they can only be used on about a third of the world’s gas lines. That’s a shame, because pigs are a great way to use predictive analytics to ensure proper maintenance and cut costs for manufacturers.

Because keeping pipes in tip-top shape is an important safety issue, pipes that can’t be cleaned and monitored using pigs must be maintained using the old-fashioned methods of conservative prediction based on experience. ...


Read More on Datafloq
How Machine Learning Will Drive Autonomous Vehicles

How Machine Learning Will Drive Autonomous Vehicles

It's Saturday noon and you set out for a long drive with your friends and family. You get into your car, turn on the music, tune into your favorite song, and give the navigation system your preferred destination while it selects the best route. Not just these. You get all the information required through cloud right on your car dashboard. Drivers around the world are used to the increasing amount of digitalization in their cars. However, this is not the end. The future is much sexier and pretty than what you have thought of.

Tomorrow's car will represent a step change in function and form to what it is being offered now. Original equipment manufacturers (OEMs) plan to build smart, connected, autonomous vehicles (AV) or self-driving cars is already in the pipeline. According to IHS Markit, the sales of AVs are expected to reach 600,000 units by 2025, and 21 million units by 2035. Geographically, Asia-Pacific (APAC) would hold a major share of 39%, followed by Americas at 34.7% in 2030. The growth in APAC region would be driven by the technological competency of China, Japan, and South Korea.

AVs today are in a public testing phase with companies such as Ford, ...


Read More on Datafloq
The Latest Trends in Big Data Analytics to Watch Out For

The Latest Trends in Big Data Analytics to Watch Out For

Big data has become one of hottest topics of discussion over the past few years. It plays a pivotal role in various aspects of businesses across industries and is also the favorite subject of academics too. The ability to mine massive volumes of data from a myriad of sources to analyze and gain insight has radically altered the dynamics of business functions, marketing, and sales.

Not just the major corporations, but even the start-ups may taste the success fruit by effectively uncovering the insights derived from data. Digging, analyzing, managing, and manipulating big data is pretty easy now, and most of the companies now have the ability to do it with minimal cost.

At the rise of 2017, we have seen that the businesses have far outgrown the basic concept of simply converting data to insights, but they can now use data to derive actionable and directive principles. The focus is now more on mining the data effectively in light of the actual organizational goals and by precisely targeting specific products and services.

This insight-driven approach will get more strengthened by the third quarter of 2017 to facilitate enhanced customer experience, market competitiveness, advanced level of security, and increased operational efficiency.

For example, a ...


Read More on Datafloq
Data in Marketing – The Key to Future Strategy

Data in Marketing – The Key to Future Strategy

What do you think when you read the first line of an unsolicited email that begins with the words, "Dear Buyer", or "Dear Manager"? If you have not heard from the company before, or, you are not the manager of anything, then it is even more peculiar. Whatever reputation that company was trying to develop, they've just lost it.

Many will delete the email and just conclude that this is poor marketing - which is true.But what I think is that the company that sent me this email has not cleaned up its database lately. The reason behind these common marketing mistakes is simple. A deficit has developed between the marketing team and data expertise.

Every year, the quality of our lead generation data is improving. Accuracy, strategic planning and conversion rates are all becoming more and more dependent on marketing where big data is having it impact.And yet, many organisations are not where they need to be regarding data management and application. This is most evident within the digital marketing sector.

Data and Marketing

A 2017 Data-driven Marketing Report by Jaywing bears this out. In their questioning of the marketing industry, 92% said that data management was a top priority for their business. However, 40% stated that ...


Read More on Datafloq
The Amazing Disruption in Business Intelligence

The Amazing Disruption in Business Intelligence

The world of business intelligence used to be rather basic and only included things like basic surveys and employee evaluations. However, the space has gotten a major makeover in recent years with the growth of new disruptive technologies and strategies. That disruption has had huge effects in a number of areas and is changing how companies across all industries do business.

Cloud-Based Analytics

At its core, business intelligence is all about using data to make strategic decisions. The amount of data companies could store and use was traditionally limited by human capabilities and storage functions. However, the growth of cloud services has allowed companies to tap into vast amounts of data that is more complex than ever before. Use of the cloud also allows businesses to share that data seamlessly between various departments and remote employees.

Instead of the old model of dividing up different aspects of business intelligence to various services and vendors, companies can now consolidate everything in the cloud to make it easy to store, analyze, and access whenever needed. This means the departments can share data and analytics to form a cohesive company goal and keep everyone on the same page.

Big Data-Driven Marketing

The use of the cloud and more ...


Read More on Datafloq
A Beginner’s Look at Big Data and its Benefits for Your Business

A Beginner’s Look at Big Data and its Benefits for Your Business

Not many people are familiar with the term "big data." Big data affects everyone and what's surprising is the fact that there is no education or training on its importance. In layman's terms, the phrase big data is used to refer to large and complex sets of data that can be computationally analysed to reveal trends and patterns. So, many people are not aware of the concept of big data, how it is used, or even what it is. Just to give a few examples, big data is collected on your supermarket loyalty card as you shop, on your social media account, and on your smartphone.

Uses of Big Data

Data collected from your supermarket loyalty card can be used by these stores to identify trends and to create personalized offers and deals providing benefits to both the organization and its customers. Data collected online from people's' social media accounts can be anonymised and used to improve users' experience online and many other good things. Data collected from satellites and smartphones can be used to determine the most popular routes in a city/town and help improve public transport. Local authorities can use big data to determine where social care is needed the ...


Read More on Datafloq
DomoPalooza 2017: Flare, Stravaganza…and Effective Business Management

DomoPalooza 2017: Flare, Stravaganza…and Effective Business Management

Logo courtesy of DOMO , Inc.
When you decide to show up at Domopalooza, Domo’s big user event, you don’t know for sure what you will find, but from the very beginning you can feel that you’ll have a unique experience. From the individual sessions and training, the partner summit and the concert line-up, to what might come from Domo’s CEO/rock-star Josh James, who certainly is one of a kind in the software industry; you know that you’ll witness a delightful event.

This year under the strings of Styx’s, Mr. James kicked off an event that amalgamated business, entertainment, fun and work in a unique way —a very Domo way.

With no more preambles, here is a summary of what happened during Domo’s 2017 DomoPalooza user conference.

Josh James at DomoPalooza 2017 (Photo courtesy of DOMO)
Key Announcements

Before entering to the subjective domain of my opinion about Domo’s event and solutions, let’s take a minute to pin point some of the important announcements made previous and during the event:
  • The first news came some days before the user event, when Domo announced its new model for rapid deployment dashboards. This solution consists of a series of tools that accelerate and ease the dashboard deployment process. Starting with its large number of connectors to diverse data sources, to a set of pre-installed and easy to configure dashboards, this model will enable developers deploy dashboards quickly and easily that decision makers can use effectively.
  • The next important announcement occurred during the conference. Domo came out with the release of Mr. Roboto —DOMO’s new set of capabilities for performing machine learning, predictive analytics and predictive intelligence. According to DOMO, the new offering will be fully integrated within DOMO’s business cloud, aiming for fast and non-disruptive business adoption. Two major features from Mr. Roboto include Alerts Center, a personalized visual console powered by advanced analytics functionality to provide insights and improve decision making. The other is its data science interface to enable users to apply predictive analytics, machine learning and other advanced analytics algorithms to its data sets. This is for sure one product I’m looking forward to analyzing further!

The introduction of new features, especially directed to narrow the technical-business gap within the C-Suite of an organization, and to facilitate decision makers an easier and customized access to insights, will enable business management and monitoring using DOMO. Some of these features include the introduction of:
  • Annotations, so information workers and decision makers can highlight significant insights in the process on top of a chart or data point. Enhancement to its Analyzer tool with the incorporation of a visual data lineage tool to enable users to track data from source to visualization.
  • Data slicing within DOMO’s cards to create more guided analysis paths business users and decision makers can take advantage of. 
  • More than 60 chart families to enhance the rich set of visual options already within DOMO’s platform. 

DOMO’s new features seem to fit well within a renewed effort from the company to address bigger enterprise markets and increase presence within segments which traditionally are occupied by other enterprise BI contenders.

It may also signal DOMO’s necessary adaptive process to comply with a market currently in a rampage for the inclusion of advanced analytic features to address larger and new user footprints within organizations, such as data scientists and a new more tech savvy generation of information workers.

There is much more behind Domo’s Curtains

Perhaps the one thing I did enjoy the most about the conference was having a continuous sense of discovery —different from previous interactions with DOMO, which somehow left me with a sense of incompletion. This time I had the chance to discover that there is much more about DOMO behind the curtains.

Having a luminary as CEO, such as Josh James, can be a two-edged sword. On one side, its glowing personality has served well to enhance DOMO’s presence in a difficult and competitive market. Josh has the type of personality that attracts, creates and sells the message, and with no doubt drives the business.

On the other end, however, if not backed and handled correctly, his strong message can create some scepticism, making some people think a company is all about a message and less about the company’s substance. But this year’s conference helped me to discover that DOMO is way more than what can be seen in the surface.

Not surprising is the fact that Josh and Chris Harrington —savvy businessmen and smart guys— have been keen to develop DOMO’s business intelligence and analytics capabilities to achieve business efficiency, working towards translating technical complexity into business oriented ease of use. To achieve this, DOMO has put together, on the technical side, a very knowledgeable team lead by Catherine Wong and Daren Thayne, DOMO’s Chief Product Officer and Chief Technology Officer respectively, both with wide experience. Their expertise goes from cloud platforms and information management to data visualization and analysis. On the business side, an experienced team that includes tech veterans like Jay Heglar and Paul Weiskopf, lead strategy and corporate development, respectively.

From a team perspective, this balance between tech experience and business innovation seems to be paying off as, according to them, the company has been growing steadily and gaining the favour of big customers such as TARGET, Univision or Sephora,  some of the customers that were present during the event.


From an enterprise BI/Analytics perspective, it seems DOMO has achieved a good balance in at least two major aspects that ensure BI adoption and consumption:

  • The way BI services can be offered to different user groups— especially to the C-level team— which requires a special degree of simplification, but at the same time an efficiency in the way the data is shown.
  • The way BI services can encapsulate complex data processing problems and hide them from the business user. 


Talking about this topic, during the conference we had the chance to see examples of the aforementioned aspects, both onstage and offstage. One with Christel Bouvron,  Head of Business Intelligence at Sephora Southeast Asia. Christel commented the following, in regards to the adoption and use of DOMO:

“We were able to hook in our data sets really quickly. I had sketched out some charts of what I wanted. They didn’t do that, but what they did was even better. I really liked that it wasn’t simply what I was asking for – they were trying to get at the business problem, the outcomes we were trying to get from it, and think about the bigger picture.�

A good example of the shift DOMO wants to convey is that they are now changing the approach from addressing a business problem with a technical perspective, to addressing the business problem with business perspective but having a technical platform in the background to support it. Of course this needs to come with the ability to effectively encapsulate technical difficulties in a way that is efficient and consumable for the business.

Christel Bouvron at DomoPalooza 2017 (Photo coutesy of DOMO)

It was also good to hear from the customers that they acknowledge that the process wasn’t always that smooth, but it helped to trigger an important cultural shift within their organization.

The takeaway

Attending Domopalooza 2017 was informative and very cool indeed. DOMO’s team showed me a thing or two about the true business of DOMO and its interaction with real customers; this includes the fact that DOMO is not a monolithic solution. Besides its already rich set of features, it enables key customization aspects to provide unique customers with unique ways to solve their problems. While DOMO is a software rather than a service company, customers expressed satisfaction with the degree of customization and services DOMO provides —this was especially true with large companies.

DOMO has done a great job to simplify the data consumption process in a way that data feeds are digestible enough. The solution concentrates more on the business problem rather than the technical one, giving many companies the flexibility and time to make the development of business intelligence solutions more agile and effective. Although these results might not be fully achieved in all cases, DOMO’s approach certainly can help organizations to from a more agile and fast deployment process, thus, more efficient and productive.

Despite being a cloud-based software company, DOMO seems to understand quite well that a great number of companies are working, for necessity or by choice, in hybrid cloud/on-premises environments, which enables the customer to easily connect and quickly interact with on-premises systems, whether this is a simple connection to a database/table source or it requires more sophisticated data extraction and transformation specifications.

There is no way that in the BI and Analytics market a company such as DOMO — or any other player in the market— will have a free ticket to success. The business intelligence market is diversifying as an increasing number of companies seem to need their services, but certainly
DOMO’s offering is, by all means, one to be considered when evaluating a new generation BI solution to meet the increasing demand for insights and data analysis.

Finally, well... what can be a better excuse to watch Styx's Mr. Roboto than this.



(All photos credited to Domo, Inc.)
Why Data Governance is the Foundation of a Healthcare Big Data Strategy

Why Data Governance is the Foundation of a Healthcare Big Data Strategy

Big data is everywhere, and many businesses are using it to improve their processes and strategies with great success. But one area where big data seems to be lagging is in healthcare. Many healthcare institutions want to adopt and expand their usage of big data, but in order to do so, they’ll need to focus on data governance.

What is data governance?

In a nutshell, data governance is what keeps data safe, secure, and up to an organization’s standards. Data is everywhere around us, and patients and consumers are used to being able to access the information they need almost as soon as they need it. However, that accessibility comes with a cost that many people don’t see—it puts much of the data at a greater risk of being hacked and stolen.

Data governance works to fight that and to secure data by creating systems so that users can trust their data. In a comprehensive data governance program, users are responsible for creating quality data and using it in a secure and ethical manner with proper authorization. In healthcare, this often comes down to establishing data governance principles that ensure data is consistent and reliable.

Why is it important?

The overall goal of data governance ...


Read More on Datafloq
What is the Future of the Internet of Things in Health?

What is the Future of the Internet of Things in Health?

The Internet of Things (IoT) is a broad term referring to all pieces of technology that connect to the Internet and each other. A subset of the Internet of Things is the Internet of Healthcare Things (IoHT). This refers to all pieces of Internet-connected technology that apply to the healthcare industry.

The IoHT and development of new technologies in the healthcare field has made significant strides in improving patient care. It is not just about maintaining records and communicating with patients. If household appliances and business technology can be added to the IoT, then medical devices can as well.  

A fully connected IoHT can enable practitioners to provide individually customized data-based treatments. If patient records are made readily available, the provider can view a comprehensive medical history cross-referenced with data-based treatment research successfully administered to similar patients. The more data available regarding current treatments, medications, and patient history, the better care the patient will receive.

The IoHT will not only benefit patients on a personal level, but it will streamline the healthcare process by allowing practitioners to communicate and instantly access patient information with the appropriate permissions. Let’s say, for instance, that a patient resides in Arizona. However, while vacationing in California, ...


Read More on Datafloq
How to Leverage AI for Cybersecurity Assurance

How to Leverage AI for Cybersecurity Assurance

In the game of cybersecurity, humans are the weakest link and usually the cause of unwanted breaches. Even highly educated and influent individuals can become unsuspecting targets of cyber-attacks, due to a lack of vigilance and taking security matters too lightly. Until now, this war was mainly human versus human. Things are about to change with the introduction of AI as the future of cybersecurity. It will soon be a challenge of super-computers against each other, much like it already is in the world of automated trading.

Challenges of cybersecurity

Human nature

The biggest threat to cyber security at this time is the careless nature of people towards passwords. The complexity of currently available algorithms would take years of continuous work by computers to break through brute force. However, users jeopardize their online safety by using simple passwords such as "123456" or their pet's names. Most users release a significant amount of personal data through social media every day. It could take seconds for an algorithm powered by AI to break into sensitive accounts by leveraging user-generated content, freely available online.

Although awareness about cyber threats is high, there has been no notable change in the behavior of people, leading to better protection. We are afraid of ...


Read More on Datafloq
6 Signs Your Company Needs a New Data Strategy

6 Signs Your Company Needs a New Data Strategy

Big Data is not the latest jargon that has crept into executive meetings, it’s becoming an essential business practice used by most organisations today. Over the years, businesses have become aware of the insights that they can gain from data analytics and are collecting increasing amounts of data. Yet, many businesses do not have a proper data strategy in place and are simply collecting data in a frenzy. There is a difference between Big Data and having lots of data. Collecting data just for the sake of it in hopes of using it in the future is not only bad business practice, it leads to potentially costly problems for your company.

Here is a list of issues that companies without a proper data strategy may face. If your company is experiencing any of these problems, it is a tell-tale sign that you need to review your company’s data strategy:

1. Storing data is starting to cost more

Even though the price of data storage has plummeted over the years, a poor data strategy will lead to high data storage costs. According to Experian, an information services company in the US, “77 percent of CIOs believe data is a valuable asset in their ...


Read More on Datafloq
4 Tech Innovations That Are Changing the Supply Chain Industry

4 Tech Innovations That Are Changing the Supply Chain Industry

Third party logistic companies (3PLs) and the supply chain have always relied on technology to meet the speed, costs, and quality demands. Technology has always been the driver to meeting customer demands, especially when it comes to supply chains. The emergence of the railroad systems in the 19th century, as well as the proliferation of trucks and other automobiles in the 1900s, had a big impact on the supply chain. These cost-effective technologies helped companies make deliveries faster and in larger amounts. We are in the 21st century now, and the internet is on the verge of transforming the supply chain in ways that we couldn't have anticipated.

When Amazon announced Prime Air services program two years ago, people were stunned- which is understandable. Has logistics technology become so advanced that orders for online shopping can be delivered with drones? Well, not yet, but this is not a joke. Tests are underway on similar programs in Israel and the United Kingdom. The 21st century is the age of the internet that is quickly transforming everything from customers' shopping experiences to how companies go through with their orders. Here, we are going to shed some light on emerging supply chain technologies expected ...


Read More on Datafloq
Enterprise Journey to Becoming Digital

Enterprise Journey to Becoming Digital

Do you want to be a digital enterprise? Do you want to master the art of transforming yourself and be at the forefront of the digital realm?

How can you change your business to achieve this?

Derive new values for yourself, and find better and more innovative ways of working. Put customer experience above and beyond everything as you find methodologies to support the rapidly changing demands of the digital world.

Your transformation will be successful only when you identify and practice appropriate principles, embrace a dual strategy that enhances your business capabilities and switch to agile methodologies if you have not done it already.

The journey to becoming a digital maestro and achieving transformation traverses through four main phases.


Becoming a top-notch expert with industrialized IT services – by adopting six main principles
Switching to agile operations to achieve maximum efficiency – so that you enjoy simplicity, rationality and automation
Creating an engaging experience for your consumers using analytics, revenue and customer management – because your customers come first; their needs and convenience should be your topmost priority
Availing opportunities for digital services – assessing your security and managing your risks


Becoming a top-notch expert with industrialized IT services

There are five key transformation principles that can help you realize the ...


Read More on Datafloq
4 Ways How Big Data Will Improve Road Safety

4 Ways How Big Data Will Improve Road Safety

With over 40,000 deaths each year coming from traffic-related collisions and accidents, it's a clear sign that improving road safety is a top priority across the nation. Advances in technology are helping reduce accidents and improving overall driver safety through a variety of methods. Here are the top 4 ways that technology will help improve road safety in the coming years.

#1 Data Collection

Along with computer controlled vehicles, data collection is vital to ensuring that we understand where and why accidents happen. The black box technology that has been famously used to track airplanes and help identify the cause of crashes is now being used in other vehicles. Black box technology is fairly simple, inexpensive, and easy to deploy on a wide scale of cars.

The benefits of the technology are we will be able to track the exact time, speeds, position, and other factors related to car collisions and accidents. As this data is studied, we will be able to better understand trends and reasons behind car crashes and use this data to prevent future incidents. South Korea was the first country to deploy black box technology in their taxi services and immediately noticed a 14 percent decrease in traffic accidents the following ...


Read More on Datafloq
Using Online Reviews and Big Data for Positive Impact

Using Online Reviews and Big Data for Positive Impact

For years, corporations have used big data to make decisions and drive strategy. This is no longer a viable option. Companies aren't correctly using their data and while the concept remains popular, the current methods of using big data aren't successfully meeting the need. Corporate researchers and marketing experts still use data without supplying the proper context. Saying that a data point is up or that another has decreased makes no sense without the entire story. Changemakers need to know the goals and what changes will be required to get there. Visionary companies are using smart data instead of big data. Smart data is timely and is used to help transform business operations.

To see the difference, let's start with review data. Most of those in the service industry consider review data essential to successful operations. Reviews tend to impact a consumer's shopping experience. Today's consumers are savvy enough to look a company's or product's reviews before purchasing. Positive reviews have a positive impact. Negative reviews have the opposite effect and consumers don't spend their money with that company or on that product. Additionally, negative reviews tend to earn more negative reviews. However, asking customers for reviews is part of the ...


Read More on Datafloq
Digital Transformation: What it is and Why It Matters

Digital Transformation: What it is and Why It Matters

Digital technologies play an ever-increasing role in our daily lives. Companies must follow suite to provide consumers with the digitally-connected experiences they expect. Digital transformation takes commitment but it is no longer an option for brands who want to stay competitive.  Consumers will simply turn away from companies that aren’t keeping pace in favor of a brand that can provide a connected experience across digital channels.

More brands are renewing their commitment to digital strategies.  According to Forrester, “A Digital Strategy allows you to understand the who, what, when and where of listening and responding to consumers, bridging brand experiences, iterating offerings, and collecting and activating consumer relationships in order to accomplish an actionable and measurable objective.�

Digital Transformation Objectives

Through investments in technology, the essence of digital strategy is about enhancing your customers experience and increasing your organization’s competitive advantage. According to a survey conducted by Altimeter, 54% of time digital transformation is led by a CMO (Chief Marketing Officer), 42% by CEO, 29% by CIO/CTO and 20% by others (including CDO).

Based on the research, survey respondents rated the following as Very Important:


80% – Improving process that expedite changes to digital properties such as website updates, new social platforms and new mobile platforms.
71% ...


Read More on Datafloq
How Big Data Changed Online Dating

How Big Data Changed Online Dating

Most of the young men would have considered the happy hour at Chainsaw Sisters Saloon as a target-rich environment. The place was packed and the drinks were cheap. Predominantly, the odds of “getting lucky� were very low. Empirically, millennials know that bar crawling is for recreation but not for low-percentage mating rituals, time-wasting, archaic. There are many dating apps and sites available if you wish to meet someone.

Space is Crowded

The major players of dating include eHarmony, Chemistry.com, and Match.com for romance and they all promise relationships that are long-lasting. Niche sites like JDate.com (intended for Jewish singles), BlackPeopleMeet.com (intended for African American people to connect), ChristianMingle.com (intended for Christians looking out for singles with similar values) and OurTime.com (intended for serious daters over the age of 50) provide eponymous consumer value propositions.

Tinder is the undisputed leader in the mobile first arena. There are numerous other offerings, but not even a single app comes closer to the market share of Tinder. Zoosk, OkCupid, and Hinge are all players and niche apps like The League (the “curated� members must be chosen to join), Bumble (women must begin the conversation), Happn (dating based on location) and JSwipe (Jewish Tinder) have all found an ...


Read More on Datafloq
SAP Ariba and MercadoLibre to consumerize business commerce in Latin America

SAP Ariba and MercadoLibre to consumerize business commerce in Latin America

The next BriefingsDirect global digital business panel discussion explores how the expansion of automated tactical buying for business commerce is impacting global markets, and what's in store next for Latin America.

We’ll specifically examine how “spot buy� approaches enable companies to make time-sensitive and often mission-critical purchases, even in complex and dynamic settings, like Latin America.

Listen to the podcast. Find it on iTunes. Get the mobile app. Read a full transcript or download a copy.

To learn more about the rising tide of such tactical business buying improvements, please join our guests, Karen Bruck, Corporate Sales Director at MercadoLibre.comin Buenos Aires, Argentina; Diego Cabrera Canay, Director of Financial Planning at MercadoLibre, and Tony Alvarez, General Manager of SAP Ariba's Spot Buy Business. The panel was recorded at the recent 2017 SAP Ariba LIVE conference in Las Vegas, and is moderated by Dana Gardner, principal analyst at Interarbor Solutions.

Here are some excerpts:

Gardner: SAP Ariba Spot Buy has been in the market a few years. Tell us about where it has rolled out so far, why certain markets are being approached, and then about Latin America specifically.

Alvarez
Alvarez: The concept is a few years old, but we've been delivering SAP Ariba Spot Buy for about a year. We began in the US, and over the past 12 months the concept of Spot Buy has progressed because of our customer base. Our customer base has pushed us in a direction that is, quite frankly, even beyond Spot Buy -- and it’s getting into trusted, vetted content.

We are approaching the market with a two-pronged strategy of, yes, we have the breadth of content so that when somebody goes into an SAP Ariba application they can find what they are looking for, but we also now have parameters and controls that allow them to vet that content and to put a filter on it.

Over the last 12 months, we've come a long way. We are live in the US, and with early access in the UK and Germany. We just went live in Australia, and now we are very much looking forward to going live and moving fast into Latin America with MercadoLibre.

Gardner: Spot buying, or tactical buying, is different from strategic or more organized long-term buying. Tell us about this subset of procurement.

Alvarez: SAP Ariba is a 20 year-old company, and its roots are in that rigorous, sourced approach. We do hundreds of billions of dollars through contract catalog on the Ariba Network, but there's a segment -- and we believe it's upward of 15% of spend -- that is spot buy spend. The procurement professional often has no idea what's being bought. And I think there are two approaches to that -- either ignorance is bliss and they are glad that it’s out of their purview, or it also keeps them up at night.

SAP Ariba Spot Buy allows them to have visibility into that spend. By partnering with providers like MercadoLibre, they have content from trusted and vetted sellers to bring to the table – so it's a really nice match for procurement.

Liberating limits

Gardner: The trick is to allow for flexibility and being dynamic, but also putting in enough rules and policies so that things don’t go off-track.

Alvarez:Exactly. For example, it’s like putting a filter on your kids’ smartphone. You want them to be able to be liberated so they can go and do as they please with phone calls -- but not to go off the guardrails.

Gardner: Karen, tell us about MercadoLibre and why Latin America might be a really interesting market for this type of Spot Buy service.

Bruck: MercadoLibre is a leading e-commerce platform in Latin America, where we provide the largest marketplaces in 16 different countries. Our main markets are Brazil, Mexico, and Argentina, and that’s where we are going the start this partnership with SAP Ariba.

Bruck
We have upward of 60 million items listed on our platform, and this breadth of supplies will make purchasing very exciting. Latin America is a complicated market -- and we like this complexity. We do very well.

It’s complicated because there are different rates of inflation in different countries, and so contracts can be hard to complete. What we bring to the table is an assortment of great payment and shipping solutions that make it easy for companies to purchase items. As Tony was saying, these are not under long-term contracts, but we still get to make use of this vast supply.

Gardner: Tony mentioned that maybe 15% of spend is in this category. Diego, do you think that that number might be higher in some of the markets that you serve?

Cabrera Canay: That’s probably the number -- but that is a big number in terms of the spend within companies. So we have to get there and see what happens.

Progressive partnership 

Gardner: Tony, tell us about the partnership. What is MercadoLibre.com bringing to the table? What is Ariba bringing to the table? How does this fit together for a whole that is greater than the sum of its parts?

Alvarez: It really is a well-matched partnership. SAP Ariba is the leading cloud procurement platform, period. When you look in Latin America, our penetration with SAP Enterprise Resource Planning (ERP) is even greater. We have a very strong installed base with SAP ERP.

Our plan is to take the SAP Ariba Spot Buy content and make it available to the SAP installed base. So this goes way beyond just SAP Ariba. And when you think about what Karen mentioned -- difficulties in Latin America with high inflation -- the catalog approach is not used as much in Latin America because everything is so dynamic.

For example, you might sign a contract but in just in a couple of weeks that contract may be obsolete, or unfavorable because of a change in pricing. But once we build controls and parameters in SAP Ariba Spot Buy, you can layer that on top of MercadoLibre content, which is super-broad. If you're looking for it you’re going to find it, and that content is constantly updated. You gain real-time access to the latest information, and then the procurement person gets the benefit of control.

So I'm very optimistic. As Diego mentioned, I think 15% is really on the low-end in Latin America for this type of spend. I think this will be a really nice way to put digital catalog buying in the hands of large enterprise buyers.

Gardner: Speaking of large enterprise buyers, if I'm a purchasing official in one of your new markets, what should I be thinking about how this is going to benefit me?

Transparent, trusted transactions

It saves a lot of time, it makes the comparison very transparent, and you are able to control the different options. Overall, it's a win-win ... a partnership, a match made in heaven.
Bruck: Let me talk about this from experience. As a country manager at MercadoLibre, I had to do a lot of the procurement, together with our procurement officers. It was really frustrating at times because all of these purchases had to be one-off engagements, with a different vendor every time. That takes a lot of time. You also have to bring in price comparisons, and that’s not always a simple process.

So what this platform gives you is the ability to be very transparent about prices and among different supplies. That makes it very easy to be able to buy every time without having to call and get the vendor to be in your own buying platform.

It saves a lot of time, it makes the comparison very transparent, and you are able to control the different options. Overall, it’s a win-win. So I do believe this is a partnership, a match made in heaven.

We were also very interested in business-to-business (B2B) industries. When Tony and SAP Ariba came to our offices to offer this partnership, we thought this would be a great way to leverage their needs with our supply and make it work.

Gardner: For sellers, this enables them to do repeated business more easily, more automated and so at scale. For buyers, with transparency they have more insight into getting the best prices, the best terms of delivery. Let's expand on that win-win. Diego, tell us about the business benefits for all parties.

Big and small, meet at the mall 

Cabrera Canay: In the past few years, we have been working to make MercadoLibre the biggest “mall� in e-commerce. We have the most important brands and the most important retailers selling through MercadoLibre.

Cabrera Canay
What differentiates us is that we are confident we have the best prices -- and also other great services such as free shipping, easy payments, and financing. We are sure that we can offer the buyers better purchasing.

Obviously, from the side of sellers, this all provides higher demand, it raises the bar in terms of having qualified buyers, and then giving the best services. That’s very exciting for us.

Gardner: Tony, we mentioned large enterprises, but this cuts across a great deal more of the economy, such as small- to medium sized (SMB) businesses. Tell us about how this works across diverse economies where there are large players but lots of small ones, too?

Alvarez: On the sales side, this gives really small businesses opportunity to reach large enterprise buyers that probably weren’t there before.

Diego was being modest, but MercadoLibre's payment structure, MercadoPago, is incredibly robust, and it's incredibly valuable to that end-seller, and also to the buyer.

Just having that platform and then connecting -- you are basically taking two populations, the large and small sellers, and the large and small buyers, and allowing them to commingle more than they ever had in the past.

Gardner: Karen, as you mentioned from your own experience, when you're dealing with paper, and you are dealing with one-offs, it's hard to just keep track of the process, never mind to analyze it. But when we go digital, when we have a platform, when we have business networks at work, then we can start to analyze things for companies -- and more broadly into markets.

How do you see this partnership accelerating the ability to leverage analytics, leverage some of the back-end platform technologies with SAP HANAand SAP Ariba, and making more strides toward productivity for your customers?

Data discoveries

Bruck:Right. When everything is tracked, as this will be, because every single purchase will be inside their SAP Ariba platform, it is all part of your “big data.� So then you can actually drop it, control it, analyze it, and say, “Hey, maybe these particular purchases mean that we should have long-term contracts, or that our long-term contracts were not priced correctly,� and maybe that's an opportunity to save money and lower costs.

So once you can track data, you can do a lot of things, and discover new opportunities for either being more efficient or reducing costs – and that's ultimately what we all want in all the departments of our companies.

Gardner: And for those listeners and readers who are interested in taking advantage of these services, and ultimately that great ability to analyze, what should they be doing now to get ready? Are there some things they could do culturally, organizationally, in order to become that more digital business when these services are available to them?
Paper is terrible for companies; you have to rethink your purchase processing in a digital way.

Cabrera Canay: I can talk about in our own case, where we are rebuilding our purchase processes. Paper is terrible for companies; you have to rethink your purchase processing in a digital way. Once you do it, SAP Ariba is a great solution, and with SAP Ariba Spot Buy we will have the best conditions for the buyers.

Bruck: It’s a natural process. People are going digital and embracing these new trends and technologies. It will make them more efficient. If they get up to speed quickly, it will become less about controlling stuff that they don't need to control. They will really understand the benefits, so it will be a natural adoption.

Gardner: Tony, coming back full circle, as you have rolled SAP Ariba Spot Buy out from North America to Europe to Asia-Pacific, and now to Latin America -- what have you learned in the way people use it?

Alvarez: First, at a macro level, people have found this to be a useful tool to replace some of the contracts that were less important, and so they can rely on marketplaces.

Second, we’ve really found as we’ve deployed in the US that a lot of times multinational companies are like, “Hey, that's great, I love this, but I really want to use this in Latin America.� So they want to go and get visibility elsewhere.

Turn-key technique

Third, they want a tool that doesn't require any training. If I’m a procurement professional, I want my users to already be expert at using the tool. We've designed this in the process context, and in concert with the content partners. You can just walk up and start using it. You don’t have to be an expert, and it keeps you within the guardrails without even thinking about it.

Gardner: And being a cloud-based, software-as-a-service (SaaS) solution you're always analyzing how it's being used -- going after that ultimate optimized user experience -- and then building those improvements back in on a constant basis?

Alvarez:Exactly. Always.

Listen to the podcast. Find it on iTunes. Get the mobile app. Read a full transcript or download a copy. Sponsor: SAP Ariba.

You may also be interested in:

SAP Ariba and MercadoLibre to consumerize business commerce in Latin America

SAP Ariba and MercadoLibre to consumerize business commerce in Latin America

The next BriefingsDirect global digital business panel discussion explores how the expansion of automated tactical buying for business commerce is impacting global markets, and what's in store next for Latin America.

We’ll specifically examine how “spot buy� approaches enable companies to make time-sensitive and often mission-critical purchases, even in complex and dynamic settings, like Latin America.

Listen to the podcast. Find it on iTunes. Get the mobile app. Read a full transcript or download a copy.

To learn more about the rising tide of such tactical business buying improvements, please join our guests, Karen Bruck, Corporate Sales Director at MercadoLibre.comin Buenos Aires, Argentina; Diego Cabrera Canay, Director of Financial Planning at MercadoLibre, and Tony Alvarez, General Manager of SAP Ariba's Spot Buy Business. The panel was recorded at the recent 2017 SAP Ariba LIVE conference in Las Vegas, and is moderated by Dana Gardner, principal analyst at Interarbor Solutions.

Here are some excerpts:

Gardner: SAP Ariba Spot Buy has been in the market a few years. Tell us about where it has rolled out so far, why certain markets are being approached, and then about Latin America specifically.

Alvarez
Alvarez: The concept is a few years old, but we've been delivering SAP Ariba Spot Buy for about a year. We began in the US, and over the past 12 months the concept of Spot Buy has progressed because of our customer base. Our customer base has pushed us in a direction that is, quite frankly, even beyond Spot Buy -- and it’s getting into trusted, vetted content.

We are approaching the market with a two-pronged strategy of, yes, we have the breadth of content so that when somebody goes into an SAP Ariba application they can find what they are looking for, but we also now have parameters and controls that allow them to vet that content and to put a filter on it.

Over the last 12 months, we've come a long way. We are live in the US, and with early access in the UK and Germany. We just went live in Australia, and now we are very much looking forward to going live and moving fast into Latin America with MercadoLibre.

Gardner: Spot buying, or tactical buying, is different from strategic or more organized long-term buying. Tell us about this subset of procurement.

Alvarez: SAP Ariba is a 20 year-old company, and its roots are in that rigorous, sourced approach. We do hundreds of billions of dollars through contract catalog on the Ariba Network, but there's a segment -- and we believe it's upward of 15% of spend -- that is spot buy spend. The procurement professional often has no idea what's being bought. And I think there are two approaches to that -- either ignorance is bliss and they are glad that it’s out of their purview, or it also keeps them up at night.

SAP Ariba Spot Buy allows them to have visibility into that spend. By partnering with providers like MercadoLibre, they have content from trusted and vetted sellers to bring to the table – so it's a really nice match for procurement.

Liberating limits

Gardner: The trick is to allow for flexibility and being dynamic, but also putting in enough rules and policies so that things don’t go off-track.

Alvarez:Exactly. For example, it’s like putting a filter on your kids’ smartphone. You want them to be able to be liberated so they can go and do as they please with phone calls -- but not to go off the guardrails.

Gardner: Karen, tell us about MercadoLibre and why Latin America might be a really interesting market for this type of Spot Buy service.

Bruck: MercadoLibre is a leading e-commerce platform in Latin America, where we provide the largest marketplaces in 16 different countries. Our main markets are Brazil, Mexico, and Argentina, and that’s where we are going the start this partnership with SAP Ariba.

Bruck
We have upward of 60 million items listed on our platform, and this breadth of supplies will make purchasing very exciting. Latin America is a complicated market -- and we like this complexity. We do very well.

It’s complicated because there are different rates of inflation in different countries, and so contracts can be hard to complete. What we bring to the table is an assortment of great payment and shipping solutions that make it easy for companies to purchase items. As Tony was saying, these are not under long-term contracts, but we still get to make use of this vast supply.

Gardner: Tony mentioned that maybe 15% of spend is in this category. Diego, do you think that that number might be higher in some of the markets that you serve?

Cabrera Canay: That’s probably the number -- but that is a big number in terms of the spend within companies. So we have to get there and see what happens.

Progressive partnership 

Gardner: Tony, tell us about the partnership. What is MercadoLibre.com bringing to the table? What is Ariba bringing to the table? How does this fit together for a whole that is greater than the sum of its parts?

Alvarez: It really is a well-matched partnership. SAP Ariba is the leading cloud procurement platform, period. When you look in Latin America, our penetration with SAP Enterprise Resource Planning (ERP) is even greater. We have a very strong installed base with SAP ERP.

Our plan is to take the SAP Ariba Spot Buy content and make it available to the SAP installed base. So this goes way beyond just SAP Ariba. And when you think about what Karen mentioned -- difficulties in Latin America with high inflation -- the catalog approach is not used as much in Latin America because everything is so dynamic.

For example, you might sign a contract but in just in a couple of weeks that contract may be obsolete, or unfavorable because of a change in pricing. But once we build controls and parameters in SAP Ariba Spot Buy, you can layer that on top of MercadoLibre content, which is super-broad. If you're looking for it you’re going to find it, and that content is constantly updated. You gain real-time access to the latest information, and then the procurement person gets the benefit of control.

So I'm very optimistic. As Diego mentioned, I think 15% is really on the low-end in Latin America for this type of spend. I think this will be a really nice way to put digital catalog buying in the hands of large enterprise buyers.

Gardner: Speaking of large enterprise buyers, if I'm a purchasing official in one of your new markets, what should I be thinking about how this is going to benefit me?

Transparent, trusted transactions

It saves a lot of time, it makes the comparison very transparent, and you are able to control the different options. Overall, it's a win-win ... a partnership, a match made in heaven.
Bruck: Let me talk about this from experience. As a country manager at MercadoLibre, I had to do a lot of the procurement, together with our procurement officers. It was really frustrating at times because all of these purchases had to be one-off engagements, with a different vendor every time. That takes a lot of time. You also have to bring in price comparisons, and that’s not always a simple process.

So what this platform gives you is the ability to be very transparent about prices and among different supplies. That makes it very easy to be able to buy every time without having to call and get the vendor to be in your own buying platform.

It saves a lot of time, it makes the comparison very transparent, and you are able to control the different options. Overall, it’s a win-win. So I do believe this is a partnership, a match made in heaven.

We were also very interested in business-to-business (B2B) industries. When Tony and SAP Ariba came to our offices to offer this partnership, we thought this would be a great way to leverage their needs with our supply and make it work.

Gardner: For sellers, this enables them to do repeated business more easily, more automated and so at scale. For buyers, with transparency they have more insight into getting the best prices, the best terms of delivery. Let's expand on that win-win. Diego, tell us about the business benefits for all parties.

Big and small, meet at the mall 

Cabrera Canay: In the past few years, we have been working to make MercadoLibre the biggest “mall� in e-commerce. We have the most important brands and the most important retailers selling through MercadoLibre.

Cabrera Canay
What differentiates us is that we are confident we have the best prices -- and also other great services such as free shipping, easy payments, and financing. We are sure that we can offer the buyers better purchasing.

Obviously, from the side of sellers, this all provides higher demand, it raises the bar in terms of having qualified buyers, and then giving the best services. That’s very exciting for us.

Gardner: Tony, we mentioned large enterprises, but this cuts across a great deal more of the economy, such as small- to medium sized (SMB) businesses. Tell us about how this works across diverse economies where there are large players but lots of small ones, too?

Alvarez: On the sales side, this gives really small businesses opportunity to reach large enterprise buyers that probably weren’t there before.

Diego was being modest, but MercadoLibre's payment structure, MercadoPago, is incredibly robust, and it's incredibly valuable to that end-seller, and also to the buyer.

Just having that platform and then connecting -- you are basically taking two populations, the large and small sellers, and the large and small buyers, and allowing them to commingle more than they ever had in the past.

Gardner: Karen, as you mentioned from your own experience, when you're dealing with paper, and you are dealing with one-offs, it's hard to just keep track of the process, never mind to analyze it. But when we go digital, when we have a platform, when we have business networks at work, then we can start to analyze things for companies -- and more broadly into markets.

How do you see this partnership accelerating the ability to leverage analytics, leverage some of the back-end platform technologies with SAP HANAand SAP Ariba, and making more strides toward productivity for your customers?

Data discoveries

Bruck:Right. When everything is tracked, as this will be, because every single purchase will be inside their SAP Ariba platform, it is all part of your “big data.� So then you can actually drop it, control it, analyze it, and say, “Hey, maybe these particular purchases mean that we should have long-term contracts, or that our long-term contracts were not priced correctly,� and maybe that's an opportunity to save money and lower costs.

So once you can track data, you can do a lot of things, and discover new opportunities for either being more efficient or reducing costs – and that's ultimately what we all want in all the departments of our companies.

Gardner: And for those listeners and readers who are interested in taking advantage of these services, and ultimately that great ability to analyze, what should they be doing now to get ready? Are there some things they could do culturally, organizationally, in order to become that more digital business when these services are available to them?
Paper is terrible for companies; you have to rethink your purchase processing in a digital way.

Cabrera Canay: I can talk about in our own case, where we are rebuilding our purchase processes. Paper is terrible for companies; you have to rethink your purchase processing in a digital way. Once you do it, SAP Ariba is a great solution, and with SAP Ariba Spot Buy we will have the best conditions for the buyers.

Bruck: It’s a natural process. People are going digital and embracing these new trends and technologies. It will make them more efficient. If they get up to speed quickly, it will become less about controlling stuff that they don't need to control. They will really understand the benefits, so it will be a natural adoption.

Gardner: Tony, coming back full circle, as you have rolled SAP Ariba Spot Buy out from North America to Europe to Asia-Pacific, and now to Latin America -- what have you learned in the way people use it?

Alvarez: First, at a macro level, people have found this to be a useful tool to replace some of the contracts that were less important, and so they can rely on marketplaces.

Second, we’ve really found as we’ve deployed in the US that a lot of times multinational companies are like, “Hey, that's great, I love this, but I really want to use this in Latin America.� So they want to go and get visibility elsewhere.

Turn-key technique

Third, they want a tool that doesn't require any training. If I’m a procurement professional, I want my users to already be expert at using the tool. We've designed this in the process context, and in concert with the content partners. You can just walk up and start using it. You don’t have to be an expert, and it keeps you within the guardrails without even thinking about it.

Gardner: And being a cloud-based, software-as-a-service (SaaS) solution you're always analyzing how it's being used -- going after that ultimate optimized user experience -- and then building those improvements back in on a constant basis?

Alvarez:Exactly. Always.

Listen to the podcast. Find it on iTunes. Get the mobile app. Read a full transcript or download a copy. Sponsor: SAP Ariba.

You may also be interested in:

What Big Data Means for the Future of Health

What Big Data Means for the Future of Health

Doctors and scientist are always looking for ways to improve the health and wellness of society. The industry thrives on the introduction of new technology and medical techniques. It is this constant effort to improve medical outcomes that pushes doctors, nurses, and scientists to find the best healthcare solutions. One big technological breakthrough that has helped these solutions come to light is big data.

Big data, when used correctly, has improved technology, communication and now healthcare. From improving medical transcriptionist schools to advancing the technology used in a hospital, big data has made its way to the medical world. Data analytics has helped doctors improve the health of their patients and will continue doing so in the future. As data is aggregated, doctors have been able to use it to improve the overall health of the public, identify and provide treatment for patients, find the right program/procedure for a patient, and communicate with healthcare providers.

Improve Public Health

Big data has the ability to analyze disease patterns and record sudden outbreaks. It also allows public health to be monitored and tracked. When disease data and medical information of the public gets recorded, these two pieces can improve the overall health of the population. ...


Read More on Datafloq
How to Balance Analytics Agility and Stability

How to Balance Analytics Agility and Stability

There have been many science fiction stories (as well as video games!) that revolve around the tradeoffs between powerful, strong, hard to harm combatants and those that are small, nimble, but easy to harm. Both have their merits and both can be useful in different situations. However, the same profile doesn’t work best in every situation.

There are situations in a fight where being fast is more important than being strong. There are also cases where being able to stand your ground is more important than speed. The same is true with analytics. At times, agility is more important. At other times, stability is more important. The key is to know when you need which option.

The Flaws Of Targeting Only Stability

Many organizations struggle to progress effectively in their analytics journeys because of too large a focus on stability. During the exploration and discovery process, nearly all the stringent rules applied to a mission-critical production process are enforced from the start. This makes it very difficult to go in new directions and drives costs up much higher than they need to be. After all, if I just want to quickly see if an idea has merit, I don’t need my process to ...


Read More on Datafloq
Why Control Is Necessary For Explainable Artificial Intelligence

Why Control Is Necessary For Explainable Artificial Intelligence

Artificial Intelligence is a very big and useful technology that has taken the world by storm. As beneficial as it is, it also has its drawbacks. No matter how efficient they are, machines only work on algorithms and these algorithms should not be over trusted for several reasons.

There may be a genuine error in its algorithm that leads to different actions from the expected or desired ones. A bug or malware may find its way into the code and this will lead to abnormal behavior of these machines. Programmers may deliberately input wrong codes for ulterior motives. This is why explainable artificial intelligence is the way to go.

With explainable artificial intelligence, every user will be able to understand how a machine works. Besides, the machines will come with a high level of transparency and accountability. Every machine should be able to explain why certain actions need to be taken to its users.

It should also explain why that is the best option and why other alternatives may not work out for a particular situation. Explainable artificial intelligence also aims at making it obvious to users when a particular machine has failed on a particular task and when it has succeeded in ...


Read More on Datafloq
How the Mortgage Industry is Being Reshaped by Big Data

How the Mortgage Industry is Being Reshaped by Big Data

What goes into getting a mortgage or refinancing your home? A fairly individualized process that hinges on both the lending bank and the borrowers’ history, the mortgage industry is on the brink of transformation – and it’s all because of big data. From the initial application process to ongoing loan servicing, data banks are dictating lending in an unprecedented way.

What does big data mean for your mortgage? Here’s an inside look at a changing industry.

Faster Decisions

As noted, loan approvals used to be fairly individualized because isolated loan officers looked at isolated pieces of information and decided whether or not you qualified. The onboarding process took a long time and was prone to missing key factors because, while there was more data out there that could have been placed at their disposal, most of it was just warehoused and not accessible. Now computers can cross-reference information during onboarding to confirm that borrowers aren’t submitting conflicting information or disguising their financial history.

Following this high speed onboarding process using immediately accessible data banks, loan officers are able to make immediate decisions about your qualifications. Actually, a computer makes the decision and feeds that decision back to the staff members, but this makes for ...


Read More on Datafloq
How Artificial Intelligence will Transform IT Operations and DevOps

How Artificial Intelligence will Transform IT Operations and DevOps

To state that DevOps and IT operations teams will face new challenges in the coming years sounds a bit redundant, as their core responsibility is to solve problems and overcome challenges. However, with the dramatic pace in which the current landscape of processes, technologies, and tools are changing, it has become quite problematic to cope with it. Moreover, the pressure business users have been putting on DevOps and IT operations teams is staggering, demanding that everything should be solved with a tap on an app. However, at the backend, handling issues is a different ball game; the users can’t even imagine how difficult it is to find a problem and solve it.

One of the biggest challenges IT operations and DevOps teams face nowadays is being able to pinpoint the small yet potentially harmful issues in large streams of Big Data being logged in their environment. Put simply, it is just like finding a needle in the haystack.

If you work in the IT department of a company with online presence that boasts 24/7 availability, here is a scenario that may sound familiar to you. Assume that you get a call in the middle of the night from an angry customer or ...


Read More on Datafloq
Big Data Continues Serving Patients in Increasingly More Ways

Big Data Continues Serving Patients in Increasingly More Ways

From marketing to healthcare, across all industries, big data has become the next disruptive technology. The healthcare industry specifically is consistently a late technology adopter, however, it is starting to uncover new ways to optimize and serve patients using big data. These are the top five ways big data is helping improve patient lives.

Big data helps doctors determine the best treatment

Dr. Anil Jain, a doctor at the Cleveland Clinic, wished he could see and analyze diabetic patient data so he could determine the best treatment plan after he noticed diabetic patients often had the same two or three concurrent medical issues. Jain thought if he had access to aggregated patient data, he and other doctors would be able to create better outcomes for their patients.

This led the Cleveland Clinic to create a program called Explorys, which gave doctors the ability to put in data and then sift through the data to “identify patient risk factors, track outcomes and evaluate treatment success.� IBM purchased Explorys in an effort to improve its cloud offerings using the supercomputer Watson.

IBM’s Watson is also being used as a tool to help oncologists determine the best treatment route for their cancer patients by using big data ...


Read More on Datafloq
How Data Analytics is Transforming Healthcare Systems

How Data Analytics is Transforming Healthcare Systems

Big Data Analytics is entirely transforming business paradigms. Automated databases are enabling businesses to perform mundane tasks more efficiently. And, the commercial sector isn’t the only area to benefit from data analytics. Its impact is widespread and is being seen across many different sectors, including healthcare.

Access to healthcare facilities is a basic, human need. However, the healthcare sector is extremely expensive, even when compared to the other developed economies. In the United States, the burden of the expense ultimately falls on the consumer since the sector is mostly dominated by private companies. America, however, ends up spending more on its public healthcare system than countries where the system is fully publicly funded.

Under such circumstances where people are paying a significantly higher price, they deserve a service that matches the price tag. The question is then: how can data analytics help increase the efficiency of healthcare systems in United States and around the world?



Performance Evaluation

Keeping a tab on hospital activities by maintaining relevant databases can help administrators find inefficiencies in service provision. Based on the results found from data analysis, specific actions can be taken to reduce the overall costs of a healthcare facility. Reduced costs may be reflected in the ...


Read More on Datafloq
How Big Data And Logistics are Working Together

How Big Data And Logistics are Working Together

Logistics companies have initiated many project prototypes for the exploitation of big data analysis and several amazing projects that will soon be part of our everyday lives. This includes using Spark on Hadoop for real-time analysis to assess large data volumes stored on registers' logs, database, excel, or HDFS that has completely changed the business dynamics. Here are some of the big data projects related to the logistics sector:

Volume Analysis

Logistic companies seeking to optimize budgets and resource allocation have always grappled with the problem of predicting parcel volume on a given day of the year, month, or week. Logistic companies are currently investing in the area to determine patterns that help to predict peak volumes. It is an ideal use case since data scientists are able to generate recommendations by running batch analysis.

Parcel Health Data

It is important for the transportation of medicines and other commodities in general to be done in a controlled environment. For instance, some medications should be stored between 2 and 8 degrees Celsius. Some type of equipment are fragile and require extra care while handling. It is quite costly for logistics companies and even the end-consumer to manage the whole process. This is why companies are ...


Read More on Datafloq
Try Cassandra with This JVM for a Flawless Experience

Try Cassandra with This JVM for a Flawless Experience

While working with big data, professionals often encounter the questions like which database is better. There is no specific answer to a question like this because a professional considers a lot of things when deciding the right type of technology that he or she wants to work with. The choices are sometimes a single Software and most of the times a combination that has worked well previously.

This blog is centered upon the use of Apache Cassandra with Zing. Now, Cassandra is a database management system (DBMS) established upon NoSQL. It has some unique features that can alleviate your routine data management experience. Zing on the other hand is a JVM designed to deliver high performance. Let’s look at the ways in which this combination can accelerate performance of your systems.

Known Issues in Cassandra

Cassandra and Zing complement each other or to say using Cassandra backed by Zing is sure to take care of all the loop holes that traditional Cassandra users have been encountering. There are two prominent issues that affect the performance.


Memory Settings: Databases are meant for storing data and in Cassandra there are a few issues like setting limits for memtables to avoid running over the other important data. ...


Read More on Datafloq
Ways Hackers Steal Your Data (And How to Defend Yourself)

Ways Hackers Steal Your Data (And How to Defend Yourself)

For non-technophiles, online communication is as simple as clicking “send� in an email client. But in reality, the entire process includes a series of precise mechanisms that took decades to develop.

Suppose you are to send a photo of your last trip to Panama. Upon sending, the picture’s data gets broken down into “packets� that are typically no bigger than 1,500 bytes each. Once these packets reach the intended recipient, a computer reassembles them back into an image – ready to be viewed by humans.

Today, internet technology has become so efficient that—on an average internet connection—up to 5.1 megabytes of data can be transferred in a second. The only problem is that data in transit is susceptible to digital eavesdroppers or more popularly known as hackers.

How Hackers Steal Data

A hacker has many tricks up his sleeve. If their goal is corporate sabotage, they can leverage a network of infected computers or ‘botnets’ to launch a Distributed Denial of Service or DDoS attack. They can also infiltrate networks by injecting malware, such as ‘keyloggers’ that track everything a user types.

Luckily, there is a straightforward solution that can prevent these common cyber threats. For everyday internet users, a free tool like Malwarebytes should ...


Read More on Datafloq
The Role of Big Data in IoT

The Role of Big Data in IoT

IoT (the Internet of Things) refers to the automated intelligent control and command of connected devices over vast regions via sensors and other computing capabilities. At its core, IoT is a fairly simple concept to grasp. It's all about making out products smarter. IoT is on its path to becoming one the biggest technological revolutions the world has ever seen. By 2020, the amount of revenue generated by IoT technology is expected to be in the figures north of $300 billion, and this is just a tip of the iceberg. One of the most critical components of the IoT process is data. For connected devices to perform commands, data has to be sent to a centralized location — say gateway or the cloud — where it's processed and sent back to the sensors of these devices.

It is, therefore, imperative to have an efficient way of collecting small amounts of data and transmitting this data to the centralized location for processing, and sending it back to the sensors — all in real-time. Taking into account the type, the enormous explosion in numbers and capabilities of these devices and sensors, the size of the data that needs processing can be extremely large ...


Read More on Datafloq
Today’s Challenge for the OpenStack Foundation: Move Beyond the Complexity Conundrum

Today’s Challenge for the OpenStack Foundation: Move Beyond the Complexity Conundrum

The OpenStack Foundation is addressing complexity with "composability." Will that be enough to bring disgruntled early adopters back into the fold?
Protection for Your Business Data: The Must-Have of 21st Century

Protection for Your Business Data: The Must-Have of 21st Century

Why should you protect your data? After all, it’s only now seen as the most valuable resource in the world; even more valuable than oil, in fact. What’s more, while it can be a challenge to steal enough oil, to steal enough data is a synch. After all, Snowden managed to steal 20,000 files from the NSA using nothing more than a few thumb drives. That’s the NSA we’re talking about!

So why should you protect your data? Because if you don’t, then you might well end up with a similar fallout when a disgruntled employee or outside hacker decides to get at your files and do serious harm with them.

The question, of course, isn’t if you should protect your data. That goes without saying. It is how to protect your data.  That’s what the rest of this article is going to be devoted to.

Know what you need to protect

Step one is identifying the data that actually needs protection. Some things do. Some things don’t. Some things that absolutely need to be secured are things like:


Customer data. This is stuff like transaction accounts, private information like names and addresses, personal data of any kind and anything else that might be sensitive.
The ...


Read More on Datafloq
Is Big Data Facilitating a Designer Society?

Is Big Data Facilitating a Designer Society?

Personalisation seems to be one of the big trends of the next decade.

With virtual assistants coordinating our every move and our lives held in the palm of our hand, it is undeniable that we are fast becoming slaves to our technology. What is slightly less obvious (at the moment) is the potential for that technology to learn about how we live and create a uniquely personal experience for us, every waking minute of the day (and even maybe marshal our dreams).

We are a product of our experiences, but the moment we plug in an analytical and predictive companion to our lives, it can learn about us in was that only our subconscious could fathom. Tech will be able to provide insights into our behaviour that we could only guess at – we will be able to “optimise� our lives, and I am sure that all sorts of solutions will appear that will take the strain.

The Big Data behind these insights will help to guide us like our own personal SatNav, but instead of telling us to “turn left at the junction� it will remind us to be patient when dealing with a certain person (because of a previous experience) or ...


Read More on Datafloq
10 éves a dmlab

10 éves a dmlab

Hálás vagyok. Ennek az egyszerű gondolatnak mindenféle variációja kavarog a fejemben, mikor arra gondolok, hogy ma 2017 május 10.-én ünnepeljük a dmlab alakulásának tizedik évfordulóját. Ahogy a tíz évnyi élményt átpörgetem a fejemen, valahogy azt érzem, hogy ez nagyon jó tíz év volt. Annyira pozitív bennem az összkép, hogy szinte hitetlenkedve szedem össze az agyam rejtet zugaiból a nehézségek, a kudarcok élményeit. És mikor ezeket is sorba veszem, méginkább kereknek és jónak látom ezt az időszakot. Hálás vagyok azért, hogy így tekinthetek vissza.


picture1.pngHálás vagyok azokért, akikkel ezt az egészet tíz éve elindítottuk. Ha dmlab indulására gondolok, egy rövid TED videó jut eszembe, ami egy rövid vicces videó elemzésén keresztül mutatja be, hogyan indul el egy mozgalom (link). Kiemeli, hogy egy új kezdeményezés indításánál nem az azt indító vezető személye a legfontosabb, hanem annak az első egy-két társnak a döntése, akik elsőként hozzá csatlakozva vezetővé teszik. Hálás vagyok ezért Nagy Istvánnak, Főző Csabának, majd Ivónak, Prekónak, Attilának, Petinek, majd Gergőnek, Csabinak, Simonnak, és sokáig sorolhatnám ki mindenkinek, akik hittek abban, hogy lehet és érdemes a dmlab kötelékében valami újat és nagyszerűt alkotni.

Hálás vagyok azért a bátorságért és azért vakságért, vakmerőségért, ami ezt a csapatot jellemezte. Bátrak voltunk, mikor új és járatlan, kockázatos utakon kezdtünk el járni, és olykor vakmerők voltunk, mikor nem is voltunk képesek felmérni, mekkora fába vágtuk a fejszénket - és néha milyen jól jött, hogy emiatt megijedni, visszarettenni sem volt lehetőségünk. Hálás vagyok azért, mert ez a kísérletező kedv, ez a szabályok és a berögződött reflexeket felülírni akaró szemlélet, ez a kreatív energia mind a mai napig áthatja a csapatot.

Hálás vagyok, hogy a tíz év során időről-időre feltettük a kérdést magunknak mit és hogyan akarunk elérni közösen. Hálás vagyok Törőért, mert segített nekünk rátalálni egy őszinte és előremutató vízióra, segített megérteni, hogy ahogy a cégnek ugyanúgy eredménye, terméke, hogy milyen munkahelyeket hoz létre, hogy milyen kollegiális viszonyban és hogyan dolgozunk együtt, tudatosodott bennünk, hogy milyen ügyeket, célokat és cégeket szolgálunk és segítünk.

Hálás vagyok a sok projektért, pilotért és oktatásért, hálás vagyok a dmlab-ból induló, “spin-off-oló� startupért és azok sikeréért. Büszkék vagyunk rátok.

Hálás vagyok a tíz évet folyton átszövő változásokért. Még úgy is, hogy tudom, hogy nem minden változás fejlődés, és nem minden fejlődés gyarapodás volt a dmlabban. De álltuk a sarat, megtaláltuk az új helyzetekben a lehetőséget, és szinte kivétel nélkül ki tudtuk használni azt. A napokban kezembe került a dmlab egy kilenc éve született stratégiai terve. Mellbevágó volt belenézni, és látni hogy mennyire keveset változtak a lényegi dolgok tíz év alatt, miközben mégis minden megváltozott: a szakma, a piac, és mi magunk is mennyit fejlődtünk.

Köszönjük.

“Ez jó mulatság, férfi munka volt!�

Megosztom Facebookon! Megosztom Twitteren! Megosztom Tumblren!

The Jedox Roadshow comes to Brisbane

The Jedox Roadshow comes to Brisbane

Jedox is one of the best unknown data platforms I know of.  Calling them unknown may sound a little harsh because they are a major product however, I say unknown because I don’t see them being considered for purchase in a large number of companies when they are clearly a fit for purpose product – […]
The Basics of Deep Learning and How It Is Revolutionizing Technology

The Basics of Deep Learning and How It Is Revolutionizing Technology

Machine learning refers to a type of Artificial Intelligence (AI) that allows computers to learn beyond their initial static programming. These new programs are developed to analyze patterns in past data sets in order to adapt. More advanced computer programs are even capable of altering their code in response to prior exposure to an unfamiliar set of inputs, which opens a whole set of possibilities for the future of AI.

Some of the recent applications of machine learning include Google’s self-driving car and the algorithm behind the success of its web search function, companies providing online recommendation offers based on user's’ browsing history, and fraud detection.

Deep learning is a branch of machine learning that focuses on the neural network model inspired by our understanding of the biology of the human brain. The human brain contains billions of neurons that are capable of sending signals and connecting to each other within a certain physical distance. Programmers incorporated that structure by creating artificial neural networks that have discrete layers, connections, and directions in which the data propagates.

How Does Deep Learning Work?

Deep learning enables computer programs to process a lot of input data simultaneously, and use it to make decisions based on the ...


Read More on Datafloq
How Secure is Your Cloud Computing?

How Secure is Your Cloud Computing?

For many businesses, cloud storage has become the new norm for storing and sharing files across departments. It’s highly convenient and gives employees access to data across multiple devices wherever they are. As more and more options become available, more and more of our personal and confidential business data gets stored on the cloud. But just how secure is that information?

After the iCloud celebrity photo breach, the use and faith in cloud storage declined, which also meant that the demand for security on cloud storage went up. It is important to note that computers, cloud storage and hybrid cloud storage are always susceptible and there is no perfect system, but that doesn’t mean you shouldn’t use one of the best and convenient pieces of technology out there. Instead, be smart about how you and your employees use it and follow these guidelines to ensure that your data is as secure as it can get on the cloud.

First and foremost, ditch the easy passwords. Nearly every website or account you use needs a username and password, and keeping track of those can be extremely frustrating and daunting which is why many resort to duplicating passwords or relying on easy info like ...


Read More on Datafloq
Why Marketing Automation Won’t Work Without Data Quality Measures

Why Marketing Automation Won’t Work Without Data Quality Measures

Yesterday, actually it was a normal day for me, I again experienced why marketing automation won’t work without data quality measures. I attended a webinar by a French marketing automation company who had a really nice tool to track customers on the website and collect leads for further processing. They have put much effort on a rule based engine to segment leads and a state of the art backend to have a nice working environment. I then asked: What happens if a person mistyped his e-mail address? What happens if a person set fills out his name in lowercase? What happens if the person has a typo in the postal address? First there was no answer. But then the webinar leader said: Why should someone do that? The answer is easy: Because we are human! There is a certain percentage of people who are not 100% concentrated when filling out a form. Maybe because they are using their smartphone where a typo can happen very easily, or they simply don’t know their correct e-mail address (I have often seen Austrian or German e-mail addresses with @gmail.at or @gmail.de – but, as we all know, there is only gmail.com).

So this sophisticated ...


Read More on Datafloq
Awesome Procurement —Survey shows how business networks fuel innovation and business transformation

Awesome Procurement —Survey shows how business networks fuel innovation and business transformation

The next BriefingsDirect digital business insights interview explores the successful habits, practices, and culture that define highly effective procurement organizations.

We'll uncover unique new research that identifies and measures how innovative companies have optimized their practices to overcome the many challenges facing business-to-business (B2B) commerce.

Listen to the podcast. Find it on iTunes. Get the mobile app. Read a full transcript or download a copy.

To learn more about the traits and best practices of the most successful procurement organizations, please join Kay Ree Lee, Director of Business Analytics and Insights at SAP Ariba. The interview was recorded at the recent 2017 SAP Ariba LIVE conference in Las Vegas, and is moderated by Dana Gardner, principal analyst at Interarbor Solutions.

Here are some excerpts:

Gardner: Procurement is more complex than ever, supply chains stretch around the globe, regulation is on the rise, and risk is heightened across many fronts. Despite these, innovative companies have figured out how to overcome their challenges, and you have uncovered some of their secrets through your Annual Benchmarking Survey. Tell us about your research and your findings.

Lee: Every year we conduct a large benchmark program benefiting our customers that combines a traditional survey with data from the procurement applications, as well as business network.

Lee
This past year, more than 200 customers participated, covering more than $400 billion in spend. We analyzed the quantitative and qualitative responses of the survey and identified the intersection between those responses for top performers compared to average performers. This has allowed us to draw correlations between what top performers did well and the practices that drove those achievements.

Gardner: What’s changed from the past, what are you seeing as long-term trends?

Lee: There are three things that are quite different from when we last talked about this a year ago.

The number one trend that we see is that digital procurement is gaining momentum quickly. A lot of organizations are now offering self-service tools to their internal stakeholders. These self-service tools enable the user to evaluate and compare item specifications and purchase items in an electronic marketplace, which allows them to operate 24x7, around-the-clock. They are also utilizing digital networks to reach and collaborate with others on a larger scale.
We see compliance management as a way for organizations to deliver savings to the bottom line.

The second trend that we see is that while risk management is generally acknowledged as important and critical, for the average company, a large proportion of their spend is not managed. Our benchmark data indicates that an average company manages 68% of their spend. This leaves 32% of spend that is unmanaged. If this spend is not managed, the average company is also probably not managing their risk. So, what happens when something unexpected occurs to that non-managed spend?

The third trend that we see is related to compliance management. We see compliance management as a way for organizations to deliver savings to the bottom line. Capturing savings through sourcing and negotiation is a good start,  but at the end of the day, eliminating loopholes through a focus on implementation and compliance management is how organizations deliver and realize negotiated savings.

Gardner:You have uncovered some essential secrets -- or the secret sauce -- behind procurement success in a digital economy. Please describe those.

Five elements driving procurement processes

Lee: From the data, we identified five key takeaways. First, we see that procurement organizations continue to expand their sphere of influence to greater depth and quality within their organizations. This is important because it shows that the procurement organization and the work that procurement professionals are involved in matters and is appreciated within the organization.

The second takeaway is that – while cost reduction savings is near and dear to the heart of most procurement professionals -- leading organizations are focused on capturing value beyond basic cost reduction. They are focused on capturing value in other areas and tracking that value better.

The third takeaway is that digital procurement is firing on all cylinders and is front and center in people's minds. This was reflected in the transactional data that we extracted.

The fourth takeaway is related to risk management. This is a key focus area that we see instead of just news tracking related to your suppliers.

The fifth takeaway is -- compliance management and closing the purchasing loopholes is what will help procurement deliver bottom-line savings.

Gardner: What next are some of the best practices that are driving procurement organizations to have a strategic impact at their companies, culturally?

Lee: To have a strategic impact in the business, procurement needs to be proactive in engaging the business. They should have a mentality of helping the business solve business problems as opposed to asking stakeholders to follow a prescribed procurement process. Playing a strategic role is a key practice that drives impact.
Another practice that drives strategic impact is the ability to utilize and adopt technology to your advantage through the use of digital networks.

They should also focus on broadening the value proposition of procurement. We see leading organizations placing emphasis on contributing to revenue growth, or increasing their involvement in product development, or co-innovation that contributes to a more efficient and effective process.

Another practice that drives strategic impact is the ability to utilize and adopt technology to your advantage through the use of digital networks, system controls to direct compliance, automation through workflow, et cetera.

These are examples of practices and focus areas that are becoming more important to organizations.

Using technology to track technology usage

Gardner: In many cases, we see the use of technology having a virtuous adoption cycle in procurement. So the more technology used, the better they become at it, and the more technology can be exploited, and so on. Where are we seeing that? How are leading organizations becoming highly technical to gain an advantage?

Lee:Companies that adopt new technology capabilities are able to elevate their performance and differentiate themselves through their capabilities. This is also just a start. Procurement organizations are pivoting towards advanced and futuristic concepts, and leaving behind the single-minded focus on cost reduction and cost efficiency.

Digital procurement utilizing electronic marketplaces, virtual catalogs, gaining visibility into the lifecycle of purchase transactions, predictive risk management, and utilizing large volumes of data to improve decision-making – these are key capabilities that benefit the bold and the future-minded. This enables the transformation of procurement, and forms new roles and requirements for the future procurement organization.

Gardner: We are also seeing more analytics become available as we have more data-driven and digital processes. Is there any indication from your research that procurement people are adopting data-scientist-ways of thinking? How are they using analysis more now that the data and analysis are available through the technology?
If you extract all of that data, cleanse it, mine it, and make sense out of it, you can then make informed business decisions and create valuable insights.

Lee: You are right. The users of procurement data want insights. We are working with a couple of organizations on co-innovation projects. These organizations   actively research, analyze, and use their data to answer questions such as:

  • How does an organization validate that the prices they are paying are competitive in the marketplace?
  • After an organization conducts a sourcing event and implements the categories, how do they actually validate that the price paid is what was negotiated?
  • How do we categorize spend accurately, particularly if a majority of spend is services spend where the descriptions are non-standard?
  • Are we using the right contracts with the right pricing?

As you can imagine, when people enter transactions in a system, not all of it is contract-based or catalog-based. There is still a lot of free-form text. But if you extract all of that data, cleanse it, mine it, and make sense out of it, you can then make informed business decisions and create valuable insights. This goes back to the managing compliance practice we talked about earlier.

They are also looking to answer questions like, how do we scale supplier risk management to manage all of our suppliers systematically, as opposed to just managing the top-tier suppliers?

These two organizations are taking data analysis further in terms of creating advantages that begin to imbue excellence into modern procurement and across all of their operations.

Gardner: Kay Ree, now that you have been tracking this Benchmark Survey for a few years, and looking at this year's results, what would you recommend that people do based on your findings?

Future focus: Cost-reduction savings and beyond

Lee: There are several recommendations that we have. One is that procurement should continue to expand their span of influence across the organization. There are different ways to do this but it starts with an understanding of the stakeholder requirements.

The second is about capturing value beyond cost-reduction savings. From a savings perspective, the recommendation we have is to continue to track sourcing savings -- because cost-reduction savings are important. But there are other measures of value to track beyond cost savings. That includes things like contribution to revenue, involvement in product development, et cetera.

The third recommendation relates to adopting digital procurement by embracing technology. For example, SAP Ariba has recently introduced some innovations. I think the user really has an advantage in terms of going out there, evaluating what is out there, trying it out, and then seeing what works for them and their organization.

As organizations expand their footprint globally, the fourth recommendation focuses on transaction efficiency. The way procurement can support organizations operating globally is by offering self-service technology so that they can do more with less. With self-service technology, no one in procurement needs to be there to help a user buy. The user goes on the procurement system and creates transactions while their counterparts in other parts of the world may be offline.
If you can measure risk for your suppliers, why not make it systematic? 

The fifth recommendation is related to risk management. A lot of organizations when they say, “risk management,� they are really only tracking news related to their suppliers. But risk management includes things like predictive analytics, predictive risk measures beyond your strategic suppliers, looking deeper into supply chains, and across all your vendors. If you can measure risk for your suppliers, why not make it systematic? We now have the ability to manage a larger volume of suppliers, to in fact manage all of them. The ones that bubble to the top, the ones that are the most risky, those are the ones that you create contingency plans for. That helps organizations really prepare to respond to disruptions in their business.

The last recommendation is around compliance management, which includes internal and external compliance. So, internal adherence to procurement policies and procedures, and then also external following of governmental regulations. This helps the organization close all the loopholes and ensure that sourcing savings get to the bottom line.

Be a leader, not a laggard

Gardner: When we examine and benchmark companies through this data, we identify leaders, and perhaps laggards -- and there is a delta between them. In trying to encourage laggards to transform -- to be more digital, to take upon themselves these recommendations that you have -- how can we entice them? What do you get when you are a leader? What defines the business value that you can deliver when you are taking advantage of these technologies, following these best practices?

Lee: Leading organizations see higher cost reduction savings, process efficiency savings and better collaboration internally and externally. These benefits should speak for themselves and entice both the average and the laggards to strive for improvements and transformation.

From a numbers perspective, top performers achieve 9.7% savings as a percent of sourced spend. This translates to approximately $20M higher savings per $B in spend compared to the average organization.

We talked about compliance management earlier. A 5% increase in compliance increases realized savings of $4.4M per $1B in spend. These are real hard dollar savings that top performers are able to achieve.
As a top performer, if you go out and recruit, it is easier to entice talent to the organization.

In addition, top performers are able to attract a talent pool that will help the procurement organization perform even better. If you look at some of the procurement research, industry analysts and leaders are predicting that there may be a talent shortage in procurement. But, as a top performer, if you go out and recruit, it is easier to entice talent to the organization. People want to do cool things and they want to use new technology in their roles.

Gardner: Wrapping up, we are seeing some new and compellingtechnologies here at Ariba LIVE 2017 -- more use of artificial intelligence(AI), increased use of bringing predictive tools into a context so that they can be of value to procurement during the life-cycle of a process.

As we think about the future, and more of these technologies become available, what is it that companies should be doing now to put themselves in the best position to take advantage of all of that?

Curious org

Lee: It's important to be curious about the technology available in the market and perhaps structure the organization in such a way that there is a team of people on the procurement team who are continuously evaluating the different procurement technologies from different vendors out there. Then they can make decisions on what best fits their organization.

Having people who can look ahead, evaluate, and then talk about the requirements, then understand the architecture, and evaluate what's out there and what would make sense for them in the future. This is a complex role. He or she has to understand the current architecture of the business, the requirements from the stakeholders, and then evaluate what technology is available. They must then determine if it will assist the organization in the future, and if adopting these solutions provides a return on investment and ongoing payback.

So I think being curious, understanding the business really well, and then wearing a technology hat to understand what's out there are key. You can then be helpful to the organization and envision how adopting these newer technologies will play out.

Listen to the podcast. Find it on iTunes. Get the mobile app. Read a full transcript or download a copy. Sponsor: SAP Ariba.

You may also be interested in:

What Innovations Will Big Data Bring Us by 2020?

What Innovations Will Big Data Bring Us by 2020?

As we move more businesses, communication, and entertainment online, large amounts of data are exponentially generated with each passing year. The data is so much that it's becoming tough to keep track of it, let alone analyzing or organizing it.

Although the origins of big data are somewhat murky, its impacts are becoming crystal clear. More devices get connected to the internet like never before, and this information is a potential goldmine for educational, commercial, and humanitarian efforts. The need to organize and analyze the massive chunks of data has become necessary. More companies for handling big data have emerged to meet these challenges. By utilizing advanced database management technology, companies, medical organizations, universities and governments can now harvest information and improve efficiency.

Other than providing a chance to make significant economic impacts, big data manipulation will create innovations that will revolutionize our lives completely by 2020. Here is how:

Apps and sites will be more functional and safer

With big data, it will be easy to identify and track any fraudulent behavior so as improve websites security. Big data can bring new visibility into everything that is going within a firm's network and assist in predicting upcoming attacks. Experts think that big data ...


Read More on Datafloq
Snowflake and Spark, Part 2: Pushing Spark Query Processing to Snowflake

Snowflake and Spark, Part 2: Pushing Spark Query Processing to Snowflake

This post provides the details of Snowflake’s ability to push query processing down from Spark into Snowflake.
10 Tips to Troubleshoot Security Concerns in IoT Wearables

10 Tips to Troubleshoot Security Concerns in IoT Wearables

From smartwatches to glasses and finger rings - the range of wearable devices are steadily expanding. Combined with the might of Internet of Things, wearable devices have a life transforming effect. While it is fancy and often productive to carry these wearables around, the billion dollar question is, “How safe are these wearables?�

What if your favourite wearable is just another door that hackers and cybercriminals can break into to make away with the control of your personal and professional life?

Security concerns in wearables are legitimate. A study by Auth0 has confirmed that more than 52% of wearable device users feel that they are provided with inadequate security measures.

VTech, a popular brand that sells wearable for kids suffered a security breach which resulted in the leakage of private information of more than 200,000 children and their parents.

There is no better time to sit back and analyze the security concerns and the ways to negate them, right now, in the present.

Security Concerns in IoT Wearables

Wearables are now being used for purposes that go far beyond calorie counting and fitness tracking. They are now even part of BYOD enterprise work philosophy and are used by remote employees to constantly collaborate and communicate with their peers. Though considered futuristic, the IoT ...


Read More on Datafloq
8 Ways IoT Can Improve Healthcare

8 Ways IoT Can Improve Healthcare

Over the past few decades, we’ve gotten used to the Internet and cannot imagine our lives without it. But now the Internet of Things (IoT) is changing the way we operate commodities around us. The Internet of Things is a real-time connection and communication among all sorts of objects, gadgets, wearables, and devices. Essentially, it represents interoperability between all the things around us (excluding computers and phones). Needless to say, IoT is changing entire industries, as it reduces costs, boosts productivity, and improves quality. One of the areas where IoT is contributing the most is medicine. In this article, we will check out 8 ways how IoT is improving the healthcare industry.

How is IoT Changing Healthcare?

With its advanced technologies, IoT gives a significant boost to healthcare development. Some forecasts even estimate that the field of IoT will climb to $117 billion by 2020. How is that possible? Let’s discuss some of the key points!



Data management



IoT provides countless possibilities for hospitals to gather relevant information about their patients, both on-site and outside of the medical premises. Healthcare relies on telemetry here to capture data and communicate it automatically and remotely. This offers medical staff a chance to act promptly and provide patients with better ...


Read More on Datafloq
How Thick Data Can Unleash the True Power of Big Data

How Thick Data Can Unleash the True Power of Big Data

All that Data which is measurable may not be qualitative. While Big Data helps us find answers to well-defined questions, Thick Data connects the dots and gives us a more realistic picture.   

We have been hearing this for quite a few years that Big Data and Analytics are the next big waves. While these waves are already sweeping us over, we are missing out on the small things going for the big. Big Data has emerged to be remarkably useful when it comes to finding answers to well-defined questions and addressing phenomena that are well understood. What it fails to recognize is the complicacy of peoples’ lives, human connections, underlying emotions, changing cultural ecosystems, interesting stories, and other social ingredients.

For instance, it made big news when Nokia was acquired by Microsoft in 2013. While there could be many reasons behind Nokia’s downfall, one of the prominent reasons that Tricia Wang, a Global Tech Ethnographer describes is the overdependence on numbers. Sharing her story on Ethnography Matters, she mentioned how her recommendations to Nokia to revise their product development strategy did not receive enough attention as the sample size used for her study was considered too small in comparison to millions of ...


Read More on Datafloq
Why Educational Systems Should Consider Big Data

Why Educational Systems Should Consider Big Data

Advances in technology have enabled good decision making in most educational institutions following the increased use of big data. The policy makers in the institutions have been using big data to understand the sentiments about the school and make systematic improvements to the student's performances. The term big data refers to a large amount of information flowing through various channels that only use computers for analysis. Learning institutions generate an immense amount of student's information that would be hard to capture and manage through conventional means. Therefore, big data comes in handy in helping improve the processing of data and increasing the storage capacity of the institutional data.

Understanding Big Data

Students' performances and experiences such as eating, social life, study, and sleeping have a high effect on their academic performance. Negative or traumatic experiences have a direct impact on the student's retention abilities. Therefore, most institutions now use big data to look into various aspects affecting the performance of a student. Academic institutions collect large quantities of data, but the problem lies in the analysis making it harder for the analytics to make data-based decisions and improve the organizational effectiveness.

Why Big Data and Not Small Data

Academic institutions collect data for many ...


Read More on Datafloq
Augmenting the Brain is Set to Pioneer Alzheimer’s Treatment

Augmenting the Brain is Set to Pioneer Alzheimer’s Treatment

As artificial intelligence becomes more human, to co-exist, does human intelligence need to become more artificial?

We’ve spent a lot of time philosophizing about where Artificial Intelligence is going to take us, how far we are to achieving general AI and the implications it will have on humanity - all not without the sky net scenarios! Hype aside, there are companies out there who are focusing on how we can use artificially intelligent applications to improve the human experience, sustain life on our planet and significantly boost the economy. This pioneering technology could well see the next world-changing scientific discovery hailing from Silicon Valley, especially considering the significant increase in investment over the past few years.  

According to the Alzheimer’s Association, there are more than 5 million Americans living with Alzheimer’s today, with a predicted 16 million by 2050, and a further 850,000 people with dementia in the UK. The degenerative disease is currently the 6th leading cause of death in the US and has also been linked to the poor health of caregivers too due to care responsibilities associated with the disease as opposed to caregivers to elderly people without dementia. Having a neuroprosthetic could be the missing the key to an improved quality of ...


Read More on Datafloq
How Big Data Can Reduce Building Injuries

How Big Data Can Reduce Building Injuries

Accidents are a part of life, but when we carefully analyze the data on unintended incidents and injuries, we often find that many of them could have been avoided through greater care and harm reduction strategies. Unfortunately, because many companies and individuals view each injury in a bubble, they miss the significance of certain occurrences and can’t effectively reform their behaviors. Only the big picture view can help – that’s where big data comes in.

When we use big data to analyze workplace injuries and individual accidents, we move from an individualized view of personal injury to a systemic one – and that’s how we can reduce injuries. But what does this look like in practice? By turning to risk management ecosystems, we can see what the future of safety looks like.

Analyzing Workplace Safety

Workplace safety is a significant national concern – it’s why organizations like OSHA exist – but just because there’s already oversight in the workplace doesn’t mean companies are maximizing their injury prevention strategies. Rather, many do the minimum required by OSHA and leave the rest to chance.

Some workplaces, however, are taking safety seriously by instituting company-wide injury analytics. These systems let all branches of a business, no matter ...


Read More on Datafloq
Are You Wasting Your Data or Consuming It?

Are You Wasting Your Data or Consuming It?

Last night I was in the checkout line at the grocery store. There was a woman behind me with a cart full of produce who told me, “I’ll probably end up throwing most of this away.� I asked her why. She said she knows she should eat healthy, but it takes too much time and effort to whip up a meal from scratch, and anyway, she wasn’t that great of a cook. Despite her best intentions, she ends up ordering in for the family most nights.

Unfortunate fact – over 40% of food produced is wasted, depleting resources like fresh water, electricity and human effort.

Wouldn’t it be great if raw ingredients could magically convert themselves into dishes for the family – no time, effort, or cooking skills needed? The family could be eating healthier meals, they’d be eating the food they spent money and time to procure, and it wouldn’t end up wasted in the trash anymore.

Companies are wasting data

Many companies are wasting data just like many people are wasting food.

Companies recognize how important data is. They know their workforce is hungry for data-driven solutions to their problems. They need it to thrive in the current landscape.

So they invest significantly in ...


Read More on Datafloq
What is the Best Programming Language for Machine Learning?

What is the Best Programming Language for Machine Learning?

Q&A sites and data science forums are buzzing with the same questions over and over again: I’m new to data science, what language should I learn? What’s the best language for machine learning?

There’s an abundance of articles attempting to answer these questions, either based on personal experience or based on job offer data. Τhere’s so much more activity in machine learning than job offers in the West can describe, however, and peer opinions are of course very valuable but often conflicting and as such may confuse the novices. We turned instead to our hard data from 2,000+ data scientists and machine learning developers who responded to our latest survey about which languages they use and what projects they’re working on – along with many other interesting things about their machine learning activities and training. Then, being data scientists ourselves, we couldn’t help but run a few models to see which are the most important factors that are correlated to language selection. We compared the top-5 languages and the results prove that there is no simple answer to the “which language?� question. It depends on what you’re trying to build, what your background is and why you got involved in machine learning ...


Read More on Datafloq
Bring Your Own Cyber Human (BYOCH) – Part 1: Self-connected Humans

Bring Your Own Cyber Human (BYOCH) – Part 1: Self-connected Humans

Perhaps some of my readers and followers have ever played in their infancy the “Rock, Paper or Scissors� game. During each match, we simulated with our hands one of these three things, although in those years we could never think that any of it could connect to the Internet.

A few years later, we are not surprised that somewhere in the world is designing connected stones, connected papers or connected scissors. Just read “The abuse of shocking headlines in IoT or how many stupid things will be connected ? “. To this end, we have arrived in Internet of Things (IoT).

But far from conforming us just connecting things, some enlightened like Elon Musk do not dream of electric sheep; they dream building human-computer hybrids. Elon Musk’s Neuralink company goal is to explore technology that can make direct connections between a human brain and a computer. Mr. Musk floated the idea that humans will need a boost from computer-assisted artificial intelligence, to remain competitive since our machines get smarter.

Facebook Engineer Asserts That Augmented Reality Will Replace Smartphones in 5 Years. Facebook’s uber-secretive Building 8 (B8). The division is currently working on a top-secret brain-computer interface (BCI) like Elon Musk’s Neuralink, but that BCI project ...


Read More on Datafloq
How to Access User Data from Third-party Apps on Android

How to Access User Data from Third-party Apps on Android

If you’re developing a DLP or Parent Control solution, chances are, you want as much user data from apps as possible. However, most of the time gathering such data can be fairly difficult. While you can try to reverse engineer iOS app or Android app, this method often proves difficult and time-consuming, while results are not guaranteed.

For Android particularly, most of the time apps store their data in a Sandbox (which is the default and most secure option) where other apps cannot access it. If a developer decides not to store data in another easily accessible area (such as memory card), and not to provide an API for accessing the data, then getting it without root can be very hard.

This means that there is seemingly no way to get Skype, Viber, or KIK messages, or a browser history, which can be extremely frustrating if your solution depends on such data. However, there is actually a fairly elegant solution allowing to get user data on Android without root and without that much of a hassle. And we will cover this solution down the line.

Idea behind a solution

The gist of the idea is very simple – each active page has a layout ...


Read More on Datafloq
The Advantages And Disadvantages of Using Django

The Advantages And Disadvantages of Using Django

If you are interested in running Django or considering making a transition to Python, let us help you explore the main virtues and vices of using this framework. But before we get started, let’s talk briefly about what Django is and why you should care.

Django came out in 2005 and, indisputably, has turned into one of the go-to web-frameworks for growing amount of developers. It was created as a framework on the Python programming language. With a set of right functionalities, Django reduces the amount of trivial code that simplifies the creation of web applications and results in faster development.

In case you want to dive deeper into the framework, view a short introduction to django full text search.

Why Django?

You should totally check Django. It is written in Python and Python is amazing, clean, easy to learn, and one of the most taught programming languages. Python is also a popular choice for:



Industrial Light & Magic (Star Wars visual effects)


Game development


Services like Pinterest, Instagram, The Guardian and more



Without a doubt, the tech market is overflowed with frameworks, but Django is a good place to start as it has the nicest documentation and tutorials in software development. Now, for the main attraction – ...


Read More on Datafloq
What Effect Will Deep Learning Have on Business?

What Effect Will Deep Learning Have on Business?

One thing that could have a deep impact on business is deep learning. Deep learning can be thought of as a subfield of machine learning. In specific, this form of machine learning was influenced by the study of the human brain. The algorithms involved are designed to mimic how the human brain operates to allow a machine to learn in the same way. This is done through a system known as an artificial neural network.

The benefits of deep learning for businesses are obvious. It allows a computer system with access to a lot of data to make its own autonomous decisions about the data through this learning process. It can produce better decisions and improve efficiency. There are many applications that can help improve a business’s operations and profit potential. Below are some of the possibilities.

Deep Learning Can Increase Sales

One of the best things deep learning can provide for a company obviously is helping it increase its bottom line. Deep learning, for example, can be deployed for the purpose of lead generation. Deep learning is a form of artificial intelligence. That AI can sift through all the data and then use it to present you with leads at that right ...


Read More on Datafloq
Why AI is the Catalyst of IoT

Why AI is the Catalyst of IoT

Businesses across the world are rapidly leveraging the Internet-of-Things (#IoT) to create new products and services that are opening up new business opportunities and creating new business models. The resulting transformation is ushering in a new era of how companies run their operations and engage with customers. However, tapping into the IoT is only part of the story [6].

For companies to realize the full potential of IoT enablement, they need to combine IoT with rapidly-advancing Artificial Intelligence (#AI) technologies, which enable ‘smart machines’ to simulate intelligent behavior and make well-informed decisions with little or no human intervention [6].

Artificial Intelligence (AI) and the Internet of Things (IoT) are terms that project futuristic, sci-fi, imagery; both have been identified as drivers of business disruption in 2017. But, what do these terms really mean and what is their relation? Let’s start by defining both terms first:

IoT is defined as a system of interrelated Physical Objects, Sensors, Actuators, Virtual Objects, People, Services, Platforms, and Networks [3] that have separate identifiers and an ability to transfer data independently. Practical examples of #IoT application today include precision agriculture, remote patient monitoring, and driverless cars. Simply put, IoT is the network of “things� that collects and exchanges ...


Read More on Datafloq
Experts define new ways to manage supply chain risk in a digital economy

Experts define new ways to manage supply chain risk in a digital economy

The next BriefingsDirect digital business thought leadership panel discussion explores new ways that companies can gain improved visibility, analytics, and predictive responses to better manage supply chain risk in the digital economy.

The panel examines how companies such as Nielsen are using cognitive computing search engines, and even machine learning and artificial intelligence (AI), to reduce risk in their overall buying and acquisitions.

Listen to the podcast. Find it on iTunes. Get the mobile app. Read a full transcript or download a copy.

To learn more about the exploding sophistication around gaining insights into advanced business commerce, we welcome James Edward Johnson, Director of Supply Chain Risk Management and Analysis at Nielsen; Dan Adamson, Founder and CEO of OutsideIQ in Toronto, and Padmini Ranganathan, Vice President of Products and Innovation at SAP Ariba.

The panel was assembled and recorded at the recent 2017 SAP Ariba LIVE conference in Las Vegas. The discussion is moderated by Dana Gardner, principal analyst at Interarbor Solutions.

Here are some excerpts:

Gardner: Padmini, we heard at SAP Ariba LIVE that risk is opportunity. That stuck with me. Are the technologies really now sufficient that we can fully examine risks to such a degree that we can turn that into a significant business competitive advantage? That is to say, those who take on risk seriously, can they really have a big jump over their competitors?

Ranganathan
Ranganathan:I come from Silicon Valley, so we have to take risks for startups to grow into big businesses, and we have seen a lot of successful entrepreneurs do that. Clearly, taking risks drives bigger opportunity.

But in this world of supplier and supply chain risk management, it’s even more important and imperative that the buyer and supplier relationships are risk-aware and risk-free. The more transparent that relationship becomes, the more opportunity for driving more business between those relationships.

That context of growing business -- as well as growing the trust and the transparent relationships -- in a supply chain is better managed by understanding the supplier base. Understanding the risks in the supplier base, and then converting them into opportunities, allows mitigating and solving problems jointly. By collaborating together, they form partnerships.

Gardner: Dan, it seems that what was once acceptable risk can now be significantly reduced. How do people in procurement and supply chain management know what acceptable risk is -- or maybe they shouldn’t accept any risk?

Adamson
Adamson:My roots are also from Silicon Valley, and I think you are absolutely right that at times you should be taking risks -- but not unnecessarily. What the procurement side has struggled with -- and this is from me jumping into financial institutions where they treat risk very differently through to procurement – is risk versus the price-point to avoid that risk. That’s traditionally been the big problem.

For every vendor that you on-board, you have to pay $1,000 for a due diligence report and it's really not price-effective. But, being able to maintain and monitor that vendor on a regular basis at acceptable cost – then there's a real risk-versus-reward benefit in there.

What we are helping to drive are a new set of technology solutions that enable a deeper level of due diligence through technology, through cognitive computing, that wasn't previously possible at the price point that makes it cost-effective. Now it is possible to clamp down and avoid risk where necessary.

Gardner: James, as a consumer of some of these technologies, do you really feel that there has been a significant change in that value equation, that for less money output you are getting a lot less risk?

Knowing what you're up against  

Johnson: To some degree that value was always there; it was just difficult to help people see that value. Obviously tools like this will help us see that value more readily.

It used to be that in order to show the value, you actually had to do a lot of work, and it was challenging. What we are talking about here is that we can begin to boil the ocean. You can test these products, and you can do a lot of work just looking at test results.

Johnson
And, it's a lot easier to see the value because you will unearth things that you couldn't have seen in the past.

Whereas it used to take a full-blown implementation to begin to grasp those risks, you can now just test your data and see what you find. Most people, once they have their eyes wide open, will be at least a little more fearful.  But, at the same time -- and this goes back to the opportunity question you asked -- they will see the opportunity to actually tackle these risks. It’s not like those risks didn't exist in the past, but now they know they are there -- and they can decide to do something about it, or not.

Gardner:So rather than avoid the entire process, now you can go at the process but with more granular tools to assess your risks and then manage them properly?

Johnson:That's right. I wouldn't say that we should have a risk-free environment; that would cost more money than we’re willing to pay. That said, we should be more conscious of what we're not yet willing to pay for.

Rather than just leaving the risk out there and avoiding business where you can’t access information about what you don't know -- now you'll know something. It's your choice to decide whether or not you want to go down the route of eliminating that risk, of living with that risk, or maybe something in between. That's where the sweet spot is. There are probably a lot of intermediate actions that people would be taking now that are very cheap, but they haven't even thought to do so, because they haven’t assessed where the risk is.

Gardner: Padmini, because we're looking at a complex landscape -- a supply chain, a global supply chain, with many tiers -- when we have a risk solution, it seems that it's a team sport. It requires an ecosystem approach. What has SAP Ariba done, and what is the news at SAP Ariba LIVE? Why is it important to be a team player when it comes to a fuller risk reduction opportunity?

Teamwork

Ranganathan:You said it right. The risk domain world is large, and it is specialized. The language that the compliance people use in the risk world is somewhat similar to the language that the lawyers use, but very different from the language that the information technology (IT) security and information security risk teams use.

The reason you can’t see many of the risks is partly because the data, the information, and the fragmentation have been too broad, too wide. It’s also because the type of risks, and the people who deal with these risks, are also scattered across the organization.
It’s not like those risks didn't exist in the past, but now they know they are there -- and they can decide to do something about it, or not.

So a platform that supports bringing all of this together is number one. Second, the platform must support the end-to-end process of managing those supply chain relationships, and managing the full supply chain and gain the transparency across it. That’s where SAP Ariba has headed with Direct Materials Sourcing and with getting more into supply chain collaboration. That’s what you heard at SAP Ariba LIVE.

We all understand that supply chain much better when we are in SAP Ariba, and then you have this ecosystem of partners and providers. You have the technology with SAP and HANA to gain the ability to mash up big data and set it in context, and to understand the patterns. We also have the open ecosystem and the open source platform to allow us to take that even wider. And last but not the least, there is the business network.

So it’s not just between one company and another company, it's a network of companies operating together. The momentum of that collaboration allows users to say, “Okay, I am going to push for finding ethical companies to do business with,� -- and then that's really where the power of the network multiplies.

Gardner: Dan, when a company nowadays buys something in a global supply chain, they are not just buying a product -- they are buying everything that's gone on with that product, such as the legacy of that product, from cradle to PO. What is it that OutsideIQ brings to the table that helps them get a better handle on what that legacy really is?

Dig deep, reduce risk, save time

Adamson: Yes, and they are not just buying from that seller, they are buying from the seller that sold it to that seller, and so they are buying a lot of history there -- and there is a lot of potential risk behind the scenes.

That’s why this previously has been a manual process, because there has been a lot of contextual work in pulling out those needles from the haystack. It required a human level of digging into context to get to those needles.

The exciting thing that we bring is a cognitive computing platform that’s trainable -- and it's been trained by FinCrime’s experts and corporate compliance experts. Increasingly, supply management experts help us know what to look for. The platform has the capability to learn about its subject, so it can go deeper. It can actually pivot on where it's searching. If it finds a presence in Afghanistan, for example, well then that's a potential risk in itself, but it can then go dig deeper on that.

And that level of deeper digging is something that a human really had to do before. This is the exciting revolution that's occurring. Now we can bring back that data, it can be unstructured, it can be structured, yet we can piece it together and provide some structure that is then returned to SAP Ariba.

The great thing about the supply management risk platform or toolkit that's being launched at SAP Ariba LIVE is that there’s another level of context on top of that. Ariba understands the relationship between the supplier and the buyer, and that's an important context to apply as well.

How you determine risk scores on top of all of that is very critical. You need to weed out all of the noise, otherwise it would be a huge data science exercise and everyone would be spinning his or her wheels.
SAP Ariba understands the relationship between the supplier and the buyer, and that's an important context to apply.

This is now a huge opportunity for clients like James to truly get some low-hanging fruit value, where previously it would have been literally a witch-hunt or a huge mining expedition. We are now able to achieve this higher level of value.

Gardner: James, Dan just described what others are calling investigative cognitive computing brought to bear on this supply chain risk problem. As someone who is in the business of trying to get the best tools for their organization, where do you come down on this? How important is this to you?

Johnson: It's very important. I have done the kinds of investigations that he is talking about. For example, if I am looking at a vendor in a high-risk country, particularly a small vendor that doesn't have an international presence  that is problematic for most supplier investigations. What do I do? I will go and do some of the investigation that Dan is talking about.

Now I'm usually sitting at my desk in Chicago. I'm not going out in the world. So there is a heightened level of due-diligence that I suspect neither of us are really talking about here. With that limitation, you want to look up not only the people, you want to look up all their connections. You might have had a due-diligence form completed, but that's an interested party giving you information, what do you do with it?

Well, I can run the risk search on more than just the entity that I'm transacting with.  I am going to run it on everyone that Dan mentioned. Then I am going to look up all their LinkedIn profiles, see who they are connected to. Do any of those people show any red flags? I’d look at the bank that they use. Are there any red flags with their bank?

I can do all that work, and I can spend several hours doing all that work. As a lawyer I might dig a little deeper than someone else, but in the end, it's human labor going into the effort.

Gardner: And that really doesn't scale very well.

Johnson: That does not scale at all. I am not going to hire a team of lawyers for every supplier. The reality here is that now I can do some level of that time-consuming work with every supplier by using the kind of technology that Dan is talking about.

The promise of OutsideIQ technology is incredible. It is an early and quickly expanding, opportunity. It's because of relationships like the one between SAP Ariba and OutsideIQ that I see a huge opportunity between Nielsen and SAP Ariba. We are both on the same roadmap.

Nielsen has a lot of work to do, SAP Ariba has a lot of work to do, and that work will never end, and that’s okay. We just need to be comfortable with it, and work together to build a better world.

Gardner: Tell us about Nielsen. Then secondarily, what part of your procurement, your supply chain, do you think this will impact best first?

Automatic, systematic risk management

Johnson: Nielsen is a market research company. We answer two questions: what do people watch? And what do people buy? It sounds very simple, but when you cover 90% of the world’s population, which we do – more than six billion people -- you can imagine that it gets a little bit more complicated.

We house about 54 petabytes of database data. So the scale there is huge. We have 43,000 employees. It’s not a small company. You might know Nielsen for the set-top boxes in the US that tell what the ratings were overnight for the Super Bowl, for example, but it’s a lot more than that. And you can imagine, especially when you're trying to answer what do people buy in  developing countries with emerging economies? You are touching some riskier things.

In terms of what this SAP Ariba collaboration can solve for us, the first quick hit is that we will no longer have to leverage multiple separate sources of information. I can now leverage all the sources of information at one time through one interface. It is already being used to deliver information to people who are involved in the procurement chain. That's the huge quick win.

The secondary win is from the efficiency that we get in doing that first layer of risk management. Now we can start to address that middle tier that I mentioned. We can respond to certain kinds of risk that, today, we are doing ad-hoc, but not systematically. There is that systematic change that will allow us to not only target the 100 to 200 vendors that we might prioritize -- but the thousands of vendors that are somewhere in our system, too.

That's going to revolutionize things, especially once you fold in the environmental, social and governance (ESG) work that, today, is very focused for us. If I can spread that out to the whole supply chain, that's revolutionary. There are a lot of low-cost things that you can do if you just have the information.
What is the good in the world that’s freely available to me, that I'm not even touching? That's amazing.

So it’s not always a question of, “am I going to do good in the world and how much is it going to cost me?� It’s really a question of, “What is the good in the world that’s freely available to me, that I'm not even touching?� That's amazing! And, that's the kind of thing that you can go to work for, and be happy about your work, and not just do what you need to do to get a paycheck.

Gardner: It’s not just avoiding the bad things; it’s the false positives that you want to remove so that you can get the full benefit of a diverse, rich supplier network to choose from.

Johnson: Right, and today we are essentially wasting a lot of time on suspected positives that turn out to be false. We waste time on them because we go deeper with a human than we need to. Let’s let the machines go as deep as they can, and then let the humans come in to take over where we make a difference.

Gardner: Padmini, it’s interesting to me that he is now talking about making this methodological approach standardized, part of due-diligence that's not ad-hoc, it’s not exception management. As companies make this a standard part of their supply chain evaluations, how can we make this even richer and easier to use?

Ranganathan: The first step was the data. It’s the plumbing; we have to get that right. It’s about the way you look at your master data, which is suppliers; the way you look at what you are buying, which is categories of spend; and where you are buying from, which is all the regions. So you already have the metrics segmentation of that master data, and everything else that you can do with SAP Ariba.

The next step is then the process, because it’s really not a one-size-fits-all. It cannot be a one-size-fits-all, where every supplier that you on-board you are going to ask them the same set of questions, check the box and move on.

I am going to use the print service vendor example again, which is my favorite. For marketing materials printing, you have a certain level of risk, and that's all you need to look at. But you still want, of course, to look at them for any adverse media incidents, or whether they suddenly got on a watch-list for something, you do want to know that.

But when one of your business units begins to use them for customer-confidential data and statement printing -- the level of risk shoots up. So the intensity of risk assessments and the risk audits and things that you would do with that vendor for that level of risk then has to be engineered and geared to that type of risk.

So it cannot be a one-size-fits-all; it has to go past the standard. So the standardization is not in the process; the standardization is in the way you look at risk so that you can determine how much of the process do I need to apply and I can stay in tune.

Gardner: Dan, clearly SAP Ariba and Nielsen, they want the “dials,� they want to be able to tune this in. What’s coming next, what should we expect in terms of what you can bring to the table, and other partners like yourselves, in bringing the rich, customizable inference and understanding benefits that these other organizations want?

Constructing cognitive computing by layer

Adamson: We are definitely in early days on the one hand. But on the other hand, we have seen historically many AI failures, where we fail to commercialize AI technologies. This time it's a little different, because of the big data movement, because of the well-known use cases in machine learning that have been very successful, the pattern matching and recommending and classifying. We are using that as a backbone to build layers of cognitive computing on top of that.

And I think as Padmini said, we are providing a first layer, where it’s getting stronger and stronger. We can weed out up to 95% of the false-positives to start from, and really let the humans look at the thorny or potentially thorny issues that are left over. That’s a huge return on investment (ROI) and a timesaver by itself.

But on top of that, you can add in another layer of cognitive computing, and that might be at the workflow layer that recognizes that data and says, “Jeez, just a second here, there's a confidentiality potential issue here, let's treat this vendor differently and let's go as far as plugging in a special clause into the contract.� This is, I think, where SAP Ariba is going with that. It’s building a layer of cognitive computing on top of another layer of cognitive computing.

Actually, human processes work like that, too. There is a lot of fundamental pattern recognition at the basis of our cognitive thought, and on top of that we layer on top logic. So it’s a fun time to be in this field, executing one layer at a time, and it's an exciting approach.

Listen to the podcast. Find it on iTunes. Get the mobile app. Read a full transcript or download a copy. Sponsor: SAP Ariba.

You may also be interested in:

Experts define new ways to manage supply chain risk in a digital economy

Experts define new ways to manage supply chain risk in a digital economy

The next BriefingsDirect digital business thought leadership panel discussion explores new ways that companies can gain improved visibility, analytics, and predictive responses to better manage supply chain risk in the digital economy.

The panel examines how companies such as Nielsen are using cognitive computing search engines, and even machine learning and artificial intelligence (AI), to reduce risk in their overall buying and acquisitions.

Listen to the podcast. Find it on iTunes. Get the mobile app. Read a full transcript or download a copy.

To learn more about the exploding sophistication around gaining insights into advanced business commerce, we welcome James Edward Johnson, Director of Supply Chain Risk Management and Analysis at Nielsen; Dan Adamson, Founder and CEO of OutsideIQ in Toronto, and Padmini Ranganathan, Vice President of Products and Innovation at SAP Ariba.

The panel was assembled and recorded at the recent 2017 SAP Ariba LIVE conference in Las Vegas. The discussion is moderated by Dana Gardner, principal analyst at Interarbor Solutions.

Here are some excerpts:

Gardner: Padmini, we heard at SAP Ariba LIVE that risk is opportunity. That stuck with me. Are the technologies really now sufficient that we can fully examine risks to such a degree that we can turn that into a significant business competitive advantage? That is to say, those who take on risk seriously, can they really have a big jump over their competitors?

Ranganathan
Ranganathan:I come from Silicon Valley, so we have to take risks for startups to grow into big businesses, and we have seen a lot of successful entrepreneurs do that. Clearly, taking risks drives bigger opportunity.

But in this world of supplier and supply chain risk management, it’s even more important and imperative that the buyer and supplier relationships are risk-aware and risk-free. The more transparent that relationship becomes, the more opportunity for driving more business between those relationships.

That context of growing business -- as well as growing the trust and the transparent relationships -- in a supply chain is better managed by understanding the supplier base. Understanding the risks in the supplier base, and then converting them into opportunities, allows mitigating and solving problems jointly. By collaborating together, they form partnerships.

Gardner: Dan, it seems that what was once acceptable risk can now be significantly reduced. How do people in procurement and supply chain management know what acceptable risk is -- or maybe they shouldn’t accept any risk?

Adamson
Adamson:My roots are also from Silicon Valley, and I think you are absolutely right that at times you should be taking risks -- but not unnecessarily. What the procurement side has struggled with -- and this is from me jumping into financial institutions where they treat risk very differently through to procurement – is risk versus the price-point to avoid that risk. That’s traditionally been the big problem.

For every vendor that you on-board, you have to pay $1,000 for a due diligence report and it's really not price-effective. But, being able to maintain and monitor that vendor on a regular basis at acceptable cost – then there's a real risk-versus-reward benefit in there.

What we are helping to drive are a new set of technology solutions that enable a deeper level of due diligence through technology, through cognitive computing, that wasn't previously possible at the price point that makes it cost-effective. Now it is possible to clamp down and avoid risk where necessary.

Gardner: James, as a consumer of some of these technologies, do you really feel that there has been a significant change in that value equation, that for less money output you are getting a lot less risk?

Knowing what you're up against  

Johnson: To some degree that value was always there; it was just difficult to help people see that value. Obviously tools like this will help us see that value more readily.

It used to be that in order to show the value, you actually had to do a lot of work, and it was challenging. What we are talking about here is that we can begin to boil the ocean. You can test these products, and you can do a lot of work just looking at test results.

Johnson
And, it's a lot easier to see the value because you will unearth things that you couldn't have seen in the past.

Whereas it used to take a full-blown implementation to begin to grasp those risks, you can now just test your data and see what you find. Most people, once they have their eyes wide open, will be at least a little more fearful.  But, at the same time -- and this goes back to the opportunity question you asked -- they will see the opportunity to actually tackle these risks. It’s not like those risks didn't exist in the past, but now they know they are there -- and they can decide to do something about it, or not.

Gardner:So rather than avoid the entire process, now you can go at the process but with more granular tools to assess your risks and then manage them properly?

Johnson:That's right. I wouldn't say that we should have a risk-free environment; that would cost more money than we’re willing to pay. That said, we should be more conscious of what we're not yet willing to pay for.

Rather than just leaving the risk out there and avoiding business where you can’t access information about what you don't know -- now you'll know something. It's your choice to decide whether or not you want to go down the route of eliminating that risk, of living with that risk, or maybe something in between. That's where the sweet spot is. There are probably a lot of intermediate actions that people would be taking now that are very cheap, but they haven't even thought to do so, because they haven’t assessed where the risk is.

Gardner: Padmini, because we're looking at a complex landscape -- a supply chain, a global supply chain, with many tiers -- when we have a risk solution, it seems that it's a team sport. It requires an ecosystem approach. What has SAP Ariba done, and what is the news at SAP Ariba LIVE? Why is it important to be a team player when it comes to a fuller risk reduction opportunity?

Teamwork

Ranganathan:You said it right. The risk domain world is large, and it is specialized. The language that the compliance people use in the risk world is somewhat similar to the language that the lawyers use, but very different from the language that the information technology (IT) security and information security risk teams use.

The reason you can’t see many of the risks is partly because the data, the information, and the fragmentation have been too broad, too wide. It’s also because the type of risks, and the people who deal with these risks, are also scattered across the organization.
It’s not like those risks didn't exist in the past, but now they know they are there -- and they can decide to do something about it, or not.

So a platform that supports bringing all of this together is number one. Second, the platform must support the end-to-end process of managing those supply chain relationships, and managing the full supply chain and gain the transparency across it. That’s where SAP Ariba has headed with Direct Materials Sourcing and with getting more into supply chain collaboration. That’s what you heard at SAP Ariba LIVE.

We all understand that supply chain much better when we are in SAP Ariba, and then you have this ecosystem of partners and providers. You have the technology with SAP and HANA to gain the ability to mash up big data and set it in context, and to understand the patterns. We also have the open ecosystem and the open source platform to allow us to take that even wider. And last but not the least, there is the business network.

So it’s not just between one company and another company, it's a network of companies operating together. The momentum of that collaboration allows users to say, “Okay, I am going to push for finding ethical companies to do business with,� -- and then that's really where the power of the network multiplies.

Gardner: Dan, when a company nowadays buys something in a global supply chain, they are not just buying a product -- they are buying everything that's gone on with that product, such as the legacy of that product, from cradle to PO. What is it that OutsideIQ brings to the table that helps them get a better handle on what that legacy really is?

Dig deep, reduce risk, save time

Adamson: Yes, and they are not just buying from that seller, they are buying from the seller that sold it to that seller, and so they are buying a lot of history there -- and there is a lot of potential risk behind the scenes.

That’s why this previously has been a manual process, because there has been a lot of contextual work in pulling out those needles from the haystack. It required a human level of digging into context to get to those needles.

The exciting thing that we bring is a cognitive computing platform that’s trainable -- and it's been trained by FinCrime’s experts and corporate compliance experts. Increasingly, supply management experts help us know what to look for. The platform has the capability to learn about its subject, so it can go deeper. It can actually pivot on where it's searching. If it finds a presence in Afghanistan, for example, well then that's a potential risk in itself, but it can then go dig deeper on that.

And that level of deeper digging is something that a human really had to do before. This is the exciting revolution that's occurring. Now we can bring back that data, it can be unstructured, it can be structured, yet we can piece it together and provide some structure that is then returned to SAP Ariba.

The great thing about the supply management risk platform or toolkit that's being launched at SAP Ariba LIVE is that there’s another level of context on top of that. Ariba understands the relationship between the supplier and the buyer, and that's an important context to apply as well.

How you determine risk scores on top of all of that is very critical. You need to weed out all of the noise, otherwise it would be a huge data science exercise and everyone would be spinning his or her wheels.
SAP Ariba understands the relationship between the supplier and the buyer, and that's an important context to apply.

This is now a huge opportunity for clients like James to truly get some low-hanging fruit value, where previously it would have been literally a witch-hunt or a huge mining expedition. We are now able to achieve this higher level of value.

Gardner: James, Dan just described what others are calling investigative cognitive computing brought to bear on this supply chain risk problem. As someone who is in the business of trying to get the best tools for their organization, where do you come down on this? How important is this to you?

Johnson: It's very important. I have done the kinds of investigations that he is talking about. For example, if I am looking at a vendor in a high-risk country, particularly a small vendor that doesn't have an international presence  that is problematic for most supplier investigations. What do I do? I will go and do some of the investigation that Dan is talking about.

Now I'm usually sitting at my desk in Chicago. I'm not going out in the world. So there is a heightened level of due-diligence that I suspect neither of us are really talking about here. With that limitation, you want to look up not only the people, you want to look up all their connections. You might have had a due-diligence form completed, but that's an interested party giving you information, what do you do with it?

Well, I can run the risk search on more than just the entity that I'm transacting with.  I am going to run it on everyone that Dan mentioned. Then I am going to look up all their LinkedIn profiles, see who they are connected to. Do any of those people show any red flags? I’d look at the bank that they use. Are there any red flags with their bank?

I can do all that work, and I can spend several hours doing all that work. As a lawyer I might dig a little deeper than someone else, but in the end, it's human labor going into the effort.

Gardner: And that really doesn't scale very well.

Johnson: That does not scale at all. I am not going to hire a team of lawyers for every supplier. The reality here is that now I can do some level of that time-consuming work with every supplier by using the kind of technology that Dan is talking about.

The promise of OutsideIQ technology is incredible. It is an early and quickly expanding, opportunity. It's because of relationships like the one between SAP Ariba and OutsideIQ that I see a huge opportunity between Nielsen and SAP Ariba. We are both on the same roadmap.

Nielsen has a lot of work to do, SAP Ariba has a lot of work to do, and that work will never end, and that’s okay. We just need to be comfortable with it, and work together to build a better world.

Gardner: Tell us about Nielsen. Then secondarily, what part of your procurement, your supply chain, do you think this will impact best first?

Automatic, systematic risk management

Johnson: Nielsen is a market research company. We answer two questions: what do people watch? And what do people buy? It sounds very simple, but when you cover 90% of the world’s population, which we do – more than six billion people -- you can imagine that it gets a little bit more complicated.

We house about 54 petabytes of database data. So the scale there is huge. We have 43,000 employees. It’s not a small company. You might know Nielsen for the set-top boxes in the US that tell what the ratings were overnight for the Super Bowl, for example, but it’s a lot more than that. And you can imagine, especially when you're trying to answer what do people buy in  developing countries with emerging economies? You are touching some riskier things.

In terms of what this SAP Ariba collaboration can solve for us, the first quick hit is that we will no longer have to leverage multiple separate sources of information. I can now leverage all the sources of information at one time through one interface. It is already being used to deliver information to people who are involved in the procurement chain. That's the huge quick win.

The secondary win is from the efficiency that we get in doing that first layer of risk management. Now we can start to address that middle tier that I mentioned. We can respond to certain kinds of risk that, today, we are doing ad-hoc, but not systematically. There is that systematic change that will allow us to not only target the 100 to 200 vendors that we might prioritize -- but the thousands of vendors that are somewhere in our system, too.

That's going to revolutionize things, especially once you fold in the environmental, social and governance (ESG) work that, today, is very focused for us. If I can spread that out to the whole supply chain, that's revolutionary. There are a lot of low-cost things that you can do if you just have the information.
What is the good in the world that’s freely available to me, that I'm not even touching? That's amazing.

So it’s not always a question of, “am I going to do good in the world and how much is it going to cost me?� It’s really a question of, “What is the good in the world that’s freely available to me, that I'm not even touching?� That's amazing! And, that's the kind of thing that you can go to work for, and be happy about your work, and not just do what you need to do to get a paycheck.

Gardner: It’s not just avoiding the bad things; it’s the false positives that you want to remove so that you can get the full benefit of a diverse, rich supplier network to choose from.

Johnson: Right, and today we are essentially wasting a lot of time on suspected positives that turn out to be false. We waste time on them because we go deeper with a human than we need to. Let’s let the machines go as deep as they can, and then let the humans come in to take over where we make a difference.

Gardner: Padmini, it’s interesting to me that he is now talking about making this methodological approach standardized, part of due-diligence that's not ad-hoc, it’s not exception management. As companies make this a standard part of their supply chain evaluations, how can we make this even richer and easier to use?

Ranganathan: The first step was the data. It’s the plumbing; we have to get that right. It’s about the way you look at your master data, which is suppliers; the way you look at what you are buying, which is categories of spend; and where you are buying from, which is all the regions. So you already have the metrics segmentation of that master data, and everything else that you can do with SAP Ariba.

The next step is then the process, because it’s really not a one-size-fits-all. It cannot be a one-size-fits-all, where every supplier that you on-board you are going to ask them the same set of questions, check the box and move on.

I am going to use the print service vendor example again, which is my favorite. For marketing materials printing, you have a certain level of risk, and that's all you need to look at. But you still want, of course, to look at them for any adverse media incidents, or whether they suddenly got on a watch-list for something, you do want to know that.

But when one of your business units begins to use them for customer-confidential data and statement printing -- the level of risk shoots up. So the intensity of risk assessments and the risk audits and things that you would do with that vendor for that level of risk then has to be engineered and geared to that type of risk.

So it cannot be a one-size-fits-all; it has to go past the standard. So the standardization is not in the process; the standardization is in the way you look at risk so that you can determine how much of the process do I need to apply and I can stay in tune.

Gardner: Dan, clearly SAP Ariba and Nielsen, they want the “dials,� they want to be able to tune this in. What’s coming next, what should we expect in terms of what you can bring to the table, and other partners like yourselves, in bringing the rich, customizable inference and understanding benefits that these other organizations want?

Constructing cognitive computing by layer

Adamson: We are definitely in early days on the one hand. But on the other hand, we have seen historically many AI failures, where we fail to commercialize AI technologies. This time it's a little different, because of the big data movement, because of the well-known use cases in machine learning that have been very successful, the pattern matching and recommending and classifying. We are using that as a backbone to build layers of cognitive computing on top of that.

And I think as Padmini said, we are providing a first layer, where it’s getting stronger and stronger. We can weed out up to 95% of the false-positives to start from, and really let the humans look at the thorny or potentially thorny issues that are left over. That’s a huge return on investment (ROI) and a timesaver by itself.

But on top of that, you can add in another layer of cognitive computing, and that might be at the workflow layer that recognizes that data and says, “Jeez, just a second here, there's a confidentiality potential issue here, let's treat this vendor differently and let's go as far as plugging in a special clause into the contract.� This is, I think, where SAP Ariba is going with that. It’s building a layer of cognitive computing on top of another layer of cognitive computing.

Actually, human processes work like that, too. There is a lot of fundamental pattern recognition at the basis of our cognitive thought, and on top of that we layer on top logic. So it’s a fun time to be in this field, executing one layer at a time, and it's an exciting approach.

Listen to the podcast. Find it on iTunes. Get the mobile app. Read a full transcript or download a copy. Sponsor: SAP Ariba.

You may also be interested in:

Why Biotech Needs the Power of Data Analytics

Why Biotech Needs the Power of Data Analytics

The Human Genome Project, that aimed to map and sequence the entire human genome, began in 1990 and ended in 2003 with a starting budget of over $1.5 million. It provided us, for the first time, a means to access invaluable data through genes – evolution patterns, diseases and their treatments, gene mutations and their effects, anthropological information, etc. Now, powerful software and analysis tools are being built that can decode an entire genome in a matter of hours. Data analytics is quickly becoming one of the most important branches of science that can be applied in the biotech industry. 

Genomics

DNA sequencing generates a huge amount of data that needs to be analyzed with care, as the information and conclusions drawn are applicable in a whole range of industries from medicine to forensic science. It involves data science at various levels:

Storage

The first step is storage of DNA sequencing data. If we were to sequence the genome of every living thing from a microbe to a human, then we need to have powerful data science tools that help us store, track and retrieve relevant information.

Annotation

Annotation is the process of adding notes to specific genes in the sequence. Tools are being built to put ...


Read More on Datafloq
The DNA of a Data Scientist

The DNA of a Data Scientist

The role has been coined  ‘the sexiest job of the 21st century’ by the Harvard Business Review and there is good reason for it.

Data Science can be a highly rewarding career path. However, not everyone is cut out for the job. Being a great data scientist takes a certain set of skills.

We’ve compiled a list of all the things that make up the best in the business.

A high level of education

Not strictly speaking mandatory. But those working under the title without, at least, a master’s degree are a very small minority (less than 20%). In fact, almost half of all data scientists have gone as far as completing a PHD.

The ideal education would be in the realm of mathematics, statistics, or computer science.

Is able to understand coding

There are a variety of different types of coding prevalent in the industry. A data scientist needs to be able to understand at least some of them.

Python, C/C++, Java, Pearl. All of those regularly crop up in data science. With Python being the most prevalent.

Has a certain level of proficiency in statistics

If you think back to your stats classes, the words ‘statistics’ and ‘data’ go hand in hand. And it’s true that statistical knowledge is important for data ...


Read More on Datafloq
Five Skillsets Needed for Securing IoT Today

Five Skillsets Needed for Securing IoT Today

On October 21, 2016 a sophisticated Distributed Denial-of-Service (DDoS) attack was launched that left customers of Amazon, Netflix, Twitter, and more without service, multiple times throughout the day.  TechTarget reported that the attack was leveled against Dyn, a Domain Name System (DNS) provider that services those brands, along with many others.  One of the contributing factors to the attack was that the hackers were able to infect Internet of Things (IoT) devices with the Mirai botnet. They were able to identify IoT devices that used default usernames and passwords (such as username: “admin,� password: “admin�), and turn them into drones in their DDoS cyberattack.

John Pironti, president of IP Architects, went on to explain to TechTarget, "The use of IoT devices for recent DDoS attacks has shown how fragile and insecure many of these devices currently are…. The first use was for DDoS, but these same devices are likely to be used as entry points to the internal networks they connect to as well as they become more pervasive."

Gartner projects that 20 billion IoT devices will be used by companies worldwide by 2020. This added mobility and productivity also brings the promise of multiplied threat vectors and vulnerabilities. If companies are ...


Read More on Datafloq
A D3 Image is Worth a Thousand Words: Interview with Morgane Ciot

A D3 Image is Worth a Thousand Words: Interview with Morgane Ciot

Many things have been said and done in the realm of analytics, but visualizations remain as the forefront of the data analysis process, where intuition and correct interpretation can help us make sense of data.

As an increasing number of tools emerge, current visualizations are far more than mere pictures in a screen, allowing for movement, exploration and interaction.

One of this tools is D3, an open-source Javascript data visualization library. D3 is perhaps the most popular tool to develop rich and interactive data visualizations, used by small and large companies such as Google and the New York Times.

With the next Open Data Science Conference in Boston coming soon, we had the opportunityto talk with DataRobot’s and ODSC speaker Morgane Ciot about her workshop session: “Intro to 3D�, the state of data visualization and her very own perspectives around the analytics market.


Morgane Ciot is a data visualization engineer at DataRobot, where she specializes in creating interactive and intuitive D3 visualizations for data analysis and machine learning. Morgane studied computer science and linguistics at McGill University in Montreal. Previously, she worked in the Network Dynamics Lab at McGill, answering questions about social media behavior using predictive models and statistical topic models.

Morgane enjoys studying about machine learning (ML), reading, writing, and staging unusual events.

Let's get to know more about Morgane and her views as a data visualization engineer.

Morgane, could you tell us a bit more about yourself, especially about your area of expertise, and what was your motivation to pursue a career in analytics and data science?

I went to school for computer science and linguistics. Those two fields naturally converge in Natural Language Processing (NLP)/Artificial Intelligence (AI), an intersection that was unfortunately not exploited by my program but that nonetheless got me interested in machine learning.

One of the computer science professors at my school was doing what essentially amounted to sociological research on social media behavior using machine learning techniques. Working with him furthered my interest in ML, NLP, and topic modeling, and I began to also explore how to visualize some of the unmanageable amounts of data we had (like, all of Reddit).

I’m probably indebted to that part of my life, and my professor, for my current position as a data viz engineer. Also, machine learning's practical ramifications are going to be game changing. I want to live closest to the eye of the storm when the singularity hits.

Based on your experience, which attributes or skills should every data master have if he/she wants to succeed, and what would be your recommendations for those looking for an opportunity at this career?

Stats, problem-solving skills, and engineering or scripting abilities all converge in the modern data scientist.

You have to be able to understand how to formulate a data science problem, how to approach it, and how to build the ad hoc tools you’ll need to solve it. At least some basic statistical knowledge is crucial. Elements of Statistical Learning by Hastie and Andrew Ng’s Coursera course both provide a solid foundational understanding of machine learning and require some statistical background.

Learn at least one programming language — Python or R are the most popular. R is the de facto language for statisticians, and Python has a thriving community and a ton of data science libraries like scikit-learn and pandas. It’s also great for writing scripts to scrape web data. If you’re feeling more adventurous, maybe look into Julia.

As usual, don’t just learn the theory. Find a tangible project to work on. Kaggle hosts competitions you can enter and has a community of experts you can learn from.

Finally, start learning about deep learning. Many of the most interesting papers in the last few years have come out of that area and we’re only just beginning to see how the theory that has been around for decades is going to be put into practice.

Talking about data visualization, what is your view of the role it plays within data science? How important is it in the overall data science process?

Data visualization is pretty fundamental to every stage of the data science process. I think how it’s used in data exploration — viewing feature distributions — is fairly obvious and well-practiced, but people often overlook how important visualizations can be even in the modeling process.

Visualizations should accompany not just how we examine our data, but also how we examine our models! There are various metrics that we can use to assess model performance, but what’s really going to convince an end user is a visualization, not a number. That's what's going to instill trust in model decisions.

Standard introductions to machine learning lionize the ROC curve, but there are plenty of other charts out there that can help us understand what and how a model is doing: plotting predicted vs. actuals, lift charts, feature importance, partial dependence, etc. — this was actually the subject of my ODSC talk last year, which should be accessible on their website.

A visualization that rank-orders the features that were most important to the predictive capacity of a model doesn’t just give you insight, it also helps you model better. You can use those top features to build faster and more accurate models. 

What do you think will be the most important data visualization trend in the next couple of years?

Data is becoming evermore important basically everywhere, but popular and even expert understanding hasn’t quite kept up.

Data is slowly consuming us, pressing down from all angles like that Star Wars scene where Luke Skywalker and Princess Leia get crushed by trash. But are people able to actually interpret that data, or are they going to wordlessly nod along to the magical incantations of “dataâ€� and “algorithmsâ€�? 

As decisions and stories become increasingly data-driven, visualizations in the media are going to become more important. Visualizations are sort of inherently democratic.

Everyone who can see can understand a trend; math is an alien language designed to make us feel dumb. I think that in journalism, interactive storytelling — displaying data with a visual and narrative focus — is going to become even more ubiquitous and important than it already is. These visualizations will become even more interactive and possibly even gamified.

The New York Times did a really cool story where you had to draw a line to guess the trend for various statistics, like the employment rate, during the Obama years, before showing you the actual trend. This kind of quasi-gamified interactivity is intuitively more helpful than viewing an array of numbers.

Expert understanding will benefit from visualizations in the same way. Models are being deployed in high-stakes industries, like healthcare and insurance, that need to know precisely why they’re making a decision. They’ll need to either use simplified models that are inherently more intelligible, at the expense of accuracy, or have powerful tools, including visualizations, to persuade their stakeholders that model decisions can be interpreted.

The EU is working on legislation called “right of explanationâ€� laws, which allows any AI-made decision to be challenged by a human. So visualizations focused on model interpretability will become more important. 

A few other things….as more and more businesses integrate with machine learning systems, visualizations and dashboards that monitor large-scale ML systems and tell users when models need to be updated will become more prevalent. And of course, we’re generating staggering amounts of new data every day, so visualizations that can accurately summarize that data while also allowing us to explore it in an efficient way — maybe also through unsupervised learning techniques like clustering and topic modeling— will be necessary. 

Please tell us a bit about DataRobot, the company you work at.

We’re a machine learning startup that offers a platform data scientists of all stripes can use to build predictive models. I’m equal parts a fan of using the product and working on it, to be honest. The app makes it insanely easy to analyze your data, build dozens of models, use the myriad visualizations and metrics we have to understand which one will be the best for your use case, and then use that one to predict on new data.

The app is essentially an opinionated platform on how to automate your data science project. I say opinionated because it’s a machine that’s been well-oiled by some of the top data scientists in the world, so it’s an opinion you can trust. And as a data scientist, the automation isn’t something to fear. We’re automating the plumbing to allow you to focus on the problem-solving, the detective work. Don’t be a luddite! 

It’s really fun working on the product because you get to learn a ton about machine learning (both the theoretic and real-world applications) almost by osmosis. It’s like putting your textbook under your pillow while you sleep, except it actually works. And since data science is such a protean field, we’re also covering new ground and creating new standards for certain concepts in machine learning. There’s also a huge emphasis, embedded in our culture and our product, on — “democratizing� is abusing the term, but really putting data science into as many hands as possible, through evangelism, teaching, workshops, and the product itself.

Shameless promotional shout-out: we are hiring! If you’re into data or machine learning or python or javascript or d3 or angular or data vis or selling these things or just fast-growing startups with some cool eclectic people, please visit our website and apply!

As a data visualization engineer at DataRobot, what are the key design principles the company applies for development of its visualizations?

The driving design principle is functionality. Above all, will a user be able to derive an insight from this visualization? Will the insight be actionable? Will that insight be delivered immediately, or is the user going to have to bend over backwards scrutinizing the chart for its underlying logic, trying to divine from its welter of hypnotic curves some hidden kernel of truth? We’re not in the business of beautiful, bespoke visualizations,  like some of the stuff the NYTimes does.

Data visualization at DataRobot can be tricky because we want to make sure the visualizations are compatible with any sort of data that passes through — and users can build predictive models for virtually any dataset — which means we have to operate at the right level of explanatory and visual abstraction. And we want users of various proficiencies to immediately intuit whether or not a model is performing well, which requires thinking about how a beginner might be able to understand the same charts an expert might expect. So by “functionality� I mean the ability to quickly intuit meaning.

That step is the second in a hierarchy of insight: the first is looking at a single-valued metric, which is only capable of giving you a high-level summary, often an average. This could be obfuscating important truths. A visualization —the second step— exposes these truths a bit further, displaying multiple values at a time over slices of your data, allowing you to see trends and anomalous spots. The third step is actually playing with the visualization. An interactive visualization confirms or denies previous insights by letting you drill down, slice, zoom, project, compare — all ways of reformulating the original view to gain deeper understanding. Interactive functionality is a sub-tenet of our driving design principle. It allows users to better understand what they’re seeing while also engaging them in (admittedly) fun ways. 

During the ODSC in Boston, you will be presenting an intro to D3, can you give us a heads up? What is D3 and what are its main features and benefits?

D3 is a data visualization library built in Javascript. It represents data in a browser interface by binding data to a webpage’s DOM elements. It’s very low-level, but there are plenty of wrapper libraries/frameworks built around it that are easier to use, such as C3.js or the much more sophisticated Plot.ly. If you find a browser-rendered visualization toolkit, it’s probably using D3 under the hood. D3 supports transitions and defines a data update function, so you can create really beautiful custom and dynamic visualizations with it, such as these simulations or this frankly overwrought work of art.

D3 was created by Mike Bostock as a continuation of his graduate work at Stanford. Check out the awesome examples.

Please share with us some details about the session. What will attendees get from it?

Attendees will learn the basics of how D3 works. They’ll come away with a visualization in a static HTML file representing some aspect of a real-world dataset, and a vague sense of having been entertained. I’m hoping the workshop will expose them to the tool and give them a place to start if they want to do more on their own. 

What are the prerequisites attendees should have to take full advantage of your session?

Having already downloaded D3 4.0 (4.0!!!!!) will be useful, but really just a working browser — I’ll be using Chrome — and an IDE or text editor of your choice. And a Positive Attitudeâ„¢. 

Finally, on a more personal tenor, what's the best book you've read recently? 

Story of O: a bildungsroman about a young French girl's spiritual growth. Very inspiring!

Thank you Morgane for your insights and thoughts.

Morgane's “Intro to 3Dâ€� workshop session will be part of the Open Data Science Conference to take place in Boston, Ma. from May 3 to 5.

A good excuse to visit beautiful Boston and have a great data science learning experience!


Cloud Analytics Conference – London!

Cloud Analytics Conference – London!

Join Snowflake and The Data Warrior in London on June 1st for a Cloud Analytics Conference
About IoT Platforms, Super Powers Methodology, Superheroes and Super Villains

About IoT Platforms, Super Powers Methodology, Superheroes and Super Villains

The world is full of normal people like you and me, but I love to think that superheroes live between us and I dream that maybe someday I could become one of them and make a better world with my super powers.

In the universe of superheroes fit gods, mutants, humans with special skills, but also the special agents. I found fun to find similarities between this fantastic world and the world of IoT platforms.  Compare and find a reasonable resemblance between IoT Platforms and Superheroes or Super villains is the goal of this article. Opinions as always are personal and subject to all kinds of comments and appreciations. Enjoy, the article.

About IoT Platforms

Many of my regular readers remember my article “It is an IoT Platform, stupid !.�. At that time, per Research and Markets, there were more than 260 IoT platforms, today some sources speak about 700 IoT platforms. I confess, I have not been able to follow the birth, evolution and in some cases death of all IoT platforms out there. I think that many enthusiasts like me also have given up keeping an updated list.

I cannot predict which IoT platforms will survive beyond 2020, or which will be ...


Read More on Datafloq
How to Implement a Successful Big Data and Data Science Strategy

How to Implement a Successful Big Data and Data Science Strategy

Big Data and Data Science are two of the most exciting areas in the business today. While most of the decision makers understand the true potential of both the fields, companies remain skeptical on how to implement a successful big data strategy for their enterprises. This roadmap can help you in defining and implementing the right big data strategy in your organization.

There are many ways to incorporate big data and data science process in your company’s operations, but the following practices outlined here would guide businesses make a perfect blueprint of their big data and implementation strategy.

Define the Big Data Analytics Strategy

Organizations first need to define a clear strategy in synchronization with their core business objectives for the big data implementation. A strategy may include improving operational efficiency, boosting marketing campaign, analyzing consumers for prediction or counter fraud to mitigate risk and drive business performance. The business strategy should adhere to the following points to effectively solve business problems.


The business strategy should align itself with the enterprise quality and performance goals.
It should focus on measurable outcomes.
It should transform your company’s capabilities through data-driven decision making.


Choosing the right data

With the voluminous increase in data, it has become problematic for organizations to ...


Read More on Datafloq
How to Improve Your Data Quality to Comply with the GDPR

How to Improve Your Data Quality to Comply with the GDPR

What data quality means to the GDPR

The General Data Protection Regulation (GDPR) that will come into effect on May 25<sup>th</sup> 2018 has strong implications for nearly each and every company and organization in Europe. Its principle of “privacy by design�, that was first postulated by Canadian data protection scientist Ann Cavoukian could lead to a paradigm shift in how businesses develop their marketing campaigns and their customer service.

Many of the articles of the GDPR show the importance of data quality, especially Article 5 (Principles relating to processing of personal data) and Article 16 (Right to rectification). But it is obvious that also other parts of the GDPR demand a high level of quality of data, especially duplicates should be avoided in order to fulfill the data subject’s rights like the “Right of access� or “Right to object� properly.

The problem with this insight is that many businesses are struggling with their data quality. Studies and surveys show that a majority of companies is not satisfied with the data quality in their databases and think that it needs improvement.

But what are possible measures to improve data quality?

Technical and organizational measures

In the “good old days�™ there were dedicated employees called “data entry clerks�. ...


Read More on Datafloq
Why We Need to Stop Using FTP for Media Data, Like Yesterday

Why We Need to Stop Using FTP for Media Data, Like Yesterday

It’s 2017, and it’s time to start making some serious changes around here. FTP, or the File Transfer Protocol, is one of the most popular transfer methods for sending files to — and downloading from — the cloud. Users like FTP is because it’s simple to use and efficient when you’re primarily working with local media servers.

But, the ease of FTP comes at a cost, and the security risks are simply not worth it.

According to a new report from Encoding.com, FTP and SFTP remain “a popular transit protocol for getting files to the cloud, primarily due to its ease and prevalence on local media servers.�

Yet FTP and SFTP — governed by the TCP/IP protocol — were never designed to handle large data transfers. Worse, they are just not as secure as what you could be using, especially if you’re a media company with proprietary content and materials.

One of the most egregious issues with FTP is that the servers can only handle usernames and passwords in plain text. FTPs’ inability to handle more than usernames and passwords is exactly why you’re advised not to use root accounts for FTP access. If someone were to discern your username and pass, they could ...


Read More on Datafloq
Bad Data that Changed the Course of History

Bad Data that Changed the Course of History

Data drives all the major decisions in the world today.  Every business relies on data to make daily strategic decisions. Every decision from attending to customer needs to gaining competitive advantage is made thanks to data.

As individuals we rely on data for even the most basic daily activities including navigation to and from work as well as for communicating with friends and family.  But what happens when the data we rely on to make our daily decisions is bad?  It can have a drastic impact on our lives whether it’s a small task like choosing where to eat, or deciding whether or not a candidate for a job is qualified to hire.  Relying on bad data can also have a drastic impact on your bottom line.   

Bad data is Costly

We know that bad data is costly, but just how costly can it be?  IBM estimates that bad data costs the US economy roughly $3.1 trillion dollars each year. That’s a huge number. They also found that 1 in 3 business leaders don’t trust the information they use to make decisions. Not only do they not trust the data they are working with, but there is also a high level of ...


Read More on Datafloq
Do Self-driving Cars Hold the Key to a Widespread IoT?

Do Self-driving Cars Hold the Key to a Widespread IoT?

In 2014, Continental Tires developed tires that “talk to you�. The innovation, dubbed eTIS (electronic Tire Information System), consists of sensors embedded beneath the tire tread. The sensors relay information about when your tires are underinflated, when tread is too low, and when your car has too much weight in it from a heavy load. This new entry in the annals of IoT tech was relatively quiet and unglamorous. Yet, it forecasted what we’re seeing now. Car manufacturers and tire manufacturers are throwing millions of dollars into technology that will enable a widespread internet of things.

Call it necessity facilitating innovation; as I reported in an earlier post here, 1.2 million people die in auto-related accidents every year. That means safety is in high demand. One way to increase safety is to embed things like tires with sensors that can communicate data with a car’s onboard computer. Another way is to replace humans with AI to create self-driving cars, which will hopefully do a better job than we do at driving.

For self-driving cars to truly succeed by 2020, the IoT needs 4.5 million developers. That’s because a comprehensive IoT infrastructure—in which smart cities talk to smart cars—will help driverless vehicles navigate ...


Read More on Datafloq
Snowflake and Spark, Part 1: Why Spark? 

Snowflake and Spark, Part 1: Why Spark? 

Snowflake Computing is making great strides in the evolution of our Elastic DWaaS in the cloud. Here is a recent update from engineering and product management on our integration with Spark: This is the first post in an ongoing series describing Snowflake’s integration with Spark. In this post, we introduce the Snowflake Connector for Spark (package […]
Why Isn’t Big Data Called Small Data?

Why Isn’t Big Data Called Small Data?

Sometimes I think that Big Data has a branding problem.

You see, for data scientists to gain the trust and buy-in from their colleagues, they have to explain how their analysis can add value. They take a “data ocean� of information and distill it into highly-specific and actionable insights for every internal customer, refining and refreshing it along the way to ensure that it is as relevant as possible.

It is like they take the most powerful telescope imaginable and look for a speck of dust on the moon. “Here you go, this precise set of data will prove that you are right.�

The success of Big Data initiatives (to a large extent) comes in the ability to drill down from the planetary level to the sub-atomic level. It’s all about getting to those small insights that would never have appeared had you not started large and refocused, refocused and refocused. Of course, this doesn’t mean that the bigger trends are not relevant, but we have a tendency to view anything “large� with a certain amount of mistrust.

Somehow we naturally think that “big� things have a bigger margin for error, although the assumptions that we made on the way to the smaller insights could ...


Read More on Datafloq
How the GDPR will boost the Megatrend of Human Data Responsibility

How the GDPR will boost the Megatrend of Human Data Responsibility

First of all some basic facts about the General Data Protection Regulation (GDPR). If you haven’t heard about it, you should now pay attention and get some further information by visiting http://www.eugdpr.org/ or https://dsgvo.tips (for german readers). The GDPR will affect everybody working with personal data and is one of the major aspects of Human Data Responsibility (HDR).

The Facts:


The enforcement date of the GDPR is 25th May 2018. So you have little over one year of time to introduce the new rules to your company.
There will be extremely heavy fines for organizations who don’t work within the law. This can be up to 4% of the global annual turnover or 20 Million € (whichever is greater).
The rules affect every organization working with personal data of citizens of the European Union. So this is a worldwide topic.


I also want to point out, that IMHO the GDPR is a good thing. It is historically based on the CHARTER  OF FUNDAMENTAL RIGHTS OF THE EUROPEAN UNION from the year 2000 (http://www.europarl.europa.eu/charter/pdf/text_en.pdf) where the protection of personal data (Article 8) is on the same level like Human Dignity (Aricle 1), the Right to life (Article 2) or Freedom of thought, conscience and religion (Article 10). ...


Read More on Datafloq
How to Avoid a Data Breach

How to Avoid a Data Breach

Even a modest data breach can create serious problems that business owners cannot afford to take lightly. By some estimates, small businesses are the target of more than 40 percent of all online attacks. From providing staff and employees with the additional training in order to avoid any bad habits that may lead to security concerns to staying informed regarding the new threats that may be just over the horizon, business owners would be wise to take whatever steps may be necessary in order to enhance their level of digital security.

Consequences of a Breach

From the largest and most high-profile breaches to situations where business owners may not even be aware that their accounts or information may become compromised, calculating the true cost of a breach can often be difficult. In addition to more tangible instances that may involve the theft of funds or loss of assets, long-term damage to the brand or image of a business can often be quite costly. Consumers who have reason for heightened concern regarding their personal, account or financial information are far more likely to take their business elsewhere. Failing to address digital security concerns could end up sending the wrong message to potential ...


Read More on Datafloq
What To Consider When Hiring Data Science Talent

What To Consider When Hiring Data Science Talent

The truth is that hiring for data science is in many ways more of an art than a science. That does sound oxymoronic, but that does not make it any less true. The reason is obvious. Data science is so new that it can be hard to know what you’re actually looking for. What is the set of skills and abilities that will make a data science team fly and which one will make them flounder?

If you don’t know, then you’re certainly not alone. Fortunately, we do some years of company experience to draw on. What’s more, IT has been with us for nearly two decades and there are plenty of valuable lessons there too that we can implement in data science hires.

So let’s see what we’ve learned so far.

It’s not only about the numbers

A lot of companies think that if they just get a couple of people who are incredibly good with numbers, then it will all sort itself out. That couldn’t be further from the truth because the numbers alone won’t get you anywhere.

In the data community, there’s a famous saying: Garbage In Garbage Out. When they say it, they’re mainly talking about the quality of the raw ...


Read More on Datafloq
Buzzing 2017 Trends That Will Affect Big Data Users

Buzzing 2017 Trends That Will Affect Big Data Users

The smartest way of predicting how 2017 will be for big data to say that it will just get bigger and better. What will get bigger is the number of companies using big data and what will get better is the way big data technologies will be employed.

The technologies change so fast that it is almost impossible for organizations to keep up with the growth at times. This makes it imperative for the organizations to be informed about what will trend and what is likely to shape the future so that the selection can be made appropriately. Here, we list the big data trends that will affect organizations in 2017 and also the big data industry.

Boom in the Internet of Things (IoT)

In the past few years, we have seen glimpses of IoT being adopted in luxury goods. Some prominent researchers have predicted a revolution in IoT which is sure to generate oodles of data causing the big data technologies to customize its offerings and center them around IoT.

Cloud for Everything Big Data

So far, there has been a mixed reaction to choosing cloud for storage of trivial data. But, it seems that companies have finally found the right mix with hybrid ...


Read More on Datafloq
Do You Know How to Create a Dashboard?

Do You Know How to Create a Dashboard?

Chalk Board
Image by Travis Wise via flickr (https://flic.kr/p/MiM8yL)

What a question.

Of course, anyone can create a dashboard for their business, right? If I don’t have the time, my IT folks could whip up some for me in a matter of hours, right?

Even more tragically, some read the above question and thought it was “Do you know how to create a chart?” which is a different question entirely.

The truth is, not everyone can create a useful and always up-to-date business dashboard. And to those who think they have the best IT department in the whole industry, here is one little surprise: They may not have the necessary experience to build one for you either.

Really? Is it that difficult to create a business dashboard?

Everyone seems to have one or five displayed on LCD screens in their hallways or conference room nowadays, how can it be that hard to create?

In truth, it is not the creation of the dashboard that is difficult, but the useful and always up-to-date part.

Useful How?

Smart companies (such as our clients) who have LCD screens throughout their office use the information broadcasted within the screen to disseminate important numbers that show the health of the company. By doing this, they engage all employees to think constantly on how their day-to-day tasks affect those numbers. (My last article discussed one of the benefits of this approach)

Therefore the numbers (and figures, and visualization) on the screen better be useful for everyone in the company to know about.

Here is the problem: These numbers usually are hiding inside multiple systems, several spreadsheets, and inside the head of some key personnel.

And, they regularly — if not constantly — change.

So They Need to be Always Up-to-date?

Bingo.

Now you start to see that to design, build, maintain, and keep up with the changes even for a single dashboard is quite a bit of work. Are you sure now that your IT department has the bandwidth (not to mention the required skills and experience)?

It is really a full-time job for a qualified personnel; actually in a lot of cases, a single person is not enough, it requires a team.

Okay, Mr. Smartypants, What do you suggest then?

Let’s start by answering these questions:

  1. Why do I need a dashboard? What purpose does it serve in my company at the moment? One good answer: “I need a lot of visibility into what can give my company the best chance to not only survive, but excel in a fiercely-competitive industry.”
  2. What do I want on the dashboard? Do I know enough about the metrics (ok, KPI if you have to use a buzzword) that affects the bottom line, also the ones that show me the pulse — or even more useful: Problem areas — within the company? If you don’t already have a project to find these metrics, now would be a good time to start one, because most likely your competitors are working on it as well.
  3. Who can help me design, build, and maintain these dashboards? Can my existing personnel do it? Or is this time to chat with folks whose day-to-day business is to design, build, and maintain other companies dashboards?

But Didn’t We Just Bought that BI Tools?

Maybe, but a BI suite of tools cannot automagically design, build, and maintain your dashboards, someone still have to gather, clean up, and prepare the data so the tools can be used on them, and keep doing this as changes come and go.

One of the fallacy in the BI Tool industry is the lack of mentioning the crucial part: Without a well-designed and well-maintained data warehouse underneath, even the most sophisticated analytic and visualization tool is useless.

Why a company like nextCoder exists?

Because we are very useful to our clients. It matters not if they already paid for BI Tools such as Power BI, Tableau, Domo, Birst, etc. because we actually help them to design useful dashboards based on existing data, existing tools, and our experience from working on dashboards across different industries. Therefore accelerating the process of making data analytic part of the company’s program to grow and to compete in the industry.

“What if we don’t have a tool yet?” Then consider DW Digest(TM), which is designed to work seamlessly with our data warehouse designs and implementations. It is also competitively priced against the tools I mentioned above.

Our online platform are designed to keep these useful dashboards up-to-date and be able to cope with changes. Using our services, our clients can concentrate to iron out kinks, discover more opportunities, and save time and ultimately cost throughout the company and across departments. All without worrying how to maintain those dashboards.

Last question

You may indeed be able to create a useful and always up-to-date dashboards. But since your business is probably not making dashboards, is it the best use of your time? Your team’s time?

If you have any questions on how to achieve measured acceleration for your business using dashboards, send them to: will.gunadi@dwdigest.com or call me at 214.436.3232.

What Can Security Analytics Give Your Team?

What Can Security Analytics Give Your Team?

With the constant changes happening in the technology industry and the ever increasing cyber security threats, businesses are eager now more than ever to protect their data and company assets. As more and more devices come equipped with internet capabilities, create data, and store personal information, there are more routes available for hackers to access this information and for a business to experience a security breach.

Cyber security is necessary for developing and conducting the appropriate safety measures that will ultimately protect an organization’s computer systems, networks, and confidential information. Having access to the best talent and technology is crucial for businesses and other institutions to keep up with and surpass the threats and constant efforts of cyber hackers. With these shifts in technology, whether they be data storage or video analytics, traditional perimeter protection tools are just not enough anymore. As businesses look for security solutions to invest in, they should consider funding a security analytics project as part of their cyber defense program.

Security monitoring and analytics proves to be one of the most fundamental services within a business’s information security system. Establishing your own in-house security operations center within your company to manage comprehensive monitoring and alerting services can ...


Read More on Datafloq
Data, Metadata, Algorithms & Ethics

Data, Metadata, Algorithms & Ethics

The topic of ethical big data use is one that will likely continue popping up in the headlines with increasing frequency in the coming years. As the IoT, AI, and other data-driven technologies become further integrated with our social identities, the more discussion regarding its regulation we will see.

Recently, transparency advocates began pushing The Open, Public, Electronic and Necessary (or OPEN) Government Data Act, which aims to publish all non-federally restricted data in an open source format, allowing for standardized use by the government as well as the public.

“Our federal government is not just the largest organization in human history, it’s also the most complex,� said executive director of the Data Coalition, Hudson Hollister, in an article on the Federal Times. “To conduct oversight across such scale and complexity is a daunting challenge, fortunately, that is where transparency comes in. By giving Americans direct access to their government’s information, we can deputize millions of citizen inspectors general to help this committee fulfill its mission.�

This type of standardization, transparency, and ethical foresight aims to create a fair and balanced framework for the use of Big Data. Considering the pace of automation and IoT growth, these standards could begin affecting every industry ...


Read More on Datafloq
Why Marketing Needs Quality Data before Big Data and Predictive Analytics

Why Marketing Needs Quality Data before Big Data and Predictive Analytics

Recent marketing hype has been about new analytics and big data, and becoming marketing technologists. However, there are some fundamentals which must first be addressed, and a key stumbling block to effective marketing is the general poor quality of data. Data quality is non-negotiable. In a recent study, Britain's Royal Mail Data Services found that the average impact on businesses was a cost of 6% of annual revenue. While there was some variance among respondents, clearly no company can afford to ignore this problem.

This concern with data quality is not limited to the United Kingdom. Experian's data quality arm, in their annual benchmark report on global data quality, reported that while most businesses globally (and 95% in the US) use data to meet their business objectives, less than 44% of them trust their data.

Customer Experience is Top of Mind for 2017

Some 56% of the respondents in Experian's report want to serve their customers better in 2017, and recognize that a key factor in achieving this is better data. Providing a rich customer experience is the name of the game, and poor or erroneous information about that customer could cause the end of that relationship. It has become apparent to most ...


Read More on Datafloq
Why VPNs Are Vital For Data-Driven Business

Why VPNs Are Vital For Data-Driven Business

Companies have been using Virtual Private Networks (VPNs) for years, typically so that workers could access their desktops remotely, but in the age of big data, these systems are more important than ever before. The fact is, remote working is on the rise, data use is multiplying, and hackers are more innovative than ever before. VPNs are one of the best tools at our disposal for protecting our businesses, our data, and our clients.

Is your company armed against the constant threat of data theft? Here’s what you need to know to keep your business safe.

Are You Out There?

Did you know that 80 percent of corporate professionals work outside the office at least once a week? That’s a lot of people operating outside the protection and constraints of the typical workplace, such as multi-level encryption, firewalls, and protected servers. Though digital attacks can take place anywhere, shifting away from the office puts workers at a unique risk.

Always remind your workers to take precautions when using free WiFi on the road and invest in a VPN they can use no matter where work takes them. Yes, free WiFi is must have – at the hotel or cafes – but using it without ...


Read More on Datafloq
Top 5 ways to use Big Data to improve your Website Design

Top 5 ways to use Big Data to improve your Website Design

Big Data is a buzzword these days. Are you wondering what big data is? So, firstly let's get the definition out of the way so that we can begin on the same page.

What is Big Data?

Big Data refers to huge volume of data, both structured and unstructured. The volume of data is so massive in scope that it is almost impossible to process it using traditional means. As per Cloud Tweaks, 2.5 quintillion bytes of data is produced every single day. Again, as per predictions of experts, 40 zettabytes of data will be in existence by end of 2020. So, basically, big data is everywhere and it's shaping the internet and influencing the way we do business.

How Big Data is influencing the Web Design

With the help of Big data, businesses can create a data-driven web design that delivers the best user experience. Well, a data driven website design is not only restricted to functionality and visual appeal rather it takes a more scientific approach towards the concept of web design. It closely highlights, how through the design a company website can gain more traffic and leads. It has been observed, businesses who make a switch to data-driven web design enjoy ...


Read More on Datafloq
How to Capitalize on the Complex Modern Data Ecosystems

How to Capitalize on the Complex Modern Data Ecosystems

The data ecosystem serving today’s modern enterprises is a multi-platform architecture that attempts to embrace a variety of heterogeneous data sources. This modern data ecosystem (MDE) might include data lakes, traditional data warehouses, SaaS deployments and other cloud-based systems, data hubs, and distributed databases.

Multi-Platform Architecture

Reality of Modern Enterprise



The MDEs can potentially enable a wide variety of business goals as well as support data diversity, optimize costs, and support multiple systems of insight. However, MDEs will never be able to deliver these benefits unless enterprises can surmount a series of formidable challenges:


Data ownership. Who owns the data and with whom can it be shared?
Integration and unification. How will disparate data be integrated and unified to support reporting and analysis across the entire portfolio?
Data quality risks. How will an enterprise ensure adequate data quality given that different data systems will be characterized by different levels of data quality?
Skillset scarcity.  How will an enterprise fulfill the need for a diverse set of skills?
Optimization issues. How will an enterprise optimize the interaction among MDE’s, separate, poorly orchestrated components?
Multiple data models. How will an enterprise work with multiple data models that proliferate, reducing efficiency?
Holistic view. How will an enterprise establish a sustainable method for gaining ...


Read More on Datafloq
3 Fresh Approaches to Maximize Customer Value with Data

3 Fresh Approaches to Maximize Customer Value with Data

New customer acquisition is costly. And customers are increasingly demanding, fickle, and empowered with endless options — new and old — to spend their dollars. So brands are rightly focused on increasing retention and share of wallet to maximize customer value.

Brands know that data holds the key to making the customer value gains they want to see. But they struggle to leverage that data in the right way. Here are 3 fresh approaches many brands are not using, but should consider, to improve customer value.

1. The More Data, The Merrier

You are collecting some data — likely even a lot of data — about your customers. You’ve got some demographics, geography/location, and purchase history. You may have their customer service history and website behavior as well.

Don’t stop there. Do you know their marital status, education level, income? How about the words they said when talking to a customer service rep? How about their tweets? How loyal to your brand are their friends, family, and coworkers in their social networks?

And don’t stop with your customers. What about the enterprise itself? You’ve got a wealth of data about every aspect of the business, including sales data, ops data, and much more.

Why is this important?

Many ...


Read More on Datafloq
Connected Cars: Big Data’s Next Mining Ground

Connected Cars: Big Data’s Next Mining Ground

One of the exciting things about the future of big data is that it will likely start acting more like a living system when products like the self-driving car mature in the marketplace. Instead of needing to correlate the differences, the analytical infrastructure will allow data that is accruing to be analyzed and acted upon automatically. That type of future should be reassuring for most insurance companies that will now only have hacking to worry about when it comes to serious accidents and large payouts.

Data mining that makes sense

Most people have noticed that the amount of privacy that they have in their lives is continuously shrinking. Part of this is due to convenience, while another part is due to additional security that takes away individual freedoms in order to provide the entire neighborhood, town, or community with better protection.

Because the tech industry has already set up the type of licensing that makes users of smartphones and tablets that are in cars or connected to them subject to wide-scale data gathering, the new auto manufacturers are scrambling to put together sophisticated programs that take advantage of the data that they are allowed to gather.

Insurance industry faces a great deal of change

Another ...


Read More on Datafloq
Cloudera Analyst Event: Facing a New Data Management Era

Cloudera Analyst Event: Facing a New Data Management Era

I have to say that I attended this year’s Cloudera analyst event in San Francisco with a mix of excitement, expectation and a grain of salt also.

My excitement and expectation were fuelled with all that has been said about Cloudera and its close competitors in the last couple of years, and also by the fact that I am currently focusing my own research on big data and “New Data Platforms�. Moreover, when it comes to events hosted by vendors, I always recommend taking its statements with a grain of salt, because logically the information might be biased.


However, in the end, the event resulted in an enriching learning experience, full of surprises and discoveries. I learnt a lot about a company that is certainly collaborating big time in the transformation of the enterprise software industry.

The event certainly fulfilled many of my “want-to-know-more� expectations about Cloudera and its offering stack; the path the company has taken; and their view of the enterprise data management market.

Certainly, it looks like Cloudera is leading and strongly paving the way for a new generation of enterprise data software management platforms.

So, let me share with you a brief summary and comments about Cloudera’s 2017 industry analyst gathering.

OK, Machine Learning and Data Science are Hot Today

One of the themes of the event was Cloudera’s keen interest and immersion into Machine Learning and Data Science. Just a few days before the event, the company made two important announcements:

The first one was about the beta release of Cloudera Data Science Workbench (Figure 1), the company’s new self-service environment for data science on top of Cloudera Enterprise. This new offering comes directly from the smart acquisition of machine learning and data science startup, Sense.io.

Screencap of Cloudera's Data Science Workbench (Courtesy of Cloudera) 
Some of the capabilities of this product allow data scientists to develop on some of the most popular open source languages —R, Python and Scala— with native Apache Spark and Apache Hadoop integration, which in turn fastens project deployments, from exploration to production.

In this regard, Charles Zedlewski, senior vice president, Products at Cloudera mentioned that

“Cloudera is focused on improving the user experience for data science and engineering teams, in particular those who want to scale their analytics using Spark for data processing and machine learning. The acquisition of Sense.io and its team provided a strong foundation, and Data Science Workbench now puts self-service data science at scale within reach for our customers.�


One key approach Cloudera takes with the Data Science Workbench is that it aims to enable data scientists to work in an truly open space that can expand its reach to use, for example, deep learning frameworks such as TensorFlow, Microsoft Cognitive Toolkit, MXnet or BigDL, but within a secure and contained environment.

Certainly a new offering with huge potential for Cloudera to increase its customer base, but also to reaffirm and grow its presence within existing customers which now can expand the use of the Cloudera platform without the need to look for third party options to develop on top on.

The second announcement showcases the launch of Cloudera Solution Gallery (Figure 2), which enables Cloudera to showcase its solution’s large partner base  â€”more than 2,800 globally— and a storefront of more than 100 solutions.

This news should not be taken lightly as it shows Cloudera capability to start building a complete ecosystem around this robust set of products, which in my view is a defining aspect of those companies who want to become an industry de-facto.

Figure 2. Cloudera Solution Gallery (Courtesy of Cloudera)

Cloudera: Way More than Hadoop

During an intensive two-day event filled with presentations, briefings and interviews with Cloudera’s executives and customers, a persistent message prevailed. While the company recognizes its origin as a provider of a commercial distribution for Hadoop, it is now making it clear that its current offering has expanded way beyond the Hadoop realm to become a full-fledged open source data platform. Hadoop is certainly in the core of Cloudera as the main data engine itself but, with support for 25 open source projects, its platform is currently able to offer much more than Hadoop distributed storage capabilities.
(post-ads)
This is reflected through Cloudera’s offerings, from the full fledged Cloudera Enterprise Data Hub, its comprehensive platform, or via one of Cloudera’s special configurations:




Cloudera’s executives made it clear that the company strategy is to make sure they are able to provide, via open source offerings, efficient enterprise-ready data management solutions.

However, don’t be surprised if the message from Cloudera changes through time, especially if the company wants to put its aim on larger organizations that most of the times rely on providers that can center their IT services to the business and are not necessarily tied with any particular technology.

Cloudera is redefining itself so it can reposition its offering as a complete data management platform. This is a logical step considering that Cloudera wants to take a bigger piece of the large enterprise market, even when the company’s CEO stated that they “do not want to replace the Netezzas and Oracle’s of the world�.

Based on these events, it is clear to me that eventually, Cloudera will end up frontally competing in specific segments of the data management market —especially with IBM through its  IBM BigInsights, and Teradata, with multiple products that have left and keep leaving a very strong footprint in the data warehouse market. Either we like it or not, big data incumbents such as Cloudera seem to be destined to enter the big fight.

The Future, Cloudera and IoT

During the event I had also a chance to attend a couple of sessions specifically devoted to show Cloudera’s deployment in the context of IoT projects. Another thing worth notice is that, even when Cloudera has some really good stories to tell about IoT, the company seems not to be in a hurry to jump directly onto this wagon.

Perhaps it’s better to let this market get mature and consistent enough before devoting larger technical investments on it. It is always very important to know when and how to invest in an emerging market.

However, we should be very well aware that Cloudera, and the rest of the big data players, will be vital for the growth and evolution of the IoT market.

Figure 3. Cloudera Architecture for IoT (Courtesy of Cloudera)

It’s Hard to Grow Gracefully

Today it’s very hard, if not impossible, to deny that Hadoop is strongly immerse in the enterprise data management ecosystem of almost every industry. Cloudera’s analyst event was yet another confirmation. Large companies are now increasingly using some Cloudera’s different options and configurations for mission critical functions.

Then, for Cloudera the nub of the issue now is not about how to get to the top, but how to stay there, evolve and leave its footprint at the top.

Cloudera has been very smart and strategic to get to this position, yet it seems it has gotten to a place where the tide will get even tougher. From this point on, convincing companies to open the big wallet will take much more than a solid technical justification.

At the time of writing this post, I learnt that Cloudera has filed to go public and will trade on the NY Stock Exchange, and as an article on Fotune mentions:

“Cloudera faces tough competition in the data analytics market and cites in its filing several high-profile rivals, including Amazon Web Services, Google, Microsoft, Hewlett Packard Enterprise, and Oracle.�

It also mentions the case of Hortonworks, which:

“went public in late 2014 with its shares trading at nearly $28 during its height in April 2015. However, Hortonworks’ shares have dropped over 60% to $9.90 on Friday as the company has struggled to be profitable.�

In my opinion, in order for Cloudera to succeed while taking this critical step, they will have to show that they are more than well prepared business, technically and strategically wise, and also prepared and ready for the unexpected, because only then they will be able to grow gracefully and align to play big, with the big guys.

Keep always in mind that, as Benjamin Franklin said:

Without continual growth and progress, such words as improvement,
achievement, and success have no meaning.

How SAP Ariba became a first-mover as Blockchain comes to B2B procurement

How SAP Ariba became a first-mover as Blockchain comes to B2B procurement

The next BriefingsDirect digital business thought leadership panel discussion examines the major opportunity from bringing Blockchain technology to business-to-business (B2B) procurement and supply chain management.

We will now explore how Blockchain’s unique capabilities can provide comprehensive visibility across global supply chains and drive simpler verification of authenticity, security, and ultimately control.

Listen to the podcast. Find it on iTunes. Get the mobile app. Read a full transcript or download a copy.

To learn more about how Blockchain is poised to impact and improve supply chain risk and management, we're joined by Joe Fox, Senior Vice President for Business Development and Strategy at SAP Ariba, and Leanne Kemp, Founder and CEO of Everledger, based in London.

The panel was assembled and recorded at the recent 2017 SAP Ariba LIVE conference in Las Vegas. The discussion is moderated by Dana Gardner, principal analyst at Interarbor Solutions.

Here are some excerpts:

Gardner: Joe, Blockchain has emerged as a network methodology, running crypto currency Bitcoin, as most people are aware of it. It's a digitally shared record of transactions maintained by a network of computers, not necessarily with centralized authority. What could this be used for powerfully when it comes to gaining supply chain integrity?

Fox: Blockchain did start in the Bitcoin area, as peer-to-peer consumer functionality. But a lot of the capabilities of Blockchain have been recognized as important for new areas of innovation in the enterprise software space.

Fox
Those areas of innovation are around “trusted commerce.� Trusted commerce allows buyers and sellers, and third parties, to gain more visibility into asset-tracking. Not just asset tracking in the context of the buyer receiving and the seller shipping -- but in the context of where is the good in transit? What do I need to do to protect that good? What is the transfer of funds associated with that important asset? There are even areas of other applications, such as an insurance aspect or some kind of ownership-proof.

Gardner: It sounds to me like we are adding lot of metadata to a business process. What's different when you apply that through Blockchain than if you were doing it through a platform?

Inherit the trust

Fox: That's a great question. Blockchain is like the cloud from the perspective of it’s an innovation at the platform layer. But the chain is only as valuable as the external trust that it inherits. That external trust that it inherits is the proof of what you have put on the chain digitally. And that includes that proof of who has taken it off and in what way they have control.

As we associate a chain transaction, or a posting to the ledger with its original transactions within the SAP Ariba Network, we are actually adding a lot of prominence to that single Blockchain record. That's the real key, marrying the transactional world and the B2B world with this new trusted commerce capability that comes with Blockchain.

Gardner: Leanne, we have you here as a prime example of where Blockchain is being used outside of its original adoption. Tell us first about Everledger, and then what it was you saw in Blockchain that made you think it was applicable to a much wider businesscapability.

Kemp: Everledger is a fast-moving startup using the best of emerging technology to assist in the reduction of risk and fraud. We began in April of 2015, so it's actually our birthday this week. We started in the world of diamonds where we apply blockchain technology to bring transparency to a once opaque market.

Kemp
And what did I see in the technology? At the very core of cryptocurrency, they were solving the problem of double-spend. They were solving the problem of transfer of value, and we could translate those very two powerful concepts into the diamond industry.

At the heart of the diamond industry, beyond the physical object itself, is certification, and certificates in the diamond industry are the currency of trade. Diamonds are cited on web sites around the world, and they are mostly sold off the merit of the certification. We were able to see the potential of the cryptocurrency, but we could decouple the currency from the ledger and we were able to then use the synthesis of the currency as a way to transfer value, or transfer ownership or custody. And, of course, diamonds are a girl's best friend, so we might as well start there.

Dealing with diamonds

Gardner: What was the problem in the diamond industry that you were solving? What was not possible that now is?
Kemp: The diamond industry boasts some pretty impressive numbers. First, it's been around for 130 years. Most of the relationships among buyers and sellers have survived generation upon generation based on a gentleman's handshake and trust.

The industry itself has been bound tightly with those relationships. As time has passed and generations have passed, what we are starting to see is a glacial melt. Some of the major players have sold off entities into other regions, and now that gentleman's handshake needs to be transposed into an electronic form.

Some of the major players in the market, of course, still reside today. But most of the data under their control sits in a siloed environment. Even the machines that are on the pipeline that help provide identity to the physical object are also black-boxed in terms of data.

We are able to bring a business network to an existing market. It's global. Some 81 countries around the world trade in rough diamonds. And, of course, the value of the diamonds increases as they pass through their evolutionary chain. We are able to bring an aggregated set of data. Not only that, we transpose the human element of trust -- the gentleman's handshake, the chit of paper and the promise to pay that's largely existed and has built has built 130 years of trade.

We are now able to transpose that into a set of electronic-form technologies -- 
Blockchain, smart contracts, cryptography, machine vision -- and we are able to take forward a technology platform that will see transactional trust being embedded well beyond my lifetime -- for generations to come.

Gardner: Joe, we have just heard how this is a problem-solution value in the diamond industry. But SAP Ariba has its eyes on many industries. What is it about the way things are done now in general business that isn't good enough but that Blockchain can help improve?

Fox: As we have spent years at Ariba solving procurement problems, we identified some of the toughest. When I saw Everledger, it occurred to me that they may have cracked the nut on one of the toughest areas of B2B trade -- and that is true understanding, visibility, and control of asset movement.

It dawned on me, too, that if you can track and trace diamonds, you can track and trace anything. I really felt like we could team up with this young company and leverage the unique way they figured out how to track and trace diamonds and apply that across a huge procurement problem. And that is, how do a supplier and a buyer manage the movement of any asset after they have purchased it? How do we actually associate that movement of the asset back to its original transactions that approved the commit-to-pay? How do you associate a digital purchase order (PO) with a digital movement of the asset, and then to the actual physical asset? That's what we really are teaming up to do.

That receipt of the asset has been a dark space in the B2B world for a long time. Sure, you can get a shipping notice, but most businesses don't do goods receipts. And as the asset flows through the supply chain -- especially the more expensive the item is -- that lack of visibility and control causes significant problems. Maybe the most important one is: overpaying for inventory to cover actual lost supply chain items in transit.

I talked to a really large UK-based telecom company and they told me that what we are going to do with Everledger, with just their fiber optics, they could cut their buying in half. Why? Because they overbuy their fiber optics to make sure they are never short on fiber optic inventory.

That precision of buying and delivery applies across the board to all merchants and all supply chains, even middle of the supply chain manufacturers. Whenever you have disruption to your inbound supply, that’s going to disrupt your profitability.

Gardner: It sounds as if what we are really doing here is getting a highly capable means -- that’s highly extensible -- to remove the margin of error from the tracking of goods, from cradle to grave.

Chain transactions

Fox: That’s exactly right. And the Internet is the enabler, because Blockchain is everywhere. Now, as the asset moves, you have the really cool stuff that Everledger has done, and other things we are going to do together – and that’s going to allow anybody from anywhere to post to the chain the asset receipt and asset movement.

For example, with a large container coming from overseas, you will have the chain record of every place that container has been. If it doesn't show up at a dock, you now have visibility as the buyer that there is a supply chain disruption. That chain being out on the Internet, at a layer that’s accessible by everyone, is one of the keys to this technology.

We are going to be focusing on connecting the fabric of the chain together with Hyperledger. Everledger builds on the Hyperledger platform. The fabric that we are going to tie into is going to directly connect those block posts back to the original transactions, like the purchase order, the invoice, the ship notice. Then the companies can see not only where their asset is, but also view it in context of the transactions that resulted in the shipment.

Gardner: So the old adage -- trust but verify -- we can now put that to work and truly verify. There's newstaking place here at SAP Ariba LIVE between Everledger and SAP Ariba. Tell us about that, and how the two companies -- one quite small, one very large -- are going to work together.

Fox: Ariba is all-in on transforming the procurement industry, the procurement space, the processes of procurement for our customers, buyers and sellers, and we are going to partner heavily with key players like Everledger.

Part of the announcement is this partnership with Everledger around track and trace, but it is not limited to track and trace. We will leverage what they have learned across our platform of $1 trillion a year in spend, with 2.5 million companies trading assets with each other. We are going to apply this partnership to many other capabilities within that.

Kemp: I am very excited. It’s a moment in time that I think I will remember for years to come. In March we also made an importantannouncement with IBM on some of the work that we have done beyond identifying objects. And that is to take the next step around ensuring that we have an ethical trade platform, meaning one that is grounded in cognitive compliance.

We will be able to identify the asset, but also know, for example in the diamond industry, that a diamond has passed through the right channels, paid the dutiful taxes that are due as a part of an international trade platform, and ensure all compliance is hardened within the chain.

I am hugely excited about the opportunity that sits before me. I am sincerely grateful that such a young company has been afforded the opportunity to really show how we are going to shine.
If you think about it, Blockchain is an evolution of the Internet.

Gardner: When it comes to open trade, removing friction from commerce, these have been goals for hundreds of years. But we really seem to be onto something that can make this highly scalable, very rich -- almost an unlimited amount of data applied to any asset, connected to a ledger that’s a fluid, movable, yet tangible resource.

Fox: That’s right.

Gardner: So where do we go next, Joe? If the sky is the limit, describe the sky for me? How big is this, and where can you take it beyond individual industries? It sounds like there is more potential here.

Reduced friction costs

Fox: There is a lot of potential. If you think about it, Blockchain is an evolution of the Internet; we are going to be able to take advantage of that.

The new evolution is that it's a structured capability across the Internet itself. It’s going to be open, and it’s going to be able to allow companies to ledger their interactions with each other. They are going to be able, in an immutable way, to track who owns which asset, where the assets are, and be able to then use that as an audit capability.

That's all very important to businesses, and until now the Internet itself has not really had a structure for business. It's been open, the Wild West. This structure for business is going to help with what I call trusted commerce because in the end businesses establish relationships because they want to do business with each other, not based on what technology they have.

Another key fact about Blockchain is that it’s going to reduce friction in global B2B. I always like to say if you just accelerated B2B payments by a few days globally, you would open up Gross Domestic Product (GDP), and economies would start growing dramatically. This friction around assets has a direct tie to how slowly money moves around the globe, and the overall cost and friction from that.

So how big could it go? Well, I think that we are going to innovate together with Everledger and other partners using the Hyperledger framework. We are going to add every buyer and seller on the Ariba Network onto the chain. They are just going to get it as part of our platform.

Then we are going to begin ledgering all the transactions that they think make sense between themselves. We are going to release a couple of key functions, such as smart contracts, so their contract business rules can be applicable in the flow of commerce -- at the time commerce is happening, not locked up in some contract, or in some drawer or Portable Document Format (PDF) file. We are going to start with those things.

I don't know what applications we are going to build beyond that, but that's the excitement of it. I think the fact that we don't know is the big play.

Gardner: From a business person’s perspective, they don’t probably care too much that it’s Blockchain that’s enabling this, just like a lot of people didn't care 20 years ago that it was the Internet that was allowing them to shop online or send emails to anybody anywhere. What is it that we would tease out of this, rather than what the technology is, what's the business benefit that people should be thinking about?

Fox: Everybody wants digital trust, right? Leanne, why don’t you share some of the things you guys have been exploring?

Making the opaque transparent

Kemp: In the diamond industry, there is fraud related to document tampering. Typically paper certificates exist across the backbone, so it’s very easy to be able to transpose those into a PDF and make appropriate changes for self-gain.

Double-financing of the pipeline is a very real problem; invoicing, of course accounts receivable, they have the ability to have banks finance those invoices two, three, four times.

We have issues with round-tripping of diamonds through countries, where transfer pricing isn't declared correctly, along with the avoidance of tax and duties.

All of these issues are the dark side of the market. But, now we have the ability to bring transparency around any object, particularly in diamonds -- the one commodity that’s yet to have true financial products wrapped around it. Now, what do I mean by that? It doesn’t have a futures market yet. It doesn’t have exchange traded funds (ETFs), but the performance of diamonds has outperformed gold, platinum and palladium.
This platform shift is like going from the 
World Wide Web to the 
World Wide Ledger.

Now, what does this mean? It means we can bring transparency to the once opaque, have the ability to know if an object has gone through an ethical chain, and then realize the true value of that asset. This process allows us to start and think about how new financial products can be formed around these assets.

We are hugely interested in rising asset classes beyond just the commodity section of the market. This platform shift is like going from the World Wide Web to the World Wide Ledger. Joe was absolutely correct when he mentioned that the Internet hasn't been woven for transactional trust -- but we have the ability to do this now.

So from a business perspective, you can begin to really innovate on top of this exponential set of technology stacks. A lot of companies quote Everledger as a Blockchain company. I have to correct them and I say that we are an emerging technology company. We use the very best of Blockchain and smart contracts, machine vision, sensorial data points, for us to be able to form the identity of objects.

Now, why is that important? Most financial services companies have really been focused on Know Your Customer (KYC), but we believe that it's Know Your Object (KYO) that really creates an entirely new context around it.

Now, that transformation and the relationship of the object have already started to move. When you think about Internet of Things (IoT), mobile phones, and autonomous cars -- these are largely devices to the fabric of the web. But are they connected to the fabric of the transactions and the identity around those objects?

Insurance companies have begun to understand this. My work in the last 10 years has been deeply involved in insurance. As you begin to build and understand the chain of trust and the chain of risk, then tectonic plate shifts in financial services begin to unfold.

Apps and assets, on and off the chain

Fox: It’s not just about the chain, it's about the apps we build on top, and it's really about what is the value to the buyer and the seller as we build those apps on top.

To Leanne’s point, it’s first going to be about the object. The funny thing is we have struggled to be able to, in a digital way, provide visibility and control of an object and this is going to fix that. In the end, B2B, which is where SAP Ariba is, is about somebody getting something and paying for it. And that physical asset that they are getting is being paid for with another asset. They are just two different forms. By digitizing both and keeping that in a ledger that really cannot be altered -- it will be the truth, but it's open to everyone, buyers and sellers.

Businesses will have to invent ways to control how frictionless this is going to be. I will give you a perfect example. In the past if I told you I could do an international payment of $1 million to somebody in two minutes, you would have told me I was crazy. With Blockchain, one corporation can pay another corporation $1 million in two minutes, internationally.

And on the chain companies like Everledger can build capabilities that do the currency translation on the fly, as it’s passing through, and that doesn’t dis-remediate the banks because how did the $1 million get onto the chain in the first place? Someone put it on the chain through a bank. The bank is backing that digital version. How does it get off the chain so you can actually do something with it? It goes through another bank. It’s actually going to make the banks more important. Again, Blockchain is only as good as the external trust that it inherits.

I really think we have to focus on getting the chain out there and really building these applications on top.

Listen to the podcast. Find it on iTunes. Get the mobile appRead a full transcript or download a copy. Sponsor: SAP Ariba.

You may also be interested in:

How SAP Ariba became a first-mover as Blockchain comes to B2B procurement

How SAP Ariba became a first-mover as Blockchain comes to B2B procurement

The next BriefingsDirect digital business thought leadership panel discussion examines the major opportunity from bringing Blockchain technology to business-to-business (B2B) procurement and supply chain management.

We will now explore how Blockchain’s unique capabilities can provide comprehensive visibility across global supply chains and drive simpler verification of authenticity, security, and ultimately control.

Listen to the podcast. Find it on iTunes. Get the mobile app. Read a full transcript or download a copy.

To learn more about how Blockchain is poised to impact and improve supply chain risk and management, we're joined by Joe Fox, Senior Vice President for Business Development and Strategy at SAP Ariba, and Leanne Kemp, Founder and CEO of Everledger, based in London.

The panel was assembled and recorded at the recent 2017 SAP Ariba LIVE conference in Las Vegas. The discussion is moderated by Dana Gardner, principal analyst at Interarbor Solutions.

Here are some excerpts:

Gardner: Joe, Blockchain has emerged as a network methodology, running crypto currency Bitcoin, as most people are aware of it. It's a digitally shared record of transactions maintained by a network of computers, not necessarily with centralized authority. What could this be used for powerfully when it comes to gaining supply chain integrity?

Fox: Blockchain did start in the Bitcoin area, as peer-to-peer consumer functionality. But a lot of the capabilities of Blockchain have been recognized as important for new areas of innovation in the enterprise software space.

Fox
Those areas of innovation are around “trusted commerce.� Trusted commerce allows buyers and sellers, and third parties, to gain more visibility into asset-tracking. Not just asset tracking in the context of the buyer receiving and the seller shipping -- but in the context of where is the good in transit? What do I need to do to protect that good? What is the transfer of funds associated with that important asset? There are even areas of other applications, such as an insurance aspect or some kind of ownership-proof.

Gardner: It sounds to me like we are adding lot of metadata to a business process. What's different when you apply that through Blockchain than if you were doing it through a platform?

Inherit the trust

Fox: That's a great question. Blockchain is like the cloud from the perspective of it’s an innovation at the platform layer. But the chain is only as valuable as the external trust that it inherits. That external trust that it inherits is the proof of what you have put on the chain digitally. And that includes that proof of who has taken it off and in what way they have control.

As we associate a chain transaction, or a posting to the ledger with its original transactions within the SAP Ariba Network, we are actually adding a lot of prominence to that single Blockchain record. That's the real key, marrying the transactional world and the B2B world with this new trusted commerce capability that comes with Blockchain.

Gardner: Leanne, we have you here as a prime example of where Blockchain is being used outside of its original adoption. Tell us first about Everledger, and then what it was you saw in Blockchain that made you think it was applicable to a much wider businesscapability.

Kemp: Everledger is a fast-moving startup using the best of emerging technology to assist in the reduction of risk and fraud. We began in April of 2015, so it's actually our birthday this week. We started in the world of diamonds where we apply blockchain technology to bring transparency to a once opaque market.

Kemp
And what did I see in the technology? At the very core of cryptocurrency, they were solving the problem of double-spend. They were solving the problem of transfer of value, and we could translate those very two powerful concepts into the diamond industry.

At the heart of the diamond industry, beyond the physical object itself, is certification, and certificates in the diamond industry are the currency of trade. Diamonds are cited on web sites around the world, and they are mostly sold off the merit of the certification. We were able to see the potential of the cryptocurrency, but we could decouple the currency from the ledger and we were able to then use the synthesis of the currency as a way to transfer value, or transfer ownership or custody. And, of course, diamonds are a girl's best friend, so we might as well start there.

Dealing with diamonds

Gardner: What was the problem in the diamond industry that you were solving? What was not possible that now is?
Kemp: The diamond industry boasts some pretty impressive numbers. First, it's been around for 130 years. Most of the relationships among buyers and sellers have survived generation upon generation based on a gentleman's handshake and trust.

The industry itself has been bound tightly with those relationships. As time has passed and generations have passed, what we are starting to see is a glacial melt. Some of the major players have sold off entities into other regions, and now that gentleman's handshake needs to be transposed into an electronic form.

Some of the major players in the market, of course, still reside today. But most of the data under their control sits in a siloed environment. Even the machines that are on the pipeline that help provide identity to the physical object are also black-boxed in terms of data.

We are able to bring a business network to an existing market. It's global. Some 81 countries around the world trade in rough diamonds. And, of course, the value of the diamonds increases as they pass through their evolutionary chain. We are able to bring an aggregated set of data. Not only that, we transpose the human element of trust -- the gentleman's handshake, the chit of paper and the promise to pay that's largely existed and has built has built 130 years of trade.

We are now able to transpose that into a set of electronic-form technologies -- 
Blockchain, smart contracts, cryptography, machine vision -- and we are able to take forward a technology platform that will see transactional trust being embedded well beyond my lifetime -- for generations to come.

Gardner: Joe, we have just heard how this is a problem-solution value in the diamond industry. But SAP Ariba has its eyes on many industries. What is it about the way things are done now in general business that isn't good enough but that Blockchain can help improve?

Fox: As we have spent years at Ariba solving procurement problems, we identified some of the toughest. When I saw Everledger, it occurred to me that they may have cracked the nut on one of the toughest areas of B2B trade -- and that is true understanding, visibility, and control of asset movement.

It dawned on me, too, that if you can track and trace diamonds, you can track and trace anything. I really felt like we could team up with this young company and leverage the unique way they figured out how to track and trace diamonds and apply that across a huge procurement problem. And that is, how do a supplier and a buyer manage the movement of any asset after they have purchased it? How do we actually associate that movement of the asset back to its original transactions that approved the commit-to-pay? How do you associate a digital purchase order (PO) with a digital movement of the asset, and then to the actual physical asset? That's what we really are teaming up to do.

That receipt of the asset has been a dark space in the B2B world for a long time. Sure, you can get a shipping notice, but most businesses don't do goods receipts. And as the asset flows through the supply chain -- especially the more expensive the item is -- that lack of visibility and control causes significant problems. Maybe the most important one is: overpaying for inventory to cover actual lost supply chain items in transit.

I talked to a really large UK-based telecom company and they told me that what we are going to do with Everledger, with just their fiber optics, they could cut their buying in half. Why? Because they overbuy their fiber optics to make sure they are never short on fiber optic inventory.

That precision of buying and delivery applies across the board to all merchants and all supply chains, even middle of the supply chain manufacturers. Whenever you have disruption to your inbound supply, that’s going to disrupt your profitability.

Gardner: It sounds as if what we are really doing here is getting a highly capable means -- that’s highly extensible -- to remove the margin of error from the tracking of goods, from cradle to grave.

Chain transactions

Fox: That’s exactly right. And the Internet is the enabler, because Blockchain is everywhere. Now, as the asset moves, you have the really cool stuff that Everledger has done, and other things we are going to do together – and that’s going to allow anybody from anywhere to post to the chain the asset receipt and asset movement.

For example, with a large container coming from overseas, you will have the chain record of every place that container has been. If it doesn't show up at a dock, you now have visibility as the buyer that there is a supply chain disruption. That chain being out on the Internet, at a layer that’s accessible by everyone, is one of the keys to this technology.

We are going to be focusing on connecting the fabric of the chain together with Hyperledger. Everledger builds on the Hyperledger platform. The fabric that we are going to tie into is going to directly connect those block posts back to the original transactions, like the purchase order, the invoice, the ship notice. Then the companies can see not only where their asset is, but also view it in context of the transactions that resulted in the shipment.

Gardner: So the old adage -- trust but verify -- we can now put that to work and truly verify. There's newstaking place here at SAP Ariba LIVE between Everledger and SAP Ariba. Tell us about that, and how the two companies -- one quite small, one very large -- are going to work together.

Fox: Ariba is all-in on transforming the procurement industry, the procurement space, the processes of procurement for our customers, buyers and sellers, and we are going to partner heavily with key players like Everledger.

Part of the announcement is this partnership with Everledger around track and trace, but it is not limited to track and trace. We will leverage what they have learned across our platform of $1 trillion a year in spend, with 2.5 million companies trading assets with each other. We are going to apply this partnership to many other capabilities within that.

Kemp: I am very excited. It’s a moment in time that I think I will remember for years to come. In March we also made an importantannouncement with IBM on some of the work that we have done beyond identifying objects. And that is to take the next step around ensuring that we have an ethical trade platform, meaning one that is grounded in cognitive compliance.

We will be able to identify the asset, but also know, for example in the diamond industry, that a diamond has passed through the right channels, paid the dutiful taxes that are due as a part of an international trade platform, and ensure all compliance is hardened within the chain.

I am hugely excited about the opportunity that sits before me. I am sincerely grateful that such a young company has been afforded the opportunity to really show how we are going to shine.
If you think about it, Blockchain is an evolution of the Internet.

Gardner: When it comes to open trade, removing friction from commerce, these have been goals for hundreds of years. But we really seem to be onto something that can make this highly scalable, very rich -- almost an unlimited amount of data applied to any asset, connected to a ledger that’s a fluid, movable, yet tangible resource.

Fox: That’s right.

Gardner: So where do we go next, Joe? If the sky is the limit, describe the sky for me? How big is this, and where can you take it beyond individual industries? It sounds like there is more potential here.

Reduced friction costs

Fox: There is a lot of potential. If you think about it, Blockchain is an evolution of the Internet; we are going to be able to take advantage of that.

The new evolution is that it's a structured capability across the Internet itself. It’s going to be open, and it’s going to be able to allow companies to ledger their interactions with each other. They are going to be able, in an immutable way, to track who owns which asset, where the assets are, and be able to then use that as an audit capability.

That's all very important to businesses, and until now the Internet itself has not really had a structure for business. It's been open, the Wild West. This structure for business is going to help with what I call trusted commerce because in the end businesses establish relationships because they want to do business with each other, not based on what technology they have.

Another key fact about Blockchain is that it’s going to reduce friction in global B2B. I always like to say if you just accelerated B2B payments by a few days globally, you would open up Gross Domestic Product (GDP), and economies would start growing dramatically. This friction around assets has a direct tie to how slowly money moves around the globe, and the overall cost and friction from that.

So how big could it go? Well, I think that we are going to innovate together with Everledger and other partners using the Hyperledger framework. We are going to add every buyer and seller on the Ariba Network onto the chain. They are just going to get it as part of our platform.

Then we are going to begin ledgering all the transactions that they think make sense between themselves. We are going to release a couple of key functions, such as smart contracts, so their contract business rules can be applicable in the flow of commerce -- at the time commerce is happening, not locked up in some contract, or in some drawer or Portable Document Format (PDF) file. We are going to start with those things.

I don't know what applications we are going to build beyond that, but that's the excitement of it. I think the fact that we don't know is the big play.

Gardner: From a business person’s perspective, they don’t probably care too much that it’s Blockchain that’s enabling this, just like a lot of people didn't care 20 years ago that it was the Internet that was allowing them to shop online or send emails to anybody anywhere. What is it that we would tease out of this, rather than what the technology is, what's the business benefit that people should be thinking about?

Fox: Everybody wants digital trust, right? Leanne, why don’t you share some of the things you guys have been exploring?

Making the opaque transparent

Kemp: In the diamond industry, there is fraud related to document tampering. Typically paper certificates exist across the backbone, so it’s very easy to be able to transpose those into a PDF and make appropriate changes for self-gain.

Double-financing of the pipeline is a very real problem; invoicing, of course accounts receivable, they have the ability to have banks finance those invoices two, three, four times.

We have issues with round-tripping of diamonds through countries, where transfer pricing isn't declared correctly, along with the avoidance of tax and duties.

All of these issues are the dark side of the market. But, now we have the ability to bring transparency around any object, particularly in diamonds -- the one commodity that’s yet to have true financial products wrapped around it. Now, what do I mean by that? It doesn’t have a futures market yet. It doesn’t have exchange traded funds (ETFs), but the performance of diamonds has outperformed gold, platinum and palladium.
This platform shift is like going from the 
World Wide Web to the 
World Wide Ledger.

Now, what does this mean? It means we can bring transparency to the once opaque, have the ability to know if an object has gone through an ethical chain, and then realize the true value of that asset. This process allows us to start and think about how new financial products can be formed around these assets.

We are hugely interested in rising asset classes beyond just the commodity section of the market. This platform shift is like going from the World Wide Web to the World Wide Ledger. Joe was absolutely correct when he mentioned that the Internet hasn't been woven for transactional trust -- but we have the ability to do this now.

So from a business perspective, you can begin to really innovate on top of this exponential set of technology stacks. A lot of companies quote Everledger as a Blockchain company. I have to correct them and I say that we are an emerging technology company. We use the very best of Blockchain and smart contracts, machine vision, sensorial data points, for us to be able to form the identity of objects.

Now, why is that important? Most financial services companies have really been focused on Know Your Customer (KYC), but we believe that it's Know Your Object (KYO) that really creates an entirely new context around it.

Now, that transformation and the relationship of the object have already started to move. When you think about Internet of Things (IoT), mobile phones, and autonomous cars -- these are largely devices to the fabric of the web. But are they connected to the fabric of the transactions and the identity around those objects?

Insurance companies have begun to understand this. My work in the last 10 years has been deeply involved in insurance. As you begin to build and understand the chain of trust and the chain of risk, then tectonic plate shifts in financial services begin to unfold.

Apps and assets, on and off the chain

Fox: It’s not just about the chain, it's about the apps we build on top, and it's really about what is the value to the buyer and the seller as we build those apps on top.

To Leanne’s point, it’s first going to be about the object. The funny thing is we have struggled to be able to, in a digital way, provide visibility and control of an object and this is going to fix that. In the end, B2B, which is where SAP Ariba is, is about somebody getting something and paying for it. And that physical asset that they are getting is being paid for with another asset. They are just two different forms. By digitizing both and keeping that in a ledger that really cannot be altered -- it will be the truth, but it's open to everyone, buyers and sellers.

Businesses will have to invent ways to control how frictionless this is going to be. I will give you a perfect example. In the past if I told you I could do an international payment of $1 million to somebody in two minutes, you would have told me I was crazy. With Blockchain, one corporation can pay another corporation $1 million in two minutes, internationally.

And on the chain companies like Everledger can build capabilities that do the currency translation on the fly, as it’s passing through, and that doesn’t dis-remediate the banks because how did the $1 million get onto the chain in the first place? Someone put it on the chain through a bank. The bank is backing that digital version. How does it get off the chain so you can actually do something with it? It goes through another bank. It’s actually going to make the banks more important. Again, Blockchain is only as good as the external trust that it inherits.

I really think we have to focus on getting the chain out there and really building these applications on top.

Listen to the podcast. Find it on iTunes. Get the mobile appRead a full transcript or download a copy. Sponsor: SAP Ariba.

You may also be interested in:

Privacy Policy

Copyright © 2017 BBBT - All Rights Reserved
Powered by WordPress & Atahualpa
X