The Race is On: How Blockchain Will Change Governments

The Race is On: How Blockchain Will Change Governments

We are at a global crossing point and if we manage to solve some of the problems that we have created in the past decades, the world can become a better place. When the internet was developed 28 years ago, the objective was to create a decentralised internet. However, somehow, we ended up with an internet which is in the hands of a few very powerful companies. As a result, the internet can go down for millions of people if one of those companies has a problem. This was the case a few months ago, when AWS experienced ‘a glitch’, or better know a typo, which brought down the internet for millions of users.

Sir Tim Berners-Lee explained it eloquently during the Decentralised Web Summit in 2016:

“The web was designed to be decentralised so that everybody could participate by having their own domain and having their own webserver and this hasn’t worked out. Instead, we’ve got the situation where individual personal data has been locked up in these silos.� - Sir Tim Berners-Lee

Fortunately, a new technology has appeared that can solve the existing problems with the internet and bring it back to its origins; a decentralised web, where everyone can participate ...


Read More on Datafloq
The Latest Trends in Big Data Analytics to Watch Out For

The Latest Trends in Big Data Analytics to Watch Out For

Big data has become one of hottest topics of discussion over the past few years. It plays a pivotal role in various aspects of businesses across industries and is also the favorite subject of academics too. The ability to mine massive volumes of data from a myriad of sources to analyze and gain insight has radically altered the dynamics of business functions, marketing, and sales.

Not just the major corporations, but even the start-ups may taste the success fruit by effectively uncovering the insights derived from data. Digging, analyzing, managing, and manipulating big data is pretty easy now, and most of the companies now have the ability to do it with minimal cost.

At the rise of 2017, we have seen that the businesses have far outgrown the basic concept of simply converting data to insights, but they can now use data to derive actionable and directive principles. The focus is now more on mining the data effectively in light of the actual organizational goals and by precisely targeting specific products and services.

This insight-driven approach will get more strengthened by the third quarter of 2017 to facilitate enhanced customer experience, market competitiveness, advanced level of security, and increased operational efficiency.

For example, a ...


Read More on Datafloq
Data in Marketing – The Key to Future Strategy

Data in Marketing – The Key to Future Strategy

What do you think when you read the first line of an unsolicited email that begins with the words, "Dear Buyer", or "Dear Manager"? If you have not heard from the company before, or, you are not the manager of anything, then it is even more peculiar. Whatever reputation that company was trying to develop, they've just lost it.

Many will delete the email and just conclude that this is poor marketing - which is true.But what I think is that the company that sent me this email has not cleaned up its database lately. The reason behind these common marketing mistakes is simple. A deficit has developed between the marketing team and data expertise.

Every year, the quality of our lead generation data is improving. Accuracy, strategic planning and conversion rates are all becoming more and more dependent on marketing where big data is having it impact.And yet, many organisations are not where they need to be regarding data management and application. This is most evident within the digital marketing sector.

Data and Marketing

A 2017 Data-driven Marketing Report by Jaywing bears this out. In their questioning of the marketing industry, 92% said that data management was a top priority for their business. However, 40% stated that ...


Read More on Datafloq
The Amazing Disruption in Business Intelligence

The Amazing Disruption in Business Intelligence

The world of business intelligence used to be rather basic and only included things like basic surveys and employee evaluations. However, the space has gotten a major makeover in recent years with the growth of new disruptive technologies and strategies. That disruption has had huge effects in a number of areas and is changing how companies across all industries do business.

Cloud-Based Analytics

At its core, business intelligence is all about using data to make strategic decisions. The amount of data companies could store and use was traditionally limited by human capabilities and storage functions. However, the growth of cloud services has allowed companies to tap into vast amounts of data that is more complex than ever before. Use of the cloud also allows businesses to share that data seamlessly between various departments and remote employees.

Instead of the old model of dividing up different aspects of business intelligence to various services and vendors, companies can now consolidate everything in the cloud to make it easy to store, analyze, and access whenever needed. This means the departments can share data and analytics to form a cohesive company goal and keep everyone on the same page.

Big Data-Driven Marketing

The use of the cloud and more ...


Read More on Datafloq
A Beginner’s Look at Big Data and its Benefits for Your Business

A Beginner’s Look at Big Data and its Benefits for Your Business

Not many people are familiar with the term "big data." Big data affects everyone and what's surprising is the fact that there is no education or training on its importance. In layman's terms, the phrase big data is used to refer to large and complex sets of data that can be computationally analysed to reveal trends and patterns. So, many people are not aware of the concept of big data, how it is used, or even what it is. Just to give a few examples, big data is collected on your supermarket loyalty card as you shop, on your social media account, and on your smartphone.

Uses of Big Data

Data collected from your supermarket loyalty card can be used by these stores to identify trends and to create personalized offers and deals providing benefits to both the organization and its customers. Data collected online from people's' social media accounts can be anonymised and used to improve users' experience online and many other good things. Data collected from satellites and smartphones can be used to determine the most popular routes in a city/town and help improve public transport. Local authorities can use big data to determine where social care is needed the ...


Read More on Datafloq
DomoPalooza 2017: Flare, Stravaganza…and Effective Business Management

DomoPalooza 2017: Flare, Stravaganza…and Effective Business Management

Logo courtesy of DOMO , Inc.
When you decide to show up at Domopalooza, Domo’s big user event, you don’t know for sure what you will find, but from the very beginning you can feel that you’ll have a unique experience. From the individual sessions and training, the partner summit and the concert line-up, to what might come from Domo’s CEO/rock-star Josh James, who certainly is one of a kind in the software industry; you know that you’ll witness a delightful event.

This year under the strings of Styx’s, Mr. James kicked off an event that amalgamated business, entertainment, fun and work in a unique way —a very Domo way.

With no more preambles, here is a summary of what happened during Domo’s 2017 DomoPalooza user conference.

Josh James at DomoPalooza 2017 (Photo courtesy of DOMO)
Key Announcements

Before entering to the subjective domain of my opinion about Domo’s event and solutions, let’s take a minute to pin point some of the important announcements made previous and during the event:
  • The first news came some days before the user event, when Domo announced its new model for rapid deployment dashboards. This solution consists of a series of tools that accelerate and ease the dashboard deployment process. Starting with its large number of connectors to diverse data sources, to a set of pre-installed and easy to configure dashboards, this model will enable developers deploy dashboards quickly and easily that decision makers can use effectively.
  • The next important announcement occurred during the conference. Domo came out with the release of Mr. Roboto —DOMO’s new set of capabilities for performing machine learning, predictive analytics and predictive intelligence. According to DOMO, the new offering will be fully integrated within DOMO’s business cloud, aiming for fast and non-disruptive business adoption. Two major features from Mr. Roboto include Alerts Center, a personalized visual console powered by advanced analytics functionality to provide insights and improve decision making. The other is its data science interface to enable users to apply predictive analytics, machine learning and other advanced analytics algorithms to its data sets. This is for sure one product I’m looking forward to analyzing further!

The introduction of new features, especially directed to narrow the technical-business gap within the C-Suite of an organization, and to facilitate decision makers an easier and customized access to insights, will enable business management and monitoring using DOMO. Some of these features include the introduction of:
  • Annotations, so information workers and decision makers can highlight significant insights in the process on top of a chart or data point. Enhancement to its Analyzer tool with the incorporation of a visual data lineage tool to enable users to track data from source to visualization.
  • Data slicing within DOMO’s cards to create more guided analysis paths business users and decision makers can take advantage of. 
  • More than 60 chart families to enhance the rich set of visual options already within DOMO’s platform. 

DOMO’s new features seem to fit well within a renewed effort from the company to address bigger enterprise markets and increase presence within segments which traditionally are occupied by other enterprise BI contenders.

It may also signal DOMO’s necessary adaptive process to comply with a market currently in a rampage for the inclusion of advanced analytic features to address larger and new user footprints within organizations, such as data scientists and a new more tech savvy generation of information workers.

There is much more behind Domo’s Curtains

Perhaps the one thing I did enjoy the most about the conference was having a continuous sense of discovery —different from previous interactions with DOMO, which somehow left me with a sense of incompletion. This time I had the chance to discover that there is much more about DOMO behind the curtains.

Having a luminary as CEO, such as Josh James, can be a two-edged sword. On one side, its glowing personality has served well to enhance DOMO’s presence in a difficult and competitive market. Josh has the type of personality that attracts, creates and sells the message, and with no doubt drives the business.

On the other end, however, if not backed and handled correctly, his strong message can create some scepticism, making some people think a company is all about a message and less about the company’s substance. But this year’s conference helped me to discover that DOMO is way more than what can be seen in the surface.

Not surprising is the fact that Josh and Chris Harrington —savvy businessmen and smart guys— have been keen to develop DOMO’s business intelligence and analytics capabilities to achieve business efficiency, working towards translating technical complexity into business oriented ease of use. To achieve this, DOMO has put together, on the technical side, a very knowledgeable team lead by Catherine Wong and Daren Thayne, DOMO’s Chief Product Officer and Chief Technology Officer respectively, both with wide experience. Their expertise goes from cloud platforms and information management to data visualization and analysis. On the business side, an experienced team that includes tech veterans like Jay Heglar and Paul Weiskopf, lead strategy and corporate development, respectively.

From a team perspective, this balance between tech experience and business innovation seems to be paying off as, according to them, the company has been growing steadily and gaining the favour of big customers such as TARGET, Univision or Sephora,  some of the customers that were present during the event.


From an enterprise BI/Analytics perspective, it seems DOMO has achieved a good balance in at least two major aspects that ensure BI adoption and consumption:

  • The way BI services can be offered to different user groups— especially to the C-level team— which requires a special degree of simplification, but at the same time an efficiency in the way the data is shown.
  • The way BI services can encapsulate complex data processing problems and hide them from the business user. 


Talking about this topic, during the conference we had the chance to see examples of the aforementioned aspects, both onstage and offstage. One with Christel Bouvron,  Head of Business Intelligence at Sephora Southeast Asia. Christel commented the following, in regards to the adoption and use of DOMO:

“We were able to hook in our data sets really quickly. I had sketched out some charts of what I wanted. They didn’t do that, but what they did was even better. I really liked that it wasn’t simply what I was asking for – they were trying to get at the business problem, the outcomes we were trying to get from it, and think about the bigger picture.�

A good example of the shift DOMO wants to convey is that they are now changing the approach from addressing a business problem with a technical perspective, to addressing the business problem with business perspective but having a technical platform in the background to support it. Of course this needs to come with the ability to effectively encapsulate technical difficulties in a way that is efficient and consumable for the business.

Christel Bouvron at DomoPalooza 2017 (Photo coutesy of DOMO)

It was also good to hear from the customers that they acknowledge that the process wasn’t always that smooth, but it helped to trigger an important cultural shift within their organization.

The takeaway

Attending Domopalooza 2017 was informative and very cool indeed. DOMO’s team showed me a thing or two about the true business of DOMO and its interaction with real customers; this includes the fact that DOMO is not a monolithic solution. Besides its already rich set of features, it enables key customization aspects to provide unique customers with unique ways to solve their problems. While DOMO is a software rather than a service company, customers expressed satisfaction with the degree of customization and services DOMO provides —this was especially true with large companies.

DOMO has done a great job to simplify the data consumption process in a way that data feeds are digestible enough. The solution concentrates more on the business problem rather than the technical one, giving many companies the flexibility and time to make the development of business intelligence solutions more agile and effective. Although these results might not be fully achieved in all cases, DOMO’s approach certainly can help organizations to from a more agile and fast deployment process, thus, more efficient and productive.

Despite being a cloud-based software company, DOMO seems to understand quite well that a great number of companies are working, for necessity or by choice, in hybrid cloud/on-premises environments, which enables the customer to easily connect and quickly interact with on-premises systems, whether this is a simple connection to a database/table source or it requires more sophisticated data extraction and transformation specifications.

There is no way that in the BI and Analytics market a company such as DOMO — or any other player in the market— will have a free ticket to success. The business intelligence market is diversifying as an increasing number of companies seem to need their services, but certainly
DOMO’s offering is, by all means, one to be considered when evaluating a new generation BI solution to meet the increasing demand for insights and data analysis.

Finally, well... what can be a better excuse to watch Styx's Mr. Roboto than this.



(All photos credited to Domo, Inc.)
Why Data Governance is the Foundation of a Healthcare Big Data Strategy

Why Data Governance is the Foundation of a Healthcare Big Data Strategy

Big data is everywhere, and many businesses are using it to improve their processes and strategies with great success. But one area where big data seems to be lagging is in healthcare. Many healthcare institutions want to adopt and expand their usage of big data, but in order to do so, they’ll need to focus on data governance.

What is data governance?

In a nutshell, data governance is what keeps data safe, secure, and up to an organization’s standards. Data is everywhere around us, and patients and consumers are used to being able to access the information they need almost as soon as they need it. However, that accessibility comes with a cost that many people don’t see—it puts much of the data at a greater risk of being hacked and stolen.

Data governance works to fight that and to secure data by creating systems so that users can trust their data. In a comprehensive data governance program, users are responsible for creating quality data and using it in a secure and ethical manner with proper authorization. In healthcare, this often comes down to establishing data governance principles that ensure data is consistent and reliable.

Why is it important?

The overall goal of data governance ...


Read More on Datafloq
What is the Future of the Internet of Things in Health?

What is the Future of the Internet of Things in Health?

The Internet of Things (IoT) is a broad term referring to all pieces of technology that connect to the Internet and each other. A subset of the Internet of Things is the Internet of Healthcare Things (IoHT). This refers to all pieces of Internet-connected technology that apply to the healthcare industry.

The IoHT and development of new technologies in the healthcare field has made significant strides in improving patient care. It is not just about maintaining records and communicating with patients. If household appliances and business technology can be added to the IoT, then medical devices can as well.  

A fully connected IoHT can enable practitioners to provide individually customized data-based treatments. If patient records are made readily available, the provider can view a comprehensive medical history cross-referenced with data-based treatment research successfully administered to similar patients. The more data available regarding current treatments, medications, and patient history, the better care the patient will receive.

The IoHT will not only benefit patients on a personal level, but it will streamline the healthcare process by allowing practitioners to communicate and instantly access patient information with the appropriate permissions. Let’s say, for instance, that a patient resides in Arizona. However, while vacationing in California, ...


Read More on Datafloq
How to Leverage AI for Cybersecurity Assurance

How to Leverage AI for Cybersecurity Assurance

In the game of cybersecurity, humans are the weakest link and usually the cause of unwanted breaches. Even highly educated and influent individuals can become unsuspecting targets of cyber-attacks, due to a lack of vigilance and taking security matters too lightly. Until now, this war was mainly human versus human. Things are about to change with the introduction of AI as the future of cybersecurity. It will soon be a challenge of super-computers against each other, much like it already is in the world of automated trading.

Challenges of cybersecurity

Human nature

The biggest threat to cyber security at this time is the careless nature of people towards passwords. The complexity of currently available algorithms would take years of continuous work by computers to break through brute force. However, users jeopardize their online safety by using simple passwords such as "123456" or their pet's names. Most users release a significant amount of personal data through social media every day. It could take seconds for an algorithm powered by AI to break into sensitive accounts by leveraging user-generated content, freely available online.

Although awareness about cyber threats is high, there has been no notable change in the behavior of people, leading to better protection. We are afraid of ...


Read More on Datafloq
6 Signs Your Company Needs a New Data Strategy

6 Signs Your Company Needs a New Data Strategy

Big Data is not the latest jargon that has crept into executive meetings, it’s becoming an essential business practice used by most organisations today. Over the years, businesses have become aware of the insights that they can gain from data analytics and are collecting increasing amounts of data. Yet, many businesses do not have a proper data strategy in place and are simply collecting data in a frenzy. There is a difference between Big Data and having lots of data. Collecting data just for the sake of it in hopes of using it in the future is not only bad business practice, it leads to potentially costly problems for your company.

Here is a list of issues that companies without a proper data strategy may face. If your company is experiencing any of these problems, it is a tell-tale sign that you need to review your company’s data strategy:

1. Storing data is starting to cost more

Even though the price of data storage has plummeted over the years, a poor data strategy will lead to high data storage costs. According to Experian, an information services company in the US, “77 percent of CIOs believe data is a valuable asset in their ...


Read More on Datafloq
4 Tech Innovations That Are Changing the Supply Chain Industry

4 Tech Innovations That Are Changing the Supply Chain Industry

Third party logistic companies (3PLs) and the supply chain have always relied on technology to meet the speed, costs, and quality demands. Technology has always been the driver to meeting customer demands, especially when it comes to supply chains. The emergence of the railroad systems in the 19th century, as well as the proliferation of trucks and other automobiles in the 1900s, had a big impact on the supply chain. These cost-effective technologies helped companies make deliveries faster and in larger amounts. We are in the 21st century now, and the internet is on the verge of transforming the supply chain in ways that we couldn't have anticipated.

When Amazon announced Prime Air services program two years ago, people were stunned- which is understandable. Has logistics technology become so advanced that orders for online shopping can be delivered with drones? Well, not yet, but this is not a joke. Tests are underway on similar programs in Israel and the United Kingdom. The 21st century is the age of the internet that is quickly transforming everything from customers' shopping experiences to how companies go through with their orders. Here, we are going to shed some light on emerging supply chain technologies expected ...


Read More on Datafloq
Enterprise Journey to Becoming Digital

Enterprise Journey to Becoming Digital

Do you want to be a digital enterprise? Do you want to master the art of transforming yourself and be at the forefront of the digital realm?

How can you change your business to achieve this?

Derive new values for yourself, and find better and more innovative ways of working. Put customer experience above and beyond everything as you find methodologies to support the rapidly changing demands of the digital world.

Your transformation will be successful only when you identify and practice appropriate principles, embrace a dual strategy that enhances your business capabilities and switch to agile methodologies if you have not done it already.

The journey to becoming a digital maestro and achieving transformation traverses through four main phases.


Becoming a top-notch expert with industrialized IT services – by adopting six main principles
Switching to agile operations to achieve maximum efficiency – so that you enjoy simplicity, rationality and automation
Creating an engaging experience for your consumers using analytics, revenue and customer management – because your customers come first; their needs and convenience should be your topmost priority
Availing opportunities for digital services – assessing your security and managing your risks


Becoming a top-notch expert with industrialized IT services

There are five key transformation principles that can help you realize the ...


Read More on Datafloq
4 Ways How Big Data Will Improve Road Safety

4 Ways How Big Data Will Improve Road Safety

With over 40,000 deaths each year coming from traffic-related collisions and accidents, it's a clear sign that improving road safety is a top priority across the nation. Advances in technology are helping reduce accidents and improving overall driver safety through a variety of methods. Here are the top 4 ways that technology will help improve road safety in the coming years.

#1 Data Collection

Along with computer controlled vehicles, data collection is vital to ensuring that we understand where and why accidents happen. The black box technology that has been famously used to track airplanes and help identify the cause of crashes is now being used in other vehicles. Black box technology is fairly simple, inexpensive, and easy to deploy on a wide scale of cars.

The benefits of the technology are we will be able to track the exact time, speeds, position, and other factors related to car collisions and accidents. As this data is studied, we will be able to better understand trends and reasons behind car crashes and use this data to prevent future incidents. South Korea was the first country to deploy black box technology in their taxi services and immediately noticed a 14 percent decrease in traffic accidents the following ...


Read More on Datafloq
Using Online Reviews and Big Data for Positive Impact

Using Online Reviews and Big Data for Positive Impact

For years, corporations have used big data to make decisions and drive strategy. This is no longer a viable option. Companies aren't correctly using their data and while the concept remains popular, the current methods of using big data aren't successfully meeting the need. Corporate researchers and marketing experts still use data without supplying the proper context. Saying that a data point is up or that another has decreased makes no sense without the entire story. Changemakers need to know the goals and what changes will be required to get there. Visionary companies are using smart data instead of big data. Smart data is timely and is used to help transform business operations.

To see the difference, let's start with review data. Most of those in the service industry consider review data essential to successful operations. Reviews tend to impact a consumer's shopping experience. Today's consumers are savvy enough to look a company's or product's reviews before purchasing. Positive reviews have a positive impact. Negative reviews have the opposite effect and consumers don't spend their money with that company or on that product. Additionally, negative reviews tend to earn more negative reviews. However, asking customers for reviews is part of the ...


Read More on Datafloq
Digital Transformation: What it is and Why It Matters

Digital Transformation: What it is and Why It Matters

Digital technologies play an ever-increasing role in our daily lives. Companies must follow suite to provide consumers with the digitally-connected experiences they expect. Digital transformation takes commitment but it is no longer an option for brands who want to stay competitive.  Consumers will simply turn away from companies that aren’t keeping pace in favor of a brand that can provide a connected experience across digital channels.

More brands are renewing their commitment to digital strategies.  According to Forrester, “A Digital Strategy allows you to understand the who, what, when and where of listening and responding to consumers, bridging brand experiences, iterating offerings, and collecting and activating consumer relationships in order to accomplish an actionable and measurable objective.�

Digital Transformation Objectives

Through investments in technology, the essence of digital strategy is about enhancing your customers experience and increasing your organization’s competitive advantage. According to a survey conducted by Altimeter, 54% of time digital transformation is led by a CMO (Chief Marketing Officer), 42% by CEO, 29% by CIO/CTO and 20% by others (including CDO).

Based on the research, survey respondents rated the following as Very Important:


80% – Improving process that expedite changes to digital properties such as website updates, new social platforms and new mobile platforms.
71% ...


Read More on Datafloq
How Big Data Changed Online Dating

How Big Data Changed Online Dating

Most of the young men would have considered the happy hour at Chainsaw Sisters Saloon as a target-rich environment. The place was packed and the drinks were cheap. Predominantly, the odds of “getting lucky� were very low. Empirically, millennials know that bar crawling is for recreation but not for low-percentage mating rituals, time-wasting, archaic. There are many dating apps and sites available if you wish to meet someone.

Space is Crowded

The major players of dating include eHarmony, Chemistry.com, and Match.com for romance and they all promise relationships that are long-lasting. Niche sites like JDate.com (intended for Jewish singles), BlackPeopleMeet.com (intended for African American people to connect), ChristianMingle.com (intended for Christians looking out for singles with similar values) and OurTime.com (intended for serious daters over the age of 50) provide eponymous consumer value propositions.

Tinder is the undisputed leader in the mobile first arena. There are numerous other offerings, but not even a single app comes closer to the market share of Tinder. Zoosk, OkCupid, and Hinge are all players and niche apps like The League (the “curated� members must be chosen to join), Bumble (women must begin the conversation), Happn (dating based on location) and JSwipe (Jewish Tinder) have all found an ...


Read More on Datafloq
What Big Data Means for the Future of Health

What Big Data Means for the Future of Health

Doctors and scientist are always looking for ways to improve the health and wellness of society. The industry thrives on the introduction of new technology and medical techniques. It is this constant effort to improve medical outcomes that pushes doctors, nurses, and scientists to find the best healthcare solutions. One big technological breakthrough that has helped these solutions come to light is big data.

Big data, when used correctly, has improved technology, communication and now healthcare. From improving medical transcriptionist schools to advancing the technology used in a hospital, big data has made its way to the medical world. Data analytics has helped doctors improve the health of their patients and will continue doing so in the future. As data is aggregated, doctors have been able to use it to improve the overall health of the public, identify and provide treatment for patients, find the right program/procedure for a patient, and communicate with healthcare providers.

Improve Public Health

Big data has the ability to analyze disease patterns and record sudden outbreaks. It also allows public health to be monitored and tracked. When disease data and medical information of the public gets recorded, these two pieces can improve the overall health of the population. ...


Read More on Datafloq
How to Balance Analytics Agility and Stability

How to Balance Analytics Agility and Stability

There have been many science fiction stories (as well as video games!) that revolve around the tradeoffs between powerful, strong, hard to harm combatants and those that are small, nimble, but easy to harm. Both have their merits and both can be useful in different situations. However, the same profile doesn’t work best in every situation.

There are situations in a fight where being fast is more important than being strong. There are also cases where being able to stand your ground is more important than speed. The same is true with analytics. At times, agility is more important. At other times, stability is more important. The key is to know when you need which option.

The Flaws Of Targeting Only Stability

Many organizations struggle to progress effectively in their analytics journeys because of too large a focus on stability. During the exploration and discovery process, nearly all the stringent rules applied to a mission-critical production process are enforced from the start. This makes it very difficult to go in new directions and drives costs up much higher than they need to be. After all, if I just want to quickly see if an idea has merit, I don’t need my process to ...


Read More on Datafloq
Why Control Is Necessary For Explainable Artificial Intelligence

Why Control Is Necessary For Explainable Artificial Intelligence

Artificial Intelligence is a very big and useful technology that has taken the world by storm. As beneficial as it is, it also has its drawbacks. No matter how efficient they are, machines only work on algorithms and these algorithms should not be over trusted for several reasons.

There may be a genuine error in its algorithm that leads to different actions from the expected or desired ones. A bug or malware may find its way into the code and this will lead to abnormal behavior of these machines. Programmers may deliberately input wrong codes for ulterior motives. This is why explainable artificial intelligence is the way to go.

With explainable artificial intelligence, every user will be able to understand how a machine works. Besides, the machines will come with a high level of transparency and accountability. Every machine should be able to explain why certain actions need to be taken to its users.

It should also explain why that is the best option and why other alternatives may not work out for a particular situation. Explainable artificial intelligence also aims at making it obvious to users when a particular machine has failed on a particular task and when it has succeeded in ...


Read More on Datafloq
How the Mortgage Industry is Being Reshaped by Big Data

How the Mortgage Industry is Being Reshaped by Big Data

What goes into getting a mortgage or refinancing your home? A fairly individualized process that hinges on both the lending bank and the borrowers’ history, the mortgage industry is on the brink of transformation – and it’s all because of big data. From the initial application process to ongoing loan servicing, data banks are dictating lending in an unprecedented way.

What does big data mean for your mortgage? Here’s an inside look at a changing industry.

Faster Decisions

As noted, loan approvals used to be fairly individualized because isolated loan officers looked at isolated pieces of information and decided whether or not you qualified. The onboarding process took a long time and was prone to missing key factors because, while there was more data out there that could have been placed at their disposal, most of it was just warehoused and not accessible. Now computers can cross-reference information during onboarding to confirm that borrowers aren’t submitting conflicting information or disguising their financial history.

Following this high speed onboarding process using immediately accessible data banks, loan officers are able to make immediate decisions about your qualifications. Actually, a computer makes the decision and feeds that decision back to the staff members, but this makes for ...


Read More on Datafloq
How Artificial Intelligence will Transform IT Operations and DevOps

How Artificial Intelligence will Transform IT Operations and DevOps

To state that DevOps and IT operations teams will face new challenges in the coming years sounds a bit redundant, as their core responsibility is to solve problems and overcome challenges. However, with the dramatic pace in which the current landscape of processes, technologies, and tools are changing, it has become quite problematic to cope with it. Moreover, the pressure business users have been putting on DevOps and IT operations teams is staggering, demanding that everything should be solved with a tap on an app. However, at the backend, handling issues is a different ball game; the users can’t even imagine how difficult it is to find a problem and solve it.

One of the biggest challenges IT operations and DevOps teams face nowadays is being able to pinpoint the small yet potentially harmful issues in large streams of Big Data being logged in their environment. Put simply, it is just like finding a needle in the haystack.

If you work in the IT department of a company with online presence that boasts 24/7 availability, here is a scenario that may sound familiar to you. Assume that you get a call in the middle of the night from an angry customer or ...


Read More on Datafloq
Big Data Continues Serving Patients in Increasingly More Ways

Big Data Continues Serving Patients in Increasingly More Ways

From marketing to healthcare, across all industries, big data has become the next disruptive technology. The healthcare industry specifically is consistently a late technology adopter, however, it is starting to uncover new ways to optimize and serve patients using big data. These are the top five ways big data is helping improve patient lives.

Big data helps doctors determine the best treatment

Dr. Anil Jain, a doctor at the Cleveland Clinic, wished he could see and analyze diabetic patient data so he could determine the best treatment plan after he noticed diabetic patients often had the same two or three concurrent medical issues. Jain thought if he had access to aggregated patient data, he and other doctors would be able to create better outcomes for their patients.

This led the Cleveland Clinic to create a program called Explorys, which gave doctors the ability to put in data and then sift through the data to “identify patient risk factors, track outcomes and evaluate treatment success.� IBM purchased Explorys in an effort to improve its cloud offerings using the supercomputer Watson.

IBM’s Watson is also being used as a tool to help oncologists determine the best treatment route for their cancer patients by using big data ...


Read More on Datafloq
How Data Analytics is Transforming Healthcare Systems

How Data Analytics is Transforming Healthcare Systems

Big Data Analytics is entirely transforming business paradigms. Automated databases are enabling businesses to perform mundane tasks more efficiently. And, the commercial sector isn’t the only area to benefit from data analytics. Its impact is widespread and is being seen across many different sectors, including healthcare.

Access to healthcare facilities is a basic, human need. However, the healthcare sector is extremely expensive, even when compared to the other developed economies. In the United States, the burden of the expense ultimately falls on the consumer since the sector is mostly dominated by private companies. America, however, ends up spending more on its public healthcare system than countries where the system is fully publicly funded.

Under such circumstances where people are paying a significantly higher price, they deserve a service that matches the price tag. The question is then: how can data analytics help increase the efficiency of healthcare systems in United States and around the world?



Performance Evaluation

Keeping a tab on hospital activities by maintaining relevant databases can help administrators find inefficiencies in service provision. Based on the results found from data analysis, specific actions can be taken to reduce the overall costs of a healthcare facility. Reduced costs may be reflected in the ...


Read More on Datafloq
How Big Data And Logistics are Working Together

How Big Data And Logistics are Working Together

Logistics companies have initiated many project prototypes for the exploitation of big data analysis and several amazing projects that will soon be part of our everyday lives. This includes using Spark on Hadoop for real-time analysis to assess large data volumes stored on registers' logs, database, excel, or HDFS that has completely changed the business dynamics. Here are some of the big data projects related to the logistics sector:

Volume Analysis

Logistic companies seeking to optimize budgets and resource allocation have always grappled with the problem of predicting parcel volume on a given day of the year, month, or week. Logistic companies are currently investing in the area to determine patterns that help to predict peak volumes. It is an ideal use case since data scientists are able to generate recommendations by running batch analysis.

Parcel Health Data

It is important for the transportation of medicines and other commodities in general to be done in a controlled environment. For instance, some medications should be stored between 2 and 8 degrees Celsius. Some type of equipment are fragile and require extra care while handling. It is quite costly for logistics companies and even the end-consumer to manage the whole process. This is why companies are ...


Read More on Datafloq
Try Cassandra with This JVM for a Flawless Experience

Try Cassandra with This JVM for a Flawless Experience

While working with big data, professionals often encounter the questions like which database is better. There is no specific answer to a question like this because a professional considers a lot of things when deciding the right type of technology that he or she wants to work with. The choices are sometimes a single Software and most of the times a combination that has worked well previously.

This blog is centered upon the use of Apache Cassandra with Zing. Now, Cassandra is a database management system (DBMS) established upon NoSQL. It has some unique features that can alleviate your routine data management experience. Zing on the other hand is a JVM designed to deliver high performance. Let’s look at the ways in which this combination can accelerate performance of your systems.

Known Issues in Cassandra

Cassandra and Zing complement each other or to say using Cassandra backed by Zing is sure to take care of all the loop holes that traditional Cassandra users have been encountering. There are two prominent issues that affect the performance.


Memory Settings: Databases are meant for storing data and in Cassandra there are a few issues like setting limits for memtables to avoid running over the other important data. ...


Read More on Datafloq
Ways Hackers Steal Your Data (And How to Defend Yourself)

Ways Hackers Steal Your Data (And How to Defend Yourself)

For non-technophiles, online communication is as simple as clicking “send� in an email client. But in reality, the entire process includes a series of precise mechanisms that took decades to develop.

Suppose you are to send a photo of your last trip to Panama. Upon sending, the picture’s data gets broken down into “packets� that are typically no bigger than 1,500 bytes each. Once these packets reach the intended recipient, a computer reassembles them back into an image – ready to be viewed by humans.

Today, internet technology has become so efficient that—on an average internet connection—up to 5.1 megabytes of data can be transferred in a second. The only problem is that data in transit is susceptible to digital eavesdroppers or more popularly known as hackers.

How Hackers Steal Data

A hacker has many tricks up his sleeve. If their goal is corporate sabotage, they can leverage a network of infected computers or ‘botnets’ to launch a Distributed Denial of Service or DDoS attack. They can also infiltrate networks by injecting malware, such as ‘keyloggers’ that track everything a user types.

Luckily, there is a straightforward solution that can prevent these common cyber threats. For everyday internet users, a free tool like Malwarebytes should ...


Read More on Datafloq
The Role of Big Data in IoT

The Role of Big Data in IoT

IoT (the Internet of Things) refers to the automated intelligent control and command of connected devices over vast regions via sensors and other computing capabilities. At its core, IoT is a fairly simple concept to grasp. It's all about making out products smarter. IoT is on its path to becoming one the biggest technological revolutions the world has ever seen. By 2020, the amount of revenue generated by IoT technology is expected to be in the figures north of $300 billion, and this is just a tip of the iceberg. One of the most critical components of the IoT process is data. For connected devices to perform commands, data has to be sent to a centralized location — say gateway or the cloud — where it's processed and sent back to the sensors of these devices.

It is, therefore, imperative to have an efficient way of collecting small amounts of data and transmitting this data to the centralized location for processing, and sending it back to the sensors — all in real-time. Taking into account the type, the enormous explosion in numbers and capabilities of these devices and sensors, the size of the data that needs processing can be extremely large ...


Read More on Datafloq
Today’s Challenge for the OpenStack Foundation: Move Beyond the Complexity Conundrum

Today’s Challenge for the OpenStack Foundation: Move Beyond the Complexity Conundrum

The OpenStack Foundation is addressing complexity with "composability." Will that be enough to bring disgruntled early adopters back into the fold?
Protection for Your Business Data: The Must-Have of 21st Century

Protection for Your Business Data: The Must-Have of 21st Century

Why should you protect your data? After all, it’s only now seen as the most valuable resource in the world; even more valuable than oil, in fact. What’s more, while it can be a challenge to steal enough oil, to steal enough data is a synch. After all, Snowden managed to steal 20,000 files from the NSA using nothing more than a few thumb drives. That’s the NSA we’re talking about!

So why should you protect your data? Because if you don’t, then you might well end up with a similar fallout when a disgruntled employee or outside hacker decides to get at your files and do serious harm with them.

The question, of course, isn’t if you should protect your data. That goes without saying. It is how to protect your data.  That’s what the rest of this article is going to be devoted to.

Know what you need to protect

Step one is identifying the data that actually needs protection. Some things do. Some things don’t. Some things that absolutely need to be secured are things like:


Customer data. This is stuff like transaction accounts, private information like names and addresses, personal data of any kind and anything else that might be sensitive.
The ...


Read More on Datafloq
Is Big Data Facilitating a Designer Society?

Is Big Data Facilitating a Designer Society?

Personalisation seems to be one of the big trends of the next decade.

With virtual assistants coordinating our every move and our lives held in the palm of our hand, it is undeniable that we are fast becoming slaves to our technology. What is slightly less obvious (at the moment) is the potential for that technology to learn about how we live and create a uniquely personal experience for us, every waking minute of the day (and even maybe marshal our dreams).

We are a product of our experiences, but the moment we plug in an analytical and predictive companion to our lives, it can learn about us in was that only our subconscious could fathom. Tech will be able to provide insights into our behaviour that we could only guess at – we will be able to “optimise� our lives, and I am sure that all sorts of solutions will appear that will take the strain.

The Big Data behind these insights will help to guide us like our own personal SatNav, but instead of telling us to “turn left at the junction� it will remind us to be patient when dealing with a certain person (because of a previous experience) or ...


Read More on Datafloq
10 éves a dmlab

10 éves a dmlab

Hálás vagyok. Ennek az egyszerű gondolatnak mindenféle variációja kavarog a fejemben, mikor arra gondolok, hogy ma 2017 május 10.-én ünnepeljük a dmlab alakulásának tizedik évfordulóját. Ahogy a tíz évnyi élményt átpörgetem a fejemen, valahogy azt érzem, hogy ez nagyon jó tíz év volt. Annyira pozitív bennem az összkép, hogy szinte hitetlenkedve szedem össze az agyam rejtet zugaiból a nehézségek, a kudarcok élményeit. És mikor ezeket is sorba veszem, méginkább kereknek és jónak látom ezt az időszakot. Hálás vagyok azért, hogy így tekinthetek vissza.


picture1.pngHálás vagyok azokért, akikkel ezt az egészet tíz éve elindítottuk. Ha dmlab indulására gondolok, egy rövid TED videó jut eszembe, ami egy rövid vicces videó elemzésén keresztül mutatja be, hogyan indul el egy mozgalom (link). Kiemeli, hogy egy új kezdeményezés indításánál nem az azt indító vezető személye a legfontosabb, hanem annak az első egy-két társnak a döntése, akik elsőként hozzá csatlakozva vezetővé teszik. Hálás vagyok ezért Nagy Istvánnak, Főző Csabának, majd Ivónak, Prekónak, Attilának, Petinek, majd Gergőnek, Csabinak, Simonnak, és sokáig sorolhatnám ki mindenkinek, akik hittek abban, hogy lehet és érdemes a dmlab kötelékében valami újat és nagyszerűt alkotni.

Hálás vagyok azért a bátorságért és azért vakságért, vakmerőségért, ami ezt a csapatot jellemezte. Bátrak voltunk, mikor új és járatlan, kockázatos utakon kezdtünk el járni, és olykor vakmerők voltunk, mikor nem is voltunk képesek felmérni, mekkora fába vágtuk a fejszénket - és néha milyen jól jött, hogy emiatt megijedni, visszarettenni sem volt lehetőségünk. Hálás vagyok azért, mert ez a kísérletező kedv, ez a szabályok és a berögződött reflexeket felülírni akaró szemlélet, ez a kreatív energia mind a mai napig áthatja a csapatot.

Hálás vagyok, hogy a tíz év során időről-időre feltettük a kérdést magunknak mit és hogyan akarunk elérni közösen. Hálás vagyok Törőért, mert segített nekünk rátalálni egy őszinte és előremutató vízióra, segített megérteni, hogy ahogy a cégnek ugyanúgy eredménye, terméke, hogy milyen munkahelyeket hoz létre, hogy milyen kollegiális viszonyban és hogyan dolgozunk együtt, tudatosodott bennünk, hogy milyen ügyeket, célokat és cégeket szolgálunk és segítünk.

Hálás vagyok a sok projektért, pilotért és oktatásért, hálás vagyok a dmlab-ból induló, “spin-off-oló� startupért és azok sikeréért. Büszkék vagyunk rátok.

Hálás vagyok a tíz évet folyton átszövő változásokért. Még úgy is, hogy tudom, hogy nem minden változás fejlődés, és nem minden fejlődés gyarapodás volt a dmlabban. De álltuk a sarat, megtaláltuk az új helyzetekben a lehetőséget, és szinte kivétel nélkül ki tudtuk használni azt. A napokban kezembe került a dmlab egy kilenc éve született stratégiai terve. Mellbevágó volt belenézni, és látni hogy mennyire keveset változtak a lényegi dolgok tíz év alatt, miközben mégis minden megváltozott: a szakma, a piac, és mi magunk is mennyit fejlődtünk.

Köszönjük.

“Ez jó mulatság, férfi munka volt!�

Megosztom Facebookon! Megosztom Twitteren! Megosztom Tumblren!

The Jedox Roadshow comes to Brisbane

The Jedox Roadshow comes to Brisbane

Jedox is one of the best unknown data platforms I know of.  Calling them unknown may sound a little harsh because they are a major product however, I say unknown because I don’t see them being considered for purchase in a large number of companies when they are clearly a fit for purpose product – […]
The Basics of Deep Learning and How It Is Revolutionizing Technology

The Basics of Deep Learning and How It Is Revolutionizing Technology

Machine learning refers to a type of Artificial Intelligence (AI) that allows computers to learn beyond their initial static programming. These new programs are developed to analyze patterns in past data sets in order to adapt. More advanced computer programs are even capable of altering their code in response to prior exposure to an unfamiliar set of inputs, which opens a whole set of possibilities for the future of AI.

Some of the recent applications of machine learning include Google’s self-driving car and the algorithm behind the success of its web search function, companies providing online recommendation offers based on user's’ browsing history, and fraud detection.

Deep learning is a branch of machine learning that focuses on the neural network model inspired by our understanding of the biology of the human brain. The human brain contains billions of neurons that are capable of sending signals and connecting to each other within a certain physical distance. Programmers incorporated that structure by creating artificial neural networks that have discrete layers, connections, and directions in which the data propagates.

How Does Deep Learning Work?

Deep learning enables computer programs to process a lot of input data simultaneously, and use it to make decisions based on the ...


Read More on Datafloq
How Secure is Your Cloud Computing?

How Secure is Your Cloud Computing?

For many businesses, cloud storage has become the new norm for storing and sharing files across departments. It’s highly convenient and gives employees access to data across multiple devices wherever they are. As more and more options become available, more and more of our personal and confidential business data gets stored on the cloud. But just how secure is that information?

After the iCloud celebrity photo breach, the use and faith in cloud storage declined, which also meant that the demand for security on cloud storage went up. It is important to note that computers, cloud storage and hybrid cloud storage are always susceptible and there is no perfect system, but that doesn’t mean you shouldn’t use one of the best and convenient pieces of technology out there. Instead, be smart about how you and your employees use it and follow these guidelines to ensure that your data is as secure as it can get on the cloud.

First and foremost, ditch the easy passwords. Nearly every website or account you use needs a username and password, and keeping track of those can be extremely frustrating and daunting which is why many resort to duplicating passwords or relying on easy info like ...


Read More on Datafloq
Why Marketing Automation Won’t Work Without Data Quality Measures

Why Marketing Automation Won’t Work Without Data Quality Measures

Yesterday, actually it was a normal day for me, I again experienced why marketing automation won’t work without data quality measures. I attended a webinar by a French marketing automation company who had a really nice tool to track customers on the website and collect leads for further processing. They have put much effort on a rule based engine to segment leads and a state of the art backend to have a nice working environment. I then asked: What happens if a person mistyped his e-mail address? What happens if a person set fills out his name in lowercase? What happens if the person has a typo in the postal address? First there was no answer. But then the webinar leader said: Why should someone do that? The answer is easy: Because we are human! There is a certain percentage of people who are not 100% concentrated when filling out a form. Maybe because they are using their smartphone where a typo can happen very easily, or they simply don’t know their correct e-mail address (I have often seen Austrian or German e-mail addresses with @gmail.at or @gmail.de – but, as we all know, there is only gmail.com).

So this sophisticated ...


Read More on Datafloq
What Innovations Will Big Data Bring Us by 2020?

What Innovations Will Big Data Bring Us by 2020?

As we move more businesses, communication, and entertainment online, large amounts of data are exponentially generated with each passing year. The data is so much that it's becoming tough to keep track of it, let alone analyzing or organizing it.

Although the origins of big data are somewhat murky, its impacts are becoming crystal clear. More devices get connected to the internet like never before, and this information is a potential goldmine for educational, commercial, and humanitarian efforts. The need to organize and analyze the massive chunks of data has become necessary. More companies for handling big data have emerged to meet these challenges. By utilizing advanced database management technology, companies, medical organizations, universities and governments can now harvest information and improve efficiency.

Other than providing a chance to make significant economic impacts, big data manipulation will create innovations that will revolutionize our lives completely by 2020. Here is how:

Apps and sites will be more functional and safer

With big data, it will be easy to identify and track any fraudulent behavior so as improve websites security. Big data can bring new visibility into everything that is going within a firm's network and assist in predicting upcoming attacks. Experts think that big data ...


Read More on Datafloq
Snowflake and Spark, Part 2: Pushing Spark Query Processing to Snowflake

Snowflake and Spark, Part 2: Pushing Spark Query Processing to Snowflake

This post provides the details of Snowflake’s ability to push query processing down from Spark into Snowflake.
10 Tips to Troubleshoot Security Concerns in IoT Wearables

10 Tips to Troubleshoot Security Concerns in IoT Wearables

From smartwatches to glasses and finger rings - the range of wearable devices are steadily expanding. Combined with the might of Internet of Things, wearable devices have a life transforming effect. While it is fancy and often productive to carry these wearables around, the billion dollar question is, “How safe are these wearables?�

What if your favourite wearable is just another door that hackers and cybercriminals can break into to make away with the control of your personal and professional life?

Security concerns in wearables are legitimate. A study by Auth0 has confirmed that more than 52% of wearable device users feel that they are provided with inadequate security measures.

VTech, a popular brand that sells wearable for kids suffered a security breach which resulted in the leakage of private information of more than 200,000 children and their parents.

There is no better time to sit back and analyze the security concerns and the ways to negate them, right now, in the present.

Security Concerns in IoT Wearables

Wearables are now being used for purposes that go far beyond calorie counting and fitness tracking. They are now even part of BYOD enterprise work philosophy and are used by remote employees to constantly collaborate and communicate with their peers. Though considered futuristic, the IoT ...


Read More on Datafloq
8 Ways IoT Can Improve Healthcare

8 Ways IoT Can Improve Healthcare

Over the past few decades, we’ve gotten used to the Internet and cannot imagine our lives without it. But now the Internet of Things (IoT) is changing the way we operate commodities around us. The Internet of Things is a real-time connection and communication among all sorts of objects, gadgets, wearables, and devices. Essentially, it represents interoperability between all the things around us (excluding computers and phones). Needless to say, IoT is changing entire industries, as it reduces costs, boosts productivity, and improves quality. One of the areas where IoT is contributing the most is medicine. In this article, we will check out 8 ways how IoT is improving the healthcare industry.

How is IoT Changing Healthcare?

With its advanced technologies, IoT gives a significant boost to healthcare development. Some forecasts even estimate that the field of IoT will climb to $117 billion by 2020. How is that possible? Let’s discuss some of the key points!



Data management



IoT provides countless possibilities for hospitals to gather relevant information about their patients, both on-site and outside of the medical premises. Healthcare relies on telemetry here to capture data and communicate it automatically and remotely. This offers medical staff a chance to act promptly and provide patients with better ...


Read More on Datafloq
How Thick Data Can Unleash the True Power of Big Data

How Thick Data Can Unleash the True Power of Big Data

All that Data which is measurable may not be qualitative. While Big Data helps us find answers to well-defined questions, Thick Data connects the dots and gives us a more realistic picture.   

We have been hearing this for quite a few years that Big Data and Analytics are the next big waves. While these waves are already sweeping us over, we are missing out on the small things going for the big. Big Data has emerged to be remarkably useful when it comes to finding answers to well-defined questions and addressing phenomena that are well understood. What it fails to recognize is the complicacy of peoples’ lives, human connections, underlying emotions, changing cultural ecosystems, interesting stories, and other social ingredients.

For instance, it made big news when Nokia was acquired by Microsoft in 2013. While there could be many reasons behind Nokia’s downfall, one of the prominent reasons that Tricia Wang, a Global Tech Ethnographer describes is the overdependence on numbers. Sharing her story on Ethnography Matters, she mentioned how her recommendations to Nokia to revise their product development strategy did not receive enough attention as the sample size used for her study was considered too small in comparison to millions of ...


Read More on Datafloq
Why Educational Systems Should Consider Big Data

Why Educational Systems Should Consider Big Data

Advances in technology have enabled good decision making in most educational institutions following the increased use of big data. The policy makers in the institutions have been using big data to understand the sentiments about the school and make systematic improvements to the student's performances. The term big data refers to a large amount of information flowing through various channels that only use computers for analysis. Learning institutions generate an immense amount of student's information that would be hard to capture and manage through conventional means. Therefore, big data comes in handy in helping improve the processing of data and increasing the storage capacity of the institutional data.

Understanding Big Data

Students' performances and experiences such as eating, social life, study, and sleeping have a high effect on their academic performance. Negative or traumatic experiences have a direct impact on the student's retention abilities. Therefore, most institutions now use big data to look into various aspects affecting the performance of a student. Academic institutions collect large quantities of data, but the problem lies in the analysis making it harder for the analytics to make data-based decisions and improve the organizational effectiveness.

Why Big Data and Not Small Data

Academic institutions collect data for many ...


Read More on Datafloq
Augmenting the Brain is Set to Pioneer Alzheimer’s Treatment

Augmenting the Brain is Set to Pioneer Alzheimer’s Treatment

As artificial intelligence becomes more human, to co-exist, does human intelligence need to become more artificial?

We’ve spent a lot of time philosophizing about where Artificial Intelligence is going to take us, how far we are to achieving general AI and the implications it will have on humanity - all not without the sky net scenarios! Hype aside, there are companies out there who are focusing on how we can use artificially intelligent applications to improve the human experience, sustain life on our planet and significantly boost the economy. This pioneering technology could well see the next world-changing scientific discovery hailing from Silicon Valley, especially considering the significant increase in investment over the past few years.  

According to the Alzheimer’s Association, there are more than 5 million Americans living with Alzheimer’s today, with a predicted 16 million by 2050, and a further 850,000 people with dementia in the UK. The degenerative disease is currently the 6th leading cause of death in the US and has also been linked to the poor health of caregivers too due to care responsibilities associated with the disease as opposed to caregivers to elderly people without dementia. Having a neuroprosthetic could be the missing the key to an improved quality of ...


Read More on Datafloq
How Big Data Can Reduce Building Injuries

How Big Data Can Reduce Building Injuries

Accidents are a part of life, but when we carefully analyze the data on unintended incidents and injuries, we often find that many of them could have been avoided through greater care and harm reduction strategies. Unfortunately, because many companies and individuals view each injury in a bubble, they miss the significance of certain occurrences and can’t effectively reform their behaviors. Only the big picture view can help – that’s where big data comes in.

When we use big data to analyze workplace injuries and individual accidents, we move from an individualized view of personal injury to a systemic one – and that’s how we can reduce injuries. But what does this look like in practice? By turning to risk management ecosystems, we can see what the future of safety looks like.

Analyzing Workplace Safety

Workplace safety is a significant national concern – it’s why organizations like OSHA exist – but just because there’s already oversight in the workplace doesn’t mean companies are maximizing their injury prevention strategies. Rather, many do the minimum required by OSHA and leave the rest to chance.

Some workplaces, however, are taking safety seriously by instituting company-wide injury analytics. These systems let all branches of a business, no matter ...


Read More on Datafloq
Are You Wasting Your Data or Consuming It?

Are You Wasting Your Data or Consuming It?

Last night I was in the checkout line at the grocery store. There was a woman behind me with a cart full of produce who told me, “I’ll probably end up throwing most of this away.� I asked her why. She said she knows she should eat healthy, but it takes too much time and effort to whip up a meal from scratch, and anyway, she wasn’t that great of a cook. Despite her best intentions, she ends up ordering in for the family most nights.

Unfortunate fact – over 40% of food produced is wasted, depleting resources like fresh water, electricity and human effort.

Wouldn’t it be great if raw ingredients could magically convert themselves into dishes for the family – no time, effort, or cooking skills needed? The family could be eating healthier meals, they’d be eating the food they spent money and time to procure, and it wouldn’t end up wasted in the trash anymore.

Companies are wasting data

Many companies are wasting data just like many people are wasting food.

Companies recognize how important data is. They know their workforce is hungry for data-driven solutions to their problems. They need it to thrive in the current landscape.

So they invest significantly in ...


Read More on Datafloq
What is the Best Programming Language for Machine Learning?

What is the Best Programming Language for Machine Learning?

Q&A sites and data science forums are buzzing with the same questions over and over again: I’m new to data science, what language should I learn? What’s the best language for machine learning?

There’s an abundance of articles attempting to answer these questions, either based on personal experience or based on job offer data. Τhere’s so much more activity in machine learning than job offers in the West can describe, however, and peer opinions are of course very valuable but often conflicting and as such may confuse the novices. We turned instead to our hard data from 2,000+ data scientists and machine learning developers who responded to our latest survey about which languages they use and what projects they’re working on – along with many other interesting things about their machine learning activities and training. Then, being data scientists ourselves, we couldn’t help but run a few models to see which are the most important factors that are correlated to language selection. We compared the top-5 languages and the results prove that there is no simple answer to the “which language?� question. It depends on what you’re trying to build, what your background is and why you got involved in machine learning ...


Read More on Datafloq
Bring Your Own Cyber Human (BYOCH) – Part 1: Self-connected Humans

Bring Your Own Cyber Human (BYOCH) – Part 1: Self-connected Humans

Perhaps some of my readers and followers have ever played in their infancy the “Rock, Paper or Scissors� game. During each match, we simulated with our hands one of these three things, although in those years we could never think that any of it could connect to the Internet.

A few years later, we are not surprised that somewhere in the world is designing connected stones, connected papers or connected scissors. Just read “The abuse of shocking headlines in IoT or how many stupid things will be connected ? “. To this end, we have arrived in Internet of Things (IoT).

But far from conforming us just connecting things, some enlightened like Elon Musk do not dream of electric sheep; they dream building human-computer hybrids. Elon Musk’s Neuralink company goal is to explore technology that can make direct connections between a human brain and a computer. Mr. Musk floated the idea that humans will need a boost from computer-assisted artificial intelligence, to remain competitive since our machines get smarter.

Facebook Engineer Asserts That Augmented Reality Will Replace Smartphones in 5 Years. Facebook’s uber-secretive Building 8 (B8). The division is currently working on a top-secret brain-computer interface (BCI) like Elon Musk’s Neuralink, but that BCI project ...


Read More on Datafloq
How to Access User Data from Third-party Apps on Android

How to Access User Data from Third-party Apps on Android

If you’re developing a DLP or Parent Control solution, chances are, you want as much user data from apps as possible. However, most of the time gathering such data can be fairly difficult. While you can try to reverse engineer iOS app or Android app, this method often proves difficult and time-consuming, while results are not guaranteed.

For Android particularly, most of the time apps store their data in a Sandbox (which is the default and most secure option) where other apps cannot access it. If a developer decides not to store data in another easily accessible area (such as memory card), and not to provide an API for accessing the data, then getting it without root can be very hard.

This means that there is seemingly no way to get Skype, Viber, or KIK messages, or a browser history, which can be extremely frustrating if your solution depends on such data. However, there is actually a fairly elegant solution allowing to get user data on Android without root and without that much of a hassle. And we will cover this solution down the line.

Idea behind a solution

The gist of the idea is very simple – each active page has a layout ...


Read More on Datafloq
The Advantages And Disadvantages of Using Django

The Advantages And Disadvantages of Using Django

If you are interested in running Django or considering making a transition to Python, let us help you explore the main virtues and vices of using this framework. But before we get started, let’s talk briefly about what Django is and why you should care.

Django came out in 2005 and, indisputably, has turned into one of the go-to web-frameworks for growing amount of developers. It was created as a framework on the Python programming language. With a set of right functionalities, Django reduces the amount of trivial code that simplifies the creation of web applications and results in faster development.

In case you want to dive deeper into the framework, view a short introduction to django full text search.

Why Django?

You should totally check Django. It is written in Python and Python is amazing, clean, easy to learn, and one of the most taught programming languages. Python is also a popular choice for:



Industrial Light & Magic (Star Wars visual effects)


Game development


Services like Pinterest, Instagram, The Guardian and more



Without a doubt, the tech market is overflowed with frameworks, but Django is a good place to start as it has the nicest documentation and tutorials in software development. Now, for the main attraction – ...


Read More on Datafloq
What Effect Will Deep Learning Have on Business?

What Effect Will Deep Learning Have on Business?

One thing that could have a deep impact on business is deep learning. Deep learning can be thought of as a subfield of machine learning. In specific, this form of machine learning was influenced by the study of the human brain. The algorithms involved are designed to mimic how the human brain operates to allow a machine to learn in the same way. This is done through a system known as an artificial neural network.

The benefits of deep learning for businesses are obvious. It allows a computer system with access to a lot of data to make its own autonomous decisions about the data through this learning process. It can produce better decisions and improve efficiency. There are many applications that can help improve a business’s operations and profit potential. Below are some of the possibilities.

Deep Learning Can Increase Sales

One of the best things deep learning can provide for a company obviously is helping it increase its bottom line. Deep learning, for example, can be deployed for the purpose of lead generation. Deep learning is a form of artificial intelligence. That AI can sift through all the data and then use it to present you with leads at that right ...


Read More on Datafloq
Why AI is the Catalyst of IoT

Why AI is the Catalyst of IoT

Businesses across the world are rapidly leveraging the Internet-of-Things (#IoT) to create new products and services that are opening up new business opportunities and creating new business models. The resulting transformation is ushering in a new era of how companies run their operations and engage with customers. However, tapping into the IoT is only part of the story [6].

For companies to realize the full potential of IoT enablement, they need to combine IoT with rapidly-advancing Artificial Intelligence (#AI) technologies, which enable ‘smart machines’ to simulate intelligent behavior and make well-informed decisions with little or no human intervention [6].

Artificial Intelligence (AI) and the Internet of Things (IoT) are terms that project futuristic, sci-fi, imagery; both have been identified as drivers of business disruption in 2017. But, what do these terms really mean and what is their relation? Let’s start by defining both terms first:

IoT is defined as a system of interrelated Physical Objects, Sensors, Actuators, Virtual Objects, People, Services, Platforms, and Networks [3] that have separate identifiers and an ability to transfer data independently. Practical examples of #IoT application today include precision agriculture, remote patient monitoring, and driverless cars. Simply put, IoT is the network of “things� that collects and exchanges ...


Read More on Datafloq
Why Biotech Needs the Power of Data Analytics

Why Biotech Needs the Power of Data Analytics

The Human Genome Project, that aimed to map and sequence the entire human genome, began in 1990 and ended in 2003 with a starting budget of over $1.5 million. It provided us, for the first time, a means to access invaluable data through genes – evolution patterns, diseases and their treatments, gene mutations and their effects, anthropological information, etc. Now, powerful software and analysis tools are being built that can decode an entire genome in a matter of hours. Data analytics is quickly becoming one of the most important branches of science that can be applied in the biotech industry. 

Genomics

DNA sequencing generates a huge amount of data that needs to be analyzed with care, as the information and conclusions drawn are applicable in a whole range of industries from medicine to forensic science. It involves data science at various levels:

Storage

The first step is storage of DNA sequencing data. If we were to sequence the genome of every living thing from a microbe to a human, then we need to have powerful data science tools that help us store, track and retrieve relevant information.

Annotation

Annotation is the process of adding notes to specific genes in the sequence. Tools are being built to put ...


Read More on Datafloq
The DNA of a Data Scientist

The DNA of a Data Scientist

The role has been coined  ‘the sexiest job of the 21st century’ by the Harvard Business Review and there is good reason for it.

Data Science can be a highly rewarding career path. However, not everyone is cut out for the job. Being a great data scientist takes a certain set of skills.

We’ve compiled a list of all the things that make up the best in the business.

A high level of education

Not strictly speaking mandatory. But those working under the title without, at least, a master’s degree are a very small minority (less than 20%). In fact, almost half of all data scientists have gone as far as completing a PHD.

The ideal education would be in the realm of mathematics, statistics, or computer science.

Is able to understand coding

There are a variety of different types of coding prevalent in the industry. A data scientist needs to be able to understand at least some of them.

Python, C/C++, Java, Pearl. All of those regularly crop up in data science. With Python being the most prevalent.

Has a certain level of proficiency in statistics

If you think back to your stats classes, the words ‘statistics’ and ‘data’ go hand in hand. And it’s true that statistical knowledge is important for data ...


Read More on Datafloq
Five Skillsets Needed for Securing IoT Today

Five Skillsets Needed for Securing IoT Today

On October 21, 2016 a sophisticated Distributed Denial-of-Service (DDoS) attack was launched that left customers of Amazon, Netflix, Twitter, and more without service, multiple times throughout the day.  TechTarget reported that the attack was leveled against Dyn, a Domain Name System (DNS) provider that services those brands, along with many others.  One of the contributing factors to the attack was that the hackers were able to infect Internet of Things (IoT) devices with the Mirai botnet. They were able to identify IoT devices that used default usernames and passwords (such as username: “admin,� password: “admin�), and turn them into drones in their DDoS cyberattack.

John Pironti, president of IP Architects, went on to explain to TechTarget, "The use of IoT devices for recent DDoS attacks has shown how fragile and insecure many of these devices currently are…. The first use was for DDoS, but these same devices are likely to be used as entry points to the internal networks they connect to as well as they become more pervasive."

Gartner projects that 20 billion IoT devices will be used by companies worldwide by 2020. This added mobility and productivity also brings the promise of multiplied threat vectors and vulnerabilities. If companies are ...


Read More on Datafloq
A D3 Image is Worth a Thousand Words: Interview with Morgane Ciot

A D3 Image is Worth a Thousand Words: Interview with Morgane Ciot

Many things have been said and done in the realm of analytics, but visualizations remain as the forefront of the data analysis process, where intuition and correct interpretation can help us make sense of data.

As an increasing number of tools emerge, current visualizations are far more than mere pictures in a screen, allowing for movement, exploration and interaction.

One of this tools is D3, an open-source Javascript data visualization library. D3 is perhaps the most popular tool to develop rich and interactive data visualizations, used by small and large companies such as Google and the New York Times.

With the next Open Data Science Conference in Boston coming soon, we had the opportunityto talk with DataRobot’s and ODSC speaker Morgane Ciot about her workshop session: “Intro to 3D�, the state of data visualization and her very own perspectives around the analytics market.


Morgane Ciot is a data visualization engineer at DataRobot, where she specializes in creating interactive and intuitive D3 visualizations for data analysis and machine learning. Morgane studied computer science and linguistics at McGill University in Montreal. Previously, she worked in the Network Dynamics Lab at McGill, answering questions about social media behavior using predictive models and statistical topic models.

Morgane enjoys studying about machine learning (ML), reading, writing, and staging unusual events.

Let's get to know more about Morgane and her views as a data visualization engineer.

Morgane, could you tell us a bit more about yourself, especially about your area of expertise, and what was your motivation to pursue a career in analytics and data science?

I went to school for computer science and linguistics. Those two fields naturally converge in Natural Language Processing (NLP)/Artificial Intelligence (AI), an intersection that was unfortunately not exploited by my program but that nonetheless got me interested in machine learning.

One of the computer science professors at my school was doing what essentially amounted to sociological research on social media behavior using machine learning techniques. Working with him furthered my interest in ML, NLP, and topic modeling, and I began to also explore how to visualize some of the unmanageable amounts of data we had (like, all of Reddit).

I’m probably indebted to that part of my life, and my professor, for my current position as a data viz engineer. Also, machine learning's practical ramifications are going to be game changing. I want to live closest to the eye of the storm when the singularity hits.

Based on your experience, which attributes or skills should every data master have if he/she wants to succeed, and what would be your recommendations for those looking for an opportunity at this career?

Stats, problem-solving skills, and engineering or scripting abilities all converge in the modern data scientist.

You have to be able to understand how to formulate a data science problem, how to approach it, and how to build the ad hoc tools you’ll need to solve it. At least some basic statistical knowledge is crucial. Elements of Statistical Learning by Hastie and Andrew Ng’s Coursera course both provide a solid foundational understanding of machine learning and require some statistical background.

Learn at least one programming language — Python or R are the most popular. R is the de facto language for statisticians, and Python has a thriving community and a ton of data science libraries like scikit-learn and pandas. It’s also great for writing scripts to scrape web data. If you’re feeling more adventurous, maybe look into Julia.

As usual, don’t just learn the theory. Find a tangible project to work on. Kaggle hosts competitions you can enter and has a community of experts you can learn from.

Finally, start learning about deep learning. Many of the most interesting papers in the last few years have come out of that area and we’re only just beginning to see how the theory that has been around for decades is going to be put into practice.

Talking about data visualization, what is your view of the role it plays within data science? How important is it in the overall data science process?

Data visualization is pretty fundamental to every stage of the data science process. I think how it’s used in data exploration — viewing feature distributions — is fairly obvious and well-practiced, but people often overlook how important visualizations can be even in the modeling process.

Visualizations should accompany not just how we examine our data, but also how we examine our models! There are various metrics that we can use to assess model performance, but what’s really going to convince an end user is a visualization, not a number. That's what's going to instill trust in model decisions.

Standard introductions to machine learning lionize the ROC curve, but there are plenty of other charts out there that can help us understand what and how a model is doing: plotting predicted vs. actuals, lift charts, feature importance, partial dependence, etc. — this was actually the subject of my ODSC talk last year, which should be accessible on their website.

A visualization that rank-orders the features that were most important to the predictive capacity of a model doesn’t just give you insight, it also helps you model better. You can use those top features to build faster and more accurate models. 

What do you think will be the most important data visualization trend in the next couple of years?

Data is becoming evermore important basically everywhere, but popular and even expert understanding hasn’t quite kept up.

Data is slowly consuming us, pressing down from all angles like that Star Wars scene where Luke Skywalker and Princess Leia get crushed by trash. But are people able to actually interpret that data, or are they going to wordlessly nod along to the magical incantations of “dataâ€� and “algorithmsâ€�? 

As decisions and stories become increasingly data-driven, visualizations in the media are going to become more important. Visualizations are sort of inherently democratic.

Everyone who can see can understand a trend; math is an alien language designed to make us feel dumb. I think that in journalism, interactive storytelling — displaying data with a visual and narrative focus — is going to become even more ubiquitous and important than it already is. These visualizations will become even more interactive and possibly even gamified.

The New York Times did a really cool story where you had to draw a line to guess the trend for various statistics, like the employment rate, during the Obama years, before showing you the actual trend. This kind of quasi-gamified interactivity is intuitively more helpful than viewing an array of numbers.

Expert understanding will benefit from visualizations in the same way. Models are being deployed in high-stakes industries, like healthcare and insurance, that need to know precisely why they’re making a decision. They’ll need to either use simplified models that are inherently more intelligible, at the expense of accuracy, or have powerful tools, including visualizations, to persuade their stakeholders that model decisions can be interpreted.

The EU is working on legislation called “right of explanationâ€� laws, which allows any AI-made decision to be challenged by a human. So visualizations focused on model interpretability will become more important. 

A few other things….as more and more businesses integrate with machine learning systems, visualizations and dashboards that monitor large-scale ML systems and tell users when models need to be updated will become more prevalent. And of course, we’re generating staggering amounts of new data every day, so visualizations that can accurately summarize that data while also allowing us to explore it in an efficient way — maybe also through unsupervised learning techniques like clustering and topic modeling— will be necessary. 

Please tell us a bit about DataRobot, the company you work at.

We’re a machine learning startup that offers a platform data scientists of all stripes can use to build predictive models. I’m equal parts a fan of using the product and working on it, to be honest. The app makes it insanely easy to analyze your data, build dozens of models, use the myriad visualizations and metrics we have to understand which one will be the best for your use case, and then use that one to predict on new data.

The app is essentially an opinionated platform on how to automate your data science project. I say opinionated because it’s a machine that’s been well-oiled by some of the top data scientists in the world, so it’s an opinion you can trust. And as a data scientist, the automation isn’t something to fear. We’re automating the plumbing to allow you to focus on the problem-solving, the detective work. Don’t be a luddite! 

It’s really fun working on the product because you get to learn a ton about machine learning (both the theoretic and real-world applications) almost by osmosis. It’s like putting your textbook under your pillow while you sleep, except it actually works. And since data science is such a protean field, we’re also covering new ground and creating new standards for certain concepts in machine learning. There’s also a huge emphasis, embedded in our culture and our product, on — “democratizing� is abusing the term, but really putting data science into as many hands as possible, through evangelism, teaching, workshops, and the product itself.

Shameless promotional shout-out: we are hiring! If you’re into data or machine learning or python or javascript or d3 or angular or data vis or selling these things or just fast-growing startups with some cool eclectic people, please visit our website and apply!

As a data visualization engineer at DataRobot, what are the key design principles the company applies for development of its visualizations?

The driving design principle is functionality. Above all, will a user be able to derive an insight from this visualization? Will the insight be actionable? Will that insight be delivered immediately, or is the user going to have to bend over backwards scrutinizing the chart for its underlying logic, trying to divine from its welter of hypnotic curves some hidden kernel of truth? We’re not in the business of beautiful, bespoke visualizations,  like some of the stuff the NYTimes does.

Data visualization at DataRobot can be tricky because we want to make sure the visualizations are compatible with any sort of data that passes through — and users can build predictive models for virtually any dataset — which means we have to operate at the right level of explanatory and visual abstraction. And we want users of various proficiencies to immediately intuit whether or not a model is performing well, which requires thinking about how a beginner might be able to understand the same charts an expert might expect. So by “functionality� I mean the ability to quickly intuit meaning.

That step is the second in a hierarchy of insight: the first is looking at a single-valued metric, which is only capable of giving you a high-level summary, often an average. This could be obfuscating important truths. A visualization —the second step— exposes these truths a bit further, displaying multiple values at a time over slices of your data, allowing you to see trends and anomalous spots. The third step is actually playing with the visualization. An interactive visualization confirms or denies previous insights by letting you drill down, slice, zoom, project, compare — all ways of reformulating the original view to gain deeper understanding. Interactive functionality is a sub-tenet of our driving design principle. It allows users to better understand what they’re seeing while also engaging them in (admittedly) fun ways. 

During the ODSC in Boston, you will be presenting an intro to D3, can you give us a heads up? What is D3 and what are its main features and benefits?

D3 is a data visualization library built in Javascript. It represents data in a browser interface by binding data to a webpage’s DOM elements. It’s very low-level, but there are plenty of wrapper libraries/frameworks built around it that are easier to use, such as C3.js or the much more sophisticated Plot.ly. If you find a browser-rendered visualization toolkit, it’s probably using D3 under the hood. D3 supports transitions and defines a data update function, so you can create really beautiful custom and dynamic visualizations with it, such as these simulations or this frankly overwrought work of art.

D3 was created by Mike Bostock as a continuation of his graduate work at Stanford. Check out the awesome examples.

Please share with us some details about the session. What will attendees get from it?

Attendees will learn the basics of how D3 works. They’ll come away with a visualization in a static HTML file representing some aspect of a real-world dataset, and a vague sense of having been entertained. I’m hoping the workshop will expose them to the tool and give them a place to start if they want to do more on their own. 

What are the prerequisites attendees should have to take full advantage of your session?

Having already downloaded D3 4.0 (4.0!!!!!) will be useful, but really just a working browser — I’ll be using Chrome — and an IDE or text editor of your choice. And a Positive Attitudeâ„¢. 

Finally, on a more personal tenor, what's the best book you've read recently? 

Story of O: a bildungsroman about a young French girl's spiritual growth. Very inspiring!

Thank you Morgane for your insights and thoughts.

Morgane's “Intro to 3Dâ€� workshop session will be part of the Open Data Science Conference to take place in Boston, Ma. from May 3 to 5.

A good excuse to visit beautiful Boston and have a great data science learning experience!


Cloud Analytics Conference – London!

Cloud Analytics Conference – London!

Join Snowflake and The Data Warrior in London on June 1st for a Cloud Analytics Conference
About IoT Platforms, Super Powers Methodology, Superheroes and Super Villains

About IoT Platforms, Super Powers Methodology, Superheroes and Super Villains

The world is full of normal people like you and me, but I love to think that superheroes live between us and I dream that maybe someday I could become one of them and make a better world with my super powers.

In the universe of superheroes fit gods, mutants, humans with special skills, but also the special agents. I found fun to find similarities between this fantastic world and the world of IoT platforms.  Compare and find a reasonable resemblance between IoT Platforms and Superheroes or Super villains is the goal of this article. Opinions as always are personal and subject to all kinds of comments and appreciations. Enjoy, the article.

About IoT Platforms

Many of my regular readers remember my article “It is an IoT Platform, stupid !.�. At that time, per Research and Markets, there were more than 260 IoT platforms, today some sources speak about 700 IoT platforms. I confess, I have not been able to follow the birth, evolution and in some cases death of all IoT platforms out there. I think that many enthusiasts like me also have given up keeping an updated list.

I cannot predict which IoT platforms will survive beyond 2020, or which will be ...


Read More on Datafloq
How to Implement a Successful Big Data and Data Science Strategy

How to Implement a Successful Big Data and Data Science Strategy

Big Data and Data Science are two of the most exciting areas in the business today. While most of the decision makers understand the true potential of both the fields, companies remain skeptical on how to implement a successful big data strategy for their enterprises. This roadmap can help you in defining and implementing the right big data strategy in your organization.

There are many ways to incorporate big data and data science process in your company’s operations, but the following practices outlined here would guide businesses make a perfect blueprint of their big data and implementation strategy.

Define the Big Data Analytics Strategy

Organizations first need to define a clear strategy in synchronization with their core business objectives for the big data implementation. A strategy may include improving operational efficiency, boosting marketing campaign, analyzing consumers for prediction or counter fraud to mitigate risk and drive business performance. The business strategy should adhere to the following points to effectively solve business problems.


The business strategy should align itself with the enterprise quality and performance goals.
It should focus on measurable outcomes.
It should transform your company’s capabilities through data-driven decision making.


Choosing the right data

With the voluminous increase in data, it has become problematic for organizations to ...


Read More on Datafloq
How to Improve Your Data Quality to Comply with the GDPR

How to Improve Your Data Quality to Comply with the GDPR

What data quality means to the GDPR

The General Data Protection Regulation (GDPR) that will come into effect on May 25<sup>th</sup> 2018 has strong implications for nearly each and every company and organization in Europe. Its principle of “privacy by design�, that was first postulated by Canadian data protection scientist Ann Cavoukian could lead to a paradigm shift in how businesses develop their marketing campaigns and their customer service.

Many of the articles of the GDPR show the importance of data quality, especially Article 5 (Principles relating to processing of personal data) and Article 16 (Right to rectification). But it is obvious that also other parts of the GDPR demand a high level of quality of data, especially duplicates should be avoided in order to fulfill the data subject’s rights like the “Right of access� or “Right to object� properly.

The problem with this insight is that many businesses are struggling with their data quality. Studies and surveys show that a majority of companies is not satisfied with the data quality in their databases and think that it needs improvement.

But what are possible measures to improve data quality?

Technical and organizational measures

In the “good old days�™ there were dedicated employees called “data entry clerks�. ...


Read More on Datafloq
Why We Need to Stop Using FTP for Media Data, Like Yesterday

Why We Need to Stop Using FTP for Media Data, Like Yesterday

It’s 2017, and it’s time to start making some serious changes around here. FTP, or the File Transfer Protocol, is one of the most popular transfer methods for sending files to — and downloading from — the cloud. Users like FTP is because it’s simple to use and efficient when you’re primarily working with local media servers.

But, the ease of FTP comes at a cost, and the security risks are simply not worth it.

According to a new report from Encoding.com, FTP and SFTP remain “a popular transit protocol for getting files to the cloud, primarily due to its ease and prevalence on local media servers.�

Yet FTP and SFTP — governed by the TCP/IP protocol — were never designed to handle large data transfers. Worse, they are just not as secure as what you could be using, especially if you’re a media company with proprietary content and materials.

One of the most egregious issues with FTP is that the servers can only handle usernames and passwords in plain text. FTPs’ inability to handle more than usernames and passwords is exactly why you’re advised not to use root accounts for FTP access. If someone were to discern your username and pass, they could ...


Read More on Datafloq
Bad Data that Changed the Course of History

Bad Data that Changed the Course of History

Data drives all the major decisions in the world today.  Every business relies on data to make daily strategic decisions. Every decision from attending to customer needs to gaining competitive advantage is made thanks to data.

As individuals we rely on data for even the most basic daily activities including navigation to and from work as well as for communicating with friends and family.  But what happens when the data we rely on to make our daily decisions is bad?  It can have a drastic impact on our lives whether it’s a small task like choosing where to eat, or deciding whether or not a candidate for a job is qualified to hire.  Relying on bad data can also have a drastic impact on your bottom line.   

Bad data is Costly

We know that bad data is costly, but just how costly can it be?  IBM estimates that bad data costs the US economy roughly $3.1 trillion dollars each year. That’s a huge number. They also found that 1 in 3 business leaders don’t trust the information they use to make decisions. Not only do they not trust the data they are working with, but there is also a high level of ...


Read More on Datafloq
Do Self-driving Cars Hold the Key to a Widespread IoT?

Do Self-driving Cars Hold the Key to a Widespread IoT?

In 2014, Continental Tires developed tires that “talk to you�. The innovation, dubbed eTIS (electronic Tire Information System), consists of sensors embedded beneath the tire tread. The sensors relay information about when your tires are underinflated, when tread is too low, and when your car has too much weight in it from a heavy load. This new entry in the annals of IoT tech was relatively quiet and unglamorous. Yet, it forecasted what we’re seeing now. Car manufacturers and tire manufacturers are throwing millions of dollars into technology that will enable a widespread internet of things.

Call it necessity facilitating innovation; as I reported in an earlier post here, 1.2 million people die in auto-related accidents every year. That means safety is in high demand. One way to increase safety is to embed things like tires with sensors that can communicate data with a car’s onboard computer. Another way is to replace humans with AI to create self-driving cars, which will hopefully do a better job than we do at driving.

For self-driving cars to truly succeed by 2020, the IoT needs 4.5 million developers. That’s because a comprehensive IoT infrastructure—in which smart cities talk to smart cars—will help driverless vehicles navigate ...


Read More on Datafloq
Snowflake and Spark, Part 1: Why Spark? 

Snowflake and Spark, Part 1: Why Spark? 

Snowflake Computing is making great strides in the evolution of our Elastic DWaaS in the cloud. Here is a recent update from engineering and product management on our integration with Spark: This is the first post in an ongoing series describing Snowflake’s integration with Spark. In this post, we introduce the Snowflake Connector for Spark (package […]
Why Isn’t Big Data Called Small Data?

Why Isn’t Big Data Called Small Data?

Sometimes I think that Big Data has a branding problem.

You see, for data scientists to gain the trust and buy-in from their colleagues, they have to explain how their analysis can add value. They take a “data ocean� of information and distill it into highly-specific and actionable insights for every internal customer, refining and refreshing it along the way to ensure that it is as relevant as possible.

It is like they take the most powerful telescope imaginable and look for a speck of dust on the moon. “Here you go, this precise set of data will prove that you are right.�

The success of Big Data initiatives (to a large extent) comes in the ability to drill down from the planetary level to the sub-atomic level. It’s all about getting to those small insights that would never have appeared had you not started large and refocused, refocused and refocused. Of course, this doesn’t mean that the bigger trends are not relevant, but we have a tendency to view anything “large� with a certain amount of mistrust.

Somehow we naturally think that “big� things have a bigger margin for error, although the assumptions that we made on the way to the smaller insights could ...


Read More on Datafloq
How the GDPR will boost the Megatrend of Human Data Responsibility

How the GDPR will boost the Megatrend of Human Data Responsibility

First of all some basic facts about the General Data Protection Regulation (GDPR). If you haven’t heard about it, you should now pay attention and get some further information by visiting http://www.eugdpr.org/ or https://dsgvo.tips (for german readers). The GDPR will affect everybody working with personal data and is one of the major aspects of Human Data Responsibility (HDR).

The Facts:


The enforcement date of the GDPR is 25th May 2018. So you have little over one year of time to introduce the new rules to your company.
There will be extremely heavy fines for organizations who don’t work within the law. This can be up to 4% of the global annual turnover or 20 Million € (whichever is greater).
The rules affect every organization working with personal data of citizens of the European Union. So this is a worldwide topic.


I also want to point out, that IMHO the GDPR is a good thing. It is historically based on the CHARTER  OF FUNDAMENTAL RIGHTS OF THE EUROPEAN UNION from the year 2000 (http://www.europarl.europa.eu/charter/pdf/text_en.pdf) where the protection of personal data (Article 8) is on the same level like Human Dignity (Aricle 1), the Right to life (Article 2) or Freedom of thought, conscience and religion (Article 10). ...


Read More on Datafloq
How to Avoid a Data Breach

How to Avoid a Data Breach

Even a modest data breach can create serious problems that business owners cannot afford to take lightly. By some estimates, small businesses are the target of more than 40 percent of all online attacks. From providing staff and employees with the additional training in order to avoid any bad habits that may lead to security concerns to staying informed regarding the new threats that may be just over the horizon, business owners would be wise to take whatever steps may be necessary in order to enhance their level of digital security.

Consequences of a Breach

From the largest and most high-profile breaches to situations where business owners may not even be aware that their accounts or information may become compromised, calculating the true cost of a breach can often be difficult. In addition to more tangible instances that may involve the theft of funds or loss of assets, long-term damage to the brand or image of a business can often be quite costly. Consumers who have reason for heightened concern regarding their personal, account or financial information are far more likely to take their business elsewhere. Failing to address digital security concerns could end up sending the wrong message to potential ...


Read More on Datafloq
What To Consider When Hiring Data Science Talent

What To Consider When Hiring Data Science Talent

The truth is that hiring for data science is in many ways more of an art than a science. That does sound oxymoronic, but that does not make it any less true. The reason is obvious. Data science is so new that it can be hard to know what you’re actually looking for. What is the set of skills and abilities that will make a data science team fly and which one will make them flounder?

If you don’t know, then you’re certainly not alone. Fortunately, we do some years of company experience to draw on. What’s more, IT has been with us for nearly two decades and there are plenty of valuable lessons there too that we can implement in data science hires.

So let’s see what we’ve learned so far.

It’s not only about the numbers

A lot of companies think that if they just get a couple of people who are incredibly good with numbers, then it will all sort itself out. That couldn’t be further from the truth because the numbers alone won’t get you anywhere.

In the data community, there’s a famous saying: Garbage In Garbage Out. When they say it, they’re mainly talking about the quality of the raw ...


Read More on Datafloq
Buzzing 2017 Trends That Will Affect Big Data Users

Buzzing 2017 Trends That Will Affect Big Data Users

The smartest way of predicting how 2017 will be for big data to say that it will just get bigger and better. What will get bigger is the number of companies using big data and what will get better is the way big data technologies will be employed.

The technologies change so fast that it is almost impossible for organizations to keep up with the growth at times. This makes it imperative for the organizations to be informed about what will trend and what is likely to shape the future so that the selection can be made appropriately. Here, we list the big data trends that will affect organizations in 2017 and also the big data industry.

Boom in the Internet of Things (IoT)

In the past few years, we have seen glimpses of IoT being adopted in luxury goods. Some prominent researchers have predicted a revolution in IoT which is sure to generate oodles of data causing the big data technologies to customize its offerings and center them around IoT.

Cloud for Everything Big Data

So far, there has been a mixed reaction to choosing cloud for storage of trivial data. But, it seems that companies have finally found the right mix with hybrid ...


Read More on Datafloq
Do You Know How to Create a Dashboard?

Do You Know How to Create a Dashboard?

Chalk Board
Image by Travis Wise via flickr (https://flic.kr/p/MiM8yL)

What a question.

Of course, anyone can create a dashboard for their business, right? If I don’t have the time, my IT folks could whip up some for me in a matter of hours, right?

Even more tragically, some read the above question and thought it was “Do you know how to create a chart?” which is a different question entirely.

The truth is, not everyone can create a useful and always up-to-date business dashboard. And to those who think they have the best IT department in the whole industry, here is one little surprise: They may not have the necessary experience to build one for you either.

Really? Is it that difficult to create a business dashboard?

Everyone seems to have one or five displayed on LCD screens in their hallways or conference room nowadays, how can it be that hard to create?

In truth, it is not the creation of the dashboard that is difficult, but the useful and always up-to-date part.

Useful How?

Smart companies (such as our clients) who have LCD screens throughout their office use the information broadcasted within the screen to disseminate important numbers that show the health of the company. By doing this, they engage all employees to think constantly on how their day-to-day tasks affect those numbers. (My last article discussed one of the benefits of this approach)

Therefore the numbers (and figures, and visualization) on the screen better be useful for everyone in the company to know about.

Here is the problem: These numbers usually are hiding inside multiple systems, several spreadsheets, and inside the head of some key personnel.

And, they regularly — if not constantly — change.

So They Need to be Always Up-to-date?

Bingo.

Now you start to see that to design, build, maintain, and keep up with the changes even for a single dashboard is quite a bit of work. Are you sure now that your IT department has the bandwidth (not to mention the required skills and experience)?

It is really a full-time job for a qualified personnel; actually in a lot of cases, a single person is not enough, it requires a team.

Okay, Mr. Smartypants, What do you suggest then?

Let’s start by answering these questions:

  1. Why do I need a dashboard? What purpose does it serve in my company at the moment? One good answer: “I need a lot of visibility into what can give my company the best chance to not only survive, but excel in a fiercely-competitive industry.”
  2. What do I want on the dashboard? Do I know enough about the metrics (ok, KPI if you have to use a buzzword) that affects the bottom line, also the ones that show me the pulse — or even more useful: Problem areas — within the company? If you don’t already have a project to find these metrics, now would be a good time to start one, because most likely your competitors are working on it as well.
  3. Who can help me design, build, and maintain these dashboards? Can my existing personnel do it? Or is this time to chat with folks whose day-to-day business is to design, build, and maintain other companies dashboards?

But Didn’t We Just Bought that BI Tools?

Maybe, but a BI suite of tools cannot automagically design, build, and maintain your dashboards, someone still have to gather, clean up, and prepare the data so the tools can be used on them, and keep doing this as changes come and go.

One of the fallacy in the BI Tool industry is the lack of mentioning the crucial part: Without a well-designed and well-maintained data warehouse underneath, even the most sophisticated analytic and visualization tool is useless.

Why a company like nextCoder exists?

Because we are very useful to our clients. It matters not if they already paid for BI Tools such as Power BI, Tableau, Domo, Birst, etc. because we actually help them to design useful dashboards based on existing data, existing tools, and our experience from working on dashboards across different industries. Therefore accelerating the process of making data analytic part of the company’s program to grow and to compete in the industry.

“What if we don’t have a tool yet?” Then consider DW Digest(TM), which is designed to work seamlessly with our data warehouse designs and implementations. It is also competitively priced against the tools I mentioned above.

Our online platform are designed to keep these useful dashboards up-to-date and be able to cope with changes. Using our services, our clients can concentrate to iron out kinks, discover more opportunities, and save time and ultimately cost throughout the company and across departments. All without worrying how to maintain those dashboards.

Last question

You may indeed be able to create a useful and always up-to-date dashboards. But since your business is probably not making dashboards, is it the best use of your time? Your team’s time?

If you have any questions on how to achieve measured acceleration for your business using dashboards, send them to: will.gunadi@dwdigest.com or call me at 214.436.3232.

What Can Security Analytics Give Your Team?

What Can Security Analytics Give Your Team?

With the constant changes happening in the technology industry and the ever increasing cyber security threats, businesses are eager now more than ever to protect their data and company assets. As more and more devices come equipped with internet capabilities, create data, and store personal information, there are more routes available for hackers to access this information and for a business to experience a security breach.

Cyber security is necessary for developing and conducting the appropriate safety measures that will ultimately protect an organization’s computer systems, networks, and confidential information. Having access to the best talent and technology is crucial for businesses and other institutions to keep up with and surpass the threats and constant efforts of cyber hackers. With these shifts in technology, whether they be data storage or video analytics, traditional perimeter protection tools are just not enough anymore. As businesses look for security solutions to invest in, they should consider funding a security analytics project as part of their cyber defense program.

Security monitoring and analytics proves to be one of the most fundamental services within a business’s information security system. Establishing your own in-house security operations center within your company to manage comprehensive monitoring and alerting services can ...


Read More on Datafloq
Data, Metadata, Algorithms & Ethics

Data, Metadata, Algorithms & Ethics

The topic of ethical big data use is one that will likely continue popping up in the headlines with increasing frequency in the coming years. As the IoT, AI, and other data-driven technologies become further integrated with our social identities, the more discussion regarding its regulation we will see.

Recently, transparency advocates began pushing The Open, Public, Electronic and Necessary (or OPEN) Government Data Act, which aims to publish all non-federally restricted data in an open source format, allowing for standardized use by the government as well as the public.

“Our federal government is not just the largest organization in human history, it’s also the most complex,� said executive director of the Data Coalition, Hudson Hollister, in an article on the Federal Times. “To conduct oversight across such scale and complexity is a daunting challenge, fortunately, that is where transparency comes in. By giving Americans direct access to their government’s information, we can deputize millions of citizen inspectors general to help this committee fulfill its mission.�

This type of standardization, transparency, and ethical foresight aims to create a fair and balanced framework for the use of Big Data. Considering the pace of automation and IoT growth, these standards could begin affecting every industry ...


Read More on Datafloq
Why Marketing Needs Quality Data before Big Data and Predictive Analytics

Why Marketing Needs Quality Data before Big Data and Predictive Analytics

Recent marketing hype has been about new analytics and big data, and becoming marketing technologists. However, there are some fundamentals which must first be addressed, and a key stumbling block to effective marketing is the general poor quality of data. Data quality is non-negotiable. In a recent study, Britain's Royal Mail Data Services found that the average impact on businesses was a cost of 6% of annual revenue. While there was some variance among respondents, clearly no company can afford to ignore this problem.

This concern with data quality is not limited to the United Kingdom. Experian's data quality arm, in their annual benchmark report on global data quality, reported that while most businesses globally (and 95% in the US) use data to meet their business objectives, less than 44% of them trust their data.

Customer Experience is Top of Mind for 2017

Some 56% of the respondents in Experian's report want to serve their customers better in 2017, and recognize that a key factor in achieving this is better data. Providing a rich customer experience is the name of the game, and poor or erroneous information about that customer could cause the end of that relationship. It has become apparent to most ...


Read More on Datafloq
Why VPNs Are Vital For Data-Driven Business

Why VPNs Are Vital For Data-Driven Business

Companies have been using Virtual Private Networks (VPNs) for years, typically so that workers could access their desktops remotely, but in the age of big data, these systems are more important than ever before. The fact is, remote working is on the rise, data use is multiplying, and hackers are more innovative than ever before. VPNs are one of the best tools at our disposal for protecting our businesses, our data, and our clients.

Is your company armed against the constant threat of data theft? Here’s what you need to know to keep your business safe.

Are You Out There?

Did you know that 80 percent of corporate professionals work outside the office at least once a week? That’s a lot of people operating outside the protection and constraints of the typical workplace, such as multi-level encryption, firewalls, and protected servers. Though digital attacks can take place anywhere, shifting away from the office puts workers at a unique risk.

Always remind your workers to take precautions when using free WiFi on the road and invest in a VPN they can use no matter where work takes them. Yes, free WiFi is must have – at the hotel or cafes – but using it without ...


Read More on Datafloq
Top 5 ways to use Big Data to improve your Website Design

Top 5 ways to use Big Data to improve your Website Design

Big Data is a buzzword these days. Are you wondering what big data is? So, firstly let's get the definition out of the way so that we can begin on the same page.

What is Big Data?

Big Data refers to huge volume of data, both structured and unstructured. The volume of data is so massive in scope that it is almost impossible to process it using traditional means. As per Cloud Tweaks, 2.5 quintillion bytes of data is produced every single day. Again, as per predictions of experts, 40 zettabytes of data will be in existence by end of 2020. So, basically, big data is everywhere and it's shaping the internet and influencing the way we do business.

How Big Data is influencing the Web Design

With the help of Big data, businesses can create a data-driven web design that delivers the best user experience. Well, a data driven website design is not only restricted to functionality and visual appeal rather it takes a more scientific approach towards the concept of web design. It closely highlights, how through the design a company website can gain more traffic and leads. It has been observed, businesses who make a switch to data-driven web design enjoy ...


Read More on Datafloq
How to Capitalize on the Complex Modern Data Ecosystems

How to Capitalize on the Complex Modern Data Ecosystems

The data ecosystem serving today’s modern enterprises is a multi-platform architecture that attempts to embrace a variety of heterogeneous data sources. This modern data ecosystem (MDE) might include data lakes, traditional data warehouses, SaaS deployments and other cloud-based systems, data hubs, and distributed databases.

Multi-Platform Architecture

Reality of Modern Enterprise



The MDEs can potentially enable a wide variety of business goals as well as support data diversity, optimize costs, and support multiple systems of insight. However, MDEs will never be able to deliver these benefits unless enterprises can surmount a series of formidable challenges:


Data ownership. Who owns the data and with whom can it be shared?
Integration and unification. How will disparate data be integrated and unified to support reporting and analysis across the entire portfolio?
Data quality risks. How will an enterprise ensure adequate data quality given that different data systems will be characterized by different levels of data quality?
Skillset scarcity.  How will an enterprise fulfill the need for a diverse set of skills?
Optimization issues. How will an enterprise optimize the interaction among MDE’s, separate, poorly orchestrated components?
Multiple data models. How will an enterprise work with multiple data models that proliferate, reducing efficiency?
Holistic view. How will an enterprise establish a sustainable method for gaining ...


Read More on Datafloq
3 Fresh Approaches to Maximize Customer Value with Data

3 Fresh Approaches to Maximize Customer Value with Data

New customer acquisition is costly. And customers are increasingly demanding, fickle, and empowered with endless options — new and old — to spend their dollars. So brands are rightly focused on increasing retention and share of wallet to maximize customer value.

Brands know that data holds the key to making the customer value gains they want to see. But they struggle to leverage that data in the right way. Here are 3 fresh approaches many brands are not using, but should consider, to improve customer value.

1. The More Data, The Merrier

You are collecting some data — likely even a lot of data — about your customers. You’ve got some demographics, geography/location, and purchase history. You may have their customer service history and website behavior as well.

Don’t stop there. Do you know their marital status, education level, income? How about the words they said when talking to a customer service rep? How about their tweets? How loyal to your brand are their friends, family, and coworkers in their social networks?

And don’t stop with your customers. What about the enterprise itself? You’ve got a wealth of data about every aspect of the business, including sales data, ops data, and much more.

Why is this important?

Many ...


Read More on Datafloq
Connected Cars: Big Data’s Next Mining Ground

Connected Cars: Big Data’s Next Mining Ground

One of the exciting things about the future of big data is that it will likely start acting more like a living system when products like the self-driving car mature in the marketplace. Instead of needing to correlate the differences, the analytical infrastructure will allow data that is accruing to be analyzed and acted upon automatically. That type of future should be reassuring for most insurance companies that will now only have hacking to worry about when it comes to serious accidents and large payouts.

Data mining that makes sense

Most people have noticed that the amount of privacy that they have in their lives is continuously shrinking. Part of this is due to convenience, while another part is due to additional security that takes away individual freedoms in order to provide the entire neighborhood, town, or community with better protection.

Because the tech industry has already set up the type of licensing that makes users of smartphones and tablets that are in cars or connected to them subject to wide-scale data gathering, the new auto manufacturers are scrambling to put together sophisticated programs that take advantage of the data that they are allowed to gather.

Insurance industry faces a great deal of change

Another ...


Read More on Datafloq
Cloudera Analyst Event: Facing a New Data Management Era

Cloudera Analyst Event: Facing a New Data Management Era

I have to say that I attended this year’s Cloudera analyst event in San Francisco with a mix of excitement, expectation and a grain of salt also.

My excitement and expectation were fuelled with all that has been said about Cloudera and its close competitors in the last couple of years, and also by the fact that I am currently focusing my own research on big data and “New Data Platforms�. Moreover, when it comes to events hosted by vendors, I always recommend taking its statements with a grain of salt, because logically the information might be biased.


However, in the end, the event resulted in an enriching learning experience, full of surprises and discoveries. I learnt a lot about a company that is certainly collaborating big time in the transformation of the enterprise software industry.

The event certainly fulfilled many of my “want-to-know-more� expectations about Cloudera and its offering stack; the path the company has taken; and their view of the enterprise data management market.

Certainly, it looks like Cloudera is leading and strongly paving the way for a new generation of enterprise data software management platforms.

So, let me share with you a brief summary and comments about Cloudera’s 2017 industry analyst gathering.

OK, Machine Learning and Data Science are Hot Today

One of the themes of the event was Cloudera’s keen interest and immersion into Machine Learning and Data Science. Just a few days before the event, the company made two important announcements:

The first one was about the beta release of Cloudera Data Science Workbench (Figure 1), the company’s new self-service environment for data science on top of Cloudera Enterprise. This new offering comes directly from the smart acquisition of machine learning and data science startup, Sense.io.

Screencap of Cloudera's Data Science Workbench (Courtesy of Cloudera) 
Some of the capabilities of this product allow data scientists to develop on some of the most popular open source languages —R, Python and Scala— with native Apache Spark and Apache Hadoop integration, which in turn fastens project deployments, from exploration to production.

In this regard, Charles Zedlewski, senior vice president, Products at Cloudera mentioned that

“Cloudera is focused on improving the user experience for data science and engineering teams, in particular those who want to scale their analytics using Spark for data processing and machine learning. The acquisition of Sense.io and its team provided a strong foundation, and Data Science Workbench now puts self-service data science at scale within reach for our customers.�


One key approach Cloudera takes with the Data Science Workbench is that it aims to enable data scientists to work in an truly open space that can expand its reach to use, for example, deep learning frameworks such as TensorFlow, Microsoft Cognitive Toolkit, MXnet or BigDL, but within a secure and contained environment.

Certainly a new offering with huge potential for Cloudera to increase its customer base, but also to reaffirm and grow its presence within existing customers which now can expand the use of the Cloudera platform without the need to look for third party options to develop on top on.

The second announcement showcases the launch of Cloudera Solution Gallery (Figure 2), which enables Cloudera to showcase its solution’s large partner base  â€”more than 2,800 globally— and a storefront of more than 100 solutions.

This news should not be taken lightly as it shows Cloudera capability to start building a complete ecosystem around this robust set of products, which in my view is a defining aspect of those companies who want to become an industry de-facto.

Figure 2. Cloudera Solution Gallery (Courtesy of Cloudera)

Cloudera: Way More than Hadoop

During an intensive two-day event filled with presentations, briefings and interviews with Cloudera’s executives and customers, a persistent message prevailed. While the company recognizes its origin as a provider of a commercial distribution for Hadoop, it is now making it clear that its current offering has expanded way beyond the Hadoop realm to become a full-fledged open source data platform. Hadoop is certainly in the core of Cloudera as the main data engine itself but, with support for 25 open source projects, its platform is currently able to offer much more than Hadoop distributed storage capabilities.
(post-ads)
This is reflected through Cloudera’s offerings, from the full fledged Cloudera Enterprise Data Hub, its comprehensive platform, or via one of Cloudera’s special configurations:




Cloudera’s executives made it clear that the company strategy is to make sure they are able to provide, via open source offerings, efficient enterprise-ready data management solutions.

However, don’t be surprised if the message from Cloudera changes through time, especially if the company wants to put its aim on larger organizations that most of the times rely on providers that can center their IT services to the business and are not necessarily tied with any particular technology.

Cloudera is redefining itself so it can reposition its offering as a complete data management platform. This is a logical step considering that Cloudera wants to take a bigger piece of the large enterprise market, even when the company’s CEO stated that they “do not want to replace the Netezzas and Oracle’s of the world�.

Based on these events, it is clear to me that eventually, Cloudera will end up frontally competing in specific segments of the data management market —especially with IBM through its  IBM BigInsights, and Teradata, with multiple products that have left and keep leaving a very strong footprint in the data warehouse market. Either we like it or not, big data incumbents such as Cloudera seem to be destined to enter the big fight.

The Future, Cloudera and IoT

During the event I had also a chance to attend a couple of sessions specifically devoted to show Cloudera’s deployment in the context of IoT projects. Another thing worth notice is that, even when Cloudera has some really good stories to tell about IoT, the company seems not to be in a hurry to jump directly onto this wagon.

Perhaps it’s better to let this market get mature and consistent enough before devoting larger technical investments on it. It is always very important to know when and how to invest in an emerging market.

However, we should be very well aware that Cloudera, and the rest of the big data players, will be vital for the growth and evolution of the IoT market.

Figure 3. Cloudera Architecture for IoT (Courtesy of Cloudera)

It’s Hard to Grow Gracefully

Today it’s very hard, if not impossible, to deny that Hadoop is strongly immerse in the enterprise data management ecosystem of almost every industry. Cloudera’s analyst event was yet another confirmation. Large companies are now increasingly using some Cloudera’s different options and configurations for mission critical functions.

Then, for Cloudera the nub of the issue now is not about how to get to the top, but how to stay there, evolve and leave its footprint at the top.

Cloudera has been very smart and strategic to get to this position, yet it seems it has gotten to a place where the tide will get even tougher. From this point on, convincing companies to open the big wallet will take much more than a solid technical justification.

At the time of writing this post, I learnt that Cloudera has filed to go public and will trade on the NY Stock Exchange, and as an article on Fotune mentions:

“Cloudera faces tough competition in the data analytics market and cites in its filing several high-profile rivals, including Amazon Web Services, Google, Microsoft, Hewlett Packard Enterprise, and Oracle.�

It also mentions the case of Hortonworks, which:

“went public in late 2014 with its shares trading at nearly $28 during its height in April 2015. However, Hortonworks’ shares have dropped over 60% to $9.90 on Friday as the company has struggled to be profitable.�

In my opinion, in order for Cloudera to succeed while taking this critical step, they will have to show that they are more than well prepared business, technically and strategically wise, and also prepared and ready for the unexpected, because only then they will be able to grow gracefully and align to play big, with the big guys.

Keep always in mind that, as Benjamin Franklin said:

Without continual growth and progress, such words as improvement,
achievement, and success have no meaning.

Meet me in St. Louie, Louie.

Meet me in St. Louie, Louie.

Join me on May 2nd in St. Louis for the SilverLinings event where I will give three talks!
The Importance of Big Data in Real Estate Business

The Importance of Big Data in Real Estate Business

The concept of big data is not new today; rather, it’s been around for years and with it creating wonders, most of the organizations in the world have started taking recourse to this innovative means. As soon as they realized its true potential, the profit-making institutions in today’s market took the opportunity to capture all the data to stream into their business, thereby strengthening their profit margin. Applying required analytics and acquiring significant value from the same has now been common among the top marketing honchos in search of ground-breaking methods to boost up their business.

The best pros big data analytics bring on board are effectiveness and speed. Gone are the days when it used to take a hell lot of efforts to gather information, run analytics and uncover the same data that could be utilized for future verdicts. However, with the interference of big data, it is no more a challenge to make out insights for instant decisions. With this smart technology in use, you can easily work faster, stay nimble and provide you with a cut-throat edge that you hardly had before.

Now, when it comes to comes to the real estate business, big data again gives the sector ...


Read More on Datafloq
5 Platforms that Protect Your Startup from DDoS Attacks

5 Platforms that Protect Your Startup from DDoS Attacks

Cybersecurity may not be on your list of priorities when you founded your startup. If so, then you’ve already committed one of your first mistakes in business. Looking at the big hacking incident in 2016, it’s apparent that cyber criminals are getting more sophisticated and ruthless every year. Some of them even offer on-demand DDoS services, which competitors can leverage for as little as $5. 

A DDoS attack utilizes a large network of infected computers, also known as a “botnet�, to flood the target website’s servers and deny access to real users – thus, resulting to lost revenue. It may also lead to secondary damages such as lost data, high remediation costs, and a ruined brand reputation.

To help you better understand how DDoS attacks work, you can refer to the infographic below:



Infographic source: Incapsula – The Anatomy of a DDoS Attack

To protect your startup from such attacks, below are 5 platforms you should consider:

1. Cloudflare

The most straightforward way to protect against DDoS attacks is to leverage a comprehensive security platform like Cloudflare. It has everything you need to protect your startup from security threats, including but not limited to a web application firewall (WAF), a shared SSL certificate, and advanced DDoS ...


Read More on Datafloq
How Facial Recognition Can Help to Understand People

How Facial Recognition Can Help to Understand People

The facial recognition market is expected to grow to more than $2 Bn by 2020. While that’s a small figure compared to that of the analytics market which is expected to grow to a whopping $200+ Bn around the same time, the demand for face analytics continues to grow in-line with the expectations and, therefore, the application of big data and analytics in many spheres of our lives.

Facebook, Google, Amazon, Microsoft, and a host of other technology majors, have acquired continue to be on the look-out for start-ups and companies delving deep in the area of facial recognition. This is a testimony to not only the growing demand of facial recognition tools but also the power it can equip organizations with to do so many wonderful things that weren’t even imagined earlier, much less possible. One of the many premises being how companies and organizations understood people (customers, prospects, visitors, strangers, patrons, commoners, suspects, etc.) beyond online footprints and such other touch-points.

Admittedly, facial recognition software has been in use for quite some time now. However, that was limited use by a select few such as the state and federal investigating agencies, security organizations and, perhaps, a handful of businesses, where ...


Read More on Datafloq
How Virtual Reality Apps Influences Small Businesses

How Virtual Reality Apps Influences Small Businesses

Virtual reality offers a thrilling experience to users. It connects their senses and stimulates feelings that 2D visuals cannot stir up. With the ongoing advancement, this technology will offer users a better immersive and exciting experience. Though it has progressed in gaming, healthcare, learning process and other commercial aspects, it will soon become a household tool. The acquisition of Oculus VR by Facebook guarantees that virtual reality is set to influence man’s daily living.

Virtual reality has not only affected users’ daily lives, but it is also transforming the operations of businesses. It has become a necessity that small business owners can use to boost their businesses. This technology can make rapid higher returns on investment possible. Besides, it is an effective marketing tool.

5 ways how virtual reality can influence small businesses

1. Changing Customers’ Perception

Virtual reality is not only used for improving sales but can change consumers’ opinion of brand favorably. Some businesses such as JC Penny and Marriot have used it to entertain their customers and promote their products. With exciting virtual tours of far places, these brands have used VR technology to get more approval from the people. Business owners can use it to give a unique experience to ...


Read More on Datafloq
Big Data & the US Vote: Did Trump Change Their Mind?

Big Data & the US Vote: Did Trump Change Their Mind?

Two summers before the 2016 US Presidential election, I was sitting around a bonfire in the wilds of Kenya, lingering in the peace that comes from spending the day amongst the extraordinary wildlife of safari (that’s ordinary for Africa). An intimate gathering of around 15 guests from all continents, the conversation was friendly and centered on the day’s site seeing. Eventually, though it meandered into the typical conversationalist vernacular: who you are, where are you from and what do you do.

There was an extended pause as we all gazed into the fire, reminiscing on elephants, lions, and miles of wildebeest trekking the Mara. I was dreamily wondering about the potential of the universe when an unexpected verbal volley shot across the flames.

“So what do you think about Donald Trump becoming President?�  Like a grandmother’s awkward question about a pregnant member at the holiday family dinner table, heads turned, and all eyes rolled toward to me, the sole American representative.

I wish I could say I easily conjured an interesting and insightful and perhaps even clever reply to demonstrate my thorough comprehension of American politics but honestly the thought in my mind a year before primaries was “… Donald Trump is running for ...


Read More on Datafloq
How Big Data Personalization Influences the Banking Industry

How Big Data Personalization Influences the Banking Industry

It is a common belief today that banks and big banking corporations basically control the world. They deliver the funds to the right place, handle big transactions, choose who they want to support, and most of, all decide what the future holds for the world.

But, a person could find themselves wondering about other aspects of the banking industry, such as the data they gather, possible security risks, as well as personalized approach to every one of their clients. When taking into consideration how big data research affects the personalization process, things get a little more complicated.

Why is big data needed by the banks?

The way we see banks is as large corporations that do their job behind closed doors, inaccessible to people who aren’t bank employees, unless in some rare cases.

What is accessible to us though is the money transaction process which is restricted to our personal use. This is the part we do and should know well as it is our money that is in question. Yet, our knowledge of banks is limited, while they know a lot more about us, and make use of that information.

Simply by using your social security number and bank account, a bank can learn ...


Read More on Datafloq
IoT: An Open Ocean of Opportunities

IoT: An Open Ocean of Opportunities

The Internet of Things (IoT) is in the wild west phase of its development. There are no formal regulations that dictate a security protocol which is evenly employed by all manufacturers, yet. While there is a distinct lack of structure in the developmental process, there is no shortage of pioneers eagerly hitching up their wagon and venturing out to explore the vast new wilderness. What this means is there are ample opportunities within each industry to become early adopters who can make the most out of this new wave of technology as long as they are carefully vetting the providers with which they choose to work.

Fleet Management

Those companies that have large transportation fleets stand to benefit substantially from the growing integration of IoT technologies. IoT fleet management with companies like Fathym makes it possible to monitor drivers in real time, which can be beneficial to both the driver and the company.

Fleet managers have the ability to maintain far more control over their fleet than ever before. Cell phones and GPS technology have steadily increased the amount of influence and control they can exert. With the integration of the IoT even more is possible. Dispatchers can monitor speeding, unscheduled stops, vehicle maintenance issues, and accidents.

This ...


Read More on Datafloq
10 Tools for the Novice Data Scientist

10 Tools for the Novice Data Scientist

Data scientists harness their knowledge of statistics in converting collected data into potential ideas for product development, customer retention, and generation of business opportunities. It could even help dissertation writing service with their work. Recently, it was dubbed as the sexiest job in the 21st century as demands for data scientists are increasing. In order to be one, you have to gain the necessary skills to enter the world of data science. And when you do, here are some tools you can use to practice on:

RapidMiner

It began in 2006 as an open-source program under the name of Rapid-I. As the years went by, they dubbed it as, RapidMiner, and managed to get 35 Million US dollars in funding.  For old versions, the tool, considered open source, is the latest version. It can be ordered within a 14-day trial and the license can be bought after that. RapidMiner takes up the whole life-cycle prediction modeling, and also deployment and validation. The graphic user interface is designed using a block-diagram approach, same as Matlab Simulink’s.

BigML

This is one more platform that provides a great Graphic User Interface, which can be used in 6 easy steps:

Sources – Makes use of various sources of data
Datasets – ...


Read More on Datafloq
Why Artificial Intelligence Has Its Negative Sides Too

Why Artificial Intelligence Has Its Negative Sides Too

The emergence of Artificial Intelligence as a technology has been welcomed with a lot of accolades and commendations. Everybody keeps thinking about the benefits and convenience it comes with. But only a very few people have thought about its negative aspects. This article seeks to compare the advantages of Artificial Intelligence to its disadvantages and leaves the reader to conclude if the technology is really worth it. 

No doubt, Artificial Intelligence comes with a lot of convenience like no other technology and it is being adopted in different industries. Nobody will encounter any application of Artificial Intelligence and not fall in love with the technology. Before the disadvantages of the technology is delved into, it is necessary to outline some of its most popular applications

Automatic Lawn Mowers

A very good category is lawn mowers. Gone are the days when you will need to handle the mower and move about with it to manually mow every part of your lawn. Now there are several types of Do-it-yourself Remote Control Lawn mower. You can now mow the lawn right in your living room. In short, mowing the lawn has just become fun.

Smart Refrigerator

Another wonderful smart home gadget that is worthy of mention is the ...


Read More on Datafloq
7 Basic Misconceptions about Cloud Technology in a Business

7 Basic Misconceptions about Cloud Technology in a Business

The cloud is growing larger every day, with more and more customers either storing their data there or using its other functions. In a time of rapid change, it's easy for the cloud to be seen by some as a panacea to cure their many IT ills, while for others it presents as a source of impending woe.

An increased number of clients are storing their data on it or utilizing it for other purposes. Countless people want to find out more are asking "what is cloud computing?" And others want to get to the root of the misconceptions circulating. These have left Dublin business users and potential business users somewhat confused about the true facts.

Here are seven major misconceptions about cloud technology in a business and create hurdles to adopting it for better results.

1. Cloud is not safe

This is perhaps the most significant and least grounded belief about the cloud. Many managers claim that they wouldn’t want critical information about a company to float somewhere around the internet, stored in shared hardware and accessible by anyone ranging from regular users to the National Security Agency.

The truth is that cloud providers usually boast many more security protocols than any company could ...


Read More on Datafloq
Big Data and Risk Management in Financial Markets (Part II)

Big Data and Risk Management in Financial Markets (Part II)

I. Introduction to forecasting

If you missed the first part, I suggest you read it before going through this article. It gives a good introduction as well as an overview of traditional risk management and big data simulation. This article is instead more focused on big data forecasting.

There are nowadays several new techniques or methods borrowed from other disciplines which are used by financial practitioners with the aim of predicting future market outcomes.

Eklund and Kapetanios (2008) provided a good review of all the new predictive methods and I am borrowing here their classification of forecasting methods, which divides techniques into four groups: single equation models that use the whole datasets; models that use only a subset of the whole database, even though a complete set is provided; models that use partial datasets to estimate multiple forecasts averaged later on; and finally, multivariate models that use the whole datasets.


II. Single equation models

The first group is quite wide and includes common techniques used differently, such as ordinary least square (OLS) regression or Bayesian regression, as well as new advancements in the field, as in the case of factor models.

In the OLS model, when the time series dimension exceeds the number of observations, the generalized inverse has to be used in order to estimate the parameters.

Bayesian regression (De Mol, Giannone, ...


Read More on Datafloq
The Dirty Secret About Predictive Analytics

The Dirty Secret About Predictive Analytics

For some time now, predictive analytics has been hailed as the “next big thing.� A quick Google search for “predictive analytics� shows that everyone from Forbes to The Wall Street Journal and beyond have written about how predictive analytics is going to “transform business� and “turn analytics on its head� and even “make BI obsolete�.

However, despite years of optimism, the analytics market is still dominated by visualization and business intelligence software such as Tableau, Qlik, and Birst. If predictive analytics is the next best thing, why isn’t everyone using it?

Predictive analytics examines data and tells you what it is likely to happen in the future. It promises to give you the power to “predict the future of your business� and to “know what will happen next.� And current technology is capable of making thousands, even millions, of predictions each second. Sounds pretty darn impressive.

But the dirty secret is that much of the automated predictive analytics technology on offer simply isn’t very useful. Why? Knowing what’s going to happen next is nice, but if you don’t know why, you won’t know what to do about it, and it will be of little value.

Consider the simple case of customer churn. With predictive analytics, you will know ...


Read More on Datafloq
What Most People Get Wrong About Data Lakes

What Most People Get Wrong About Data Lakes

The technology industry has continued to find new ways to interpret big data, to develop artificial intelligence, create backup solutions, and expand the cloud into a platform for businesses. One issue that many businesses face is trying to find ways to make data analysis easier in order to deliver faster and more insightful results. A data lake has helped businesses accomplish this. The data lake has become a popular big data tool due to its ability to support the accumulation of data in the original format from a potentially infinite number of sources. These sources include such diverse sources as social media, ticketing systems, and automatic sensors.

The data lake is still a relatively new addition to the technology industry. Since it is still developing, there are often a large amount of misconceptions about what data lakes are and how they work. Below are the top six misconceptions.

The Data Lake is Considered Independent Technology

The data lake supports the big data endeavors of businesses by creating a path to the discovery of brand new insights. Many users would describe the data lake as another technological tool, but it can be more precisely defined as the aggregating of old tools. This misconception comes ...


Read More on Datafloq
Don’t Let Your Big Data Project Be a Failure

Don’t Let Your Big Data Project Be a Failure

Data collection and analysis is an up and coming industry that has grown extensively over the past few years. With the rise of the digital age and our increasing connection to the internet, we have created an immaculate amount of data. Reports show that in 2013, 90 percent of the world's data had been collected in the previous two years. It’s anticipated that in 2020, the digital world will have as many digital bits as there are stars in our universe. Several prominent companies have learned to use this data and its benefits to improve their businesses. Companies such as Amazon, Facebook, Google, and Apple are among the biggest businesses leading the data collection industry, with Google being one of the largest.

Big data analysis can be used to improve any company in any industry by helping to direct business owners, executives, and other professionals in their important decision-making. Big data analysis works to collect and interpret a vast amount of information and then provides its findings to businesses. These discoveries allow business owners and executives to make better and more informed decisions. However, big data analysis can seem like a nebulous cloud for companies who are currently trying to incorporate ...


Read More on Datafloq
7 Key Questions You Should Ask When Building a Chatbot

7 Key Questions You Should Ask When Building a Chatbot

Conversational AI is positioned to be the next frontier of an engaging consumer experience.We have experienced Google Now and Siri in our day to day lives for some time now and I must say these are very promising products. These will keep getting better and better in coming years and you can imagine a smartphone without any app at all but just an assistant to help you do all your tasks. Really exciting!

Indian enterprises too have been warming up to the use of conversational AI products with Chatbots being the popular flavor. Prominent use cases have been


Customer Service: You can serve your customer requests and provide access to the information at a fraction of cost and at larger scale compared to the human capital. We can find such use cases with Ecommerce, Financial Services companies already.
Transactional Purpose: You can perform transactions like Book a cab, order pizza recharge your mobile or buy a product. There are different B2C startups providing such platforms.


Decision to use chatbot as another consumer touchpoint needs to be evaluated very carefully rather than just Me too! approach. Thing with such products is either they work well and provide great consumer experience or they become part of troll ...


Read More on Datafloq
The Big Data Startup That’s Disrupting the Industry

The Big Data Startup That’s Disrupting the Industry

Big data has been completely disrupting the way we do business as a society for the last several years. Like many things that disrupt business as we know it, big data has also become mainstream recently—every company wants to leverage its power, and for good reason. Now that the industry is growing in popularity, however, some of its limitations are becoming apparent. Some startups are working to fix inherent problems within the industry and evolve the available technology to make big data even more valuable to companies in every industry. One of the biggest problems in big data today is analysis: it’s easy to collect and store data, but figuring out how to analyze and use it can be a major challenge. Taking on that challenge is Neo Technology’s mission—they’re ready to bring easy big data analysis to any company that wants it.

The Challenges of Big Data

Once the ability to collect, store and analyze large datasets efficiently became available, companies with large budgets began assembling data analytics teams and giant databases. These teams were made up of curious coders who were tasked with finding relevant answers to these companies’ biggest questions about customer behavior, efficiency, and many other queries, ...


Read More on Datafloq
Why Big Data Strategies Need DevOps

Why Big Data Strategies Need DevOps

Applying DevOps concepts can have great benefits to any big data initiatives, but the analytics teams still choose not to use these methodologies. Applications based on the components of big data ecosystem need to be hardened in order to run in production, and DevOps can be included as an important part of that.

What is DevOps?

The idea behind DevOps is to tear down the barriers that stand between IT infrastructure administrators and software developers, in order to make sure that everyone’s focused on a singular goal. This requires a bit of cross-training from both sides so the used terminology is understood by everyone. After the completion of training, clear lines of direction and communication can be established, with a clear aim of continuous improvement. Both ends will be able to bring software features and fixes to end users faster, as DevOps enables them to work in tandem to tune production infrastructure components and test environments to meet new software requirements.

Big data analysts know how tough and complex it is to extract meaningful and accurate answers from big data. Big data software developers lack coordination in many enterprises, which often makes things more challenging and big data projects remain siloed for different ...


Read More on Datafloq
What is the Impact of Big Data on Mobile Marketing

What is the Impact of Big Data on Mobile Marketing

Every business in one way or the other has access to data corresponding to their customers, competition, and market. Naturally, to stand out from other businesses and to ensure competitive advantage they need insights from other areas.

A customer is equally a social person with a set of habits, preferences, constraints and empathy. So, the business specific data about the customer concerning his buying habits, economy or demographic category just fail to present the person in totality. These days, businesses are finding it extremely necessary to grab hold of these multi-faceted data about their customers for better marketing output and decision making. Big Data analytics that converges across huge volumes of digital data from all niches and walks of life allows them this exposure. Thanks to Big Data businesses now can grab crucial business insights that so-called business analytics of earlier time were not capable of delivering.

Mobile data to feed Mobile Marketing

What is the biggest source of digital data today? We all know it is mobile. Yes, the web is now accessed principally through mobile devices. A vast majority of digital interactions is starting from web browsing to social media interactions to using a wide range of apps for a variety ...


Read More on Datafloq
3 Companies That Crushed it With Big Data

3 Companies That Crushed it With Big Data

Many companies in the last few years have discovered the power of tiny changes. When you are dealing with a global population of buyers and millions of people, see your products every single day, changing a single small thing can have consequences in the millions of dollars. 

The problem is that even though most companies realize this, they don't have access to the right information to make good decisions. This means that billions of dollars are being lost because of a lack of the right data.

Some companies, on the other hand, are doing this marvelously well. Here are a few doing just that.

T-mobile

T-mobile was recently able to significantly decrease one of the biggest problems in their industry, that of customer turnover. Cell phone companies struggle to keep customers for more than a few years for a variety of reasons. The problem was discovering those reasons.

One of the biggest ways they did this was drop call analysis. They were able to contact customers that were starting to experience more dropped calls based on a current location and work with them to improve their plan quality before they dropped T-mobile as a provider.

Another interesting item they worked on was sentiment. There are multiple ways ...


Read More on Datafloq
A New Chapter In The Analytics Journey

A New Chapter In The Analytics Journey

Every individual and enterprise travels a unique journey in the pursuit of analytics. In my case, I could never have predicted how my journey would unfold when I first entered the workforce over 20 years ago. The rise of analytics as a strategic imperative and the explosion of career opportunities within the field far surpass what I expected coming out of school. I feel very lucky to have had such a terrific journey so far and will be interested to look back 20 years from now and see what happens from here.

A New Leg in My Journey

As I write this, I am finishing a major leg of my personal analytics journey. As many readers are likely aware, I will be leaving Teradata this month. I had a terrific 14-year run with Teradata where I made a lot of friends, worked with some amazing clients, and got to witness firsthand how the world’s largest organizations have dealt with the rise of big data and analytics. Teradata treated me well and I like to think that I, in turn, contributed a lot to the company. It wasn’t an easy decision to leave, but I came across a great opportunity and every good ...


Read More on Datafloq
6 Incredibly Costly Big Data Marketing Mistakes

6 Incredibly Costly Big Data Marketing Mistakes

Everybody keeps talking about how great big data is – particularly for marketing. And, of course, it is great. It offers some fantastic opportunities, as has been shown over and over again. This is particularly true as the Internet of Things comes online, which offers an ocean of new possibilities. The thing is, if you don’t handle your big data carefully, then it can do a lot of damage as well.

Here we’re going to explore some of the biggest big data marketing mistakes that can cost you a lot of money. In that way, you’ll know what to look out for and you’ll be in a much better position to avoid them. Sound good? Let’s get started.

Low advertising conversions

The entire point of big data marketing is to convert lots of people. Of course, sometimes that doesn’t happen. Sometimes there are low conversion rates. That in and of itself isn’t a mistake. That’s just grounds for looking to improve your big data marketing strategy.

The mistake sits in the fact that some marketing teams don’t do anything about it. They continue on in the way they’re going, creating more ads and hoping that somehow the problem will fix itself. Which, of course, ...


Read More on Datafloq
What you need to know about GDPR

What you need to know about GDPR

Gartner predicts that by the end of 2018, over 50% of companies affected by the GDPR will not be in full compliance with its requirements. Here we explain the impact of the GDPR regulation and how you can prepare…

What is the EU data protection regulation?

Issued by the European Parliament, the European Council and the European Commission, European Data Protection Regulation (GDPR) will replace the current Data Protection Directive 95/46/ec in spring 2018. Its main purpose is to protect the data privacy of EU citizens and harmonise the current data protection laws across EU countries.

Some of the key privacy and data protection requirements of the GDPR that will impact your business include:

Proven Consent: You need to obtain valid consent to hold and use any personal data and be able to provide a proof of this consent at any time.

Right to Erasure: You cannot change the use of the data from the purpose for which it was originally collected. This means, if someone has agreed to receive your email newsletters, you need to get fresh consent before engaging in forms of communication, such as event notifications. Individuals will have the right to request the deletion of their details when this data is no longer used for its original purpose.

Privacy Impact ...


Read More on Datafloq

Privacy Policy

Copyright © 2017 BBBT - All Rights Reserved
Powered by WordPress & Atahualpa
X