Snowflake and Spark, Part 2: Pushing Spark Query Processing to Snowflake

Snowflake and Spark, Part 2: Pushing Spark Query Processing to Snowflake

This post provides the details of Snowflake’s ability to push query processing down from Spark into Snowflake.
10 Tips to Troubleshoot Security Concerns in IoT Wearables

10 Tips to Troubleshoot Security Concerns in IoT Wearables

From smartwatches to glasses and finger rings - the range of wearable devices are steadily expanding. Combined with the might of Internet of Things, wearable devices have a life transforming effect. While it is fancy and often productive to carry these wearables around, the billion dollar question is, “How safe are these wearables?�

What if your favourite wearable is just another door that hackers and cybercriminals can break into to make away with the control of your personal and professional life?

Security concerns in wearables are legitimate. A study by Auth0 has confirmed that more than 52% of wearable device users feel that they are provided with inadequate security measures.

VTech, a popular brand that sells wearable for kids suffered a security breach which resulted in the leakage of private information of more than 200,000 children and their parents.

There is no better time to sit back and analyze the security concerns and the ways to negate them, right now, in the present.

Security Concerns in IoT Wearables

Wearables are now being used for purposes that go far beyond calorie counting and fitness tracking. They are now even part of BYOD enterprise work philosophy and are used by remote employees to constantly collaborate and communicate with their peers. Though considered futuristic, the IoT ...

Read More on Datafloq
8 Ways IoT Can Improve Healthcare

8 Ways IoT Can Improve Healthcare

Over the past few decades, we’ve gotten used to the Internet and cannot imagine our lives without it. But now the Internet of Things (IoT) is changing the way we operate commodities around us. The Internet of Things is a real-time connection and communication among all sorts of objects, gadgets, wearables, and devices. Essentially, it represents interoperability between all the things around us (excluding computers and phones). Needless to say, IoT is changing entire industries, as it reduces costs, boosts productivity, and improves quality. One of the areas where IoT is contributing the most is medicine. In this article, we will check out 8 ways how IoT is improving the healthcare industry.

How is IoT Changing Healthcare?

With its advanced technologies, IoT gives a significant boost to healthcare development. Some forecasts even estimate that the field of IoT will climb to $117 billion by 2020. How is that possible? Let’s discuss some of the key points!

Data management

IoT provides countless possibilities for hospitals to gather relevant information about their patients, both on-site and outside of the medical premises. Healthcare relies on telemetry here to capture data and communicate it automatically and remotely. This offers medical staff a chance to act promptly and provide patients with better ...

Read More on Datafloq
How Thick Data Can Unleash the True Power of Big Data

How Thick Data Can Unleash the True Power of Big Data

All that Data which is measurable may not be qualitative. While Big Data helps us find answers to well-defined questions, Thick Data connects the dots and gives us a more realistic picture.   

We have been hearing this for quite a few years that Big Data and Analytics are the next big waves. While these waves are already sweeping us over, we are missing out on the small things going for the big. Big Data has emerged to be remarkably useful when it comes to finding answers to well-defined questions and addressing phenomena that are well understood. What it fails to recognize is the complicacy of peoples’ lives, human connections, underlying emotions, changing cultural ecosystems, interesting stories, and other social ingredients.

For instance, it made big news when Nokia was acquired by Microsoft in 2013. While there could be many reasons behind Nokia’s downfall, one of the prominent reasons that Tricia Wang, a Global Tech Ethnographer describes is the overdependence on numbers. Sharing her story on Ethnography Matters, she mentioned how her recommendations to Nokia to revise their product development strategy did not receive enough attention as the sample size used for her study was considered too small in comparison to millions of ...

Read More on Datafloq
Why Educational Systems Should Consider Big Data

Why Educational Systems Should Consider Big Data

Advances in technology have enabled good decision making in most educational institutions following the increased use of big data. The policy makers in the institutions have been using big data to understand the sentiments about the school and make systematic improvements to the student's performances. The term big data refers to a large amount of information flowing through various channels that only use computers for analysis. Learning institutions generate an immense amount of student's information that would be hard to capture and manage through conventional means. Therefore, big data comes in handy in helping improve the processing of data and increasing the storage capacity of the institutional data.

Understanding Big Data

Students' performances and experiences such as eating, social life, study, and sleeping have a high effect on their academic performance. Negative or traumatic experiences have a direct impact on the student's retention abilities. Therefore, most institutions now use big data to look into various aspects affecting the performance of a student. Academic institutions collect large quantities of data, but the problem lies in the analysis making it harder for the analytics to make data-based decisions and improve the organizational effectiveness.

Why Big Data and Not Small Data

Academic institutions collect data for many ...

Read More on Datafloq
Augmenting the Brain is Set to Pioneer Alzheimer’s Treatment

Augmenting the Brain is Set to Pioneer Alzheimer’s Treatment

As artificial intelligence becomes more human, to co-exist, does human intelligence need to become more artificial?

We’ve spent a lot of time philosophizing about where Artificial Intelligence is going to take us, how far we are to achieving general AI and the implications it will have on humanity - all not without the sky net scenarios! Hype aside, there are companies out there who are focusing on how we can use artificially intelligent applications to improve the human experience, sustain life on our planet and significantly boost the economy. This pioneering technology could well see the next world-changing scientific discovery hailing from Silicon Valley, especially considering the significant increase in investment over the past few years.  

According to the Alzheimer’s Association, there are more than 5 million Americans living with Alzheimer’s today, with a predicted 16 million by 2050, and a further 850,000 people with dementia in the UK. The degenerative disease is currently the 6th leading cause of death in the US and has also been linked to the poor health of caregivers too due to care responsibilities associated with the disease as opposed to caregivers to elderly people without dementia. Having a neuroprosthetic could be the missing the key to an improved quality of ...

Read More on Datafloq
How Big Data Can Reduce Building Injuries

How Big Data Can Reduce Building Injuries

Accidents are a part of life, but when we carefully analyze the data on unintended incidents and injuries, we often find that many of them could have been avoided through greater care and harm reduction strategies. Unfortunately, because many companies and individuals view each injury in a bubble, they miss the significance of certain occurrences and can’t effectively reform their behaviors. Only the big picture view can help – that’s where big data comes in.

When we use big data to analyze workplace injuries and individual accidents, we move from an individualized view of personal injury to a systemic one – and that’s how we can reduce injuries. But what does this look like in practice? By turning to risk management ecosystems, we can see what the future of safety looks like.

Analyzing Workplace Safety

Workplace safety is a significant national concern – it’s why organizations like OSHA exist – but just because there’s already oversight in the workplace doesn’t mean companies are maximizing their injury prevention strategies. Rather, many do the minimum required by OSHA and leave the rest to chance.

Some workplaces, however, are taking safety seriously by instituting company-wide injury analytics. These systems let all branches of a business, no matter ...

Read More on Datafloq
Are You Wasting Your Data or Consuming It?

Are You Wasting Your Data or Consuming It?

Last night I was in the checkout line at the grocery store. There was a woman behind me with a cart full of produce who told me, “I’ll probably end up throwing most of this away.� I asked her why. She said she knows she should eat healthy, but it takes too much time and effort to whip up a meal from scratch, and anyway, she wasn’t that great of a cook. Despite her best intentions, she ends up ordering in for the family most nights.

Unfortunate fact – over 40% of food produced is wasted, depleting resources like fresh water, electricity and human effort.

Wouldn’t it be great if raw ingredients could magically convert themselves into dishes for the family – no time, effort, or cooking skills needed? The family could be eating healthier meals, they’d be eating the food they spent money and time to procure, and it wouldn’t end up wasted in the trash anymore.

Companies are wasting data

Many companies are wasting data just like many people are wasting food.

Companies recognize how important data is. They know their workforce is hungry for data-driven solutions to their problems. They need it to thrive in the current landscape.

So they invest significantly in ...

Read More on Datafloq
What is the Best Programming Language for Machine Learning?

What is the Best Programming Language for Machine Learning?

Q&A sites and data science forums are buzzing with the same questions over and over again: I’m new to data science, what language should I learn? What’s the best language for machine learning?

There’s an abundance of articles attempting to answer these questions, either based on personal experience or based on job offer data. Τhere’s so much more activity in machine learning than job offers in the West can describe, however, and peer opinions are of course very valuable but often conflicting and as such may confuse the novices. We turned instead to our hard data from 2,000+ data scientists and machine learning developers who responded to our latest survey about which languages they use and what projects they’re working on – along with many other interesting things about their machine learning activities and training. Then, being data scientists ourselves, we couldn’t help but run a few models to see which are the most important factors that are correlated to language selection. We compared the top-5 languages and the results prove that there is no simple answer to the “which language?� question. It depends on what you’re trying to build, what your background is and why you got involved in machine learning ...

Read More on Datafloq
Bring Your Own Cyber Human (BYOCH) – Part 1: Self-connected Humans

Bring Your Own Cyber Human (BYOCH) – Part 1: Self-connected Humans

Perhaps some of my readers and followers have ever played in their infancy the “Rock, Paper or Scissors� game. During each match, we simulated with our hands one of these three things, although in those years we could never think that any of it could connect to the Internet.

A few years later, we are not surprised that somewhere in the world is designing connected stones, connected papers or connected scissors. Just read “The abuse of shocking headlines in IoT or how many stupid things will be connected ? “. To this end, we have arrived in Internet of Things (IoT).

But far from conforming us just connecting things, some enlightened like Elon Musk do not dream of electric sheep; they dream building human-computer hybrids. Elon Musk’s Neuralink company goal is to explore technology that can make direct connections between a human brain and a computer. Mr. Musk floated the idea that humans will need a boost from computer-assisted artificial intelligence, to remain competitive since our machines get smarter.

Facebook Engineer Asserts That Augmented Reality Will Replace Smartphones in 5 Years. Facebook’s uber-secretive Building 8 (B8). The division is currently working on a top-secret brain-computer interface (BCI) like Elon Musk’s Neuralink, but that BCI project ...

Read More on Datafloq
How to Access User Data from Third-party Apps on Android

How to Access User Data from Third-party Apps on Android

If you’re developing a DLP or Parent Control solution, chances are, you want as much user data from apps as possible. However, most of the time gathering such data can be fairly difficult. While you can try to reverse engineer iOS app or Android app, this method often proves difficult and time-consuming, while results are not guaranteed.

For Android particularly, most of the time apps store their data in a Sandbox (which is the default and most secure option) where other apps cannot access it. If a developer decides not to store data in another easily accessible area (such as memory card), and not to provide an API for accessing the data, then getting it without root can be very hard.

This means that there is seemingly no way to get Skype, Viber, or KIK messages, or a browser history, which can be extremely frustrating if your solution depends on such data. However, there is actually a fairly elegant solution allowing to get user data on Android without root and without that much of a hassle. And we will cover this solution down the line.

Idea behind a solution

The gist of the idea is very simple – each active page has a layout ...

Read More on Datafloq
The Advantages And Disadvantages of Using Django

The Advantages And Disadvantages of Using Django

If you are interested in running Django or considering making a transition to Python, let us help you explore the main virtues and vices of using this framework. But before we get started, let’s talk briefly about what Django is and why you should care.

Django came out in 2005 and, indisputably, has turned into one of the go-to web-frameworks for growing amount of developers. It was created as a framework on the Python programming language. With a set of right functionalities, Django reduces the amount of trivial code that simplifies the creation of web applications and results in faster development.

In case you want to dive deeper into the framework, view a short introduction to django full text search.

Why Django?

You should totally check Django. It is written in Python and Python is amazing, clean, easy to learn, and one of the most taught programming languages. Python is also a popular choice for:

Industrial Light & Magic (Star Wars visual effects)

Game development

Services like Pinterest, Instagram, The Guardian and more

Without a doubt, the tech market is overflowed with frameworks, but Django is a good place to start as it has the nicest documentation and tutorials in software development. Now, for the main attraction – ...

Read More on Datafloq
What Effect Will Deep Learning Have on Business?

What Effect Will Deep Learning Have on Business?

One thing that could have a deep impact on business is deep learning. Deep learning can be thought of as a subfield of machine learning. In specific, this form of machine learning was influenced by the study of the human brain. The algorithms involved are designed to mimic how the human brain operates to allow a machine to learn in the same way. This is done through a system known as an artificial neural network.

The benefits of deep learning for businesses are obvious. It allows a computer system with access to a lot of data to make its own autonomous decisions about the data through this learning process. It can produce better decisions and improve efficiency. There are many applications that can help improve a business’s operations and profit potential. Below are some of the possibilities.

Deep Learning Can Increase Sales

One of the best things deep learning can provide for a company obviously is helping it increase its bottom line. Deep learning, for example, can be deployed for the purpose of lead generation. Deep learning is a form of artificial intelligence. That AI can sift through all the data and then use it to present you with leads at that right ...

Read More on Datafloq
Why AI is the Catalyst of IoT

Why AI is the Catalyst of IoT

Businesses across the world are rapidly leveraging the Internet-of-Things (#IoT) to create new products and services that are opening up new business opportunities and creating new business models. The resulting transformation is ushering in a new era of how companies run their operations and engage with customers. However, tapping into the IoT is only part of the story [6].

For companies to realize the full potential of IoT enablement, they need to combine IoT with rapidly-advancing Artificial Intelligence (#AI) technologies, which enable ‘smart machines’ to simulate intelligent behavior and make well-informed decisions with little or no human intervention [6].

Artificial Intelligence (AI) and the Internet of Things (IoT) are terms that project futuristic, sci-fi, imagery; both have been identified as drivers of business disruption in 2017. But, what do these terms really mean and what is their relation? Let’s start by defining both terms first:

IoT is defined as a system of interrelated Physical Objects, Sensors, Actuators, Virtual Objects, People, Services, Platforms, and Networks [3] that have separate identifiers and an ability to transfer data independently. Practical examples of #IoT application today include precision agriculture, remote patient monitoring, and driverless cars. Simply put, IoT is the network of “things� that collects and exchanges ...

Read More on Datafloq
Why Biotech Needs the Power of Data Analytics

Why Biotech Needs the Power of Data Analytics

The Human Genome Project, that aimed to map and sequence the entire human genome, began in 1990 and ended in 2003 with a starting budget of over $1.5 million. It provided us, for the first time, a means to access invaluable data through genes – evolution patterns, diseases and their treatments, gene mutations and their effects, anthropological information, etc. Now, powerful software and analysis tools are being built that can decode an entire genome in a matter of hours. Data analytics is quickly becoming one of the most important branches of science that can be applied in the biotech industry. 


DNA sequencing generates a huge amount of data that needs to be analyzed with care, as the information and conclusions drawn are applicable in a whole range of industries from medicine to forensic science. It involves data science at various levels:


The first step is storage of DNA sequencing data. If we were to sequence the genome of every living thing from a microbe to a human, then we need to have powerful data science tools that help us store, track and retrieve relevant information.


Annotation is the process of adding notes to specific genes in the sequence. Tools are being built to put ...

Read More on Datafloq
The DNA of a Data Scientist

The DNA of a Data Scientist

The role has been coined  ‘the sexiest job of the 21st century’ by the Harvard Business Review and there is good reason for it.

Data Science can be a highly rewarding career path. However, not everyone is cut out for the job. Being a great data scientist takes a certain set of skills.

We’ve compiled a list of all the things that make up the best in the business.

A high level of education

Not strictly speaking mandatory. But those working under the title without, at least, a master’s degree are a very small minority (less than 20%). In fact, almost half of all data scientists have gone as far as completing a PHD.

The ideal education would be in the realm of mathematics, statistics, or computer science.

Is able to understand coding

There are a variety of different types of coding prevalent in the industry. A data scientist needs to be able to understand at least some of them.

Python, C/C++, Java, Pearl. All of those regularly crop up in data science. With Python being the most prevalent.

Has a certain level of proficiency in statistics

If you think back to your stats classes, the words ‘statistics’ and ‘data’ go hand in hand. And it’s true that statistical knowledge is important for data ...

Read More on Datafloq
Five Skillsets Needed for Securing IoT Today

Five Skillsets Needed for Securing IoT Today

On October 21, 2016 a sophisticated Distributed Denial-of-Service (DDoS) attack was launched that left customers of Amazon, Netflix, Twitter, and more without service, multiple times throughout the day.  TechTarget reported that the attack was leveled against Dyn, a Domain Name System (DNS) provider that services those brands, along with many others.  One of the contributing factors to the attack was that the hackers were able to infect Internet of Things (IoT) devices with the Mirai botnet. They were able to identify IoT devices that used default usernames and passwords (such as username: “admin,� password: “admin�), and turn them into drones in their DDoS cyberattack.

John Pironti, president of IP Architects, went on to explain to TechTarget, "The use of IoT devices for recent DDoS attacks has shown how fragile and insecure many of these devices currently are…. The first use was for DDoS, but these same devices are likely to be used as entry points to the internal networks they connect to as well as they become more pervasive."

Gartner projects that 20 billion IoT devices will be used by companies worldwide by 2020. This added mobility and productivity also brings the promise of multiplied threat vectors and vulnerabilities. If companies are ...

Read More on Datafloq
A D3 Image is Worth a Thousand Words: Interview with Morgane Ciot

A D3 Image is Worth a Thousand Words: Interview with Morgane Ciot

Many things have been said and done in the realm of analytics, but visualizations remain as the forefront of the data analysis process, where intuition and correct interpretation can help us make sense of data.

As an increasing number of tools emerge, current visualizations are far more than mere pictures in a screen, allowing for movement, exploration and interaction.

One of this tools is D3, an open-source Javascript data visualization library. D3 is perhaps the most popular tool to develop rich and interactive data visualizations, used by small and large companies such as Google and the New York Times.

With the next Open Data Science Conference in Boston coming soon, we had the opportunityto talk with DataRobot’s and ODSC speaker Morgane Ciot about her workshop session: “Intro to 3D�, the state of data visualization and her very own perspectives around the analytics market.

Morgane Ciot is a data visualization engineer at DataRobot, where she specializes in creating interactive and intuitive D3 visualizations for data analysis and machine learning. Morgane studied computer science and linguistics at McGill University in Montreal. Previously, she worked in the Network Dynamics Lab at McGill, answering questions about social media behavior using predictive models and statistical topic models.

Morgane enjoys studying about machine learning (ML), reading, writing, and staging unusual events.

Let's get to know more about Morgane and her views as a data visualization engineer.

Morgane, could you tell us a bit more about yourself, especially about your area of expertise, and what was your motivation to pursue a career in analytics and data science?

I went to school for computer science and linguistics. Those two fields naturally converge in Natural Language Processing (NLP)/Artificial Intelligence (AI), an intersection that was unfortunately not exploited by my program but that nonetheless got me interested in machine learning.

One of the computer science professors at my school was doing what essentially amounted to sociological research on social media behavior using machine learning techniques. Working with him furthered my interest in ML, NLP, and topic modeling, and I began to also explore how to visualize some of the unmanageable amounts of data we had (like, all of Reddit).

I’m probably indebted to that part of my life, and my professor, for my current position as a data viz engineer. Also, machine learning's practical ramifications are going to be game changing. I want to live closest to the eye of the storm when the singularity hits.

Based on your experience, which attributes or skills should every data master have if he/she wants to succeed, and what would be your recommendations for those looking for an opportunity at this career?

Stats, problem-solving skills, and engineering or scripting abilities all converge in the modern data scientist.

You have to be able to understand how to formulate a data science problem, how to approach it, and how to build the ad hoc tools you’ll need to solve it. At least some basic statistical knowledge is crucial. Elements of Statistical Learning by Hastie and Andrew Ng’s Coursera course both provide a solid foundational understanding of machine learning and require some statistical background.

Learn at least one programming language — Python or R are the most popular. R is the de facto language for statisticians, and Python has a thriving community and a ton of data science libraries like scikit-learn and pandas. It’s also great for writing scripts to scrape web data. If you’re feeling more adventurous, maybe look into Julia.

As usual, don’t just learn the theory. Find a tangible project to work on. Kaggle hosts competitions you can enter and has a community of experts you can learn from.

Finally, start learning about deep learning. Many of the most interesting papers in the last few years have come out of that area and we’re only just beginning to see how the theory that has been around for decades is going to be put into practice.

Talking about data visualization, what is your view of the role it plays within data science? How important is it in the overall data science process?

Data visualization is pretty fundamental to every stage of the data science process. I think how it’s used in data exploration — viewing feature distributions — is fairly obvious and well-practiced, but people often overlook how important visualizations can be even in the modeling process.

Visualizations should accompany not just how we examine our data, but also how we examine our models! There are various metrics that we can use to assess model performance, but what’s really going to convince an end user is a visualization, not a number. That's what's going to instill trust in model decisions.

Standard introductions to machine learning lionize the ROC curve, but there are plenty of other charts out there that can help us understand what and how a model is doing: plotting predicted vs. actuals, lift charts, feature importance, partial dependence, etc. — this was actually the subject of my ODSC talk last year, which should be accessible on their website.

A visualization that rank-orders the features that were most important to the predictive capacity of a model doesn’t just give you insight, it also helps you model better. You can use those top features to build faster and more accurate models. 

What do you think will be the most important data visualization trend in the next couple of years?

Data is becoming evermore important basically everywhere, but popular and even expert understanding hasn’t quite kept up.

Data is slowly consuming us, pressing down from all angles like that Star Wars scene where Luke Skywalker and Princess Leia get crushed by trash. But are people able to actually interpret that data, or are they going to wordlessly nod along to the magical incantations of “dataâ€� and “algorithmsâ€�? 

As decisions and stories become increasingly data-driven, visualizations in the media are going to become more important. Visualizations are sort of inherently democratic.

Everyone who can see can understand a trend; math is an alien language designed to make us feel dumb. I think that in journalism, interactive storytelling — displaying data with a visual and narrative focus — is going to become even more ubiquitous and important than it already is. These visualizations will become even more interactive and possibly even gamified.

The New York Times did a really cool story where you had to draw a line to guess the trend for various statistics, like the employment rate, during the Obama years, before showing you the actual trend. This kind of quasi-gamified interactivity is intuitively more helpful than viewing an array of numbers.

Expert understanding will benefit from visualizations in the same way. Models are being deployed in high-stakes industries, like healthcare and insurance, that need to know precisely why they’re making a decision. They’ll need to either use simplified models that are inherently more intelligible, at the expense of accuracy, or have powerful tools, including visualizations, to persuade their stakeholders that model decisions can be interpreted.

The EU is working on legislation called “right of explanationâ€� laws, which allows any AI-made decision to be challenged by a human. So visualizations focused on model interpretability will become more important. 

A few other things….as more and more businesses integrate with machine learning systems, visualizations and dashboards that monitor large-scale ML systems and tell users when models need to be updated will become more prevalent. And of course, we’re generating staggering amounts of new data every day, so visualizations that can accurately summarize that data while also allowing us to explore it in an efficient way — maybe also through unsupervised learning techniques like clustering and topic modeling— will be necessary. 

Please tell us a bit about DataRobot, the company you work at.

We’re a machine learning startup that offers a platform data scientists of all stripes can use to build predictive models. I’m equal parts a fan of using the product and working on it, to be honest. The app makes it insanely easy to analyze your data, build dozens of models, use the myriad visualizations and metrics we have to understand which one will be the best for your use case, and then use that one to predict on new data.

The app is essentially an opinionated platform on how to automate your data science project. I say opinionated because it’s a machine that’s been well-oiled by some of the top data scientists in the world, so it’s an opinion you can trust. And as a data scientist, the automation isn’t something to fear. We’re automating the plumbing to allow you to focus on the problem-solving, the detective work. Don’t be a luddite! 

It’s really fun working on the product because you get to learn a ton about machine learning (both the theoretic and real-world applications) almost by osmosis. It’s like putting your textbook under your pillow while you sleep, except it actually works. And since data science is such a protean field, we’re also covering new ground and creating new standards for certain concepts in machine learning. There’s also a huge emphasis, embedded in our culture and our product, on — “democratizing� is abusing the term, but really putting data science into as many hands as possible, through evangelism, teaching, workshops, and the product itself.

Shameless promotional shout-out: we are hiring! If you’re into data or machine learning or python or javascript or d3 or angular or data vis or selling these things or just fast-growing startups with some cool eclectic people, please visit our website and apply!

As a data visualization engineer at DataRobot, what are the key design principles the company applies for development of its visualizations?

The driving design principle is functionality. Above all, will a user be able to derive an insight from this visualization? Will the insight be actionable? Will that insight be delivered immediately, or is the user going to have to bend over backwards scrutinizing the chart for its underlying logic, trying to divine from its welter of hypnotic curves some hidden kernel of truth? We’re not in the business of beautiful, bespoke visualizations,  like some of the stuff the NYTimes does.

Data visualization at DataRobot can be tricky because we want to make sure the visualizations are compatible with any sort of data that passes through — and users can build predictive models for virtually any dataset — which means we have to operate at the right level of explanatory and visual abstraction. And we want users of various proficiencies to immediately intuit whether or not a model is performing well, which requires thinking about how a beginner might be able to understand the same charts an expert might expect. So by “functionality� I mean the ability to quickly intuit meaning.

That step is the second in a hierarchy of insight: the first is looking at a single-valued metric, which is only capable of giving you a high-level summary, often an average. This could be obfuscating important truths. A visualization —the second step— exposes these truths a bit further, displaying multiple values at a time over slices of your data, allowing you to see trends and anomalous spots. The third step is actually playing with the visualization. An interactive visualization confirms or denies previous insights by letting you drill down, slice, zoom, project, compare — all ways of reformulating the original view to gain deeper understanding. Interactive functionality is a sub-tenet of our driving design principle. It allows users to better understand what they’re seeing while also engaging them in (admittedly) fun ways. 

During the ODSC in Boston, you will be presenting an intro to D3, can you give us a heads up? What is D3 and what are its main features and benefits?

D3 is a data visualization library built in Javascript. It represents data in a browser interface by binding data to a webpage’s DOM elements. It’s very low-level, but there are plenty of wrapper libraries/frameworks built around it that are easier to use, such as C3.js or the much more sophisticated If you find a browser-rendered visualization toolkit, it’s probably using D3 under the hood. D3 supports transitions and defines a data update function, so you can create really beautiful custom and dynamic visualizations with it, such as these simulations or this frankly overwrought work of art.

D3 was created by Mike Bostock as a continuation of his graduate work at Stanford. Check out the awesome examples.

Please share with us some details about the session. What will attendees get from it?

Attendees will learn the basics of how D3 works. They’ll come away with a visualization in a static HTML file representing some aspect of a real-world dataset, and a vague sense of having been entertained. I’m hoping the workshop will expose them to the tool and give them a place to start if they want to do more on their own. 

What are the prerequisites attendees should have to take full advantage of your session?

Having already downloaded D3 4.0 (4.0!!!!!) will be useful, but really just a working browser — I’ll be using Chrome — and an IDE or text editor of your choice. And a Positive Attitudeâ„¢. 

Finally, on a more personal tenor, what's the best book you've read recently? 

Story of O: a bildungsroman about a young French girl's spiritual growth. Very inspiring!

Thank you Morgane for your insights and thoughts.

Morgane's “Intro to 3Dâ€� workshop session will be part of the Open Data Science Conference to take place in Boston, Ma. from May 3 to 5.

A good excuse to visit beautiful Boston and have a great data science learning experience!

Cloud Analytics Conference – London!

Cloud Analytics Conference – London!

Join Snowflake and The Data Warrior in London on June 1st for a Cloud Analytics Conference
About IoT Platforms, Super Powers Methodology, Superheroes and Super Villains

About IoT Platforms, Super Powers Methodology, Superheroes and Super Villains

The world is full of normal people like you and me, but I love to think that superheroes live between us and I dream that maybe someday I could become one of them and make a better world with my super powers.

In the universe of superheroes fit gods, mutants, humans with special skills, but also the special agents. I found fun to find similarities between this fantastic world and the world of IoT platforms.  Compare and find a reasonable resemblance between IoT Platforms and Superheroes or Super villains is the goal of this article. Opinions as always are personal and subject to all kinds of comments and appreciations. Enjoy, the article.

About IoT Platforms

Many of my regular readers remember my article “It is an IoT Platform, stupid !.�. At that time, per Research and Markets, there were more than 260 IoT platforms, today some sources speak about 700 IoT platforms. I confess, I have not been able to follow the birth, evolution and in some cases death of all IoT platforms out there. I think that many enthusiasts like me also have given up keeping an updated list.

I cannot predict which IoT platforms will survive beyond 2020, or which will be ...

Read More on Datafloq
How to Implement a Successful Big Data and Data Science Strategy

How to Implement a Successful Big Data and Data Science Strategy

Big Data and Data Science are two of the most exciting areas in the business today. While most of the decision makers understand the true potential of both the fields, companies remain skeptical on how to implement a successful big data strategy for their enterprises. This roadmap can help you in defining and implementing the right big data strategy in your organization.

There are many ways to incorporate big data and data science process in your company’s operations, but the following practices outlined here would guide businesses make a perfect blueprint of their big data and implementation strategy.

Define the Big Data Analytics Strategy

Organizations first need to define a clear strategy in synchronization with their core business objectives for the big data implementation. A strategy may include improving operational efficiency, boosting marketing campaign, analyzing consumers for prediction or counter fraud to mitigate risk and drive business performance. The business strategy should adhere to the following points to effectively solve business problems.

The business strategy should align itself with the enterprise quality and performance goals.
It should focus on measurable outcomes.
It should transform your company’s capabilities through data-driven decision making.

Choosing the right data

With the voluminous increase in data, it has become problematic for organizations to ...

Read More on Datafloq
How to Improve Your Data Quality to Comply with the GDPR

How to Improve Your Data Quality to Comply with the GDPR

What data quality means to the GDPR

The General Data Protection Regulation (GDPR) that will come into effect on May 25<sup>th</sup> 2018 has strong implications for nearly each and every company and organization in Europe. Its principle of “privacy by design�, that was first postulated by Canadian data protection scientist Ann Cavoukian could lead to a paradigm shift in how businesses develop their marketing campaigns and their customer service.

Many of the articles of the GDPR show the importance of data quality, especially Article 5 (Principles relating to processing of personal data) and Article 16 (Right to rectification). But it is obvious that also other parts of the GDPR demand a high level of quality of data, especially duplicates should be avoided in order to fulfill the data subject’s rights like the “Right of access� or “Right to object� properly.

The problem with this insight is that many businesses are struggling with their data quality. Studies and surveys show that a majority of companies is not satisfied with the data quality in their databases and think that it needs improvement.

But what are possible measures to improve data quality?

Technical and organizational measures

In the “good old days�™ there were dedicated employees called “data entry clerks�. ...

Read More on Datafloq
Why We Need to Stop Using FTP for Media Data, Like Yesterday

Why We Need to Stop Using FTP for Media Data, Like Yesterday

It’s 2017, and it’s time to start making some serious changes around here. FTP, or the File Transfer Protocol, is one of the most popular transfer methods for sending files to — and downloading from — the cloud. Users like FTP is because it’s simple to use and efficient when you’re primarily working with local media servers.

But, the ease of FTP comes at a cost, and the security risks are simply not worth it.

According to a new report from, FTP and SFTP remain “a popular transit protocol for getting files to the cloud, primarily due to its ease and prevalence on local media servers.�

Yet FTP and SFTP — governed by the TCP/IP protocol — were never designed to handle large data transfers. Worse, they are just not as secure as what you could be using, especially if you’re a media company with proprietary content and materials.

One of the most egregious issues with FTP is that the servers can only handle usernames and passwords in plain text. FTPs’ inability to handle more than usernames and passwords is exactly why you’re advised not to use root accounts for FTP access. If someone were to discern your username and pass, they could ...

Read More on Datafloq
Bad Data that Changed the Course of History

Bad Data that Changed the Course of History

Data drives all the major decisions in the world today.  Every business relies on data to make daily strategic decisions. Every decision from attending to customer needs to gaining competitive advantage is made thanks to data.

As individuals we rely on data for even the most basic daily activities including navigation to and from work as well as for communicating with friends and family.  But what happens when the data we rely on to make our daily decisions is bad?  It can have a drastic impact on our lives whether it’s a small task like choosing where to eat, or deciding whether or not a candidate for a job is qualified to hire.  Relying on bad data can also have a drastic impact on your bottom line.   

Bad data is Costly

We know that bad data is costly, but just how costly can it be?  IBM estimates that bad data costs the US economy roughly $3.1 trillion dollars each year. That’s a huge number. They also found that 1 in 3 business leaders don’t trust the information they use to make decisions. Not only do they not trust the data they are working with, but there is also a high level of ...

Read More on Datafloq
Do Self-driving Cars Hold the Key to a Widespread IoT?

Do Self-driving Cars Hold the Key to a Widespread IoT?

In 2014, Continental Tires developed tires that “talk to you�. The innovation, dubbed eTIS (electronic Tire Information System), consists of sensors embedded beneath the tire tread. The sensors relay information about when your tires are underinflated, when tread is too low, and when your car has too much weight in it from a heavy load. This new entry in the annals of IoT tech was relatively quiet and unglamorous. Yet, it forecasted what we’re seeing now. Car manufacturers and tire manufacturers are throwing millions of dollars into technology that will enable a widespread internet of things.

Call it necessity facilitating innovation; as I reported in an earlier post here, 1.2 million people die in auto-related accidents every year. That means safety is in high demand. One way to increase safety is to embed things like tires with sensors that can communicate data with a car’s onboard computer. Another way is to replace humans with AI to create self-driving cars, which will hopefully do a better job than we do at driving.

For self-driving cars to truly succeed by 2020, the IoT needs 4.5 million developers. That’s because a comprehensive IoT infrastructure—in which smart cities talk to smart cars—will help driverless vehicles navigate ...

Read More on Datafloq
Snowflake and Spark, Part 1: Why Spark? 

Snowflake and Spark, Part 1: Why Spark? 

Snowflake Computing is making great strides in the evolution of our Elastic DWaaS in the cloud. Here is a recent update from engineering and product management on our integration with Spark: This is the first post in an ongoing series describing Snowflake’s integration with Spark. In this post, we introduce the Snowflake Connector for Spark (package […]
Why Isn’t Big Data Called Small Data?

Why Isn’t Big Data Called Small Data?

Sometimes I think that Big Data has a branding problem.

You see, for data scientists to gain the trust and buy-in from their colleagues, they have to explain how their analysis can add value. They take a “data ocean� of information and distill it into highly-specific and actionable insights for every internal customer, refining and refreshing it along the way to ensure that it is as relevant as possible.

It is like they take the most powerful telescope imaginable and look for a speck of dust on the moon. “Here you go, this precise set of data will prove that you are right.�

The success of Big Data initiatives (to a large extent) comes in the ability to drill down from the planetary level to the sub-atomic level. It’s all about getting to those small insights that would never have appeared had you not started large and refocused, refocused and refocused. Of course, this doesn’t mean that the bigger trends are not relevant, but we have a tendency to view anything “large� with a certain amount of mistrust.

Somehow we naturally think that “big� things have a bigger margin for error, although the assumptions that we made on the way to the smaller insights could ...

Read More on Datafloq
How the GDPR will boost the Megatrend of Human Data Responsibility

How the GDPR will boost the Megatrend of Human Data Responsibility

First of all some basic facts about the General Data Protection Regulation (GDPR). If you haven’t heard about it, you should now pay attention and get some further information by visiting or (for german readers). The GDPR will affect everybody working with personal data and is one of the major aspects of Human Data Responsibility (HDR).

The Facts:

The enforcement date of the GDPR is 25th May 2018. So you have little over one year of time to introduce the new rules to your company.
There will be extremely heavy fines for organizations who don’t work within the law. This can be up to 4% of the global annual turnover or 20 Million € (whichever is greater).
The rules affect every organization working with personal data of citizens of the European Union. So this is a worldwide topic.

I also want to point out, that IMHO the GDPR is a good thing. It is historically based on the CHARTER  OF FUNDAMENTAL RIGHTS OF THE EUROPEAN UNION from the year 2000 ( where the protection of personal data (Article 8) is on the same level like Human Dignity (Aricle 1), the Right to life (Article 2) or Freedom of thought, conscience and religion (Article 10). ...

Read More on Datafloq
How to Avoid a Data Breach

How to Avoid a Data Breach

Even a modest data breach can create serious problems that business owners cannot afford to take lightly. By some estimates, small businesses are the target of more than 40 percent of all online attacks. From providing staff and employees with the additional training in order to avoid any bad habits that may lead to security concerns to staying informed regarding the new threats that may be just over the horizon, business owners would be wise to take whatever steps may be necessary in order to enhance their level of digital security.

Consequences of a Breach

From the largest and most high-profile breaches to situations where business owners may not even be aware that their accounts or information may become compromised, calculating the true cost of a breach can often be difficult. In addition to more tangible instances that may involve the theft of funds or loss of assets, long-term damage to the brand or image of a business can often be quite costly. Consumers who have reason for heightened concern regarding their personal, account or financial information are far more likely to take their business elsewhere. Failing to address digital security concerns could end up sending the wrong message to potential ...

Read More on Datafloq
What To Consider When Hiring Data Science Talent

What To Consider When Hiring Data Science Talent

The truth is that hiring for data science is in many ways more of an art than a science. That does sound oxymoronic, but that does not make it any less true. The reason is obvious. Data science is so new that it can be hard to know what you’re actually looking for. What is the set of skills and abilities that will make a data science team fly and which one will make them flounder?

If you don’t know, then you’re certainly not alone. Fortunately, we do some years of company experience to draw on. What’s more, IT has been with us for nearly two decades and there are plenty of valuable lessons there too that we can implement in data science hires.

So let’s see what we’ve learned so far.

It’s not only about the numbers

A lot of companies think that if they just get a couple of people who are incredibly good with numbers, then it will all sort itself out. That couldn’t be further from the truth because the numbers alone won’t get you anywhere.

In the data community, there’s a famous saying: Garbage In Garbage Out. When they say it, they’re mainly talking about the quality of the raw ...

Read More on Datafloq
Buzzing 2017 Trends That Will Affect Big Data Users

Buzzing 2017 Trends That Will Affect Big Data Users

The smartest way of predicting how 2017 will be for big data to say that it will just get bigger and better. What will get bigger is the number of companies using big data and what will get better is the way big data technologies will be employed.

The technologies change so fast that it is almost impossible for organizations to keep up with the growth at times. This makes it imperative for the organizations to be informed about what will trend and what is likely to shape the future so that the selection can be made appropriately. Here, we list the big data trends that will affect organizations in 2017 and also the big data industry.

Boom in the Internet of Things (IoT)

In the past few years, we have seen glimpses of IoT being adopted in luxury goods. Some prominent researchers have predicted a revolution in IoT which is sure to generate oodles of data causing the big data technologies to customize its offerings and center them around IoT.

Cloud for Everything Big Data

So far, there has been a mixed reaction to choosing cloud for storage of trivial data. But, it seems that companies have finally found the right mix with hybrid ...

Read More on Datafloq
Do You Know How to Create a Dashboard?

Do You Know How to Create a Dashboard?

Chalk Board
Image by Travis Wise via flickr (

What a question.

Of course, anyone can create a dashboard for their business, right? If I don’t have the time, my IT folks could whip up some for me in a matter of hours, right?

Even more tragically, some read the above question and thought it was “Do you know how to create a chart?” which is a different question entirely.

The truth is, not everyone can create a useful and always up-to-date business dashboard. And to those who think they have the best IT department in the whole industry, here is one little surprise: They may not have the necessary experience to build one for you either.

Really? Is it that difficult to create a business dashboard?

Everyone seems to have one or five displayed on LCD screens in their hallways or conference room nowadays, how can it be that hard to create?

In truth, it is not the creation of the dashboard that is difficult, but the useful and always up-to-date part.

Useful How?

Smart companies (such as our clients) who have LCD screens throughout their office use the information broadcasted within the screen to disseminate important numbers that show the health of the company. By doing this, they engage all employees to think constantly on how their day-to-day tasks affect those numbers. (My last article discussed one of the benefits of this approach)

Therefore the numbers (and figures, and visualization) on the screen better be useful for everyone in the company to know about.

Here is the problem: These numbers usually are hiding inside multiple systems, several spreadsheets, and inside the head of some key personnel.

And, they regularly — if not constantly — change.

So They Need to be Always Up-to-date?


Now you start to see that to design, build, maintain, and keep up with the changes even for a single dashboard is quite a bit of work. Are you sure now that your IT department has the bandwidth (not to mention the required skills and experience)?

It is really a full-time job for a qualified personnel; actually in a lot of cases, a single person is not enough, it requires a team.

Okay, Mr. Smartypants, What do you suggest then?

Let’s start by answering these questions:

  1. Why do I need a dashboard? What purpose does it serve in my company at the moment? One good answer: “I need a lot of visibility into what can give my company the best chance to not only survive, but excel in a fiercely-competitive industry.”
  2. What do I want on the dashboard? Do I know enough about the metrics (ok, KPI if you have to use a buzzword) that affects the bottom line, also the ones that show me the pulse — or even more useful: Problem areas — within the company? If you don’t already have a project to find these metrics, now would be a good time to start one, because most likely your competitors are working on it as well.
  3. Who can help me design, build, and maintain these dashboards? Can my existing personnel do it? Or is this time to chat with folks whose day-to-day business is to design, build, and maintain other companies dashboards?

But Didn’t We Just Bought that BI Tools?

Maybe, but a BI suite of tools cannot automagically design, build, and maintain your dashboards, someone still have to gather, clean up, and prepare the data so the tools can be used on them, and keep doing this as changes come and go.

One of the fallacy in the BI Tool industry is the lack of mentioning the crucial part: Without a well-designed and well-maintained data warehouse underneath, even the most sophisticated analytic and visualization tool is useless.

Why a company like nextCoder exists?

Because we are very useful to our clients. It matters not if they already paid for BI Tools such as Power BI, Tableau, Domo, Birst, etc. because we actually help them to design useful dashboards based on existing data, existing tools, and our experience from working on dashboards across different industries. Therefore accelerating the process of making data analytic part of the company’s program to grow and to compete in the industry.

“What if we don’t have a tool yet?” Then consider DW Digest(TM), which is designed to work seamlessly with our data warehouse designs and implementations. It is also competitively priced against the tools I mentioned above.

Our online platform are designed to keep these useful dashboards up-to-date and be able to cope with changes. Using our services, our clients can concentrate to iron out kinks, discover more opportunities, and save time and ultimately cost throughout the company and across departments. All without worrying how to maintain those dashboards.

Last question

You may indeed be able to create a useful and always up-to-date dashboards. But since your business is probably not making dashboards, is it the best use of your time? Your team’s time?

If you have any questions on how to achieve measured acceleration for your business using dashboards, send them to: or call me at 214.436.3232.

What Can Security Analytics Give Your Team?

What Can Security Analytics Give Your Team?

With the constant changes happening in the technology industry and the ever increasing cyber security threats, businesses are eager now more than ever to protect their data and company assets. As more and more devices come equipped with internet capabilities, create data, and store personal information, there are more routes available for hackers to access this information and for a business to experience a security breach.

Cyber security is necessary for developing and conducting the appropriate safety measures that will ultimately protect an organization’s computer systems, networks, and confidential information. Having access to the best talent and technology is crucial for businesses and other institutions to keep up with and surpass the threats and constant efforts of cyber hackers. With these shifts in technology, whether they be data storage or video analytics, traditional perimeter protection tools are just not enough anymore. As businesses look for security solutions to invest in, they should consider funding a security analytics project as part of their cyber defense program.

Security monitoring and analytics proves to be one of the most fundamental services within a business’s information security system. Establishing your own in-house security operations center within your company to manage comprehensive monitoring and alerting services can ...

Read More on Datafloq
Data, Metadata, Algorithms & Ethics

Data, Metadata, Algorithms & Ethics

The topic of ethical big data use is one that will likely continue popping up in the headlines with increasing frequency in the coming years. As the IoT, AI, and other data-driven technologies become further integrated with our social identities, the more discussion regarding its regulation we will see.

Recently, transparency advocates began pushing The Open, Public, Electronic and Necessary (or OPEN) Government Data Act, which aims to publish all non-federally restricted data in an open source format, allowing for standardized use by the government as well as the public.

“Our federal government is not just the largest organization in human history, it’s also the most complex,� said executive director of the Data Coalition, Hudson Hollister, in an article on the Federal Times. “To conduct oversight across such scale and complexity is a daunting challenge, fortunately, that is where transparency comes in. By giving Americans direct access to their government’s information, we can deputize millions of citizen inspectors general to help this committee fulfill its mission.�

This type of standardization, transparency, and ethical foresight aims to create a fair and balanced framework for the use of Big Data. Considering the pace of automation and IoT growth, these standards could begin affecting every industry ...

Read More on Datafloq
Why Marketing Needs Quality Data before Big Data and Predictive Analytics

Why Marketing Needs Quality Data before Big Data and Predictive Analytics

Recent marketing hype has been about new analytics and big data, and becoming marketing technologists. However, there are some fundamentals which must first be addressed, and a key stumbling block to effective marketing is the general poor quality of data. Data quality is non-negotiable. In a recent study, Britain's Royal Mail Data Services found that the average impact on businesses was a cost of 6% of annual revenue. While there was some variance among respondents, clearly no company can afford to ignore this problem.

This concern with data quality is not limited to the United Kingdom. Experian's data quality arm, in their annual benchmark report on global data quality, reported that while most businesses globally (and 95% in the US) use data to meet their business objectives, less than 44% of them trust their data.

Customer Experience is Top of Mind for 2017

Some 56% of the respondents in Experian's report want to serve their customers better in 2017, and recognize that a key factor in achieving this is better data. Providing a rich customer experience is the name of the game, and poor or erroneous information about that customer could cause the end of that relationship. It has become apparent to most ...

Read More on Datafloq
Why VPNs Are Vital For Data-Driven Business

Why VPNs Are Vital For Data-Driven Business

Companies have been using Virtual Private Networks (VPNs) for years, typically so that workers could access their desktops remotely, but in the age of big data, these systems are more important than ever before. The fact is, remote working is on the rise, data use is multiplying, and hackers are more innovative than ever before. VPNs are one of the best tools at our disposal for protecting our businesses, our data, and our clients.

Is your company armed against the constant threat of data theft? Here’s what you need to know to keep your business safe.

Are You Out There?

Did you know that 80 percent of corporate professionals work outside the office at least once a week? That’s a lot of people operating outside the protection and constraints of the typical workplace, such as multi-level encryption, firewalls, and protected servers. Though digital attacks can take place anywhere, shifting away from the office puts workers at a unique risk.

Always remind your workers to take precautions when using free WiFi on the road and invest in a VPN they can use no matter where work takes them. Yes, free WiFi is must have – at the hotel or cafes – but using it without ...

Read More on Datafloq
Top 5 ways to use Big Data to improve your Website Design

Top 5 ways to use Big Data to improve your Website Design

Big Data is a buzzword these days. Are you wondering what big data is? So, firstly let's get the definition out of the way so that we can begin on the same page.

What is Big Data?

Big Data refers to huge volume of data, both structured and unstructured. The volume of data is so massive in scope that it is almost impossible to process it using traditional means. As per Cloud Tweaks, 2.5 quintillion bytes of data is produced every single day. Again, as per predictions of experts, 40 zettabytes of data will be in existence by end of 2020. So, basically, big data is everywhere and it's shaping the internet and influencing the way we do business.

How Big Data is influencing the Web Design

With the help of Big data, businesses can create a data-driven web design that delivers the best user experience. Well, a data driven website design is not only restricted to functionality and visual appeal rather it takes a more scientific approach towards the concept of web design. It closely highlights, how through the design a company website can gain more traffic and leads. It has been observed, businesses who make a switch to data-driven web design enjoy ...

Read More on Datafloq
How to Capitalize on the Complex Modern Data Ecosystems

How to Capitalize on the Complex Modern Data Ecosystems

The data ecosystem serving today’s modern enterprises is a multi-platform architecture that attempts to embrace a variety of heterogeneous data sources. This modern data ecosystem (MDE) might include data lakes, traditional data warehouses, SaaS deployments and other cloud-based systems, data hubs, and distributed databases.

Multi-Platform Architecture

Reality of Modern Enterprise

The MDEs can potentially enable a wide variety of business goals as well as support data diversity, optimize costs, and support multiple systems of insight. However, MDEs will never be able to deliver these benefits unless enterprises can surmount a series of formidable challenges:

Data ownership. Who owns the data and with whom can it be shared?
Integration and unification. How will disparate data be integrated and unified to support reporting and analysis across the entire portfolio?
Data quality risks. How will an enterprise ensure adequate data quality given that different data systems will be characterized by different levels of data quality?
Skillset scarcity.  How will an enterprise fulfill the need for a diverse set of skills?
Optimization issues. How will an enterprise optimize the interaction among MDE’s, separate, poorly orchestrated components?
Multiple data models. How will an enterprise work with multiple data models that proliferate, reducing efficiency?
Holistic view. How will an enterprise establish a sustainable method for gaining ...

Read More on Datafloq
3 Fresh Approaches to Maximize Customer Value with Data

3 Fresh Approaches to Maximize Customer Value with Data

New customer acquisition is costly. And customers are increasingly demanding, fickle, and empowered with endless options — new and old — to spend their dollars. So brands are rightly focused on increasing retention and share of wallet to maximize customer value.

Brands know that data holds the key to making the customer value gains they want to see. But they struggle to leverage that data in the right way. Here are 3 fresh approaches many brands are not using, but should consider, to improve customer value.

1. The More Data, The Merrier

You are collecting some data — likely even a lot of data — about your customers. You’ve got some demographics, geography/location, and purchase history. You may have their customer service history and website behavior as well.

Don’t stop there. Do you know their marital status, education level, income? How about the words they said when talking to a customer service rep? How about their tweets? How loyal to your brand are their friends, family, and coworkers in their social networks?

And don’t stop with your customers. What about the enterprise itself? You’ve got a wealth of data about every aspect of the business, including sales data, ops data, and much more.

Why is this important?

Many ...

Read More on Datafloq
Connected Cars: Big Data’s Next Mining Ground

Connected Cars: Big Data’s Next Mining Ground

One of the exciting things about the future of big data is that it will likely start acting more like a living system when products like the self-driving car mature in the marketplace. Instead of needing to correlate the differences, the analytical infrastructure will allow data that is accruing to be analyzed and acted upon automatically. That type of future should be reassuring for most insurance companies that will now only have hacking to worry about when it comes to serious accidents and large payouts.

Data mining that makes sense

Most people have noticed that the amount of privacy that they have in their lives is continuously shrinking. Part of this is due to convenience, while another part is due to additional security that takes away individual freedoms in order to provide the entire neighborhood, town, or community with better protection.

Because the tech industry has already set up the type of licensing that makes users of smartphones and tablets that are in cars or connected to them subject to wide-scale data gathering, the new auto manufacturers are scrambling to put together sophisticated programs that take advantage of the data that they are allowed to gather.

Insurance industry faces a great deal of change

Another ...

Read More on Datafloq
Cloudera Analyst Event: Facing a New Data Management Era

Cloudera Analyst Event: Facing a New Data Management Era

I have to say that I attended this year’s Cloudera analyst event in San Francisco with a mix of excitement, expectation and a grain of salt also.

My excitement and expectation were fuelled with all that has been said about Cloudera and its close competitors in the last couple of years, and also by the fact that I am currently focusing my own research on big data and “New Data Platforms�. Moreover, when it comes to events hosted by vendors, I always recommend taking its statements with a grain of salt, because logically the information might be biased.

However, in the end, the event resulted in an enriching learning experience, full of surprises and discoveries. I learnt a lot about a company that is certainly collaborating big time in the transformation of the enterprise software industry.

The event certainly fulfilled many of my “want-to-know-more� expectations about Cloudera and its offering stack; the path the company has taken; and their view of the enterprise data management market.

Certainly, it looks like Cloudera is leading and strongly paving the way for a new generation of enterprise data software management platforms.

So, let me share with you a brief summary and comments about Cloudera’s 2017 industry analyst gathering.

OK, Machine Learning and Data Science are Hot Today

One of the themes of the event was Cloudera’s keen interest and immersion into Machine Learning and Data Science. Just a few days before the event, the company made two important announcements:

The first one was about the beta release of Cloudera Data Science Workbench (Figure 1), the company’s new self-service environment for data science on top of Cloudera Enterprise. This new offering comes directly from the smart acquisition of machine learning and data science startup,

Screencap of Cloudera's Data Science Workbench (Courtesy of Cloudera) 
Some of the capabilities of this product allow data scientists to develop on some of the most popular open source languages —R, Python and Scala— with native Apache Spark and Apache Hadoop integration, which in turn fastens project deployments, from exploration to production.

In this regard, Charles Zedlewski, senior vice president, Products at Cloudera mentioned that

“Cloudera is focused on improving the user experience for data science and engineering teams, in particular those who want to scale their analytics using Spark for data processing and machine learning. The acquisition of and its team provided a strong foundation, and Data Science Workbench now puts self-service data science at scale within reach for our customers.�

One key approach Cloudera takes with the Data Science Workbench is that it aims to enable data scientists to work in an truly open space that can expand its reach to use, for example, deep learning frameworks such as TensorFlow, Microsoft Cognitive Toolkit, MXnet or BigDL, but within a secure and contained environment.

Certainly a new offering with huge potential for Cloudera to increase its customer base, but also to reaffirm and grow its presence within existing customers which now can expand the use of the Cloudera platform without the need to look for third party options to develop on top on.

The second announcement showcases the launch of Cloudera Solution Gallery (Figure 2), which enables Cloudera to showcase its solution’s large partner base  â€”more than 2,800 globally— and a storefront of more than 100 solutions.

This news should not be taken lightly as it shows Cloudera capability to start building a complete ecosystem around this robust set of products, which in my view is a defining aspect of those companies who want to become an industry de-facto.

Figure 2. Cloudera Solution Gallery (Courtesy of Cloudera)

Cloudera: Way More than Hadoop

During an intensive two-day event filled with presentations, briefings and interviews with Cloudera’s executives and customers, a persistent message prevailed. While the company recognizes its origin as a provider of a commercial distribution for Hadoop, it is now making it clear that its current offering has expanded way beyond the Hadoop realm to become a full-fledged open source data platform. Hadoop is certainly in the core of Cloudera as the main data engine itself but, with support for 25 open source projects, its platform is currently able to offer much more than Hadoop distributed storage capabilities.
This is reflected through Cloudera’s offerings, from the full fledged Cloudera Enterprise Data Hub, its comprehensive platform, or via one of Cloudera’s special configurations:

Cloudera’s executives made it clear that the company strategy is to make sure they are able to provide, via open source offerings, efficient enterprise-ready data management solutions.

However, don’t be surprised if the message from Cloudera changes through time, especially if the company wants to put its aim on larger organizations that most of the times rely on providers that can center their IT services to the business and are not necessarily tied with any particular technology.

Cloudera is redefining itself so it can reposition its offering as a complete data management platform. This is a logical step considering that Cloudera wants to take a bigger piece of the large enterprise market, even when the company’s CEO stated that they “do not want to replace the Netezzas and Oracle’s of the world�.

Based on these events, it is clear to me that eventually, Cloudera will end up frontally competing in specific segments of the data management market —especially with IBM through its  IBM BigInsights, and Teradata, with multiple products that have left and keep leaving a very strong footprint in the data warehouse market. Either we like it or not, big data incumbents such as Cloudera seem to be destined to enter the big fight.

The Future, Cloudera and IoT

During the event I had also a chance to attend a couple of sessions specifically devoted to show Cloudera’s deployment in the context of IoT projects. Another thing worth notice is that, even when Cloudera has some really good stories to tell about IoT, the company seems not to be in a hurry to jump directly onto this wagon.

Perhaps it’s better to let this market get mature and consistent enough before devoting larger technical investments on it. It is always very important to know when and how to invest in an emerging market.

However, we should be very well aware that Cloudera, and the rest of the big data players, will be vital for the growth and evolution of the IoT market.

Figure 3. Cloudera Architecture for IoT (Courtesy of Cloudera)

It’s Hard to Grow Gracefully

Today it’s very hard, if not impossible, to deny that Hadoop is strongly immerse in the enterprise data management ecosystem of almost every industry. Cloudera’s analyst event was yet another confirmation. Large companies are now increasingly using some Cloudera’s different options and configurations for mission critical functions.

Then, for Cloudera the nub of the issue now is not about how to get to the top, but how to stay there, evolve and leave its footprint at the top.

Cloudera has been very smart and strategic to get to this position, yet it seems it has gotten to a place where the tide will get even tougher. From this point on, convincing companies to open the big wallet will take much more than a solid technical justification.

At the time of writing this post, I learnt that Cloudera has filed to go public and will trade on the NY Stock Exchange, and as an article on Fotune mentions:

“Cloudera faces tough competition in the data analytics market and cites in its filing several high-profile rivals, including Amazon Web Services, Google, Microsoft, Hewlett Packard Enterprise, and Oracle.�

It also mentions the case of Hortonworks, which:

“went public in late 2014 with its shares trading at nearly $28 during its height in April 2015. However, Hortonworks’ shares have dropped over 60% to $9.90 on Friday as the company has struggled to be profitable.�

In my opinion, in order for Cloudera to succeed while taking this critical step, they will have to show that they are more than well prepared business, technically and strategically wise, and also prepared and ready for the unexpected, because only then they will be able to grow gracefully and align to play big, with the big guys.

Keep always in mind that, as Benjamin Franklin said:

Without continual growth and progress, such words as improvement,
achievement, and success have no meaning.

Meet me in St. Louie, Louie.

Meet me in St. Louie, Louie.

Join me on May 2nd in St. Louis for the SilverLinings event where I will give three talks!
The Importance of Big Data in Real Estate Business

The Importance of Big Data in Real Estate Business

The concept of big data is not new today; rather, it’s been around for years and with it creating wonders, most of the organizations in the world have started taking recourse to this innovative means. As soon as they realized its true potential, the profit-making institutions in today’s market took the opportunity to capture all the data to stream into their business, thereby strengthening their profit margin. Applying required analytics and acquiring significant value from the same has now been common among the top marketing honchos in search of ground-breaking methods to boost up their business.

The best pros big data analytics bring on board are effectiveness and speed. Gone are the days when it used to take a hell lot of efforts to gather information, run analytics and uncover the same data that could be utilized for future verdicts. However, with the interference of big data, it is no more a challenge to make out insights for instant decisions. With this smart technology in use, you can easily work faster, stay nimble and provide you with a cut-throat edge that you hardly had before.

Now, when it comes to comes to the real estate business, big data again gives the sector ...

Read More on Datafloq
5 Platforms that Protect Your Startup from DDoS Attacks

5 Platforms that Protect Your Startup from DDoS Attacks

Cybersecurity may not be on your list of priorities when you founded your startup. If so, then you’ve already committed one of your first mistakes in business. Looking at the big hacking incident in 2016, it’s apparent that cyber criminals are getting more sophisticated and ruthless every year. Some of them even offer on-demand DDoS services, which competitors can leverage for as little as $5. 

A DDoS attack utilizes a large network of infected computers, also known as a “botnet�, to flood the target website’s servers and deny access to real users – thus, resulting to lost revenue. It may also lead to secondary damages such as lost data, high remediation costs, and a ruined brand reputation.

To help you better understand how DDoS attacks work, you can refer to the infographic below:

Infographic source: Incapsula – The Anatomy of a DDoS Attack

To protect your startup from such attacks, below are 5 platforms you should consider:

1. Cloudflare

The most straightforward way to protect against DDoS attacks is to leverage a comprehensive security platform like Cloudflare. It has everything you need to protect your startup from security threats, including but not limited to a web application firewall (WAF), a shared SSL certificate, and advanced DDoS ...

Read More on Datafloq
How Facial Recognition Can Help to Understand People

How Facial Recognition Can Help to Understand People

The facial recognition market is expected to grow to more than $2 Bn by 2020. While that’s a small figure compared to that of the analytics market which is expected to grow to a whopping $200+ Bn around the same time, the demand for face analytics continues to grow in-line with the expectations and, therefore, the application of big data and analytics in many spheres of our lives.

Facebook, Google, Amazon, Microsoft, and a host of other technology majors, have acquired continue to be on the look-out for start-ups and companies delving deep in the area of facial recognition. This is a testimony to not only the growing demand of facial recognition tools but also the power it can equip organizations with to do so many wonderful things that weren’t even imagined earlier, much less possible. One of the many premises being how companies and organizations understood people (customers, prospects, visitors, strangers, patrons, commoners, suspects, etc.) beyond online footprints and such other touch-points.

Admittedly, facial recognition software has been in use for quite some time now. However, that was limited use by a select few such as the state and federal investigating agencies, security organizations and, perhaps, a handful of businesses, where ...

Read More on Datafloq
How Virtual Reality Apps Influences Small Businesses

How Virtual Reality Apps Influences Small Businesses

Virtual reality offers a thrilling experience to users. It connects their senses and stimulates feelings that 2D visuals cannot stir up. With the ongoing advancement, this technology will offer users a better immersive and exciting experience. Though it has progressed in gaming, healthcare, learning process and other commercial aspects, it will soon become a household tool. The acquisition of Oculus VR by Facebook guarantees that virtual reality is set to influence man’s daily living.

Virtual reality has not only affected users’ daily lives, but it is also transforming the operations of businesses. It has become a necessity that small business owners can use to boost their businesses. This technology can make rapid higher returns on investment possible. Besides, it is an effective marketing tool.

5 ways how virtual reality can influence small businesses

1. Changing Customers’ Perception

Virtual reality is not only used for improving sales but can change consumers’ opinion of brand favorably. Some businesses such as JC Penny and Marriot have used it to entertain their customers and promote their products. With exciting virtual tours of far places, these brands have used VR technology to get more approval from the people. Business owners can use it to give a unique experience to ...

Read More on Datafloq
Big Data & the US Vote: Did Trump Change Their Mind?

Big Data & the US Vote: Did Trump Change Their Mind?

Two summers before the 2016 US Presidential election, I was sitting around a bonfire in the wilds of Kenya, lingering in the peace that comes from spending the day amongst the extraordinary wildlife of safari (that’s ordinary for Africa). An intimate gathering of around 15 guests from all continents, the conversation was friendly and centered on the day’s site seeing. Eventually, though it meandered into the typical conversationalist vernacular: who you are, where are you from and what do you do.

There was an extended pause as we all gazed into the fire, reminiscing on elephants, lions, and miles of wildebeest trekking the Mara. I was dreamily wondering about the potential of the universe when an unexpected verbal volley shot across the flames.

“So what do you think about Donald Trump becoming President?�  Like a grandmother’s awkward question about a pregnant member at the holiday family dinner table, heads turned, and all eyes rolled toward to me, the sole American representative.

I wish I could say I easily conjured an interesting and insightful and perhaps even clever reply to demonstrate my thorough comprehension of American politics but honestly the thought in my mind a year before primaries was “… Donald Trump is running for ...

Read More on Datafloq
How Big Data Personalization Influences the Banking Industry

How Big Data Personalization Influences the Banking Industry

It is a common belief today that banks and big banking corporations basically control the world. They deliver the funds to the right place, handle big transactions, choose who they want to support, and most of, all decide what the future holds for the world.

But, a person could find themselves wondering about other aspects of the banking industry, such as the data they gather, possible security risks, as well as personalized approach to every one of their clients. When taking into consideration how big data research affects the personalization process, things get a little more complicated.

Why is big data needed by the banks?

The way we see banks is as large corporations that do their job behind closed doors, inaccessible to people who aren’t bank employees, unless in some rare cases.

What is accessible to us though is the money transaction process which is restricted to our personal use. This is the part we do and should know well as it is our money that is in question. Yet, our knowledge of banks is limited, while they know a lot more about us, and make use of that information.

Simply by using your social security number and bank account, a bank can learn ...

Read More on Datafloq
IoT: An Open Ocean of Opportunities

IoT: An Open Ocean of Opportunities

The Internet of Things (IoT) is in the wild west phase of its development. There are no formal regulations that dictate a security protocol which is evenly employed by all manufacturers, yet. While there is a distinct lack of structure in the developmental process, there is no shortage of pioneers eagerly hitching up their wagon and venturing out to explore the vast new wilderness. What this means is there are ample opportunities within each industry to become early adopters who can make the most out of this new wave of technology as long as they are carefully vetting the providers with which they choose to work.

Fleet Management

Those companies that have large transportation fleets stand to benefit substantially from the growing integration of IoT technologies. IoT fleet management with companies like Fathym makes it possible to monitor drivers in real time, which can be beneficial to both the driver and the company.

Fleet managers have the ability to maintain far more control over their fleet than ever before. Cell phones and GPS technology have steadily increased the amount of influence and control they can exert. With the integration of the IoT even more is possible. Dispatchers can monitor speeding, unscheduled stops, vehicle maintenance issues, and accidents.

This ...

Read More on Datafloq
10 Tools for the Novice Data Scientist

10 Tools for the Novice Data Scientist

Data scientists harness their knowledge of statistics in converting collected data into potential ideas for product development, customer retention, and generation of business opportunities. It could even help dissertation writing service with their work. Recently, it was dubbed as the sexiest job in the 21st century as demands for data scientists are increasing. In order to be one, you have to gain the necessary skills to enter the world of data science. And when you do, here are some tools you can use to practice on:


It began in 2006 as an open-source program under the name of Rapid-I. As the years went by, they dubbed it as, RapidMiner, and managed to get 35 Million US dollars in funding.  For old versions, the tool, considered open source, is the latest version. It can be ordered within a 14-day trial and the license can be bought after that. RapidMiner takes up the whole life-cycle prediction modeling, and also deployment and validation. The graphic user interface is designed using a block-diagram approach, same as Matlab Simulink’s.


This is one more platform that provides a great Graphic User Interface, which can be used in 6 easy steps:

Sources – Makes use of various sources of data
Datasets – ...

Read More on Datafloq
Why Artificial Intelligence Has Its Negative Sides Too

Why Artificial Intelligence Has Its Negative Sides Too

The emergence of Artificial Intelligence as a technology has been welcomed with a lot of accolades and commendations. Everybody keeps thinking about the benefits and convenience it comes with. But only a very few people have thought about its negative aspects. This article seeks to compare the advantages of Artificial Intelligence to its disadvantages and leaves the reader to conclude if the technology is really worth it. 

No doubt, Artificial Intelligence comes with a lot of convenience like no other technology and it is being adopted in different industries. Nobody will encounter any application of Artificial Intelligence and not fall in love with the technology. Before the disadvantages of the technology is delved into, it is necessary to outline some of its most popular applications

Automatic Lawn Mowers

A very good category is lawn mowers. Gone are the days when you will need to handle the mower and move about with it to manually mow every part of your lawn. Now there are several types of Do-it-yourself Remote Control Lawn mower. You can now mow the lawn right in your living room. In short, mowing the lawn has just become fun.

Smart Refrigerator

Another wonderful smart home gadget that is worthy of mention is the ...

Read More on Datafloq
7 Basic Misconceptions about Cloud Technology in a Business

7 Basic Misconceptions about Cloud Technology in a Business

The cloud is growing larger every day, with more and more customers either storing their data there or using its other functions. In a time of rapid change, it's easy for the cloud to be seen by some as a panacea to cure their many IT ills, while for others it presents as a source of impending woe.

An increased number of clients are storing their data on it or utilizing it for other purposes. Countless people want to find out more are asking "what is cloud computing?" And others want to get to the root of the misconceptions circulating. These have left Dublin business users and potential business users somewhat confused about the true facts.

Here are seven major misconceptions about cloud technology in a business and create hurdles to adopting it for better results.

1. Cloud is not safe

This is perhaps the most significant and least grounded belief about the cloud. Many managers claim that they wouldn’t want critical information about a company to float somewhere around the internet, stored in shared hardware and accessible by anyone ranging from regular users to the National Security Agency.

The truth is that cloud providers usually boast many more security protocols than any company could ...

Read More on Datafloq
Big Data and Risk Management in Financial Markets (Part II)

Big Data and Risk Management in Financial Markets (Part II)

I. Introduction to forecasting

If you missed the first part, I suggest you read it before going through this article. It gives a good introduction as well as an overview of traditional risk management and big data simulation. This article is instead more focused on big data forecasting.

There are nowadays several new techniques or methods borrowed from other disciplines which are used by financial practitioners with the aim of predicting future market outcomes.

Eklund and Kapetanios (2008) provided a good review of all the new predictive methods and I am borrowing here their classification of forecasting methods, which divides techniques into four groups: single equation models that use the whole datasets; models that use only a subset of the whole database, even though a complete set is provided; models that use partial datasets to estimate multiple forecasts averaged later on; and finally, multivariate models that use the whole datasets.

II. Single equation models

The first group is quite wide and includes common techniques used differently, such as ordinary least square (OLS) regression or Bayesian regression, as well as new advancements in the field, as in the case of factor models.

In the OLS model, when the time series dimension exceeds the number of observations, the generalized inverse has to be used in order to estimate the parameters.

Bayesian regression (De Mol, Giannone, ...

Read More on Datafloq
The Dirty Secret About Predictive Analytics

The Dirty Secret About Predictive Analytics

For some time now, predictive analytics has been hailed as the “next big thing.� A quick Google search for “predictive analytics� shows that everyone from Forbes to The Wall Street Journal and beyond have written about how predictive analytics is going to “transform business� and “turn analytics on its head� and even “make BI obsolete�.

However, despite years of optimism, the analytics market is still dominated by visualization and business intelligence software such as Tableau, Qlik, and Birst. If predictive analytics is the next best thing, why isn’t everyone using it?

Predictive analytics examines data and tells you what it is likely to happen in the future. It promises to give you the power to “predict the future of your business� and to “know what will happen next.� And current technology is capable of making thousands, even millions, of predictions each second. Sounds pretty darn impressive.

But the dirty secret is that much of the automated predictive analytics technology on offer simply isn’t very useful. Why? Knowing what’s going to happen next is nice, but if you don’t know why, you won’t know what to do about it, and it will be of little value.

Consider the simple case of customer churn. With predictive analytics, you will know ...

Read More on Datafloq
What Most People Get Wrong About Data Lakes

What Most People Get Wrong About Data Lakes

The technology industry has continued to find new ways to interpret big data, to develop artificial intelligence, create backup solutions, and expand the cloud into a platform for businesses. One issue that many businesses face is trying to find ways to make data analysis easier in order to deliver faster and more insightful results. A data lake has helped businesses accomplish this. The data lake has become a popular big data tool due to its ability to support the accumulation of data in the original format from a potentially infinite number of sources. These sources include such diverse sources as social media, ticketing systems, and automatic sensors.

The data lake is still a relatively new addition to the technology industry. Since it is still developing, there are often a large amount of misconceptions about what data lakes are and how they work. Below are the top six misconceptions.

The Data Lake is Considered Independent Technology

The data lake supports the big data endeavors of businesses by creating a path to the discovery of brand new insights. Many users would describe the data lake as another technological tool, but it can be more precisely defined as the aggregating of old tools. This misconception comes ...

Read More on Datafloq
Don’t Let Your Big Data Project Be a Failure

Don’t Let Your Big Data Project Be a Failure

Data collection and analysis is an up and coming industry that has grown extensively over the past few years. With the rise of the digital age and our increasing connection to the internet, we have created an immaculate amount of data. Reports show that in 2013, 90 percent of the world's data had been collected in the previous two years. It’s anticipated that in 2020, the digital world will have as many digital bits as there are stars in our universe. Several prominent companies have learned to use this data and its benefits to improve their businesses. Companies such as Amazon, Facebook, Google, and Apple are among the biggest businesses leading the data collection industry, with Google being one of the largest.

Big data analysis can be used to improve any company in any industry by helping to direct business owners, executives, and other professionals in their important decision-making. Big data analysis works to collect and interpret a vast amount of information and then provides its findings to businesses. These discoveries allow business owners and executives to make better and more informed decisions. However, big data analysis can seem like a nebulous cloud for companies who are currently trying to incorporate ...

Read More on Datafloq
7 Key Questions You Should Ask When Building a Chatbot

7 Key Questions You Should Ask When Building a Chatbot

Conversational AI is positioned to be the next frontier of an engaging consumer experience.We have experienced Google Now and Siri in our day to day lives for some time now and I must say these are very promising products. These will keep getting better and better in coming years and you can imagine a smartphone without any app at all but just an assistant to help you do all your tasks. Really exciting!

Indian enterprises too have been warming up to the use of conversational AI products with Chatbots being the popular flavor. Prominent use cases have been

Customer Service: You can serve your customer requests and provide access to the information at a fraction of cost and at larger scale compared to the human capital. We can find such use cases with Ecommerce, Financial Services companies already.
Transactional Purpose: You can perform transactions like Book a cab, order pizza recharge your mobile or buy a product. There are different B2C startups providing such platforms.

Decision to use chatbot as another consumer touchpoint needs to be evaluated very carefully rather than just Me too! approach. Thing with such products is either they work well and provide great consumer experience or they become part of troll ...

Read More on Datafloq
The Big Data Startup That’s Disrupting the Industry

The Big Data Startup That’s Disrupting the Industry

Big data has been completely disrupting the way we do business as a society for the last several years. Like many things that disrupt business as we know it, big data has also become mainstream recently—every company wants to leverage its power, and for good reason. Now that the industry is growing in popularity, however, some of its limitations are becoming apparent. Some startups are working to fix inherent problems within the industry and evolve the available technology to make big data even more valuable to companies in every industry. One of the biggest problems in big data today is analysis: it’s easy to collect and store data, but figuring out how to analyze and use it can be a major challenge. Taking on that challenge is Neo Technology’s mission—they’re ready to bring easy big data analysis to any company that wants it.

The Challenges of Big Data

Once the ability to collect, store and analyze large datasets efficiently became available, companies with large budgets began assembling data analytics teams and giant databases. These teams were made up of curious coders who were tasked with finding relevant answers to these companies’ biggest questions about customer behavior, efficiency, and many other queries, ...

Read More on Datafloq
Why Big Data Strategies Need DevOps

Why Big Data Strategies Need DevOps

Applying DevOps concepts can have great benefits to any big data initiatives, but the analytics teams still choose not to use these methodologies. Applications based on the components of big data ecosystem need to be hardened in order to run in production, and DevOps can be included as an important part of that.

What is DevOps?

The idea behind DevOps is to tear down the barriers that stand between IT infrastructure administrators and software developers, in order to make sure that everyone’s focused on a singular goal. This requires a bit of cross-training from both sides so the used terminology is understood by everyone. After the completion of training, clear lines of direction and communication can be established, with a clear aim of continuous improvement. Both ends will be able to bring software features and fixes to end users faster, as DevOps enables them to work in tandem to tune production infrastructure components and test environments to meet new software requirements.

Big data analysts know how tough and complex it is to extract meaningful and accurate answers from big data. Big data software developers lack coordination in many enterprises, which often makes things more challenging and big data projects remain siloed for different ...

Read More on Datafloq
What is the Impact of Big Data on Mobile Marketing

What is the Impact of Big Data on Mobile Marketing

Every business in one way or the other has access to data corresponding to their customers, competition, and market. Naturally, to stand out from other businesses and to ensure competitive advantage they need insights from other areas.

A customer is equally a social person with a set of habits, preferences, constraints and empathy. So, the business specific data about the customer concerning his buying habits, economy or demographic category just fail to present the person in totality. These days, businesses are finding it extremely necessary to grab hold of these multi-faceted data about their customers for better marketing output and decision making. Big Data analytics that converges across huge volumes of digital data from all niches and walks of life allows them this exposure. Thanks to Big Data businesses now can grab crucial business insights that so-called business analytics of earlier time were not capable of delivering.

Mobile data to feed Mobile Marketing

What is the biggest source of digital data today? We all know it is mobile. Yes, the web is now accessed principally through mobile devices. A vast majority of digital interactions is starting from web browsing to social media interactions to using a wide range of apps for a variety ...

Read More on Datafloq
3 Companies That Crushed it With Big Data

3 Companies That Crushed it With Big Data

Many companies in the last few years have discovered the power of tiny changes. When you are dealing with a global population of buyers and millions of people, see your products every single day, changing a single small thing can have consequences in the millions of dollars. 

The problem is that even though most companies realize this, they don't have access to the right information to make good decisions. This means that billions of dollars are being lost because of a lack of the right data.

Some companies, on the other hand, are doing this marvelously well. Here are a few doing just that.


T-mobile was recently able to significantly decrease one of the biggest problems in their industry, that of customer turnover. Cell phone companies struggle to keep customers for more than a few years for a variety of reasons. The problem was discovering those reasons.

One of the biggest ways they did this was drop call analysis. They were able to contact customers that were starting to experience more dropped calls based on a current location and work with them to improve their plan quality before they dropped T-mobile as a provider.

Another interesting item they worked on was sentiment. There are multiple ways ...

Read More on Datafloq
A New Chapter In The Analytics Journey

A New Chapter In The Analytics Journey

Every individual and enterprise travels a unique journey in the pursuit of analytics. In my case, I could never have predicted how my journey would unfold when I first entered the workforce over 20 years ago. The rise of analytics as a strategic imperative and the explosion of career opportunities within the field far surpass what I expected coming out of school. I feel very lucky to have had such a terrific journey so far and will be interested to look back 20 years from now and see what happens from here.

A New Leg in My Journey

As I write this, I am finishing a major leg of my personal analytics journey. As many readers are likely aware, I will be leaving Teradata this month. I had a terrific 14-year run with Teradata where I made a lot of friends, worked with some amazing clients, and got to witness firsthand how the world’s largest organizations have dealt with the rise of big data and analytics. Teradata treated me well and I like to think that I, in turn, contributed a lot to the company. It wasn’t an easy decision to leave, but I came across a great opportunity and every good ...

Read More on Datafloq
6 Incredibly Costly Big Data Marketing Mistakes

6 Incredibly Costly Big Data Marketing Mistakes

Everybody keeps talking about how great big data is – particularly for marketing. And, of course, it is great. It offers some fantastic opportunities, as has been shown over and over again. This is particularly true as the Internet of Things comes online, which offers an ocean of new possibilities. The thing is, if you don’t handle your big data carefully, then it can do a lot of damage as well.

Here we’re going to explore some of the biggest big data marketing mistakes that can cost you a lot of money. In that way, you’ll know what to look out for and you’ll be in a much better position to avoid them. Sound good? Let’s get started.

Low advertising conversions

The entire point of big data marketing is to convert lots of people. Of course, sometimes that doesn’t happen. Sometimes there are low conversion rates. That in and of itself isn’t a mistake. That’s just grounds for looking to improve your big data marketing strategy.

The mistake sits in the fact that some marketing teams don’t do anything about it. They continue on in the way they’re going, creating more ads and hoping that somehow the problem will fix itself. Which, of course, ...

Read More on Datafloq
What you need to know about GDPR

What you need to know about GDPR

Gartner predicts that by the end of 2018, over 50% of companies affected by the GDPR will not be in full compliance with its requirements. Here we explain the impact of the GDPR regulation and how you can prepare…

What is the EU data protection regulation?

Issued by the European Parliament, the European Council and the European Commission, European Data Protection Regulation (GDPR) will replace the current Data Protection Directive 95/46/ec in spring 2018. Its main purpose is to protect the data privacy of EU citizens and harmonise the current data protection laws across EU countries.

Some of the key privacy and data protection requirements of the GDPR that will impact your business include:

Proven Consent: You need to obtain valid consent to hold and use any personal data and be able to provide a proof of this consent at any time.

Right to Erasure: You cannot change the use of the data from the purpose for which it was originally collected. This means, if someone has agreed to receive your email newsletters, you need to get fresh consent before engaging in forms of communication, such as event notifications. Individuals will have the right to request the deletion of their details when this data is no longer used for its original purpose.

Privacy Impact ...

Read More on Datafloq
How Big Data Is Reshaping the Fashion Industry

How Big Data Is Reshaping the Fashion Industry

Big data gets a lot of attention in discussions about the financial, healthcare, and marketing sectors -- in large part because they were already numbers-driven fields. Anyone could see how more, better data would improve efficiency and profitability there.

However, big data is having effects on other niches, such as the fashion industry, surprising to say. Fashion doesn’t seem like a good “fit� for big data because it’s so much more subjective than accounting or sales.

But fashion designers, producers, and consumers are all starting to come under the influence of big data, which is reshaping this industry, too … for the better, one hopes.

New and Better Products

In the first place, big data is useful in an engineering sense: for gathering information from customer feedback and product performance to make new and better products. High-quality makeup brands like Urban Decay continually invest in new research and development to improve their popular products even further, and new lines like the Tyme Style flatiron have made it easier for women to style their hair.

Only through intensive research and analysis are product designers able to improve their existing products … as well as introduce new concepts that consumers didn’t know they needed.

Trend Prediction

Fashion trends are ...

Read More on Datafloq
4 Principles and Mistakes to Avoid When Using Big Data

4 Principles and Mistakes to Avoid When Using Big Data

Big Data is often hyped as the Next Big Thing, with fans claiming outlandish benefits for firms willing to use it. Yet, in many cases, companies enter into the world of Big Data without a coherent plan of what they want to achieve, and accordingly make some costly mistakes.

One of the problems is that the geeks who work with Big Data generally do not understand the cut-and-thrust of the business world, where pragmatism is often a huge virtue. Conversely, many smart, successful business people have no idea how data analytics works. This communication problem often means that clients and companies end up talking in woolly generalities, about “business transformation� or some equally vague term, and missing out on the advantages of Big Data.

Some of these problems can be avoided, however, by the implementation of a few basic steps. Let’s look at some.

1. Be Willing to Experiment

If you are thinking about adding a Big Data component to your business, it can be helpful to see this as a process of experimentation. This means two things. First, you should not pin all your hopes for the future on data analytics. Second, be open-minded about the possibilities of Big Data – what you ...

Read More on Datafloq
How to Become an Omni-Channel Data-Driven Retailer

How to Become an Omni-Channel Data-Driven Retailer

In today’s digital age where customers are as likely to buy a product from an eCommerce website as from a brick and mortar store, delivering a seamless and value-adding shopping experience has become more important than ever before. The multiple shopping channels available to customers and the competition posed by other retailers have made it an absolute necessity for a retail business to integrate data inputs from different channels and use to it define an omni-channel shopping experience. 

As a Big Data and BI influencer, I have worked with a large number of businesses within and beyond the retail industry. Using this knowledge and expertise, I have developed a 5-step approach that businesses operating in the retail sector can adopt to meet the expectations of their customers and become an omni-channel, data-driven retailer. 

Step 1: Collect the Right Type of Customer Data

In their journey to becoming a data-driven organization, businesses are required to collect the right type of data — data that can help them improve the customer experience and maximize the profit they gain from their online and offline channels. 

For example, a business that operates brick and mortar stores would probably like to try different layouts and floor plans to determine ...

Read More on Datafloq
9 Tips For Keeping Customer’s Data Safe

9 Tips For Keeping Customer’s Data Safe

The following are 9 ways to better protect sensitive data and encourage trust from customers. 

1.  Be sure that your privacy policy includes an accurate explanation of how customer data is used by your company. 

Trust plays a critical role in increasing consumers' willingness to share their personal data with businesses.  However, according to an HBR study, social media sites have a tendency to get one of the lowest trust ratings from consumers.

Misleading customers about the way their data is collected, stored, protected and used can create reputation and legal issues for your business.

Trade bodies and government agencies have tightened up how they deal with organizations that have deceptive statements contained in their privacy policies - so do not get caught doing this.  Double or even triple check our privacy policy to make sure all of its information is up-to-date and accurate.

What will happen if you don't?  Snapchat has been one of the victims of the new regulatory crackdown.  It was found that the company deceived users regarding personal data, including information being collected off of iPhone contact lists and using slack security measures that resulted in users being exposed to security breaches.

Snapchat was not fined.  However, the company was forced ...

Read More on Datafloq
What Would the Big Data from Your Brain Tell You?

What Would the Big Data from Your Brain Tell You?

Would you really want to know?

Amongst his other amazing projects, Elon Musk wants to help hook us up to our brains.

According to a recent report in the Wall Street Journal, he is backing an operation called Neuralink, which is experimenting with putting electrodes in our brains to enhance their function. The benefits for a whole variety of conditions are obvious, but it poses a question:

Is this venture into brain technology the start of the next space race?

We have conquered many parts of our exterior world, but with lightening fast improvements in A.I. technology, will human brains now be in a race to keep up with their robot A.I. cousins? This “neural lace� technology has so many real-world applications, and it could eventually mean that an entirely different “class� of people emerge, but I would just like to ask the question whether it would be entirely healthy to have intimate access to every single thought that has ever crossed our mind?

If the technology advanced to this stage, would we want a Big Data style archive of our life?

Most of us have enough trouble remembering what we need to do from day to day, and few of us get enough sleep to ...

Read More on Datafloq
MSys Free LSSGB Training to Transfer Military Skill into Civilian Job Expertise

MSys Free LSSGB Training to Transfer Military Skill into Civilian Job Expertise

To consider the candidate’s profile for the job in the firm, many organizations have their own criteria or they look for some specific certifications as a pre-requisite. A lot of time these certifications are nothing but a civilian version of military skills. Lean Six Sigma Green Belt (LSSGB), Lean Six Sigma Black Belt (LSSBB), Project Management Professionals (PMP) and other certifications are ways to utilize and convert your military experience into corporate job skills on a resume. These certifications also give you an understanding on how civilian processes differently from military approaches.

At MSys, we strongly believe that military Veterans have potentials to handle any project efficiently by utilizing their military experience. We also understand that Vets need to learn how they can effectively incorporate their military techniques in the corporate world before joining any firms. To guide our Veterans, MSys has started an initiative “Give Back to the Society� and is running special training programs for them. We have successfully executed our first Veteran training program on Lean Six Sigma Green Belt in the month of March and are very overwhelmed with the responses to our initiative from attendees:

Reviews on TrustPilot

Although, the success of the first Veteran training program on Lean Six Sigma Green Belt has added a feather to MSys hat, we want maximum number of Veterans to take the advantage of our upcoming LSSGB (weekend batch) and PMP training and are looking for join hands with more Veterans. The last date of registration for both the trainings is April 20th, 2017 and the process is as below:

Registration Process for LSSGB

Without Course Material and MSys LSSGB

  • Drop an email with a scan copy of any document that verifies your Veteran status at
  • Within 24 hours, you will receive a confirmation mail with other details

With Course Material and MSys LSSGB

  • Go to Lean Six Sigma Green Belt
  • Click on the Enroll now for Batch 3 of April Month
  • Use coupon code “WEEKEND299â€�
  • Pay $299 ($99 for course material + $199 for MSys LSSGB exam)
  • Get registered yourself

Registration Process for PMP

With Course Material

  • Go to PMP Registration
  • Click on the Enroll now for batch 2 of April month
  • Use coupon code “IAMVETERAN@99â€�
  • Pay $99 for course material
  • Get register yourself

Without Course

  • Drop an email at with scan copy of document that verifies your veteran status
  • You will receive confirmation within 24 hours

We would like to congratulate our Veterans who have clear the MSys’ LSSGB certification exam after completing the course and become certified LSSGB professional.

Collaboration in the Era of Big Data; How Empowerment Will Drive Change

Collaboration in the Era of Big Data; How Empowerment Will Drive Change

As discussed in the 7 Big Data Trends for 2017, organisations need to apply big data analytics to make sense of their organisation and their environment. The impact big data analytics will have on your organisation depends on which type of business analytics is applied within the business. There are four types of analytics, ranging from descriptive and diagnostic analytics to the more advanced predictive and prescriptive analytics. These are different stages of understanding your business and the more advanced the analytics is, the more complex it will be, but also the more value it can create for your organisation.

Organisations that apply business analytics tools will be better able to understand their organisation as well as their environment. It will improve the ability to make the right decisions at the right moment and as such seize the right opportunities to create competitive advantage. Descriptive and diagnostic analytics enable an organisation to use a variety of structured data sources to achieve insights in what has happened, it is commonly referred to as business intelligence. Predictive analytics uses machine learning and artificial intelligence to discover patterns and understand relationships in various unstructured and structured data sources to develop predictions for the future. Finally, ...

Read More on Datafloq
How to Navigate a Digital Transformation

How to Navigate a Digital Transformation

Perhaps you have heard the phrase “digital transformation�. What does it really mean to “digitally transform� your business?

Transformation is a thorough or dramatic change, like a sports car changing to a robot.

Digital represents software and data leveraging computing, networking, and data storage technology.

Combined together, digital transformation, in the context of business, is a thorough or dramatic change with software and data as the enabler.

Netflix digitally transformed the way we acquire and consume movies by using software and the internet to distribute movies to consumers. This digital transformation put Blockbuster out of business.

Amazon digitally transformed the way we shop, seriously crippling major retailers who have been in business for generations.

Three out of the top five S&P companies gained their position by digitally transforming entire markets. You know them as Alphabet (Google), Amazon, and Facebook. The other two in the top five just happen to be technology companies that transformed their industries with digital technology pre-internet (Apple and Microsoft).

With each digital success story, there is a tragedy. Old companies, hiding behind perceived barriers to entry, continuing to “do what they always do� until they cannot afford to do it any longer.

No Industry is Safe

Uber: Taxi industry
AirBnB: Hotels
AOL/email: Postal System
Facebook: Tabloid magazines/advertising
Amazon: Retail

Digital technology has ...

Read More on Datafloq
Five Tools For Developing Augmented Reality Apps

Five Tools For Developing Augmented Reality Apps

Augmented Reality application development entails replacing the virtual world with the real world. App developers utilize AR Libraries with open source API to break down the stages of development. Here is a review of five AR tools for mobile app developers along with their merits and demerits:

1. ARToolKit

ARToolKit is a type of augmented reality software tool utilized in the development of AR apps. The major advantage of this tool is the open-source code that allows unrestricted access to the library. Other merits of the ARToolKit include:

The library aids tracking of advanced object markers via mobile device cameras thus duplicating their whereabout on the screen of the device.
It supports 2D recognition and mapping of additional elements through OpenGL
It provides a free development environment for operating systems like iOS, Linux, SGI, Windows, Mac OSX, and Android.

Although this AR library can be accessed freely, the development documentation is considerably limited. It compresses the test apps. However, some of them can be difficult to build. The examples are inadequate, and there are no plans to update the framework yet.

2. Vuforia

Vuforia is an all-inclusive SDK for augmented reality app development.

The main benefit of this tool includes a test app with annotations showing the abilities of the Vuforia and ...

Read More on Datafloq
How Machine Learning is Revolutionizing Digital Enterprises

How Machine Learning is Revolutionizing Digital Enterprises

According to the prediction of IDC Futurescapes, Two-thirds of Global 2000 Enterprises CEOs will center their corporate strategy on digital transformation. A major part of the strategy should include machine-learning (ML) solutions. The implementation of these solutions could change how these enterprises view customer value and internal operating model today.

If you want to stay ahead of the game, then you cannot afford to wait for that to happen. Your digital business needs to move towards automation now while ML technology is developing rapidly. Machine learning algorithms learn from huge amounts of structured and unstructured data, e.g. text, images, video, voice, body language, and facial expressions. By that it opens a new dimension for machines with limitless applications from healthcare systems to video games and self-driving cars. 

In short, ML will connect intelligently people, business and things. It will enable completely new interaction scenarios between customers and companies and eventually allow a true intelligent enterprise. To realize the applications that are possible due to ML fully, we need to build a modern business environment. However, this will only be achieved, if businesses can understand the distinction between Artificial Intelligence (AI) and Machine Learning (ML). 

Understanding the Distinction Between ML and AI

Machines that could ...

Read More on Datafloq
Data Vault Training – Europe & US

Data Vault Training – Europe & US

Lots of training classes on Data Vault 2.0 are on the horizon around the world thanks to a new venture that Dan Linstedt and Michael Olschimke have formed called ScaleFree. In addition to world class consulting, they are offering dozens of Data Vault 2.0 related classes, including Bootcamps, Certification, and Introduction to Data Vault Modeling. […]
6 Ways Big Data Can Help With Hiring

6 Ways Big Data Can Help With Hiring

When it comes to Big Data, big changes have occurred in the business world. Any serious business that uses Big Data has transformed their way of everyday life by using this modern tool to their advantage.

Having a centralized hub of information that you can apply in any means you imagine is a great advantage over the competition. But how can you use Big Data in order to boost your HR recruitment and help your interviewers with hiring new talent?

Employee profile

While Big Data offers a huge assortment of options when it comes to HR recruitment, it’s important to note that the human factor is just as important. That means that you can easily create an employee model based on your current staff and create the perfect profile you think will suit the company. These attributes can range from soft skills, professional skills, education, and teamwork skills and so on.

Keep in mind that these values should always be evaluated and sorted through, not taken at face value. Some of your candidates might be having bad days or anxiety issues – these are not valid inputs for your HR profiling and should be evaluated carefully.

Recruitment model

Building upon the profile you created using Big ...

Read More on Datafloq
5 Ways Big data Analytics Can Help Your eCommerce Business

5 Ways Big data Analytics Can Help Your eCommerce Business

The words ‘Big data’ are thrown around a lot these days, but there is no definition that is universally accepted. The best definition of Big data comes from analyst Doug Laney, who said in 2001 that Big data is defined by ‘The 3Vs’ – including velocity, variety and volume. This means that Big data is a large amount of content that is varied and being produced quickly. Here are five ways that Big data analytics can help your online company.

1. Examine Google Trends

Big data analytics can help your business by giving you an opportunity to examine trends on Google. Trend data shows you what kind of terms and keywords have been searched, where they were searched and who they were searched by. “This information helps you see what the public is interested in, and allows you to adapt to that specific market. Trends can also help you decide the best direction for your website�, - says Jane Reed, Operation Manager at Paper Fellows.

2. Prevent Fraud

Through the analysis of large data sets, you can identify where different kinds of fraud are most prevalent. For instance, you can find out what states or what countries that credit card fraud is most common, or where ...

Read More on Datafloq
Grab the Attention of Employers by Adding Certification in the Most Appropriate Way

Grab the Attention of Employers by Adding Certification in the Most Appropriate Way

You must have learned a ton from the online or classroom course and you have been incorporating these new skills in your work too. But if you are looking to switch your job, then the question that may strike to your head is how to include an additional certification on your resume. The inappropriate inclusion of any certification can make your resume skeptical and in some cases it can make your resume worse. To understand the interpretation of employers on the additional courses on professionals’ resumes, we have consulted with several hiring managers and recruiters. Considering their views, here are some tips on how to add details of additional courses in your application:

Place Certifications in Their Proper Position

Everyone, including the board, recruiters and the hiring managers agreed that MOOCs (Massive Open Online Course) and other courses can specify that you qualify for the job. Only include those courses that have taught you something related to your job. Keep the course list short and confine them to a single section, like a “Professional Training� within your work history.

Keep It Relevant

Only add certifications those are related to your job profile or you are actually using them in your work. Nobody cares if you have studied Classical Greek Art when you are working as a sales person. Instead, add only those certifications that are related to the work you expect to do. The list of certification may be change according to the job post for which you are applying.

Display How You Incorporate Your Skills into Your Work

During the discussion, recruiters also agreed that giving evidence of how one put their skills into work can help strengthen the case that the certification meant something. It’s very important for candidates to validate that they keep upgrading their skills. It is recommended for the candidates to include a special project or work to give context around the results that you have brought implementing new skills. The proper listing of courses with successfully completed projects for the same can validate that the candidate is using his/her newly acquired skills to the work.

Above are some major points that you need to keep in mind while preparing your resume after completing the course. During the training, MSys instructors will also guide you about some other factors that can help you in building a great resume and grab the attention of employers and recruiters.

Roadmap for Project Management Professional (PMP)

Roadmap for Project Management Professional (PMP)

Project Management Professional (PMP) by Project Management Institute (PMI) is one of the most popular certifications across the globe. PMI® defines PMP® as an application of skills, knowledge and techniques to effectively execute projects. Thus, a project management professional certification can make any professional stand out from the crowd. The certification process of PMP can be confusing for most of us, this article will give you details of the entire process, from eligibility criteria to renewal of the credentials.

Eligibility Criteria for PMP


Individual with enough working experience in project management can take the PMP certification program. Individuals must fit in either of the following:

  • Secondary degree with minimum 5 years of experience in project management with 7500 hours leading projects and 35 hours of PDU.
  • Higher degree with minimum 3 years of experience in project management with 4500 hours leading projects and 35 hours of PDU.

Project management experience is not related to designation, but it refers that you have experience in project handling. One must be involved in all the five stages of the project management lifecycle i.e. planning, initiating, monitoring, directing and closing.

Individuals must have taken up minimum 35 hours of project management training. For the PMI, one actual hour of training or education is equivalent to one contact hour. One can either take online or in-person classroom training.

Form Filling

If you fulfill the prerequisites set for PMP, the next step is to fill up the PMP exam form. Start with creating an account at the PMI official website, followed by filling online application from listed under myPMI section. You will get 90 days to complete the application form that requires the following details:

  1. Personal details such as name, address, contact info, etc.
  2. University/college degree
  3. Details about contact hours
  4. Brief about experience of project management

After submitting the application form, you will receive the acceptance confirmation within a week.

Fees and Membership

After submitting the application form, the PMI will send you an email to pay the PMP exam fees. The current exam fee is $405 for member and $555 for non-member (this might be change after the launch of PMBOK Guide 6). The PMI membership gets you a lot of benefits at just $129 annual fee. Being a PMI member, you will receive a digital copy of PMBOK® Guide; peer-written and reviewed articles, get access to on-demand webinars, and project and business management books.

Audit Process of PMI®

On completion of the process of form filling and exam fee payment, you will immediately get to know whether your application requires an audit or not. If selected for the audit, you need to submit proof to validate the details mentioned in your application and you will get 3 month period to submit the following documents:

  1. Education certificates (photocopy of university/college degree)
  2. Printed contact hour certificate
  3. Authorized letter from the contact person listed in experience section. (separate envelops for each letter)

Put all these proofs in one envelope and send that to PMI® at registered mail and you will receive the confirmation email within a week.

Scheduling an Exam

After done with all the formalities by PMI, you can schedule your examination through PMI website. Just login to the PMI® Certification system and select your mode of examination i.e. Computer-based Test (CBT) or Paper-based Test (PBT). You can choose any date within 1 year of submitting the fee.


PMP certification exam is a closed book examination. It includes 200 multiple-choice questions that you need to attempt within 4 hours. Among 200, 25 questions are pretest questions that don’t have any effects on the score and are included to test the validity of future questions. To clear the exam, you should correctly answer at least 137 questions. If in case, you don’t clear the exam in the first attempt, you can reschedule the exam. One applicant gets 3 attempts within the period of one year. After clearing the PMP certification exam, you will be able to take your PMP® certificate within 6-8 weeks.

Certification Renewal

The PMP credentials are valid for 3-years and to renew the PMP certificate one need to earn and submit additional 60 PDUs every 3 years. However, you can earn these 60 PDUs in the certification year itself. Each dedicated hour for professional development activities will offer you one PDU. PMI has defined two ways to earn PDUs, education and profession.

1.     Educational PDUs (No limit for PDUs earned)

  • You can earn PDUs by attending events like webinars, seminars and workshops conducted by PMI® Registered Education Providers (REP) or PMI® itself.
  • By completing short term or long term courses by training institutes or a university can help you earn PDUs.
  • You can attend a PMI® Publication Quiz that includes reading articles and answering minimum 70 percent of the questions accurately.
  • Informal education activities like reading books, podcasts, articles, watching videos, etc., associated to project management, program management, project scheduling and project risk.

2.     Professional PDUs (Maximum 45 PDUs in 3-year cycle)

  • By creating a new knowledge base such as writing a book, articles, serving as a speaker, presenting a webinar, etc. for the topic of your expertise, can earn you PDUs.
  • By offering non-compensated project scheduling or project management services to non-employer.

You need to update earned PDUs in PMI’s PDU Activity Reporting form

After earning 60 PDUs at the end of the credential cycle, you can apply for renewal by paying the renewal fee, which is $60 for members and $150 for non-members.

Follow the above mentioned process to get PMP certified and to maintain your credentials for the same. MSys Training, a leading training institute in North America offers the 4-days boot camp or online classroom training with 100% money back guarantee.

Schema Definitions from an ODBC Text Source

Schema Definitions from an ODBC Text Source

Recently, I’ve worked on disjointed migration project where things just didn’t work the way you expect.  For example, you usually like to access data direct from its source (direct from a relational database) but in this situation, this was just not allowed and we were given a load of text files (hundreds) to load.  Perhaps […]
How to Balance Customer Experience With Customer Identity Management?

How to Balance Customer Experience With Customer Identity Management?

The world is going digital, and it doesn’t come as a surprise that people love getting online but there is a condition, they want to get there too soon! Yes, without any hassles without any barrier. They want more mobility but fewer hassles, and they want more security but fewer efforts in authentication. Isn’t it confusing and undoubtedly leaves businesses with one question, how can they balance customer experience with customer identity management?

Customer identity management is the latest business imperative of today. If the customers are not able to find a quick entry to your website, they don’t hesitate to go to your competitors. Growing competition and reducing patience has made every business worried. The article deals with why customer identity management is a must to customer experience and how you can balance out the both.

Present day scenario

Just like no human relationship stays the same throughout the journey, the relationship with your customers also keeps changing. That loyal customer you have from past five years, can’t be guaranteed to stay loyal in future as well. The growing technologies and increasing options have made your customers more vulnerable. The only thing that can come to your rescue is the experience you ...

Read More on Datafloq
How Your Small Business Can Take Advantage of the IoT Wave

How Your Small Business Can Take Advantage of the IoT Wave

From self-driving cars, tiny chips, automatic response sensors, smart machines to smart watches, the era of connected devices is already here. The IoT refers to the idea of connecting billions of devices to the internet. This concept has mostly affected how people are conducting business today.

Presently, technology plays a primary role in any business development. Both small and large businesses are embracing the state-of-the-art technology to keep speed with the stiff competition and the current clients' expectation.

According to world giant tech companies like Google, Xero, and Cisco, the world of business has only felt the surface of what technologies such as IoT and cloud technology has to offer.

Though big business and companies have already transformed in technology, small businesses and entrepreneurs are yet to venture into this new level of technology that makes the world a global village.

It is predicted that by 2020, small businesses will be dominated by smart machines and connected devices. Small businesses and other emerging areas of business will experience a rapid growth of connected things, leading to enhanced safety and security, improved marketing endeavors, and developing new business models.

Small businesses and entrepreneurs can benefit a great deal from the IoT applications to step up against ...

Read More on Datafloq
The 5 Industries With The Highest Cyber Security Risks

The 5 Industries With The Highest Cyber Security Risks

New technology means new avenues for growth in every field, but it also means new threats. In our digital age, cyber security risks are undoing some of the greatest technological advancements in key industries.

In 2015 alone, the Theft Resource Center found that over 780 data breaches occurred, leaving over 169 million records or personal information to fall into the wrong hands.

Breaches occurred in a variety of different industries leaving many to question who is the most susceptible and why. Now, two years later and thousands of hacks detected, it is clear which industries are the most vulnerable to cyber security risks.

1. Health Care

The health care industry has experienced incredible booms in the use and integration of new technology. Unfortunately, they have also been victimized by hackers because of it. Last year over 100 million patient records were stolen by cyber criminals, leaving millions of patients at high risk of having their identities stolen from anyone around the world.

Cyber criminals target the health care industry for the deeply personal patient information within it. Hospitals, clinics and other medical centers hold patients’ social security numbers, personal addresses, bank information and health information. 

2. Education

Colleges and universities are consistently targeted as well. Universities contain ...

Read More on Datafloq
Is Your Data Center Protected From All Environmental Elements?

Is Your Data Center Protected From All Environmental Elements?

You’ve probably thought about how to keep your data safe from hackers, but what about the naturally occurring environmental elements that could cause massive equipment failures?

The extent to which you must worry about those things varies depending on where the data center is located and what the weather’s usually like there, but, no matter what, environmental elements can wreak havoc before you know it.

Let’s look at some of the most common environmental threats for a data center. We’ll also examine things you can do to minimize the adverse impact of those elements.

Monitoring Temperature

The temperature level of your data center is a crucial environmental element to watch. Luckily, it’s quite easy to ensure you’re within the recommended ranges for data center temperature best practices.

Guidance about temperature is constantly evolving, but current advice suggests keeping the temperature between 64.4-80.4 Fahrenheit. There are also classes of “allowable� temperatures intended to give data center personnel some flexibility.

Besides staying abreast about recommended temperature ranges, monitor your data center carefully with advanced temperature sensors placed on the top, middle and bottom of server racks. Handheld thermometers are useful for determining hot spots in a room. An alarm system can alert data center personnel if the temperature ...

Read More on Datafloq
Factoring Massive Numbers: A Machine Learning Approach

Factoring Massive Numbers: A Machine Learning Approach

We are interested here in factoring numbers that are a product of two very large primes. Such numbers are used by encryption algorithms such as RSA, and the prime factors represent the keys (public and private) of the encryption code. Here you will also learn how data science techniques are applied to big data, including visualization, to derive insights. This article is good reading for the data scientist in training, who might not necessarily have easy access to interesting data: here the dataset is the set of all real numbers -- not just the integers -- and it is readily available to anyone. Much of the analysis performed here is statistical in nature, and thus, of particular interest to data scientists. 

Factoring numbers that are a product of two large primes allows you to test the strength (or weakness) of these encryption keys. It is believed that if the prime numbers in question are a few hundred binary digits long, factoring is nearly impossible: it would require years of computing power on distributed systems, to factor just one of these numbers.

While the vast majority of big numbers have some small factors and are thus easier to break, the integers that we are dealing ...

Read More on Datafloq
Building a Data Lake in the Cloud 

Building a Data Lake in the Cloud 

So you want to build a Data Lake in the Cloud? Here is how.
New To Analyzing Big Data? 3 Tips For Efficient Data Analysis

New To Analyzing Big Data? 3 Tips For Efficient Data Analysis

As a business owner, you want to save as much time, money, and energy as humanly possible by streamlining your entire business from top to bottom. The more resources you can conserve, the more you can invest in your marketing efforts.

If you’re smart, you’re already streamlining your small needs like outsourcing minor tasks, hiring short-term contracted staff members, and using professionally programmed and pre-made documents to make sure your paperwork is up to par. These aspects of running a business are probably “old hat� to you.

While you’re streamlining the everyday aspects of running your business, hopefully, you’re not forgetting to streamline your data analytics as well. If you’re collecting any kind of data, it’s likely coming from multiple sources. And since you can’t afford to dilute the efficiency of your business, here are three tips to tighten it up.

1. Make sure your devices and programs are collecting data

This may sound obvious to some, but if you’re new to data analysis, you may not be aware that the collection of data from the devices and programs you use within your network is entirely determined by settings within your control. Some devices automatically log data no matter what; however, many devices have ...

Read More on Datafloq
10 Things that make Embedded Java the best for the Internet of Things

10 Things that make Embedded Java the best for the Internet of Things

The acquisition of Sun Microsystems by Oracle in 2010 has pushed Java more towards enterprises, rather than individual application development. The Java 8 platform includes a specific library suite that targets embedded system programming, thus making up for the lost ground due to lack of specific functionalities in the earlier versions of Java.

Here’s what you need to know about Embedded Java, if you are considering a Java Development Company for your next embedded enterprise application:

1. IoT has marked the start of a new era for Embedded Java.

Embedded systems was the domain of low level languages like C, until the present. However, with IoT (Internet of Things), machine-to-machine (M2M) communication is a real possibility that needs full-fledged application programming with an elaborate user interface. 

Earlier, developers used to program in the language specific to the platform. With Java for embedded systems, developers can leverage the ‘Write once, run anywhere’ paradigm and port the same code to embedded devices with a few tweaks.

2. Embedded Java reduces the time and cost required for application development.

The IDE (NetBeans from Oracle, etc.) for embedded Java development are free of cost, and so are the debugging tools. This makes embedded application programming possible on optimum cost reduction ...

Read More on Datafloq
So You Want to Be a Data Scientist? – It’s Complicated

So You Want to Be a Data Scientist? – It’s Complicated

Rock stars. That’s what some people are calling data scientists. And right now, data science is the sexiest tech career field, at least according to LinkedIn in a recent report of top skills for 2017. Since the term was first coined in 2001, data science has come to mean a large group of activities which combine statistical analysis and data mining.

Anyone who is considering a career in data science needs to understand first, the myriad of things such a career involves, the type of education and training required, and exactly what the job market holds. And because the field is growing so fast, students and mid-career professionals both have an opportunity to move into data science careers, if they get the right education and training.

The Field is Broad and Varied

There is no single definition of data science, as it varies with industry, specific business, and what the purpose of the data scientist’s role is. And different roles require different skill sets, therefore the educational and training path is not uniform. Data scientists can come from many fields – math, statistics, computer science, and even engineering. But the role the scientist is to play is now generally broken down into two ...

Read More on Datafloq
Enterprise Performance Management: Not That Popular But Still Bloody Relevant

Enterprise Performance Management: Not That Popular But Still Bloody Relevant

While performing my usual Googling during preparation for one of my latest reports on enterprise performance management (EPM), I noticed a huge difference in popularity between EPM and, for example, big data (Figure 1).

From a market trend perspective, it is fair to acknowledge that the EPM software market has taken a hit from the hype surrounding the emergence of technology trends in the data management space, such as business analytics and, particularly, big data.

Figure 1: Searches for big data, compared with those for EPM (Source: Google Trends)

In the last four years, at least, interest in big data has grown exponentially, making it a huge emerging market in the software industry. The same has happened with other data management related solutions such as analytics.

While this is not that surprising, my initial reaction came with a bit of discomfort. Such a huge difference makes one wonder how many companies have simply jumped onto the big data wagon rather than making a measured and thoughtful decision regarding the best way to deploy their big data initiative to fit within the larger data management infrastructure in place, especially with regards to having the system co-exist and collaborate effectively with EPM and existing analytics solutions.

Now, don’t get me wrong; I’m not against the deployment of big data solutions and all the potential benefits. On the contrary, I think these solutions are changing the data management landscape for good. But I can’t deny that, over the past couple of years, a number of companies, once past the hype and euphoria, have raised valid concerns about the efficiency of their existing big data initiatives and have questioned its value within the overall data management machinery already in place, especially alongside EPM and analytics solutions, which are vital for measuring performance and providing the right tools for strategy and planning.

The Analytics/EPM/Big Data Conundrum
A study published by Iron Mountain and PwC titled How Organizations Can Unlock Value and Insight from the Information they Hold, for which researchers interviewed 1,800 senior business executives in Europe and North America, concluded that:

“Businesses across all sectors are falling short of realizing the information advantage.�

Even more interesting is that, in the same report, when evaluating what they call an Information Value Index, the authors realized that:

“The enterprise sector, scoring 52.6, performs only slightly better than the mid-market (48.8).�

For some, including me, this statement is surprising. One might have imagined that large companies, which commonly have large data management infrastructures, would logically have already mastered, or at least reached an acceptable level of maturity with, their general data management operations. But despite the availability of a greater number of tools and solutions to deal with data, important issues remain as to finding, on one hand, the right way to make existing and new sources of data play a better role within the intrinsic mechanics of the business, and, on the other, how these solutions can play nicely with existing data management solutions such as EPM and business intelligence (BI).

Despite a number of big data success stories—and examples do exist, including Bristol-Myers Squibb, Xerox, and The Weather Company—some information workers, especially those in key areas of the business like finance and other related areas, are:

  • somehow not understanding the potential of big data initiatives within their areas of interest and how to use these to their advantage in the operational, tactical, and strategic execution and planning of their organization, rather than using them for in tangential decisions or for relevant yet siloed management tasks.
  • oftentimes swamped with day-to-day data requests and the pressure to deliver based on the amount of data already at their disposal. This means they have a hard time deciphering exactly how to integrate these projects effectively with their own data management arsenals.

In addition, it seems that for a number of information workers on the financial business planning and execution side, key processes and operations remain isolated from others that are directly related to their areas of concern.

The Job Still Needs to Be Done
On the flip side, despite the extensive growth of and hype for big data and advanced analytics solutions, for certain business professionals, especially those in areas such as finance and operations, interest in the EPM software market has not waned.

In every organization, key people from these important areas of the business understand that improving operations and performance is an essential organizational goal. Companies still need to reduce the cost of their performance management cycles as well as make them increasingly agile to be able to promptly respond to the organization’s needs. Frequently, this implies relying on traditional practices and software capabilities.

Activities such as financial reporting, performance monitoring, and strategy planning still assume a big role in any organization concerned with improving its performance and operational efficiency (Figure 2).

Figure 2: Population’s perception of EPM functional area relevance (%)
(Source: 2016 Enterprise Performance Management Market Landscape Report)

So, as new technologies make their way into the enterprise world, a core fact remains: organizations still have basic business problems to solve, including budget and sales planning, and financial consolidation and reporting.

Not only do many organizations find the basic aspects of EPM relevant to their practices, an increasing number of them are also becoming more conscious of the importance of performing specific tasks with the software. This signals that organizations have a need to continuously improve their operations and business performance and analyze transactional information while also evolving and expanding the analytic power of the organization beyond this limit.

How Can EPM Fit Within the New Data Management Technology Framework?
When confronted with the need for better integration, some companies will find they need to deploy new data technology solutions, while others will need to make existing EPM practices work along with new technologies to increase analytics accuracy and boost business performance.

In both cases, a number of organizations have taken a holistic approach, to balance business needs by taking a series of steps to enable the integration of data management solutions. Some of these steps include:

  • taking a realistic business approach towards technology integration. Understanding the business model and its processes is the starting point. But while technical feasibility is vital, it is equally important to take into account a practical business approach to understand how a company generates value through the use of data. This usually means taking an inside-out approach to understanding, by taking control of data from internal sources and that which might come from structured information channels and/or tangible assets (production, sales, purchase orders, etc.). Only after this is done should the potential external data points be identified. In many cases these will come in the form of data from intangible assets (branding, customer experiences) that can directly benefit the specific process, both new or already in place.

  • identifying how data provided by these new technologies can be exploited. Once you understand the business model and how specific big data points can benefit the existing performance measuring process, it is possible to analyze and understand how these new incoming data sources can be incorporated or integrated into the existing data analysis cycle. This means understanding how it will be collected (period, frequency, level of granularity, etc.) and how it will be prepared, curated, and integrated into the existing process to increase its readiness for the specific business model.
  • recognizing how to amplify the value of data. By recognizing and making one or two of these sources effectively relate and improve the existing analytics portfolio, organizations can build a solid data management foundation. Once organizations can identify where these new sources of information can provide extended insights into common business processes, the value of the data can be amplified to help explain customer behavior and needs; to see how branding affects sales increases or decreases; or even to find out which sales regions need improved manufacturing processes.

All this may be easier said than done, and the effort devoted to achieving this is considerable, but if you are thinking in terms of the overall business strategy, it makes sense to take a business-to-technical approach that can have a direct impact on the efficiency, efficacy, and success of the adoption of EPM/big data projects while also improving chances of adoption, understanding, and commitment to these projects.

Companies need to understand how the value of data can be amplified by integrating key big data points with the “traditional� data management cycle so it effectively collaborates with the performance management process, from business financial monitoring to planning and strategy.

While enterprise performance management initiatives are alive and kicking, new big data technologies can be put to work alongside them to expand the EPM software’s capabilities and reach.

The full potential of big data for enterprise performance management will only be realized when enterprises are able to fully leverage all available internal and external data sources towards the same business performance management goal to better understand their knowledge-based capital.

(Originally published on TEC's Blog)
Healthcare IT Security and The Internet of Things

Healthcare IT Security and The Internet of Things

The major distributed denial-of-service (DDoS) attack which targeted Dyn shook a lot of people, especially experts in IT security. Dyn is well known for offering domain name services (DNS), and the attack crippled most parts of the internet. In simple terms, a DNS provider has the same role of air traffic controller, that is, the provider routes internet traffic. The DDoS attack made major services like Spotify, Twitter and Netflix to load slower than normal or even unavailable.

Investigators found that the cyber attackers found weaknesses in the devices – wireless enabled baby monitors and smart home appliances which were connected to these major services. The attackers then used the Mirai malware to unleash botnets. In other words, the attackers exploited susceptible Internet of Things (IoT) for the attack to be successful. It is estimated that over 500,000 devices have been infected by the Mirai malware. Also, the Mirai malware attack affected Allscripts and Athena Health who deal with EHR devices.

Why healthcare is more vulnerable than other sectors

Security experts in healthcare IT industry have warned that IT security in healthcare is no longer about medical data or healthcare. Recent data breaches in healthcare were attributed to security compromises in their HIPAA business associates. Cyber ...

Read More on Datafloq
Why Big Data Has To Make Sense To People

Why Big Data Has To Make Sense To People

You might have three PhDs and a brain the size of a football, but if you can’t explain things in simple language to others, it will be a struggle to maximise your potential impact on the world. Of course, there are many notable exceptions of academically brilliant individuals who have transformed the world, but people such as Einstein, Newton, Hawking, et al., often developed their theories in the confines of their labs and studies.

Data Science professionals, clever as they may be, have no choice but to work extremely closely with their non-scientifically-minded colleagues. I’d just like to say that intelligence is important in business, but it is far from the only success factor. A practical focus on getting things done and an ability to get on with other people are two things that every great leader has, and what you might call EQ is sometimes more important than IQ.

So, for the data gurus to make a difference, it is essential that they learn to translate the data into something that everyone would appreciate. This might involve dazzling visualisation and colourful graphics, and it might even involve the odd bit of humour to help the points to sink in.

If the data ...

Read More on Datafloq
The Role Of Machine Learning In Marketing Research And Automation

The Role Of Machine Learning In Marketing Research And Automation

Marketing campaigns generate tons of data that can be used to understand customer preferences and behavior. This includes structured data (information like name and location that prospects provide voluntarily) as well as lots of semi-structured and unstructured data. Social media texts, navigation behavior on your website, photos and emails are all examples of unstructured data that can reveal a lot about the customer when studied individually but may not convey anything at a holistic level.

Let us take an example. A study of user behavior on a popular apparel store showed that 20% of visitors from search engines hit the ‘Buy’ button within two minutes of landing while 30% of visitors bounced (quit the page) for the same keyword. Navigational study of individual users showed that some users quit after checking out the various color variants of the product while a few other users quit as soon as the page finished loading.

At an individual level, it is easy to identify a fix for each of these visitors who do not convert. Perhaps some of these users did not find the product in the color they were looking for while others decided that the product was not for them as soon as ...

Read More on Datafloq
5 Ways Big Data In The Marketing World Will Affect Consumers

5 Ways Big Data In The Marketing World Will Affect Consumers

Do you ever feel like certain companies are following you around the internet? It's easy to think they know your every thought, but in this particular case, there is a reasonable explanation for why you see them everywhere. That doesn't mean companies don't have so much of your info it would blow your mind.

Big data is going to change the way companies operate. In most cases, it will help consumers too, at least some of the time. It's going to have a much bigger impact on your life in the future, so let's look at a couple of things from the marketing world that will affect you going forwards.

1 - More Big Data Equals Popular Products

You gave up your freedom a long time ago when you started signing up to every social media network under the sun. Then the smartphone came along and now everyone is glued to the internet more often than not.

We're going to see the same trend going forwards as augmented reality, virtual reality, and lots of wearable devices become mainstream. The amount of big data companies will have at their disposal will make products even more popular.

2 - Ads Will Become A Lot More Targeted

Let's use real-time ...

Read More on Datafloq
A Successful Effort to Transform Veterans Professional Life by MSys

A Successful Effort to Transform Veterans Professional Life by MSys

“When You are Acting Selflessly, You are at Your Bravest�, we have learned this from our Veterans and to honor them MSys have conducted free training program.

At MSys, we believe Veterans have all the potential to efficiently handle the responsibilities of lead in any organization, yet they require to give direction to their ability. To help Vets to mold their skills according to the corporate world, MSys have put their best feet forward and scheduled free training session for Lean Six Sigma Green Belt (LSSGB) and Project Management Professional (PMP) on March 28th-31st, 2017. But, as many of veterans wanted to be a part of both the training session, we have considered their request and rescheduled the PMP online class to April 25th – 28th, 2017.

To make this training fruitful for maximum Veterans, we are arranging another weekend batch for Lean Six Sigma Green Belt training in the month of April, dated April 29th – 30th and May 6th – 7th, 2017. We request our Vets to get themselves enroll for the PMP or LSSGB weekend class on or before April 20th, 2017. To know more about the registration process for Lean Six Sigma Green Belt, you can read “Another Effort by MSys to Help Veterans – Organizing Free LSSGB Weekend Classesâ€�; and forPMP registration process, consider reading “Why Veteran should Take Free LSSGB and PMP Training with MSys Training?â€�

Two days of Veteran Lean Six Sigma Green Belt training have completed and we have received heartwarming responses from our Veteran participants. Today, MSys feels proud that we took this initiative that gave us a chance to serve our heroes and give back a little of their service.

Thank You Veterans, Your Feedback Matters the Most to MSys

We would like to give a special thanks to our instructor Jason Saturn, a master Lean Six Sigma instructor at MSys to make this program a successful one. With his wide-experience and world class teaching skills, he always won the heart of our participants.

Three Industries That Can Benefit From Augmented Reality

Three Industries That Can Benefit From Augmented Reality

Virtual and augmented reality has remained a buzzword for technology fans and it's on everyone's radar for the upcoming year as well. Mobile devices are the wave of the future and to connect with users, more and more businesses and industries alike are turning to VR and AR and hoping to use their potential. App development has begun taking a turn for better, and more creative ways to both use and integrate VR and AR.

Augmented reality app development has already begun proving it's worth the cost and effort. It has begun to profoundly impact real estate, education and gaming.

Real Estate

Real estate is typically a difficult market, with clients taking a long time to buy or even rent. Although commissions are significant, real estate professionals don't like to wait and are hoping to use every tool available to move the sale cycle along. Augmented reality on a realtor's site can provide information that hasn't been seen before prior to a home walk-through. App developers can create presentations that bring the client into the home and offer views that even human eyes are not accustomed to seeing, such as the layout from above, the location in a development or the height of ...

Read More on Datafloq
The Future of the Cloud

The Future of the Cloud

It has been less than a year since Google first provided a glimpse into its vision for cloud computing. If there is any company that should be paid attention to in this area, it's Google. Special attention should occur when the chairman of Google's parent company, Alphabet, claims that the next five years will hold even more innovation than the previous five years when it comes to cloud computing. Looking back at the last five years, that is quite the statement.

The biggest part of the keynote address focused on machine learning. This gives us a good idea of where cloud computing in general wil go. Google has since proven just how serious it is about machine learning with the news being filled with buzz about DeepMind. 

Looking at the keynote address we can surmise that machine learning will be a regular part of our lives within the next five years. According to Google, their machine learning software could work in something like an API in the fugure. Developers will be able to plug the software in and focus on creating.

The end goal is that machines will be able to parse through the millions of terabytes of data that humankind collects every ...

Read More on Datafloq
The Future is Bright for Banking

The Future is Bright for Banking

2018 is likely to be a game-changing year for the banking and finance sector. As the General Data Protection Regulation (GDPR) and Revised Payment Service Directive (PSD2) are implemented across the European Union, the exclusive control of banks and other financial institutions on financial data of their customers is about to end. These new regulations will open the door to almost any company interested in claiming a share, particularly the tech giants, such as Amazon, Facebook, Google. 

While this may look like a challenge to many, we, as journey science experts, view this as an opportunity for banks to partner with large tech, eCommerce, and FinTech enterprises and leverage their expertise at managing customer experiences to revolutionize customer journeys and offer an improved, more holistic experience to their end-users. 

In this article, we have reviewed how implementation of PSD2 and GDPR will transform the banking and finance industry and result in a new wave of partnerships between banks and ecosystem enterprises to create a win-win situation for everyone, including customers, financial institutions, major tech players, such as Google, Amazon, Facebook, eCommerce companies, as well as small FinTech start-ups. 

How Upcoming Regulations Will Revolutionise the Banking Industry

The new regulations will introduce banks to two ...

Read More on Datafloq
Another Effort by MSys to Help Veterans – Organizing Free LSSGB Weekend Classes

Another Effort by MSys to Help Veterans – Organizing Free LSSGB Weekend Classes

With an intention of helping our Veterans to mark their presence in the corporate world, MSys Training had started the “Give Back to the Society” campaign on February 28th, 2017. We are overwhelmed with the response by Veterans as well as civilians for our initiative. During this one month journey, we got a chance to meet highly talented Veterans from across the states who have potential to be a part of any organization and we are glad that with this initiative we became part of their life. Initially, we had plans of conducting both LSSGB and PMP free online class on March 28th – 31st, 2017. However, most of our Vets wanted to attend both LSSGB and PMP, hence MSys decided to reschedule the PMP online class to April 25th – 28th, 2017 and extended the registration date to April 20th, 2017.

We have received more than 90 registration requests, but due to class limitation few of them missed this chance. With over 30 confirmed attendees in our March 28th LSSGB free online class (2 batches), MSys smelled a need of a weekend training to allow maximum Veterans taking advantage of our free online class and decided to conduct another free online training session on Lean Six Sigma Green Belt (LSSGB) on 2 consecutive weekends i.e. April 29th – 30th and May 6th – 7th, 2017.

Our LSSGB training will offer numerous benefits and here are some highlighting features:

  • 4 days online classroom training by lead Lean Six Sigma instructors at MSys
  • Discounted MSys LSSGB certification exam fee and IASSC accredited LSSGB course material
  • 40% off on online self-learning on future training programs and 30% off on online and in-person classroom training
  • Career guidance from MSys consultants
  • $100 bonus on every successful referral enrollment (this won’t be applicable on Veterans training programs’ registration)


The 4-day online classroom LSSGB training is held on April 29th – 30th and May 6th – 7th, 2017 and the last date to get registered for the session is April 20th, 2017. To enroll for MSys Training’s weekend training program on LSSGB (Lean Six Sigma Green Belt), follow the below registration process.


Registration Process for LSSGB

Without Course Material and MSys LSSGB

  • Drop an email with a scan copy of any document that verifies your Veteran status at
  • Within 24 hours, you will receive a confirmation mail with other details

With Course Material and MSys LSSGB

  • Go to Lean Six Sigma Green Belt
  • Click on the Enroll now for Batch 3 of April Month
  • Use coupon code “WEEKEND299â€�
  • Pay $299 ($99 for course material + $199 for MSys LSSGB exam)
  • Get registered yourself

Some common questions that can arise in your mind while registering for the training session, if you don’t get your answers in this list, feel free to contact us:

When do I get my training details?

One of our training consultants will send you the login credential before 5 days to the training start date. You will receive an enrollment confirmation email within 24 hours of your registration request.

Do I get the MSys courseware?

Yes, Veterans can avail MSys courseware by paying just $99 per license.

How much do I pay to attend this training?

You don’t need to pay anything to attend the 4 days online class. You will have to pay only if you decide to purchase course material, MSys LSSGB exam voucher or online class recorded videos. You will pay $299

How to take the LSSGB Certification Exam?

You can appear for the MSys LSSGB exam and get certified in just $199.

What is the difference between MSys Veterans free classes and paid classes?

There is no alteration between our paid and free training program. MSys Training is hosting this free class for Vets to understand the current market situation and help them to get those extra skills needed to excel in the corporate world.

Is there any difference between MSys and IASSC certification?

Both the certifications are globally recognized. IASSC comes with 1 attempt and is a close book exam, whereas MSys LSSGB exam is an open book exam with 3 attempts. You can either go for MSys LSSGB exam @ $199 or IASSC LSSGB exam @ $399.

Do I get a course completion certificate?

Yes, on completion of 4 days online classroom training, MSys will send you a 35 PDUs certificate to your registered email address.

At MSys Training, we believe “We Rise by Lifting Others!” and the success of our “Give Back to the Society” proved that again.

Guiding Principles for ITIL Practitioners

Guiding Principles for ITIL Practitioners

The ITIL Practitioner is the latest qualification included to the ITIL service lifecycle. The ITIL Practitioner certification is based upon the knowledge gained during ITIL Foundation level and offers you with the ability to recognize and deliver service improvements using a practical approach to adapting and adopting the ITIL framework. Below are the nine guiding principles of ITIL Practitioner:

  • Design for Experience: Services and processes should be designed and defined from the outset to develop a satisfying experience for the end user or customer.
  • Focus on the Value: In IT Service Management (ITSM), everything and anything should be designed to deliver the value to customers and the customer chooses what is valuable to them.
  • Start where you are: You don’t need to start from the scratch. Always try and consider first what can be achieved from what you already have.
  • Work Holistically: No component or service can stand alone. Services are complicated systems that have to be designed, deployed, considered, improved and managed with an awareness of the whole.
  • Progress Iteratively: Many of us want to do multiple things in one go, but you should resist yourself from the temptation of doing everything in one go. Divide the work into small, manageable tasks that deliver something useful and keep going. Combining all small efforts into one can create a great change.
  • Observe Directly: Make decision depending on the information that is true and keep checking the source of the activity at regular intervals and observe it directly.
  • Be Transparent: Be honest and clear about the facts. Be aware of what is happening and the source from where those rumors are arriving. Speak from a position of knowledge.
  • Keep it Simple: Always do what is required to deliver the desired outcomes and try to eliminate that which is extravagant.
  • Collaborate: Work with your subordinates creatively towards a desired goal. Combined effort will create commitment, resulting benefit from different perspectives.

At MSys Training, we offer the ITIL Practitioner training gives the understanding of the tools and techniques. The 2days training will help you to implement ITIL methodologies more effectively in your organization.

Privacy Policy

Copyright © 2017 BBBT - All Rights Reserved
Powered by WordPress & Atahualpa