Bio-Artificial Intelligence: A lesson in Why AI Algorithms must “Know” the Field

Bio-Artificial Intelligence: A lesson in Why AI Algorithms must “Know” the Field

Artificial Intelligence (AI) is all the rage. It seems that every time I surf the web, I see a new AI product, ranging from consumer electronics, to cars, to social media apps, to high-end analytical software. For example, check out these other articles on AI in Internet Marketing, the Legal Profession, and several other application areas. There are even AI products advertised in the airports. Forget what the pundits say about AI being a big part of our future. Just look around to see that the future is here now.

AI Approaches Must Adapt to the Problem at Hand

With the increasing prevalence of AI, we see successes and failures in the use of the technology. I have seen many questions on Quora along the lines of “What Artificial Intelligence algorithm should I use for mining the web?� Clearly, this is an unanswerable question; without knowing what the goal is, the correct approach cannot be defined. This is an extreme example, but to me, it is indicative of the same problems we have seen historically with other new technologies. There is a tendency to rush in with the new hammer in the toolkit and bang on everything like it is a nail.

The ...


Read More on Datafloq
Bio-Artificial Intelligence: A lesson in Why AI Algorithms must “Know” the Field

Bio-Artificial Intelligence: A lesson in Why AI Algorithms must “Know” the Field

Artificial Intelligence (AI) is all the rage. It seems that every time I surf the web, I see a new AI product, ranging from consumer electronics, to cars, to social media apps, to high-end analytical software. For example, check out these other articles on AI in Internet Marketing, the Legal Profession, and several other application areas. There are even AI products advertised in the airports. Forget what the pundits say about AI being a big part of our future. Just look around to see that the future is here now.

AI Approaches Must Adapt to the Problem at Hand

With the increasing prevalence of AI, we see successes and failures in the use of the technology. I have seen many questions on Quora along the lines of “What Artificial Intelligence algorithm should I use for mining the web?� Clearly, this is an unanswerable question; without knowing what the goal is, the correct approach cannot be defined. This is an extreme example, but to me, it is indicative of the same problems we have seen historically with other new technologies. There is a tendency to rush in with the new hammer in the toolkit and bang on everything like it is a nail.

The ...


Read More on Datafloq
7 Reasons Why Serverless Computing will Create a Revolution in Cloud Technology

7 Reasons Why Serverless Computing will Create a Revolution in Cloud Technology

Of late, you may have heard this term - Serverless Computing. It is a term that has been around in the developer’s world for more than a year, and it has already caused a paradigm shift in the field of cloud technology.

In this article, we will examine the reasons that caused serverless computing to be such a major trend.

Serverless Computing is a computing code execution model where the developers are relieved of several time-consuming activities so that they can focus on other important tasks.

This trend is also known as Function as a Service (FaaS) where the cloud vendor is responsible for starting and stopping a function’s container platform, check infrastructure security, reduce maintenance efforts, improve scalability, so on and so forth at low operational costs.

The aim is to develop microservice oriented solutions to help decompose complex applications into small, easily manageable and exchangeable modules.

This brings us to the question - are there really ‘serverless’ computing services?

Of course, it is only logical that there should be servers in the background, but developers need not bother about the operation or provisioning of these servers; the entire server management is done by the cloud provider. Thus, the developer can devote more of his time ...


Read More on Datafloq
5 Ways Big Data In The Marketing World Will Affect Consumers

5 Ways Big Data In The Marketing World Will Affect Consumers

Do you ever feel like certain companies are following you around the internet? It's easy to think they know your every thought, but in this particular case there is a reasonable explanation for why you see them everywhere. That doesn't mean companies don't have so much of your info it would blow your mind.

Big data is going to change the way companies operate. In most cases it will help consumers too, at least some of the time. It's going to have a much bigger impact on your life in the future, so let's look at a couple of things from the marketing world that will affect you going forwards.

1 - More Big Data Equals Popular Products

You gave up your freedom a long time ago when you started signing up to every social media network under the sun. Then the smartphone came along and now everyone is glued to the internet more often than not.

We're going to see the exact same trend going forwards as augmented reality, virtual reality, and lots of wearable devices become mainstream. The amount of big data companies will have at their disposal will make products even more popular.

2 - Ads Will Become A Lot More Targeted

Let's use real time bidding as ...


Read More on Datafloq
How the IoT is Affecting Property Management

How the IoT is Affecting Property Management

At its heart, the Internet of Things (IoT) is all about connection. Connection between smart devices, WiFi, beacons, etc. to collect data on such things as lighting, temperature or the presence of people. With the IoT, the idea is for make our everyday devices “smarter�--get them talking to each other to share information and then analyze this information for us to gain insights on when to run a dishwasher to save energy or how to set up a store display for foot traffic. For the property management industry, the aim of the IoT is to connect property owners, managers and even tenants to each other. Similarly, it is then to connect these individuals to data insights generated from everyday routine activities. Although much of what the IoT can offer is still in its infancy, it is already having a large impact on property management.

Within the industry, security is a main concern for both the tenant and building owners, and the IoT offers some truly amazing solutions. Wireless sensors allow property owners and managers to offer tenants virtual guard services. Tenants or managers can program the sensors through an app to customize their security. For example, if the tenant works from ...


Read More on Datafloq
Microsoft and VMware Will Square-off in the Hybrid Cloud

Microsoft and VMware Will Square-off in the Hybrid Cloud

Many enterprises will be on the road to the hybrid cloud and they will find Microsoft and VMware waiting for them with highly differentiated approaches. Who will win?
What to Look For When Picking a Big Data-as-a-Service Provider

What to Look For When Picking a Big Data-as-a-Service Provider

The Big Data-as-a-Service (BDaaS) market is one to definitely keep an eye on in the coming years. It’s been quickly growing as more and more companies understand the value that big data solutions can bring to their individual organizations. This is especially true among mid-size businesses which likely didn’t have the budget or resources to invest in things like constructing their own data centers or employing the best data scientists. With the rise of BDaaS, all of the potential found within big data is now at their fingertips in much the same way other cloud services have led to more opportunities and capabilities. While a company may want to work with a BDaaS provider, choosing one can prove to be difficult. The BDaaS market is changing, at time rapidly so, making the decision a challenging one. Though the market is evolving, there are a number of things you can look for when you’re considering a BDaaS provider.

The first thing you’ll want to do is evaluate your company’s needs. Since this will be different for each organization, it’s a good beginning step for determining a BDaaS provider. When you know the ways in which big data will help your business, you ...


Read More on Datafloq
Big Data & Analytics – Getting Started on Your Journey

Big Data & Analytics – Getting Started on Your Journey

There has been a lot of talk and content published around the application of Big Data within enterprises. 

In the past 6-7 years that we’ve been involved in this sector, we’ve seen a significant amount of money being spent by corporates to get value from various data technologies. But the missing piece we often see and ask our clients is: Where and how is the wider business involved?   

For example, many CIOs have approved the spend to build Hadoop clusters or similar but with little thought into how the wider business will go on to use this technology. Furthermore, the business doesn’t really understand the technology and what it could do for them.

The problem doesn’t get any easier now that we’re seeing a lot of hype around Machine Learning and Artificial Intelligence (AI). The biggest problem I’m seeing today is that there seems to be very little connection between Big Data & Analytics and Machine Learning/AI.

The hype over the last few years has also moved to Digital Transformation and what that means to an industry, and more recently toward the application of blockchain technology (but that’s for another time).

In this post I will focus on how to get started on your Big Data & ...


Read More on Datafloq
How Is Big Data Changing Supply Chain Management

How Is Big Data Changing Supply Chain Management

In the last couple of years, big data technology was implemented in many different industries. Big data innovations have uplifted the business process to another level by introducing more predictability and by automating the decision making-process. Implementation of big data into the supply chain management has shown similar results. The use of big data technology has enabled companies to react faster to the changes within their supply chain and to drastically improve customer experience. In this article, we’re going to analyze some of the new supply trends that are directly connected with the big data technology implementation.

Real-time tracking

Today, every modern retail company offers a real-time tracking feature. Time tracking data can also be used as a basis for the innovation of the supply chain, especially when it’s combined with the data coming from the social sources, like eCommerce analytics, social networks, news, weather, etc. Big data allows companies to plan their future inventory and supply routes in real time, without relying on historical data. This functionality is very important for the accuracy of the eCommerce logistics process.

The biggest couriers have already replaced the outdated ERP tracking technology with the Internet of Things sensors, which can follow the whole shipment process ...


Read More on Datafloq
Your Number One Priority: Your Data

Your Number One Priority: Your Data

Behind every successful company, whether it’s a multinational corporation or the smallest start-up, is clean and qualified data. In the following article, we share some of the best ways you can grow and protect your most valuable asset… Your data.

If you want to increase the success of business and maximise upon the sales and/or marketing activities you undertake, you need to ensure the data you have is of good quality.

Here we share how you can grow, improve and protect your data…

Invest time and money in growing your data

Data is your business’ most valuable asset, it’s therefore no surprise that we recommend you invest in it. In order for your business to grow, you need to invest time and money in sales and marketing activities. This essentially enables you to grow your data, by measuring what activities work and which don’t, along with the ability to record customer and prospect data.

Recording all business data should be a consistent and continual process, as it will enable you to remain agile and on-trend. It will enable you to react to the most accurate, up to date information available and stay ahead of the competition.

Ways you can grow your data include:


Buying in data from ...


Read More on Datafloq
Brace Yourselves for the Sweeping Changes Brought by Virtual Reality

Brace Yourselves for the Sweeping Changes Brought by Virtual Reality

Technologies change so often that, unless you pay attention to everything relevant happening around you, it’s easy to miss out the latest advancements. The good news is that virtual reality isn’t likely to go unnoticed, even by those who don’t consider themselves tech-savvy individuals. In fact, VR tech is here to stay and it will leave an unmistakable mark on the industry. Some industries are likely to adopt the new technology before others, as it is expected to set important milestones for online entertainment.

The idea is not exactly new, as Hollywood blockbusters and sci-fi movies pitched the concept to a dazzled audience a long time ago. Until recently, this was regarded as fringe technology and most expected it to be prohibitively expensive. The early virtual reality headsets were very limited and not exactly affordable for regular people. The good news is that as technology gets better, the costs decrease and even the best VR headsets are likely to become mainstream in the near future.

A New Way to Experience the World Around You

There are so many ways in which virtual reality can be used that it’s difficult to make any long-term forecasts. However, there are some areas such as online entertainment ...


Read More on Datafloq
Data-Driven Marketing 2017: 80% of Marketers Agree that Data is More Critical Than Ever

Data-Driven Marketing 2017: 80% of Marketers Agree that Data is More Critical Than Ever

Data is an integral component to any successful marketing or advertising strategy.  Once a novel concept, data-driven marketing is now essential for establishing meaningful connections and personalized experiences with today’s consumers.

In research by the Winterberry Group in partnership with the Global Direct Marketing Association (GDMA), almost 80% of respondents said that they see customer data as critical to their marketing and advertising assets. More than half (53.4 percent) increased their year-on-year spend on data-driven marketing and advertising in 2016.

Additional findings from the report include:


Almost 90% maintain databases to host information on customers and prospects to improve how they manage and use data
88% are actively segmenting information to better target and engage customers for activities and offers
64% of respondents are purchasing third-party data to support targeted marketing campaigns


Marketers are also spending more on data strategies. In another report conducted in 2016 by Winterberry in conjunction with the DMA, over 40% of respondents in the survey said they expected their use of data-driven marketing to “increase somewhat� or “increase significantly� in Q4 2016. More than 40% of executives also said that they expected their Q4 2016 revenues generated by data-driven marketing efforts to “increase significantly� or “increase somewhat.�



How Reliable is Your Data?

While ...


Read More on Datafloq
How Big data & AI are Improving Internet Marketing

How Big data & AI are Improving Internet Marketing

Big data is touching our lives in numerous ways. By the year 2020, nearly 1.7 trillion megabytes of new information will be generated every second. The big data revolution is playing a tremendous role in the future of online marketing as well.

Here are some ways big data is a boon for the Internet marketing profession.

More Granular Approach to Optimization

Whether companies are trying to generate leads, boost Facebook or Instagram likes or focus on direct selling, they need to collect plenty of data to optimize their campaigns. Creating successful online marketing campaigns requires extensive testing and optimization. The more variables you track, the easier it is to improve your campaigns.

Before major advances in server technology and cloud computing, brands had to restrict the data they monitored, which limited their ability to optimize their campaigns. Today, big data allows them to retain much more extensive records of their marketing campaigns, which makes optimizing campaigns easier than ever.

Provide Better Targeting

Every experienced marketer agrees that targeting is the most important part of online marketing. Targeting your ads to the right users has a greater impact on your ROI than your landing pages, ads, day parting and virtually every other variable you can picture.

Big data ...


Read More on Datafloq
Virtual Reality In Educational Use: A Few Notable Examples

Virtual Reality In Educational Use: A Few Notable Examples

Back in the old days, as in a few years ago, children were taught using chalkboards and textbooks. Some lucky classes had video and social or physical interaction in their lesson plans. Now it seems that virtual reality application development has found at least one permanent home, in the classroom.

Google has a new platform called Google Expeditions. Its purpose is to bring virtual reality to the classroom using the Google Cardboard headset. Some teams will also teach the teacher, going to classrooms to show instructors how to implement the program. Kits are available for purchase already on the Google site. The virtual reality app developers plan to have students and teacher share a visit to locations not easily explained through textbooks. It's also a perfect venue for the teacher to lecture while she shares the same view, selecting specific topics in the view or directing kids' attention to areas they'd miss on a real field trip.

Although field trips are fun, they tend to be a cumbersome process for educators. Kids miss school time and parents need to give special permission. With app development deployed towards education, bringing the sites, countries and world historic events to the classroom has become easier. ...


Read More on Datafloq
How Will Big Data Change the Future of Employee Retention

How Will Big Data Change the Future of Employee Retention

Attrition hurts businesses. When it comes to employee turnover, around 70% of organizations report that attrition “has a negative financial impact due to the cost of recruiting, hiring, and training a replacement employee and the overtime work of current employees that’s required until the organization can fill the vacant position.�

You spend a great deal of time and money training employees. You rely on them, you entrust them with all your internal secrets, you grow attached to them in both a practical and emotional sense. Then all of a sudden, they’re gone. You know some level of attrition is inevitable, but isn’t there a way to go beyond all the normal efforts to ensure they’ll stick around?

The commonsense advice is to improve your HR and pay attention to employee engagement. Worldwide, in 2016 Gallup found only 13% of workers are engaged, while the US numbers stand at 32%. Disengaged employees are more likely to leave, while engaged employees are likely to help a business succeed. According to a study from Harvard Business Review, 71% of respondents view employee engagement as “very important to achieving overall organizational success�.

Employee retention through the strategic use of big data is already happening. Bank of America ...


Read More on Datafloq
Big Data Strategy (Part III): is Your Company Data-Driven?

Big Data Strategy (Part III): is Your Company Data-Driven?

If you missed the first two parts, I have previously proposed some tips for analyzing corporate data as well as a data maturity map to understand the stage of data development of an organization. Now, in this final article, I want to conclude this mini-series with final food for thoughts and considerations on big data capabilities in a company context.

I. Where is home for big data capabilities?

First of all, I want to spend few more words regarding the organizational home (Pearson and Wegener, 2013) for data analytics. I claimed that the Centre of Excellence is the cutting-edge structure to incorporate and supervise the data functions within a company. Its main task is to coordinate cross-units activities, which include:


Maintaining and upgrading the technological infrastructures;
Deciding what data have to be gathered and from which department;
Helping with the talents recruitment;
Planning the insights generation phase and stating the privacy, compliance, and ethics policies.


However, other forms may exist, and it is essential to know them since sometimes they might fit better into preexisting business models.



Data analytics organizational models

The figure shows different combinations of data analytics independence and business models. It ranges from business units (BUs) that are completely independent one from the other, to independent BUs that join the efforts in some ...


Read More on Datafloq
Data Vault 2.0 Online Training – Early Adopter

Data Vault 2.0 Online Training – Early Adopter

Finally you have an option to get authentic Data Vault 2.0 training online from Dan Linstedt!
A Look at the Role of Big Data in SEO

A Look at the Role of Big Data in SEO

The concept of big data can seem a bit elusive to most people. There are thousands of possible applications and just when you think you have a grasp on it, something changes. But from a business and marketing perspective, it’s important that you recognize how big data is related to SEO.

Google: The World’s Largest Data Corporation

It’s impossible to separate big data from SEO, if for no other reason than Google is the original and largest big data organization in the world. As SEO expert Jiyan Wei explains, “They have become the institution they are today by analyzing enormous sets of data, making automated inferences, and providing intelligence back to consumers. By studying Google’s methodology and applying their findings, search professionals have been intimately involved with big data for quite some time.�

When you consider that Google is the leader in the SEO industry, it becomes clear that big data and SEO will forever be intertwined. The future of search will be determined by big data, as will the manner in which businesses optimize their websites and digital content.

Data is what makes the massive (and growing) amount of content on the internet decipherable. It’s how users are able to filter out the ...


Read More on Datafloq
Why Every Business Is A Data Business

Why Every Business Is A Data Business

Hello friend: I am here to give you a wake-up call.

You know that business you have been spending precious time and effort building for 5, 10, 20, 30, 50 or maybe even 100 years? You know, the one that builds widgets or sells professional services or has killer technology or does something that's going to change the world or maybe something not even remotely close to any of that?

What would you say if I told you that this business produces a hidden source of revenue, some may even call it a secret product, that can help you make both your top line and bottom line numbers every quarter?

No, this is not #fakenews. It is true. Simply by existing, every company produces something called DATA, and it is time for you and your company to wake up and start monetizing it. As someone who has spent their entire career in and around data and analytics businesses, I assure you: there's gold in them thar hills.

Taking A Lesson From Fast Food Restaurants

If you recall, not too long ago, gas prices were averaging close to $4.00/gallon nationally. As the economy slowed, predictably we all began to opine for alternate, and more cost efficient, ...


Read More on Datafloq
How to Secure Your Cloud-based Applications From Cyber Security Attacks

How to Secure Your Cloud-based Applications From Cyber Security Attacks

We think our customers’ data and business information is safe if we store everything in the cloud. This is true to some extent and this kind of shift has also made businesses from small to big save a lot of money by reducing their IT expenses, access scalable tools, and reduction of on-site load of their IT infrastructure. According to CSA (Cloud Security Alliance), nearly 70% of the businesses across the globe now operate in the cloud.

With benefits like great flexibility, automatic software updates, etc. the number 70%, isn’t that big. However, even cloud has its own downsides for storing apps that have loads and loads of data. That is the main reason the remaining 30% business owners are still in dilemma whether to go to the cloud or not as they can not make a move without stringent security practices in place.

Today, we will discuss the top 5 of security tips for cloud-based applications you should be aware of.

1. Develop a Threat Model

Develop a threat model for every cloud app that you are taking into consideration for cloud deployment. Find out the potential threats, both in terms of technical and business issues, no matter whether these threats can be ...


Read More on Datafloq
How To Better Interpret Big Data

How To Better Interpret Big Data

Big data has become a very popular buzzword in the business world. Tons of articles talk about how big data can influence your business decisions and lead to new insights, but how do you interpret your big data? Statistics are notorious for being manipulatable, and raw numbers can be hard to make heads or tails of.

Before data can be utilized to improve a business, healthcare, or science, it needs to be interpreted. As with any scientific experiment, after the data is collected, that information needs to be processed and analyzed. This allows mistakes to be identified, outliers to be examined and discredited, and decisions to be made about how useful the information is.

Where Is The Data Coming From?

Before you even start to interpret the data, you need to determine the sources of the data. Big data can be about any number of subjects, like consumer behaviors, the efficiency of machines, weather patterns, and crime statistics.

Interpreting that data starts with a strong understanding of its subjects and the tactics used to gather it. If there is a bunch of data about consumer online shopping patterns, but all of the information is from one website, then the data can’t be trusted as all ...


Read More on Datafloq
The GDPR: 5 Questions Data-Driven Companies Should Ask to Manage Risks and Reputation

The GDPR: 5 Questions Data-Driven Companies Should Ask to Manage Risks and Reputation

Data is rapidly becoming the lifeblood of the global economy. In the world of Big Data and artificial intelligence, data represents a new type of economic asset that can offer companies a decisive competitive advantage, as well as damage the reputation and bottom-line of those that remain unsuccessful at ensuring the security and confidentiality of critical corporate and customer data. 

Despite the severe repercussions of compromised data security, until recently, the fines for breach of data protection regulations were limited and enforcement actions infrequent. However, the introduction of a potentially revolutionary European General Data Protection Regulation (GDPR) is likely to transform the way data-driven companies handle customer data by exposing them to the risk of hefty fines and severe penalties in the event of incompliance and data breach. 



In this article, I have tried to summarise the implications of GDPR implementation for data-driven companies, as well as the measures businesses can take to ensure the security and privacy of client’s data and avoid the penalties associated with non-compliance. 

How Does GDPR Impact Data-Driven Organisations? 

The General Data Protection Regulation (GDPR) stands out from all existing regulations because of its breadth of client data protection. From conditions on cross-border data transfer to the need to implement, ...


Read More on Datafloq
8 Ways Your Team Can Take Advantage of Big Data

8 Ways Your Team Can Take Advantage of Big Data

Managing any type of business in today’s digital age can prove challenging. Whether you are into IT, legalities, sales or any sort of work that requires careful data management, you might have heard of Big Data. It’s more than a concept, allowing you and your team to operate much more freely and progressively than before. Let’s take a look at some of the ways you can take advantage of Big Data and utilize it to the best of your abilities.

Develop new strategies

By allowing you quick and easy access to a diverse library of data, Big Data allows you and your team to develop strategies you haven’t considered before. This is because the data you will be using is centralized and any type of information is available at the press of a button. Use this advantage to brainstorm ideas and run simulations in order to maximize your productivity and drive your business forward.

Communicate with clients

Sometimes it seems impossible to actually communicate with your client. This is why Big Data allows you to cross-reference, calculate and determine what your clients want and what their next steps will be in your relationship. Relying on data and analysis to determine what your clients want ...


Read More on Datafloq
Mobile Payments: Are They More Secure Than Credit Cards?

Mobile Payments: Are They More Secure Than Credit Cards?

With all the buzz that’s been surrounding our online privacy, cybercrime has become an important issue in everyday life. After all, we trust our most sensitive information to many different websites, believing that they have the safeguards in place to keep our private information private. Unfortunately, our data isn’t safe at all—a fact that’s made clear by the large number of data breaches and cases of identity theft that continue to grow each year. Though encryption and cybersecurity are getting more sophisticated every year, so are the hackers intent on stealing our data. Because of all this, many people are worried about the trend toward mobile payments isn’t secure—that it will open them up to having their financial and personal information stolen. Others maintain that mobile payments are more secure than credit cards. So who’s right in this scenario?

Chip Cards vs. Magnetic Strip

Recently, most new credit cards have started to feature “chip� technology, in addition to the traditional magnetic strip. This chip technology is more secure than swiping a card, because it doesn’t store the card number and name of the person in the system—it generates a code unique to that transaction, rather than providing the vendor with the actual ...


Read More on Datafloq
Case Studies

Case Studies

Career development programs have potentials to help professionals to work on their skills for career betterment. Nowadays, wide array of strategies and activities are available that support career development. Doing so can give help professionals to get the real-life career opportunities. Below are some case studies that show how the career development programs help:

Career Advancer 

Introduction and Background

Carlos has been working as a Quality engineer in a reputed organization for the past 10 years. During this period, he has acknowledged various negative as well as positive aspects of his career. During the discussion, we come to know that Carlos had an interest towards advertising, but he pursued a degree in engineering to make a career in IT industry. After completing his graduation, he got a good job with a handsome salary. After working in the industry for around 10 years, he started feeling like he is receiving nominal increases in his salary and now he certainly feel he is underpaid.

Proposed Solution

LSSBBAfter discussing his roles and responsibilities during his working experience and found that he worked as a team leader for various years. His only concern was salary growth. Considering his 10 years’ experience in QA, we have recommended him to go for Lean Six Sigma Black Belt (LSSBB) from MSys Training. LSSBB helped him to grow his professional skills that help him in getting a new job with a good salary hike.

Trainers at MSys training help him to sharp his communication and personal skills too. Now, he is working with the Fortune 500 firm with good salary and live stress free life.

 

Major Outcome

MSys Training helps him to refine his skills and polished his networking abilities. The LSSBB helped him to explore the job market. Carlos has now joined one of the Fortune 500 companies as a QA team lead.

 

Career Transitioner

Introduction and Background

Jeff is a working professional with one of the reputed firm in the California. He has total 8 years of experience into IT and he is very passionate about his work. Even after working for so long in the same industry, he cannot see any growth and concern about the same that led to mental stress. He also has an experience in handling team in the absence of his manager.

Proposed Solution

CAPMDuring the counseling session, we come to know that he is really good with his work and has a great team handling skills as he do in the absence of his manger. After having a thorough discussion on his career profile, we understand that he had not done any certification or any additional courses that can help him to achieve his career goals. Considering his knowledge, skills and experience, we recommend him to go for the project management course offered by MSys, a leading training provider in the USA.

He joined their CAPM certification course and clear the exam. Being certified from the best training provider, he come across better job opportunities and get a good hike in his salary. Now working with the global firm as a team lead and happy with his career growth.

Major Outcome

The MSys CAPM certification helped Jeff to explore his skills and credibility, which boost his self-confidence and fill him with the energy to sharp his skill and come across better opportunities.


Quality Management Strategy – Prince2 Approach | Simplilearn

Quality Management Strategy – Prince2 Approach | Simplilearn

Quality Management Strategy - Prince2 Approach | Simplilearn Quality:Quality is generally defined as the totality of features and inherent or assigned characteristics of a product, person, process, service, and/or system that bear on its ability to show that it meets expectations or satisfies stated needs, requirements or specification. In PRINCE2®, a product can also be a person, process, service and/or s...Read More.
Re-skilling will always keep a person up to date: In Conversation with BKS Prasad | Simplilearn

Re-skilling will always keep a person up to date: In Conversation with BKS Prasad | Simplilearn

Re-skilling will always keep a person up to date: In Conversation with BKS Prasad | Simplilearn Born and brought up at the Karnataka’s culture capital - Mysore, I completed my education in Engineering, began my career in the hardware industry, moved onto cellular operations and subsequently into the IT industry, both in product as well as service organizations. I worked for startups and MNC’s and have about 20 years of experience....Read More.
Recognition and Reward System in the Project Management | Simplilearn

Recognition and Reward System in the Project Management | Simplilearn

Recognition and Reward System in the Project Management | Simplilearn I am sure every project manager now-a-days uses a lot of tool and techniques to execute the work within the timeframe or within the estimated cost with the desired quality. The main thing is to complete the work on time. And it is possible only through true, motivated, successful manpower. They are the resources who are working on the projects. Do ...Read More.
Reducing Project Estimation Overruns | Simplilearn

Reducing Project Estimation Overruns | Simplilearn

Reducing Project Estimation Overruns | Simplilearn Problem Statement In project management, accurate estimates are the basis of sound project planning. Many processes have been developed to aid engineers in making accurate estimates, such as Analogy based estimation Compartmentalization (i.e., breakdown of tasks) Cost estimate Delphi method Documenting estimation results Educated assumptions Estim...Read More.
Release Plan in Agile | Agile Certification Training | Simplilearn

Release Plan in Agile | Agile Certification Training | Simplilearn

Release Plan in Agile | Agile Certification Training | Simplilearn Agile practitioners use an agile approach to release management in projects. As the main idea behind agile project management is to achieve the satisfaction of customers at the end of the project, a release plan in agile is not complete till it is demonstrated to the customer. Agile teams follow release plan at regular intervals in the end of the i...Read More.
Renewing the PMP certification : CCR Program | Simplilearn

Renewing the PMP certification : CCR Program | Simplilearn

Renewing the PMP certification : CCR Program | Simplilearn The minute you pass your PMP® exam, you will have to work towards the CCR (Continuing Certification Requirements) program to maintain your PMP® credential and an active certification status. Each PMP® certification cycle lasts for three years during which you will need to earn 60 PDUs (Professional Development Units) to ...Read More.
Risk Assessment in Project Management | Simplilearn

Risk Assessment in Project Management | Simplilearn

Risk Assessment in Project Management | Simplilearn The last thing that any project will want to face is risks. Projects are designed to take advantage of resources and opportunities and with these, come uncertainty, challenges and risk. Hence risk management becomes a very important key to all project success. The project risk management plan addresses the process behind risk management and the ris...Read More.
How is Data Analytics Changing the Web Design Landscape?

How is Data Analytics Changing the Web Design Landscape?

In the past, people usually relied on intuition or gut feeling when it comes to the matters of design. Unfortunately, this kind of approach is highly unreliable. Sure, those with a lot of experience in the field will unknowingly draw patterns from their previous success or failures, but this usually comes down to a limited number of people (who have enough experience in the first place). Today, web designers have something much more reliable to lean on- the notion of data analytics.

John Wanamaker once said that one half of everything he spends on marketing goes to waste. The problem, however, is that he can never know which half. Still, this statement was uttered ages ago and today things are much different than they were at the time. Thanks to the internet and numerous analytics tools we have at disposal, in 2017 it is more or less possible to accurately predict which factors affect website or app users the most. By knowing this, you can also see which website elements are most likely to result in a great ROI.

What Kind of Data Are We Talking About?

We keep repeating this word data, but what kind of data are we actually talking about. ...


Read More on Datafloq
Why Enhanced Connectivity Offers the Opportunity for Superior Data Analysis

Why Enhanced Connectivity Offers the Opportunity for Superior Data Analysis

Today's small businesses often have more numerous opportunities to employ data analysis than they might realize. Devices, appliances and equipment that maintain a constant connection to the Internet can often yield valuable information suitable for use in data analysis efforts. Being able to develop a clearer picture of their needs, situation and any opportunities that may be just over the horizon can ensure that small businesses are able to enhance the overall efficiency and profitability of their operations. The enhanced connectivity behind the so-called Internet of Things may offer significant benefits when it comes to auditing or analyzing data.

Putting Big Data to Work for Small Businesses

When it comes to the competitive world of business, being able to base decisions on more accurate information is often crucial. Small businesses can utilize data analysis in order to fine-tune cost-effective marketing efforts, analyze their current operations in an effort to identify areas of waste or inefficiency as well as ensuring that future policy changes and the implementation of new workflow processes may be tracked and assessed to greater effect. The electronic devices, systems and business equipment that businesses may rely upon during the course of their day to day operations may be able ...


Read More on Datafloq
The Ethics of Big Data and the Latent Threat to Democracy

The Ethics of Big Data and the Latent Threat to Democracy

One of the most cliched, overused, yet poignant quotes comes from a dying Uncle Ben, speaking to Peter Parker on the eve of becoming Spider-Man:

“With great power comes great responsibility.�

Big Data and the IoT represent some of the most powerful technological advances that mankind has ever seen in its relatively short history. Social and cultural norms dictate that the average citizen of any developed country owns and regularly uses a smartphone, a social media profile, and a myriad of other devices that are connected to the cloud. The data these users generate is genuinely useful and can contribute to grandiose marketing efforts, widespread government initiatives, and important humanitarian efforts as well. However, there’s a more sinister side to big data and analytics usage that we can’t ignore, and that responsibility demands we confront with an ethical framework.

A New, Confused Field of Research

The problem with approaching the ethics of Big Data is that nobody can really agree yet on a framework. This is due to multiple factors, but, in her piece “Scientists Are Just As Confused About the Ethics of Big-Data Research As You,� published via Wired in mid 2016, tech author Sarah Zhang argues that “the risks—and rewards—of analyzing big, ...


Read More on Datafloq
Mastering Data: A Comprehensive Take on MDM and Analytics

Mastering Data: A Comprehensive Take on MDM and Analytics

Today, CEOs and their associated teams can access a broader pool of data that no one could have imagined 10-15 years ago. The issue was not related to lack of information or over-bundling of information, but fetching the right information to make decision-making easy and hassle-free.

A pervasive analytical strategy strengthens activities and welcomes a self-service mechanism, coupled with encompassing guidance and supervision from people accessible to the data. This kind of strict governance often plays an influential role in curbing data quality issues, wrong decision-making and redundant evaluations.

Embrace Master Data Management! MDM is mostly overlooked though it is one of the essential driving factors in making an organisation completely data-driven. A comprehensive data management strategy comprises efforts focused to preserve data quality and better data governance throughout a diverse range of business applications. With efficient MDM tools and techniques, companies can now enjoy better access control, expansive oversights, along with tracking the origin and decency of data in order to ensure improved safety and consistency across wide decision-making procedures.

By combining data analytics with improved governance and supervision with the help of MDM, companies tend to find higher user satisfaction levels. Owing to high quality, trusted data, easy access to data and ...


Read More on Datafloq
Risk Management Cycle or Procedure – ISO 31000 perspective | Simplilearn

Risk Management Cycle or Procedure – ISO 31000 perspective | Simplilearn

Risk Management Cycle or Procedure - ISO 31000 perspective | Simplilearn ISO 31000 has introduced some important and more pertinent terms to the risk management standard and hence helps in better orchestration and implementation of the process across the organization to yield benefits whilst at the same time controlling the costs and the overall optimization of resources. • Risk owner is defined as a “person ...Read More.
Risk register – An important component of overall risk management framework | Simplilearn

Risk register – An important component of overall risk management framework | Simplilearn

Risk register - An important component of overall risk management framework | Simplilearn Risk register is an important component of the overall risk management framework. An in-depth understanding of this topic will enable us to understand the risk management processes better. Risk register is also an important topic of study for the PMP®certification exam. What is a Risk Register? The risk register database can be viewed by projec...Read More.
PMI-ACP Exam: Agile Certified Practitioner – Eligibility Requirements | Simplilearn

PMI-ACP Exam: Agile Certified Practitioner – Eligibility Requirements | Simplilearn

PMI-ACP Exam: Agile Certified Practitioner - Eligibility Requirements | Simplilearn The Project Management Institute (PMI) exam validates the knowledge of aspirants on agile and their ability to understand agile principles and concepts with a PMI-ACP Credential. PMI-ACP Credential on your CV demonstrates your ability to handle challenging projects along with the use of agile tools and techniques. According to Simplyhired, the...Read More.
The Day-to-Day Life of a Data Scientist

The Day-to-Day Life of a Data Scientist

Many businesses and industries understand there is great potential ready to be unlocked in data science, so much so that they’re eager to hire the best data scientists. Despite this, there is still a great deal of confusion about what exactly a data scientist does. Broadly speaking, data scientists use data to solve problems for the businesses and clients they work with, but that general explanation doesn’t even begin to delve into their individual responsibilities and what their day-to-day schedules and tasks are like. To better understand how data scientists work, it’s important to get a feel for what they do in a typical work day. Note, however, that the following is a mere generalization and that daily tasks can differ greatly depending on the data scientist and who they work for. This should only be read as a wide view and not a concrete example.

It may come as a surprise to some business leaders, but the typical day for a data scientist can resemble a typical day for any employee. Meetings are common, and they usually happen on a daily basis. In fact, that’s how many data scientists begin their day, talking about and discussing the work they were ...


Read More on Datafloq
The Day-to-Day Live of a Data Scientist

The Day-to-Day Live of a Data Scientist

Many businesses and industries understand there is great potential ready to be unlocked in data science, so much so that they’re eager to hire the best data scientists. Despite this, there is still a great deal of confusion about what exactly a data scientist does. Broadly speaking, data scientists use data to solve problems for the businesses and clients they work with, but that general explanation doesn’t even begin to delve into their individual responsibilities and what their day-to-day schedules and tasks are like. To better understand how data scientists work, it’s important to get a feel for what they do in a typical work day. Note, however, that the following is a mere generalization and that daily tasks can differ greatly depending on the data scientist and who they work for. This should only be read as a wide view and not a concrete example.

It may come as a surprise to some business leaders, but the typical day for a data scientist can resemble a typical day for any employee. Meetings are common, and they usually happen on a daily basis. In fact, that’s how many data scientists begin their day, talking about and discussing the work they were ...


Read More on Datafloq
How the Internet of Things Changes Big Data Analytics

How the Internet of Things Changes Big Data Analytics

The internet of things is going to have a dramatic and far-reaching impact on the world that we can’t even imagine. By the year 2020, there will be somewhere in the vicinity of 28 billion sensors online. That’s more than four per person. And from there it will only multiply.

That means that there will just be tremendous amounts of data pouring in. More data then we can, at this point, process. In fact, we might never be able to process it as even as ramp up our ability to process data, the number of sensors is going to keep growing. You can almost see it as an arms race between the data and the ability to process it.

Even if we can’t process that data, it won’t just disappear. Instead, it will get dumped into huge servers, as the storage space for information continues to plummet and it becomes easier to just put things on disk than to do anything with them.

So what does all this mean for big data?

For one thing, it means that if you want to make sure that you can use this incredible river of data, you’ll need to start early. You’ll need to start picking and ...


Read More on Datafloq
How New Technologies Are Challenging the Ad Industry Status Quo

How New Technologies Are Challenging the Ad Industry Status Quo

They say that advertising isn’t exact science and most of the times you need to be patient and hope for the best. This is obviously something that advertising companies would like their clients to believe indefinitely. Until recently, it was difficult to quantify results and a trial and error process was regarded as acceptable in most cases. Once again, technology came to the rescue and data science changed the way the game is played in this industry.

Today it is much easier for clients to estimate the success of their ongoing promotions and they can make educated decisions. An ample program will require just as much involvement and time spent on advertising, but at least results are easier to quantify. If something doesn’t work according to the plan, at least you know why and can make the necessary changes. The wheels are set in motion and data driven creativity is changing advertising faster than anyone could’ve predicted a couple of years ago.

Know Everything About the Targeted Audience

Companies selling products and services are constantly trying to expand their customer base by any means necessary. Before they get the chance to pitch an idea and convince people that it’s worth investing in what ...


Read More on Datafloq
BI Success – Does the tail wag the dog?

BI Success – Does the tail wag the dog?

I am watching recorded event this morning about the keys to Business Intelligence success. The top 4 points that were provided are; Finding the right (executive sponsor) Don’t build what they ask for, build what they need Offer up a buffet (deliver multiple options) Don’t try to boil the ocean. Lets be honest, there’s nothing […]
How Analytics Is Evolving Like the Medical Field

How Analytics Is Evolving Like the Medical Field

It used to be that a doctor was a doctor for the most part. Even a century ago, unless you lived in a large city, people likely had a town doctor who handled most every type of ailment and guided most any type of treatment. Given the limited medical knowledge and lack of sophisticated treatment options during this time, these generalists could often provide a level of care that was comparable to the best available. Today, that is no longer true in medicine and a similar trend is playing out in analytics.

The Proliferation of Specialists

Today, the medical profession still has general practitioners who are usually our first line of defense. However, there are also specialists focused on a wide range of specialties within medicine. Radiologists, surgeons, orthopedists, and more are among those we interact with on a regular basis. Even within these specialties, there are sub-specialties. For example, brain surgeons and heart surgeons. While all medical professionals go through the same basic training, many then focus on training in a specific area.

When I started in analytics just a few decades ago, most of us were similar to the town doctor. For the most part, we were generalists who would apply ...


Read More on Datafloq
How Big Data Is Changing the Business World, and Why It Matters

How Big Data Is Changing the Business World, and Why It Matters

In the not too distant past businesses ran on data collected through store visits, coupon redemption, mailed surveys and the ever present telemarketing calls. Information is still collected this way, but there is an increasing reliance on web searches, email contacts and online analytic tools. The information that is collected online can gauge customer behavior more accurately than traditional means.

The fact is that more and more people conduct their business online opposed to face-to-face. So a small business that has an online component to their products and services is more likely to attract business. It is also understandably smart to consider providing products and services online because:



It is instant gratification for your customer. Online tools make it very easy to track and ship products.


It provides you with an avenue to showcase your products and services to customers.


You can customize the customer's experience and understand their buying patterns.



A lot of the benefits of online business come from collecting, analyzing, and using big data. Here are some ways in which it is changing the business world.

Big Data Changes How You View Your Customers

In the past your business wouldn't have access to trends as quickly as you do today. This matters because your ...


Read More on Datafloq
The Top 5 Software Development Trends in 2017

The Top 5 Software Development Trends in 2017

As digital start-ups continue to transform industries around the world, businesses need to do all they can to stay competitive. We've heard all about the benefits new technologies can bring, but what should IT professionals be doing right now to stay ahead of the game? The challenges of a rapidly changing environment can seem daunting, but with the right knowledge and preparation, they can be harnessed to propel a company forward. Here are the top software development trends that are shaping enterprise this year.

Automation

It might seem strange to start with something that's been around for a few years now, but we're set to see a huge transformation in how IT automation is implemented. Opportunistic scripting is well-loved by administrators, but the time has come for it to be replaced by a more systemic implementation. The current gold standard is based around heuristic design — using data from operations to learn how a system works, before applying automation accordingly. 

In order to manage this change, experts recommend appointing an automation leader in IT, as well as providing incentives for administrators.

Machine Learning

Heuristic automation itself has only been made possible by recent advances in artificial intelligence. And just as the term 'Big Data' reverberated ...


Read More on Datafloq
8 Amazing Smart Home IoT Devices Available in 2017

8 Amazing Smart Home IoT Devices Available in 2017

Do you sometimes feel like your home greets you like an old friend when you come home from a long day in the office? Well, what if it could? The Internet of Things (IoT) continues to develop rapidly, with new devices becoming available every year. Smart homes are one of the most widely adopted IoT applications currently available to consumers, and devices for smart homes are becoming more popular each year. One of the biggest benefits of smart home devices is that they are totally customizable—you can choose the devices that make sense for your life. But what options are out there? Well, we still haven’t seen a device that will cook your breakfast for you, but these 8 cutting-edge smart home IoT devices are available this year, and could make your life a whole lot easier. 

1. Sensi Wi-Fi Thermostat

Let’s face it, not everyone likes a room to be the same temperature. You won’t always be able to agree on what makes for a comfortable room, but you will be able to optimize your preferences with the Sensi Thermostat. The Sensi allows you to set different temperature schedules, depending on the day of the week, giving you better control over ...


Read More on Datafloq
7 Reasons to Switch to Cloud Hosting in 2017

7 Reasons to Switch to Cloud Hosting in 2017

Cloud computing is here to stay – that’s a fact. While some were still skeptical of the wide-ranging effects of the “cloud revolution� several years ago, it has become clear that outsourced, cloud-based computing and infrastructure solutions are the wave of the future.

Still, some people are skeptical. Even though cloud hosting is one of the earliest “cloud� technologies, there are still individuals and companies who aren’t making use of the cloud – and are missing out on the many benefits of websites that are hosted on the cloud.

In this article, we’ll take a look at the top 6 reasons your website should be running on the cloud, to give you an inside perspective on the benefits of cloud web hosting.

1. Lower Costs

Hosting a website on your own is expensive. You have to invest in server architecture, licensing, and operating costs for an on-premises server. Not only that, you’ll have to have IT staff who are able to service and maintain your equipment – or you’ll have to outsource to another IT firm that is willing to do so.

Cloud hosting allows you to avoid all of this. By simply paying a flat fee per month, your website will be hosted on ...


Read More on Datafloq
5 Ways Data Will Be Commoditized in the Future

5 Ways Data Will Be Commoditized in the Future

Data is a valuable resource, and it’s already starting to become commoditized. Institutions that can collect, organize, and distribute data are making significant profits, and data technology is making its way into the hands of less and less tech-savvy users. At the same time, improvements to our technology and the increasing reach of that technology is making it cheaper to execute tasks that seemed unthinkable just a decade ago.

So what does this mean for the future of that data industry and for consumers as a whole?

Data Commoditization

We predict these five developments from the commoditization of data in the future:

1. Accessible predictive analytics.

Currently, predictive analytics is a branch of analytics that exists as more of a wish-list item than a feasible or reliable institution. It exists, and is in use by many companies, but it relies on a combination of state-of-the-art artificial intelligence and human data analysts to make it accurate (and allow it to make actionable, clear predictions). Predictive analytics also isn’t something available to the general public, or even small business owners with limited budgets and resources. However, as our access to data technology begins to scale, predictive analytics will become more accessible and may even become a “baseline” ...


Read More on Datafloq
Revised DFS Cyber Regulation is Out

Revised DFS Cyber Regulation is Out

New York State’s Department of Financial Services (DFS) has just released its revised first-in-nation proposed cybersecurity regulation.  In formulating the revised proposal, DFS took into account the more than 150 comments it received with regard to its original proposal, which was released in September 2016.  Although the new proposal maintains many of the requirements of the initial proposal, such as the requirements for a Cybersecurity Program, a written Cybersecurity Policy, and the designation of an individual responsible for the program’s implementation and oversight, the new proposal differs in a number of very significant ways, highlighted below:


DFS has retreated from the prescriptive approach it took in its original proposal. Under the new proposal, an entity’s Cybersecurity Program “shall be based on the Covered Entity’s Risk Assessment.�
DFS has deleted the requirements to identify the Covered Nonpublic Information stored by the Covered Entity and to identify its sensitivity.
There is a new requirement to address “asset inventory and device management� in the Cybersecurity Policy, while the requirement to address “capacity and performance planning� has been eliminated.
The Cybersecurity Policy must be approved by a Senior Officer or by the Board of Directors, but the requirement for annual review by the board or a senior officer ...


Read More on Datafloq
Urgent Need on ‘Silent’ Cyber Risks

Urgent Need on ‘Silent’ Cyber Risks

This is an unprecedented time for insurers. As margins associated with conventional lines of coverage continue to tighten, pressure is increasing to offer new forms of coverage to respond to the emerging cyber threats facing insureds in today’s digital economy. At the same time, insurers are compelled to make certain that those risks are effectively excluded from coverage under many other “traditional� policy forms.

Unfortunately for underwriters of both traditional and newer policy forms, emerging cyber threats can be difficult, if not impossible, to predict and factor into underwriting and policy drafting processes. But as we’ve already seen in the context of cyber incidents, today’s unknown cyber threat can become tomorrow’s front-page news and unanticipated limits payout. And if that threat is spread across multiple insureds in an insurer’s coverage portfolio, the bottom-line effect of the aggregated losses could be devastating. Making matters worse — as recently recognized by the Bank of England’s Prudential Regulation Authority (PRA) — these “silent� cyber exposures can simultaneously affect multiple lines of coverage, (including casualty, marine, aviation and transport), affecting both direct and facultative coverages.

Imagine this scenario:

Company A manufactures components used in the Wi-Fi systems of commercial airliners. Mr. X, a disgruntled employee of Company ...


Read More on Datafloq
What is the Concept of the Internet of Things in a Box

What is the Concept of the Internet of Things in a Box

A few years ago, the idea of a “Telco in a Box” was very usual among the Telecommunication industry. Basically, it was a pre-integrated, turnkey real-time billing and customer care solution that enabled communications service providers (CSPs) to accelerate their growth strategies and increase profitability.

Companies like Accenture, Oracle, Redknee or Tech Mahindra used this concept addressed to Mobile Virtual Network Operators or MVNOs, Tier 3 Operators and Tier 1 sub brands. The benefits of this solution were clear:


A low-risk, quick to launch turnkey solution
Go to market faster than competitors


It was a matter of time that this marketing slogan reached the Internet of Things (IoT). And so it has been, at the moment with little noise, but it is certain that we will see much more “IoT in a Box” in the next months.

What is IoT in a Box and What’s in the box

Today we could say that IoT in a Box is:


A pre-configured, fully integrated, enterprise-enabled IoT bundle optimized for IoT processing (Telco view)
All the required building blocks to develop a wireless IoT system (IoT Vendor view)


In the first case, the IoT in a Box must include some of the following components depending on the application:

Hardware / Hardware as a ...


Read More on Datafloq
Are You Brave Enough to Change Your Data Habits?

Are You Brave Enough to Change Your Data Habits?

Do you often go with gut feeling rather than data and insights? Is your data stored in separate databases, in different formats with different values? We all have bad habits and some are a little hard to kick. However, if there is one you must break, it is surely to make your bad data habits a thing of the past…

Breaking bad data habits isn’t easy. Often there is internal resistance to making data-driven changes, especially with ‘the way things have always been done’ attitude many businesses still embrace. However, ignoring the elephant in the room can be costly. According to Experian’s Data Quality Report, 83% of companies believe their revenue is affected by inaccurate and incomplete customer or prospect data. This is often due to time and money being wasted on unnecessary resources and marketing and communication activities, which ultimately result in a huge loss of productivity.

The cost of bad data habits 

Data is your business’ most valuable asset – it enables you to make the right decisions and impacts everything from email deliverability to customer service and ultimately revenue generation.

Unfortunately, many businesses don’t realise the scale of the data quality issues and fail to give it the focus it deserves. They continue to follow the status quo, ...


Read More on Datafloq
Great Mathematical API by Wolfram

Great Mathematical API by Wolfram

I was in the process of computing some definite integrals involving special mathematical constants, when I discovered WolframAlpha. It solves tons of mathematical problems, for free, online, offering exact solutions whenever possible. Not just integrals, but matrix computations and much more.

In my case, I was trying to see, if by computing an integral in two different ways, one using the original function on the original domain, and the other one using the inverse function on the image domain, I would be able to find a mathematical equality involving one special mathematical constant for the first integral (say e or Pi) and one involving some other special mathematical constants (say log 2) for the second integral. The idea being that, if I manage to find such a relationship, then it means that the two mathematical constants (say e and log 2) are a simple function of each other, and thus we only need one. 

Needless to say, I was not able to find such relationships. I did find some interesting stuff though. First, the WolframAlpha API, and then, I rediscovered an obscure but fundamental theorem, not mentioned in math textbooks -- a theorem linking the integral of a function to the integral ...


Read More on Datafloq
Data Supply Framework 3.0 – ETL Patterns

Data Supply Framework 3.0 – ETL Patterns

This article is the first in a series of articles that discuss aspects of the use of architectural patterns in the Cambriano Information Supply Framework 3.0

The term architectural pattern may sound grand, misleading or daunting, but it’s really quite a simple concept. It’s like writing a function in a programming language to log in to a database, check that the connection is alive and working and report back the success of the connection request. If that function can be reused either in the same application development, in the same IT shop or in IT in general (e.g. Java code to connect and test the connection to SQL Server) then it’s well on its way to becoming an architectural pattern. Of course, there are much more sophisticated architectural patterns. But generally a pattern is a simplified and generic template for address a generally occurring problem. But as with much in architecture, less usually turns out to be more.

In this article I will be looking at patterns for the process known as ETL (Extract, Transform and Load), which is the typical mechanism used to take date from source systems (which may be systems of record) through transformation (to put the data into ...


Read More on Datafloq
Why IoT Viability Depends on Education

Why IoT Viability Depends on Education

Although there are now many connected devices, from Google Home to Amazon Echo, the Internet of Things still isn’t a big deal for the purchasing public. Look at Forbes’ 2017 predictions on the IoT and the other developments that go with it--AI, Big Data, etc--and there’s nothing necessarily positive. It’s a “buzzword”. Widespread adoption won’t happen because of complexity. The IoT will shut down the internet again like it did with the DDoS attack, only this time it will be worse. Cybersecurity for IoT devices will be a number one priority because, for one, ransomware will start hitting these devices, too.

But earlier in the same article, there are some predictions that bode well for the IoT. For one, chatbots will continue to take off, or as Narrative Science CEO Stuart Frankel puts it, “The movement towards conversational interfaces will accelerate.” When it comes to the consumer-facing side of the IoT, the conversational interface is access point for everything from asking Siri where the nearest gas station is to querying Home about artists similar to ABBA. Chatbots are indeed so relevant to the right-here-and-now of business that Aisle50 co-founder Christopher Steiner offers an investor’s guide to chatbots.

Steiner says, “The growing pool ...


Read More on Datafloq
What to Look For When Hiring Programmers

What to Look For When Hiring Programmers

Hiring a developer who is really talented is a skill that relies heavily on intuition, technical acumen, social networking and process management. If you have ever been given the hiring responsibility, then you most probably understand the challenges that are involved.

One particular problem when it comes to IT staffing is that it is tough to evaluate the hallmark qualities of the software engineering candidates. How can you explore the ability of a candidate to think creatively and innovate? How will you know if he or she is a team player? How do you know their capacity to utilize constructive feedback? Can you even investigate their moral fiber? Even though this task isn't simple, it is important to find answers to all these questions so as to hire the best people. Unfortunately, nothing of substance will be achieved when you just ask them.

Some methodologies and approaches can help you to evaluate the subtle dimensions of a programmer's skills and abilities. If used effectively, the techniques can assist you to hire the best programmer there is.

Technical acumen

Determining the technical proficiency of a candidate goes beyond their knowledge in programming languages or technology. An ideal developer will not waste time memorizing something that ...


Read More on Datafloq
Warehouse Management in the Era of Big Data

Warehouse Management in the Era of Big Data

There are many things that determine the efficiency of a business. However, when that business involves the production of a tangible product, there is probably nothing more important to efficiency than the quality of warehouse management.

In recent years there has been a revolution that is changing warehouse management forever. That revolution is big data. Big data is the collection and analysis of a volume of digital information so vast that it couldn’t be stored on computer hardware until recently. It has transformed business analytics and been a game changer in many different industries. Here are some of the ways big data is changing warehouse management in the modern era.

Big Data Improves Operational Efficiency

Operational efficiency can be described as the ability of a business to deliver its product or service to consumers in a way that minimizes cost while maximizing the quality of that product or service as well as other related support services. Operational efficiency is certainly something that can be improved by the integration of big data analysis. Data regarding operations in a warehouse, or any facility for that matter, can be immediately updated in real time.

This can have some real benefits. A warehouse manager can be given a minute ...


Read More on Datafloq
[BreakingNews] A Kaggle-t felvásárolta a Google

[BreakingNews] A Kaggle-t felvásárolta a Google

Ma hajnalban ütött be a hír, hogy a Google felvásárolta a legnagyobb adatbányászati versenyeket szervező oldalt, a Kaggle.com site-ot (első hír itt, hivatalosabbak itt , de a Google a blogposzt írásának időpontjában még hivatalosan nem jelentette be a tranzakciót). A vételárról nincsenek hírek, a 2010-es alapítása óta eddig 12,5 millió dollár befektetést tudott bevonni a cég.


kaggle_vs_google.pngA hír váratlanul ért, de sok szempontból nem annyira meglepő: a Google számára sok dolog jól jöhet a Kaggle portfóliójából. Kaggle a data science közösség egyik központi oldala, saját álláskereső oldalával és a közel félmillió felhasználójával (! - én ezt nem is nagyon akarom elhinni) jó merítés a cégnek már csak toborzási szempontból is. Ezen túlmenően a Kernels nevű kezdeményezésével a Kaggle egy saját gépi tanulási platformot is létrehozott, ami jól fog virítani a Google Cloud embléma alatt is. Talán ez az a pont, ahol mint Kaggle felhasználók a legtöbbet nyerhetünk. Nem hiába, a Google olyan cég, aki hisz az adatok erejében, miért ne hinné el, hogy a data scientist-ek világának közepét is érdemes birtokolnia. 


Másfelöl a felvásárlás azt is bizonyítja számomra, hogy az adatokkal dolgozó cégek és szakemberek univerzumában a nagy techóriások egyfajta hatalmas gravitációjú csillagoknak tekinthetők: egyre gyorsabban szippantják be az ígéretes kezdeményezéseket, az igazán izgalmas dolgok körülöttük történnek. Az itt felhalmozódó hatalmas adat- és tudásvagyonnal egyre nehezebben fognak versenyezni azok a vállalatok, melyek homlokterében az adatokkal való munka egyfajta megtűrt hobby vagy kényszerű próbálkozás.

google_vs_kaggle_1.pngPéldául a napokban találkoztam olyan nagyvállalattal, ahol egyenesen az volt a vélemény, hogy félnek belső kompetenciát építeni, mert ha valaki megtanulná ezt a szakmát a cégen belül, azt biztos hamar lecsábítanák az ilyen emberekre vadászó más cégek. Pedig hosszú távon valójában fordított tendenciák várhatók: hosszú távon az adatvezérelt gondolkodás, az adatokkal való barátság inkább alapkövetelményként fog megjelenni, mint szelekciós szempontként. De a lemaradók megnyugtatására és az élenjárók szomorúsága végett le kell szögeznem: mint minden változás, aminek az emberi fejekben kell végbemennie, sokkal lassabban fog megvalósulni, mint amekkora tempót maga a technológia diktál. 

Gratulálunk a Kaggle csapatának!

Megosztom Facebookon! Megosztom Twitteren! Megosztom Tumblren!

Recent Amazon Failures Leaves “Cloud” Forecast Unclear

Recent Amazon Failures Leaves “Cloud” Forecast Unclear

It’s been a rough few weeks for the cloud. Amazon Web Services, the public cloud division of internet juggernaut Amazon, suffered a mighty blow, crippling the internet on a Tuesday afternoon, while internet-connected teddy bears were hacked.

“If you could make God bleed, people would cease to believe in Him,” Ivan Vanko tells Tony Stark in Iron Man 2. “There will be blood in the water, the sharks will come.” A simple typo brought the Internet to its knees on Feb. 28.

With a market share of roughly a third of the internet, cloud traffic relies on the AWS, particularly the Amazon Simple Storage Service. A tiny coding error brought down all of S3’s customers, including popular services and websites such as Spotify, Imgur, Slack, Quora, and, ironically, the Down Detector.

This is the first major malfunction of S3 and the AWS, which is “designed to deliver 99.999999999% durability” for cloud services. Being one of the first cloud services commercially offered, Amazon quickly signed major names to the service. But, other providers, such as Microsoft, Google, and IBM, are catching up. Does this major outage equate to blood in the water, signaling a shift to other services?

It’s possible, though unlikely. Given the rarity ...


Read More on Datafloq
The Next as-a-Service: SRM

The Next as-a-Service: SRM

Storage Resource Management is once again a growth opportunity. The simplicity and visibility provided through new cloud-based as-a-service dashboards empower a broad range of IT management while retaining the sophistication augmented by predictive analytics, required by storage administrators.
Why Veteran should Take Free LSSGB and PMP Training with MSys Training?

Why Veteran should Take Free LSSGB and PMP Training with MSys Training?

Veterans have potential to handle the responsibilities of managers or any senior level role in all industries. With their military service, they can bring valuable skills and experience to any workforce. However, they may need a training on utilizing their military experience in the corporate world. We at MSys Training are taking an initiative in this direction and offering 4-days complementary training on Lean Six Sigma Green Belt (LSSGB) and Project Management Professional (PMP). Veterans can enroll themselves by simply dropping an email at support@msystraining.com along with a scan copy of the document that verifies their veteran status. We understand the confidentiality of the Veteran documents and hence, suggest Veterans to hide/blur any such details and then share the scan copy of it.

Continue reading the article to more know more about LSSGB training (March 28th to 31st, 2017) and PMP training (April 25th to 28th, 2017)

4-Days Lean Six Sigma Green Belt Training

MSys is conducting a LSSGB training for veterans to help them enhance their skills and explore the job opportunities in various industries. We are organizing this training session to honor our Veterans and the training brings a lot of benefits for them. Some of them are listed below:

  • 4 days online classroom training by lead Lean Six Sigma instructors at MSys
  • Career guidance from MSys professionals
  • Discount on Lean Six Sigma Green Belt course material and MSys LSSGB certification exam
  • 30% off on online and in-person classroom training programs and 40% off on online self-learning on future training programs
  • $100 referral bonus on every successful referral enrollment (except Veterans training programs)

Enroll today for the LSSGB training program by MSys Training, scheduled on March 28th to 31st, 2017 and the last date to get registered for the session is March 15th, 2017.

Registration Process for

With Course Material and MSys LSSGB Exam Voucher

  • Go to LSSGB Registration
  • Click on the Enroll now for batch 2 of March, 2017
  • Use coupon code “IAMVETERAN299”
  • Pay $299 ($99 for course material + $199 for MSys LSSGB exam)
  • Get register yourself

Without Course Material and MSys LSSGB

  • Drop an email at support@msystraining.com with scan copy of document that verifies your veteran status
  • You will receive confirmation within 24 hours

4-Days Project Management Professional (PMP) Training

Veterans have all major interpersonal skills (decision making, leadership and influencing power) required to be a successful project manager. Just to sharpen those skills as per the industry standards, MSys Training is offering a complimentary online training on PMP for the US Veterans on April 25th – 28th, 2017. Enrolling for this online classroom training will bring you several benefits, such as:

  • 4 days live online classroom training by lead PMP instructors
  • Discount on PMP Course material aligned with PMBOK V5
  • Career guidance by MSys professionals
  • 30% off on online and in-person classroom training programs and 40% off on online self-learning (applicable on future training programs)
  • Dedicated learning consultant to fill PMI® application
  • $100 referral bonus on every successful referral enrollment(except Veterans training programs)

Registration Process for PMP

With Course Material

  • Go to PMP Registration
  • Click on the Enroll now for batch 2 of April month
  • Use coupon code “IAMVETERAN@99”
  • Pay $99 for course material
  • Get register yourself

Without Course

  • Drop an email at support@msystraining.com with scan copy of document that verifies your veteran status
  • You will receive confirmation within 24 hours

MSys is conducting both the free trainings for Veterans, but if you are not one of them and sill wants to be a part of the training program, you can register for our training at 25% discounted price.

There might be several questions coming to your mind about our training program. Here is a list of some common questions, if you still not able to find your answer, you can drop your query at support@msystraining.com.

How to enroll?

Send us an email at support@msystraining.com with a scan copy of the document that verifies your Veteran status to complete your enrollment.

When do I get my training details?

MSys Training will share you the www.GoToMeeting.com login credential 5 days prior to the training. You will receive an acknowledgement for registration confirmation within 24 hours of your request.

Do I get the MSys courseware?

MSys courseware is available for Veterans at just $99 per license for both the courses. However, you will need to buy a PMBOK V5 from a local store or an online store for the class.

Do I get a course completion certificate?

Yes, post completion of online classroom training, MSys will send a 35 contact hours’ certificate for PMP and 35 PDUs certificate for LSSGB to your registered email address.

What is the difference between MSys paid and Veterans free classes?

There is no difference between our free and paid training sessions. MSys is conducting these free classes for Veterans to understand the current market situation and get those extra skills require to excel in the corporate world.

How to take PMP/LSSGB Certification Exam?

You can appear for the MSys LSSGB exam and get certified in just $199. On the other hand, to schedule a PMP certification exam, you need to fill an application on www.PMI.org.

I am new to PMI, will you assist me in filling out my application?

Yes, we have a team of dedicated learning consultants, they will guide you through PMI application process.

Is there any difference between IASSC and MSys LSSGB certification?

IASSC is a close book exam with just 1 attempt, whereas MSys LSSGB is an open book exam with 3 attempts. You can choose to take the IASSC LSSGB exam @ $399 or MSys LSSGB exam @ $199 (both are globally recognized).

Veterans! Gear up to get trained by the experts!


MSys Helping Veterans to Build Place in Corporate World

MSys Helping Veterans to Build Place in Corporate World

Veterans are highly respected in our society and always appreciated for their contribution. But when it comes to offering them a job, organizations don’t prefer hiring them and choose civilians over Veterans. But, have you ever given a thought who can serve better for your organization? Let me highlight some trades of Veterans that prove they are better leaders and can add more value to any organization:

  • Veterans choose work over themselves.
  • They are well-disciplined, hard-working and have leadership quality.
  • They are a quick learner and have the ability to adapt any situation.
  • During their service, they have trained to mentor others, which is a much required skill.

To improve and make their skills an industry ready, MSys Training is offering the free trainings on Project Management Professional (PMP) and Lean Six Sigma Green Belt (LSSGB). These PMP and LSSGB trainings will bring you a lot of other benefits than saving money. Here are few of them:

  • 4 days online classroom training by MSys’ lead PMP and Lean Six Sigma instructors
  • In-depth knowledge on Project Management and Lean & Six Sigma Methodologies
  • Allows you to attend 4 days online class from home, office or anywhere you are comfortable
  • Discount on course materials for PMP and LSSGB
  • Special discount on the MSys LSSGB certification exam
  • 30% off on online and in-person classroom trainings and 40% off on online self-learning on future training programs
  • $100 referral bonus on every successful referral enrollment (except Veterans training programs)
  • Career guidance from MSys professionals

At MSys Training, we have instructors who have well-versed knowledge about their topic and capable to sharpen professional skills. The instructors are proven experts in their respective fields and have taught for many years. More than expert professionals, instructors are passionate about their courses!

These 4-days online classroom trainings are scheduled for March 28th-31st, 2017 and the last date to get registered for the trainings is March 15th, 2017. You can opt for either training that you are most interested in or looking to build your career in.

These free trainings are hosted only for Veterans, but if you are not military Veterans and still looking to take this training program, you can register for any training at 25% discounted price.

There must be several questions roaming around your mind. Here is a list of some common questions, if you did not get your answer, feel free to contact us at support@msystraining.com.

How to enroll?

Simply drop an email to support@msystraining.com with a document that verifies your Veteran status to complete your enrollment.

When do I get my training details?

MSys Training will send you the GoToMeeting.com login credential 5 days prior to the training start date. You will receive a registration confirmation email within 24 hours of your registration request.

Do I get the MSys courseware?

MSys courseware for both PMP and LSSGB is available for Vets at just $99 per license. And, you will need to buy a PMBOK V5 from a local store or an online store for the class.

Do I get a course completion certificate?

Yes, post completion of 4 days online class, MSys will send a 35 PDUs certificate for LSSGB and 35contact hours’ certificate for PMP to your registered email address.

Is there any difference between MSys paid and Veterans free classes?

No, there is no difference between our paid and free training sessions. MSys is conducting these free classes for Veterans to understand the current market situation and get those extra skills require to excel in the corporate world.

How to appear for LSSGB/PMP Certification Exam?

You can take the MSys LSSGB exam and get certified in just $199. For PMP you need to fill an application on PMI.org before scheduling the exam.

I’m new to PMI, do you assist me in filling out my application?

Yes, our dedicated learning consultant will guide you through PMI application process

What is the difference between IASSC and MSys LSSGB certification?

IASSC is a close book exam with just 1 attempt, whereas MSys LSSGB is an open book exam with 3 attempts. You can choose to take the IASSC LSSGB exam @ $399 or MSys LSSGB exam @ $199 (both are globally recognized).


MSys Honors Veterans and Offers 4-Days Lean Six Sigma Green Belt Training for No Cost

MSys Honors Veterans and Offers 4-Days Lean Six Sigma Green Belt Training for No Cost

Veterans have all the skills required to be a professional in any industry, but being from the military or army background many companies don’t give preference to them. According to the source, companies don’t pay veterans as per their capabilities. MSys is taking an initiative to train veterans in different domains that help them to get jobs with handsome salary.

To help veterans to explore and avail a chance to get a good job in various industries, MSys is putting their best foot forward and are organizing a free LSSGB training for veterans. To be a part of the MSys LSSGB (Lean Six Sigma Green Belt) training program, veterans do not need to pay anything, apart from four days of their schedule. As we are running this campaign to honoring the veterans, we would like you to submit any document verifying your veteran status to prevent scam. This free LSSGB training will get you lots of benefits as well and some of them are listed below:

  • 4 days online classroom training by Jason Saetrum, a lead Lean Six Sigma instructor at MSys
  • Heavy discount on LSSGB course material and MSys LSSGB certification exam
  • 30% off on online and in-person classroom training programs and 40% off on online self-learning on future training programs
  • Career guidance from MSys professionals
  • $100 referral bonus on every successful referral enrollment (except Veterans training programs)

Our leading Lean Six Sigma instructor, Jason Saetrum is going to conduct this free LSSGB training session for veterans. He is IASSC and ASQ certified Lean and Six Sigma training associate, along with PMP, CompTIA Project+ and Microsoft Certified Trainer. With more than 2 decades of experience in the industry, he trains people to make them expert in the respective domain.

Join the venture by registering yourself to this free LSSGB (Lean Six Sigma Green Belt) training program by MSys Training. The 4-day online classroom LSSGB training is held from March 28th to 31st, 2017 and the last date to get registered for the session is March 15th, 2017. What are you waiting for, avail the benefits today! Register yourself for LSSGB training and pass this to your friends and colleges to get maximum benefits out of this session.

This free training is hosted only for Veterans, but if you are not one of them and still want to be a part of the training program, you can register for our LSSGB training at 25% discounted price.

If quality management is not your interest, no need to worry! MSys Training also conducting a free training session on Project Management Professionals on the same dates. You can opt for either training that you are most interested in or looking to build your career in.

Here are some questions that can stroke your mind before registering for the LSSGB training:

How to enroll?

To enroll for the training, just drop an email to support@msystraining.com to complete your enrollment.

When do I get my training details?

MSys will send you the GoToMeeting.com login credential 5 days prior to the training start date. The registration confirmation email will be sent within 24 hours of your registration request.

Do I get the MSys courseware?

MSys courseware is available for Vets at just $99 per license.

How to appear for Certification Exam?

You can take the MSys LSSGB exam and get certified in just $199.

What is the difference between IASSC and MSys certification?

IASSC is a close book exam with just 1 attempt, whereas MSys LSSGB is an open book exam with 3 attempts. You can choose to take IASSC LSSGB exam @ $399 or MSys LSSGB exam @ $199(both are globally recognized).

Is there any difference between MSys paid classes and Veterans free classes?

No, there is absolutely no difference between our paid and free program. MSys conducting this free classes for Veterans to understand the current market situation and get those extra skills require to excel in civilian world.

Do I get a course completion certificate?

Yes, post completion of 4 days online class, MSys will send a 35 PDUs to your registered email address.


Converged IoT systems: Bringing the data center to the edge of everything

Converged IoT systems: Bringing the data center to the edge of everything

The next BriefingsDirect thought leadership panel discussion explores the rapidly evolving architectural shift of moving advanced IT capabilities to the edge to support Internet of Things (IoT) requirements.

The demands of data processing, real-time analytics, and platform efficiency at the intercept of IoT and business benefits have forced new technology approaches. We'll now learn how converged systems and high-performance data analysis platforms are bringing the data center to the operational technology (OT) edge.

Listen to the podcast. Find it on iTunes. Get the mobile app. Read a full transcript or download a copy.

To hear more about the latest capabilities in gaining unprecedented measurements and operational insights where they’re needed most, please join me in welcoming Phil McRell, General Manager of the IoT Consortia at PTC; Gavin Hill, IoT Marketing Engineer for Northern Europe at National Instruments (NI) in London, and Olivier Frank, Senior Director of Worldwide Business Development and Sales for Edgeline IoT Systems at Hewlett Packard Enterprise (HPE). The discussion is moderated by BriefingsDirect's Dana Gardner, Principal Analyst at Interarbor Solutions.

Here are some excerpts:

Gardner: What's driving this need for a different approach to computing when we think about IoT and we think about the “edge” of organizations? Why is this becoming such a hot issue?

McRell: There are several drivers, but the most interesting one is economics. In the past, the costs that would have been required to take an operational site -- a mine, a refinery, or a factory -- and do serious predictive analysis, meant you would have to spend more money than you would get back.

For very high-value assets -- assets that are millions or tens of millions of dollars -- you probably do have some systems in place in these facilities. But once you get a little bit lower in the asset class, there really isn’t a return on investment (ROI) available. What we're seeing now is that's all changing based on the type of technology available.

Gardner: So, in essence, we have this whole untapped tier of technologies that we haven't been able to get a machine-to-machine (M2M) benefit from for gathering information -- or the next stage, which is analyzing that information. How big an opportunity is this? Is this a step change, or is this a minor incremental change? Why is this economically a big deal, Olivier?
Frank

Frank: We're talking about Industry 4.0, the fourth generation of change -- after steam, after the Internet, after the cloud, and now this application of IoT to the industrial world. It’s changing at multiple levels. It’s what's happening within the factories and within this ecosystem of suppliers to the manufacturers, and the interaction with consumers of those suppliers and customers. There's connectivity to those different parties that we can then put together.

While our customers have been doing process automation for 40 years, what we're doing together is unleashing the IT standardization, taking technologies that were in the data centers and applying them to the world of process automation, or opening up.

The analogy is what happened when mainframes were challenged by mini computers and then by PCs. It's now open architecture in a world that has been closed.

Gardner: Phil mentioned ROI, Gavin. What is it about the technology price points and capabilities that have come down to the point where it makes sense now to go down to this lower tier of devices and start gathering information?


Hill
Hill: There are two pieces to that. The first one is that we're seeing that understanding more about the IoT world is more valuable than we thought. McKinsey Global Institute did a study that said that by about 2025 we're going to be in a situation where IoT in the factory space is going to be worth somewhere between $1.2 trillion and $3.7 trillion. That says a lot.

The second piece is that we're at a stage where we can make technology at a much lower price point. We can put that onto the assets that we have in these industrial environments quite cheaply.

Then, you deal with the real big value, the data. All three of us are quite good at getting the value from our own respective areas of expertise.

Look at someone that we've worked with, Jaguar Land Rover. In their production sites, in their power train facilities, they were at a stage where they created an awful lot of data but didn't do anything with it. About 90 percent of their data wasn't being used for anything. It doesn't matter how many sensors you put on something. If you can't do anything with the data, it's completely useless.

They have been using techniques similar to what we've been doing in our collaborative efforts to gain insight from that data. Now, they're at a stage where probably 90 percent of their data is usable, and that's the big change.

Collaboration is key

Gardner: Let's learn more about your organizations and how you're working collaboratively, as you mentioned, before we get back into understanding how to go about architecting properly for IoT benefits. Phil, tell us about PTC. I understand you won an award in Barcelona recently.

McRell: That was a collaboration that our three organizations did with a pump and valve manufacturer, Flowserve. As Gavin was explaining, there was a lot of learning that had to be done upfront about what kind of sensors you need and what kind of signals you need off those sensors to come up with accurate predictions.

When we collaborate, we rely heavily on NI for their scientists and engineers to provide their expertise. We really need to consume digital data. We can't do anything with analog signals and we don't have the expertise to understand what kind of signals we need. When we obtain that, then with HPE, we can economically crunch that data, provide those predictions, and provide that optimization, because of HPE's hardware that now can live happily in those production environments.

Gardner: Tell us about PTC specifically; what does your organization do?

McRell: For IoT, we have a complete end-to-end platform that allows everything from the data acquisition gateway with NI all the way up to machine learning, augmented reality, dashboards, and mashups, any sort of interface that might be needed for people or other systems to interact.

In an operational setting, there may be one, two, or dozens of different sources of information. You may have information coming from the programmable logic controllers (PLCs) in a factory and you may have things coming from a Manufacturing Execution System (MES) or an Enterprise Resource Planning (ERP) system. There are all kinds of possible sources. We take that, orchestrate the logic, and then we make that available for human decision-making or to feed into another system.

Gardner: So the applications that PTC is developing are relying upon platforms and the extension of the data center down to the edge. Olivier, tell us about Edgeline and how that fits into this?
Explore
HPE's Edgeline

IoT Systems
Frank: We came up with this idea of leveraging the enterprise computing excellence that is our DNA within HPE. As our CEO said, we want to be the IT in the IoT.

According to IDC, 40 percent of the IoT computing will happen at the edge. Just to clarify, it’s not an opposition between the edge and the hybrid IT that we have in HPE; it’s actually a continuum. You need to bring some of the workloads to the edge. It's this notion of time of insight and time of action. The closer you are to what you're measuring, the more real-time you are.

We came up with this idea. What if we could bring the depth of computing we have in the data center in this sub-second environment, where I need to read this intelligent data created by my two partners here, but also, actuate them and do things with them?

Take the example of an electrical short circuit that for some reason caught fire. You don’t want to send the data to the cloud; you want to take immediate action. This is the notion of real-time, immediate action.

We take the deep compute. We integrate the connectivity with NI. We're the first platform that has integrated an industry standard called PXI, which allows NI to integrate the great portfolio of sensors and acquisition and analog-to-digital conversion technologies into our systems.

Finally, we bring enterprise manageability. Since we have proliferation of systems, system management at the edge becomes a problem. So, we bring our award-winning and millions-of-licenses sold our Integrated Lights-Out (iLO) that we sell in all our ProLiant servers, and we bring that technology at the edge as well.

Gardner: We have the computing depth from HPE, we have insightful analytics and applications from PTC, what does NI bring to the table? Describe the company for us, Gavin?

Working smarter

Hill: As a company, NI is about a $1.2 billion company worldwide. We get involved in an awful lot of industries. But in the IoT space, where we see ourselves fitting within this collaboration with PTC and HPE, is our ability to make a lot of machines smarter.

There are already some sensors on assets, machines, pumps, whatever they may be on the factory floor, but for older or potentially even some newer devices, there are not natively all the sensors that you need to be able to make really good decisions based on that data. To be able to feed in to the PTC systems, the HPE systems, you need to have the right type of data to start off with.

We have the data acquisition and control units that allow us to take that data in, but then do something smart with it. Using something like our CompactRIO System, or as you described, using the PXI platform with the Edgeline products, we can add a certain level of understanding and just a smart nature to these potentially dumb devices. It allows us not only to take in signals, but also potentially control the systems as well.

We not only have some great information from PTC that lets us know when something is going to fail, but we could potentially use their data and their information to allow us to, let’s say, decide to run a pump at half load for a little bit longer. That means that we could get a maintenance engineer out to an oil rig in an appropriate time to fix it before it runs to failure. We have the ability to control as well as to read in.

The other piece of that is that sensor data is great. We like to be as open as possible in taking from any sensor vendor, any sensor provider, but you want to be able to find the needle in the haystack there. We do feature extraction to try and make sure that we give the important pieces of digital data back to PTC, so that can be processed by the HPE Edgeline system as well.
Explore
HPE's Edgeline

IoT Systems
Frank: This is fundamental. Capturing the right data is an art and a science and that’s really what NI brings, because you don’t want to capture noise; it’s proliferation of data. That’s a unique expertise that we're very glad to integrate in the partnership.

Gardner: We certainly understand the big benefit of IoT extending what people have done with operational efficiency over the years. We now know that we have the technical capabilities to do this at an acceptable price point. But what are the obstacles, what are the challenges that organizations still have in creating a true data-driven edge, an IoT rich environment, Phil?

Economic expertise

McRell: That’s why we're together in this consortium. The biggest obstacle is that because there are so many different requirements for different types of technology and expertise, people can become overwhelmed. They'll spend months or years trying to figure this out. We come to the table with end-to-end capability from sensors and strategy and everything in between, pre-integrated at an economical price point.

Speed is important. Many of these organizations are seeing the future, where they have to be fast enough to change their business model. For instance, some OEM discrete manufacturers are going to have to move pretty quickly from just offering product to offering service. If somebody is charging $50 million for capital equipment, and their competitor is charging $10 million a year and the service level is actually better because they are much smarter about what those assets are doing, the $50 million guy is going to go out of business.

McRell
We come to the table with the ability to come and quickly get that factory, get those assets smart and connected, make sure the right people, parts, and processes are brought to bear at exactly the right time. That drives all the things people are looking for -- the up-time, the safety, the yield,  and performance of that facility. It comes down to the challenge, if you don't have all the right parties together with that technology and expertise, you can very easily get stuck on something that takes a very long time to unravel.

Gardner: That’s very interesting when you move from a Capital Expenditure (CAPEX) to an Operational Expenditure (OPEX) mentality. Every little bit of that margin goes to your bottom line and therefore you're highly incentivized to look for whole new categories of ways to improve process efficiency.

Any other hurdles, Olivier, that you're trying to combat effectively with the consortium?

Frank: The biggest hurdle is the level of complexity, and our customers don't know where to start. So, the promise of us working together is really to show the value of this kind of open architecture injected into a 40-year-old process automation infrastructure and demonstrate, as we did yesterday with our robot powered by our HPE Edgeline is this idea that I can show immediate value to the plant manager, to the quality manager, to the operation manager using the data that resides in that factory already, and that 70 percent or more is unused. That’s the value.

So how do you get that quickly and simply? That’s what we're working to solve so that our customers can enjoy the benefit of the technology faster and faster.

Bridge between OT and IT

Gardner: Now, this is a technology implementation, but it’s done in a category of the organization that might not think of IT in the same way as the business side -- back office applications and data processing. Is the challenge for many organizations a cultural one, where the IT organization doesn't necessarily know and understand this operational efficiency equation and vice versa, and how are we bridging that?

Hill: I'm probably going to give you the high-level end from the operational technology (OT) side as well. These guys will definitely have more input from their own domain of expertise. But, that these guys have that piece of information for that part that they know well is exactly why this collaboration works really well.

You have situations with the idea of the IoT, where a lot of people stood up and said, "Yeah, I can provide a solution. I have the answer," but without having a plan -- never mind a solution. But we've done a really good job of understanding that we can do one part of this system, this solution, really well, and if we partner with the people who are really good in the other aspects, we provide real solutions to customers. I don't think anyone can compete with us with at this stage, and that is exactly why we're in this situation.

Frank: Actually, the biggest hurdle is more on the OT side, not really relying on the IT of the company. For many of our customers, the factory's a silo. At HPE, we haven't been selling too much to that environment. That’s also why, when working as a consortium, it’s important to get to the right audience, which is in the factory. We also bring our IT expertise, especially in the areas of security, because at the moment, when you put an IT device in an OT environment, you potentially have problems that you didn’t have before.

We're living in a closed world, and now the value is to open up. Bringing our security expertise, our managed service, our services competencies to that problem is very important.

Speed and safety out in the open

Hill: There was a really interesting piece in the HPE Discover keynote in December, when HPE Aruba started to talk about how they had an issue when they started bringing conferencing and technology out, and then suddenly everything wanted to be wireless. They said, "Oh, there's a bit of a security issue here now, isn’t there? Everything is out there."

We can see what HPE has contributed to helping them from that side. What we're talking about here on the OT side is a similar state from the security aspect, just a little bit further along in the timeline, and we are trying to work on that as well. Again, we have HPE here and they have a lot of experience in similar transformations.

Frank: At HPE, as you know, we have our Data Center and Hybrid Cloud Group and then we have our Aruba Group. When we do OT or our Industrial IoT, we bring the combination of those skills.

For example, in security, we have HPE Aruba ClearPass technology that’s going to secure the industrial equipment back to the network and then bring in wireless, which will enable the augmented-reality use cases that we showed onstage yesterday. It’s a phased approach, but you see the power of bringing ubiquitous connectivity into the factory, which is a challenge in itself, and then securely connecting the IT systems to this OT equipment, and you understand better the kind of the phases and the challenges of bringing the technology to life for our customers.

McRell: It’s important to think about some of these operational environments. Imagine a refinery the size of a small city and having to make sure that you have the right kind of wireless signal that’s going to make it through all that piping and all those fluids, and everything is going to work properly. There's a lot of expertise, a lot of technology, that we rely on from HPE to make that possible. That’s just one slice of that stack where you can really get gummed up if you don’t have all the right capabilities at the table right from the beginning. 

Gardner: We've also put this in the context of IoT not at the edge isolated, but in the context of hybrid computing and taking advantage of what the cloud can offer. It seems to me that there's also a new role here for a constituency to be brought to the table, and that’s the data scientists in the organization, a new trove of data, elevated abstraction of analytics. How is that progressing? Are we seeing the beginnings of taking IoT data and integrating that, joining that, analyzing that, in the context of data from other aspects of the company or even external datasets?

McRell: There are a couple of levels. It’s important to understand that when we talk about the economics, one of the things that has changed quite a bit is that you can actually go in, get assets connected, and do what we call anomaly detection, pretty simplistic machine learning, but nonetheless, it’s a machine-learning capability.

In some cases, we can get that going in hours. That’s a ground zero type capability. Over time, as you learn about a line with multiple assets, about how all these function together, you learn how the entire facility functions, and then you compare that across multiple facilities, at some point, you're not going to be at the edge anymore. You're going to be doing a systems type analytics, and that’s different and combined.

At that point, you're talking about looking across weeks, months, years. You're going to go into a lot of your back-end and maybe some of your IT systems to do some of that analysis. There's a spectrum that goes back down to the original idea of simply looking for something to go wrong on a particular asset.

The distinction I'm making here is that, in the past, you would have to get a team of data scientists to figure out almost asset by asset how to create the models and iterate on that. That's a lengthy process in and of itself. Today, at that ground-zero level, that’s essentially automated. You don't need a data scientist to get that set up. At some point, as you go across many different systems and long spaces of time, you're going to pull in additional sources and you will get data scientists involved to do some pretty in-depth stuff, but you actually can get started fairly quickly without that work.

The power of partnership

Frank: To echo what Phil just said, in HPE we're talking about the tri-hybrid architecture -- the edge, so let’s say close to the things; the data center; and then the cloud, which would be a data center that you don’t know where it is. It's kind of these three dimensions.

The great thing partnering with PTC is that the ThingWorx platform, the same platform, can run in any of those three locations. That’s the beauty of our HPE Edgeline architecture. You don't need to modify anything. The same thing works, whether we're in the cloud, in the data center, or on the Edgeline.

To your point about the data scientists, it's time-to-insight. There are things you want to do immediately, and as Phil pointed out, the notion of anomaly detection that we're demonstrating on the show floor is understanding those nominal parameters after a few hours of running your thing, and simply detecting something going off normal. That doesn't require data scientists. That takes us into the ThingWorx platform.
Explore
HPE's Edgeline

IoT Systems
But then, to the industrial processes, we're involving systems integration partners and using our own knowledge to bring to the mix along with our customers, because they own the intelligence of their data. That’s where it creates a very powerful solution.

Gardner: I suppose another benefit that the IT organization can bring to this is process automation and extension. If you're able to understand what's going on in the device, not only would you need to think about how to fix that device at the right time -- not too soon, not too late -- but you might want to look into the inventory of the part, or you might want to extend it to the supply chain if that inventory is missing, or you might want to analyze the correct way to get that part at the lowest price or under the RFP process. Are we starting to also see IT as a systems integrator or in a process integrator role so that the efficiency can extend deeply into the entire business process?

McRell: It's interesting to see how this stuff plays out. Once you start to understand in your facility -- or maybe it’s not your facility, maybe you are servicing someone's facility -- what kind of inventory should you have on hand, what should you have globally in a multi-tier, multi-echelon system, it opens up a lot of possibilities.

Today PTC provides a lot of network visibility, a lot of spare-parts inventory, management, and systems, but there's a limit to what these algorithms can do. They're really the best that’s possible at this point, except when you now have everything connected. That feedback loop allows you to modify all your expectations in real time, get things on the move proactively so the right person and parts, process, kit, all show up at the right time.

Then, you have augmented reality and other tools, so that maybe somebody hasn't done this service procedure before, maybe they've never seen these parts before, but they have a guided walk-through and have everything showing up all nice and neat the day of, without anybody having to actually figure that out. That's a big set of improvements that can really change the economics of how these facilities run.

Connecting the data

Gardner: Any other thoughts on process integration?

Frank: Again, the premise behind industrial IoT is indeed, as you're pointing out, connecting the consumer, the supplier, and the manufacturer. That’s why you have also the emergence of a low-power communication layer, like LoRa or Sigfox, that really can bring these millions of connected devices together and inject them into the systems that we're creating.

Hill: Just from the conversation, I know that we’re all really passionate about this. IoT and the industrial IoT is really just a great topic for us. It's so much bigger than what we're talking about. You've talked a little bit about security, you have asked us about the cloud, you have asked us about the integration of the inventory and to the production side, and it is so much bigger than what we are talking about now.

We probably could have twice this long of a conversation on any one of these topics and still never get halfway to the end of it. It's a really exciting place to be right now. And the really interesting thing that I think all of us are now realizing, the way that we have made advancements as a partnership as well is that you don't know what you don't know. A lot of companies are waking up to that as well, and we're using our collaborations to allow us to know what we don’t know

Frank: Which is why speed is so important. We can theorize and spend a lot of time in R&D, but the reality is, bring those systems to our customers, and we learn new use cases and new ways to make the technology advance.

Hill: The way that technology has gone, no one releases a product anymore -- that’s the finished piece, and that is going to stay there for 20, 30 years. That’s not what happens. Products and services are being provided that get constantly updated. How many times a week does your phone update with different pieces of firmware, the app is being updated. You have to be able to change and take the data that you get to adjust everything that’s going on. Otherwise you will not stay ahead of the market.

And that’s exactly what Phil described earlier when he was talking about whether you sell a product or a service that goes alongside a set of products. For me, one of the biggest things is that constant innovation -- where we are going. And we've changed. We were in kind of a linear motion of progression. In the last little while, we've seen a huge amount of exponential growth in these areas.

We had a video at the end of the London HPE Discover keynote, where it was one of HPE’s pieces of what the future could be. We looked at it and thought it was quite funny. There was an automated suitcase that would follow you after you left the airport. I started to laugh at that, but then I took a second and I realized that maybe that’s not as ridiculous as it sounds, because we as humans think linearly. That’s incumbent upon us. But if the technology is changing in an exponential way, that means that we physically cannot ignore some of the most ridiculous ideas that are out there, because that’s what’s going to change the industry.

And even by having that video there and by seeing what PTC is doing with the development that they have and what we ourselves are doing in trying out different industries and different applications, we see three companies that are constantly looking through what might happen next and are ready to pounce on that to take advantage of it, each with their own expertise.

Gardner: We're just about out of time, but I'd like to hear a couple of ridiculous examples -- pushing the envelope of what we can do with these sorts of technologies now. We don’t have much time, so less than a minute each, if you can each come up perhaps with one example, named or unnamed, that might have seemed ridiculous at the time, but in hindsight has proven to be quite beneficial and been productive. Phil?

McRell: You can do this as engineering with us, you can do this in service, but we've been talking a lot about manufacturing. In a manufacturing journey, the opportunity, as Gavin and Olivier are describing here, is at the level of what happened between pre- and post-electricity. How fast things will run, the quality at which they will produce products, and then therefore the business model that now you can have because of that capability. These are profound changes. You will see up-times in some of the largest factories in the world go up double digits. You will see lines run multiple times faster over time.

These are things that, if you just walked in today and walked in in a couple of years to some of the people who run the hardest, it would be really hard to believe what your eyes are seeing at that point, just like somebody who was around before factories had electricity would be astounded by what they see today.

Back to the Future

Gardner: One of the biggest issues at the most macro level in economics is the fact that productivity has plateaued for the past 10 or 15 years. People want to get back to what productivity was -- 3 or 4 percent a year. This sounds like it might be a big part of getting there. Olivier, an example?

Frank: Well, an example would be more like an impact on mankind and wealth for humanity. Think about that with those technologies combined with 3D printing, you can have new class of manufacturers anywhere in the world -- in Africa, for example. With real-time engineering, some of the concepts that we are demonstrating today, you have designing.

Another part of PTC is Computer-Aided Design (CAD) systems and Product Lifecycle Management (PLM), and we're showing real-time engineering on the floor again. You design those products and you do quick prototyping with your 3D printing. That could be anywhere in the world. And you have your users testing the real thing, understanding whether your engineering choices were relevant, if there are some differences between the digital model and the physical model, this digital twin ID.

Then, you're back to the drawing board. So, a new class of manufacturers that we don’t even know, serving customers across the world and creating wealth in areas that are (not) up to date, not industrialized.

Gardner: It's interesting that if you have a 3D printer you might not need to worry about inventory or supply chain.

Hill: Just to add on that one point, the bit that really, really excites me about where we are with technology, as a whole, not even just within the collaboration, you have 3D printing, you have the availability of open software. We all provide very software-centric products, stuff that you can adjust yourself, and that is the way of the future.

That means that among the changes that we see in the manufacturing industry, the next great idea could come from someone who has been in the production plant for 20 years, or it could come from Phil who works in the bank down the road, because at a really good price point, he has the access to that technology, and that is one of the coolest things that I can think about right now.

Where we've seen this sort of development and this use of these sort of technologies and implementations and seen a massive difference, look at someone like Duke Energy in the US. We worked with them before we realized where our capabilities were, never mind how we could implement a great solution with PTC and with HPE. Even there, based on our own technology, those guys in the para-production side of things in some legacy equipment decided to try and do this sort of application, to have predictive maintenance to be able to see what’s going on in their assets, which are across the continent.

They began this at the start of 2013 and they have seen savings of an estimated $50 billion up to this point. That’s a number.

Listen to the podcast. Find it on iTunes. Get the mobile appRead a full transcript or download a copy. Sponsor: Hewlett Packard Enterprise.

You may also be interested in:

6 Steps for Planning Your Big Data Strategy

6 Steps for Planning Your Big Data Strategy

Big data is only getting bigger. Last year we’ve collected more information than we did in the whole of human history before that. Even more frightening, that data is doubling every year. That’s a lot of data!

That means two things:

1. Ever more can be understood on the basis of big data.

2. The longer you wait with joining the big data the game, the harder it will be for everybody else will be further ahead of you.

And when I say ‘everybody’ I do really mean ‘everybody’. From logistics to healthcare, from finance to logistics, from multinationals to small businesses, big data is a part of their strategy and informing the decision that they’re making.

Therefore, isn’t it about time that you joined in as well?

‘Of course,’ the image of my audience in my mind grumbles, ‘but how do I do that?’

Get the right team together

The first thing to realize is that as big data is still quite new, there isn’t yet anything like a ‘big data’ person. What’s more, it isn’t just some IT program. Instead, it’s a business strategy. For that reason, you need to make sure that your team has all the necessary skills to actually make the best of ...


Read More on Datafloq
(Guest Post) Value And Insights: Yves Mulkers ahead of Big Data World 2017

(Guest Post) Value And Insights: Yves Mulkers ahead of Big Data World 2017

 John Bensalhia talks to Yves Mulkers, freelance Data Architect and blogger at 7wData, about the benefits, developments and challenges linked with Big Data...


“I'm an explorer on new technologies and Data Visualisation, and keep my finger on what's happening with Big Data from an architecture point of view.”

So says Yves Mulkers, freelance Data Architect and social media influencer. Yves is speaking ahead of upcoming Big Data World event in London, where he will make an appearance. Listing the key benefits of what Big Data can offer, Yves says that these are:

“Scalability, cost reduction, new products and revenue streams, tailored solutions and targeting, enterprise wide insights, and Smart cities.”

Having worked as a software developer in various branches, Yves achieved great expertise and mindset in object oriented thinking and development.
“Doing the full cycle of software development from analysis, implementation, support and project management in combination with a strong empathy, he positioned himself as a technical expert bridging and listening into the needs of the business and end-users.” 

Yves says that this past year has seen a number of breakthroughs in the development of Big Data such as:
“Integrated platforms, data preparation automation, automating automation, GPU and in-memory databases, Artificial Intelligence, micro services, IoT (Internet Of Things), and self-service analytics.”

Big Data can be used to create a competitive advantage in various ways for businesses. In addition to a 360% Customer View and narrower segmentation of customers, Yves says that next generation products, real-time customization, and business models based on data products are the new approaches. In addition, better informed decisions, such as the measurement of consumer sentiment are good gauges of raising the value of what Big Data can bring.

Businesses must consider a variety of aspects in order to ensure successful Data implementation. Yves says that businesses must have clear business processes and information state diagrams, and should also ensure that they are on top of their game with respect to training and documentation. Data standards must also be developed and complied with.

For applying data analytics and applications in a business, Yves explains that there are challenges to tackle:
“Creating value from your data products, finding the right talent and tools, maturity of the organisation in information management, and trusting the results of analytics. It's worth noting that Big Data and analytics are not the same as business intelligence.”

In the next five to 10 years, Yves says that:
“Big Data will become the business intelligence of now.”

In addition to businesses and companies, aspects of Big Data will be for everyone to take advantage of:
 “Big Data will be embedded in companies strategy, and analytics will become available to everyone. “
“Data volumes will keep on growing as data products will become a commodity and improve our quality of life.”

Looking ahead to the event, Yves says that he expects it to bring a lot of value and insights.
“The combination with the sidetracks around Cloud and others, will bring a broader view on the complete architecture (business, technical and data) needed to be successful in Big Data implementations.”

Computer Vision: Picturing The Future Of Retail

Computer Vision: Picturing The Future Of Retail

It’s no big secret that we live in a consumer-driven society.

The buyer is king and retailers have to keep up with current trends through investing time and money in data-informed insights to shape their core business strategies. 

Big Data will always play a part in influencing how retailers operate and market their products and brands. Advancing deep learning algorithms are increasingly being used to power the development of brands and products that consumers want with greater knowledge on the individual, personal buying experience.  

Emotion plays a huge part in marketing and brand building

But, it’s not all stats and figures anymore. The future of retail could be led by computer vision and mixed reality technologies to transform the way we shop and interact with our favourite brands. Emotion plays a huge part in marketing and brand building, so retailers are now looking at innovative technologies that push the boundaries of interaction and emotion in the buying experience.

There’s been a lot of buzz around Oculus Rift, Samsung Gear, and PlayStation VR recently, but we’re also seeing start-ups breaking onto the scene with more retail-led applications for computer vision and mixed reality tech.

Outdoor giants The North Face have used Virtual Reality as a means ...


Read More on Datafloq
How Virtual Reality Has Taken Over The World

How Virtual Reality Has Taken Over The World

Mobile app developers are now shifting towards virtual reality app development. Apps are being developed for different functions but the hottest trend now is virtual reality apps. This is because virtual reality, as a technology, has been adopted by different industries. And it is being adopted by more industries. Most app developers now work in conjunction with virtual reality engineers.

Instead of the convention building plans that are created on cardboards, architects now make use of virtual reality to present their plans. This gives 3D effect to the structure and the plan is viewed with the appropriate equipment, it feels exactly like the real image. It can never be compared to the conventional cardboard plan. In fact most clients no longer accept cardboard plans. The major advantage of this technology is that it is able to present a building that is yet to be developed in its completed form just like it was snapped with a camera.

Virtual reality apps are also used in the medical line buy surgeons. They use it to view the exact locations of some tumors. With the technology, they will know the best way to make incisions and remove the tumors. Virtual reality displays the interior of ...


Read More on Datafloq
The Data Doctrine

The Data Doctrine

Message: Thank you for signing The Data Doctrine!

What a fantastic moment. I’ve just signed The Data Doctrine. What is the data doctrine? In a similar philosophy to the Agile Manifesto it offers us data geeks a data-centric culture:

Value Data Programmes1 Preceding Software Projects
Value Stable Data Structures Preceding Stable Code
Value Shared Data Preceding Completed Software
Value Reusable Data Preceding Reusable Code

While reading the data doctrine I saw myself looking around seeing all the lost options and possibilities in data warehouse projects because of companies, project teams, or even individuals ignoring the value of data by incurring the consequences. I saw it in data warehouse projects, struggling with the lack of stable data structures in source systems as well as in the data warehouse. In a new fancy system, where no one cares about which, what and how data was generated. And for a data warehouse project even worse, is the practice of keeping data locked with access limited to a few principalities of departments castles.
All this is not the way to get value out of corporate data, and to leverage it for value creation.

As I advocate flexible, lean and easily extendable data warehouse principles and practices, I’ll support the idea of The Data Doctrine to evolve the understanding for the need of data architecture as well as of data-centric principles.

So long,
Dirk

1 To emphasize the point, we (the authors of The Data Doctrine) use the British spelling of “programme” to reinforce the difference between a data programme, which is a set of structured activities and a software program, which is a set of instructions that tell a computer what to do (Wikipedia, 2016).

How Big Data Can Enhance Your Storytelling Abilities in Content Marketing

How Big Data Can Enhance Your Storytelling Abilities in Content Marketing

Each day, 2.5 quintillion bytes of big data are created. More than likely, you’re familiar with the concepts of content marketing and big data. But, you might not know exactly how they can work together to create the best possible output.

A common practice that a lot of marketers immerse themselves in is judging their content content purely based on how many pairs of eyes they attract. In actuality, branded material needs to be focused on drawing in the eyes of people of whom are most likely to buy. More often than not, you are only selling to a specific subset of the population that is exposed to your messaging.

While the primary goal of any content marketing strategy is to spark interest in your product or service, randomly creating material without data-driven reasoning is a shot in the dark. Effective storytelling as all about being in the right place at the right time. Let’s talk about the profound impact big data has on your ability to do so.

Precise Targeting

Perhaps the biggest advantage of big data is that you can conduct in-depth research and craft your content according to popular demand. There are many ways to mine information across different channels to ...


Read More on Datafloq
Are Your Social Media Efforts Paying Off? | Simplilearn webinar starts 13-04-2017 11:30

Are Your Social Media Efforts Paying Off? | Simplilearn webinar starts 13-04-2017 11:30

In this webinar, digital marketing expert and social media influencer Lilach Bullock, tells you everything you need to know about how to measure your social media success.  Find out what makes a social media campaign successful and discover the best ways to measure your results. By taking part, you will learn: - How to set up goals for your so...Read More.
4 Trends That are Shaping the Future for Data Centres in India

4 Trends That are Shaping the Future for Data Centres in India

Since last decade, the concepts of data center services have changed drastically. The advent of pioneering trends, such as big data, cloud storage, and wearable technology, has opened up newer avenues for businesses to scale their operations, by modifying their existing data centers. Subsequently, organizations are migrating from in-house data centers to public cloud and leased co-location facilities. In addition, there have been other breakthroughs in data center planning and structuring that are making waves today, four of which have been discussed in  this blog post. Take a look.

1. Hyperscaling

Data centers in India, today, are hosting IT infrastructures for several e-commerce businesses. Subsequently, they are now evolving to incorporate hyperscaling, and handle the increased online traffic and reduce network outages. Migrating the infrastructures onto cloud, allows businesses to keep  their networks free from bandwidth crunches, and allow smoother customer experience online, while streaming videos, and all forms of static information.

2. Hybrid Internal Clouds

Prominent organizations are also shifting focus towards hosting hybrid internal clouds, to assimilate the ever-increasing network traffic into their functioning. Hybrid cloud infrastructures involve connecting two separate clouds (one internal and one external cloud), allowing seamless transfer of data and application, using identity-based user access control. Such cloud ...


Read More on Datafloq
20 Awesome Websites For Collecting Big Data

20 Awesome Websites For Collecting Big Data

Big data is a big deal. That’s why so many companies are working to deploy analytical systems that can track and collect the data they need — if they haven’t already. With it, you can learn a lot about customer behaviors, habits and tendencies, your own products and services, and much more. It can also provide insights into the future, like how to tailor specific marketing campaigns or what new products you should launch.

From 2011 to 2013, more data had been created in those two years than in the entire history of the human race. And that was years ago. It’s exploded even more since then. By the year 2020, there will be an estimated 44 trillion gigabytes.

As valuable as this information can be, not everyone has the capacity to collect and access — or even analyze — this much information. What’s the solution if you don’t have access to a system that can facilitate the data for you? What if you don’t have access to data banks or databases? Where can you go? Where can small businesses get the information they need?

Believe it or not, there are many websites on the internet you can use to reference and collect ...


Read More on Datafloq
From Buzz to Brass Tacks: Data-backed Strategies to Improve Sales Performance

From Buzz to Brass Tacks: Data-backed Strategies to Improve Sales Performance

No sales team would pass up an opportunity to improve its overall productivity and profitability. Most also realize that big data and analytics are game changers. The question isn’t whether to harness the power of big data and analytics. It’s deciding what provides the most value to individual salespeople and the entire organization.

While the majority of today's organizations may recognize the potential of big data, most struggle with how to properly apply the skills of data scientists:

Most businesses that do hire data scientists often lacks an idea how to effectively utilize their skills. Most data scientists are stuck in maintenance mode organizing and collating data, rather than actually analyzing it.

Below are several practical applications of big data that can be used to boost sales performance in an organization.

Real-time Performance Visibility and Optimization

Companies that consistently post record sales quarter after quarter have learned how to close the gap between CRM and the real-time performance of sales teams. One reason why is because big data gives management the visibility to more completely see what’s going on with the entire sales team and their processes.

Sales platforms driven by big data give managers more information to work with when it comes to coaching their ...


Read More on Datafloq
How Do Leading Businesses Protect Their Data Centers?

How Do Leading Businesses Protect Their Data Centers?

As more and more businesses invest in data centers to efficiently store and manage company data, it’s becoming increasingly clear that we need to have a discussion about data center integrity and what can be done to mitigate the risks that could potentially come into play at any given moment.

5 Ways to Protect a Data Center

A data center isn’t something to be taken lightly. Whether you design your own, purchase a pre-fabricated data center, or lease one, advanced protection is an absolute must. Here are a few things to be thinking about:

1. Control Physical Access

While you could argue that controlling physical access is second to securing network and system access (in terms of priorities), it’s a good idea to start with the physical side of things. That’s because controlling physical access to your data center is a much easier challenge. It lets you get a small “win” from the start and gives you one less thing to worry about.

A common strategy is to look at your data center in terms of zones. IT pro Rutrell Yasin likes the idea of using three zones. As he explains, “One zone would be for researchers to test and stage equipment, one would provide ...


Read More on Datafloq
The IoT-Connected Car of Today— Cases From Hertz, Nokia, NTT, Mojio & Concur Technologies

The IoT-Connected Car of Today— Cases From Hertz, Nokia, NTT, Mojio & Concur Technologies

Imagine a world where your car not only drives itself, but also says intelligent things like these:


A hotel is just around the corner and you have been driving for eight hours. Would you like to reserve a room and take rest for a couple of hours?
You last serviced the brakes twelve months ago and you have driven your car about 20.000 miles in this duration. Would you like me to find a dealer and book an appointment?


This would look like an impossibility about five years ago when the world was unaware of a technology called the Internet of Things (IoT), but today, the IoT is already breaking fresh ground for tech companies and car manufacturers, enabling them to realize their idea of a ‘connected car.’

I recently attended Mobile World Congress (#MWC17) in Barcelona where SAP announced its collaboration with Hertz, Nokia and Concur Technologies. The purpose of this new partnership is to leverage IoT to offer an intelligent, automated experience to car users. SAP also announced its collaboration with Mojio, the connected vehicle platform and app provider for T-Mobile USA and Deutsche Telekom. The integration of Mojio’s cloud computing capabilities with SAP Vehicles Network will make parking and fueling process a ...


Read More on Datafloq
A Big Market For Big Data: An Outlook on 2017

A Big Market For Big Data: An Outlook on 2017

You might think that the big data revolution is already in full swing. You’d be wrong. It can take decades for a big technological revolution to truly make its impact felt. And before you doubt it, big data is a technological revolution. It already is changing the landscape, like in real estate and health care. That will only accelerate as new uses get found, ideas get implemented and technologies that make use of it and that allow it to excel get rolled out (think the Internet of Things).

For that reason, you have to stay aware of what’s going on and what’s on the horizon. For that is where the real promise lies. Therefore, we’re going to look at the outlook for 2017, so that you can inform yourself and make the right decision about your big data policies.

Machine parts for big data

Just like machine parts for manufacture revolutionized construction, as it allowed one part to be replaced by another and small businesses to buy parts they knew to be compatible to construct finished products, so big data will experience a standardization revolution. More and more companies will come online with ready to use products that business can plug and play.

In ...


Read More on Datafloq
AI and Speech Recognition: A Primer for Chatbots

AI and Speech Recognition: A Primer for Chatbots

Conversational User Interfaces (CUI) are at the heart of the current wave of AI development. Although many applications and products out there are simply “Mechanical Turks” — which means machines that pretend to be automatized while a hidden person is actually doing all the work — there have been many interesting advancements in speech recognition from the symbolic or statistical learning approaches.

In particular, deep learning is drastically augmenting the abilities of the bots with respect to traditional NLP (i.e., bag-of-words clustering, TF-IDF, etc.) and is creating the concept of “conversation-as-a-platform”, which is disrupting the apps market.

Our smartphone currently represents the most expensive area to be purchased per squared centimeter (even more expensive than the square meters price of houses in Beverly Hills), and it is not hard to envision that having a bot as unique interfaces will make this area worth almost zero.

None of these would be possible though without heavily investing in speech recognition research. Deep Reinforcement Learning (DFL) has been the boss in town for the past few years and it has been fed by human feedbacks. However, I personally believe that soon we will move toward a B2B (bot-to-bot) training for a very simple reason: the reward structure. Humans spend time training their bots if ...


Read More on Datafloq
3 Major Challenges Facing the Future of IoT

3 Major Challenges Facing the Future of IoT

The Internet of Things (IoT) phenomenon—ubiquitous connected things providing key physical data and further processing of that data in the cloud to deliver business insights— presents a huge opportunity for many players in all businesses and industries. Many companies are organizing themselves to focus on IoT and the connectivity of their future products and services.

IoT Challenges

For the IoT industry to thrive there are three categories of challenges to overcome and this is true for any new trend in technology not only IoT:

Figure 1: IoT Challenges


Technology
Business  
Society


Technology

This part is covering all the technologies needed to make IoT systems function smoothly as a standalone solution or part of existing systems. Cloud Security Alliance (CSA)  listed some of the root causes of such technological challenges:


Many IoT Systems are poorly designed and implemented, using diverse protocols and technologies that create complex configurations.
Lack of mature IoT technologies and business processes
Limited guidance for life cycle maintenance and management of IoT devices
Limited best practices available for IoT developers
There is a lack of standards for authentication and authorization of IoT edge devices
There are no best practices for IoT-based incident response activities.
Audit and Logging standards are not defined for IoT components
Restricted interfaces available IoT devices to interact with security ...


Read More on Datafloq
How you Can Use Analytics and Data Presentation to Revolutionise your Business

How you Can Use Analytics and Data Presentation to Revolutionise your Business

To handle the growing pile of data, modern businesses have now embraced different business analytic tools and technologies. Such technologies have given certain enterprises competitive advantage over the rest. In a world where entrepreneurs, corporate managers, and key business stakeholders are becoming analytical, nothing can be taken for granted when it comes to data management and presentation.

So much to gain

For business owners that appreciate the need for data analysis, the functional role of business analytics cannot be overemphasized. For those who are yet to reap the full benefits, it is fair enough to learn how critical analytics can be in taking your business to the next level. Coupled with artificial intelligence and data techniques, analytics can come handy in mining information that might not necessarily provided by other means. With a proper approach to analytics, business can unearth potential clients and narrow down on their list of targeted customers. To reap the maximum benefits of analytics, necessary measures have to be in place.

The Numbers that Matter

In business, analytics isn’t all about gathering and assessing information on everything you lay your hands on. You need to know the figures that count and the statistics that are worth your efforts. For someone ...


Read More on Datafloq
How IoT Could Affect Our Daily Routine

How IoT Could Affect Our Daily Routine

The Internet of Things may very well have been named when someone was really at a loss for words. But whatever the name of the new wave of tech, it appears that IoT is on everyone's radar. The Internet of Things refers to the connectivity of all elements involved in human lifestyle. Think refrigerators, coffee machines and maybe even the oil burner.

Just like TVs now come pre-built with connections for computer cables and various audio, video and game players, your household items may soon come with the capability to speak to the internet. There are a number of repercussions to this. Life will certainly be even easier with more automated systems. There are almost too many time and labor saving devices in existence to list. Consider simple items like dishwashers or windshield wipers up to more complex ones like those that notify us of severe weather or satellites that detect space anomalies. All these things were once left to human eyes and hands.

On the other hand, the internet is far more complex than the direct link between your car's dash and the windshield wipers. Privacy is continually touted as the concern of the future. An item that has been pre-programmed ...


Read More on Datafloq
SAP Leonardo, SAP’s IoT Platform Now Has a Name: Interview with SAP’s Rakesh Gandhi

SAP Leonardo, SAP’s IoT Platform Now Has a Name: Interview with SAP’s Rakesh Gandhi

As the “Internet of Things (IoT)” market becomes less hype and more reality, German software powerhouse SAP is aiming to move fast with important economical and research investments, aiming to become a leader in the IoT field.

One key move is the recent announcement of SAP’s Leonardo Innovation Portfolio, a comprehensive solution offering to enable organizations plan, design and deploy IoT solutions.

Of course, with these announcements we felt compelled to reach out to SAP and know from their own words, about the details of this SAP’s new IoT portfolio.

As a result we had the opportunity to speak with Rakesh Gandhi, Vice President for IOT GTM & Solutions at SAP America.

Rakesh is an innovation Enthusiast and IOT evangelist, he is currently responsible for SAP Leonardo portfolio for IoT innovation’ GTM and Solutions Management. A 12 year veteran at SAP, Rakesh has been involved in incubating new innovations of Mobile, Cloud for Customer, CEC and now IoT.


Thank you Mr. Gandhi:

Last year SAP announced an ambitious €2 Billion investment plan to help companies and government agencies to develop their IoT and Big Data initiatives. Could share with us some details about this program and what this involves in a general sense?

IoT is one of the key pillar of SAP’s strategy to enable customer’ digital transformation journey. Over past several years SAP is developing IoT portfolio working closely with our customers. Recent announcement for SAP Leonardo brand is a continuation of SAP commitment and plans in following key areas

  • Accelerate innovation of IoT solution portfolio both organic and inorganic with acquisitions. 
  • Create awareness of SAP’s IoT innovations that empowers customers to run live business with smart processes across all line of business and re-invent business model  
  • Drive customer adoption, scale service, support and co-innovation, and 
  • Most importantly grow its ecosystem of partners and startups in the IoT market

To date, summary of key announcement includes:

Key acquisitions such as:

  • Fedem: With this acquisition SAP can now build an end-to-end IoT solution in which a digital avatar continuously represents the state of operating assets through feeds from sensors, replacing the need for physical inspection with a “digital inspection.” Additionally, the solution is intended to consider complex forces in play and detect both instantaneous consequences of one-off events and long-term health effects of cyclic loads, making possible accurate monitoring of maintenance requirements and remaining-life prediction for assets.
  • Plat.one: This acquisition helped provide expertise and technology to accelerate the availability of key IoT capabilities in SAP HANA Cloud Platform, such as advanced lifecycle management for IoT devices, broad device connectivity, strong IoT edge capabilities that work seamlessly with a cloud back end, end-to-end role-based security and rapid development tools for IoT applications.
  • Altiscale: This acquisition is helping our customers create business value by harnessing the power of BIG DATA generated by the connected world.
The Launch of SAP Leonardo Brand for IoT Innovation portfolio: This was a major step in announcing our brand for IoT driven innovation

SAP Leonardo jumpstart program: This is a major step in our commitment to help our customers drive adoption and rapidly deployment core IoT applications in a short time frame of 3months duration with fixed scope and price.

Partners Ecosystem are critical to our success; we are working closely with partners to create an ecosystem that our customers can leverage to further simplify their deployment projects.

Additionally, SAP is on track in opening up IoT labs to collaborate on Industry 4.0 and IoT innovations with our customers, partners and startups.

Can you share with us some of the details of the new enablement program as well as the general features of the Leonardo IoT Portfolio?

What are observing in the market place is that many organizations are starting with small experimental IoT projects or may have started to collect & store sensor data with some visualization capabilities.

However, it is still generally believed that IoT as a topic is very low on maturity curve. SAP now have a very robust portfolio which has been co-innovated with our early adopter customers and proven to deliver business value.

The second challenge and general perception with customers is that of IoT is still in hype phase and difficult to deploy, we decided it is very important for SAP to support our customer’ adoption and showcase that they can go productive live in a short time frame for first pilot.

This jumpstart program supports three scenarios as three distinct separate packages viz:

  • Vehicle Insights for fleet telematics,
  • Predictive Maintenance & Service with Asset Intelligence Network for connected assets
  • Connected Goods for scenarios connected coolers, connected vending machines and such mass marketing things.
Customers can now deploy one of this scenarios in 3 months timeframe. It is a very structured 3 steps process where-in first SAP teams works with customer leveraging ½ day design thinking workshop to get an agreement on pilot deployment scope, step 2 deliver a rapid prototype to demonstrate vision to get customer buy in.

In the final step, towards the end of 3 months engagement deliver a pilot productive system.

Lastly, SAP will continue to engage with customers to help with their IoT roadmap for next processes and business case.

It seems natural to assume SAP has already started working to support IoT projects in key industries and/or lines of business, could talk about some of these industry/LoB efforts?

SAP Leonardo IoT innovation portfolio powers digital processes across line of businesses and Industry.

As an example we have released new value map [here] of supply chain processes, now referred to as digital supply chain and this is powered by SAP Leonardo IoT innovation portfolio.

The same is applicable for other LoBs e.g. customer service processes to enable predictive & proactive maintenance process and also industry specific e2e solutions powered by SAP Leonardo e.g. with SAP Connected Goods for CPG & Retail industry.

Is this program designed mostly for SAP’s existing partners and customers? How non SAP customers could take advantage of it?

Jump start program is designed to support all our customer, both existing and net new prospect customers as well.

This mirrors how SAP Leonardo portfolio of IoT solutions is designed to work with SAP or Non-SAP backend and agnostic in that regard.

Finally, what are the technical and/or business requirements for applicants of this program?

As mentioned above, initially SAP Leonardo jump start program is offered for three packages, viz: SAP Vehicles Insights, SAP Connected Goods and SAP Predictive Maintenance &service + Asset intelligence networks.

These are cloud solutions and use cases covered by each of these packages are applicable across multiple industry.

Thank you again Mr. Gandhi!

You can learn more about SAP Leonardo by reaching its web site and/or reading this post by Hans Thalbauer.
In the meantime, you can take a look at the video introduction produced by SAP.




What’s in a Name; 7 Blockchain Benefits for the Finance Industry

What’s in a Name; 7 Blockchain Benefits for the Finance Industry

A few days ago, The Merkle ran a story that R3CEV, the largest blockchain consortium of banks and technology firms, admitted that the technology they are developing does not use a blockchain and as such they admitted defeat. A day before that article, R3CEV released a story about when a blockchain is not a blockchain to explain that what the R3 partnership is developing is actually not a blockchain, but an open source distributed ledger technology (Corda). As R3CEV explains in their article, it is “heavily inspired by and captures the benefits of blockchain systems, but with design choices that make it able to meet the needs of regulated financial institutions”.

The distributed ledger platform that has been developed by R3CEV in collaboration with 70 global institutions from all corners of the financial services industry has a few unique settings that, according to R3CEV, makes it not a blockchain. These changes were required to satisfy regulatory, privacy and scalability concerns. As such, the platform restricts access to data within agreements to predetermined actors and the financial agreements used are smart contracts that are actually legally enforceable as they are rooted firmly in law.

Whether it is a blockchain or not, or simply ...


Read More on Datafloq
The Google Analytics Data You Need for a Dynamite Content Strategy

The Google Analytics Data You Need for a Dynamite Content Strategy

Healthy and sick patients alike turn to lab tests to measure exactly what’s going on inside their bodies. Similarly, both successful and struggling companies can use data (the marketing equivalent of a lab test) to measure which of their content pieces bring the greatest health to the company.

Data has weighed heavily on the minds of marketers over the past year. Informatica blogger Myles Suer declared 2016 as “The Year of Data and Relevance.” DataOn blog speculates that “2016 will be the year in which the world will start producing more data than we can store.”

Marketers are finding more and more ways to use data to drive their strategies to success. There’s still plenty of time to bring data into your marketing efforts, particularly in regard to your content marketing strategy.

Data Reveals What the Consumer Craves

Content paired with data creates the marketer’s Holy Grail: relevance. A data-driven content strategy ensures you’re getting relevant information in front of the right audience at the right time.

When a marketer can see which of their content pieces earned the most clicks, which held the viewer’s attention the longest, and which sent consumers to sales pages most often, they can streamline content strategy and increase content ...


Read More on Datafloq
(Guest Post) Winning Solutions: Kirk Borne Discusses the Big Data Concept Ahead of Big Data World London

(Guest Post) Winning Solutions: Kirk Borne Discusses the Big Data Concept Ahead of Big Data World London

Looking ahead to 2017's Big Data World event, Booz Allen Hamilton's Principal Data Scientist discusses the Big Data concept, benefits and developments in detail with John Bensalhia...




2017's Big Data World promises plenty in the way of insightful talks and discussions on the subject. One of the unmissable talks to watch out for in March will come from Kirk Borne, Booz Allen Hamilton's Principal Data Scientist, who will look at “The Self-Driving Organisation and Edge Analytics in a Smart IoT World.


“I will describe the concept of a self-driving organisation that learns, gains actionable insights, discovers next-best move, innovates, and creates value from streaming Big Data through the application of edge analytics on ubiquitous data sources in the IoT-enriched world.”

As part of this discussion, Kirk will also present an Analytics Roadmap for the IoT-enabled Cognitive Organisation.

“In this case, the “self-driving organisation” is modeled after the self-driving automobile, but applicable organisations include individual organisations, and also smart cities, smart farms, smart manufacturing, and smart X (where X can be anything). The critical technologies include machine learning, machine intelligence, embedded sensors, streaming analytics, and intelligence deployed at the edge of the network.”
“Big Data and data science are expanding beyond the boundaries of your data centre, and even beyond the Cloud, to the point of data collection at the point of action! We used to say “data at the speed of business”, but now we say “business at the speed of data.”

Having achieved a Ph.D. in astronomy from Caltech, Kirk focused most of the first 20 years of his career on astrophysics research (“colliding galaxies and other fun stuff”), including a lot of data analysis as well as modelling and simulation.

“My day job for nearly 18 years was supporting large data systems for NASA astronomy missions, including the Hubble Space Telescope. So, I was working around data all of the time.”
“When data set sizes began to grow “astronomically” in the late 1990s, I began to focus more on data mining research and data science. It became apparent to me that the whole world (and every organisation) was experiencing large growth in digital data. From these observations, I was convinced that we needed to train the next-generation workforce in data skills. So, in 2003, I left my NASA job and joined the faculty at George Mason University (GMU) within the graduate Ph.D. program in Computational Science and Informatics (Data Science).”

As a Professor of Astrophysics and Computational Science at GMU, Kirk helped to create the world’s first Data Science undergraduate degree program.

“I taught and advised students in data science until 2015, at which point the management consulting firm Booz Allen Hamilton (BAH) offered me the position as the firm’s first Principal Data Scientist. I have been working at BAH since then.”

Booz Allen Hamilton offers management consulting services to clients in many sectors: government, industry, and non-profit. “Booz Allen Hamilton (BAH) is over 100 years old, but has reinvented itself as an agile leading-edge technology consultant,” says Kirk.

“Our market focus is very broad, including healthcare, medicine, national defense, cyber-security, law enforcement, energy, finance, transportation, professional sports, systems integration, sustainability, business management, and more. We deliver systems, technology strategy, business insights, consultative services, modelling, and support services in many technology areas: digital systems, advanced analytics, data science, Internet of Things, predictive intelligence, emerging technologies, Cloud, engineering, directed energy, unmanned aerial vehicles (drones), human capital, fraud analytics, and data for social good (plus more, I am sure).”

Discussing Big Data, Kirk regards this as a “concept”.

“It is not really about “Big” or “Data”, but it is all about value creation from your data and information assets. Of course, it is data. But the focus should be on big value, not on big volume; and the goal should be to explore and exploit all of your organisation’s data assets for actionable information and insights.”
“I like to say that the key benefits of Big Data are the three D2D’s: Data-to-Discovery (data exploration), Data-to-Decisions (data exploitation), and Data-to-Dividends (or Data-to-Dollars; i.e., data monetisation).”

Looking back to the the past year, Kirk says that there have been several significant Big Data-related developments.

“These include the emergence of the citizen data scientist, which has been accompanied by a growth in self-service tools for analytics and data science. We are also seeing maturity in deep learning tools, which are now being applied in many more interesting contexts, including text analytics. Machine intelligence is also being recognised as a significant component of processes, products, and technologies across a broad spectrum of use cases: connected cars, Internet of Things, smart cities, manufacturing, supply chain, prescriptive machine maintenance, and more.”
“But I think the most notable developments are around data and machine learning ethics – this has been evoked in many discussions around privacy and fairness in algorithms, and it has been called out also in some high-profile cases of predictive modelling failures. These developments demand that we be more transparent and explanatory to our clients and to the general public about what we are doing with data, especially their data!”

Much value can be gleaned from the Smart IoT World for businesses, and in a number of ways, as Kirk explains.

“First of all, businesses can learn about the latest products, the newest ideas, and the emerging technologies. Businesses can acquire lessons learned, best practices, and key benefits, as well as find business partners to help them on this journey from digital disruption to digital transformation.”
“The “Smart” in “Smart IoT” is derived from machine learning, data science, cognitive analytics, and technologies for intelligent data understanding. More than ever, businesses need to focus more on the “I” in “IT” – the Information (i.e., the data) is now the fundamental asset, and the Technology is the enabler. IoT is about ubiquitous sensors collecting data and tracking nearly everything in your organisation: People, Processes, and Products. Smart IoT will deliver big value from Big Data.”

Kirk says that the past few years of Big Data have been described as The End of Demographics and the Age of Personalisation. The next five to ten years, on the other hand will be the Age of Hyper-Personalisation.

“More than ever, people are at the centre of business,”

 explains Kirk.

“Big Data can and will be used to engage, delight, and enhance employee experience (EX), user experience (UX), and customer experience (CX). The corresponding actionable insights for each of these human experiences will come from “360 view” Big Data collection (IoT), intelligence at the point of data collection (Edge Analytics), and rich models for behavioural insights (Data Science).”
“These developments will be witnessed in Smart Cities and Smart Organisations of all kinds. The fundamental enabler for all of this is Intelligent Data Understanding: bringing Big Data assets and Data Science models together within countless dynamic data-driven application systems.”

With Big Data World only weeks away, Kirk is looking forward to the great opportunities that it will bring.

“I expect Big Data World to be an information-packed learning experience like no other. The breadth, depth, and diversity of useful Smart IoT applications that will be on display at Big Data World will change the course of existing businesses, inspire new businesses, stimulate new markets, and grow existing capabilities to make the world a better place.”
“I look forward to learning from technology leaders about Smart Cities, IoT implementations, practical business case studies, and accelerators of digital transformation. It is not true that whoever has the most data will win; the organisation that wins is the one who acts on the most data! At Big Data World, we can expect to see many such winning solutions, insights, and applications of Big Data and Smart IoT.”

4 Ways Big Data Can Boost Real Estate Business

4 Ways Big Data Can Boost Real Estate Business

Data collection has changed industries across the board so dramatically that many past models for how to run a business have turned obsolete. We have brand-new information at our fingeripts that changes the way companies are run, and real estate firms are no exception.

Every day, we can amass new information about properties, finances, client interactions, market performance, risks, and other details that help investors, agents, loan officers, buyers, and sellers to make better informed decisions.

Big data especially influences big decisions. Since the choice to purchase a home or start a business is one of the biggest moves a person can make, the data on which you base it should be accurate and shareable.

Key data influencers in the realty sector include:

1. Data shows the importance of using technology to sell a home

Thanks to the real-time data that’s been collected, we know that 89 percent of buyers use online tools as they search for their dream home. This information shows Realtors how critical it can be to use websites, listing services, social media, digital advertising, apps, and other technological tools to sell a home.

In addition, technology within the property influences your ability to sell it. According to research from Coldwell Banker, about ...


Read More on Datafloq
How B2B Ecosystems & (Big) Data Can Transform Sales and Marketing Practices

How B2B Ecosystems & (Big) Data Can Transform Sales and Marketing Practices

Managing your relationships with customers, suppliers, and partners and constantly improving their experience is a proven way to build a sustainable and profitable business, and contrary to popular assumption, this doesn’t apply to B2C businesses only. With 89% of B2B research studies using the internet during the internet research process, improved collaboration to deepen existing customer relationships and build new ones is vital to running a successful B2B business as well.  

This increasing need for collaboration has led to the development of digital ecosystems. Players like Apple and Google present an interesting case for how B2C ecosystems work. Consider Apple, which is primarily a B2C tech vendor, but it has built a new smart business model that pulls together technologies from multiple domains and combines them to form a solution that wins buyers acceptance. Amazon, Facebook, and Google are working on a similar kind of business model as well.

So, considering the examples of these tech giants, one can suggest that in this era of personalized customer experiences, B2B ecosystems are no longer an add-on; instead, they have become a necessity for progressive B2B businesses. 

Offering valuable insights into customer journey, B2B ecosystems work by segmenting and targeting your audience, allowing you to ...


Read More on Datafloq
Veterans Get Ready to Become Certified PMP Professional with MSys for $0

Veterans Get Ready to Become Certified PMP Professional with MSys for $0

Today’s veterans understand a technologically crazed, ever changing and extremely high pressure business environment. Many veterans prefer the job that demands productive behavior under pressure. They are unafraid to take risks, more dependable and constant learners in any role. They know how to handle and react in any situation. The veterans are trained to lead the team and know how to think under pressure.

The veterans are perfect choice for the role of project manager, but as we all know to be a successful project manager, one must have some important skills. We at MSys Training are offering a free online classroom training on PMP (Project Management Professional) for the US Veterans and will help them to sharp their project management skills as per the industry standards. During this 4-days online training, our lead PMP trainer Michelle Halsey will be providing the complete training on PMP.

To join the MSys PMP training program, veterans do not require to pay any money, yet they need to submit the scan copy of veteran id proof to prevent scam. Registering for this free online classroom training on PMP will offer you several benefits, such as:

  • Course material at discounted price
  • Free 4 days online classroom training
  • 40% off on other online and in-person classroom training programs and 50% off on online self-learning
  • Dedicated learning consultant to fill PMI-PMP® application
  • Career guidance by MSys professionals
  • You can also get a $100 referral bonus on every successful referral enrollment

If you are not a veteran and still want to join this PMP online classroom training, MSys has something for you as well! MSys is offering flat 25% off on all trainings for a limited time.

Important points that need your attention:

  • Last Date of Registration: 15th March 2017
  • Only for US Veteran
  • Need to submit scan copy of Veteran identification proof

MSys not only offering a free training on PMP, but they are also conducting a Lean Six Sigma Green Belt training program for veterans, so that they can utilize their expertise and get better job opportunities in various industries. To confirm your registration for free online PMP training with MSys, send query to support@msystraining.com.

Don’t Wait! Grab the Chance Today!


Not Your Father’s Database: Interview with VoltDB’s John Piekos

Not Your Father’s Database: Interview with VoltDB’s John Piekos



As organizations deal with challenging times ―technologically and business wise―, managing increasing volumes of data has become a key to success.

As data management rapidly evolve, the main Big Data paradigm has changed from just “big” to “big, fast, reliable and efficient”.

No more than today in the evolution of the big data and database markets, the pressure is on for software companies to deliver new and improved database solutions capable not just to deal with increasing volumes of data but also to do it faster, better and in a more reliable fashion.

A number of companies have taken the market by storm, infusing the industry with new and spectacularly advanced database software —for both transactional and non-transactional operations— that are rapidly changing the database software landscape.

One of these companies is VoltDB. This New England (Massachusetts) based company has rapidly become a reference when it comes to the offering of next-generation of database solutions and, has gained the favor of important customers in key industries such as communications, finance and gaming.

VoltDB was co-founded by no other than world known database expert and 2014 ACM A.M. Turing Award recipient, professor, Dr. Michael Stonebraker who has been key in the development of a new generation database solution and the formation of a talented team in charge of its development.

With the new VoltDB 7.0 already in the market, we had the opportunity to chat with VoltDB’s John Piekos about VoltDB’s key features and evolution.

John is VoltDB’s Vice President of Engineering at VoltDB, where he heads up VoltDB’s engineering operations, including product development, QA, technical support, documentation and field engineering.

John has more than 25 years of experience leading teams and building software, delivering both enterprise and Big Data solutions.

John has held tech leadership positions at several companies, most recently at Progress Software where he led the OpenEdge database, ObjectStore database and Orbix product lines. Previously, John was vice president of Web engineering at EasyAsk, and chief architect at Novera Software, where he led the effort to build the industry’s first Java application server.

John holds an MS in computer science from Worcester Polytechnic Institute and a BS in computer science from the University of Lowell.

Thank you John, please allow me to start with the obvious question:

What’s the idea behind VoltDB, the company, and what makes VoltDB the database, to be different from other database offerings in the market?

What if you could build a database from the ground-up, re-imagine it, re-architect it, to take advantage of modern multi-core hardware and falling RAM prices, with the goal of making it as fast as possible for heavy write use cases like OLTP and the future sensor (IoT) applications?  That was the basis of the research Dr. Stonebraker set out to investigate.

Working with the folks at MIT, Yale, and Brown, they created the H-Store project and proved out the theory that if you eliminated the overhead of traditional databases (logging, latching, buffer management, etc), ran an all in-memory workload, spread that workload across all the available CPUs on the machine and horizontally scaled that workload across multiple machines, you could get orders of magnitude performance out of the database.

The commercial realization of that effort is VoltDB.  VoltDB is fully durable, able to process hundreds of thousands to millions of multi-statement SQL transactions per second, all while producing SQL-driven real-time analytics.

Today an increasing number of emerging databases work partially or totally in-memory while existing ones are changing their design to incorporate this capability. What are in your view the most relevant features users need to look for when trying to choose from an in-memory based database?

First and foremost, users should realize that not all in-memory databases are created equal.  In short, architecture choices require trade-offs.  Some IMDBs are created to process reads (queries) faster and others, like VoltDB, are optimized for fast writes.  It is impractical (impossible) to get both the fastest writes and the fastest reads at the same time on the same data, all while maintaining high consistency because the underlying data organization and architecture is different for writes (row oriented) than it is for reads (columnar).

 It is possible to maintain two separate copies of the data, one in row format, the other in compressed column format, but that reduces the consistency level - data may not agree, or may take a while to agree between the copies.

Legacy databases can be tweaked to run in memory, but realize that, short of a complete re-write, the underlying architecture may still be disk-based, and thus incur significant (needless) processing overhead.

VoltDB defines itself as an in-memory and operational database. What does this mean in the context of Big Data and what does it mean in the context of IT’s traditional separation between transactional and analytical workloads, how does VoltDB fit or reshapes this schemas?

VoltDB supports heavy write workloads - it is capable of ingesting never-ending streams of data at high ingestion rates (100,000+/second per machine, so a cluster of a dozen nodes can process over a million transactions a second).

While processing this workload, VoltDB can calculate (via standard SQL) and deliver strongly consistent real-time analytics, either ad hoc, or optimally, as pre-computed continuous queries via our Materialized View support.

These are capabilities simply not possible with traditional relational databases.  In the Big Data space, this places VoltDB at the front end, as the ingestion engine for feeds of data, from telco, digital ad tech, mobile, online gaming, IoT, Finance and numerous other application domains.

Just recently, VoltDB passed the famous Jepsen Testing for improving safety of distributed databases with VoltDB 6.4, Could you share with us some details of the test, the challenges and the benefits it brought for VoltDB?

We have a nice landing page with this information, including Kyle’s and VoltDB’s founding engineer John Hugg’s blog.

In summary, distributed systems programming is hard. Implementing the happy path isn’t hard, but doing the correct thing (such as returning the correct answer) when things go wrong (nodes failing, networks dropping), is where most of the engineering work takes place. VoltDB prides itself on strong consistency, which means returning the correct answer at all times (or not returning an answer at all - if, for example, we don’t have all of the data available).

Kyle’s Jepsen test is one of the most stringent tests out there.  And while we hoped that VoltDB would pass on the first go-around, we knew Kyle was good at breaking databases (he’s done it to many before us!).  He found a couple of defects, thankfully finding them before any known customer found them, and we quickly went to work fixing them. Working with Kyle and eventually passing the Jepsen test was one of the 2016 engineering highlights at VoltDB. We’re quite proud of that effort.

(post-ads)

One interesting aspect of VoltDB is that It’s a relational database complies fully with ACID and bring native SQL support, what are the differences of this design compared to, for example NoSQL and some so-called NewSQL offerings? Advantages, tradeoffs perhaps?

In general, NoSQL offerings favor availability over consistency - specifically, the database is always available to accept new content and can always provide content when queried, even if that content is not the most recent (i.e., correct) version written.

NoSQL solutions rely on non-standard query languages (some are SQL-like), to compute analytics. Additionally, NoSQL data stores do not offer rich transaction semantics, often providing “transactionality” on single key operations only.

Not all NewSQL database are created equal. Some favor faster reads (over fast writes).  Some favor geo-distributed data sets, often resulting in high latency, or at least unpredictable latency access and update patterns.  VoltDB’s focus is low and predictable OLTP (write) latency at high transactions/second scale, offering rich and strong transaction semantics.

Note that not all databases that claim to provide ACID transactions are equal. The most common place where ACID guarantees are weakened is isolation. VoltDB offers serializable isolation.

Other systems offer multiple levels of isolation, with a performance tradeoff between better performance (weak guarantees) and slower performance (strong guarantees). Isolation models like Read-Committed and Read-Snapshot are examples; many systems default to one of these.

VoltDB’s design trades off complex multi-dimensional (OLAP) style queries for high throughput OLTP-style transactions while maintaining an ACID multi-statement SQL programming interface. The system is capable of surviving single and multi-node failures.

Where failures force a choice between consistency and availability, VoltDB chooses consistency. The database supports transactionally rejoining failed nodes back to a surviving cluster and supports transactionally rebalancing existing data and processing to new nodes.

Real-world VoltDB applications achieve 99.9% latencies under 10ms at throughput exceeding 300,000 transactions per second on commodity Xeon-based 3-node clusters.

How about the handling of non-structured information within VoltDB? Is it expected VoltDB to take care of it or it integrates with other alternative solutions? What’s the common architectural scenario in those cases?

VoltDB supports the storage of JSON strings and can index, query and join on fields within those JSON values. Further, VoltDB can process streamed JSON data directly into the database using our Importers (See the answer for question #9) and custom formatters (custom decoding) - this makes it possible for VoltDB to transactionally process data in almost any format, and even to act as an ETL engine.

How does VoltDB interact with players in the Big Data space such as Hadoop, both open source and commercial distributions?

The VoltDB database supports directly exporting data into a downstream data lake.  This target could be Hadoop, Vertica, a JDBC source or even flat files.  VoltDB handles the real-time data storage and processing, as it is capable of transactionally ingesting (database “writes”) millions of events per second.

Typically the value of this data decreases with age - it becomes cold or stale - and eventually would be migrated to historical storage such as Hadoop, Spark, Vertica, etc.  Consider applications in the telco or online gaming space - the “hot data” may have a lifespan of one month in telco, or even one hour or less, in the case of game play.

Once the data becomes “historical” and is of less immediate value, it may be removed from VoltDB and stored on disk in the historical archive (such as Hadoop, Vertica, etc).

What capabilities VoltDB offers not just for database administration but for development on top of VoltDB with Python, R, or other languages?

While VoltDB offers traditional APIs such as JDBC, ODBC, Java and C++ native bindings, as well as Node.js, Go, Erlang, PHP, Python, etc., I think one of the more exciting next-generation features VoltDB offers is the ability to stream data directly into the database via our in-process Importers. VoltDB is a clustered database, meaning a database comprises one (1) or more processes (usually a machine, VM or container).

A database can be configured to have an “importer,” which is essentially a plug-in that listens to a source, reads incoming messages (events, perhaps) and transactionally processes them. If the VoltDB database is highly available, then the importer is highly available (surviving node failure).  VoltDB supports a Kafka Importer and a socket importer, as well as the ability to create your own custom importer.

Essentially this feature “eliminates the client application” and data can be transactionally streamed directly into VoltDB.  The data streamed can be JSON, CSV, TSV or any custom-defined format.  Further, the importer can choose which transactional behavior to apply to the incoming data.  This is how future applications will be designed: by hooking feeds, streams of data, directly to the database - eliminating much of the work of client application development.

We have one customer who has produced one of the top 10 games in the app store - their application streams in-game events into VoltDB at a rate upwards of 700,000 events per second.  VoltDB hosts a Marketing Optimization application that analyzes these in-game events in an effort to boost revenue.

If you had a crystal ball, how would you visualize the database landscape in 5 years from now? Major advancements?

Specialized databases will continue to carve out significant market share from established vendors.
IoT will be a major market, and will drive storage systems to support two activities: 1) Machine learning (historical analysis) on the Data Lake/Big Data; storage engines will focus on enabling data scientists to capture value from the vast increases of data, and 2) Real-time processing of streams of data. Batch processing of data is no longer acceptable - real-time becomes a “must have”.

Data creation continues to accelerate and capturing value from fresh data in real-time is the new revenue frontier.

Finally, could tell us a song that is an important part of the soundtrack of your life?  

I’m a passionate Bruce Springsteen fan (and also a runner), so it would have to be “Born to Run”.

Springsteen captures that youthful angst so perfectly, challenging us to break out of historic norms and create and experience new things, to challenge ourselves.

This perfectly captures the entrepreneurial spirit both of personal “self” as well as “professional self,” and it matches the unbridled spirit of what we’re trying to accomplish with VoltDB. “Together we could break this trap We'll run till we drop, baby we'll never go back.”



The Psychology of Data Science

The Psychology of Data Science

I have recently published a piece on what it means and what it takes to be a data scientist. I want to add a further consideration, which lies at the intersection of science and psychology.

I. Data Scientists’ Characteristics

There is no scientist exactly alike another, and this is true for data scientists as well. Even if data science seems to mainly be a field run by American white male with a PhD (what I inferred from King and Magoulas, 2015), this is not conclusive at all on the ideal candidate to hire. The suggestion is to value the skills and capabilities more than titles or formal education (there are not many academic programs so well-structured to signal the right set of competencies to potential employers).

So far, in order to become a data scientist, the paths to be followed could have been unconventional and various, so it is important to assess the abilities instead of simply deciding based on the type of background or degree level. Never forget that one of the real extra-value added by data science is different field contaminations and cross-sectional applications.

But there is also another relevant aspect to take into account in choosing the right candidate for your team, and that is Psychology.

II. How ...


Read More on Datafloq
What Big Data Can Do For Your Small Business

What Big Data Can Do For Your Small Business

One of the most challenging aspects of running a business is managing the large volume of data it encounters on a day to day basis. Properly managing this information can provide some potentially useful insights that may ultimately increase your business’ capital. However, due to the large amount of data, those that contain invaluable knowledge are often buried underneath all the irrelevant information. Managing data may seem like a waste of time and effort when it produces no result. But once you learn how to efficiently manage big data, you’ll be able to make better decisions and plan better strategies for the betterment of your small business. Here’s how you can increase the capital of your small business by managing big data.

Hire Big Data Talent

The need for employees that are skilled in data analytics is dramatically increasing especially now that the world is mostly relying on technology to function. Thus, there is a shortage of employees that are exceptionally skilled in data analysis. Even for small businesses, one or two analytics expert is rarely enough. Building a team of highly skilled big data analytics experts, on the other hand, will make it easier for you to identify and address the ...


Read More on Datafloq
How Kids are Learning Code in the Classroom

How Kids are Learning Code in the Classroom

Reading, writing, and arithmetic…and code? 

“Learn to code” has been a directive given to students and young professionals throughout the country (and the world) for the last several years. It makes sense—so many careers today involve at least some basic knowledge of a programming language. On top of that, some programming-dependent fields, like big data, are experiencing a major talent shortage, and people who can code have an easier time finding a high-paying job. This shortage may be partially due to the fact that coding is not taught as a required skill in the classroom. Much like proficient literacy, which opens doors for students in the workforce (60% of people with proficient literacy work in the management, business, financial, or professional sectors), coding can help students make the transition into the workforce, no matter what field they choose. Additionally, learning to code has other benefits, allowing students to develop problem-solving and critical thinking skills, as well as the all-important “grit” that gives students the persistence to succeed. Many educators are recognizing these benefits, and are working to incorporate coding curriculum into the classroom, to give their students a head start on their future. 

Innovative Methods of Teaching 

It’s easy to get discouraged when ...


Read More on Datafloq
How to Plan Big Data Investments Based on Time-to-Insight Advantages

How to Plan Big Data Investments Based on Time-to-Insight Advantages

For many of the big companies now, the priority is to capitalize on Big Data investments, but the discussion about it is still ongoing as to whether the actual objective of it is to save money or there are some higher goals to achieve?

The all-time golden rule in business is “time is money". The Fortune 1000 executives who participated and voted in the Fourth Big Data Executive Annual Survey, conducted by NewVantage Partners last year, undoubtedly confirmed that reducing the ‘time-to-insight’ is the most specific driver for the major companies to invest in Big Data than the generic objective of simply saving money.

Overview of NewVantage Partners Survey

The 2016 report, confirms that the leading industrial corporations believe that Big Data is all-set to deliver a competitive advantage over other prevalent business technologies of the age by enabling the organizations to act faster in terms of:


Gathering data
Analyzing data
Gaining proper insights
Making critical decisions
Bringing new capabilities first to market


NewVantage Partners survey reflects that actual perception of the CDOs (Chief Data Officers), CIOs (Chief Information Officers), Business VPs and Presidents, and the heads of 50 prominent Fortune 1000 organizations that are in charge of Big Data implementation.

The major participants in the survey also included the ...


Read More on Datafloq
IDOL-powered appliance delivers better decisions via comprehensive business information searches

IDOL-powered appliance delivers better decisions via comprehensive business information searches

The next BriefingsDirect digital transformation case study highlights how a Swiss engineering firm created an appliance that quickly deploys to index and deliver comprehensive business information.

By scouring thousands of formats and hundreds of languages, the approach then provides via a simple search interface unprecedented access to trends, leads, and the makings of highly informed business decisions.

We will now explore how SEC 1.01 AG delivers a truly intelligent services solution -- one that returns new information to ongoing queries and combines internal and external information on all sorts of resources to produce a 360-degree view of end users’ areas of intense interest.

Listen to the podcast. Find it on iTunes. Get the mobile app. Read a full transcript or download a copy.

To learn how to access the best available information in about half the usual time, we're joined by David Meyer, Chief Technology Officer at SEC 1.01 AG in Switzerland. The discussion is moderated by BriefingsDirect's Dana Gardner, Principal Analyst at Interarbor Solutions.

Here are some excerpts:

Meyer
Gardner: What are some of the trends that are driving the need for what you've developed. It's called the i5 appliance?

Meyer: The most important thing is that we can provide instant access to company-relevant information. This is one of today’s biggest challenges that we address with our i5 appliance.

Decisions are only as good as the information bases they are made on. The i5 provides the ability to access more complete information bases to make substantiated decisions. Also, you don’t want to search all the time; you want to be proactively informed. We do that with our agents and our automated programs that are searching for new information that you're interested in.

Gardner: As an organization, you've been around for quite a while and involved with  large applications, packaged applications -- SAP, for example and R/3 -- but over time, more data sources and ability to gather information came on board, and you saw the need in the market for this appliance. Tell us a little bit about what led you to create it?

Accelerating the journey

Meyer: We started to dive into big data about the time that HPE acquired Autonomy, December 2011, and we saw that it’s very hard for companies to start to become a data-driven organization. With the i5 appliance, we would like to help companies accelerate their journey to become such a company.

Gardner: Tell us what you mean by a 360-degree view? What does that really mean in terms of getting the right information to the right people at the right time?

Meyer: In a company's information scope, you don’t just talk about internal information, but you also have external information like news feeds, social media feeds, or even governmental or legal information that you need and don’t have to time to search for every day.

So, you need to have a search appliance that can proactively inform you about things that happen outside. For example, if there's a legal issue with your customer or if you're in a contract discussion and your partner loses his signature authority to sign that contract, how would you get this information if you don't have support from your search engine?
Mission Critical
Server Choices

Have Never Been Better
Gardner: And search has become such a popular paradigm for acquiring information, asking a question, and getting great results. Those results are only as good as the data and content they can access. Tell us a little bit about your company SEC 1.01 AG, your size and your scope or your market. Give us a little bit of background about your company.

Meyer: We've been an HPE partner for 26 years, and we build business-critical platforms based on HPE hardware and also the HPE operating system, HP-UX. Since the merger of Autonomy and HPE in 2011, we started to build solutions based on HPE's big-data software, particularly IDOL and Vertica.

Gardner: What was it about the environment that prevented people from doing this on their own? Why wouldn't you go and just do this yourself in your own IT shop?

Meyer: The HPE IDOL software ecosystem, is really an ecosystem of different software, and these parts need to be packed together to something that can be installed very quickly and that can provide very quick results. That’s what we did with the i5 appliance.

We put all this good software from HPE IDOL together into one simple appliance, which is simple to install. We want to accelerate the time that is needed to start with big data to get results from it and to get started with the analytical part of using your data and gain money out of it.

Multiple formats

Gardner: As we mentioned earlier, getting the best access to the best data is essential. There are a lot of APIs and a lot of tools that come with the IDOL ecosystem as you described it, but you were able to dive into a thousand or more file formats, support a 150 languages, and 400 data sources. That's very impressive. Tell us how that came about.

Meyer: When you start to work with unstructured data, you need some important functionality. For example, you need to have support for lot of languages. Imagine all these social media feeds in different languages. How do you track that if you don't support sentiment analysis on these messages?

On the other hand, you also need to understand any unstructured format. For example, if you have video broadcasts or radio broadcasts and you want to search for the content inside these broadcasts, you need to have a tool to translate the speech to text. HPE IDOL brings all the functionality that is needed to work with unstructured data, and we packed that together in our i5 appliance.

Gardner: That includes digging into PDFs and using OCR. It's quite impressive how deep and comprehensive you can be in terms of all the types of content within your organization.
Access the Free
HPE Vertica

Community Edition
How do you physically do this? If it's an appliance, you're installing it on-premises, you're able to access data sources from outside your organization, if you choose to do that, but how do you actually implement this and then get at those data sources internally? How would an IT person think about deploying this?

Meyer: We've prepared installable packages. Mainly, you need to have connectors to connect to repositories, to data ports. For example, if you have a Microsoft Exchange Server, you have a connector that understands very well how the Exchange server can communicate to that connector. So, you have the ability to connect to that data source and get any content including the metadata.

You talk about metadata for an e-mail, for example, the “From” to “To”, to “Subject,” whatever. You have the ability to put all that content and this metadata into a centralized index, and then you're able to search that information and refine the information. Then, you have a reference to your original document.

When you want to enrich the information that you have in your company with external information, we developed a so-called SECWebConnector that can capture any information from the Internet. For example, you just need to enter an RSS feed or a webpage, and then you can capture the content and the metadata you want it to search for or that is important for your company.

Gardner: So, it’s actually quite easy to tailor this specifically to an industry focus, if you wish, to a geographic focus. It’s quite easy to develop an index that’s specific to your organization, your needs, and your people.

Informational scope

Meyer: Exactly. In our crowded informational system that we have with the Internet and everything, it’s important that companies can choose where they want to have the information that is important for them. Do I need legal information, do I need news information, do I need social media information, and do I need broadcasting information? It’s very important to build your own informational scope that you want to be informed about, news that you want to be able to search for.

Gardner: And because of the way you structured and engineered this appliance, you're not only able to proactively go out and request things, but you can have a programmatic benefit, where you can tell it to deliver to you results when they arise or when they're discovered. Tell us a little bit how that works.

Meyer: We call them agents. You can define which topics you're interested in, and when some new documents are found by that search or by that topic, then you get informed, with an email or with a push notification on the mobile app.

Gardner: Let’s dig into a little bit of this concept of an appliance. You're using IDOL and you're using Vertica, the column-based or high-performance analytics engine, also part of HPE, but soon to be part of Micro Focus. You're also using 3PAR StoreServ and ProLiant DL380 servers. Tell us how that integration happened and why you actually call this an appliance, rather than some other name?
In our crowded informational system that we have with the Internet and everything, it’s important that companies can choose where they want to have the information that is important for them.

Meyer: Appliance means that all the software is patched together. Every component can talk to the others, talks the same language, and can be configured the same way. We preconfigure a lot, we standardize a lot, and that’s the appliance thing.

And it’s not bound on hardware. So, it doesn’t need to be this DL380 or whatever. It also depends on how big your environment will be. It can also be a c7000 Blade Chassis or whatever.

When we install an appliance, we have one or two days until it’s installed, and then it starts the initial indexing program, and this takes a while until you have all the data in the index. So, the initial load is big, but after two or three days, you're able to search for information.

You mentioned the HPE Vertica part. We use Vertica to log every action that goes on, on the appliance. On one hand, this is a security feature. You need to prove if nobody has found the salary list, for example. You need to prove that and so you need to log it.

On the other hand, you can analyze what users are doing. For example, if they don’t find something and it’s always the same thing that people are searching in the company and can't find, perhaps there's some information you need to implement into the appliance.

Gardner: You mentioned security and privileges. How does the IT organization allow the right people to access the right information? Are you going to use some other policy engine? How does that work?

Mapped security

Meyer: It's included. It's called mapped security. The connector takes the security information with the document and indexes that security information within the index. So, you will never be able to find a document that you don't have access to in your environment. It's important that this security is given by default.

Gardner: It sounds to me, David, like were, in a sense, democratizing big data. By gathering and indexing all the unstructured data that you can possibly want to, point at it, and connect to, you're allowing anybody in a company to get access to queries without having to go through a data scientist or a SQL query author. It seems to me that you're really opening up the power of data analysis to many more people on their terms, which are basic search queries. What does that get an organization? Do you have any examples of the ways that people are benefiting by this democratization, this larger pool of people able to use these very powerful tools?

Meyer: Everything is more data-driven. The i5 appliance can give you access to all of that information. The appliance is here to simplify the beginning of becoming a data-driven organization and to find out what power is in the organization's data.
Mission Critical
Server Choices

Have Never Been Better
For example, we enabled a Swiss company called Smartinfo to become a proactive news provider. That means they put lots of public information, newspapers, online newspapers, TV broadcasts, radio broadcasts into that index. The customers can then define the topics they're interested in and they're proactively informed about new articles about their interests.

Gardner: In what other ways do you think this will become popular? I'm guessing that a marketing organization would really benefit from finding relationships within their internal organization, between product and service, go-to market, and research and development. The parts of a large distributed organization don't always know what the other part is doing, the unknown unknowns, if you will. Any other examples of how this is a business benefit?

Meyer: You mentioned the marketing organization. How could a marketing organization listen what customers are saying? For example, on social media they're communicating there, and when you have an engine like i5, you can capture these social media feeds, you can do sentiment analysis on that, and you will see an analyzed view on what's going on about your products, company, or competitors.

You can detect, for example, a shitstorm about your company, a shitstorm about your competitor, or whatever. You need to have an analytic platform to see that, to visualize that, and this is a big benefit.

On the other hand, it's also this proactive information you get from it, where you can see that your competitor has a new campaign and you get that information right now because you have an agent with the customer's name. You can see that there is something happening and you can act on that information.

Gardner: When you think about future capabilities, are there other aspects that you can add on? It seems extensible to me. What would we be talking about a year from now, for example?

Very extensible

Meyer: It's pretty much extensible. I think about all these different verticals. You can expand it for the health sector, for the transportation sector, whatever. It doesn't really matter.

We do network analysis. That means when you prepare yourself to visit a company, you can have a network picture, what relationships this company has, what employees work there, who is a shareholder of that company, which company has contracts with any of other companies?

This is a new way to get a holistic image of a company, a person, or of something that you want to know. It's thinking how to visualize things, how to visualize information, and that's the main part we are focusing on. How can we visualize or bring new visualizations to the customer?

Gardner: In the marketplace, because it's an ecosystem, we're seeing new APIs coming online all the time. Many of them are very low cost and, in many cases, open source or free. We're also seeing the ability to connect more adequately to LinkedIn and Salesforce, if you have your license for that of course. So, this really seems to me a focal point, a single pane of glass to get a single view of a customer, a market, or a competitor, and at the same time, at an affordable price.

Let's focus on that for a moment. When you have an appliance approach, what we're talking about used to be only possible at very high cost, and many people would need to be involved -- labor, resources, customization. Now, we've eliminated a lot of the labor, a lot of the customization, and the component costs have come down.
Access the Free
HPE Vertica

Community Edition
We've talked about all the great qualitative benefits, but can we talk about the cost differential between what used to be possible five years ago with data analysis, unstructured data gathering, and indexing, and what you can do now with the i5?

Meyer: You mentioned the price. We have an OEM contract, and that that's something that makes us competitive in the market. Companies can build their own intelligence service. It's affordable also for small and medium businesses. It doesn't need to be a huge company with own engineering and IT staff. It's affordable, it's automated, it's packed together, and simple to install.

Companies can increase the workplace performance and shorten the processes. Anybody has access to all the information they need in their daily work, and they can focus more on their core business. They don't lose time in searching for information and not finding it and stuff like that.

Gardner: For those folks who have been listening or reading, are intrigued by this, and want to learn more, where would you point them? How can they get more information on the i5 appliance and some of the concepts we have been discussing?

Meyer: That's our company website, sec101.ch. There you can find any information you would like to have. And this is available now.

Listen to the podcast. Find it on iTunes. Get the mobile app. Read a full transcript or download a copy. Sponsor: Hewlett Packard Enterprise.

You may also be interested in:

How Big Data Makes the Sharing Economy Possible

How Big Data Makes the Sharing Economy Possible

Long before capitalism and big corporations ruled the world, we used to survive using different tactics: sharing. Bartering, sharing skills, and helping one another in order to feed ourselves and keep sheltered was simply a necessity. These days, we’re starting to shift back to those roots with the “sharing economy”—a concept that would not be possible in modern times without the help of big data’s ability to bring us together. Whether you know it or not, you’re probably participating in the new trend of the sharing economy—but what is it, exactly, and how does big data make this new (old) way of life possible?

What is the Sharing Economy?

Essentially, the sharing economy is the concept of crowdsourcing goods and services from others. This is usually done off of an online or mobile platform that fields requests and provides “matchmaking” services to facilitate sharing. This might mean calling a ride from someone with a car and a few extra hours to spare, buying a meal your neighbor cooked, or renting out someone’s apartment when you’re visiting a new city. You can even start a business by leveraging peer-to-peer lending—a process that bypasses banks and allows individuals to invest in a business via ...


Read More on Datafloq
Best Resources to Prepare for Certified Scrum Master (CSM ®) Certification

Best Resources to Prepare for Certified Scrum Master (CSM ®) Certification

A CSM Certification helps to develop skills to effectively design and utilize Scrum in any project. In the last couple of years, the Certified Scrum Master certification has gained the popularity and now it is used to manage various projects of quickly changing business environments. But during the preparation of the Certified Scrum Master exam, people often get confused in selecting the books to refer and landed to book which is not that helpful. Being in the training industry, I have shortlisted some books and reference sites those are actually helpful to prepare for the CSM certification exam.

Scrum Alliance

Being the official certification body of the Certified Scrum Master certification, the Scrum Alliance provides a wide range of articles, white papers, blogs, videos, reports and other important links about Scrum. You can visit the Scrum Alliance at their official website that is https://www.scrumalliance.org/ and look out for up-to-date details and information about CSM.

Scrum Guides

The http://www.scrumguides.org/ is an informative website and is supported by the Scrum Alliance. This is highly recommended to keep reading the content available on the website to get all updates about CSM (Certified Scrum Master) certification. This website facilitates you to read Official Scrum Guide online; also allow you to download their 16-page booklet for offline reference. This is really a helpful source for those who are preparing for the CSM certification.

Chris Sims’ The Elements of Scrum

Unlike a reference guide or training manual, The Element of Scrum adopts a different way to illustrate the key concepts of Scrum. Chris Sims has explained the practice, principles and pitfalls of the scrum framework with reference to real life examples in the form of small stories that the reader can relate to. With the real life example, this book makes CSM certification prep a breeze.

The Scrum Field Guide – Practical Advice for Your First Year, by Mitch Lacey

 

With 30 engaging and insightful chapters, Mitch Lacey explains how to implement Scrum in an organization or a work project. The author has illustrated all the facts about implementation from appointing people onboard to when and how to do ‘emergency procedures’ to bring the project back on the track are discussed in this book. This book not only helps you to crack the CSM examination, but it also provides you the practical advice throughout your initial days of implementing Agile methods and techniques.

 

 

Certified Scrum Master (CSM) 83, by Janice Garrison

The Certified Scrum Master (CSM) 83 is a only 72 pages book, but it includes 83 most important and frequently asked questions that you must know. The book does not consist any mock exam, rather it is a collection of CSM (Certified Scrum Master) and Scrum related questions with their answers to give you a basic knowledge about the CSM.

The above mentioned books and sites are for reference, but studying without expert instruction might be a bad idea. MSys Training offers the instructor led classroom for all Agile and Scrum certification courses. One can complement these books with the learning on a full-fledged MSys certification training course.


ITIL Newbie Commits Following Mistakes While Adopting ITIL

ITIL Newbie Commits Following Mistakes While Adopting ITIL

While stepping into IT service management, newly certified ITIL professionals are full of dynamism and vigor. However, many professionals are not aware that clearing a ITIL certification is just the first step towards the long tale journey. Adopting IT service management’s ITIL Framework requires a complete understanding of the breadth and depth of changes that might arise around organizational resources, technology and processes. Clear vision with proper business planning is important for adopting the ITIL process across the organization.

Being in the training industry for a quite long now, I have seen many newbies who commit some common mistake while implementing tools and techniques of ITIL. Here is a common mistakes made by newly certified ITIL professionals:

Certified and All Set for Shouldering the Responsibilities

 

For being successful in ITSM, ITIL certification is not everything. The lifecycle of service learnt by an applicant during ITIL course is a simplified version of a complex IT service management concepts. This does not offer the thorough guidance for gaining the benefits of ITIL to an organization.

 

 

More Process Centric than Service Centric

The professionals who have recently passed the ITIL certification majorly focus on process stabilization before gaining a understanding of the services and products offered by the organization. This can lead to a dreadful result as the decision is made by only considering process stabilization.

Highly Process Driven

After completing the ITIL Foundation training, professionals desire to make all moves in the right direction. In this attempt, they usually set up different groups to manage problems, incidents and making changes that arise in a project. With respect to IT Service Management concepts, this is a counter-productive. Hence, it is very important for newbies to understand that not all processes requires a devoted team. Usually process is a set of activities that can be handled by single resource. Therefore, the main focus of ITIL professional should be procedures that depend on several activities of a particular process.

Trying to Lead Project Single Handedly

 

TIL work on the fact that IT depends on support from several businesses. Hence, IT needs to work together to achieve the business requirements. The Newbie ITIL professional tries to implement a project single handedly, but implementing a process or tool without proper guidance can be a risk for business.

 

 

Neglect Business Requirement while Creating a Service Catalog

A service catalog is designed to bridge the gap between IT deliverables and business expectations. It is a good practice to document the services offered by IT to different firms while designing a service catalog. Hence, mapping the offered services to the expected business outcomes can bridge the gap between business expectations and IT service.

Hence, it’s very important for newly certified ITIL professionals to take time to develop practical expertise and gain industry wide experience to utilize their ITIL knowledge.


Privacy Policy

Copyright © 2017 BBBT - All Rights Reserved
Powered by WordPress & Atahualpa
X