ServiceMaster’s path to an agile development twofer: Better security and DevOps business benefits

ServiceMaster’s path to an agile development twofer: Better security and DevOps business benefits

The next BriefingsDirect Voice of the Customer security transformation discussion explores how home-maintenance repair and services provider ServiceMaster develops applications with a security-minded focus as a DevOps benefit.

Listen to the podcast. Find it on iTunes. Get the mobile app. Read a full transcript o download a copy.

To learn how security technology leads to posture maturity and DevOps business benefits, we're joined by Jennifer Cole, Chief Information Security Officer and Vice President of IT, Information Security, and Governance for ServiceMaster in Memphis, Tennessee, and Ashish Kuthiala, Senior Director of Marketing and Strategy at Hewlett Packard Enterprise DevOps. The discussion is moderated by BriefingsDirect's Dana Gardner, Principal Analyst at Interarbor Solutions.

Here are some excerpts:

Gardner: Jennifer, tell me, what are some of the top trends that drive your need for security improvements and that also spurred DevOps benefits?

Cole: When we started our DevOps journey, security was a little bit ahead of the curve for application security and we were able to get in on the front end of our DevOps transformation.


The primary reason for our transformation as a company is that we are an 86-year-old company that has seven brands under one umbrella, and we needed to have one brand, one voice, and be able to talk to our customers in a way that they wanted us to talk to them.

That means enabling IT to get capabilities out there quickly, so that we can interact with our customers "digital first." As a result of that, we were able to see an increase in the way that we looked at security education and process. We were normally doing our penetration tests after the fact of a release. We were able to put tools in place to test prior to a release, and also teach our developers along the way that security is everyone's responsibility.

ServiceMaster has been fortunate that we have a C-suite willing to invest in DevOps and an Agile methodology. We also had developers who were willing to learn, and with the right intent to deliver code that would protect our customers. Those things collided, and we have the perfect storm.

So, we're delivering quicker, but we also fail faster allowing us to go back and fix things quicker. We're seeing an uptick in what we're delivering being a lot more secure.

Gardner: Ashish, it seems obvious, having heard Jennifer describe it, DevOps and security hand-in-hand -- a whole greater than the sum of the parts. Are you seeing this more across various industries?

Stopping defects

Kuthiala: Absolutely. With the adoption of DevOps increasing more across enterprises, security is no different than any other quality-assurance (QA) testing that you do. You can't let a defect reach your customer base; and you cannot let a security flaw reach your customer base as well.

If you look at it from that perspective, and the teams are willing to work together, you're treated no differently than any other QA process. This boils not just to the vulnerability of your software that you're releasing in the marketplace, but there are so many different regulations and compliance [needs] -- internal, external, your own company policies -- that you have to take a look at. You don't want to go faster and compromise security. So, it's an essential part of DevOps.

Cole: DevOps allows for continuous improvement, too. Security comes at the front of a traditional SDLC process, while in the old days, security came last. We found problems after they were in production or something had been compromised. Now, we're at the beginning of the process and we're actually getting to train the people that are at the beginning of the process on how and why to deliver things that are safe for our customers.

Gardner: Jennifer, why is security so important? Is this about your brand preservation? Is this about privacy and security of data? Is this about the ability for high performance to maintain its role in the organization? All the above? What did I miss? Why is this so important?

Cole: Depending on the lens that you are looking through, that answer may be different. For me, as a CISO, it's making sure that our data is secure and that our customers have trust in us to take care of their information. The rest of the C-suite, I am sure, feels the same, but they're also very focused on transformation to digital-first, making sure customers can work with us in any way that they want to and that their ServiceMaster experience is healthy.

Our leaders also want to ensure our customers return to do business with us and are happy in the process.  Our company helps customers in some of the most difficult times in their life, or helps them prevent a difficult time in the ownership of their home.

But for me and the rest of our leadership team, it's making sure that we're doing what's right. We're training our teams along the way to do what's right, to just make the overall ServiceMaster experience better and safe. As young people move into different companies, we want to make sure they have that foundation of thinking about security first -- and also the customer.
Learn More About DevOps
Solutions that Unify
Development and Operations
We tend to put IT people in a back room, and they never see the customer. This methodology allows IT to see what they could have released and correct it if it's wrong, and we get an opportunity to train for the future.
Through my lens, it’s about protecting our data and making sure our customers are getting service that doesn't have vulnerabilities in it and is safe.

Gardner: Now, Ashish, user experience is top of mind for organizations, particularly organizations that are customer focused like ServiceMaster. When we look at security and DevOps coming together, we can put in place the requirements to maintain that data, but it also means we can get at more data and use it more strategically, more tactically, for personalization and customization -- and at the same time, making sure that those customers are protected.

How important is user experience and data gathering now when it comes to QA and making applications as robust as they can be?

Million-dollar question

Kuthiala: It's a million-dollar question. I'll give you an example of a client I work with. I happen to use their app very, very frequently, and I happen to know the team that owns that app. They told me about 12 months ago that they had invested -- let’s just make up this number -- $1 million in improving the user experience. They asked me how I liked it. I said, "Your app is good. I only use this 20 percent of the features in your app. I really don’t use the other 80 percent. It's not so useful to me."

That was an eye-opener to them, because the $1 million or so that they would have invested in enriching the user experience -- if they knew exactly what I was doing as a user, what I use, what I did not use, where I had problems -- could have used that toward that 20 percent that I use. They could have made it better than anybody else in the marketplace and also gathered information on what is it that the market wants by monitoring the user experience with people like me.
It's not just the availability and health of the application; it’s the user experience. It's having empathy for the user, as an end user.

It's not just the availability and health of the application; it’s the user experience. It's having empathy for the user, as an end-user. HPE of course, makes a lot of these tools, like HPE AppPulse, which is very specifically designed to capture that mobile user experience and bring it back before you have a flood of calls and support people screaming at you as to why the application isn’t working.

Security is also one of those things. All is good until something goes wrong. You don't want to be in a situation when something has actually gone wrong and your brand is being dragged through mud in the press, your revenue starts to decline, and then you look at it. It’s one of those things that you can't look at after the fact.

Gardner: Jennifer, this strikes me as an under-appreciated force multiplier, that the better you maintain data integrity, security, and privacy, the more trust you are going to get to get more data about your customers that you can then apply back to a better experience for them. Is that something that you are banking on at ServiceMaster?
Learn More About DevOps
Solutions that Unify
Development and Operations
Cole: Absolutely. Trust is important, not only with our customers, but also our employees and leaders. We want people to feel like they're in a healthy environment, where they can give us feedback on that user experience. What I would say to what Ashish was saying is that DevOps actually gives us the ability to deliver what the business wants IT to deliver for our customers.

In the past 25 years, IT has decided what the customer would like to see. In this methodology, you're actually working with your business partners who understand their products and their customers, and they're telling you the features that need to be delivered. Then, you're able to pick the minimum viable product and deliver it first, so that you can capture that 20 percent of functionality.

Also, if you're wrapping security in front of that, that means security is not coming back to you later with the penetration test results and say that you have all of these things to fix, which takes time away from delivering something new for our customers.

This methodology pays off, but the journey is hard. It’s tough because in most companies you have a legacy environment that you have to support. Then, you have this new application environment that you’re creating. There's a healthy balance that you have to find there, and it takes time. But we've seen quicker results and better revenue, our customers are happier, they're enjoying the ServiceMaster experience, instead of our individual brand families, and we've really embraced the methodology.

Gardner: Do you have any examples that you can recall where you've done development projects and you’ve been able to track that data around that particular application? What’s going on with the testing, and then how is that applied back to a DevOps benefit? Maybe you could just walk us through an example of where this has really worked well.

Digital first

Cole: About a year and a half ago, we started with one of our brands, American Home Shield, and looked at where the low hanging fruit -- or minimum viable product -- was in that brand for digital first. Let me describe the business a little bit. Our customers reach out to us, they purchase a policy for their house and we maintain appliances and such in their home, but it is a contractor-based company. We send out a contractor who is not a ServiceMaster associate.

We have to make that work and make our customer feel like they've had a seamless experience with American Home Shield. We had some opportunity in that brand for digital first. We went after it and drastically changed the way that our customers did business with us. Now, it's caught on like wildfire, and we're really trying to focus on one brand and one voice. This is a top-down decision which does help us move faster.

All seven of our brands are home services. We're in 75,000 homes a day and we needed to identify the customers of all the brands, so that we could customize the way that we do business with them. DevOps allows us to move faster into the market and deliver that.

Gardner: Ashish, there aren't that many security vendors that do DevOps, or DevOps vendors that do security. At HPE, how have you made advances in terms of how these two areas come together?
The strengths of HPE in helping its customers lies with the very fact that we have an end-to-end diverse portfolio.

Kuthiala: The strengths of HPE in helping its customers lies with the very fact that we have an end-to-end diverse portfolio. Jennifer talked about taking the security practices and not leaving it toward the end of the cycle, but moving it to the very beginning, which means that you have to get developers to start thinking like security experts and work with the security experts.

Given that we have a portfolio that spans the developers and the security teams, our best practices include building our own customer-facing software products that incorporate security practices, so that when developers are writing code, they can begin to see any immediate security threats as well as whether their code is compliant with any applicable policies or not. Even before code is checked in, the process runs the code through security checks and follows it all the way through the software development lifecycle.

These are security-focused feedback loops. At any point, if there is a problem, the changes are rejected and sent back or feedback is sent back to the developers immediately.

If it makes through the cycle and a known vulnerability is found before release to production, we have tools such as App Defender that can plug in to protect the code in production until developers can fix it, allowing you to go faster but remain protected.

Cole: It blocks it from the customer until you can fix it.

Kuthiala: Jennifer, can you describe a little bit how you use some of these products?

Strategic partnership

Cole: Sure. We’ve had a great strategic partnership with HPE in this particular space. Application security caught on fire about two years ago at RSA, which is one of the main security conferences for anyone in our profession.

The topic of application security has not been focused to CISOs in my opinion. I was fortunate enough that I had a great team member who came back and said that we have to get on board with this. We had some conversations with HPE and ended up in a great strategic partnership. They've really held our hands and helped us get through the process. In turn, that helped make them better, as well as make us better, and that's what a strategic partnership should be about.

Now, we're watching things as they are developed. So, we're teaching the developer in real-time. Then, if something happens to get through, we have App Defender, which will actually contain it until we can fix it before it releases to our customer. If all of those defenses don’t work, we still do the penetration test along with many other controls that are in place. We also try to go back to just grassroots, sit down with the developers, and help them understand why they would want to develop differently next time.
The next step for ServiceMaster specifically is making solid plans to migrate off of our legacy systems, so that we can truly focus on maturing DevOps and delivering for our customer in a safer, quicker way.

Someone from security is in every one of the development scrum meetings and on all the product teams. We also participate in Big Room Planning. We're trying to move out of that overall governing role and into a peer-to-peer type role, helping each other learn, and explaining to them why we want them to do things.

Gardner: It seems to me that, having gone at this at the methodological level with those collaboration issues solved, bringing people into the scrum who are security minded, puts you in a position to be able to scale this. I imagine that more and more applications are going to be of a mobile nature, where there's going to be continuous development. We're also going to start perhaps using micro-services for development and ultimately Internet of Things (IoT) if you start measuring more and more things in your homes with your contractors.

Cole: We reach 75,000 homes a day. So, you can imagine that all of those things are going to play a big part in our future.

Gardner: Before we sign-off, perhaps you have projections as to where you like to see things go. How can DevOps and security work better for you as a tag team?
Learn More About DevOps
Solutions that Unify
Development and Operations
Cole: For me, the next step for ServiceMaster specifically is making solid plans to migrate off of our legacy systems, so that we can truly focus on maturing DevOps and delivering for our customer in a safer, quicker way, and so we're not always having to balance this legacy environment and this new environment.
If we could accelerate that, I think we will deliver to the customer quicker and also more securely.

Gardner: Ashish, last word, what should people who are on the security side of the house be thinking about DevOps that they might not have appreciated?

Higher quality

Kuthiala: This whole approach of adopting DevOps is to deliver your software faster to your customers with higher quality says it. DevOps is an opportunity for security teams to get deeply embedded in the mindset of the developers, the business planners, testers, production teams – essentially the whole software development lifecycle, which earlier they didn’t have the opportunity to do.

They would usually come in before code went to production and often would push back the production cycles by a few weeks because they had to do the right thing and ensure release of code that was secure. Now, they’re able to collaborate with and educate developers, sit down with them, tell them exactly what they need to design and therefore deliver secure code right from the design stage. It’s the opportunity to make this a lot better and more secure for their customers.

Cole: The key is security being a strategic partner with the business and the rest of IT, instead of just being a governing body.

Listen to the podcast. Find it on iTunes. Get the mobile app. Read a full transcript o download a copy. Sponsor: Hewlett Packard Enterprise.

You may also be interested in:

Are You Unlocking the Hidden Value of Your ‘Big Data’?

Are You Unlocking the Hidden Value of Your ‘Big Data’?

This article is sponsored by CloudMoyo - Partner of choice for solutions at the intersection of Cloud & Analytics.

More and more organizations are coming to understand that there are valuable insights hidden in the data which they generate during the course of their work that could transform their business operations. The science of data analysis is growing fast around the world, and it’s becoming more and more predictive in the way that it’s applied, with an emphasis on trying to plot new courses for businesses eager to grow and to change.

Every business operates differently and has different goals and insights to harness from its big data. But there are also many organizations which can serve as an inspiration; they have understood how big data applies to what they do and they have harnessed it to make changes.

Naturally, the first step is to organize your data gathering process. Collecting, storing and organizing the data that you need in a streamlined, sustainable manner is essential if you are ever to truly derive value from that data.  From there, one starts to ask questions like ‘’what is the value of all this data? What should I be looking for?”

The business of data analysis has undergone ...

Read More on Datafloq
Computational Consistency and Why It Matters

Computational Consistency and Why It Matters

The advance of Big Data analytics and the need for real time results in application environments such as IoT is driving the need for a new approach to storage. Startups in this space have a particular goal in mind and that is to reduce the latency between the computational later [...]
BBBT Hosts Webinar with Cloudera on Data Management and Analytics in the Cloud and on Premises

BBBT Hosts Webinar with Cloudera on Data Management and Analytics in the Cloud and on Premises

This Friday, October 28, 2016, the Boulder Business Intelligence Brain Trust (BBBT), the largest industry analyst consortium of its kind, will host a private webinar with Cloudera on what’s driving customers to the cloud, common use cases and deployment patterns, and what the future holds for BI and analytics.

(PRWeb October 25, 2016)

Read the full story at

A Layman’s Guide to Understanding the World of Big Data

A Layman’s Guide to Understanding the World of Big Data

In God we trust; all others must bring data" - W. Edwards Deming

Machine Learning – an elixir for new world technology

The world of analytics is now talking about Support Vector Machine (SVM), Naïve bayes, expectation maximization using naïve bayes, random forest, bagged regressions et al. –everything is about adapting learning and self-evolving algorithms that augment the understanding of the customer with every successive digital footprint.

For e-commerce businesses of this era data mining and machine learning algorithm plays an important role in the following areas –

Product search
Product recommendation and promotions
Fraud detection
Business intelligence
Anticipatory purchases
Pricing management
Supply chain management

Providing an example is the best way to understand how data science works and why is it so useful. For instance, a customer service centre has basic systems that allow employees to check the customer’s name, email, phone, address whenever they are calling the centre. In this way, the employee can see what this customer has bought in the past and they can skip the explanation in the beginning. However, with the help of tools that rely on data science, employees will be able to get more information about the caller like their return history, the ratings they gave to the company in different surveys, the ...

Read More on Datafloq
What Happens When You Merge Virtual Reality with Big Data

What Happens When You Merge Virtual Reality with Big Data

An article in the Journal of Big Data points out that one of the most difficult challenges of combining big data with virtual reality to produce useful scientific applications is the limitation of human perception.

Virtual Reality, Big Data and Science

Researchers at Cal Tech University have taken the first step towards solving this dilemma. The first step of answering any scientific questions begins with asking the right ones. Their team set out to explore the possibility of creating virtual reality platforms consisting of both software and hardware. Those platforms would allow scientists to use immersive virtual reality for multidimensional data visualization.

If you're having a hard time visualizing that, it's because it still exists only in science fiction movies like Iron Man, in which Tony Stark uses it to create his superhero suit. In real life, it would allow scientists to interact with their data and their colleagues simultaneously within the same visual space. Researchers believe it will lead to an improved perception of geometry as well as better retention of perceived relationships between different data sets.

An article in Data Science Central points to some of the key factors converging to make the big data and virtual reality one of the most ...

Read More on Datafloq
What Does Big Data Mean For Sustainability?

What Does Big Data Mean For Sustainability?

Big data is the buzzword. Everything that surrounds you is impacted by big data today. The phenomenon took shape earlier in this decade and now it has become a full-blown reality. There are now a growing number of compelling ways in which big data analytics is being applied to solve real world problems. The biggest of these problems, environment sustainability is also not out of the scope of big data and this post is exactly about how big data analytics will help scientists of today solve problems of tomorrow and make the development of human race more sustainable and environmentally friendly.

Data gathering is becoming advanced and so is our ability to understand copious amounts of data. Better computational strength and enhanced connectivity are riding the information revolution. According to an estimate by IBM, in 2020 there will be 300 times more information than we had in 2005. Understandably, there are immense possibilities which arise from a proper utilization of this data and environment sustainability is one of them. Out of many promises of Big Data, environment sustainability is one of the most important ones to implement and maintain. Why so?

Climate change is happening and there is nothing that can deny its ...

Read More on Datafloq
The Future is Now, Big Data Trends Going Forward

The Future is Now, Big Data Trends Going Forward

The future is approaching rapidly, in fact in fast evolving Big Data world it has already arrived. The seeds of future are already sown in the present. Many trends are visible today which are going to be the cornerstones of how Big Data landscape will look moving forward. Just like any data technologies since the days of Comma Separated Value (CSV) files, data has never been the end but a means to the end, however, the profoundness of Big Data is slowly but surely changing that unwritten law. Today, the data is starting to come to the centre from periphery. Some of the trends which are driving the technology world in general and data science in general in this direction are discussed below.

The Internet of Things (IoT) Enabled Connected World

As the IoT market matures with more and more devices becoming nodes of this huge web of machines and humans, the size and enormity of data they generate is going to rise further and faster. Some of the factors acting as tipping point or point of inflexion for IoT is coming together and maturation of few technologies including miniaturisation and substantial cost reduction of hardware like sensors, radio devices and modems ...

Read More on Datafloq
Why Perl 6 is Remarkably Robust at Handling Big Data Sets

Why Perl 6 is Remarkably Robust at Handling Big Data Sets

Perl has undergone a massive overhaul over the last year.

Perl 6 was released last December. It's virtually unrecognizable from previous versions of the 15-year-old programming language.

There are a number of updates Perl programmers should be aware of. Among the most significant is the emphasis on big data. Brian Kelly, The main developer of FullAuto said these applications will be tremendously useful in many verticals.

“Perl has a huge community of avid users that continues to thrive in spite of detractors,” said Brian Kelly, the main developer behind a configuration management tool written with Perl 5. “This community — like communities for every language in computing, is being pulled into the Big Data world, like it or not.”

Tom Radcliffe, the Director of Engineering at ActiveState, concurs that Perl 6 will lead the way with big data analytics. He said the new language will be particularly valuable for his clients in the financial industry.

“Many of our clients in the finance industry are using ActivePerl to pull data from various databases and process it,” said Radcliffe. “It’s being used as ‘big-data lite’ or as a way to load up big-data Hadoop systems.”

How Does Perl 6 Handle Big Data?

Perl 6 is capable of handling data ...

Read More on Datafloq
The State of Social Media Marketing – Key Trends, Strategies, and Statistics to Power Your Marketing

The State of Social Media Marketing – Key Trends, Strategies, and Statistics to Power Your Marketing

Social media has reshaped our daily lives – how we socialize, shop, stay informed, work, research, and spend our leisure time. The number of worldwide social media users is expected to reach 2.95 billion by 2020, which is approximately around a third of the Earth’s entire population. As of 2016, 78% of the United States population had a social networking profile.

Businesses are now fully on board with using social channels in their marketing strategies. Nine out of 10 U.S. companies are active on social networks, with the same percentage reporting increased brand exposure. More than half of businesses report an increase in sales due to their social media outreach.  

Companies Are Increasing Their Social Media Budgets

According to the Advertiser Perceptions' first Social Media Advertising Report, recently published by Media Post, 24% of major advertisers’ digital ad budgets now go toward social media. 47% of advertisers intend on increasing that figure over the next six months. While 38% are using social media in an attempt to generate sales with many marketers instead using it to boost brand recognition.

In another report by Strongview and Selligent, the estimates were even higher for channels in which marketers plan to increase spend.  56.3% of respondents reported that they would ...

Read More on Datafloq
How Startups Can Build a Big Data Infrastructure

How Startups Can Build a Big Data Infrastructure

It’s not a big surprise that big data can be very profitable when used correctly. Companies Teles Properties have successfully used big data to determine the potential of real estate properties, resulting in the firm selling homes for higher prices (and more quickly) than their competitors. Uber, the hugely popular ride-sharing service, uses data as both a tool and a commodity, selling the travel patterns they collect from drivers and using them to implement “surge” pricing for busy times of the day. Behind these data-driven businesses are a strong infrastructure and talented analysts who can identify trends and suggest changes based on those trends. 

Want to gain valuable insights that will help you stay relevant as you grow your startup? Of course, you do—and to get these insights, you’ll need to leverage big data. Almost every industry is using data analysis to predict trends and make business decisions, and startups are no exception. The problem? Most startups don’t have the knowledge to set up a simple big data infrastructure, which can limit their potential. Simple big data may sound like an oxymoron, but setting up your big data infrastructure doesn’t have to be complicated. Ready to get started? Here are some ...

Read More on Datafloq
nextCoder 2016-10-19 11:44:18

nextCoder 2016-10-19 11:44:18

“Great! We bought a BI Tool!”… Now What?

Read our LinkedIn Post.

How Big Data Takes the Retail Industry to a Whole New, More Informed Space

How Big Data Takes the Retail Industry to a Whole New, More Informed Space

This article is sponsored by CloudMoyo - Partner of choice for solutions at the intersection of Cloud & Analytics.

Speculation around the future of retail often tends to drift into visions of drones flying through the skies and delivering packages within minutes of a consumer clicking a few buttons on a site. In this projection, the bricks and mortar retail stores are old-fashioned, out of date and a relic of the past. Yet the reality of today’s retail environment is a far cry from that distant future. Today’s innovative retailers are harnessing information technology, and using big data and analytics in innovative and unusual ways, with a goal of enhancing the shopping experience, as well as gathering and processing valuable data that will help retailer’s better position themselves to meet the consumer’s needs.

What kind of insights are being gathered via big data? Many. For example, predicting which products are going to be most popular over the coming weeks and making sure that there is enough stock to meet the demand. Analyzing which branches of a retail chain are busier than others, what products they should be stocking, and who their customers are. Who the customers are in a particular store, what they usually ...

Read More on Datafloq
Why government agencies could lead the way in demanding inter-public cloud interoperability and standardization

Why government agencies could lead the way in demanding inter-public cloud interoperability and standardization

The next BriefingsDirect thought leadership panel discussion explores how public-sector organizations can gain economic benefits from cloud interoperability and standardization.

Our panel comes to you in conjunction with The Open Group Paris Event and Member Meeting October 24 through 27, 2016 in France, with a focus on the latest developments in eGovernment.

As government agencies move to the public cloud computing model, the use of more than one public cloud provider can offer economic benefits by a competition and choice. But are the public clouds standardized efficiently for true interoperability, and can the large government contracts in the offing for cloud providers have an impact on the level of maturity around standardization?

Listen to the podcast. Find it on iTunes. Get the mobile app. Read a full transcript or download a copy.

To learn how to best procure multiple cloud services as eGovernment services at low risk and high reward, we're joined by our panel, Dr. Chris Harding, Director for Interoperability at The Open Group; Dave Linthicum, Senior Vice President at Cloud Technology Partners, and Andras Szakal, Vice President and Chief Technology Officer at IBM U.S. Federal. The discussion is moderated by BriefingsDirect's Dana Gardner, Principal Analyst at Interarbor Solutions.

Here are some excerpts:

Gardner: Andras, I've spoken to some people in the lead-up to this discussion about the level of government-sector adoption of cloud services, especially public cloud. They tell me that it’s lagging the private sector. Is that what you're encountering, that the public sector is lagging the private sector, or is it more complicated than that?

Szakal: It's a bit more complicated than that. The public sector born-on-the-cloud adoption is probably much greater than the public sector and it differentiates. So the industry at large, from a born-on-the-cloud point of view is very much ahead of the public-sector government implementation of born-on-the-cloud applications.

What really drove that was innovations like the Internet of Things (IoT), gaming systems, and platforms, whereas the government environment really was more about taking existing government citizens to government shared services and so on and so forth and putting them into the cloud environment.

When you're talking about public cloud, you have to be very specific about the public sector and government, because most governments have their own industry instance of their cloud. In the federal government space, they're acutely aware of the FedRAMP certified public-cloud environments. That can go from moderate risk, where you can have access to the yummy goodness of the entire cloud industry, but then, to FedRAMP High, which would isolate these clouds into their own environments in order to increase the level of protection and lower the risk to the government.

So, the cloud service provider (CSP) created instances of these commercial clouds fit-for-purpose for the federal government. In that case, if we're talking about enterprise applications shifting to the cloud, we're seeing the public sector government side, at the national level, move very rapidly, compared to some of the commercial enterprises who are more leery about what the implications of that movement may be over a period of time. There isn't anybody that's mandating that they do that by law, whereas that is the case on the government side.

Attracting contracts

Gardner: Dave, it seems that if I were a public cloud provider, I couldn't think of a better customer, a better account in terms of size and longevity, than some major government agencies. What are we seeing from the cloud providers in trying to attract the government contracts and perhaps provide the level of interoperability and standardization that they require?

Linthicum: The big three -- Amazon, Google and Microsoft -- are really making an effort to get into that market. They all have federal sides to their house. People are selling into that space right now, and I think that they're seeing some progress. The FAA and certainly the DoD have been moving in that direction.

However, they do realize that they have to build a net new infrastructure, a net new way of doing procurement to get into that space. In the case where the US is building the world’s biggest private cloud at the CIA, they've had to change their technology around the needs of the government.

They see it as really the "Fortune 1." They see it as the largest opportunity that’s there, and they're willing to make huge investments in the billions of dollars to capture that market when it arrives.

Gardner: It seems to me, Chris, that we might be facing a situation where we have cloud providers offering a set of services to large government organizations, but perhaps a different set to the private sector. From an interoperability and standardization perspective, that doesn’t make much sense to me.

What’s your perspective on how public cloud services and standardization are shaping up? Where did you expect things to be at this point?

Harding: The government has an additional dimension to that of the private sector when it comes to procurement in terms of the need to be transparent and to be spending the money that’s entrusted to them by the public in a wise manner. One of the issues they have with a lack of standardization is that it makes it more difficult for them to show that they're visibly getting the best deals from the taxpayers when they come to procure cloud services.

In fact, The Open Group produced a guide to cloud computing for business a couple of years ago. One of the things that we argued in that was that, when procuring cloud services, the enterprise should model the use that it intends to make of the cloud services and therefore be able to understand the costs that they were likely to incur. This is perhaps more important for government, even more than it is for private enterprises. And you're right, the lack of standardization makes it more difficult for them to do this.

Gardner: Chris, do you think that interoperability is of a higher order of demand in public-sector cloud acquisition than in the private sector, or should there be any differentiation?

Need for interoperability

Harding: Both really have the need for interoperability. The public sector perhaps has a greater need, simply because it’s bigger than a small enterprise and it’s therefore more likely to want to use more cloud services in combination.

Gardner: We've certainly seen a lot of open-source platforms emerge in private cloud as well as hybrid cloud. Is that a driving force yet in the way that the public sector is looking at public cloud services acquisition? Is open source a guide to what we should expect in terms of interoperability and standardization in public-cloud services for eGovernment?

Szakal: Open source, from an application implementation point of view, is one of the questions you're asking, but are you also suggesting that somehow these cloud platforms will be reconsidered or implemented via open source? There's truth to both of those statements.

IBM is the number two cloud provider in the federal government space, if you look at hybrid and the commercial cloud for which we provide three major cloud environments. All of those cloud implementations are based on open source -- OpenStack and Cloud Foundry are key pieces of this -- as well as the entire DevOps lifecycle.
So, the economy of APIs and the creation of this composite services are going to be very, very important elements. If they're closed and not open to following the normal RESTful approaches defined by the W3C and other industry consortia, then it’s going to be difficult to create these composite clouds.

So, open source is important, but if you think of open source as a way to ensure interoperability, kind of what we call in The Open Group environment "Executable Standards," it is a way to ensure interoperability.

That’s more important at the cloud-stack level than it is between cloud providers, because between cloud providers you're really going to be talking about API-driven interoperability, and we have that down pretty well.

So, the economy of APIs and the creation of this composite services are going to be very, very important elements. If they're closed and not open to following the normal RESTful approaches defined by the W3C and other industry consortia, then it’s going to be difficult to create these composite clouds.

Gardner: We saw that OpenStack had its origins in a government agency, NASA. In that case, clearly a government organization, at least in the United States, was driving the desire for interoperability and standardization, a common platform approach. Has that been successful, Dave? Why wouldn’t the government continue to try to take that approach of a common, open-source platform for cloud interoperability?

Linthicum: OpenStack has had some fair success, but I wouldn’t call it excellent success. One of the issues is that the government left it dangling out there, and while using some aspects of it, I really expected them to make some more adoption around that open standard, for lots of reasons.

So, they have to hack the operating systems and meet very specific needs around security, governance, compliance, and things like that. They have special use cases, such as the DoD, weapons control systems in real time, and some IoT stuff that the government would like to move into. So, that’s out there as an opportunity.
Register for
The Open Group Event
Next in Your Region
In other words, the ability to work with some of the distros out there, and there are dozens of them, and get into a special government version of that operating system, which is supported openly by the government integrators and providers, is something they really should take advantage of. It hasn’t happened so far and it’s a bit disappointing.

Insight into Europe

Gardner: Do any of you have any insight into Europe and some of the government agencies there? They haven’t been shy in the past about mandating certain practices when it comes to public contracts for acquisition of IT services. I think cloud should follow the same path. Is there a big difference in what’s going on in Europe and in North America?

Szakal: I just got off the phone a few minutes ago with my counterpart in the UK. The nice thing about the way the UK government is approaching cloud computing is that they're trying to do so by taking the handcuffs off the vendors and making sure that they are standards-based. They're meeting a certain quality of services for them, but they're not mandating through policy and by law the structure of their cloud. So, it allows for us, at least within IBM, to take advantage of this incredible industry ecosystem you have on the commercial side, without having to consider that you might have to lift and shift all of this very expensive infrastructure over to these industry clouds.

The EU is, in similar ways, following a similar practice. Obviously, data sovereignty is really an important element for most governments. So, you see a lot of focus on data sovereignty and data portability, more so than we do around strict requirements in following a particular set of security controls or standards that would lock you in and make it more difficult for you to evolve over a period of time.

Gardner: Chris Harding, to Andras’ point about data interoperability, do you see that as a point on the arrow that perhaps other cloud interoperability standards would follow? Is that something that you're focused on more specifically than more general cloud infrastructure services?

Harding: Cloud is a huge spectrum, from the infrastructure services at the bottom,up to the business services, the application services, to software as a service (SaaS), and data interoperability sits on top of that stack.

I'm not sure that we're ready to get real data interoperability yet, but the work that's being done on trying to establish common frameworks for understanding data, for interpreting data, is very important as a basis for gaining interoperability at that level in the future.

We also need to bear in mind that the nature of data is changing. It’s no longer a case that all data comes from a SQL database. There are all sorts of ways in which data is represented, including human forms, such as text and speech, and interpreting those is becoming more possible and more important.

This is the exciting area, where you see the most interesting work on interoperability.

Gardner: Dave Linthicum, one of the things that some of us who have been proponents of cloud for a number of years now have looked to is the opportunity to get something that couldn’t have been done before, a whole greater than the sum of the parts.
Register for
The Open Group Event
Next in Your Region
It seems to me that if you have a common cloud fabric and the sufficient amount of interoperability for data and/or applications and infrastructure services and that cuts across both the public and the private sector, then this difficulty we've had with health insurance, payer and provider, interoperability and communication, sharing of government services, and data with the private sector, many of the things that have been probably blamed on bureaucracy and technical backwardness in some ways could be solved if there was a common public cloud approach adopted by the major public cloud providers. It seems to me a very significant benefit could be drawn when the public and private sector have a commonality that having your own data centers of the past just couldn't provide.

Am I chewing on too much pie in the sky here, Dave, or is there actually something to be said about the cloud model, not just between government to government agencies, but the public and private sectors?

Getting more savvy

Linthicum: The public-cloud providers out there, the big ones, are getting more savvy about providing interoperability, because they realized that it’s going to be multi-cloud. It’s going to be different private and public cloud instances, different kinds of technologies, that are there, and you have to work and play well with a number of different technologies.

However, to be a little bit more skeptical, over the years, I've found out that they're in it for their own selfish interests, and they should be, because they're corporations. They're going to basically try to play up their technology to get into a market and hold on to the market, and by doing that, they typically operate against interoperability. They want to make it as difficult as possible to integrate with the competitors and leverage their competitors’ services.

So, we have that kind of dynamic going on, and it’s incredibly frustrating, because we can certainly stand up, have the discussion, and reveal the concepts. You just did a really good job in revealing that this has been Nirvana, and we should start moving in this direction. You will typically get lots of head-nodding from the public-cloud providers and the private-cloud providers but actions speak louder than words, and thus far, it’s been very counterproductive.

Interoperability is occurring but it’s in dribs and drabs and nothing holistic.

Gardner: Chris, it seems as if the earlier you try to instill interoperability and standardization both in technical terms, as well as methodological, that you're able to carry that into the future where we don't repave cow paths, but we have highly non-interoperable data centers replaced by them being in the cloud, rather than in some building that you control.
The public-cloud providers out there, the big ones, are getting more savvy about providing interoperability, because they realized that it’s going to be multi-cloud.

What do you think is going to be part of the discussion at The Open Group Paris Event, October 24, around some of these concepts of eGovernment? Shouldn’t they be talking about trying to make interoperability something that's in place from the start, rather than something that has to be imposed later in the process?

Harding: Certainly this will be an important topic at the forthcoming Paris event. My personal view is that the question of when you should standardize something to gain interoperability is a very difficult balancing act. If you do it too late, then you just get a mess of things that don’t interoperate, but equally, if you try to introduce standards before the market is ready for them, you generally end up with something that doesn’t work, and you get a mess for a different reason.

Part of the value of industry events, such as The Open Group events, is for people in different roles in different organizations to be able to discuss with each other and get a feel for the state of maturity and the directions in which it's possible to create a standard that will stick. We're seeing a standard paradigm, the API paradigm, that was mentioned earlier. We need to start building more specific standards on top of those, and certainly in Paris and at future Open Group events, those are the things we'll be discussing.

Gardner: Andras, you wear a couple of different hats. One is the Chief Technology Officer at IBM US Federal, but you're also very much involved with The Open Group. I think you're on the Board of Directors. How do you see this progression of what The Open Group has been able to do in other spheres around standardization and both methodological, such as an enterprise architecture framework, TOGAF®, an Open Group standard,, as well as the implementation enforcement of standards? Is what The Open Group has done in the past something you expect to be applicable to these cloud issues?

Szakal: IBM has a unique history, being one of the only companies in the technology arena. It’s over a 100-years-old and has been able to retain great value to its customers over that long period of time, and we shifted from a fairly closed computing environment to this idea of open interoperability and freedom of choice.

That's our approach for our cloud environment as well. What drives us in this direction is because our customers require it from IBM, and we're a common infrastructure and a glue that binds together many of our enterprise and the largest financial banking and healthcare institutions in the world to ensure that they can interoperate with other vendors.
Register for
The Open Group Event
Next in Your Region
As such, we were one of the founders of The Open Group, which has been at the forefront of helping facilitate this discussion about open interoperability. I'm totally with Chris as to when you would approach that. As I said before, my concern is that you interoperate at the service level in the economy of APIs. That would suggest that there are some other elements for that, not just the API itself, but the ability to effectively manage credentials, security, or some other common services, like being able to manage object stores to the place that you would like to be able to store your information, so that data sovereignty isn’t an issue. These are all the things that will occur over a period of time.

Early days

It’s early, heady days in the cloud world, and we're going to see all of that goodness come to pass here as we go forward. In reality, we talk about cloud it as if it’s a thing. It’s true value isn't so much in the technology, but in creating these new disruptive business capabilities and business models. Openness of the cloud doesn’t facilitate that creation of those new business models.

That’s where we need to focus. Are we able to actually drive these new collaborative models with our cloud capabilities? You're going to be interoperating with many CSPs not just two, three, or four, especially as you see different factors grow into the cloud. It won’t matter where they operate their cloud services from; it will matter how they actually interoperate at that API level.

Gardner: It certainly seems to me that the interoperability is the killer application of the cloud. It can really foster greater inter-department collaboration and synergy, government to government, state to federal, across the EU, for example as well, and then also to the private sector, where you have healthcare concerns and you've got monetary and banking and finance concerns all very deeply entrenched in both public and private sectors. So, we hope that that’s where the openness leads to.
It won’t matter where they operate their cloud services from; it will matter how they actually interoperate at that API level.

Chris, before we wrap up, it seems to me that there's a precedent that has been set successfully with The Open Group, when it comes to security. We've been able to do some pretty good work over the past several years with cloud security using the adoption of standards around encryption or tokenization, for example. Doesn’t that sort of give us a path to greater interoperability at other levels of cloud services? Is security a harbinger of things to come?

Harding: Security certainly is a key aspect that needs to be incorporated in the standards where we build on the API paradigm. But, some people talk about move to digital transformation, the digital enterprise. So, cloud and other things like IoT, big-data analysis, and so on are all coming together, and a key underpinning requirement for that is platform integration. That's where the Open Platform 3.0™ Forum of The Open Group is centering on the possibilities for platform interoperability to enable digital platform integration. Security is a key aspect of that, but there are other aspects too.

Gardner: I am afraid we will have to leave it there. We've been discussing the latest developments in eGovernment and cloud adoption with a panel of experts. Our focus on these issues comes in conjunction with The Open Group Paris Event and Member Meeting, October 24-27, 2016 in Paris, France, and there is still time to register at and find more information on that event, and many others coming in the near future.

Listen to the podcast. Find it on iTunes. Get the mobile app. Read a full transcript or download a copy. Sponsor: The Open Group.

You may also be interested in:

Drill to Detail Podcast: Data Modeling, Data Vault, and Snowflake!

Drill to Detail Podcast: Data Modeling, Data Vault, and Snowflake!

Check out Episode 5 of Drill to Details where Mark Rittman asks me about data modeling, data vault, and Snwoflake DB!
BBBT Hosts Webinar with Splunk on Leveraging Machine Data to Deliver New Insights for Business Analytics

BBBT Hosts Webinar with Splunk on Leveraging Machine Data to Deliver New Insights for Business Analytics

This Friday, October 21, 2016, the Boulder Business Intelligence Brain Trust (BBBT), the largest industry analyst consortium of its kind, will host a private webinar with Splunk on enabling organizations to harness the transformative power of machine data. Splunk provides the leading software platform for real-time Operational Intelligence.

(PRWeb October 18, 2016)

Read the full story at

Salesforce Acquires BeyondCore to Enable Analytics . . . and More

Salesforce Acquires BeyondCore to Enable Analytics . . . and More

In October of 2014, Salesforce announced the launch of Salesforce Wave, the cloud-based company’s analytics cloud platform. By that time, Salesforce had already realized that to be able to compete with the powerful incumbents in the business software arena—the Oracles, SAPs and IBMs of the world—arriving to the cloud at full swing would require it to expand its offerings to the business
Payment Privacy: Do New Apps Protect User Data?

Payment Privacy: Do New Apps Protect User Data?

If you’ve downloaded iOS 10 for your iPhone, you may have noticed that Apple Pay is a more prominent component of the Wallet and that Apple Pay is just one of the many alternative payment systems being used in big stores, tiny craft shops, and even among friends today. Some of these, like PayPal, have been around for a long time, while many are part of a new, increasingly digital economy.

Unfortunately, though many are eager to use these new apps and simplify their financial exchanges, not all of these mobile alternatives are ready for the big time. In fact, some may be putting you and your customers at financial risk. If you’re a business owner, it’s important to know which of these apps you can trust and which should be tossed back into the digital sea.

Business Ready, Customer Safe

For companies that are ready to expand beyond PayPal and Apple Pay, there are definitely a number of apps that are business-ready and won’t compromise customer data, including 2CheckOut, a simple e-commerce app open to both businesses and individuals and Payline Data, an e-commerce provider that centers transparency in its business.

Payline Data offers two different tiers, much like the send money to ...

Read More on Datafloq
How Millennials are Fueling Demand for Data-Driven Transportation

How Millennials are Fueling Demand for Data-Driven Transportation

There’s long been a myth that millennials don’t buy cars—a theory that has been slowly losing credibility over the past few years, as more young people begin to lease and purchase vehicles. In fact, automakers saw record sales in 2015, following several sluggish years of decline. The myth of the absent millennial car buyer was largely based on speculative theories, which just haven’t held up under scrutiny. However, there are differences in what millennials prioritize when it comes to transportation—and these attitudes will ultimately lead to increased demand for data-driven transportation, both in private vehicles and public transit. Here’s why:

Young Adults are Interested in Electric Cars

More automobile manufacturers are getting into the electric car business, and young adults are eager to get these new vehicles on the road. According to The Consumer Federation of America’s polls, 50% of participants 18-34 years old would consider buying an electric vehicle, and this age group showed the most interest in these vehicles of all participants. Aside from the obvious environmental angle and the cars’ lower maintenance costs, many electric cars are among the new wave of vehicles that integrate GPS, traffic tracking and analysis, and other technology that can sync with users’ other ...

Read More on Datafloq
Data Preparation: Is the Dream of Reversing the 80/20 Rule Dead?

Data Preparation: Is the Dream of Reversing the 80/20 Rule Dead?

I recently had someone ask me, “For years we’ve talked about changing analytics from 80% data prep and 20% analytics to 20% data prep and 80% analytics, yet we still seem stuck with 80% data prep. Why is that?” It is a very good question about a very real issue that causes many people frustration.

I believe that there is actually a good answer to it and that the perceived lack of progress is not as bad as it first appears. To explain, we need to differentiate between a new data source and/or a new business problem and existing ones we have addressed before.

Breaking New Ground

Whenever a new data source is first acquired and analyzed, there is a lot of initial work required to understand, cleanse, and assess the data. Without that initial work, it isn’t possible to perform effective analysis. Much of the work will be a one-time effort, but it can be substantial. For example, determining how to identify and handle inaccurate sensor readings or incorrectly recorded prices.

From the earliest days of my career, some of the most challenging work has been working with new data. For the first couple of analytics on a new data source, the ratio ...

Read More on Datafloq
Artificial Intelligence Regulation: Let’s not Regulate Mathematics!

Artificial Intelligence Regulation: Let’s not Regulate Mathematics!

On Wednesday, ahead of today’s White House Frontiers Conference, the White House Office of Science and Technology Policy released its report on Preparing for the Future of Artificial Intelligence. The report is optimistic, comprehensive and well-balanced. In summary: full-speed ahead.  But let’s be smart when it comes to Artificial Intelligence regulation.

The premise is that we are going toward an AI-based future, mostly for the common good. The progress of AI needs to be encouraged through investment, training and education. When AI is incorporated into existing applications, regulators should consider risks as well as benefits before intervening, and Artificial Intelligence regulation should not be used to arbitrarily burden or slow down the development of AI. That said, safety and ethics should be primary concerns as we move AI systems from the lab into the much more unpredictable real world. All sensible stuff.  

The difficulty of Artificial Intelligence regulation

Unsurprisingly though all is not completely straightforward.  The question of Artificial Intelligence regulation poses considerable challenges.  The OSTP report discusses the issues of fairness and transparency, and brings up two different concerns:

The need to prevent automated systems from making decisions that discriminate against certain groups or individuals.
The need for transparency in AI systems, in the ...

Read More on Datafloq
12 Interesting Big Data Careers That Everyone Should Know

12 Interesting Big Data Careers That Everyone Should Know

Ever wondered how people as young as 25-30 become CTOs and attain exponential growth in a short time?

Sure, they have the talent, but they also take the right steps to grow in their career. They are very clear what they want to achieve and create milestones to make it.

Like them, have you planned your career to succeed? If not, it is not too late to rework your career plan. If you are a graduate, then some data science opportunities await you.

A study says that data science is going to open up as much as 10 million jobs in this decade. Now, since you already know there are many opportunities, how do you leverage your skills to tap into it? First and foremost look at what skills define you. Is it your expertise, your visualization skills or managing skills that you not only demonstrate but also enjoying working?

Once you're through with it, work towards it and learn from the different software languages that are trending in the industry and are in high demand. Take up certification courses that can give the much-needed edge. After your build, your portfolio with technical skills, a broad range of data job profiles can help you settle in and earn a ...

Read More on Datafloq
Four Tips for Setting up A Server Room For Your Company

Four Tips for Setting up A Server Room For Your Company

Although there is a growing trend for businesses to use the cloud to store their data, many businesses still rely on traditional servers. If your business falls into this category, you most likely will need to create a dedicated room where the servers can be stored. Your server room can range from extremely basic to elaborate, depending on the needs of your business.

If you work with an outside technology advisor, they can provide you with input on how best to set up the room including what equipment to invest in and how to organize the servers within the room. On the other hand, if you are working on your own, you will need to educate yourself about how best to go about designing the space. The following tips should give you a good jumping off point that you can use to get started.

Find The Best Server Rack For Your Business

Server racks come in many different styles and sizes. Finding one that is well-suited to your needs depends on how you plan to use it. For instance, do you plan on using it primarily to store data or for handling backups? After you decide which racks are the best choice for ...

Read More on Datafloq
How Big Data is Changing the World Around Us

How Big Data is Changing the World Around Us

Developing technologies and new businesses demonstrate the power of big data and how it has come to define our lives in the modern world. From saving more lives to selling expensive stuff, it is big data which has been helping individuals, businesses and governments to find out new world solutions to problems. So what is big data about? As implied, it is a collection of huge data sets that are analyzed to reveal trends, patterns, associations and causes related to human behavior, interactions and conditions. One would be surprised to know that all of this data comes from any digital interactions we do every day.

From search engine queries to the products we buy, everything we do on the internet gets registered. The penetration of smartphones has been able to leverage this data collection opportunity to a huge extent in recent times. Today, more than a billion Google searches are done every day and 294 billion Emails sent by users across the globe. Wearable and smart technologies have further ensured that the world is connected with over trillions of sensors that track, monitor, communicate and even help with real world functionalities. Big Data has no limitations for its sources. From Tweets ...

Read More on Datafloq
BBBT Hosts Trifacta Webinar on Data Wrangling for Any User, Any Data, and Any Cloud

BBBT Hosts Trifacta Webinar on Data Wrangling for Any User, Any Data, and Any Cloud

This Friday, the Boulder Business Intelligence Brain Trust (BBBT), the largest industry analyst consortium of its kind, will host a private webinar from Trifacta on its emergence as the global leader in data wrangling. Trifacta has more than 16,000 users at more than 3,500 companies, and the largest partner ecosystem of any vendor.

(PRWeb October 12, 2016)

Read the full story at

What is Artificial Intelligence? Louis Monier explains Everything

What is Artificial Intelligence? Louis Monier explains Everything

Our Chief Scientist Louis Monier gives you the straight dope on AI.

Artificial Intelligence, always a very polarizing subject, is back on top of the news. Unless you have been on a deep space mission for the past year, you have been exposed to opinions ranging from “this will change everything for the better” to “this will spell our doom”.

But what are the facts? What is Artificial Intelligence? Why is it seeing a resurgence now? How can I benefit from it? How does it affect my business? This is the first in a series of posts where we answer the most commons questions about Artificial Intelligence.

So what is Artificial Intelligence?

Put simply Artificial Intelligence, or AI, is the study of tasks that are effortless for humans but very difficult for machines. Let’s think of what we can accomplish within the first few years of our lives:

We can recognize people and objects, and pick up a voice out of a noisy background.
We are on our way to mastering a language by our third year.
We acquire a lot of common sense, we know a big toy won’t fit inside a small box, that if you play with water you’ll get wet, and that your ...

Read More on Datafloq
Why Big Data as a Service is the Hottest Trend in Cloud Now

Why Big Data as a Service is the Hottest Trend in Cloud Now

This article is sponsored by CloudMoyo - Partner of choice for solutions at the intersection of Cloud & Analytics.

Once in a while, a technology comes along that is so disruptive and so unique that it often takes a few years before applications to make the most effective use of that technology are sufficiently developed. So it is with the advent of cloud computing and its application as a service to analyze big data.

Big data as a service (BDaaS) is an evolution of software as a service (SaaS) and platform as a service (PaaS), with the added ingredient of massive amounts of data. Essentially, the BDaaS offering is a solution for companies to solve problems that they are facing by analyzing and interpreting their data. Organizations around the world have warmed to the idea that their next phase of growth will be driven by understanding the insights that are gleaned from the data which is produced by their interactions.

Why is big data so hard for companies to unpack without help from external service providers? Paul Hoffman, CTO of Space-Time Insight explains, “Organizations are collecting and storing data that is generated within their walls (e.g. business and operational data) as well as externally (e.g. ...

Read More on Datafloq
Big Data in the Face-paced World of Finance

Big Data in the Face-paced World of Finance

In 1998 the SEC began to allow online trading of financials securities. It took traders very little time to realize that data, and lots of it, was going to be absolutely essential to making money. Some firms realized that a tiny advantage in data, be it in speed of transfer or quality of data, could lead to huge increases in profits and revenue. Thus began high-speed and algorithmic trading.

Today the market runs on big data. There are a couple of ways it is involved, and all of them have the potential to make or lose trillions of dollars.


The first way big data is involved in the financial industry is via algorithmic trading. This is where computer programs take all available data for a specific stock or market and sift through that data to determine good times to buy and sell as well as good securities to buy and sell.  If the program can identify a pattern, that pattern can be exploited and a lot of money can be made.

When it comes to algorithmic trading, the more data the better. Patterns can be found scattered throughout data, and most algorithms will look for multiple patterns that point to a good purchase. ...

Read More on Datafloq
How propelling instant results to the Excel edge democratizes advanced analytics

How propelling instant results to the Excel edge democratizes advanced analytics

The next BriefingsDirect Voice of the Customer digital transformation case study explores how powerful and diverse financial information is newly and uniquely delivered to the ubiquitous Excel spreadsheet edge.

We'll explore how HTI Labs in London provides the means and governance with its Schematiq tool to bring critical data services to the interface users want. By leveraging the best of instant cloud-delivered data with spreadsheets, Schematiq democratizes end-user empowerment while providing powerful new ways to harness and access complex information.

Listen to the podcast. Find it on iTunes. Get the mobile app. Read a full transcript or download a copy.

To learn how complex cloud core-to-edge processes and benefits can be better managed and exploited we're joined by Darren Harris, CEO and Co-Founder of HTI Labs, and Jonathan Glass, CTO and Co-Founder of HTI Labs, based in London. The discussion is moderated by BriefingsDirect's Dana Gardner, Principal Analyst at Interarbor Solutions.

Here are some excerpts:

Gardner: Let's put some context around this first. What major trends in the financial sector led you to create HTI Labs, and what are the problems you're seeking to solve?

Harris: Obviously, in finance, spreadsheets are widespread and are being used for a number of varying problems. A real issue started a number of years ago, where spreadsheets got out of control. People were using them everywhere, causing lots of operational risk processes. They wanted to get their hands around it for governance, and there were loads that we needed to eradicate -- Excel-type issues.

That led to the creation of centralized teams that locked down rigid processes and effectively took away a lot of the innovation and discovery process that traders are using to spot opportunities and explore data.

Through this process, we're trying to help with governance to understand the tools to explore, and [deliver] the ability to put the data in the hands of people ... [with] the right balance.

So by taking the best of regulatory scrutiny around what a person needs, and some innovation that we put into Schematiq, we see an opportunity to take Excel to another level -- but not sacrifice the control that’s needed.

Gardner: Jonathan, are there technology trends that allowed you to be able to do this, whereas it may not have been feasible economically or technically before?

Upstream capabilities

Glass: There are lot of really great back-end technologies that are available now, along with the ability to either internally or externally scale compute resources. Essentially, the desktop remains quite similar. Excel has remained quite the same, but the upstream capabilities have really grown.

So there's a challenge. Data that people feel they should have access to is getting bigger, more complex, and less structured. So Excel, which is this great front-end to come to grips with data, is becoming a bit of bottleneck in terms of actually keeping up with the data that's out there that people want.

Gardner: So, we're going to keep Excel. We're not going to throw the baby out with the bathwater, so to speak, but we are going to do something a little bit different and interesting. What is it that we're now putting into Excel and how is that different from what was available in the past?

Harris: Schematiq extends Excel and allows it to access unstructured data. It also reduces the complexity and technical limitations that Excel has as an out-of-the-box product.

We have the notion of a data link that's effectively in a single cell that allows you to reference data that’s held externally on a back-end site. So, where people used to ingest data from another system directly into Excel, and effectively divorce it from the source, we can leave that data where it is.
Learn More About
Haven OnDemand
Sign Up Now
It's a paradigm of take a question to the data; don’t pull the data to the question. That means we can leverage the power of the big-data platforms and how they process an analytic database on the back-end, but where you can effectively use Excel as the front screen. Ask questions from Excel, but push that query to the back-end. That's very different in terms of the model that most people are used to working with in Excel.

Gardner: This is a two-way street. It's a bit different. And you're also looking at the quality, compliance, and regulatory concerns over that data.

Harris: Absolutely. An end-user is able to break down or decompose any workflow process with data and debug it the same way they can in a spreadsheet. The transparency that we add on top of Excel’s use with Schematiq allows us to monitor what everybody is doing and the function they're using. So, you can give them agility, but still maintain the governance and the control.

In organizations, lots of teams have become disengaged. IT has tried to create some central core platform that’s quite restrictive, and it's not really serving the users. They have gotten disengaged and they've created what Gartner referred to as the Shadow BI Team, with databases under their desk, and stuff like that.

By bringing in Schematiq we add that transparency back, and we allow IT and the users to have an informed discussion -- a very analytic conversation -- around what they're using, how they are using it, where the bottlenecks are. And then, they can work out where the best value is. It's all about agility and control. You just can't give the self-service tools to an organization and not have the transparency for any oversight or governance.

To the edge

Gardner: So we have, in a sense, brought this core to the edge. We've managed it in terms of compliance and security. Now, we can start to think about how creative we can get with what's on that back-end that we deliver. Tell us a little bit about what you go after, what your users want to experiment with, and then how you enable that.

Glass: We try to be as agnostic to that as we can, because it's the creativity of the end-user that really drives value.

We have a variety of different data sources, traditional relational databases, object stores, OLAP cubes, APIs, web queries, and flat files. People want to bring that stuff together. They want some way that they can pull this stuff in from different sources and create something that's unique. This concept of putting together data that hasn't been put together before is where the sparks start to fly and where the value really comes from.

Gardner: And with Schematiq you're enabling that aggregation and cleansing ability to combine, as well as delivering it. Is that right?
The iteration curve is so much tighter and the cost of doing that is so much less. Users are able to innovate and put together the scenario of the business case for why this is a good idea.

Harris: Absolutely. It's that discovery process. It may be very early on in a long chain. This thing may progress to be something more classic, operational, and structured business intelligence (BI), but allowing end-users the ability to cleanse, explore data, and then hand over an artifact that someone in the core team can work with or use as an asset. The iteration curve is so much tighter and the cost of doing that is so much less. Users are able to innovate and put together the scenario of the business case for why this is a good idea.

The only thing I would add to the sources that Jon has just mentioned is with HPE Haven OnDemand, [you gain access to] the unstructured analytics, giving the users the ability to access and leverage all of the HPE IDOL capabilities. That capability is a really powerful and transformational thing for businesses.

They have such a set of unstructured data [services] available in voice and text, and when you allow business users access to that data, the things they come up with, their ideas, are just quite amazing.

Technologists always try to put themselves in the minds of the users, and we've all historically done a bad job of making the data more accessible for them. When you allow them the ability to analyze PDFs without structure, to share that, to analyze sentiment, to include concepts and entities, or even enrich a core proposition, you're really starting to create innovation. You've raised the awareness of all of these analytics that exist in the world today in the back-end, shown end-users what they can do, and then put their brains to work discovering and inventing.

Gardner: Many of these financial organizations are well-established, many of them for hundreds of years perhaps. All are thinking about digital transformation, the journey, and are looking to become more data-driven and to empower more people to take advantage of that. So, it seems to me you're almost an agent of digital transformation, even in a very technical and sophisticated sector like finance.

Making data accessible

Glass: There are a lot of stereotypes in terms of who the business analysts are and who the people are that come up with ideas and intervention. The true power of democratization is making data more accessible, lowering the technical barrier, and allowing people to explore and innovate. Things always come from where you least expect them.

Gardner: I imagine that Microsoft is pleased with this, because there are some people who are a bit down on Excel. They think that it's manual, that it's by rote, and that it's not the way to go. So, you, in a sense, are helping Excel get a new lease on life.

Glass: I don’t think we're the whole story in that space, but I love Excel. I've used it for years and years at work. I've seen the power of what it can do and what it can deliver, and I have a bit of an understanding of why that is. It’s the live nature of it, the fact that people can look at data in a spreadsheet, see where it’s come from, see where it’s going, they can trust it, and they can believe in it.
Learn More About
Haven OnDemand
Sign Up Now
That’s why what we're trying to do is create these live connections to these upstream data sources. There are manual steps, download, copy/paste, move around the sheet, which is where errors creep in. It’s where the bloat, the slowness, and the unreliability can happen, but by changing that into a live connection to the data source, it becomes instant and it goes back to being trusted, reliable, and actionable.

Harris: There's something in the DNA, as well, of how people interact with data and so we can lay out effectively the algorithm or the process of understanding a calculation or a data flow. That’s why you see a lot of other systems that are more web-based or web-centric and replicate an Excel-type experience.

The user starts to use it and starts to think, "Wow, it’s just like Excel," and it isn’t. They hit a barrier, they hit a wall, and then they hit the "export" button. Then, they put it back [into Excel] and create their own way to work with it. So, there's something in the DNA of Excel and the way people lay things out. I think of [Excel] almost like a programing environment for non-programers. Some people describe it as a functional language very much like Haskell, and the Excel functions they write were effectively then working and navigating through the data.

Gardner: No need to worry that if you build it, will they come; they're already there.

Harris: Absolutely.

Gardner: Tell us a bit about HTI Labs and how your company came about, and where you are on your evolution.

Cutting edge

Harris: HTI labs was founded in 2012. The core backbone of the team actually worked for the same tier 1 investment bank, and we were building risk and trading systems for front-office teams. We were really, I suppose, the cutting edge of all the big data technologies that were being used at the time -- real-time, disputed graphs and cubes, and everything.

As a core team, it was about taking that expertise and bringing it to other industries. Using Monte Carlo farms in risk calculations, the ability to export data at speed and real-time risk. These things were becoming more centric to other organizations, which was an opportunity.

At the moment, we're focusing predominately on energy trading. Our software is being used across a number of other sectors and our largest client has installed Schematiq on 120 desktops, which is great. That’s a great validation of what we're doing. We're also a member of the London Stock Exchange Elite Program, based in London for high-growth companies.

Glass: Darren and I met when we were working for the same company. I started out as a quant doing the modeling, the map behind pricing, but I found that my interest lay more in the engineering. Rather than doing it once, can I do it a million times, can I do these things reliably and scale them?
The algorithms are built, but the key to making them so much more improved is the feedback loop between your domain users, your business users, and how they can enrich and train effectively these algorithms.

Because I started in a front-office environment, it was very spreadsheet-dominated, it was very VBA-dominated. There's good and bad in that. A lot of those lessened, and Darren and I met up. We crossed the divide together from the top-down, big IT systems and the bottom-up end-user best-developed spreadsheets, and so on. We found a middle ground together, which we feel is a quite powerful combination.

Gardner: Back to where this leads. We're seeing more-and-more companies using data services like Haven OnDemand and starting to employ machine learning, artificial intelligence (AI), and bots to augment what the humans do so well. Is there an opportunity for that to play here, or maybe it already is? The question basically is, how does AI come to bear on what you can deliver out to the Excel edge?

Harris: I think what you see is that out of the box, you have a base unit of capability. The algorithms are built, but the key to making them so much more improved is the feedback loop between your domain users, your business users, and how they can enrich and train effectively these algorithms.

So, we see a future where the self-service BI tools that they use to interact with data and explore would almost become the same mechanism where people will see the results from the algorithms and give feedback to send back to the underlying algorithm.

Gardner: And Jonathan, where do you see the use of bots, particularly perhaps with an API model like Haven OnDemand?

The role of bots

Glass: The concept for bots is replicating an insight or a process that somebody might already be doing manually. When people create these data flows and analyses that they maybe run once so it’s quite time-consuming to run. The real exciting possibility is that you make these things run 24×7. So, you start receiving notifications, rather than having to pull from the data source. You start receiving notifications from your own mailbox that you have created. You look at those and you decide whether that's a good insight or a bad insight, and you can then start to train it and refine it.

The training and refining is that loop that potentially goes back to IT, gets back through a development loop, and it’s about closing that loop and tightening that loop. That's the thing that really adds value to those opportunities.

Gardner: Perhaps we should unpack Schematiq a bit to understand how one might go back and do that within the context of your tool. Are there several components of the tool, one of which might lend itself to going back and automating?

Glass: Absolutely. You can imagine the spreadsheet has some inputs and some outputs. One of the components within the Schematiq architecture is the ability to take a spreadsheet, to take the logic and the process that’s embedded in our spreadsheet, and turn it into an executable module of code, which you can host on your server, you can schedule, you can run as often as you like, and you can trigger based on events.
It’s very much all about empowering the end-user to connect, create, govern, share instantly and then allow consumption from anybody on any device.

It’s a way of emitting code from a spreadsheet. You take some of the insight, you take without a business analysis loop and a development loop, and you take the exact thing that the user, the analyst, has programmed. You make it into something that you can run, commoditize, and scale. That’s quite an important way in which we reduce that development loop. We create that cycle that’s tight and rapid.

Gardner: Darren, would you like to explain the other components that make-up Schematiq?

Harris: There are four components of Schematiq architecture. There's the workbench that extends Excel and allows the ability to have large structured data analytics. We have the asset manager, which is really all about governance. So, you can think of it like source control for Excel, but with a lot more around metadata control, transparency, and analytics on what people are using and how they are using it.

There's a server component that allows you just to off-load and scale analytics horizontally, if they do that, and build repeatable or overnight processes. The last part is the portal. This is really about allowing end-users to instantly share their insights with other people. Picking up from Jon’s point about the compound executable, but it’s defined in Schematiq. That can be off-loaded to a server and exposed as another API to a computer, the mobile, or even a function.

So, it’s very much all about empowering the end-user to connect, create, govern, share instantly and then allow consumption from anybody on any device.

Market for data services

Gardner: I imagine, given the sensitive nature of the financial markets and activities, that you have some boundaries that you can’t cross when it comes to examining what’s going on in between the core and the edge.

Tell me about how you, as an organization, can look at what’s going on with the Schematiq and the democratization, and whether that creates another market for data services when you see what the demand entails.

Harris: It’s definitely the case that people have internal datasets they create and that they look after. People are very precious about them because they are hugely valuable, and one of the things that we strive to help people do is to share those things.

Across the trading floor, you might effectively have a dozen or more different IT infrastructures, if you think of what’s existing on the desk as being a miniature infrastructure that’s been created. So, it's about making easy for people to share these things, to create master datasets that they gain value from, and to see that they gain mutual value from that, rather than feeling closed in, and don’t want to share this with their neighbors.

If we work together and if we have the tools that enable us to collaborate effectively, then we can all get more done and we can all add more value.
If we work together and if we have the tools that enable us to collaborate effectively, then we can all get more done and we can all add more value.

Gardner: It's interesting to me that the more we look at the use of data, the more it opens up new markets and innovation capabilities that we hadn’t even considered before. And, as an analyst, I expect to see more of a marketplace of data services. You strike me as an accelerant to that.

Harris: Absolutely. As the analytics are coming online and exposed by API’s, the underlying store that’s used is becoming a bit irrelevant. If you look at what the analytics can do for you, that’s how you consume the insight and you can connect to other sources. You can connect from Twitter, you connect from Facebook, you can connect PDFs, whether it’s NoSQL, structured, columnar, rows, it doesn’t really matter. You don’t see that complexity. The fact that you can just create an API key, access it as consumer, and can start to work with it is really powerful.

There was the recent example in the UK of a report on the Iraq War. It’s 2.2 million words, it took seven years to write, and it’s available online, but there's no way any normal person could consume or analyze that. That’s three times the complete works of Shakespeare.

Using these APIs, you can start to pull out mentions, you can pull out countries, locations and really start to get into the data and provide anybody with Excel at home, in our case, or any other tool, the ability to analyze and get in there and share those insights. We're very used to media where we get just the headline, and that spin comes into play. People turn things on their, head and you really never get to delve into the underlying detail.
Learn More About
Haven OnDemand
Sign Up Now
What’s really interesting is when democratization and sharing of insights and collaboration comes, we can all be informed. We can all really dig deep, and all these people that work there, the great analysts, could start to collaborate and delve and find things and find new discoveries and share that insight.

Gardner: All right, a little light bulb just went off in my head whereas we would go to a headline and a new story and we might have a hyperlink to a source. I could get a headline and a news story, open up my Excel spreadsheet, get to the actual data source behind the entire story and then probe and plumb and analyze that any which way I wanted to.

Harris: Yes, Exactly. I think the most savvy consumer now, the analyst, is starting to demand that transparency. We've seen in the UK, words, election messages and quotes and even financial stats where people just don’t believe the headlines. They're demanding transparency in that process, and so governance can only be really a good thing.

Listen to the podcast. Find it on iTunes. Get the mobile app. Read a full transcript or download a copy. Sponsor: Hewlett Packard Enterprise.

You may also be interested in:

Can Your Business Survive Without IoT Solutions?

Can Your Business Survive Without IoT Solutions?

When was the last time you saw a Blockbuster? The once-dominant video rental chain has all but disappeared. After going bankrupt in 2011, Blockbuster was bought out by Dish Network. There now seems to be only 13 stores left in the world.

What Went Wrong?

Blockbuster failed to keep up with a rapidly changing industry that was disrupted by Netflix and Redbox. The franchise that had a store on every corner quietly disappeared, and we were too busy binge-watching Netflix to notice.

The IoT Revolution

Industrial companies of all types are now facing a disruption called the Internet of Things. This disruption is being referred to as the next industrial revolution. If Netflix can bankrupt a company as big as Blockbuster, what could an industrial revolution do to unprepared businesses?

IoT is changing the way businesses, individuals, and “things” (inanimate objects) interact, communicate, and react. This results in the automation of tasks, and more importantly, the production of data.  IoT and the data it generates are revolutionizing every major industry. This means businesses either have to adapt or be left behind.

Dangers of the Comfort Zone

As we pointed out in “The IoT Tsunami: Will You Catch the Data Wave”, IoT is experiencing rapid growth. No one can predict exactly how enormous it’s going to get.  ...

Read More on Datafloq
How a Global Gaming Company Used Data Analytics to Stay Ahead of Rivals

How a Global Gaming Company Used Data Analytics to Stay Ahead of Rivals

It may be in the entertainment domain but Wargaming’s data analytics operations are as humongous as that of any Fortune 500 company. Wargaming is a leader in the free-to-play Massively Multiplayer Online (MMO) game market across all gaming platforms – PC, console and mobile.

Just to get a grasp of the enormity of its operations here are some figures:

It has a user base of 150 million & counting
Three of its mainline games - World of Tanks, World of Tanks Blitz and World of Warships - together generate 550 million ‘events’ a day
Every game consists of over 15 different main data sources (battle, session, account, logs)
Players are located in different time zones across Europe, North America, CIS and Asia
Its team has to process over 3TB of raw data daily
Number of employees: Over 4000 in 15 worldwide offices

Business Play

Wargaming, a pioneer in the ‘free-to-play’ concept of games in the West, provides high-quality games for free, but makes money from upselling items within the game itself. Its first blockbuster was ‘World of Tanks’, launched in 2010, and that placed the company on the global gaming map. This first ‘epic’ game was followed by a suite of other games, chiefly involving aircraft and warships.

About 80% ...

Read More on Datafloq
What is the Blockchain – part 3 – Blockchain Startups and Five Challenges to Overcome

What is the Blockchain – part 3 – Blockchain Startups and Five Challenges to Overcome

In this series of posts, I am providing insights in a technology that will change our world. Blockchain has been said to be as important invention as the Internet and Johann Palychata, a research analyst from BNP Baripas, called Blockchain an invention like the steam or combustion engine.

In part 1 of this series I gave an introduction to Blockchain and in part 2 I provided insights in different types of Blockchain and consensus algorithms. This third part will discuss some of the major challenges we will need to overcome to make Blockchain truly change our world for the better. But first, let’s look at some startups who are trying to change the world through Blockchain

Blockchain Startups


Ethereum has the ambition to reinvent the internet and they are well on track to achieve that. Ethereum has been around for a few years now and it is a decentralised platform to develop Decentralised Applications (DApps) that run through smart contracts. These smart contracts are small software programs that execute a task, a sort of If This Then That statement, but then a lot more complex. They run on a custom-built blockchain and as such, there is not a chance for fraud, censorship or ...

Read More on Datafloq
Adat-alapú vezetés meetup kedden

Adat-alapú vezetés meetup kedden

Az big data megoldásokban rejlő lehetőségek kihasználása nem tekinthető pusztán IT feladatnak. Erre a területre is igaz, hogy akkor tudják hatékonyabbá tenni egy-egy vállalat működését, ha az adatvezérelt gondolkodás vezetői szinten is gyökeret ver. A technológiai oldalról számtalan lehetőség van arra, hogy egy a data science vagy big data téma iránt érdeklődő szakember belekóstoljon ezekbe a témákba, csak gondoljunk arra a rengeteg technológiai meetupra, ami ma Budapesten elérhető a nagyközönség számára.

a_6.jpgA vezetői réteg lehetősége sokkal korlátozottabb, ezért is nagy öröm számomra, hogy következő kedden, 2016. október 11.-én tartja a Spark Institute az Adat-alapú vezetés című meetupját. A Spark Institute képzéseivel a vezetői réteget célozzák meg, a változó technológiai környezethez való alkalmazkodásra illetve a felforgató technológiák felhasználására kívánja felkészíteni a résztvevőket.

Ezen a vonalon kerül képbe az adatvezérelt gondolkodás, a big data világának üzleti vonatkozásai. A keddi meetup-on én tartom a felvezető előadást a big data technológiai és üzleti aspektusairól, majd Szukács István (StreamBright Data) fog beszélni az ajánlórendszerekben rejlő lehetőségekről, majd Vértes Balázs ( az online hirdetések minőségbiztosításáról.

2016. október 11. 19.00

Adat-alapú vezetés meetup - Spark Institute

Helyszín: LogMeIn - 1061 Budapest, Paulay Ede u 12.

Figyelem, a rendezvényt nem a szervezik, a részvétel regisztrációhoz kötött,
a rendezvény holnapján erre van lehetőséged

Megosztom Facebookon! Megosztom Twitteren! Megosztom Tumblren!

DMZ Europe 2016

DMZ Europe 2016

This year’s European Data Modeling Zone (DMZ) will take place at the wonderful German capital Berlin and I’m very happy to be again speaker at this great event! This year I’ll speak about how to start with a conceptual model, using a logical model and finally how to model the physical Data Vault. During this session we will do some exercises (no, no push-ups!!) to bring our brains up and running about modeling.

What Does it Mean to be a Big Data Snooper?

What Does it Mean to be a Big Data Snooper?

It is natural for the human mind to participate in a practice called apophenia, which is when our brains instinctively find patterns that contribute to meaningful relationships in various forms of collected data. This phenomenon has also been referred to as "patternicity" or "big data snooping". While this tendency can be quite beneficial in the collection and analysis of many data forms, it also has the potential to lead to invalid predictions and biased findings. Many may presume that this phenomenon only impacts new hirees in various data collection fields, but many well-renowned scientists have been shown to participate in big data snooping as well. This is why it is vital for new hire training practices to put a large emphasis on how to detect big data snooping, and implement ways to avoid it.

This August, the website Datanami interviewed Ryan Sullivan, the CEO of a California-based analytics company, and he spoke about how to ensure that these relationships are not formed in a false context. For instance, one of the main ways a data analyst forms a proper hypothesis, is by identifying the errors that are found in the previously established relationships in order to weed out any findings that ...

Read More on Datafloq
How Big Data can Improve your Warehouse Environment

How Big Data can Improve your Warehouse Environment

Big data is changing how the general population tracks, monitors and improves their overall performance in both their personal lives and the workplace. These strategies are implemented with iPhone apps that record user information, websites dedicated to help you avoid certain URLs while you're trying to work, and even alarms that remind you to stay on task. However, one sector of the business market is taking this love of incremental improvement to the next level: warehouse and manufacturing.

Tracking Worker Progress and Efficiency

One approach that has already shown its face in the workplace is by tracking the flow of worker efficiency within the warehouse environment. As many individuals know, warehouse manufacturing relies on a quick and steady flow of work in order to achieve the shared goal of shipping products to the customer. Without a respectable amount of optimization, the company begins to lose revenue and becomes susceptible to going out of business.

Although big data can be used in various ways, such as a cubing and dimensional weight system, employers are mostly using it to monitor the peak hours of optimal work within the warehouse. For example, workflow and production lines seem to take a negative dive during the noon and ...

Read More on Datafloq
DDoS Attacks: Use Cloud-Based Protection and Apache Module

DDoS Attacks: Use Cloud-Based Protection and Apache Module

Any self-respecting webmaster or site administrator should be familiar with the different security techniques to mitigate potential Distributed Denial of Service (DDoS) attacks — a security threat that utilizes a network of compromised systems to flood a web server with traffic and deny access to legitimate users. With the high costs and opportunity losses associated with downtime, administrators need to step up and explore every option to protect the integrity of the website infrastructure.

Protecting against DDoS basically involves three methodologies, which range from free to costly, and which are applicable to web deployments of different sizes.

Mod_evasive: a DIY approach

The most basic line of defense against DDoS is the do-it-yourself approach, which involves an Apache web server module called mod_evasive. Formerly known as mod_dosevasive, the mod_evasive module is built into Apache and is designed to stop traffic-based attacks by setting thresholds and IP blacklisting. Mod_evasive monitors traffic using a dynamic hash table and denies access to IP addresses with suspicious activity.

Given that it is already included in Apache, it’s a popular choice for small web deployments that don’t expect much traffic. However, it does require some tweaking and setting up, in order to be effective against basic attacks. Using mod_evasive requires ...

Read More on Datafloq
Beyond Prediction: Why Adaptive Analytics Matter

Beyond Prediction: Why Adaptive Analytics Matter

Predictive analytics were once considered the cutting-edge way to interact with consumers. The data sets were built on history – what a consumer purchased before, how many times a consumer visited a site before buying, and at what point in the buying journey consumers were getting stuck. Those data points offered a wealth of information to companies - and sometimes some surprisingly obvious revelations - that directly impacted how the consumer journey was approached in the future.

Predictive analytics are still a valuable driving force behind business marketing decisions but contemporary technology calls for even more. Companies should not just look at the past when planning marketing campaigns; adaptive analytics in real time should play a major role, too. Interacting with consumers during the buying process gives greater depth of connection and ensures that what a company is offering to a specific consumer is relevant.

A “GPS” for Companies

The best predictive analytics have an adaptable component that makes them even more relevant. In a piece written on, smart adaptive analytics platforms are compared to GPS systems.

“GPS leverages historical data (predictive) with real-time data (adaptive) to guide me to make better decisions. Earlier generation GPS systems would provide sub-par results due to ...

Read More on Datafloq
How a North American Railroad Used the Magic of Big Data & Cloud for Crew / Workforce management

How a North American Railroad Used the Magic of Big Data & Cloud for Crew / Workforce management

This article is sponsored by CloudMoyo - Partner of choice for solutions at the intersection of Cloud & Analytics.

One of the most gratifying things that a big data analytics firm can be part of is the transformation of an established company as it integrates new practices and insights generated by its data.  That is exactly what happened in the collaboration between CloudMoyo and the Kansas City Southern Railway Company (KCS), an organization which has been operating since 1887 across 10 central U.S states as well as northern Mexico and southern Canada.

Logistics are the lifeblood of a transportation company like KCS. The company has been in operation since 1887 and operates in ten central U.S states, as well as the north-eastern states of Mexico and into Canada. It has over 13000 freight cars, 1044 locomotives and their rail network comprises approximately 6,600 route miles that link commercial and industrial markets in the United States and Mexico.  It has approx. 500 trains running per day with an average of 800+ crew members daily across 181 interchange points with other railroads. Add to this the complexities of repairs, re-crews, allocations, scheduling, incidents, services, people & goods movement, vacations, communications and it turns out to be ...

Read More on Datafloq
Four Data-Driven Strategies to Start a Successful Blog

Four Data-Driven Strategies to Start a Successful Blog

Jeremy Schoemaker, an Internet millionaire and the founder of ShoeMoney, has long discussed the benefits of blogging. In a recent webinar and blog post, he pointed out that more millionaires have been created over the last decade from blogging than any other business model.

However, it takes a lot of work and ingenuity to thrive as a blogger. Many aspiring bloggers fail, because they don’t come up with a well-thought-out strategy and manage their time wisely. If you are starting your first blog, you should follow these strategies and also read this data-driven guide for setting up a blog.

The Importance of a Data-Driven Blogging Strategy

Do you dream of being the next millionaire blogger? You will need two things to succeed:

A strong drive and willingness to work long hours
A strategy based on empirically proven strategies

Unfortunately, many promising bloggers work countless hours without getting any results, because they don’t have the right strategy in place. Here are some blogging statistics you need to keep in mind while developing your strategy.

Understand the Benefits of Daily Posts

Many blogs only post articles once or twice a week. These blogs will have a harder time generating the traffic they need to be profitable. A recent study from ...

Read More on Datafloq
BBBT to Host Webinar from Attunity on Modern Data Warehouse Automation and Optimization

BBBT to Host Webinar from Attunity on Modern Data Warehouse Automation and Optimization

This Friday, the Boulder Business Intelligence Brain Trust (BBBT), the largest industry analyst consortium of its kind, will host a private webinar from Attunity on managing data more effectively with data replication, data warehouse automation, and data usage analytics software.

(PRWeb October 04, 2016)

Read the full story at

5 Ways Marketers are Using Data to Drive Their Content Strategies

5 Ways Marketers are Using Data to Drive Their Content Strategies

When you think about writing, data probably isn’t one of the first things that comes to mind. After all, writing is a creative pursuit that tends to be more of an art than a science. But, if you had any previous doubts that big data and analytics are the way of the future, doubt no more. Data and content marketing are merging together and the results are outstanding.

Five Ways Content Marketers are Leveraging Data

Data is becoming synonymous with words like growth and modernization – and these are two words that marketers love. Thus, they’re turning to data in overwhelming numbers. Specifically, here are some of the ways they’re using data.

1. Blog Optimization

As Neil Patel, one of today’s most successful internet marketers, likes to point out, blogging is one of the single most important aspects of any business. “A blog can serve as the foundation for just about anything you want — a personal brand, an enterprise level business, a multi-million dollar content provider or a startup,” he points out. “A blog is super important.”

Part of developing a successful blog that brings a high return is identifying who your audience is and creating a layout, voice, and content strategy that satisfies their ...

Read More on Datafloq
How to Create a Culture of Cyber Security in the Workplace

How to Create a Culture of Cyber Security in the Workplace

Cyber security is a growing concern that businesses big and small should be prioritizing within their companies as soon as possible. Data breaches are increasing every year, with hackers getting more and more sophisticated as security systems improve. Breaches are expensive, causing costs that often climb into the millions—and they can also cost companies their reputation and trust with clients and customers.

Concerned about cyber security? Aside from ensuring you have the basics of data security covered (encryption, backups, a breach response plan), you’ll also need to help your company develop a culture of cyber security in the workplace. 84% of executives believe that culture is critical to success, and having your team engaged with the security process will help keep sensitive data safe. We’ve pulled together some tips to help you achieve a culture of cyber security throughout your organization.


The first step in creating a culture of cyber security is to give due recognition to the importance of protecting the company’s private data and preventing a breach. Preventative measures such as implementing encryption and protecting company mobile devices is a good start, but a written policy for cyber security and response protocols for a breach are essential as well. This ...

Read More on Datafloq
How to Solve the “IoT is Hard” Problem

How to Solve the “IoT is Hard” Problem

Have you ever tried something that looks easy but turns out to be hard (or damn near impossible)?

Assembling an IoT solution is one of those things that appears deceptively simple. Recent evidence has suggested that IoT is a lot harder than it looks. CBInsights has reported a slowdown in IoT investments. While this could be a coincidence, it could also indicate that IoT is surprisingly difficult to pull off. 

What makes IoT hard? 

Model airplane kits come to mind when thinking about that question. A model airplane kit came in all different sizes and consisted of a kit full of plastic pieces, glue, paint, and stickers, and instructions for assembly. The task was to separate all the plastic parts from the piece holding them all together then assemble with glue and paint. 

Easy, right? Not so fast.  The trouble is, you had to be multi-talented. You needed to be able to decipher and follow confusing instructions. You needed the eyesight and patience to see, handle, and keep track of small parts. You needed the artistic skill to be able to paint, glue, and apply stickers. Lastly, you had to have a place to work (i.e., a table that would be undisturbed for days). 

Compare that with Lego. A Lego kit ...

Read More on Datafloq
Seven Types of Data Network Attacks Everyone Should Be Aware Of

Seven Types of Data Network Attacks Everyone Should Be Aware Of

A computer network is a combination of various technical, primarily independent electronic systems (especially computers, but also sensors, actuators, agents and other harmful components, etc.), which enables communication between the individual systems to one another. The goal is sharing of resources such as network printers, servers, media files and databases. Also important is the ability to centrally manage network devices, network users, their permissions and data. This is particularly important today, as it helps in the direct communication between network users (chat, VoIP telephony, etc.)

Data Network Attacks and its Types

According to, any given threat to a computer system, whether it is hardware, network, or software related, is termed as "security risk." These threats are inevitable in the digital world and are deemed to exist just because the system exists. They cannot be entirely avoided or fixed easily regardless of the technical advancements in information technology. The important part of network security is to pay close attention to the security of the computer system because without suitable and comprehensive security measures, the threat is set even if the computer system has not been a victim to one. There are no limitations to network attacks; therefore, the security of computer systems ...

Read More on Datafloq
Tároljuk-e le ezt az adatot?

Tároljuk-e le ezt az adatot?

Egy data scientist a címben szereplő kérdésre egy automatikus igennel szokott válaszolni - ha van valami adatunk, tároljuk le, mi ezen a kérdés. És valóban, mi akadályoz meg minket ebben? Ha belegondolunk, mennyire olcsó ma már az adattárolás, a kérdés felmerülése elsőre furcsán is hathat. Többek előadásában láttam már visszaköszönni a mellékelt ábrát, és én is gyakran használom - azt mutatja meg, hogy az elmúlt 35 évben hogyan zuhant le egyetlen GB adattárolás éves költsége. A születésem környékén több mint egy millió dollárba került volna azt eltárolni, ami ma egy promóciós ajándéknak utánam dobott 8GB-os pendrive-on elfér. Bár sokat keresgéltem, de nem láttam 2015/2016-os adatokat, de az ábrán így is látszik, hogy az elmúlt években már bőven beestünk a 10 dollárcent alatti értékekhez, ennyibe kerül 1GB háttértár kapacitás manapság.

cost-per-gigabyte-large.pngMit is jelent ez? Azt, hogy ha az adatmennyiség nem extrém nagy, akkor egy átlagos nagyvállalatnál az adatok tárolásáról szóló meetinget résztvevő kollégák órabére valószínűleg jelentősen meghaladja a teljes tárolás költségét. 

Mégis meg kell védenem azokat a szervezeteket, ahol nem minden adatot tárolnak, amire valaha rátaláltak vagy valaha birtokoltak. Mert csak egy szempont az, hogy egy ilyen adatot el kell tárolni, de van itt néhány más szempont is:

  • Ha eltároltuk, akkor kinek lesz jogosultsága ezeket az adatokat olvasni? Egy nagyvállalati környezetben ennek eldöntése már nem annyira triviális, mint lementeni azt.
  • Felmerül a kérdés, hogy ha tároljuk az adatot, akkor azt mennyire biztonságosan kell tenni. Nincs-e benne valami olyan érzékeny adat, aminek védelme extra költséget igényel. 
  • Sokszor olyan ügyféladat is lehet az adathalmaz mélyén, amire törlési kényszer van törvényi kötelezettségek miatt. Így a felelőtlenül törölt adatok a végén akár több fejfájást is okozhatnak, mint eredetileg gondoltuk.

Jól látható tehát, hogy a "mindent tároljunk" szabály inkább kihívásokat hoz egy hagyományos óriáscég számára. Az elv bevezetését én csak olyan helyen láttam, ahol (1) az ügyfelekhez köthető személyes adatok kezelése egyszerűsítve volt, (2) ahol minden alkalmazott bátran hozzáférhet (pénzügyi adatok kivételével) minden adathoz, azaz nagyon egyszerű jogosultsági szabályok voltak adathozzáférés szempontjából, (3) és ahol tipikusan rugalmasan bővíthető adattárolási infrastruktúra (pl. AWS) állt rendelkezésre. Ha ezt a három szempontot összeadjuk, könnyű kitalálni, hogy az innovatív, startup világ felöl közelítő techcégek eshetnek bele csak ebbe a körbe.

Pedig valójában minden cégnél van létjogosultsága a "tároljunk mindent" elv feltételeit megteremteni. Ma egyre több iparágban az adatokban rejlő lehetőségek jelentik az egyik legfontosabb feltételét annak, hogy hosszú távon versenyképesek legyenek. Az adatok kiaknázásához pedig - nem meglepő módon - már eltárolt adatok is szükségesek szoktak lenni.

Adalék az árakhoz: Ha megnyitom a Google Drive fiókomat, akkor ott egyetlen gomb megnyomásával az elérhető tárkapacitásomat felnyomhatom 10TB (10*1000GB) területre. Ezért cserébe elég biztonságos hozzáférést, adatvesztés nélküli tárolást kapok úgy, hogy egyszerre 3 példányban tárolják az adataimat olyan adatközpontokban, amik legalább 300km távolságban vannak egymástól - így egy kisbolygó szerencsétlen érkezése sem nagyon veszélyeztetik a családi fotókat. Ezért összesen havi 100$ kérnének most tőlem.

Azaz évente 1GB tárolása 0,012$-ba (alig 35Ft-ba) kerülne.

aaa.jpgSok adat van nálatok is, már foglalkoztok a kiaknázásával, de jó lenne ha valaki friss szemmel is rá tudna nézni, elbeszélgetnél arról milyen módon lehetne még felhasználni azt? Írj bátran nekünk, szívesen gondolkodunk együtt olyanokkal, akiket érdekelnek az innovatív big data és data science megoldások. Cím:

Megosztom Facebookon! Megosztom Twitteren! Megosztom Tumblren!

Top 3 Tips for Staying Current in the Evolving World of Data Warehousing

Top 3 Tips for Staying Current in the Evolving World of Data Warehousing

The world of data is changing and changing fast. How do you keep up? Here are my Top 3 tips.
Top 3 Tips for Staying Current in the Evolving World of Data Warehousing

Top 3 Tips for Staying Current in the Evolving World of Data Warehousing

The world of data is changing and changing fast. How do you keep up? Here are my Top 3 tips.
Hyper Personalised Customer Interactions in Banking

Hyper Personalised Customer Interactions in Banking

In today’s tight world economy, under-stress financial institutions, including banks, are looking for fresh solutions to help grab new customers, retain the profitable clients, and eventually boost the bottom line.

Banking on Big Data analytics is the way forward. The lines between the real world and digital finance are fast blurring. Increasingly, FIIs and banking executives globally are asking - how can analytics help in optimizing bank-client relationship? The answer lies in the combination of customer information base, technology and analytical methods.

Take what US-based Novantas did recently for one of its clients, a well-known US bank, for example. This global leader in analytic advisory services and technology solutions for financial institutions, implemented a solution for the 10 million customers strong bank that enabled the latter to conduct a far more profitable business than ever before by using persona-based pricing per customer.

Solution highlights:

Traditionally, banks have always offered “honeymoon rates” to attract new deposit monies with the hope that customers will keep their monies with the bank for many years to come. Banks hope that by doing so, the short term loss will be recouped with customer inertia to move their funds over to another bank once the honeymoon period is over. Almost ...

Read More on Datafloq
Enterprise Storage: Still A Land Of Opportunity For Those With The Right Vision

Enterprise Storage: Still A Land Of Opportunity For Those With The Right Vision

Remember that those who have traditionally predicted meteoric data growth haven’t backed off. IoT is only the latest data volume generator and more will come when we get to some others now appearing on the horizon like blockchain. So there are now some interesting startups looking to capitalize on some hot storage trends.
Tech Tip: Quick Start for Getting Your Data into Snowflake

Tech Tip: Quick Start for Getting Your Data into Snowflake

Want to learn how to load data into a cloud data warehouse? This post will show you one way with Snowflake
Six Tips to become a Data Driven Business

Six Tips to become a Data Driven Business

Data is everywhere. All businesses have access to it. Yet, turning data into actionable insights and becoming a data-driven business can be difficult to achieve.

What does data-driven mean?

A data-driven business utilises data to inform every business decision they make. By analysing relevant data and evaluating it they are able to form a conclusion and predict trends. Data-driven businesses ensure their company culture evolves to encourage innovation and agility. It is imperative for all employees within a data-driven business to collect, analyse and learn from data on a regular basis in order to consistently drive results and improve their skill set.

Why is it important?

By analysing data and creating actionable insights you are able to implement effective business strategies. It can help you increase your competitive advantage, guide product and/or service innovation, increase margins, minimise waste, improve customer service and help you to retain employees.

It is much more reliable to make business decisions based on evidence rather than testing an idea based on an assumption. Data can highlight and predict trends that benefit your business, along with potential issues, thus allowing you to react quickly to ensure minimal damage.

But how does a business become data driven?

Identify your business goals

Those with usable data ...

Read More on Datafloq
How to Build Impressive Dashboards for Data Management

How to Build Impressive Dashboards for Data Management

The advancements in analysis technology have allowed companies the ability to measure almost anything. With so much data available, it becomes increasingly important for managers to determine what metrics meet specific needs so as to avoid drowning in an ocean of ultimately irrelevant information. The most productive way to present this information is through business dashboards, and determining the most appropriate dashboard example and template.

A dashboard is a web interface that organizes and presents information in a clear, concise manner. Business dashboards are data management tools that provide visibility into key performance indications, or KPIs, through visually stimulating and understandable graphics, bringing clarity to data.

Dashboard Examples and Development

According to Harold Kerzner, Senior Executive Director for Project Management at the International Institute for Learning, dashboards can be operational or analytical, although several dashboard examples and templates provide for both. An operational dashboard monitors performance, while an analytical dashboard focuses on data trends over time. In his book, Project Management Metrics, KPIs, and Dashboards: A Guide to Measuring and Monitoring Project Performance, Kerzner discusses the benefits of dashboards, the rules for dashboards and the best dashboard practices. He also provides a number of dashboard examples and dashboard templates.

Dashboard examples can include several ...

Read More on Datafloq
Data Centers Should Be Wary of This Danger

Data Centers Should Be Wary of This Danger

Security is one of the most important things to consider when setting up a data center. Are you able to protect your data from hackers, break-ins, and other disasters, natural or otherwise? A properly prepared data center should be able to stand up to all of these things and more, providing a safe and secure location for people to store their precious data.

Unfortunately, one of the most dangerous things in a data center is also the most often overlooked danger. What is this hazard that all data centers should be wary of?

It’s the same thing that keeps the lights on and the servers running – electricity.

The Dangers of Electricity

Everyone knows that when you’re setting up your electronics in your home, you should plug them into a surge protector before plugging them into the wall socket.  This is designed to shield your expensive electronics from power surges caused by lightning strikes or other problems.

In your home, it’s not just your computers and cell phones that you need to worry about. Refrigerators, ovens, washing machines, dryers, light switches, and ground fault circuit interrupter sockets can all be damaged or destroyed.

Even if it only lasts for a couple millionths of a second, a ...

Read More on Datafloq
How JetBlue turns mobile applications quality assurance into improved user experience wins

How JetBlue turns mobile applications quality assurance into improved user experience wins

The next BriefingsDirect Voice of the Customer performance engineering case study discussion examines how JetBlue Airways in New York uses virtual environments to reduce software development costs, centralize performance testing, and create a climate for continuous integration and real-time monitoring of mobile applications.

We'll now hear how JetBlue cultivated a DevOps model by including advanced performance feedback in the continuous integration process to enable greater customer and workforce productivity.

Listen to the podcast. Find it on iTunes. Get the mobile app. Read a full transcript or download a copy.

To describe how efficient performance engineering has reduced testing, hardware and maintenance costs by as much as 60 percent, we're joined by Mohammed Mahmud, the Senior Software Performance Engineer at JetBlue Airways in New York. The discussion is moderated by BriefingsDirect's Dana Gardner, Principal Analyst at Interarbor Solutions.

Here are some excerpts:

Gardner: Why is mobile user experience so very important to your ability to serve customers well these days?

Mahmud: It's really very important for us to give the customer an option to do check-in, book flights, manage bookings, check flight status, and some other things. On flights, they have an option to watch TV, listen to music, and buy stuff using mobile devices as well. But on board, they have to use Fly-Fi [wireless networks]. This is one of the most important business drivers for JetBlue Airways.

Gardner: What sort of climate or environment have you had to put together in order to make sure that those mobile apps really work well, and that your brand and reputation don’t suffer?
Effective Performance Engineering
Download the Report
Mahmud: I believe a real-time monitoring solution is the key to success. We use HPE Business Service Management (BSM), integrated with third-party applications for monitoring purposes. We created some synthetic transactions and put them out there on a real device to see ... how it impacts performance. If there are any issues with that, we can fix it before it happens in the production environment.

Also, we have a real-time monitoring solution in place. This solution uses real devices to get the real user experience and to identify potential performance bottlenecks in a live production environment. If anything goes wrong there, we can get alerts from the production environment, and we can mitigate that issue right away.

DevOps benefits

Gardner: How have you been able to connect the development process to the operational environment?

Mahmud: My area is strictly performance engineering, but we're in the process of putting the performance effort into our DevOps model. We're going to be part of the continuous integration (CI) process, so we can take part in the development process and give performance feedback early in the development phase.

In this model, an application module upgrade is kicking off the functional test cases and giving feedback to the developers. Our plan is to take part of that CI process and include the performance test cases to provide performance feedback in the very early stage of the development process.

Gardner: How often are you updating these apps? Are you doing it monthly, quarterly, more frequently?

Mahmud: Most of them on a two- or three-week basis.

Gardner: How are you managing the virtual environment to create as close to the operating environment as you can? How do the virtualized services and networks benefit you?

Mahmud: We're maintaining a complete virtualized environment for our performance testing and performance engineering. Before our developers create any kind of a service, or put it out there, they do mock-ups using third-party applications. The virtual environment they're creating is similar to the production environment, so that when it’s being deployed out there in the actual environment, it works efficiently and perfectly without any issue.
Effective Performance Engineering
Download the Report
Our developers recently started using the service virtualization technology. Also, we use network virtualization technology to measure the latency for various geographical locations.

Gardner: How has performance engineering changed over the past few years? We've seen a lot of changes in general, in development, mobile of course, DevOps, and the need for more rapid development. But, how have you seen it shift in the past few years in performance engineering?

Mahmud: When I came to JetBlue Airways, LoadRunner was only one product they had. The performance team was responsible for evaluating the application performance by running a performance test and give the test results with identifying pass/fail based on the requirements provided. It was strictly performance testing.

The statistics they used to provide were pretty straightforward, maybe some transaction response times and some server statistics, but no other analysis or detailed information. But now, it’s more than that. Now, we don’t just test the application and determine the pass/fail. We analyze the logs, traffic flow, user behavior, and how they behave, etc. in order to create and design an effective test. Now, this is more performance engineering than performance testing.

Early in the cycle

We're getting engaged early in the development cycle to provide performance feedback. We're doing the performance testing, providing the response time in cases where multiple users are using that application or that module, finding out how this is going to impact the performance, and finding bottlenecks before it goes to the integration point.

So, it’s more of coming to the developers' table, sitting together, and figuring out any performance issue.

Gardner: Understanding the trajectory forward, it seems that we're going to be doing more with microservices, APIs, more points of contact, generating more data, trying to bring the analysis of that data back into the application. Where do you see it going now that you've told us where it has come from? What will be some of the next benefits that performance engineering can bring to the overall development process?

Mahmud: Well, as I mentioned earlier, we're planning to be part of the continuous integration; our goal is to become engaged earlier in the development process. That's sitting together with the developers on a one-to-one basis to see what they need to make sure that we have performance-efficient applications in our environment for our customers. Again, this is all about getting involved in the earlier stages. That's number one.
We're trying to become engaged in the early stages and be part of the development process as well.

Number two, we're trying to mitigate any kind of volume-related issue. Sometimes, we have yearly sales. We don’t know when that's going to happen, but when it happens, it’s an enormous pressure on the system. It's a big thing, and we need to make sure we're prepared for that kind of traffic on our site.

Our applications are mostly and JetBlue mobile applications. It’s really crucial for us and for our business. We're trying to become engaged in the early stages and be part of the development process as well.

Gardner: Of course it’s important to be able to demonstrate value. Do you have any metrics of success or you can point to ways in which getting in early, getting in deep, has been of significant value? How do you measure your effectiveness?

Mahmud: We did an assessment in our production environment to see, if goes down for an hour, how much it’s going to cost us? I'm not authorized to discuss any numbers, but I can tell you that it was in the millions of dollars.
Effective Performance Engineering
Download the Report
So, before it goes to production with any kind of performance-related issue, we make sure that we're solving it before it happens. Right there, we're saving millions of dollars. That’s the value we are adding.

Gardner: Of course more and more people identify the mobile app with the company. This is how they interact; it becomes the brand. So, it's super important for that as well.

Adding value

Mahmud: Of course, and I can add another thing. When I joined JetBlue three years ago, industry standard-wise our position was bottom on the benchmark list. Now, we're within the top five in the benchmark list. So, we're adding value to our organization.

Gardner: It pays to get it done right the first time and get it early, almost in any activity these days.

What comes next? Where would you like to extend continuous integration processes, to more types of applications, developing more services? Where do you take the success and extend it?

Mahmud: Right now, we're more engaged with and mobile applications. Other teams are interested in doing performance testing for their systems as well. So, we're getting engaged with the SAP, DB, HR, and payroll team as well. We're getting engaged more day by day. It’s getting bigger every day.

Listen to the podcast. Find it on iTunes. Get the mobile app. Read a full transcript or download a copy. Sponsor: Hewlett Packard Enterprise.

You may also be interested in:

Data-Driven Strategies to Optimize You Customer Experiences and Increase Conversions

Data-Driven Strategies to Optimize You Customer Experiences and Increase Conversions

Data-driven marketing, once a novel concept, has become the critical foundation for marketing success. Marketers no longer have the luxury of viewing data as an extra ingredient to integrate into marketing strategies.  Data is now the fuel that drives the entire marketing engine. Data tells us who our best customers and prospects are, what messaging they will be most apt to respond to, who is indicating in-market purchase intent, which channels they prefer, and overall, how to drive conversions and increase revenue.

In a recent survey by Econsultancy and Adobe, marketers indicated their top goals for using data in marketing. According to the research, the top three goals among company respondents included:

Better customer experiences – 38%
Identifying new audiences & customers – 37%
Retaining and upselling existing customers – 37%

However, many marketers continue to be challenged in achieving a single customer view in order to meet these objectives.  Data must be integrated, cleansed, and enhanced in order to fully understand a customer’s behaviors and in turn, personalize customer experiences to increase conversion rates. Of those surveyed, only 20% of company marketers surveyed said that they have created an actionable single customer view. According to the research:

Among those who have achieved a single customer ...

Read More on Datafloq
Data, Data, Data (How and Why?)

Data, Data, Data (How and Why?)

As organizations of today grow and evolve, so too are data volumes. Instead of throwing away old information, structured and unstructured data alike are funneled into enterprise data hubs, but often without the essential tool for keeping this data in order: metadata.

The struggle to manage information within the big data landscape is as complex as the digital workflows it supports. This landscape includes the internal ecosystem and the wider geography of partners and third-party entities. The complexity of all of the available data is compounded with the increasing rate of production and diversity of formats.

Data assets are critical to your business operations -- they need to be discovered at all points of the digital lifecycle. Key to building trust in your data is ensuring its accuracy and usability. Leveraging meaningful metadata provides your best chance for a return on investment on the data assets created and becomes an essential line of defense against lost opportunities. Your users' digital experience is based on their ability to identify, discover and experience your brand in the way it was intended. Value is not found -- it's made -- so make the data meaningful to you, your users and your organization by managing it ...

Read More on Datafloq
BBBT to Host Webinar from Dundas Data Visualization on Building Big Data Embedded Solutions with Smarter Visualizations

BBBT to Host Webinar from Dundas Data Visualization on Building Big Data Embedded Solutions with Smarter Visualizations

This Friday, the Boulder Business Intelligence Brain Trust (BBBT), the largest industry analyst consortium of its kind, will host a private webinar from Dundas Data Visualization on how Dundas BI can be seamlessly embedded into any application.

(PRWeb September 27, 2016)

Read the full story at

How to Offload the ETL Bottleneck with Hadoop

How to Offload the ETL Bottleneck with Hadoop

Where’s my Data?   

There is a battle today inside IT to meet Enterprise Data Warehouse (EDW) overnight data load commitments with ETL solutions. Traditional ETL solutions have struggled to keep pace with the tsunami of data now being generated. Fortunately, there is a solution with an ecosystem of software called Hadoop.  Before I stage the problem, I want to give you a reason to care.

Hadoop’s benefits:

Process more data in a smaller batch window to meet ETL and EDW commitments
Reduce EDW storage costs
Eliminate the need to purge EDW data or archive to tape
Improve the query performance of the EDW
Free up enterprise grade servers for business intelligence and analytics use

Financial Justification

IT leaders are justifying a Hadoop deployment on the savings from reduced EDW storage fees and extra server capacity. The real benefit comes from their ability to meet or exceed ETL commitments. 

What is ETL?

ETL stands for Extract, Transform, and Load.   There is a high demand inside every business for enterprise data to facilitate decision making and report on recent performance (see Take Action ....and Empowering Business Leaders …). Transactional data produced at the point of sale or within production facilities exists in many different systems, types, structures, and generally in a highly normalized form. Thus, this data in its original state is ...

Read More on Datafloq
Seven secrets to highly effective procurement: How business networks fuel innovation and transformation

Seven secrets to highly effective procurement: How business networks fuel innovation and transformation

The next BriefingsDirect innovation discussion focuses on how technology, data analysis, and digital networks are transforming procurement and the source-to-pay process as we know it. We’ll also discuss what it takes to do procurement well in this new era of business networks. 

Far beyond just automating tasks and transactions, procurement today is a strategic function that demands an integrated, end-to-end approach built on deep insights and intelligence to drive informed source-to-pay decisions and actions that enable businesses to adopt a true business ecosystem-wide digital strategy.

Listen to the podcast. Find it on iTunes. Get the mobile app. Read a full transcript or download a copy.

And according to the findings of a benchmarking survey conducted by SAP Ariba, there are seven essential traits of modern procurement organizations that are driving this innovation and business transformation.

To learn more about the survey results on procurement best practices, please join me in welcoming Kay Ree Lee, Director of Value Realization at SAP. The discussion is moderated by BriefingsDirect's Dana Gardner, Principal Analyst at Interarbor Solutions.

Here are some excerpts:

Gardner: Procurement seems more complex than ever. Supply chains now stretch around the globe, regulation is on the rise, and risk is heightened on many fronts in terms of supply chain integrity.

Innovative companies, however, have figured out how to overcome these challenges, and so, at the value realization group you have uncovered some of these best practices through your annual benchmarking survey. Tell us about this survey and what you found.

Lee: We have an annual benchmarking program that covers purchasing operations, payables, sourcing, contract management, and working capital. What's unique about it, Dana, is that it combines a traditional survey with data from our procurement applications and business network.

This past year, we looked at more than 200 customers who participated, covering more than $350 billion in spend. We analyzed their quantitative and qualitative responses and identified the intersection between those responses for top performers compared to average performers. Then, we drew correlations between which top performers did well and the practices that drove those achievements.

Gardner: By making that intersection, it’s an example of the power of business networks, because you're able to gather intelligence from your business network environment or ecosystem and then apply a survey back into that. It seems to me that there is a whole greater than the sum of the parts between what the Ariba Network can do and what market intelligence is demanding.

Universe of insights

Lee: That’s right. The data from the applications in the Ariba Network contain a universe of insights, intelligence, and transactional data that we've amassed over the last 20-plus years. By looking at the data, we've found that there are specific patterns and trends that can help a lot of companies improve their procurement performance -- either by processing transactions with fewer errors or processing them faster. They can source more effectively by collaborating with more suppliers, having suppliers bid on more events, and working collaboratively with suppliers.

Gardner: And across these 200 companies, you mentioned $350 billion of spend. Do you have any sense of what kind of companies these are, or do they cross a variety of different types of companies in different places doing different vertical industry activities?

Lee: They're actually cross-industry. We have a lot of companies in the services industry and in the manufacturing industry as well.

Gardner: This sounds like a unique, powerful dataset, indicative of what's going on not just in one or two places, but across industries. Before we dig into the detail, let’s look at the big picture, a 100,000-foot view. What would you say are some the major high-level takeaways that define best-in-class procurement and organizations that can produce it these days based on your data?

Lee: There are four key takeaways that define what best-in-class procurement organizations do.

The first one is that a lot of these best-in-class organizations, when they look at source-to-pay or procure-to-pay, manage it as an end-to-end process. They don't just look at a set of discrete tasks; they look at it as a big, broad picture. More often than not, they have an assigned process expert or a process owner that's accountable for the entire end-to-end process. That's key takeaway number one.
A lot of these best-in-class organizations also have an integrated platform from which they manage all of their spend.

Key takeaway number two is that a lot of these best-in-class organizations also have an integrated platform from which they manage all of their spend. And through this platform, procurement organizations provide their internal stakeholders with flexibility, based on what they're trying to purchase.

For example, if a company needs to keep track of items that are critical to manufacturing and they need to have inventory visibility and tracking. That's one requirement.

Another requirement is if they have to purchase manufacturing or machine parts that are not stocked, that can be purchased through supply catalogs with pre-negotiated part description and item pricing.
Gardner: Are you saying that this same platform can be used in these companies across all the different types of procurement and source-to-pay activities -- internal services, even indirect, perhaps across different parts of a large company? That could be manufacturing or transportation? Is it the common platform common for all types of purchasing?

Common platform

Lee: That's right. One common platform for different permutations of what you're trying to buy. This is important.

The third key takeaway was that best-in-class organizations leverage technology to fuel greater collaboration. They don't just automate tasks. One example of this is by providing self-service options.

Perhaps a lot of companies think that self-service options are dangerous, because you're letting the person who is requesting items select on their own, and they could make mistakes. But the way to think about a self-service option is that it's providing an alternative for stakeholders to buy and to have a guided buying experience that is both simple and compliant and that's available 24/7.

You don't need someone there supervising them. They can go on the platform and they can pick the items, because they know the items best -- and they can do this around the clock. That's another way of offering flexibility and fueling greater collaboration and ultimately, adoption.
Networks have become very prevalent these days, but best-in-class companies connect to networks to assess intelligence, not just transact.

Gardner: We have technologies like mobile these days that allow that democratization of involvement. That sounds like a powerful approach.

Lee: It is. And it ties to the fourth key takeaway, which is that best-in-class organizations connect to networks. Networks have become very prevalent these days, but best-in-class companies connect to networks to assess intelligence, not just transact. They go out to the network, they collaborate, and they get intelligence. A network really offers scale that organizations would otherwise have to achieve by developing multiple point-to-point connections for transacting across thousands of different suppliers.

You now go on a network and you have access to thousands of suppliers. Years ago, you would have had to develop point-to-point connectivity, which costs money, takes a long time, and you have to test all those connections, etc.

Gardner: I'm old enough to remember Metcalfe's Law, which roughly says that the more participants in a network, the more valuable that network becomes, and I think that's probably the case here. Is there any indication from your data and research that the size and breadth and depth of the business network value works in this same fashion?

Lee: Absolutely. Those three words are key. The size -- you want a lot of suppliers transacting on there. And then the breadth -- you want your network to contain global suppliers, so some suppliers that can transact in remote parts of the world, even Nigeria or Angola.

Then, the depth of the network -- the types of suppliers that transact on there. You want to have suppliers that can transact across a plethora of different spend categories -- suppliers that offer services, suppliers that offer parts, and suppliers that offer more mundane items.

But you hit the nail on the head with the size and breadth of the network.

Pretty straightforward

Gardner: So for industry analysts like myself, these seem pretty straightforward. I see where procurement and business networks are going, I can certainly agree that these are major and important points.

But I wonder, because we're in such a dynamic world and because companies -- at least in many of the procurement organizations -- are still catching up in technology, how are these findings different than if you had done the survey four or five years ago? What's been a big shift in terms of how this journey is progressing for these large and important companies?

Lee: I don't think that there's a big shift. Over the last two to five years, perhaps priorities have changed. So, there are some patterns that we see in the data for sure. For example, within sourcing, while sourcing savings continue to go up, go down, sourcing continues to be very important to a lot of organizations to deliver cost savings.

The data tells us organizations need to be agile and they need to continue to do more with less. Networks have become very prevalent these days, but best-in-class companies connect to networks to assess intelligence, not just transact.
They have fewer people operating certain processes, and that means that it costs organizations less to operate those processes.

One of the key takeaways from this is that the cost structure of procurement organizations have come down. They have fewer people operating certain processes, and that means that it costs organizations less to operate those processes, because now they're leveraging technology even more. Then, they're able to also deliver higher savings, because they're including more and different suppliers as they go to market for certain spend categories.

That's where we're seeing difference. It's not really a shift, but there are some patterns in the data.

Gardner: It seems to me, too, though, that because we're adding through that technology more data and insight, we can elevate procurement more prominently into the category of spend management. That allows companies to really make decisions at a large environment level across the entire industries, maybe across the entire company based on these insights, based on best practices, and they can save a lot more money.

But then, it seems to me that that elevates procurement to a strategic level, not just a way to save money or to reduce costs, but to actually enable processes and agility, as you pointed out, that haven't been done before.

Before we go the traits themselves, is there a sense that your findings illustrate this movement of procurement to a more strategic role?

Front and center

Lee: Absolutely. That's another one of the key traits that we have found from the study. Top performing organizations do not view procurement as a back-office function. Procurement is front and center. It plays a strategic role within the organization to manage the organization’s spend.

When you talk about managing spend, you could talk about it at the surface level. But we have a lot of organizations that manage spend to a depth that includes performing strategic supplier relationship management, supplier risk management, and deep spend analysis. The ability to manage at this depth distinguishes top performers from average performers.

Gardner: As we know, Kay Ree, many people most trust their cohorts, people in other companies doing the same function they are, for business acumen. So this information is great, because we're learning from the people that are doing it in the field and doing it well. What are some of the other traits that you uncovered in your research?
Top performers play a strategic role within the organization. They manage more spend and they manage that spend at a deep level.

Lee: Let me go back to the first trait. The first one that we saw that drove top performing organizations was that top performers play a strategic role within the organization. They manage more spend and they manage that spend at a deep level.

One of the stats that I will share is that top performers see a 36 percent higher spend under management, compared to the average organization. And they do this by playing a strategic role in the organization. They're not just processing transactions. They have a seat at the leadership table. They're a part of the business in making decisions. They're part of the planning, budgeting, and financial process.

They also ensure that they're working collaboratively with their stakeholders to ensure that procurement is viewed as a trusted business adviser, not an administrator or a gatekeeper. That’s really the first trait that we saw that distinguishes top performers.

The second one is that top performers have an integrated platform for all procurement spend, and they conduct regular stakeholder spend reviews -- resulting in higher sourcing savings.

And this is key. They conduct quarterly – or even more frequent -- meetings with the businesses to review their spend. These reviews serve different purposes. They provide a forum for discussing various sourcing opportunities.

Imagine going to the business unit to talk to them about their spend from the previous year. "Here is who you have spent money with. What is your plan for the upcoming year? What spend categories can we help you source? What's your priority for the upcoming year? Are there any capital projects that we can help out with?"

Sourcing opportunities

It's understanding the business and requirements from stakeholders that helps procurement to identify additional sourcing opportunities. Then, collaborating with the businesses and making sure that procurement is being responsive and agile to the stakeholder requirements. Procurement, has to be proactive in collaborating with stakeholders and ensuring that they’re being responsive and agile to their requirements. That's the second finding that we saw from the survey.

The third one is that top performers manage procure-to-pay as an end-to-end process with a single point of accountability, and this really drives higher purchase order (PO) and invoicing efficiency. This one is quite straightforward. Our quantitative and qualitative research tells us that having a single point of accountability drives a higher transactional efficiency.

Gardner: I can speak to that personally. In too many instances, I work with companies where one hand doesn’t know what the other is doing, and there is finger pointing. Any kind of exception management becomes bogged down, because there isn’t that point of accountability. I think that’s super important.

Lee: We see that as well. Top performers operationalize savings after they have sourced spend categories and captured negotiated savings. The question then becomes how do they operationalize negotiated savings so that it becomes actual savings? The way top performers approach it is that they manage compliance for those sourced categories by creating fit-for-purpose strategies for purchase. So, they drive more spend toward contract and electronic catalogs through a guided buying experience.

You do that by having available to your stakeholders contracts and catalogs that would guide them to the negotiated pricing, so that they don't have to enter pricing, which would then dilute your savings. Top performers also look at working capital, and they look at it closely, with the ability to analyze historical payment trends and then optimize payment instruments resulting in higher discounts.
Top performers leverage technology and provide self-service to enable around-the-clock business.

Sometimes, working capital is not as important to procurement because it's left to the accounts payable (AP) function, but top performers or top performing procurement organizations look at it holistically; as another lever that they manage within the sourcing and procure-to-pay process.

So, it's another negotiation point when they are sourcing, to take advantage of opportunities to standardize payment terms, take discounts when they need to, and also look at historical data and really have a strategy, and variations of the strategy, for how we're going to pay strategic suppliers. What’s the payment term for standard suppliers, when do we pay on terms versus discounts, and then when do we pay on a P-Card? They look at working capital holistically as part of their entire procurement process.

Gardner: It really shows where being agile and intelligent can have major benefits in terms of your ability to time and enforce delivery of goods and services -- and also get the best price in the market. That’s very cool.

Lee: And having all of that information and having the ability to transact efficiently is key. Let’s say you have all the information, but you can't transact efficiently. You're slow to make invoice payments, as an example. Then, while you have a strategy and approach, you can’t even make a change there (related to working capital). So, it's important to be able to do both, so that you have the options and the flexibility to be able to operationalize that strategy.

Top performers leverage technology and provide self-service to enable around-the-clock business. This really helps organizations drive down cycle time for PO processing.

Within the oil and gas sector, for example, it's critical for organizations to get the items out to the field, because if they don't, they may jeopardize operations on a large scale. Offering the ability to perform self-service and to enable that 24x7 gives organizations flexibility and offers the users the ability to maneuver themselves around the system quite easily. Systems nowadays are quite user-friendly. Let the users do their work, trust them in doing their work, so that they can purchase the items they need to, when they want to.

User experience

Gardner: Kay Ree, this really points out the importance of the user experience, and not just your end-user customers, but your internal employee users and how younger folks, millennials in particular, expect that self-service capability.

Lee: That’s right. Purchasing shouldn't be any different. We should follow the lead of other industries and other mobile apps and allow users to do self-service. If you want to buy something, you go out there, you pick the item, the pricing is out there, it’s negotiated pricing, so you pick the item, and then let’s go.

Gardner: That’s enabling a lot of productivity. That’s great. Okay, last one.

Lee: The last one is that top performers leverage technology to automate PO and invoice processing to increase administrative efficiency. What we see is best-in-class organizations leverage technology with various features and functionalities within the technology itself to increase administrative efficiency.

An example of this could be the ability to collaborate with suppliers on the requisitioning process. Perhaps you're doing three bids and a buy, and during that process it's not picking up the phone anymore. You list out your requirements for what you're trying to buy and you send it out automatically to three suppliers, and then they provide responses back, you pick your responses and then the system converts the requirements to a PO.
Top performers are able to achieve about 7.8 percent in savings per year as a percent of source spend. That’s a key monetary benefit that most organizations look to.

So that flexibility by leveraging technology is key.

Gardner: Of course, we expect to get even more technology involved with business processes. We hear things about the Internet of Things (IoT), more data, more measurement, more scientific data analysis being applied to what may have been more gut instinct types of business decision making, now it’s more empirical. So I think we should expect to see even more technology being brought to bear on many of these processes in the next several years. So that’s kind of important to see elevated to a top trait.

All right, what I really like about this, Kay Ree, is this information is not just from an academic or maybe a theory or prediction, but this is what organizations are actually doing. Do we have any way of demonstrating what you get in return? If these are best practices as the marketplace defines them, what is the marketplace seeing when they adopt these principles? What do they get for this innovation? Brass tacks, money, productivity and benefits -- what are the real paybacks?

Lee: I'll share stats for top performers. Top performers are able to achieve about 7.8 percent in savings per year as a percent of source spend. That’s a key monetary benefit that most organizations look to. It’s 7.8 percent in savings.

Gardner: And 7.8 percent to someone who's not familiar with what we're talking about might not seem large, but this is a huge amount of money for many companies.

Lee: That's right. Per billion dollars, that’s $78 million.

Efficient processing

They also manage more than 80 percent of their spend and they manage this spend to a greater depth by having the right tools to do it -- processing transactions efficiently, managing contracts, and managing compliance. And they have data that lets them run deeper spend analysis. That’s a key business benefit for organizations that are looking to transact over the network, looking to leverage more technology.

Top performers also transact and collaborate electronically with suppliers to achieve a 99 percent-plus electronic PO rate. Best-in-class organizations don't even attach a PDF to an email anymore. They create a requisition, it gets approved, it becomes a PO, and it is automatically sent to a supplier. No one is involved in it. So the entire process becomes touch-less.

Gardner: These traits promote that automation that then leads to better data, which allows for better process. And so on. It really is a virtuous cycle that you can get into when you do this.

Lee: That’s right. One leads to another.
They create a requisition, it gets approved, it becomes a PO, and it is automatically sent to a supplier. No one is involved in it. So the entire process becomes touch-less.

Gardner: Are there other ways that we're seeing paybacks?

Lee: The proof of the pudding is in the eating. I'll share a couple of examples from my experience looking at data for specific companies. One organization utilizes the availability of collaboration and sourcing tools to source transportation lanes, to obtain better-negotiated rates, and drive higher sourcing savings.

A lot of organizations use collaboration and sourcing tools, but the reason why this is interesting is because when you think about transportation, there are different ways to source transportation, but doing it to an eSourcing tool and having the ability to generate a high percentage in savings through collaboration and sourcing tools, that was an eye-opener for me. That’s an example of an organization really using technology to its benefit of going out and sourcing an uncommon spend category.

For another example, I have a customer that was really struggling to get control of their operational costs related to transaction processing, while trying to manage and drive a high degree of compliance. What they were struggling with is that their cost structure was high. They wanted to keep the cost structure lower, but still drive a high degree of compliance.

When we looked at their benchmark data, it helped open the eyes of the customer to understand how to drive improvements by directing transactions to catalogs and contracts where applicable, driving suppliers to create invoice-based contracts in the Ariba Network and then they were enabling more suppliers to invoice electronically. This then helped increase administrative efficiency and reduced invoice errors, which were resulting in a lot of rework for the AP team.

So, these two examples, in addition to the quantitative benefits, show the tremendous opportunity organizations have to adopt and leverage some of these technologies.

Virtuous cycle

Gardner: So, we're seeing more technology become available, more data and analytics become available with the business networks are being built out in terms of size, breadth and depth, and we've identified that the paybacks can lead to a virtuous cycle of improvement.

Where do you see things going now that you've had a chance to really dig into this data and see these best practices in actual daily occurrence? What would you see happening in the future? How can we extrapolate from what we've learned in the market to what we should expect to see in the market?

Lee: We're still only just scratching the surface with insights. We have a roadmap of advanced insights that we're planning for our customers that will allow us to further leverage the insights and intelligence embedded in our network to help our customers increase efficiency in operations and effectiveness of sourcing.
We have a roadmap of advanced insights that we're planning for our customers that will allow us to further leverage the insights and intelligence embedded in our network.

Gardner: It sounds very exciting, and I think we can also consider bringing artificial intelligence and machine learning capabilities into this as we use cloud computing. And so the information and insights are then shared through a sophisticated infrastructure and services delivery approach. Who knows where we might start seeing the ability to analyze these processes and add all sorts of new value-added benefits and transactional efficiency? It's going to be really exciting in the next several years.

Listen to the podcast. Find it on iTunes. Get the mobile app. Read a full transcript or download a copy. Sponsor: SAP Ariba.

You may also be interested in:

Adventures in the Cloud

Adventures in the Cloud

I just got back from a whirlwind tour that included a trip to San Francisco to visit the Snowflake HQ, attend the annual Oracle ACE Directors briefing, meetings and events at Oracle OpenWorld (OOW), speaking at OakTable World (OTW), then off to Chicago to speak at Enterprise Dataversity. Yes it was packed. And lots of […]
Adventures in the Cloud

Adventures in the Cloud

I just got back from a whirlwind tour that included a trip to San Francisco to visit the Snowflake HQ, attend the annual Oracle ACE Directors briefing, meetings and events at Oracle OpenWorld (OOW), speaking at OakTable World (OTW), then off to Chicago to speak at Enterprise Dataversity. Yes it was packed. And lots of […]
What is the Blockchain – part 2 – and Why It Will Change Our World

What is the Blockchain – part 2 – and Why It Will Change Our World

For the tech-savvy people among us, the Blockchain might be nothing new and it may be clear that it will have a big impact on the world. However, for many people, the Blockchain is still a mystery, a puzzle or an unknown unknown. Therefore, in a series of posts, I share with you what the Blockchain is, how it works and how it will completely change the world as we know it, if we get it right.

In my first post about the Blockchain, I explained the basics of the Blockchain and in this post I will go a bit deeper and talk about the different types of Blockchains, some examples of dApps and talk about the most important part of the Blockchain; the consensus algorithms to validate the data.

Different Types of Blockchains

The most well-known Blockchain is the Bitcoin Blockchain. The Bitcoin Blockchain was envisioned by Satoshi Nakamato in 2008 and this is a so-called Permissionless Blockchain, or public Blockchain. This means that anyone interested to join the Blockchain, can do so by simply hooking-up his/her computer to the decentralised Blockchain network, download the Blockchain and contribute to the processing of transactions. It is not required to have a previous relationship with ...

Read More on Datafloq
4 Big Data Tips for Creating a Safer Workplace

4 Big Data Tips for Creating a Safer Workplace

Advanced analytics is having a huge impact on fields such as marketing where it produces more sales at less cost and CRM where it boosts retention and profit per customer, to name two of many. In the field of workplace health and safety, it is safe to say that Big Data has had less impact to date. But, that is changing. A growing number of companies are analysing data through a safety lens to modify their philosophy and practices. The changes are making workplaces safer than ever.

Effectively Using the Past to Change the Future

An oft-repeated proverb says, “Those who fail to learn from the past are doomed to repeat it.” The principle could be applied to workplaces in which similar accidents happened with repetition under similar circumstances, yet without being obvious. Big Data analytics offers companies a way to drill down into information to produce a clear picture of the past, which is the basis for preventing it from happening again.

Analysing the copious information available to companies about what has already happened allows them to predict the future and then to change that future to something better, something safer, before it arrives.

A widely used process for understanding the past and ...

Read More on Datafloq
How flash storage provides competitive edge for Canadian music service provider SOCAN

How flash storage provides competitive edge for Canadian music service provider SOCAN

The next BriefingsDirect Voice of the Customer digital business transformation case study examines how Canadian nonprofit SOCAN faced digital disruption and fought back with a successful storage modernizing journey. We'll learn how adopting storage innovation allows for faster responses to end-user needs and opens the door to new business opportunities.

Listen to the podcast. Find it on iTunes. Get the mobile app. Read a full transcript or download a copy.

To describe how SOCAN gained a new competitive capability for its performance rights management business we're joined by Trevor Jackson, Director of IT Infrastructure for SOCAN, the Society of Composers, Authors and Music Publishers of Canada, based in Toronto. The discussion is moderated by BriefingsDirect's Dana Gardner, Principal Analyst at Interarbor Solutions.

Here are some excerpts:

Gardner: The music business has changed a lot in the past five years or so. There are lots of interesting things going on with licensing models and people wanting to get access to music, but people also wanting to control their own art.

Tell us about some of the drivers for your organization, and then also about some of your technology decisions.
A Tech Guide
For the Savvy
Flash Buyer
Jackson: We've traditionally been handling performances of music, which is radio stations, television and movies. Over the last 10 or 15 years, with the advent of YouTube, Spotify, Netflix, and digital streaming services, we're seeing a huge increase in the volume of data that we have to digest and analyze as an organization.

Gardner: And what function do you serve? For those who are might not be familiar with your organization or the type of organization, tell us the role you play in the music and content industries.

Play music ethically

Jackson: At a very high level, what we do is license the use of music in Canada. What that means is that we allow businesses through licensing to ethically play any type of music they want within their environment. Whether it's a bar, restaurant, television station, or a radio station, we collect the royalties on behalf of the creators of the music and then redistribute that to them.

We're a not-for-profit organization. Anything that we don't spend on running the business, which is the collecting, processing, and payment of those royalties, goes back to the creators or the publishers of the music.

Gardner: When you talk about data, tell us about the type of data you collect in order to accomplish that mission?

Jackson: It's all kinds of data. For the most part, it's unstructured. We collect it from many different sources, again radio and television stations, and of course, YouTube is another example.

There are some standards, but one of the challenges is that we have to do data transformation to ensure that, once we get the data, we can analyze it and it fits into our databases, so that we can do the processing on information.

Gardner: And what sort of data volumes are we talking about here?

Jackson: We're not talking about petabytes, but the thing about performance information is that it's very granular. For example, the files that YouTube sends to us may have billions of rows for all the performances that are played, as they're going through their cycle through the month; it's the same thing with radio stations.

We don't store any digital files or copies of music. It's all performance-related information -- the song that was played and when it was played. That's the type of information that we analyze.
We don't store any digital files or copies of music. It's all performance-related information.

Gardner: So, it's metadata about what's been going on in terms of how these performances have been used and played. Where were you two years ago in this journey, and how have things changed for you in terms of what you can do with the data and how performance of your data is benefiting your business?

Jackson: We've been on flash for almost two years now. About two and a half years ago, we realized that the storage area network (SAN) that we did have, which was a traditional tiered-storage array, just didn't have the throughput or the input/output operations per second (IOPS) to handle the explosive amount of data that we were seeing.

With YouTube coming online, as well as Spotify, we knew we had to do something about that. We had to increase our throughput.

Performance requirements

Gardner: Are you generating reports from this data at a certain frequency or is there streaming? How is the output in terms of performance requirements?

Jackson: We ingest a lot of data from the data-source providers. We have to analyze what was played, who owns the works that were played, correlate that with our database, and then ensure that the monies are paid out accordingly.

Gardner: Are these reports for the generation of the money done by the hour, day, or week? How frequently do you have to make that analysis?

Jackson: We do what we call a distribution, which is a payment of royalties, once a quarter. When we're doing a payment on a distribution, it’s typically on performances that occurred nine months prior to the day of the distribution.
A Tech Guide
For the Savvy
Flash Buyer
Gardner: What did you do two and a half years ago in terms of moving to flash and solid state disk (SSD) technologies? How did you integrate that into your existing infrastructure, or create the infrastructure to accommodate that, and then what did you get for it?

Jackson: When we started looking at another solution to improve our throughput, we actually started looking at another tiered-storage array. I came to the HPE Discover [conference] about two years ago and saw the presentation on the all-flash [3PAR Storage portfolio] that they were talking about, the benefits of all-flash for the price of spinning disk, which was to me very intriguing.

I met with some of the HPE engineers and had a deep-dive discussion on how they were doing this magic that they were claiming. We had a really good discussion, and when I went back to Toronto, I also met with some HPE engineers in the Toronto offices. I brought my technical team with me to do a bit of a deeper dive and just to kick the tires to understand fully what they were proposing.
We saw some processes that we were running going from days to hours just by putting it on all flash. To us, that's a huge improvement.

We came away from that meeting very intrigued and very happy with what we saw. From then on, we made the leap to purchase the HPE storage. We've had it running for about [two years] now, and it’s been running very well for us.

Gardner: What sort of metrics do you have in terms of technology, speeds and feeds, but also metrics in terms of business value and economics?

Jackson: I don’t want to get into too much detail, but as an anecdote, we saw some processes that we were running going from days to hours just by putting it on all-flash. To us, that's a huge improvement.

Gardner: What other benefits have you gotten? Are there some analytics benefits, backup and recovery benefits, or data lifecycle management benefits?

OPEX perspective

Jackson: Looking at it from an OPEX perspective, because of the IOPS that we have available to us, planning maintenance windows has actually been a lot easier for the team to work with.

Before, we would have to plan something akin to landing the space shuttle. We had to make sure that we weren’t doing it during a certain time, because it could affect the batch processes. Then, we'd potentially be late on our payments, our distributions. Because we have so many IOPS on tap, we're able to do these maintenance windows within business hours. The guys are happier because they have a greater work-life balance.

The other benefit that we saw was that all-flash uses less power than spinning disk. Because of less power, there less heat, and a need for less floor space. Of course, speed is the number one driving factor for a company to go all-flash.

Gardner: In terms of automation, integration, load-balancing, and some of those other benefits that come with flash storage media environments, were you able to use some of your IT folks for other innovation projects, rather than speeds and feeds projects?

Jackson: When you're freeing up resources from keeping the lights on, it's adding more value to the business. IT traditionally is a cost center, but now we can take those resources and take them off of the day-to-day mundane tasks and put them into projects, which is what we've been doing. We're able to add greater benefit to our members.
We know our business very well and we're hoping to leverage that knowledge with technology to further drive our business forward.

Gardner: And has your experience with flash in modernizing your storage prompted you to move toward other infrastructure modernization techniques including virtualization, software-defined composable infrastructure, maybe hyper converged? Is this an end point for you or maybe a starting point?

Jackson: IT is always changing, always transforming, and we're definitely looking at other technologies.

Some of the big buzzwords out there, blockchain, machine learning, and whatnot are things that we’re looking at very closely as an organization. We know our business very well and we're hoping to leverage that knowledge with technology to further drive our business forward.

Gardner: We're hearing a lot promising sorts of vision these days about how machine learning could be brought to bear on things like data transformation and making that analysis better, faster, cheaper. So, that’s a pretty interesting stuff.
A Tech Guide
For the Savvy
Flash Buyer
Are you now looking to extend what you do? Is the technology an enabler more than a cost center in some ways for your general SOCAN vision and mission?

Jackson: Absolutely. We're in the music business, but there is no way we can do what we do without technology; technically it’s impossible. We're constantly looking at ways that we can leverage what we have today, as well as what’s out in the marketplace or coming down the pipe, to ensure that we can definitely add the value to our members to ensure that they're paid and compensated for their hard work.

Gardner: And user experience and user quality of experience are top-of-mind for everybody these days.

Jackson: Absolutely, that’s very true.

Listen to the podcast. Find it on iTunes. Get the mobile app. Read a full transcript or download a copy. Sponsor: Hewlett Packard Enterprise.

You may also be interested in:

Why Alternative Data is the New Financial Data for Industry Investors

Why Alternative Data is the New Financial Data for Industry Investors

Information that has the ability to give investors an edge has long been coveted, but the nature of that information has evolved over time. Currently, traditional financial data, such as stock price history and fundamentals, is the standard for determining the health of a stock. However, alternative data has the potential to reveal insights about a stock’s health before traditional financial data. Obviously, this has major implications for investors.

If information is power, then unique information sourced from places, well, not-yet-sourced from, is a new level of domination. This is alternative data — unmined information with the potential to be leveraged for commercial value. Given that we’re in what we like to call a data revolution, where nearly every move we make can be digitized, tracked, and analyzed, every company is now a data company. Everyone is both producing and consuming immense amounts of data in the race to make more money. To help paint a more concrete picture of how alternative data may be useful, below is a list of 10 industry sectors that we believe can produce new data for investors.

Business Operations

When it comes to operational challenges like keeping costs low and improving efficiencies, data from various sources, such ...

Read More on Datafloq
$46 Billion: Big Data is Increasingly Meaning Big Money

$46 Billion: Big Data is Increasingly Meaning Big Money

The Big Data industry is bigger than ever. In terms of data, that’s a true statement any day of the week; every time you click, tap or swipe, you’re making a contribution to the data mine.In terms of money, revenues have never been better! Companies offering services classified within the Big Data Industry reached a new milestone; more than $46 billion in revenue year-to-date.

Big Data is Poised to Get Bigger

That’s a big number, no matter how you slice it. In fact, it’s ten times bigger than the estimated net-worth of the 2016 Republican Presidential Candidate; a man that thinks everything should be big. Although, I have a feeling that his answer would be entertaining if you were able to ask him “What does big data mean?”

And, just like political ad-buys, the numbers are only getting bigger as the year goes on. Some analysts believe that Big Data has a long way to go in terms of making its way into every corner of the market. How big could the Big Data market become? SNS Research released a report stating that by 2020, Big Data will generate $72 billion in revenue.

Companies need to invest capital in order to grow the technological ...

Read More on Datafloq
What is Video Analytics and Why is it Becoming Such a Big Player?

What is Video Analytics and Why is it Becoming Such a Big Player?

Explosive trends in the video analytics market has really caught the eye of business intelligent departments. This little known software is quickly picking up speed for an expansive amount of applications. From security to public safety to crowd management, video analytics have begun to see a big boom on the business world.

Video analytics, or intelligent video analytics, is software that is used to monitor video streams in near real-time. While monitoring the videos, the software identifies attributes, events or patterns of specific behavior via video analysis of monitored environments. Video analysis software also generates automatic alerts and can facilitate forensic analysis of historical data to identify trends, patterns and incidents. The software enables its users to analyze, organize and share any insight gained from the data to make smarter, better decisions. It can promote enhanced coordination across and within agencies and organizations. Its applications are widespread, including monitoring vehicle patterns or violations of traffic laws, or people entering restricted areas during defined time frames. The data can then be sorted by time and date or over an extended time period to create a trend analysis.

A simple function of video analytics is motion detection with a fixed background. More technical functions ...

Read More on Datafloq
Why Machine Learning is the Future of Data Analytics

Why Machine Learning is the Future of Data Analytics

Facebook makes use of a user’s likes and preferences to show ads that they may be interested in. When you mistype a search query on Google, the search engine instantly cross-references it against the millions of similar typos to interpret the correct query and shows you results appropriately. Tesla makes use of your car’s vital parameters and benchmarks this against available data to know when you are due for servicing. Netflix studies engagement and behavior from millions of users to precisely know what images and promotions elicit better response from users.

All of this is just the tip of the iceberg. In the last few years, machine learning techniques have proven to be incredibly effective for predictive and deep insights; when used with data analytics. Many companies keep big data as their biggest asset because it reflects their aggregate experience. After all, every partner, customer, defect, transaction, and complaint gives the company an experience to learn from. 

While in the recent years, many companies have focused more on how to store and manage all this data, it’s not just about the quantity of data or how it’s being stored. By combining data analytics with machine learning, companies can predict the future with ...

Read More on Datafloq
IT Sapiens, for Those Who Are Not

IT Sapiens, for Those Who Are Not

Perhaps one of the most refreshing moments in my analyst life is when I get the chance to witness the emergence of new tech companies—innovating and helping small and big organizations alike to solve their problems with data. This is exactly the case with Latvia-based IT Sapiens, an up-and-coming company focused on helping those small or budget-minded companies to solve their basic yet crucial
Why The Cars of the Future Will Rely on the IoT

Why The Cars of the Future Will Rely on the IoT

Once, electric cars were a novelty: they couldn't go very far, and weren't a practical option for consumers. Fortunately, a lot has changed since those days, and electric vehicles are now much more accessible and high-tech than they were in the past. National Drive Electric Week aims to give even more drivers the tools they need to make a more sustainable option and consider an electric car. These cars of the future are an exciting development in reducing our oil dependency, but where are they headed?

The Evolution of the Electric Car and Their Benefits

Gas cars weren't always the norm, and in fact, they used to be much less popular than their electric counterparts. Back in the late 1800s, electric cars completely took over the market, and gas cars were much less popular. Unfortunately, there were some limitations at that time: each car had to be assembled by hand (while by 1910, gas-powered cars could be produced by assembly line) and the electrical infrastructure limited the vehicles to city-only driving. Gas cars, meanwhile, became safer and more convenient, so most production of electric cars quickly declined.

Later in the century, during the late 60s and early 70s, electric cars experienced a Renaissance, ...

Read More on Datafloq
Strategic DevOps—How advanced testing brings broad benefits to Independent Health

Strategic DevOps—How advanced testing brings broad benefits to Independent Health

The next BriefingsDirect Voice of the Customer digital business transformation case study highlights how Independent Health in Buffalo, New York has entered into a next phase of "strategic DevOps."

After a two-year drive to improve software development, speed to value, and improved user experience of customer service applications, Independent Health has further extended advanced testing benefits to ongoing apps production and ongoing performance monitoring.

Learn here how the reuse of proven performance scripts and replaying of synthetic transactions that mimic user experience have cut costs and gained early warning and trending insights into app behaviors and system status.

Listen to the podcast. Find it on iTunes. Get the mobile app. Read a full transcript or download a copy.

Here to describe how to attain such new strategic levels of DevOps benefits are Chris Trimper, Manager of Quality Assurance Engineering at Independent Health in Buffalo, New York, and Todd DeCapua, Senior Director of Technology and Product Innovation at CSC Digital Brand Services Division and former Chief Technology Evangelist at Hewlett Packard Enterprise (HPE). The discussion is moderated by BriefingsDirect's Dana Gardner, Principal Analyst at Interarbor Solutions.

Here are some excerpts:

Gardner:What were the major drivers that led you to increase the way in which you use DevOps, particularly when you're looking at user-experience issues in the field and in production?

Trimper: We were really hoping to get a better understanding of our users and their experiences. The way I always describe it to folks is that we wanted to have that opportunity to almost look over their shoulder and understand how the system was performing for them.

Whether your user is internal or external, if they don't have that good user experience, they're going to be very frustrated and they're going to have a poor experience. Internally, time is money. So, if it takes longer for things to happen, and you get frustrated potential turnover, it's an unfortunate barrier.

Gardner: What kind of applications are we talking about? Is this across the spectrum of different type of apps, or did you focus on one particular type of app to start out?

End users important

Trimper: Well, when we started, we knew that the end user, our members, were the most important thing to us, and we started off with the applications that our servicing center used, specifically our customer relationship management (CRM) tool.

If the member information doesn’t pop fast when a member calls, it can lead to poor call quality, queuing up calls, and it just slows down the whole business. We pride ourselves on our commitment to our members. That goes even as far as, when you call up, making sure that the person on the other end of the phone can service you well. Unfortunately, they can only service you as well as the data that’s provided to them to understand the member and their benefits.

Gardner: It’s one thing to look at user experience through performance, but it's a whole new dimension or additional dimension when you're looking at user experience in terms of how they utilize that application, how well it suits their particular work progress, or the processes for their business, their line of business. Are you able to take that additional step, or are you at the point where the feedback is about how users behave and react in a business setting in addition to just how the application performs?

Trimper: We're starting to get to that point. Before, we only had as much information as we were provided about how an application was used or what they were doing. Obviously, you can't stand there and watch what they're doing 24x7.

Lately, we've been consuming an immense amount of log data from our systems and understanding what they're doing, so that we can understand their problems and their woes, or make sure that what we're testing, whether it's in production monitoring or pre-production testing, is an accurate representation of our user. Again, whether it’s internal or external, they're both just as valuable to us.

Gardner: Before we go any further, Chris, tell us a little bit about Independent Health. What kind of organization is it, how big is it, and what sort of services do you provide in your communities?
Get the New Book
On Effective
Performance Engineering
Trimper: We're a healthcare company for the Western New York area. We're a smaller organization. We define the red-shirt treatment that stands for the best quality care that we can provide our members. We try to be very proactive in everything that we do for our members as well. We drive members to the provider to do preventative things, that healthier lifestyle that everybody is trying to go for.

Gardner: Todd, we're hearing this interesting progression toward a feedback loop of moving beyond performance monitoring into behaviors and use patterns and improving that user experience. How common is that, or is Independent Health on the bleeding edge?

Ahead of the curve

DeCapua: Independent Health is definitely moving with, or maybe a little bit ahead of, the curve in the way that they're leveraging some of these capabilities.

If we were to step back and look at where we've been from an industry perspective across many different markets, Agile was hot, and now, as you start to use Agile and break all the right internal systems for all the right reasons, you have to start adopting some of these DevOps practices.

Independent Health is moving a little bit ahead on some of those pieces, and they're probably focusing on a lot of the right things, when you look across other customers I work with. It's things like speed of time to value. That goes across technology teams, business teams, and they're really focused on their end customer, because they're talking about getting these new feature functions to benefit their end customers for all the right reasons.

You heard Chris talking about that improved end-user experience about around their customer service applications. This is when people are calling in, and you're using tools to see what’s going on and what your end users are doing.

There's another organization that actually recorded what their customers were doing when they were having issues. That was a production-monitoring type thing, but now you're recording a video of this. If you called within 10 minutes of having that online issue, as you are calling in and speaking with that customer service representative, they're able to watch the video and see exactly what you did to get that error online to cause that phone call. So having these different types of users’ exceptions, being able to do the type of production monitoring that Independent Health is doing is fantastic.
I do think that Independent Health is hitting the bleeding edge on that piece. That’s what I've observed.

Another area that Chris was telling me about is some of the social media aspects and being able to monitor that is another way of getting feedback. Now, I do think that Independent Health is hitting the bleeding edge on that piece. That’s what I've observed.

Gardner: Let’s hear some more about that social media aspect, getting additional input, additional data through all the available channels that you can.

Trimper: It would be foolish not to pay attention to all aspects of our members, and we're very careful to make sure that they're getting that quality that we try to aim for. Whether it happens to be Facebook, Twitter, or some other mechanism that they give us feedback on, we take all that feedback very seriously.

I remember an instance or two where there might have been some negative feedback. That went right to the product-management team to try to figure out how to make that person’s experience better. It’s interesting, from a healthcare perspective, thinking about that. Normally, you think about a member’s copay or their experience in the hospital. Now, it's their experience with this application or this web app, but those are all just as important to us.

Broadened out?

Gardner: You started this with those customer-care applications. Has this broadened out into other application development? How do you plan to take the benefits that you've enjoyed early and extend them into more and more aspects of your overall IT organization?

Trimper: We started off with the customer service applications and we've grown it into observing our provider portals as well. A provider can come in and look at the benefits of a member, the member portal that the members actually log in to. So, we're actually doing production monitoring of pretty much all of our key areas.

We also do pre-production monitoring of it. So, as we are doing a release, we don’t have to wait until it gets to production to understand how it went. We're going a little bit beyond normal performance testing. We're running the same exact types of continuous monitoring in both our pre-production region and our production regions to ensure that quality that we love to provide.

Gardner: And how are the operations people taking this? Has this been building bridges? Has this been something that struck them as a foreign entity in their domain? How has that gone?

Trimper: At first, it was a little interesting. It felt like to them it was just another thing that they had to check out and had to look at, but I took a unique approach with it. I sat down and talked to them personally and said, "You hear about all these problems that people have, and it’s impossible for you to be an expert on all these applications and understand how it works. Luckily, coming from the quality organization, we test them all the time and we know the business processes."
Get the New Book
On Effective
Performance Engineering
The way I sold it to them is, when you see an alert, when you look at the statistics, it’s for these key business processes that you hear about, but you may not necessarily want to know all the details about them or have the time to do that. So, we really gave them insight into the applications.

As far as the alerting, there was a little bit of an adoption practice for that, but overall we've noticed a decrease in the number of support tickets for applications, because we're allowing them to be more proactive, whether it’s proactive of an unfortunately blown service-level agreement (SLA), or it’s a degradation in quality of the performance. We can observe both of those, and then they can react appropriately.

Gardner: Todd, he actually sat down and talked to the production people. Is this something novel? Are we seeing more of that these days?

DeCapua: We're definitely seeing more of it, and I know it’s not unique for Chris. I know there was some push back at the beginning from the operations teams.

There was another thing that was interesting. I was waiting for Chris to hit on it, and maybe he can go into it a little bit more. It was the way that he rolled this out. When you're bringing a monitoring solution in, it’s often the ops team that’s bringing in this solution.

Making it visible

What’s changing now is that you have these application-development testing teams that are saying, "We also want to be able to get access to these types of monitoring, so that our teams can see it and we can improve what we are doing and improve the quality of what we deliver to you, the ops teams. We are going to do instrumenting and everything else that we want to get this type of detail to make it visible."

Chris was sharing with me how he made this available first to the directors, and not just one group of directors, but all the directors, making this very plain-sight visible, and helping to drive some of the support for the change that needed to happen across the entire organization.

As we think about that as a proven practice, maybe Chris is one of the people blazing the trail there. It was a big way of improving and helping to illuminate for all parties, this is what’s happening, and again, we want to work to deliver better quality.

Gardner: Anything to add to that, Chris?

Trimper: There were several folks in the development area that weren’t necessarily the happiest when they learned that the perception of what they originally thought was there and what was really there in terms of performance wasn’t that great.
It was a big way of improving and helping to illuminate for all parties, this is what’s happening.

One of the directors shared an experience with me. He would go into our utilities and look at the dashboards before he was heading to a meeting in our customer service center. He would understand what kind of looks he was going to be given when he walked in, because he was directly responsible for the functionality and performance of all this stuff.

He was pleased that, as they went through different releases and were able to continually make things better, he started seeing everything is green, everything is great today. So, when I walk in, it’s going to be sunshine and happiness, and it was sunshine and happiness, as opposed to potentially a little bit doomy and gloomy. It's been a really great experience for everyone to have. There's a little bit of pain going through it, but eventually, it has been seen as a very positive thing.

Gardner: What about the tools that you have in place? What allows you to provide these organizational and cultural benefits? It seems to me that you need to have data in your hands. You need to have some ability to execute once you have got that data. What’s the technology side of this; we've heard quite a bit about the people and the process?

Trimper: This whole thing came about because our CIO came to me and said. "We need to know more about our production systems. I know that your team is doing all the performance testing in pre-production. Some of the folks at HPE told me about this new tool called Performance Anywhere. Here it is, check it out, and get back to me. "

We were doing all the pre-production testing and we learned that all the scripts that we did, which had already been tried and true and been running and continuously get updates as we get new releases, could just be turned into these production monitors. Then, we found through using the tool, through our trial, and now all of our two plus years that we have been working with it that it was a fairly easy process.

Difficult point

The most difficult point was understanding how to get production data that we could work with, but you could literally take a test on your VUGen script and turn it into a production monitor in 5-10 minutes, and that was pretty invaluable to us.

That means that every time we get a release, we don’t have to modify two sets of scripts and we don’t have two different teams working on everything. We have one team that is involved in the full life cycle of these releases and that can very knowledgeably make the change to those production monitors.

Gardner: HPE Performance Anywhere. Todd, are lot of people using it in the same fashion where they're getting this dual benefit from pre-production and also in deployment and operations?

DeCapua: Yes, it’s definitely something that’s becoming more-and-more aware. It’s a capability that's been around for a little while. You'll also hear about things like IT4IT, but I don’t want to open up that whole can of worms unless we want to dive into it. But as that starts to happen, people like Chris, people like his CIO, want to be able to get better visibility into all systems that are in production, and is there an easy way to do that? Being able to provide that easy way for all of your stakeholders and all of your customers are capabilities that we're definitely seeing people adopt. It was a big way of improving and helping to illuminate for all parties, this is what’s happening
That means that every time we get a release, we don’t have to modify two sets of scripts and we don’t have two different teams working on everything.

Gardner: Can you provide a bit more detail in terms of the actual products and services that made this possible for you, Chris?

Trimper: We started with our HPE LoadRunner scripts, specifically the VUGen scripts, that we were able to turn into the production monitors. Using the AppPulse Active tool from the AppPulse suite of tools, we were able to build our scripts using their SaaS infrastructure and have these monitors built for us and available to test our systems.

Gardner: So what do you see in our call center? Are you able to analyze in any way and say, "We can point to these improvements, these benefits, from the ability for us to tie the loop back on production and quality assurance across the production spectrum?"

Trimper: We can do a lot of trend analysis. To be perfectly honest, we didn’t think that the report would run, but we did a year-to-date trend analysis and it actually was able to compile all of our statistics. We saw really two neat things.

When you had open enrollment, we saw this little spike that shot up there, which we would expect to see, but hopefully we can be more prepared for it as time goes. But we saw a gradual decrease, and I think, due to the ability to monitor, due to the ability to react and plan better for a better performing system, through the course of the year, for this one key piece of pulling member data, we went from an average of about 12-14 seconds down to 4 seconds, and that trend actually is continuing to go down.

I don’t know if it’s now 3 or less today, but if you think about that 12 or 14 down to about 4, that was a really big improvement, and it spoke volumes to our capabilities of really understanding that whole picture and being able to see all of that in one place was really helpful to us.

Where next?

Gardner: Looking to the future, now that you've made feedback loops demonstrate important business benefits and even move into a performance benefit for the business at large, where can you go next? Perhaps you're looking at security and privacy issues, given that you're dealing with compliance and regulatory requirements like most other healthcare organizations. Can you start to employ these methods and these tools to improve other aspects of your SLAs?

Trimper: Definitely, in terms of the SLAs and making sure that we're keeping everything alive and well. As for some of the security aspects, those are still things where we haven’t necessarily gone down the channels yet. But we've started to realize that there are an awful lot of places where we can either tie back or really start closing the gaps in our understanding of just all that is our systems.

Gardner: Todd, last word, what should people be thinking about when they look at their tooling for quality assurance and extending those benefits into full production and maybe doing some cultural bonding at the same time?
The culture is a huge piece. No matter what we talk about nowadays, it starts with that.

DeCapua: The culture is a huge piece. No matter what we talk about nowadays, it starts with that. When I look at somebody like Independent Health, the focus of that culture and the organization is on their end user, on their customer.

When you look at what Chris and his team has been able to do, at a minimum, it’s reducing the number of production incidents. And while you're reducing production incidents, you're doing a number of things. There are actually hard costs there that you're saving. There are opportunity costs now that you can have these resources working on other things to benefit that end customer.

We've talked a lot about DevOps, we've talked a lot about monitoring, we've mentioned now culture, but where is that focus for your organization? How is it that you can start small and incrementally show that value? Because now, what you're going to do is be able to illustrate that in maybe two or three slides, two or three pages.
Get the New Book
On Effective
Performance Engineering
But some of the things that Chris has been doing, and other organizations are also doing, is showing, "We did this, we made this investment, this is the return we got, and here's the value." For Independent Health, their customers have a choice, and if you're able to move their experience from 12-14 seconds to 4 seconds, that’s going to help. That’s going to be something that Independent Health wants to be able to share with their potential new customers.

As far as acquiring new customers and retaining their existing customers, this is the real value. That's probably my ending point. It's a culture, there are tools that are involved, but what is the value to the organization around that culture and how is it that you can then take that and use that to gain further support as you move forward?

Listen to the podcast. Find it on iTunes. Get the mobile app. Read a full transcript or download a copy. Sponsor: Hewlett Packard Enterprise.

You may also be interested in:

How to Empower Business Leaders with Dashboards

How to Empower Business Leaders with Dashboards

Provide an insightful report and you’ll empower an executive for a day. Provide an interactive data dashboard and you’ll empower him for a lifetime.

I have been a consumer of financial performance data for more than 20 years. An insightful performance report (revenue, profit, and expense) empowered me for a day. But in the long run, I was left frustrated for three reasons: 

1. Reports come too late to react to
2. Reports contain errors causing doubt
3. Data behind the numbers were rarely accessible

Financial reporting is not about the numbers. Financial reporting is about the story behind the numbers and the actions you take as a result of that story. 

My ability to tell the "performance" story has been as good as the data I would get. When I could not see the details behind numbers, I had to make assumptions without certainty of facts.  At times I felt like a sports fan shouting at the scoreboard hoping to change the result.

“I felt like a sports fan shouting at the scoreboard hoping to change the result.” 

Before I go any further, I do not blame IT. Business Leaders often don't know what they want until they want it. IT people perform heroic acts trying to anticipate ...

Read More on Datafloq
5 Ways to Increase Your Start-up Valuation with Cloud Accounting

5 Ways to Increase Your Start-up Valuation with Cloud Accounting

Cloud accounting is a great tool, not only for large businesses and organizations but also for start-ups. It is the process or financial strategy which helps businesses to store, monitor and process confidential financial data, without investing in expensive in-house IT server infrastructure. In the case of start-ups, Cloud accounting solutions bring in a whole plethora of benefits which help the emerging business to grow and optimize the efficacy of its operational protocols, while also increasing its valuation. Cloud accounting’s dynamic advantages like real-time data access, scalability, and flexibility make it a great choice for developing start-ups. 

The basic fundamental of any start-up is its affinity for innovation and creativity. A start-up which implements Cloud accounting protocols with business strategy benefits in more than way. It can help start-ups with limited resources and /or knowledge to properly manage their financial aspects without much of a hassle. Here are a few advantages of implementing Cloud accounting techniques, which helps to drive up a startup's valuation. 

1.Impressive Data Flow and Access Management 

Unlike large business and corporations, start-ups lack the necessary IT resources like skilled IT personnel, specialized hardware, and top notch security systems to properly set up and manage an in-house data servers. Without a significant amount of funding, start-ups ...

Read More on Datafloq
How Emerging Industries are Using Big Data to their Advantage

How Emerging Industries are Using Big Data to their Advantage

It’s exciting to watch a new industry figure out ways to use big data. There are so many ways different industries put data to work for them and for their audience. For example, Google is using RankBrain to determine search results.

This is exciting because it’s a very real example of Artificial Intelligence employing data to affect what you see in front of you every day. Go ahead, try typing an unusual query into Google. RankBrain will help determine the result based on similar and unconnected searches, your own searches, and the data generated by clicks in those searches, and it will do this in real-time.

In other words, machine learning will use big data to personalize the result of each search for you. And in the world of Google updates and SEO, the mysterious, exciting thing is that there was an “Unnamed Major Update” in May. Was that change a full-scale takeover by AI? We won’t know until there’s an announcement.

So Google is something of a juggernaut and a pioneer in the big data world. Internet giants like Google were at the forefront of Big Data’s emergence to the public eye in 2010, which has ushered in Analytics 3.0. This is ...

Read More on Datafloq
Mi maradt le a bevásárlólistáról? – Ma új hazai adatbányászati verseny indul

Mi maradt le a bevásárlólistáról? – Ma új hazai adatbányászati verseny indul

A data scientistté válás útjának egyik fontos állomása az adatbányászati versenyeken való indulás. A gépi tanulási eljárásokkal kapcsolatos tudásod, a helyes tesztelési és tanítási rendszer kialakításának a képessége, a jó visszamérési stratégiád ellenőrzésére nagyon alkalmas egy jó versenyen való részvétel. Ezért is népszerű a adatbányászati versenyeket szervező oldal, érdemes követni rajta az eseményeket akkor is, ha nincs időd bekapcsolódni a megmérettetésekbe.

a_5.jpgKülön örülök, ha hazai versenyek indulnak, hiszen ezen események egyfajta indikátorai annak, hogy a hazai adatos közösség hol is tart valójában. Ezért is szeretém külön felhívni a figyelmet a ma induló Cetli  ("Shopping List") Competition versenyre: a Nextent jóvoltából és az ő támogatásukkal induló megmérettetésen a Cetli nevű applikáció adatai felett dolgozhatunk. Az anonimizált felhasználók bevásárlólistáit láthatjuk, és tudjuk hol töröltek le azokról tételeket. Azt kell megbecsülni hogy adott boltban mi az a termék, amit a verseny kiírói letöröltek a listáról. Ebből következően egyszerre láthatjuk, hogy mit és hol vásároltak, szóval maga az adathalmaz magában is érdekes. 

Ha érdekel, nézz körül a verseny oldalán, majd regisztrálj versenyzőnek. 

Verseny hivatalos oldala 

A verseny indulásáról a ma esti Meetupon fognak bővebben beszélni a verseny szervezői.

Úgy érzed, hogy neked is van olyan adathalmazod, ami kapcsán érdekes lehetne kiírni egy adatbányászati versenyt? Érdekelne, mi lenne az elérhető közel legjobb megoldás, vagy kíváncsi vagy rá, kik értenek igazán az adott fajta feladat megoldásához? Egyszerűen a beszállítóidat szeretnéd megversenyeztetni? Keress meg minket, és mi szívesen segítünk a verseny megfogalmazásában, kiírásában, akár a lebonyolításában.  - Gáspár Csaba

Megosztom Facebookon! Megosztom Twitteren! Megosztom Tumblren!

Why is Data becoming established in the C-Suite?

Why is Data becoming established in the C-Suite?

Over the last decade, MBN Solutions have seen the seniority of data and analytics roles grow in various types of organization. Chief Data Officers (CDO) have become increasingly popular, as the most senior role of those roles within a company.

With the passing of time, those roles continue to operate and it feels like the CDO is now an established presence in today’s boardrooms. Why is that? Given the demand we also see for Analytics or Data Science leaders, why has the CDO reached the top first?

Although there are a few businesses with Chief Analytics Officers or Chief Scientists, many appear satisfied with a CDO at the top for now. Wondering why, I’ve been chatting to some of those data and analytics leaders we have placed. A few common themes have emerged, to perhaps explain this pattern.

MBN have been tracking this in the market as we posed this question, back in 2013, to an audience of senior data professionals at an event we hosted in Home House, London: “How many of you anticipate a Data professional sitting at C-Suite level within the next few years?”

The response from the audience was striking. Of almost 80 individuals in the room, only 2 raised their hands.

So, I thought I’d ...

Read More on Datafloq
Will Fog Computing Hide the Clouds of the Internet of Things?

Will Fog Computing Hide the Clouds of the Internet of Things?

As if the Internet of Things (IoT) was not complicated enough, the Marketing team at Cisco introduced its Fog Computing vision in January 2014, also known as Edge Computing  for other more purist vendors.

Given Cisco´s frantic activity in their Internet of Everything (IoE) marketing campaigns, it is not surprising that many bloggers have abused of shocking headlines around this subject taking advantage of the Hype of the IoT.

I hope this post helps you better understand what is the role of Fog Computing in the IoT Reference Model and how companies are using IoT Intelligent gateways in the Fog to connect the “Things” to the Cloud through some applications areas and examples of Fog Computing.

The problem with the Cloud

As the Internet of Things proliferates, businesses face a growing need to analyse data from sources at the edge of a network, whether mobile phones, gateways or IoT sensors. Cloud computing has a disadvantage: It can’t process data quickly enough for modern business applications.

The IoT owes its explosive growth to the connection of physical things and operation technologies (OT) to analytics and machine learning applications, which can help glean insights from device-generated data and enable devices to make “smart” decisions without human intervention. Currently, such resources ...

Read More on Datafloq
How Smart Factories and the IIoT Can Prevent Product Recalls

How Smart Factories and the IIoT Can Prevent Product Recalls

In recent news, Samsung Electronics Co. has initiated a global recall of 2.5 millions of their Galaxy Note 7 smartphones, after finding that the batteries of some of their phones exploded while charging. This recall would cost the company close to $1 Billion.

This is not a one-off incident

Product recalls have plagued the manufacturing world for decades, right from food and drug to automotive industries, causing huge losses and risk to human life. In 1982, Johnson & Johnson recalled 31 million bottles of Tylenol which retailed at $100 million after 7 people died in Chicago-area. In 2000, Ford recalled 20 million Firestone tires losing around $3 billion, after 174 people died in road accidents due to faulty tires. In 2009, Toyota issued a recall of 10 million vehicles due to numerous issues including gas pedals and faulty airbags that resulted in $2 billion loss consisting of repair expenses and lost sales in addition to the stock prices dropping more than 20% or $35 billion.

Most manufacturers have very stringent quality control processes for their products before they are shipped. Then how and why do these faulty products make it to the market which poses serious life risks and business risks?

Koh Dong-jin, president ...

Read More on Datafloq
Becoming a Big Data Scientist: Skills You Need to Know and How to Learn Them

Becoming a Big Data Scientist: Skills You Need to Know and How to Learn Them

To say that data scientists are in high demand would actually be sort of an understatement. With big data being utilized more and more within organizations, executives want men and women who know big data inside and out. The number of data scientist positions is on the rise and growing each year. This demand is reflected in the amount of money being paid to data scientists, with the median salary for computer and information research scientists hitting more than $110,000 in 2015, according to the Bureau of Labor Statistics. But it’s not enough to be considered a data scientist, you need to have the right skills to get noticed above your peers. In this way, you’ll be able to land the most coveted jobs that are out there. In other words, mastering certain skills will get you noticed far more quickly.

One can look at data science skills you should know through a broad lens. Simply saying you need programming skills, for example, would be accurate, but let’s get more specific than that. In an analysis from CrowdFlower of LinkedIn job postings, the most cited skill for data scientist openings was SQL. In fact, more than half (57 percent) listed SQL ...

Read More on Datafloq
5 Ways Blockchain will Transform Financial Services

5 Ways Blockchain will Transform Financial Services

Blockchain is being hailed as “the new internet” and is driving transformation for businesses across multiple sectors, particularly for the Financial Services. But how exactly?

Blockchain in a Nutshell

Let’s start with a quick recap of what exactly blockchain is and it’s benefits. Pinching a definition from the Financial Times…

“A blockchain is a shared digital ledger that allows transactions to be recorded and verified electronically over a network of computers without a central ledger. Cryptography is used to protect the data from fraud or hackers.”

So why is everyone, including us of course, so excited? Because the benefits are extensive:- decentralisation, reliability, simplification, transparency, traceability, cost saving, reduced room for error, faster transactions and improved data quality… just to mention a few!

So let’s take a look at some specific ways blockchain will transform the Financial Services industry – ultimately creating a much more satisfying customer experience for us all.

1. Asset Management

Use case: Settlements

Traditional trade processes within asset management can be slow, manual, cumbersome and filled with risk when reconciling and matching – and they’re getting more complex with cross border transaction and for non-standard investment products, e.g. loans. Each party in the trade lifecycle (e.g. broker dealers, intermediaries, custodians, clearing and settlement teams) currently keeps their ...

Read More on Datafloq
How to Integrate Sqoop in a Data Ingestion Layer with Hive, Oozie in a Big Data Application?

How to Integrate Sqoop in a Data Ingestion Layer with Hive, Oozie in a Big Data Application?

Sqoop is a tool which helps to migrate and transfer the data between RDBMS and Hadoop system in bulk mode. This blog post will focus on integrating Sqoop with other projects in Hadoop ecosystem and Big Data Applications. As I am working for Big Data Solution providers, I learned it and here I will show how to schedule Sqoop job with Oozie and how to load the data from Sqoop to the data warehouse in Hive on Hadoop or even HBase. I tried to give the solutions with the code here which may become easy to understand.


Java: JDK 1.7

Cloudera version:  CDH4.6


Initial steps

Here we assume that we will have some data on HDFS of our Hadoop cluster and in our MySQL. Now we need to integrate jobs with Oozie and Hive. Let’s understand it with the code!

Code walkthrough

Below is an Oozie workflow which will trigger import job to import the data from MySQL to Hadoop:

<workflow-app name="musqoop-wf" xmlns="uri:oozie:workflow:0.1">


<action name="mysqoopaction">

<sqoop xmlns="uri:oozie:mysqoopaction:0.2">



<command>import --table People --connect ...</command>


<ok to="next"/>

<error to="error"/>




Now, How to add the property for Sqoop when integrating with Oozie?  This workflow file will help to give the answer of this question and will configure the property to sqoop.

For Ex: Add a statement in a ...

Read More on Datafloq
Why Employee Training and Big Data Should Work Together

Why Employee Training and Big Data Should Work Together

The flood of data available today is growing by leaps and bounds as expanding networks are able to capture real-time user decisions in an instant. The ability to analyze this data for trends and insights is becoming an eagerly-sought advantage for corporate training companies and organizations of all kinds. But more of them are beginning to discover that it poses benefits beyond marketing forecasts and instant statistics. Big data is being adapted to e-learning processes to train better employees.

Data-driven approaches are being used to perfect adaptive learning, create better courses, and provide electronic monitoring and testing in ways a single human instructor couldn't cope with. Around 77% of US companies offer e-learning, but little of it leverages big data. Here's why every company should bring big data to employee training.

1. Determine Effectiveness

Big data computing can return analytics that quantify all the results of employee training - lesson retention, employee learning needs, more productive curriculum and techniques, and improved learning software. Learning directors are able to look at a number of different factors and determine which works and which doesn't.

Organizations can develop metrics for each training module based on employee learning time, test results, questions, and feedback. Do certain methods work ...

Read More on Datafloq
Why Big Data Is Growing So Fast?

Why Big Data Is Growing So Fast?

You are probably already feeling some of the impact of living in a data-driven world. You rarely come across a Google search that doesn’t answer your question, and often enough you find more than enough information to write your own tome on any topic you can imagine.

Moreover, your hard drive is probably filled with so much data accumulated over the years that you might wonder what would happen to all your data if it crashed and you hadn’t backed up everything. Fortunately, hard drive data recovery experts can quickly resolve this issue.

In one of his books, author Deepak Chopra narrates how his hard drive crashed when he was writing a book he had spent months researching. Since he had not yet backed up his data, he immediately experienced crushing despair. He had no idea how to reconstruct his pivotal ideas or retrace his in-depth research findings. Fortunately, he discovered that it was possible to completely recover a hard-drive and was amazed (and relieved) when he quickly got his restored hard-drive in the mail.

Keeping Track of Big Data

In four years from now, there will be 5,200 GB of data per person. International Data Corporation, a research group, believes that there will ...

Read More on Datafloq
Breaking Analytics Out Of The Box – Literally

Breaking Analytics Out Of The Box – Literally

The lines between open source and commercial products are blurring rapidly as our options for building and executing analytics grow by the day. The range of options and price points available today enable anyone from a large enterprise to a single researcher to gain access to affordable, powerful analytic tools and infrastructure. As a result, analytics will continue to become more pervasive and more impactful.

Author’s note: I typically avoid mentioning specific products or services in my blogs. However, it is unavoidable for this topic. While I will make mention of a number of my company’s offerings here to illustrate specific examples of the themes, the themes themselves are broad and industry-wide.

Blurred Lines

Given the cost and overhead, it used to be that organizations would have to make an either/or choice when it came to selecting data platforms and analytical tools. Even with the advent of the open source movement, common opinions espoused either avoiding open source altogether or migrating completely to open source options. This either/or choice was a false one time has shown. As it turned out, most organizations now utilize a mixture of open source and commercial products to achieve maximum effectiveness.

From a platform perspective, large organizations typically are ...

Read More on Datafloq
Who Competes With VMware Now?

Who Competes With VMware Now?

Yesterday, September 7, 2016, the EMC² logo disappeared. It’s hard for me to imagine that one of the greatest tech marketing companies of all time is suddenly gone. Yes, I know it lives on under Dell Technologies as DellEMC, but the impressions I have of the two companies side by side are so disparate that I’m still having trouble seeing how one blends into the other.
How always-available data forms the digital lifeblood for a university medical center

How always-available data forms the digital lifeblood for a university medical center

The next BriefingsDirect Voice of the Customer digital business transformation case study examines how the Nebraska Medical Center in Omaha consolidated and unified its data-protection capacities.

We'll explore how adopting storage innovation protects the state's largest hospital from data disruption and adds operational simplicity to complex data lifecycle management.

Listen to the podcast. Find it on iTunes. Get the mobile app. Read a full transcript or download a copy.

To describe how more than 150 terabytes of data remain safe and sound, we're joined by Jeff Bergholz, Manager of Technical Systems at The Nebraska Medical Center in Omaha. The discussion is moderated by BriefingsDirect's Dana Gardner, Principal Analyst at Interarbor Solutions.

Here are some excerpts:

Gardner: Tell us about the major drivers that led you to seek a new backup strategy as a way to keep your data sound and available no matter what.

Bergholz: At Nebraska Medicine, we consist of three hospitals with multiple data centers. We try to keep an active-active data center going. Epic is our electronic medical record (EMR) system, and with that, we have a challenge of making sure that we protect patient data as well as keeping it highly available and redundant.

We were on HPE storage for that, and with it, were really only able to do a clone-type process between data centers and keep retention of that data, but it was a very traditional approach.

A couple of years ago, we did a beta program with HPE on the P6200 platform, a tertiary replica of our patient data. With that, this past year, we augmented our data protection suite. We went from license-based to capacity-based and we introduced some new D2D dedupe devices into that, and StoreOnce as well. What that affords us is to easily replicate that data over to another StoreOnce appliance with minimal disruption.

Part of our goal is to keep backup available for potential recovery solutions. With all the cyber threats that are going on in today's world, we've recently increased our retention cycle from 7 weeks to 52 weeks. We saw and heard from the analysts that the average vulnerability sits in your system for 205 to 210 days. So, we had to come up with a plan for what would it take to provide recovery in case something were to happen.

We came up with a long-term solution and we're enacting it now. Combining HPE 3PAR storage with the StoreOnce, we're able to more easily move data throughout our system. What's important there is that our backup windows have greatly been improved. What used to take us 24 hours now takes us 12 hours, and we're able to guarantee that we have multiple copies of the EMR in multiple locations.

We demonstrate it, because we're tested at least quarterly by Epic as to whether we can restore back to where we were before. Not only are we backing it up, we're also testing and ensuring that we're able to reproduce that data.

More intelligent approach

Gardner: So it sounds like a much more intelligent approach to backup and recovery with the dedupe, a lower cost in storage, and the ability to do more with that data now that it’s parsed in such a way that it’s available for the right reason at the right time.

Bergholz: Resource wise, we always have to do more with less. With our main EMR, we're looking at potentially 150 terabytes of data in a dedupe that shrinks down greatly, and our overall storage footprint for all other systems were approaching 4 petabytes of storage within that.

We've seen some 30:1 decompression ratios within that, which really has allowed my staff and other engineers to be more efficient and frees up some of their time to do other things, as opposed to having to manage the normal backup and retention of that.
HPE Data Protector:
Backup with Brains
Learn More Here
We're always challenged to do more and more. We grow 20-30 percent annually, and by having appropriate resources, we're not going to get 20 to 30 percent more resources every year. So, we have to work smarter with less and leverage the technologies that we have.

Gardner: Many organizations these days are using hybrid media across their storage requirements. The old adage was that for backup and recovery, use the cheaper, slower media. Do you have a different approach to that and have you gone in a different direction?

Bergholz: We do, and backup is as important to us as our data that exists out there. Time and time again, we’ve had to demonstrate the ability to restore in different scenarios, the accepted time of being able to restore and provide service back. They're not going to wait for that. When clinicians or caregivers are taking care of patients, they want that data as quickly as possible. While it may not be the EMR, it maybe some ancillary documents that they need to be able to get in order to provide better care.
We're able, upon request, to enact and restore in 5-10 minutes. In many cases, once we receive a ticket or a notification, we have full data restoration within 15 minutes.

We're able, upon request, to enact and restore in 5 to 10 minutes. In many cases, once we receive a ticket or a notification, we have full data restoration within 15 minutes.

Gardner: Is that to say that you're all flash, all SSD, or some combination? How did you accomplish that very impressive recovery rate?

Bergholz: We're pretty much all dedupe-type devices. It’s not necessarily SSD, but it's good spinning disk, and we have the technology in place to replicate that data and have it highly available on spinning disk, versus having to go to tape to do the restoration. We deal with bunches of restorations on a daily basis. It’s something we're accustomed to and our customers require quick restoration.

In a consolidated strategic approach, we put the technology behind it. We didn’t do the cheapest, but we did the best sort of thing to do, and having an active-active data center and backing up across both data centers enables us to do it. So, we did spend money on the backup portion because it's important to our organization.

Gardner: You mentioned capacity-based pricing. For those of our listeners and readers who might not be familiar with that, what is that and why was that a benefit to you?

Bit of a struggle

Bergholz: It was a little bit of a struggle for us. We were always traditionally client-based or application-based in the backup. If we needed Microsoft Exchange email boxes we had to have an Exchange plug-in. If we had Oracle, we had to have an Oracle plug-in, a SQL plug-in.

While that was great, it enabled us to do a lot, it we were always having to get another plug-in thing to do it. When we saw that with our dedupe compression ratios we were getting, going to a capacity-based license allowed us to strategically and tactically plan for any increase that we were doing within our environment. So now, we can buy in chunklets and keep ahead of the game, making sure that we’re effective there.

We're in throes of enacting archive-type solution through a product called QStar, which I believe HPE is OEM-ing, and we're looking at that as a long-term archive-type process. That's going to a linear tape file system, utilizing the management tools that that product brings us to afford the long-term archive of patient information.

Our biggest challenge is that we never delete anything. It’s always hard with any application. Because of the age of the patient, many cases are required to be kept for 21 years; some, 7 years; some, 9 years. And we're a teaching hospital and research is done on some of that data. So we delete almost nothing.
HPE Data Protector:
Backup with Brains
Learn More Here
In the case of our radiology system, we're approaching 250 terabytes right now. Trying to backup and restore, that amount of data with traditional tools is very ineffective, but we need to keep it forever.

By going to a tertiary-type copy, which this technology brings us, we have our source array, our replicated array, plus now, a tertiary array to take that, too, which is our LTFS solution.

Gardner: And with your backup and recovery infrastructure in place and a sense of confidence that comes with that, has that translated back into how you do the larger data lifecycle management equation? That is to say, are there some benefits of knowledge of quality assurance in backup that then allows people to do things they may not have done or not worried about, and therefore have a better business transformation outcome for your patients and your clinicians?
Being able to demonstrate solutions time and time again buys confidence through leadership throughout the organization and it makes those people sleep safer at night.

Bergholz: From a leadership perspective, there's nothing real sexy about backup. It doesn’t get oohs and ahs out of people, but when you need data to be restored, you get the oohs and ahs and the thank-yous and the praise for doing that. Being able to demonstrate solutions time and time again buys confidence through leadership throughout the organization and it makes those people sleep safer at night.

Recently, we passed HIMSS Level 7. One of the remarks from that group was that a) we hadn’t had any production sort of outage, and b) when they asked a physician on the floor, what do you do when things go down, and what do you do when you lose something? He said the awesome part here is that we haven’t gone down and, when we lose something, we're able to restore that in a very timely manner. That was noted on our award.

Gardner: Of course, many healthcare organizations have been using thin clients and keeping everything at the server level for a lot of reasons, a edge to core integration benefit. Would you feel more enabled to go into mobile and virtualization knowing that everything that's kept on the data-center side is secure and backed up, not worrying about the fact that you don't have any data on the incline? Is that factored into any of your architectural decisions about how to do client decision-making?

Desktop virtualization

Bergholz: We have been in the throes of desktop virtualization. We do a lot of Citrix XenApp presentations of applications that keeps the data in a data center and a lot of our desktop devices connect to that environment.

The next natural progression for us is desktop virtualization (VDI), ensuring that we're keeping that data safe in the data center, ensuring that we're backing it up, protecting the patient information on that, and it's an interesting thought and philosophy. We try to sell it as an ROI-type initiative to start with. By the time you start putting all pieces to the puzzle, the ROI really doesn't pan out. At least we've seen in two different iterations.

Although it can be somewhat cheaper, it's not significant enough to make a huge launch in that route. But the main play there, and the main support we have organizationally, is from a data-security perspective. Also, it's the the ease of managing the virtual desktop environment. It frees up our desktop engineers from being feet on the ground, so to speak, to being application engineers and being able to layer in the applications to be provisioned through the virtual desktop environment.
The next natural progression for us is desktop virtualization (VDI), ensuring that we're keeping that data safe in the data center, ensuring that we're backing it up, protecting the patient information on that.

And one important thing in the healthcare industry is that when you have a workstation that has an issue and requires replacement or re-imaging, that’s an invasive step. If it’s in a patient room or in a clinical-care area, you actually have to go in, disrupt that flow, put a different system in, re-image, make sure you get everything you need. It can be anywhere from an hour to a three-hour process.

We do have a smattering of thin devices out there. When there are issues, it’s merely just replaying or redoing a gold image to it. The great part about thin devices versus thick devices is that in lot of cases, they're operating in a sterile environment. With traditional desktops, the fans are sucking air through infection control and all that; there's noise; perhaps they're blowing dust within a room, if it's not entirely clean. SSD devices are a perfect-play there. It’s really a drop-off, unplug, and re-plug sort of technology.

We're excited about that for what it will bring to the overall experience. Our guiding principle is that you have the same experience no matter where you're working. Getting there from Step A to Step Z is a journey. So, you do that a little bit a time and you learn as you go along, but we're going to get there and we'll see the benefit of that.
HPE Data Protector:
Backup with Brains
Learn More Here
Gardner: And ensuring the recovery and voracity of that data is a huge part of being able to make those other improvements.

Bergholz: Absolutely. What we've seen from time to time is that users, while they're fairly knowledgeable, save their documents where they save them to. Policy is to make sure you put them within the data center. That may or may not always be adhered to. By going to a desktop virtualization, they won’t have any other choice.

A thin client takes that a step further and ensures that nothing gets saved back to a device, where that device could potentially disappear and cause a situation.

We do encrypt all of our stuff. Any device that's out there is covered by encryption, but still there's information on there. It’s well-protected, but this just takes away that potential.

Listen to the podcast. Find it on iTunes. Get the mobile app. Read a full transcript or download a copy. Sponsor: Hewlett Packard Enterprise.

You may also be interested in:

Influencer Summit 2016—Teradata Reshapes Itself with Analytics and the Cloud

Influencer Summit 2016—Teradata Reshapes Itself with Analytics and the Cloud

For anyone with even a small amount of understanding regarding current trends in the software industry it will come as no surprise that the great majority of enterprise software companies are focusing on the incorporation of analytics, big data, cloud adoption, and especially the Internet of Things into their software solutions. In fact, these capabilities have become so ubiquitous that for
Top 5 Up and Coming Big Data Jobs of 2016

Top 5 Up and Coming Big Data Jobs of 2016

Big data has been all the rage in the past few years. In 2012, the Harvard Business Review described data science as the sexiest job of the 21st century, and big data has only grown bigger since then. Whether it’s location tracking or mapping customer behavior across a website, there are numerous job opportunities within big data for those who like numbers. With the explosion of the Internet of Things and machine learning, the growth of big data jobs will continue in the coming years.

By 2020, businesses utilizing data will boost their productivity by $430 billion in comparison to the competition that does not use big data. As a result, there are all kinds of jobs that are in high demand today to help businesses increase their efficiency and more effectively target customers. Here are the top 4 up-and-coming big data jobs of 2016.

1. Data Analyst

This one’s a bit of a no-brainer, but big data can’t help a business unless they have the resources to process that data and extract meaningful trends that they can act on. It’s the data analyst’s job to wrangle all of the data and provide a company with conclusions on what the data is saying. ...

Read More on Datafloq
Why Do Television Companies Need a Digital Transformation

Why Do Television Companies Need a Digital Transformation

Over just a few years, the world of television production, distribution, and consumption has changed dramatically. In the past, with only a few channels to choose from, viewers watched news and entertainment television at specific times of the day or night. They were also limited by where and how to watch. Options included staying home, going to a friend’s house, or perhaps going to a restaurant or bar to watch a special game, show, news story, or event. When we are taking about the TV industry has now been completing and moving to the high definition from the standard definition, now the discussion is about 4K and 8K video standard. But before all these things happen, analog based broadcasting needs to transform digitally. That means TV industry is unavoidable needing a disruptive transformation in their ICT platform to cope with the new processes of acquisition, production, distribution and consumption.

Fast-forward to today, and you have a very different scenario. Thanks to the rise of the Internet – and, in particular, mobile technology – people have nearly limitless options for their news and entertainment sources. Not only that, but they can choose to get their news and other media on TV or ...

Read More on Datafloq
Take Action in Real Time with Big Data Analytics

Take Action in Real Time with Big Data Analytics

I wrote about predicting the future with analytics in a blog titled "Remember the Past and Predict the Future". In many industries, technological advances have greatly reduced the margin of error for predicting future outcomes. But what good is a prediction if you don’t take action? 

Thinking about the ramifications of taking action on predictions can turn into a mind warping time travel movie. Or more realistically, a never-ending iterative loop:

If my data is telling me that X is going to happen, then I need to do Y. 
But if I do Y, my data tells me Z will happen.  
While Z is better than X, is it the best alternative?  
Perhaps I desire ZZ. 
How do I exit analysis paralysis and make an actionable decision before the outcome happens?

In “Remember the Past …”, I wrote that prediction becomes much more accurate the closer you get to the predicted outcome. While this is great and hopefully intuitive, are we able to take action fast enough?  If the weather forecast offers a 5-minute lead time, will you ever grab your umbrella? 

Reaction Time v. Prediction Lead Time

If we agree that predictions are generally more accurate the closer in time you get to the predicted outcome, then perhaps our focus should be ...

Read More on Datafloq
What do Big Data and Robo Advisors Have in Common?

What do Big Data and Robo Advisors Have in Common?

There are many investors and analysts that are calling 2016 the year of the robo. One can barely get on a computer without being bombarded with ads and content from a sector that is growing faster than it can handle.

Even some of the more traditional brokers are starting to offer a more digitalized version of an advisor for a new generation of investors that expect massive amounts of data to be at their fingertips, sorted and organized for their viewing pleasure. The market for these robo advisors is expected to move from $2 billion in 2013 to $500 billion in 2020.

Investing has never been an easy game. There are thousands of variables playing into financial markets at any given time, making it virtually impossible for man and machine alike to consistently and accurately call market movements. That is exactly what robo advisors intend to do, however. In theory, if one had all the necessary data, they should be able to accurately predict market movements virtually every single time. This concept has brought about a massive push for data in wealth management.

How they work

Robo advisors take a person’s data, crunch it, and find smart investment decisions based on that data. Some are ...

Read More on Datafloq
Kick-Starting the 4th Industrial Revolution, One Blockchain at a Time

Kick-Starting the 4th Industrial Revolution, One Blockchain at a Time

We live in a future of accelerated change and today’s world is changing faster than we have ever seen before. New technology is changing the way we live, work and collaborate. No longer is it sufficient to for organisations to sit back and stick to the status quo. Today’s new technologies require an active attitude by organizations, if they want to remain in business in the next decade.

I am talking about the 4th Industrial Revolution that is rapidly approaching and it will bring change at unlike we have ever seen before. In fact, it will change what it means to be humans. It also offers us a tremendous opportunity to create a world that is good for all, where technology is used for the good, privacy of consumers is respected and data is used to improve the lives of all humans. The 4th Industrial Revolution is all about Algorithms, Machine-Learning and Artificial Intelligence. It is about robotics, 3D printing and VR/AR, Nano technology, and many more emerging technologies. It is disruption on all levels, resulting in system-wide innovations that can change an industry in years instead of decades. The combination of such revolutionary emerging technologies will bring us realities that until recently would ...

Read More on Datafloq
What Does Your Medical Record Say About You? (and who is reading it?)

What Does Your Medical Record Say About You? (and who is reading it?)

What happens when you fill in that medical history questionnaire?

A new doctor or a new office or sometimes it's a routine visit… how many times have you filled in your personal medical information - from your address and insurance information down to the significant (or awkward) events of your medical history - the illness, the surgery, the procedure?  Not only is this excruciatingly private information, but it’s also important to you that it is accurate, timely AND secure.trained, and they are only human as far as reading and entering name after name. YOUR name and affliction and 

Every time you write you name and information on a form a person has to read it, interpret it, and most likely enter it in some electronic form for a structured data base.  Will they get it right?  Would you know if they did or did not translate it correctly?

An entire job industry exists for medical data entry.  Not only are non-medical strangers reading your personal medical information time after time, but the potential for information to be incorrectly interpreted with errors in spelling, dates, treatments and more is possible.  The data entry employee or outsourced contractor is not medically treatment are just a blip in ...

Read More on Datafloq
Why Isn’t Your Business Using NoSQL Databases Yet?

Why Isn’t Your Business Using NoSQL Databases Yet?

Businesses today are collecting huge amounts of data every single day that can be used to benefit the business and its operations. The sheer volume of data that companies now deal with and store on a daily basis means that traditional frameworks are under pressure and in some cases are no longer fit for purpose. NoSQL databases could be your solution to dealing with today’s data demands if you’re not already using this framework option.

Traditional relational database management systems (RDBMS) are a great choice if a business is dealing with small amounts of data that needs to be kept well-structured. But when large volumes are added it can have a negative impact, resulting in performance decreasing, making it an often unsuitable tool when it comes to big data.

In contrast, NoSQL, often known as Not Only SQL, is scalable. This type of database has been designed with the high volumes of incoming information associated with big data in mind. NoSQL is particularly useful if you have lots of unstructured data stored in multiple areas that need to be correlated and large quantities of data need to be accessed fast.

Up until recently, RDBMS relational databases were the most commonly used but NoSQL databases are ...

Read More on Datafloq
Cyber Security of the Connected Car in the Age of the Internet of Things

Cyber Security of the Connected Car in the Age of the Internet of Things

The Revolutionary Design and Features of Connected Cars

In this age of the Internet of Things, virtual technology affects just about every aspect of our lives. From the way that we watch movies and television to the manner in which we shop and order food from our favorite restaurants, we have become increasingly dependent on wireless and virtual inventions that are designed to make our lives easier.

This technology now extends to the very cars that we drive every day to school, work, or anywhere else we need to go. Our new wireless Internet vehicles, dubbed connected cars, are designed to help us with ordinary driving tasks like backing out of a driveway, parallel parking, and even making a phone call or sending a text without having to dial or type on our cell phones. 

Our cars can tell us what directions to take and what the weather will be like once we arrive at our destination. They play movies, connect to global satellite radio stations, and keep us entertained at the touch of a button. All of their technological features center on maximizing our driving pleasure, improving our safety and handling, and relieving us of much of the thought and effort that ...

Read More on Datafloq
How Big Data and CRM are Shaping Modern Marketing

How Big Data and CRM are Shaping Modern Marketing

Big Data is the term for massive data sets that can be mined with analytics software to produce information about your potential customers’ habits, preferences, likes and dislikes, needs and wants.

This knowledge allows you to predict the types of marketing, advertising and customer service to extend to them to produce the most sales, satisfaction and loyalty.

Skilled use of Big Data produces a larger clientele, and that is a good thing. However, having more customers means you must also have an effective means of keeping track of them, managing your contacts and appointments with them and providing them with care and service that has a personal feel to it rather than making them feel like a “number.”

That’s where CRM software becomes an essential tool for profiting from growth in your base of customers and potential customers. Good CRM software does exactly what the name implies – offers outstanding Customer Relationship Management with the goal of fattening your bottom line.

With that brief primer behind us, let’s look at five ways that the integration of Big Data and CRM is shaping today’s marketing campaigns.

Achieving Targeted Multi-Channel Reach

The data acquired by marketers tells them where to find their customers and potential customers. The information ...

Read More on Datafloq
How Big Data is Revolutionizing the Manufacturing Industry

How Big Data is Revolutionizing the Manufacturing Industry

Data collection and analysis are an integral part of our society, and are important activities we use to inform our decisions. Big data is no exception. Made up of extremely large sets of data that can be analyzed for trends and other information, big data is extremely useful and relevant when determining strategies and plans for communities and companies. In fact, it’s changing the face of many different industries—including manufacturing. Let’s take a look at how big data is revolutionizing the manufacturing industry.

Big Data’s Role in the Manufacturing Industry

“Made in the USA” is a proud label attached to goods manufactured on U.S. soil. While this label doesn’t necessarily assure good quality, most U.S. manufacturers are dedicated to producing well-made goods and paying workers fair wages. A recent report from Ohio University found that the manufacturing industry represents 12% of the country’s gross domestic product (GDP), and that these goods raked in $1.2 trillion from exports in 2013. American manufacturing is becoming stronger again, with a 30% increase in output since the end of the recession, and 54% of manufacturers considering bringing their production back from overseas.

Why is manufacturing experiencing such a positive surge in the United States? Part of it ...

Read More on Datafloq
What is the Blockchain and Why is it So Important?

What is the Blockchain and Why is it So Important?

Blockchain is growing in importance. Increasingly organisations have to explore what this revolutionary technology will mean for their business. Marc Andreessen from the well-known VC firm Andreessen Horowitz calls it as big an invention as the internet. Last year, in my Big Data Trends prediction for 2016, I already foresaw that 2016 would become the year of the Blockchain and now also Gartner has included in their Hype Cycle for Emerging Technologies.

Many organisations are already exploring the possibilities of the Blockchain, although primarily still in the Financial Services industry. The R3 Partnership is a consortium of 45 of the biggest financial institutions, investigating what the Blockchain means for them. Next to the R3 consortium, four of the biggest global banks, led by Swiss bank UBS, have developed a “Utility Settlement Coin” (USC), which is the digital counterpart of each of the major currencies backed by central banks. Their objective is to develop a settlement system that processes transactions in (near) real-time instead of days. A third example is Australia Post, who have released plans for developing a blockchain-based e-voting system for the state of Victoria.

The possibilities of the Blockchain are enormous and it seems that almost any industry that deals ...

Read More on Datafloq
Maintaining disabled FK’s, wisdom or farce?

Maintaining disabled FK’s, wisdom or farce?

A while back, I wrote a post about having FKs (foreign keys) in your data warehouse. Well, a similar question came up recently on an Oracle forum with the above title. It is a fair question and it does surface fairly regularly in a variety of contexts (not just data warehousing). Of course, as The […]

Privacy Policy

Copyright © 2016 BBBT - All Rights Reserved
Powered by WordPress & Atahualpa