4 Big Data Tips for Creating a Safer Workplace

4 Big Data Tips for Creating a Safer Workplace

Advanced analytics is having a huge impact on fields such as marketing where it produces more sales at less cost and CRM where it boosts retention and profit per customer, to name two of many. In the field of workplace health and safety, it is safe to say that Big Data has had less impact to date. But, that is changing. A growing number of companies are analysing data through a safety lens to modify their philosophy and practices. The changes are making workplaces safer than ever.

Effectively Using the Past to Change the Future

An oft-repeated proverb says, “Those who fail to learn from the past are doomed to repeat it.” The principle could be applied to workplaces in which similar accidents happened with repetition under similar circumstances, yet without being obvious. Big Data analytics offers companies a way to drill down into information to produce a clear picture of the past, which is the basis for preventing it from happening again.

Analysing the copious information available to companies about what has already happened allows them to predict the future and then to change that future to something better, something safer, before it arrives.

A widely used process for understanding the past and ...


Read More on Datafloq
How flash storage provides competitive edge for Canadian music service provider SOCAN

How flash storage provides competitive edge for Canadian music service provider SOCAN

The next BriefingsDirect Voice of the Customer digital business transformation case study examines how Canadian nonprofit SOCAN faced digital disruption and fought back with a successful storage modernizing journey. We'll learn how adopting storage innovation allows for faster responses to end-user needs and opens the door to new business opportunities.

Listen to the podcast. Find it on iTunes. Get the mobile app. Read a full transcript or download a copy.

To describe how SOCAN gained a new competitive capability for its performance rights management business we're joined by Trevor Jackson, Director of IT Infrastructure for SOCAN, the Society of Composers, Authors and Music Publishers of Canada, based in Toronto. The discussion is moderated by BriefingsDirect's Dana Gardner, Principal Analyst at Interarbor Solutions.

Here are some excerpts:

Gardner: The music business has changed a lot in the past five years or so. There are lots of interesting things going on with licensing models and people wanting to get access to music, but people also wanting to control their own art.

Tell us about some of the drivers for your organization, and then also about some of your technology decisions.
A Tech Guide
For the Savvy
Flash Buyer
Jackson: We've traditionally been handling performances of music, which is radio stations, television and movies. Over the last 10 or 15 years, with the advent of YouTube, Spotify, Netflix, and digital streaming services, we're seeing a huge increase in the volume of data that we have to digest and analyze as an organization.

Gardner: And what function do you serve? For those who are might not be familiar with your organization or the type of organization, tell us the role you play in the music and content industries.

Play music ethically

Jackson: At a very high level, what we do is license the use of music in Canada. What that means is that we allow businesses through licensing to ethically play any type of music they want within their environment. Whether it's a bar, restaurant, television station, or a radio station, we collect the royalties on behalf of the creators of the music and then redistribute that to them.

Jackson
We're a not-for-profit organization. Anything that we don't spend on running the business, which is the collecting, processing, and payment of those royalties, goes back to the creators or the publishers of the music.

Gardner: When you talk about data, tell us about the type of data you collect in order to accomplish that mission?

Jackson: It's all kinds of data. For the most part, it's unstructured. We collect it from many different sources, again radio and television stations, and of course, YouTube is another example.

There are some standards, but one of the challenges is that we have to do data transformation to ensure that, once we get the data, we can analyze it and it fits into our databases, so that we can do the processing on information.

Gardner: And what sort of data volumes are we talking about here?

Jackson: We're not talking about petabytes, but the thing about performance information is that it's very granular. For example, the files that YouTube sends to us may have billions of rows for all the performances that are played, as they're going through their cycle through the month; it's the same thing with radio stations.

We don't store any digital files or copies of music. It's all performance-related information -- the song that was played and when it was played. That's the type of information that we analyze.
We don't store any digital files or copies of music. It's all performance-related information.

Gardner: So, it's metadata about what's been going on in terms of how these performances have been used and played. Where were you two years ago in this journey, and how have things changed for you in terms of what you can do with the data and how performance of your data is benefiting your business?

Jackson: We've been on flash for almost two years now. About two and a half years ago, we realized that the storage area network (SAN) that we did have, which was a traditional tiered-storage array, just didn't have the throughput or the input/output operations per second (IOPS) to handle the explosive amount of data that we were seeing.

With YouTube coming online, as well as Spotify, we knew we had to do something about that. We had to increase our throughput.

Performance requirements

Gardner: Are you generating reports from this data at a certain frequency or is there streaming? How is the output in terms of performance requirements?

Jackson: We ingest a lot of data from the data-source providers. We have to analyze what was played, who owns the works that were played, correlate that with our database, and then ensure that the monies are paid out accordingly.

Gardner: Are these reports for the generation of the money done by the hour, day, or week? How frequently do you have to make that analysis?

Jackson: We do what we call a distribution, which is a payment of royalties, once a quarter. When we're doing a payment on a distribution, it’s typically on performances that occurred nine months prior to the day of the distribution.
A Tech Guide
For the Savvy
Flash Buyer
Gardner: What did you do two and a half years ago in terms of moving to flash and solid state disk (SSD) technologies? How did you integrate that into your existing infrastructure, or create the infrastructure to accommodate that, and then what did you get for it?

Jackson: When we started looking at another solution to improve our throughput, we actually started looking at another tiered-storage array. I came to the HPE Discover [conference] about two years ago and saw the presentation on the all-flash [3PAR Storage portfolio] that they were talking about, the benefits of all-flash for the price of spinning disk, which was to me very intriguing.

I met with some of the HPE engineers and had a deep-dive discussion on how they were doing this magic that they were claiming. We had a really good discussion, and when I went back to Toronto, I also met with some HPE engineers in the Toronto offices. I brought my technical team with me to do a bit of a deeper dive and just to kick the tires to understand fully what they were proposing.
We saw some processes that we were running going from days to hours just by putting it on all flash. To us, that's a huge improvement.

We came away from that meeting very intrigued and very happy with what we saw. From then on, we made the leap to purchase the HPE storage. We've had it running for about [two years] now, and it’s been running very well for us.

Gardner: What sort of metrics do you have in terms of technology, speeds and feeds, but also metrics in terms of business value and economics?

Jackson: I don’t want to get into too much detail, but as an anecdote, we saw some processes that we were running going from days to hours just by putting it on all-flash. To us, that's a huge improvement.

Gardner: What other benefits have you gotten? Are there some analytics benefits, backup and recovery benefits, or data lifecycle management benefits?

OPEX perspective

Jackson: Looking at it from an OPEX perspective, because of the IOPS that we have available to us, planning maintenance windows has actually been a lot easier for the team to work with.

Before, we would have to plan something akin to landing the space shuttle. We had to make sure that we weren’t doing it during a certain time, because it could affect the batch processes. Then, we'd potentially be late on our payments, our distributions. Because we have so many IOPS on tap, we're able to do these maintenance windows within business hours. The guys are happier because they have a greater work-life balance.

The other benefit that we saw was that all-flash uses less power than spinning disk. Because of less power, there less heat, and a need for less floor space. Of course, speed is the number one driving factor for a company to go all-flash.

Gardner: In terms of automation, integration, load-balancing, and some of those other benefits that come with flash storage media environments, were you able to use some of your IT folks for other innovation projects, rather than speeds and feeds projects?

Jackson: When you're freeing up resources from keeping the lights on, it's adding more value to the business. IT traditionally is a cost center, but now we can take those resources and take them off of the day-to-day mundane tasks and put them into projects, which is what we've been doing. We're able to add greater benefit to our members.
We know our business very well and we're hoping to leverage that knowledge with technology to further drive our business forward.

Gardner: And has your experience with flash in modernizing your storage prompted you to move toward other infrastructure modernization techniques including virtualization, software-defined composable infrastructure, maybe hyper converged? Is this an end point for you or maybe a starting point?

Jackson: IT is always changing, always transforming, and we're definitely looking at other technologies.

Some of the big buzzwords out there, blockchain, machine learning, and whatnot are things that we’re looking at very closely as an organization. We know our business very well and we're hoping to leverage that knowledge with technology to further drive our business forward.

Gardner: We're hearing a lot promising sorts of vision these days about how machine learning could be brought to bear on things like data transformation and making that analysis better, faster, cheaper. So, that’s a pretty interesting stuff.
A Tech Guide
For the Savvy
Flash Buyer
Are you now looking to extend what you do? Is the technology an enabler more than a cost center in some ways for your general SOCAN vision and mission?

Jackson: Absolutely. We're in the music business, but there is no way we can do what we do without technology; technically it’s impossible. We're constantly looking at ways that we can leverage what we have today, as well as what’s out in the marketplace or coming down the pipe, to ensure that we can definitely add the value to our members to ensure that they're paid and compensated for their hard work.

Gardner: And user experience and user quality of experience are top-of-mind for everybody these days.

Jackson: Absolutely, that’s very true.

Listen to the podcast. Find it on iTunes. Get the mobile app. Read a full transcript or download a copy. Sponsor: Hewlett Packard Enterprise.

You may also be interested in:

Why Alternative Data is the New Financial Data for Industry Investors

Why Alternative Data is the New Financial Data for Industry Investors

Information that has the ability to give investors an edge has long been coveted, but the nature of that information has evolved over time. Currently, traditional financial data, such as stock price history and fundamentals, is the standard for determining the health of a stock. However, alternative data has the potential to reveal insights about a stock’s health before traditional financial data. Obviously, this has major implications for investors.

If information is power, then unique information sourced from places, well, not-yet-sourced from, is a new level of domination. This is alternative data — unmined information with the potential to be leveraged for commercial value. Given that we’re in what we like to call a data revolution, where nearly every move we make can be digitized, tracked, and analyzed, every company is now a data company. Everyone is both producing and consuming immense amounts of data in the race to make more money. To help paint a more concrete picture of how alternative data may be useful, below is a list of 10 industry sectors that we believe can produce new data for investors.

Business Operations

When it comes to operational challenges like keeping costs low and improving efficiencies, data from various sources, such ...


Read More on Datafloq
$46 Billion: Big Data is Increasingly Meaning Big Money

$46 Billion: Big Data is Increasingly Meaning Big Money

The Big Data industry is bigger than ever. In terms of data, that’s a true statement any day of the week; every time you click, tap or swipe, you’re making a contribution to the data mine.In terms of money, revenues have never been better! Companies offering services classified within the Big Data Industry reached a new milestone; more than $46 billion in revenue year-to-date.

Big Data is Poised to Get Bigger

That’s a big number, no matter how you slice it. In fact, it’s ten times bigger than the estimated net-worth of the 2016 Republican Presidential Candidate; a man that thinks everything should be big. Although, I have a feeling that his answer would be entertaining if you were able to ask him “What does big data mean?”

And, just like political ad-buys, the numbers are only getting bigger as the year goes on. Some analysts believe that Big Data has a long way to go in terms of making its way into every corner of the market. How big could the Big Data market become? SNS Research released a report stating that by 2020, Big Data will generate $72 billion in revenue.

Companies need to invest capital in order to grow the technological ...


Read More on Datafloq
What is Video Analytics and Why is it Becoming Such a Big Player?

What is Video Analytics and Why is it Becoming Such a Big Player?

Explosive trends in the video analytics market has really caught the eye of business intelligent departments. This little known software is quickly picking up speed for an expansive amount of applications. From security to public safety to crowd management, video analytics have begun to see a big boom on the business world.

Video analytics, or intelligent video analytics, is software that is used to monitor video streams in near real-time. While monitoring the videos, the software identifies attributes, events or patterns of specific behavior via video analysis of monitored environments. Video analysis software also generates automatic alerts and can facilitate forensic analysis of historical data to identify trends, patterns and incidents. The software enables its users to analyze, organize and share any insight gained from the data to make smarter, better decisions. It can promote enhanced coordination across and within agencies and organizations. Its applications are widespread, including monitoring vehicle patterns or violations of traffic laws, or people entering restricted areas during defined time frames. The data can then be sorted by time and date or over an extended time period to create a trend analysis.

A simple function of video analytics is motion detection with a fixed background. More technical functions ...


Read More on Datafloq
Why Machine Learning is the Future of Data Analytics

Why Machine Learning is the Future of Data Analytics

Facebook makes use of a user’s likes and preferences to show ads that they may be interested in. When you mistype a search query on Google, the search engine instantly cross-references it against the millions of similar typos to interpret the correct query and shows you results appropriately. Tesla makes use of your car’s vital parameters and benchmarks this against available data to know when you are due for servicing. Netflix studies engagement and behavior from millions of users to precisely know what images and promotions elicit better response from users.

All of this is just the tip of the iceberg. In the last few years, machine learning techniques have proven to be incredibly effective for predictive and deep insights; when used with data analytics. Many companies keep big data as their biggest asset because it reflects their aggregate experience. After all, every partner, customer, defect, transaction, and complaint gives the company an experience to learn from. 

While in the recent years, many companies have focused more on how to store and manage all this data, it’s not just about the quantity of data or how it’s being stored. By combining data analytics with machine learning, companies can predict the future with ...


Read More on Datafloq
IT Sapiens, for Those Who Are Not

IT Sapiens, for Those Who Are Not

Perhaps one of the most refreshing moments in my analyst life is when I get the chance to witness the emergence of new tech companies—innovating and helping small and big organizations alike to solve their problems with data. This is exactly the case with Latvia-based IT Sapiens, an up-and-coming company focused on helping those small or budget-minded companies to solve their basic yet crucial
Why The Cars of the Future Will Rely on the IoT

Why The Cars of the Future Will Rely on the IoT

Once, electric cars were a novelty: they couldn't go very far, and weren't a practical option for consumers. Fortunately, a lot has changed since those days, and electric vehicles are now much more accessible and high-tech than they were in the past. National Drive Electric Week aims to give even more drivers the tools they need to make a more sustainable option and consider an electric car. These cars of the future are an exciting development in reducing our oil dependency, but where are they headed?

The Evolution of the Electric Car and Their Benefits

Gas cars weren't always the norm, and in fact, they used to be much less popular than their electric counterparts. Back in the late 1800s, electric cars completely took over the market, and gas cars were much less popular. Unfortunately, there were some limitations at that time: each car had to be assembled by hand (while by 1910, gas-powered cars could be produced by assembly line) and the electrical infrastructure limited the vehicles to city-only driving. Gas cars, meanwhile, became safer and more convenient, so most production of electric cars quickly declined.

Later in the century, during the late 60s and early 70s, electric cars experienced a Renaissance, ...


Read More on Datafloq
Strategic DevOps—How advanced testing brings broad benefits to Independent Health

Strategic DevOps—How advanced testing brings broad benefits to Independent Health

The next BriefingsDirect Voice of the Customer digital business transformation case study highlights how Independent Health in Buffalo, New York has entered into a next phase of "strategic DevOps."

After a two-year drive to improve software development, speed to value, and improved user experience of customer service applications, Independent Health has further extended advanced testing benefits to ongoing apps production and ongoing performance monitoring.

Learn here how the reuse of proven performance scripts and replaying of synthetic transactions that mimic user experience have cut costs and gained early warning and trending insights into app behaviors and system status.

Listen to the podcast. Find it on iTunes. Get the mobile app. Read a full transcript or download a copy.

Here to describe how to attain such new strategic levels of DevOps benefits are Chris Trimper, Manager of Quality Assurance Engineering at Independent Health in Buffalo, New York, and Todd DeCapua, Senior Director of Technology and Product Innovation at CSC Digital Brand Services Division and former Chief Technology Evangelist at Hewlett Packard Enterprise (HPE). The discussion is moderated by BriefingsDirect's Dana Gardner, Principal Analyst at Interarbor Solutions.

Here are some excerpts:

Gardner:What were the major drivers that led you to increase the way in which you use DevOps, particularly when you're looking at user-experience issues in the field and in production?

Trimper: We were really hoping to get a better understanding of our users and their experiences. The way I always describe it to folks is that we wanted to have that opportunity to almost look over their shoulder and understand how the system was performing for them.

Whether your user is internal or external, if they don't have that good user experience, they're going to be very frustrated and they're going to have a poor experience. Internally, time is money. So, if it takes longer for things to happen, and you get frustrated potential turnover, it's an unfortunate barrier.

Gardner: What kind of applications are we talking about? Is this across the spectrum of different type of apps, or did you focus on one particular type of app to start out?

End users important

Trimper: Well, when we started, we knew that the end user, our members, were the most important thing to us, and we started off with the applications that our servicing center used, specifically our customer relationship management (CRM) tool.

Trimper
If the member information doesn’t pop fast when a member calls, it can lead to poor call quality, queuing up calls, and it just slows down the whole business. We pride ourselves on our commitment to our members. That goes even as far as, when you call up, making sure that the person on the other end of the phone can service you well. Unfortunately, they can only service you as well as the data that’s provided to them to understand the member and their benefits.

Gardner: It’s one thing to look at user experience through performance, but it's a whole new dimension or additional dimension when you're looking at user experience in terms of how they utilize that application, how well it suits their particular work progress, or the processes for their business, their line of business. Are you able to take that additional step, or are you at the point where the feedback is about how users behave and react in a business setting in addition to just how the application performs?

Trimper: We're starting to get to that point. Before, we only had as much information as we were provided about how an application was used or what they were doing. Obviously, you can't stand there and watch what they're doing 24x7.

Lately, we've been consuming an immense amount of log data from our systems and understanding what they're doing, so that we can understand their problems and their woes, or make sure that what we're testing, whether it's in production monitoring or pre-production testing, is an accurate representation of our user. Again, whether it’s internal or external, they're both just as valuable to us.

Gardner: Before we go any further, Chris, tell us a little bit about Independent Health. What kind of organization is it, how big is it, and what sort of services do you provide in your communities?
Get the New Book
On Effective
Performance Engineering
Trimper: We're a healthcare company for the Western New York area. We're a smaller organization. We define the red-shirt treatment that stands for the best quality care that we can provide our members. We try to be very proactive in everything that we do for our members as well. We drive members to the provider to do preventative things, that healthier lifestyle that everybody is trying to go for.

Gardner: Todd, we're hearing this interesting progression toward a feedback loop of moving beyond performance monitoring into behaviors and use patterns and improving that user experience. How common is that, or is Independent Health on the bleeding edge?

Ahead of the curve

DeCapua: Independent Health is definitely moving with, or maybe a little bit ahead of, the curve in the way that they're leveraging some of these capabilities.

DeCapua
If we were to step back and look at where we've been from an industry perspective across many different markets, Agile was hot, and now, as you start to use Agile and break all the right internal systems for all the right reasons, you have to start adopting some of these DevOps practices.

Independent Health is moving a little bit ahead on some of those pieces, and they're probably focusing on a lot of the right things, when you look across other customers I work with. It's things like speed of time to value. That goes across technology teams, business teams, and they're really focused on their end customer, because they're talking about getting these new feature functions to benefit their end customers for all the right reasons.

You heard Chris talking about that improved end-user experience about around their customer service applications. This is when people are calling in, and you're using tools to see what’s going on and what your end users are doing.

There's another organization that actually recorded what their customers were doing when they were having issues. That was a production-monitoring type thing, but now you're recording a video of this. If you called within 10 minutes of having that online issue, as you are calling in and speaking with that customer service representative, they're able to watch the video and see exactly what you did to get that error online to cause that phone call. So having these different types of users’ exceptions, being able to do the type of production monitoring that Independent Health is doing is fantastic.
I do think that Independent Health is hitting the bleeding edge on that piece. That’s what I've observed.

Another area that Chris was telling me about is some of the social media aspects and being able to monitor that is another way of getting feedback. Now, I do think that Independent Health is hitting the bleeding edge on that piece. That’s what I've observed.

Gardner: Let’s hear some more about that social media aspect, getting additional input, additional data through all the available channels that you can.

Trimper: It would be foolish not to pay attention to all aspects of our members, and we're very careful to make sure that they're getting that quality that we try to aim for. Whether it happens to be Facebook, Twitter, or some other mechanism that they give us feedback on, we take all that feedback very seriously.

I remember an instance or two where there might have been some negative feedback. That went right to the product-management team to try to figure out how to make that person’s experience better. It’s interesting, from a healthcare perspective, thinking about that. Normally, you think about a member’s copay or their experience in the hospital. Now, it's their experience with this application or this web app, but those are all just as important to us.

Broadened out?

Gardner: You started this with those customer-care applications. Has this broadened out into other application development? How do you plan to take the benefits that you've enjoyed early and extend them into more and more aspects of your overall IT organization?

Trimper: We started off with the customer service applications and we've grown it into observing our provider portals as well. A provider can come in and look at the benefits of a member, the member portal that the members actually log in to. So, we're actually doing production monitoring of pretty much all of our key areas.

We also do pre-production monitoring of it. So, as we are doing a release, we don’t have to wait until it gets to production to understand how it went. We're going a little bit beyond normal performance testing. We're running the same exact types of continuous monitoring in both our pre-production region and our production regions to ensure that quality that we love to provide.

Gardner: And how are the operations people taking this? Has this been building bridges? Has this been something that struck them as a foreign entity in their domain? How has that gone?

Trimper: At first, it was a little interesting. It felt like to them it was just another thing that they had to check out and had to look at, but I took a unique approach with it. I sat down and talked to them personally and said, "You hear about all these problems that people have, and it’s impossible for you to be an expert on all these applications and understand how it works. Luckily, coming from the quality organization, we test them all the time and we know the business processes."
Get the New Book
On Effective
Performance Engineering
The way I sold it to them is, when you see an alert, when you look at the statistics, it’s for these key business processes that you hear about, but you may not necessarily want to know all the details about them or have the time to do that. So, we really gave them insight into the applications.

As far as the alerting, there was a little bit of an adoption practice for that, but overall we've noticed a decrease in the number of support tickets for applications, because we're allowing them to be more proactive, whether it’s proactive of an unfortunately blown service-level agreement (SLA), or it’s a degradation in quality of the performance. We can observe both of those, and then they can react appropriately.

Gardner: Todd, he actually sat down and talked to the production people. Is this something novel? Are we seeing more of that these days?

DeCapua: We're definitely seeing more of it, and I know it’s not unique for Chris. I know there was some push back at the beginning from the operations teams.

There was another thing that was interesting. I was waiting for Chris to hit on it, and maybe he can go into it a little bit more. It was the way that he rolled this out. When you're bringing a monitoring solution in, it’s often the ops team that’s bringing in this solution.

Making it visible

What’s changing now is that you have these application-development testing teams that are saying, "We also want to be able to get access to these types of monitoring, so that our teams can see it and we can improve what we are doing and improve the quality of what we deliver to you, the ops teams. We are going to do instrumenting and everything else that we want to get this type of detail to make it visible."

Chris was sharing with me how he made this available first to the directors, and not just one group of directors, but all the directors, making this very plain-sight visible, and helping to drive some of the support for the change that needed to happen across the entire organization.

As we think about that as a proven practice, maybe Chris is one of the people blazing the trail there. It was a big way of improving and helping to illuminate for all parties, this is what’s happening, and again, we want to work to deliver better quality.

Gardner: Anything to add to that, Chris?

Trimper: There were several folks in the development area that weren’t necessarily the happiest when they learned that the perception of what they originally thought was there and what was really there in terms of performance wasn’t that great.
It was a big way of improving and helping to illuminate for all parties, this is what’s happening.

One of the directors shared an experience with me. He would go into our utilities and look at the dashboards before he was heading to a meeting in our customer service center. He would understand what kind of looks he was going to be given when he walked in, because he was directly responsible for the functionality and performance of all this stuff.

He was pleased that, as they went through different releases and were able to continually make things better, he started seeing everything is green, everything is great today. So, when I walk in, it’s going to be sunshine and happiness, and it was sunshine and happiness, as opposed to potentially a little bit doomy and gloomy. It's been a really great experience for everyone to have. There's a little bit of pain going through it, but eventually, it has been seen as a very positive thing.

Gardner: What about the tools that you have in place? What allows you to provide these organizational and cultural benefits? It seems to me that you need to have data in your hands. You need to have some ability to execute once you have got that data. What’s the technology side of this; we've heard quite a bit about the people and the process?

Trimper: This whole thing came about because our CIO came to me and said. "We need to know more about our production systems. I know that your team is doing all the performance testing in pre-production. Some of the folks at HPE told me about this new tool called Performance Anywhere. Here it is, check it out, and get back to me. "

We were doing all the pre-production testing and we learned that all the scripts that we did, which had already been tried and true and been running and continuously get updates as we get new releases, could just be turned into these production monitors. Then, we found through using the tool, through our trial, and now all of our two plus years that we have been working with it that it was a fairly easy process.

Difficult point

The most difficult point was understanding how to get production data that we could work with, but you could literally take a test on your VUGen script and turn it into a production monitor in 5-10 minutes, and that was pretty invaluable to us.

That means that every time we get a release, we don’t have to modify two sets of scripts and we don’t have two different teams working on everything. We have one team that is involved in the full life cycle of these releases and that can very knowledgeably make the change to those production monitors.

Gardner: HPE Performance Anywhere. Todd, are lot of people using it in the same fashion where they're getting this dual benefit from pre-production and also in deployment and operations?

DeCapua: Yes, it’s definitely something that’s becoming more-and-more aware. It’s a capability that's been around for a little while. You'll also hear about things like IT4IT, but I don’t want to open up that whole can of worms unless we want to dive into it. But as that starts to happen, people like Chris, people like his CIO, want to be able to get better visibility into all systems that are in production, and is there an easy way to do that? Being able to provide that easy way for all of your stakeholders and all of your customers are capabilities that we're definitely seeing people adopt. It was a big way of improving and helping to illuminate for all parties, this is what’s happening
That means that every time we get a release, we don’t have to modify two sets of scripts and we don’t have two different teams working on everything.

Gardner: Can you provide a bit more detail in terms of the actual products and services that made this possible for you, Chris?

Trimper: We started with our HPE LoadRunner scripts, specifically the VUGen scripts, that we were able to turn into the production monitors. Using the AppPulse Active tool from the AppPulse suite of tools, we were able to build our scripts using their SaaS infrastructure and have these monitors built for us and available to test our systems.

Gardner: So what do you see in our call center? Are you able to analyze in any way and say, "We can point to these improvements, these benefits, from the ability for us to tie the loop back on production and quality assurance across the production spectrum?"

Trimper: We can do a lot of trend analysis. To be perfectly honest, we didn’t think that the report would run, but we did a year-to-date trend analysis and it actually was able to compile all of our statistics. We saw really two neat things.

When you had open enrollment, we saw this little spike that shot up there, which we would expect to see, but hopefully we can be more prepared for it as time goes. But we saw a gradual decrease, and I think, due to the ability to monitor, due to the ability to react and plan better for a better performing system, through the course of the year, for this one key piece of pulling member data, we went from an average of about 12-14 seconds down to 4 seconds, and that trend actually is continuing to go down.

I don’t know if it’s now 3 or less today, but if you think about that 12 or 14 down to about 4, that was a really big improvement, and it spoke volumes to our capabilities of really understanding that whole picture and being able to see all of that in one place was really helpful to us.

Where next?

Gardner: Looking to the future, now that you've made feedback loops demonstrate important business benefits and even move into a performance benefit for the business at large, where can you go next? Perhaps you're looking at security and privacy issues, given that you're dealing with compliance and regulatory requirements like most other healthcare organizations. Can you start to employ these methods and these tools to improve other aspects of your SLAs?

Trimper: Definitely, in terms of the SLAs and making sure that we're keeping everything alive and well. As for some of the security aspects, those are still things where we haven’t necessarily gone down the channels yet. But we've started to realize that there are an awful lot of places where we can either tie back or really start closing the gaps in our understanding of just all that is our systems.

Gardner: Todd, last word, what should people be thinking about when they look at their tooling for quality assurance and extending those benefits into full production and maybe doing some cultural bonding at the same time?
The culture is a huge piece. No matter what we talk about nowadays, it starts with that.

DeCapua: The culture is a huge piece. No matter what we talk about nowadays, it starts with that. When I look at somebody like Independent Health, the focus of that culture and the organization is on their end user, on their customer.

When you look at what Chris and his team has been able to do, at a minimum, it’s reducing the number of production incidents. And while you're reducing production incidents, you're doing a number of things. There are actually hard costs there that you're saving. There are opportunity costs now that you can have these resources working on other things to benefit that end customer.

We've talked a lot about DevOps, we've talked a lot about monitoring, we've mentioned now culture, but where is that focus for your organization? How is it that you can start small and incrementally show that value? Because now, what you're going to do is be able to illustrate that in maybe two or three slides, two or three pages.
Get the New Book
On Effective
Performance Engineering
But some of the things that Chris has been doing, and other organizations are also doing, is showing, "We did this, we made this investment, this is the return we got, and here's the value." For Independent Health, their customers have a choice, and if you're able to move their experience from 12-14 seconds to 4 seconds, that’s going to help. That’s going to be something that Independent Health wants to be able to share with their potential new customers.

As far as acquiring new customers and retaining their existing customers, this is the real value. That's probably my ending point. It's a culture, there are tools that are involved, but what is the value to the organization around that culture and how is it that you can then take that and use that to gain further support as you move forward?

Listen to the podcast. Find it on iTunes. Get the mobile app. Read a full transcript or download a copy. Sponsor: Hewlett Packard Enterprise.

You may also be interested in:

How to Empower Business Leaders with Dashboards

How to Empower Business Leaders with Dashboards

Provide an insightful report and you’ll empower an executive for a day. Provide an interactive data dashboard and you’ll empower him for a lifetime.

I have been a consumer of financial performance data for more than 20 years. An insightful performance report (revenue, profit, and expense) empowered me for a day. But in the long run, I was left frustrated for three reasons: 

1. Reports come too late to react to
2. Reports contain errors causing doubt
3. Data behind the numbers were rarely accessible

Financial reporting is not about the numbers. Financial reporting is about the story behind the numbers and the actions you take as a result of that story. 

My ability to tell the "performance" story has been as good as the data I would get. When I could not see the details behind numbers, I had to make assumptions without certainty of facts.  At times I felt like a sports fan shouting at the scoreboard hoping to change the result.

“I felt like a sports fan shouting at the scoreboard hoping to change the result.” 

Before I go any further, I do not blame IT. Business Leaders often don't know what they want until they want it. IT people perform heroic acts trying to anticipate ...


Read More on Datafloq
5 Ways to Increase Your Start-up Valuation with Cloud Accounting

5 Ways to Increase Your Start-up Valuation with Cloud Accounting

Cloud accounting is a great tool, not only for large businesses and organizations but also for start-ups. It is the process or financial strategy which helps businesses to store, monitor and process confidential financial data, without investing in expensive in-house IT server infrastructure. In the case of start-ups, Cloud accounting solutions bring in a whole plethora of benefits which help the emerging business to grow and optimize the efficacy of its operational protocols, while also increasing its valuation. Cloud accounting’s dynamic advantages like real-time data access, scalability, and flexibility make it a great choice for developing start-ups. 

The basic fundamental of any start-up is its affinity for innovation and creativity. A start-up which implements Cloud accounting protocols with business strategy benefits in more than way. It can help start-ups with limited resources and /or knowledge to properly manage their financial aspects without much of a hassle. Here are a few advantages of implementing Cloud accounting techniques, which helps to drive up a startup's valuation. 

1.Impressive Data Flow and Access Management 

Unlike large business and corporations, start-ups lack the necessary IT resources like skilled IT personnel, specialized hardware, and top notch security systems to properly set up and manage an in-house data servers. Without a significant amount of funding, start-ups ...


Read More on Datafloq
How Emerging Industries are Using Big Data to their Advantage

How Emerging Industries are Using Big Data to their Advantage

It’s exciting to watch a new industry figure out ways to use big data. There are so many ways different industries put data to work for them and for their audience. For example, Google is using RankBrain to determine search results.

This is exciting because it’s a very real example of Artificial Intelligence employing data to affect what you see in front of you every day. Go ahead, try typing an unusual query into Google. RankBrain will help determine the result based on similar and unconnected searches, your own searches, and the data generated by clicks in those searches, and it will do this in real-time.

In other words, machine learning will use big data to personalize the result of each search for you. And in the world of Google updates and SEO, the mysterious, exciting thing is that there was an “Unnamed Major Update” in May. Was that change a full-scale takeover by AI? We won’t know until there’s an announcement.

So Google is something of a juggernaut and a pioneer in the big data world. Internet giants like Google were at the forefront of Big Data’s emergence to the public eye in 2010, which has ushered in Analytics 3.0. This is ...


Read More on Datafloq
Mi maradt le a bevásárlólistáról? – Ma új hazai adatbányászati verseny indul

Mi maradt le a bevásárlólistáról? – Ma új hazai adatbányászati verseny indul

A data scientistté válás útjának egyik fontos állomása az adatbányászati versenyeken való indulás. A gépi tanulási eljárásokkal kapcsolatos tudásod, a helyes tesztelési és tanítási rendszer kialakításának a képessége, a jó visszamérési stratégiád ellenőrzésére nagyon alkalmas egy jó versenyen való részvétel. Ezért is népszerű a kaggle.com adatbányászati versenyeket szervező oldal, érdemes követni rajta az eseményeket akkor is, ha nincs időd bekapcsolódni a megmérettetésekbe.

a_5.jpgKülön örülök, ha hazai versenyek indulnak, hiszen ezen események egyfajta indikátorai annak, hogy a hazai adatos közösség hol is tart valójában. Ezért is szeretém külön felhívni a figyelmet a ma induló Cetli  ("Shopping List") Competition versenyre: a Nextent jóvoltából és az ő támogatásukkal induló megmérettetésen a Cetli nevű applikáció adatai felett dolgozhatunk. Az anonimizált felhasználók bevásárlólistáit láthatjuk, és tudjuk hol töröltek le azokról tételeket. Azt kell megbecsülni hogy adott boltban mi az a termék, amit a verseny kiírói letöröltek a listáról. Ebből következően egyszerre láthatjuk, hogy mit és hol vásároltak, szóval maga az adathalmaz magában is érdekes. 

Ha érdekel, nézz körül a verseny oldalán, majd regisztrálj versenyzőnek. 

Verseny hivatalos oldala 

A verseny indulásáról a ma esti Budapest.py Meetupon fognak bővebben beszélni a verseny szervezői.

Úgy érzed, hogy neked is van olyan adathalmazod, ami kapcsán érdekes lehetne kiírni egy adatbányászati versenyt? Érdekelne, mi lenne az elérhető közel legjobb megoldás, vagy kíváncsi vagy rá, kik értenek igazán az adott fajta feladat megoldásához? Egyszerűen a beszállítóidat szeretnéd megversenyeztetni? Keress meg minket, és mi szívesen segítünk a verseny megfogalmazásában, kiírásában, akár a lebonyolításában.  - Gáspár Csaba gaspar.csaba@dmlab.hu

Megosztom Facebookon! Megosztom Twitteren! Megosztom Tumblren!

Why is Data becoming established in the C-Suite?

Why is Data becoming established in the C-Suite?

Over the last decade, MBN Solutions have seen the seniority of data and analytics roles grow in various types of organization. Chief Data Officers (CDO) have become increasingly popular, as the most senior role of those roles within a company.

With the passing of time, those roles continue to operate and it feels like the CDO is now an established presence in today’s boardrooms. Why is that? Given the demand we also see for Analytics or Data Science leaders, why has the CDO reached the top first?

Although there are a few businesses with Chief Analytics Officers or Chief Scientists, many appear satisfied with a CDO at the top for now. Wondering why, I’ve been chatting to some of those data and analytics leaders we have placed. A few common themes have emerged, to perhaps explain this pattern.

MBN have been tracking this in the market as we posed this question, back in 2013, to an audience of senior data professionals at an event we hosted in Home House, London: “How many of you anticipate a Data professional sitting at C-Suite level within the next few years?”

The response from the audience was striking. Of almost 80 individuals in the room, only 2 raised their hands.

So, I thought I’d ...


Read More on Datafloq
Will Fog Computing Hide the Clouds of the Internet of Things?

Will Fog Computing Hide the Clouds of the Internet of Things?

As if the Internet of Things (IoT) was not complicated enough, the Marketing team at Cisco introduced its Fog Computing vision in January 2014, also known as Edge Computing  for other more purist vendors.

Given Cisco´s frantic activity in their Internet of Everything (IoE) marketing campaigns, it is not surprising that many bloggers have abused of shocking headlines around this subject taking advantage of the Hype of the IoT.

I hope this post helps you better understand what is the role of Fog Computing in the IoT Reference Model and how companies are using IoT Intelligent gateways in the Fog to connect the “Things” to the Cloud through some applications areas and examples of Fog Computing.

The problem with the Cloud

As the Internet of Things proliferates, businesses face a growing need to analyse data from sources at the edge of a network, whether mobile phones, gateways or IoT sensors. Cloud computing has a disadvantage: It can’t process data quickly enough for modern business applications.

The IoT owes its explosive growth to the connection of physical things and operation technologies (OT) to analytics and machine learning applications, which can help glean insights from device-generated data and enable devices to make “smart” decisions without human intervention. Currently, such resources ...


Read More on Datafloq
How Smart Factories and the IIoT Can Prevent Product Recalls

How Smart Factories and the IIoT Can Prevent Product Recalls

In recent news, Samsung Electronics Co. has initiated a global recall of 2.5 millions of their Galaxy Note 7 smartphones, after finding that the batteries of some of their phones exploded while charging. This recall would cost the company close to $1 Billion.

This is not a one-off incident

Product recalls have plagued the manufacturing world for decades, right from food and drug to automotive industries, causing huge losses and risk to human life. In 1982, Johnson & Johnson recalled 31 million bottles of Tylenol which retailed at $100 million after 7 people died in Chicago-area. In 2000, Ford recalled 20 million Firestone tires losing around $3 billion, after 174 people died in road accidents due to faulty tires. In 2009, Toyota issued a recall of 10 million vehicles due to numerous issues including gas pedals and faulty airbags that resulted in $2 billion loss consisting of repair expenses and lost sales in addition to the stock prices dropping more than 20% or $35 billion.

Most manufacturers have very stringent quality control processes for their products before they are shipped. Then how and why do these faulty products make it to the market which poses serious life risks and business risks?

Koh Dong-jin, president ...


Read More on Datafloq
Becoming a Big Data Scientist: Skills You Need to Know and How to Learn Them

Becoming a Big Data Scientist: Skills You Need to Know and How to Learn Them

To say that data scientists are in high demand would actually be sort of an understatement. With big data being utilized more and more within organizations, executives want men and women who know big data inside and out. The number of data scientist positions is on the rise and growing each year. This demand is reflected in the amount of money being paid to data scientists, with the median salary for computer and information research scientists hitting more than $110,000 in 2015, according to the Bureau of Labor Statistics. But it’s not enough to be considered a data scientist, you need to have the right skills to get noticed above your peers. In this way, you’ll be able to land the most coveted jobs that are out there. In other words, mastering certain skills will get you noticed far more quickly.

One can look at data science skills you should know through a broad lens. Simply saying you need programming skills, for example, would be accurate, but let’s get more specific than that. In an analysis from CrowdFlower of LinkedIn job postings, the most cited skill for data scientist openings was SQL. In fact, more than half (57 percent) listed SQL ...


Read More on Datafloq
5 Ways Blockchain will Transform Financial Services

5 Ways Blockchain will Transform Financial Services

Blockchain is being hailed as “the new internet” and is driving transformation for businesses across multiple sectors, particularly for the Financial Services. But how exactly?

Blockchain in a Nutshell

Let’s start with a quick recap of what exactly blockchain is and it’s benefits. Pinching a definition from the Financial Times…

“A blockchain is a shared digital ledger that allows transactions to be recorded and verified electronically over a network of computers without a central ledger. Cryptography is used to protect the data from fraud or hackers.”

So why is everyone, including us of course, so excited? Because the benefits are extensive:- decentralisation, reliability, simplification, transparency, traceability, cost saving, reduced room for error, faster transactions and improved data quality… just to mention a few!

So let’s take a look at some specific ways blockchain will transform the Financial Services industry – ultimately creating a much more satisfying customer experience for us all.

1. Asset Management

Use case: Settlements

Traditional trade processes within asset management can be slow, manual, cumbersome and filled with risk when reconciling and matching – and they’re getting more complex with cross border transaction and for non-standard investment products, e.g. loans. Each party in the trade lifecycle (e.g. broker dealers, intermediaries, custodians, clearing and settlement teams) currently keeps their ...


Read More on Datafloq
How to Integrate Sqoop in a Data Ingestion Layer with Hive, Oozie in a Big Data Application?

How to Integrate Sqoop in a Data Ingestion Layer with Hive, Oozie in a Big Data Application?

Sqoop is a tool which helps to migrate and transfer the data between RDBMS and Hadoop system in bulk mode. This blog post will focus on integrating Sqoop with other projects in Hadoop ecosystem and Big Data Applications. As I am working for Big Data Solution providers, I learned it and here I will show how to schedule Sqoop job with Oozie and how to load the data from Sqoop to the data warehouse in Hive on Hadoop or even HBase. I tried to give the solutions with the code here which may become easy to understand.

Environment

Java: JDK 1.7

Cloudera version:  CDH4.6

MySQL.

Initial steps

Here we assume that we will have some data on HDFS of our Hadoop cluster and in our MySQL. Now we need to integrate jobs with Oozie and Hive. Let’s understand it with the code!

Code walkthrough

Below is an Oozie workflow which will trigger import job to import the data from MySQL to Hadoop:

<workflow-app name="musqoop-wf" xmlns="uri:oozie:workflow:0.1">

 ...

<action name="mysqoopaction">

<sqoop xmlns="uri:oozie:mysqoopaction:0.2">

<job-tracker>myjt:8021</job-tracker>

<name-node>nn:8020</name-node>

<command>import --table People --connect ...</command>

</sqoop>

<ok to="next"/>

<error to="error"/>


</action>

 ...

</workflow-app>

Now, How to add the property for Sqoop when integrating with Oozie?  This workflow file will help to give the answer of this question and will configure the property to sqoop.

For Ex: Add a statement in a ...


Read More on Datafloq
Why Employee Training and Big Data Should Work Together

Why Employee Training and Big Data Should Work Together

The flood of data available today is growing by leaps and bounds as expanding networks are able to capture real-time user decisions in an instant. The ability to analyze this data for trends and insights is becoming an eagerly-sought advantage for corporate training companies and organizations of all kinds. But more of them are beginning to discover that it poses benefits beyond marketing forecasts and instant statistics. Big data is being adapted to e-learning processes to train better employees.

Data-driven approaches are being used to perfect adaptive learning, create better courses, and provide electronic monitoring and testing in ways a single human instructor couldn't cope with. Around 77% of US companies offer e-learning, but little of it leverages big data. Here's why every company should bring big data to employee training.

1. Determine Effectiveness

Big data computing can return analytics that quantify all the results of employee training - lesson retention, employee learning needs, more productive curriculum and techniques, and improved learning software. Learning directors are able to look at a number of different factors and determine which works and which doesn't.

Organizations can develop metrics for each training module based on employee learning time, test results, questions, and feedback. Do certain methods work ...


Read More on Datafloq
Why Big Data Is Growing So Fast?

Why Big Data Is Growing So Fast?

You are probably already feeling some of the impact of living in a data-driven world. You rarely come across a Google search that doesn’t answer your question, and often enough you find more than enough information to write your own tome on any topic you can imagine.

Moreover, your hard drive is probably filled with so much data accumulated over the years that you might wonder what would happen to all your data if it crashed and you hadn’t backed up everything. Fortunately, hard drive data recovery experts can quickly resolve this issue.

In one of his books, author Deepak Chopra narrates how his hard drive crashed when he was writing a book he had spent months researching. Since he had not yet backed up his data, he immediately experienced crushing despair. He had no idea how to reconstruct his pivotal ideas or retrace his in-depth research findings. Fortunately, he discovered that it was possible to completely recover a hard-drive and was amazed (and relieved) when he quickly got his restored hard-drive in the mail.

Keeping Track of Big Data

In four years from now, there will be 5,200 GB of data per person. International Data Corporation, a research group, believes that there will ...


Read More on Datafloq
Breaking Analytics Out Of The Box – Literally

Breaking Analytics Out Of The Box – Literally

The lines between open source and commercial products are blurring rapidly as our options for building and executing analytics grow by the day. The range of options and price points available today enable anyone from a large enterprise to a single researcher to gain access to affordable, powerful analytic tools and infrastructure. As a result, analytics will continue to become more pervasive and more impactful.

Author’s note: I typically avoid mentioning specific products or services in my blogs. However, it is unavoidable for this topic. While I will make mention of a number of my company’s offerings here to illustrate specific examples of the themes, the themes themselves are broad and industry-wide.

Blurred Lines

Given the cost and overhead, it used to be that organizations would have to make an either/or choice when it came to selecting data platforms and analytical tools. Even with the advent of the open source movement, common opinions espoused either avoiding open source altogether or migrating completely to open source options. This either/or choice was a false one time has shown. As it turned out, most organizations now utilize a mixture of open source and commercial products to achieve maximum effectiveness.

From a platform perspective, large organizations typically are ...


Read More on Datafloq
Who Competes With VMware Now?

Who Competes With VMware Now?

Yesterday, September 7, 2016, the EMC² logo disappeared. It’s hard for me to imagine that one of the greatest tech marketing companies of all time is suddenly gone. Yes, I know it lives on under Dell Technologies as DellEMC, but the impressions I have of the two companies side by side are so disparate that I’m still having trouble seeing how one blends into the other.
How always-available data forms the digital lifeblood for a university medical center

How always-available data forms the digital lifeblood for a university medical center

The next BriefingsDirect Voice of the Customer digital business transformation case study examines how the Nebraska Medical Center in Omaha consolidated and unified its data-protection capacities.

We'll explore how adopting storage innovation protects the state's largest hospital from data disruption and adds operational simplicity to complex data lifecycle management.

Listen to the podcast. Find it on iTunes. Get the mobile app. Read a full transcript or download a copy.

To describe how more than 150 terabytes of data remain safe and sound, we're joined by Jeff Bergholz, Manager of Technical Systems at The Nebraska Medical Center in Omaha. The discussion is moderated by BriefingsDirect's Dana Gardner, Principal Analyst at Interarbor Solutions.

Here are some excerpts:

Gardner: Tell us about the major drivers that led you to seek a new backup strategy as a way to keep your data sound and available no matter what.

Bergholz: At Nebraska Medicine, we consist of three hospitals with multiple data centers. We try to keep an active-active data center going. Epic is our electronic medical record (EMR) system, and with that, we have a challenge of making sure that we protect patient data as well as keeping it highly available and redundant.

We were on HPE storage for that, and with it, were really only able to do a clone-type process between data centers and keep retention of that data, but it was a very traditional approach.

Bergholz
A couple of years ago, we did a beta program with HPE on the P6200 platform, a tertiary replica of our patient data. With that, this past year, we augmented our data protection suite. We went from license-based to capacity-based and we introduced some new D2D dedupe devices into that, and StoreOnce as well. What that affords us is to easily replicate that data over to another StoreOnce appliance with minimal disruption.

Part of our goal is to keep backup available for potential recovery solutions. With all the cyber threats that are going on in today's world, we've recently increased our retention cycle from 7 weeks to 52 weeks. We saw and heard from the analysts that the average vulnerability sits in your system for 205 to 210 days. So, we had to come up with a plan for what would it take to provide recovery in case something were to happen.

We came up with a long-term solution and we're enacting it now. Combining HPE 3PAR storage with the StoreOnce, we're able to more easily move data throughout our system. What's important there is that our backup windows have greatly been improved. What used to take us 24 hours now takes us 12 hours, and we're able to guarantee that we have multiple copies of the EMR in multiple locations.

We demonstrate it, because we're tested at least quarterly by Epic as to whether we can restore back to where we were before. Not only are we backing it up, we're also testing and ensuring that we're able to reproduce that data.

More intelligent approach

Gardner: So it sounds like a much more intelligent approach to backup and recovery with the dedupe, a lower cost in storage, and the ability to do more with that data now that it’s parsed in such a way that it’s available for the right reason at the right time.

Bergholz: Resource wise, we always have to do more with less. With our main EMR, we're looking at potentially 150 terabytes of data in a dedupe that shrinks down greatly, and our overall storage footprint for all other systems were approaching 4 petabytes of storage within that.

We've seen some 30:1 decompression ratios within that, which really has allowed my staff and other engineers to be more efficient and frees up some of their time to do other things, as opposed to having to manage the normal backup and retention of that.
HPE Data Protector:
Backup with Brains
Learn More Here
We're always challenged to do more and more. We grow 20-30 percent annually, and by having appropriate resources, we're not going to get 20 to 30 percent more resources every year. So, we have to work smarter with less and leverage the technologies that we have.

Gardner: Many organizations these days are using hybrid media across their storage requirements. The old adage was that for backup and recovery, use the cheaper, slower media. Do you have a different approach to that and have you gone in a different direction?

Bergholz: We do, and backup is as important to us as our data that exists out there. Time and time again, we’ve had to demonstrate the ability to restore in different scenarios, the accepted time of being able to restore and provide service back. They're not going to wait for that. When clinicians or caregivers are taking care of patients, they want that data as quickly as possible. While it may not be the EMR, it maybe some ancillary documents that they need to be able to get in order to provide better care.
We're able, upon request, to enact and restore in 5-10 minutes. In many cases, once we receive a ticket or a notification, we have full data restoration within 15 minutes.

We're able, upon request, to enact and restore in 5 to 10 minutes. In many cases, once we receive a ticket or a notification, we have full data restoration within 15 minutes.

Gardner: Is that to say that you're all flash, all SSD, or some combination? How did you accomplish that very impressive recovery rate?

Bergholz: We're pretty much all dedupe-type devices. It’s not necessarily SSD, but it's good spinning disk, and we have the technology in place to replicate that data and have it highly available on spinning disk, versus having to go to tape to do the restoration. We deal with bunches of restorations on a daily basis. It’s something we're accustomed to and our customers require quick restoration.

In a consolidated strategic approach, we put the technology behind it. We didn’t do the cheapest, but we did the best sort of thing to do, and having an active-active data center and backing up across both data centers enables us to do it. So, we did spend money on the backup portion because it's important to our organization.

Gardner: You mentioned capacity-based pricing. For those of our listeners and readers who might not be familiar with that, what is that and why was that a benefit to you?

Bit of a struggle

Bergholz: It was a little bit of a struggle for us. We were always traditionally client-based or application-based in the backup. If we needed Microsoft Exchange email boxes we had to have an Exchange plug-in. If we had Oracle, we had to have an Oracle plug-in, a SQL plug-in.

While that was great, it enabled us to do a lot, it we were always having to get another plug-in thing to do it. When we saw that with our dedupe compression ratios we were getting, going to a capacity-based license allowed us to strategically and tactically plan for any increase that we were doing within our environment. So now, we can buy in chunklets and keep ahead of the game, making sure that we’re effective there.

We're in throes of enacting archive-type solution through a product called QStar, which I believe HPE is OEM-ing, and we're looking at that as a long-term archive-type process. That's going to a linear tape file system, utilizing the management tools that that product brings us to afford the long-term archive of patient information.

Our biggest challenge is that we never delete anything. It’s always hard with any application. Because of the age of the patient, many cases are required to be kept for 21 years; some, 7 years; some, 9 years. And we're a teaching hospital and research is done on some of that data. So we delete almost nothing.
HPE Data Protector:
Backup with Brains
Learn More Here
In the case of our radiology system, we're approaching 250 terabytes right now. Trying to backup and restore, that amount of data with traditional tools is very ineffective, but we need to keep it forever.

By going to a tertiary-type copy, which this technology brings us, we have our source array, our replicated array, plus now, a tertiary array to take that, too, which is our LTFS solution.

Gardner: And with your backup and recovery infrastructure in place and a sense of confidence that comes with that, has that translated back into how you do the larger data lifecycle management equation? That is to say, are there some benefits of knowledge of quality assurance in backup that then allows people to do things they may not have done or not worried about, and therefore have a better business transformation outcome for your patients and your clinicians?
Being able to demonstrate solutions time and time again buys confidence through leadership throughout the organization and it makes those people sleep safer at night.

Bergholz: From a leadership perspective, there's nothing real sexy about backup. It doesn’t get oohs and ahs out of people, but when you need data to be restored, you get the oohs and ahs and the thank-yous and the praise for doing that. Being able to demonstrate solutions time and time again buys confidence through leadership throughout the organization and it makes those people sleep safer at night.

Recently, we passed HIMSS Level 7. One of the remarks from that group was that a) we hadn’t had any production sort of outage, and b) when they asked a physician on the floor, what do you do when things go down, and what do you do when you lose something? He said the awesome part here is that we haven’t gone down and, when we lose something, we're able to restore that in a very timely manner. That was noted on our award.

Gardner: Of course, many healthcare organizations have been using thin clients and keeping everything at the server level for a lot of reasons, a edge to core integration benefit. Would you feel more enabled to go into mobile and virtualization knowing that everything that's kept on the data-center side is secure and backed up, not worrying about the fact that you don't have any data on the incline? Is that factored into any of your architectural decisions about how to do client decision-making?

Desktop virtualization

Bergholz: We have been in the throes of desktop virtualization. We do a lot of Citrix XenApp presentations of applications that keeps the data in a data center and a lot of our desktop devices connect to that environment.

The next natural progression for us is desktop virtualization (VDI), ensuring that we're keeping that data safe in the data center, ensuring that we're backing it up, protecting the patient information on that, and it's an interesting thought and philosophy. We try to sell it as an ROI-type initiative to start with. By the time you start putting all pieces to the puzzle, the ROI really doesn't pan out. At least we've seen in two different iterations.

Although it can be somewhat cheaper, it's not significant enough to make a huge launch in that route. But the main play there, and the main support we have organizationally, is from a data-security perspective. Also, it's the the ease of managing the virtual desktop environment. It frees up our desktop engineers from being feet on the ground, so to speak, to being application engineers and being able to layer in the applications to be provisioned through the virtual desktop environment.
The next natural progression for us is desktop virtualization (VDI), ensuring that we're keeping that data safe in the data center, ensuring that we're backing it up, protecting the patient information on that.

And one important thing in the healthcare industry is that when you have a workstation that has an issue and requires replacement or re-imaging, that’s an invasive step. If it’s in a patient room or in a clinical-care area, you actually have to go in, disrupt that flow, put a different system in, re-image, make sure you get everything you need. It can be anywhere from an hour to a three-hour process.

We do have a smattering of thin devices out there. When there are issues, it’s merely just replaying or redoing a gold image to it. The great part about thin devices versus thick devices is that in lot of cases, they're operating in a sterile environment. With traditional desktops, the fans are sucking air through infection control and all that; there's noise; perhaps they're blowing dust within a room, if it's not entirely clean. SSD devices are a perfect-play there. It’s really a drop-off, unplug, and re-plug sort of technology.

We're excited about that for what it will bring to the overall experience. Our guiding principle is that you have the same experience no matter where you're working. Getting there from Step A to Step Z is a journey. So, you do that a little bit a time and you learn as you go along, but we're going to get there and we'll see the benefit of that.
HPE Data Protector:
Backup with Brains
Learn More Here
Gardner: And ensuring the recovery and voracity of that data is a huge part of being able to make those other improvements.

Bergholz: Absolutely. What we've seen from time to time is that users, while they're fairly knowledgeable, save their documents where they save them to. Policy is to make sure you put them within the data center. That may or may not always be adhered to. By going to a desktop virtualization, they won’t have any other choice.

A thin client takes that a step further and ensures that nothing gets saved back to a device, where that device could potentially disappear and cause a situation.

We do encrypt all of our stuff. Any device that's out there is covered by encryption, but still there's information on there. It’s well-protected, but this just takes away that potential.

Listen to the podcast. Find it on iTunes. Get the mobile app. Read a full transcript or download a copy. Sponsor: Hewlett Packard Enterprise.

You may also be interested in:

Influencer Summit 2016—Teradata Reshapes Itself with Analytics and the Cloud

Influencer Summit 2016—Teradata Reshapes Itself with Analytics and the Cloud

For anyone with even a small amount of understanding regarding current trends in the software industry it will come as no surprise that the great majority of enterprise software companies are focusing on the incorporation of analytics, big data, cloud adoption, and especially the Internet of Things into their software solutions. In fact, these capabilities have become so ubiquitous that for
Top 5 Up and Coming Big Data Jobs of 2016

Top 5 Up and Coming Big Data Jobs of 2016

Big data has been all the rage in the past few years. In 2012, the Harvard Business Review described data science as the sexiest job of the 21st century, and big data has only grown bigger since then. Whether it’s location tracking or mapping customer behavior across a website, there are numerous job opportunities within big data for those who like numbers. With the explosion of the Internet of Things and machine learning, the growth of big data jobs will continue in the coming years.

By 2020, businesses utilizing data will boost their productivity by $430 billion in comparison to the competition that does not use big data. As a result, there are all kinds of jobs that are in high demand today to help businesses increase their efficiency and more effectively target customers. Here are the top 4 up-and-coming big data jobs of 2016.

1. Data Analyst

This one’s a bit of a no-brainer, but big data can’t help a business unless they have the resources to process that data and extract meaningful trends that they can act on. It’s the data analyst’s job to wrangle all of the data and provide a company with conclusions on what the data is saying. ...


Read More on Datafloq
Why Do Television Companies Need a Digital Transformation

Why Do Television Companies Need a Digital Transformation

Over just a few years, the world of television production, distribution, and consumption has changed dramatically. In the past, with only a few channels to choose from, viewers watched news and entertainment television at specific times of the day or night. They were also limited by where and how to watch. Options included staying home, going to a friend’s house, or perhaps going to a restaurant or bar to watch a special game, show, news story, or event. When we are taking about the TV industry has now been completing and moving to the high definition from the standard definition, now the discussion is about 4K and 8K video standard. But before all these things happen, analog based broadcasting needs to transform digitally. That means TV industry is unavoidable needing a disruptive transformation in their ICT platform to cope with the new processes of acquisition, production, distribution and consumption.

Fast-forward to today, and you have a very different scenario. Thanks to the rise of the Internet – and, in particular, mobile technology – people have nearly limitless options for their news and entertainment sources. Not only that, but they can choose to get their news and other media on TV or ...


Read More on Datafloq
Take Action in Real Time with Big Data Analytics

Take Action in Real Time with Big Data Analytics

I wrote about predicting the future with analytics in a blog titled "Remember the Past and Predict the Future". In many industries, technological advances have greatly reduced the margin of error for predicting future outcomes. But what good is a prediction if you don’t take action? 

Thinking about the ramifications of taking action on predictions can turn into a mind warping time travel movie. Or more realistically, a never-ending iterative loop:

If my data is telling me that X is going to happen, then I need to do Y. 
But if I do Y, my data tells me Z will happen.  
While Z is better than X, is it the best alternative?  
Perhaps I desire ZZ. 
How do I exit analysis paralysis and make an actionable decision before the outcome happens?

In “Remember the Past …”, I wrote that prediction becomes much more accurate the closer you get to the predicted outcome. While this is great and hopefully intuitive, are we able to take action fast enough?  If the weather forecast offers a 5-minute lead time, will you ever grab your umbrella? 



Reaction Time v. Prediction Lead Time

If we agree that predictions are generally more accurate the closer in time you get to the predicted outcome, then perhaps our focus should be ...


Read More on Datafloq
What do Big Data and Robo Advisors Have in Common?

What do Big Data and Robo Advisors Have in Common?

There are many investors and analysts that are calling 2016 the year of the robo. One can barely get on a computer without being bombarded with ads and content from a sector that is growing faster than it can handle.

Even some of the more traditional brokers are starting to offer a more digitalized version of an advisor for a new generation of investors that expect massive amounts of data to be at their fingertips, sorted and organized for their viewing pleasure. The market for these robo advisors is expected to move from $2 billion in 2013 to $500 billion in 2020.

Investing has never been an easy game. There are thousands of variables playing into financial markets at any given time, making it virtually impossible for man and machine alike to consistently and accurately call market movements. That is exactly what robo advisors intend to do, however. In theory, if one had all the necessary data, they should be able to accurately predict market movements virtually every single time. This concept has brought about a massive push for data in wealth management.

How they work

Robo advisors take a person’s data, crunch it, and find smart investment decisions based on that data. Some are ...


Read More on Datafloq
Kick-Starting the 4th Industrial Revolution, One Blockchain at a Time

Kick-Starting the 4th Industrial Revolution, One Blockchain at a Time

We live in a future of accelerated change and today’s world is changing faster than we have ever seen before. New technology is changing the way we live, work and collaborate. No longer is it sufficient to for organisations to sit back and stick to the status quo. Today’s new technologies require an active attitude by organizations, if they want to remain in business in the next decade.

I am talking about the 4th Industrial Revolution that is rapidly approaching and it will bring change at unlike we have ever seen before. In fact, it will change what it means to be humans. It also offers us a tremendous opportunity to create a world that is good for all, where technology is used for the good, privacy of consumers is respected and data is used to improve the lives of all humans. The 4th Industrial Revolution is all about Algorithms, Machine-Learning and Artificial Intelligence. It is about robotics, 3D printing and VR/AR, Nano technology, and many more emerging technologies. It is disruption on all levels, resulting in system-wide innovations that can change an industry in years instead of decades. The combination of such revolutionary emerging technologies will bring us realities that until recently would ...


Read More on Datafloq
What Does Your Medical Record Say About You? (and who is reading it?)

What Does Your Medical Record Say About You? (and who is reading it?)

What happens when you fill in that medical history questionnaire?

A new doctor or a new office or sometimes it's a routine visit… how many times have you filled in your personal medical information - from your address and insurance information down to the significant (or awkward) events of your medical history - the illness, the surgery, the procedure?  Not only is this excruciatingly private information, but it’s also important to you that it is accurate, timely AND secure.trained, and they are only human as far as reading and entering name after name. YOUR name and affliction and 

Every time you write you name and information on a form a person has to read it, interpret it, and most likely enter it in some electronic form for a structured data base.  Will they get it right?  Would you know if they did or did not translate it correctly?

An entire job industry exists for medical data entry.  Not only are non-medical strangers reading your personal medical information time after time, but the potential for information to be incorrectly interpreted with errors in spelling, dates, treatments and more is possible.  The data entry employee or outsourced contractor is not medically treatment are just a blip in ...


Read More on Datafloq
Why Isn’t Your Business Using NoSQL Databases Yet?

Why Isn’t Your Business Using NoSQL Databases Yet?

Businesses today are collecting huge amounts of data every single day that can be used to benefit the business and its operations. The sheer volume of data that companies now deal with and store on a daily basis means that traditional frameworks are under pressure and in some cases are no longer fit for purpose. NoSQL databases could be your solution to dealing with today’s data demands if you’re not already using this framework option.

Traditional relational database management systems (RDBMS) are a great choice if a business is dealing with small amounts of data that needs to be kept well-structured. But when large volumes are added it can have a negative impact, resulting in performance decreasing, making it an often unsuitable tool when it comes to big data.

In contrast, NoSQL, often known as Not Only SQL, is scalable. This type of database has been designed with the high volumes of incoming information associated with big data in mind. NoSQL is particularly useful if you have lots of unstructured data stored in multiple areas that need to be correlated and large quantities of data need to be accessed fast.

Up until recently, RDBMS relational databases were the most commonly used but NoSQL databases are ...


Read More on Datafloq
Cyber Security of the Connected Car in the Age of the Internet of Things

Cyber Security of the Connected Car in the Age of the Internet of Things

The Revolutionary Design and Features of Connected Cars

In this age of the Internet of Things, virtual technology affects just about every aspect of our lives. From the way that we watch movies and television to the manner in which we shop and order food from our favorite restaurants, we have become increasingly dependent on wireless and virtual inventions that are designed to make our lives easier.

This technology now extends to the very cars that we drive every day to school, work, or anywhere else we need to go. Our new wireless Internet vehicles, dubbed connected cars, are designed to help us with ordinary driving tasks like backing out of a driveway, parallel parking, and even making a phone call or sending a text without having to dial or type on our cell phones. 

Our cars can tell us what directions to take and what the weather will be like once we arrive at our destination. They play movies, connect to global satellite radio stations, and keep us entertained at the touch of a button. All of their technological features center on maximizing our driving pleasure, improving our safety and handling, and relieving us of much of the thought and effort that ...


Read More on Datafloq
How Big Data and CRM are Shaping Modern Marketing

How Big Data and CRM are Shaping Modern Marketing

Big Data is the term for massive data sets that can be mined with analytics software to produce information about your potential customers’ habits, preferences, likes and dislikes, needs and wants.

This knowledge allows you to predict the types of marketing, advertising and customer service to extend to them to produce the most sales, satisfaction and loyalty.

Skilled use of Big Data produces a larger clientele, and that is a good thing. However, having more customers means you must also have an effective means of keeping track of them, managing your contacts and appointments with them and providing them with care and service that has a personal feel to it rather than making them feel like a “number.”

That’s where CRM software becomes an essential tool for profiting from growth in your base of customers and potential customers. Good CRM software does exactly what the name implies – offers outstanding Customer Relationship Management with the goal of fattening your bottom line.

With that brief primer behind us, let’s look at five ways that the integration of Big Data and CRM is shaping today’s marketing campaigns.

Achieving Targeted Multi-Channel Reach

The data acquired by marketers tells them where to find their customers and potential customers. The information ...


Read More on Datafloq
How Big Data is Revolutionizing the Manufacturing Industry

How Big Data is Revolutionizing the Manufacturing Industry

Data collection and analysis are an integral part of our society, and are important activities we use to inform our decisions. Big data is no exception. Made up of extremely large sets of data that can be analyzed for trends and other information, big data is extremely useful and relevant when determining strategies and plans for communities and companies. In fact, it’s changing the face of many different industries—including manufacturing. Let’s take a look at how big data is revolutionizing the manufacturing industry.

Big Data’s Role in the Manufacturing Industry

“Made in the USA” is a proud label attached to goods manufactured on U.S. soil. While this label doesn’t necessarily assure good quality, most U.S. manufacturers are dedicated to producing well-made goods and paying workers fair wages. A recent report from Ohio University found that the manufacturing industry represents 12% of the country’s gross domestic product (GDP), and that these goods raked in $1.2 trillion from exports in 2013. American manufacturing is becoming stronger again, with a 30% increase in output since the end of the recession, and 54% of manufacturers considering bringing their production back from overseas.

Why is manufacturing experiencing such a positive surge in the United States? Part of it ...


Read More on Datafloq
What is the Blockchain and Why is it So Important?

What is the Blockchain and Why is it So Important?

Blockchain is growing in importance. Increasingly organisations have to explore what this revolutionary technology will mean for their business. Marc Andreessen from the well-known VC firm Andreessen Horowitz calls it as big an invention as the internet. Last year, in my Big Data Trends prediction for 2016, I already foresaw that 2016 would become the year of the Blockchain and now also Gartner has included in their Hype Cycle for Emerging Technologies.

Many organisations are already exploring the possibilities of the Blockchain, although primarily still in the Financial Services industry. The R3 Partnership is a consortium of 45 of the biggest financial institutions, investigating what the Blockchain means for them. Next to the R3 consortium, four of the biggest global banks, led by Swiss bank UBS, have developed a “Utility Settlement Coin” (USC), which is the digital counterpart of each of the major currencies backed by central banks. Their objective is to develop a settlement system that processes transactions in (near) real-time instead of days. A third example is Australia Post, who have released plans for developing a blockchain-based e-voting system for the state of Victoria.

The possibilities of the Blockchain are enormous and it seems that almost any industry that deals ...


Read More on Datafloq
Maintaining disabled FK’s, wisdom or farce?

Maintaining disabled FK’s, wisdom or farce?

A while back, I wrote a post about having FKs (foreign keys) in your data warehouse. Well, a similar question came up recently on an Oracle forum with the above title. It is a fair question and it does surface fairly regularly in a variety of contexts (not just data warehousing). Of course, as The […]
Seven Magnificent Big Data Success Stories

Seven Magnificent Big Data Success Stories

Big data has arrived. Big Data is here for keeps. Big Data is the future.

Despite some of the malicious, mendacious and malodorous words of naysayers, sceptics and contrarians, the world of big data and big data analytics is replete with totally amazing and fabulous success stories.

Big Data gurus are often accused of not delivering coherent, cohesive and verifiable accounts of Big Data successes. Which is understandable but at the same time a pity. So here, to illustrate this miraculous and remarkable turnaround, I give you not three but seven of the many Big Data success stories that I could have casually grabbed out of the ether.

First, we take a trip to Glasgow to discover the leveraging of Big Data in alternative investments. Then we pass over to Boston to explore the magic of Big Data at Universal Legal.

The Richy Rich Student Debt Mega Alpha Fund – Big Data and Corporate Welfare

Govan based Hedge Fund operators RCN are proudly leveraging Big Data to the max. Their Student Debt Mega Alpha Fund is one of the most imaginative schemes in the whole of the financial industry landscape, from Singapore, through Soho, to Stateside.

RCN use Big Data in innovative, unique and inventive ways. ...


Read More on Datafloq
Loyalty management innovator Aimia’s transformation journey to modernized IT

Loyalty management innovator Aimia’s transformation journey to modernized IT

The next BriefingsDirect Voice of the Customer digital business transformation case study examines how loyalty management innovator Aimia is modernizing, consolidating, and standardizing its global IT infrastructure.

As a result of rapid growth and myriad acquisitions, Montreal-based Aimia is in a leapfrog mode -- modernizing applications, consolidating data centers, and adopting industry standard platforms. We'll now learn how improving end-user experiences and leveraging big data analytics helps IT organizations head off digital disruption and improve core operations and processes.
 
Listen to the podcast. Find it on iTunes. Get the mobile app. Read a full transcript or download a copy.

To describe how Aimia is entering a new era of strategic IT innovation, we're joined by André Hébert, Senior Vice President of Technology at Aimia in Montreal. The discussion is moderated by BriefingsDirect's Dana Gardner, Principal Analyst at Interarbor Solutions.

Here are some excerpts:

Gardner: What are some of the major drivers that have made you seek a common IT strategy?

Hébert: If you go back in time, Aimia grew through a bunch of acquisitions. We started as Aeroplan, Air Canada's frequent flyer program and decided to go in the loyalty space. That was the corporate strategy all along. We acquired two major companies, one in the UK and one that was US-based, which gave us a global footprint. As a result of these acquisitions, we ended up with quite a large IT footprint worldwide and wanted to look at ways of globalizing and also consolidating our IT footprint.

Hébert
Gardner: For many people, when they think of a loyalty program, it's frequent flyer miles, perhaps points at a specific retail outlet, but this varies quite a bit market to market around the globe. How do you take something that's rather fractured as a business and make it a global enterprise?

Hébert: We've split the business into two different business units. The first one is around coalition loyalty. This is where Aimia actually runs the program. Good examples are Aeroplan in Canada or Nectar in the UK, where we own the currency, we operate the program, and basically manage all of the coalition partners. That's one side.

The other side is what we call our global loyalty solutions. This is where we run loyalty programs for other companies. Through our standard technology, we set up a technology footprint within the customer site or preferably in one of our data centers and we run the technology, but the program is often white-labeled, so Aimia's name doesn't appear anywhere. We run it for banks, retailers and many industry verticals.

Almost like money

Gardner: You mentioned the word currency, and as I think about it, loyalty points are almost like money -- it is currency -- it can be traded, and it can be put into other programs. Tell us about this idea. Are you operating almost like a bank or a virtual currency trader of some sort?

Hébert: You could say that the currency is like money. It is accumulated. If you look at our systems, they're very similar to bank-account systems. So our systems are like banks'. If you look at debit and credit transactions, they mimic the accumulation and redemption transactions that our members do.
Gardner: What's been your challenge from an IT perspective to allow your company to thrive in this digital economy?

Hébert: Our biggest challenge was how large the technology footprint was. We still operate many dozens of data centers across the globe. The project with HPE is to consolidate all of our technology footprint into four Tier 3 data centers that are scattered across the globe to better serve our customers. Those will benefit from the best security standards and extremely robust data-center infrastructure. 

On the infrastructure side, it's all about simplifying, consolidating, virtualizing, using the cloud, leveraging the cloud, but in a virtual private way, so that we also keep our data very secured. That's on the infra side.

On the application side, we probably have more applications than we have customers. One of the big drivers there is that we have a global product strategy. Several loyalty products have now been developed. We're slowly migrating all of our customers over to our new loyalty systems that we've created to simplify our application portfolios. We have a large number of applications today, and the plan is to try to consolidate all these applications into key products that we've been developing over the last few years.
We've shopped around for a partner that can help us in that space and we thought that HPE had the best credentials, the best offer for us to go forward.

Gardner: That’s quite a challenge. You're modernizing and consolidating applications. At the same time, you're consolidating and modernizing your infrastructure. It reminds me of what HPE did just a few years ago when it decided to split and to consolidate many data centers. Was that something that attracted you to HPE, that they have themselves gone through a similar activity?

Hébert: Yes, that is one of the reasons. We've shopped around for a partner that can help us in that space and we thought that HPE had the best credentials, the best offer for us to go forward. 

Virtual Private Cloud (VPC), a solution that they have offered, is both innovative, yet it is virtual and private. So, we feel that our customer’s data will be significantly more secure than just going to any public cloud.

Gardner: How is consolidating applications and modernizing infrastructure at the same time helping you to manage these compliance and data-protection issues?

Raising the bar

Hébert: The modernization and infrastructure consolidation is, in fact, helping greatly in continuing to secure data and meet ever more difficult security standards, such as PCI and DSS 3.0. Through this process, we're going to raise the bar significantly over data privacy.

Gardner: André, a lot of organizations don't necessarily know how to start. There's so much to do when it comes to apps, data, infrastructure modernization and, in your case, moving to VPC. Do you have any thoughts about how to chunk that out, how to prioritize, or are you making this sort of a big bang approach, where you are going to do it all at once and try to do it as rapidly as possible? Do you have a philosophy about how to go about something so complex?

Hébert: We've actually scheduled the whole project. It’s a three-year journey into the new HPE world. We decided to attack it by region, starting with Canada and the US, North America. Then, we moved on to zooming into Asia-Pacific, and the last phase of the project is to do Europe. We decided to go geographically. 
The program is run centrally from Canada, but we have boots on the ground in all of those regions. HPE has taken the lead into the actual technical work. Aimia does the support work, providing documentation, helping with all of the intricacies of our systems and the infrastructure, but it's a co-led project, with HPE doing the heavy lifting.

Gardner: Something about costs comes to mind when you go standard. Sometimes, there are some upfront cost, you have to leapfrog that hurdle, but your long-term operating costs can be significantly lower. What is it about the cost structure? Is it the standardized infrastructure platforms, are you using cheaper hardware, is it open source software, all the above? How do you factor this as a return on investment (ROI) type of an equation?

Hébert: It’s all of the above. Because we're right in the middle of this project, it will allow us to standardize, to evergreen, a lot of our technology that was getting older. A lot of our servers were getting old. So, we're giving the infrastructure a shot in the arm as far as modernization. 

From a VPC point of view, we're going to leverage this internal cloud much more significantly. From a CPU point of view, and from an infrastructure point of view, we're going to have significantly fewer physical servers than what we have today. It's all operated and run by HPE. So, all of the management, all of the ITO work is done by HPE, which means that we can focus on apps, because our secret sauce is in apps, not in infrastructure. Infrastructure is a necessary evil.

Gardner: That brings up another topic, DevOps. When you're developing, modernizing, or having a continuous-development process for your applications, if you have that cloud and infrastructure in place and it’s modern, that can allow you to do more with the development phase. Is that something you've been able to measure at all in terms of the ability to generate or update apps more rapidly?

Hébert: We're just dipping our toe into advanced DevOps, but definitely there are some benefits around that. We're currently focused on trying to get more value from that.

Gardner: When you think about ROI, there are, of course, those direct costs on infrastructure, but there are ancillary benefits in terms of agility, business innovation, and being able to come to market faster with new products and services. Is that something that is a big motivator for you and do you have anything to demonstrate yet in terms of how that could factor?

Relationship 2.0

Hébert: We're very much focused right now on what I would say is Relationship 1.0, but HPE was selected as a partner for their ability to innovate. They also are in a transition phase, as we all know, so while we're focused on getting the heavy lifting done, we're focusing on innovation and focusing on new projects with HPE. We actually call that Relationship 2.0.

Gardner: For others who are looking at similar issues -- consolidation, modernization, reducing costs over time, leveraging cloud models -- any words of advice now that you are into this journey as to how to best go about it or maybe things to avoid?
Hébert: When we first looked at this, we thought that we could do a lot of that consolidation work ourselves. Consolidating 42 data centers into 4 is a big job, and where HPE helps in that regard is that they bring the experience, they bring the teams, and they bring the focus to this. 

We probably could have done it ourselves. It probably would have cost more and it probably would have taken longer. One of the benefits that I also see is that HPE manages thousands and thousands of servers. With their ability to automate all of the server management, they've taken it to a level. As a small company, we couldn’t afford to do all of the automation that they can afford doing on these thousands of servers.
We probably could have done it ourselves. It probably would have cost more and it probably would have taken longer.

Gardner: Before we close out, André, looking to the future -- two, three, four years out -- when you've gone through this process, when you've gotten those modern apps and they are running on virtual private clouds and you can take advantage of cloud models, where do you see this going next? 

Do you have some ideas about mobile applications, about different types of transactional capabilities, maybe getting more into the retail sector? How does this enable you to have even greater growth strategically as a company in a few years?

Hébert: If you start with the cloud, the world is about to see a very different cloud model. If you fast forward five years, there will be mega clouds, and everybody will be leveraging these clouds. Companies that actually purchase servers will be a thing of the past. 

When it comes to mobile, clearly Aimia’s strategy around mobile is very focused. The world is going mobile. Most apps will require mobile support. If you look at analytics, we have a whole other business that focuses on analytics. Clearly, loyalty is all about making all this data make sense, and there's a ton of data out there. We have got a business unit that specializes in big data, in advanced analytics, as it pertains to the consumers, and clearly for us it is a very strategic area that we're investing in significantly.

Listen to the podcast. Find it on iTunes. Get the mobile app. Read a full transcript or download a copy. Sponsor: Hewlett Packard Enterprise.

You may also be interested in:

Big Data on the Roof of the World

Big Data on the Roof of the World

Once upon a time, there was a mountain known as Peak 15. Very little was known about it. Then in 1852, surveyors found it was the highest in the world, and they named it Everest.

As with other significant challenges that we can identify in life, many people have been driven by a passionate desire to conquer peaks all around the world. This is just one illustration of those of us who can identify their significant challenges and rise to them. This sharp focus, determination and courage turns ordinary citizens into people who are invariably on a mission. People who know what they want.

When we know what we want to accomplish, almost anything flowing from that can be driven by a single unified goal, objective and mission.

From the outset, having a clear idea of the significant challenges and why we want to address those significant challenges is far more important than knowing how we are going to go about addressing that challenge.

Of course, we should also have an idea of how we might go about meeting our challenges, whilst at the same time recognising that few things waylay legitimate ambitions more than being held-hostage to the means. This requires that we ...


Read More on Datafloq
How Cutting-edge Data Technologies Rapidly Change the World of HR

How Cutting-edge Data Technologies Rapidly Change the World of HR

Cutting-edge technologies help create a bright workforce management ecosystem that works for you. To achieve this, organizations need to find talented, predictive, data-driven, intuitive, and user-friendly technologies that integrate with its core HR management system and enable the HR team to perform better.

Let’s see how the new-age solutions are changing the way HR performs.

Today, HR management is not just about automating payroll or time and attendance management. It has evolved to include HR planning and placement strategies, creating efficient channels for better workforce engagement, and enhancing employee training and development. To accomplish all these tasks, it requires a technology that can provide accurate data in an easily comprehensible manner and support organization-wide decision making.

Talent acquisition, development, and retention

Recruitment is one of the biggest areas where HR technology can be leveraged in order to drive better results; mainly in a world where businesses of all sizes are engaged in a “war of talent” and struggle to find the candidates they need. Regular recruitment methods like using ATS (Applicant Tracking Systems), job portals, and recruitment consultants cannot help meet the need of modern recruitment challenges.

Recruitment methodologies are changing, and technology needs to support them. Social hiring is increasing today, and organizations are ...


Read More on Datafloq
How Big data Emerged as a Productive Game Changer and is Shaping Industries

How Big data Emerged as a Productive Game Changer and is Shaping Industries

The word big data is thrown around relentlessly in business conversations, though scores of folks hardly know how to carry out, but everyone says they are doing it. Nowadays, there is barely any company that is not being influenced by big data. It is indeed a source of completion, productivity growth and innovation as long as the best practices and right organizers are in place.  

Data is now fusing into every modern industry and many stakeholders have the same opinion that big data is an integral part of every business. And this trend has made its way to every industry. It is so big that it can likely to disturb everything from consumer behavior to any other field of our lives. Nevertheless, advantages connected with Big Data have the possibility to change the way organizations action their diverse operations.

Well understanding big data continues to be a challenge for a number of industries that are implementing this technology. As per Gartner Survey, 75% organizations are investing in big data and continue to do that for next two years. Most companies have many goals for investing in big data projects, while the most important one is to enrich customer experience, achieve targeted ...


Read More on Datafloq
How to Improve Big Data Analytics with Machine Learning

How to Improve Big Data Analytics with Machine Learning

The amount of data generated by businesses every single day is colossal – and it’s growing at an exponential rate. Many organisations are now reporting that they have a mass of unstructured data and no idea how to analyze it or use it in any way – from emails to social media posts, customer letters to voicemail messages, there’s now so much information to digest and analyze, businesses are finding it almost impossible to keep up. What’s the point in collecting and storing so much data if you can’t find a way to interpret it effectively?

But advances in analytics tools and software have endowed businesses with an exciting way to leverage big data: machine learning. Machine learning is described as ‘a type of artificial intelligence which provides computers with the ability to learn without being explicitly programmed’. It also focuses on the ‘development of computer programs that can teach themselves to grow and change when exposed to new data’.

Here’s a real-world example of big data and machine learning at work. When Google sends you an alert that you should set off on a journey in order for you to be on time for a meeting, that’s the result of big ...


Read More on Datafloq
Three Ways to Successfully Manage Your Marketing Data

Three Ways to Successfully Manage Your Marketing Data

Data is at the center of today’s marketing strategies –essential to driving the right connections with the right people and across the right channels. With the fluidity of data moving in and out of channel systems and consumers interacting with brands through any number of touchpoints, data is constantly changing. Marketers must be extra diligent to proactively manage customer and prospect data to maintain the integrity of such a valuable business asset.

Research firm Ascend2 recently conducted a survey examining the state of marketing data management. 

Study findings included:

Measuring ROI to attribute sales resulting from the marketing data management investment is a top priority. Improving the quality and accessibility of marketing data are also top goals.



Companies fail or thrive based in large part on the quality of their business decisions. Making more accurate decisions is the most valuable benefit of using marketing data for 54% of companies.



Poor access to marketing data will limit its use. And if the marketing data is of poor quality, it will have limited usefulness. Combined, these are the most significant barriers to success.



Tactically, the most effective use of marketing data is for campaign targeting. Getting the right message to the right person at the right time requires ...


Read More on Datafloq
Big data and cloud combo spark momentous genomic medicine advances at HudsonAlpha

Big data and cloud combo spark momentous genomic medicine advances at HudsonAlpha

The next BriefingsDirect Voice of the Customer IT innovation case study explores how the HudsonAlpha Institute for Biotechnology engages in digital transformation for genomic research and healthcare paybacks.

We'll learn how HudsonAlpha leverages modern IT infrastructure and big-data analytics to power a pioneering research project incubator and genomic medicine innovator.

Listen to the podcast. Find it on iTunes. Get the mobile app. Read a full transcript or download a copy.

To describe new possibilities for exploiting cutting-edge IT infrastructure and big data analytics for potentially unprecedented healthcare benefits, we're joined by Dr. Liz Worthey, Director of Software Development and Informatics at the HudsonAlpha Institute for Biotechnology in Huntsville, Alabama. The discussion is moderated by BriefingsDirect's Dana Gardner, Principal Analyst at Interarbor Solutions.

Here are some excerpts:

Gardner: It seems to me that genomics research and IT have a lot in common. There's not much daylight between them -- two different types of technology, but highly interdependent. Have I got that right?

Worthey: Absolutely. It used to be that the IT infrastructure was fairly far away from the clinic or the research, but now they're so deeply intertwined that it necessitates many meetings a week between the leadership of both in order to make sure we get it right.

Gardner: And you have background in both.

Worthey: My background is primarily on the biology side, although I'm Director of Informatics and I've spent about 20 years working in the software-development and informatics side. I'm not IT Director, but I'm pretty IT savvy, because I've had to develop that skill set over the years. My undergraduate degree was in immunology, and since then, my focus has really been on genetics informatics and bioinformatics.

Gardner: Please describe what genetic informatics or genomic informatics is for our audience.

Worthey: Since 2003, when we received the first version of a human reference genome, there's been a large field involved in the task of extracting knowledge that can be used for society and health from genomic data.

Worthey
A [human] genome is 3.2 billion nucleotides in length, and in there, there's a lot of really useful information. There's information about which diseases that individual may be more likely to get and which diseases they will get.

It’s also information about which drugs they should and shouldn't take; information about which types of procedures, surveillance procedures, what colonoscopies they should have. And so, the clinical aspects of genomics are really developing the analytical capabilities to extract that data in real time so that we can use it to help an individual patient.

On top of that, there's also a lot of research. A lot of that is in large-scale studies across hundreds of thousands of individuals to look for signals that are more difficult to extract from a single genome. Genomics, clinical genomics, is all of that together.

Parallel trajectory

Gardner: Where is the societal change potential in terms of what we can do with this information and these technologies?

Worthey: Genomics has existed for maybe 20 years, but the vast majority of that was the first step. Over the last six years, we've taken maybe the second or third step in a journey that’s thousands of steps long.

We're right on the edge. We didn’t used to be able to do this, because we didn't have any data. We didn't have the capability to sequence a genome cheaply enough to sequence lots. We also didn't have the storage capabilities to store that data, even if we could produce it, and we certainly didn't have enough compute to do the analysis, infrastructure-wise. On top of that, we didn’t actually have the analytical know-how or capabilities either. All of that is really coalescing at the same time.
Start Your HPE Vertica
Community Edition Trial
As we are doing genomics, and that technology and the sequencing side has come up, the compute and the computing technologies have come up at the time. They're feeding each other, and genomics is now driving IT to think about things in a very different way.

Gardner: Let's dive into that a little bit. What are the hurdles technologically for getting to where you want to be, and how do you customize that or need to customize that, for your particular requirements?

Worthey: There are a number of hurdles. Certainly, there are simpler hurdles that we have to get past, like storage, storage tied with compression. How do you compress that data to where you can store millions of genomes at a price that's affordable.

A bigger hurdle is the ability to query information at a lot of disparate sites. When we think about genomic medicine, one of the things that we really want do is share data between institutions that are geographically diverse. And the data that we want to share is millions of data points, each of which has hundreds or thousands of annotations or curations.
When we think about genomic medicine, one of the things that we really want do is share data between institutions that are geographically diverse.

Those are fairly complex queries, even when you're doing it in one site, but in order to really change the practice of medicine, we have to be able to do that regionally, nationally, and globally. So, the analytics questions there are large.

We have 3.2 billion data points for each individual. The data is quite broad, but it’s also pretty deep. One of the big problems is that we don’t have all the data that we need to do genomic medicine. There's going to be data mining -- generate the data, form a hypothesis, look at the data, see what you get, come back with a new hypothesis, and so on.

Finally, one of the problems that we have is that a lot of algorithms that you might use only exists in the brains of MDs, other clinical folks, or researchers. There is really a lot of human computer interaction work to be done, so that we can extract that knowledge.

There are lots of problems. Another big problem is that we really want to put this knowledge in the hands of the doctor while they have seven minutes to see the patient. So, it’s also delivery of answers at that point in time, and the ability to query the data by the person who is doing the analysis, which ideally will be an MD.

Cloud technology

Gardner: Interestingly, the emergence of cloud methods and technology over the past five or 10 years would address some of those issues about distributing the data effectively -- and also perhaps getting actionable intelligence to a physician in an actual critical-care environment. How important is cloud to this process and what sort of infrastructure would be optimal for the types of tasks that you have in mind?

Worthey: If you had asked me that question two years ago, on the genomic medicine side, I would have said that cloud isn't really part of the picture. It wasn't part of the picture for anything other than business reasons. There were a lot of questions around privacy and sharing of healthcare information, and hospitals didn’t like the idea.

They're very reluctant to move to the cloud. Over the last two years, that has started to change. Enough of them had to decide to do it, before everybody would view it as something that was permissible.

Cloud is absolutely necessary in many ways, because we have periods where lots of data that has to be computed and analytics has to be run. Then, we have periods where new information is coming off the sequencer. So, it’s that perfect crest and trough.

If you don't have the ability to deal with that sort of fluctuation, if you buy a certain amount of hardware and you only have it available in-house, your pipeline becomes impacted by the crests and then often sits idle for a long time.
Start Your HPE Vertica
Community Edition Trial
But it’s also important to have stuff in-house, because sometimes, you want to do things in a different way. Sometimes, you want to do things in a more secure manner.

It's kind of our poster child for many of the new technologies that are coming out that look at both of those, that allow you to run things in-house and then also allow you to run the same jobs on the same data in the cloud as well. So, it’s key.

Gardner: That brings me to the next question about this concept of genomics as a service or a platform to support genomics as a service. How do you envision that and how might that come about?

Worthey: When we think about the infrastructure to support that, it has to be something flexible and it has to be provided by organizations that are able to move rapidly, because the field is moving really quickly.

It has to be infrastructure that supports this hypothesis-driven research, and it has to be infrastructure that can deal with these huge datasets. Much of the data is ordered, organized, and well-structured, but because it's healthcare, a lot of the information that we use as part of the interpretation phase of genomic medicine is completely unstructured. There needs to be support for extraction of data from silos.

My dream is that the people who provide these technologies will also help us deal with some of these boundaries, the policy boundaries, to sharing data, because that’s what we need to do for this to become routine.

Data and policy

Gardner: We've seen some of that when it comes to other forms of data, perhaps in the financial sector. More and more, we're seeing tokenization, authentication, and encryption, where data can exist for a period of time with a certain policy attached to it, and then something will happen if the data is a result for that policy. Is that what you're referring to?

Worthey: Absolutely. It's really interesting to come to a meeting like HPE Discover because you get to see what everybody else is doing in different fields. Much of the things that people in my field have regarded as very difficult are actually not that hard at all; they happen all the time in other industries.

A lot of this -- the encryption, the encrypted data sharing, the ability to set those access controls in a particular way that only lasts for a certain amount of time for a particular set of users -- seems complex, but it happens all the time in other fields. A big part of this is talking to people who have a lot of experience in a regulated environment. It’s just not this regulated environment and learning the language that they use to talk to the people that set policy there and transferring that to our policy makers and ideally getting them together to talk to one another.

Gardner: Liz, you mentioned the interest layers in getting your requirements to the technology vendors, cloud providers, and network providers. Is that under way? Is that something that's yet to happen? Where is the synergy between the genomic research community and the technology-vendor platform provider community?
This is happening fast. For genomics, there's been a shift in the volume of genomic data that we can produce with some new sequencing technology that's coming.

Worthey: This is happening fast. For genomics, there's been a shift in the volume of genomic data that we can produce with some new sequencing technology that's coming. If you're a provider of hardware or service user solutions to deal with big data, looking at genomics, as the people here are probably going to overtake many of those other industries in terms of the volume and complexity of the data that we have.

The reason that that's really interesting is because then you get invited to come and talk at forums, where there's lots of technology companies and you make them aware of the work that has to be done in the field of medicine, and in genomic research, and then you can start having those discussions.

A lot of the things that those companies are already doing, the use cases, are similar and maybe need some refinement, but a lot of that capability is already there.

Gardner: It's interesting that you’ve become sort of the “New York” of use cases. If you can make it there, you can make it anywhere. In other words, if we can solve this genomic data issue and use the cloud fruitfully to distribute and gather -- and then control and monitor the data as to where it should be under what circumstances -- we can do just about anything.

Correct me if I am wrong, though. We're using data in the genomic sense for population groups. We're winnowing those groups down into particular diseases. How farfetched is it to think about individuals having their own genomic database that would follow them like an authenticated human design? Is that completely out of the bounds? How far would that possibly be?

Technology is there

Worthey: I’ve had my genome sequenced, and it’s accessible. I could pick it up and look at it on the tools that I developed through my phone sitting here on the table. In terms of the ability to do that, a lot of that technology is already here.

The number of people that are being sequenced is increasing rapidly. We're already using genomics to make diagnosis in patients and to understand their drug interactions. So, we are here.

One of the things that we are talking about just now is, at what point in a person’s life should you sequence their genome. I and a number of other people in the field believe that that is earlier, rather than later, before they get sick. Then, we have that information to use when they get those first symptoms. You are not waiting until they're really ill before you do that.

I can’t imagine a future where that's not what's going to happen, and I don’t think that future is too far away. We're going to see it in our lifetimes, and our children are definitely going to see it in theirs.
The data that we already have, clinical information, is really for that one person, but your genome is shared among your family, even distant relatives that you’ve never met.

Gardner: The inhibitors, though, would be more of an ethical nature, not a technological nature.

Worthey: And policy, and society; the society impact of this is huge.

The data that we already have, clinical information, is really for that one person, but your genome is shared among your family, even distant relatives that you’ve never met. So, when we think about this, there are many very hard ethical questions that we have to think about. There are lots of experts that are working on that, but we can’t let that get in the way of progress. We have to do it. We just have to make sure we do it right.

Gardner: To come back down a little bit toward the technology side of things, seeing as so much progress has been made and that there is the tight relationship between information technology and some of the fantastic things that can happen with the proper knowledge around genomic information, can you describe the infrastructure you have in place? What’s working? What do you use for big-data infrastructure, and cloud or hybrid cloud as well?

Worthey: I'm not on the IT side, but I can tell you about the other side and I can talk a little bit on the IT side as well. In terms of the technologies that we use to store all of that varying information, we're currently using Hadoop and Mongo DB. We finished our proof of concept with HPE, looking at their Vertica solution.

We have to work out what the next steps might be for our proof of concept. Certainly, we’re very interested in looking at the solutions that they have in here. They fit with our needs. The issue that’s been addressed on that side is lots of variants, complex queries, that you need to answer really fast.
Start Your HPE Vertica
Community Edition Trial
On the other side, one of the technological hurdles that we have to meet is the unstructured data. We have electronic health record (EHR) information that’s coming in. We want to hook up to those EHRs and we want to use systems to process that data to make it organized, so that we can use it for the interpretation part.

In-house solution

We developed in-house solutions that we're using right now that allow humans to come in and look at that data and select the terms from it. So, you’d select disease terms. And then, we have in-house solutions to map them to the genomic side. We're looking at things like HPE’s IDOL as a proof-of-concept (POC) on that side. We're talking to some EHR companies about how to hook up the EHR to those solutions to our software to make it a seamless product and that would give us all that.

In terms of hardware, we do have HPE hardware in-house. I think we have 12 petabytes of their storage. We also have data direct network hardware, a general parallel file system solution. We even have things down to graphics processors for some of the analysis that we do. We’ve a large deck of such GPUs because in some cases it’s much faster for some other types of problems that we have to solve. So we are pretty IT-rich, a lot of heavy investment on the IT side.

Gardner: And cloud -- any preference to the topology that works for you architecturally for cloud, or is that still something you are toying with?
We not only do the research and the clinical, but we also have a lab that produces lots of data for other customers, a lab that produces genomic data as a service.

Worthey: We're currently looking at three different solutions that are all cloud solutions. We not only do the research and the clinical, but we also have a lab that produces lots of data for other customers, a lab that produces genomic data as a service.

They have a challenge of getting that amount of data returned to customers in a timely fashion. So, there are solutions that we're looking at there. There are also, as we talked at the start, solutions to help us with that in-flow of the data coming off the sequencers and the compute -- and so we're looking at a number of different solutions that are cloud-based to solve some of those challenges.

Gardner: Before we close, we’ve talked about healthcare and population impacts, but I should think there's also a commercial aspect to this. That kind of information will lend itself to entrepreneurial activities, products and services, a great demand in the marketplace? Is that something you're involved with as well, and wouldn’t that help foot the bill for some of these many costly IT infrastructure investments?

Worthey: One of the ways that HudsonAlpha Institute was set up was just that model. We have a research, not-for-profit side, but we also have a number of affiliate companies that are for-profit, where intellectual property and ideas can go across to that site and be used to generate revenue that fund the research and keep us moving and be on the cutting-edge.

We do have a services lab that does genomic sequencing in analytics. You can order that from them. We also service a lot of people who have government contracts for this type of work. And then, we have an entity called Envision Genomics. For disclosure, I'm one of founders of that entity. It’s focused on empowering people to do genomic medicine and working with lots of different solution providers to get genomic medicine being done everywhere it’s applicable.

Listen to the podcast. Find it on iTunes. Get the mobile app. Read a full transcript or download a copy. Sponsor: Hewlett Packard Enterprise.

You may also be interested in:

Six Practical Data Center Management Tips

Six Practical Data Center Management Tips

Nowadays, big data platforms are everywhere, and an overwhelming majority of people store digital content in the cloud. The hyperactive lifestyle of today’s professionals forced to access their work on multiple platforms as they travel, attributes to this.

Of course, big data is about much more than cloud-based systems. Businesses, government agencies, schools, and even the average individual now rely on data centers to save, organize and facilitate valuable information.

Data center administrators are more important than ever — at least if you want a reliable system in place. But what about the data center itself?

Experienced administrators are a given, along with a substantial maintenance team. But what improvements can be made to the actual data center and related hardware? What can you do to ensure your data center operates smoothly?

Maximize Storage Space

You can never have enough storage space, especially in a data center designed to hold thousands of terabytes or petabytes of information.

It’s not just about maximizing the amount of storage, though — you also want the space available to be efficient. It won’t do you any good to make use of old, outdated hard drives or storage systems. Read-write speeds won’t be able to keep up with customer demand.

Don’t forget ...


Read More on Datafloq
How Holographic Computing can Transform the Construction and Architecture Industries

How Holographic Computing can Transform the Construction and Architecture Industries

The construction industry is a multibillion-dollar industry. Yet, many of their processes have remained unchanged. Even though many technologies have been introduced to the construction and architecture industry over the last decade, holographic computing enabled by IoT devices like Microsoft Hololens is fundamentally changing how people visualize things, collaborate, and work in the construction industry.

Microsoft’s HoloLens, one-of-a-kind wearable holographic computer, is a revolutionary gizmo powered by Windows 10 that enables users to interact with high-definition holograms in the real world. In other words, it lets you see how holograms melded with reality. This enables architects and construction managers to be more imaginative while changing the way they think about design.

A gaze into the current scenario

Architects have to deal with space, shapes, and 3D objects as part of their job. Moreover, they need to create designs, translate them into a set of 2D documents. It has always been a difficult part for architects, engineers, contractors to collaborate remotely and interpret physical and digital information as well as dimensional relationships more effectively.

Until now, paper drawings have been one of the most common and effective ways to interact with each other and show the dynamics. Though a few sophisticated software and tools are ...


Read More on Datafloq
Why the 2016 Hype Cycle for Emerging Technologies is all about Data

Why the 2016 Hype Cycle for Emerging Technologies is all about Data

Every year, Gartner published the Gartner Hype Cycle for Emerging Technologies and today revealed the 2016 edition. Last year, Gartner tempered the expectations of Big Data, by predicting it would take another 5-10 before it would reach the plateau of productivity. This year, Gartner finally added “emerging” technologies such as the Blockchain and Machine Learning. Let’s have a look at the 2016 Hype Cycle for Emerging Technologies and see what it means.



Gartner identified three key technology trends that organisations need to track in order to gain competitive advantage:

1. The Perceptual Smart Machine Age

Gartner added ‘General Purpose Machine Intelligence’ as an ‘Innovation Trigger’ to this year’s hype cycle and expects it to take more than 10 years to reach the plateau of productivity. Of course, machine intelligence is nothing new, but that is (very) specific machine intelligence, i.e. a machine is extremely good in doing one, simple, task. General purpose machine intelligence is something different and requires extreme amounts of computing power and near-endless amounts of data, which is why it will probably take a lot longer. It is also called Artificial General Intelligence and it is an emerging field dealing with the development of ‘thinking machines’ with intelligence that is ...


Read More on Datafloq
5 Ways to Use Geo-Location Data to Transform Your Retail Marketing Strategy

5 Ways to Use Geo-Location Data to Transform Your Retail Marketing Strategy

Mobile has completely transformed the retail landscape. According to Pew’s October 2015 research, 68% of Americans have smartphones and 45% have tablet computers.  Mobile ownership is growing at record speeds, giving retailers unprecedented opportunities to target consumers by their geolocation data. The ability for mobile devices to track and report a person’s location in real time with a reliable degree of accuracy continues to evolve as new technologies continue to revolutionize the mobile landscape.

Here’s a look at 5 ways retailers to boost customer acquisition and retention by tapping into a consumer’s geolocation data.

Location-Based Marketing

Beacon technology allows brands to pinpoint where a customer is at any given moment, and then send them push notifications with coupons, promotions or other targeted offers. The use of beacons is further transforming the shift to mobile commerce by providing immediate relevancy and value to customers. For example, a clothing retailer store may send a promotion to a consumer near their location or while the individual is visiting a competitor.

A study conducted by beacon platform Swirl found that 73 percent of shoppers who received a beacon-triggered message on their smartphone said it increased their likelihood of making a purchase during a store visit, while 61 percent said the message ...


Read More on Datafloq
5 Ways Big Data is Changing the Gambling Industry

5 Ways Big Data is Changing the Gambling Industry

Microgaming was the first online gambling website, launched in 1994. Since its introduction, the gambling industry has come a long way towards full digitalization. Today, when online gambling has become insanely popular, casinos need to acquire advanced big data tools in order to accurately determine the odds and personalize their games for different types of players. Most industry leaders have already hired teams of big data engineers and technicians. In this article we will try to determine how big data influences the gambling industry.

It helps bookmakers establish more realistic odds

Bookmakers were the first gambling professionals who implemented big data analysis in their work processes. Big data helps them analyze previous games, determine winning and point scoring patterns and establish more realistic odds. A survey conducted by Talented has determined that more than 72% of UK bookmaking customers believe that their bookmaker doesn’t offer them a personalized service.

Also, 67% of gamblers will be loyal to the brand that offers a more customized service. Personalization of the bookmaking process can be done by introducing tailored odds and offers, exclusive offers in real time and targeted push notifications.

It helps poker players in developing their game strategies

In recent years, poker has become much more ...


Read More on Datafloq
Why Blending Data Analytics and Gut-Feeling Benefits your Business

Why Blending Data Analytics and Gut-Feeling Benefits your Business

Understanding the impact of Big Data is not self-evident for many companies. Big Data offers almost endless possibilities and as such organizations are overwhelmed. Big Data requires different technologies, new IT systems, new processes and a different way of working. In addition, Big Data requires a different culture and changing your company culture is always hard, especially when new technology is involved.

In order to be successful with big data, you need a culture that incorporates data-driven decision-making. That does not mean, however, that organizations should only focus on big data analytics and that they should ignore gut-feeling. Gut-feeling, or intuitive synthesis, is an important aspect of decision-making and successful companies are capable of combining the two in what has become known as Design Thinking.

Design Thinking; A Creative and Data-Driven Process

In the past decade, design thinking, also known as a human-centered approach to innovation, has become a popular practice at organizations from around the world to generate innovative and competitive strategies. Although the history of design thinking can be traced back to the 1960s, the adaptation of design thinking for business purposes followed in 1991 by the founder of IDEO, David Kelley. Specialized design thinking firms such as IDEO help organizations create ...


Read More on Datafloq
Behind the Scenes with IoT; How Big Data Can Help

Behind the Scenes with IoT; How Big Data Can Help

It is hard to avoid hearing the term IoT or the tremendous business opportunities created by IoT. But what does IoT really mean to the beneficiary (i.e., the end using customer)? What happens behind the scene?

The Internet of Things (IoT), as defined by Wikipedia, is the network of physical devices, vehicles, buildings and other items—embedded with electronics, software, sensors, actuators, and network connectivity that enable these objects to collect and exchange data.

This definition does not give enough credit to what happens beyond the physical devices. 

The Scope of IoT

The IoT opportunity is commonly segmented into unique industries and applications as pictured in the figure below provided by IoT Analytics.

Figure 1

The beneficiaries' perception of IoT will most certainly vary across Industry/Application combinations. Behind the scenes, there are four IoT components that make an IoT application complete. These components seem to be the same across all industries and applications.

Four Components of an IoT Application

Figure 2

Smart Connected Device

A smart connected device is the “Star of the Show” in the IoT world.  Examples are Fitbits and Nest thermostats in the consumer world. Smart meters, security monitors, and vehicle tracking in the business world. A device, qualifying as “Smart” and “Connected”, will consist of five layers as shown in Figure 3.

Figure 3

Physical Properties (Form Factor)

What you see ...


Read More on Datafloq
What is DataOps and Why Do We Need It?

What is DataOps and Why Do We Need It?

As businesses churn out more and more data every day, the emergence of a new set of best practices has been helping to improve the coordination between the analysis of this data and the general operation of a business. These best practices are known as DataOps – and they’ve become essential for any business looking to compete in the world of real-time BI.

What is DataOps?

So what exactly is DataOps? It’s essentially a method of managing data, with greater focus on communication and integration. It also promotes automation, as well as collaboration between all of those who will come into contact with the data: the engineers and the data scientists. DataOps bridges the gap between those who collate the data, those who analyse the data and those who will put the findings from that analysis to good use.

Why do we need DataOps?

The first reason we need DataOps – a streamlined, effective process – is because time is of the essence in the world of business. There’s a reason that so much emphasis has been placed on real-time data gathering and analysis – because things move fast, and a new opportunity could arrive and have disappeared within the blink of an eye.
We ...


Read More on Datafloq
Machine Learning Becomes Mainstream: How to Increase Your Competitive Advantage

Machine Learning Becomes Mainstream: How to Increase Your Competitive Advantage

First there was big data – extremely large data sets that made it possible to use data analytics to reveal patterns and trends, allowing businesses to improve customer relations and production efficiency. Then came fast data analytics – the application of big data analytics in real-time to help solve issues with customer relations, security, and other challenges before they became problems. Now, with machine learning, the concepts of big data and fast data analytics can be used in combination with artificial intelligence (AI) to avoid these problems and challenges in the first place.

So what is machine learning, and how can it help your business? Machine learning is a subset of AI that lets computers “learn” without explicitly being programmed. Through machine learning, computers can develop the ability to learn through experience and search through data sets to detect patterns and trends. Instead of extracting that information for human comprehension and application, it will use it to adjust its own program actions.

What does that mean for your business? Machine learning can be used across industries, including but not limited to healthcare, automotive, financial services, cloud service providers, and more. With machine learning, professionals and businesses in these industries can get improved ...


Read More on Datafloq
What is Information Governance? Why Do I Need It?

What is Information Governance? Why Do I Need It?

The explosive growth of information has been the defining characteristic of our era, the Age of Information. The amount of data, the number of data sources, the uses for data, and the routes that data travel have all been growing at an exponential rate, creating new industries for defining, processing, collecting, accessing, and curating information. In such an environment, everyone recognizes the importance of information Governance, but how to go about this massive task is harder to grasp.

What is Information Governance (IG)? How does it differ from IT Governance, Information Management, or Data Governance? In this article, we will try to shed a little light on Information Governance – an emerging area of data management that focuses specifically on business processes and compliance issues.

What is Information Governance?

Information governance is a set requirement of rights and responsibility to allow suitable function of different aspects of information ranging valuation, creation, storage, use, archiving and deletion. To use information effectively, information governance includes purposes, policies, standards, processes that helps organization to achieve its goals.

Here, the requirement of decision rights is the determination of ownership of data, and who makes the decisions about it. By defining the decision-makers and owners, we can now assign ...


Read More on Datafloq
Cybersecurity crosses the chasm: How IT now looks to the cloud for best security

Cybersecurity crosses the chasm: How IT now looks to the cloud for best security

The next BriefingsDirect cybersecurity innovation and transformation panel discussion explores how cloud security is rapidly advancing, and how enterprises can begin to innovate and prevail over digital disruption by increasingly using cloud-defined security.

We'll examine how a secure content collaboration services provider removes the notion of organizational boundaries so that businesses can better extend processes. And we'll hear how less boundaries and cloud-based security together support transformative business benefits.

Listen to the podcast. Find it on iTunes. Get the mobile app. Read a full transcript or download a copy.

To share how security technology leads to business innovations, we're joined by Daren Glenister, Chief Technology Officer at Intralinks in Houston, and Chris Steffen, Chief Evangelist for Cloud Security at HPE. The discussion is moderated by BriefingsDirect's Dana Gardner, Principal Analyst at Interarbor Solutions.

Here are some excerpts:

Gardner: Daren, what are the top three trends driving your need to extend security and thereby preserve trust with your customers?

Glenister
Glenister: The top thing for us is speed of business, people being able to do business beyond boundaries, and how can they enable the business rather than just protect it. In the past, security has always been about how we shut things down and stop data. But now it's how we do it all securely, and how we perform business outside of the organization. So, it's enabling business.

The second thing we've seen is compliance. Compliance is a huge issue for most of the major corporations. You have to be able to understand where the data is and who has access to it, and to know who's using it and make sure that they can be completely compliant.

The third thing is primarily around the shift between security inside and outside of the organization. It's been a fundamental shift for us, and we've seen that security has moved from people's trust in their own infrastructure, versus using a third-party who can provide that security and have a far higher standard, because that’s what they do the whole day, every day. That security shift from on-premise to the cloud is a third big driver for us, and we've seen that in the market.

Gardner: You're in a unique position to be able to comment on this. Tell us about Intralinks, what the company does, and why security at the edge is part of your core competency.

Secure collaboration

Glenister: We're a software-as-a-service (SaaS) provider and we provide secure collaboration for data, wherever that data is, whether it’s inside a corporation or it’s shared outside. Typically, once people share data outside, whether it’s through e-mail or any other method, some of the commercial tools out there have lost control of that data.

We have the ability to actually lock that data down, control that, and put the governance and the compliance around that to secure that data, know where the high-value intellectual property (IP) is, who has access to it, and then be able to even share as well. And, if you’re in a situation of losing data, revoke access to someone who has left the organization.

Gardner: And these are industries that have security as a paramount concern. So, we’re talking about finance and insurance. Give us a little bit more indication of the type of data we’re talking about.

Glenister: It's anybody with high-value IP or compliance requirements -- banking, finance, healthcare, life sciences, for example, and manufacturing. Even when you’re looking at manufacturing overseas and you have IP going over to China to manufacture your product, your plans are also being shared overseas. We've seen a lot of companies now asking how to protect those plans and therefore, protect IP.
Critical Security
And Compliance Considerations
For Hybrid Cloud Deployments
Gardner: Chris, Intralinks seems to be ahead of the curve, recognizing how cloud can be an enabler for security. We're surely seeing a shift in the market, at least I certainly am. In the last six months or so, companies that were saying that security was a reason not to go to the cloud are now saying that security is a reason they're going to the cloud. They can attain security better. What's happened that has made that perspective flip?

Steffen: I don't know exactly what’s happened, but you're absolutely right; that flip is going on. We've done a lot of research recently and shown that when you’re looking at inherent barriers going to a cloud solution, security and compliance considerations are always right there at the top. We commissioned the study through 451 Research, and we kind of knew that’s what was going on, but they sure nailed it down, one and two, security and compliance, right there. [Get a copy of the report.]

Steffen
The reality, though, is that that the C-table, executives, IT managers, those types, are starting to look at the massive burden of security and hoping to find help somewhere. They can look at a provider like Intralinks, they can look at a provider like HPE and ask, "How can they help us meet our security requirements?"

They can’t just third-party their security requirements away. That’s not going to cut it with all the regulators that are out there, but we have solutions. HPE has a solution, Intralinks has solutions, a lot of third-party providers have solutions that will help the customer address some of those concerns, so those guys can actually sleep at night.

Gardner: We're hearing so much about digital disruption in so many industries, and we're hearing about why IT can’t wait, IT needs to be agile and have change in the business model to appeal to customers to improve their user experience.

It seems that security concerns have been a governor on that. "We can’t do this because 'blank' security issue arises." It seems to me that it's a huge benefit when you can come to them and say, "We're going to allow you to be agile. We're going to allow you to fight back against disruption because security can, in fact, be managed." How far are we to converting disruption in security into an enabler when you go to the cloud?

Very difficult

Glenister: The biggest thing for most organizations is they're large, and it’s very difficult to transform just the legacy systems and processes that are in-place. It's very difficult for organizations to change quickly. To actually drive that, they have to look at alternatives, and that’s why a lot of people move into cloud. Driving the move to the cloud is, "Can we quickly enable the business? Can we quickly provide those solutions, rather than having to spend 18 months trying to change our process and spend millions of dollars doing it?"

Enablement of the business is actually driving the need to go to the cloud, and obviously will drive security around that. To Chris’s point a few minutes ago, not all vendors are the same. Some vendors are in the cloud and they're not as secure as others. People are looking for trusted partners like HPE and Intralinks, and they are putting their trust and their crown jewels, in effect, with us because of that security. That’s why we work with HPE, because they have a similar philosophy around security as we do, and that’s important.

Steffen: The only thing I would add to that is that security is not only a concern of the big business or the small business; it’s everybody’s concern. It’s one of those things where you need to find a trusted provider. You need to find that provider that will not only understand the requirements that you're looking for, but the requirements that you have.
You don’t want to migrate to a cloud solution and then have all the compliance work that you’ve done previously just wiped away.

This is my opinion, but when you're kicking tires and looking at your overall compliance infrastructure, there's a pretty good chance you had to have that compliance for more than a day or two. It’s something that has been iterative; it may change, it may grow, whatever.

So, when you're looking at a partner, a lot of different providers will start to at least try to ensure that you don’t start at square-one again. You don’t want to migrate to a cloud solution and then have all the compliance work that you’ve done previously just wiped away. You want a partner that will map those controls and that really understands those controls.

Perfect examples are in the financial services industry. There are 10 or 11 regulatory bodies that some of the biggest banks in the world all have to be compliant with. It’s extremely complicated. You can’t really expect that Big Bank 123 is going to just throw away all that effort, move to whatever provider, and hope for the best. Obviously, they can’t be that way. So the key is to take a map of those controls, understand those controls, then map those controls to your new environment.

Gardner: Let’s get into a little bit of the how ... How this happens. What is it that we can do with security technology, with methodologies, with organizations that allow us to go into cloud, remove this notion of a boundary around your organization and do it securely? What’s the secret sauce, Daren?

Glenister: One of the things for us, being a cloud vendor, is that we can protect data outside. We have the ability to actually embed the security into documents wherever documents go. Instead of just having the control of data at rest within the organization, we have the ability to actually control it in motion inside and outside the perimeter.

You have the ability to control that data, and if you think about sharing with third parties, quite often people say, "We can’t share with a third-party because we don’t have compliance, we don’t have a security around it." Now, they can share, they can guarantee that the information is secure at rest, and in motion.

Typically, if you look at most organizations, they have at-rest data covered. Those systems and procedures are relative child’s play. But that’s been covered for many years. The challenge is that it's newly in motion. How do you actually extend working with third parties and working with outside organizations?

Innovative activities

Gardner: It strikes me that we're looking at these capabilities through the lens of security, but isn’t it also the case that this enables entirely new innovative activities. When you can control your data, when you can extend where it goes, for how long, to certain people, under certain circumstances, we're applying policies, bringing intelligence to a document, to a piece of data, not just securing it but getting control over it and extending its usefulness. So why would companies not recognize that security-first brings larger business benefits that extend for years?

Glenister: Historically, security has always been, "No, you can’t do this, let’s stop." If you look in a finance environment, it’s stop using thumb drives, stop using emails, stop using anything rather than ease of solution. We've seen a transition. Over the last six months, you're starting to see a transition where people are saying, "How do we enable? How do we get people to control them?' As a result of that, you see new solutions coming out from organizations and how they can impact the bottom line.

Gardner: Behavior modification has always a big part of technology adoption. Chris, what is it that we can do in the industry to show people that being secure and extending the security to wherever the data is going to go gives us much more opportunity for innovation? To me this is a huge enticing carrot that I don’t think people have perhaps fully grokked.
What is cloud security? What does it mean to have defense in depth? What does it mean to have a matured security policy vision?

Steffen: Absolutely. And the reality of it is that it’s an educational process. One of the things that I've been doing for quite some time now is trying to educate people. I can talk with a fellow CISSP and we can talk about Diffie-Hellman encryption and I promise that your CEO does not care, and he shouldn’t. He shouldn’t ever have to care. That’s not something that he needs to care about, but he does need to understand total cost of ownership (TCO), he needs to understand return on investment (ROI). He needs to be able to go to bed at night understanding that his company is going to be okay when he wakes up in the morning and that his company is secure.

It’s an iterative process; it’s something that they have to understand. What is cloud security? What does it mean to have defense in depth? What does it mean to have a matured security policy vision? Those are things that really change the attitudinal barriers that you have at a C-table that you then have to get past.

Security practitioners, those tinfoil hat types -- I classify myself as one of those people, too -- truly believe that they understand how data security works and how the cloud can be secured, and they already sleep well at night. Unfortunately, they're not the ones who are writing the checks.

It's really about shifting that paradigm of education from the practitioner level, where they get it, up to the CIO, the CISO who hopefully understands, and then up to the C-table and the CFO making certain that they can understand and write that check to ensure that going to a cloud solution will allow them to sleep at night and allow the company to innovate. They'll take any security as an enabler to move the business forward.
Critical Security
And Compliance Considerations
For Hybrid Cloud Deployments
Gardner: So, perhaps it’s incumbent upon IT and security personnel to start to evangelize inside their companies as to the business benefits of extended security, rather than the glass is always half empty.

Steffen: I couldn’t agree more. It’s a unique situation. Having your -- again, I'll use the term -- tinfoil hat people talking to your C-table about security -- they're big and scary, and so on. But the reality of it is that it really is critically important that they do understand the value that security brings to an organization.

Going back to our original conversations, in the last 6 to 12 months, you're starting to see that paradigm shifted a little bit, where C-table executives aren’t satisfied with check-box compliance. They want to understand what it takes to be secure, and so they have experts in house and they want to understand that. If they don’t have experts in-house, there are third-party partners out there that can provide that amount of education.

Gardner: I think it’s important for us to establish that the more secure and expert you are at security the more of a differentiator you have against your competition. You're going to clean up in your market if you can do it better than they can.

Step back

Steffen: Absolutely, and even bring that a step further back. People have been talking for two decades now about technology as a differentiator and how you can make a technical decision or embrace and exploit technology to be the differentiator in your vertical, in your segment, so on.

The credit reporting agency that I worked for a long time ago was one of those innovators, and people thought we were nuts for doing some of the stuff that we are doing. Years later, everybody is doing the same thing now.

It really can set up those things. Security is that new frontier. If you can prove that you're more secure than the next guy, that your customer data is more secured than the next guy, and that you're willing to protect your customers more than the next guy, maybe it’s not something you put on a billboard, but people know.

Would you go to retailer A because they have had a credit card breach or do you decide to go retailer B? It's not a straw man. Talk to Target, talk to Home Depot, talk to some of these big big-box stores that have had breaches and ask how their numbers looked after they had to announce that they had a breach.
Customers are now more demanding because the media is blowing up all of the cyber crimes, threats, and hacks. The consumer is now saying they need their data to be protected.

Gardner: Daren, let’s go to some examples. Can think of an example of IntraLinks and a security capability that became a business differentiator or enable?

Glenister: Think about banks at the moment, where they're working with customers. There's a drive for security. Security people have always known about security and how they can enable and protect the business.

But what’s happening is that the customers are now more demanding because the media is blowing up all of the cyber crimes, threats, and hacks. The consumer is now saying they need their data to be protected.

A perfect example is my daughter, who was applying for a credit card recently. She's going off to college. They asked her to send a copy of her passport, Social Security card, and driver’s license to them by email. She looked at me and said, "What do you think?" It's like, "No. Why would you?"

People have actually voted, saying they're not going to do business with that organization. If you look in the finance organizations now, banks and the credit-card companies are now looking at how to engage with the customer and show that they have been securing and protecting their data to enable new capabilities like loan or credit-card applications and protecting the customer’s data, because customers can vote with their feet and choose not to do business with you.

So, it’s become a business-enabler to say we're protecting your data and we have your concerns at heart.

Gardner: And it’s not to say that that information shouldn’t be made available to a credit card or an agency that’s ascertaining credit, but you certainly wouldn’t do it through email.

Insecure tool

Glenister: Absolutely, because email is the biggest sharing tool on the planet, but it’s also one of the most insecure tools on the planet. So, why would you trust your data to it?

Steffen: We've talked about security awareness, the security awareness culture, and security awareness programs. If you have a vendor management program and you’re subject to a vendor management from some other entity, one of the things they also would request is that you have a security awareness program?

Even five to seven years ago, people looked at that as drudgery. It was the same thing as all the other nonsensical HR training that you have to look at. Maybe, to some extent, it still is, but the reality is that when I've given those programs before, people are actually excited. It's not only because you get the opportunity to understand security from a business perspective, but a good security professional will then apply that to, "By the way, your email is not secured here, but your email is not secured at home, too. Don’t be stupid here, but don’t be stupid there either."

We're going to fix the router passwords. You don’t need to worry about that, but you have a home router, change the default password. Those sounds like very simple straightforward things, but when you share that with your employees and you build that culture, not only do you have more secure employees, but then the culture of your business and the culture of security changes.
It has to be a year-round, day-to-day culture with every organization understanding the implications of security and the risk associated with that.

In effect, what’s happening is that you'll finally be getting to see that translate into stuff going on outside of corporate America. People are expecting to have information security parameters around the businesses that they do business with. Whether it's from the big-box store, to the banks, to the hospitals, to everybody, it really is starting to translate.

Glenister: Security is a culture. I look at a lot of companies for whom we do once-a-year certification or attestation, an online test. People click through it, and some may have a test at the end and they answer the questions and that’s it, they're done. It's nice, but it has to be a year-round, day-to-day culture with every organization understanding the implications of security and the risk associated with that.

If you don’t do that, if you don’t embed that culture, then it becomes a one-time entity and your security is secure once a year.

Steffen: We were talking about this before we started. I'm a firm believer in security awareness. One of the things that I've always done is take advantage of these pretend Hallmark holidays. The latest one was Star Wars Day. Nearly everybody has seen Star Wars or certainly heard of Star Wars at some point or another, and you can’t even go into a store these days without hearing about it.

For Star Wars Day, I created a blog to talk about how information-security failures led to the downfall of the Galactic Empire.
Critical Security
And Compliance Considerations
For Hybrid Cloud Deployments
It was a fun blog. It wasn't supposed to be deadly serious, but the kicker is that we talked about key information security points. You use that holiday to get people engaged with what's going on and educate them on some key concepts of information security and accidentally, they're learning. That learning then comes to the next blog that you do, and maybe they pay a little bit more attention to it. Maybe they pay attention to simply piggybacking through the door and maybe they pay attention to not putting something in an e-mail and so on.

It's still a little iterative thing; it’s not going to happen overnight. It sounds silly talking about information security failures in Star Wars, but those are the kind of things that engage people and make people understand more about information security topics.

Looking to the future

Gardner: Before we sign off, let’s put on our little tinfoil hat with a crystal ball in front. If we've flipped in the last six months or so, people now see the cloud as inherently more secure, and they want to partner with their cloud provider to do security better. Let’s go out a year or two, how impactful will this flip be? What are the implications when we think about this, and we take into consideration what it really means when people think that cloud is the way to go to be secure on the internet?

Steffen: The one that immediately comes to mind for me -- Intralinks is actually starting to do some of this -- is you're going to see niche cloud. Here's what I mean by niche cloud. Let’s just take some random regulatory body that's applicable to a certain segment of business. Maybe they can’t go to a general public cloud because they're regulated in a way that it's not really possible.

What you're going to see is a cloud service that basically says, "We get it, we love your type, and we're going to create a cloud. Maybe it will cost you a little bit more to do it, but we understand from a compliance perspective the hell that you are going through. We want to help you, and our cloud is designed specifically to address your concerns."

When you have niche cloud, all of a sudden, it opens up your biggest inherent barriers. We’ve already talked about security. Compliance is another one, and compliance is a big fat ugly one. So, if you have a cloud provider that’s willing to maybe even assume some of the liability that comes with moving to their cloud, they're the winners. So let’s talk 24 months from now. I'm telling you that that’s going to be happening.
You definitely see security now transforming business, enabling businesses to do things and interact with their customs in ways they've never done before.

Gardner: All right, we'll check back on that. Daren, your prediction?

Glenister: You are going to see a shift that we're already seeing, and Chris will probably see this as well. It's a shift from discussions around security to transformation. You definitely see security now transforming business, enabling businesses to do things and interact with their customs in ways they've never done before.

You'll see that impacting two ways. One is going to be new business opportunities, so revenue coming in, but it’s also going to be streamlined in the internal processes, so making things easier to do internally. And you'll see a transformation of the business inside and outside. That’s going to drive a lot of new opportunities and new capabilities and innovations we've seen before.

Listen to the podcast. Find it on iTunes. Get the mobile app. Read a full transcript or download a copy. Sponsor: Hewlett Packard Enterprise.

You may also be interested in:

Why the Future of the Olympics are “The Internet of Olympic Games”

Why the Future of the Olympics are “The Internet of Olympic Games”

Forgive me for using this headline to attract more readers during the summer holidays. Certainly I will include a comment in my famous post The abuse of shocking headlines in IoT or how many stupid things will be connected? I encourage you to continue reading.

The Internet of Things in “Smart City Rio 2016“

I guess that many of you are following the Rio 2016 Olympic Games by TV, Internet or the lucky ones who are watching the Games in Rio, are listening and reading complaints from athletes, journalists, spectators and fans about this Games.

How is it possible that there are so many complaints on traffic jams in one of the most famous “Smart Cities? I imagine citizens and visitors are not satisfied with the performance of the Rio Operations Center, known as COR, that allows the local government to continuously monitor the city from this center. As a result, agencies can act more quickly in different situations, such as unforeseen events in traffic or environmental disasters or increase the security of its streets and assure unpredictable weather conditions.

How come that the IoT did not avoid any complaints on the green water in the sensorized swimming pool?

How come the Organization allowed empty and noisy stadiums? Maybe ...


Read More on Datafloq
BBBT to Host Webinar from TimeXtender on Governance in the Age of Self-Service

BBBT to Host Webinar from TimeXtender on Governance in the Age of Self-Service

This Friday, the Boulder Business Intelligence Brain Trust (BBBT), the largest industry analyst consortium of its kind, will host a private webinar from TimeXtender on its Data Discovery Hub, which covers end-to-end data discovery needs, enables business, and liberates IT.

(PRWeb August 16, 2016)

Read the full story at http://www.prweb.com/releases/2016/08/prweb13612311.htm

One more time: Do we still need Data Modeling?

One more time: Do we still need Data Modeling?

More specifically do we still need to worry about data modeling in the NoSQL, Hadoop, Big Data, Data Lake, world? This keeps coming up. Today it was via email after a presentation I gave last week. This time the query was about the place of data modeling tools in this new world order. Bottom line: […]
It’s YOUR Fault! Big Data Takes on the Blame

It’s YOUR Fault! Big Data Takes on the Blame

It’s your fault. 

No it’s the other person’s fault.

It’s the guy who cut you off in traffic.

It’s “customer service.”

It’s the government.

It’s the weather.

It’s global warming.

It’s the way things go.

A tree falls in the forest. A roof falls in. A business fails. A stock market crashes. A disease spreads.

Sh!t happens

 Who or what caused it? Cause and effect – causality – is deeply woven into our lives.

Figuring out who or what caused something is a universal application of everyday life as well as one of global consequence. We want to know why something happened, regardless of good or bad outcomes.

 Researchers use causality to test whether a drug has the desired effects as well as to control the less desirable side effects. It’s used by scientists to figure out the common cold or/and the secrets of the cosmos. What-caused-it figures out who is going to pay for damages – for the automobile accident or for global warming. It’s used by governments to develop regulations and uphold laws.

Knowing what caused what not only tries to explain what already happened, it also leans forward into the future. Causality is used for the ultimate gold - prediction. You use it to keep your finances and your safety. Business tries ...


Read More on Datafloq
How to Secure the Internet of Things (IoT) with Blockchain

How to Secure the Internet of Things (IoT) with Blockchain

IoT is creating new opportunities and providing a competitive advantage for businesses in current and new markets. It touches everything—not just the data, but how, when, where and why you collect it. The technologies that have created the Internet of Things aren’t changing the internet only, but rather change the things connected to the internet—the devices and gateways on the edge of the network that are now able to request a service or start an action without human intervention at many levels. 

Because the generation and analysis of data is so essential to the IoT, consideration must be given to protecting data throughout its life cycle. Managing information at all levels is complex because data will flow across many administrative boundaries with different policies and intents.

Given the various technological and physical components that truly make up an IoT ecosystem, it is good to consider the IoT as a system-of-systems. The architecting of these systems that provide business value to organizations will often be a complex undertaking, as enterprise architects work to design integrated solutions that include edge devices, applications, transports, protocols, and analytics capabilities that make up a fully functioning IoT system. This complexity introduces challenges to keeping the IoT secure, ...


Read More on Datafloq
5 Big Data Migration Mistakes To Avoid

5 Big Data Migration Mistakes To Avoid

Data migration can be a painful process that involves multiple steps in the Extract-Transform-Load (ETL) process. The challenges with migration can be even higher when we are talking about big data. This is especially true when we are migrating different types of structured and unstructured data within the same system. In this article, we will take a look at some of the most common mistakes that can cause delays or worse, failure of the big data migration project. 

Ignoring The Governance Structure

Big data migration processes can be quite overwhelming and organizations often spend most of the time firefighting infrastructure and load related issues. In the process, businesses often miss out on more critical aspects of migration like identifying the governance structure of the data. Understanding the ownership of data and who has permissions to access, create, edit or delete data is important to ensure that the data owners are in the loop of the process. This can be an issue from a legal standpoint as well, depending on the industry you operate in.

Not Cleansing Data

Merging or migrating your big data to a new system can be a good opportunity to cleanse the content and remove any legacy structures that your data ...


Read More on Datafloq
Why You Should Ask If Your Business is Intelligent

Why You Should Ask If Your Business is Intelligent

Asking whether your business is intelligent might seem like an unusual question but the response is becoming more important to companies that want to get ahead of rivals and maintain a competitive edge.

Business intelligence is a set of techniques or tools used to acquire and transform raw data into meaningful information that can be used to analyse the business and benefit decision making processes. Business intelligence technologies are capable of handling huge amounts of data that can help identify and develop business opportunities and risks that can then be incorporated into an effective business strategy. If done correctly business intelligence can provide companies with stability and give them an edge in a competitive market place.

The concept of business intelligence is becoming increasingly important as businesses are now expected to react and adapt to changing environments and trends. One of the main advantages to investing in business intelligence is the ability to analyse current customer buying trends and using this information to develop products that match current needs, thereby increasing profitability.

There are lots of ways to assess whether your business is intelligent. Try answering these 3 questions to see if you should invest in a business intelligence solution.

1. Do you Gain ...


Read More on Datafloq
Why You Should Level Up Your Data Wrangling Skills

Why You Should Level Up Your Data Wrangling Skills

Big data analytics seems to be everywhere these days, but it’s not one of those trends that simply pops up without reason. Big data analytics has proven to be crucial to discovering new insights for businesses, giving them new capabilities and improving on their existing operations. To actually get down to using that data is the big challenge many organizations are facing. It’s easy to simply say that companies need to gather data and analyze it, but in practice the process can be complex.

That’s where data wrangling comes in. Before businesses can even think of fully implementing results from big data analyses, they need to engage in a bit of data wrangling. Perhaps this is your job, making your role the first step in a larger strategy. While many may dismiss the role of data wrangler as mere janitorial work, when done well, it can make all the difference between successful big data efforts and continuing struggles. But first, you need to ensure your data wrangling skills are up to par.

So what exactly makes a good data wrangler? If it’s a job that’s often thought of as less than glamorous, why even bother honing your data wrangling skills? To be ...


Read More on Datafloq
Why You Should Embrace Analytic Athleticism

Why You Should Embrace Analytic Athleticism

With today’s rapidly changing mix of analytic techniques, toolsets, and platforms, it’s difficult for any organization to be confident it is keeping its analytic workforce and skillsets up to date.

I often have clients ask if they need to consider turning over a large portion of their analytics organization in order to adapt to these changes. I firmly believe that this is usually not the case and that the fundamental skills for analytic success are in place. Those skills simply need tuning and updating.

In fact, I see a strong parallel between athleticism and analytic capability. I also see a strong parallel between learning to speak multiple languages and learning to work within differing analytic environments. I’ll explain what I mean by both of these statements in this blog in the hope that it will help make the path forward seem clearer and less intimidating.

Analytic Athleticism

People generally accept the notion of inherent athleticism. This concept says simply that there are people who are athletic and those who aren’t. While anyone can maximize their inherent athleticism with training, someone who isn’t very athletic will never compete at an elite level in any sport. On the other hand, people with an elite level of ...


Read More on Datafloq
How to Build a Career in Project Management If You Are an IT Professional | Simplilearn

How to Build a Career in Project Management If You Are an IT Professional | Simplilearn

How to Build a Career in Project Management If You Are an IT Professional | Simplilearn Recently, we were invited by our friend, Allan, for Christmas celebrations at his place. He had invited close friends and family to the event. After a wonderful round of delicious snacks, we sat around the table, conversing, when Allan introduced me to his nephew Mark, a twenty-something software professionals. When he realized I was a Project Mana...Read More.
9 Social Media Marketing Skills you Need Right Now | Simplilearn

9 Social Media Marketing Skills you Need Right Now | Simplilearn

9 Social Media Marketing Skills you Need Right Now | Simplilearn Today, if a brand doesn’t exist on social media, they are not represented online. Ignoring social media means losing your opportunity for interaction with customers, thought leaders, and tastemakers, which costs you engagement and web traffic as a whole. According to Hubspot, in 2014 92% of marketers claimed that social media marketing was im...Read More.
6 Free Agile Tools for Every Project Manager | Simplilearn

6 Free Agile Tools for Every Project Manager | Simplilearn

6 Free Agile Tools for Every Project Manager | Simplilearn Agile Project Management has been around for decades and yet thousands of companies and managers fail to seamlessly transition from traditional to Agile when managing projects. Major reasons include lack of knowledge & awareness of inexpensive/free tools which can enable swift decision making in an organization. While a knowledge-base can only b...Read More.
Will China Pip the USA at Rio ’16? Big Data Analytics Predicts | Simplilearn

Will China Pip the USA at Rio ’16? Big Data Analytics Predicts | Simplilearn

Will China Pip the USA at Rio ’16? Big Data Analytics Predicts | Simplilearn August 5th, 2016 – The most awaited day of the year. Countries are gearing up for the biggest sports battle of the decade – the 2016 Summer Olympics at Rio de Janeiro, Brazil. More than 200 NOCs (National Olympic Committee) are assembling their best athletes to bring home the gold. But this time, countries are implementing a range of po...Read More.
What is Critical Chain Project Management? | Simplilearn

What is Critical Chain Project Management? | Simplilearn

What is Critical Chain Project Management? | Simplilearn Critical Chain Project Management A Brief Overview Critical Chain Project Management was developed and publicized by Dr. Eliyahu M. Goldratt in 1997. Followers of this methodology of Project Management claim it to be an alternative to the established standard of Project Management as advocated by PMBOK® and other Standards of Project...Read More.
What Does This Text Really Mean?

What Does This Text Really Mean?

Imagine that you were tasked with indexing (categorizing and summarizing) the contents of a large collection of randomly formatted text documents to make them easily accessible throughout your company. Perhaps this use case contains legal documents as well as medical records (with various levels of sensitivity). Suppose that the size of this collection contained over 100 million documents that you had to analyze. Besides categorizing each document as medical or legal, you must also assign them to subcategories like cardiology or probate. Finally, you must capture the meaning of each document for reporting and analytics.

Besides setting aside the rest of your life to do this job manually, does this problem have a solution?

Natural Language Processing [NLP]

Natural Language Processing (NLP) tools and techniques from could potentially provide a workable solution to the problem at hand. NLP is an area of Computer Science whose aim is to programmatically understand and produce human language in both written and spoken form.

Someday NLP will consider the number of languages used on the face of the planet and the dialects and colloquialisms that go along with them. For the purpose of this blog, let's stick to the language in which you are currently reading, US English.

For ...


Read More on Datafloq
An Overview of the Customer Relationship Management (CRM) Market | Simplilearn

An Overview of the Customer Relationship Management (CRM) Market | Simplilearn

An Overview of the Customer Relationship Management (CRM) Market | Simplilearn What is CRM and Why is it Becoming So Popular? Customer Relationship Management (CRM) software is used by businesses to manage interaction with current and future customers. Today’s CRMs have evolved into a confluence of multiple functions but the primary objective of a CRM is to serve the customers better, increase retention, and manage the ...Read More.
Free Whitepaper: Considerations of Deploying Enterprise-wide eLearning Content | Simplilearn

Free Whitepaper: Considerations of Deploying Enterprise-wide eLearning Content | Simplilearn

 Cloud-based learning management systems have revolutionized the field of education. An increasing number of enterprises are taking to the cloud to get their employees trained and equipped to face the challenges of the industry.   But what factors should you consider before choosing an LMS solution? This whitepaper helps you understand th...Read More.
What is the difference between Agile Coach and Agile Consultant? | Simplilearn

What is the difference between Agile Coach and Agile Consultant? | Simplilearn

What is the difference between Agile Coach and Agile Consultant? | Simplilearn ‘I am here to help you’. Both the Agile Coach and Agile Consultant will say the same to you. Whom will you choose? It gets difficult to make a choice as many people are not able to differentiate between an Agile Coach and an Agile Consultant which makes it even harder to take a decision. I would like to throw some light on this inter...Read More.
How software-defined storage translates into just-in-time data center scaling and hybrid IT benefits

How software-defined storage translates into just-in-time data center scaling and hybrid IT benefits

The next BriefingsDirect Voice of the Customer case study examines how hosting provider Opus Interactive adopted a software-defined storage approach to better support its thousands of customers.

We'll learn how scaling of customized IT infrastructure for a hosting organization in a multi-tenant environment benefits from flexibility of modern storage, unified management, and elastic hardware licensing. The result is gaining the confidence that storage supply will always meet dynamic hybrid computing demand -- even in cutting-edge hosting environments.

Listen to the podcast. Find it on iTunes. Get the mobile app. Read a full transcript or download a copy.

To describe how massive storage and data-center infrastructure needs can be met in a just-in-time manner, we're joined by Eric Hulbert, CEO at Opus Interactive in Portland, Oregon. The discussion is moderated by BriefingsDirect's Dana Gardner, Principal Analyst at Interarbor Solutions.

Here are some excerpts:

Gardner:What were the major drivers when you decided to re-evaluate your storage, and what were the major requirements that you had?

Hulbert: Our biggest requirement was high-availability in multi-tenancy. That was number one, because we're service providers and we have to meet the needs of a lot of customers, not just a single enterprise or even enterprises with multiple business groups.

Hulbert
So we were looking for something that met those requirements. Cost was a concern as well. We wanted it to be affordable, but needed it to be enterprise-grade with all the appropriate feature sets -- but most importantly it would be the scale-out architecture.

We were tired of the monolithic controller-bound SANs, where we'd have to buy a specific bigger size. We'd start to get close to where the boundary would be and then we would have to do a lift-and-shift upgrade, which is not easy to do with almost a thousand customers.

Ultimately, we made the choice to go to one of the first software-defined storage architectures, which is a company called LeftHand Networks, later acquired by Hewlett Packard Enterprise (HPE), and then some 3PAR equipment, also acquired by HPE. Those were, by far, the biggest factors while we made that selection on our storage platform.

Gardner: Give us a sense of the scale-out requirements.

Hulbert: We have three primary data centers in the Pacific Northwest and one in Dallas, Texas. We also have the ability for a little bit of space in New York, for some of our East Coast customers, and one in San Jose, California. So, we have five data centers in total.

Gardner: Is there a typical customer, or a wide range of customers?

Big range

Hulbert: We have a pretty big range. Our typical customers are in finance and travel and tourism, and the hospitality industries. There are quite a few in there. Healthcare is a growing vertical for us as well.

Then, we rounded out with manufacturing and little bit of retail. One of our actual verticals, if you could call it vertical, are the MSPs and IT companies, and even some VARs, that are moving into the cloud.

We enable them to do their managed services and be the "boots on the ground" for their customers. That spreads us into the tens of thousands of customers, because we have about 30 to 25 MSPs that work with us throughout the country, using our infrastructure. We just provide the infrastructure as a service, and that's been a pretty growing vertical for us.
Software Defined Storage
Eliminate Complexity and Free Infrastructure
From the Limitations of Dedicated Hardware
Gardner: And then, across that ecosystem, you're doing colocation, cloud hosting, managed services? What's the mix? What’s the largest part of the pie chart in terms of the services you're providing in the market?

Hulbert: We're about 75 percent cloud hosting, specifically a VMware-based private cloud, a multi-tenant private cloud. It's considered public cloud, but we call it private cloud.

We do a lot of hybrid cloud, where we have customers that are doing bursting into Amazon or [Microsoft] Azure. So, we have the ability to get them either Direct Connect Amazon connections or Azure ExpressRoute connections into any of our data centers. Then, 20 percent is colocation and about 5 percent for back-up, and disaster recovery (DR) rounds that out.

Gardner: Everyone, it seems, is concerned about digital disruption these days. For you, disruption is probably about not being able to meet demand. You're in a tight business, a competitive business. What’s the way that you're looking at this disruption in terms of your major needs as a business? What are your threats? What keeps you up at night?

Still redundant

Hulbert: Early on, we wanted a concurrently maintainable infrastructure, which also follows through with the data centers that we're at. So, we needed Tier 3-plus facilities that are concurrently maintainable. We wanted the infrastructure be the same. We're not kept up at night, because we can take an entire section of our solution offline for maintenance. It could be a failure, but we're still redundant.

It's a little bit more expensive, but we're not trying to compete with the commodity hosting providers out there. We're very customized. We're looking for customers that need more of that high-touch level of service, and so we architect these big solutions for them -- and we host with a 100 percent up-time.

The infrastructure piece is scalable with scale-out architecture on the storage side. We use only HP blades, so that we just keep stacking in blades as we go. We try to stay a couple of blade chassis ahead, so that we can take pretty large bursts of that infrastructure as needed.

That's the architecture that I would recommend for other service providers looking for a way to make sure they can scale out and not have to do any lift-and-shift on their SAN, or even the stack and rack services, which take more time.

We have to cable all of them versus needing to do one-blade chassis. Then, you can just slot in 16 blades quickly, as you're scaling. That allows you to scale quite a bit faster.
We use only HP blades, so that we just keep stacking in blades as we go. We try to stay a couple blade chassis ahead, so that we can take pretty large bursts of that infrastructure as needed.

Gardner: When it comes to making the choice for software-defined, what has that gotten you? I know people are thinking about that in many cases -- not just service providers, but enterprises. What did service-defined storage get for you, and are you furthering your software-defined architecture to more parts of your infrastructure?

Hulbert: We wanted it to be software-defined because we have multiple locations and we wanted one pane of glass. We use HPE OneView to manage that, and it would be very similar for an enterprises. Say we have 30 remote offices, they want to put the equipment there, and the business units need to provision some service and storage. We want to be going to each individual appliance or chassis or application in one place to provision it all.

Since we're dealing now with nearly a thousand customers -- and thousands and thousands of virtual servers, storage nodes, and all of that, the chunklets of data are distributed across all these. Being able to do that from one single pane of the glass from a management standpoint is quite important for us.

So, it's that software-defined aspect, especially distributing the data into chunklets, which allows us to grow quicker, and putting a lot of  automation on the back-end.

We only have 11 system administrators and engineers on our team managing that many servers, which shows you that our density is pretty high. That only works well if we have really good management tools, and having it software-defined means fewer people walking to and from the data center.

Even though our data centers are manned facilities, our infrastructure is basically lights out. We do everything from remote terminals.

Gardner: And does this software-defined extend across networking as well? Are you hyper-converged, converged? How would you define where you're going or where you'd like to go?

Converged infrastructure

Hulbert: We're not hyper-converged. For our scale, we can’t get into the prepackaged hyper-converged product. For us, it would be more of a converged infrastructure approach.

As I said, we do use the c-Class blade chassis with Virtual Connect, which is software-defined networking. We do a lot of VLANs and things like that on the software side.

We till have some outside of that out of band, networking, the network stacks, because we're not just a cloud provider. We also do colocation and a lot of hybrid computing where people are connecting between them. So, we have to worry about Fibre Channel on iSCSI and connections in SAN.

That adds a couple of other layers that are a few extra management steps, but in our scale, it’s not like we're adding tens of thousands of servers a day or even an hour, as I'm sure Amazon has to. So we can take that one small hit to pull that portion of the networking out, and it works pretty good for us.
Software Defined Storage
Eliminate Complexity and Free Infrastructure
From the Limitations of Dedicated Hardware
Gardner: How do you see the evolution of your business in terms of moving past disruption, adopting these newer architectures? Are there types of services, for example, that you're going to be able to offer soon or in the foreseeable future, based on what you're hearing from some of the vendors?

Hulbert: Absolutely. One of the first ones I mentioned earlier was the ability for customers that want to burst into public cloud to be able to do the Amazon Direct Connects. Even with the telecom providers back on, you're looking at 15 to 25 milliseconds latency. For some of these applications, that’s just too much latency. So, it’s not going to work.

Now, with the most recent announcement from Amazon, they put a physical Direct Connect node in Oregon, about a mile from our data-center facility. It's from EdgeConneX, who we partnered with.

Now, we can offer the lowest latency for both Amazon and Azure ExpressRoute in the Pacific Northwest, specifically in Oregon. That’s really huge for our customers, because we have some that do a lot of public-cloud bursting on bold platforms. So that’s one new offering we are doing.

Disruption, as we've heard, is around containers. We're launching a new container-as-a-service platform later this year based on ContainerX. That will allow us to do containers for both Windows or Starnix platforms, regardless of what the developers are looking for.

We're targeting developers, DevOps guys, who are looking to do microservices to take their application, old or new, and architect it into the containers. That’s going to be a very disruptive new offering. We've been working on a platform for a while now because we have multiple locations and we can do the geographic dispersion for that.

I think it’s going to take a little bit of the VMware market share over time. We're primarily a VMware shop, but I don’t think it’s going to be too much of an impact to us. It's another vertical we're going to be going after. Those are probably the two most important things we see as big disruptive factors for us.

Hybrid computing

Gardner: As an organization that's been deep into hybrid cloud and hybrid computing, is there anything out there in terms of the enterprises that you think they should better understand? Are there any sort of misconceptions about hybrid computing that you detect in the corporate space that you would like to set them straight on?

Hulbert: The hybrid that people typically hear about is more like having on-premises equipment. Let’s say I'm a credit union and I’ve got one of the bank branches that we decided to put three or four cabinets of our equipment and one on the vaults. Maybe they've added one UPS and one generator, but it’s not to the enterprise level, and they're bursting to the public cloud for the things that makes sense to meet their security requirements.

To me, that’s not really the best use of hybrid IT. Hybrid IT is where you're putting what used to be on-premises in an actual enterprise-level, Tier 3 or higher data center. Then, you're using either a form of bursting into private dedicated cloud from a provider in one of those data centers or into the public cloud, which is the most common definition of that hybrid cloud. That’s what I would typically define as hybrid cloud and hybrid IT.

Gardner: What I'm hearing is that you should get out of your own data center, use somebody else's, and then take advantage of the proximity in that data center, the other cloud services that you can avail yourself of.
Then, you're using either a form of bursting into private dedicated cloud from a provider in one of those data centers or into the public cloud which is the most common definition of that hybrid cloud.

Hulbert: Absolutely. The biggest benefit to them is at their individual location or bank branches. This the scenario where we use the credit union. They're going to have maybe one or two telco providers, and they're going to be their 100 or maybe 200 Mb-per-second circuits.

They're paying a pretty premium for them, and now when they get into one of these data centers, they're going to have the ability to have 10-gig or even 40- or 100-gig connected internet pipes with a lot higher headroom for connectivity at a better price point. 

On top of that, they'll have 10-gig connection options into the cloud, all the different cloud providers. Maybe they have an Oracle stack that they want to put on an Oracle cloud some day along with their own on- premises. The hybrid things get more challenging, because now, they're not going to get the connectivity they need. Maybe they want to be into the software, they want to do an Amazon or Azure, or maybe they want a Opus cloud.

They need faster connectivity for that, but they have equipment that still has usable life. Why not move that to an enterprise-grade data center and not worry about air conditioning challenges, electrical problems, or whether it’s secure.

All of these facilities, including ours, have every checkbox for compliance and auditing that happens on an annual basis. Those things that used to be really headaches aren’t core of their business. They don’t do those any more. Focus on what's core, focus on the application and their customers.

Gardner: So proximity still counts, and probably will count for an awfully long time. You get benefits from taking advantage of proximity in these data centers, but you can still have, as you say, what you consider core under your control, under your tutelage and set up your requirements appropriately?

Mature model

Hulbert: It really comes down to the fact that the cloud model is very mature at this point. We’ve been doing it for over a decade. We started doing cloud before it was even called cloud. It was just virtualization. We launched our platform in late 2005 and it proved out, time and time again, with 100 percent up-time.

We have one example of a large customer, a travel and tourism operator, that brings visitors from outside the US to the US. They do over a $1 billion a year in revenue, and we host their entire infrastructure.

It's a lot of infrastructure and it’s a very mature model. We've been doing it for a long time, and that helps them to not worry about what used to be on-premises for them. They moved it all. A portion of it is colocated, and the rest is all on our private cloud. They can just focus on the application, all the transactions, and ultimately on making their customers happy.

Gardner: Going back to the storage equation, Eric, do you have any examples of where the storage software-defined environment gave you the opportunity to satisfy customers or price points, either business or technical metrics that demonstrate how this new approach to storage particularly fills out this costs equation?
The ability to easily provision the different sized data storage we need for the virtual servers that are running on that is absolutely paramount.

Hulbert: In terms of the software-defined storage, the ability to easily provision the different sized data storage we need for the virtual servers that are running on that is absolutely paramount.

We need super-quick provisioning, so we can move things around. When you add in the layers of VMware, like storage vMotion, we can replicate volumes between data centers. Having that software-defined makes that very easy for us, especially with the built-in redundancy that we have and not being controller-bound like we mentioned earlier on.

Those are pretty key attributes, but on top of that , as customers are growing, we can very easily add more volumes for them. Say they have a footprint in our Portland facility and want to add a footprint in our Dallas, Texas facility and do geographic load balancing. It makes it very easy for us to do the applications between the two facilities, slowly adding on those layers as customers need to grow. It makes that easy for them as well.
Software Defined Storage
Eliminate Complexity and Free Infrastructure
From the Limitations of Dedicated Hardware
Gardner: One last question, what comes next in terms of containers? What we're seeing is that containers have a lot to do with developers and DevOps, but ultimately I'd  think that the envelope gets pushed out into production, especially when you hear about things like composable infrastructure. If you've been composing infrastructure in the earlier part of the process and development, it takes care of itself in production.

Do you actually see more of these trends accomplishing that where production is lights-out like you are, where more of the definition of infrastructure and applications, productivity, and capabilities is in that development in DevOps stage?

Virtualization

Hulbert: Definitely. Over time, it is going to be very similar to what we saw when customers were moving from dedicated physical equipment into the cloud, which is really virtualization.

This is the next evolution, where we're moving into containers. At the end of the day, the developers, the product managers for the applications for whatever they're actually developing, don't really care what and how it all works. They just want it to work.

They want it to be a utility consumption-based model. They want the composable infrastructure. They want to be able to get all their microservices deployed at all these different locations on the edge, to be close to their customers.

Containers are going to be a great way to do that because they have all the overhead of dealing with the operations knowledge. So, they can just put these little APIs and the different things that they need where they need it. As we see more of that stuff pushed to the edge to get the eyeball traffic, that’s going to be a great way to do that. With the ability to do even further bursting and into the bigger public clouds worldwide, I think we can get to a really large scale in a great way.

Listen to the podcast. Find it on iTunes. Get the mobile app. Read a full transcript or download a copy. Sponsor: Hewlett Packard Enterprise.

 You may also be interested in:

How to Create an Amazon S3 Image Processing Pipeline in Python

How to Create an Amazon S3 Image Processing Pipeline in Python

Need to create a simple Amazon S3 image processing pipeline to batch edit images? Now that Algorithmia supports Amazon S3 integration, here’s a quick way to automatically create thumbnails with custom dimensions.

In this demo, we’ll use SmartThumbnail, a microservice that uses face detection to perfectly crop every photo to the same size without awkwardly cropped heads or faces.

While manually cropping just a handful of photos isn’t bad. Cropping hundreds or thousands of images in real-time would be extremely expensive, time consuming, and tedious.

So, instead of doing this by hand so that every face in every photo is perfectly preserved, we can run all the photos through SmartThumbnail. The output is both intuitive and expected, each and every time.

We’re going to connect to an Amazon S3 bucket, process the images in a folder, and then save a new thumbnail images back to the folder. We’ll be using Python for this tutorial, but this could easily be done in JavaScript/Node, Rust, Scala, Java, or Ruby.

Don’t use Amazon S3? Want to use Dropbox instead? No problem. Here’s our guide to creating a Dropbox image processing pipeline.

Ready? Let’s go.

Step 1: Create a Free Account and Install Client

You’ll need a free Algorithmia account for this tutorial. Use the promo code “s3” to get an additional 50,000 credits when you signup.

Next, make sure you have the ...


Read More on Datafloq
Őszi kurzusaink a BME-n

Őszi kurzusaink a BME-n

Még sokan a szabadságukat töltik, vagy élvezik a nyarat, de a csapatunk elkezdett dolgozni az öszi BME-s kurzusok aktualizálásán. Ennek már hagyományos része, hogy a külsős hallgatók között is megnyitjuk a kurzusainkat. Az előző félévek tapasztalatai alapján minden kurzus esetén legalább annyi külsős hallgató érkezett, mint amennyi diák felvette azt "hivatalos" keretek között. Úgy gondoljuk, hogy mindenki számára előnyös ez az ajánlat: a külsősök tanulhatnak valami újat, a hallgatóknak a kérdések és visszajelzések alapján valódi képük lesz a tanultak felhasználhatóságáról, az oktatóknak pedig mindig izgalmas az interaktív órák tartása.

Jöjjön hát az étlap, miből lehet jelenleg válogatni.

Ha az adatelemzéssel kapcsolatos programnyelvekhez szeretnél érteni

Tárgy neve: Alkalmazott adatelemzés (Applied Data Analytics, azaz ADA)
Kedd és csütörtök 12-14h
Terem: később dől el pontosan, de biztosan a Lágymányosi kampusz, Magyar tudósok körútja 2.
Tárgy hivatalos tematikája

Az iteratív módon fejlesztett adatfeldolgozó eljárások vannak a középpontban, az adatelemzés programozási nyelveit tanítjuk nektek. A témát a reguláris kifejezésekkel, illetve az awk szövegfeldolgozóval kezdjük, majd SAS programozási nyelvet, Pyhon és R programozást tanítunk úgy, hogy közben a legfontosabb gépi tanulási feladatokat is röviden áttekintjük. A félév során három kisházit adunk a hallgatóknak, majd vizsgával zárul a tárgy. Ezek ugye nem kötelezők a külsős hallgatóknak, de ha valaki meg szeretné méretni magát, annak adunk lehetőséget. Azonban mind a tematika, mind a követelmények átalakítás alatt, szóval itt még lehetnek meglepetések :)

Ha a big data technológiák dzsungelében szeretnél tájékozódni

Tárgy neve: 'Big Data' elemzési eszközök nyílt forráskódú platformokon
Kedd 12-14h
Terem: később
Tárgy hivatalos tematikája

Itt a Dmlab big data szakemberei adnak betekintést a területen kialakult technológiai stack felépítésébe. A MapReduce, Hadoop alapoktól indulunk, és a legújabb technológiákig jutunk el. Nyilván mindben teljesen nem fogunk tudni elmélyedni, de aki ezt a kurzust végighallgatja, az könnyen el fog tájékozódni a big data technológiák között. A félév végén egy ZH és egy házifeladat alapján kapnak jegyet a hallgatók, külön kérésre a külsős kollégák is megmérettethetik magukat ezeken a számonkéréseken.

 

 Jelentkezés

Mivel mindegyik tárgyon szeretnénk egészséges arányt tartani a belsős és külsős hallgatók között, ezért kérünk, hogy az alábbi form segítségével jelentkezz a tárgyak egyikére. Néhány napon belül visszajelzünk a jelentkezésedről.

Jelentkezési form

Egyedi tematikájú képzések
Felhívjuk a figyelmet arra, hogy szívesen dolgozunk ki személyre vagy cégre szabott tematikát is, ha gyorsabb haladásra és hatékonyabb tanulásra van szükség. Itt sokkal jobban tudunk igazodni a már meglévő kompetenciátokhoz, a képzést gyakorlatai során akár a saját adataitokon végezzük az elemzést. Ezúton más technológiákat is szívesen tanítunk, Python, R, RapidMiner, IBM SPSS Modeler, SAS, Oracle, KNime környezetben is szívesen oktatunk, de big data technológiákhoz is vannak jó képzési javaslataink. Az elmúlt évben több mint tíz ilyen képzést tartottunk, keressetek meg bátran, ha ilyen kérdés merül fel bennetek, a tematika rögzítése után gyorsan tudunk árajánlatot adni, a speciális igényeitekhez igazodni.

Ha érdekes, írj néhány sort: Nagy-Rácz István - nagy.istvan@dmlab.hu

A kép forrása

Megosztom Facebookon! Megosztom Twitteren! Megosztom Tumblren!

Criclytics – How Big Data is helping Teams Win Big at the T20 World Cup | Simplilearn

Criclytics – How Big Data is helping Teams Win Big at the T20 World Cup | Simplilearn

Criclytics - How Big Data is helping Teams Win Big at the T20 World Cup | Simplilearn We just witnessed the rage of the cricket season. The battle between the teams. The fight for the trophy. Fans sitting glued to television screens every evening. The sporting of team jerseys.The T20 world cup was as grand as ever.And everybody loves going back in time, assessing how teams and players have performed, talking about how fresh mileston...Read More.
Are you future ready? Building a career in Mobile App Development | Simplilearn

Are you future ready? Building a career in Mobile App Development | Simplilearn

Are you future ready? Building a career in Mobile App Development | Simplilearn When it comes to the IT industry, there are plenty of career paths one can take.  But one field that has seen a tremendous rise in popularity, of late, is that of Mobile App Development.     With over 4.88 billion mobile phone users globally, mobile devices have become ubiquitous. And using Mobile Apps for everything from ...Read More.
Jitendra Vaswani: A professional blogger sharing his secrets | Simplilearn

Jitendra Vaswani: A professional blogger sharing his secrets | Simplilearn

Jitendra Vaswani: A professional blogger sharing his secrets | Simplilearn Jitendra Vaswani is the founder of Digiexe Complete Digital Marketing Services, internet-marketing blog Bloggersideas.com, and tech blog TechNoven.com. Like most tech founders, he is a self-taught SEO and blogging expert, and made his way up the hard way. Currently, Jiten consults with startups to increase their social media and organic presence in...Read More.
BBBT to Host Webinar from Kognitio on its new Software Release “Kognitio on Hadoop”

BBBT to Host Webinar from Kognitio on its new Software Release “Kognitio on Hadoop”

This Wednesday, the Boulder Business Intelligence Brain Trust (BBBT), the largest industry analyst consortium of its kind, will host a private webinar from Kognitio on how its latest release provides ultra-fast, high concurrency SQL, natively on Hadoop.

(PRWeb August 08, 2016)

Read the full story at http://www.prweb.com/releases/2016/08/prweb13600719.htm

Is it Time for a Renaissance of Research?

Is it Time for a Renaissance of Research?

When I talk with research leaders across the UK (and Europe), I consistently hear some common woes.

The roblem for research leaders today

Many feel under-utilised & almost all suggest they appear to have less influence than they had in the past.

With regard to the cause of this ‘demotion’, many cite the rise of executive interest in Big Data & Analytics. It seems that customer research is now often viewed as the poor relation to a more ‘modern’ data analysis solution.

The fact that market or customer research often still sits in a different department, to data & analytics teams, can exacerbate the problem.

Too few companies yet bring all these components of holistic customer insight together.

Yet, despite this apparent ‘doom & gloom‘ for research professionals, the wind appears to be changing in the wider marketing community. At the start of 2016 a number of data & marketing leaders were asked for their predictions as to key themes for the year. Many cited the need for more focus on emotion. We’ve shared some content on that previously.

Beyond even more emotional marketing, in this though provoking piece, Bruce Tempkin shares the criticality of engaging with emotion when designing better customer experiences:



So, if marketers & CX leaders need to better engage with people’s ...


Read More on Datafloq
Six Pitfalls to Avoid When Becoming a Data Driven Enterprise

Six Pitfalls to Avoid When Becoming a Data Driven Enterprise

With Big Data Survey, an initiative of Big Data Expo and GoDataDriven, midway through, it is time for an interim analysis of the score on the door. In this article we share 6 pitfalls to avoid when becoming a data driven enterprise.

Hundreds of organizations from a wide variety have already shared their insights. Around 55 per cent of the participants work at organizations of over 100 employees and are a representative cross-section of the data population: 14% is BI Specialist, 11% Director, 10% Marketing Manager and 6% Data Scientist. It’s not too late to participate in Big Data Survey and receive the full research report.

# 1 Lack of Vision

Data is definitely still a large theme for organizations. No less then 88 per cent of respondents indicate that the opportunities of data for their organizations are substantial. Remarkably, respondents still think of a strong vision (87%) and support from management (60%) as most crucial for success with data.



#2 No Data or Poor Quality of Data

When it comes to laying the technological foundation, the data infrastructure, the biggest challenge is making data available (49%) followed by improving data quality (48%). Developing the right skills to setting up data infrastructure (35%) seems to be ...


Read More on Datafloq
3 steps to your dream IT Service role | Simplilearn webinar starts 31-08-2016 10:30

3 steps to your dream IT Service role | Simplilearn webinar starts 31-08-2016 10:30

In this webinar, we will guide you through the all the necessary steps to land your dream IT Service role.

Webinar Agenda:

  • Do you really know what your dream job is?
  • Assess your capabilities
  • Identify opportunities
  • Landing the job.
.
Why Humanizing Algorithms Is a Good Idea

Why Humanizing Algorithms Is a Good Idea

Algorithms are taking over the world. Not yet completely and not yet definitely, but they are well on their way to automate a lot of tasks and jobs. This algorithmization offers many benefits for organizations and consumers; boring tasks can be outsourced to an algorithm that is exceptionally well at a very dull task, much better than humans could ever become. More complicated tasks can benefit from insights derived from analyzing multiple data sources and these insights can help humans in the task at hand. Soon, however, also these tasks could be taken over by algorithms.

We know many examples of the first, ranging from robots that build your smartphone to algorithms that find that particular website within milliseconds. More and more we also see great examples of the latter. There is an algorithm that has a seat at the board of directors of Hong Kong venture capital firm Deep Knowledge Ventures. In addition, there ar algorithms that can instantly translate spoken language into a different language.

Algorithms are therefore rapidly changing how we do business. Businesses consist of value propositions, customer segments, consumer relationships, channels, revenue streams, cost structures, limited resources, partnerships and activities. Algorithms enable each of these elements to be automated ...


Read More on Datafloq
Why Humanizing Algorithms Could Prevent them from Going Awry

Why Humanizing Algorithms Could Prevent them from Going Awry

Algorithms are taking over the world. Not yet completely and not yet definitely, but they are well on their way to automate a lot of tasks and jobs. This algorithmization offers many benefits for organizations and consumers; boring tasks can be outsourced to an algorithm that is exceptionally well at a very dull task, much better than humans could ever become. More complicated tasks can benefit from insights derived from analyzing multiple data sources and these insights can help humans in the task at hand. Soon, however, also these tasks could be taken over by algorithms.

We know many examples of the first, ranging from robots that build your smartphone to algorithms that find that particular website within milliseconds. More and more we also see great examples of the latter, from an algorithm that has a seat at the board of directors of Hong Kong venture capital firm Deep Knowledge Ventures to algorithms that can instantly translate spoken language into a different language.

Algorithms are therefore rapidly changing how we do business. Businesses consist of value propositions, customer segments, consumer relationships, channels, revenue streams, cost structures, limited resources, partnerships and activities. Algorithms enable each of these elements to be automated and using ...


Read More on Datafloq
Idén is CRUNCH konferencia

Idén is CRUNCH konferencia

https-_2f_2fcdn_evbuc_com_2fimages_2f13190254_2f5608145127_2f1_2foriginal.pngTavaly arról írtunk, hogy hazánk big data szempontból is nagykorúvá vált azzal, hogy a Prezi, a UStream és a RapidMiner csapata Budapesten szervezett egy mind előadóiban, mind szervezésében világszínvonalú adatos konferenciát. Idén is megrendezésre kerül a Crunch konferencia. A workshopok listáját böngészve idén is érdemes lesz résztvenni a konferencián, de a hazánkba látogató szakmabeli előadók és hallgatóság a networkingre is viszonylag ritka lehetőséget ad.

A szervezők jóvoltából a blog olvasói az alábbi linken kedvezményesen vehetnek részt a konferencián. Használjátok ki a lehetőséget és találkozzunk a Millenárison idén ősszel is.

Megosztom Facebookon! Megosztom Twitteren! Megosztom Tumblren!

How IT innovators turn digital disruption into a business productivity force multiplier

How IT innovators turn digital disruption into a business productivity force multiplier

The next BriefingsDirect business innovation thought leadership panel discussion examines how digital business transformation has been accomplished by several prominent enterprises. We'll explore how the convergence of cloud, mobility, and big-data analytics has prompted companies to innovate and produce new levels of award-winning productivity.

Listen to the podcast. Find it on iTunes. Get the mobile app. Read a full transcript or download a copy.

To learn how these trend-setters create innovation value, we're joined by some finalists from the Citrix Synergy 2016 Innovation Awards Program: Olaf Romer, Head of Corporate IT and group CIO at Bâloise in Basel, Switzerland; Alan Crawford, CIO of Action for Children in London, and Craig Patterson, CEO of Patterson and Associates in San Antonio, Texas. The discussion is moderated by me, Dana Gardner, Principal Analyst at Interarbor Solutions.

Here are some excerpts:

Gardner: Olaf, what are the major trends that drove you to reexamine the workplace conceptually, and how did you arrive at your technology direction for innovating in that regard?

Romer: First of all, we're Swiss traditional insurance. So, our driver was to become a little bit more modern to get the new generation of people in our company. In Switzerland, this is s a little bit of problem. We also have big companies in Zurich, for example. So, it’s very important for us.

Romer
We did this in two directions. One direction is on the IT side, and the other direction is on the real-estate side. We changed from the traditional office boxes to a flex office with open space, like Google has. Nobody has their own desk, not even me. We can go anywhere in our office and sit with whom we think it’s necessary. This is also on the IT side. We go in this direction to go for more mobility, an easier way to work in our company.

Gardner: And because you’re an insurance organization, you have a borderless type of enterprise, where you need to interact with field offices, other payers, suppliers, and customers, of course.

Was that ability to deal with many different types of end-point environments also a concern, and how did you solve that?

Romer: The first step was inside our company, and now, we want to go outside to our brokers and to our customers. The security aspect is very, very important. We're still working on being absolutely secure, because we're handling sensitive customer data. We're still in the process of opening our ecosystem outward to the brokers and customers, but also to other companies we work with. [See related post, Expert panel explores the new reality for cloud security and trusted mobile apps delivery.]

Gardner: Alan, tell us about Action for Children and what you’ve been doing in terms of increasing the mobile style of interactions in business.

Crawford: Action for Children is a UK charity. It helps 300,000 children, families, and young people every year. About 5,000 staff, operate from between 300 and 500 branches. So, 300 are our own and a couple of hundred locations are with our partner agencies.

Crawford
When I started there, the big driver was around security and mobility. A lot of the XP computers were running out of support, and the staff outside the office was working on paper.

There was a great opportunity in giving modern tablets to staff to improve the productivity. Productivity in our case means that if you spend less time doing unnecessary visits or do something in one visit instead of three, you can spend more quality time with the family to improve the outcomes for the children.

Gardner: And, of course, as a non-profit organization, costs are always a concern. We’ve heard an awful lot here at Citrix Synergy about lower cost client and endpoint devices. Has that been a good news to your ears? [Learn more about Citrix Synergy 2016.]

Productivity improvements

Crawford: It has. We started with security and productivity as being the main drivers, but actually, as we’ve rolled out, we’ve seen those productivity improvements arise. Now, we're looking at the cost, about the savings we can make on travel, print, and stationery. Our starting budget this year is £1.3 million ($1.7 million) less than it was the year before we introduced tablets for those things. We're trying to work out exactly how much of that we can attribute to the mobile technology and how much of that is due to other factors.

Gardner: Craig, you're working with a number of public sector organizations. Tell us about what they are facing and what mobility as a style of work means to them.

Patterson: Absolutely. I'm working with a lot of public housing authorities. One is Lucas Metropolitan, and other is Hampton Redevelopment Agency. What they're facing is declining budgets and a need to do more with less.

Patterson
When we look at traditional housing-authority and government-service agencies that are paper-based, paper just continues to multiply. You put one piece in the copier and 20 pieces come out. So, being able to take the documents that contain secure private information of our clients and connect those with the clients out in the field is why we need mobility and efficiency and workflows.

And the cloud is what came to mind with that. With content management, we can capture data out in the field. We can move our staff out in the field. We don’t have to bring all of the clients into the office, which can sometimes pose a hardship, especially for elderly, disabled, and many of those in the greatest need. Mobility and efficiency with the cloud and the security have become paramount in how we perform our business.

Gardner: I suppose another aspect of mobility is the ability to bring data in analytics to the very edge. Have you yet to take advantage of that or do you see that it’s something that you’re going to be working toward?

Patterson: We know that it’s something we're working toward. We know from the analytics that we’ve been able to see so far that mobility is the key. For some time, people have thought that we can’t put online things like applications for affordable housing, because people don’t have access to the Internet.

Our analytics prove that entirely wrong. Age groups of 75 and 80 were accessing it on mobile devices faster than the younger group was. What it means is that they find a relative, a grandchild or whoever they need that allows them to access the Internet. It’s been our mindset that has kept us from making the internet and those mobility avenues into our systems available on a broader scale. So, we're moving in that direction so that self service to that community can be displayed more in a broader context.

Measuring outcomes

Crawford: On the analytics and how that’s helped by the mobile working, we had a very similar result in Action for Children in the same year we brought out tablets. We started to do outcome measures with the children we were with. To reach a child, we do a baseline measure when we first meet the family, and then maybe three months later, whatever the period of the intervention, we do a further measure.

Doing that directly on a tablet with the family present has really enhanced the outcome measures. We now have measures on 50,000 children and we can aggregate that, see what the trends are, see what the patterns are geographically by types of service and types of intervention.

Gardner: So it’s that two-way street; the more data and analytics you can bring down to the edge, the more you can actually capture and reapply, and that creates a virtuous cycle of improvement in productivity.

Crawford: Absolutely. In this case, we're looking at the data and learning lessons about what works better to improve the outcomes for disadvantaged children, which is really what we're about.

Gardner: Olaf, user experience is a big topic these days, and insurance, going right to the very edge of where there might be a settlement event of some sort, back to the broker, back to the enterprise. User experience improvements at every step of that means ultimately a better productive outcome for your end-customers. [See related post, How the Citrix Technology Professionals Program produces user experience benefits from greater ecosystem collaboration.]

How does user experience factor into this mobility and data in an analytics equation?
We're looking at the data and learning lessons about what works better to improve the outcomes for disadvantaged children, which is really what we're about.

Romer: First of all, the insurance business is a little bit different business than the others here. The problem is that our customers normally don’t want to touch us during the year. They get a one-time invoice from us and they have to pay the premium. Then, they hope, and we also hope, that they will not have a claim.

We have only one touch a year, and this is little bit of problem. We try to do everything to be more attractive for the customer to get them to us, so that for them it’s clear if they have a problem or need a new insurance, they go to Bâloise Insurance.

We're working on it to bring a little bit of consumerization. In former years the insurance business was very difficult and it wasn’t transparent. The customers have to answer 67 questions before they can take out insurance with us, and this is the point. To make it as simple as possible and to work with a new technology, we have to be attractive for the customers, like taking out insurance through an iPhone. That’s not so easy.

If you talk with a core insurance guy to calculate the premiums, they won’t already have the 67 answers from the customers.  So, it's not only the technology, but working a little bit in a differently in the insurance business. The technology will also help us there. For me, the buzzword is big data, and now we have to bring out the value of the data we have in our business, so that we can go directly with the right user interface to the right customer area.

Gardner: Another concept that we have heard quite a bit at Synergy is the need to allow IT to say yes more often. Starting with you Craig, what are you seeing in the trends and in the technology that is perhaps most impactful for you to be able to say yes to the requests and the need for agility in these businesses, in these public sector organizations?

Device agnosticism

Patterson: It’s the device agnosticism, where you bring your own device (BYOD). It’s a device that the individuals are already familiar with. I'm going to take it from two angles. It could be the employee that’s delivering a service out to a customer in the field that can bring their own device, or a partner or contractor, so that we can integrate and shrink-wrap certain data. We will still have data security while they're deploying or doing something out in the field for us. It could be inspections, customer service, medical, etc.

But then, on the client end, they have their own device. By our being able to deliver products through portals that don’t care what device they have, it’s based on mobile protocols and security. Those are the types of trends that are going to allow us to collect the big analytics, know what we think we know, and find out whether we really know it or not and find it, get the facts for it.

The other piece of it though is to make it easy to access the services that we provide to the community, because now it’s a digital community; it’s not just the hardcore community. To see people in a waiting line now for applications hurts my feelings. We want to see them online, accessing it 24×7, when it makes sense for them. Those are the types of services that I see becoming the greater trends in our industry.
Those are the types of trends that are going to allow us to collect the big analytics, know what we think we know, and find out whether we really know it or not and find it, get the facts for it.

Gardner: Alan, what allows you to say “yes” more often?

Crawford: When I started with the XP laptops, we were saying no. So doing lot of comparisons in program within our center now, they're using the tablets and the technology. You have closed Facebook groups with those families. There's now peer support outside hours, when children are going to bed, which is often when they have issues in a family.

They use Eventbrite, the booking app. There are some standard off-the-shelf apps, but the real enterprise in our service in a rural community currently tells everybody in that community what services they're running through posters and flyers that were printed off. That moved to developing our own app. The prototypes are already out there, and the full app will be out there in a few weeks time. We're saying yes to all of those things. We want to support them. It is not just yes, but yes and how can we help you do that.

Gardner: Olaf, of course, productivity is only as good as the metrics that we need to convince the higher-ups in the board room that we need more investment or that we're doing good work with our technology. Do you have any measurements, metrics, even anecdotes about how you measure productivity and what you've done to modernize your workspaces?

Romer: Yes, for us it’s the feedback from the people. It’s very difficult to measure it on a clear technology level, but feedback from the people is very good and very important for us. You can see  with the BYOD we introduced one and a half years ago, a stronger cultural change in collaboration. We work together much more efficiently in the company and in the different departments.

In former times, we had closed file shares, and I couldn't see the files of the department next to me. Now, we're working completely in a modern collaboration way. Still, on traditional insurances, let’s say with the government, it’s very hard for them to work in the new style..

In the beginning, there were very strong concerns about that, and now we're in a cultural shift on this. We get a lot of good feedback that in project teams, or in the case of some problems or issues, we can work much better and faster together.

Metrics of success

Gardner: Craig, of course it’s great to say yes to your constituents, but it’s also good to say that we're doing more with less to your higher-ups and those that control the budget. Any metrics of success that you can recall in some of the public-sector organizations you're working with?

Patterson: Absolutely. I'll talk about files in workflow. When a document comes into the organization before, we mapped how much time and money it took to get it in a file folder, having been viewed by everyone that it needs to get viewed by. To give quick context, before, a document took a file folder, a label maker, copy machine, and every time a person needed to put a document in that folder, someone had to get it there. Now, the term "file clerk" is actually becoming obsolete.

When a document come in, it gets scanned, it’s instantaneously put in the correct order in the right electronic folder, and an electronic notification is sent to the person who needs to know. That happens in seconds. When you look at each month, it amounts to savings; before, we were managing files, rather than assisting people.
We can now see how many file folders you looked at, how many documents you actually touched, read, and reviewed in comparison with somebody else.

The metrics are in the neighborhood of just about 75 percent paper reduction, because people aren’t making copies. This means they're not going to the copy machine and along the way, the water-cooler and conversation pits. That also abates some of the efficiencies. We can now see how many file folders you looked at, how many documents you actually touched, read, and reviewed in comparison with somebody else.

We had as many as five documents, in comparison with 1,700 in a month. That starts to tell you some things about where your workload is shifting. Not everyone likes that. They might consider it a little bit "big brother," but we need those analytics to know how best to change our workflows to serve our customer, and that’s the community.

Gardner: I don’t know if this is a metric that’s easy to measure, but less bureaucracy would be something that I think just about everyone would be in favor of. Can you point to something that says we're able to reduce bureaucracy through technology?

Patterson: When you look at bureaucracy and unnecessary paper flows, there are certain yes-and-no questions that are part of bureaucracy. Somebody has it go their desk and their job is to stamp yes or no on it. What decision do you have to make? Well they really don’t; they just have to stamp yes. To me, that’s classic bureaucracy.

Well, if the document hits that person’s desk and it meets a certain criteria or threshold, the computer automatically and instantaneously approves it and it has a documented audit trail. That saves some of our clients in the housing-authority industry, when the auditors come and review things. But if you had to make a decision, it forced you to know how long it took you to make it. So, we can look at why is it taking so long or there are questions that you don’t need to be answering.

Gardner: So let the systems do what they do best and let the people do the exception management and the value-added activities. Alan, you had some thoughts about metrics of success of bureaucracy or both?

Proxy measure

Crawford: Yes, it’s the metrics. The Citrix CEO [Kirill Tatarinov] talked at Citrix Synergy about productivity actually going down in the last few years. We’ve put all these tablets out there and we have individual case studies where we know a particular family-support worker has driven 1,700 miles in the year with the tablet, and it was 3,400 miles in the year without. That’s a proxy measure of how much time they're spending on the road, and we have all the associated cost of fuel and wasted time and effort.

We've just installed an app -- actually I have rolled it out in the last month or so -- that measures how many tablets have been switched on in the month, how much they're been used in the day, and what they've been used for. We can break that down by the geographical areas and give that information back to the line managers, because they're the people to whom it will actually make sense.

I'm right at a stage where it’s great information. It’s really powerful, but it’s actually to understand how many hours a day they should be using that tablet. We're not quite sure, and it probably varies from one type of service to another.

We look at those trends over a period of months. We can tell managers that, yes, total staff used them 90 percent, but it’s 85 percent in yours. All managers, I find, are fairly competitive.
There are inhibitors around mobile network coverage and even broadband coverage in some rural areas. We just follow up on all of those user experience information we get back and try and proactively improve them.

Gardner: Well, that may be a hallmark of business agility, when you can try things out, A/B testing. We’ll try this, we’ll try that, we don’t pay a penalty for doing that. We can simply learn from it and immediately apply our lesson back to the process.

Crawford: It’s all about how we support those areas where we identify that they're not making the most of the technology they’ve been given. And it might be human factors. The staff or even the managers are very fearful. Or it might be technical factors. There are inhibitors around mobile network coverage and even broadband coverage in some rural areas. We just follow up on all of those user experience information we get back and try and proactively improve them.

Gardner: Olaf, when we ask enterprises where they are in their digital transformation, many are saying they're just at the beginning. For you, who are obviously well into a digital transformation process, what lessons learned could you share; any words of advice for others as they embark on this journey?

Romer: The first digital transformation in the insurance business was in the middle of 1990s, when we started to go paperless and work with a digital system. Today, more than 90 percent of our new insurance contracts are completely paperless. In Germany, for example, you can give a digital signature. It’s not allowed for the moment in Switzerland, but from a technical perspective, we can do this.

My advice would be that digitalization gives you a good situation to think about to make it simple. We built up great complexity over the years, and now we're able to bring this down and make it as simple as possible. We created the slogan, “Simply Safe,” for us to rethink everything that we're doing to make it simple and safe. Again, for insurance, it's very important that the digitalization brings us not more complexity, but reduces it.

Gardner: Craig, digital transformation, lessons learned, what advice can you offer others as they embark?

Document and workflow

Patterson: In digital transformation, I’ll just use document and workflow. Start with the higher-end items; there's low-hanging fruit there. I don’t know if we'll ever be totally paperless, which would really allow us to go mobile, but at the same time, know what not to scan. Know what to archive and just get rid off. And don't hang on to old technologies for too long. That’s something else that’s starting to happen. The technological revolution in lifecycle of technology is shorter and we need to plan our strategies along those lines.

Gardner: Alan, words of advice on those also interested in digital transformation?

Crawford: For us, it started about connecting with our cause. We’ve got social care staff and since we’re going to do digital transformation, it's not going to really enthuse them. However, if you explain that this is about actually improving the lives of children with technology, then they start to get interested. So, there is a bit about using your cause and relating the change to your cause.
You’ve got to follow through on all this change to get the real benefits out of it. You’ve got to be a bit tenacious with it to really see the benefits in the end.

A lot of our people factors are on how to engage and train. It's no longer IT saying, "Here’s the solution, and we expect you to do ABC." I was working with those social-care workers, and here are the options, what will work for you and how should we approach that, but then it’s never letting up.

Actually, you’ve got to follow through on all this change to get the real benefits out of it. You’ve got to be a bit tenacious with it to really see the benefits in the end.

Gardner: Tie your digital transformation and the organization’s mission that there is no daylight between them.

Crawford: We’ve got the project digitally enabling Action for Children and that was to try and link the two together inextricably.

Listen to the podcast. Find it on iTunes. Get the mobile app. Read a full transcript or download a copy. Sponsor: Citrix.

You may also be interested in:

Free eBook: Job Hunting Demystified: The Insider’s Guide to Job Portals | Simplilearn

Free eBook: Job Hunting Demystified: The Insider’s Guide to Job Portals | Simplilearn

Recruiters spend a large portion of their time on job portals, looking for talent. While you might be up there with all the skills they need, your profile has to be good enough to showcase your best work. So if you’ve been fretting about not getting calls, relax: all you may need is a simple fix to your job portal profile.   We bring you...Read More.
Free eBook: 2016 High Paying Certifications | Simplilearn

Free eBook: 2016 High Paying Certifications | Simplilearn

Did you know that over 91% of recruiters look for a professional certification on resumes?    A certification validates your professional expertise and is proof of your dedication to career advancement.   But getting a professional certification means committing a portion of your valuable time and money for future gains. With hundred...Read More.
Free eBook: Skills Toolkit for the 21st Century Professional | Simplilearn

Free eBook: Skills Toolkit for the 21st Century Professional | Simplilearn

Recruiters have become wary of hiring professionals since 2008’s global financial crisis. Lay-off rates are seeing new highs every year.   How do you acquire and retain your job in this highly competitive age?   This guide will give you the answer. Find inside: #1 – The importance of having a constantly updated, relevant skil...Read More.
Free eBook: 9 High Paying Certifications in 2015 | Simplilearn

Free eBook: 9 High Paying Certifications in 2015 | Simplilearn

A certification is proof that you have the skills to perform a particular function. In fact, a certification or two makes your CV get noticed by a recruiter. Here’s a list of the top trending certifications of the year so you can figure out which one is suitable for you. The eBook gives you clarity on the certification process, the scope pos...Read More.
Free eBook: 2015 High Paying Business Certifications | Simplilearn

Free eBook: 2015 High Paying Business Certifications | Simplilearn

For 91% of hiring managers, a certified professional isn’t just an asset: it is a must. And 84% of recruiters believe that certified professionals are far more productive than their counterparts. So if you’re looking to launch an amazing career in the world of business, investing in a popular, in-demand certification is the way to go. B...Read More.
Free eBook: 2015’s Top 8 IT Certifications | Simplilearn

Free eBook: 2015’s Top 8 IT Certifications | Simplilearn

An IT certification can add immense value to your resume: 86% of hiring managers believe that it is a high priority while evaluating a candidate! But which ones to choose? To make life easier, we’ve prepared a list of the highest-paying IT certifications across the world –get your copy today! In this e-Book, you will find – 1. W...Read More.
Free eBook: Top 10 Programming Languages to learn in 2015 | Simplilearn

Free eBook: Top 10 Programming Languages to learn in 2015 | Simplilearn

Research has shown that programming tops the list of skills that are most-sought-after in the IT industry, ensuing in the demand for skilled programmers. While you may already be adept in some of the programming languages in demand, expanding your expertise in newer, essential programming languages, promotes your value on the job-market. In this ...Read More.
Infrastructure as destiny — How Purdue builds an IT support fabric for big data-enabled IoT

Infrastructure as destiny — How Purdue builds an IT support fabric for big data-enabled IoT

The next BriefingsDirect Voice of the Customer IT infrastructure thought leadership case study explores how Purdue University has created a strategic IT environment to support dynamic workload requirements.

We'll now hear how Purdue extended a research and development IT support infrastructure to provide a common and "operational credibility" approach to support myriad types of compute demands by end users and departments.

Listen to the podcast. Find it on iTunes. Get the mobile app. Read a full transcript or download a copy.

To describe how a public university is moving toward IT as a service, please join Gerry McCartney, Chief Information Officer at Purdue University in Indiana. The discussion is moderated by BriefingsDirect's Dana Gardner, Principal Analyst at Interarbor Solutions.

Here are some excerpts:

Gardner: When you're in the business of IT infrastructure, you need to predict the future. How do you close the gap between what you think will be demanded of your infrastructure in a few years and what you need to put in place now?

McCartney: A lot of the job that we do is based on trust and people believing that we can be responsive to situations. The most effective way to show that right now is to respond to people’s issues today. If you can do that effectively, then you can present a case that you can take a forward-looking perspective and satisfy what you and they anticipate to be their needs.

McCartney
I don’t think you can make forward-looking statements credibly, especially to a somewhat cynical group of users, if you're not able to satisfy today’s needs. We refer to that as operational credibility. I don’t like the term operational excellence, but are you credible in what you provide? Do people believe you when you speak?

Gardner: We hear an awful lot about digital disruption in other industries. We see big examples of it in taxi cabs, for example, or hospitality. Is there digital disruption going on at university campuses as well, and how would you describe that?

McCartney: A university you can think of as consisting of three main lines of business, two of which are our core activities, of teaching students, educating students; and then producing new knowledge or doing research. The third is the business of running that business, and how do you do that. A very large infrastructure is built up around that third leg, for a variety of reasons.

But if we look at the first two, research in particular, which is where we started, this concept of the third leg of science has been around for some time now. It used to be just experimentation and theory creations. You create a theory, then you do an experiment with some test tubes or something like this, or grow a crop in the field. Then, you would refine your theory and you would continue in that kind of dyadic mode of just going backward and forward.

Third leg of science

That was all right until we wanted to crash lorries into walls or to fly a probe into the sun. You don’t get to do that a thousand times, because you can’t afford it, or it’s too big or too small. Simulation has now become what we refer to as the third leg of science.

Slightly more than 35 percent of our actual research now uses high-performance computing (HPC) in some key parts of it to produce results, then shape the theory formulation, and the actual experimentation, which obviously still goes on.

Around teaching, we've seen for-profit universities, and we've seen massive open online courses (MOOCs) more recently. There's a strong sense that the current mode of instructional delivery cannot stay the same as it has been for the last hundreds of years and that it’s ripe for reform.

Indeed, my boss at Purdue, Mitch Daniels, would be a clear and vibrant voice in that debate himself. To go back to my earlier comments, our job there is to be able to provide credible alternatives, credible solutions to ideas as they emerge. We still haven’t figured that out collectively as an industry, but that’s something that is in the forefront of a lot of peoples’ minds.

Gardner: Suffice to say that information technology will play a major role in that, whatever it is.

McCartney: It’s hard to imagine a solution that isn’t actually completely dependent upon information technology, for at least its delivery, and maybe for more than that.
Right now, our principal requirement is around research computing, because we have to put the storage close to the compute. That's just a requirement of the technology.

Gardner: So, high-performance computing is a bedrock for the simulations needed in modern research. Has that provided you with a good stepping stone toward more cloud-based, distributed computing-based fabric, and ultimately composable infrastructure-based environments?

McCartney: Indeed it has. I can go back maybe seven or eight years at our place, and we had close to 70 data centers on our campus. And by a data center, I mean a room with at least 200-amp supply, and at least 30 tons of additional cooling, not just a room that happens to have some computers in it. I couldn't possibly count how many of them there are now. Those stand-alone data centers are almost all gone now, thanks to our community cluster program, and the long game is that we probably won't have much hardware on our campus at some point a few years from now.

Right now, our principal requirement is around research computing, because we have to put the storage close to the compute. That's just a requirement of the technology.

In fact, many of our administrative services right now are provided by cloud providers. Our users are completely oblivious to that, but we have no on-premises solution at all. We're not doing travel, expense reimbursement and a variety of back-office things on our campus at all.
Gain Data Insights and Business Value
From the Proliferation
Of IoT Connected Devices and Machines
That trend is going to continue, and the forcing function there is that I can't spend enough on security to protect all the assets I have. So, rather than spend even more on security and fail to provide that completely secure environment, it's better to go to somebody who can provide that environment.

Data-compute link

Gardner: What sort of an infrastructure software environment do you think will give you that opportunity to make the right choices when you decide on-prem versus cloud, even for those intensive workloads that require a tight data and compute link?

McCartney: The worry for any CIO is that the only thing I have that's mine is my business data. Anything else -- web services, network services -- I can buy from a vendor. What nobody else can provide me are my actual accounts, if you wish to just choose a business term, but that can be research information, instructional information, or just regular bookkeeping information.

When you come into a room of a new solution, you're immediately looking at the exit door. In other words, when I have to leave, how easy, difficult, or expensive is it going to be to extract my information back from the solution?

That drives a huge part of any consideration, whether it's cloud or on-prem or whether it's proprietary or open code solution. When this product dies, the company goes bust, we lose interest in it, or whatever -- how easy, expensive, difficult is it for me to extract my business data back from that environment, because I am going to need to do that?

I'm quite happy for everybody else to knock the bumps out to the road for me, and I'll be happy to drive along it when it’s a six-lane highway.
Gardner: What, at this juncture, meets that requirement in your mind? We've heard a lot recently about container technology, standards for open-source platforms, industry accepted norms for cloud platforms. What do you think reduces your risk at this point?

McCartney: I don't think it's there yet for me. I'm happy to have, relatively speaking, small lines of business. Also, you're dependent then on your network availability and volume. So, I'm quite happy there, because I wasn't the first, and because that's not an important narrative for us as an institution.

I'm quite happy for everybody else to knock the bumps out of the road for me, and I'll be happy to drive along it when it’s a six-lane highway. Right now it's barely paved, and I'll allow other brave souls to go there ahead of me.

Gardner: You mentioned early on in our discussion the word "cynical." Tell me a little bit about the unique requirements in a university environment where you need to provide a common, centrally managed approach to IT for cost and security and manageability, but also see to the unique concerns and requirements of individual stakeholders?

McCartney: All universities are, as they should be, full of self-consciously very smart people who are all convinced they could do a job, any particular job, better than the incumbent is doing it. Having said that, the vast bulk of them have very little interest in anything to do with infrastructure.

The way this plays out is that the central IT group provides the core base that services the network -- the wireless services, base storage, base compute, things like that. As you move to the edge, the things that make a difference at the edge.

Providing the service

In other words, if you have a unique electrical device that you want to plug in to a socket in the wall because you are in paleontology, cell biology, or organic chemistry, that's fine. You don't need your own electricity generating plants to do that. I can provide you with the electricity. You just need the cute device and you can do your business, and everybody is happy.

Whatever the IT equivalent to that is, I want to be the energy supplier. Then, you have your device at the edge that makes a difference for you. You don't have to worry about the electricity working; it's just there. I go back to that phrase "operational credibility." Are we genuinely surprised when the service doesn’t work? That’s what credibility means.

Gardner: So, to me, that really starts to mean IT as a service, not just electricity or compute or storage. It's really the function of IT. Is that in line with your thinking, and how would you best describe IT as a service?

McCartney: I think that's exactly right, Dana. There are two components to this. There's an operational component, which is, are you a credible provider of whatever the institution decides the services are that it needs, lighting, air-conditioning or the IT equivalence of that? They just work. They work at reasonable cost; it's all good. That’s the operational component.

The difference with IT, as opposed to other infrastructure components, is that IT has itself the capability to transform entire processes. That’s not true of other infrastructure things. I can take an IT process and completely reengineer something that's important to me, using advantages that the technology gives me.
Gain Data Insights and Business Value
From the Proliferation
Of IoT Connected Devices and Machines
For example, I might be concerned about student performance in particular programs. I can use geo-location data about their movement. I can use network activity. I can use a variety of other resources available to me to help in the guidance of those students on what’s good behavior and what’s helpful behavior to an outcome that they want. You can’t do that with an air-conditioning system.

IT has that capability to reinvent itself and reinvent entire processes. You mentioned some of them the way that things like Uber has entirely disrupted the taxi industry. I’d say the same thing here.

There's one part of the CIO’s job that’s operational; does everything work? The second part is, if we're in transition period to a new business model, how involved are the IT leaders in your group in that discussion? It's not just can we do this with IT or not, but it’s more can a CIO and the CIO’s staff bring an imagination to the conversation, that is a different perspective than other voices in the organization? That's true of any industry or line of business.

Are you merely there as a handmaiden waiting to be told what to do, or are you an active partner in the conversation? Are you a business partner? I know that’s a phrase people like to use. There's a kind of a great divide there.

Gardner: I can see where IT is a disruptor -- and it’s also a solution to the disruptor, but that solution might further disrupt things. So, it's really an interesting period. Tell me a little bit more about this concept of student retention using new technologies -- geolocation for example -- as well as big data which has become more available at much lower cost. You might even think of analytics as a service as another component of IT as a service.

How impactful will that be on how you can manage your campus, not only for student retention, but perhaps for other aspects of a smarter intelligent campus opportunity? [See related post, Nottingham Trent University Elevates Big Data’s Role to Improving Student Retention in Higher Education.]

Personalized attention

McCartney: One of the great attractions of small educational institutions is that you get a lot of personalized attention. The constraint of a small institution is that you have very little choice. There's a small number of faculty, and they simply can’t offer the options and different concentrations that you get in a large institution.

In a large institution, you have the exact opposite problem. You have many, many choices, perhaps even too many subjects that, as a 19-year-old, you've never even heard of. Perhaps you get less individualized attention and you fill that gap by taking advice from students who went to your high school a year before, who are people in your residence hall, or people you bump into on the street. The knowledge that you acquire there is accidental, opportunistic, and not structured in any way around you as an individual, but it’s better than nothing.

There are advisors, of course, and there are people, but you don't know these individuals. You have to go and form relationships with them and they have to understand you and you have to understand them.

A big-data opportunity here is to be able to look at the students at some level of individuality. "Look, this is your past, this is what you have done, this is what you think, and this is the behavior that we are not sure you're engaging in right now. Have you thought about this path, have you thought about this kind of behavior for yourself?"
One of the great attractions of small educational institutions is that you get a lot of personalized attention. The constraint of a small institution is that you have very little choice.

A well-established principle in student services is that the best indicator of student success is how engaged they are in the institution. There are many surrogate measures of that, like whether they participate in clubs. Do they go home every weekend, indicating they are not really engaged, that they haven’t made that transition?

Independent of your academic ability, your SAT scores, and your GPA that you got in high school, for students that engage, that behavior is highly correlated with success and good outcomes, the outcomes everybody wants.

As an institution, how do you advise or counsel. They'll say perhaps there's nothing here they're interested in, and that can be a problem with a small institution. It's very intimate. Everybody says, "Dana, we can see you're not having a great time. Would you like to join the chess club or the drafts club?" And you say, "Well, I was looking for the Legion of Doom Club, and you don’t seem to have one here."

Well, you go to a large institution, they probably have two of those things, but how would you find it and how would you even know to look for that? How would you discover new things that you didn't even know you liked, because the high school you went to didn't teach applied engineering or a whole pile of other things, for that matter.

Gardner: It’s interesting when you look at it that way. The student retention equation is, in a business sense, the equivalent of user experience, personalization, engagement, share of wallet, those sorts of metrics.

We have the opportunity now, probably for the first time, to use big data, Internet of Things (IoT), and analytics to measure, predict, and intercede at a behavioral level. So in this case, to make somebody a productive member of society at a capacity they might miss and you only have one or two chances at that, seems like a rather monumental opportunity.

Effective path

McCartney: You’re exactly right, Dana. I'm not sure I like the equivalence with a customer, but I get the point that you're making there. What you're trying to do is to genuinely help students discover an effective path for themselves and learn that. You can learn it randomly, and that's nice. We don't want to create this kind of railroad track. Well, you're here; you’ve got to end up over there. That’s not helpful either.

My own experience, and I don’t know about other people listening to this, is that you have remarkably little information when you're making these choices at 19 and 20. Usually, if you were getting direction, it was from somebody who had a plan for you that was more based on their experience of life, some 20 or 30 years previously than on your experience of life.
Gain Data Insights and Business Value
From the Proliferation
Of IoT Connected Devices and Machines
So where big data can be a very effective play here, was to say, "Look, here are people that look like you, and here were the choices they've made. You might find some of these choices interesting. If you might, then here’s how you’d go about exploring that."

As you rightly say, and implicitly suggested, there is a concern with the high costs, especially of residential education, right now. The most wasteful expenditures there are is where you do a year or two to find out you shouldn't have ever been in this program, you have no love for this thing, you have no affinity for it.
What you're trying to do is to genuinely help students discover an effective path for themselves and learn that. You can learn it randomly, and that's nice.

The sooner you can find that out for yourself and make a conscious choice the better. We see big data having a very active role in that because one of the great advantages of being in a large institution is that we have tens of thousands of students over many years. We know what those outcomes look like, and we know different choices that different people have made. Yes, you can be the first person to make a brand new choice, and good for you if you are.

Gardner: Well it’s an interesting way of looking at big data that has a major societal benefit in the offing. It also provides predictability and tools for people in ways they hadn’t had before. So, I think it’s very commendable.

Before we sign-off, what comes next – high performance computing (HPC), fabric cloud, IT-as-a service -- is there another chapter on this journey that perhaps you have a bead on that that we’re not aware of?

McCartney: Oh my goodness, yes. We have an event now that I started three years ago called "Dawn or Doom," in which if technology is a forcing function, if it is. We're not even going to assert that definitely. Are we reaching a point of a new nirvana, a new human paradise where we’ve resolved all major social problems, and health problems or have we created some new seventh circle of hell where it’s actually an unmitigated disaster for almost everybody; if not everybody? Is this the end of life as we know it? We create robots that are superior to us in every way and we become just some intermediate form of life that has reached the end of its cycle.

This is an annual event that's free and open. Anybody who wants to come is very welcome to attend. You can Google "Dawn or Doom Purdue." We look at it from all different perspectives. So, we have obviously engineers and computer scientists, but we have psychologists, we have labor economists. What about the future of work? If nobody has a job, is that a blessing or a curse?

Psychologists, philosophers, what does it mean, what does artificial intelligence mean, what does a self-conscious machine mean? Currently, of course, we have things like food security we worry about. And the Zika virus -- are we spawning a whole new set of viruses we have no cure for? Have we reached the end of the effectiveness of antibiotics or not?

These are all incredibly interesting questions I would think any intelligent person would want to at least probe around, and we've had some significant success with that.

Next event

Gardner: When is the next Dawn or Doom event, and where will it be?

McCartney: It would be in West Lafayette, Indiana, on October 3 and 4. We have a number of external high-profile key note speakers, then we have a passel of Purdue faculty. So, you will find something that entertain even the most arcane of interests. [For more on Dawn or Doom, see the book, Dawn or Doom: The Risks and Rewards of Emerging Technologies.]

Listen to the podcast. Find it on iTunes. Get the mobile app. Read a full transcript or download a copy. Sponsor: Hewlett Packard Enterprise.

You may also be interested in:

BBBT to Host Webinar from Rocana on how Digital Transformation Starts with Total Operational Visibility

BBBT to Host Webinar from Rocana on how Digital Transformation Starts with Total Operational Visibility

This Friday, the Boulder Business Intelligence Brain Trust (BBBT), the largest industry analyst consortium of its kind, will host a private webinar from Rocana on how Rocana Ops levels the playing field by providing a total visibility solution for CIOs and technologists.

(PRWeb August 02, 2016)

Read the full story at http://www.prweb.com/releases/2016/08/prweb13588453.htm

How UPS automates supply chain management and gains greater insight for procurement efficiency

How UPS automates supply chain management and gains greater insight for procurement efficiency

The next BriefingsDirect business innovation for procurement case study examines how UPS modernizes and streamlinines its procure-to-pay processes.

Learn how UPS -- across billions of dollars of supplier spend per year -- automates supply-chain management and leverages new technologies to provide greater insight into procurement networks. This business process innovation exchange comes to you in conjunction with the Tradeshift Innovation Day held in New York on June 22, 2016.

Listen to the podcast. Find it on iTunes. Get the mobile app. Read a full transcript or download a copy. 

To explore how procurement has become strategic for UPS, BriefingsDirect sat down with Jamie Dawson, Vice-President of UPS's Global Business Services Procurement Strategy in Atlanta. The discussion is moderated by me, Dana Gardner, Principal Analyst at Interarbor Solutions.

Here are some excerpts:

Gardner: What are the major trends that you are seeing in procurement, and how you're changing your strategy to adjust?

Dawson: We're seeing a lot of evolution in the marketplace in terms of both technology and new opportunities in ways to procure goods and that really is true around the globe. We're adjusting our strategy and also challenging some of our business partners to come along with us.

We're a $60 billion company. Last year, our total expenses were somewhere in the $50-billion range, lots of goods and services flowing around the globe.

Gardner: And so, any way that you can find new efficiency, new spend management benefits that turns into some significant savings.

Dawson: Absolutely.

Gardner: Now that you're looking for new strategies and new solutions, what is it in procurement that’s of most interest to you and how are you using technology in ways you didn't before?

Collaboration and partnerships

Dawson: One of the new ways is a combination or partnerships both with third parties as well as our own internal business partners. We're collaborating with other functions, and procurement is not something we are doing to them; we're working with them to understand what their needs are and working with their suppliers as well.

Dawson
Gardner: We're hearing some very interesting things these days about using machine learning and artificial intelligence, combining that with human agents who are specialized. It sounds like, in some ways, external procurement services can do the job better than anyone. Is that something that you're open to? Is procurement as a service something you're looking at? [See related post, How new modes of buying and evaluating goods and services disrupts business procurement — for the better.]


Dawson: Procurement-as-a-service has a certain niche play. There will always be basic buy-and-sell items, even as individuals. There are some things you don’t research, but you just go out and buy. There are other things for which you do a lot of research and you look into different solutions.

There are different things that will cause you to research more. Maybe it's a competitive advantage, maybe you're looking for an opportunity in a new space or a new corner of the globe. So, you'll do a lot more research, and your solutions need to be scalable. If you create and start in Europe, maybe you'll also want to use it in Asia. If you start in the US, maybe you want to use elsewhere.

Gardner: It sure sounds like, during a period of experimentation, that where the boundary was between things that you would buy by rote versus things you would buy with a lot of expertise or research is shifting or changing. Are you experimenting as an organization, and what is interesting to you as you look at new opportunities from those people who are in the procurement network space?
There will always be complex areas that require solution orientation more than just price. They need a deep understanding of industry, knowledge, and partnership.

Dawson: There will always be complex areas that require solution orientation more than just price. They need a deep understanding of industry, knowledge, and partnership. There are a lot of other areas where the opportunities are expanding every day. [See related post, ChainLink analyst on how cloud-enabled supply chain networks drive companies to better manage finances, procurement.]

Gardner: As you think about what you've done and been able to accomplish, do you have any advice for other organizations that are also starting to think about modernizing and strategizing, rather than just doing it in the traditional old way? What would you tell them?

Dawson: Two things. One would be within the procurement organizations to be open to new ideas. And second, get the rest of the organization behind you, because you're going to need their support.

Gardner: It seems that procurement as a function is just far more strategic than it used to be. Not only are you able to get more goods and services, but you can save significant amounts of money. Do you feel that your profile as an organization within UPS is rising or expanding in terms of the role you play in the larger organization? [See related post, CPO expert Joanna Martinez extols the virtues of redesigning procurement for strategic business agility.]


Don't have to sell

Dawson: I'm certainly aware that the knowledge of the capabilities and the demonstrated successes are now being recognized throughout the organization. And it becomes self feeding. You actually get on a roll and can further expand the capabilities once that knowledge is out there; you don’t have to sell.

Gardner: Last question, looking to the future, on a vision level, what’s really exciting to you? What are you thinking that might be more important to you in how you do business two or three years from now? It could be technology, suppliers, ecosystems, cloud enabled intelligence, that sort of thing.

Dawson: It’s a very interesting question, because it’s almost the same answer. Your greatest fear is the greatest benefit. I listened to what we just heard on the Tradeshift Go tool, and it’s crazy how exciting that this is. You heard all the questions in the room about how to adapt that to what you already have today? The world still exists as it exists today.
There's this huge transition period where we were bolting on these fantastic great ideas to our existing infrastructure. That transition into what's new and really embracing it is the most exciting of all.

So, there's this huge transition period where we were bolting on these fantastic great ideas to our existing infrastructure. That transition into what's new and really embracing it is the most exciting of all.

Gardner: Disruption can be good and disruption can be bad.

Dawson: It will be a challenging journey.

Listen to the podcast. Find it on iTunes. Get the mobile app. Read a full transcript or download a copy. Sponsor: Tradeshift.

You may also be interested in:

1 Simple Rule for Success – What Sets Successful Professionals Apart | Simplilearn

1 Simple Rule for Success – What Sets Successful Professionals Apart | Simplilearn

1 Simple Rule for Success – What Sets Successful Professionals Apart | Simplilearn There has been a change in the rules of the business world. Gone are the days when promotions were based on seniority, and jobs promised life-long job security. Nowadays it’s not enough to just pledge long-term loyalty to an employer; it’s too easy to become irrelevant if all you’re doing is biding time. Employers want employees w...Read More.
How To Become A Data-Driven Company

How To Become A Data-Driven Company

The value of big data can’t be overstated for businesses nowadays. The collection and analysis of data has allowed thousands of businesses to make decisions that are driven by that data, lending their decisions more weight and credibility – and even predicting the future with machine-learning.

But how can you actually become a business driven by data? The idea of being able to make powerful decisions based on hard evidence is something many businesses crave, but implementing this kind of system can be challenging, to say the least. Here are a few steps you can take in order to build your data strategy.

Collect the data

The first, and arguably the most important step to becoming a data-driven company, is to start collecting the data. You can’t make those all-important decisions without it.

Start off by acquiring a simple cloud-based software platform, and store as much data as you possibly can. Don’t discriminate on the data you store at this stage – you don’t know what might come in handy further down the line. There might be certain metrics that are totally useless, but you won’t find out until the point of analysis. A simple rule of thumb is this: if you’re not sure ...


Read More on Datafloq

Privacy Policy

Copyright © 2016 BBBT - All Rights Reserved
Powered by WordPress & Atahualpa