Benefits of Taking Lean Six Sigma Green Belt Certification

Benefits of Taking Lean Six Sigma Green Belt Certification

The Lean Six Sigma pays attention on the areas where organization fails to provide required services and products. For example, in a manufacturing industry, the Lean Six Sigma tries to minimize the number of defective products produced by a company. Similarly, the Lean Six Sigma Green Belt certification is a credential used for quality improvement in any organization like manufacturing firms or process management organizations.

Organizations interested in Green belt or Black belt certification, but still not aware of benefits of them. There are many advantages that you will experience after becoming a Green Belt certified, and here are some of those.

  • Strategic Benefits
  • People Development Benefits
  • Financial Benefits
  • Customer Benefits
  • Competitive Benefits

Strategic Benefits 

The Lean Six Sigma certified professional is specialized in handling the project with a strategic importance. Any organization having a Green Belt certified professional can enjoy this strategic benefit by solving major problems in the business. The certified professional has ability to solve the complicated problems in a shorter period of time. Apart from solving the problems, they can find out the root cause and prevent it from occurring again.

People Development Benefits

Being a Green Belt certified, you will experience numerous benefits provide by an organization. Apart from tool and methodologies, the Lean Six Sigma Green Belt training enhances personal skills like talking with confidence, successfully solving the problems and suggesting new ideas.

Financial Benefits 

The Lean Six Sigma Green Belt certified professional give confidence of successfully completing the project and there is a possibility that they will save around $60k. Usually, the savings per project is the amount that most organization needs to pay to accomplish the project on time and get the positive output.

Customer Benefits

The certified green belt professionals help to deliver better quality products and services, which leads to better customer satisfaction. Nowadays, lot of customers look for the organizations those utilize the Lean Six Sigma. Lean Six Sigma Green Belt professionals increase organization’s credibility and helps to attract the customers, and give them surety that you offer the best services or products in the industry.

Competitive Advantage

You can use the Lean Six Sigma in the sales pitches and marketing campaign. By implementing the Lean Six Sigma and getting a Green Belt certification, you’ll be able to improve organization’s performance and keep your organization ahead of competitors. As everything in lean six sigma is based on gathered data and analysis, you will be more confident in making decisions.

If you are a professional and looking to take advantage of being Lean Six Sigma Green Belt certified, then there are numerous training bodies are available who offer the training for LSSGB. MSys Training is one of the world’s leading training providers, offering the industry experts led LSSGB training. Register for their training session and keep yourself one step ahead of your colleagues.


Importance of Release Plan in Agile

Importance of Release Plan in Agile

Agile practitioners utilize different agile methods to release management in the project. The major idea behind the agile project management is to achieve customer satisfaction after project delivery. The agile release plan is not complete until the project is successfully delivered to the customer. Agile teams keep following the release plan at specific intervals at the end of the iterations with the customer’s consent.

Definition of Release Plan

A release plan can be defined as a roadmap to showcase how an agile team is planning to achieve the project goal; how the project requirements and the implementation plans are integrated in the project data sheet, and finally achieving the overall goal of the project keeping customer’s consent at work.

Importance of a Release Plan

The release plan is a very important element for the product authorities as the release plan facilitates them to communicate with project’s stakeholders and demonstrate the potentials for a given project. The Release Plan helps an agile team member to understand the expectations and the plan that the team needs to achieve. It also helps the team members to work according to the plan. The importance of the release plan can be defined as a guide-post that give a definite direction to the project team.

Steps to Design a Release Plan

Initially, the owner of the product needs to demonstrate the scope, expectations, quality and other elements to customers. After the initial process, the product owner needs to estimate the requirements to meet the expectation, followed by prioritizing project needs and deciding a release date. The steps for planning a release includes:

  1. Defining conditions of satisfaction
  2. Estimating stories of the user
  3. Choosing an iteration length
  4. Velocity estimation
  5. Prioritizing user
  6. Selecting stories and deciding a release date

End of Release Plan

The Release Plan ends with finalizing requirements of the user and allotting them specific iterations. The first three iterations are well-defined with complete requirements and iteration length, the remaining iteration kept relatively loosely defined. This practice makes the release plan flexible and allows to adapt to critical situations.

To get more information about the Agile and Scrum planning and process, you can visit and join MSys Agile and Scrum online training. With MSys Training, you can consider attending agile and scrum certification training in your city in the USA.


Top 7 Reason to Take PMP Certification

Top 7 Reason to Take PMP Certification

The PMP Credential is a globally accepted and acknowledge professional certification for project management that certifies the professional’s experience and knowledge of project management. This PMP credential offered by the PMI (Project Management Institute) to professionals who clear in the PMP examination hosted by the PMI. There are numerous benefits of taking PMP certification, here are few of them that illustrate why one should consider getting a PMP certification:

1.     Globally Acknowledged Certification

The PMP credential is most reputed certification around the world and there are more than 450k certified PMP professionals. This credential helps professionals to highlight their expertise and skills to global organizations.

2.     Hike in the Salary

The certified PMI-PMP professionals witness a dramatic hike in their salary with the PMP certification. According to Project Management Salary Survey, the PMP certified project managers earn 25% more than non-certified colleagues. Further, the Project Management Institute states that 71% of project managers witnessed an increase in compensation. PMP certified professionals earn a median salary of $110,000 per year.

3.     Expands Market Reach and Scope

The Project Management Professional (PMI-PMP) certification holders have designed a global close-knit community where they share insights, tips and pool their experience. Being a certified PMP you get an access to the major communities related to project management that helps you to stay updated with the latest trends and developments in the industry.

4.     Better Job Opportunities

According to the recent survey, more than 80% of high-performing projects are led by PMP credentialed project managers and organizations having more than 35% certified project managers deliver much better project performance. Hence, a PMI-PMP certification offers better career paths and provides greater job opportunities to the professionals in the project management world.

5.     Most Challenging Projects

A PMP certification helps professionals to work on more challenging and important projects, since it shows dedication of professionals towards project management and validates required experience and knowledge to handle challenging projects. The PMP certification shows the skills and experience of professional to leading and directing projects, as it include both education and experience for project managers.

6.     Greater Visibility to Global Employers

The PMP credential is a standard that showcases a one’s expertise in handling project effectively and the certification catches a recruiter’s eye immediately, during profile evaluation. Research suggests that employers prefer hiring a project manager with PMP certification over those who don’t.

7.     Utility across Industry

A PMP Credential is the best certificates for project managers in all industries, including IT, business processing, telecom, finance, commerce, research, and others.

 

 

 

MSys Training world’s leading training providers offers 4-days intense training on PMP exam prep with 100% money back assurance on passing the PMP certification exam in the first attempt. If you are also looking for career growth and better opportunities, simply drop us a query at support@msystraining.com or call +1-408 878 3078.


Making the Most out of the Webinar

Making the Most out of the Webinar

Nowadays, we all are busy with our professional and personal life that we don’t get time to learn new tools and technologies launched in the market. But, we have an option to attend webinar on the latest trends to get updates about it. MSys Training, one of the leading training providers are organizing webinars on various topics, including project management, Lean Six Sigma and ITIL that can help you to enhance your managerial skills. Here are few more reasons, why you should consider attending webinars:

Easy to Register and Attend

Let’s consider that you are looking to learn something new. Learning a new technique with your own is little difficult than learning from some expert or professional, and it will be cherry on the cake if you get to learn without paying anything or any prior commitment. A webinar is the treat for those who are seeking to learn new things every day. The registration for the webinar is very easy and it will hardly take few seconds to fill out the required information. You can reserve a seat for the webinar without any commitment to attend it. You simply need to login to the portal and start listing to webinar from anywhere. The best thing is you don’t need to visit any awkward seminar that you cannot quit in the middle of the session. With the ability to get new and updated information from the experts at no cost, attending a webinar is a worthy idea.

Great Source to Build a Network

With the commercial point of view, speakers don’t gain anything even after sharing their time and knowledge with you. However, the attendees have a lot to gain from these webinars. The speaker invest their time to hand out the important information relevant to the topic they’re speaking on. Being an attendee, it is very important for you to extract as much as possible and useful information from this experience. Many webinars allow you to ask question related to the topic, hence, if you liked the session and enjoyed a presentation, you should ask questions if you have any.

You can reach out the speaker on their social accounts and let them know about your experience and offer some meaningful feedback, so that they will be more likely to get back in touch with you.

Break from Your Daily Life

Attending a webinar is one of the best ways for professionals to take break from their daily schedule. Many people think webinar is a time consuming activity and hence do not prefer to attend the webinar. But the knowledge and information you will be getting from the webinars worth that time. You must take out some time from your schedule to attend a webinar that help you to refresh and you will be ready to return to work with a new found excitement.

So if you are thinking to attend a webinar in coming time, MSys Webinars are highly recommended. The webinar is conducted by the leading experts in the industry.


Major Highlights from PMBOK® Guide Sixth Edition

Major Highlights from PMBOK® Guide Sixth Edition

We all know that the Project Management Body of Knowledge (PMBOK® Guide) 6th edition will be launched in this July. From the new PMBOK® Guide, we can expect considerable changes in every aspect whether it’s an exam, training course, or anything related to PMI.

The PMI defines the role of project manager in every industry and help us to identify the knowledge areas that we need to perform in our jobs. For starter project managers, the 6th edition validates the requirement for the Talent Triangle, which means the project managers must have knowledge in business management, technical project management, and strategic & leadership. In this edition of the Project Management Body of Knowledge, inclusion of Agile practice will be the major addition and will have a great impact of PMI certification courses.

Below are the highlighted changes that you can witness in the 6th edition of Project Management Body of Knowledge (PMBOK® Guide):

  • The first 3 chapters of the book have been completely rewritten to highlight importance of project management’s role in the business value creation and any organizational change.
  • Agile concept is incorporated in all 10 knowledge areas. Due to these changes, project managers need to gain knowledge form Agile Practice Guide along with the PMBOK® Guide.
  • All 10 knowledge areas will focus on new topics like Key Concepts, Approaches in Agile, Tailoring Considerations, Trends and Emerging Practices, Iterative and Adaptive Environments and so on.
  • In the new edition, the role of the project manager has been expanded as a leader, business expert and strategic thinker.
  • Included a new business case in the initiation and the requirement for the project management plan benefits.
  • The ‘Project Human Resource Management’ module has been renamed to ‘Project Resource Management’, so that it can include both physical and human resources.
  • Renamed ‘Project Time Management’ with ‘Project Schedule Management’.
  • Three new processes have been included naming Implement Risk Responses, Manage Project Knowledge and Control Resources.
  • Close Procurements have been removed from the book because studies suggest that project managers usually don’t close contracts.

Although, Guide to the Project Management Body of Knowledge(PMBOK) is launching in 3rd quarter of 2017, PMI will permit current PMP® participants to continue taking an exam based on the current version that is the PMBOK® Guide 5th edition for up to half a year. Which means the PMP® exam based on the PMBOK® Guide 6th edition will be conducted in the first quarter of 2018.

MSys Training, a world’s leading training provider offers long term and short term training courses that helps individuals to update and improve their skills, and achieve career objectives.


How IoT is Changing the World: Cases from Visa, Airbus, Bosch & SNCF

How IoT is Changing the World: Cases from Visa, Airbus, Bosch & SNCF


The Internet of Things (IoT) is changing our world. This may seem like a bold statement, but consider the impact this revolutionary technology has already had on communications, education, manufacturing, science, business, and many other fields of life. Clearly, the IoT is moving really fast from concept to reality and transforming how industries operate and create value. 

As the IoT creeps towards mass adoption, IT giants experiment and innovate with the technology to explore new opportunities and create new revenue streams. I was invited to Genius of Things Summit as a Futurist by Watson IoT and WIRED Insider and attended the long-awaited grand opening of IBM’s headquarters for Watson Internet of Things in Munich. The two-day event provided me an insight into what IBM’s doing to constantly push the boundaries of what’s possible with the IoT.

In this article, I have discussed the major developments that caught my interest and that, in my opinion, will impact and improve customer experience substantially. 

IoT capabilities become an integral part of our lifestyle

According to IBM the number of connected devices is expecting to rise as high as 30 billion in the next three years. This increasingly connected culture presents businesses with an opportunity to harness digital ...


Read More on Datafloq
Sumo Logic CEO on how modern apps benefit from ‘continuous intelligence’ and DevOps insights

Sumo Logic CEO on how modern apps benefit from ‘continuous intelligence’ and DevOps insights

The next BriefingsDirect applications health monitoring interview explores how a new breed of continuous intelligence emerges by gaining data from systems infrastructure logs -- either on-premises or in the cloud -- and then cross-referencing that with intrinsic business metrics information.

We’ll now explore how these new levels of insight and intelligence into what really goes on underneath the covers of modern applications help ensure that apps are built, deployed, and operated properly.

Today, more than ever, how a company's applications perform equates with how the company itself performs and is perceived. From airlines to retail, from finding cabs to gaming, how the applications work deeply impacts how the business processes and business outcomes work, too.

Listen to the podcast. Find it on iTunes. Get the mobile app. Read a full transcript or  download a copy.

We’re joined by an executive from Sumo Logic to learn why modern applications are different, what's needed to make them robust and agile, and how the right mix of data, metrics and machine learning provides the means to make and keep apps operating better than ever.

To describe how to build and maintain the best applications, welcome Ramin Sayar, President and CEO of Sumo Logic. The discussion is moderated by BriefingsDirect's Dana Gardner, Principal Analyst at Interarbor Solutions.

Here are some excerpts:

Gardner: There’s no doubt that the apps make the company, but what is it about modern applications that makes them so difficult to really know? How is that different from the applications we were using 10 years ago?

Sayar: You hit it on the head a little bit earlier. This notion of always-on, always-available, always-accessible types of applications, either delivered through rich web mobile interfaces or through traditional mechanisms that are served up through laptops or other access points and point-of-sale systems are driving a next wave of technology architecture supporting these apps.

These modern apps are around a modern stack, and so they’re using new platform services that are created by public-cloud providers, they’re using new development processes such as agile or continuous delivery, and they’re expected to constantly be learning and iterating so they can improve not only the user experience -- but the business outcomes.

Gardner: Of course, developers and business leaders are under pressure, more than ever before, to put new apps out more quickly, and to then update and refine them on a continuous basis. So this is a never-ending process.

User experience

Sayar: You’re spot on. The obvious benefits around always on is centered on the rich user interaction and user experience. So, while a lot of the conversation around modern apps tends to focus on the technology and the components, there are actually fundamental challenges in the process of how these new apps are also built and managed on an ongoing basis, and what implications that has for security. A lot of times, those two aspects are left out when people are discussing modern apps.

Sayar
Gardner: That's right. We’re now talking so much about DevOps these days, but in the same breath, we’re taking about SecOps -- security and operations. They’re really joined at the hip.

Sayar: Yes, they’re starting to blend. You’re seeing the technology decisions around public cloud, around Docker and containers, and microservices and APIs, and not only led by developers or DevOps teams. They’re heavily influenced and partnering with the SecOps and security teams and CISOs, because the data is distributed. Now there needs to be better visibility instrumentation, not just for the access logs, but for the business process and holistic view of the service and service-level agreements (SLAs).

Gardner: What’s different from say 10 years ago? Distributed used to mean that I had, under my own data-center roof, an application that would be drawing from a database, using an application server, perhaps a couple of services, but mostly all under my control. Now, it’s much more complex, with many more moving parts.

Sayar: We like to look at the evolution of these modern apps. For example, a lot of our customers have traditional monolithic apps that follow the more traditional waterfall approach for iterating and release. Often, those are run on bare-metal physical servers, or possibly virtual machines (VMs). They are simple, three-tier web apps.
Access the Webinar
On Gaining Operational Visibility
Into AWS
We see one of two things happening. The first is that there is a need for either replacing the front end of those apps, and we refer to those as brownfield. They start to change from waterfall to agile and they start to have more of an N-tier feel. It's really more around the front end. Maybe your web properties are a good example of that. And they start to componentize pieces of their apps, either on VMs or in private clouds, and that's often good for existing types of workloads.
Now there needs to be better visibility instrumentation, not just for the access logs, but for the business process and holistic view of the service and service-level agreements.

The other big trend is this new way of building apps, what we call greenfield workloads, versus the brownfield workloads, and those take a fundamentally different approach.

Often it's centered on new technology, a stack entirely using microservices, API-first development methodology, and using new modern containers like Docker, Mesosphere, CoreOS, and using public-cloud infrastructure and services from Amazon Web Services (AWS), or Microsoft Azure. As a result, what you’re seeing is the technology decisions that are made there require different skill sets and teams to come together to be able to deliver on the DevOps and SecOps processes that we just mentioned.

Gardner: Ramin, it’s important to point out that we’re not just talking about public-facing business-to-consumer (B2C) apps, not that those aren't important, but we’re also talking about all those very important business-to-business (B2B) and business-to-employee (B2E) apps. I can't tell you how frustrating it is when you get on the phone with somebody and they say, “Well, I’ll help you, but my app is down,” or the data isn’t available. So this is not just for the public facing apps, it's all apps, right?

It's a data problem

Sayar: Absolutely. Regardless of whether it's enterprise or consumer, if it's mid-market small and medium business (SMB) or enterprise that you are building these apps for, what we see from our customers is that they all have a similar challenge, and they’re really trying to deal with the volume, the velocity, and the variety of the data around these new architectures and how they grapple and get their hands around it. At the end of day, it becomes a data problem, not just a process or technology problem.

Gardner: Let's talk about the challenges then. If we have many moving parts, if we need to do things faster, if we need to consider the development lifecycle and processes as well as ongoing security, if we’re dealing with outside third-party cloud providers, where do we go to find the common thread of insight, even though we have more complexity across more organizational boundaries?

Sayar: From a Sumo Logic perspective, we’re trying to provide full-stack visibility, not only from code and your repositories like GitHub or Jenkins, but all the way through the components of your code, to API calls, to what your deployment tools are used for in terms of provisioning and performance.

We spend a lot of effort to integrate to the various DevOps tool chain vendors, as well as provide the holistic view of what users are doing in terms of access to those applications and services. We know who has checked in which code or which branch and which build created potential issues for the performance, latency, or outage. So we give you that 360-view by providing that full stack set of capabilities.
Unlike others that are out there and available for you, Sumo Logic's architecture is truly cloud native and multitenant, but it's centered on the principle of near real-time data streaming.

Gardner: So, the more information the better, no matter where in the process, no matter where in the lifecycle. But then, that adds its own level of complexity. I wonder is this a fire-hose approach or boiling-the-ocean approach? How do you make that manageable and then actionable?

Sayar: We’ve invested quite a bit of our intellectual property (IP) on not only providing integration with these various sources of data, but also a lot in the machine learning  and algorithms, so that we can take advantage of the architecture of being a true cloud native multitenant fast and simple solution.

So, unlike others that are out there and available for you, Sumo Logic's architecture is truly cloud native and multitenant, but it's centered on the principle of near real-time data streaming.

As the data is coming in, our data-streaming engine is allowing developers, IT ops administrators, sys admins, and security professionals to be able to have their own view, coarse-grained or granular-grained, from our back controls that we have in the system to be able to leverage the same data for different purposes, versus having to wait for someone to create a dashboard, create a view, or be able to get access to a system when something breaks.

Gardner: That’s interesting. Having been in the industry long enough, I remember when logs basically meant batch. You'd get a log dump, and then you would do something with it. That would generate a report, many times with manual steps involved. So what's the big step to going to streaming? Why is that an essential part of making this so actionable?

Sayar: It’s driven based on the architectures and the applications. No longer is it acceptable to look at samples of data that span 5 or 15 minutes. You need the real-time data, sub-second, millisecond latency to be able to understand causality, and be able to understand when you’re having a potential threat, risk, or security concern, versus code-quality issues that are causing potential performance outages and therefore business impact.

The old way was hope and pray, when I deployed code, that I would find something when a user complains is no longer acceptable. You lose business and credibility, and at the end of the day, there’s no real way to hold developers, operations folks, or security folks accountable because of the legacy tools and process approach.

Center of the business

Those expectations have changed, because of the consumerization of IT and the fact that apps are the center of the business, as we’ve talked about. What we really do is provide a simple way for us to analyze the metadata coming in and provide very simple access through APIs or through our user interfaces based on your role to be able to address issues proactively.

Conceptually, there’s this notion of wartime and peacetime as we’re building and delivering our service. We look at the problems that users -- customers of Sumo Logic and internally here at Sumo Logic -- are used to and then we break that down into this lifecycle -- centered on this concept of peacetime and wartime.

Peacetime is when nothing is wrong, but you want to stay ahead of issues and you want to be able to proactively assess the health of your service, your application, your operational level agreements, your SLAs, and be notified when something is trending the wrong way.

Then, there's this notion of wartime, and wartime is all hands on deck. Instead of being alerted 15 minutes or an hour after an outage has happened or security risk and threat implication has been discovered, the real-time data-streaming engine is notifying people instantly, and you're getting PagerDuty alerts, you're getting Slack notifications. It's no longer the traditional helpdesk notification process when people are getting on bridge lines.
No longer do you need to do “swivel-chair” correlation, because we're looking at multiple UIs and tools and products.

Because the teams are often distributed and it’s shared responsibility and ownership for identifying an issue in wartime, we're enabling collaboration and new ways of collaboration by leveraging the integrations to things like Slack, PagerDuty notification systems through the real-time platform we've built.

So, the always-on application expectations that customers and consumers have, have now been transformed to always-on available development and security resources to be able to address problems proactively.

Gardner: It sounds like we're able to not only take the data and information in real time from the applications to understand what’s going on with the applications, but we can take that same information and start applying it to other business metrics, other business environmental impacts that then give us an even greater insight into how to manage the business and the processes. Am I overstating that or is that where we are heading here?

Sayar: That’s exactly right. The essence of what we provide in terms of the service is a platform that leverages the machine logs and time-series data from a single platform or service that eliminates a lot of the complexity that exists in traditional processes and tools. No longer do you need to do “swivel-chair” correlation, because we're looking at multiple UIs and tools and products. No longer do you have to wait for the helpdesk person to notify you. We're trying to provide that instant knowledge and collaboration through the real-time data-streaming platform we've built to bring teams together versus divided.

Gardner: That sounds terrific if I'm the IT guy or gal, but why should this be of interest to somebody higher up in the organization, at a business process, even at a C-table level? What is it about continuous intelligence that cannot only help apps run on time and well, but help my business run on time and well?

Need for agility

Sayar: We talked a little bit about the whole need for agility. From a business point of view, the line-of-business folks who are associated with any of these greenfield projects or apps want to be able to increase the cycle times of the application delivery. They want to have measurable results in terms of application changes or web changes, so that their web properties have either increased or potentially decreased in terms of user satisfaction or, at the end of the day, business revenue.

So, we're able to help the developers, the DevOps teams, and ultimately, line of business deliver on the speed and agility needs for these new modes. We do that through a single comprehensive platform, as I mentioned.

At the same time, what’s interesting here is that no longer is security an afterthought. No longer is security in the back room trying to figure out when a threat or an attack has happened. Security has a seat at the table in a lot of boardrooms, and more importantly, in a lot of strategic initiatives for enterprise companies today.

At the same time we're helping with agility, we're also helping with prevention. And so a lot of our customers often start with the security teams that are looking for a new way to be able to inspect this volume of data that’s coming in -- not at the infrastructure level or only the end-user level -- but at the application and code level. What we're really able to do, as I mentioned earlier, is provide a unifying approach to bring these disparate teams together.
Download the State
Of Modern Applications
In AWS Report
Gardner: And yet individuals can extract the intelligence view that best suits what their needs are in that moment.

Sayar: Yes. And ultimately what we're able to do is improve customer experience, increase revenue-generating services, increase efficiencies and agility of actually delivering code that’s quality and therefore the applications, and lastly, improve collaboration and communication.

Gardner: I’d really like to hear some real world examples of how this works, but before we go there, I’m still interested in the how. As to this idea of machine learning, we're hearing an awful lot today about bots, artificial intelligence (AI), and machine learning. Parse this out a bit for me. What is it that you're using machine learning  for when it comes to this volume and variety in understanding apps and making that useable in the context of a business metric of some kind?

Sayar: This is an interesting topic, because of a lot of noise in the market around big data or machine learning and advanced analytics. Since Sumo Logic was started six years ago, we built this platform to ensure that not only we have the best in class security and encryption capabilities, but it was centered on the fundamental purpose around democratizing analytics, making it simpler to be able to allow more than just a subset of folks get access to information for their roles and responsibilities, whether you're security, ops, or development teams.

To answer your question a little bit more succinctly, our platform is predicated on multiple levels of machine learning and analytics capabilities. Starting at the lowest level, something that we refer to as LogReduce is meant to separate the signal-to-noise ratio. Ultimately, it helps a lot of our users and customers reduce mean time to identification by upwards of 90 percent, because they're not searching the irrelevant data. They're searching the relevant and oftentimes occurring data that's not frequent or not really known, versus what’s constantly occurring in their environment.

In doing so, it’s not just about mean time to identification, but it’s also how quickly we're able to respond and repair. We've seen customers using LogReduce reduce the mean time to resolution by upwards of 50 percent.

Predictive capabilities

Our core analytics, at the lowest level, is helping solve operational metrics and value. Then, we start to become less reactive. When you've had an outage or a security threat, you start to leverage some of our other predictive capabilities in our stack.

For example, I mentioned this concept of peacetime and wartime. In the notion of peacetime, you're looking at changes over time when you've deployed code and/or applications to various geographies and locations. A lot of times, developers and ops folks that use Sumo want to use log compare or outlier predictor operators that are in their machine learning capabilities to show and compare differences of branches of code and quality of their code to relevancy around performance and availability of the service and app.

We allow them, with a click of a button, to compare this window for these events and these metrics for the last hour, last day, last week, last month, and compare them to other time slices of data and show how much better or worse it is. This is before deploying to production. When they look at production, we're able to allow them to use predictive analytics to look at anomalies and abnormal behavior to get more proactive.

So, reactive, to proactive, all the way to predictive is the philosophy that we've been trying to build in terms of our analytics stack and capabilities.
Sumo Logic is very relevant for all these customers that are spanning the data-center infrastructure consolidation to new workload projects that they may be building in private-cloud or public-cloud endpoints.

Gardner: How are some actual customers using this and what are they getting back for their investment?

Sayar: We have customers that span retail and e-commerce, high-tech, media, entertainment, travel, and insurance. We're well north of 1,200 unique paying customers, and they span anyone from Airbnb, Anheuser-Busch, Adobe, Metadata, Marriott, Twitter, Telstra, Xora -- modern companies as well as traditional companies.

What do they all have in common? Often, what we see is a digital transformation project or initiative. They either have to build greenfield or brownfield apps and they need a new approach and a new service, and that's where they start leveraging Sumo Logic.

Second, what we see is that's it’s not always a digital transformation; it's often a cost reduction and/or a consolidation project. Consolidation could be tools or infrastructure and data center, or it could be migration to co-los or public-cloud infrastructures.

The nice thing about Sumo Logic is that we can connect anything from your top of rack switch, to your discrete storage arrays, to network devices, to operating system, and middleware, through to your content-delivery network (CDN) providers and your public-cloud infrastructures.

As it’s a migration or consolidation project, we’re able to help them compare performance and availability, SLAs that they have associated with those, as well as differences in terms of delivery of infrastructure services to the developers or users.

So whether it's agility-driven or cost-driven, Sumo Logic is very relevant for all these customers that are spanning the data-center infrastructure consolidation to new workload projects that they may be building in private-cloud or public-cloud endpoints.

Gardner: Ramin, how about a couple of concrete examples of what you were just referring to.

Cloud migration

Sayar: One good example is in the media space or media and entertainment space, for example, Hearst Media. They, like a lot of our other customers, were undergoing a digital-transformation project and a cloud-migration project. They were moving about 36 apps to AWS and they needed a single platform that provided machine-learning analytics to be able to recognize and quickly identify performance issues prior to making the migration and updates to any of the apps rolling over to AWS. They were able to really improve cycle times, as well as efficiency, with respect to identifying and resolving issues fast.

Another example would be JetBlue. We do a lot in the travel space. JetBlue is also another AWS and cloud customer. They provide a lot of in-flight entertainment to their customers. They wanted to be able to look at the service quality for the revenue model for the in-flight entertainment system and be able to ascertain what movies are being watched, what’s the quality of service, whether that’s being degraded or having to charge customers more than once for any type of service outages. That’s how they're using Sumo Logic to better assess and manage customer experience. It's not too dissimilar from Alaska Airlines or others that are also providing in-flight notification and wireless type of services.

The last one is someone that we're all pretty familiar with and that’s Airbnb. We're seeing a fundamental disruption in the travel space and how we reserve hotels or apartments or homes, and Airbnb has led the charge, like Uber in the transportation space. In their case, they're taking a lot of credit-card and payment-processing information. They're using Sumo Logic for payment-card industry (PCI) audit and security, as well as operational visibility in terms of their websites and presence.
They were able to really improve cycle times, as well as efficiency, with respect to identifying and resolving issues fast.

Gardner: It’s interesting. Not only are you giving them benefits along insight lines, but it sounds to me like you're giving them a green light to go ahead and experiment and then learn very quickly whether that experiment worked or not, so that they can find refine. That’s so important in our digital business and agility drive these days.

Sayar: Absolutely. And if I were to think of another interesting example, Anheuser-Busch is another one of our customers. In this case, the CISO wanted to have a new approach to security and not one that was centered on guarding the data and access to the data, but providing a single platform for all constituents within Anheuser-Busch, whether security teams, operations teams, developers, or support teams.

We did a pilot for them, and as they're modernizing a lot of their apps, as they start to look at the next generation of security analytics, the adoption of Sumo started to become instant inside AB InBev. Now, they're looking at not just their existing real estate of infrastructure and apps for all these teams, but they're going to connect it to future projects such as the Connected Path, so they can understand what the yield is from each pour in a particular keg in a location and figure out whether that’s optimized or when they can replace the keg.

So, you're going from a reactive approach for security and processes around deployment and operations to next-gen connected Internet of Things (IoT) and devices to understand business performance and yield. That's a great example of an innovative company doing something unique and different with Sumo Logic.

Gardner: So, what happens as these companies modernize and they start to avail themselves of more public-cloud infrastructure services, ultimately more-and-more of their apps are going to be of, by, and for somebody else’s public cloud? Where do you fit in that scenario?

Data source and location

Sayar: Whether you’re running on-prem, whether you're running co-los, whether you're running through CDN providers like Akamai, whether you're running on AWS or Azure, Heroku, whether you're running SaaS platforms and renting a single platform that can manage and ingest all that data for you. Interestingly enough, about half our customers’ workloads run on-premises and half of them run in the cloud.

We’re agnostic to where the data is or where their applications or workloads reside. The benefit we provide is the single ubiquitous platform for managing the data streams that are coming in from devices, from applications, from infrastructure, from mobile to you, in a simple, real-time way through a multitenant cloud service.

Gardner: This reminds me of what I heard, 10 or 15 years ago about business intelligence (BI), drawing data, analyzing it, making it close to being proactive in its ability to help the organization. How is continuous intelligence different, or even better, and something that would replace what we refer to as BI?
The expectation is that it’s sub-millisecond latency to understand what's going on, from a security, operational, or user-experience point of view.

Sayar: The issue that we faced with the first generation of BI was it was very rear-view and mirror-centric, meaning that it was looking at data and things in the past. Where we're at today with this need for speed and the necessity to be always on, always available, the expectation is that it’s sub-millisecond latency to understand what's going on, from a security, operational, or user-experience point of view.

I'd say that we're on V2 or next generation of what was traditionally called BI, and we refer to that as continuous intelligence, because you're continuously adapting and learning. It's not only based on what humans know and what rules and correlation that they try to presuppose and create alarms and filters and things around that. It’s what machines and machine intelligence needs to supplement that with to provide the best-in-class type of capability, which is what we refer to as continuous intelligence.

Gardner: We’re almost out of time, but I wanted to look to the future a little bit. Obviously, there's a lot of investing going on now around big data and analytics as it pertains to many different elements of many different businesses, depending on their verticals. Then, we're talking about some of the logic benefit and continuous intelligence as it applies to applications and their lifecycle.

Where do we start to see crossover between those? How do I leverage what I’m doing in big data generally in my organization and more specifically, what I can do with continuous intelligence from my systems, from my applications?

Business Insights

Sayar: We touched a little bit on that in terms of the types of data that we integrate and ingest. At the end of the day, when we talk about full-stack visibility, it's from everything with respect to providing business insights to operational insights, to security insights.

We have some customers that are in credit-card payment processing, and they actually use us to understand activations for credit cards, so they're extracting value from the data coming into Sumo Logic to understand and predict business impact and relevant revenue associated with these services that they're managing; in this case, a set of apps that run on a CDN.
Try Sumo Logic for Free
To Get Critical Data and Insights
Into Apps and Infrastructure Operations
At the same time, the fraud and risk team are using us for threat and prevention. The operations team is using us for understanding identification of issues proactively to be able to address any application or infrastructure issues, and that’s what we refer to as full stack.

Full stack isn’t just the technology; it's providing business visibility insights to line the business users or users that are looking at metrics around user experience and service quality, to operational-level impacts that help you become more proactive, or in some cases, reactive to wartime issues, as we've talked about. And lastly, the security team helps you take a different security posture around reactive and proactive, around threat, detection, and risk.

In a nutshell, where we see these things starting to converge is what we refer to as full stack visibility around our strategy for continuous intelligence, and that is technology to business to users.

Listen to the podcast. Find it on iTunes. Get the mobile app. Read a full transcript or  download a copy. Sponsor: Sumo Logic.

You may also be interested in:

How Unlimited Mobile Data and 5G Will Make VR the New Normal

How Unlimited Mobile Data and 5G Will Make VR the New Normal

This is the companion piece to my last article on mobile data, The dark (and not-so-dark) side of mobile data.

The stage is set, we’re just not quite there yet. When 5G internet becomes a reality, when the mobile data source is unlimited, and when Virtual Reality and Augmented applications proliferate, it won’t be that odd to see someone walking down the street with a headset on.

First, the mobile web and data. If a mobile web provider limits the amount of data you can use, it’s undoubtedly a money-making scheme. The data itself is not a commodity. This is evidenced by a provider like T-Mobile offering 2 smartphone lines of unlimited data and including video streaming and mobile hotspots for no extra charge. To offer streaming data and hotspot data as add-ons is to imply that these data are fundamentally different than cellular data, they’re bonus data, a special kind of data the provider could potentially charge extra for.

But data is data. There’s no limit to it, and increasingly, providers are making unlimited data plans the standard. The difference between streaming video data and audio data, say, is that a packet of video data is much larger, because it includes both ...


Read More on Datafloq
How Artificial Intelligence Will Change the Face of eCommerce

How Artificial Intelligence Will Change the Face of eCommerce

In 2016, the artificial intelligence and machine learning hype cycle reached fever pitch. We’ve been here before; it seems as if revolutionary artificial intelligence has been on the cusp of reality for decades. It’s understandable that many in the eCommerce world are skeptical, particularly smaller eCommerce merchants who have to be careful where they invest resources.

Although the benefits of AI / ML are often overblown, it’s clear that it will make — and is already making — a real difference. The contrast between this and other “AI changes everything” hype cycles is significant. The technology in the limited domains relevant to eCommerce has advanced enormously: we’re better at data analytics on huge scales, we’re better at natural language processing and pattern recognition, we’re better at building cheap and scalable infrastructure, and the market has matured: many companies offer products and services that use AI / ML to provide real-world benefits to retailers without requiring a phD in computer science.

Artificial intelligence is the application of automated systems to decision making, the discovery of solutions, and the delivery of insights. That sounds great, but it doesn’t mean anything without practical applications, and, in 2017, there’s no shortage of eCommerce companies and solution ...


Read More on Datafloq
Why Big Data Desperately Needs Transparency

Why Big Data Desperately Needs Transparency

People have an innate suspicion of numbers.

We understand that the answer to life, the universe, and everything is too complex to be boiled down to the number 42 (according to Hitchhiker’s Guide to the Galaxy), but in the search to quantify our existence, we do allow our lives to be ruled by numbers. We count the calories in our food, we count the minutes on our daily commute, and we definitely count the number of emails in our inboxes. We use our experience to decide which numbers are good and which numbers are bad. If we didn’t manage our lives by numbers, we would be obese, late and overwhelmed.

However, there are increasing amounts of data in our lives where we are not certain of the origin. The failure of opinion polling has already been widely debated, and if such a “fine art” can get it wrong, who is to say that Big Data is any different? Isn’t polling Big Data by another name? We haven’t really got a clue how these opinion pollsters got their figures, and most corporate people are equally unsure where all their stats are conjured up from.

If Big Data is to have the impact that it ...


Read More on Datafloq
The Impact of Artificial Intelligence on the Accounting Profession

The Impact of Artificial Intelligence on the Accounting Profession

Artificial Intelligence or AI is defined as the capacity of machines and software to exhibit or imitate a sense of cognitive intelligence. The idea behind AI holds great potential and yet also raises many concerns. In fictions, scientific and otherwise, developing AI tools and applications, capable of thinking, learning and, reacting like living beings doesn’t end very well. But in the real world, the concerns it generates are not that grave but does, however, pose the threat of a revolutionary disruption trend similar to that of cloud accounting. 

There is no need to deny the fact that AI can become an invaluable business enhancement tool in professions where significant training is a necessary prerequisite. Professions where technical precision and right-minded judgments are necessary such as accounting also has great scope for AI applications. In fact, a report from Deloitte states that in the very near future, AI could help develop a whole new paradigm of services, and product, creating a whole new market with huge investor profits.  

Niche business areas like customer service, research and development, logistics, sales, and marketing also have great potential for AI applications. Although AI as a technology is still in its infancy, a report from the European Commission states that the global market for AI and AI applications grew from €700 million in 2013 to €27 billion by the end of 2015. 

Similar to the cloud accounting disruption in the industry a few years earlier, accountants and the accounting industry in general, will ...


Read More on Datafloq
The 8 Actors a Data Scientist Must Meet

The 8 Actors a Data Scientist Must Meet

If you are on the verge of joining the Analytics industry, your vision of an ideal day in a life of a practitioner could be someone who effortlessly dives into data to create that perfect beer-and-nappy Eureka insight moment (technically termed as market basket analysis) which shows the unlikely insight that beer and diapers get bought together during game nights; a solution that gets implemented immediately in the retail store design and in turn leads to a dramatic increase in revenue growth.

Truth is the journey from data to revenue growth is more like a well-crafted drama that unfolds across many different stages. There are various actors that need to enact their roles, successfully, on each of these make-shift stages, for the journey to be completed.

Intrigued? Let’s see how:

"However beautiful the strategy, you should occasionally look at the results” – Sir Winston Churchill

Role: Strategist

Script: Let’s say the business faces a profitability problem or an issue of increased credit risk. The strategist must work with the business to understand the context and convert that into an analytics problem which can be addressed by data. Data then becomes an enabler to understand patterns and help solve the problem.

"Sometimes we stare so long at ...


Read More on Datafloq
What is Federated SSO and How is it Different from SSO?

What is Federated SSO and How is it Different from SSO?

Federated SSO and SSO may look similar to many people. Cannot blame them as users are only able to see the upper crust of the processes. They need to login with their credentials and enjoys different applications or multiple systems without even repeating the login process.” It’s a snap! But originally, both techniques work differently. So, do you want to know how federated SSO is different from SSO? And if you are perplexed about federated SSO or your organization is struggling to make their choice between federated SSO and SSO? The article will help to offer you an insight about federated SSO and state that how SSO and federated SSO are on a quite different page. Please read on.

What is federated SSO?

To understand, federated SSO, you need to understand federation. Federation is a relationship which is maintained between organizations. User from each organization gets access across each other’s web properties. Hence, federated SSO provides an authentication token to the user which is trusted across organizations. So, user does not need to create a different account for every organization in federation to access web properties and applications.

Note:-  use of SAML is common in federation protocols.

How does Federated SSO work?

Let us start with ...


Read More on Datafloq
Why Big Data Struggles to Prove its Value in the World of Healthcare

Why Big Data Struggles to Prove its Value in the World of Healthcare

Big data is quickly expanding to a number of industries, and healthcare is no exception. With the use of big data, all kinds of medical records and studies can be digitized and easily analyzed. However, using big data in the healthcare space comes with its own set of challenges that anyone involved in the industry should be aware of.

Unreliable Data

The U.S. government and other private organizations have poured billions of dollars into digitizing medical records, but so far the data has basically just stayed where it is. Next to nothing has been done to analyze and actually use that data, in large part because the data is incredibly difficult to use and interpret. Medical data is often stored in databases, which tend to not be easily compatible with each other. Some of the best and most useful information is often added to records as freeform notes, which can be hard to digitize and interpret. Medical records also pass through multiple people’s hands, from nurses to techs and doctors, before making it to the digital world, meaning it is relatively easy for errors or discrepancies to enter someone’s personal information. One of the biggest pushes for big data in healthcare is ...


Read More on Datafloq
Small Businesses are Quickly Adapting as Big Data is Becoming More User Friendly

Small Businesses are Quickly Adapting as Big Data is Becoming More User Friendly

While businesses of all sizes have challenges to overcome, small businesses often deal with growing pains. It can be hard to keep an operation profitable, especially when resources are limited. Most small-business managers want to concentrate on only the most vital aspects of their organization.

Though certain things in the world of business were once reserved for only the largest and most successful organizations, technological breakthroughs are changing this fact. Big data is becoming more user-friendly than ever.

This type of change has big implications for small businesses. While some people would assume that businesses would only analyze and consider data relevant to their size, this isn’t always the case. In fact, analyzing large scale data and information can help a small business grow effectively.

Why Analyzing Data is a Good Move for Businesses

While organizations of all sizes will experience their share of successes and failures if their organization has been going long enough, both of these instances can prove valuable. Businesses can learn what works (and what doesn’t) by analyzing data.

Everything from search engine results to sales records can help organizations find out what steps they should be taking next. For smaller organizations, this is especially valuable. These organizations can’t afford to ...


Read More on Datafloq
How to Write a Data Analysis Report

How to Write a Data Analysis Report

Let’s face it, data analysis reports, whether you’re writing them for universities or for big data, are intimidating. They’re also not a great deal of fun to write. I asked some people where they’d rank writing one and it came in just above going to the dentist. That’s not a good place to find yourself (and here I’m talking both about the list and the chair).

You know what the crazy thing is? They’re actually not that hard to write! Like so many things in life you just need to know where to start. For that reason, I thought I’d write you up a quick article so that the next time you at least know how to get it over with as quickly as possible. And then, maybe you’ll start to enjoy it more and it will become only as unpleasant as being woken by your neighbor drilling holes in the wall. We can only hope, right? 

There is no one right way

The first thing you’ve got to realize is that there is not yet one way to present your data. Admittedly, that’s unfortunate. It would probably be helpful for everybody if there was some standard way to do these things. That’s ...


Read More on Datafloq
The Pros And Cons Of Virtual Reality App Development

The Pros And Cons Of Virtual Reality App Development

Virtual reality is a technology that creates an artificial environment. With a few clicks and a headset, users are allowed to build a new world. This innovation is useful in Education, Medicine, Filming, Engineering and other fields of study. Mechanics, Virtual reality app development is a revenue-generating opportunity. Most of the existing VR apps are in the gaming area, but some App developers have ventured into other fields. The benefits and detriments of VR apps are discussed below.

Pros

Training Tool

Virtual reality apps are used for training purposes in business, education, medicine, armed forces, and construction. The use of VR apps in the healthcare sector is particularly in the field of surgery. For instance, the robotic surgery has been adopted in medical schools for teaching future surgeons. The benefits of this technology for training purposes are listed below.


VR apps provide realistic scenarios ideal for teaching.
They present little or no risk to the students and instructors.
These applications are safe to use and can be remotely controlled.
Virtual reality applications simplify complex situations during training.
They are innovative and ideal for different methods of teaching.
These apps are engaging and fun to use.


Economical

VR apps are cost-effective. They reduce the cost of creating different prototypes for training purposes. ...


Read More on Datafloq
Is Blockchain the Silver Bullet Needed by the IoT Industry?

Is Blockchain the Silver Bullet Needed by the IoT Industry?

It was a matter of time to end up writing an article to talk about the connection between Internet of Things (IoT) and the technology (arguably still in the infancy of its development) that may have the greatest power to transform our world, Blockchain.

In a future planet interconnected not just by devices, but by the events taking place across it, with billions of devices talking to one another in real time, the Internet of Things will require a secure and efficient way to track all interactions, transactions, and activities of every “thing” in the network.

Blockchain’s role could be a coordination layer across devices and the enabler of the IoT to securely facilitate interactions and transactions between devices, and may also support certain processes related to architecture scalability, data sharing, and advancements in encryption and private key technology, enhanced security, and potentially even privacy.

With blockchain, the Achilles’ heel of the IoT of heterogeneous OEM devices world now becomes viable. I wonder, however, if it is feasible that this decentralized IoT network may co-exist IoT sub-networks or centralized cloud-based IoT models.

But let's face it, blockchain is still a nascent and controversial technology (experts estimate that it might take 5 -10 years for the ...


Read More on Datafloq
What I Always Wanted To Know About Big Data* (*but was afraid to ask)

What I Always Wanted To Know About Big Data* (*but was afraid to ask)

When I first heard the term Big Data a few years ago, I didn’t think much of it. Soon after, Big Data started appearing in many of my conversations with many of my tech friends. So I started asking a very simple question 'What is Big Data?'. I kept asking that question to various folks and I did not get the same answer twice from any number of people. ‘Oh, it’s a lot of data’. ‘It’s variety of data’. ‘It’s how fast the data is piling up’. Really? I thought to myself but was afraid to ask more questions. As none of it made much sense to me, I decided to dig into it myself. Obviously, my first stop was Google.

When I typed ‘Big Data’ at that time, this showed up.



Ahh, It all made sense right away. None of the people I was talking to really knew much about Big Data but were talking about it anyway as everyone else was talking about it.

In this series of articles, I am planning to write on Big Data, my target audience is those people who come across the term Big Data but don’t live and breathe Big Data on a daily basis for their regular jobs. ...


Read More on Datafloq
Hot Trends in AI Advancements you Should Know

Hot Trends in AI Advancements you Should Know

Artificial Intelligence is nothing new, the renewed interest in it, is though. The biggest tech companies in the world are dedicating efforts to the field. Whether it's making smart devices, smart vehicles or voice assistants and robots, every company has its own take on AI. Facebook, Google, Microsoft, Samsung, and more companies are all working on AI in some form.

The developments in AI have been rapid and the momentum is not poised to slow down anytime soon. When Google introduced Google Now and voice search, Apple had Siri and Microsoft introduced Cortana. Digital assistants allowed to interact with devices through voice and have conversational help always available to the users. We could talk to our devices just like any other human. That is how AI chatbot technology changes the humech bond. That was AI going mainstream.

We had glimpses of robots learning quickly and adapting as per human needs in recent movies like Her and Ex Machina. Even saw the pitfalls of them outsmarting their own creators and theories of a war between robots and humans not being too far away.

There has even been a controversial conference on Love and Sex with Robots just on 19-20th December in London where as ...


Read More on Datafloq
Quantitative Risk Analysis – Purpose and Objectives | Simplilearn

Quantitative Risk Analysis – Purpose and Objectives | Simplilearn

PERFORM QUANTITATIVE RISK ANALYSIS Welcome to “Perform Quantitative Risk Analysis”, lesson 8 of this course. After the risk management plan and identify risk we need to carry out the analysis part of each risk. There are two types of analysis namely qualitative and quantitative. Qualitative analysis has already been discussed in t...Read More.
Qualitative Risk Analysis – Purpose and Objectives | Simplilearn

Qualitative Risk Analysis – Purpose and Objectives | Simplilearn

PERFORM QUALITATIVE RISK ANALYSIS Welcome to “Perform Qualitative Risk Analysis”, lesson 7 of this course. After the “risk management plan and identify risk” we need to carry out the analysis part for each risk. There are two types of analysis namely qualitative analysis and quantitative analysis. This lesson is about &ldquo...Read More.
Business Continuity and Disaster Recovery Architecture | Simplilearn

Business Continuity and Disaster Recovery Architecture | Simplilearn

BUSINESS CONTINUITY AND DISASTER RECOVERY PLANNING Introduction Hello and Welcome to Lesson 8 of CISSP Certification Course by SimpliLearn! This lesson is about Business Continuity and Disaster Recovery Planning. Business Continuity and Disaster Recovery Planning is one of the ten domains of the Common Body of Knowledge (CBK) for the CISSP exam. Th...Read More.
PMI-RMP – Plan Risk Management Methodologies | Simplilearn

PMI-RMP – Plan Risk Management Methodologies | Simplilearn

PLAN RISK MANAGEMENT Welcome to Lesson 5. So far we discussed about the framework, principles, concepts and high level activities that are carried out in each of the project Risk Management process.  In this lesson we will discuss in detail about the first process that is Plan Risk Management. To ensure that project risk management is a succes...Read More.
Going beyond Top 10’s with “Movers & Shakers” | Simplilearn webinar starts 21-03-2017 11:30

Going beyond Top 10’s with “Movers & Shakers” | Simplilearn webinar starts 21-03-2017 11:30

Forget about useless “Top X” reports! Embrace the concept of “movers & shakers” and look at items that really made a difference. With some simple math and the help of Excel or Google Sheet, you will be able to provide much more valuable insight.  Join Stéphane Hamel, Digital Analytics Faculty Chair, Digital ...Read More.
Disruption in Search | Simplilearn webinar starts 07-03-2017 10:00

Disruption in Search | Simplilearn webinar starts 07-03-2017 10:00

Mobile has more than proven itself when it comes to consumer engagement that leads to sales and loyalty. But some of the most impressive successes have come in the enterprise. Whether it’s an app that enables in-the-field workers to increase productivity or the exchange by health professionals of text messages and X-rays that benefit patients...Read More.
Can You Drive Influencer Impact Without Deep Pockets? | Simplilearn webinar starts 14-03-2017 11:30

Can You Drive Influencer Impact Without Deep Pockets? | Simplilearn webinar starts 14-03-2017 11:30

As Influencer Marketing matures from a "you scratch my back, I'll scratch yours" world of collaboration into a wheeling and dealing, high-cost undertaking, many smaller businesses are wondering if they'll be blocked out of the game due to lack of budget. In this webinar, we'll look at the maturation of the market, including bo...Read More.
The utopia of the 360° view of the customer | Simplilearn webinar starts 23-03-2017 16:30

The utopia of the 360° view of the customer | Simplilearn webinar starts 23-03-2017 16:30

Marketers dream of a 360° view of the customer. Of course, solution vendors won’t burst their bubble. To the contrary, they will inflate a little more the preconceived idea that the more you know about your customers, the better decisions you will make.  Join Stéphane Hamel, Digital Analytics Faculty Chair, Digital Analytics ...Read More.
Getting Azure Certified – Everything you Need to Know | Simplilearn webinar starts 01-03-2017 10:30

Getting Azure Certified – Everything you Need to Know | Simplilearn webinar starts 01-03-2017 10:30

This Webinar deals with Microsoft Azure from a high-level perspective, in particular with the Microsoft Certification Exam 70-532: Developing Microsoft Azure Solutions.  Join Utkarsh, as he explores the strategies and what you have to ignore on your path to success. This session would be a validation that you have the skills and knowledge nec...Read More.
How Deep Learning Affects SEO

How Deep Learning Affects SEO

Google is always up to something new and fascinating, and one of the things that they have been working on is the evolution of Google Search from machine learning to “something more”. Originally, a group of Google’s engineers worked on the search engine’s recognition of synonyms. With users inputting different words interchangeable with one another, Google would implement its knowledge to understand better what they were searching for.

The next step came with the translation of websites, where the engineers fed the system with a large number of translated documents, “teaching” Google how one language is mapped to another. This way, Google was able to translate sites into languages that none of the engineers spoke. And now, the ultimate step in this evolution is deep learning.

What is deep learning?

Deep learning is based on the notion of digital neurons, which are organized into layers. Every layer takes features of a higher level one from the data that it receives, and passes them to the next layer. The result of this is that higher layers can understand the notions from the input data. Take images for example. 

The first layer receives an input of pixels and is “taught” to recognize shapes from them. Higher layers ...


Read More on Datafloq
Why Big Data Can Mean Big Gains for Progressive Traders

Why Big Data Can Mean Big Gains for Progressive Traders

For decades, stock traders and other investors have been on the cutting edge of technology. They’re always looking for the slightest advantage that will allow them to be successful. And in recent years, the savviest traders have been relying on big data.

Big Data Improves Technical Analysis

Analysis has always played a significant role in the evaluation of stocks, bonds, and options. Specifically, technical analysis has played a key part. As RJO Futures explains, “Technical analysis is the study of price action and volume through the careful analysis of various different chart types. Modern-day technical analysis looks to expand upon such principles as price trends, moving averages, volume and open interest, support and resistance levels, as well as momentum indicators.”

Until recently, technical analysis has relied on outdated tools – like spreadsheets and rudimentary equations – to provide traders with insights into which trades make sense under a specific set of circumstances. When big data entered the picture, everything changed.

3 Ways Big Data Intersects Trading

Walk through offices on Wall Street today and you’ll notice some stark differences from what was happening 10 or 15 years ago. Let’s take a look at some of the changes that have been brought on by big data:

Tex ...


Read More on Datafloq
What Are Uber’s Future Plans in Terms of Data Analytics?

What Are Uber’s Future Plans in Terms of Data Analytics?

There’s almost no one left in the world, except those perhaps residing in the farthest corners of the earth, who has not yet heard of Uber and the Uber business model.

BigInsights Principal Raj Dalal met up with Uber’s Chief Data Architect M C Srivas on a recent visit to San Francisco, where, in the course of the hour-long conversation, Srivas spoke of what data analytics meant for Uber, and how data innovation was being used to further, what is now popularly known around the world as “the Uber model.”

In the first part of this three-part series, we had looked at how data analytics was the key to Uber’s success.

In this, second part, we now look at Uber’s future plans based on data analytics.

Raj: As chief data architect here at Uber what are some breakthroughs you hope to make to take Uber to the next phase?

Srivas: Unlike Google or Facebook or Twitter, every bit of Uber's data is actually a monetary transaction. There’s livelihood getting that exactly right. At Uber, every piece of data is impacting somebody's pocket. So the need for quality is much higher than that of any standard website. So that's a whole different set of challenges.  

The second part ...


Read More on Datafloq
Garbage In is Garbage Out; How Big Data Scientists Can Benefit from Human Judgment

Garbage In is Garbage Out; How Big Data Scientists Can Benefit from Human Judgment

This article is Sponsored by Search Strategy Solutions, experts in offering your data scientists high-quality, reliable human judgments and support.

The quality of your data determines the quality of your insights from that data. Of course, the quality of your data models and algorithms have an impact on your results as well, but in general it is garbage in, garbage out. Therefore, (Total) Data Quality Management (DQM) or Master Data Management (MDM) have been around for a very long time and it should be a vital aspect of your data governance policies.

Data governance can offer many benefits for organizations, including reduced head count, higher quality of data, better data analysis and time savings. As such, those companies that can maintain a balance of value creation and risk exposure in relation to data can create competitive advantage.

Human judgments and Data Quality

Garbage in, garbage out. Especially with the hype around artificial intelligence and machine learning, that has become more important than ever. Any organization that takes itself serious and employs data scientists to develop artificial intelligence and machine learning solutions should take the quality of data very serious. Data that is used to develop, test and train algorithms should be of high quality, ...


Read More on Datafloq
How Facial Recognition’s Plays a Role in Catching Criminals

How Facial Recognition’s Plays a Role in Catching Criminals

Big data is constantly making our lives better by helping businesses to serve us better while becoming more efficient, pinpointing trends in large datasets, and even improving healthcare. It’s also definitely making criminals’ lives harder in many ways, making our communities safer overall. From helping police to identify crime patterns in cities to scanning large databases for specific information, big data is revolutionizing criminal justice, and making it a lot harder for people to get away with breaking the law. Fingerprint analysis has long been an important part of forensic evidence to build a case, but thanks to big data and artificial intelligence, there’s a newer tool that can help identify suspects: facial recognition.

How it Works

If you’re out and about these days, there’s a good chance you’re getting caught on video somewhere. Surveillance cameras aren’t just for gas stations these days, and they can be helpful witnesses in criminal activity. With enhancement technology available, it’s often an easy way to identify a suspect. However, calling on the public to identify individuals in these photos often wastes precious time—time for criminals to skip town or change their appearance. 

Facial recognition in criminal justice relies on a database—a big data dataset that contains ...


Read More on Datafloq
How to Navigate the World of Disciplined Software Selection

How to Navigate the World of Disciplined Software Selection

When selected with care, enterprise software can be a strategic differentiator that drives efficiency and cost savings throughout your organization. Furthermore, as the great crew change occurs in any industry, the younger generation has certain implicit expectations about how you will leverage that technology, from automation for repetitive processes to the usability of application interfaces, to remote access anywhere on the devices of their preference. If your organization finds itself labeled a technology dinosaur, then you will neither attract nor retain the best young talent in a competitive employment landscape.

Beyond the human resources aspect, the bar that defines baseline efficiency levels in the business is on an upward move. If your organization fails to leverage technology effectively, then your business will ultimately suffer and fall behind as your competition leap frogs ahead. If you accumulate enough technical debt in your IT organization then you will find it difficult to catch-up to the competition.

Selecting the right software product can be a complicated and daunting process, especially when the costs to purchase, implement, and maintain the product can be in the hundreds of thousands to millions of dollars. Improper selection of a software product results in a direct impact to information management ...


Read More on Datafloq
Designing the Data Management Infrastructure of Tomorrow

Designing the Data Management Infrastructure of Tomorrow

Today, more than ever before, organisations realise the strategic importance of data and consider it to be a corporate asset that must be managed and protected just like any other asset. Considering the strategic importance of data, increasing number of farsighted organisations are investing in the tools, skills, and infrastructure required to capture, store, manage, and analyse data. 

More organisations are now viewing data management as a holistic activity that requires enterprise-wide collaboration and coordination to share data across the organisation, extract insights, and rapidly convert them into action before opportunities are lost. However, despite the increasing investment in data management infrastructure, there are not many organisations that spend time and effort on anticipating the future events that may impact their data management practices. 

From upcoming rules and regulations to the need to create better customer experiences in order to discover hidden value in customer journeys, there are a number of factors that demand a more proactive approach from organisational leaders and decision makers when it comes to the planning and design of an enterprise’s data management infrastructure. 

Breaking Down the Data Silos 

When it comes to efficient data management, the biggest challenge that enterprises need to overcome is the elimination of the silos ...


Read More on Datafloq
Moving Beyond Predictions – Second Order Analytics

Moving Beyond Predictions – Second Order Analytics

Last month, I wrote about why simply making predictions isn’t enough to drive value with analytics. I made the case that behind stories of failed analytic initiatives, there is often a lack of action to take the predictions and turn them into something valuable. It ends up that identifying and then taking the right action often leads to additional requirements for even more complex analyses beyond the initial effort to get to the predictions! Let’s explore what that means.

Identifying The Action Is The Next Step

Once I have a prediction, simulation, or forecast, the next step is to identify what action is required to realize the potential value uncovered. Let’s consider the example of using sensor data for predictive or condition-based maintenance. In this type of analysis, sensor data is captured and analyzed to identify when a problem for a piece of equipment is likely. For example, an increase in friction and temperature within a gear might point to the need to replace certain components before the entire assembly fails.

Identifying the problem ahead of time sounds great. All we have to do is to identify when something is going to break and then fix it before it breaks. Doing so saves ...


Read More on Datafloq
How to Boost Your Career in Big Data and Analytics

How to Boost Your Career in Big Data and Analytics

The world is increasingly digital, and this means big data is here to stay. In fact, the importance of big data and data analytics is only going to continue growing in the coming years. It is a fantastic career move and it could be just the type of career you have been trying to find.

Professionals who are working in this field can expect an impressive salary, with the median salary for data scientists being $116,000. Even those who are at the entry level will find high salaries, with average earnings of $92,000. As more and more companies realize the need for specialists in big data and analytics, the number of these jobs will continue to grow. Close to 80% of the data scientists say there is currently a shortage of professionals working in the field.

What Type of Education Is Needed?

Most data scientists - 92% - have an advanced degree. Only eight percent have a bachelor’s degree; 44% have a master’s degree and 48% have a Ph.D. Therefore, it stands to reason that those who want to boost their career and have the best chance for a long and fruitful career with great compensation will work toward getting higher education.

Some of ...


Read More on Datafloq
Why is Big Data so Important in Today’s World?

Why is Big Data so Important in Today’s World?

Big revenues are gained from Big Data, Quote SAP

There is no doubt that the industries are going ablaze with the huge eruption of data. None of the sectors have remained untouched of this drastic change in a decade. Technology has crept inside each business arena and hence, it has become an essential part of every processing unit. Talking about IT industry specifically, software and automation are the bare essential terms and are used in each and every phase of a process cycle.

Businesses are focusing more on agility and innovation rather than stability and adopting the big data technologies help the companies achieve that in no time. Big data analytics has not only allowed the firms to stay updated with the changing dynamics but has also let them predict the future trends giving a competitive edge.

What is driving the widespread adoption of big data across the industries?

Let's find out the reasons behind all the hype of big data-

Firms witnessing surprising growth

Needless to say that Big Data is taking the world by storm through its countless benefits. Big Data is allowing the leading firms like IBM, Amazon, to develop some of the cutting-edge technologies providing high-end services to their customers.

"Orchestrating Big Data, Cloud ...


Read More on Datafloq
Learn How Radical Analytics Can Uncover Blind Spots in Your Data | Simplilearn webinar starts 09-02-2017 14:00

Learn How Radical Analytics Can Uncover Blind Spots in Your Data | Simplilearn webinar starts 09-02-2017 14:00

Web Analytics has always been associated with defining objectives, setting KPIs, seeking executive buy-ins, and embracing a data-driven culture. If this is something you still believe, then your ROI is probably showing a downward trend. Presenting – a whole new approach to data driven decision making – Radical Analytics. Join Stephane...Read More.
Visualizing an IT Roadmap for Leading Digital Transformation on the Cloud | Simplilearn webinar starts 09-02-2017 03:00

Visualizing an IT Roadmap for Leading Digital Transformation on the Cloud | Simplilearn webinar starts 09-02-2017 03:00

In today’s digital world, IT is about leadership, not just support. Beyond ensuring stability and squeezing costs, today’s IT organizations are tasked with helping companies become nimble digital enterprises. But how can senior IT leaders take the lead in transforming business strategy and operations, instead of reacting to the digital...Read More.
Enterprise transformation to Agile: Challenges and Upcoming trends | Simplilearn webinar starts 02-03-2017 10:00

Enterprise transformation to Agile: Challenges and Upcoming trends | Simplilearn webinar starts 02-03-2017 10:00

If you're looking to introduce Agile to your workplace, you probably already know about the benefits in speed, quality, and bottom-line results that you can expect. Conducted by Agile expert Jeff Allen, this webinar will show you how to overcome challenges that enterprises face during the transition process to Agile. The session will cover th...Read More.
Is Data the Currency of the Future?

Is Data the Currency of the Future?

Technology has now infiltrated nearly every facet of human life, making the world more connected than it’s ever been. Think about it - in just a few short years a completely new tech vernacular has emerged, making commonplace inventive terms like trolling, cookies, memes and overgrams. One of the most descriptive, and now relevant phrases that has come into being as a result of the digital revolution is “data mining.” Data mining is exactly what it sounds like - digging for data. The idea that data is a valuable commodity is implicit in this phrase. And it is. In fact, recently data has become such a critical part of the business world and the world in general that the question has been raised: is data becoming a form of currency?

The intrinsic, quantifiable value of data is without question. Think about all the ways that data provides monetary value to companies. First, there is the obvious fact that data-informed insights are more likely to pay off. Eliminating “gut-based” decision-making allows companies to grow more consistently and methodically. Secondly, consider how the wealth of consumer insights available through data mining has allowed marketers and brand ambassadors to connect with their customers like ...


Read More on Datafloq
7 Interesting and Unusual Uses of AI You May Not Know About

7 Interesting and Unusual Uses of AI You May Not Know About

Artificial Intelligence (AI) is not just the buzzword used by scientists, science fiction authors, and filmmakers. The future has already arrived. AI has a large number application fields. Many of them are robotics-related (e.g. Google Home and Amazon Echo). But let's make a journey into the most interesting AI use cases.

Customer Service

If you are accustomed to robot-like chatbots ineffectively simulating human behavior, you will have to forget about them very soon. AI is believed to be the future of customer service. Many companies work on AI projects, and platforms, such as Init.ai, motion.ai and Octane AI already assist businesses in creating feature-rich chatbots using AI. Technology solutions empowered by AI can do much more than just conducting a conversation. They can be personal assistants like Siri, Microsoft’s Cortana, and Google Now.

What about a new crop? Developers offer AI apps like Mezi and Claire that can help people to manage their trips. Virtual humans, such as 3D intelligent virtual assistants developed by iDAvatars are emotionally expressive and can speak customers' language, thus delivering superior customer experience.



Source: http://codebaby.com/

Food and Drinks Industry

You may have heard about some "boring" probable uses of AI like "intelligent machines" for food sorting. What about this one? The ...


Read More on Datafloq
Dark Matter: How GPS Data is Helping to Unlock the Universe’s Secrets

Dark Matter: How GPS Data is Helping to Unlock the Universe’s Secrets

Data is essential to the way we live our lives here on Earth. But it’s also proven to be very helpful in exploring the mysteries of the universe beyond our tiny planet. Because there are so many phenomena observed in space that are difficult to measure, many mysteries remain about how the universe functions, changes, and evolves. Dark matter has been of particular interest to researchers, but it has been notoriously difficult to study—because all that has been detected of the elusive matter is its gravitational pull. New research has revealed that there may be a way to find this matter, however, using a common technology that we use every day: GPS satellites and data. 

What is Dark Matter?

Dark matter is one of the most elusive and mysterious substances that exists. It’s the stuff that is thought to be responsible for the formation of galaxies, but very little is known about its nature, aside from the hypothesis that it makes up 85% of the galaxy’s matter, and that its gravity is thought to keep galaxies together. Scientists have been hunting for answers about dark matter for years, with few solid answers. Even extra-sensitive sensors haven’t been able to detect dark matter ...


Read More on Datafloq
Email Marketing Data You Can’t Afford To Ignore

Email Marketing Data You Can’t Afford To Ignore

Compared to previous types of marketing, digital marketing is a data goldmine. There’s no 100% verifiable way to measure how many people read a newspaper ad, but Google Analytics can quickly tell you how many people visited each page on your website, how long they visited for, and so much more.

In the realm of digital marketing, email marketing takes the lead as the most effective — with websites coming in at a close second. On average, every dollar spent on email marketing generates $38 in return. If you’re ignoring email marketing, you’re throwing away money.

To make the most of email marketing, it is important to understand the wealth of data every email campaign generates. By digging into the data, you can improve your email marketing for even better results for your company.

Fortunately, many email marketing services make it easy to understand important email marketing data — no degree in research methods or mathematics required.

While your particular marketing goals will influence which email marketing data is most important for you to examine, there are five types of data that are always worth tracking.

Open Rate

Part of what makes email marketing so amazing is how effective it is at putting your brand in ...


Read More on Datafloq
The Dark (and not-so-dark) Side of Mobile Data

The Dark (and not-so-dark) Side of Mobile Data

For information consumers, constant communicators and internet junkies everywhere, unlimited mobile data is a pass to unlimited playtime. It’s like unlimited crack for the addict. For data scientists, analysts and marketers, the words ‘unlimited mobile data’ spark raised eyebrows. Unlimited data, by its very nature, makes for limitless possibilities. How its use will pan out is, of course, a different story.

One way companies such as Target and Macy’s use smartphone data is movement tracking. When a customer arrives, a camera gets a shot of their face or license plate, then the retailer tracks where they go in the store through WiFi. Although these stores aren’t malicious, and are ostensibly doing this to build user profiles and personalize marketing, it seems ominous: 80% of shoppers feel in-store tracking is unacceptable; 68.5% are concerned their data will not be kept safe, and 67% say “tracking feels like spying.”

This is the grey area of mobile data usage, because many people might not be okay with tracking, but they might be happy to take advantage of a special discount the store offers as a result of tracking. Customers with Wal-Mart’s app, which uses geolocation tracking, spend nearly 40% more per month. This benefits Wal-Mart, ...


Read More on Datafloq
5 Major Big Data Predictions for 2017

5 Major Big Data Predictions for 2017

Big data is, well, big business. It's believed over the next four years, over two billion people will be utilizing personal cloud storage at a capacity of 1.7 gigabytes per user per month. The cloud will gradually become the standard method of storage for both personal and professional needs, minimizing the use of everything from physical servers to flash drives.

The cloud is expected to take the forefront in a lot more than storage. Here are five major big data predictions that will be with us in the workplace long beyond 2017.

Training

eLearning companies were among the first industries to take full advantage of the cloud. Through mobile learning management systems, they provide exceptional tools that have transformed onboarding and training for companies and employees in every way. The cloud has given every employer great opportunities to create and customize training to fit specific needs and goals. The tech lets the employer design training unique to each individual employee. Employees can log in, pause sessions, pick up during their commute, and take the exam at home.

Content

The cloud has significantly altered the way we create, edit, share, and present content. The once mighty Microsoft and Corel office staples are taking a step back ...


Read More on Datafloq
Two-Factor Authentication: To Build Or To Buy?

Two-Factor Authentication: To Build Or To Buy?

For any business dealing in any niche, cost-price, etc are the most discussed and looked after synonyms. And why not? Most often or not, cost either works as the deal maker or the deal breaker when it comes to decision-making. Need I say everybody is trying to optimize this factor as much as possible? But many times, in order to reduce this factor, people end up making wrong decisions resulting in failure in the original quest.

For example, with the hike in cases of data breaches, the implementation of two-factor authentication is at its peak. If you are dealing with sensitive customer data, you also would have either implemented this solution or are going to implement it very soon. And obviously, the idea of “building two-factor authentication on your own”, must have crossed your mind more than once. After all, it will reduce costs (or it seems so), you would have full control over your data and it is exciting, right? So let’s do it!

Well, hold back your horses! If you do some analysis, building this 2FA on your own is not that wise, if you look at other factors like prior investment, after effects, etc. Don’t worry, we’ll soon dive into ...


Read More on Datafloq
Why Creativity is Crucial in Data Science

Why Creativity is Crucial in Data Science

Data science might not be seen as the most creative of pursuits.

You add a load of data into a repository, and you crunch it the other end to draw your conclusions. Data in, data out, where is the scope for creativity there? It is not like you are working with a blank canvas.

For me, the definition of creativity is when you are able to make something out of nothing. This requires an incredible amount of imagination, and seeing past the obvious headline statistics to reach a deeper conclusion is the hallmark of a great Big Data professional.

Many of the most successful people come across as innovative thinkers when they have interviews with us. They have no choice, moulding the data in unique and unexpected ways is their job. Just as Einstein found inspiration in his violin playing, many leading data scientists find that when their creative juices are flowing, they often find the most elegant solutions to the challenges that they face. These Data Creatives are some of hardest to find candidates – mainly due to the subjectivity involved. (See also a previous blog for more on Data Creatives)  

It is actually one of my favourite interview questions:

“What is the ...


Read More on Datafloq
Attackers Can Intercept Data From CouchDB and Hadoop Databases

Attackers Can Intercept Data From CouchDB and Hadoop Databases

Is Your Big Data Vulnerable?

The year 2017 started on a dramatic Hollywood-style note for a few database-service providers. In a series of events, some database became victim of ransomware. This raised questions about the security of cloud-based databases which has been the biggest drawback of cloud so far.

MongoDB and Elasticsearch Fell Victim

In what seemed to be a hacking attempt by an individual or a group, thousands of MongoDB-based databases fell victim to a cyber-attack. The hacker claimed to have access to many databases and threatened to delete or encrypt the data if a ransom was not paid.

While many were trying to get over the MongoDB instance, another news regarding Elasticsearch clusters started doing rounds. In this case, the data was deleted from the cluster and a message asking for ransom was left behind. Experts are working on establishing whether this is an isolated event or is it related to the MongoDB attack.

Hadoop and CouchDB Instances

When Hadoop and CouchDB started facing similar issues where data was deleted from the instances, the first thought that crossed many people’s mind was that these have also been hit by ransomware. But, the reality came out to be different than thought. The hackers targeting Hadoop ...


Read More on Datafloq
Understanding the Human Side of Big Data

Understanding the Human Side of Big Data

The idea of "Big Data" has generated a lot of hype, especially in recent years as the variety of data sources increases and affordable new data tools appear. While success with big data analytics is not guaranteed, the chance for discovering operational improvements from your own data still makes it attractive. It's been estimated that a large retailer can increase margins by over 60 percent through process insights. Increasingly in the social media era, big data is shaping consumer-brand relationships.

For many companies, however, big data is not providing the anticipated returns. The value in data-driven decisions seems to be elusive. Colin Strong, the author of "Humanizing Big Data", advises looking to the human rather than the technical side to obtain positive results.

Companies need to re-examine big data operations in terms of the humans involved. This can mean not only the consumers feeding a flood of data points, but the data analysts themselves.

The Humans Behind the Data

Regarding data analysis as a statistical exercise is missing out on the human circumstances that generate the information. Human social environments can shape consumer preferences rapidly and profoundly. Basing decisions solely on the numbers means that potential changes are missed, and brands could miss spotting ...


Read More on Datafloq
Here, There and Everywhere: Interview with Brian Wood on Teradata’s Cloud Strategy

Here, There and Everywhere: Interview with Brian Wood on Teradata’s Cloud Strategy

(Image Courtesy of Teradata)
In a post about Teradata’s 2016 Partners event I wrote about the big effort Teradata is making to ensure its software offerings are now available both on-premises and in the Cloud, in variety of forms and shapes, making a big push to ensure Teradata’s availability, especially for hybrid cloud configurations.

So, the data management and analytics software giant seems to be sticking to its promise by increasingly incorporating its flagship Teradata Database other solutions to the Cloud in the form of its own Manage Cloud for Americas and Europe, a private cloud-ready solution or via public cloud providers such as AWS and most recently announced on Microsoft’s Azure Marketplace.

To chat about this latest news and Teradata’s the overall strategy directed to the cloud we’ve sat with Teradata’s Brian Wood.

Brian Wood is director of cloud marketing at Teradata. He is a results-oriented technology marketing executive with 15+ years of digital, lead gen, sales / marketing operations & team leadership success.

Brian has an MS in Engineering Management from Stanford University, a BS in Electrical Engineering from Cornell University, and served as an F-14 Radar Intercept Officer in the US Navy.

All along 2016 and especially during its 2016 Partners conference, Teradata made it clear it is undergoing an important transformation process and, a key strategy includes its path to the cloud. Offerings such as Teradata Database on different private and public cloud configurations, including AWS, VMware, Teradata Managed Cloud, and of course Microsoft Azure are available now. Could you share some details about the progress of this strategy so far?

Thanks for asking, Jorge. It’s been a whirlwind because Teradata has advanced tremendously across all aspects of cloud deployment in the past few months; the progress has been rapid and substantial.

To be clear, hybrid cloud is central to Teradata’s strategy and it’s all about giving customers choice. One thing that’s unique to Teradata is that we offer the very same data and analytic software across all modes of deployment – whether managed cloud, public cloud, private cloud, or on-premises.

What this means to customers is that it’s easy for them to transfer data and workloads from one environment to another without hassle or loss of functionality; they can have all the features in any environment and dial it up or down as needed. Customers like this flexibility because nobody wants to locked in and it’s also helpful to be able to choose the right tool for the job and not worry about compatibility or consistency of results.

Specific cloud-related advancements in the last few months include:
  • Expanding Teradata Managed Cloud to now include both Americas and Europe
  • Increasing the scalability of Teradata Database on AWS up to 64 nodes
  • Launching Aster Analytics on AWS with support up to 33 nodes
  • Expanding Teradata Database on VMware scalability up to 32 virtual nodes
  • Bolstering our Consulting and Managed Services across all cloud options
  • And announcing upcoming availability of Teradata Database on Azure in Q1
These are just the ones that have been announced; there are many more in the pipeline queued up for release in the near future. Stay tuned!

The latest news is the availability of Teradata Database on Microsoft’s Azure Marketplace. Could you give us the details around the announcement?

We’re very excited about announcing Q1 availability for Teradata Database on Azure because many important Teradata customers have told us that Microsoft Azure is their preferred public cloud environment. We at Teradata are agnostic; whether AWS, Azure, VMware, or other future deployment options, we want what’s best for the customer and listen closely to their needs.

It all ties back to giving customers choice in how they consume Teradata, and offering the same set of capabilities across the board to make experimentation, switching, and augmentation as easy as possible.

Our offerings on Azure Marketplace will be very similar to what we offer on AWS Marketplace, including:
  • Teradata Database 15.10 (our latest version)
  • Teradata ecosystem software (including QueryGrid, Unity, Data Mover, Viewpoint, Ecosystem Manager, and more)
  • Teradata Aster Analytics for multi-genre advanced analytics
  • Teradata Consulting and Managed Services to help customers get the most value from their cloud investment
  • Azure Resource Manager Templates to facilitate the provisioning and configuration process and accelerate ecosystem deployment
(post-ads)

What about configuration and licensing options for Teradata Database in Azure?

The configuration and licensing options for Teradata Database on Azure will be similar to what is available on AWS Marketplace. Customers use Azure Marketplace as the medium through which to find and subscribe to Teradata software; they are technically Azure customers but Teradata provides Premier Cloud Support as a bundled part of the software subscription price.

One small difference between what will be available on Azure Marketplace compared to what is now available on AWS Marketplace is subscription duration. Whereas on AWS Marketplace we currently offer both hourly and annual subscription options, on Azure Marketplace we will initially offer just an hourly option.

Most customers choose hourly for their testing phase anyway, so we expect this to be a non-issue. In Q2 we plan to introduce BYOL (Bring Your Own License) capability on both AWS Marketplace and Azure Marketplace which will enable us to create subscription durations of our choosing.

Can we expect technical and functional limitations from this version compared with the on-premises solution?

No, there are no technical or functional limitations of what is available from Teradata in the cloud versus on-premises. In fact, this is one of our key differentiators: customers consume the same best-in-class Teradata software regardless of deployment choice. As a result, customers can have confidence that their existing investment, infrastructure, training, integration, etc., is fully compatible from one environment to another.

One thing to note, of course, is that a node in one environment will likely have a different performance profile than what is experienced with a node in another environment. In other words, depending on the workload, a single node of our flagship Teradata IntelliFlex system may require up to six to ten instances or virtual machines in a public cloud environment to yield the same performance.

There are many variables that can affect performance – such as query complexity, concurrency, cores, I/O, internode bandwidth, and more – so mileage may vary according to the situation. This is why we always recommend a PoC (proof of concept) to determine what is needed to meet specific customer requirements.

Considering a hybrid cloud scenario. What can we expect in regards to the integration with the rest of the Teradata stack, especially on-premises?

Hybrid cloud is central to Teradata’s strategy; I cannot emphasize this enough. We define hybrid cloud as a customer environment consisting of a mix on managed, public, private, and on-premises resources orchestrated to work together.

We believe that customers should have choice and so we’ve made it easy to move data and workloads in between these deployment modes, all of which use the same Teradata software. As such, customers can fully leverage existing investments, including infrastructure, training, integration, etc. Nothing is stranded or wasted.

Hybrid deployment also introduces the potential for new and interesting use cases that were less economically attractive in an all-on-premises world. For example, three key hybrid cloud use cases we foresee are:
  • Cloud data labs – cloud-based sandboxes that tie back to on-premises systems
  • Cloud disaster recovery – cloud-based passive systems that are quickly brought to life only when needed
  • Cloud bursting – cloud-based augmentation of on-premises capacity to alleviate short-term periods of greater-than-usual utilization


How about migrating from existing Teradata deployments to Azure? What is the level of support Teradata and/or Azure will offer?

Teradata offers more than a dozen cloud-specific packages via our Consulting and Managed Services team to help customers get the most value from their Azure deployments in three main areas: Architecture, Implementation, and Management.

Specific to migration, we first always recommend that customers have a clear strategy and cloud architecture document prior to moving anything so that the plan and expectations are clear and realistic. We can facilitate such discussions and help surface assumptions about what may or may not be true in different deployment environments.

Once the strategy is set, our Consulting and Managed Services team is available to assist customers or completely own the migration process, including backups, transfer, validation, testing, and so on. This includes not only Teradata-to-Teradata migration (e.g., on-premises to the cloud), but also competitor-to-Teradata migrations as well. We especially love the latter ones!

Finally, can you share with us a bit of what is next for Teradata in the Cloud?

Wow, where should I start? We’re operating at breakneck pace. Seriously, we have many new cloud developments in the works right now, and we’ve been hiring cloud developers like crazy (hint: tell ‘em Brian sent you!).

You’ll see more cloud announcements from us this quarter, and without letting the cat out of bag, expect advancements in the realm of automation, configuration assistance, and an expansion in managed offers.

Cloud is a key enabler to our ability to help customers get the most value from their data, so it’s definitely an exciting time to be involved in helping define the future of Teradata.
Thanks for your questions and interest!

How To Apply IoT Applications In Real Life

How To Apply IoT Applications In Real Life

The Internet of Things (IoT) is a hot topic in business circles and with current innovations in the field; it is easy to see why. Related technologies like sensor embedded connected systems have the potential to transform the way we live and even how businesses operate. To illustrate the impact of the trend, here are top ten applications that showcase the Internet of Things:


Wearables  
Smart Homes
Connected Cars
Smart Cities
Industrial IoT
Retail
Healthcare
Smart Grids
Environmental Monitoring
Supply Chain


1 - Wearables

When it comes to technologies that made it possible for there to be an Internet of Things, wearable’s deserve special mention. Wearables are often in the form of sensor-embedded objects driven by software and are designed to collect data about users, send and sometimes even receive it. Depending on what a device is for, the data that is received about the user can be used to glean insights on anything from a user’s health to bank transactions.

Since their inception, there has been an increasing demand for wearable devices and brands like Google and Apple are already cashing in on it. Case in point both are versions of the Apple Watch and Google Glass.

2 - Smart Homes

If wearable’s like smart watches aren’t enough to convince you that IoT is ...


Read More on Datafloq
Real-Time Kafka Data Ingestion into HBase via PySpark

Real-Time Kafka Data Ingestion into HBase via PySpark

Streaming data is becoming an essential part of every data integration project nowadays, if not a focus requirement, a second nature. Advantages gained from real-time data streaming are so many. To name a few: real-time analytics and decision making, better resource utilization, data pipelining, facilitation for micro-services and much more.

Python has many modules out there which are used heavily by data engineers and scientist to achieve different goals. While "Scala" is gaining a great deal of attention, Python is still favorable by many out there, including myself. Apache Spark has a Python API, PySpark, which exposes the Spark programming model to Python, allowing fellow "pythoners" to make use of Python on the amazingly, highly distributed and scalable Spark framework.

Often, persisting real-time data streams is essential, and ingesting MapR Streams / Kafka data into MapR-DB / HBase is a very common use case. Both, Kafka and HBase are built with two very important goals in mind: scalability and performance. In this blog post, I'm going to show you how to integrate both technologies using Python code that runs on Apache Spark (via PySpark). I've already tried to search such combination on the internet with no luck, I found Scala examples but not ...


Read More on Datafloq
Corporate Self-Service Analytics: 4 Questions You Should Ask Yourself Before You Start

Corporate Self-Service Analytics: 4 Questions You Should Ask Yourself Before You Start

Today’s customers are socially driven and more value conscious than they were ever before. Believe it or not, everyday customer interactions create a whopping 2.5 exabytes of data, which is equal to 1,000,000 terabytes, and this figure has been predicted to grow by 40 percent with every passing year. As organisations face the mounting challenges of coping with the surge in the amount of data and number of customer interactions, it has become extremely difficult to manage the huge quantities of information, whilst providing a satisfying customer experience. It is imperative for businesses and corporations to create a customer-centric experience by adopting a data-driven approach, based on predictive analytics.

Integrating an advanced self-service analytics (SSA) environment for strengthening your analytics and data handling strategy can prove to be beneficial for your business, regardless of the type and size of your enterprise. A corporate SSA environment can assist in dramatically improving your operations capabilities, as it provides an in-depth understanding of consumer data. This, in turn, facilitates your workforce in taking up a more responsive, nimble approach to analyzing data, and fosters fact-based decision making rather than on predictions and guesswork. Self-service analytics offers a wealth of intelligence and insights into how ...


Read More on Datafloq
6 Reasons Data Analytics Will Make a Splash in 2017

6 Reasons Data Analytics Will Make a Splash in 2017

There is no doubt about it. Millennials, unlike previous generations, love self-service options. They want the freedom to customize orders, change account information, and find answers to their questions without intervention from a middle man. Think of them as the first demographic who prefers navigating automated phone systems over speaking to a real person, at least most of the time.

This is even true within organizations. Users would much rather access and query data themselves. It’s faster and more customized than relying on standard reports or submitting a request for an ad hoc report.

This trend is only going to grow as 2017 continues. Check out six ways that data analytics will trend over the next twelve months.

1. Data Gets Smarter

The Vs of big data are veracity, volume, velocity, and variety. Smart data involves removing volume, variety, and velocity from the picture and focusing on veracity. The idea is that by doing so, the value of that data to people and organizations increases in a meaningful way.

Smart data is useful and actionable. It involves weeding out the fluff and providing information that people can use to make decisions and predict trends. In 2017, brands will increasingly use artificial intelligence when analyzing data ...


Read More on Datafloq
Five Ways How Big Data is Applied in Russia

Five Ways How Big Data is Applied in Russia

Everybody heard about Big Data, but a relatively small number of people knows what it means and how we can actually use it. Those who know can build their business around Big Data to optimize the spendings and forecast the behavior of their clients.

The meaning behind Big Data

The amount of digital data is growing every year. According to IBS the total number of data stored in 2015 is more than 6,5 zettabytes and is constantly growing. These huge masses of data are called Big Data. In Russia, we can add the tools for its analysis to the term, but the main idea stays untouched. Only 1,5% of this data is actually useful and we need to have great analytics to depict the information needed from the whole data pool.

Big Data over the world

Nowadays the USA is the pioneer of Big Data practices with the majority of companies being at least interested in the subject. In 2014, according to IDC, more countries began to look into the data corner. Europe, Asia (excluding Japan) and Africa took 45% share of big data software and hardware.

Where Big Data is used

The usage of Big Data is wide. With clever usage of data analytics, you can find out the effectiveness ...


Read More on Datafloq
Future-Proof: Data-Driven Marketing Predictions for 2017

Future-Proof: Data-Driven Marketing Predictions for 2017

We recently spoke with Daniel Cantorna, Head of Data Products at ICLP, to give us his insights with regards to the future of big data in 2017 and his specialty which is data driven marketing. Daniel has worked on marketing automation programmes and has architected solutions to collect, store and understand customer data, as well as having created several data products that allow marketers to drive better decisions. Here are some of his thoughts:

The Difference Between Data-Driven Marketing or Data-Informed Marketing. There is a big difference between data-driven and data-informed marketing, but the two approaches can strengthen each other.

Essentially, data-driven marketing is the direct use of insights or outcomes from data analysis to inform a marketing decision which then impacts on a customer experience in some way. For example, offering special discounts to customers who click on your ad, visit your web page, or sign up for your mail list.

On the other hand, data-informed marketing describes a process whereby the outcome of some data analysis may inform or direct a future decision e.g. we have noticed that there is a dip in sales when products are advertised using these (x, y, z) words, so let’s try using a different product ...


Read More on Datafloq
How to Perfect Lambda Architecture with Oracle Data Integrator (and Kafka / MapR Streams)

How to Perfect Lambda Architecture with Oracle Data Integrator (and Kafka / MapR Streams)

"Lambda architecture is a data-processing architecture designed to handle massive quantities of data by taking advantage of both batch- and stream-processing methods. This approach to architecture attempts to balance latency, throughput, and fault-tolerance by using batch processing to provide comprehensive and accurate views of batch data, while simultaneously using real-time stream processing to provide views of online data. The two view outputs may be joined before presentation. The rise of lambda architecture is correlated with the growth of big data, real-time analytics, and the drive to mitigate the latencies of map-reduce." - Wikipedia



Previously, I've written some blogs covering many use-cases for using Oracle Data Integrator (ODI) for batch processing on top of MapR distribution and for using Oracle GoldenGate (OGG) to stream transactional data into MapR Streams and other Hadoop components. While combining both products perfectly fit for the lambda architecture, the latest release of ODI (12.2.1.2.6) has many new great features, including the ability to deal with Kafka streams as source and target from ODI itself. This feature has tremendous advantages to anyone already having or planning to have a lambda architecture, by simplifying the way we process and handle both batch and fast data within the same logical design, under one product. Now if we combine OGG streaming capabilities and ODI batch/streaming capabilities, ...


Read More on Datafloq
How to Build a Data Science Team

How to Build a Data Science Team

Businesses today need to do more than merely acknowledge big data. They need to embrace data and analytics and make them an integral part of their company. Of course, this will require building a quality team of data scientists to handle the data and analytics for the company. Choosing the right members for the team can be difficult, mainly because the field is so new and many companies are still trying to learn exactly what a good data scientist should offer. Putting together an entire team has the potential to be more difficult. The following information should help to make the process easier.

The Right People

What roles need to be filled for a data science team? You will need to have data scientists who can work on large datasets and who understand the theory behind the science. They should also be capable of developing predictive models. Data engineers and data software developers are important, too. They need to understand architecture, infrastructure, and distributed programming.

Some of the other roles to fill in a data science team include the data solutions architect, data platform administrator, full-stack developer, and designer. Those companies that have teams focusing on building data products will also likely want ...


Read More on Datafloq
The Role and Impact of Big Data on the Banking & Finance Sector

The Role and Impact of Big Data on the Banking & Finance Sector

Technology has a lot to play in the evolution of the Banking & Finance Industry in the last one to two decades. The service and the way the banks operated have advanced to make life easier for both the customers and the banking professionals. When the Big Data Revolution hit the various industries, the banking industry realised the opportunity avenues associated with it. This article provides an insight on the impact of Big Data on the banking sector.

How Big is Big Data in the Banking Sector?

Banking firms always had a huge amount of information stored in their database; clueless about what to do with it. Big Data has unlocked the doors converting this huge amount of data into meaningful benefits for themselves and their customers.

According to a report by Alacer, the banks in the US have currently around 1 exabyte of stored data, which is equal to 273 Billion mp3s. Typically, the source of information in the banking industry comes from the sources – customer bank visits, credit/debit card histories, banking volumes, account transactions, call logs, and web interactions.

Role and Impact of Big Data

As mentioned in the above paragraphs, there are a lot of areas which have been or can ...


Read More on Datafloq
How Cloud Computing Provides Innovative Business Solutions

How Cloud Computing Provides Innovative Business Solutions

You’ve surely heard the term “cloud computing” by now. Even parents and grandparents know they can use their iPhone to backup their family photos in the cloud, although they may not know exactly what that means.

In basic terms, cloud computing is the act of storing and accessing your data and programs over the internet, as opposed to a stand-alone desktop computer’s hard drive. While the term “cloud computing” is fairly new, the concept is not. People have been using cloud computing since the 1960s, even prior to the internet as we know it today.

With Salesforce.com making its debut in 1999, they delivered the first enterprise application over a basic website. It wasn’t long before other companies would follow suit, and in 2002, Amazon Web Services was launched.

Why it’s called the cloud

Referring to this technology as “the cloud” is a genius marketing strategy that paints a picture in the user’s mind of an all-powerful network where data is accessible at all times, from all devices. It’s much like a puffy, white cloud that hovers overhead and follows you wherever you go.

Cloud computing is important because it allows people to collaborate in real time from opposite ends of the world. It’s also ...


Read More on Datafloq
How to Protect your Network against rising Cybercrime

How to Protect your Network against rising Cybercrime

A security system is literally a means/method by which something is secured through a system of interworking components and devices. When it comes to information, it is defined as the protection of information to minimize exposure to unauthorized personnel.

Cybercrime is a wide range of malicious activities including the illegal interception of data, system interferences that compromise network integrity and availability of copyright infringements. These offenses are committed using telecommunication networks such as the internet and mobile phones. The crime may be committed by individuals or small groups, as well as by criminal organizations that are often spread around the world and committing crimes on an unprecedented scale with a criminal motive to intentionally harm the reputation of the victim. The committed offenses can cause physical/mental harm or loss to the victim directly or indirectly. These crimes threaten a nation’s security and financial health. Cyber criminals often chose to operate in countries with weak or nonexistent cybercrime laws.

How Your Website is Getting Harmed by Hackers?

A hacker is a highly skilled computer expert, capable of breaking into computer systems and network using bugs that exploit and gain unauthorized access to data. Below are the most common types of attacks on websites by the hacker:

1. ...


Read More on Datafloq
Yep, I’m Writing a Book on Modern Data Management Platforms (2017-02 Update)

Yep, I’m Writing a Book on Modern Data Management Platforms (2017-02 Update)

(Image courtesy of Thomas Skirde)
As I mentioned in a first blog about the book, I'm now working hard to deliver a piece that will hopefully, serve as a practical guide for the implementation of a successful modern data management platform.

I'll try to provide frequent updates and, perhaps, share some pains and gains about its development.
For now, here's some additional information, including the general outline and the type of audience is intended for.

I invite you to be part of the process and leave your comments, observations and encouragement quotes right below, or better yet, to consider:
  • Participating in our Data Management Platforms survey to obtain a nice discount right off the bat)
  • Pre-ordering the book, soon, I’ll provide you with details on how to pre-order your copy, but in the meantime, you can show your interest by signing up to our pre-order list, or for
  • Providing us with information about your own successful enterprise use case, which we may use in the book
Needless to say, the information you provide will be kept confidential and used only for the purpose of developing this book.
So here, take a look at the update...


New Data Management Platforms

Discovering Architecture Blueprints

About the Book

What Is This Book About?

This book is the result of a comprehensive study into the improvement, expansion, and modernization of different types of architectures, solutions, and platforms to address the need for better and more effective ways of dealing with increasing and more complex volumes of data.

In conducting his research for the book, the author has made every effort to analyze in detail a number of successful modern data management deployments as well as the different types of solutions proposed by software providers, with the aim of providing guidance and establishing practical blueprints for the adoption and/or modernization of existing data management platforms.
These new platforms have the capability of expanding the ability of enterprises to manage new data sources—from ingestion to exposure—more accurately and efficiently, and with increased speed.

The book is the result of extensive research conducted by the author examining a wide number of real-world, modern data management use cases and the plethora of software solutions offered by various software providers that have been deployed to address them. Taking a software vendor‒agnostic viewpoint, the book analyzes what companies in different business areas and industries have done to achieve success in this endeavor, and infers general architecture footprints that may be useful to those enterprises looking to deploy a new data management platform or improve an already existing one.

Who Is This Book For?

This book is intended for both business and technical professionals in the area of information technology (IT). These roles would include chief information officers (CIOs), chief technology officers (CTOs), chief financial officers (CFOs), data architects, and data management specialists interested in learning, evaluating, or implementing any of the plethora of new technologies at their disposal for modernizing their existing data management frameworks.

The book is also intended for students in the fields of computer sciences and informatics interested in learning about new trends and technologies for deploying data architecture platforms. It is not only relevant for those individuals considering pursuing a big data/data management‒related career, but also for those looking to enrich their analytics/data sciences skills with information about new platform technologies.
This book is also relevant for:


  • Professionals in the IT market who would like to enrich their knowledge and stay abreast of developments in information management.
  • Entrepreneurs who would like to launch a data management platform start-up or consultancy, enhancing their understanding of the market, learning about some start-up ideas and services for consultants, and gaining sample business proposals.
  • Executives looking to assess the value and opportunities of deploying and/or improving their data management platforms. 
  • Finally, the book can also be used by a general audience from both the IT and business areas to learn about the current data management landscape and technologies in order to acquire an informed opinion about how to use these technologies for deploying modern technology data management platforms. 

What Does This Book Cover? 

The book covers a wide variety of topics, from a general exploration of the data management landscape, to a more defined revision of the topics, including the following:

  • The evolution of data management
  • A comprehensive introduction to Big Data, NoSQL, and analytics databases 
  • The emergence of new technologies for faster data processing—such as in-memory databases, data streaming, and real-time technologies—and their role in the new data management landscape
  • The evolution of the data warehouse and its new role within modern data management solutions 
  • New approaches to data management, such as data lakes, enterprise data hubs, and alternative solutions 
  • A revision of the data integration issue—new components, approaches, and solutions 
  • A detailed review of real-world use cases, and a suggested approach to finding the right deployment blueprint 

How Is the Book Structured?

The book is divided into four comprehensive parts that offer a historical perspective and the ground basis for the development of management platforms and associated concepts, and the analysis of real-world cases of modern data management frameworks toward the establishment of potential development blueprints for deployment.

  • Part I. A brief history of diverse data management platform architectures, and how their evolution has set the stage for the emergence of new data management technologies. 
  • Part II. The need for and emergence of new data management technologies such as Big Data, NoSQL, data streaming, and real-time systems in reshaping existing data management infrastructures. 
  • Part III. An in-depth exploration of these new technologies and their interaction with existing technologies to reshape and create new data management infrastructures. 
  • Part IV. A study of real-world modern data management infrastructures, along with a proposal of a concrete and plausible blueprint. 

General Outline

The following is a general outline of the book:

<Table of Content>
Preface x 
Acknowledgment xi 
Prologue xii 
Introduction xiii 
Part I. Brief History of Data Management Platform Architectures 
          Chapter 1. The Never-Ending Need to Manage Data
          Chapter 2. The Evolution of Structured Data Repositories
          Chapter 3. The Evolution of Data Warehouse as the Main Data Management Platform
Part II. The Need for and Emergence of New Data Management Technologies 
          Chapter 4. Big Data: A Primer
          Chapter 5. NoSQL: A Primer
          Chapter 6. Need for Speed 1: The Emergence of In-Memory Technologies
          Chapter 7: Need for Speed 2: Events, Streams, and the Real-Time Paradigm
          Chapter 8 The Role of New Technologies in Reshaping the Analytics and Business Intelligence Space
Part III. New Data Management Platforms: A First Exploration 
          Chapter 9. The Data Warehouse, Expanded and Improved
          Chapter 10. Data Lakes: Concept and Approach
          Chapter 11. Data Hub: Concept and Approach
          Chapter 13. Analysis of Alternative Solutions
          Chapter 12. Data Lake vs. Data Hub: Key Differences and Considerations
          Chapter 13. Considerations on Data Ingestion, Integration, and Consolidation
Part IV. Studying Plausible New Data Management Platforms 
          Chapter 14. Methodology
          Chapter 15. Data Lakes
               Sub-Chapter 15.1. Analyzing three real-world uses cases
               Sub-Chapter 15.2 Proposing a feasible blueprint
          Chapter 16. Data Hubs
               Sub-Chapter 16.1. Analyzing three real-world uses cases
               Sub-Chapter 16.2. Proposing a feasible blueprint
          Chapter 17. Summary and Conclusion
Summary and Conclusions
Appendix A. The Cloud Factor: Data Management Platforms in the Cloud
Appendix B. Brief Intro into Analytics and Business Intelligence with Big Data
Appendix D. Brief Intro into Virtualization and Data Integration
Appendix E. Brief Intro into the Role of Data Governance in Big Data & Modern Data Management Strategies
Appendix F. Brief Intro into Analytics and Business Intelligence with Big Data
Glossary 
Bibliography 
Index 

Main Post Image courtesy of Thomas Skirde 

Újra – Nyitott data science képzés

Újra – Nyitott data science képzés

Folytatva a hagyományokat: szeretnénk a figyelmetekbe ajánlani a tavaszi félévben futó data science kurzusunkat a Műegyetemen, amelyben az érdeklődők betekintést nyerhetnek az adatelemzés világába elméleti és gyakorlati órák keretében. A tárgy órái hetente keddenként 10:15-től és minden második pénteken 10:15-től lesznek. Az első alkalom február 7-én, kedden, 10:15-kor kezdődik.

Téma szempontjából az adatelemzés alapjait vesszük át: adatmodell, CRISP-DM, felügyelt és nem felügyelt tanulási eljárások, adatbányászati modellezés, és sok alkalmazási példa: elvándorlás-előrejelzés, kockázatbecslés, szegmentáció, idősorok előrejelzése. Az első hetekben RapidMinerrel dolgozunk, majd a python adatelemzős alapjait sajátítjuk el a gyakorlati alkalmak keretében. Mindenkitől azt kérjük, hogy a gyakorlati órákra hozzon saját számítógépet, amire a megfelelő programcsomagokat telepítette (az ingyenes verziókkal dolgozunk).

A tárgyhoz házifeladat is tartozik, ami egy felügyelt tanulási feladat lesz valós adathalmazon - sőt a kaggle.com rendszerén keresztül egy zárt adatbányászati versenyen is megversenyeztetjük majd a házifeladatra adott megoldásokat. A helyszínről és a pontos beosztásról a jelentkezés után, annak elfogadása esetén tájékoztatunk. 

Már most látszik, hogy elég sok külsős hallgató jelentkezése várható, ezért némi korlátozással is élnünk kell majd: a tárgyra annyi embert tudunk befogadni, ahány hallgató is felvette azt. Ez a jelenlegi állapot szerint 24 fő. Emiatt arra kérünk, hogy amennyiben érdekel a tárgy és szívesen velünk tartanál, úgy jelentkezésed add le az alábbi rövid kérdőív kitöltésével.

Mindenkit pár napon belül értesíteni fogunk, hogy jelentkezését módunkban áll-e elfogadni. Ez a válaszolás sorrendjétől, valamint a motivációtól fog függeni. Ugyanazon cégtől csak kivételes esetben fogunk nagy számú (3+) jelentkezést befogadni.

Várunk benneteket!

 

A kép forrása.

Megosztom Facebookon! Megosztom Twitteren! Megosztom Tumblren!

Immutability in JavaScript using Redux

Immutability in JavaScript using Redux

In an ever growing ecosystem of rich and complicated JavaScript applications, there’s more state to be managed than ever before: the current user, the list of posts loaded, etc.

Any set of data that needs a history of events can be considered stateful. Managing state can be hard and error prone, but working with immutable data (rather than mutable) and certain supporting technologies- namely Redux, for the purposes of this article- can help significantly.

Immutable data has restrictions, namely that it can’t be changed once it’s created, but it also has many benefits, particularly in reference versus value equality, which can greatly speed up applications that rely on frequently comparing data (checking if something needs to update, for example).

Using immutable states allows us to write code that can quickly tell if the state has changed, without needing to do a recursive comparison on the data, which is usually much, much faster.

This article will cover the practical applications of Redux when managing state through action creators, pure functions, composed reducers, impure actions with Redux-saga and Redux Thunk and, finally, use of Redux with React. That said, there are a lot of alternatives to Redux, such as MobX, Relay, and Flux-based libraries.



Why Redux?

The key ...


Read More on Datafloq
The Symbiotic Relationship between IoT and Analytics

The Symbiotic Relationship between IoT and Analytics

Throughout history, the business world has been driven forward by new technologies, with these developments rapidly redefining what companies are capable of doing. As early business adopters take advantage of these technological opportunities, it often becomes apparent to industry observers that there exists a mutually beneficial relationship between new technologies which impact the growth of both.

We’re seeing one example of this phenomenon today, with the growing importance of both the Internet of Things (IoT) and analytics technologies in businesses worldwide.

Most industry observers already understand that each of these technologies will play a tremendously important role in the business world of the future, but by looking at the symbiotic relationship between the two that we can best understand what each has to offer.

What is the Internet of Things?

The Internet of Things is a collective term for all smart devices in the world capable of generating and delivering data without human intervention. The Internet of Things is set to redefine our understanding of how our industrial systems operate, and the potential upside for the business world is massive.

How does Analytics fit in?

The development of analytics technology closely mirrored the beginning of the big data era. As organizations saw the exponential growth of ...


Read More on Datafloq
Techdown – The Presence of Technology on Super Bowl

Techdown – The Presence of Technology on Super Bowl

With Super Bowl LI rapidly approaching, there are tons of questions that need answering. They range from the ones about the halftime show – why did Adele turn it down, what will it look like this year and will it be more spectacular than the infamous Super Bowl XXXVIII with its wardrobe malfunctioning incident? – to the actual game.

Namely, is Tom Brady going to be allowed anywhere near the ball before the game starts and, most importantly, how did the Atlanta Falcons manage to get here? After having more or less a horrible decade ever since Michael Vick left – except the 2012 season which was quite good, actually – almost nobody saw this coming. And while the New England Patriots have been consistently good season after season and even won four championship titles in the last 15 years, the Falcons are rather a surprise. Maybe it’s their way to say farewell and show respect to the Georgia Dome?

However, these aren’t the only questions since lots of people are going to be focusing on ads and social media reports. So, if you’re interested in the tech side of the Super Bowl more than in the sports aspect of it, here ...


Read More on Datafloq
How Mixed Reality Will Transform Advertising

How Mixed Reality Will Transform Advertising

Imagine yourself meeting your friend far away from your place at a coffee shop by projecting your digital self and your friend interacting with you, sipping coffee and a virtual kitten that’s indistinguishable from a real one playing on the table. Exciting, isn’t it? Well, that’s what Mixed Reality has in store for you. Before diving deep, let’s first understand how it’s different from Virtual Reality (VR) and Augmented Reality (AR).

Virtual Reality

VR completely overrides the physical world and creates its own digital world. It takes the user to another immersive environment and allows the viewer to experience something that would not have been feasible otherwise. For example, VR would let you have 3D experience of Rio de Janeiro Carnival, FIFA world cup, your favorite band, sea surfing and much more just by wearing a headgear (irrespective of the physical place). Here is a video of Google’s Artist in Residence (AIR) program which shows artists producing creative pieces using HTC Vive and Google Tilt Brush:



 Liked it? Now don’t miss Katie Rodgers create a garment using a virtual mannequin. 

Augmented Reality

Unlike VR, AR works in tandem with the real world by superimposing digital objects on the real-world environment. It functions by augmenting the physical ...


Read More on Datafloq
Algorithms are Black Boxes, That is Why We Need Explainable AI

Algorithms are Black Boxes, That is Why We Need Explainable AI

Artificial Intelligence offers a lot of advantages for organisations by creating better and more efficient organisations, improving customer services with conversational AI and reducing a wide variety of risks in different industries. Although we are only at the beginning of the AI revolution that is upon us, we can already see that artificial intelligence will have a profound effect on our lives. As a result, AI governance is also becoming increasingly important, if we want to reap the benefits of artificial intelligence.

Data governance and ethics have always been important and a few years ago, I developed ethical guidelines for organisations to follow, if they want to get started with big data. Such ethical guidelines are becoming more important, especially now since algorithms are taking over more and more decisions. Automated decision-making is great until it has a negative outcome for you and you can’t change that decision or, at least, understand the rationale behind that decision. In addition, algorithms offer tremendous opportunities, but they have two major flaws:


Algorithms are extremely literal; they pursue their (ultimate) goal literally and do exactly what is told while ignoring any other, important, consideration;
Algorithms are black boxes; whatever happens inside an algorithm is only known ...


Read More on Datafloq
The Future of Artificial Intelligence; Why We Need Explainable AI

The Future of Artificial Intelligence; Why We Need Explainable AI

Artificial Intelligence offers a lot of advantages for organisations by creating better and more efficient organisations, improving customer services with conversational AI and reducing a wide variety of risks in different industries. Although we are only at the beginning of the AI revolution that is upon us, we can already see that artificial intelligence will have a profound effect on our lives. As a result, AI governance is also becoming increasingly important, if we want to reap the benefits of artificial intelligence.

Data governance and ethics have always been important and a few years ago, I developed ethical guidelines for organisations to follow, if they want to get started with big data. Such ethical guidelines are becoming more important, especially now since algorithms are taking over more and more decisions. Automated decision-making is great until it has a negative outcome for you and you can’t change that decision or, at least, understand the rationale behind that decision. In addition, algorithms offer tremendous opportunities, but they have two major flaws:


Algorithms are extremely literal; they pursue their (ultimate) goal literally and do exactly what is told while ignoring any other, important, consideration;
Algorithms are black boxes; whatever happens inside an algorithm is only known ...


Read More on Datafloq
How Important is Big Data in Recruiting?

How Important is Big Data in Recruiting?

Intuition. That’s what most recruiters and hiring managers use to make their decisions on which applicants get offers. In a world that is moving toward more data and logic-driven decisions at every turn, recruiting is a surprising change of pace. Or is it? Big data has touched nearly every facet of business, and recruiting is no exception. Large companies have been using data to choose their new employees and decrease turnover for several years. But how much better is predictive analysis than the human intuition when it comes to recruiting?

Can Big Data Save Companies Money on Hiring?

It costs a lot to hire and train an employee. Figures vary widely based on the type of position, but some studies estimate that it costs 6-9 months’ of the new employee’s salary for onboarding and training. With costs like that, minimizing turnover is a key money-saving goal for any business. 

Data points available from information about applicants online have helped hiring managers to devise new ways of evaluating potential employees, including complex personality tests that can help predict culture fit. But only when large data sets are used to predict specific outcomes, like retention rates, does big data come into recruiting. 

For the big companies ...


Read More on Datafloq
Sketch Notes Reflections at TDWI Roundtable with FCO-IM

Sketch Notes Reflections at TDWI Roundtable with FCO-IM

Our 25th anniversary roundtable in Frankfurt with FCO-IM was a great success. Almost 90 registrations and more than 60 attendees is an unexpected outcome for a topic that is almost unknown in Germany. If you want to know what happened at the roundtable, read about it in my previous blog post.

Curing method-illness in Enterprise Architecture

Curing method-illness in Enterprise Architecture

Years ago my son came home with some geography homework. He had to learn the countries and capitals of the European continent. When I was practicing with him I encountered the country Yugoslavia...Now, this is 2010 I am talking about, Yugoslavia is split up into (a.o) Serbia and Montenegro. I honestly thought this was a mistake (old material used by the teacher), so I went to the teacher and asked her whether we could get an update on the material. This was her response:

"The method has not been updated yet"

This is – in my opinion – a small symptom of what is very wrong with education (which is completed saturated with this ‘method-illness), society as a whole, Sick and – more particular - Enterprise Architecture. History tells us that these beliefs, like a religion, in a method and blindly following it and demanding others should too, will not end well. Methods used in this way are killing critical thinking, creativity and innovation.

Methods, best practices, checklists and other attempts to mask (especially to management and society) the inherent uncertainty of the work/world we’re in, is extremely damaging to people, organisations and society.

Only ‘systems’ (using a very broad definition of ‘system’ here) that are simple, linear, predictable and have a generic context might benefit from this method-approach. In my world of data-enterprise-IT-Business-Blabla-architecture, these systems do simply not exist.

The moment a system gets complicated, complex or even chaotic* it all breaks down and it becomes dangerous on so many levels when design-participants of these systems still act as if it’s a simple-linear-method-based-system. The inherent complexity and uncertainty of these systems require a deep dive into the context (real world) surrounding the system. It requires architects to experiment, try something, rinse-and-repeat, being a full member of the operationalisation (!) and hanging in there (!), learning, discussing. And yeah, for many architects this is scary….

I understand its temptation, the phrase ‘we use TOGAF’ or ‘we have written a PSA1’ sends a vibe of trust and safety (‘he, I followed the method, not my fault’) and is highly rewarded by senior management. What is not rewarded is stating this uncertainty (‘What do you mean, you don’t know, that’s why I hired you’). Make senior management part of the uncertainty, figure out together how to approach it, how we design a learning-by-doing mentality as well as a mutual respect for one another and emerging insights (!).

We ‘the architects’ should stand firm in stating the message ‘this is complicated, we can do it, but I am not sure how’.

How do we need to change? We need to go back to the fundamentals, stir away from the various technical hypes, distance ourselves from architectural methods. We should separate concerns fiercely, isolate, abstract, collaborate with many experts in the field, communicate and above all, we should honour the context of the domain we are working in and how it’s affecting the real world. Remember, there is no ‘one architecture’ in complex systems that you can design on forehand. And if you tell that fairy-tale, you are deluding yourself, your colleagues, your organisation and ultimately, society.

Back to my opening story, if teachers are not evolving towards educators truly interested in the kids they need to educate and if they keep relying on the methods to educate our kids, I say…lets automate the teachers, long live the MOOC’s and YouTube. And the sad thing is, this teaching in methods is educating future architects to do the same, this method-illness is deeply rooted in our society. Instead, teach them how to think critically, to think outside the box using fundamental skills. So, throw away your methods, burn them, educating kids is about connecting with kids, parents, their concerns, it's about an unique and proud profession, something you need to learn and train hard for, it's a valuable, hard-to-learn and respected skill.

I leave it to the reader to convert this analogy to Enteprise Architecture and the state we are in.

*referring to the Cynafin framework of Dave Snowden

1 Project Start Architecture

 

Why We Need to Rethink Security for Wearable Tech

Why We Need to Rethink Security for Wearable Tech

As ubiquitous computing continues to become the focal point of our daily lives, it is more important than ever to make decisions about the scope of data people unknowingly share. With wearable technology, users are enjoying the comfort that comes through ambient intelligence. However, they also risk potentially exposing their private data to nefarious actors. For example, a hacker may gauge the best time to rob victims while they sleep based on their leaked heartbeat data. Hackers can also use data to discover medical conditions that can be exploited for illegal gains.

Security in wearable technology is different from the precautions people take in other settings. This is due to the increased attack surface exposed by such devices. In 2015, a vulnerability in the Fitbit wristband was disclosed that allowed an attacker to upload malicious code when the device was in close range. The code could then be transferred to any connected computer making other devices vulnerable as well.

Bluetooth is becoming the premier connectivity option for a majority of wearable devices. Unfortunately, Bluetooth is not secure, and it continues to be a weak link in the security chain. Freely available tools, such as Crackle, can be used to crack Bluetooth encryption. ...


Read More on Datafloq
4 Ways Cloud Analytics Helps IT Businesses

4 Ways Cloud Analytics Helps IT Businesses

Does cloud analytics really help in the overall scope of Information Technology? This is one question asked by most IT businesses. As the cloud continues to gain prominence in both, technology and business world, questions such as these arise more frequent. Before we delve into the significance of cloud analytics for IT businesses, it’s important to first define it. Cloud analytics is a service model in which the data analytics process is provided through private and public cloud under a subscription-based model. In this blog post, we take a look at how cloud computing analytics can have a positive impact on the way IT businesses operate.

1. Streamlined Business Operations

One of the major benefits of cloud computing are the web-based services that host all the programs that users need. This involves all the elements of analytics - rather than invest in multiple programs, the cloud offers a single place for all your hosting needs. Cloud analytics is a streamlined model for IT businesses, especially call centers, which previously had separate software to run specific applications. The cloud simplifies the function by providing a single platform.

2. Cost-Effective

Apart from being simplified, the option of subscription-based services allows businesses to opt for the pay-per-use ...


Read More on Datafloq
RI (Referential Integrity) Constraints: 3 Reasons to Include Them in Your Data Warehouse

RI (Referential Integrity) Constraints: 3 Reasons to Include Them in Your Data Warehouse

3 Reasons to Include Referential Integrity Constraints in Your Data Warehouse
Data Security, Data Ethics, and Data Ownership

Data Security, Data Ethics, and Data Ownership

I. The problem(s)

Data security represents one of the main problems of this data yield generation, since a higher magnitude of data is correlated with a loose control and higher fraud probability, with a higher likelihood of losing own privacy, and becoming targets of illicit or unethical activities. Today more than ever a universal data regulation is needed — and some steps have already been taken toward one (OECD, 2013). This is especially true because everyone claims privacy leakages, but no one wants to give up on the extra services and customized products that companies are developing based on our personal data.

It is essential to protect individual privacy without erasing companies’ capacity to use data for driving businesses in a heterogeneous but harmonized way. Any fragment of data has to be collected with prior explicit consent, and guaranteed and controlled against manipulation and fallacies. A privacy assessment to understand how people would be affected by the use of data is crucial as well.

II. Fairness and Data Minimization

There are two important concepts to be considered from a data protection point of view: fairness and minimization.

Fairness concerns how data are obtained, and the transparency needed from organizations that are collecting them, especially about their future potential uses.

Data minimization, instead, regards the ability of gathering ...


Read More on Datafloq
Could GPS Satellites Be Replaced with Synthetic Diamonds?

Could GPS Satellites Be Replaced with Synthetic Diamonds?

Today, we take GPS (Global Positioning System) technology for granted. Thinking back on the days of navigating with maps seems like a long time ago, even though modern GPS systems are really fairly new. That’s because the technology has been so revolutionary to the way we live our lives, that we can’t really imagine life without it. It helps us to navigate while driving, help us track our steps, and even crowdsource crime analysis maps to help keep us safe. Technology is always evolving, however, and there may be something new on the horizon that could make GPS obsolete: synthetic diamonds. 

How Does GPS Work?

Currently, GPS works using three different devices: satellites, ground devices, and receivers. Satellites, of course, orbit the Earth and are able to relay positioning data back to the ground devices. The ground devices then use radar to confirm positioning. Our receivers are the devices we use on an everyday basis: our phones, GPS units in the car, smart watches, and any other instrument that can enable GPS. While this system works perfectly well for most of our needs, there are some limitations on the accuracy and speed of GPS. For some technology, like driverless cars, we’ll need ...


Read More on Datafloq
What is the Key to Securing your Big Data Environments

What is the Key to Securing your Big Data Environments

Big data environments are now more common in companies; nearly every industry has its hand in the cookie jar. Because of this, companies are generating more data today than any other point in history. Vast “silos” of information are being structured and filled every day giving companies a competitive advantage to better tailor their products to their customers’ needs. But with this growth of big data and big data environments, we are also seeing growth in cybercrime as well. Security for big data has become a great concern in the last few years. Much of the data that is gleaned is sensitive; we know it, you know it, and cybercriminals absolutely know it and love it.

For companies that run big data environments, securing their data warehouses and modes of deployment is vital to not only their success but also to their customers’, clients’ and business partners’ privacy. Knowing how to secure your big data isn’t as difficult as it might sound.

Security strategies can be found through asking specific questions such as “Who is running specific big data requests,” “What analytics requests are users running” or “Are users trying to download sensitive data or is the request part of a job ...


Read More on Datafloq
Data Vault (CDVDM) Training Down Under

Data Vault (CDVDM) Training Down Under

If your looking to get some great Data Vault training in Australia in March, the Genesee academy is running the CDVDM course in Brisbane, Melbourne and Sydney.  You can get the schedule dates at the Genesee site.  It’s a great course and well worth the investment for anyone looking to get in to Data Vault […]
Journey Science in Telecom: Take Customer Experience to the Next Level

Journey Science in Telecom: Take Customer Experience to the Next Level

Journey Science, being derived from connected data from different customer activities, has become pivotal for the telecommunications industry, providing the means to drastically improve the customer experience and retention. It has the ability to link together scattered pieces of data, and enhance a telco business’s objectives. Siloed approaches are becoming obsolete – take call centers as an example – there is only so much that you can do with data from only one system.

By using insights from customer journey analytics, telco businesses can better measure the user experience, and make informed decision for refining it. The data not only allow them to take proactive approach towards customer satisfaction, but enable the prediction of future failures as well. With customer journey analytics, you can evaluate the touchpoints to journeys, and revamp your strategies to better cater to customers’ needs.

In the telecom industry, it is difficult for a business to effectively manage the massive volume of data with the existing systems and technology. There are several aspects where telecom companies need to make improvements, such as reduce costs, improve customer experience, increase conversion rates, and many more. To do so, they need to derive meaning from the collected data by finding ...


Read More on Datafloq
OCSL sets its sights on the Nirvana of hybrid IT—attaining the right mix of hybrid cloud for its clients

OCSL sets its sights on the Nirvana of hybrid IT—attaining the right mix of hybrid cloud for its clients

The next BriefingsDirect digital transformation case study explores how UK IT consultancy OCSL has set its sights on the holy grail of hybrid IT -- helping its clients to find and attain the right mix of hybrid cloud.

We'll now explore how each enterprise -- and perhaps even units within each enterprise -- determines the path to a proper mix of public and private cloud. Closer to home, they're looking at the proper fit of converged infrastructure, hyper-converged infrastructure (HCI), and software-defined data center (SDDC) platforms.

Implementing such a services-attuned architecture may be the most viable means to dynamically apportion applications and data support among and between cloud and on-premises deployments.

Listen to the podcast. Find it on iTunes. Get the mobile app. Download the transcript. Sponsor: Hewlett Packard Enterprise.

To describe how to rationalize the right mix of hybrid cloud and hybrid IT services along with infrastructure choices on-premises, we are joined by Mark Skelton, Head of Consultancy at OCSL in London. The discussion is moderated by BriefingsDirect's Dana Gardner, Principal Analyst at Interarbor Solutions.

Here are some excerpts:

Gardner: People increasingly want to have some IT on premises, and they want public cloud -- with some available continuum between them. But deciding the right mix is difficult and probably something that’s going to change over time. What drivers are you seeing now as organizations make this determination?
Accelerate Your Business
With Hybrid Cloud from HPE

Learn More
Skelton: It’s a blend of lot of things. We've been working with enterprises for a long time on their hybrid and cloud messaging. Our clients have been struggling just to understand what hybrid really means, but also how we make hybrid a reality, and how to get started, because it really is a minefield. You look at what Microsoft is doing, what AWS is doing, and what HPE is doing in their technologies. There's so much out there. How do they get started?

We've been struggling in the last 18 months to get customers on that journey and get started. But now, because technology is advancing, we're seeing customers starting to embrace it and starting to evolve and transform into those things. And, we've matured our models and frameworks as well to help customer adoption.

Gardner: Do you see the rationale for hybrid IT shaking down to an economic equation? Is it to try to take advantage of technologies that are available? Is it about compliance and security? You're probably temped to say all of the above, but I'm looking for what's driving the top-of-mind decision-making now.

Start with the economics

Skelton: The initial decision-making process begins with the economics. I think everyone has bought into the marketing messages from the public cloud providers saying, "We can reduce your costs, we can reduce your overhead -- and not just from a culture perspective, but from management, from personal perspective, and from a technology solutions perspective."

Skelton

CIOs, and even financial officers, are seeing economics as the tipping point they need to go into a hybrid cloud, or even all into a public cloud. But it’s not always cheap to put everything into a public cloud. When we look at business cases with clients, it’s the long-term investment we look at. Over time, it’s not always cheap to put things into public cloud. That’s where hybrid started to come back into the front of people’s minds.

We can use public cloud for the right workloads and where they want to be flexible and burst and be a bit more agile or even give global reach to long global businesses, but then keep the crown jewels back inside secured data centers where they're known and trusted and closer to some of the key, critical systems.

So, it starts with the finance side of the things, but quickly evolves beyond that, and financial decisions aren't the only reasons why people are going to public or hybrid cloud.

Gardner: In a more perfect world, we'd be able to move things back and forth with ease and simplicity, where we could take the A/B testing-type of approach to a public and private cloud decision. We're not quite there yet, but do you see a day where that choice about public and private will be dynamic -- and perhaps among multiple clouds or multi-cloud hybrid environment?

Skelton: Absolutely. I think multi-cloud is the Nirvana for every organization, just because there isn't one-size-fits-all for every type of work. We've been talking about it for quite a long time. The technology hasn't really been there to underpin multi-cloud and truly make it easy to move on-premises to public or vice versa. But I think now we're getting there with technology.

Are we there yet? No, there are still a few big releases coming, things that we're waiting to be released to market, which will help simplify that multi-cloud and the ability to migrate up and back, but we're just not there yet, in my opinion.
There are still a few big releases coming, things that we're waiting to be released to market, which will help simplify that multi-cloud and the ability to migrate up and back, but we're just not there yet.

Gardner: We might be tempted to break this out between applications and data. Application workloads might be a bit more flexible across a continuum of hybrid cloud, but other considerations are brought to the data. That can be security, regulation, control, compliance, data sovereignty, GDPR, and so forth. Are you seeing your customers looking at this divide between applications and data, and how they are able to rationalize one versus the other?

Skelton: Applications, as you have just mentioned, are the simpler things to move into a cloud model, but the data is really the crown jewels of the business, and people are nervous about putting that into public cloud. So what we're seeing lot of is putting applications into the public cloud for the agility, elasticity, and global reach and trying to keep data on-premises because they're nervous about those breaches in the service providers’ data centers.

That's what we are seeing, but we are seeing an uprising of things like object storage, so we're working with Scality, for example, and they have a unique solution for blending public and on-premises solutions, so we can pin things to certain platforms in a secure data center and then, where the data is not quite critical, move it into a public cloud environment.

Gardner: It sounds like you've been quite busy. Please tell us about OCSL, an overview of your company and where you're focusing most of your efforts in terms of hybrid computing.

Rebrand and refresh

Skelton: OCSL had been around for 26 years as a business. Recently, we've been through a re-brand and a refresh of what we are focusing on, and we're moving more to a services organization, leading with our people and our consultants.

We're focusing on transforming customers and clients into the cloud environment, whether that's applications or, if it's data center, cloud, or hybrid cloud. We're trying to get customers on that journey of transformation and engaging with business-level people and business requirements and working out how we make cloud a reality, rather than just saying there's a product and you go and do whatever you want with it. We're finding out what those businesses want, what are the key requirements, and then finding the right cloud models that to fit that.

Gardner: So many organizations are facing not just a retrofit or a rethinking around IT, but truly a digital transformation for the entire organization. There are many cases of sloughing off business lines, and other cases of acquiring. It's an interesting time in terms of a mass reconfiguration of businesses and how they identify themselves.

Skelton: What's changed for me is, when I go and speak to a customer, I'm no longer just speaking to the IT guys, I'm actually engaging with the finance officers, the marketing officers, the digital officers -- that's he common one that is creeping up now. And it's a very different conversation.
Accelerate Your Business
With Hybrid Cloud from HPE

Learn More
We're looking at business outcomes now, rather than focusing on, "I need this disk, this product." It's more: "I need to deliver this service back to the business." That's how we're changing as a business. It's doing that business consultancy, engaging with that, and then finding the right solutions to fit requirements and truly transform the business.

Gardner: Of course, HPE has been going through transformations itself for the past several years, and that doesn't seem to be slowing up much. Tell us about the alliance between OCSL and HPE. How do you come together as a whole greater than the sum of the parts?

Skelton: HPE is transforming and becoming a more agile organization, with some of the spinoffs that we've had recently aiding that agility. OCSL has worked in partnership with HPE for many years, and it's all about going to market together and working together to engage with the customers at right level and find the right solutions. We've had great success with that over many years.

Gardner: Now, let’s go to the "show rather than tell" part of our discussion. Are there some examples that you can look to, clients that you work with, that have progressed through a transition to hybrid computing, hybrid cloud, and enjoyed certain benefits or found unintended consequences that we can learn from?

Skelton: We've had a lot of successes in the last 12 months as I'm taking clients on the journey to hybrid cloud. One of the key ones that resonates with me is a legal firm that we've been working with. They were in a bit of a state. They had an infrastructure that was aging, was unstable, and wasn't delivering quality service back to the lawyers that were trying to embrace technology -- so mobile devices, dictation software, those kind of things.

We came in with a first prospectus on how we would actually address some of those problems. We challenged them, and said that we need to go through a stabilization phase. Public cloud is not going to be the immediate answer. They're being courted by the big vendors, as everyone is, about public cloud and they were saying it was the Nirvana for them.

We challenged that and we got them to a stable platform first, built on HPE hardware. We got instant stability for them. So, the business saw immediate returns and delivery of service. It’s all about getting that impactful thing back to the business, first and foremost.

Building cloud model

Now, we're working through each of their service lines, looking at how we can break them up and transform them into a cloud model. That involves breaking down those apps, deconstructing the apps, and thinking about how we can use pockets of public cloud in line with the hybrid on-premise in our data-center infrastructure.

They've now started to see real innovative solutions taking that business forward, but they got instant stability.

Gardner: Were there any situations where organizations were very high-minded and fanciful about what they were going to get from cloud that may have led to some disappointment -- so unintended consequences. Maybe others might benefit from hindsight. What do you look out for, now that you have been doing this for a while in terms of hybrid cloud adoption?

Skelton: One of the things I've seen a lot of with cloud is that people have bought into the messaging from the big public cloud vendors about how they can just turn on services and keep consuming, consuming, consuming. A lot of people have gotten themselves into a state where bills have been rising and rising, and the economics are looking ridiculous. The finance officers are now coming back and saying they need to rein that back in. How do they put some control around that?
People have bought into the messaging from the big public-cloud vendors about how they can just turn on services and keep consuming, consuming, consuming.

That’s where hybrid is helping, because if you start to hook up some workloads back in an isolated data center, you start to move some of those workloads back. But the key for me is that it comes down to putting some thought process into what you're putting into cloud. Just think through to how can you transform and use the services properly. Don't just turn everything on, because it’s there and it’s click of a button away, but actually think about put some design and planning into adopting cloud.

Gardner: It also sounds like the IT people might need to go out and have a pint with the procurement people and learn a few basics about good contract writing, terms and conditions, and putting in clauses that allow you to back out, if needed. Is that something that we should be mindful of -- IT being in the procurement mode as well as specifying technology mode?

Skelton: Procurement definitely needs to be involved in the initial set-up with the cloud  whenever they're committing to a consumption number, but then once that’s done, it’s IT’s responsibility in terms of how they are consuming that. Procurement needs to be involved all the way through in keeping constant track of what’s going on; and that’s not happening.

The IT guys don’t really care about the cost; they care about the widgets and turning things on and playing around that. I don’t think they really realized how much this is going to cost-back. So yeah, there is a bit of disjoint in lots of organizations in terms of procurement in the upfront piece, and then it goes away, and then IT comes in and spends all of the money.

Gardner: In the complex service delivery environment, that procurement function probably should be constant and vigilant.

Big change in procurement

Skelton: Procurement departments are going to change. We're starting to see that in some of the bigger organizations. They're closer to the IT departments. They need to understand that technology and what’s being used, but that’s quite rare at the moment. I think that probably over the next 12 months, that’s going to be a big change in the larger organizations.

Gardner: Before we close, let's take a look to the future. A year or two from now, if we sit down again, I imagine that more micro services will be involved and containerization will have an effect, where the complexity of services and what we even think of as an application could be quite different, more of an API-driven environment perhaps.

So the complexity about managing your cloud and hybrid cloud to find the right mix, and pricing that, and being vigilant about whether you're getting your money’s worth or not, seems to be something where we should start thinking about applying artificial intelligence (AI), machine learning, what I like to call BotOps, something that is going to be there for you automatically without human intervention.
Hopefully, in 12 months, we can have those platforms and we can then start to embrace some of this great new technology and really rethink our applications.

Does that sound on track to you, and do you think that we need to start looking to advanced automation and even AI-driven automation to manage this complex divide between organizations and cloud providers?

Skelton: You hit a lot of key points there in terms of where the future is going. I think we are still in this phase if we start trying to build the right platforms to be ready for the future. So we see the recent releases of HPE Synergy for example, being able to support these modern platforms, and that’s really allowing us to then embrace things like micro services. Docker and Mesosphere are two types of platforms that will disrupt organizations and the way we do things, but you need to find the right platform first.

Hopefully, in 12 months, we can have those platforms and we can then start to embrace some of this great new technology and really rethink our applications. And it’s a challenge to the ISPs. They've got to work out how they can take advantage of some of these technologies.
Accelerate Your Business
With Hybrid Cloud from HPE

Learn More
We're seeing a lot of talk about Cervalis and computing. It's where there is nothing and you need to spin up results as and when you need to. The classic use case for that is Uber; and they have built a whole business on that Cervalis type model. I think that in 12 months time, we're going to see a lot more of that and more of the enterprise type organizations.

I don’t think we have it quite clear in our minds how we're going to embrace that but it’s the ISV community that really needs to start driving that. Beyond that, it's absolutely with AI and bots. We're all going to be talking to computers, and they're going to be responding with very human sorts of reactions. That's the next way.

I am bringing that into enterprise organizations for how we can solve some business challenges. Service test management is one of the use cases where we're seeing, in some of our clients, whether they can get immediate response from bots and things like that to common queries, so they don’t need as many support staff. It’s already starting to happen.

Listen to the podcast. Find it on iTunes. Get the mobile app. Download the transcript. Sponsor: Hewlett Packard Enterprise.

You may also be interested in:

5 Security Tips for Protecting Your Company Blog

5 Security Tips for Protecting Your Company Blog

A blog is one of the most precious digital assets a company has in 2017. It provides SEO value, engages customers, and pushes site visitors through the conversion funnel. But a blog can also be a magnet for malicious behavior and external threats, which is why businesses really need to hunker down on blog security in the coming months.

Here’s How You Can Protect Your Blog

Protecting your blog is a lot like protecting your home. There are many different entry points through which hackers can enter your blog and you’ll have to account for all of them in order to fully protect your investment.

Here are some of the most important things you can do to get started.

1. Choose the Right CMS Platform

Blogging security all starts with the CMS platform you’re using to run your blog. If you’re using a platform with a questionable history of website security, then don’t be surprised when you discover that your blog isn’t as safe as you’d like.

By far, the most secure platform is WordPress. When you set up your blog with WordPress, you can rest assured knowing that you’re relying on a platform that millions of people use. As a result, security updates are continually ...


Read More on Datafloq
Why Cloud Adoption Will Continue to Grow in 2017

Why Cloud Adoption Will Continue to Grow in 2017

Cloud computing has had a pretty slow start over the past few years,. It makes sense, because there are many security and privacy concerns weighing down the technology.

It looks like that sentiment is finally waning, however. Cloud adoption in 2016 was higher than it’s ever been, and that trend continues as we move into 2017.  Most recently, worldwide software company Epicor acquired the cloud-based enterprise content management company, docSTAR, providing just one example of a trend we can expect to see more of throughout the coming year.

Surveys indicate that 17% of enterprises have over 1,000 virtual machines deployed in the cloud, a big increase from 13% in 2015. Furthermore, 95% of survey respondents indicated they utilized cloud services.

As cloud interest — and adoption — rises, enterprises are forced to jump on the bandwagon or risk getting left behind. There are roadblocks and obstacles you may encounter along the way, but more importantly, the benefits far outweigh the risks.

Embracing the Cloud

Cloud solutions now provide benefits like rapid elasticity, proper scaling support, broad network access and on-demand self-service software. And these are quickly becoming necessary in today’s hyper-efficient, technology-oriented world.

This, coupled with the fact that local hardware is aging fast, will push ...


Read More on Datafloq
5 Multi-factor Authentication Strategies Enterprises Can Use

5 Multi-factor Authentication Strategies Enterprises Can Use

In 2016 alone, more than 2.2 billion records were exposed in data breaches. Using a password-based authentication with a good hashing scheme like Bcrypt is perfectly fine as long as you can guarantee that your users won’t use easy to guess passwords or reuse passwords at many portals. Both of these assumptions turn out to be wrong in a substantial number of cases, however. There are ample reasons to require using multi-factor authentication. Other than asking your employees and users to use a password manager, here are five strategies that a business can use to protect themselves (that SMS as a second factor is not recommended because of some serious security implications):

HOTP/TOTP

This is the second most common way of providing multi-factor authentication (after SMS). HOTP and TOTP are One-Time-Password strategies that generate a secret code to be entered by the user to log in. This secret code has a time-based expiry. The code is shared with the user using an already authenticated application that is installed on the user’s mobile device. Google Authenticator is an example of one such application. Note that any application that supports HOTP/TOTP can be used to login to multiple services, you do not need a new application for each service.

Magic ...


Read More on Datafloq
4 Simple Data Security Tips for Self-Taught Developers

4 Simple Data Security Tips for Self-Taught Developers

The vast majority of developers are self-taught. This makes sense, considering how quickly technology evolves and adapts to changing market conditions. It just isn’t feasible to return to universities and coding boot camps every time something changes. But, the most shocking statistic to come out of a recent survey of software developers is that approximately 13% of coders are completely self-taught; completing bypassing traditional education programs.

For the self-taught coder out there, or the intrepid high schooler with a laptop and a passion for creating digital magic, security is something that is usually lacking. I’ve worked on projects with dev teams consisting of individuals with a variety of backgrounds. The common thread, in my experience, is that self-taught coders oftentimes lack the foundational concepts behind securing the content they produce.

Granted, some of these guys started out as hackers, but the majority of white-hat, self-taught developers need the help of formally trained developers in completely securing their project for release into the wild.

1. Collaborate with Your Peers and Build a Reputation the Hard Way

If you’ve learned to code on your own, don’t limit yourself by remaining a solo operator. Teaming up with a group of seasoned experts can dramatically increase your knowledge, ...


Read More on Datafloq
How Uber Depends on Data Analytics to Deliver Extreme Customer Service

How Uber Depends on Data Analytics to Deliver Extreme Customer Service

From a simple limo hailing app for friends to the world’s go-to taxi app. Uber’s growth in the approximately 7 years of existence can be described by one word, “Phenomenal”.

But there’s another way to define Uber, one that not many have given thought to.  Uber is a Big Data company, on the likes of Google and Amazon. It not only uses existing data in its banks effectively for its business operations, but the process of gathering data – data from drivers, data about drivers, data of passengers, data about passengers, data of traffic systems around the world, transactional data – and analyzing all of it in real time, continues.

BigInsights Principal Raj Dalal caught up with Uber’s Chief Data Architect M C Srivas on his recent trip to San Francisco. In the course of the hour-long conversation, among many things, Srivas spoke of what data analytics means for Uber, and how innovation in data is being used to further what is now popularly known around the world as “the Uber model.”

Raj: I have been tracking for a while now how data can be used to drive “extreme customer service”. Uber has done some exciting stuff, matching supply and demand and estimating ...


Read More on Datafloq
The Yin & Yang of data: ‘Science of data’ & Datascience

The Yin & Yang of data: ‘Science of data’ & Datascience

Afbeeldingsresultaat voor yin and yang

People familiar with my thinking know that I am a bit of a 'fundamentalist' when it comes to 'data'. I am the guy that pushes the non-sexy part of data; data quality, data governance, metadata, data protection, data integration, semantics, rules, etc..

It is hard to stand your ground in a time where shortermism, technology-fetish and data-populism is thriving. I see ‘data architectures’ in my industry that boils down to superdooper databases, ultrafast massively parallel hardware and of course huge amounts of software that seem to glorify ‘coding’ your way to the promised kingdom.

Call me old school, but I want (data) architectures to separate concerns on various levels (conceptual, logical and technical), dimensions (process, data, interaction) and aspects (law & regulation, people, organisation, governance, culture). Architectures should enable businesses to reach certain goals that (preferably) serve the customer, civilian, patient, student, etc..

Lately I have been studying the ‘datascience’ community, attempting to comprehend how they think, act & serve the common goals of an organisation. I have abandoned (just a bit :-)) my declarative nature of data modelling, semantics, dataquality or governance and I have drowned myself in coursera courses, learning to code in Python, Julia and R. Dusting off my old Statistics books I learned in Uni, installed Anaconda, Jupyter notebook, Rstudio, Git, etc..

And oh my, it is soooo cool. Give me some data, I don’t care what, where it comes from and what it exactly means, but I can do something cool with it. Promise!

Now my problem…

  • (1) It seems to me that the ‘science’ in ‘datascience’ is on average extremely low to non-existent. Example; I have heard of ‘data science’ labs/environments where the code is not versioned at all & data is not temporal freezed, ergo; reproducibility is next to zero. Discovering any relationship between variables does not mean it is a proven fact, more is needed. Datasciene is not equal to data-analysis with R (or whatever), is it?  
  • (2) There seems to be a huge trust in relevance and quality of data wherever it comes from, whatever its context and however it is tortured. Information sits at the fabric of our  universe1, it’s life, it’s the real world. Data is the ‘retarded little brother’ of this ‘information’, it is an attempt of humankind to capture information in a very poor way. Huge amounts of contexts are lost in this capturing. Attempting to retrofit ‘information’ from this ‘retarded brother’ called ‘data’ is dangerous and should be done with great care. Having these conversations with data scientists is hard and we simply seem to completely disconnect.
  • (3) There seems to be little business-focus, bottom-line focus. Datascientists love to ‘play’, they call it experiment or innovate. I call it ‘play’ (if you are lucky they call their environment ‘sandbox’, wtf?). Playing on company resources should be killed. Experiments (or innovations) start with a hypothesis, something you wanna proof or investigate. You can fail, you can succeed, but you serve the bottom-line (and yes, failing is serving the bottom-line!) and the purpose/mission of an organisation. Datascientists seem to think they are done when they’ve made some fascinating machine learning-, predictive- or whatever-model in their sandbox or other lab-kind-of-environment. Getting this model deployed on scale in a production environment for ‘everyone’ to use, affecting the real world…..that is where the bottom-line value really shines, you are not done until this is achieved.
  • (4) There seems to be little regard for data protection aspects. The new GDPR (General Data Protection Regulation) is also highly relevant for datascience. Your ‘sandbox’ or your separated research environment needs to be compliant as well! The penalties for non-compliance are huge.

There is huge value in datascience, its potential is staggering and it is soo much fun. But please, stop fooling around. This is serious business with serious consequences and opportunities for everyone and for probably every domain you can think of, whether it be automotive, banking, healthcare, poverty, climate control, energy, education, etc…

The ‘science of data’ and ‘datascience’ are the yin & yang of fulfilling the promise of being truly datadriven. Both are needed.

For my Data Quadrant Model followers; it is the marriage  between quadrant I & II versus quadrant III & IV.

1 Increasingly, there is more and more 'evidence' originating from theoretical physics that this statement hold some truth to it, link [Dutch]. I would also like to give attention to this blogpost by John O'Gorman, very insightful, putting DIKW to rest....

 

 

Three Best Practices to Help you Learn to Code

Three Best Practices to Help you Learn to Code

Have you ever had any interest in computer, system or software programming? Have you so far experienced any challenges that might be holding you back or almost making you to give up? Many people think that programming is a challenging task but that is wrong. Here is a list of the best practices you can follow to ensure that you perfect your programming skills.

Use a real world problem

Programs are developed to solve a given real world problem or make improvements to existing solutions. One of the best ways to perfect your programming skills is to apply them in real life. Take a given problem in the society and use it as your personal learning idea. Apply the skills you have learned to create a solution to that problem. At the end of it all, you'll have learned something new as well as perfecting what you had already learned.

Take your time, don't rush

When you start understanding a given task, a certain kind of joy comes into your mind and you're tempted to do more complicated tasks without taking your time to think through. This is bad practice since you'll end up wasting more time than if you handled one small bit ...


Read More on Datafloq
IoT Progress: What’s Really Happening with Driverless Cars in 2017?

IoT Progress: What’s Really Happening with Driverless Cars in 2017?

There’s been a lot of buzz about driverless cars since the first prototypes were announced. However, aside from a few grim stories about accidents (like the fatal crash involving a driverless Tesla in 2016 due to a combination of sensor and driver failure) and sporadic updates on the progress of driverless cars, most people haven’t seen them on the road—because there aren’t that many out there yet. So what’s really going on with autonomous vehicles in 2017? Let’s find out. 

Almost Ready for the Showrooms 

This year’s CES (formerly the Consumer Electronics Show) is hosting a special guest: Delphi Automotive Plc, which is showing off its new Audis with self-driving systems that consumers will eventually be able to purchase. This shows that driverless car manufacturers are shifting their focus from proving that the technology works to actually selling the new cars. Showrooms will soon feature these vehicles with autonomous features, and the goal for manufacturers of these systems is to get them in the hands of automakers and public and private transit services. Projections are that these cars will be on the road within about 5 years. 

What Features Will They Have?

With self-driving cars coming to a road near you (Uber and Lyft ...


Read More on Datafloq
Why does Data Decay so Fast and What to do About It?

Why does Data Decay so Fast and What to do About It?

People change jobs, get promoted and move home. Companies go out of business, expand and relocate. Every one of these changes contributes to data decay. It’s been said that business databases degrade by around 30% per year, but why?

A report by IDG states that companies with effective data grow 35% faster year-on-year. However, for this to happen your data needs to have a high level of accuracy, consistency and completeness. Yet for many businesses, data quality is seen as an abstract concept – let’s examine why…

What is data decay?

Data decay refers to the gradual loss of data quality within a system, including key company information, personal details and most importantly, accurate contact information. As a result, the data becomes outdated and often invalid.

Why does data decay so quickly?

The world is constantly changing and sadly data is not immune to that change. From the moment you capture information, your data is at the mercy of processes and systems, as well as a number of human factors:

Disparate systems

Collecting data across multiple systems can often lead to inaccuracies, including typos, incomplete information or duplicate records.

If you integrate your systems without a cleansing exercise you are only bringing across your “dirty data”. As a result, ...


Read More on Datafloq
How do the USA’s Slow Internet Speeds Impact Big Data Implementation

How do the USA’s Slow Internet Speeds Impact Big Data Implementation

Many people think that internet in the US is the fastest in the world. The truth is that the internet network in the United States is fundamentally broken. It requires huge investments in order to compete with the nationwide networks of Norway, Japan, Singapore, South Korea and many other developed countries.

Such a slow internet often obstructs the implementation of various advanced systems, processes and technologies. Since big data requires a complex infrastructure and extremely high computer performance, storing, structuring and analysis processes are sometimes directly affected by the low internet speeds and various other forms of outdated technology.

Why is U.S. internet so slow?

While Tier 1 and Tier 2 networks work relatively fine, the so called ‘last mile’ part of the network drastically decreases overall internet speed. This is the last stretch of infrastructure that connects individual homes and corporate offices with the rest of the network and brings worldwide data directly to our modems.

A huge part of this ‘last mile’ infrastructure is made of outdated coaxial copper cables that have connected our phones to the network since the time of Alexander Graham Bell. These cables have many ‘bottlenecks’ that slow data flow down. Although data easily travels thousands of miles ...


Read More on Datafloq
5 Reasons Why Enterprises Are Embracing OSS

5 Reasons Why Enterprises Are Embracing OSS

Enterprises are often characterized as repulsive to open source software, both in usage and offerings. Building enterprise software is not only about building software, but also about building processes. Open source software usually lacks long-lasting patrons that can provide support to such processes and thus present itself as a serious contender to proprietary offerings. Therefore enterprises are usually hesitant with adopting open source software solutions and as a result end up not putting their offerings as OSS either, creating an impasse.

A new line of thought has emerged recently, however, in which enterprises are offering their solutions under OSS licenses. This rise is mostly fueled by the success of products like Android, MongoDB, Elasticsearch and many more. Here are 5 reasons why enterprises are choosing to open source their products increasingly:

Builds User Base and Trust

When the users can evaluate the product before making a commitment to it, they have a better fulfillment experience. An important point to note is that OSS does not mean that the product is free of cost. Enterprises usually have two parallel offerings; a Community Edition that is free of cost and an Enterprise Edition that has some additional features, plus other important add-ons like priority support, ...


Read More on Datafloq
Battle of the Segments: Market Segmentation vs. Customer Segmentation

Battle of the Segments: Market Segmentation vs. Customer Segmentation

In a former life, I worked for a cloud-based retail technology solutions provider looking to bring a retail merchandise planning application to the US market. As with all IT vendors, the actual marketing of the solution preceded the finished product.

In the solution pitch, we talked a lot about top-down and bottom-up planning processes. Top-down planning includes the strategic objectives mandated by management based on a number of inputs, included last year's actual company performance, growth objectives, forecasts and general market indicators. For most, top-down planning was only as granular as the department/category level (most retailers employ at least a four-level merchandise hierarchy often starting with department and/or category). Then the merchandisers work the plan to devise a bottom-up plan. Oftentimes, these plans can go as granular as the SKU by location level, but might be higher in the overall product/store hierarchy.

OK, enough background. This is what I found interesting: when we were building the messaging for the merchandise financial planning solution, the VP of Product Management, our resident retail expert, insisted that the following message was included in the sales pitch, "The bottom-up plan always wins." I remember the conversation like it was yesterday. I looked at her inquisitively and ...


Read More on Datafloq
Cloud Data Warehousing: Is it for real?

Cloud Data Warehousing: Is it for real?

  Our industry is full of hype and hyped terms. Big Data. NoSQL. The Cloud. Self-service <whatever>. And Cloud Data Warehousing. Some of the offerings and solutions are real. Some less so. Newest on the scene is cloud data warehousing (or data warehousing in the cloud). As with all new tech, there are a variety […]
How to Protect the Online Reputation of a Small Business

How to Protect the Online Reputation of a Small Business

When customers conduct online research before spending their money, they want to see excellent reviews, high ratings and competitive prices. According to a Bright Local survey, nearly 92 percent of customers search for online reviews, and an average customer reads two to six reviews before trusting a business. Additionally, 88 percent of customers trust online reviews more than personal recommendations.

Customer reviews play a significant role in determining the online reputation of your small business. However, for small businesses, not paying attention to customer reviews could result in a loss of reputation, revenue and customers. With so many people looking at your brand’s online reputation, it's important to make sure your customer reviews are as positive as possible.

Here are some tips for managing the online reputation of a small business:

1. Build an online presence

You need to create your business website to be able to appear in search results and for providing accurate and relevant information to your visitors. In addition to your business website, you should create an active blog for your small business.

2. Take charge of social media

To support and promote your business activities, you must have an active presence on popular social media platforms such as Facebook, Twitter, LinkedIn, ...


Read More on Datafloq
How to Measure Productivity Benefits and Losses of New Technology

How to Measure Productivity Benefits and Losses of New Technology

Technology exists to make our lives easier. Even simple machines and basic tools saved our ancestors countless hours of manual labor; these days, the latest gadgets promise to shave minutes off our already-lightning-fast tasks, help us communicate more efficiently, or even fully automate tasks that once populated our to-do lists.

When you purchase a new device, upgrade to a new kind of software, or phase out some antiquated technology, your instinct tells you that your team will be more productive—but what do you have to back that up? Some companies, like Dialpad, have been able to run general studies; confirmation that eliminating desk phones can save a company more than $1 million over the course of 6 months. But how do they calculate this figure? And more importantly, how can you make these calculations for your own investments in technology, before you even pull the trigger on them?

Potential Values of Upgrades

First, make a list of all the ways that your planned purchase would benefit your company. These are some of the most common ways:


Reduced direct costs. Technology could help you eliminate direct costs altogether. For example, if you’re paying hundreds of dollars a month for a subscription service that could be ...


Read More on Datafloq
Why Drones could be the Future of Real-Time Mapping

Why Drones could be the Future of Real-Time Mapping

For some, drones are nothing but a punch line—Amazon’s next attempt at replacing humans in their day-to-day operations, or a device that crashes at the drop of the hat. To others, they’re a sign of an uneasy future—a future they’re not comfortable with. Despite these misgivings, there are a lot of exciting uses for drones, and they’re earning a spot in our world. Mapping is something we all take for granted, but it takes a lot of work to keep those maps up-to-date and secure. It can be extremely frustrating when they’re not. The big players—Google and Apple—still use trucks to provide map data—but that might be changing in the future.

Bring in the Drones

Apple has had a hard time competing with Google’s dominance in the mapping space, but it may have found the answer: drones. The company has already gotten approval from the Federal Aviation Administration (FAA) to move forward with the plan, though drone laws are always evolving, and may not fall in Apple’s favor. The drones may not be coming right away, but it’s an option that will help Apple compete with Google’s massively popular Maps application. The drones, in addition to the other features the company is building ...


Read More on Datafloq

Privacy Policy

Copyright © 2017 BBBT - All Rights Reserved
Powered by WordPress & Atahualpa
X