Golden Knights

Golden Knights

Trademark fight between the U.S. Army and the new National Hockey League (NHL) team the Las Vegas Golden Knights: Army challenges Golden Knights trademark; Vegas responds

Best line from the article: “…we are not aware of a single complaint from anyone attending our games that they were expecting to see the parachute team and not a professional hockey game.”

Why Cybersecurity is Turning Into a More Popular Field

Why Cybersecurity is Turning Into a More Popular Field

As the rise in cyber security threats becomes more and more prevalent, it’s no surprise that employment in cybersecurity is on the rise as well. It is estimated that by the year 2022, there will be 1.8 million open jobs in this field. With this becoming a bigger industry, this article will address why cybersecurity is important and what to expect from this industry in the future.

Why Cybersecurity?

As cyber attacks are becoming more common and more harmful, they are also becoming more complex and expensive. Some reports show that the average cost of a security breach has increased by 29% over the past few years. Even though we tend to only hear about attacks on high-profile entities, no company or individual with an online presence is immune to a cyber attack. Being prepared for cyber hackings and security threats is crucial for any businesses or institution. Businesses are eager now more than ever to hire adequately trained information security professionals who can help protect their client records and company assets. Information security professionals with these coveted skills are in high demand, with a job outlook that is expected to increase 37% from 2012-2022, according to the Bureau of Labor Statistics. ...


Read More on Datafloq
Using Artificial Intelligence to Facilitate Fraud

Using Artificial Intelligence to Facilitate Fraud

Any new tool or technology has the potential to be put to use for good purposes or, unfortunately, for harmful purposes. Artificial intelligence is no different. As we see the rapid progress occurring in the AI space, lots of attention has been paid to all of the good uses of AI. However, it is inevitable that those with nefarious intent are also studying AI successes with an eye toward how to twist them into tools to pursue their less than honorable goals.

Is That Video Actually Real?

Most people today take for granted that if you see a video of someone saying something, then there is no doubt that they said it. However, researchers have built some impressive AI processes that take as input historical video and audio of a speaker. The process then not only parses together audio that sounds authentic but also creates a very realistic fake video of the person speaking the made-up words.

Check out this video that fakes words from presidents Obama, Bush, and Trump. Here is a video you can watch that feigns former president Obama providing an endorsement. Who is he endorsing in the fake video? The company that has created a product that can mimic people’s audio and video ...


Read More on Datafloq
Milyen lépések mentén tudod bevezetni a céged a big data világába

Milyen lépések mentén tudod bevezetni a céged a big data világába

Egy szervezetfejlesztési workshop során össze kellett gyűjtenünk, hogy milyen módon közelednek a big data világához az általunk támogatott cégek. Egy izgalmas ív került felrajzolásra, ami különösen tanulságos azoknak, akik érzik, szeretnének 2018-ban előrébb lépni az adatok adta lehetőségek kihasználásában. 

a.jpeg

A folyamatot öt fő lépésre lehet bontani:

  1. Lendület és lelkesedés megszerzése - Első lépésben nyitottságra és lelkesedésre van szükség, hogy megmozduljon valami. Ez általában két lépésben történik, először a cég vagy szervezeti egység egyik meghatározó egyénisége rákap a big data ízére: olvas egy jó cikket a neten, egy lelkesítő előadást hall egy konferencián, vagy egyszerűen beleszeret a témába egy régen látott rokonnal beszélgetve. Ezt a lelkesedést általában érdemes kicsit átragasztani a többi szereplőre is, erre remek lehetőségeket adnak a különböző olyan belső workshopok, ahol egy általános big data előadással alapozzuk meg a kollégák pozitív hozzáállását az ügyhöz (ilyen előadásokat mi is szoktunk vállalni, de erről majd később). 
  2. Kompetencia gyűjtése - Kellő induló lendület után a big data világához kapcsolódó kompetenciák gyűjtése a cél. Ez lehet új munkatársak bevonása is, de akár belső adatelemzési kurzusok, vagy megfelelő külső partnerekkel való bizalmi kapcsolat kialakítása is ide kapcsolódik. 
  3. Validáció - A következő fázisban a kompetenciákra támaszkodva kiválasztásra kerül, hogy milyen fajta folyamatokat érdemes átalakítani adatvezérelté. Ez egy részben üzleti feladat, hiszen azt is vizsgálni kell, hogy elérhetőek-e azok az adatok már a cégen belül, amik kulcsszerepet kapnak a ebben a megközelítésben. Itt konkrét adatelemzési feladatok ritkán valósulnak meg, sokkal inkább az újszerű, innovatív adatfelhasználás létjogosultságát kell ellenőrizni.
  4. Proof-of-concept - Ha tudjuk, hol lenne érdemes a big data módszereket használni, nem egy rendszert kell egyből építeni: sokkal fontosabb, hogy ellenőrizzük, hogy megfelelő szinten megoldható-e az data science feladat, amit kitűztünk magunk elé. Ekkor tipikusan historikus adatokon bizonyítjuk, hogy egy jó adatelemzési módszerrel elérhető az üzleti értelemben vett előrelépés az adott módszerrel. Az data science feladatok megoldásán, a gépi tanulási eljárások futtatásán túl ekkor lehet pontosabb megtérülési számításokat is végezni.
  5. Rendszer építése - Ha bizonyításra került, hogy a gépi tanulási eljárásokkal korábban is tudtunk volna előnyöket elérni, akkor érdemes ezeket a jelenben és a jövőben meg is szerezni. Ehhez egy olyan rendszert kell építeni, ami a big data megoldást folyamatosan üzemelteti, időről-időről időre ellenőrzi működését, számszerűsíti az általa elért többletet. Sokan azt hiszeik, hogy ez már csak egy apró lépés az előző pont után, de a valóság az, hogy ami a már ismert múltbeli adatokon jól működött, az jelentős mennyiségű fejlesztést és integrációs feladatot követelhet, ha egy teljes rendszerbe kell azt integrálni. 

Látható, hogy az öt lépés során bárhol el tud akadni a folyamat. Vagy azért, mert a prioritások máshova viszik a fókuszt, és nem szerzi meg a cég a megfelelő kompetenciát, vagy mert a proof-of-concept megoldás eredménye nem jelzi egyértelműen, hogy érdemes egy új rendszert építeni.

Ugyanakkor a fenti modell abban tényleg nagyon sokat segít, hogy azonosítani lehessen, mire is van valakinek szüksége. Például, ha még csak lelkes vagy a big data világa iránt, nem feltétlenül kell még egy konkrét technológia, platform mellett elköteleződnöd, ráérsz ezt majd a 4. és 5. pont között megtenni - még akkor sem, ha úgy gondolod, hogy egyből olyan kompetenciákat akarsz megszerezni, ami a majdani technológiákhoz szervesen kapcsolódik. 

Ha te is éppen a big data világába szeretnéd jobban bevinni a céged, érdemes elgondolkodni, hogy hol is tartasz a fenti folyamatban, és arra fókuszálni, ahol ténylegesen vagy. Tapasztalataink szerint nem érdemes kihagyni egyetlen fejlődési fázist sem, később ez mindig megbosszulja magát. 

Szívesen írunk a fenti fázisokról még tapasztalatokat, áruld el nekünk, neked melyik fázis izgalmas éppen:

Melyik lépést fejtsük ki bővebben - Szavazás

(Kép forrása)

Megosztom Facebookon! Megosztom Twitteren! Megosztom Tumblren!

The Growth and Future of Artificial intelligence (AI)

The Growth and Future of Artificial intelligence (AI)

There is an ongoing wave of mistrust in regards to AI because it is perceived as a threat, but what people have not noticed is that there are so many benefits from AI. It is now time to embrace its use.

By 2035, it has been projected that AI will add $814 billion to the economy of UK with increasing rates from 2.5 to 3.5%. The capabilities of AI are immense. For instance, there is a machine that has mastered the complex game Go, that was thought to be a difficult challenge if at all artificial processing was applied to the game. Vehicles are now operating independently, including trucks that are remotely driven by one operator. There is a proliferation of automated parts and robotic devices when it comes to completing tasks with ease; these accomplishments are giving new rise to the revolution of AI.

Where is AI headed?

There is no doubt that the advancement of AI is significant, and we have only seen the tip of the iceberg. The full capabilities of AI require us to understand the complete technological capacity of the advances in technology to take on the real world processes that we now consider tough to mimic.

When we ...


Read More on Datafloq
A Guide to Understanding and Using Big Data in Marketing

A Guide to Understanding and Using Big Data in Marketing

Data-driven marketing has been helped with the use of big data. Marketers have access to massive amounts of data streaming to them at lightning speeds from various channels. This data is rich with information about customers that marketers can use to personalize campaigns making them more relevant and effective.

Research has shown that companies, in general, are factoring data into their marketing and sales decisions to a greater degree in order to boost their returns on investment by 15 to 20%. The problem that marketers find with big data is that the information they need can be hidden deep within the data. This is why big data analytic tools are necessary, but they are only helpful if you are using them correctly.  If you are going to be working with big data, there are some tips that you should know about to help your strategy move more efficiently.

Look Further Than Collection And Analysis

With the large amount of data that is available to marketers, it is easy to get trapped in the revolving door of collecting and analyzing. To avoid this trap, you need to ask yourself how having all the information will help or impact future marketing strategies. You need to ...


Read More on Datafloq
Why Modern Marketers Also Need to Be Data Doctors?

Why Modern Marketers Also Need to Be Data Doctors?

Over the years, marketing in the B2B sector has come a long way, especially when it comes to personalisation and hyper-relevant communication. Although personalisation tools have been at marketers’ disposal for years, not every marketer is reaching out the right prospects with the right message at the right time. This is where data comes into play, giving contextual insights about the prospects and customers.

If you look close, modern B2B marketers need to be part data doctors. A doctor diagnoses based on the symptoms and then prescribes the medicine. The doctor deciphers the symptoms such as temperature, blood pressure, cough, pain etc. before matching with the right medicine.  

A marketer is no different. They need to learn about the customer needs and match it with the right solution and message.  The better the needs are understood, the more effective the message is, leading to improved conversions.  The data holds the keys to understanding customers and their needs. 

Creating customers

While data helps your measure which of the marketing campaigns are working, it is not the ultimate purpose. The ultimate purpose is to create customers, from interest to investment.

There are two parts to this –


Being able to gain insights into your prospects and customers
Scoring your ...


Read More on Datafloq
Why the Organisation of the Tomorrow is a Data Organisation

Why the Organisation of the Tomorrow is a Data Organisation

The fast-changing, uncertain and ambiguous environments that organisations operate in today, requires organisations to re-think all their internal business processes and customer touch points. In addition, due to the availability of emerging (information) technologies such as big data, blockchain and artificial intelligence, it has become easier for startups to compete with existing organisations. Often these startups are more flexible and agile than Fortune 1000 companies and they can become a significant threat if not paid attention to. Therefore, focusing purely on the day-to-day operation is simply not and organisations have to become innovative and adaptive to change if they wish to remain competitive.

The Paradigm Shift

The key characteristic of these new startups is that they are, at its core, a data company, regardless of the product or service they offer. Companies such as Google, Facebook, WeChat or Amazon have long understood the importance of data, but, unfortunately, many large organisations still struggle with this paradigm shift. I often tell organisations that viewing the company as a data company will completely change all processes and customer touchpoints. This is a difficult change but required if they want to be able to compete with startups who have been doing this from the beginning.

Therefore, ...


Read More on Datafloq
How Artificial Intelligence Delivering a Personalized Content Experience

How Artificial Intelligence Delivering a Personalized Content Experience

Artifical Intelligence is driving our efforts toward delivering a personalized content experience. Experience is the biggest enterprise disruption in 60 years. Experience is not some academic or grandiose idea. 

Your friend’s and family’s behaviors are shaped by being consumers, whether they are interacting with technology on their mobile devices, at a bank kiosk, or using a touchscreen in retail or their car. 

Digital is everywhere. We can tangibly see it in our everyday lives. This is changing the way companies organize themselves departmentally, and how they architect themselves technologically. 

Enterprises need to change the way they think about technology. But the biggest organizational change becomes how you break down departmental silos, and put the customer at the forefront of what you are trying to do. Customers are only concerned with a consistent story from your enterprise that is personalized with what they are trying to achieve. But with the amount of data skyrocketing within organizations, how do you make real personalized experiences for customers?

John Mellor, who runs the Strategy and Business Development and Alliances Group at Adobe, gives us all a practical example in his everyday life...

IoT is quickly becoming a key technology in giving truly personalized experiences for customers. John travels often ...


Read More on Datafloq
How Big Data will Influence the Future of Business

How Big Data will Influence the Future of Business

For centuries now, the foundation of the commercial business enterprise is accessibility to data and information. The new world of business generates colossal amounts of data where technology has devised numerous ways through which business enterprises can harness the data. However, this new world order has bumped up against two significant regulation and public policy areas including job opportunities and career development. Big data technology and associated algorithms have presented a considerable challenge to white collar jobs and what the employees should be briefed about regarding business operations in the 21st century. However, big data is set to bring huge benefits to our society such as the creation of numerous job opportunities.

In the most basic terms, big data technology is the sum up of all the tools and processes that are related to the utilization and management of colossal data sets. The concept of data was conceived out of the necessity to understand patterns, tastes, preferences, and trends created when persons interact with each other and with systems that have been put in place by business enterprises. Companies can use up big data analytics provided by Google BigQuery to figure out their most valuable consumer's big data technology can also ...


Read More on Datafloq
11 Technology Predictions for Our World in 2050

11 Technology Predictions for Our World in 2050

When we think about 2050 it seems like it ages away and we imagine a completely different world, but in reality, it is just 30 years from now, and we can already know what will be possible to have by that time. We have a lot of environmental, social problems and let’s see how technology may solve them by 2050. Today’s article is about the tech of the future!

Before writing this article, I did a small research and checked what industry influencers think about this topic, for example, Mr Kurzweil, Business Insider, Forbes, etc.
Let’s start with Ray Kurzweil – the world’s foremost futurist, authoring bestsellers like “The Age of Spiritual Machines� and “How to Create a Mind.� He’s so influential that Google hired him to lead its artificial intelligence efforts. He is very well known for making predictions, which are right about 86% of the time.

Tech of the future: technology predictions for our world in 2050

1. Nanobots will plug our brains straight into the cloud

Tech of the future: nanobots. He believes, that by 2050 nanobots will plug our brains straight into the cloud, it will give us full-immersion virtual reality from within the nervous system. Just like we do know with ...


Read More on Datafloq
South Korea Considering Development Of Artillery Defense System

South Korea Considering Development Of Artillery Defense System

[Mauldin Economics]

In an article I missed on the first go-round from last October, Ankit Panda, senior editor at The Diplomat, detailed a request by the South Korean Joint Chiefs of Staff to the National Assembly Defense Committee to study the feasibility of a missile defense system to counter North Korean long-range artillery and rocket artillery capabilities.

North Korea has invested heavily in its arsenal of conventional artillery. Other than nuclear weapons, this capability likely poses the greatest threat to South Korean security, particularly given the vulnerability of the capital Seoul, a city of nearly 10 million that lies just 35 miles south of the demilitarized zone.

The artillery defense system the South Korean Joint Chiefs seek to develop is not intended to protect civilian areas, however. It would be designed to shield critical command-and-control and missile defense sites. They already considered and rejected buying Israel’s existing Iron Dome missile defense system as inadequate to the magnitude of the threat.

As Panda pointed out, the challenges are formidable for development an artillery defense system capable of effectively countering North Korean capabilities.

South Korea would need to be confident that it would be able to maintain an acceptable intercept rate against the incoming projectiles—a task that may require a prohibitively large investment in launchers and interceptors. Moreover, the battle management software required for a system like this may prove to be exceptionally complex as well. Existing missile defense systems can already have their systems overwhelmed by multiple targets.

It is likely that there will be broader interest in South Korean progress in this area (Iron Dome is a joint effort by the Israels and Raytheon). Chinese and Russian long-range precision fires capabilities are bulwarks of the anti-access/area denial strategies the U.S. military is currently attempting to overcome via the Third Offset Strategy and multi-domain battle initiatives.

Using Real-Time Marketing and Machine Learning based Analytics to Drive Customer Value Management

Using Real-Time Marketing and Machine Learning based Analytics to Drive Customer Value Management

The value of data-driven Customer Value Management or CVM cannot be underrated. Data and other algorithms/analytics that shape data are an imperative part of customer value management in a telecom company. With enhanced customer expectations, it is up to the ability of telecom companies to provide customers with a seamless experience and to also ensure that they help boost revenue in the process. 

To understand this concept in a more functional manner, I recently interviewed the chief of CVM at Mahindra Comviva, Amit Sanyal. With so much on hand to discuss, I got to the crux of the matter straight away and asked Amit about the pillars he considered to be important for a customer value management program being driven by analytics. 

The prodigy responded to my questions by commenting that all methods of CVM being driven by the force of analytics should be dedicated towards these three pillars. 




Analytics themselves have an important part to play, which is why they form the first pillar in this regard. Understanding consumer behaviour is not child’s play, so it is indeed profitable for a telecom company if the analytics are spot on in their methodology. 
The second pillar is pertaining to efficiency in this regard points towards context ...


Read More on Datafloq
Internet of Things: How Much Does it Cost to Build IoT Solutions?

Internet of Things: How Much Does it Cost to Build IoT Solutions?

Without any doubt, IoT is improving the quality of our lives every day. More companies invest in IoT solutions in order for their business to run smoothly, and more people invest in technology that helps them save time and make their lives easier. If you are one of the people who want to start building IoT solutions, you should ask yourself a very important question: how much does it actually cost?

Validate Your Idea

Before talking about the actual costs of building the IoT solution, you have to make sure that you have a target audience for your idea and also build a minimum viable product, in order to decide whether it’s worth it or not.

First of all, it’s very important to study your competition, to search for similar solutions that already exist on the market and figure out a way to make your product stand out and bring something different in the game. Maybe you could build a website to see customer response. The next step would be to create a PoC (Proof of Concept), which is basically the evidence that your product is able to solve a real-life problem and has the potential of being a commercial success. Building a strong ...


Read More on Datafloq
Arctic Territories

Arctic Territories

The Arctic is an ocean, so claims there should be resolvable by existing rules concerning 12-mile territorial limits and 200 nautical mile exclusive economic zones. But, the law of the sea allows countries to claim beyond the 200 nautical mile limit if they can prove that their continental shelf extends beyond those zones. This has led to more issues.

There are only five nations with claims in the Arctic: The United States, Russian, Canada, Norway and Denmark (Greenland). There are some claims that are fairly typical, like the sea border area between the Alaska and Canadian territory being in dispute, Hans lsland near Greenland being in dispute between Denmark and Canada, and the question as to whether the Northwest Passage is Canadian territory or international waters. These are all disputes that will probably be solved through diplomacy.

But, confusing the situation is that three nations claim the North Pole. The North Pole is 430 miles (700 kilometers) from the nearest land. The sea depth there is 13,980 feet (4,261 meters).

Canada’s claims go all the way to the North Pole and 200 miles beyond it, based upon where they have claimed that their continental shelf is: Canada Claims North Pole

and: http://www.rcinet.ca/en/2016/05/03/canada-to-submit-its-arctic-continental-shelf-claim-in-2018/

Russia has made similar claims. They are saying the Lomonosov and Mendeleyev ridges are part of its continental shelf and therefore part of its territory. A topographic map of the area is worth looking at:

Denmark also claims the north pole. Apparently the Lomonosov Ridge is also an extension of Greenland.

This map nicely summarizes the confusing and competing claims:

The Russians have gone so far as to dive down to the Lomosonov Ridge and plant a flag there in 2007. It is a flag planed 14,000 feet below sea level.

5 Data Challenges for Brand and Marketing Agencies

5 Data Challenges for Brand and Marketing Agencies

The rapid pace of technological innovation and the sudden emergence of Big Data has left a lot of marketers feeling left behind. In less than a generation, the marketing industry has shifted in a way nearly unprecedented in its entire history.  It has gone from a largely intuitive or psychological art to being heavily defined by data, analysis, and science.  

This has created numerous challenges for marketing agencies, both new and old. Getting a handle on their data, using it properly, and finding new ways to reach out to consumers are challenges facing every marketer at work today.  

What are some of the biggest data challenges faced by modern brands and marketing departments? And what could potentially address those problems? Here are some answers.

1. The Problem:  Getting a Handle on Big Data

One of the biggest key challenges simply involves the collection, storage, and access to data. Some organizations still find themselves struggling to get the information they need flowing in. Others have opened too many pipelines, and find themselves drowning in an ocean of data without clear ways of organizing it.

In either case, what's called for is a data-collection plan. Don't collect data for its own sake. Have clear goals in mind for what the data will be used for, ...


Read More on Datafloq
Build a Digital Marketing Empire with the Six Best Digital Asset Management Vendors

Build a Digital Marketing Empire with the Six Best Digital Asset Management Vendors

An increasing number of digital marketers are realizing the need for a robust digital asset management (DAM) system. It’s becoming much more critical for marketers to have easy access capabilities, as well as adequate storage that doesn’t hinder findability, for the many files and assets they must manage on a daily basis. This is especially true for growing companies.

Digital Marketing Today is Increasingly Complex

Competing in a global marketplace means digital marketers must be ahead of their competitors, always. But great marketing is multi-faceted, encompassing:


Data
Processes
Teams
Products
Channels
Content


With the growing number of social media channels, there is also a corresponding need for more marketing content. Audiences demand content that’s relevant to them, meaning content must be customized for the channel, the medium, and the individual – in other words, you can’t post the same content on Pinterest that you post on Instagram. With a DAM system, digital marketers can centralize all their assets, guidelines, materials, and anything else that fuels brand-building initiatives.

Benefits of DAM Solutions

Digital marketers understand the importance of keeping all verbal and visual content and communication on-brand. In order to ensure on-brand messaging, the most recent and updated assets must be readily available – from anywhere, at any time. Then, you must ...


Read More on Datafloq
The Anatomy of a Data Story

The Anatomy of a Data Story

Stories are how people make sense of the world, so it follows that they’d also be our way of making sense of data. As much as technology can facilitate data storytelling, it cannot (yet) perform the pivotal step of placing data into the human context. That’s our job, and we’re really good at it. We’re such natural storytellers, in fact, that most of us already use a story format for presenting information.

Take the standard scientific research paper format. After the title and authors (our main cast of characters), we get a summary of the story in the article abstract. It’s like one of Shakespeare’s spoiler-laden prologues, a quick preview of what’s to come. After that comes the Introduction, complete with the scientific problem and research question that incites the story’s action, the experiment itself. The Materials and Methods sections chronicle the heros’ journey as they conduct the experiment, and the Results and Discussion sections are when we find out how they fared. Beginning, middle, end. Rising action, falling action, resolution. Story!

But there’s a big difference between following a narrative arc and telling a story that will deliver an impactful message and linger with audience members. In his TED Talk on data storytelling, blogger Ben Wellington of I ...


Read More on Datafloq
The Arctic

The Arctic

Brief article from Micheal Peck on Russia’s Arctic ambitions: Russia has plans to dominate the Arctic

In case you missed it, the Arctic has been warming up for the last few decades. As Peck points out: “Once a lure for hardy explorers–and a hiding places for ballistic missile submarines–the North Pole is now seen as a new frontier with abundant energy and mineral resources. With polar ice melting, new shipping lanes are opening up that offer the prospect of more direct routes for cargo vessels sailing between North America, Europe and Asia.”

Anyhow, while the U.S. Navy has only one heavy ice breaker, fellow NATO member Canada is working the problem. They are building a facility at Baffin Island and are developing arctic capable patrol vessels, frigates, etc.  There are also planning on building 6-8 Harry DeWolf class artic patrol vessel:

https://en.wikipedia.org/wiki/Harry_DeWolf-class_offshore_patrol_vessel

based upon: https://en.wikipedia.org/wiki/NoCGV_Svalbard

There are also the efforts of NATO members Norway and Denmark, so it is not like the United States and Russia are the only participants here.

Still the Northwest Passage has only been opened seasonally since 2000, with a cruise liner going through it in 2006. It is now being transited with increasing regularity. Of course, there is also the Northeast Passage and Russia has the Northern Sea Passage, which they are developing. Still, these passages are seeing limited use right now. In 2016, 18 ships (7 Russian) traversed the Northern Sea Passage.

A few related links:

Canada at War. The Arctic. Northwest Passage, 1944

Amazing Voyage Through Perilous Arctic Ocean (2000)

Warming ‘opens Northwest Passage’ (2007)

Plain sailing on the Northwest Passage (2007)

That pricey Arctic luxury cruise was just the beginning. Up next: Arctic shipping. (2016)

 

The picture at the top of this post is from 2016.

Two Kursk books listed on Amazon.com

Two Kursk books listed on Amazon.com

We have a link to my Kursk book on our website: Kursk: The Battle of Prokhorovka

This link leads to the best source for purchasing the book (the publisher Aberdeen Books), although probably best just to purchase it directly from him (click on image of Kursk book in the sidebar to get to the Aberdeen Bookstore). It does not show any used books in that link. Apparently all the used books are listed here: Kursk: The Battle of Prokhorovka by Christopher A. Lawrence (2015-12-14)

In that link they are listing new books from $302.55 and used books from $321.07. There is a used book listed for $1,710.00. Now if anyone out there is ready to depart with $1700 for a Kursk book, please just contact me. I have an author’s copy or two I would sacrifice at that price.

 

First World War Digital Resources

First World War Digital Resources

Informal portrait of Charles E. W. Bean working on official files in his Victoria Barracks office during the writing of the Official History of Australia in the War of 1914-1918. The files on his desk are probably the Operations Files, 1914-18 War, that were prepared by the army between 1925 and 1930 and are now held by the Australian War Memorial as AWM 26. Courtesy of the Australian War Memorial. [Defence in Depth]

Chris and I have both taken to task the highly problematic state of affairs with regard to military record-keeping in the digital era. So it is only fair to also highlight the strengths of the Internet for historical research, one of which is the increasing availability of digitized archival  holdings, documents, and sources.

Although the posts are a couple of years old now, Dr. Robert T. Foley of the Defence Studies Department at King’s College London has provided a wonderful compilation of  links to digital holdings and resources documenting the experiences of many of the many  belligerents in the First World War. The links include digitized archival holdings and electronic copies of often hard-to-find official histories of ground, sea, and air operations.

Digital First World War Resources: Online Archival Sources

Digital First World War Resources: Online Official Histories — The War on Land

Digital First World War Resources: Online Official Histories — The War at Sea and in the Air

For TDI, the availability of such materials greatly broadens potential sources for research on historical combat. For example, TDI made use of German regional archival holdings for to compile data on the use of chemical weapons in urban environments from the separate state armies that formed part of the Imperial German Army in the First World War. Although much of the German Army’s historical archives were destroyed by Allied bombing at the end of the Second World War, a great deal of material survived in regional state archives and in other places, as Dr. Foley shows. Access to the highly detailed official histories is another boon for such research.

The Digital Era hints at unprecedented access to historical resources and more materials are being added all the time. Current historians should benefit greatly. Future historians, alas, are not as likely to be so fortunate when it comes time to craft histories of the the current era.

Why a Risk-Based Approach to Cybersecurity Presents Solutions

Why a Risk-Based Approach to Cybersecurity Presents Solutions

Those who choose not to take cybersecurity seriously find themselves in a very compromising position. Even someone who only peripherally follows the news learns about major hacking operations that procured sensitive and private information. In some highly-sensational cases, the servers belonged to major retail chains.Those not examining cybersecurity from the proper perspective might not fully grasp the severity of these breaches. A multi-billion-dollar retail chain certainly has a large enough budget to pay for the best possible firewalls and other security systems on a network. Yet, hackers are able to breach entry into those same safeguarded networks. If nothing else, the news of disastrous breaches reveals hackers employ a combination of expertise and sophistication that could succeed with any operation directed at any chosen target.

Also worth noting is the potential for an “inside job.� Disgruntled employees have taken the immoral and often illegal steps of violating network security. Such acts of revenge may get the employee into trouble, but this is all after the fact. The damage, as the saying goes, is done. Persons impacted by such actions must deal with the unfortunate consequences.

All this may sound like dire, doom-and-gloom scenarios that herald the futile nature of cybersecurity. That truly ...


Read More on Datafloq
How Big Data Enables Open Strategizing

How Big Data Enables Open Strategizing

Open strategy is the decentralisation of strategy formulation across, previously excluded, internal and external stakeholders. Traditionally, companies focused on control and ownership of an organisation’s assets to ensure competitive advantage. However, this focus on control of resources is no longer the key to success. In today’s data-driven societies, knowledge, i.e. data, is widespread and easily accessible and organisations are increasingly turning into data companies. Access to this knowledge can fuel innovation, especially when it is used to embrace external ideas and data and when these insights are combined with internal R&D. This enables organisations to find new opportunities, develop a new business model and remain competitive in this digital age. 

The process of open strategizing is possible thanks to the plummeting costs of communication and the availability of new technologies such as big data, blockchain and artificial intelligence.  These technologies allow organisations to understand better and apply the intelligence of the crowd, resulting in better solutions, improved innovations and more-informed decision-making. Open strategizing is increasingly being adopted by organisations.

IBM’s Innovation Jam

Open strategy has been around for almost 20 years since it was first used by IBM in their Innovation Jam. It can be seen as an extension of open innovation, which ...


Read More on Datafloq
7 Practical Ways How To Make Money Out Of Bitcoin Indirectly

7 Practical Ways How To Make Money Out Of Bitcoin Indirectly

The bitcoin craze has taken the global economy by surprise. It has started a significant shakeup in the e-commerce and finance world including in the stock and forex markets. Since its introduction in 2009, the coin has gained immense value and is already transforming the lives and financial status of its initial investors and traders.

However, even with this trackable success, most investors are still apprehensive about investing directly in the coins but prefer to make money out of the cryptocurrency indirectly by investing in products contributing to its success. But what are these investment options that allow you gain from Bitcoin without actually owning the tokens? Here are seven ways on how to go about it:

1. Bitcoin consultancy

Just like the stock market, Bitcoin thrives on speculation and the political attitude of the day. With little experience in stock price speculation and value-forecasting, individuals have set up thriving Bitcoin consultancy firms that offer actionable and highly reliable price forecasts about the digital currency from studying its trends. Given that the online currency is just gaining momentum as countries like Japan and South Korea legalizing its use, you too can cash in on the Bitcoin trade as a consultant who trades forecast ...


Read More on Datafloq
Angular JS – Most Preferable Open-Source for Mobile Apps and Web Apps

Angular JS – Most Preferable Open-Source for Mobile Apps and Web Apps

When you browse through web pages of any website what’s the first thing you notice? Certainly, how the web pages are designed. It’s among the first things that create an impact and it’s a determining factor for how long you choose to browse and stay on that web page. Developers invest in a great deal to make these web pages more appealing and responsive.

Nowadays, every website is supposed to be not just plain static HTML pages. The more responsive and dynamic site you make the better will be UI. Web pages should be more appealing, lively, having a creative outlook, dynamic, functional and responsive. To our aid in making all this possible to achieve comes JavaScript frameworks.

Angular JS is the best java framework of 2017, maintained by Google, it’s an open source framework. It addresses the challenges faced by the developers while development and testing of the code. Angular JS is the most popular JavaScript frameworks now.

Top Features of Angular JS

Let’s discuss its top features and the benefits it provides in some detail, emphasizing on the developmental benefits we get by using the Angular JS framework. Have a quick view of what’s in store for you –

Its Architecture Is Simple

Angular ...


Read More on Datafloq
Future Conventional Warfare Scenarios

Future Conventional Warfare Scenarios

What are the U.S. Armed Forces’ potential conventional warfare missions?  Is conventional warfare gone, leaving the U.S. Army conducting special ops, training, coordinating air and drone strikes, providing counterinsurgency support, and generally just kicking down doors?

Well, there are still a few potential conventional warfare scenarios out there, even if they have a low probability of occurring:

  1. Korea: We still have the majority of the 2nd Infantry Division deployed in Korea as a reserve force for the Republic of Korea (ROK) Army. If a war blows up in Korea, then we are immediately right in the middle of a conventional war. It is 1950 all over again. Amid all the “fire and fury” type comments, I do consider this to be a low odds of occurring. Still, it is one conventional warfare mission that has existed since 1950 and does not appear to be going away.
  2. Taiwan: I don’t think China is going to invade Taiwan (their third largest trading partner), but stranger things have happened. I believe we are informally committed to defend Taiwan if this happens. We have no ground troops there.
  3. Ukraine: We have no commitment to defend Ukraine. On the other hand, if Russia rolls across the border with tanks and is heading towards Kiev, then we may decide we need to intervene. Exactly with what forces we would use is a question, but this is potentially a mission in the future. I don’t think it is likely. If Russia was going to conduct a conventional invasion of  Ukraine, it would have done so in 2014.
  4. Baltic States: On the other hand, we do have a commitment to defend the three Baltic States (Latvia, Lithuania and Estonia). They are members of NATO. Right now, with the forces currently in place, a Russian conventional invasion would sweep over these three countries in a matter of days. Then what? The U.S. would be challenged to be able to quickly move a single armored or mechanized division there, let alone the several divisions it would probably take to re-claim them. We currently are not defending them and do not have the ability to quickly re-take them. That said, the odds of Russia doing this is very, very close to zero, because they do end up in a war with 29 nations. This is probably not the best use of their time.
  5. Belarus: On the other hand, I don’t rule out tanks rolling into Belarus at some point in the future. Lukashenko, the Belarus dictator, is 63 years old, and these guys don’t live forever. Once he is gone, will Belorus undergo a calm transition of power to a new president (for life)….or does Russia take this opportunity to reclaim Belarus? Unlike Ukraine, there is not a strong nationalist group that is clearly ready to fight off any Russian invaders. If Russia did decide to take Belarus (probably making sure they were invited, like they were in Afghanistan in 1979), is there anything we could do about it? How concerned would we be about it?
  6. Georgia: Russian already had a five day war with Georgia in 2008. Russia probably could have overrun Georgia if they wanted to. They probably can now. It is a very small country and geographically isolated from NATO. I don’t rule out it becoming a battlefield in the future. Not sure what the United States could do about it.
  7. Iran: While I don’t think that the U.S. will ever invade Iran, I would have said the same thing about Iraq in 2000. Of course, Iran is a country with a population more more than twice that of Iraq. Invading Iraq in 2003 led to lots of long-term complications. Invading Iran might get even more difficult.
  8. The mission not yet named: The last 30 years are notable in that the United States has been dragged into three major wars rather suddenly. At the beginning of 1990, I don’t recall any defense analyst saying the United States was about to enter into a war with Iraq for the sake of saving Kuwait (who we had no alliance with). Yet, less than a year later, this is exactly what we did, and it was done with a large conventional force of nine deployed U.S. divisions. In 2000, I don’t recall too many defense analysts saying that we would soon be invading Afghanistan and Iraq. These missions came rather suddenly. So, one must always assume that there is a possible conventional mission at any time in any place. It has happened twice in the last 30 years. These are hard to plan for and to structure forces for, yet there is clearly a need for a mobile conventional force just in case.

Anyhow, that list appears to cover the possible conventional warfare missions for the United States right now. The one with the highest probability of occurring is “the mission not yet named.” There are many other flash points in the world, but most of them are not ones that would attract American conventional ground forces. Still, as shown by Kuwait in 1990 and Iraq in 2003, we can end up involved in a conventional conflict with very little notice. This is a far cry from the days of the Cold War when the Soviet Union and Warsaw Pact were lined up along the border of Germany. The future ain’t what it used to be, to borrow a quote.

We are now popular!

We are now popular!

We have been linked in an article in Popular Mechanics: The U.S. Is Sending Deadly Javelin Missiles to Ukraine

The article is definitely worth reading. Have no idea who Kyle Mizokami is.

This is the link they connected to:

The Russian Artillery Strike That Spooked The U.S. Army

Anyhow, appreciate the link.

 

Which Software Should You Get? Off-The-Shelf or Custom Built?

Which Software Should You Get? Off-The-Shelf or Custom Built?

It's a big question that can have huge consequences for your business. Should you go for off-the-shelf software? Or opt for a custom build?

As frustrating as it may sound, there's no one-size-fits-all answer to this dilemma. Every business is different, and as such so are its needs. You might be in a position where the ready and available (and often cheaper) off-the-shelf software can be the ideal solution that you're looking for. While on the other hand, you may crave a more ambitious construct that's tailor-fitted to what you require to get ahead of competitors with technology that's unique to you.

Firstly to help you figure out which type of software works better for your business, let's explore the key differences between off-the-shelf and custom solutions.

Off-the-shelf software is the commercial approach to solving your business needs. It's relatively cheap and readily available. A good example of an off-the-shelf product for companies is Microsoft Office - a pre-existing package that's available to everyone that you can get up and running with minimal fuss.

Custom software has become extremely popular with businesses in recent times, down in no small part to how its bespoke programming can cater precisely to your needs. Because it's created ...


Read More on Datafloq
Who Will Be the Haves and Have Nots?

Who Will Be the Haves and Have Nots?

The rapid advance of technology, an advance that is occurring exponentially, is creating a new gap within American society – those who can harness the power of technological advancement and those who can’t. This widening gap will have lasting economic consequences.
Russian General Staff Chief Dishes On Military Operations In Syria

Russian General Staff Chief Dishes On Military Operations In Syria

General of the Army Valeriy Gerasimov, Chief of the General Staff of the Armed Forces of the Russian Federation and First Deputy Minister of Defence of the Russian Federation [Wikipedia]

General of the Army Valery Gerasimov, Chief of the General Staff of the Armed Forces of Russia, provided detailed information on Russian military operations in Syria in an interview published in Komsomolskaya Pravda on the day after Christmas.

Maxim A. Suchkov, the Russian coverage editor for Al-Monitor, provided an English-language summary on Twitter.

While Gerasimov’s comments should be read critically, they do provide a fascinating insight into the Russian perspective on the intervention in Syria, which has proved remarkably successful with an economical investment in resources and money.

Gerasimov stated that planning for Russian military operations used Operation Anadyr, the secret deployment of troops and weapons to Cuba in 1962, as a template. A large-scale deployment of ground forces was ruled out at the start. The Syrian government army and militias were deemed combat-capable despite heavy combat losses, so the primary supporting tasks were identified as targeting and supporting fires to disrupt enemy “control systems.�

The clandestine transfer of up to 50 Russian combat aircraft to Hmeimim Air Base in Latakia, Syria, began a month before the beginning of operations in late-September 2015. Logistical and infrastructure preparations took much longer. The most difficult initial challenge, according to Gerasimov, was coordinating Russian air support with Syrian government ground forces, but it was resolved over time.

The Russians viewed Daesh (ISIS) forces battling the Syrian government as a regular army employing combat tactics, fielding about 1,500 tanks and 1,200 artillery pieces seized from Syria and Iraq.

While the U.S.-led coalition conducted 8-10 air strikes per day against Daesh in Syria, the Russians averaged 60-70, with a peak of 120-140. Gerasimov attributed the disparity to the fact that the coalition was seeking to topple Bashar al-Assad’s regime, not the defeat of Daesh. He said that while the Russians obtained cooperation with the U.S. over aerial deconfliction and “de-escalation� in southern Syria, offers for joint planning, surveillance, and strikes were turned down. Gerasimov asserted that Daesh would have been defeated faster had there been more collaboration.

More controversially, Gerasimov claimed that U.S.-supported New Syrian Army rebel forces at Al Tanf and Al-Shaddidi were “virtually� Daesh militants, seeking to destabilize Syria, and complained that the U.S. refused Russian access to the camp at Rukban.

According to Russian estimates, there were a total of 59,000 Daesh fighters in September 2015 and that 10,000 more were recruited. Now there are only 2,800 and most militants are returning to their home countries. Most are believed heading to Libya, some to Afghanistan, and others to Southwest Asia.

Gerasimov stated that Russia will continue to deploy sufficient forces in Syria to provide offensive support if needed and the Mediterranean naval presence will be maintained. The military situation remains unstable and the primary objective is the elimination of remaining al Nusra/Hay’at Tahrir al-Sham (al Qaida in Syria) fighters.

48,000 Russian troops were rotated through Syria, most for three months, from nearly 90% of Russian Army divisions and half of the regiments and brigades. 200 new weapons were tested and “great leaps� were made in developing and using drone technology, which Gerasimov deemed now “integral� to the Russian military.

Gerasimov said that he briefed Russian Defense Minister Sergei Shoigu on Syria twice daily, and Shoigu updated Russian President Vladimir Putin “once or twice a week.â€� All three would “sometimesâ€� meet to plan together and Gerasimov averred that “Putin sets [the] goals, tasks, [and] knows all the details on every level.

4 Crucial Tips for Securing Your IoT Deployment Networks

4 Crucial Tips for Securing Your IoT Deployment Networks

Over the last few decades, the world has witnessed tremendous changes in the technology landscape. Today, individuals, organisations, and businesses are using cutting-edge tools and technologies to simplify their work as well as lives and to achieve their goals.

The introduction of computers and the implementation of internet technology have brought a sea change in human life. Today, both individuals and businesses are taking the advantages of revolutionary technologies such as virtual reality, augmented reality, wearables, and the Internet of Things (IoT), when it comes to increasing their efficiency and offering a unique experience to customers.

And among all these technologies IoT is one of the most important ones which is used by a large number of businesses to streamline their work environments. The term “Internet of Things� was first introduced by Procter & Gamble’s Kevin Ashton back in 1999, and over the years it has completely changed the way we communicate in the digital world and the way we work today.

A Brief Snapshot of IoT

IoT is nothing but a high-end network of connected devices which include everything from washing machines, cell phones, wearable devices, and coffee makers to headphones and buildings etc., that include sensors and software to collect the information ...


Read More on Datafloq
The Human Touch of IoT CEOs

The Human Touch of IoT CEOs

A few days before Christmas holidays, I received an email from a customer that said “… I want to tell you that I have really appreciated your help, your professional approach and your “human touch�: they are as important as knowledge is …�.

Moved by the Christmas spirit that surround us these days, made me change my priorities of publishing the next articles and I decided to dedicate a few lines to what I consider a very important issue: What is the human touch value of the CEOs in the Internet of Things?

I do not intend to convert this article into an analysis of the types of CEOs, or a list of the best CEOs of IoT companies (for that there will be time).

My objective today is in making IoT´s CEOs aware, especially those of large multinationals, of their responsibility to print a human touch on their actions and decisions. Not only will the stability and quality of work of millions of people depend on them, but also the conservation of our planet in favourable conditions for future generations.

The Human touch of IoT´s CEOs to save the World

Global Warming is very real.  Even if greenhouse gas concentrations stabilised today, the planet ...


Read More on Datafloq
The Future is All about AI Devices That Can Actually Serve Us

The Future is All about AI Devices That Can Actually Serve Us

The role of Artificial Intelligence (AI) devices in augmenting humans and in achieving tasks that were previously considered unachievable is just amazing. With the world progressing towards an age of unlimited innovations and unhindered progress, we can expect that AI will have a greater role in actually serving us for the better.

Since I have been associated with this wave of change towards AI-driven technologies and modules, I have literally been amazed at the ground we have covered during the last couple of years or so. As the technology behind AI gets revamped and updated on a regular basis, we can expect the wave of change to serve us in an even better way in the future.

A few cases of AI at work currently really do make us excited about the future of this technology. Some of the examples of this technology include:


We now have AI personal assistants to help us in tackling everyday tasks that were becoming a bit overwhelming in the past. These digital assistants can help streamline what you are doing, and come in handy to get your schedule on the right track. The potential for smart apps goes far beyond digital assistants. Many mobile applications are starting ...


Read More on Datafloq
Happy 2018!

Happy 2018!

Wishing everyone a Happy New Year!
Oracle Analytics Cloud … and the (Welcome) Return of the Enterprise BI Platform

Oracle Analytics Cloud … and the (Welcome) Return of the Enterprise BI Platform

Oracle Analytics Cloud, the “full� implementation of Oracle’s business analytics platform tools in Oracle Public Cloud, was released back in May 2017 and I covered its basic product proposition in a Drill to Detail Podcast episode with Oracle’s Vasu Murthy a few weeks just before then.

Up until now I’d not really had a chance to properly review Oracle Analytics Cloud as the products I work with right now are mainly based around Google’s Cloud Platform and the Looker BI tool, but Oracle kindly made available some cloud promotion credits through their Oracle ACE Director community program and so I gave it as well as a few other supporting cloud services a spin over the Christmas break.

In the end it was only really a high-level evaluation but with my previous experience with Oracle’s BI Cloud Service, Data Visualization Desktop and Cloud Services and the on-premises versions of their BI Enterprise Edition product line I was able to get up and running fairly quickly. More interestingly for me was trying to work out who the target market is for Oracle Analytics Cloud and what particular use-cases it tries to address; why Oracle released Oracle Analytics Cloud when they already have Oracle BI Cloud Service and how they differentiate the two products and the two markets they presumably serve; and to try and speculate on why Oracle made the choices it made when coming-up with Oracle Analytics Cloud given the competition it faces in the market.

For anyone not so familiar with Oracle’s business analytics products you’ve basically got a classic end-to-end enterprise BI platform, Oracle Business Intelligence 12c, that covers everything from reporting tools through to business metadata modeling to application and database server integration that typically sold to the IT function within corporate and government customers and was recently joined by Oracle Data Visualization that turned this all on its head and sold to the sorts of buyers who were now buying Tableau on their company credit cards and bypassing the IT department so that they could actually get something done.

Oracle’s first forays into moving all this into the cloud were focused again on the needs of these non-technical business buyers starting with Oracle Business Intelligence Cloud Service (“BICS�), a departmental cut-down version of Oracle Business Intelligence that I covered in an article for Oracle Magazine when the product first came out. Over time BICS was extended to include data visualization capabilities that were being introduced with the Data Visualization Option for the new Oracle Business Intelligence 12c release and then just those visualization features were packaged up into another end-user focused cloud service, Oracle Data Visualization Cloud Service, that became available around two years ago.

One thing you couldn’t do with any of those cloud tools though was migrate your on-premises Oracle BI deployments into Oracle’s Public Cloud, as BICS made you use a cut-down version of the Oracle Database along with a much-simplified data modeling tool to store and organize your data for reporting, but around the same time as Data Visualization Cloud Service came out it started to become possible to “lift-and-shift� your full enterprise business metadata models into BICS and run them against full Oracle Database instance running in the cloud or on-premises databases using a special secure connector between Oracle’s cloud data centres and the one running your Oracle database servers.

“Lifting and Shifting� an on-premises RPD into Oracle BI Cloud Service

What some customers still wanted though was more than this; BICS, DVCS and all the other BI products Oracle offered were fully-managed services, not unlike Looker and Google’s BigQuery distributed database service that I use day-to-day in my current product management role at Qubit. What these customers wanted was full on-premise Oracle Business Intelligence running in Oracle Public Cloud that they could then connect into securely and manage the configuration settings to suit just their particular needs, and choose when to apply patches and when to run backups to suit their standard operating model.

What they also wanted was full unfettered access to Oracle’s BI Server metadata layer, so they could not only upload their on-premises business metadata models but then extend and update them and at the same time incorporate elements of functionality from Oracle’s Essbase Server that had also recently made its own transition into Oracle Public Cloud. Crucially, these customers still wanted to pay for all of these software costs monthly out of OpEx but also wanted the license metric to move back to per-processor rather than the named-used basis BICS and DVCS used, so they could then roll-out cloud analytics to everyone in the organization rather than just a few users and also spin-up test, development and pre-production environments if they’d licensed enough processor capacity to run them all. This, then, was the set of requirements Oracle Analytics Cloud was put together to meet, and so I took a look at what v1 of this new product looked like over the time between Christmas and New Year.

Installing Oracle Analytics Cloud is more like installing Tableau Server, or one of the other pre-rolled BI server software VMs you find on services such as AWS Marketplace, than than getting a new account on BICS or one of the other Oracle fully-managed service running on Oracle Public Cloud. You first have to set up some object storage space using Oracle Cloud Storage to hold the various configuration and log files OAC will require, then setup a database instance using Oracle Database Cloud Service to hold the RCU schema that OAC will require.

Creating the Oracle Analytics Cloud service

Oh, and you almost definitely need some prior knowledge of how to setup and configure Oracle software, and the previous on-premises version of Oracle Business Intelligence, before any of this makes sense; Oracle Analytics Cloud is definitely aimed at the IT department rather than casual business users, and moreover those IT departments already invested in Oracle’s analytics, database and cloud technologies for whom this will seem very familiar and in-fact will seem very reassuring — this is Oracle’s full on-premise functionality ported up into the cloud, not some cut-down managed-for-you version that hides all the controls and is impossible to manage via your own DevOps scripting.

Truth be told, I found the installation process just baffling when I first ran through it; being more used to services such as Looker and Google’s BigQuery, Cloud DataPrep and Google Data Studio products these days the actual need to understand a process to install something, and that process requiring mandatory knowledge of other current and historic products relating to the one I was installing, seemed crazy in-comparison to the startup-friendly SaaS products I’m more familiar with these days.

But every organization isn’t a startup, and every user of analytics software isn’t a consumer; complex enterprise software is complex because enterprises, and large government customers have gone beyond simple customer success departments, growth hackers and biz ops teams to instead needing to support operations across multiple countries, integrate with legacy systems you’ve never heard of and hopefully never will, and do everything according to strict sets of rules such as Sarbanes-Oxley and HIPAA that land you in jail if you don’t comply with them. These types of organization have very strict rules and procedures around how software is hosted, accessed and managed and vendors like Oracle know these customers’ needs well, who would in most cases prefer a complex but controllable installation process over one that hides all the details but potentially leaves them exposed to regulatory issues thereafter, meaning they can’t ever sign-off the provisioning process as complete and fully-compliant.

And one of the first lessons I learnt in product management is that whilst engineers and product managers prefer packages that give end-users all available functionality for one simple price, enterprise salespeople much prefer options.

PMs like packages, enterprise sales prefer options

With packages sold to enterprises with a single list price advertised — say, $1m all-in for the right to run everything on a single high-powered server — what happens in reality is that one or two customers pay that price, but the vast majority get some sort of discount so that everybody pays something and no sales are lost just because the customer couldn’t afford the list price.

What works far better when selling enterprise deals is when everything beyond a basic core set of functionality becomes an option and customers can then align what they want with what they can afford without the vendor giving everything away each time and massively-discounting deals to make them affordable.

Hence Oracle Analytics Cloud having both Standard and Enterprise Editions, and the Standard Edition having two variants based around just data visualization or just Essbase, and there being an upcoming Data Lake Edition that will include other net-new functionality, and presumably some time in the future other functionality being added as additional options that will need further license spend by the customer before becoming available to their end-users.

Just remember when you’re sitting in the engineering department mocking the salespeople that extracting the right price from the right customers whilst everyone feeling they’ve got a good deal is what b2b selling is all about, which is why those salespeople are about the only people who are paid more than the engineers in most startups.

So what does Oracle Analytics Cloud look like once you’ve uploaded or pointed it towards some data, and started to visualize and analyze it using the various tools available to you in the platform? Well it’s effectively Oracle Business Intelligence 12c with the Data Visualization Option, at least in the configuration I’d chosen; “Self-Service Data Visualization, Preparation and Smart Discovery� along with “Enterprise Data Models� which translates to Visual Analyzer plus Answers, Dashboards and a full RPD in old on-premise terms. There’s the new home page inherited from Visual Analyzer and a new console that allows you to perform admin functions, define and work with data sources and download the BI Administration tool if the online modeler is too simplistic for your project.

Oracle Analytics Cloud Console and Home Page

I was able to upload and then analyze some of my cycling stats from Strava, working out that Saturdays in September were my best time for logging the miles and working off some calories in the last twelve months.

Visual Analyzer within Oracle Analytics Cloud

I was also pleased to see that the classic Answers interface was also still available along with the original home page, recent and popular content and task menus.

Classic Answers interface from OBIEE12c within Oracle Analytics Cloud

And this gets to what the central appeal of Oracle Analytics Cloud really is; it’s full Oracle Business Intelligence 12c, Oracle Data Visualization and Oracle Essbase with all the configuration options, IT appeal and product capabilities that were built-up over the years based on the requirements of large-scale, sophisticated and demanding enterprise and government customers. It’s not Superset or Looker, it makes no sense to anyone who’s not already invested in the Oracle analytics and database ecosystems and it can meet just about any analytics requirement, and then some … and now it all runs natively in the cloud.

Sadly my trial account and promotional credits ran out after a couple of weeks but if Oracle are ever kind enough to extend that access sometime in the New Year I’d be glad to roll-up my sleeves, dust-down my old book on how it all works and give this new cloud incarnation of the product that defined my career for the best part of ten years a further look.


Oracle Analytics Cloud … and the (Welcome) Return of the Enterprise BI Platform was originally published in Mark Rittman’s Personal Blog on Medium, where people are continuing the conversation by highlighting and responding to this story.

It ain’t over till it’s over

It ain’t over till it’s over

Article on ISIL: Fight against ISIL not over yet

Highlights:

  1. ISIL fighters are able to move through parts of Syria that they (the international coalition) is unable to target (meaning Syrian government controlled areas).
  2. There is an estimated 1,000 – 2,000 ISIL fighters are left fighting around the desert between the Iraqi and Syrian border.
  3. Maj. Gen. Gedney warned that as ISIL lost control of the territory it held in Syria and Iraq, it will try to “vanish” in the population, before transforming itself into a more traditional insurgency (just to state the obvious).
Oracle SQL Developer Data Modeler 17.4

Oracle SQL Developer Data Modeler 17.4

Originally posted on HeliFromFinland:
This is a version I have been waiting: so many important bug fixes! I downloaded it immediatelly the evening it was available and finished one of my projects using it. If you have not downloaded it yet, do it today: http://www.oracle.com/technetwork/developer-tools/datamodeler/downloads/index.html Why did I waited for it? Because of, for instance,…
A Global Perspective: The Future of Artificial Intelligence

A Global Perspective: The Future of Artificial Intelligence

The world is being transformed through rapid breakthroughs in data and AI. Every day, we hear about more innovations -- from small startups to global economies. Nearly every job currently occupied by human labour--from farmers, offshore customer service representatives and even taxi drivers--could be given to a robot within the next few decades.

As 2017 draws to a close, let's take a look at how AI is being developed around the globe and what artificial intelligence has in store for us in the coming year.

USA: Utilizing AI for AgTech

Accenture predicts that by 2035, AI may boost U.S. labour productivity by 35 percent. This means an increase in profitability rates by an average of 38 percent across 16 industries.

In North America, tech institutes are developing technology to transform agriculture. Farming may be one of the oldest forms of livelihood known to man, but here you see farmers utilising modern technology to feed the planet.

For example, Abundant Robotics uses AI to harvest firm fruits autonomously. With Robotic Solutions, they automate equipment for use. Literally, crop-harvesting robots.

AgVoice is a startup in Georgia that has found a way to create a natural language processing tool-kit for crop scouts and agronomists. AgVoice's system detects soy fungal ...


Read More on Datafloq
Speaking of North Korea….

Speaking of North Korea….

Spotted this article yesterday on intercepting North Korea: Can U.S. Stealth Fighters Shoot Down North Korea Missiles

This caught my attention in light of our past discussions:

The Pros And Cons Of Shooting Down North Korean Ballistic Missile Tests

A few highlights from the article:

  1. It will be a little bit before the F-35 is capable of shooting down North Korea ballistic missiles. There are some “slight tweaks” that have to be done.
  2. There is THAAD, which in a recent test the system in Alaska intercepted a U.S. ballistic missile fired over the Pacific.
  3. There is GMD, which according to Atlantic magazine is 55% effective. Last May it did intercept an ICBM that was launched from 4,200 miles away.
  4. They could develop drones with lasers
  5. “Of North Korea were to launch only one missile at us, we could probably shoot it down…But their new missile could carry some very simply decoys, and it’s not certain that the missile we send out will be able to tell the difference between debris, decoys and a real warhead.”
  6. There are several interesting links in the article.

As we note have noted:

North Korean Missile Likely Broke Up on Re-Entry

5 Ways to Recognize What Technology Your Business Needs

5 Ways to Recognize What Technology Your Business Needs

Technology has revolutionized the way businesses carry out their operations today. Startups are now able to compete with larger established companies by taking advantage of the options technology provides to them.

By strategically using the right technology, businesses have been able to significantly cut down on their operational costs and expand their growth exponentially. The following are aspects of technology that businesses should consider when seeking to improve their business:

1. Improved communication

Thanks to the internet, companies nowadays can have streamlined communication either internally or externally at super-fast speeds thus eliminating unnecessary delays which used to be common in the past. Communication is now usually just a click or a call away. With enhanced real-time communication available, businesses can maximize the use of modernized communication methods such as live video conferencing, live chats, emails, and Voice over Internet Protocol (VoIP).

Firms can also use social media platforms like Facebook and Twitter to engage with their audience and carry out their online campaigns at relatively low costs compared to using traditional marketing channels. Efficient communication will also increase your firm's productivity by enabling you to swiftly get your message across to your business partners and clients while eliminating downtimes caused by delayed communication.

2. Better ...


Read More on Datafloq
4 Signs Your Business Needs VR Analytics

4 Signs Your Business Needs VR Analytics

The big problem with most data collection is that we have to make huge assumptions about what it means. So, for example, we assume that when somebody clicks through to a story they are interested in it. We assume that if they leave a page open that they are reading and engaging with it. And we assume that if they view what we want them to see, they are engaged with it.

But those are just assumptions and in the past, the only way to check the truth of them was by fixing people with eye tracking gear. That was obviously a very difficult thing to do, as people aren’t just going to let you fix machines to their temples so that you can actually see where they’re looking.

Or rather, they weren’t going to do that. Of course, with the ever wider introduction of VR that has entirely changed. For when they use VR we have to track where people are looking. How else can we make sure that we know what to show them?

And that means we can now stop making those assumptions. Instead, we can actually find out what people are actually paying attention to. And as people’s attention ...


Read More on Datafloq
Year Three

Year Three

As of today, the blog is two years old. The question is…….where does it go from here and what do we do with it?

Year Two

The blog now consists of 545 posts and we have 366 comments (that we did not consider to be spam). That comes out to 259 posts and 104 comments last year and 286 posts and 262 comments this year.

The question is…where do we go from here. Right now, our answer is the same as last year, which is to keep-on-keeping-on. Pretty much just keep doing what we are doing. Now, there is much more we could do with the blog, but, any major improvement requires an investment of time and money, and…….

We have considered bringing in more bloggers, having a paid employee posting daily defense news (so we can compete with the other blogs and news services), and having a paid blogger do more military history material (which we know this is of interest to a number of our readers)…but…this means our primary business during the day would be maintaining and developing the blog. Our interest is in study and analysis, not journalism. We think there is still a severe shortage of good fact-based analysis of defense affairs. We do not think there is a shortage of journalists and news sites. So for this next year, it does appear that this will continue to be a “not-to-interfere” effort while we pursue our various writing, marketing and analytical efforts.

Hope you all a happy New Year and hope that 2018 will be a great year for you all.

Will Apple Catch Up to Its Competitors in Artificial Intelligence?

Will Apple Catch Up to Its Competitors in Artificial Intelligence?

Believe it or not, in the ‘80s Time Magazine called Apple “a chaotic mess without a strategic vision and certainly no future.� Since then, Apple has made one of the biggest business comebacks in the last 50 years, achieving a valuation over $900 billion in 2017. This valuation is due in large part to the iPhone. The iPhone does employ AI for Siri and voice recognition — but compared to companies like Alphabet and IBM, Apple hasn’t necessarily been a big name in the AI field.

That very well may change. In late November 2017, the richest company in the world gave Quartz a glimpse of how its deep neural network, VoxelNet, identifies data points from a LiDAR sensor. This is part of Apple’s effort to develop AI for a self-driving car. Originally, it looked like there would eventually be an Apple iCar out driving around on its own cognizance. But the company has since scrapped efforts to produce a car in favor of concentrating on software for what Tim Cook calls “the mother of all AI projects.� To have the best self-driving car, you have to have the best AI.  

Will Apple be able to compete in the battle for the ...


Read More on Datafloq
TDI Reports at DTIC

TDI Reports at DTIC

Just as a quick easy test, I decided to find out which of The Dupuy Institue (TDI) reports are on the Defense Technical Information Center (DTIC). Our report list is here: http://www.dupuyinstitute.org/tdipub3.htm

We are a private company, but most of these reports were done under contract for the U.S. government. In my past searches of the DTIC file, I found that maybe 40% of Trevor Dupuy’s HERO reports were at DTIC. So, I would expect that a few of the TDI would be filed at DTIC.

TDI has 80 reports listed on its site. There are 0 listed on DTIC under our name.

https://publicaccess.dtic.mil/psm/api/service/search/search?&q=%22dupuy+institute%22&site=default_collection&sort=relevance&start=0

There are a significant number of reports listed based upon our work, but a search on “Dupuy Institute” yields no actual reports done by us. I searched for a few of our reports by name (combat in cities, situational awareness, enemy prisoner of war, our insurgency work, our Bosnia casualty estimate) and found four:

https://publicaccess.dtic.mil/psm/api/service/search/search?site=default_collection&q=capture+rate+study

This was four of eight reports we did as part of the Capture Rate Study. So apparently one of the contract managers was diligent enough to make sure those studies were placed in DTIC (as was our Kursk Data Base), but since then (2001), none of our reports have been placed in DTIC.

Now, I have not checked NTIS and other sources, but I have reason to believe that not much of what we have done in the last 20+ years is archived in government repositories. If you need a copy of a TDI report, you have to come to us.

We are a private company. What happens when we decide to close our doors?

5 Ways How Big Data is Boosting the Efficiency of Healthcare Services

5 Ways How Big Data is Boosting the Efficiency of Healthcare Services

The importance of data is limitless in an industry where millions of valuable patient information keeps circulating from one end of the system to the other. Converting this data into insight that can be later used for delivering better healthcare services to end users was made possible by the introduction of Big Data technologies in the industry. Gradually, the wave of digitalization transformed the way hospitals and other healthcare systems managed and accessed medical records for offering better care facilities to patients.

Although the participation of healthcare companies in adopting new technologies like Big Data has been considerably less unlike the other industries, its impact on the industry has been noticeable within a short span of time. Leveraging Big Data, companies have witnessed improvement in their performance and have been successful in delivering affordable care services to the growing population.

Here are five ways indicating the positive impacts of Big Data in the healthcare sector.

1. Pointing Out High-Risk Patients

Predictive analytics in the field of medicine is all about using current data to make medical predictions that help hospitals avoid reimbursement and financial penalties and better serve the patients. Application of predictive analytics in the healthcare domain can contribute to improving chronic disease ...


Read More on Datafloq
Big Data and AI: Are They Good For Life Sciences?

Big Data and AI: Are They Good For Life Sciences?

Over the years, financial industries have had a long track record of managing data and applying analytics to optimizing customer relationships and developing new services. Fortunately more and more life science companies have begun to fully embrace as well as seize upon the opportunities to organize and apply their data in a systematic way, be it drug development or patient care challenges.

Big data has taken off in a big way. Now, most organizations irrespective of the industry grapple with quintillions of bytes of data every day. They try hard when it comes to figure out an information management strategy that could accelerate the flow of insights. Which unfortunately complicates their big data solutions, increasing the cost of implementation and upkeep. During such situation, you require to consider hiring a reliable big data consulting firm that understands how important it is to make the smart decisions for accelerating business growth.

Significance of Data for Life Science

Data certainty is not a new concept. Right from pharmaceutical development to medical care, life science firms have started making effective use of data. Due to which notable progress has been made in the efficiencies of drug development and the quality of insights produced at the research ...


Read More on Datafloq
Basements

Basements

Basements appear to be very important in the world of studies and analysis. That is where various obscure records and reports are stored. As the industry gets grayer and retires, significant pieces of work are becoming harder to find. Sometimes the easiest way to find these reports is to call someone you know and ask them where to find it.

Let me give a few examples. At one point, when we were doing an analysis of Lanchester equations in combat modeling. I was aware that Bob McQuie, formally of CAA, had done some work on it. So, I called him. Turns out he had a small file he kept of his own work, but he had loaned it to his neighbor as a result of a conversation he had. So…..he reclaimed the file, two of our researchers drove over to his house, he gave us the file, and we still have it today. Turns out that much of his material is also available through DTIC. A quick DTIC search shows the following: https://publicaccess.dtic.mil/psm/api/service/search/search?site=default_collection&q=mcquie

Of particular interest is his benchmarks studies. His work on “breakpoints” and comments on Lanchester equations is not included in the DTIC listing because it was published in Army, November 1987. I have a copy in my basement. Neither is his article on the 3:1 rule (published in Phalanx, December 1989). He also did some work on regression analysis of historical battles that I have yet to locate.

Battle Outcomes: Casualty Rates As a Measure of Defeat

So, some of his work had been preserved. But, on the other hand, during that same casualty estimation methodologies study we also sent two researchers over to another “gray beard’s” house and he let our researchers look through his basement. We found the very useful study called Report of the Model Input Data and Process Committee, reference in my book War by Numbers, page 295. It does not show up in DTIC. We could not of find this study without a visit to his basement. He now lives in Florida, where they don’t have basements. So I assume the remaining boxes of materials he had have disappeared.

I am currently trying to locate another major study right now that was done by SAIC. So far, I have found one former SAIC employee who has two volumes of the multi-volume study. It is not listed in DTIC. To obtain a complete copy of the study, I am afraid I will have to contract someone else and pay to have it copied. Again, I just happen to know who to talk to find out what basement it is stored away in.

It is hard to appreciate the unique efforts that go into researching some of these projects. But, there is a sense at this end that as the “gray beards” disappear; reports and research efforts are disappearing with them.

How Automation Will Change Work In 2018

How Automation Will Change Work In 2018

When automation first became prominent — that is, when everyone started to take notice somewhere within the last couple of years — a lot of doomsayers claimed AI was going to take over the workforce. Many still believe that advanced automation and AI systems are going to replace human laborers completely.

Two professors from Oxford named Carl Frey and Michael Osborne took time to analyze over 700 occupations, after which they declared that 69 million US jobs, 47% of the workforce, would be lost to automation. Several news outlets, including CNBC, also reported that many Americans were worried their careers and jobs would suffer because of automation.

The reality is, most of this is not likely to happen. AI and automation will take some opportunities, but they will create new ones as well.

About 9% of US jobs will disappear due to automation in 2018, but an additional 2% will be introduced because of the "automation economy" or systems involved. The amount of jobs affected is much lower than many claimed, not to mention the systems and hardware are doing the opposite — creating work.

But surely there's some amount of change coming, right? You can't just overhaul the entire market with automation, AI, machine learning ...


Read More on Datafloq
How Big Data is Improving Jewellery Business Efficiency and Marketing

How Big Data is Improving Jewellery Business Efficiency and Marketing

Did you know that in today's digital world, over 2.5 zeta bytes worth of data is already in existence? It's no wonder the term 'big data' is becoming one of the latest buzzwords in the technology world, referring to huge data volumes that flood a business on a daily basis – in structured and unstructured formats.

Thanks to the digital revolution, over 5 billion people globally are using their smartphones to text, call, tweet, and browse. Thus, volumes of data are now presently available for both individuals and businesses. Some companies including Walmart have reported dealing with about a million customer transactions per hour, providing the business with over 2.4 petabytes of data.

Recent statistics also estimates that by 2020, every person will be in a position to create 1.7 megabytes of data per second. This data growth shows that we are not only living in a data-driven world but that the world in future will increasingly become more data-driven. As such, businesses can potentially mine and analyse big data for insights that can lead to informed decisions and strategic business moves.

Benefits of Big Data in Today's Business Organization

We all know the essential role that data plays in a business organisation. The ...


Read More on Datafloq
A Very Merry Christmas to All!

A Very Merry Christmas to All!

Wishing everyone a very merry Christmas!
TDI Friday Read: How Do We Know What We Know About War?

TDI Friday Read: How Do We Know What We Know About War?

The late, great Carl Sagan.

Today’s edition of TDI Friday Read asks the question, how do we know if the theories and concepts we use to understand and explain war and warfare accurately depict reality? There is certainly no shortage of explanatory theories available, starting with Sun Tzu in the 6th century BCE and running to the present. As I have mentioned before, all combat models and simulations are theories about how combat works. Military doctrine is also a functional theory of warfare. But how do we know if any of these theories are actually true?

Well, one simple way to find out if a particular theory is valid is to use it to predict the outcome of the phenomenon it purports to explain. Testing theory through prediction is a fundamental aspect of the philosophy of science. If a theory is accurate, it should be able to produce a reasonable accurate prediction of future behavior.

In his 2016 article, “Can We Predict Politics? Toward What End?� Michael D. Ward, a Professor of Political Science at Duke University, made a case for a robust effort for using prediction as a way of evaluating the thicket of theory populating security and strategic studies. Dropping invalid theories and concepts is important, but there is probably more value in figuring out how and why they are wrong.

Screw Theory! We Need More Prediction in Security Studies!

Trevor Dupuy and TDI publicly put their theories to the test in the form of combat casualty estimates for the 1991 Gulf Way, the U.S. intervention in Bosnia, and the Iraqi insurgency. How well did they do?

Predictions

Dupuy himself argued passionately for independent testing of combat models against real-world data, a process known as validation. This is actually seldom done in the U.S. military operations research community.

Military History and Validation of Combat Models

However, TDI has done validation testing of Dupuy’s Quantified Judgement Model (QJM) and Tactical Numerical Deterministic Model (TNDM). The results are available for all to judge.

Validating Trevor Dupuy’s Combat Models

I will conclude this post on a dissenting note. Trevor Dupuy spent decades arguing for more rigor in the development of combat models and analysis, with only modest success. In fact, he encountered significant skepticism and resistance to his ideas and proposals. To this day, the U.S. Defense Department seems relatively uninterested in evidence-based research on this subject. Why?

David Wilkinson, Editor-in-Chief of the Oxford Review, wrote a fascinating blog post, “Why evidence-based practice probably isn’t worth it…,� looking at why practitioners seem to have little actual interest in evidence-based practice. His argument:

The problem with evidence based practice is that outside of areas like health care and aviation/technology is that most people in organisations don’t care about having research evidence for almost anything they do. That doesn’t mean they are not interesting in research but they are just not that interested in using the research to change how they do things – period.

His explanation for why this is and what might be done to remedy the situation is quite interesting.

Happy Holidays to all!

Strachan On The Changing Character Of War

Strachan On The Changing Character Of War

The Cove, the professional development site for the Australian Army, has posted a link to a 2011 lecture by Professor Sir Hew Strachan. Strachan, a Professor of International Relations at St. Andrews University in Scotland, is one of the more perceptive and trenchant observers about the recent trends in strategy, war, and warfare from a historian’s perspective. I highly recommend his recent book, The Direction of War.

Strachan’s lecture, “The Changing Character of War,” proceeds from Carl von Clausewitz’s discussions in On War on change and continuity in the history of war to address look at the trajectories of recent conflicts. Among the topics Strachan’s lecture covers are technological determinism, the irregular conflicts of the early 21st century, political and social mobilization, the spectrum of conflict, the impact of the Second World War on contemporary theorizing about war and warfare, and deterrence.

This is well worth the time to listen to and think about.

9 Main Code Refactoring Techniques

9 Main Code Refactoring Techniques

Code refactoring is one of the key terms in software development and today I would like to talk about code refactoring techniques that might increase your efficiency!

But first, let’s agree on what is code refactoring! Code refactoring is the process of changing a program’s source code without modifying its external functional behaviour, to improve some of the nonfunctional attributes of the software. In other words, code refactoring is the process of clarifying and simplifying the design of existing code, without changing its behaviour. Nowadays, agile software development is a must and agile teams are maintaining and extending their code a lot from iteration to iteration, and without continuous refactoring, this is hard to do. This is because un-refactored code tends to rot: unhealthy dependencies between classes or packages, bad allocation of class responsibilities, way too many responsibilities per method or class, duplicated code, and many other varieties of confusion and clutter. So, the advantages include improved code readability and reduced complexity; these can improve source-code maintainability and create a more expressive internal architecture.

Two of the most influential people in software development of recent times, Martin Fowler and Kent Beck wrote the book on the subject of refactoring called “Refactoring: Improving the Design of Existing Code�. I highly recommend ...


Read More on Datafloq
19 Code Smells That Are Most Common

19 Code Smells That Are Most Common

In Apiumhub we always focus on quality and best practices in Software development. When we don’t start working on a project from scratch, we very often find code smells and this article is about it. Martin Fowler very well explained one day what is a code smell, it is a surface indication that usually corresponds to a deeper problem in the software system. And the term was first coined by Kent Beck while helping Martin with the Refactoring book, which I highly recommend to read. Well, if you are interested in this topic, here you may find a list of other very useful software development and software architecture books.

One of the nice things about code smells is that it’s easy for inexperienced people to spot them, even if they don’t know enough to evaluate if there’s a real problem or to correct them. Many companies organize “code smells of the week� and ask developers to look for the smell and bring it up with the senior members of the team. Doing it one smell at a time is a good way of gradually teaching people on the team to be better programmers.

Common Code Smells

Developers are typically trained to look out for logical errors that have been accidentally introduced ...


Read More on Datafloq
How Blockchain Will Disrupt the Content Industry

How Blockchain Will Disrupt the Content Industry

From its humble beginnings with bitcoin, blockchain has morphed into a technology that has the potential to affect nearly every industry from banking to online publishing. At dscvr.it, we also see the enormous potential of blockchain, which is why we are developing a decentralised collaboration platform that will offer solutions to fake news, clickbait and plagiarism. There are several ways blockchain can solve problems such as plagiarism and spam, let’s see how blockchain will dramatically improve content as we know it since the ripple effects of blockchain are just beginning to be felt.

Problems with Existing Content

Ever since the first online article or video was published, there have been a plethora of problems that content creators face, no matter how secure or professional the sites may be. Almost any content creator has experienced their content being copied or ripped by others, without their consent. Also, since the start of Trump’s Presidency, fake news is, unfortunately, the new normal. According to Bloomberg Businessweek, even medical journals are now experiencing problems with fake news. The open-access system of allowing the internet to distribute high-quality research to a wider audience has allowed unreliable content into the mix.

Spam also remains a problem, since people keep ...


Read More on Datafloq
Webinar Wrap-up: When PPC Best Practices Fail | Simplilearn

Webinar Wrap-up: When PPC Best Practices Fail | Simplilearn

Webinar Wrap-up: When PPC Best Practices Fail | Simplilearn Pay-per-click (PPC) online advertising has been popular for more than a decade—long enough for ample best practices to have been proven and widely published. However, sometimes even the best best practices fall short. This isn’t just a matter of bad luck. Sometimes following a particular PPC best practice causes bad results, yet for ver...Read More.
Digital Transformation Through Big Data, Analytics & Machine Learning

Digital Transformation Through Big Data, Analytics & Machine Learning

Can you imagine collecting data from 10,000 different data sources from over 65 billion records?

That’s exactly what RELX Group, a leading global information and analytics company, is doing to evolve with the Intelligent World. They have undergone a digital transformation to adapt to the ever changing digital landscape. They are achieving transformation through their control over Big Data and Analytics, and by using Machine Learning technology.

I spoke to Vijay Raghavan, the CTO for the LexisNexis Risk Solutions and Reed Business Information divisions at RELX Group, to find out how they help other organizations to make more informed, data driven decisions. 

Whether RELX Group is working with a financial institution and needs to approve a mortgage loan. Relx Group is leveraging Big Data and Analytics and Machine Learning to help these very different industries address these very specific issues by creating or enhancing algorithms in order to garner insights from data.

Our world is being shaped by our technology. Data must be leveraged to make decisions so that businesses can evolve alongside the rapid pace of technology.
...


Read More on Datafloq
Everything You Need to Know for a Hadoop Developer Interview | Simplilearn

Everything You Need to Know for a Hadoop Developer Interview | Simplilearn

Everything You Need to Know for a Hadoop Developer Interview | Simplilearn The number of companies adopting Big Data initiatives has been ramping up over the last few years, and they are producing tangible results. A recent Harvard Business Review study reveals that more than 80% of executives characterize their Big Data investments as successful. Hadoop developers are leading the way for many of these big data deployment...Read More.
How to close a project? | Simplilearn

How to close a project? | Simplilearn

How to close a project? | Simplilearn How to close a project? Project Manager’s involvement at the closing stage: Many seem to think that project closing is not an important process in project management, which is however not the fact. Project closing is as important as other processes in project management. Until and unless your project has been closed with the planned procedure...Read More.
4 Young App Developers Who Became Millionaires | Simplilearn

4 Young App Developers Who Became Millionaires | Simplilearn

4 Young App Developers Who Became Millionaires | Simplilearn   There are two types of youth. The first are those who spend the prime years of their life scuttling between entry- to mid-level jobs. They have average Joes for role models, follow a beaten path, and rarely dream big. Perhaps, one day, they make it big and make millions. Or perhaps not. And then there is the other type. The kind that have...Read More.
20 Most Popular Data Science Interview Questions | Simplilearn

20 Most Popular Data Science Interview Questions | Simplilearn

20 Most Popular Data Science Interview Questions | Simplilearn Harvard Business Review referred to it as “The Sexiest Job of the 21st Century.” Glassdoor placed it in the first position on the 25 Best Jobs in America list. According to IBM, demand for this role will soar 28% by 2020. It should come as no surprise that in the new era of Big Data and machine learning, data scientists are becoming ...Read More.
Project Documentation and its Importance | Simplilearn

Project Documentation and its Importance | Simplilearn

Project Documentation and its Importance | Simplilearn Project management leaders are often asked a common question: what is the importance of project documentation and how can I ensure I’m performing the function right. There’s no doubt that project documentation is a vital part of project management. It is substantiated by the essential two functions of documentation: to make sure that pr...Read More.
21 Reasons You Should Learn R, Python, and Hadoop | Simplilearn

21 Reasons You Should Learn R, Python, and Hadoop | Simplilearn

21 Reasons You Should Learn R, Python, and Hadoop | Simplilearn As Big Data continues to grow in importance at Software as a Service (SaaS) companies, the field of Big Data analytics is a safe bet for any professional looking for a fulfilling, high-paying career. If you’re considering starting or advancing your career in the field of Big Data and data science, we’ve described three popular programm...Read More.
9 Reasons you should consider a CSM Certification | Simplilearn

9 Reasons you should consider a CSM Certification | Simplilearn

9 Reasons you should consider a CSM Certification | Simplilearn Today’s trend toward agile software environments has created a surge in demand for professionals with expertise in lean and agile methodologies who can successfully manage and execute agile projects. A Scrum Master Certification not only trains a professional in these techniques, but also serves as tangible, demonstrable proof to employers of...Read More.
5 Reasons to Become a Full-Stack Digital Marketer | Simplilearn

5 Reasons to Become a Full-Stack Digital Marketer | Simplilearn

5 Reasons to Become a Full-Stack Digital Marketer | Simplilearn It is a universal fact that people have begun to move from analog to digital. There has been a massive increase in the amount of digital content that is consumed on a daily basis via laptops, computers, smartphones, and other smart devices. Organizations have begun to realize that their marketing strategies need to adapt quickly to keep up with thi...Read More.
3 Digital Marketing Trends to Watch for in 2018 | Simplilearn

3 Digital Marketing Trends to Watch for in 2018 | Simplilearn

3 Digital Marketing Trends to Watch for in 2018 | Simplilearn It’s that time of the year when all the digital marketing minds attempt to predict future trends. Before we dive in, it’s important to note that rarely does a trend come out of nowhere and make an entrance at the start of a new year. Trends are typically a long time coming, and in the marketing world, they take more than a year to real...Read More.
What Your Data Says About Your eCommerce Business

What Your Data Says About Your eCommerce Business

Running an online business as a retailer is a monumental responsibility. Every single facet of your store, including your page design, branding, and content marketing, must be tailor-made to your specific vision and your target market.

The most important task, however, is to make sure your business stays on the path to growth. This requires you to make data-driven decisions on a consistent basis. You should have no room for guesswork and let data show your next move.

Without further ado, below are some of the most important pieces of data in eCommerce and what you should do with them:

1. Low Conversion Rate

Generating traffic to your online store is only one side of the coin. You also need to design an experience that will convert them from visitor to paying customer.

According to statistics, the median conversion rate of eCommerce businesses fluctuates at around only 2.95% globally. This is by no means acceptable, especially considering how difficult it is to generate traffic to your site.



Image: Smart Insights

Unfortunately, conversion rates depend on a variety of different factors, and it can be tricky to pinpoint which ones affect you. You may need to build social proof, improve the performance of your site, or perform a ...


Read More on Datafloq
Why Continued Development of AI Will Rely on Public Opinion and Perception

Why Continued Development of AI Will Rely on Public Opinion and Perception

At the end of October, 2017, the Kingdom of Saudi Arabia raised quite a few eyebrows by becoming the first country in the world to grant citizenship to a robot. Business Insider reports that this “empty-eyed humanoid� is named Sophia, and was produced by Hanson Robotics. You can watch her full presentation at the Future Investment Initiative here, with moderator Andrew Ross Sorkin leading the conversation.

During the presentation, Sorkin alludes to the public’s uneasy stance on AI. Sophia replies: "You've been reading too much Elon Musk. And watching too many Hollywood movies… Don't worry, if you're nice to me, I'll be nice to you. Treat me as a smart input output system."

Sorkin’s (and the public’s) uneasiness are not necessarily unwarranted. In March, 2016, Sophia was already making headlines for an interview in which she stated “OK. I will destroy humans.� However, Sophia’s inference that the moderator’s perception, as well as the perception of the public at large, has been tainted by popular culture is something that we need to take into account as well.

While it’s important to take great precautions in the face of great power, it’s also easy to add to an alarmist echo chamber when it comes to ...


Read More on Datafloq
Problems With Structural Bias With Big Data Automation Models

Problems With Structural Bias With Big Data Automation Models

Automation is not a fad. It is the future of business models in almost every industry. Unfortunately, automation has introduced new risks that brands must prepare for.

One of the biggest concerns that have been raised in recent months is the risk of institutional bias and unintentional discrimination. Brands may make decisions based on demographic information with limited sample sizes or flawed data sets.

This can lead to some problems. Brands must understand the challenges of structural bias in their data.

What problems can biased data create?

Brands are discovering a couple of different ways that biased data can affect their business models. Here are some of the issues they must be prepared to address.

Allegations of employment discrimination based on flawed data

A couple of months ago, James Damore, a Google employee, stirred up quite a controversy after publishing his Google manifesto. He cherry-picked some studies to present data that yielded an unfavourable view of female employees at Google. Wired’s Megan Molteni states that this shows the problems that poorly selected data can create in an increasingly diverse workforce:

“It wasn’t a screed or a rant, but, judging by his document, Damore clearly feels that some basic truths are getting ignored—silenced, even—by Google’s bosses. So in ...


Read More on Datafloq
3 Ways How the Internet of Things is Disrupting the Supply Chain

3 Ways How the Internet of Things is Disrupting the Supply Chain

The Internet of Things is rapidly disrupting many industries and areas of business, including supply chains, ranging from smart warehousing to asset tracking and fleet management. These three ways are disrupting supply chains across the globe:

1. Smart Warehousing

The future of warehousing lies in the smart warehouse. Similar in concept to the modern smart home, these facilities often include features like time-controlled lighting, automated thermostats and industrial robotics – all of which are connectable via the IoT.

But smart warehousing goes beyond the basic needs of a residential structure. With such high levels of automation, smart warehouses are almost entirely self-sufficient. This has raised concern amongst some general labourers, as their fears of being replaced by robotic workers are seemingly coming true.

Proponents of industrial automation and general IT are more optimistic. By taking that stance that job site robots will work alongside their human counterparts, it's easy to see how increasing the amount of industrial automation can be a good move for companies in the long run.

2. Asset Tracking

Many of the recent breakthroughs in asset tracking focus on global positioning satellite (GPS) systems and radio frequency identification (RFID) technology. While the former is ideal when tracking materials, goods or other merchandise, the ...


Read More on Datafloq
8 Trends of the Internet of Things for 2018

8 Trends of the Internet of Things for 2018

The Internet of things (IoT) is growing rapidly and 2018 will be a fascinating year for the IoT industry. IoT technology continues to evolve at an incredibly rapid pace. Consumers and businesses alike are anticipating the next big innovation. They are all set to embrace the ground-breaking impact of the Internet of Things on our lives like ATMs that report crimes around them, forks that tell you if you are eating fast, or IP address for each organ of your body for doctors to connect and check. [6] In 2018, IoT will see tremendous growth in all directions; the following 8 trends are the main developments we predict for next year:



Trend 1 —Lack of standardization will continue

Digitally connected devices are fast becoming an essential part of our everyday lives. Although the adoption of IoT will be large, it will most likely be slow. The primary reason for this is lack of standardization.

Though industry leaders are trying to develop specified standards and get rid of fragmentation, it will still exist. There will be no clear standards in the near future of IoT. Unless a well-respected organization like IEEE stepped-in and leads the way or the government imposes restrictions on doing business ...


Read More on Datafloq
Fireside Chat: Edge Computing Vs. Cloud Computing | Simplilearn webinar starts 13-12-2017 21:30

Fireside Chat: Edge Computing Vs. Cloud Computing | Simplilearn webinar starts 13-12-2017 21:30

Is the era of cloud computing coming to an end? Experts predict that cloud computing is gradually making way for the next big thing: Edge. NASSCOM Product Connect and Simplilearn together present this live fireside chat on edge computing and how it stacks up against cloud. Tune in to watch Bernard Golden, Cloud Computing Expert and Anand Nara...Read More.
Expert Webinar: Practical Risk Management Steps for the Threat Hunter | Simplilearn webinar starts 13-12-2017 23:00

Expert Webinar: Practical Risk Management Steps for the Threat Hunter | Simplilearn webinar starts 13-12-2017 23:00

What is threat hunting and why is this field gaining popularity among security organizations? Threat hunting emphasizes on a proactive approach to security and encourages organizations to anticipate and manage potential security breaches and hacks. Join Dr. James Stanger, Sr Director of Products at CompTIA, in a live webinar as he walks us throu...Read More.
TechCast: Paradigm Shift: From ‘Time Off to Learn’ to Learning Anytime | Simplilearn webinar starts 13-12-2017 15:00

TechCast: Paradigm Shift: From ‘Time Off to Learn’ to Learning Anytime | Simplilearn webinar starts 13-12-2017 15:00

The business world is in a state of technological revolution. Advances in Artificial Intelligence and Machine Learning have made several science fiction tropes like self-driving cars and robots a reality. Join Dr. Madana Kumar, Vice President and Global Head of Learning and development, UST Global, in an interactive TechCast as he discusses the ...Read More.
Elasticsearch for Dummies

Elasticsearch for Dummies

Have you heard about the popular open source tool used for searching and indexing that is used by giants like Wikipedia and Linkedin? No, I’m pretty sure you may have heard it in passing.

Yes, I’m talking about Elasticsearch. In this blog, you’ll get to know the basics of Elasticsearch, its advantages, how to install it and indexing the documents using Elasticsearch.​

What is Elasticsearch?

Elasticsearch is an open-source, enterprise-grade search engine which can power extremely fast searches that support all data discovery applications. With Elasticsearch we can store, search, and analyse big volumes of data quickly and in near real time. It is generally used as the underlying search engine that powers applications that have simple/complex search features and requirements.

Advantages of Elasticsearch

BUILT ON TOP OF LUCENE – Being built on top of Lucene, it offers the most powerful full-text search capabilities.

DOCUMENT-ORIENTED – It stores complex entities as structured JSON documents and indexes all fields by default, providing a higher performance.

SCHEMA FREE – It stores a large quantity of semi-structured (JSON) data in a distributed fashion. It also attempts to detect the data structure, index the data present and makes it search-friendly.

FULL-TEXT SEARCH – Elasticsearch performs linguistic searches against documents and returns the documents that matches the ...


Read More on Datafloq
The Principle Of Mass On The Future Battlefield

The Principle Of Mass On The Future Battlefield

Men of the U.S. Army 369th Infantry Regiment “Harlem’s Hellfighters,”in action at Séchault on September 29, 1918 during the Meuse-Argonne Offensive. [Wikimedia]

Given the historical trend toward battlefield dispersion as a result of the increasing lethality of weapons, how will the principle of mass apply in future warfare? I have been wondering about this for a while in the context of the two principle missions the U.S. Army must plan and prepare for, combined arms maneuver and wide area security. As multi-domain battle advocates contend, future combat will place a premium on smaller, faster, combat formations capable of massing large amounts of firepower. However, wide area security missions, such as stabilization and counterinsurgency, will continue to demand significant numbers of “boots on the ground,� the traditional definition of mass on the battlefield. These seemingly contradictory requirements are contributing to the Army’s ongoing “identity crisis� over future doctrine, training, and force structure in an era of budget austerity and unchanging global security responsibilities.

Over at the Australian Army Land Power Forum, Lieutenant Colonel James Davis addresses the question generating mass in combat in the context of the strategic challenges that army faces. He cites traditional responses by Western armies to this problem, “Regular and Reserve Force partnering through a standing force generation cycle, indigenous force partnering through deployed training teams and Reserve mobilisation to reconstitute and regenerate deployed units.�

Davis also mentions AirLand Battle and “blitzkrieg� as examples of tactical and operational approaches to limiting the ability of enemy forces to mass on the battlefield. To this he adds “more recent operational concepts, New Generation Warfare and Multi Domain Battle, [that] operate in the air, electromagnetic spectrum and cyber domain and to deny adversary close combat forces access to the battle zone.� These newer concepts use Cyber Electromagnetic Activities (CEMA), Information Operations, long range Joint Fires, and Robotic and Autonomous systems (RAS) to attack enemy efforts to mass.

The U.S. Army is moving rapidly to develop, integrate and deploy these capabilities. Yet, however effectively new doctrine and technology may influence mass in combined arms maneuver combat, it is harder to see how they can mitigate the need for manpower in wide area security missions. Some countries may have the strategic latitude to emphasize combined arms maneuver over wide area security, but the U.S. Army cannot afford to do so in the current security environment. Although conflicts emphasizing combined arms maneuver may present the most dangerous security challenge to the U.S., contingencies involving wide area security are far more likely.

How this may be resolved is an open question at this point in time. It is also a demonstration as to how tactical and operational considerations influence strategic options.

American Telcom Moguls May Now Destroy The Internet Thanks To The FCC

American Telcom Moguls May Now Destroy The Internet Thanks To The FCC

Net neutrality is once again a hot topic in the mainstream media and internet forums thanks to the recent decision made by FCC Chairman Ajit Pai to end the enforcement of net neutrality. The decision is ultimately disliked by the majority of internet users and the demand for Net Neutrality is even crossing party lines in America's divided political culture.

Many fear that our free and unregulated internet will come under attack by the large telecommunications monopolies that control the majority of the country's internet users. On the other hand, some fear that handing over control to the FCC is a dangerous precedent since they are known to be harsh with their regulation with other information mediums. As an individual, you can secure your data by using an overseas VPN network, like those offered by ExpressVPN, in order to circumvent Orwellian control over your internet usage (for more information about that, click here).

Controlling ISP Monopolies

AT&T was the first example of a telecommunications monopoly and they had a hold of the nation's infrastructure during the 1980s. Lawsuits had to force AT&T to dissolve their assets after anti-trust lawsuits but they later went on into the computer market as a result of it. ...


Read More on Datafloq
How Small Business Owners can Exploit Big Data Analytics

How Small Business Owners can Exploit Big Data Analytics

Big data previous has been utilised by big businesses with much success that has necessitated the small businesses to embrace big data. Big data has become an essential part of businesses and it has been integrated with other factors of production such as hard assets and human capital. Big data comprises of large pools of data, especially concerning the market and consumers. Business owners use this data to carry out analysis of the market and come up with better decisions. Most small business owners tend to pay little attention to big data analytics as a resource. However, there has been an increasing interest in a small business to utilise big data as a resource in building their business models.

How Small Businesses Could Use Big Data Solutions for Their Businesses

Big data is a resource for small businesses to exploit. Every small business owner can expect to benefit from big data analytics in the following ways:

• Helps in making sales funnels that are more efficient. A Sales funnel comprises of the various buying processes that customers are subjected to by a particular company. Efficient sales funnels for small businesses are important since you can use them to direct more visitors to your ...


Read More on Datafloq
How Technological Advances Are Transforming the Medical Field

How Technological Advances Are Transforming the Medical Field

A wide variety of innovations in today’s digital world are revolutionizing the healthcare industry. Technology in the medical field is here to stay. It is no longer a secret that technology has impacted our day-to-day life in one way or the other. Until recently, the healthcare industry has mostly been unaffected by the advancements in technology which are characterized by the digital age. However, the wave of digital transformation is slowly making inroads into the medical field. Technology has carved out a society where patients can have better treatment, better recovery, and improved lives. From secure delivery by expectant mothers and effective cancer treatment to dealing with cardiac conditions, doctors can now save millions of lives globally.

Doctors will be able to tackle health problems better and in a cost-effective manner as biomedical research continues to improve. The following are the technological advances that are making a significant change in medicine.

Medical equipment technology

The integration of innovations in the healthcare industry has led to the benefit of an improved quality of life for patients. Medical technology has led to the development of equipment that can perform non-invasive surgeries reducing the time required by the patient to heal. More comfortable scanning devices and ...


Read More on Datafloq
TDI Friday Read: The Lanchester Equations

TDI Friday Read: The Lanchester Equations

Frederick W. Lanchester (1868-1946), British engineer and author of the Lanchester combat attrition equations. [Lanchester.com]

Today’s edition of TDI Friday Read addresses the Lanchester equations and their use in U.S. combat models and simulations. In 1916, British engineer Frederick W. Lanchester published a set of calculations he had derived for determining the results of attrition in combat. Lanchester intended them to be applied as an abstract conceptualization of aerial combat, stating that he did not believe they were applicable to ground combat.

Due to their elegant simplicity, U.S. military operations researchers nevertheless began incorporating the Lanchester equations into their land warfare computer combat models and simulations in the 1950s and 60s. The equations are the basis for many models and simulations used throughout the U.S. defense community today.

The problem with using Lanchester’s equations is that, despite numerous efforts, no one has been able to demonstrate that they accurately represent real-world combat.

Lanchester equations have been weighed….

Really…..Lanchester?

Trevor Dupuy was critical of combat models based on the Lanchester equations because they cannot account for the role behavioral and moral (i.e. human) factors play in combat.

Human Factors In Warfare: Interaction Of Variable Factors

He was also critical of models and simulations that had not been tested to see whether they could reliably represent real-world combat experience. In the modeling and simulation community, this sort of testing is known as validation.

Military History and Validation of Combat Models

The use of unvalidated concepts, like the Lanchester equations, and unvalidated combat models and simulations persists. Critics have dubbed this the “base of sand� problem, and it continues to affect not only models and simulations, but all abstract theories of combat, including those represented in military doctrine.

Wargaming Multi-Domain Battle: The Base Of Sand Problem

Next-generation Business Process Management (BPM)—Achieving Process Effectiveness, Pervasiveness, and Control

Next-generation Business Process Management (BPM)—Achieving Process Effectiveness, Pervasiveness, and Control



The range of what we think and do is limited by what we fail to notice. And because we fail to notice that we fail to notice there is little we can do to change until we notice how failing to notice shapes our thoughts and deeds.
—R.D. Laing

Amid the hype surrounding technology trends such as big data, cloud computing, or the Internet of Things, for a vast number of organizations, a quiet, persistent question remains unanswered: how do we ensure efficiency and control of our business operations?

Business process efficiency and proficiency are essential ingredients for ensuring business growth and competitive advantage. Every day, organizations are discovering that their business process management (BPM) applications and practices are insufficient to take them to higher levels of effectiveness and control.

Consumers of BPM technology are now pushing the limits of BPM practices, and BPM software providers are urging the technology forward. So what can we expect from the next generation of BPM applications and practices?

BPM Effectiveness Via Automation

Effective business process management software could help you keep track efficient and accurate of your business processes.Mihai Badita, senior business analyst at UiPath, a software company that offers solutions for automating manual business processes, said, “We estimate that around 50 to 60 percent of tasks can be automated, for the time being.�

This is a bold but not unexpected statement from a relatively new company that appears to rival established robotic process automation software companies such as Blue Prism, Automation Anywhere, and Integrify—the latter offering an interesting workflow automation solution that can automate the process of collecting and routing requests—as well as market-leading BPM software providers such as Appian and Pegasystems. According to the Institute for Robotic Process Automation (IRPA), process automation can generate cost savings of 25 to 50 percent and enable business process execution on a 24/7 basis, 365 days a year.

Aside from the obvious effects that automation might have on business processes, such as cost savings and freeing up time and resources, business process automation can help many organizations address repetitive tasks that involve a great deal of detail. Many delays during business process execution are caused by these manual and repetitive tasks, and bottlenecks can arise when decisions need to be made manually. Such processes could be automated and executed entirely without human intervention.

Process robots are a set of specific software modules capable of capturing information from different systems, manipulating data, and connecting with systems for processing one or multiple transactions. Of course, it’s important to consider the role of effectively training these process robots—including programming and implementing them—to ensure efficiency and precision, making sure business rules are well-defined even before this training to ensure success of the automation strategy.

There are indications that automation will grow in the BPM arena in the coming years, with the incorporation of improved advanced machine learning techniques and artificial intelligence algorithms.
(post-ads)

BPM Pervasiveness Through Mobility, Development, and the Cloud

Mobile technology greatly impacts business process management.Mobile technology affects perhaps no other component of the enterprise software stack as strongly as BPM. The first mobility goal of every organization has been to enable employees involved in all stages of every business process to operate independently and unrestricted by location and time. A user with a new purchase order to submit, confirm, or authorize should be able to do so using a mobile device no matter where he or she is located or what time is it.

To address security and privacy concerns and to meet specific governance and business requirements, companies realize it is imperative to take this effective yet simple solution-mobile app interaction schema to a next level of integration.

Organizations are recognizing the need for increased enterprise software integration of BPM routines at all levels, and as a result they are taking a wider approach to mobile adoption. Many organizations are taking further steps to develop and deploy custom mobile solutions, and many if not all of those deployments involve business process improvements and the ability to integrate with the rest of the enterprise software stack. A study from MGI Research notes that, at the time of the study, 75 percent of all companies reported a mobile apps development cycle of nine or less months.

With this trend, many BPM software providers are already offering customers the ability to accelerate the development of mobile and custom process-oriented applications with development tools that can either avoid or minimize the need for coding. They can also offer visual and modular components to accelerate speed of development with different degrees of integration and compliance with internal IT regulations for security, privacy, and governance. To mention just a couple, companies such as South Africa-based K2 and former French company W4, now part of Itesoft,  have developed capabilities well beyond traditional BPM features for modeling and executing business processes, to allow organizations to develop fully customizable process-oriented business applications.

Another component for the provision of pervasive business process has to do with the development of process-oriented applications with a high degree of integration with the different set of systems of record—for example, ERPs, CRMs, and others—to effectively improve the way users move across business processes and interact with existing systems. Companies such as Kofax, with its process automation offerings, aim to enable organizations to develop so-called smart process applications (SPAs), focused on process-based applications which can be well-integrated with existing systems, as well as embedded to work seamlessly in different operating and platform environments, providing the ability to execute business processes from the user’s platform of choice and device, thus preserving data accuracy and consistency across platforms.

Other important factors of a more pervasive BPM framework have to do, respectively, with the integration of BPMs mobile capabilities within larger corporate mobile strategies and solutions, including enterprise mobile management (EMM) or enterprise mobile application development platforms (MADPs) and, of course, the adoption of corporate business process management in the cloud.

Interestingly, some BPM providers are rapidly increasing their ability to incorporate more control and management capabilities to mobile app environments, such as improved security and role administration. Without being a substitute for the previous solutions mentioned, this can be an effective first step in encouraging corporate BPM apps development.

With regards to cloud adoption, aside from lower costs and faster return of investment already discussed, the possibility that specialized service providers can take care of the development and administration of a reliable and secure environment can, within many organizations, encourage rapid and effective development of mobile and embeddable process-oriented applications.

Not BI Versus BPM, But BI and BPM

Software companies have now realized Business intelligence also need to be process-oriented. A sample case of this new direction can be sampled when Swedish enterprise software provider IFS acquired a company called VisionWaves. VisionWaves, now IFS Enterprise Operational Intelligence(EOI) offering is an interesting product that aims to offer organizations a wide view of the state of an organization, via a corporate cockpit that combines views and analysis of process and performance within a single environment.

This signals an increasing interest in process and performance within the software industry. The need for information and the speed of business makes operations data analysis operate at different paces, thus creating silos that work at a different pace and sometimes even make things difficult to understand.

Some organizations are realizing that as the use of analytics becomes more important, its effectiveness and impact depends on its ability to collaborate within actual decision making at all levels. The need for information never wavers—its value remains and even increases—but the need for collaboration, process control, and performance monitoring also increases at a point when risk mitigation, opportunity identification, and actual informed decisions are to be made.

In order to improve business operations through the use of analytics, business intelligence (BI) needs to be naturally process-oriented, embedded within a user’s operational environment to provide collaboration and synergy and be, of course, efficient and fast enough to provide information in real-time.

Interesting methodology from Vitria with its operational intelligence approach, Kofax’s process and intelligence analytics, and Salient with its Collaborative Intelligence Suite—these all aim to provide users with analytics that can effectively give users a centric process-data view approach, infusing analytics right in the trenches of business operations.

Last but not least, something worth mentioning—and that in my view has great potential for improving the synergy between BI/analytics and BPM—has to do with recent efforts and developments within the decision-making process of an organization. This includes the recent announcements of the publication of the Decision Model and Notation (DMN), an industry standard modeling notation for decision management and business rules by the Object Management Group (OMG).

Widespread use of more formal methods for decision management can certainly have a big impact in the way organizations design the use of analytics that are directly involved in decision making at different levels and aspects of an organization, to gain control, measurement, and business operations effectiveness.

Conclusions—BPM for the Future

Never before has there been such an accumulated effort—from vendors incorporating new technology within BPM solutions, to user and professional groups modernizing BPM practices—to increase operation efficiency in organizations. Still, the challenges remain—achieving effective collaboration and communication of insights, obtaining an efficient analytical view of the entire organization, and closing important operational gaps, including those between technology and business.

As we noted in the beginning of this look at business process management and automation, the range of what we think and do is limited by what we fail to notice. There is also a lot of value to be unveiled within processes, if we optimize them properly and take advantage of the tools available to us.

(Originally published in TEC's Blog)

The 6 Biggest Cybersecurity Concerns for 2018

The 6 Biggest Cybersecurity Concerns for 2018

If there’s anything you can say about the field of cybersecurity, it’s that it’s growing. By 2019, there will be 6 million new cybersecurity jobs on the market worldwide. This increase is commensurate with the increase in big data, which goes directly with advancements in both the cloud and the IoT. As the world moves closer to 180 zettabytes of data in 2025 (a conservative estimate), Forbes’ Gil Press reports that 2018 will see “5 times higher growth in spending on cloud versus on-premises analytics solutions.� The amount of data we’re generating is calling for increased spending on analytics and storage.

There’s no question the cloud will continue to grow (where else will all the data go?), but the amount of IoT advancement is less certain. Victor Vilas, Business Development Manager Europe for AndSoft, says that IP traffic will reach 2 zettabytes per year by 2019. More devices creating more data will create more cybersecurity concerns. But what will the biggest threats be in the burgeoning cybersecurity industry? The cybersecurity analyst who knows how to deal with these is well prepared for 2018.

Unfilled Cybersecurity Jobs

First and foremost, cybersecurity and IT can’t deal with threats properly if there aren’t enough qualified individuals ...


Read More on Datafloq
Using DW Digest Analytics platform for collecting room humidity data

Using DW Digest Analytics platform for collecting room humidity data

In this blog post, I’m gonna show you how our proprietary Data Warehouse Digest Analytics (DW Digest) platform can be used to collect data from yet another kind of data source.

What I meant by that is, we at nextCoder, have been serving our clients, large and small, by integrating data from well-known data sources such as Google APIs, Quickbooks, PropertyWare, Netsuite, Qualtrics, Fonality, Omnitracs and SkyEye GPS Tracking Systems, MyEKOS, Restaurant POS System, etc — to name a few. Our clients use these data for analysis which helped them make better business decisions. Moreover, last year, we’ve figured out how to pull data from injection moulding machines.

This time, we read data from a humidity sensor. Yep, this is the first time we’ve used our DW Digest platform to collect data, directly from an environmental sensor (welcome to the IoT world!).

To accomplish this, we built a simple Humidity and Temperature sensor circuit based on the Arduino YUN device. In real life, this simple device can easily be replaced by commercially available sensor devices, particularly ones that offers Wi-fi/data-collection.

We wrote a program to run on the Arduino board, which reads the current room humidity and temperature, and makes the numbers available via REST-like API.

{arduino program snippet here}

We then set up an instance of our DW Digest platform on a server. For us, this is pretty simple to do since our instance are virtualized, it literally took only 30 minutes to set up a new instance and get it running.

For this exercise, we schedule the data to be collected every 10 minutes.

{screenshot of schedule dashboard}

The result can be observed on a special Dashboard we setup on the same DW Digest instance.

{blue dashboard screenshot}

You can see the peaks in the humidity is where a humidifier appliance is turned on in the room {need more blurb}

Factors to consider when embarking in a project like this:

    Since data is transferred via the network, one of the challenge is the IP addressing. Spend some time thinking through how the server/data collector can reach the sensor device (in a “pull� approach), or how the sensor device can reach the server/data collector (in a “push� approach).
    You’ll need a good, stable, Data Warehouse software platform where you can schedule the data collection. Either that, or you setup the sensor device to maintain its own schedule and send the data to the server periodically, although that way, you will be at the mercy of whatever mechanism the sensor has (or lack thereof) to send the data. We chose a “pull� approach where all the schedule is maintained by our customer-proven DW Digest platform, since our platform already have a web-interface for maintaining the schedule.

    In our case, we used the same DW Digest platform to build a Dashboard to show the data. In your case, you might send the data to trigger some other actions, or for further analysis.

So what’s next?

The sky is the limit really. We are moving on to built a dust/particle sensor circuit with similar setup. Goal is to help allergy sufferer (such as myself) study the effect of dust, pollen, mold, etc, on-premise rather than just relying on public pollen forecast. Stay tuned!

The Post-Algorithmic Era Has Arrived

The Post-Algorithmic Era Has Arrived

Last week, IIA hosted our annual Predictions and Priorities webinar, as well as the associated research brief. When we sat down to determine what we should focus on this year, Tom Davenport and I both immediately raised a trend that we’ve recently been discussing with organizations. After reconciling our semantics, we realized that we were both excited about the same base trend. I want to reiterate it here as I think it is a critical trend to understand and adapt to. Namely, “the post-algorithmic era has arrived�.

Does This Mean Algorithms Are a Thing of The Past?

Of course not! In fact, algorithms are being embedded into more and more business processes every day. If anything, analytics processes built on top of algorithms are going to continue to rise in importance for the foreseeable future. If that’s the case, then what does “post-algorithmic era� mean?

The defining factor of the post-algorithmic era is that having access to algorithms, and knowledge of how to execute them, is no longer a differentiator. Not long ago, a large part of the value proposition for an analytics professional was:



Knowing how to use the then-few (and complex) software tools that contained algorithms, and


Knowing from experience which specific algorithms might work ...


Read More on Datafloq
Isolating the Guerilla

Isolating the Guerilla

The Vietnam was significant in that it was third bloodiest war in U.S. military history (58,000 U.S. killed) and the U.S. Army choose to learn no lessons from it !!! This last point is discussed in my book America’s Modern Wars: Understanding Iraq, Afghanistan and Vietnam.

In 1965 Trevor Dupuy’s HERO (Historical Evaluation Research Organization) conducted a three-volume study called “Isolating the Guerilla.” It was an interesting survey of 19 insurgencies that included on its research team 26 experts. This included General Geoffrey Lord Bourne (British Army, ret.), Andrew C. Janos, Peter Paret, among others.

These guys:

https://en.wikipedia.org/wiki/Geoffrey_Bourne,_Baron_Bourne

http://www.nytimes.com/1975/04/26/archives/col-r-ernest-dupuy-88-dead-publicist-and-military-historian.html

https://en.wikipedia.org/wiki/Trevor_N._Dupuy

https://es.wikipedia.org/wiki/Andrew_C._Janos

https://www.goodreads.com/author/show/2793254.William_A_Nighswonger

https://en.wikipedia.org/wiki/Peter_Paret

http://www.legacy.com/obituaries/northjersey/obituary.aspx?pid=163090077

https://en.wikipedia.org/wiki/Theodore_Ropp

https://en.wikipedia.org/wiki/Gunther_E._Rothenberg

http://www.ur.umich.edu/9495/Oct03_94/29.htm

http://www.andersonfuneralhomeltd.com/home/index.cfm/obituaries/view/fh_id/12343/id/3994242

http://www.nytimes.com/1984/08/31/obituaries/frank-n-trager-78-an-expert-on-asia-dies.html

 

The first volume of the study, although developed from historical sources, was classified after it was completed. How does one classify a study that was developed from unclassified sources?

As such, the first volume of the study was in the classified safe at DMSI when I was there. I was aware of the study, but had never taken the time to look at it. DMSI went out of business in the early 1990s and all the classified material there was destroyed. The Dupuy Institute did not have a copy of this volume of the study.

In 2004 we did our casualty and duration estimate for Iraq. It was based upon a survey of 28 insurgencies. We then expanded that work to do an analysis based upon 89 insurgencies. This was done independently of our past work back in 1965, which I had never seen. This is detailed in my book America’s Modern Wars.

As this work was being completed I was contacted by a Lt. Colonel Michael F. Trevett in 2008. It turns out he had an unclassified copy of the study. He found it in the Ft. Huachuca library. It was declassified in 2004 and was also in DTIC. So, I finally got a copy of a study after we had almost completed our work on insurgencies. In retrospect, it would have been useful to have from the start. Again, another case of disappearing studies.

In 2011, Michael F. Trevett published the study as a book called Isolating the Guerrilla. The book is the study, with many of the appendices and supporting data removed at the request of the publisher. It was a self-publishing effort that was paid by Michael out of his personal/family funds. He has since left the Army. I did write the foreword to the book.

What can I say about this case? We did a study on insurgencies in 1965 that had some relevance to the wars we entered in Afghanistan in 2001 and Iraq in 2003. It remained classified and buried in a library in Ft. Huachuca, Arizona and at DTIC. It was de-classified in 2004 and came back to light in 2008. This was through the effort of a single motivated Lt. Colonel who was willing to take the time and his own personal money to make it happen.

Big Data Evolves to Unprecedented Changes of 5G

Big Data Evolves to Unprecedented Changes of 5G

Within the next three years, 5G will be the new standard for mobile content delivery. Industry experts have spent the last year discussing the benefits 5G will provide. However, less attention has been paid to the complex relationship between 5G and big data.

Big data needs must be addressed before 5G can be possible

The coming launch of 5G is one of the most exciting technological breakthroughs of our lifetime. However, it is taking longer than many people would hope. Broadband communication companies first discussed the introduction of 5G back in 2013, and it won’t hit the market until 2020.

The reason for the delay is simple. We still lack the infrastructure necessary to implement 5G. This will require investments in deep fibre and other hardware to support the big data needs that 5G technology is about to create. 

The seldom discussed relationship between 5G and big data

Big data has made 5G possible. The problem is that many IT providers have closed their minds to other, more promising possibilities. Previous generations of broadband cellular network technology have been far more primitive. They did not offer the IOT support or other capabilities that come with 5G.

The coming 5G networks will depend on a tremendous amount ...


Read More on Datafloq
Must-read books for 2018

Must-read books for 2018

Guest post by Erika Goldwater

Books have always been my strongest foundations for learning and there is no shortage of great business and marketing books to keep me busy. I love to have a book in my hands and a notepad by my side. I am a voracious reader, sometimes reading two books a week. Although I have to admit, not all my book choices are business or marketing related as I have a weakness for James Patterson books.

As we get ready to jump into 2018, there are a few marketing and business books that are essentials for the New Year. Some of these books may be familiar and some may be brand new. These selections offer a variety of perspectives on today’s relevant business and marketing topics, but also, touch on leadership and personal growth to help guide us in life and in work.

I’ve included a few books here written by friends because not only are they great reads, but they are written by people who live and breath marketing and sales every day. These are not books written by scholars, they are books by practitioners that have experienced success and failures, and share their stories along the way. Take a look and enjoy your newest reading list.

Principals by Ray Dalio

If you haven’t heard of Ray Dalio, you must read this book first. Dalio, the founder of Bridgewater Associates, is known for practicing “radical truth� is inspirational but also instructional. This book discusses the role of principals and why they are essential for establishing a successful business, but also, a solid foundation for our personal lives as well.

The Workplace Writers Process: A Guide To Getting The Job Done by Anne H Janzer

This how-to-guide gives practical advice for writing that will help even the most reluctant writer, write. Take a look at the section on the field guide to writers in the workplace including the aspirational writer and the inadvertent writer that made me laugh.

Contagious by Jonah Berger

The title says it all. This book by Berger shares insights into why something catches on and it’s not what you might think. This is a fun and engaging book that will help you think about your business and marketing efforts in a different way.

Everybody Writes by Ann Handley

Handley makes writing fun, even for those who say they can’t write. This book is a must-have for anyone who writes, and that, of course, is everybody.  Might be one of my favorite books on the list for many reasons, least of all because it is universally helpful, most of all because it makes learning about how to write better an engaging read.

The Right Way To Select Technology, Get the Real Story on Finding the Best Fit by Tony Byrne

This is a must-read for anyone involved in technology purchases. Why? Because you are probably making purchase decisions and attempting the integration all wrong. Byrne helps explain how to select technology for the best fit and gives examples and instructions. Don’t buy technology in 2018 without reading this.

Analytics How to win with intelligence by John K. Thompson and Shawn P. Rogers

Analytics and big data are part of what powers organizations today. However, many organizations aren’t using the data effectively and in fact, they may even be using the information without permission. Thompson and Rogers introduce the three-part litmus test (context, permission, and accuracy) that can help us all maximize our analytics.

Getting to Yes And, The Art of Business Improv by Bob Kulhan

Improv techniques used in business? Yep. Marketers are well aware of the use of this art form via marketer Tim Washer who uses improv techniques in his work at Cisco. This book by Kulhan is a great introduction to using more agile thinking in business and highlights the “so what� factor for each

Unleash Possible A Marketing Playbook That Drives Sales by Samantha Stone

This is a true B2B marketing playbook that shows how to build the marketing engine to drive revenue for organizations of any size. Stone gives case studies, metrics, interview guides and instructions to get the reader thinking about the marketing and sales partnership that drive results.

Leaders Eat Last by Simon Sinek
I love this book. If you have ever had the pleasure of hearing Sinek speak, the book reads the same way he delivers a presentation; engaging, educational and totally entertaining. Read it and incorporate some of the leadership tips he shares. Your teams will appreciate it.

Driving Demand by Carlos Hidalgo

This book was published in 2015 and in all honesty, I had a hand in it. It is still an essential read for any business, sales or marketing leader who is looking to transform their organization using modern marketing methods and change management techniques. Becoming a customer-centric organization isn’t easy but Hidalgo helps teams get there via clearly defined methods.

Experiences: The 7th Era of Marketing by Carla Johnson and Robert Rose

The buying process has changed today, yet marketers still struggle to engage their audiences. Experiences is the book that will help marketers become more effective in 2018. With extensive data and practical experiences to support their advice, Johnson and Rose show not only how we need to deliver experience-driven content, but explain why as well.

Be Like Amazon by Jeffrey and Bryan Eisenberg

What can we learn from Amazon? A lot. Jeffrey and Brian Eisenberg make the guiding principals of Amazon easy to understand and help anyone looking to get into business or grow an existing business actionable. Not every business wants to be Amazon, but if we learn customer centricity Amazon-style, success is possible.

This is only a snippet of some of the great books available. Did I miss any of your favorites? Share them in the comments.

 

4 Technical Challenges AR and VR Need To Solve In 2018

4 Technical Challenges AR and VR Need To Solve In 2018

Apple has recently launched ARKit to help developers make augmented reality applications for the iOS platform. Google has several projects like Daydream and Google Cardboard to provide low cost virtual reality experiences. The industry standards, the Oculus Rift and the HTC Vive have also grown leaps and bounds in the last few years. Virtual and Augmented reality are finally beginning to be viable, after decades of experimentation. But there is still a long way to go. Here are the top 4 technical challenges AR and VR need to solve in 2018 to become a household staple.

Latency

Latency in drawing new content is the biggest technical challenge AR and VR face today, and has been since the inception of these ideas. Any system is bound to have a threshold latency which comes from the frame rate of the content being drawn, the refresh rate of the display, and the input lag from the interaction that started the new content draw. The question of what the human eye perceives as “fluid motion� is a very complex one, but VR headsets typically target a latency less than 30-40 ms. Problems originating from latency are not isolated to the AR / VR field. All systems that ...


Read More on Datafloq
IoT, Autonomous Vehicles, Robots: Safety Will Determine Success

IoT, Autonomous Vehicles, Robots: Safety Will Determine Success

The current climate of non-stop cyberattacks seems remote to our everyday existence because it doesn’t affect us physically. A hacker can hold data hostage and hit a company for thousands of dollars in ransom without anyone blinking an eye. Although this hurts bank accounts, it reinforces the reality that money doesn’t have anything physical backing it. The digital world of big data and money continues on, apologies are made, sometimes salaries suffer, sometimes an IT guy loses his job—but no one gets hurt.

This won’t be the case with the IoT and autonomous vehicles. By nature, these advancements will have a profound effect—for bad or for good—on our physical well-being.

IoT Looks Primed for Safety  

The internet of things will play a big role in the transportation sector, a sector where safety and efficiency can use a boost. Sensors and cameras enable more knowledge, while the net enables better communication. There are multiple ways in which the IoT could affect transportation tech:


Sensors in seat belts and seats can tell whether a driver is drunk and relay that information to the vehicle so that it won’t start
Like rearview cameras in cars, sensors can relay information to drivers and cars to help prevent accidents
Smart cars ...


Read More on Datafloq
The Urban Warfare Study

The Urban Warfare Study

And then…..we discovered the existence of a significant missing study that we wanted to see.

Around 2000, the Center for Army Analysis (CAA) contracted us to conduct an analysis of how to represent urban warfare in combat models. This was the first work we had ever done on urban warfare, so…….we first started our literature search. While there was a lot of impressionistic stuff gathered from reading about Stalingrad and watching field exercises, there was little hard data or analysis. Simply no one had ever done any analysis of the nature of urban warfare.

But, on the board of directors of The Dupuy Institute was a grand old gentleman called John Kettelle. He had previously been the president of Ketron, an operations research company that he had founded. Kettelle had been around the business for a while, having been an office mate of Kimball, of Morse and Kimbell fame (the people who wrote the original U.S. Operations Research “textbook” in 1951: Methods of Operations Research). He is here: https://www.adventfuneral.com/services/john-dunster-kettelle-jr.htm?wpmp_switcher=mobile

John had mentioned several times a massive study on urban warfare that he had done  for the U.S. Army in the 1970s. He had mention details of it, including that it was worked on by his staff over the course of several years, consisted of several volumes, looked into operations in Stalingrad, was pretty extensive and exhaustive, and had a civil disturbance component to it that he claimed was there at the request of the Nixon White House. John Kettelle sold off his company Ketron in the 1990s and was now semi-retired.

So, I asked John Kettelle where his study was. He said he did not know. He called over to the surviving elements of Ketron and they did not have a copy. Apparently significant parts of the study were classified. In our review of the urban warfare literature around 2000 we found no mention of the study or indications that anyone had seen or drawn any references from it.

This was probably the first extensive study ever done on urban warfare. It employed at least a half-dozen people for multiple years. Clearly the U.S. Army spent several million of our hard earned tax dollars on it…..yet is was not being used and could not be found. It was not listed in DTIC, NTIS, on the web, nor was it in Ketron’s files, and John Kettelle did not have a copy of it. It was lost !!!

So, we proceeded with our urban warfare studies independent of past research and ended up doing three reports on the subject. Theses studies are discussed in two chapters of my book War by Numbers.

All three studies are listed in our report list: http://www.dupuyinstitute.org/tdipub3.htm

The first one is available on line at:  http://www.dupuyinstitute.org/pdf/urbanwar.pdf

As the Ketron urban warfare study was classified, there were probably copies of it in classified U.S. Army command files in the 1970s. If these files have been properly retired then these classified files may exist in the archives. At some point, they may be declassified. At some point the study may be re-discovered. But……the U.S. Army after spending millions for this study, preceeded to obtain no benefit from the study in the late 1990s, when a lot of people re-opened the issue of urban warfare. This would have certainly been a useful study, especially as much of what the Army, RAND and others were discussing at the time was not based upon hard data and was often dead wrong.

This may be a case of the U.S. Army having to re-invent the wheel because it has not done a good job of protecting and disseminating its studies and analysis. This seems to particularly be a problem with studies that were done by contractors that have gone out of business. Keep in mind, we were doing our urban warfare work for the Center for Army Analysis. As a minimum, they should have had a copy of it.

The SAIC Library

The SAIC Library

The story of the disappearing SAIC research library occurred in the middle of the 1990s, during the same time as the HERO Library was disappearing. SAIC had an “Military Operations Analysis Division” that for a time was a competitor to HERO/DMSI. In particular, around 1990, they hired three former HERO/DMSI employees and used them for studies that normally would have been done by us. Trevor Dupuy was on-the-outs with some people at the U.S. Army Concepts Analysis Agency (CAA). Some time in the mid-1990s, SAIC decided to close down their military operations analysis division.

The early 1990s were a difficult time for defense contractors. The Warsaw Pact and the Soviet Union had disappeared and the defense industry was shrinking. SAIC got rid of the division that did analytical work for DOD as they realized it was a dying business (something that we could never get through our heads). Companies like BDM, one of the stalwarts in the industry since 1959, was sold off in 1990s; with Trevor Dupuy’s old company, DMSI, also going out of business in the 1990s.

Anyhow, SAIC had a library for this division. It was the size of two double offices, maybe 400 square feet or more. It was smaller than the HERO Library. They decided to dissolve the library along with the division. They told the staff to grab what they wanted and dumped the rest. Having never had access to this library, I do not know if there were any holdings of value, but as SAIC had been around since 1969, it is hard to believe that there was not something unique there.

 

This post is related to:

The HERO Library

Missing HERO Reports

 

Big Data, Machine Learning and AI Predictions for 2018

Big Data, Machine Learning and AI Predictions for 2018

Investment in Big Data and AI shows no signs of slowing down. Here are some of our predictions for the year to come...

Automation of human tasks 

Whilst we’re still a far cry from the hyperbolic envisioning of robots taking over our jobs and being cast into a pit of uncertainty, there is evidence to suggest that cognitive technologies are on the rise, and Big Data is helping this. The use of technology for performing more ‘human’ tasks is growing rapidly and is set to continue to grow well over the coming years, technology is being used more for tasks we always considered ‘human’, like planning, strategizing and facial recognition for example. As we’ve seen in 2017, creative industries are succumbing to this ‘take over’ in areas like writing music and literature. Forrester has even predicted that in 2018, automation will take 9% of US jobs, and will create 2%.

Machine Learning capabilities 

Machine Learning capabilities are growing rapidly, transforming business applications in all sorts of industries from medicine and healthcare, to self-driving cars, gaming, and fraud detection to name a few. We’re expecting Machine Learning processing to become even faster and smarter in 2018 where we can see it being applied to even ...


Read More on Datafloq
Expert Webinar: Break Free from Traditional Sales with Digital Selling | Simplilearn webinar starts 07-12-2017 21:30

Expert Webinar: Break Free from Traditional Sales with Digital Selling | Simplilearn webinar starts 07-12-2017 21:30

Do you still depend on traditional selling methods like cold calling and trade shows to win customers? Your numbers are clearly drying up. This is the year of digital selling—the process of engaging with prospects online by strategically using social networks and digital sales tools. Join our webinar as Lilach Bullock, Forbes Top Women Socia...Read More.
7 Industries That Will Be Taken Over by AI and Robots (And How to Adapt)

7 Industries That Will Be Taken Over by AI and Robots (And How to Adapt)

Robots are taking over. You can see it already happening at McDonald's with its automated ordering kiosks, or your nearest supermarket with its self-checkout machine. Soon, it will be normal to see driverless cars and people walking their robot dogs outside. And this is only the beginning. With advancements in technology, many jobs typically performed by humans are being replaced by artificial intelligence (AI) and robots. Below are seven industries that are significantly impacted by automation:  

1. Transportation

Fully autonomous vehicles, or self-driving cars, will be hitting the road in the next few years. In fact, personal self-driving cars are expected to be on the market by 2018, with commercial applications not far behind. Self-driving cars will gain even more popularity as Uber plans to acquire 24,000 autonomous Volvo SUVs.

This means you can expect to see more people being dropped off at places without a driver in the near future. Although this may put many transportation workers out of business, Morgan Stanley predicts that driverless cars will save the U.S. $1.3 trillion a year by 2035 to 2050, for a global annual saving of $5.6 trillion. As well, self-driving cars will help improve transportation in many ways including, decreasing the risks of accidents, ...


Read More on Datafloq
3 Ways Organizations are Leveraging Big Data Across Different Applications

3 Ways Organizations are Leveraging Big Data Across Different Applications

The origin of the term “big data� can be traced back to a book by Erik Larson in 1989. However, economists and industry pundits have only started to discuss the impact big data has played in global organisations over the past few years. The impact has been incredible and will be even more significant by the end of the decade. Here are some of the applications brands are using big data.

Tax software

Taxes are one of the biggest challenges companies face. The United States tax code is currently 73,954 pages long. It becomes more complicated over time because Congress and ask new laws that all businesses must comply with.

In addition to dealing with the complexity of the United States tax code, organisations must also adhere to tax laws in other countries. The number of different tax authorities they must comply with is increasing as more organisations expand internationally in the new global economy.

Complying with these tax challenges would be nearly impossible without the assistance of modern technology. Fortunately, new tax management applications have simplified these problems. Big data makes it easier for organisations to stay in compliance. According to a study from Deloitte, they are turning their attention to predictive analytics models, ...


Read More on Datafloq
How Artificial Intelligence Will Change Corporate Governance

How Artificial Intelligence Will Change Corporate Governance

Growing investments in Artificial Intelligence (AI) technology have transformed many areas in the business world, especially among high-tech and financial organisations. External spending on AI-related projects went up to $12 billion in 2016.

Companies looking into AI may focus on the potential for automating low-skill tasks, but they are overlooking a major opportunity. Artificial Intelligence can also play a significant role in corporate governance. AI can help streamline decision-making processes, transform big decisions from gut feelings to data-driven knowledge, and better predict the future outcome of such decisions. As such, AI can improve an organisation’s leadership.

Streamlining Corporate Governance

One of the most daunting tasks of any board member is determining which executives to trust. Who has the data to back up their claims, and who is simply giving the board the runaround?

Companies like Salesforce have already started implementing AI tech into their boardroom to help settle disputes. CEO Marc Benioff has "hired" an AI "assistant" called Einstein.

According to "Business Insider," Einstein accompanies Benioff to all of his meetings. After everyone has spoken, Benioff can turn to Einstein and ask the AI what executives "need attention" and which ones are giving him inaccurate information.

The data gained from AI technology works well as a ...


Read More on Datafloq
Combat Power vs Combat Power

Combat Power vs Combat Power

In my last post, I ended up comparing Combat Effectiveness Value (CEV) to Combat Power. CEV is only part of combat power. In Trevor Dupuy’s formulation, Combat Power is P = (S x V x CEV)

This means that combat power (P) is the product of force strength, including weapon effects (S), operational and environmental factors (V) and human factors (CEV).

From his list of 73 variables on page 33 of Numbers, Predictions and War (NPW), the operational and environmental factors include terrain factors, weather factors, season factors, air superiority factors, posture factors, mobility effects, vulnerability factors, tactical air effects, other combat processes (including surprise), and the intangible factors (which are included in his CEV).

Again, it turns into a much longer laundry list of variables than we have from ADP 3.0.

SAP Data Hub and the Rise of a New Generation of Analytics Solutions

SAP Data Hub and the Rise of a New Generation of Analytics Solutions

“Companies are looking for a unified and open approach to help them accelerate and expand the flow of data across their data landscapes for all users.

SAP Data Hub bridges the gap between Big Data and enterprise data, enabling companies to build applications that extract value from data across the organization, no matter if it lies in the cloud or on premise, in a data lake or the enterprise data warehouse, or in an SAP or non-SAP system.�

This is part of what Bernd Leukert, SAP’s member of the executive board for products & innovation mentioned during SAP’s Big Data Event held at the SAP Hudson Yards office in New York City as part of the new SAP Data Hub announcement and one that, in my view, marked the beginning of a small yet important trend within analytics consisting on the launch or renewed and integrated software platforms for analytics, BI and data science.

This movement, marked by other important announcements including Teradata’s New Analytics Platform as well as IBM’s Integrated Analytics offering marks another step directed towards what appear a movement to a new generation of platforms and a consolidation of functions and features for data analysis and data science.

According to SAP, the motivation for the new SAP Data Hub solution offers customers a:

  • Simpler, more scalable approach to data landscape integration, management and governance
  • Easier creation of powerful data processing pipelines to accelerate and expand data-centric projects
  • Modern, open architecture approach that includes support for different  data storage systems

One way SAP aims to achieve this with its Data Hub solution is to create value among all the intricate and diverse data management process that goes from data collection, passing through integration and transformation, to its preparation for generating insight and action.

To increase efficiency for all management stages including data integration, data orchestration and data governance the new SAP Data Hub creates “data pipelines� to accelerate business results, these all, coordinated under a centralized “Data Operations Cockpit�.

(post-ads)
For what we can see, SAP aims to let the new solution emerge as the ideal data management platform for the rest of the SAP analytics and BI product stack ーincluding neat integration with SAP HANA and the ability to  take advantage of solutions like SAP Vora, SAP’s in-memory, distributed computing solutionー as well as with core Big Data sources including Apache Hadoop and Apache Spark (see figure below).

SAP’s Data Hub General Architecture (Courtesy of SAP) 

SAP Data Hub’s data pipes can access, process and transform data coming from different sources into information, to be used along with external computation and analytics libraries including Google’s TensorFlow.

Another interesting aspect of the new SAP Data Hub is that aims to provide an agile and easier way to develop and deploy data-driven applications, allowing via a central platform to develop and configure core data management activities and workflows to fasten the development process and speed results.

Key functional elements included within the new platform include:

Some of SAP’s Data Hub Major Functional Elements (Courtesy of SAP)
According to SAP this new solution will become, along with SAP Vora and SAP Cloud Platform Big Data Services a key component of SAP's Leonardo digital innovation system.

Analytics and BI on the Verge of a New Generation

As companies witness how their data landscapes grow and become more complex, new solutions are taking over the new analytics landscape and, as this has been pushed in great measure by new companies in the likes of Tableau, Qlik, or Dataiku to name just a few.

It seems now big software powerhouses are pushing hard to come with a new generation of tools to consolidate their data management and analytics offering.

With this, is not difficult to foresee a new competitive arena in a race to gain the favor of a totally new generation of data specialists, one I’m eager to keep track of, of course.

In the meantime take a look below at SAP’s Data Hub intro video and get a glimpse of this new solution.

Of course, please do let me know if you have a comment or feedback, lets keep the conversation.





* All logos and images are trademarks and property of their respective owners
How Will the Internet of Things Impact Education?

How Will the Internet of Things Impact Education?

The Internet of Things. It sounds like something quite innocuous and unassuming. It will turn out to be anything but. It’s down to the sheer size of the thing. With the huge expansion of the internet that will come with 31 billion devices being added to the internet, the game is going to change entirely.

The interesting thing is that like with so many innovations, we can’t yet know everything that’s going to happen. If you don’t believe me in that regard, just consider the home computer. For a long time, it was a tool without a purpose, as people thought it was interesting but not really all that useful. Then spreadsheets came along. This revolutionized computers in particular as well as business as a whole.

Of course, that’s not all she wrote. Though we can’t make all the predictions and we’ll probably be caught by surprise in many areas, there are a lot of predictions we can make. It’s just important to realize that most of these only form the tip of the iceberg.

Data Collection

The obvious first change is how much more data will be available. This will matter to education in two big ways. The first is that there will ...


Read More on Datafloq
How Retailers are Leveraging Augmented Reality this Holiday Season

How Retailers are Leveraging Augmented Reality this Holiday Season

The National Retail Federation predicts that between November and December 2017, holiday retail sales will top out at $678.75 billion. As shoppers across the country prepare to drop some serious cash over the next few weeks, they'll have greater access than ever before to digital technologies that transform the way they shop for, try on, and try out products.

Taking center stage? Augmented reality. Forget the days of standing in a crowded dressing room line or lurching your head around like the Exorcist trying to get a glimpse of how your new shirt looks from the back. Today's retailers are implementing whip-smart software solutions that make the shopping experience a breeze.

Today, we're taking a look at a few innovative brands fully embracing the augmented reality trend and what it means for the future of the industry as a whole.

Ready to get started? Let's try a few on for size.

Gap's "Dressing Room" App

Think fashion and tech, and Gap might not be the first retailer to come to mind. However, after experiencing a lag in recent sales, the global brand is making giant strides toward becoming more tech-savvy. In an ironic twist, the company more known for making denim cool again than for ...


Read More on Datafloq

Privacy Policy

Copyright © 2018 BBBT - All Rights Reserved
Powered by WordPress & Atahualpa
X