The One Board Wargame To Rule Them All

The One Board Wargame To Rule Them All

The cover of SPI’s monster wargame, The Campaign For North Africa: The Desert War 1940-43 [SPI]

Even as board gaming appears to be enjoying a resurgence in the age of ubiquitous computer gaming, it appears, sadly, that table-top wargaming continues its long, slow decline in popularity from its 1970s-80s heyday. Pockets of enthusiasm remain however, and there is new advocacy for wargaming as a method of professional military education.

Luke Winkie has written an ode to that bygone era through a look at the legacy of The Campaign For North Africa: The Desert War 1940-43, a so-called “monster” wargame created by designer Richard Berg and published by Simulations Publications, Inc. (SPI) in 1979. It is a representation of the entire North African theater of war at the company/battalion level, played on five maps which extend over 10 feet and include 70 charts and tables. The rule book encompasses three volumes. There are over 1,600 cardboard counter playing pieces. As befits the real conflict, the game places a major emphasis on managing logistics and supply, which can either enable or inhibit combat options. The rule book recommends that each side consist of five players, an overall commander, a battlefield commander, an air power commander, one dedicated to managing rear area activities, and one devoted to overseeing logistics.

The game map. [BoardGameGeek]

Given that to complete a full game requires an estimated 1,500 hours, actually playing The Campaign For North Africa is something that would appeal to only committed, die-hard wargame enthusiasts (known as grognards, i.e. Napoleonic era slang for “grumblers” or veteran soldiers.) As the game blurb suggests, the infamous monster wargames were an effort to appeal to a desire for a “super detailed, intensive simulation specially designed for maximum realism,” or as realistic as war on a tabletop can be, anyway. Berg admitted that he intentionally designed the game to be “wretched excess.”

Although The Campaign For North Africa was never popular, it did acquire a distinct notoriety not entirely confined to those of us nostalgic for board wargaming’s illustriously nerdy past. It retains a dedicated fanbase. Winkie’s article describes the recent efforts of Jake, a 16-year Minnesotan who, unable to afford to buy a second-end edition of the game priced at $400, printed out the maps and rule book for himself. He and a dedicated group of friends intend to complete a game before Jake heads off to college in two years. Berg himself harbors few romantic sentiments about wargaming or his past work, having sold his own last copy of the game several years ago because a “whole bunch of dollars seemed to be [a] more worthwhile thing to have.â€� The greatness of SPI’s game offerings has been tempered by the realization that the company died for its business sins.

However, some folks of a certain age relate more to Jake’s youthful enthusiasm and the attraction to a love of structure and complexity embodied in The Campaign For North Africa‘s depth of detail. These elements led many of us on to a scholarly study of war and warfare. Some of us may have discovered the work of Trevor Dupuy in an advertisement for Numbers, Predictions and War: Using History to Evaluate Combat Factors and Predict the Outcome of Battles in the pages of SPI’s legendary Strategy & Tactics magazine, way back in the day.

Book Commentary: Predictive Analytics by Eric Siegel

Book Commentary: Predictive Analytics by Eric Siegel




As much as we’d like to imagine that today the deployment and use of predictive analysis has now become a commodity for every organization and it’s of use in every “modern� business.

The reality is that in many cases an number of small, medium and even large organizations are still not using predictive analytics and data mining solutions as part of their core software business stack.

Reasons can be plenty: insufficient time, budget or human resources as well as a dose of inexperience and ignorance of its real potential benefits can be the cause. This and other reasons came to my mind when I had the opportunity to read the book: Predictive Analytics: The Power to Predict Who Will Click, Buy, Lie, or Die written by former Columbia University professor and founder of Predictive Analytics World conference series, Eric Siegel.

Aside from being a well and clear written book filled with examples and bits of humor to make it enjoyable, what makes this book in my view a one is, mostly written for a general audience, in plain English, which makes it a great option for those new in the field to fully understand what predictive analytics is and the potential effect and benefits for any organization.

With plenty of industry examples and use cases, Mr. Siegel neatly introduces the reader to the world of predictive analytics, what it is, and how this discipline and its tools are currently helping an increasing number organizations in the likes of Facebook, HP, Google, Pfizer ―and other big players in their fields― to discover hidden trends, predict and plan for making better decisions with data.

Another great aspect of the book is also its clear and easy explanation of current important topics including data mining and machine learning as a key to further advanced topics including artificial intelligence and deep learning. It also does a good job of mentioning some caveats and dangers making wrong assumptions when using predictive analytics.

I especially enjoyed the central section of the book filled with examples and use cases predictive analysis in different industries and lines of business like healthcare financial and law enforcement among others as well as list of resources listed at the end of the book.

Of course for me, being a practitioner for many years, there was a small sense of wanting a bit more technical and theoric details but yet, the book is a great introduction reference for both novices that need to get the full potential of predictive analytics and those familiar with the topic but that want to know what their peers are doing to expand their view of the application of predictive analysis in their organization.

If you are still struggling to understand what predictive analysis is and what benefits can offer to your organization can do to improve your decision making and planning abilities, or want a fresh view on what are the new use cases for this discipline and software solutions, Predictive Analytics from Eric Siegel is certainly reference you should consider having in your physical or virtual bookshelf.

Have you read the book? About to do it? Don’t be shy, share your comments right below…

Used Kursk Books

Used Kursk Books

Kursk: The Battle of Prokhorovka: Amazon.com is selling off its used copies of the Kursk book at $118.80. This is the first time I have seen them selling the book for below $200. They have eight “used-acceptable” books at 118.80. Another seller has a “used-acceptable” for $114.82. Amazon.com has six “used-good” for $120.04, seven “used-very good” for $121.28, and three “used-like new” for $122.51.

Kind of mystified how Amazon.com ended up with 24 used books.

Highlights from Newly Launched PMBOK Guide 6th Edition

Highlights from Newly Launched PMBOK Guide 6th Edition

The 6th edition of Project Management Body of Knowledge (PMBOK®) Guide is released this month and we all are curious to know what changes have been made in the guide. The inclusion of the Agile concepts is the major addition and it will bring significant changes in PMI certification courses like PMP, CAPM and PMI-ACP. Soon, we can expect the changes in the exam prep materials, exam format and training style too.

The new PMBOK Guide supports the broadening spectrum of project delivery approaches and incorporates Agile concepts in all knowledge areas. PMBOK Guide majorly focuses on Project Managers from various industries and to make it more relevant to the current PM scenario, some processes have been removed and some have been added as well. The 6th edition of the guide validates the requirement for the Talent Triangle that ensure project managers have great understanding of technical project management, business management, and strategic & leadership.

Here are some of the major highlights from the 6th edition of Project Management Body of Knowledge (PMBOK®):

  • To evaluate the importance of the project manager’s role in the business value creation and organizational changes, first three chapters of the book have been completely rewritten.
  • Agile concept has been introduced in all 10 knowledge areas. PMI has also launched an Agile Practice Guide to help aspirants to understand agile concepts easily.
  • In the PMBOK Guide V6, the role of the project manager has been defined as a leader, strategic thinker and business expert.
  • Introduces three new processes:
    1. Implement Risk Responses
    2. Manage Project Knowledge
    3. Control Resources
  • Change in the name of knowledge areas to make it more accurate. The ‘Human Resource Management’ module has been renamed to ‘Resource Management’ and ‘Time Management’ is renamed to ‘Schedule Management’.
  • All 10 knowledge areas now include four new sections as below
    1. Key Concepts
    2. Tailoring Considerations
    3. Trends and Emerging Practices
    4. Considerations for Agile/Adaptive Environments
  • New business case is added in the initiation and the requirement phase of the project management plan.
  • Taking off the Close Procurements from the book is one of the major changes.

This is the first version of PMBOK Guide that focuses on the implementation of the project management approaches in an Agile environment. The V6 of PMBOK Guide is released in Sep, 2017, PMI will bring the changes in the exam in the first quarter of 2018, however, the date is not confirmed yet. PMP aspirants who started studying with PMBOK Guide 5, can take the examination based on the same version before 2018.


Secretary of the Army, take 3

Secretary of the Army, take 3

On 19 July 2017 Mark Thomas Esper was nominated to be the new Secretary of the Army. This is the third nomination for this position, as the first two nominees, Vincent Viola and Mark E. Green, withdrew. Not sure when the congress with review and approve this nomination. I am guessing it won’t happen in September. The acting Secretary of the Army is Ryan McCarthy (approved as Undersecretary of the Army in August 2017).

Mr. Esper’s background:

  1. Graduate of USMA (West Point) in 1986 with a BS in Engineering.
  2. Masters degree from Harvard in 1995.
  3. PhD from GWU in 2008.
  4. Served as in infantry officer with the 101st Airborne Division during the Gulf War (1990-1991).
  5. Over ten years of active duty (1986-1996?). I gather still in the Army Reserve as a Lt. Colonel.
  6. Chief of Staff of the Heritage Foundation, 1996-1998.
  7. Senior staffer for Senate Foreign Relations Committee and Senate Governmental Affairs Committee, 1998-2002.
  8. Policy Director House Armed Services Committee, 2001-2202.
  9. Deputy Assistance Secretary of Defense for Negotiations Policy, 2002-2004.
  10. Director of National Security Affairs for U.S. Senate, 2004-2006.
  11.  Executive Vice President at Aerospace Industries Association, 2006-2007.
  12. National Policy Director for Senator Fred Thompson’s 2008 Presidential campaign, 2007-2008.
  13. Executive Vice President of the Global Intellectual Property Center, and Vice President for Europe and Eurasia at U.S. Chamber of Commerce, 2008-2010.
  14. Vice President of Government Relations at Raytheon, 2010 to present.

Wikipedia article: https://en.wikipedia.org/wiki/Mark_Esper

Rocket Man

Rocket Man

One of the two best versions of this song, from 2011: https://www.youtube.com/watch?v=HQnmLT1KbQo

The other great version, from 1978: https://www.youtube.com/watch?v=ngm_5PNSP00

 

Human Factors In Warfare: Diminishing Returns In Combat

Human Factors In Warfare: Diminishing Returns In Combat

[Jan Spousta; Wikimedia Commons]

One of the basic problems facing military commanders at all levels is deciding how to allocate available forces to accomplish desired objectives. A guiding concept in this sort of decision-making is economy of force, one of the fundamental and enduring principles of war. As defined in U.S. Army’s Field Manual FM 100-5, Field Service Regulations, Operations (which Trevor Dupuy believed contained the best listing of the principles):

Economy of Force

Minimum essential means must be employed at points other than that of decision. To devote means to unnecessary secondary efforts or to employ excessive means on required secondary efforts is to violate the principle of both mass and the objective. Limited attacks, the defensive, deception, or even retrograde action are used in noncritical areas to achieve mass in the critical area.

How do leaders determine the appropriate means for accomplishing a particular mission? The risk of failing to assign too few forces to a critical task is self-evident, but is it possible to allocate too many? Determining the appropriate means in battle has historically involved subjective calculations by commanders and their staff advisors of the relative combat power of friendly and enemy forces. Most often, it entails a rudimentary numerical comparison of numbers of troops and weapons and estimates of the influence of environmental and operational factors. An exemplar of this is the so-called “3-1 rule,� which holds that an attacking force must achieve a three to one superiority in order to defeat a defending force.

Through detailed analysis of combat data from World War II and the 1967 and 1973 Arab-Israeli wars, Dupuy determined that combat appears subject to a law of diminishing returns and that it is indeed possible to over-allocate forces to a mission.[1] By comparing the theoretical outcomes of combat engagements with the actual results, Dupuy discovered that a force with a combat power advantage greater than double that of its adversary seldom achieved proportionally better results than a 2-1 advantage. A combat power superiority of 3 or 4 to 1 rarely yielded additional benefit when measured in terms of casualty rates, ground gained or lost, and mission accomplishment.

Dupuy also found that attackers sometimes gained marginal benefits from combat power advantages greater than 2-1, though less proportionally and economically than the numbers of forces would suggest. Defenders, however, received no benefit at all from a combat power advantage beyond 2-1.

Two human factors contributed to this apparent force limitation, Dupuy believed, Clausewitzian friction and breakpoints. As described in a previous post, friction accumulates on the battlefield through the innumerable human interactions between soldiers, degrading combat performance. This phenomenon increases as the number of soldiers increases.

A breakpoint represents a change of combat posture by a unit on the battlefield, for example, from attack to defense, or from defense to withdrawal. A voluntary breakpoint occurs due to mission accomplishment or a commander’s order. An involuntary breakpoint happens when a unit spontaneously ceases an attack, withdraws without orders, or breaks and routs. Involuntary breakpoints occur for a variety of reasons (though contrary to popular wisdom, seldom due to casualties). Soldiers are not automatons and will rarely fight to the death.

As Dupuy summarized,

It is obvious that the law of diminishing returns applies to combat. The old military adage that the greater the superiority the better, is not necessarily true. In the interests of economy of force, it appears to be unnecessary, and not really cost-effective, to build up a combat power superiority greater than two-to-one. (Note that this is not the same as a numerical superiority of two-to-one.)[2] Of course, to take advantage of this phenomenon, it is essential that a commander be satisfied that he has a reliable basis for calculating relative combat power. This requires an ability to understand and use “combat multipliers� with greater precision than permitted by U.S. Army doctrine today.[3] [Emphasis added.]

NOTES

[1] This section is drawn from Trevor N. Dupuy, Understanding War: History and Theory of Combat (New York: Paragon House, 1987), Chapter 11.

[2] This relates to Dupuy’s foundational conception of combat power, which is clearly defined and explained in Understanding War, Chapter 8.

[3] Dupuy, Understanding War, p. 139.

New Beginning with Our Heroes – First PMBOK V6 Based Training for Veterans

New Beginning with Our Heroes – First PMBOK V6 Based Training for Veterans

The launch of the PMBOK Guide V6 can consider as the beginning of a new journey for PMI’s certifications, especially for PMP. The PMI had made some major and few minor changes in the book, which will start affecting the PMP and other certification exams from Q1, 2018.

In the past few months of 2017, MSys have conducted special PMP, LSSGB, ITIL and PMI-ACP classes for veterans and supported several Veterans in starting a new career. With the overwhelming support and success to our “Give Back to Society� initiative, MSys would like to offer extended support and begin our PMBOK V6 journey by announcing three PMBOK V6 special classes only and just for Veterans. Yes, you heard it right! MSys had decided to organize their first ever PMBOK Guide V6 based PMP class for our Heroes.

Here is the schedule for PMP boot camp based on the PMBOK Guide 6th edition:

To enroll for these special PMP classes based on V6 of PMBOK Guide, drop an email at support@msystraining.com or give us a call on +1-408 878 3078. If project management is not your forte, you can also attend LSSGB, LSSBB, ITIL Foundation and PMI-ACP special year end online classes take your corporate career to a new height.

Have a look at the discounted price for all courses:

The PMP certification course includes:

1)     PMBOK V6 based course material designed and developed by Jason Saetrum

2)     Simulations as per new exam format

3)     Class recorded videos

4)     35 contact hours certificate

5)     Reference book for V5 to V6 comparison

6)     Dedicated support in transitioning

MSys has been rated 5 out of 5 by our past veteran students. Read our success stories on www.msystraining.com or www.trustpilot.com. We are also the 1st organization to announce PMBOK V6 based training calendar. Thank you for your trust and support in making MSys leading training and certification provider in North America.

To know more about the course, you can pay a visit to our official site www.msystraining.com. If you have any queries about our Veteran special classes or any regular class, you can contact us at support@msystraining.com or call us at +1-408 878 3078.


How European Organizations Increase Their Innovation Speed with Smart Data Application

How European Organizations Increase Their Innovation Speed with Smart Data Application

For the third consecutive year, GoDataDriven collaborated with Big Data Expo to conduct Big Data Survey. Over 800 professionals participated in this third edition of Big Data Survey and shared their insights and experiences on topics like data strategy, implementing data science, technology, cloud, gdpr, and how to become attractive as an employer.

What Makes a Data Strategy Successful?

As in the first two editions of Big Data Survey, vision (87%) remains the most important aspect of a successful Big Data strategy, followed by talent (54%) and, last year’s number two, support from the management (50%).

Improving data quality is the largest challenge when it comes to implementing data infrastructure, followed by making data available and implementing data governance.

"Our management has a clear vision of the type of data products our organization needs to realize. We are constantly working to collect more data that supports us in executing our Big Data Strategy�, says Erik van Osenbruggen, Operational Manager at Promovendum/CAK Group

Although 9 out of every 10 organizations see lots of potential with data, data are primarily applied in dashboards and not so much to develop predictive models. The number of websites that are personalized in real-time using artificial intelligence is still very limited.

Becoming ...


Read More on Datafloq
How Big Data Analytics Boosts Cyber Security

How Big Data Analytics Boosts Cyber Security

As more and more parts of our lives become connected to the internet, and more of our daily transactions take place online, cybersecurity is becoming an increasingly important topic. Just as modern technology changes more quickly than ever before, so do cyber criminals create newer and faster ways to target and rip off organizations. New malware is difficult to detect using previous strategies, which means we need new cybersecurity strategies to ensure commercial security.

One such new strategy is to use big data analytics. Big data analytics is an automated process by which a computer system examines large and varied sets of data to find patterns and trends. It is currently used to help companies track customer preferences and therefore better target their products and advertisements to specific users. However, with some reprogramming, those same big data analytics could be used to detect, respond to, and ultimately prevent cybercrime.

Here are some ways that big data analytics could help in the fight against cyber criminals.

1. Identifying Anomalies in Device and Employee Behavior

It is nigh-on impossible for a human user to manually analyze the millions of alerts that internet customers generate each month and pick out the valid ones from the threats. A ...


Read More on Datafloq
Economics of Warfare 20

Economics of Warfare 20

 

This is the twentieth lecture from Professor Michael Spagat’s Economics of Warfare course that he gives at Royal Holloway University. It is posted on his blog Wars, Numbers and Human Losses at: https://mikespagat.wordpress.com/

This lecture continues the discussion of terrorism, looking at whether poverty or poor education causes terrorism. The conventional wisdom, supported by a book by Alan Krueger, is that they do not. The lecture presents four studies. Of those, one study (Krueger) makes the argument that they do not (see pages 4-5), while three of the studies (Enders and Hoover, de Mesquita, and Benmelech) find a limited association (see pages 6-14, 15 and page 21 ). Some of these other three studies have to work pretty hard to make their point. One is left to conclude that while poverty may have some impact on degree of terrorism and recruitment of terrorists, it is probably not the main or determining factor. We probably need to look elsewhere for the root causes.

The link to his lecture is here: http://personal.rhul.ac.uk/uhte/014/Economics%20of%20Warfare/Lecture%2020.pdf

 

Thoughts on Looker JOIN 2017 and the Looker Product Roadmap

Thoughts on Looker JOIN 2017 and the Looker Product Roadmap

Anyone who follows me on Twitter will probably know that I’ve been at the Looker JOIN 2017 product conference in San Francisco this week, if only because I’ve been tweeting constantly for the past two days about the new features coming with the Looker 5 release, the vibrant partner ecosystem and customers coming-up on stage and talking about how Looker had transformed their organization and helped them become more data-driven.

I’d initially registered to go to the event on my own behalf but ended-up attending in my new role as Product Manager responsible for an analytics service that used Looker as the data analysis component, speaking about our experiences and meeting our contacts in their partner and product engineering teams.

But I was also interested in what Looker were planing on doing next given the very interesting position they were now in, having ridden the wave of the new analytic databases introduced by Google and Amazon and started to displace Tableau and the other standalone, data visualization tools in the tech startup marketin the types of data-driven digital businesses and tech startups valuing Looker’s ability to query and analyze all the data they were collecting rather than just the subsets tools like Tableau were only capable of working with.

Looker started-out as a back-end technology play with its tech-focused initial set of customers willing to overlook more primitive front-end features, an example of which was Qubit when I arrived who’s engineering team loved the LookML modeling language and the efficient way in which it accessed their BigQuery data layer. As Looker’s customer base became more mainstream as sales grew in the enterprise market they’d need close the feature-gap with tools like Tableau and more generally work on the front-end user experience, particularly the initial experience when users first log in and from my experience aren’t really sure what to do next.

Looker are by no means the first BI software vendor to face this challenge and I remember a similar evolution taking place with Oracle’s BI platform in the years following the Siebel acquisition where features such as a user homepage that surfaced recent and relevant content, integration links into other applications and business processes and a general UI makeover were introduced with the 11g release as I wrote about in a blog post at the time for my old company …but at the same time introduced deep integration with their Oracle Fusion Middleware platform that greatly increased the complexity of the product was added primarily to enable BI functionality to be added their upcoming Fusion Applications business applications suite.

None of this is particularly meant as criticism of Oracle in-particular and I only it as an example because I wrote the book on it back in the day and built a business providing consulting and training for customers moving onto this platform from earlier versions. And of course this is in part an argument for delivering BI as cloud-based software-as-a-service as Oracle and others have since done, but even with a move to a cloud much of the product complexity still remains and you can’t help notice that development priority is now around enabling, and in some cases requiring, customers to adopt other products from that supplier rather than building out core BI functionality.

What’s interesting to me is the opportunity Looker has to similarly transition from a niche to mainstream but with a blank sheet of paper when creating the product’s architecture and more importantly, without the “strategy tax� of having to integrate and drive sales of unrelated business and infrastructure software. Just as interesting though is the opportunity this focus can provide for Looker to evolve their product that just happens to be built on a modern, scalable and standards-based platform to a platform for running data applications of which Looker the BI tool is just one example, just as Salesforce.com did with their Force.com platform some while ago.

To get some sense of how successful this strategy has been for Salesforce consider how busy San Francisco is when Oracle Openworld comes to town each year; 60,000 customers, vendors and salespeople descend on downtown San Francisco and fill all the bars and hotel rooms and make crossing the road down by the Moscone a major logistical exercise. Two weeks though Salesforce roll into town with their Dreamforce event and 180,000 turn-up, and now it’s 2nd and 3rd Street that are turfed-over and not just the gap between Moscone North and South that the city’s only just opened-up after Openworld.

So what new features were announced at Looker JOIN 2017 with the launch of Looker 5, the latest release of their platform? Well if you followed my tweets at the time you’ll have seen the UI had a refresh with a new initial homepage for users with recent content they’d worked on, and content commonly used with their group, surfaced on that page, together with the Action Hub, an evolution of the integrations feature recently added to the Looker 4 platform and used by Segment to enable users to route newly-discovered user cohorts to their platform and onwards to other marketing applications — the similarity to features similarly added by Oracle to their BI tool back in 2007 noted by me at the time (and several times subsequently)

Other UI improvements included a feature where stats analysis is used to automatically suggest which other columns are commonly-used by other users when you add columns yourself to a Look, a feature added to help new users navigate what can often be fairly complex and lengthy explores with no particular indication of what columns are useful for a particular report.

Basic query federation will also come with this new release giving user the ability to join data together in a query from two (or more?) explores and have the front-end environment join them together. It’s not clear whether this join happens in-memory in the web browser (I think so) or server-side, but it’s clear that these joins aren’t defined once-only in the LookML model and instead seem to be transitory, defined as needed by the user as part of an individual Look.

More importantly, as with all query federation features in tools going back to the original nQuire Query Server that provided the core of Oracle’s subsequent BI Server query federation technology there’s a limit to the size of tables you can join, from a network bandwidth perspective and because the server running the federation engine typically is much less powerful than the database server engines it sourced the data from — something to bear in mind when thinking about joining BigQuery datasets to other sources when using this feature when it becomes available.

A very welcome development from my work perspective was the addition of what Looker are calling “Analytic Blocks�, “Data Blocks� and “Viz Blocks� to their existing Looker Blocks method of packaging up best-practice metadata models and analysis techniques shared by Looker and partner developers. Analytic Blocks seem a further development of existing Looker Blocks and examples announced included examples for web analysis and cohort analysis, whilst Viz blocks extend Looker’s capability to add custom data visualizations to the hosted (and more commonly-used) version of Looker. Data Blocks meanwhile package-up LookML patterns for public datasources and commonly-used SaaS data services.

My thoughts on Looker evolving into a more broad platform for hosting data-driven applications was confirmed when Looker announced new analytic apps, though at first I thought I’d heard these positioned as having accompanying ETL routines which as we know from bitter experience usually end-up being more hassle than they’re worth, but they’re actually just examples of data, analytic and viz LookML blocks brought together to create a specific industry or horizontal application.

Finally, Looker announced one feature out soon and another due out later in 2018 that addressed a problem I’d certainly identified myself but didn’t expect Looker to try to address, but was pleased they’re making it a focus. The breakthrough that new cloud-based, massively-distributed query engines such as Google BigQuery gave users is its ability to query petabyte-size datasets with response time typically within a minute or less, which is fantastic and was the topic of my presentation in the deep-dive sessions.

But users aren’t impressed with how you’ve managed to query petabytes of data in under a minute … they want response-times in under a second and BigQuery just isn’t optimized for that. What is though is in-memory cache technology such as Druid used by Superset and a few other cutting-edge BI tools, and Looker announced imminent support for Druid with provisos that functionality would be limited and evolve further over time as the engineering team got to grips with Druid’s different approach to data access and querying.

For myself thought the real prize with developing-out this query optimisation would be in making Looker aggregate-aware so that pre-built aggregates in Druid, mySQL or other fast-response datastores could be mapped into Looker’s LookML model and used automatically by the query engine when data was requested at summary-level. As far as I know no such similar feature is on Google or Amazon Web Services’ roadmap and therefore investment by Looker to solve this “last mile� query problem by Looker would not be wasted.

Another potential solution as shown in the tweet above was shown in the final roadmap session at the end of the event, using memory in the users’ desktop browser was used to cache data returned by a Look in order to then have all subsequent front-end user interactions run at split-second speed.

So, based on what I saw at Looker JOIN 2017 does Looker the BI tool, and Looker the company, have the potential to grown beyond its current super-enthusiastic market of tech startups and VC-funded retail and other high-growth companies and become the next Salesforce.com? Will it end-up being bought by Google to complement their more basic and free-to-use Google Data Studio tool (Alphabet Ventures led the most recent round of investment in Looker back in May this year)?

Or will usage grow rapidly at first and then plateau, with further growth limited by the very platform choices that were innovative at the time but now limited their ability to take advantage of the next wave of IT innovation, as happened with Qlik and their eventual buyout by private equity?

Or will they suffer perhaps the worst outcome to this story and be bought by one of the software mega-vendors to be adapted thereafter to become the analysis and reporting solution for that company’s applications, and with more general analytics innovation mostly ending as engineers depart and customers move on to the next point solution? Well, that may be the fate of BeyondCore but having met and talked with their founder, Arijit Sengupta, last year just after their acquisition by Salesforce I think they’ll end-up subsumed as a distinct product within Einstein but play a big role in Salesforce’s drive to add AI to CRM and sales automation.

My opinion from what I saw at Join 2017 and from speaking with their product teams, customers and founder Lloyd Tabb on a couple of occasions — one of which was at the main customer service desk where I saw him regularly speaking with attendees and helping with support questions, just after delivering the product keynote, is that they’ve got the vision, market positioning and ability to execute that gives them a shot at the big prize, and enough VC funding to go for it without an immediate need to sell to a trade buyer and dilute their current focus. Good luck to them and I’m already looking forward to next year’s JOIN event and a trip back to San Francisco.


Thoughts on Looker JOIN 2017 and the Looker Product Roadmap was originally published in Mark Rittman’s Personal Blog on Medium, where people are continuing the conversation by highlighting and responding to this story.

Why Bitcoin Will Ultimately Fail and What Will Come Next

Why Bitcoin Will Ultimately Fail and What Will Come Next

We live in exciting times, where it has become possible to send money across the globe nearly instantaneously, where you can create value out of nothing and where we are working towards a future that is decentralised. The first application that kickstarted this revolution was Bitcoin when it was launched in 2009. Since then, the price of Bitcoin has increased dramatically, reaching $5000 for the first time in September 2017. The news of the Chinese government banning ICO’s resulted in a brief dip in the value of the coin, as well as almost all other cryptocurrencies, but after two days it recovered already. It seems that Bitcoin continues to rise in value, with multiple people predicting that Bitcoin will reach $10.000 by the end of 2017, $250.000 in ten years, $500.000 by 2030 or would even reach a value of $ 1 million in 5-10 years. Obviously, there is a hype going on if such enormous returns are predicted.

However, I believe that Bitcoin will not reach those valuations. In fact, I believe that Bitcoin will ultimately fail and be worth nothing or close to nothing. This will probably not happen in the short term. Most likely the value will keep ...


Read More on Datafloq
BOARD International: Cognitive, Mobile, and Collaborative

BOARD International: Cognitive, Mobile, and Collaborative

Business Intelligence (BI) and Enterprise Performance Management (EPM) software provider BOARD International recently released version 10.1 of its all-in-one BI and EPM solution. This release includes new user experience, collaboration, and cognitive capabilities, which will enable BOARD to enter into the cognitive computing field.

By incorporating all these new capabilities into its single BI/EPM offering, BOARD continues to uphold its philosophy of offering powerful capabilities within a single platform.

With version 10.1, BOARD aims to improve the way users interact with data significantly. The new version’s interface introduces new user interaction functionality in areas such as user experience and storytelling and is a major improvement on that of the previous version.

BOARD gave me an exclusive overview of the main features of version 10.1 and the company's product roadmap. Read on for details.

Getting On Board with Cognitive Technologies

With version 10.1, BOARD seems to be making its solution fit for a new era centered on machine learning. The solution uses natural language recognition (NLR) and natural language generation (NLG) capabilities to offer users new ways to interact with data (see Figure 1).

Figure 1. BOARD’s assistant (image courtesy of Board International)

For instance, users can now create an entire report in a drag-and-drop interface. They can also directly ‘talk’ to the system through spoken and written language. The system uses search-like strings that automatically translate human speech into words, words into queries, queries into reports, and finally reports that include the most important insights from the source information.

One key aspect of these features is that users can create a report by simply writing a search string or request. Specifically, BOARD uses a fuzzy search mechanism that searches the string for character sequences that are not only the same but similar to the query term to transform this request into a machine-generated report (Figure 2).

Figure 2. BOARD’s machine-generated report analysis (image courtesy of Board International)

BOARD can also identify, recover, and list reports that match the search criteria, such as reports generated by other users. This capability speeds up the solution development process by enabling users to identify existing work that can be used for a new purpose.


In-context Collaboration

BOARD has also improved its collaboration strategy, specifically by facilitating communication between users. The vendor has introduced an in-context collaboration feature that enables users to share their analyses, communicate via live chat, and enabling multiple users to edit and author reports in a single interface. Embedded security (Figure 3) ensures users have the right level of access and defines user groups. This enables users to share analytics securely and seems to improve the overall analysis of data and the development of analytics apps.

Figure 3. BOARD’s embedded collaboration features (Courtesy of Board International)

User Experience and Storytelling

BOARD is also continuing to focus heavily on customer experience and functional efficiency.

The latest version of BOARD’s BI and EPM platform has redesigned user interfaces, including a color-coded tile menu with icons to improve hierarchy management and touchscreen usability. In addition, the configuration panel now offers more time and analytics functions.

10.1 also introduces Presentations—a new storytelling capability that enables users to personalize their reports and save them as a live presentation. This enables users to share presentations that incorporate live information rather than static images and graphs with other users and groups, improving user collaboration.

This new feature lets BOARD stay up to date with current trends in BI and compete with other players in the field that already offer similar capabilities, such as Tableau and Yellowfin.

Mobility, Cognitive Capabilities, and Collaboration: BOARD’s Bet for the Future

BOARD also explained that it‘s paving the way for medium- and long-term product advancements.

In its latest release, BOARD has ensured its HTML 5- based client will replicate all the functionality of its existing Windows client interface in future. This will enable users to choose between mobile and desktop devices.

10.1 also introduces, new mobile apps and add-ons, which widen BOARD’s intrinsic analytics and data management capabilities and the solution’s mobile functions and features.   The company is also currently reinforcing the product’s interaction with the Microsoft Office software stack in a continuous effort to help users increase productivity. This will help users conduct BI and EPM analysis more easily as they will have access to embedded analytics services within the standard Office applications such as Word and Excel.

Lastly, 10.1 also includes more features for accessing big data sources and cloud-based technologies and has partnered with cloud CRM and Business Software leader Salesforce.com.It’s also worth noting that BOARD is now expanding its North American presence. Specifically, the vendor is increasing the number of its human and material resources to reinforce its marketing and sales efforts and support and services capabilities.

BOARD 10.1 offers a good balance of analytics and enterprise performance management capabilities. It could be a solution for those looking to start using analytics or enhance their existing analytics capabilities.

(Originally published on TEC's Blog)
Attention Entrepreneurs: Why You Need To Start Managing Your Data Now

Attention Entrepreneurs: Why You Need To Start Managing Your Data Now

Data is the foundation for every move you make. It drives your decisions and the way you interact with customers during the sales process. Collecting accurate data is vital, and there are there are serious consequences when the data you collect is inaccurate or incomplete.

Imagine this scenario

You’ve set your marketing team up with a brilliant CRM to capture leads and track their every move. You know the importance of collecting data, and you didn’t spare a dime when it came to buying the best CRM software out there.

The sales rep told you their software is capable of tracking your contacts’ behaviour in-depth. You can segment them based on the links they click (or don’t click), the videos they watch, the forms they fill out; you can even track them across the internet. Of course, you signed up.

Fast forward three years. You hire a top marketing manager to launch a massive marketing campaign to take advantage of the data you’ve been collecting for the past few years. They log into your account only to find a mess.

Your CRM is filled with unused tags, duplicate tags, and tags that are indecipherable. The people who created most of the tags have left the ...


Read More on Datafloq
Top-Paying Certifications for 2017

Top-Paying Certifications for 2017

What is the next upcoming certification? Which certification should be in your credential list? The IT skills and salary survey conducted in 2017 will answer your all questions related to certification. The predicted salary may vary depending on the location, organization and experience.

1. Certified in Risk and Information Systems Control (CRISC) (Annual Earnings: $131,298)

The ISACA (Information Systems Audit and Control Association) offers and manages the CRISC certification. The CRISC certification is specially designed for project managers, IT professionals and others whose job is to identify and manage business risks. According to the survey, over 20,000 people are certified CRISC professionals and 96% of those have kept it active. The demand of certified professionals with these skills is higher than the number of certified professionals, make CRISC is the highest-paying certification.

Certification Process: To become CRISC certified, one should have minimum three years of experience in the four areas covered by the certification. It is a computer based examination and aspirants need to register through ISACA website. To maintain the certification, one needs to submit required Continuing Professional Education (CPE) credits every year. The CRISC certification has been one of the top certifications for years and with the popularity of cloud computing, it will stay high in demand for upcoming years.

2. Certified Information Security Manager (CISM) (Annual Earnings: $128,156)

CISM certification, another popular course created and maintain by ISACA. This certification focuses on security strategy management and evaluates the systems and policies. Since its launch in 2002, over 32,000 professionals have been certified in this, making this a popular area with a small supply of certified individuals.

Certification Process: The process to get CISM certified is same as the CRISC and it is also a computer based exam. To appear for CISM certification, you must have at least 5 years of experience in Information Security and minimum three years of those as a security manager. To maintain the certification, you need to submit required Continuing Education Credits each year to ISACA.

3. AWS Certified Solutions Architect (CSA) Associate (Annual Earnings: $125,091)

The AWS Certified Solutions Architect (CSA) demonstrates individual’s proficiency in designing and implementing scalable systems on AWS. The demand of highly skilled AWS solutions architects makes this one of the top paying certifications. According to the stats, there are more than 10,000 certified professionals and considering the popularity of AWS-CSA certification this is a very small number.

Certification Process: To get AWS-CSA certified, one needs to have minimum 6 months of experience with AWS. This is again a computer based exam and can be taken at Kryterion testing centers. The certification course includes various topics, including selecting the suitable AWS services, designing on AWS, ingress and egress of data, identifying cost-control measures and estimating AWS costs.

4. Certified Information Systems Security Professional (CISSP) (Annual Earnings: $121,729)

CISSP certification proves expertize of individual into security management. This credential is offered and managed by the International Information Systems Security Certification Consortium (ISC)2. Along with other security related certifications, CISSP is a high-value certification in the field of security, but unlike others, one can get an associate credential while gathering the required experience. There are around 111,000 certified professionals worldwide with almost two-thirds are from the United States.

Certification Process: The CISSP exam is based on 8 areas in computer security, including communications and network security, security and risk management, asset security, software development security, identity and access management, security engineering, security operations and security assessment and testing. The CISSP certification requires minimum five years of experience in the field of Information Security with at least 3 years as a security manager. To maintain the certification, you need to submit Continuing Education Credits to the governing authority every year. You can apply for CISSP Associate certification while earning the required experience for CISSP.

5. Project Management Professional (PMP®) (Annual Earnings: $119,349)

PMP is globally recognized certification for Project Management and the fifth highest-paying certification. This certification is offered and maintained by the Project Management Institute (PMI®). In 210 countries and territories worldwide, there are almost 730,000 active PMPs.

Certification Process: The PMP exam focuses on the five areas of the project, including initiate, plan, execute, monitor and control, and close; and it validates expertize of running project successfully. To appear for the PMP certification exam, one must have a bachelor’s degree with 4,500 hours of project management experience while those who don’t have a bachelor’s degree need to have 7,500 hours of experience. Moreover, individuals need to attend 35 hours of PMP-related training. To book an examination, individuals need to apply at the official website of PMI. To maintain the certification, you need to submit 60 professional development units every three years.

6. Certified Information Systems Auditor (CISA) (Annual Earnings: $115,471)

CISA certification is specially designed for professionals whose job responsibilities include monitoring, auditing, assessing and controlling business systems. This certifies candidate’s ability to control vulnerabilities and propose processes, controls, and updates business policies according to business standards.

Certification Process: To appear for the CISA certification exam, you need to have minimum five years of experience in Information Security auditing or controlling. To continue the CISA credentials, one needs to earn continuing professional education credits.

7. Citrix Certified Professional – Virtualization (CCP-V) (Annual Earnings: $105,086)

CCP-V proves the ability of candidates that they can deploy virtual desktops and applications by utilizing a variety of Citrix technologies. The number of certified professionals is very less compared to the demand.

Certification Process: Only the certified Citrix Certified Associate – Virtualization (CCA-V) candidates can apply for CCP-V. The certification is valid for three years and to maintain it, you need to submit continuing education credits.

8. ITIL® V3 Foundation (Annual Earnings: $103,408)

ITIL is a widely used framework in the world of IT management. ITIL Foundation is an entry-level certification of ITIL lifecycle and offers an understanding of ITIL concepts and terminology. There are selected authorized partners who offer ITIL certification education, training and certifications. This certification focuses on the intersection of IT and the business needs.

Certification Process: There are no prerequisites that you need to fulfill before appearing for ITIL certification exam. Just attend the training, register for the exam, pass the exam and get certified.

If you are also looking to improve your skillset as well as your salary, you can pursue any of the certifications relevant to your field. While choosing the certification, consider your current skills, then select one that can power your career to the next level. If you have experience in storage or networking, you can pursue a cloud computing or virtualization certification. You can take PMP® or ITIL® to speed up your management career. There are several training providers like MSys Training that guide and help you to get the relevant certification for you.


How to get the PMP Certification in just 9 Steps

How to get the PMP Certification in just 9 Steps

Project Management Professional (PMP) is globally recognized certification for Project Management and the Project Management Institute (PMI®) is the governing body of this certification. Till 2017, there are around 730,000 PMP certified professionals across the world. If you are also looking to become certified Project Management Professional, follow the below steps to learn how to get PMP certification:

Step 1: Read PMI® Credentials Handbook

The PMI® organization has published a Handbook which is easily available on their official website. The PMI handbook explains everything one should know about the process for applying for PMP, taking the exam and getting certified.

Step 2: Be eligible for PMP Certified Professional

The PMI has set criteria to apply for PMP certification. To appear for the PMP certification exam, one must have a bachelor’s degree with 4,500 hours of project management experience while those who don’t have a bachelor’s degree need to have 7,500 hours of experience. Moreover, individuals need to attend 35 hours of the PMP certification training program to earn required contact hours. You can attend online PMP courses or in-person PMP bootcamp to earn contact hours.

Step 3: Apply for the PMP Exam

You can apply for the PMP certification exam before or after completing your online PMP courses or in-person PMP bootcamp. To register, you have to visit and fill the PMP Application Form on the PMI website (http://www.pmi.org). Once your PMP application has been approved, you can book an exam on the most suitable date.

Step 4: Read the PMBOK Guide

‘What is the PMP certification?’ PMP certification is globally recognized credentials for project managers and is based on the PMBOK Guide. Hence, during the PMP certification training, it is recommended to read the book multiple times.

 

 

 

Step 5: Online Study Material

Another way to start preparing for the PMP exam is listening to online audio books or other study materials. Nowadays, there are numerous sources available to refer and prep for the PMP exam. These prep-materials make the concepts, tools, and techniques of the PMBOK easy and explain you with examples from everyday life.

Step 6: Attend PMP Bootcamp

You can attend PMP certification training either online or in-person. There are several training providers, colleges and universities run workshops around the world. Few organizations even provide a passing guarantee with their training programs. Just like other prep material, the PMP certification training will also help you gain understanding on the PMBOK® Guide. The major benefit is that you can discuss your problem with an instructor and fellow students.

Step 7: Attend as many as Simulation tests

There are numerous online tests available on the internet. Also, many organizations offer simulation test with their training programs that help you to prepare for the PMP certification exam. You can see the improvement in your score as you attempt more and more simulation exams. If you consistently score 85% or more in your simulations, then you can appear for the exam.

Step 8: Prepare a Plan for Study

While planning for PMP, it’s very important for you to decide a schedule for your study. One needs to plan everything, including how many hours in a day you will be dedicated for study, how many sample tests you will be answering and how many chapters from the PMBOK Guide will be covered during this period. In this way, you will come to know how many months, you will need to prepare for PMP. While planning your study, make yourself prepared for ups and downs as PMP exam is a serious endeavor.

Step 9: Take the Exam.

After all planning and studies, it is the time to take the final exam. Consider the following things that will help you to achieve success in the last step. Don’t take much stress, specially a day before the exam. Take a proper sleep, enjoy your breakfast and a journey from your home to the examination center. You have studied hard and now you are ready for the final exam. Prior to the exam there is a short tutorial explaining about using the computer and the software. After listening instruction carefully, click the “Start Exam” button and start writing your test.


Changes to be expected with the launch of PMBOK Guide Sixth Edition

Changes to be expected with the launch of PMBOK Guide Sixth Edition

To support the broadening spectrum of project delivery approaches, PMI is offering a Sixth Edition of the Project Management Body of Knowledge (PMBOK®) Guide. PMI also collaborated with the Agile Alliance to launch a new Agile Practice Guide that will offer a complete understanding of Agile practices. The new edition of PMBOK® Guide contains information about agile; whereas the Agile Practice Guide serves as a bridge between waterfall model and agile.

Every 4 to 5 years, PMI has been updated the PMBOK Guide with the latest practices in project management. Each knowledge area of the Guide will contain a section entitled Iterative and Adaptive Environments, Approaches for Agile, defining how these practices integrate in project. The Guide will also contain detailed information on strategic and business knowledge, including information on the PMI Talent Triangle™, project management business documents and the critical skills required for success in today’s market.

 

The current Project Management Professional (PMP)® certification exam is based on current practices in the Project Management industry. As the project management practices are developing, so is the PMP exam. With the launch of the 6th edition of the PMBOK Guide, PMP exam will change soon. These updates will ensure that the PMP exam is coordinated with the PMBOK® Guide.

 

What can be updated in the Exam?

Although the PMP certification exam is not a test of the PMBOK Guide®, it is the primary reference for the same. Few changes that we can expect with the new edition is changes in the lexicon and terminology used within the certification exam as well as harmonization of tools, techniques and process groups.

 

If you refer the PMBOK Guide® as a major study tool for the PMP exam, you can find the following updates in the sixth edition of the PMBOK Guide®:

  • Addition of a new chapter on the role of the project manager to emphasis on leading projects effectively.
  • Change in the name of two Knowledge Areas to make it more accurately reflect which fundamentals can be managed:

* Time Management changed to Schedule Management

* Human Resource Management changed to Resource Management

All Knowledge Areas have four new sections:

  1. Key Concepts
  2. Tailoring Considerations
  3. Trends and Emerging Practices
  4. Considerations for Agile/Adaptive Environments

Few Facts to be considered

  • The PMP certification exam will be changed in the first quarter of 2018. However, the exact date is yet to be announced.
  • Aspirants taking exam prior to Q1 2018 will get a chance to attend an exam that references the fifth edition of PMBOK Guide®.

Reasons to Get PMI-ACP Certified

Reasons to Get PMI-ACP Certified

In today’s digital world, Agile becomes the best way to work. There are numerous training providers offer short term as well as long term certification courses, which makes it difficult for aspirants to select the best suited certification for them. Agile certification is trending nowadays and the release of the PMBOK Guide 6th editionwill bring Agile to Project Management as well. Getting Agile certified by PMI worth, here are some reasons that prove PMI-ACP worth your time and money:

Validates Expertise

Though this applies to all certifications that they validate the knowledge in the domain, PMI-ACP® confirms your expertise in scrum and agile. Nowadays, most of the organizations clearly mention in their job description for PMI-ACP certified candidates. This certification helps you to take your career into new heights in the same organization.

Bigger Coverage

Unlike other certifications for agile that focus on one or some framework of agile methodology, the PMI-ACP® cert is very rich in its syllabus. PMI-ACP® focuses on multiple frameworks, such as scrum, XP, lean, Kanban and many others. The PMI-ACP training and knowledge needed to appear for this certification will improve the value of the job seeker’s profile.

Various types in Market

Today’s digital world keeps changing every day, so the job trends. Every field of all industries demands for Agile certified professionals, which makes the PMI-ACP® a handy option. If aspirants want to open the gates of job opportunities in the market, they have to be agile friendly. There is less supply of certified professionals compared to the demand and people with PMI-ACP certification will be in demand until there is enough supply. To be one of the certified professionals, attend PMI ACP training online or in-person PMI-ACP bootcamp and become certified.

Higher Pay Scale

Although, the pay scale varies according to different factors and relevant certification is one of the important factors. PMI-ACP®certified professionals get paid higher than a non-certified. The certification brings you a better position and package in return. Many surveys conducted by leading organizations clearly emphasize the value of PMI-ACP certification.

 

PMI® Way

PMI® certifications hold a tremendous credibility in the industry compared to any other certification organizations. If you are looking to go ahead in your career with Agile, PMI-ACP is the ultimate choice.

Confused among the training providers? The best way to select your training partner is to get in touch with them, get answers to all your queries and if you are satisfied with their answers, then register with them. There are several training organizations (like MSys Training) who offer 100% pass assurance or a money back guarantee. Such can be proven the best option for you.


Robo-advisors: How Big Data is Changing the Financial Landscape

Robo-advisors: How Big Data is Changing the Financial Landscape

It wasn’t long ago that, for the layperson, investment automation seemed reserved for hedge funds and investment banks—monoliths of the investing world that play the numbers with analytics to gain an advantage penny upon penny. If you simply had a casual interest in stocks, the state of affairs wasn’t very encouraging. Big players stood to make a whole lot more than you from a wide variety and volume of investments, as well as lightning-fast, automated trades.

But now things are different. This summer’s news about robo-investing paints a picture of where we are now with big data and investing:


Betterment, the original robo-advisor, has some 300,000 users and is valued at over $800 million
Robo-analysts are able to pore over reams of data in financial statements, saving a great deal of time for funds managers and other decision-makers, who can then use the analyses to make stock-buying decisions
Anyone can use robo-advisors from fintech companies or banks to manage their portfolio
Fintech robo-advisors are competing with banks, but banks’ robo-advisors may face conflicts of interest in their recommendations, because their algorithms are skewed to prefer companies that pay the banks for marketing


Robo-investing is blowing the door open on a whole new level of trade. Because ...


Read More on Datafloq
Yes, Artificial Intelligence Is Analytics

Yes, Artificial Intelligence Is Analytics

There seems to be some confusion as to exactly what artificial intelligence (AI) is, and how the discipline of AI should be categorized. Is AI a form of analytics or is it a totally new discipline that is distinct from analytics? I firmly believe that AI is more closely related to predictive analytics and data science than to any other discipline. One might even argue that AI is the next generation of predictive analytics. Additionally, AI is often utilized in situations where it is necessary to operationalize the analytics process. So, in that sense, AI is also often pushing the envelope of prescriptive, operationalized analytics. It would be a mistake to say that AI is not a form of analytics.

AI’s Relationship to Predictive Analytics

Let’s review a few basic facts that help define predictive analytics and then look at how AI fits well within those bounds. At its core, predictive analytics is, naturally, about predicting something. Who will buy? Will certain equipment break? Which price will maximize profits? Each of these questions can be addressed by following a familiar workflow:


First, we identify a metric or state that we want to predict and gather historical information on that metric or state. For ...


Read More on Datafloq
“So Fricking Stupid�: Muddling Through Strategic Insolvency

“So Fricking Stupid�: Muddling Through Strategic Insolvency

As I have mentioned before, the United States faces a crisis of “strategic insolvency� with regard to the imbalance between its foreign and military policy commitments and the resources it has allocated to meet them. Rather than addressing the problem directly, the nation’s political leadership appears to be opting to “muddle through� instead by maintaining the policy and budgetary status quo. A case in point is the 2017 Fiscal Year budget, which should have been approved last year. Instead Congress passed a series of continuing resolutions (CRs) that keeps funding at existing levels while its members try to come to an agreement.

That part is not working out so well. Representative Adam Smith, the ranking Democrat on the House Armed Services Committee (HASC), earlier this week warned that the congressional budget process is headed for “a complete meltdown� in December, Sidney J. Freedberg, Jr. reported in Defense One. The likely outcome, according to Smith, will be another year-long CR in place of a budget. Smith vented that this would constitute “borderline legislative malpractice, particularly for the Department of Defense.�

Smith finds himself in bipartisan agreement with HASC chairman Mac Thornberry and Senate Armed Services chairman John McCain that ongoing CRs and the restrictions of sequestration have contributed to training and maintenance shortfalls that resulted in multiple accidents—including two U.S. Navy ship collisions—that have killed 42 American servicemembers this summer.

As Freedberg explained,

What’s the budget train wreck, according to Smith? The strong Republican majority in the House has passed a defense bill that goes $72 billion over the maximum allowed by the 2011 Budget Control Act. That would trigger the automatic cuts called sequestration unless the BCA is amended, as it has been in the past. But the slim GOP majority in the Senate needs Democratic votes to amend the BCA, and the Dems won’t deal unless non-defense spending rises as much as defense – which is anathema to Republican hardliners in the House.

“Do you understand just how fricking stupid that is?� a clearly frustrated Smith asked rhetorically. A possible alternative would be to shift the extra defense spending into Overseas Contingency Operation funding, which is not subject to the BCA, as has been done before. Smith derided this option as “a fiscal sleight of hand [that] would be bad governance and ‘hypocritical.’�

Just as politics have gridlocked budget negotiations, so to it prevents flexibility in managing the existing defense budget. Smith believes a lot of money could be freed up by closing domestic military bases deemed unnecessary by the Defense Department and canceling some controversial nuclear weapons programs, but such choices would be politically contentious, to say the least.

The fundamental problem may be simpler: no one knows how much money is really needed to properly fund current strategic plans.

One briefer from the Pentagon’s influential and secretive Office of Net Assessment told Smith that “we do not have the money to fund the strategy that we put in place in 2012,� the congressman recalled. “And I said, ‘how much would you need?’…. He had no idea.�

And the muddling through continues.

Multi-Access Edge Computing- A Perfect IoT Technique

Multi-Access Edge Computing- A Perfect IoT Technique

Internet of things (IoT) is the vast world of things that are completely related to internet business. In this fast pace world, IoT has revolutionized the way of thinking towards business. Moving our talk towards MEC (multi-access edge computing) which is becoming an emerging and growing trend in IoT.

MEC plays a key role as an enabler for the things related to IoT because it is known to be a more valuable or recognized concept and technology in architecture. That’s why it is an important element for the network of a future generation.

ETSI is a technological trend in MEC that describes MEC in more general. Basically, ETSI is an IT service provider that is advance with cloud computing abilities in the mobile network along RAN (Radio Access Network). Below we will discuss this IoT trend MEC in more detail that completely verifies its concept and influence in IoT business.

Standardization Of MEC In Globe

To standardize MEC in global organizations, ETSI has established ISG (Industry Specification Group) that provides an open or efficient environment to third-party applications across different multi-access platforms. The purpose of these specifications is to revolutionize its use in the globe along with providing the framework to architecture references to ...


Read More on Datafloq
4 Problems Still Facing Rugged Data Storage

4 Problems Still Facing Rugged Data Storage

Most data storage devices on the market are relatively reliable, but only if kept from getting too hot or cold and in a vibration-free place.

In contrast, rugged data storage devices are among the toughest available. Preferred by military personnel and others who regularly encounter demanding environments, these tech tools can withstand extreme temperatures, plus shocks and vibrations.

Below, we’ll look at four things that still make efficient use of rugged data difficult.

1. Storage-Related Delays Compromise Real-Time Intelligence Decisions

One of the primary reasons why the military uses rugged data is because it’s necessary to make in-the-moment decisions that could affect national or international security.

There is a huge surplus of data, and delays in processing or storing it could lead to outdated intelligence. To stay competitive, today’s providers of rugged data devices must ensure they meet the fast-paced demands of people who require them while making time-sensitive judgments.

2. Next-Generation Flight Data Recorders Are Expensive

Flight recorders, also known as aerospace data recorders, are the most commonly used pieces of equipment that transmit real-time data during air travel. They are part of a booming sector projected to be worth more than $2 million by 2025, up from a market value that was just over 1.4 million ...


Read More on Datafloq
The Pros And Cons Of Shooting Down North Korean Ballistic Missile Tests

The Pros And Cons Of Shooting Down North Korean Ballistic Missile Tests

Two THAAD interceptors and a Standard-Missile 3 Block IA missile were launched resulting in the intercept of two near-simultaneous medium-range ballistic missile targets during designated Flight Test Operational-01 (FTO-01) on September 10, 2013 in the vicinity of the U.S. Army Kwajalein Atoll/ Reagan Test Site and surrounding areas in the western Pacific. The test demonstrated the ability of the Aegis BMD and THAAD weapon systems to function in a layered defense architecture. Photos taken by Missile Defense Agency. (Photo Credit: Missile Defense Agency)

On 3 September, North Korea tested what it claimed to be a thermonuclear warhead which can be mounted on a ballistic missile. While analysts debate whether the device detonated actually was a deliverable thermonuclear bomb, it is clear that the regime of Kim Jong Un is making progress in developing the capability to strike the United States and its regional allies with nuclear weapons.

Is there anything that can be done to halt North Korea weapons development and mitigate its threatening behavior? At the moment, there appear to be few policy options, and each of them carries significant risk.

  1. Launch a preemptive military strike.
  2. Enlist or coerce China into reigning in North Korea’s adventurism.
  3. Accept the fact that North Korea is now the ninth nuclear power in the world—with the capability to strike the U.S. and its regional allies with nuclear-armed ballistic missiles—and adopt the Cold War approach of containing it militarily and limiting its nuclear arsenal through negotiation.

Would attempting to shoot down forthcoming North Korean ballistic missile test launches be a viable policy alternative for the U.S. and its allies? Geof Clark proposed this option in a recent post:

I would argue that the U.S. use the United Nations as a forum to define the parameters for any possible North Korean missile launch that should be intercepted with allied BMD [ballistic missile defense] assets If, for example, a North Korean missile looks likely to hit close to Tokyo, based upon the trajectory identified by Aegis ships at sea, then BMD should shoot it down. By making our rules of engagement public, this would provide a clear signal to China and Russia that the U.S. and its allies intend to use their BMD capabilities (and potentially learn from any failures) against live enemy missiles, but also temper the risk of escalation into any further missile volleys between any parties.

A number of commentators questioned why the U.S. or Japan elected not to attempt to intercept North Korea’s 29 August ballistic missile test that flew directly over Japanese territory. A variety of technical and political issues were cited as justification for restraint. The U.S. and Japan resorted to the usual mix of condemnation and calls for further economic sanctions.

What are the arguments for and against a policy of intercepting North Korean missile tests?

Pros:

  1. The main argument in favor of this is that it could change the narrative with North Korea, which goes like this: Kim’s government stages some provocation and the U.S. and its allies respond with outraged rhetoric, diplomatic moves to further isolate Kim’s regime, and the imposition of a new round of economic sanctions  It is hard to see how much more isolated North Korea can be made, however, and  the vast majority of its trade is with a benevolent China across a porous border. This story has played out repeatedly, yet nothing really changes. Shooting down North Korea’s missile tests could change this stale narrative by preventing it from conducting provocations without consequence.
  2. It would send a strong message to North Korea.
  3. It is not out of line with the provocations North Korea has done over the years (for example: sinking a South Korean patrol boat in 2002).
  4. It is a step short of a preemptive strike by the U.S. and its allies.
  5. It could stall North Korean missile development (especially the ballistic cap, which the North Koreans still have not developed).
  6. It could provide the basis for negotiations.
  7. It is a credible threat (unlike threatening trade sanctions against China to coerce it into restraining North Korea).
  8. It would embarrass Kim’s government by demonstrating that its threats are no longer effective.

Cons:

  1. Which missile tests would be shot down? The U.S. has already declared that any North Korean missile that appears to be targeted at the territory of the U.S. or its allies would be engaged by BMDs and considered an act of war. (The determination that the 29 August test was not aimed at friendly territory was a major factor in the decision not to engage it.) The Trump administration has repeatedly warned the North Koreans of a massive military response to any perceived attack.
    1. Intercepting a North Korean test flying over Japan or into international waters would likely be interpreted by Kim’s regime as a deliberate escalation of the conflict. Such an act would probably extinguish what some have seen as signals from North Korea of a willingness to engage in diplomatic talks, and could precipitate counter-provocations in what is already a highly tense stand-off.
    2. Some have speculated that the North Koreans may attempt to launch a ballistic missile carrying what many believe to be a recently-tested thermonuclear warhead. The consequences of an attempt to intercept such a test would inevitably be dire.
    3. What about targeting North Korean short-range ballistic missiles, or long-range missile tested at short ranges? Intercepting these tests would pose formidable technical challenges for U.S. and allied BMD systems. The risk of failed intercepts would increase and the level of provocation to the North Koreans would be very high.
  2. China might interpret an attempted intercept as a violation of North Korean sovereignty. Although the Chinese have expressed frustration with North Korea’s behavior, it remains a Chinese client state. While certainly provocative, North Korea’s missile tests over Japan are not a clear cut violation of international law. China remains committed to defending North Korea against foreign threats. Intercepting an allegedly “peaceful� ballistic missile test could easily bring China to North Korea’s overt assistance. This would run contrary to the Trump administration’s avowed policy of enlisting the Chinese to restrain Kim’s government and raises the potential for a direct U.S/China confrontation.
  3. It is not at all clear that key U.S. allies South Korea and Japan would support a policy of intercepting North Korean missile tests not aimed at their territory. The U.S. needs permission from these countries to deploy its theater BMD systems within range of North Korean missiles. South Korea is already ambivalent about hosting U.S. BMDs and Japan has indicated that it will maintain its own policy regarding intercepting potential threats. An aggressive U.S. policy could risk damaging or splitting the alliance.
  4. A vow to intercept North Korean missile tests would place enormous pressure on U.S. and allied BMDs to perform effectively, a capability that remains highly uncertain. While theater BMDs have performed better in tests than the U.S. intercontinental Ground-Based Midcourse Defense (GMD) system, it is unlikely they can intercept every potential target. Any weaknesses demonstrated by theater BMD increases the political effectiveness of North Korea’s putative ballistic missile capability. The current ambiguity works in the favor of the U.S. and its allies. Dispelling the uncertainty would be a high price to pay in any circumstance but defense of U.S. or allied territory.
  5. It is not evident that suppressing North Korean missile tests at this point would have a significant impact on its capabilities. North Korea has already demonstrated that its ballistic missiles work well enough to pose a clear threat to the U.S. and its allies. Further testing would only refine existing technology to reduce the probability of technical failures.

Like the other available policy options, this one too carries a mix of potential benefits and risky downsides. The consequences of attempting to implement it cannot be completely foreseen. What does seem clear is that the existing approach does not seem to have worked. Successfully resolving a problem like North Korea is likely to take time, patience, and no small amount of imagination.

 

Equestrian Sports – Where Trots Get Digital and Technology Meets Tradition

Equestrian Sports – Where Trots Get Digital and Technology Meets Tradition

Equestrian sports is one of the oldest forms of sports entertainment. Its history can be dated back to the ancient Greek civilization. Since then, the trots have transcended the periodical barriers from time to time, walking stride to stride with the age-specific customs, and entertaining the crowds in the process. And, in this present age, where everything is digitized, the trots have now become digital too.

I recently attended the annual World Equestrian Festival, CHIO Aachen—the Wimbledon of Equestrian Sports—and interviewed Ingrid Klimke, the two times Equestrian Olympic champion; (Mr.) Michael Mronz, the Organizer of the CHIO Aachen, and Mr Björn Ganzhorn, the Head of SAP Global Sponsorships.

We have heard about the deployment of SAP powered solutions for enterprise management. Now, SAP is also making strides in the world of sports, integrating data and analytics with the traditional thrill and passion to dramatically transform the sports experience for fans, media, athletes, and organizers. Leading this race in the adoption of data and an analytics-driven solution is the equestrian sports.

CHIO Aachen and Data & Analytics

In 1924 the first horse show took place in the Aachen Soers, which is the venue of the CHIO Aachen until today. Since then, the organizers of the CHIO Aachen ...


Read More on Datafloq
Would You Give Up Your Data for Science?

Would You Give Up Your Data for Science?

The world is not as private a place as it might seem.

Even though you might be sitting quietly at work, on your way back home on the train or relaxing in front of the latest Netflix series, your life is being scrutinised. Some of the data will be for your eyes only, and there is a certain amount of choice as to what is in the public domain, but much of the snooping is still done without our knowledge or consent.

The data genie is well and truly out of the bottle, but if he is to be fully empowered, we have to get used to the idea that privacy is an outdated concept.

Many of us are happy to donate our organs after we leave this world to benefit others, but why is there still a reluctance to share the data from our tech wearables? This could save multiple lives.

I suppose that it stems from the fact that we have been taught not to “talk to strangers.� The more others know about us, the more power they have over us, and sharing our data with some faceless corporation is still seen as too much of a risk.

Maybe we are still in ...


Read More on Datafloq
Why Do You Need Cloud-Based Learning Management System In Your Organization?

Why Do You Need Cloud-Based Learning Management System In Your Organization?

Today, organizations have more challenges and demands to deliver on the lines of their promises that, too, in the backdrop of extreme competitive environment. It is not easy to provide the real value to customers’ money anymore. So, professionals need to be more creative, more informed, more productive, and more advanced in their approach towards delivering quality products and services.

It has been one of the foremost objectives of organizations to enhance professional development and education of employees because it can make a significant impact on their (organizations’) value proposition. Which is why an integrated cloud-based learning management system (LMS) could be a fitting answer for a strong professional development and education. This new-age online education, virtual classroom setting, social and mobile learning, and performance support can change the operations of any organization for better.

If you have been wondering whether a cloud-based LMS is going to greatly benefit your company, and wanting to incorporate one; then you are on the right track. You probably have your own apprehensions regarding switching to a new system; but let me tell you, there are several reasons why a cloud-based system is the best choice for your organization’s eLearning program.

What exactly is a cloud-based LMS?

Simply ...


Read More on Datafloq
Virtual Reality App Development And The Opportunity It Presents For Your Business

Virtual Reality App Development And The Opportunity It Presents For Your Business

As mobile technology continues to break new boundaries and set new records on a daily basis, Virtual Reality is fast becoming a reality. For many industries and business including health, Hospitality, Education, Transportation, Manufacturing, etc., Virtual reality app development is a “gold mine.�

The application of Virtual Reality in business has revolutionized how business is run. Today, traditional business practices are being replaced with new and innovative technique. This has tremendously affected growth.

Below are ways VR app development can benefit your business

1. Show and Tell: Virtual Reality can help you showcase your products

Even though mobile technology seems to have affected almost every area of our daily lives, many people prefer the traditional way of seeing the product they intend to buy before making a choice. With Virtual reality app development, they will not just be able to see the product; they will be able to “experience� it: its functionality, usability, design, features, etc., irrespective of the size. For example, VR gives tourists the opportunity to experience places in an immersive 3D environment. As a result, they will discover potential travel destinations, even before they set out to buy travel tickets.

The man who runs a Hotel can give his clients a 3D ...


Read More on Datafloq
3 Ways the IoT for Cars is Evolving Travel

3 Ways the IoT for Cars is Evolving Travel

The connected cars of tomorrow are right around the corner. There are myriad ways the Internet of Things [IoT] will revolutionize the way we travel, whether it’s by car, bus, boat or train. Carmakers connect their vehicles to the IoT in one of two ways, or both: technology that is built-in, or embedded in a vehicle, and technology that is tethered to a mobile device you bring along for the ride. The production of embedded and/or tethered connected vehicles is forecast to increase from 12.4 million in 2016 to 61 million in 2020.

Here are the top three ways the IoT will impact travel in the future, beginning with ways in which it is already doing so.

1. Smart Car Convenience

With today’s 4G networks powering navigation systems, alerting us to traffic conditions, and feeding content to in-vehicle infotainment systems, one could argue it already has. With features like GPS-aided navigation, voice-activated infotainment systems and power ports for plug-ins galore, many vehicles are now connected, comfortable, state-of-the-art living rooms on wheels. Internet connectivity also allows the vehicle owner to analyze its performance and help diagnose problems requiring repair.

The rapid growth expected in the IoT will be driven largely by the connected vehicle – ...


Read More on Datafloq
Virtual Reality, So What’s All the Fuss About?

Virtual Reality, So What’s All the Fuss About?

Virtual reality has become one of the fastest growing industries in our modern age. A once exclusive sideshow has now become affordable and widespread. Major companies such as Google, Facebook, Sony and HTC, all have massively invested in the emerging technology and are driving it forward.

The most widespread use of VR today is within the video games industry. A medium, where immersion into the game world is key, VR is a natural fit. There are already a number of brands pushing the boundaries of our virtual experiences. Oculus Rift, Sony Playstation VR and the HTC Vive all offer high-end headsets to purchase and play at home. But is the future of VR in our society?

Aside from gaming, virtual reality has many implications, some rather frightening, some potentially beneficial. Within the business world, people could potentially conduct meetings anywhere in the world. One would only need a headset and access to a virtual boardroom. Communication between people would become that much more interactive.

In education, children might one day be taught entirely online. Books being replaced by a virtual world, where they’re not only told what happens, but can literally see it happening. Imagine a program that allows students to see how everything ...


Read More on Datafloq
How Conversational AI Will Change Customer Service

How Conversational AI Will Change Customer Service

By 2020, approximately 20.4 billion devices are estimated to be connected to the internet. These IoT devices are getting smarter, connecting to intelligent applications, such as Amazon's Alexa or Apple's Siri, and helping consumers make transactions and complete tasks. However, they are also sparking conversational AI, and it stands to change customer service. Explore conversational AI, its benefits and challenges, and how it will help change customer service:

What Is Conversational AI?

Conversational AI consists of an advanced technology that uses natural language processing (NLP) so that computers can comprehend human language. Conversational AI includes a variety of technologies, such as chatbots, advanced notifications and personal assistants.

Technical Challenges

Conversational AI is not without its challenges. Some of the technical challenges conversational AI faces include enhanced user interfaces. For example, product browsing lacks support due to the single-column layout of many chat messaging canvases. Also, there is a lack of organizing the various bots in a directory. Moreover, bots still require time to learn patterns of human behavior and speech, such as the correct use of pronouns, to provide a user experience that is natural and effortless.

Key Benefits

Organizations can use conversational AI in several ways, including via chatbots, and there are myriad of benefits ...


Read More on Datafloq
Status of Books

Status of Books

War by Numbers: Understanding Conventional Combat: For some reason, Amazon.com does not have a Kindle edition available at the moment (I recall that they did). I have talked to the publisher and they are looking into it. The paperback edition is for sale on Amazon.com and of course, University of Nebraska Press. I have heard that some people overseas have gotten copies, but other people are having a problem. I also have the publisher looking into that. There is one 5-star review of the book on Amazon.com. I don’t know the reviewer (meaning it is not a planted review).

Kursk: The Battle of Prokhorovka: The book has been selling at a consistent rate this year, and at that rate, it will be out of stock in the second half of 2018. If you are thinking about getting it, you probably don’t want to tarry too long. There are currently no plans for a re-print.

America’s Modern Wars: Understanding Iraq, Afghanistan and Vietnam: I do consider this the most significant of my three books, and of course, it is the one with the worse sales. I guess the study and analysis of insurgencies is passé, as we have done such a great job of winning these type of wars.

 

Simple Ways EVERY Business Can Lend a Helping Hand in a Crisis

Simple Ways EVERY Business Can Lend a Helping Hand in a Crisis

This article originally appeared on LinkedIn. Follow me on LinkedIn to join the conversation.

The recent floods in Texas and IRMA beating down on the Carribean have dominated the news. Stories of strength and persistence give us hope, while tales of loss make us grateful. Even from far away I am touched by the grand gestures many businesses are making to help those most in need. Like this from my friend Dan Briscoe and his team at HCSS or Amazon who made it easy for The Marketing Advisory Network to donate money to support the American Red Cross efforts on the ground.

Volunteering time and making donations is just one of the ways your business can help. I’ve put together a list of easy things any business can do to support a community in their time of need.

Extend payment terms – Remove financial concerns by notifying customers in impacted zones that payment terms have been extended for an extra 30 or 60 days without penalty for any invoice outstanding. The bottom line impact is tiny compared to the gesture of good will.

Pause your promotional communications – I promise you the last thing on the minds of residents crushed by flooding is your web seminar. Pause promotional communications for at least two weeks following an emergency situation.

Place impacted contacts on your sales do not call list temporarily – Temporarily place contacts in impacted areas on your sales do not call list to avoid causing frustration. This is not the time to “take advantage” of fear and uncertainty.

Proactively communicate about shipping/delivery delays – Your customers may be affected by the emergency even if they are outside of the impact zone.

Don’t raise prices – Airlines are coming under fire for dramatic price hikes in the wake of IRMA predictions. The internet is not happy and responding with ire towards airline brands. Except for JetBlue who capped fares leaving Florida at $99 and is winning hearts across the country. Any short term revenue you gain from price spikes will be offset by a long term hit to your reputation.

It’s not only right to respond when your customers need you, it’s good business.

The Benefits of Data Analytics for Insurance Businesses

The Benefits of Data Analytics for Insurance Businesses

“Data� this. “Analytics� that. You’re probably tired of hearing all these buzzwords. They’re just useless hype anyways, right?

Not so fast.

Taking advantage of data analytics can actually make all the difference between a failing insurance business and a thriving one. Here’s how data analytics can help you improve the performance of your business.

It can help you increase profitability

Insurance is traditionally a “paper and ink� business. But, today’s agent is outmatched by the consumer who has unlimited access to information on the Internet.

Meanwhile, what data is the agent using? Sadly, most of them aren’t using anything more sophisticated than the agency’s quoting software and email. Here’s where new digital platforms and data analytics come into play.

Analytics can tell you who owns what policy, how many policies they own, and where potential gaps in coverage are. And while insurance brokerages can keep track of policies in their current database system, they often can’t share that information with other offices, and sometimes it’s even difficult to share information with other colleagues in the same office.

Being able to see and share data allows agents to make smarter, more personalized recommendations to customers, cross sell products when appropriate, and stay on top of customer insurance needs.

For example, if ...


Read More on Datafloq
3 Data Driven Technologies That Are Opening Our Eyes Wider

3 Data Driven Technologies That Are Opening Our Eyes Wider

Every morning when you open your eyes, you are flooded with information about the world around you that gives you the perspective you need to get out of bed without tripping over and knocking your head against the floor. Without collecting data about the world around us, we have nary any idea how to get what we want. Data-driven technologies help us get that data and to put it into a perspective we can maneuver so that we can see a clear path from Point A to Point B.

Although data-driven decision making has existed for many years, a couple of new technologies are coming out of the woodwork that have begun changing the game for businesses. Some of them are even helping smaller operations outpace larger competitors. These are the technologies we will have to keep an eye on in the coming years as they continue to shape the playing field.

Business Intelligence Becomes More Accessible And Agile

BI is a pretty old technology that has been around since companies began storing their customers’ information on magnetic hard disks instead of typing them onto sheets of paper. The internet and the advent of cloud computing have made this process much smarter and ...


Read More on Datafloq
Click & Ready: Setup a virtual training classroom

Click & Ready: Setup a virtual training classroom

For our upcoming event “BusinessObjects Arbeitskreis” or simply BOAK (www.boak.ch) we are providing hands-on sessions to participants. We’ve used Cloudshare as a platform for multiple years now to provide every participant with its own virtual machine environment. This year we have the chance to use the Training module of the Cloudshare platform. This modules simplifies the setup of user environments massively and gives us – the instructors – better capabilities to assist our participants. In this article I quickly walk you through how you can setup a virtual training classroom in Cloudshare:

Create a new class

Open the Training module in the web interface of Cloudshare and choose “Create a New Class”. Give it  a name and choose which blueprint and snapshot of your VM environment you want to use for the class.

 

In addition, choose an environment policy. The policy controls how long an environment can be used and after which idle time an environment is suspended. Here you see an example of an environment policy:

Next, you have to choose a time for your class to start. Important: This starting time is used by Cloudshare to prepare all the necessary VM environments in the background so that your students don’t have to wait once they log in:

Finally specify a passphrase and a maximum number of students. You can also allow non-registered users to join – an option we are grateful for during BOAK.

Manage students & run the class

After having created the class, you can register students and eventually send out an invitation email to them.

What students need to join a class is only the Student Login Link as well as the passphrase:

If you allow unregistered users to join the class, you can force them to provide some details like name and company:

Once you are logged in, the student’s personal environment is already up and running – just clik “Start Using This Environment”:

Click on “View VM”

Now the HTML5 based remote desktop session is opened directly within the browser – please note: No addons are needed for this! This helps us to keep costs for additional configurations on our training laptops to an absolute minimum.

As a student I can ask for help, chat with the instructor or with the whole class.

As an instructor I can switch to the Instructor Console where I see a thumbnail of every student’s screen:

If a student asks for help, I can quickly “zoom in” and see the student’s screen live:

If needed, the instructor can take over control of a student’s screen:

Conclusion

In this blog post I’ve showed how you can use Cloudshare’s Training Module to setup and run a virtual training lab class. I have to admit that I’m pretty amazed how easy it was to use this feature. So far everybody is happy with it:

  • Our IT administrators because it reduces laptop-side configurations.
  • Me as an instructor: I can easily invite students and run a class without technical knowledge around virtualization, VM management etc.
  • Our students: So far they were very happy with the solution. It is easy to use and the performance is as if you work with your local PC.

   


What Skills Do I Need to Become a Data Scientist?

What Skills Do I Need to Become a Data Scientist?

Leveraging the use of big data, as an insight-generating engine, has driven the demand for data scientists at enterprise-level, across all industry verticals. Whether it is to refine the process of product development, help improve customer retention, or mine through the data to find new business opportunities—organizations are increasingly relying on the expertize of data scientists to sustain, grow, and outdo their competition.

Consequently, as the demand for data scientists increases, the discipline presents an enticing career path for students and existing professionals. This includes those who are not data scientists but are obsessed with data, which has left them asking:

What skills do I need to become a data scientist?

This article aims to answer this question. We will dive into the technical and non-technical skills that are critical for success in data science.


If you are a potential data scientist, you can use the information herein, to carve a successful career for yourself in data science.
If you are a data analytics director at an organization, you can leverage the information to train your existing team of data scientists, in order to make them more productive and efficient at their work.


This is an address for all those who love to wrangle and rumble ...


Read More on Datafloq
Recent Academic Research On Counterinsurgency

Recent Academic Research On Counterinsurgency

An understanding of the people and culture of the host country is an important aspect of counterinsurgency. Here, 1st Lt. Jeff Harris (center) and Capt. Robert Erdman explain to Sheik Ishmael Kaleel Gomar Al Dulayani what was found in houses belonging to members of his tribe during a cordon and search mission in Hawr Rajab, Baghdad, Nov. 29, 2006. The Soldiers are from Troop A, 1st Squadron, 40th Cavalry Regiment. (Photo Credit: Staff Sgt. Sean A. Foley)

As the United States’ ongoing decade and a half long involvement in Afghanistan remains largely recessed from the public mind, the once-intense debate over counterinsurgency warfare has cooled as well. Interest stirred mildly recently as the Trump administration rejected a proposal to turn the war over to contractors and elected to slightly increase the U.S. troop presence there. The administration’s stated policy does not appear to differ significantly from that that proceeded it.

The public debate, such as it was, occasioned two excellent articles addressing Afghanistan policy and relevant recent academic scholarship on counterinsurgency, one by Max Fisher and Amanda Taub in the New York Times, and the other by Patrick Burke in War is Boring.

Fisher and Taub addressed the question of the seeming intractability of the Afghan war. “There is a reason that Afghanistan’s conflict, then and now, so defies solutions,� they wrote. “Its combination of state collapse, civil conflict, ethnic disintegration and multisided intervention has locked it in a self-perpetuating cycle that may be simply beyond outside resolution.�

The article weaves together findings of studies on these topics by Ken Menkhaus; Romain Malejacq; Dipali Mukhopadhyay; and Jason Lyall, Graeme Blair, and Kosuke Imai. Fisher and Taub concluded on the pessimistic note that bringing peace and stability to Afghanistan may be on a generational time scale.

Burke looked at a more specific aspect of counterinsurgency, the relationship between civilian casualties and counterinsurgent success of failure. Separating insurgents from the civilian population is one of the central conundrums of counterinsurgency, referred to as the “identification problem.� Burke noted that the current U.S. military doctrine holds that “excessive civilian casualties will cripple counterinsurgency operations, possibly to the point of failure.� This notion rests on the prevailing assumption that civilians have agency, that they can choose between supporting insurgents or counterinsurgents, and that reducing civilian deaths and “winning hearts and minds� is the path to counterinsurgency success.

Burke surveyed work by Matthew Adam Kocher, Thomas B Pepinsky, and Stathis N. Kalyvas; Luke Condra and Jacob Shapiro; Lyall, Blair and Imai, Christopher Day and William Reno; Lee J.M. Seymour; Paul Staniland; and Fotini Christia. The picture portrayed in this research indicates that there is no clear, direct relationship between civilian casualties and counterinsurgent success. While civilians do hold non-combatant deaths against counterinsurgents, the relevance of blame can depend greatly on whether the losses were inflicted by locals for foreigners. In some cases, counterinsurgent brutality helped them succeed or had little influence on the outcome. In others, decisions made by insurgent leaders had more influence over civilian choices than civilian casualties.

While the collective conclusions of the studies surveyed by Fisher, Taub and Burke proved inconclusive, the results certainly warrant deep reconsideration of the central assumptions underpinning prevailing U.S. political and military thinking about counterinsurgency. The articles and studies cited above provide plenty of food for thought.

Introduction to Blockchain & What It Means to Big Data

Introduction to Blockchain & What It Means to Big Data

“Arguably the most significant development in information technology over the past few years, blockchain has the potential to change the way that the world approaches big data, with enhanced security and data quality just two of the benefits afforded to businesses using Satoshi Nakamoto’s landmark technology.�

What is a Blockchain?

Blockchain is a distributed database system that acts as an “open ledger� to store and manage transactions. Each record in the database is called a block and contains details such as the transaction timestamp as well as a link to the previous block. This makes it impossible for anyone to alter information about the records retrospectively. Also, due to the fact that the same transaction is recorded over multiple, distributed database systems, the technology is secure by design.

With the above in mind, blockchain is immutable – information remains in the same state for as long as the network exists.

Blockchain and Big Data

When you talk about blockchain in the context of Bitcoin, the connection to Big Data seems a little tenuous. What if, instead of Bitcoin, the blockchain was a ledger for other financial transactions? Or business contracts? Or stock trades?

The financial services industry is starting to take a serious look at blockchain technology. Oliver ...


Read More on Datafloq
3 Big Data Hurdles Every Omnichannel Marketer Must Face

3 Big Data Hurdles Every Omnichannel Marketer Must Face

Unless you’ve been living under a rock for the past few years, you know that big data is driving the present and future of marketing – or all business operations for that matter. According to Gartner, the worldwide business intelligence and analytics market is expected to reach $18.3 billion by the end of 2017.

For all the potential it offers, the sheer volume of big data is growing at such a fast rate that marketers are struggling to keep up with the trends.

With so much information available, often times, the most difficult task is simply knowing what to look for and how to apply it to a marketing strategy. Even though every business has a unique sets of needs, each struggles with similar challenges along the way. Let’s discuss three of the most prevalent obstacles marketers face when dealing with big data.

1. Controlling Data Sources

In the early years of big data, gathering information was much more clear-cut than it is now. This was simply due to a limited number of sources available. Marketers were grabbing any and all the data they could to make more educated decisions. Fast forward to 2017, and the situation is reversed.

There are over 2.5 quintillion bytes ...


Read More on Datafloq
How Long Do We Have to Wait for the Internet of Things 4.0?

How Long Do We Have to Wait for the Internet of Things 4.0?

“In memory of my brother Juan.�

I have not intended to abuse of one more time of a shocking headline in IoT, but the fact is that per Gartner´s 2016 Hype Curve� the Internet of Things (IoT) had to fall into the dreaded Trough of Disillusionment and the Powerful analyst firm decided to not mention IoT anymore in 2017. Also, corroborated for many pessimistic articles of IoT project failures.

So it is our responsibility as IoT influencers to continue evangelizing about how the “The Internet of Things will Change our World�.

In the article “The Internet of Things… Are We There Yet?� , Cees Links points out that “The IoT is suffering today from a lack of understanding of its true value proposition and even if we are currently in the Valley of Disillusionment, we should not be distracted. We still have a lot to learn, but we are in the middle of shaping a better world for the next generation�.

IoT 1.0 or the time of the Systems

It seems prehistoric. I'm talking about the world of Telemetry, Machine to Machine, Industrial Control Systems (PLCs, SCADAs, HMI,..) . But IoT 1.0 is still the one that holds the largest number of devices connected ...


Read More on Datafloq
Microsoft Azure Stack vs. VMware Cloud on AWS — Customers, Start Your Pencil Sharpeners

Microsoft Azure Stack vs. VMware Cloud on AWS — Customers, Start Your Pencil Sharpeners

Now that pricing metrics for Microsoft's Azure Stack can be compared to VMware Cloud on AWS, potential customers will likely revisit their initial assumptions. Enterprise customers will now be sharpening their pencils as they look for all means available to exact discounts from both vendors.
Combat Readiness And The U.S. Army’s “Identity Crisis”

Combat Readiness And The U.S. Army’s “Identity Crisis”

Servicemen of the U.S. Army’s 173rd Airborne Brigade Combat Team (standing) train Ukrainian National Guard members during a joint military exercise called “Fearless Guardian 2015,� at the International Peacekeeping and Security Center near the western village of Starychy, Ukraine, on May 7, 2015. [Newsweek]

Last week, Wesley Morgan reported in POLITICO about an internal readiness study recently conducted by the U.S. Army 173rd Airborne Infantry Brigade Combat Team. As U.S. European Command’s only airborne unit, the 173rd Airborne Brigade has been participating in exercises in the Baltic States and the Ukraine since 2014 to demonstrate the North Atlantic Treaty Organization’s (NATO) resolve to counter potential Russian aggression in Eastern Europe.

The experience the brigade gained working with Baltic and particularly Ukrainian military units that had engaged with Russian and Russian-backed Ukrainian Separatist forces has been sobering. Colonel Gregory Anderson, the 173rd Airborne Brigade commander, commissioned the study as a result. “The lessons we learned from our Ukrainian partners were substantial. It was a real eye-opener on the absolute need to look at ourselves critically,� he told POLITICO.

The study candidly assessed that the 173rd Airborne Brigade currently lacked “essential capabilities needed to accomplish its mission effectively and with decisive speed� against near-peer adversaries or sophisticated non-state actors. Among the capability gaps the study cited were

  • The lack of air defense and electronic warfare units and over-reliance on satellite communications and Global Positioning Systems (GPS) navigation systems;
  • simple countermeasures such as camouflage nets to hide vehicles from enemy helicopters or drones are “hard-to-find luxuries for tactical unitsâ€�;
  • the urgent need to replace up-armored Humvees with the forthcoming Ground Mobility Vehicle, a much lighter-weight, more mobile truck; and
  • the likewise urgent need to field the projected Mobile Protected Firepower armored vehicle companies the U.S. Army is planning to add to each infantry brigade combat team.

The report also stressed the vulnerability of the brigade to demonstrated Russian electronic warfare capabilities, which would likely deprive it of GPS navigation and targeting and satellite communications in combat. While the brigade has been purchasing electronic warfare gear of its own from over-the-counter suppliers, it would need additional specialized personnel to use the equipment.

As analyst Adrian Bonenberger commented, “The report is framed as being about the 173rd, but it’s really about more than the 173rd. It’s about what the Army needs to do… If Russia uses electronic warfare to jam the brigade’s artillery, and its anti-tank weapons can’t penetrate any of the Russian armor, and they’re able to confuse and disrupt and quickly overwhelm those paratroopers, we could be in for a long war.â€�

While the report is a wake-up call with regard to the combat readiness in the short-term, it also pointedly demonstrates the complexity of the strategic “identity crisis” that faces the U.S. Army in general. Many of the 173rd Airborne Brigade’s current challenges can be traced directly to the previous decade and a half of deployments conducting wide area security missions during counterinsurgency operations in Iraq and Afghanistan. The brigade’s perceived shortcomings for combined arms maneuver missions are either logical adaptations to the demands of counterinsurgency warfare or capabilities that atrophied through disuse.

The Army’s specific lack of readiness to wage combined arms maneuver warfare against potential peer or near-peer opponents in Europe can be remedied given time and resourcing in the short-term. This will not solve the long-term strategic conundrum the Army faces in needing to be prepared to fight conventional and irregular conflicts at the same time, however. Unless the U.S. is willing to 1) increase defense spending to balance force structure to the demands of foreign and military policy objectives, or 2) realign foreign and military policy goals with the available force structure, it will have to resort to patching up short-term readiness issues as best as possible and continue to muddle through. Given the current state of U.S. domestic politics, muddling through will likely be the default option unless or until the consequences of doing so force a change.

North Korea And The U.S. Navy

North Korea And The U.S. Navy

North Korean leader Kim Jong-un inspects what is said to be a hydrogen bomb. [EPA]

This past week has seen some extraordinary events in the stand-off between North Korea, and it seems the rest of the world. North Korea continues to test its nuclear weapons, causing a 6.3 magnitude earthquake. Evaluation of these events does indicate the strength of the weaponry used, however, some doubt exists as to the veracity claims of the technology used.

The force of the explosion, at 100-150 kilotons, could have been ten times bigger than North Korea’s previous test. But experts argue that is still not quite powerful enough to have been a genuine hydrogen bomb. Instead, they suggest it might have been an implosion device boosted by tritium and deuterium gas (hydrogen isotopes). If that was the case, making the device small enough to be turned into a warhead that could be carried on an ICBM would be technically difficult. On the other hand, if it turns out to have been a two-stage device, in which an initial blast is used to amplify the main detonation, then it probably was a small thermonuclear bomb, which could be miniaturised into a compact warhead. There is as yet no way of knowing which it was. Although experts are sceptical about the latter, they have been caught out often enough by North Korea’s nuclear programme advancing faster than most expected.

The United States has had several voices of response, including Defense Secretary James Matthis, who said “We are not looking to the total annihilation of a country, namely North Korea. But as I said, we have many options to do so.” Diplomatically, both China and Russia claim they are united against the nuclearisation of the Korean peninsula. Meanwhile, the U.S. has called for an emergency meeting of the United Nations Security Council, saying that North Korea is “begging for war.” China has apparently made the suggestion that the U.S. and South Korea cease military exercises in exchange for freezing missile and nuclear operations by North Korea. The U.S. demands the “strongest possible measures” be put into effect.

The previous post gives a link to a detailed chronology of North Korean ballistic missile developments. It has been a problem decades in the making, and not easily solved. Other posts have addressed the defenses that the U.S. and Japan have against a threatened strike on Guam, and also the different layers of defense that exist between a potential North Korean missile strike and his many potential targets.

One of these layers is of particular interest, the component provided by the U.S. Navy (USN), specifically the Arleigh Burke class destroyers, equipped with the AN/SPY-1D radar, and carrying the RIM-161 Standard Missile 3 (SM3), and this part of the Aegis Ballistic Missile Defense (Aegis BMD). The name Aegis is taken from ancient Greece, a shield carried by Zeus and Athena, and said to “produce a sound as from a myriad roaring dragons.” (Iliad, 4.17) This is intended to invoke a strong defense, and has been effectively branded as such by Lockheed Martin, the manufacturer. Lockheed continues to sell Aegis technology to the USN under the Aegis BMD program, and also to Japan under the Aegis Ashore banner. The powerful Aegis radars were first fielded on Ticonderoga class cruisers, authorized in 1978, built from 1980, and commissioned from 1983. Their targets were Soviet anti-ship cruise missiles and bombers that would hunt USN carrier battle groups during the Cold War. This technology has evolved since then to offer some defense from ballistic missiles; per the Congressional Research Service:

Aegis BMD

enables warships to shoot down enemy ballistic missiles … Aegis BMD-equipped vessels can transmit their target detection information to the Ground-Based Midcourse Defense system and, if needed, engage potential threats using the RIM-161 Standard Missile 3 (SM-3) mid-course interceptors and the RIM-156 Standard Missile 2 Extended Range Block IV (SM-2 Block IV) or RIM-174 Standard Extended Range Active Missile (SM-6) terminal-phase interceptors. Aegis BMD does not have the ability to intercept ICBMs, although future versions may allow limited intercept capability.  [Emphasis added]

As retired F-35 and F-22 pilot Lt. Col. Berke so accurately noted before, information is the most precious commodity. The capability to detect a missile launch from the Sea of Japan, and transmit that through a secure network at the speed of light to every other component of the BMD network is the first and crucial step in the kill chain that hopefully results in a shoot down of each and every North Korean missile fired in anger, or even off-target. I would argue that the U.S. use the United Nations as a forum to define the parameters for any possible North Korean missile launch that should be intercepted with allied BMD assets If, for example, a North Korean missile looks likely to hit close to Tokyo, based upon the trajectory identified by Aegis ships at sea, then BMD should shoot it down. By making our rules of engagement public, this would provide a clear signal to China and Russia that the U.S. and its allies intend to use their BMD capabilities (and potentially learn from any failures) against live enemy missiles, but also temper the risk of escalation into any further missile volleys between any parties.

The U.S.S. John S. McCain after collision with a commercial tanker. [EPA]

Recently, however, the credibility of U.S. BMD deterrence has taken a large step backward due to self-inflicted wounds.  It is related to the concept of friction, as reported in this blog.  We can see the effects of friction on the U.S. Navy’s safety and navigation incidents, which have unfortunately cost of the lives of seventeen seamen, more injured, and perhaps some loss of prestige.

  • U.S.S. John S. McCain (DDG-56) collides with Alnic MC, a Liberian-flagged oil tanker of 30,040 gross tons, on [2017-08-21 05:24] east of the Straight of Malacca. (wikipedia), NavyTimes.
  • U.S.S. Fitzgerald (DDG-62) of 9000 gross tons collides with MV ACX Crystal, a Phillipines-flagged container of 29,060 gross tons.

As noted here, an interesting comparison is with a Russian naval vessel and its collision with a commercial ship in the Eastern Mediterranean Sea, on 27 April, close to the busy sea lanes of the Bosporus. Warships don’t transmit an Automatic Identification System (AIS) signal, so the ship is not visible for AIS-connected shipping; however, it would be visible on radar within a certain range, as an unidentified object. Also, warships, in accordance with wide-spread practices, are not predictable in their movements, including speed and course.

As The Economist reports, in a critique of the USN “… critics argue that the 277-ship naval fleet is already overstretched, particularly in the Western Pacific, where naval competition with an increasingly capable China requires a high tempo of operations. The John S.McCain was on its way to Singapore after a “freedom of navigationâ€� mission during which it had sailed through international waters near a reef where China has created an artificial island. The Chinese media have been cock-a-hoop over pictures of American warships limping into port with apparently self-inflicted damage.

The spate of accidents has raised questions about whether they are in some way linked to a common cause. Inevitably, there has been speculation that hacking of the ships’ computers or navigation systems by the Chinese or North Koreans might be responsible. The navy says it has seen nothing that suggests this might have happened.

It is far more likely that unrelenting operational demands on forward-deployed vessels and several years of Pentagon spending distorted by budget caps and sequestration have taken their toll. A report by the Government Accountability Office in 2015 found that the Navy was working on the basis that its Japan-based cruisers and destroyers would spend 67% of their time deployed and 33% in maintenance. That meant there would be no time left for training. Without training drills to remind sailors of the “basic seamanship� referred to by Admiral Richardson, it would not be surprising if some bad habits and sloppiness have crept in. [emphasis added]

Also, here is a great video, showing replay data, based on the Automatic Identification System (AIS) data source:

The Future of Big Data in Australia

The Future of Big Data in Australia

"Big Data� is one of those words that seems to be everywhere at the moment, used by everyone from Data Scientists to Branding Analysts. Yet, until now, it has remained merely a buzzword for many companies.

It seems, though, that things are changing in Australia. Recent research has found that Australian companies are spending more than ever before on Big Data systems and research, and that this stream of investment is projected to continue.

This is a welcome development, of course. Implementing Big Data systems is now easier and cheaper than ever before, and can bring great benefits to any enterprise that takes it seriously. Even small companies are embracing its power, and it is making itself felt in many industries that were previously sceptical about it.

The Growth of Big Data

New research by Telsyte suggests that Big Data is going from strength to strength in Australia. Conducted in the early part of 2017, as part of this research Telsyte talked to a huge range of companies in order to ascertain their attitude toward Big Data, and whether they would be investing more in this area in the future.

The results were striking. 83% of the Australian CIOs that Telsyte spoke to are planning ...


Read More on Datafloq
[Talend Podcast] Big Data in 2020: Featuring Mark van Rijmenam of Datafloq

[Talend Podcast] Big Data in 2020: Featuring Mark van Rijmenam of Datafloq

According to research firm IDC, the big data and business analytics market is predicted to hit $203 billion in the year 2020. Today, the creation and consumption of data continue to grow by leaps and bounds and with continued investment in big data analytics hardware, software, and services and in data scientists and their continuing education. However, technology, tools, and trends in big data seem to go out just as quickly as they come in. With the pace of change accelerating at an ever increasing rate, what will “big data� mean in 2020?

To answer this question, I sat down with prominent big data influencer, Mark van Rijmenam, founder of Datafloq to talk to him about what changes to expect in the world of big data, artificial intelligence, analytics and more in the next two years.

Follow Mark van Rijmenam (@VanRijmenam)

Follow Datafloq (@datafloq)


...


Read More on Datafloq
Amazon Moves Further Towards Robot Automation

Amazon Moves Further Towards Robot Automation

Robots are soon to become a large part of the world's workforce. While things don't currently look like a science-fiction television show, that time may come. Right now, the manufacturing and industrial sectors have integrated robotics into their daily operations. In Amazon's warehouse, robots remain a common sight. Unfortunately, the current crop of robotic systems employed by Amazon isn't living up to expectations.

The robots don't seem to be as effective as desired. The company has "robo pickers" integrated into operations and they don't seem to pick things up -- or drop them off -- too well. A lot of disappointment seems to have ensued.

This does not mean, however, Amazon plans on ceasing use of robots in its warehouse. Rather, plans are to work at improving the way the automated system works. Once warehouse robotics are perfected, the obtained benefits can be enormous.

The Unique Warehouse

A successfully automated warehouse could mean wonders for Amazon's productivity. Through automating the product line, packages could be shipped in a far more expedient manner. Costs might be brought down significantly, which passes savings off to the consumer.

Amazon wants to turn its current warehouse into the warehouse of the future. Improving how its robots perform absolutely would assist ...


Read More on Datafloq
Is your Data ready for GDPR?

Is your Data ready for GDPR?

The EU’s General Data Protection Regulation is set to not only disrupt the data industry but all business’ that hold customer data. With new ruling on how business’ manage their customer data, GDPR should no doubt be on every business’ radar.

First introduced in May 2016, the new ruling will begin to be enforced in May 2018. With less than 11 months to make the necessary changes, the clock is ticking to clean up your data, protecting yourself from substantial fines.

The fines for non-compliance will be up to 4% of global annual turnover or 20 million euro, whichever is higher.

What are the most important changes?

From May 2018, the individuals who you hold personal data about will have the following rights:The right of access


The right of rectification
The right to erase
The right to restrict processing
The right to data portability
Rights related to restricting automated decision making and profiling


One of the largest changes brought about by the new GDPR is you must legally ask permission to use an email for email marketing campaigns. Simultaneously recording the date, the channel and statement used to opt in or out of email marketing.

The real question is, how does one prepare their data so that they can effectively satisfy these rights?  The answer lies ...


Read More on Datafloq
Why Security Analytics is so Important for the Success of the Internet of Things

Why Security Analytics is so Important for the Success of the Internet of Things

The Internet of Things is expected to grow to 8.4 billion devices in 2017, with predictions of more than 20 billion devices by 2020. While this market is growing rapidly, it faces a major barrier on the way to its success.

Connected devices are vulnerable, as seen in the DDoS attack on October 21st, 2016, which took down the DNS provider Dyn. Large websites such as Etsy, Twitter, PayPal, Verizon, Comcast and Reddit were among the many that were virtually unusable during this attack. The hackers turned to unsecured IoT devices to create an extensive botnet so they could push enough traffic to take down Dyn.

While this was the largest attack caused by IoT security issues, it certainly isn't the first. The IoT market needs to find a way to properly secure these devices before more high-profile attacks completely negate the benefits of having this connected technology in your organization.

The Consequences of Unsecured Connected Devices

IoT devices add countless potential attack surfaces to an organization, whether you have an official policy or people are bringing in their own technology. They are trying to connect to your network and have the potential of giving attackers a direct entry point to your infrastructure.

While several ...


Read More on Datafloq
How these 6 Industries Can Be Transformed through Blockchain Technology

How these 6 Industries Can Be Transformed through Blockchain Technology

These days, you might be hearing a lot about blockchain technology.

Right?

And you must have some questions in your mind like what is blockchain technology? How is this technology useful for different industries? How will it impact on your business?

Having these questions in mind is extremely common because this high-end technology has created a buzzword in the market.

Currently, banking and payments are not the only industries that can be affected by blockchain tech. Law enforcement, entertainment, education, stock trading, charity, etc. are some of the different industries that also could be transformed with blockchain technology.

According to Anirban Bose, a Global Head of Banking and Financial Services, Blockchain is a game-changer in the industry and there are no financial service firms that can afford to ignore this technology.

The existence of Bitcoin as a decentralized digital currency is only possible because of blockchain technology and mainly a public ledger, which securely and automatically verifies and records a high-volume of transactions digitally.

Gradually, number of entrepreneurs are believing that various industries can be easily disrupted using Blockchain technology. Even, there are various business use cases for transactions, which are verified and organized by a decentralized platform, which requires no central supervision.

Let’s look at the different ...


Read More on Datafloq
5 Applications of Big Data in Social Media Marketing

5 Applications of Big Data in Social Media Marketing

The use of big data in social media has many aspects when it comes to marketing. There are various forms with which you can use big data to design consumer preferences that will attract clients and lead to sales. There are several ways you can apply big data in social media to achieve success. These applications include;

Vision recognition

When you use big data in social media to recognise images in the pictures, it can give assistance with generating custom classifiers. After uploading a photo, the application returns terms that represent the things it has found in the photo like events or objects. You can train your app on specific image sets like logos for recognising customised images in real-time. This method will help you as a marketer to detect customers that are posting on your social pages and prompt you to engage.

Personality insights

Big data in social media analyses personality attributes from posts like emails and social posts so that you get the right insights about people. The users will uncover a deeper understanding of the people’s needs, characteristics, personalities, and values that drive personalisation. Big data outputs a profile with three personalities in the dimension of needs, values, and the five ...


Read More on Datafloq
How To Turn Your Organization Into Making Data-Driven Decisions

How To Turn Your Organization Into Making Data-Driven Decisions

Most organizations are now realizing the power of data and analytics. In the past decade, a few pioneering organizations have demonstrated how making strategic decisions based on data-derived facts can push businesses and innovation forward. It’s not only anecdotal; evidence that making data based decisions makes business-sense is piling up. The McKinsey Global Institute has indicated that data-driven organizations are 23 times more likely to outcompete non-data-intensive organizations in terms of new customer acquisition, nine times as likely to surpass competitors in terms of customer loyalty, and more than twice as likely to be more profitable than competitors.

It’s tough enough for many organizations to keep track of their data; it’s much tougher to translate the data into valuable business insights. Before an organization can use data as a core part of their decision-making process, it needs to build a foundation of well-governed and valuable data, commitment from company personnel to integrate data analytics into their business process, and common language surrounding data to ensure sound communication between departments and across the organizational hierarchy. Luckily, many organizations have already taken the first steps and through their experiences, you can learn how to turn your organization into a data analytics powerhouse.

Create a ...


Read More on Datafloq
The Reality Of Working With Data

The Reality Of Working With Data

There is much talk about the commercialization of Big Data. Many understand its benefits, but there is a lack of awareness on how it actually happens. Some organizations join the Big Data game with misconceptions that it is a simple, easy, and quick implementation and have no real strategy in place to derive value from the data. The truth is that there is a disconnect between Big Data expectations and reality.

Working with data is a complex process that needs time, effort, and a proper strategy. Unless organizations have the necessary skills in-house, they would need to have the right partner or vendor that has the data engineering and data transforming solutions in place to turn raw data into a high-quality data product--one that is both accessible and consumable.

Strategy First

Before embarking on Big Data investments, the first step an organization needs to do is to set a data strategy. This refers to the overall vision, as well as the definitive action steps, which serve as the platform for an organization to harness data-dependent or data-related capabilities. Data architects and engineers need to have clear and specific objectives to achieve the organization’s data goals. A misconception when it comes to data investments ...


Read More on Datafloq
Chronology of North Korean Missile Development

Chronology of North Korean Missile Development

A nice little article from AFP: https://www.yahoo.com/news/chronology-north-korean-missile-development-011251742.html

Key dates: 1999, 2000, 2005, 2006, 2009, 2016 and of course 2017.

Get Certified! #DataVault 2.0 Certification in the US

Get Certified! #DataVault 2.0 Certification in the US

If you have been waiting to get your Data Vault 2.0 certification there are three sessions coming in the new few months right in the USA. Here is the list.
Structure Of The U.S. Defense Department History Programs

Structure Of The U.S. Defense Department History Programs

With the recent discussions of the challenges facing U.S. government historians in writing the official military histories of recent conflicts, it might be helpful to provide a brief outline of the structure of the Department of Defense (DOD) offices and programs involved. There are separate DOD agency, joint, and service programs, which while having distinct missions, sometime have overlapping focuses and topics. They are also distinct from other Executive Branch agency history offices, such as the Office of the Historian at the State Department.

The Office of the Secretary of Defense has its own Historical Office, which focuses on collecting, preserving, and presenting the history of the defense secretaries. Its primary publications are the Secretaries of Defense Historical Series. Although the office coordinates joint historical efforts among the military services and DOD agency history offices, it does not direct their activities.

The Joint History Office of the Joint Chiefs of Staff (JCS) provides historical support to the Chairman and Vice Chairman of the Joint Chiefs of Staff and to the Joint Staff. Its primary publications are the JCS and National Policy series, as well as various institutional studies and topical monographs.

The Joint History Office also administers the Joint History Program, which includes the history offices of the joint combatant commands. Its primary role is to maintain the history programs of the commanders of the combatant commands. Current guidance for the Joint History Program is provided by Chairman of the Joint Chiefs Instruction 5320.1B, “Guidance for the Joint History Program,� dated 13 January 2009.

Each of the military services also has its own history program. Perhaps the largest and best known is the Army Historical Program. Its activities are defined in Army Regulation 870-5, “Military History: Responsibilities, Policies, and Procedures,� dated 21 September 2007. The program is administered by the Chief of Military History, who is the principal advisor to the Secretary of the Army and the Army Chief of Staff for all historical matters, and is dual-hatted as the director of the U.S. Army Center for Military History.

The Air Force History and Museum Program is outlined in Air Force Policy Directive 84-1, “Historical Information, Property, and Art,� dated 16 September 2005. The Director of Air Force History and Museums, Policies, and Programs oversees the Air Force Historical Studies Office, and its field operating agency, the Air Force Historical Research Agency.

The Navy History Program is managed by the Director of Navy History. Its activities are described in OPNAV Instruction 5750.4E, “Navy History Programs,â€� dated 18 June 2012. The Navy’s central historical office is the Naval History and Heritage Command, which includes the Navy Department Library and the National Museum of the United States Navy in Washington, D.C.

The U.S. Marine Corps History Division, a branch of Marine Corps University, runs and administers the Marine history program. Its policies, procedures, standards, and responsibilities are outlined in Marine Corps Order 5750.1H, dated 13 February 2009.

In future posts, I will take a closer look at the activities and publications of these programs.

How to Perform a GRU Implementation in TensorFlow

How to Perform a GRU Implementation in TensorFlow

MLPs (Multi-Layer Perceptrons) are great for many classification and regression tasks, but it is hard for MLPs to do classification and regression on sequences. In this code tutorial, a GRU is implemented in TensorFlow.

Introduction

A sequence is an ordered set of items and sequences appear everywhere. In the stock market, the closing price is a sequence. Here, time is the ordering. In sentences, words follow a certain ordering. Therefore, sentences can be viewed as sequences. A gigantic MLP could learn parameters based on sequences, but this would be infeasible in terms of computation time. The family of Recurrent Neural Networks (RNNs) solve this by specifying hidden states which do not only depend on the input, but also on the previous hidden state. GRUs are one of the simplest RNNs. Vanilla RNNs are even simpler, but these models suffer from the Vanishing Gradient problem.

GRU Model

The key idea of GRUs is that the gradient chains do not vanish due to the length of sequences and this is done by allowing the model to pass values completely through the cells. The model is defined as the following [1]:






I had a hard time understanding this model, but it turns out that it is not too hard ...


Read More on Datafloq
5 Limiting Factors for Data-driven Businesses

5 Limiting Factors for Data-driven Businesses

Many companies today claim they are data driven just because they have a central system that stores all their data. Unfortunately, they are not. A company that is truly data-driven must have a system where anyone in the organization who can use data can access it at any time. There are many challenges and difficulties today faced by many companies who want to become data driven. Many of these problems are stated below to help you identify them and find solutions to data issues in your organization.

Data Aggregation Is A Big Challenge

Many companies know that making decisions based on available and accessible data can help improve productivity, and they are willing to adopt it. The problem is, they don't know how to aggregate data from different sources. Gathering data from various sources into a single system can make it easy to access, and users will be able to interact with it as efficiently as possible. Many companies don't know how to use data, and they have to understand why experts are needed to help them find a resourceful means to build a system that delivers results.

Disruptive Innovation Is Hard To Predict

Companies need to be able to understand the impact that ...


Read More on Datafloq
Why Is Blockchain Gaining So Much Popularity?

Why Is Blockchain Gaining So Much Popularity?

Blockchain is the latest buzzword with almost all the Fintech people and enthusiast talking about its potential globally. The technology, originally devised for the digital currency, Bitcoin, is gaining a lot of popularity and traction. It was invented by Satoshi Nakamoto, a secretive internet user in 2008, before it went online in 2009. Several attempts to identify Satoshi have been made without any conclusive proof.

According to blockchain.info, about 16.5 million of bitcoins have been mined till date with a CAGR of 34% since 2009. The website quotes an aggregate bitcoin transaction of over 200,000 taking place every day, and the number is increasing with each passing minute. In March 2017, the value of a Bitcoin, at US$1,268, exceeded that of an ounce of gold ($1,233) for the first time. And then there is no looking back. Currently, a Bitcoin is valued at over US$4,500. That’s quite impressive!

But what makes Bitcoin and the technology backed by it the preferred choice to perform transactions? Let’s delve more deeply into it to understand this. Every business is based on transactions. These transactions are routed through third-party intermediaries such as banks, brokers, and lawyers. The process to complete a business transaction thus takes a ...


Read More on Datafloq
Human Factors In Warfare: Friction

Human Factors In Warfare: Friction

The Prussian military philosopher Carl von Clausewitz identified the concept of friction in warfare in his book On War, published in 1832.

Everything in war is very simple, but the simplest thing is difficult. The difficulties accumulate and end by producing a kind of friction that is inconceivable unless one has experienced war… Countless minor incidents—the kind you can never really foresee—combine to lower the general level of performance, so that one always falls far short of the intended goal… Friction is the only concept that more or less corresponds to the factors that distinguish real war from war on paper… None of [the military machine’s] components is of one piece: each part is composed of individuals, every one of whom retains his potential of friction [and] the least important of whom may chance to delay things or somehow make them go wrong…

[Carl von Clausewitz, On War, Edited and translated by Michael Howard and Peter Paret (Princeton, NJ: Princeton University Press, 1984). Book One, Chapter 7, 119-120.]

While recognizing this hugely significant intangible element, Clausewitz also asserted that “[F]riction…brings about effects that cannot be measured, just they are largely due to chance.� Nevertheless, the clearly self-evident nature of friction in warfare subsequently led to the assimilation of the concept into the thinking of most military theorists and practitioners.

Flash forward 140 years or so. While listening to a lecture on combat simulation, Trevor Dupuy had a flash of insight that led him to conclude that it was indeed possible to measure the effects of friction.[1] Based on his work with historical combat data, Dupuy knew that smaller-sized combat forces suffer higher casualty rates than do larger-sized forces. As the diagram at the top demonstrates, this is partly explained by the fact that small units have a much higher proportion of their front line troops exposed to hostile fire than large units.

However, this relationship can account for only a fraction of friction’s total effect. The average exposure of a company of 200 soldiers is about seven times greater than an army group of 100,000. Yet, casualty rates for a company in intensive combat can be up to 70 times greater than that of an army group. This discrepancy clearly shows the influence of another factor at work.

Dupuy hypothesized that this reflected the apparent influence of the relationship between dispersion, deployment, and friction on combat. As friction in combat accumulates through the aggregation of soldiers into larger-sized units, its effects degrade the lethal effects of weapons from their theoretical maximum. Dupuy calculated that friction affects a force of 100,000 ten times more than it does a unit of 200. Being an ambient, human factor on the battlefield, higher quality forces do a better job of managing friction’s effects than do lower quality ones.

After looking at World War II combat casualty data to calculate the effect of friction on combat, Dupuy looked at casualty rates from earlier eras and found a steady correlation, which he believed further validated his hypothesis.

Despite the consistent fit of the data, Dupuy felt that his work was only the beginning of a proper investigation into the phenomenon.

During the periods of actual combat, the lower the level, the closer the loss rates will approach the theoretical lethalities of the weapons in the hands of the opposing combatants. But there will never be a very close relationship of such rates with the theoretical lethalities. War does not consist merely of a number of duels. Duels, in fact, are only a very small—though integral—part of combat. Combat is a complex process involving interaction over time of many men and numerous weapons combined in a great number of different, and differently organized, units. This process cannot be understood completely by considering the theoretical interactions of individual men and weapons. Complete understanding requires knowing how to structure such interactions and fit them together. Learning how to structure these interactions must be based on scientific analysis of real combat data.

NOTES

[1] This post is based on Trevor N. Dupuy, Understanding War: History and Theory of Combat (New York: Paragon House, 1987), Chapter 14.

Deployed Troop Counts

Deployed Troop Counts

Well, turns out we have a little more deployed troops in Afghanistan than is previously reported. Previously it has been reported to be 8,400. Turns out we have 11,000. This does not include the 3,900 that have been recently authorized to go there.

We also have officially 5,262 in Iraq and 503 in Syria. These figures are low with a couple of thousand more troops in both countries (not sure if that is supposed to a couple of thousand more in each of these two countries).

So potentially we are looking at around 15,000 troops in Afghanistan and may have around 8,000 troops in Iraq and Syria.

Reuters article: https://www.reuters.com/article/us-usa-afghanistan-military-idUSKCN1BA2IF

 

HDFS vs. HBase : All you need to know

HDFS vs. HBase : All you need to know

The sudden increase in the volume of data from the order of gigabytes to zettabytes has created the need for a more organized file system for storage and processing of data. The demand stemming from the data market has brought Hadoop in the limelight making it one of biggest players in the industry. Hadoop Distributed File System (HDFS), the commonly known file system of Hadoop and HBase (Hadoop’s database) are the most topical and advanced data storage and management systems available in the market.



What are HDFS and HBase?

HDFS is fault-tolerant by design and supports rapid data transfer between nodes even during system failures. HBase is a non-relational and open source Not-Only-SQL database that runs on top of Hadoop. HBase comes under CP type of CAP (Consistency, Availability, and Partition Tolerance) theorem.

HDFS is most suitable for performing batch analytics. However, one of its biggest drawbacks is its inability to perform real-time analysis, the trending requirement of the IT industry. HBase, on the other hand, can handle large data sets and is not appropriate for batch analytics. Instead, it is used to write/read data from Hadoop in real-time.

Both HDFS and HBase are capable of processing structured, semi-structured as well as unstructured data. ...


Read More on Datafloq
How Big Data can Help Cab Aggregators

How Big Data can Help Cab Aggregators

Gone are those days when folks would have to stand and wait for taxis or cabs to pick them up and drop them to their destinations. Now, it’s at their fingertips. Cabs are, well and truly, only a touch away. It has indeed become so easy. No more hollering or wasting your time. But we should realise that there is a lot going on behind the scenes, getting you the cab. Do they not have problems, covering so many cities, countries and continents? Yes, they do. They have problems galore. One such set of problems can be solved by Big Data and its analysis.

The global cab aggregator space is undergoing cut throat competition. New players are finding it increasingly hard to differentiate themselves from their peers. Ever since their inception, customer retention and ensuring good ride experience has always been of maximum importance. With abundant data at their disposal, it becomes important for the senior management to consider solutions that uncover actionable insights and deliver customer expectations. It is no longer just a game of marketing, as simply competing through offers and promotional codes wouldn’t always result in revenue. The key is simple, real time exploratory data analytics.

Ever wondered why ...


Read More on Datafloq
Economy, Environment, Employment: How Driverless Vehicles Might Change Our World

Economy, Environment, Employment: How Driverless Vehicles Might Change Our World

The promise of widespread autonomous vehicles has dominated recent headlines in both the technology and automobile industries, but the idea has been around much longer than that. Public discussions about self-piloting vehicles took place as early as 1939. According to Wired Magazine, GM's exhibit at the World's Fair sparked such discussions. In 2010, thanks to Google, the first truly driverless vehicles hit the road. While the obvious benefits of self-driving cars have been discussed, like fewer accidents and better access to safe transportation, there are some unintended consequences worth discussing. From higher unemployment rates to better insurance options, a driverless future has an incredible potential impact on our lives.

Unemployment in Driving Industries

An increase in driverless vehicles could mean an increase in unemployment. According to U.S. labor statistics from 2014, nearly 2.6 million people are actively employed as truck drivers, taxi and independent service drivers, and public transportation operators. If driverless cars become commonplace, the devastating effect on American employment rates would be catastrophic.

Jobs related to the driving industry could be lost as well, like administrative services, management companies, and parent companies supporting trucking or driving positions will be eliminated or downsized.

Loss of Private Car Ownership

Industries and people learn to adapt. ...


Read More on Datafloq
6 Reasons To Build Common Language Into Your Code

6 Reasons To Build Common Language Into Your Code

For the most part, programming languages are remarkably precise. Even the smallest deviation from the norms of the language can result in some serious syntax and contextual errors.

This isn’t as much of an issue with modern languages, as some are designed to be both written and read like regular English. This includes languages such as COBOL, AppleScript, Inform and more. However, some of the older languages can be incredibly complex to translate just by looking at active code.

There is one component that makes reading code more bearable, particularly in smaller segments. As you might have guessed, it has to do with comments, also referred to as “commenting out� code.

Comments — often denoted by a specific tag or symbol — are not read by the development environment or finished application. Instead, they exist only to guide someone reading through the active code.

For example, you might include a comment that explains what a snippet is used for, describing the name of an object and its parameters and what it does. Front-end IT professionals also use a similar strategy when working on projects that involve multiple sets of eyes.

Being able to rely on a consistent, common language ensures that, regardless of background or ...


Read More on Datafloq
Å�szi data science választható tárgyak – Nem csak BME hallgatóknak

Å�szi data science választható tárgyak – Nem csak BME hallgatóknak

(Hallgatóknak rövidítve:)

BME választható tárgyak hiteles előadóktól:

- Alkalmazott adatelemzés (K-Cs 12h) minden órán laptopoddal dolgozol, Python, R és SAS + data science és gépi tanulás alapjai
- 'Big Data' elemzési eszközök nyílt forráskódú platformokon (Sz 12h) Hadoop, Spark, teljes big data stack

Go to Neptun!

a.jpg

(Külsősöknek, részletek után érdeklődőknek)

Idén is meghirdetjük a BME-n tartott legfontosabb tárgyainkat külsősök számára is. Ez azt jelenti, hogy a műegyetemista hallgatókkal együtt szeptember elejétől 14 héten keresztül lehet a data science és a big data világába betekintést kapni. 

A kezdeményezés igen népszerű, de a helyek számát korlátozzák (1) a rendelkezésre álló termek méretei, illetve (2) az az elvünk, hogy nem engedünk be több külsős érdeklődőt a tárgyra, mint ahány egyetemi hallgató jelentkezett az órára.

 

Ha az adatelemzéssel kapcsolatos programnyelvekhez szeretnél érteni

Tárgy neve: Alkalmazott adatelemzés (Applied Data Analytics, azaz ADA)
Kedd és csütörtök 12-14h
Terem: Lágymányosi kampusz, Magyar tudósok körútja
Tárgy hivatalos tematikája

Az iteratív módon fejlesztett adatfeldolgozó eljárások vannak a középpontban, az adatelemzés programozási nyelveit tanítjuk nektek. A téma a data science alapfeladatainak megoldása abban az esetben, ha valamilyen programozási nyelven kell megoldani a problémát: SAS programozási nyelvet, Python és R programozást tanítunk úgy, hogy az órákon mindenki a saját gépén ugyanúgy készíti a programkódot, mint az előadó a kivetítőn. Igazi közös gondolkodás, szemléletátadás is így válik lehetségessé, hiszen itt tényleg bezavarnak a valós adatok sajátosságai, nem minden csodaszép, mint a machine learning könyvekben. 

Ha a big data technológiák dzsungelében szeretnél tájékozódni

Tárgy neve: 'Big Data' elemzési eszközök nyílt forráskódú platformokon
Szerda 12-14h
Terem: Lágymányosi kampusz, Magyar tudósok körútja
Tárgy hivatalos tematikája

Itt a Dmlab big data szakemberei adnak betekintést a területen kialakult technológiai stack felépítésébe. A MapReduce, Hadoop alapoktól indulunk, és a legújabb technológiákig jutunk el. Nyilván mindben teljesen nem fogunk tudni elmélyedni, de aki ezt a kurzust végighallgatja, az könnyen fog tájékozódni a big data technológiák között. A félév végén egy ZH és egy házifeladat alapján kapnak jegyet a hallgatók, külön kérésre a külsős kollégák is megmérettethetik magukat ezeken a számonkéréseken.

Mindkét tárgyra itt tudtok külsősként jelentkezni: JELENTKEZÉS

A jelentkezés alapvetően jelentkezési sorrendben történik, várhatóan a hét végén fogunk eredményt hirdetni. A hírlevélre feliratkozók között már néhány nappal korábban már kiküldtük az információt. A részvételnek nincs külön feltétele, a kurzuson való részvétel ingyenes. Van lehetőség arra is, hogy hivatalosan beiratkozz néhány tízezer Forintért a BME-re erre a tárgyra, ebben az esetben hivatalosan le is vizsgáztatunk, és mint hallgató vehetsz részt a tárgyon. 

Megosztom Facebookon! Megosztom Twitteren! Megosztom Tumblren!

Book Review – Eastern Front 1943

Book Review – Eastern Front 1943

Obviously, anything related to the Battle of Kursk gets my attention. This book review was just emailed to me: https://saberandscroll.weebly.com/blog-ii/germany-and-the-second-world-war-volume-viii-the-eastern-front-1943-1944-the-war-in-the-east-and-on-the-neighbouring-fronts-edited-by-karl-heinz-frieser

I wonder if Frieser’s book references my book (probably not, as I did not publish until 2015). Anyhow, there is not a review of my Kursk book on the Saber and Scroll website.

That book review is part of very interesting website that has two book review blogs: https://saberandscroll.weebly.com/

The Power of Artificial Intelligence to Revolutionize the Oil & Gas Industry

The Power of Artificial Intelligence to Revolutionize the Oil & Gas Industry

Global oil supply appears comfortable for the next three years, but according to Market Report Series: Oil 2017, growth slows considerably after that. To keep pace with demand after 2020 means getting caught up with 21st-century technology immediately.

The oil and gas industry has long leveraged innovation, namely when it comes to drilling, but business intelligence capabilities, such as Artificial Intelligence (A.I.), have been reserved for only the largest of projects and biggest of companies.

But seeing as there is no end in sight for the era of cheap oil, the industry has no choice but to more bullheaded in the democratization of tools like A.I., automation, and IoT due to the need to operate in lower margin environments. Data and intelligence must move beyond the drill bit, into every facet of an organization.

Bring Production into the Space Age

The drilling operations have long since been in the space age. They have invested billions in the hope and prayer that some crazy new technology using thin fiber to crack through hard stone would produce oil. And voila, with one tweak on a mobile app, a rig worker can instruct a drill bit that is thousands of feet underground to hit squarely on the ...


Read More on Datafloq
DMAIC Approach in Lean Six Sigma

DMAIC Approach in Lean Six Sigma

Lean and Six Sigma Concepts help to improve the efficiency and effectiveness of business performance. The statistical representation of Lean and Six Sigma defines the performance of the process. To achieve Six Sigma, any process needs to reduce the defect rate below 3.4 defects per million (DPM) opportunities. Lean and Six Sigma focuses on process enhancement and reduces the defect of the process. Lean and Six Sigma is driven by a DMAIC concept that recognizes, measures, evaluates, improves and controls the process.

What is DMAIC in Six Sigma?

DMAIC is a data-driven quality methodology utilized for process improvement by integrating a Six Sigma quality initiative. The methodology starts with the Define phase where the core business process and CTQ (critical to quality) issues are defined. In this phase, it explains the expectations of the customer, customer requirements, stop and start the process, project boundaries and a process flow.

The performance of the business is measured in the second stage i.e. Measure. The data collection plan is scheduled to define the types of defects and metrics in the process. Depending upon the gaps between the current and goal process variation and performance is defined. Further, in the Analyze phase, the process finds the root causes of defects and opportunities for process betterment. A relevant solution is identified to reduce the defects in the process. The plan for the improvement of the process is implemented in Improve phase. The last phase, Control includes documentation and standardization of the new process. It defines a plan to observe the process and to control the process performance.

How to apply DMAIC Steps to Continuous Improvement Projects?

 

Lean and Six Sigma concept not only helps to resolve the problems, but it also helps to identify the actual cause of the problem and offer the relevant solution to control the problem. Some questions related to each process that could lead to solve the problem in a statistical method:

 

 

  • Define: What is the process outcome to be measured?
  • Measure: What is the current performance of the measured process?
  • Analyze: Distinguishing the real causes and the problem?
  • Improve: How to eliminate the cause from the process or how the problem’s size can be reduced?
  • Control: Did the performance of the process improved?

DMAIC is a cycle used to remove the defect from the process and improve the opportunities for business improvements. To learn more about DMAIC and other methodologies of Lean and Six Sigma, contact support@msystraining.com. MSys lead master instructors will offer the detailed information about the same.


PMP/CAPM Certification – Before or After Releasing the PMBOK Guide V6?

PMP/CAPM Certification – Before or After Releasing the PMBOK Guide V6?

The PMBOK® Guide Sixth Edition is bringing a huge revolution by changing the foundation of the Project Management profession. The release of the PMBOK® Guide V6 with the inclusion of Agile concepts in all Knowledge Areas is the biggest change in the history of the PMBOK Guide! The PMBOK Guide 6 will make PMP®’s more qualified as they will not only use Scrum vocabulary but will utilize the core Agile practices like Progressive Elaboration, Rolling Wave Planning and Decomposition.

The PMI published the first edition of the PMBOK® Guide in 1996, and since then it has grown from 176 pages to 589 pages. Since then Project Managers evolved in all departments from the engineering department to the strategic organizational changes. Considering all those facts, the PMBOK® Guide V6 has changed more drastically than ever before.

The release of the new PMBOK Guide will bring some major changes to PMP and CAPM certification exam and becoming PMP/CAPM certified will be more challenging for aspirants looking to get certified in 2017 or early 2018.

Impact of PMBOK Guide 6th Edition on PMP/CAPM Aspirants

  • Lack of Study Material: As PMBOK V6 is releasing in the third quarter, instructors and training providers will not get much time to study and prepare the course material. The current resources are based on the 5th edition of the book, and it requires time to get updated according to the latest edition.
  • Increased Difficulty Level: The addition of Agile will increase the difficulty level of PMP/CAPM examination and professionals have to work harder to get certified.
  • Hike in Examination Fee: It has been predicted that the launch of PMBOK Guide 6 might increase the exam fee of PMI certification exams.
  • Change in Exam Format: We can expect changes in the exam format because of changes in knowledge areas and processes.

Though the PMBOK Guide is releasing later this month or early next month, we can witness the reflection of it on PMP/CAPM exam from January’ 2018. Professionals who are passionate about their careers and don’t want to delay the career growth, it’s high time for them to schedule the exam with PMI and get certified in 2017.


Fifth Generation Deterrence

Fifth Generation Deterrence

“Deterrence is the art of producing in the mind of the enemy… the FEAR to attack. And so, … the Doomsday machine is terrifying and simple to understand… and completely credible and convincing.” – Dr. Strangelove.

In a previous post, we looked at some aspects of the nuclear balance of power. In this Stpost, we will consider some aspects of conventional deterrence. Ironically, Chris Lawrence was cleaning out a box in his office (posted in this blog), which contained an important article for this debate, “The Case for More Effective, Less Expensive Weapons Systems: What ‘Quality Versus Quantity’ Issue?” by none other than Pierre M. Sprey, available here, published in 1982.

In comparing the F-15 and F-16, Sprey identifies four principal effectiveness characteristics that contribute to victory in air-to-air combat:

  1. Achieving surprise bounces and avoiding being surprised;
  2. Out-numbering the enemy in the air;
  3. Out-maneuvering the enemy to reach firing position (when surprise fails);
  4. Achieving reliable kills within the brief firing opportunities presented by combat.

“Surprise is the first because, in every air war since WWI, somewhere between 65% and 85% of all fighters shot down were unaware of their attacker.” Sprey mentions that the F-16 is superior to the F-15 due to the smaller size, and that fact that it smokes much less, both aspects that are clearly Within-Visual Range (WVR) combat considerations. Further, his discussion of Beyond Visual Range (BVR) combat is dismissive.

The F-15 has an apparently advantage inasmuch as it carries the Sparrow radar missile. On closer examination, this proves to be little or no advantage: in Vietnam, the Sparrow had a kill rate of .08 to .10, less that one third that of the AIM-9D/G — and the new models of the Sparrow do not appear to have corrected the major reasons for this disappointing performance; even worse, locking-on with the Sparrow destroys surprise because of the distinctive and powerful radar signature involved.

Sprey was right to criticize the performance of the early radar-guided missiles.  From “Trends in Air-to-Air Combat: Implications for Future Air Superiority,” page 10

From 1965 through 1968, during Operation Rolling Thunder, AIM-7 Sparrow missiles succeeded in downing their targets only 8 percent of the time and AIM-9 Sidewinders only 15 percent of the time. Pre-conflict testing indicated expected success rates of 71 and 65 percent respectively. Despite these problems, AAMs offered advantages over guns and accounted for the vast majority of U.S. air-to-air victories throughout the war.

Sprey seemed to miss out of the fact that the radar guided missile that supported BVR air combat was not something in the far distant future, but an evolution of radar and missile technology. Even in the 1980’s, the share of air-to-air combat victories by BVR missiles was on the rise, and since the 1990’s, it has become the most common way to shoot down an enemy aircraft.

In an Aviation Week podcast in July of this year, retired Marine Lt. Col. David Berke (also previously quoted in this blog), and Pierre Sprey debated the F-35. Therein, Sprey offers a formulaic definition of air power, as created by force and effectiveness, with force being a function of cost, reliability, and how often it can fly per day (sortie generation rate?). “To create air power, you have to put a bunch of airplanes in the sky over the enemy. You can’t do it with a tiny hand full, even if they are like unbelievably good. If you send six aircraft to China, they could care less what they are … F-22 deployments are now six aircraft.”

Berke counters with the ideas that he expressed before in his initial conversation with Aviation week (as analyzed in this blog), that information and situational awareness are by far the most important factor in aerial warfare. This stems from the advantage of surprise, which was Sprey’s first criteria in 1982, and remains a critical factor is warfare to this day. This reminds me a bit of Disraeli’s truism of “lies, damn lies and statistics”pick the metrics that tell your story, rather than objectively look at the data.

Critics beyond Mr. Sprey have said that high technology weapons like the F-22 and the F-35 are irrelevant for America’s wars; “the [F-22] was not relevant to the military’s operations in places like Iraq, Afghanistan and Libya — at least according to then-secretary of defense Robert Gates.” Indeed, according to the Washington Post, “Gates called the $65 billion fleet a ‘niche silver-bullet solution’ to a major aerial war threat that remains distant. … and has promised to urge President Obama to veto the military spending bill if the full Senate retains F-22 funding.”

The current conflict in Syria against ISIS, after the Russian deployment resulted in crowded and contested airspace, as evidenced by a NATO Turkish F-16 shoot down of a Russian Air Force Su-24 (wikipedia), and as reported on this blog. Indeed, ironically for Mr. Sprey’s analysis of the relative values of the AIM-9 vs the AIM-7 missiles, as again reported by this blog,

[T]he U.S. Navy F/A-18E Super Hornet locked onto a Su-22 Fitter at a range of 1.5 miles. It fired an AIM-9X heat-seeking Sidewinder missile at it. The Syrian pilot was able to send off flares to draw the missile away from the Su-22. The AIM-9X is not supposed to be so easily distracted. They had to shoot down the Su-22 with a radar guided AMRAAM missile.

For the record the AIM-7 was a direct technical predecessor of the AIM-120 AMRAAM. We can perhaps conclude that having more that one type of weapon is useful, especially as other air power nations are always trying to improve their counter measures, and this incident shows that they can do so effectively. Of course, more observations are necessary for statistical proof, but since air combat is so rare since the end of the Cold War, the opportunity to learn the lesson and improve the AIM-9X should not be squandered.

USAF Air Combat Dominance as Deterrent

Hence to fight and conquer in all your battles is not supreme excellence; supreme excellence consists in breaking the enemy’s resistance without fighting. – Sun Tzu

The admonition to win without fighting is indeed a timeless principle of warfare, and it is clearly illustrated through this report on the performance of the F-22 in the war against ISIS, over the crowded airspace in Syria, from Aviation Week on June 4th, 2017.  I’ve quoted at length, and applied emphasis.

Shell, a U.S. Air Force lieutenant colonel and Raptor squadron commander who spoke on the condition that Aviation Week identify him only by his call sign, and his squadron of stealth F-22 Lockheed Martin Raptors had a critical job to do: de-conflict coalition operations over Syria with an irate Russia.

… one of the most critical missions the F-22 conducts in the skies over Syria, particularly in the weeks following the April 6 Tomahawk strike, is de-confliction between coalition and non-coalition aircraft, says Shell. … the stealth F-22’s ability to evade detection gives it a unique advantage in getting non-coalition players to cooperate, says Shell. 

‘It is easier to bring air dominance to bear if you know where the other aircraft are that you are trying to influence, and they don’t know where you are,’ says Shell. ‘When other airplanes don’t know where you are, their sense of comfort goes down, so they have a tendency to comply more.

… U.S. and non-coalition aircraft were still communicating directly, over an internationally recognized, unsecure frequency often used for emergencies known as ‘Guard,’  says Shell. His F-22s acted as a kind of quarterback, using high-fidelity sensors to determine the positions of all the actors on the battlefield, directing non-coalition aircraft where to fly and asking them over the Guard frequency to move out of the way. 

The Raptors were able to fly in contested areas, in range of surface-to-air missile systems and fighters, without the non-coalition players knowing their exact positions, Shell says. This allowed them to establish air superiority—giving coalition forces freedom of movement in the air and on the ground—and a credible deterrent.

Far from being a silver bullet solution for a distant aerial war, America’s stealth fighters are providing credible deterrence on the front lines today. They have achieved in some cases, the ultimate goal of winning without fighting, by exploiting the advantage of surprise. The right question might be, how many are required for this mission, given the enormous costs of fifth generation fighters? (more on this later).  As a quarterback, the F-22 can support many allied units, as part of a larger team.

Giving credit where it is due, Mr. Sprey has rightly stated in his Aviation Week interview, “cost is part of the force you can bring to bear upon the enemy.”  His mechanism to compute air power in 2017, however, seems to ignore the most important aspect of air power since it first emerged in World War I, surprise.  His dogmatic focus on the lightweight, single purpose air-to-air fighter, which seems to shun even available, proven technology seems clear.

Here’s Why Blockchain Matters for Your Online Reputation

Here’s Why Blockchain Matters for Your Online Reputation

Just as word-of-mouth can make or break your business, your online reputation can either help or hurt your business. In today's digital world, our "real" lives are increasingly becoming merged with our digital lives and it is transcending to the world of business and e-commerce. But how can you ensure what is being said about your company or one that you plan to do business with is true? The answer rests in the power of the blockchain. Blockchain technology stands to change online reputations in more ways than you think, and it is worthwhile incorporating online reputation management (ORM) with your blockchain strategy. Here is what you need to know:

What is Online Reputation?

The advent of the internet and social media have empowered anyone with access to any device that can type content to become critics. This criticism has emblazoned the online reputation--a conglomerate of opinions, beliefs, and experiences expressed typically online. It is not just limited to employees or celebrities. Companies have online reputations, too. It has become so important that it has created a niche in the form of online reputation management or ORM. ORM businesses drum up "dream teams" that include SEO experts, private investigators, attorneys, publicists and ...


Read More on Datafloq
Economic of Warfare 19 – 4

Economic of Warfare 19 – 4

Continuing with a fourth and final posting on the nineteenth lecture from Professor Michael Spagat’s Economics of Warfare course that he gives at Royal Holloway University. It is posted on his blog Wars, Numbers and Human Losses at: https://mikespagat.wordpress.com/

This lecture continues the discussion of terrorism, looking at whether poverty or poor education causes terrorism. The conventional wisdom, supported by a book by Alan Krueger, is that they do not. Dr. Spagat explores this in more depth and the data tends to support this theme, although there are exceptions.

On slide 39, Dr. Spagat leaves us with a gem of a quote. The data he had been looking at was responses to surveys about terrorism. As he notes: “It is one thing to voice support on a survey for terrorism or attacks–it is another matter entirely to strap on explosives and blow oneself up. In other words, suicide bombers have to be really committed individuals.”

He then goes to show Palestinian suicide bombers are generally less impoverished and better educated on average than the population they are drawn from. He sees a similar observation when looking at deceased Hezbollah militants (pages 39-41). This is not surprising if you are familiar with the history of revolutions and insurgencies.

The link to his lecture is here: http://personal.rhul.ac.uk/uhte/014/Economics%20of%20Warfare/Lecture%2019.pdf

Visualizing European Population Density

Visualizing European Population Density

Anonymous [Reddit]

A map of European countries with a population fewer than London metropolitan area. As of 2015, London had 13.8 million residents.

Military History In The Digital Era

Military History In The Digital Era

Volumes of the U.S. Army in World War II official history series published by the U.S. Army Center for Military History [Hewes Library photo]

The U.S. National Archives and Records Administration (NARA) has released a draft strategic plan announcing that it will “no longer accept transfers of permanent or temporary records in analog formats and will accept records only in electronic format and with appropriate metadata” by the end of 2022. Given the widespread shift to so-called “paperless” offices across society, this change may not be as drastic as it may seem. Whether this will produce an improvement in record keeping is another question.

Military historians are starting to encounter the impact of electronic records on the preservation and availability of historical documentation of America’s recent conflicts. Adin Dobkin wrote an excellent overview earlier this year on the challenges the U.S Army Center for Military History faces in writing the official histories of the U.S Army in Afghanistan and Iraq. Army field historians on tight deployment timelines “hoovered up” huge amounts of electronic historical documentation during the conflicts. Now official historians have to sort through enormous amounts of material that is often poorly organized and removed from the context from which it was originally created. Despite the volume of material collected, much of it has little historical value and there are gaps in crucial documentation. Separating the useful wheat from the digital chaff can tedious and time-consuming.

Record keeping the paper age was often much better. As Chris wrote earlier this year, TDI conducted three separate studies on Army records management in the late-1990s and early 2000s. Each of these studies warned that U.S. Army documentation retention standards and practices had degraded significantly. Significant gaps existed in operational records vital to future historians. TDI found that the Army had better records for Red Cloud’s War of 1866-1868 than it did for Bosnia.

TDI is often asked why it tends to focus on the World War II era and earlier for its analytical studies. The answer is pretty simple: those are the most recent conflicts for which relatively complete, primary source historical data is available for the opposing combatants. Unfortunately, the Digital Age is unlikely to change that basic fact.

Object — Just Another Storage Technology or a New Business Proposition?

Object — Just Another Storage Technology or a New Business Proposition?

Object storage has lived in the IT background. But with advances in object storage technology and the fact that primary storage array sales are declining, object can now take over as a storage growth engine.
Is Big Data a Slippery Slope?

Is Big Data a Slippery Slope?

“I know one thing; that I know nothing.�

These words were famously attributed to Socrates in one of Plato’s accounts of the philosopher, a phrase that now comes to represent the Socratic paradox. While it is contested whether Socrates actually said these words, the meaning is still poignant, indicating that the wisest people are the ones who don’t assume to know all, who keep an open mind, and who “know when they know nothing.�

With the rise of Big Data, we now have more information at hand than ever before. Some might even say that we know more now than we ever have before — and this is dangerous thinking. Tom Goodwin, head of innovation at Zenith Media and Forbes contributor, would likely agree.

“We overestimate the importance of what we know, rather than focus on what this data makes clear we don’t actually know. The more you know, the more you know you don’t know,� he says in his post, “The Dark Side of Big Data.� “Above all I’m concerned we believe that big data is used as a cure all, we’ve somehow assumed that it will solve all our problems and I think that the reality doesn’t meet the hype and ...


Read More on Datafloq
Working in the (data) heart of KPN

Working in the (data) heart of KPN

Working in the (data) heart of KPN

The main question appears to be a simple one: What makes it so interesting to work as a Data Analyst at KPN? “Do you have a minute,� Ruben Timmermans asks with a smile. What follows is a talk about concretising “Xs and Ys,� the urge to innovate and true team spirit.

After earning his degree in Econometrics, Ruben Timmermans started working as a Data Analyst at KPN in 2016. “Econometrics teaches you all about the theoretical application of mathematical models. At KPN, these Xs and Ys are translated into concrete business with the help of statistics. There is a mathematical solution for every wish or problem. It truly makes you realise how much value data analysis can add for the customer,� Ruben explains excitedly.

Is a degree in Econometrics a requirement for working as a Data Analyst at KPN?

Ruben: “We have a lot of econometricians in our Data & Analytics department, but this is also an interesting professional environment for people with a different scientific and statistical background. KPN is an organisation with millions of customers who are each on their own “customer journey.� It is our job to discover and optimise these journeys using data ...


Read More on Datafloq
Human Factors In Warfare: Suppression

Human Factors In Warfare: Suppression

Images from a Finnish Army artillery salvo fired by towed 130mm howitzers during an exercise in 2013. [Puolustusvoimat РF̦rsvarsmakten РThe Finnish Defence Forces/YouTube]

According to Trevor Dupuy, “Suppression is perhaps the most obvious and most extensive manifestation of the impact of fear on the battlefield.� As he detailed in Understanding War: History and Theory of Combat (1987),

There is probably no obscurity of combat requiring clarification and understanding more urgently than that of suppression… Suppression usually is defined as the effect of fire (primarily artillery fire) upon the behavior of hostile personnel, reducing, limiting, or inhibiting their performance of combat duties. Suppression lasts as long as the fires continue and for some brief, indeterminate period thereafter. Suppression is the most important effect of artillery fire, contributing directly to the ability of the supported maneuver units to accomplish their missions while preventing the enemy units from accomplishing theirs. (p. 251)

Official US Army field artillery doctrine makes a distinction between “suppression� and “neutralization.� Suppression is defined to be instantaneous and fleeting; neutralization, while also temporary, is relatively longer-lasting. Neutralization, the doctrine says, results when suppressive effects are so severe and long-lasting that a target is put out of action for a period of time after the suppressive fire is halted. Neutralization combines the psychological effects of suppressive gunfire with a certain amount of damage. The general concept of neutralization, as distinct from the more fleeting suppression, is a reasonable one. (p. 252)

Despite widespread acknowledgement of the existence of suppression and neutralization, the lack of interest in analyzing its effects was a source of professional frustration for Dupuy. As he commented in 1989,

The British did some interesting but inconclusive work on suppression in their battlefield operations research in World War II. In the United States I am aware of considerable talk about suppression, but very little accomplishment, over the past 20 years. In the light of the significance of suppression, our failure to come to grips with the issue is really quite disgraceful.

This lack of interest is curious, given that suppression and neutralization remain embedded in U.S. Army combat doctrine to this day. The current Army definitions are:

Suppression – In the context of the computed effects of field artillery fires, renders a target ineffective for a short period of time producing at least 3-percent casualties or materiel damage. [Army Doctrine Reference Publication (ADRP) 1-02, Terms and Military Symbols, December 2015, p. 1-87]

Neutralization – In the context of the computed effects of field artillery fires renders a target ineffective for a short period of time, producing 10-percent casualties or materiel damage. [ADRP 1-02, p. 1-65]

A particular source for Dupuy’s irritation was the fact that these definitions were likely empirically wrong. As he argued in Understanding War,

This is almost certainly the wrong way to approach quantification of neutralization. Not only is there no historical evidence that 10% casualties are enough to achieve this effect, there is no evidence that any level of losses is required to achieve the psycho-physiological effects of suppression or neutralization. Furthermore, the time period in which casualties are incurred is probably more important than any arbitrary percentage of loss, and the replacement of casualties and repair of damage are probably irrelevant. (p. 252)

Thirty years after Dupuy pointed this problem out, the construct remains enshrined in U.S. doctrine, unquestioned and unsubstantiated. Dupuy himself was convinced that suppression probably had little, if anything, to do with personnel loss rates.

I believe now that suppression is related to and probably a component of disruption caused by combat processes other than surprise, such as a communications failure. Further research may reveal, however, that suppression is a very distinct form of disruption that can be measured or estimated quite independently of disruption caused by any other phenomenon. (Understanding War, p. 251)

He had developed a hypothesis for measuring the effects of suppression, but was unable to interest anyone in the U.S. government or military in sponsoring a study on it. Suppression as a combat phenomenon remains only vaguely understood.

How Can Predictive Analytics Enhance Customer Base and Experience

How Can Predictive Analytics Enhance Customer Base and Experience

The marketplace has changed, that much is apparent. Customer expectations have taken a U-turn and put most previous practices to rest. People no longer tolerate being generalized and expect companies to treat them on a personal level. While this is understandable reasoning from an individual's perspective, the game completely changes when looking at this from the other side. Companies simply do not have the time or resources to get to know every single client personally. This is where predictive analytics comes in.

The what now?

According to the very aptly named Predictive Analytics Today, predictive analytics is a branch of advanced analytics that is used to make predictions about future events. By using various methods to get to previous and current customer data, process it and extrapolate a plausible outcome.

Simply put, it helps determine what marketing strategies resulted in a sale in the past and which ones have the highest probability of succeeding in the future. It is useful in every stage of the sales process, from helping to get customers engaged to keeping them as followers of your brand.

Pre-sale stage

It is important to know that the process begins way before any actual purchases are made. The beginning of the journey is ...


Read More on Datafloq
Real-time stream processing: Are we doing it wrong?

Real-time stream processing: Are we doing it wrong?

Humans to Machine – Shift of data source

Data has been growing exponentially. We have more data streaming through the wire than we can keep them on disk from both value and volume perspective. These data are being created by everything we deal with on daily basis. When humans were the dominant creator of data, we naturally used to have fewer amounts of data to deal with and at the same time value used to persist for a longer period. This, in fact, holds true now as well, if humans are the creator of the data.

However, humans are no longer the dominant creator of the data. Machines, sensors, devices etc. have taken over long time back. These data, created by machines with humongous speed, is so much that in last two years we had 90% of the data created since the dawn of civilization. These data tend to have limited shelf life as far as value is concerned. The value of data decreases rapidly with time. If the data is not processed as soon as possible then it may not be very useful for ongoing businesses and operations. Naturally, we need to have different thought process and approach to deal with ...


Read More on Datafloq
Anyone Can Be A Historian

Anyone Can Be A Historian

In the world of government contracting, it is hard for a contractor to remain working with the government for longer than 3-5 years. Problems happen, people annoy each other, mistakes are made, frictions develop, and pretty soon people start wondering if they could do better with another contactor. So it not unusual to see contractors fall in and out of favor. I have seen it happen repeatedly.

Many years ago a company that was a competitor to Dupuy’s HERO conducted a study. It was well done as they hired one of our employees as their employee and another of our employees as a consultant. They got a follow-on contract. But, this being the government, as is often the case, the follow-on contract came a year or so after the original effort was completed. The original team had move to other projects in the company. As it was, defense budgets were in a period of decline, so the company decided they could conduct the next study using available staff so they could keep them employed. The former HERO employee was not available as he had been assigned to another project, and that project manager did not want to let him go. The consultant was not called back. Instead they took some available engineers who were between contracts and put them on the project. After all, anyone can do history.

Needless to say, the next study was a failure. I was later told by a manager in the government that they would never hire that contractor back. Apparently this work requires enough expertise that we cannot be easily replaced by any bright guy.

Dupuy On Youtube

Dupuy On Youtube

One can hear the voice of Trevor Dupuy on youtube: https://www.youtube.com/watch?v=cRbX6ur-pJ4

The more extended version is here: https://www.youtube.com/watch?v=ycNkCJpcbO4

 

 

 

Why You Need Cyber Threat Intelligence

Why You Need Cyber Threat Intelligence

Cyber threat intelligence has become more and more a necessity for every business, no matter how large or small. There are a number of different methods you can implement in order to combat cyber threats, including cyber intelligence. This method involves learning about the various cyber-risks that could affect your business. By making use of cyber threat intelligence, you can be prepared for what’s coming and know how to avoid these attacks.

Why Do You Need to be Concerned?

Many small business owners feel that as long as they have a virus scanner, a firewall, and other tools, they’re protected. They don’t feel like hackers would target their small business. Unfortunately, that’s often not the case. In 2015, around 43 percent of cyber-attacks were aimed at small businesses. That means small businesses are definitely not too small of a target for hackers! No matter how small your business is, you can’t simply assume hackers won’t notice you. You’ve got to be prepared, and that means you need to have a cyber intelligence policy in place.

Define Threats

The first thing to do is to define your threats. If you don’t know what’s going to cause your organization harm, you can’t be prepared to fight ...


Read More on Datafloq
Against the Panzers

Against the Panzers

The book that came out of the A2/D2 Study (Anti-Armor Defense Data Study) was Against the Panzers, by Allyn R. Vannoy and Jay Karamales: Against the Panzers: United States Infantry Versus German Tanks, 1944-1945

The graphics person for of my three books and the images for this website is Jay Karamales. Jay is a multi-talented person whose primary occupation is a programmer. Apparently the challenge of writing a book while working a full-time job was stressful enough that he never tried it again. Unfortunately, there was never an Against the Panzers II, although I gathered he did some work on it.

For a taste of Mr. Karamales’ book, I recommend you take a look at his article in the TNDM Newsletter: http://www.dupuyinstitute.org/pdf/v1n6.pdf

IoT: Penetrating the Possibilities of a Data Driven Economy

IoT: Penetrating the Possibilities of a Data Driven Economy


All of us are accustomed to the smart wearables, such as the ones we wear on a jogging track. We also have seen the concept of smart homes turn into a reality. We have seen a farmer sort and track his flock of sheep with the help of a mountable RFID device. 

Every physical element around us (including ourselves) has become a part of a real and rhythmic whole – communicating information with each other at all times. All thanks to the Internet of Things!

Ever since the Internet of Things (IoT) manifested into reality, integrating the physical world with our digital routine, experts and thought leaders have waited for it to transform the dream of a data driven economy into a witnessed possibility.

As the concept of Internet of Things continues to evolve and grow, it now appears that the wait is finally over. 

Welcome to the Industrial Internet of Things (IIoT). This is a concept-turned-reality, which looks set to change the traditional picture of industrial production for years to come. 

Industrial Internet of Things – What Is It? 

The Industrial Internet of Things or IIoT, is the Internet of Things, applied in industrial settings, which combines operational technology with information technology to help in ...


Read More on Datafloq
Advances in Blockchain Analytics, Through more Investment & Collaboration

Advances in Blockchain Analytics, Through more Investment & Collaboration

Since our previous post, raising questions for Blockchain to answer, I have seen more news on Blockchain Analytics.

It seems many individuals and firms are working in this space. To bring together the power of analytics (sometimes plus AI), to help both analyse blockchain data and enable new types of analytics.

We raised, in that previous post, the unanswered questions on whether blockchain could help overcome current data challenges & how data held in blocks in chains could be analysed.

In this short follow-up post, I’ll share some progress I’ve made in seeing answers to some of those questions. This includes news items & a video, together with an encouraging increase in collaboration across these different technology specialisms.

Investment in cracking Blockchain Analytics

There have been several stories in UK & US news about investments in blockchain start-ups & more established firms. Indeed the US government seems particularly committed to investing in this technology development.

One story that caught my eye, as it offered promise for more ‘out of the box‘ solutions to blockchain data analytics, is this news item on the acquisition of Skry (formerly CoinAnalytics) by Bloq. If they are able to realise the potential outlined in this article, it could make for some breakthroughs ...


Read More on Datafloq
How Blockchain Could Improve Your Big Data

How Blockchain Could Improve Your Big Data

The rise of cloud storage has helped companies collect and manage massive amounts of data. Data comes from corporate systems, Internet of Things objects and unstructured sources like online forums. New analytics tools like Hadoop help companies make sense of that data.

Yet simply having data and analysis tools doesn't mean the results of an analysis are meaningful. Getting true insight from data depends on the data being correct. With the many sources that feed into data lakes and the many transformations big data goes through to be processed, there are many possible ways for errors to be accidentally or deliberately introduced. It's no surprise, then, that one survey found only a third of executives trust their analytics programs.

This lack of trust in the data not only limits its use within the enterprise that collected it but also limits the potential for companies to monetize their data by sharing it with others.

The solution to these problems may be found in an unexpected source: the blockchain technology that supports cryptocurrencies like Bitcoin.

Blockchain and Data Quality

As a technology, blockchain became famous along with Bitcoin, and most companies probably think its relevance, if any, is as another payments technology.

The right way to view blockchain, ...


Read More on Datafloq
How to Make the Most of Your Big Data

How to Make the Most of Your Big Data

Data rules the world these days. But now that it’s handily surpassed just about every other quantifiable thing on Earth in terms of usefulness and monetary value, a single burning question remains: Just what, precisely, do we do with all of it?

Don’t be scared off by the name — “big� data is every bit as useful for smaller companies as for globe-spanning corporations. As it turns out, it’s not really about how much you have, but rather how you use it. If you collect and rely on data, but you’re wondering how to make it more useful and actionable, here are a few places you can get started.

Start With a Problem That Actually Needs Solving

You don’t need to pore over a data point just because it exists. Some data just isn’t going to yield anything useful — so you need to pick your battles. What problems are you actually trying to solve?

Perhaps the best source of this kind of “structured� data is the information that changes hands during customer transactions. With a little effort, you can uncover patterns you didn’t know existed, such as seasonal fluctuations and spending habits.

Operationalize Your Results

Supposing you started with a problem that needed solving and ...


Read More on Datafloq
A2/D2 Study

A2/D2 Study

A2/D2 Study = Anti-armor defense data study.

In the last days of the Soviet Union—before anyone realized they *were* the last days—the NATO nations were still doing all they could to prepare for a possible Soviet onslaught into Western Europe. They had spent decades developing combat models to help them predict where the blow would fall, where defense would be critical, where logistics would make the difference, what mix of forces could survive. Their main problem was that they didn’t know how far they could trust those models. How could they validate them? Maybe if they could reverse-engineer the past, they could be relied upon to predict the future.

To that end, the American Department of Defense (DoD) and (particularly) the British Defence Operational Analysis Establishment (DOAE) undertook to collect data about historical battles that resembled the battles they expected to be fighting, with the aim of feeding that data into their models and seeing how much the models’ results resembled the historical outcomes of those battles. The thinking went that if the models could produce a result similar to history, they could be confident that feeding in modern data would produce a realistic result and teach them how to adjust their dispositions for optimal results.

One of the battles that NATO expected to fight was a Soviet armored drive through the Fulda Gap, a relatively flat corridor through otherwise rough terrain in south-central West Germany. The battle that most resembled such an operation, in the minds of the planners, was the December 1944 surprise attack by the German Army into the Ardennes Forest region along the German/Luxembourg/Belgian border, which became known as the Battle of the Bulge for the wedge-shaped salient it drove into American lines. As the British involvement in this epic battle—what Churchill called the greatest battle in the history of the U.S. Army—was minor, consisting of a minor holding action by XXX Corps, the DOAE delegated collecting the relevant data for this battle to the DoD. The responsible element of the DoD was the Army’s Concepts Analysis Agency (CAA), which in turn hired defense contractor Science Applications International Corporation (SAIC) to perform the data collection and study. In late 1990 SAIC began in-depth research, consisting of archival reviews and interviews of surviving veterans, for the project which hoped to identify engagements down to vehicle-on-vehicle action, with rounds expended, ammunition types, ranges, and other quantitative data which could be fed into models. Ultimately the study team, led by former HERO researcher and Trevor Dupuy protégé Jay Karamales, identified and recorded details for 56 combat actions from the ETO in 1944-1945, most from the Battle of the Bulge; and the detailed data from these engagements was used in the validation efforts for various combat models. This quantitative data, along with a copious amount of anecdotal information, was used as the basis for Karamales’ 1996 book with his co-author Allyn Vannoy titled Against the Panzers: United States Infantry versus German Tanks, 1944-1945: A History of Eight Battles Told through Diaries, Unit Histories and Interviews.

Copies of this study are available at DTIC. If you put “saic a2d2� into a search engine you should find all the volumes in PDF format on the DTIC website. As an example, http://www.dtic.mil/dtic/tr/fulltext/u2/a232910.pdf or http://www.dtic.mil/cgi-bin/GetTRDoc?Location=U2&doc=GetTRDoc.pdf&AD=ADA284378

 

Behavioral Advertising: Find out who clicks on you Ads!

Behavioral Advertising: Find out who clicks on you Ads!

The term Behavioral Advertising and Predictive Advertising has been making the rounds for quite a while. However, not everyone is clear about how minuscule pieces of information come together to form a pattern.

“Behavioral Advertising can be summed up as a technique used by advertisers, to display targeted ads to consumers, by collecting their browsing data and analyzing patterns over a period of time.�



Figure 1: An ecosystem of Behavioral Targeting, and how it works.

How does It work?

Let us say you recently browsed through the handbag collection at Macy’s website. Shortly after, whichever website you visit, has Handbag ads served to you with attractive prices and quick delivery options. Over the past years, Behavioral Advertising has rapidly replaced traditional targeting like demographic targeting etc;

Behavioral Advertising has hugely impacted the buying behavior of a consumer, by altering their self-perception. Let’s understand this by an example, you purchase some vegan products a few times a month, every month. Based on targeting algorithm, marketers label you as “Animal conscious� consumer and soon most of the ads you’ll see will be based on the Label they’ve created – “Animal Conscious�. Soon enough, you’ll start responding to this label and purchase more “animal-friendly� products.

What’s cooking behind the ...


Read More on Datafloq
7 Tips for Selecting the Best IoT Platform for Your Business

7 Tips for Selecting the Best IoT Platform for Your Business

It is now an established fact that maintaining an Internet of Things platform of your choice is crucial to the way business processes are carried out. And, the rapid adoption of IoT in all fields of industry has made it an asset to any company.

However, in spite of the rapid growth of IoT, the sad fact is that 90% of the data still remains unused, making it evident that we are not leveraging the full potential of IoT. Often failure to use the correct IoT platform is the commonest reason for this. When IBM conducted research into this, they came up with solutions.

One feasible solution was to release cognitive IoT solutions where the computer will not be programmed, but will learn through intelligent means of experience, human interaction and data. Through cognitive IoT, vast amounts of data can be used.

So, let’s take a look at some of best tips that will help you choose the right IoT platform for your business.

1. The Unique Offerings of Each Vendor

IoT has revolutionized many industrial sectors to a great extent; hence, there are so many IoT platforms out there that it becomes a tough task to  choose the best. After initial research, you will probably have to focus ...


Read More on Datafloq
Wrong KPIs are costly: How to avoid Wells Fargo’s mistake

Wrong KPIs are costly: How to avoid Wells Fargo’s mistake

Fraud charges and $185 million fine – this was the price of one wrong KPI for Wells Fargo. This notorious case probably induced many companies to double-check their motivation schemes and reach out to BI consulting practitioners. And it’s easy to understand these precautions. For many years, Wells Fargo was a role model for many banks that dreamed of repeating their success in cross-selling. Likewise, Wells Fargo’s failure motivated their followers to double-check their own strategies.

Wells Fargo defined a brilliant strategy. With a limited number of customers in the market, the best scenario was to sell more products to the existing customers. The company followed the principles of smart strategic management that instructed to translate a corporate strategy into KPIs, and defined a KPI on cross-selling with a target of 8 products sold per customer. Besides, Wells Fargo invented a catchy motto Eight is great. Only many years later, the company found out that they had chosen a wrong incentive that led to creating 2 million fake accounts.

While this article cites Wells Fargo’s example, the problem of poorly defined KPIs that are not connected with ultimate business goals can appear in any industry.

BI solution can help in defining right KPIs

Any ...


Read More on Datafloq
Importance of Agile Approach to Project Management

Importance of Agile Approach to Project Management

The traditional Waterfall approach to project management works well in more stable contexts. Several project frameworks like PRINCE2®, PMBOK® and APM BoK derive from the same approach. The waterfall frameworks make working more volatile, uncertain, complex and ambiguous. Waterfall approaches are relevant where the requirements are predefined before the work begins and will not change during the life of the project.

However, in some cases where there are frequent changes throughout the project, with a waterfall framework, it means expensive reworking of the plan. This leads the consideration of Agile approach of numerous small deliveries with a continuous conversation with customer offers greater flexibility and therefore reduce the wastage. The success of Agile framework depends on several factors. The 3 key-elements are as follows:

1. A Self-Organizing Team

The self-organizing team is one of the important factors to start with Agile framework. By moving away from the traditional working culture, team members are encouraged to utilize their overlapping skills and work together that results in greater empowerment and satisfaction.

2. Requirements Management and Time-Boxing

This factor includes the calculation of time and cost elements along with prioritized the requirements. The Agile concept is different to the expectations of Waterfall; within agreed parameters member can change the requirements, but time and cost cannot be changed. Customers always want to keep many requirements into the ‘must haves’ section. However, Agile frameworks keep these to around 40-45% of the total effort.

3. People Engagement

People engagement is a crucial part of Agile approach and is proven more successful as it allows stakeholders of different sections to discuss what they do and the order of the work. This practice is recognized as a motivational approach than the traditional “command and control� approaches.

Agile Is Not What You Think

Not everyone is ready to use Agile approach and the common misconception about agile is that there is unifying Agile methodology. There is no standard way to organize and manage an Agile project, which makes it attractive to some and threatening to others.

There are people who think Agile practice is only relevant to the software industry, but that’s not true. The Agile approach can be used on other non-software fields, such as improving business processes, renovating a large building or improving job aids for customer-facing personnel. To learn how to utilize Agile practices to any project, join the training programs organized by MSys Training.


How PMBOK Guide 6 will change your life?

How PMBOK Guide 6 will change your life?

Release of PMBOK Guide 6 become talk of the town and delay in launch makes people more curious about it. Even I am waiting for the release of the PMBOK Guide 6th edition. Being associated with MSys Training, a training provider who also offers training on PMP, I always get a chance to contact people associated with PMI and PMP.

Post announcement of PMBOK V6, everybody was talking about it, I read many articles about PMBOK Guide 6 updates and its launch; even I posted some. While reading and researching, I realized that the launch of PMBOK Guide 6 will definitely make the PMP exam tougher and for any PMP aspirant, it will be difficult to clear the PMP certification exam in first or even in second attempt.

There are several small and big changes made in PMBOK Guide 6, some of them are as follows:

  • The first 3 chapters are completely rewritten to highlight importance of project management’s role in the business value creation and any organizational change.
  • All knowledge areas will focus on new topics like Key Concepts, Tailoring Considerations, Approaches in Agile, Trends and Emerging Practices, Adaptive and Iterative Environments and so on.
  • The ‘Project Human Resource Management’ module has been renamed to ‘Project Resource Management’, so that it can cover human resources as well as physical methodologies.
  • In the PMBOK V6, the role of the project manager has been expanded as a business expert, strategic thinker and leader.
  • Renamed ‘Project Time Management’ with ‘Project Schedule Management’.
  • Three new processes have been included naming Manage Project Knowledge, Implement Risk Responses and Control Resources.
  • Close Procurements have been removed from the book.

Among all these changes, the major change that will make PMP exam more difficult is addition of Agile in project management studies. Agile concept has been incorporated in all 10 knowledge areas.

Considering all the facts and myths roaming around, I will recommend PMP aspirants who are looking to get certified by 2017 or early 2018, this is the high time to get into the action. Get train, start studying, utilize your project management expertise, and get your PMP certification before PMBOK V6 impact the examination.


Explore the World of Agile with Agile Practice Guide

Explore the World of Agile with Agile Practice Guide

PMI and Agile Alliance collaborated to design and develop a guide that will combine agile and project management practices. The Agile Practice Guide is scheduled to be launched in September 2017 besides the PMBOK Guide 6. This guide is designed to give a greater understanding of agile practices and will offer knowledge on how agile is related to the project management community.

To illustrate several ways to be agile, the guide defines some of the commonly used agile frameworks, including Scrum, Scrumban, Scrum of Scrums, Large Scale Scrum, Enterprise Scrum, Scaled Agile Framework, Kanban, eXtreme Programming (XP), Feature-Driven Development (FDD), Agile Unified Process (AUP), Dynamic Systems Development Method (DSDM) and Disciplined Agile.

The Agile Practice Guide introduces Agile Manifesto mindset, principles and values to the aspirants. The guide also covers the concepts of high-uncertainty and definable work, and the correlation between agile approaches, lean and Kanban methods. This book will elaborate empirical measurements for the team and their reporting status. The guide will also explore organizational factors impacting the use of agile practices like readiness, culture, business practices and the role of a PMO. Apart from that, the guide will focus on several other factors that play important roles while creating an Agile Environment like team composition and servant leadership.

The Agile Practice Guide will be helpful for individuals trained in project management and looking for transition to Agile. The guide will have a mapping of agile concepts to the Process Groups and Knowledge Areas described in the PMBOK® Guide V6. It will help project practitioners to adopt an agile way of working. However, according to the reports, the publisher will look for the feedback from agile experts to improve the practice guide.


History and Evolutions of PMBOK Guide

History and Evolutions of PMBOK Guide

A Guide to the Project Management Body of Knowledge (PMBOK® Guide) by PMI documents best practices and standards for project management. The current version of PMBOK is considered as one of the most important exam preparation books for the PMP (Project Management Professionals) and PMI-ACP certification. Soon in the third quarter of 2017, PMBOK 6th edition is scheduled to launch. In this article, we will go through the history and evolutions of the PMBOK Guide:

The first PMBOK® Guide published in 1996. Each successive edition was released to surpass the previous version incorporating new best practices and standards of project management.

PMBOK® Guide 1st Version [1996]

PMI witnessed a need to put together all official documents and guides to upgrade the development process of the project management, and published the first ever edition of the PMBOK® Guide in 1996. This edition was an extended version of “Ethics, Standards, and Accreditation Committee Final Report”, a white paper published in 1983.

PMBOK® Guide Version 2 [2000]

The upgraded version of PMBOK Guide i.e. 2nd edition was launched in 2000. This edition includes knowledge and practices that were commonly accepted in the field of project management that were proven valuable and useful to most projects. The PMBOK® Guide Version 2 also reflected the growth of the project management and removed the errors in the previous edition.

PMBOK® Guide Version 3 [2004]

After releasing the PMBOK® Guide 2nd Edition, PMI received thousands of suggestions for improvements of the PMBOK® Guide. The PMI’s editorial committee reviewed those suggestions and tried to integrate the recommendations into the next version of PMBOK® Guide and released the third edition in 2004. The project management practices included in the 3rd edition of PMBOK® Guide would be useful to most projects.

PMBOK® Guide Version 4 [2009]

The fourth edition of PMBOK was launched after the five years of publication of its preceding version. In this edition, the content of the PMBOK® Guide was edited to make it more consistent and accessible. The clear distinction between the project documents and project management plan was made. The “triple constraints� of project management were expanded to six as scope, schedule, quality, resources, risk and budget.

PMBOK® Guide Version 5 [2013]

The current version i.e. 5th version of the PMBOK® Guide was released in 2013. Considering the suggestions and recommendations, PMI made changes in PMBOK Guide 4th Edition and the 5th edition represents PMI’s continual efforts to upgrade and update the body of knowledge. Many PMP certification aspirants refer the PMBOK® Guide to prepare for the PMP certification exam.

PMBOK® Guide 6 [2017]

The PMBOK® Guide 6 will be published in July, 2017. This edition will incorporate Agile in its module as Agile has become one of the fastest growing methodologies in the recent years. In this edition, we can also witness some minor changes in the process groups, processes and naming of the PMBOK® Guide methodology. The PMI Talent Triangle (Leadership, Technical Project Management, Business and Strategic Management), will also be incorporated into the 6th edition of PMBOK® Guide.

PMBOK Guide is one of the major sources to prepare for PMP and PMI-ACP, but after the launch of the new edition, many people find it difficult to understand the newly added terminologies and processes initially. Hence, if you are a PMP aspirant, this is a high time for you to make the decision and book an appointment at PMI before the sixth edition of PMBOK affect the certification exam.


Extended LSSGB, PMP, ITIL Foundation and PMI-ACP Classes for Veterans

Extended LSSGB, PMP, ITIL Foundation and PMI-ACP Classes for Veterans

Industry progress, changing trends and practices are not in anyone’s control, but you can match them by re-skilling, up-skilling and multi-skilling. Updating your skills is the only way to stay relevant in any industry. MSys is working in the direction to reduce the skill gap between organizations and individuals. The idea behind initiating the “Give Back to the Society” campaign is to generate equal opportunities for Veterans and their families.

Since March, MSys is organizing complementary classes and helped more than 25 Veterans to get certified and create their different identity in the corporate world. Considering the request from our Vets and to increase the opportunity level for them, MSys have now added PMI-ACP, LSS Black Belt and ITIL Foundation.

Any training is more fruitful once participant gets certified. We observed, 60% of our Vet participants opting for paid course and getting their certification. Considering the request from our Veteran friends and to tribute Vets for their services, MSys have decided to extend the ‘Give Back to the Society’ campaign for the entire year.

Here is the schedule for the Veterans classes in the upcoming months:

Some recent reviews from Vets who attended previous classes:

Our Helping Hands

Few of our master instructors, including Jason Saetrum, Michelle Halsey and Alfred Howard have joined hands with us to make this campaign more fruitful for Vets and will be guiding Vets on how to get certified and transform their careers.

Why reduced cost?

To be able to continue delivering the training programs to Veterans, MSys have decided to offer these courses at very low price compared to the standard rates. Veterans’ families can also avail the benefits of this opportunity.

To learn more about our Veterans program visit at https://www.linkedin.com/redir/invalid-link-page?url=https%3A%2F%2Fwww%2emsystraining%2ecom%2Fveteran. To inquire about the program, drop us an email at https://www.linkedin.com/redir/invalid-link-page?url=support%40msystraining%2ecom or give us a ring on +1-408-848-3078 and we will get back to you.

Stay connected to get more update on “Give Back to the Society�!

Here are some common questions asked by Veterans before enrolling for the classes. Have a look at them, still if you don’t get answers to your queries, drop us an email at https://www.linkedin.com/redir/invalid-link-page?url=support%40msystraining%2ecomor call us at 408-878-3078.

How to register for the classes?

Go to Veteran Page, select the course with suitable date and get yourself registered.

OR

Drop an email with your preferred course and dates at support@msystrainining.com and one of our Learning Managers will confirm your registration within 24 hours.

When do I get my training details?

You will receive a registration confirmation email within 24 hours of your registration request. All trainings are online and will be hosted on GoToMeeting. MSys representative will send you the GoToMeeting.com login credentials 5 days prior to the training.

Do I get the MSys courseware?

MSys courseware for PMP, PMI-ACP, LSSGB and ITIL Foundation is available for Vets at just $149, $199, $349 and $449 respectively.

How to appear for Certification Exam?

Certification fees for ITIL Foundation and LSSGB is included in the MSys courseware fees. On completion of training, you will get an exam coupon with six months validity. For PMP and PMI-ACP, one of our consultants will help you to fill the application form on http://www.pmi.org.

What is the difference between MSys and IASSC LSSGB certification?

Both the certifications are globally recognized, IASSC comes with single attempt and is a close book exam, whereas the MSys LSSGB is an open book exam with three attempts.

Is there any difference between MSys Veterans classes and regular paid classes?

No, there is absolutely no difference between our paid and Veteran program. MSys is conducting these classes for Veterans to understand the current market situation and get those extra skills require to excel in the civilian world.

Do I get a course completion certificate?

Yes, post completion of online training, MSys will send a course completion certificate along with PDUs/Contact Hours certificate to your registered email address.


Privacy Policy

Copyright © 2017 BBBT - All Rights Reserved
Powered by WordPress & Atahualpa
X