Soviet Tank Repairs at Kursk (part 1 of 2)

Soviet Tank Repairs at Kursk (part 1 of 2)

Below is text taken directly from Valeriy Zamulin’s book Demolishing the Myth, pages 447-448.

To quote:

The units of the [German] II SS Panzer Corps also left behind only a scorched field and demolished equipment when they eventually withdrew. The headquarters of the 2nd Tank Corps report on 16 July 1943 “…The enemy, organizing a retreat during the night, withdrew all his forces, evacuated all the damaged equipment, and torched the remaining knocked-out tanks and vehicles on the battlefield.”

In the Red Army, the main burden for recovering tanks and self-propelled guns and for transporting them to collection points for disabled vehicles lay on the brigades’ equipment companies and the personnel of the tank battalions. Usually, the crew themselves actively participated in the recovery of their damaged tanks or self-propelled guns and then performed any routine or moderately difficult repair jobs. Kommunar tractors, and Komintern and Stalinets-2 artillery prime movers were used for this work. Mainly, however, a single T-34 or a pair of them made the recover, because the tracotrs had bulky profiles, insufficient power, and no armor protection.

In contrast, the recovery and repair work in the units of [German] Army Group South was well-organized. Each panzer regiment had a well-equipped separate tank maintenance company, while a separate Tiger battalion had its own tank maintenance platoon. These elements managed to do 95% of all the repair work on the armored vehicles, which was performed in frontline conditions.

Unfortunately, it is impossible to say the same thing about the repair work in the corps formations of the [Soviet] 5th Guards Tank Army. At 2400 12.07.43, the headquarters of the 29th Tank Corps reported the following information in a combat dispatch:

The brigade and battalion of the corps are engaged in recovering the wounded and equipment. In the course of the night 3 T-34 tanks and 1 Su-122 self-propelled gun will be repaired.

The recovery of damaged vehciles is being implemented by three turret-less T-34 tanks and one M-3 tank. Five teams are performing the repair work; two teams from the 32nd Tank Brigade and one from the 31st Tank Brigade, in addition to the corps repair teams. One of the corps teams is doing the repair work on self-propelled guns.

Thus, of the 55 knocked-out tanks and self-propelled guns, the 29th Tank Corps was only able to repair four over the night. Naturally, at such a pace it was not easy to restore the combat capability of the corps quickly.

P.A. Rotmistrov [Fifth Guards Tank Army commander] later wrote:

The presence in the army of only mechanical tools could not guarentee the quick restoration and repair of parts, necessary for tank repairs. The lack of welding equipment and repair workshops delayed the fabrication and rehabilitation of spare parts, and thus also the repair of tanks and self-propelled guns within set periods. Army and Front depots of inventory of armored vehicles were located far way (150-300 kilometers), and the insufficient amount of transportation in the army’s corps and brigades complicated the timely supply of components and spare parts.

There were no special break-down teams in the repair units, so it was necessary to pull qualified mechanics from repair work in order to break down the tanks, which reduced labor productivity.

AI and Big Data enable better diagnosis, personalized treatment of eye diseases

AI and Big Data enable better diagnosis, personalized treatment of eye diseases

“Exactly one year ago, we spoke about the fact that, in future, it will be possible to diagnose diabetes from the eye using automatic digital retinal screening, without the assistance of an...

[[ This is a content summary only. Visit my website for full links, other content, and more! ]]
Maintaining Customer Relationships by Maintaining your Data

Maintaining Customer Relationships by Maintaining your Data

Imagine a world where you only ever met anybody once. Where every encounter lasted no longer than a few hours and no matter how well you got on with that person, you never spoke to them again. This would be a very sad world to live in.

Now take this world and compare it to a business which only ever meets its customer once. No meaningful relationship is ever developed, there is no way of knowing what the market wants and there is no feedback to provide you with that satisfying feeling that you have delivered something truly valuable.

Luckily, we do not live in a world like this, however, it is shocking how many businesses discard good customers by not maintaining contact and nurturing a good customer relationship.

Developing and Maintaining a good customer relationship is crucial for a variety of reasons:


Firstly, it is easier and cheaper to market to existing customers
Helps maintain a good brand image which prevents customers from switching to competitors
Opens doors to new customers through valuable testimonials/reviews from previous customers
Grants you vital product feedback from customers which allows you to improve your product or service


So why do businesses allow customer relationships to decay?

The answer lies primarily in poor data. ...


Read More on Datafloq
Using Artificial Intelligence To Fix Healthcare

Using Artificial Intelligence To Fix Healthcare

The healthcare industry should be using Artificial Intelligence (AI) to a far greater degree than at present, but progress has been painfully slow. The same factors that make the healthcare system so...

[[ This is a content summary only. Visit my website for full links, other content, and more! ]]
7 IoT Influencers You Should Be Following

7 IoT Influencers You Should Be Following

The Internet of Things (IoT) is a powerful, transformative force and cornerstone for digital businesses taking advantage of the convergence of the physical and digital worlds. IoT is creating an...

[[ This is a content summary only. Visit my website for full links, other content, and more! ]]
7 IoT Influencers You Should Be Following

7 IoT Influencers You Should Be Following

The Internet of Things (IoT) is a powerful, transformative force and cornerstone for digital businesses taking advantage of the convergence of the physical and digital worlds. IoT is creating an...

[[ This is a content summary only. Visit my website for full links, other content, and more! ]]
Population Now versus 2050

Population Now versus 2050

As you may have noted in my previous demographics posts, I was tracking the population of the various nations I was looking at, both in 1950, currently and the estimated for 2050. Let me summarize briefly what we are looking at (measured in millions of people):

                                       1950                    2017                    2050

China                               583 (1953)         1,411                   1,360

India                                 361 (1951)         1,324                   1,700

United States                   151                       309  (2018)          402

Soviet Union/Russia        182  (1951)           143  (2018)          132

Japan                                83                        127                       109

Germany                           69                          83                        79

 

Now, a lot of numbers there. Let us set the U.S. at a value of 1 and everyone else at a value relative to it. So:

                                       1950                     2017                    2050

China                               3.86                     4.57                      3.38

India                                 2.39                     4.28                      4.23

United States                  1.00                     1.00                      1.00

Soviet Union/Russia         1.21                       .46                        .33

Japan                                 .55                        .41                        .27

Germany                            .46                        .27                        .20

 

So, during the height of the bad old days (1950s), the Soviet Union had more population that the U.S.; and China, part of the communist bloc and actually in a hot war with us, had four times the population. Now….well the Soviet Union is gone. In 2050,  China will only have three times the U.S. population while a number of major powers (like Japan, Germany and Russia) will be a smaller fraction of the U.S. population.

Again, I note that some people like to talk about America in decline on the world stage. I really don’t see it economically or demographically.

U.S. versus The World (GDP)

Of course, the real challenge would be predict GDPs in 2050. Probably can with the U.S. On the other hand, it is pretty hard to say where the Chinese economy will be in 2050. I would be hesitant to do a straight line estimate.

U.S. versus China (GDP)

​Digital transformation: Closing the gap between innovation and execution

​Digital transformation: Closing the gap between innovation and execution

Coming up with the ideas is easy, but turning those bright sparks into successful businesses is a lot harder. Most organisations do not have a major problem generating new ideas, but many firms fail...

[[ This is a content summary only. Visit my website for full links, other content, and more! ]]
Managing High Availability in PostgreSQL – Part I

Managing High Availability in PostgreSQL – Part I

Managing high availability in your PostgreSQL hosting is very important to ensuring your clusters maintain exceptional uptime and strong operational performance so your data is always available to your application. In an earlier blog post, we introduced you to configuring high availability for PostgreSQL using streaming replication, and now we’re going to show you how to best manage PostgreSQL high availability.

There are multiple tools available for managing the high availability of your PostgreSQL clusters using streaming replication. These solutions offer automatic failover capabilities, monitoring, replication, and other useful administrative tasks. Some of the prominent open source solutions include:



PostgreSQL Automatic Failover by ClusterLabs


Replication Manager for PostgreSQL Clusters by repmgr (2ndQuadrant)


Patroni by Zalando



Each of these tools provides their own way of managing the clusters. In our three-part series of posts on high availability for PostgreSQL, we’ll share an overview, the prerequisites, and the working and test results for each of these three tools. Here in Part 1, we’ll deep dive into the PostgreSQL Automatic Failover (PAF) solution by ClusterLabs.

PostgreSQL Automatic Failover

PostgreSQL Automatic Failover (PAF) is a high availability management solution for PostgreSQL by ClusterLabs. PAF makes use of the popular, industry-standard Pacemaker and Corosync stack. With Pacemaker and Corosync together, you’ll be able to detect failures in the system and act accordingly.

Pacemaker is capable of managing many resources, and ...


Read More on Datafloq
Leadership in the age of disruption: How to rise to digital transformation

Leadership in the age of disruption: How to rise to digital transformation

Brad Parks, VP of Business Development at Morpheus Data, looks at what it takes to be a leader in the age of disruption Technology has been transforming business since the invention of the wheel and...

[[ This is a content summary only. Visit my website for full links, other content, and more! ]]
Do you really want to be data-driven?

Do you really want to be data-driven?

I was once asked by a CIO to help him present to his board on why they wanted to become “data-driven�. After five minutes into my presentation, a board member and I concluded that based on what we...

[[ This is a content summary only. Visit my website for full links, other content, and more! ]]
Inside the chief data privacy officer role with Barbara Lawler

Inside the chief data privacy officer role with Barbara Lawler

This fall, leading data platform company Looker announced that Barbara Lawler had joined the company as its new chief privacy and data ethics officer. She appears to have been an obvious choice for...

[[ This is a content summary only. Visit my website for full links, other content, and more! ]]
Data Cleansing: Why it Matters in Today’s World

Data Cleansing: Why it Matters in Today’s World

Data is the backbone of every organization; simply put, without data, a company’s present and future will fall to pieces, and every organization will eventually end up taking wrong decisions with...

[[ This is a content summary only. Visit my website for full links, other content, and more! ]]
In the blockchain economy, intellectual property will be obsolete

In the blockchain economy, intellectual property will be obsolete

Canada has a rich history of innovation, but in the next few decades, powerful technological forces will transform the global economy. Large multinational companies have jumped out to a headstart in...

[[ This is a content summary only. Visit my website for full links, other content, and more! ]]
How to Set Up an AI R&D Lab

How to Set Up an AI R&D Lab

There’s a common misconception that rebranding as an AI company is as simple as having data, infrastructure, and off-the-shelf data and analytics. The reality is that AI is complex, high-risk,...

[[ This is a content summary only. Visit my website for full links, other content, and more! ]]
How Integrating AI Into Recruitment Can Benefit Companies Facing a Labor Crisis

How Integrating AI Into Recruitment Can Benefit Companies Facing a Labor Crisis

Entrepreneurs intent on fast growth need to stay one step ahead of their competitors. That’s where AI comes in — or should. Unemployment in the United States is at its lowest rate (3.7...

[[ This is a content summary only. Visit my website for full links, other content, and more! ]]
Trevor Dupuy and Technological Determinism in Digital Age Warfare

Trevor Dupuy and Technological Determinism in Digital Age Warfare

Is this the only innovation in weapons technology in history with the ability in itself to change warfare and alter the balance of power? Trevor Dupuy thought it might be. Shot IVY-MIKE, Eniwetok Atoll, 1 November 1952. [Wikimedia]

Trevor Dupuy was skeptical about the role of technology in determining outcomes in warfare. While he did believe technological innovation was crucial, he did not think that technology itself has decided success or failure on the battlefield. As he wrote posthumously in 1997,

I am a humanist, who is also convinced that technology is as important today in war as it ever was (and it has always been important), and that any national or military leader who neglects military technology does so to his peril and that of his country. But, paradoxically, perhaps to an extent even greater than ever before, the quality of military men is what wins wars and preserves nations. (emphasis added)

His conclusion was largely based upon his quantitative approach to studying military history, particularly the way humans have historically responded to the relentless trend of increasingly lethal military technology.

The Historical Relationship Between Weapon Lethality and Battle Casualty Rates

Based on a 1964 study for the U.S. Army, Dupuy identified a long-term historical relationship between increasing weapon lethality and decreasing average daily casualty rates in battle. (He summarized these findings in his book, The Evolution of Weapons and Warfare (1980). The quotes below are taken from it.)

Since antiquity, military technological development has produced weapons of ever increasing lethality. The rate of increase in lethality has grown particularly dramatically since the mid-19th century.

However, in contrast, the average daily casualty rate in combat has been in decline since 1600. With notable exceptions during the 19th century, casualty rates have continued to fall through the late 20th century. If technological innovation has produced vastly more lethal weapons, why have there been fewer average daily casualties in battle?

The primary cause, Dupuy concluded, was that humans have adapted to increasing weapon lethality by changing the way they fight. He identified three key tactical trends in the modern era that have influenced the relationship between lethality and casualties:

Technological Innovation and Organizational Assimilation

Dupuy noted that the historical correlation between weapons development and their use in combat has not been linear because the pace of integration has been largely determined by military leaders, not the rate of technological innovation. “The process of doctrinal assimilation of new weapons into compatible tactical and organizational systems has proved to be much more significant than invention of a weapon or adoption of a prototype, regardless of the dimensions of the advance in lethality.� [p. 337]

As a result, the history of warfare has been exemplified more often by a discontinuity between weapons and tactical systems than effective continuity.

During most of military history there have been marked and observable imbalances between military efforts and military results, an imbalance particularly manifested by inconclusive battles and high combat casualties. More often than not this imbalance seems to be the result of incompatibility, or incongruence, between the weapons of warfare available and the means and/or tactics employing the weapons. [p. 341]

In short, military organizations typically have not been fully effective at exploiting new weapons technology to advantage on the battlefield. Truly decisive alignment between weapons and systems for their employment has been exceptionally rare. Dupuy asserted that

There have been six important tactical systems in military history in which weapons and tactics were in obvious congruence, and which were able to achieve decisive results at small casualty costs while inflicting disproportionate numbers of casualties. These systems were:

  • the Macedonian system of Alexander the Great, ca. 340 B.C.
  • the Roman system of Scipio and Flaminius, ca. 200 B.C.
  • the Mongol system of Ghengis Khan, ca. A.D. 1200
  • the English system of Edward I, Edward III, and Henry V, ca. A.D. 1350
  • the French system of Napoleon, ca. A.D. 1800
  • the German blitzkrieg system, ca. A.D. 1940 [p. 341]

With one caveat, Dupuy could not identify any single weapon that had decisive changed warfare in of itself without a corresponding human adaptation in its use on the battlefield.

Save for the recent significant exception of strategic nuclear weapons, there have been no historical instances in which new and lethal weapons have, of themselves, altered the conduct of war or the balance of power until they have been incorporated into a new tactical system exploiting their lethality and permitting their coordination with other weapons; the full significance of this one exception is not yet clear, since the changes it has caused in warfare and the influence it has exerted on international relations have yet to be tested in war.

Until the present time, the application of sound, imaginative thinking to the problem of warfare (on either an individual or an institutional basis) has been more significant than any new weapon; such thinking is necessary to real assimilation of weaponry; it can also alter the course of human affairs without new weapons. [p. 340]

Technological Superiority and Offset Strategies

Will new technologies like robotics and artificial intelligence provide the basis for a seventh tactical system where weapons and their use align with decisive battlefield results? Maybe. If Dupuy’s analysis is accurate, however, it is more likely that future increases in weapon lethality will continue to be counterbalanced by human ingenuity in how those weapons are used, yielding indeterminate—perhaps costly and indecisive—battlefield outcomes.

Genuinely effective congruence between weapons and force employment continues to be difficult to achieve. Dupuy believed the preconditions necessary for successful technological assimilation since the mid-19th century have been a combination of conducive military leadership; effective coordination of national economic, technological-scientific, and military resources; and the opportunity to evaluate and analyze battlefield experience.

Can the U.S. meet these preconditions? That certainly seemed to be the goal of the so-called Third Offset Strategy, articulated in 2014 by the Obama administration. It called for maintaining “U.S. military superiority over capable adversaries through the development of novel capabilities and concepts.� Although the Trump administration has stopped using the term, it has made “maximizing lethality� the cornerstone of the 2018 National Defense Strategy, with increased funding for the Defense Department’s modernization priorities in FY2019 (though perhaps not in FY2020).

Dupuy’s original work on weapon lethality in the 1960s coincided with development in the U.S. of what advocates of a “revolution in military affairs� (RMA) have termed the “First Offset Strategy,� which involved the potential use of nuclear weapons to balance Soviet superiority in manpower and material. RMA proponents pointed to the lopsided victory of the U.S. and its allies over Iraq in the 1991 Gulf War as proof of the success of a “Second Offset Strategy,� which exploited U.S. precision-guided munitions, stealth, and intelligence, surveillance, and reconnaissance systems developed to counter the Soviet Army in Germany in the 1980s. Dupuy was one of the few to attribute the decisiveness of the Gulf War both to airpower and to the superior effectiveness of U.S. combat forces.

Trevor Dupuy certainly was not an anti-technology Luddite. He recognized the importance of military technological advances and the need to invest in them. But he believed that the human element has always been more important on the battlefield. Most wars in history have been fought without a clear-cut technological advantage for one side; some have been bloody and pointless, while others have been decisive for reasons other than technology. While the future is certainly unknown and past performance is not a guarantor of future results, it would be a gamble to rely on technological superiority alone to provide the margin of success in future warfare.

Tips for Choosing the Right Tech Stack for Mobile App Development

Tips for Choosing the Right Tech Stack for Mobile App Development

Anyone familiar with the mobile app market will tell you that the competition is cut-throat. A business could have a high-quality offering along with a plethora of optimized features for its app and yet, the project could fail to make an impact unless backed by a suitable technology stack and a novel idea among other things.

Picking the right mobile technology stack is crucial to ensure the mobile application development project's success since it not only galvanizes the project but also makes it easy to maintain and scale while ensuring it remains in sync with the business' needs. In addition to that, the right technology stack also helps cut down costs and requires considerably less time when it comes to mobile app development. Moreover, that's not all, a well-qualified technology stack for Java development can also influence the ease of future updates and releases for the company's app.

Let's review some technology stacks for mobile app development that have proven to be quite helpful for Java developers.

1. Native App Development: iOS Tech Stack

To develop an iOS app, one can go with either Objective-C or Swift. Objective-C, a superset of the C programming language, offers object-oriented abilities along with a dynamic runtime environment. ...


Read More on Datafloq
Lean impact: How to apply lean startup principles to non-profits for social good

Lean impact: How to apply lean startup principles to non-profits for social good

Modern technology developers often use terms such as minimum viable product and iterative development. Startup companies popularized these lean startup approaches to develop products while conserving...

[[ This is a content summary only. Visit my website for full links, other content, and more! ]]
Why Businesses Should Take Charge of Their Big Data Skills Needs

Why Businesses Should Take Charge of Their Big Data Skills Needs

For the last few years, businesses all over the globe have been racing to incorporate big data systems and solutions into their processes and operations. That has increased demand for workers with the skills to operate, maintain, and expand those systems to levels not seen in other professions in decades, if ever. All the while, the global education system has been slow to adapt to the growing need, creating a severe shortage of qualified workers in the labor pool.

Part of the problem is that the field of big data encompasses so many skills that it's difficult for today's students to decide which subset offers them the best chance for a long and rewarding career, and which will be phased out as newer technology continues to emerge. For businesses, that reality means that it's not going to be possible to rely on the general labor pool to supply them with the talented individuals they will need to power their businesses into the future. Instead, they're going to have to take matters into their own hands, and do whatever it takes to secure the talent they need. Here's how they can do it.

Training to Suit

Any business that's making big data a centerpiece ...


Read More on Datafloq
Adatkezelési kultúrák

Adatkezelési kultúrák

Big data forradalom: adatkezelési erőviszonyok a „cég-állam-egyén szentháromságában�

 Komoly felelősség és páratlan üzleti potenciál. Valahogy így lehetne leírni azt a lehetőséget, amellyel a (személyes) adatok kezelése, birtoklása és az adatelemzés kecsegtet.

Big data forgatókönyvek

Az elmúlt hónapokban éjjel-nappal a GDPR-ról, a téma fontosságáról és az ezzel együtt járó kötelezettségekről lehetetett hallani. �m valóban „csak� arról van itt szó, hogy illetéktelenek kezébe kerülhetnek érzékeny adatok? Az adatgyűjtés és az adatfelhasználás lehetőségei valójában ennél sokkal színesebbek, és sokszor a leírt rémhíreknél barátságosabbak is lehetnek. Mi több, ha kicsit jobban elmélyedünk a témában, meglehet, hogy saját folyamataink hatékonyságán is dobhatunk egyet. Utánajártunk néhány érdekességnek és a világ ma fellelhető adatkezelési forgatókönyveinek.

 Egy 2007-es európai uniós nyilvános tanulmányban[1] számoltak be arról, hogy az IT, a telekommunikáció, a média és a szórakoztatóipar összefonódása és az üzleti intelligencia fejlődése miatt körülbelül 2015-re különleges jelentőséggel bírnak majd a személyes adatok, melyek birtokosa monopolhelyzetbe kerülhet. A szerzők háromféle adatkezelési forgatókönyvet vázoltak fel. Meglepően jól prognosztizáltak: ezek az adatelemzési, adattárolási gyakorlatok, vagy legalábbis ehhez nagyon hasonlók megjelennek a világ jelentős gazdasági övezeteiben:

  1. Cégközpontú szkenárió
  2. �llamközpontú szkenárió
  3. Egyénközpontú szkenárió

 A három nézőpont különböző aspektusokat és lehetőségeket kínál arra, amiért és ahogyan érdemes az adatokkal foglalkoznunk.

Cégközpontú big data forgatókönyv – Az egyesült kapitalista adatliberalizmus

A legjelentősebb adatbányászati nagyhatalmak hazája az Egyesült �llamok. Az IT-szektorban vezető helyen szerepel, vállalatai szárnyalnak, így nem meglepő, ha az adatok amerikai cégközpontú megközelítése sokakat csábít, mindenki erre a vonatra próbál felkapaszkodni.

A világ adatvagyonának legnagyobb birtokosait (például: a Google, a Facebook, az Apple, az Amazon, az eBay vagy a Twitter) az amerikai törvények sokszor előnyben részesítik: némi hozzájárulás után szinte szabaddá válik az adatfeldolgozás, sőt, Donald Trumph egyik első hivatali intézkedésével engedélyezte az amerikai internetszolgáltatóknak a rajtuk átfolyó ügyféladatok értékesítését. Hazai hasonlattal élve ez olyan lenne, mintha nálunk a Telekom eladásra kínálná számunkra a szomszédunk netforgalmát (kicsit abszurd, nem?). Néha Amerikában is kirobban egy-egy „adatbotrány�, ám ezek száma és jelentősége a birtokolt információhoz képest elenyésző. A netes óriáscégek a megszerzett, megvásárolt adatokra támaszkodva őrületes hatékonyságról tanúskodnak, ami sokakat félelemmel tölt el, míg másokat csábít és kapzsivá tesz ez a döntéstámogató rendszer. Ezt a létező szisztémát leginkább csak belülről lehetne megbolygatni, például trösztellenes törvényekkel. Sokszor felmerült már a Google, a Facebook vagy az Amazon feldarabolásának igénye, mint ahogyan az korábban az AT&T egyeduralkodó amerikai távközlési vállalattal is megtörtént.

Profitorientált vállalati döntéstámogató rendszerek

A módszer keltette érzések és személyes gondolatok lecsupaszítása után elmélyedhetünk a szkenárió lényegi megközelítésében is. Az elv az elérhető legnagyobb hatékonyság-növekedésen alapszik: központjában a folyamatok optimalizálása, az alsóbb döntéshozási pontok támogatása áll, az adatokra pedig a személyes vonal redukálása mellett objektív pénzteremtő erőként tekint. Ez az adatgyűjtési módszer költséget takarít meg és bevételt növel. A Dmlabnál számos referenciánk kapcsolódik ehhez az adatfeldolgozáshoz: részt vettünk már meghibásodások előrejelzésének kidolgozásában, üzemanyagfogyasztás optimalizálásában vagy például az ügyfeleknek kedvezőbb, ugyanakkor a vállalat számára is megfelelő ajánlatadási eljárás megtalálásában.

�llamközpontú big data diktatúra – A kínai falanszter informatikai alapjai

A jó öreg Kínát hozhatjuk fel itt legjobb példaként. A világ legősibb folyamatos civilizációja kiválóan kiaknázta az adatbányászat nyújtotta előnyöket is. Itt nem feltétlenül a titkosszolgálati és hírszerzési vonalra gondoltunk (szemben az amerikai NSA-botrányokkal), hanem sokkal inkább hétköznapokhoz köthető információk hasznosítására. Kínában jelenleg működik egy pontrendszer, amely az állampolgárokat minősíti az adatokra lefordítható aktivitása (például iskolázottság, munkahely, pénzügyi mozgás vagy társas kapcsolatok) alapján. Mindezt megspékelve egy hatalmas kiterjedésű állami internetes cenzúrával, amelyen közel kétmillió ember dolgozik. Félelmetes, ám hatékony és hatásos megvalósulása az adatok birtoklásának, amelyet másnéven bizánci információkezelési rendszernek is neveznek.

Csak egy példa a sok közül: Kínában rohamtempóban építik a köztéri kamerarendszert, kísérleti jelleggel már számos közintézményt és iskolát bekameráztak. Utóbbiban figyelemmel kísérik a diákok képességeit, tanulási szokásait és a fegyelmezettséget. Az így nyert adatok feltételezhetően bekerülnek majd az állami döntéstámogató pontrendszer indikátorai közé, és talán a közeljövőben ezen múlik majd, hogy egy állampolgár elegendő pontot „gyűjtött-e� a hitelképességhez, az útlevéligényléshez vagy akár egy közigazgatás álláshely sikeres megpályázásához.

Központosított adatbázis kezelés

Ellenőrzés és kontroll. Ez a két, időnként félelmetesnek ható kulcsszó ebben az államközpontú megközelítésben. Máshol a világon nem tapasztalható ez a merev és szigorú adatkezelési forgatókönyv, ám néhány vállalatnál fellelhető egy-két aspektus. Ezek főként olyan cégek, ahol az adatok szerepe kevésbé a hatékonyságnövelés, sokkal inkább a munkavállalók és a folyamatok kézbentartása, ellenőrzése. A szemlélet elterjedésének erős lökést adtak a különböző videó, képi vagy hanganyagok feldolgozására készített deep learning megoldások.

 A Dmlabnál is találkoztunk már olyan projekttel, amikor e forgatókönyv elveihez folyamodtunk: a Nemzeti Adó- és Vámhivatallal (NAV) közösen dolgoztunk egy hatékonyabb �FA-csalás elleni módszer megalkotásán, de említhetjük bármilyen fraud (csalás) ellen szóló megoldási javaslatunkat is.

Egyénközpontú big data – Megkérdőjelezhető európai bizalom és nagymértékű individualizmus az adatkezelésben

Nincs olyan állam (ma még), amelyre egyértelműen rá lehetne húzni ezt a forgatókönyvet, ám az új európai adatkezelési szabályok (GDPR) életbe lépésével kétségtelenül Európa lett a központja ennek a szemléletnek, azaz a személyes adatokkal történő önrendelkezés jogának. A kontinentális (elsősorban svéd) gyakorlatba beleillik a „privacy� védelmének a joga. Jó példa lehet erre a Linux operációs rendszer szellemisége, a PirateBay mögötti ideológia, vagy a hamarosan érvénybe lévő irányelv, miszerint a bankoknak az ügyfél kérésére ki kell adniuk a harmadik fél részére a pénzügyi adatokat. Azért tegyük hozzá, hogy a monopolhelyzetű adatokból élő cégek nem a kontinensről származnak, így egy európai vállalat laza tollvonással válhat az adatokkal kapcsolatos önrendelkezés harcosává.

A magánélet üzleti intelligencia megoldásai

Az európai gondolkodásban fontos szerep jut a magánélet és a különböző szabadságjogok védelmének, a szigorú adatvédelmi szabályoknak. E szerint a világkép szerint a privacy-nak, azaz a magánéletnek, az információnak, mint magántulajdonnak kellene a legnagyobb hangsúlyt kapnia a haszon és az ellenőrzés helyett. Az Európai Unió az egyik leglelkesebb képviselője ennek a felfogásnak, ám a kontinentális földrész jóval kisebb erőt tudhat magáénak, mint amekkora lendülettel az USA vagy Kína vetette bele magát a téma kiaknázásába. A helyzethátrány oka leginkább az, hogy az elképzelések motorja inkább az EU intézményrendszere, amelyhez (egyelőre) még nem párosul felhajtóerőként a felhasználók tudatossága. �rásunk mellé ajánlunk még egy (szerintünk jó) cikket az európai adatvédelemmel kapcsolatos dilemmákról.

 Szétnézve az üzleti folyamataink között ez a módszer egy kicsit kakukktojásnak tűnhet, de bármily meglepő, az adatokat a haszonszerzése igénye nélkül egyszerűen „wellbeing-re� is lehet használni, ha mondjuk arra törekszünk, hogy az ügyfeleinknek vagy a munkatársainknak jobb legyen. Ilyen lehet a working log rendszer alapján egy egyénre szabott szabadságajánló vezetői információs rendszer, valamilyen belső HR-analitikai megoldás vagy például a felhasználói adatok visszaszolgáltatása az ügyfelek számára.

Semmi sem fekete vagy fehér, és olykor a szürkét is nehéz meglátni

A világ társadalmai nem kategorizálhatók egyértelműen e három forgatókönyv mentén: általában e három big data forgatókönyv egyvelegével találkozhatunk egy-egy országban, ami a technológiai trendek fejlődésével gyorsan változik, egyre változatosabbak az adatfelhasználási lehetőségek, így érdemes figyelemmel kísérni a trendeket.

 A személyes és üzleti adatok védelme, ugyanakkor az információszabadság kettősége egy olyan erő, amellyel sokan (vissza)élnek. Az óvatosságon túllépve a lehetőséget kell meglátni ebben a száguldó vonatban, és lehetőleg minél előbb felülni rá.

 

[1] WORLD ECONOMIC FORUM (2007): Digital Ecosystem - Convergence between IT, Telecoms, Media and Entertainment: Scenarios to 2015. World Scenario Series, 2007

 

Megosztom Facebookon! Megosztom Twitteren! Megosztom Tumblren!

Schema-on-what? How to model JSON

Schema-on-what? How to model JSON

How do you make sense out of schema-on-read? This post shows you how to turn a JSON document into a relational data model that any modeler or relational database person could understand.
Mental Framework For A Data Driven Digital Transformation

Mental Framework For A Data Driven Digital Transformation

Over the last years, my small business has undergone a digital transformation from a marketing service company to a data literacy consultancy. What does a data literacy consultancy do? We teach...

[[ This is a content summary only. Visit my website for full links, other content, and more! ]]
How to Use Data to Monitor Your Competition

How to Use Data to Monitor Your Competition

Everybody in this day and age is well-familiar with the power of big data and contemporary machine learning operations, but few business owners and working professionals seem to have a solid grasp on how to use data to monitor the competition. This is unfortunate, as recent digital innovations have made keeping an eye on your competition easier and cheaper than ever before.

Here’s how you can harness the power of data to monitor your competition, and what your company will need to do to guarantee that it maintains a competitive advantage in the market.

Start on social media

There are many ways to use data to monitor your competition, but by far the easiest is to tap into popular social media platforms, which possess reams of lucrative data just waiting to be exploited by savvy companies. You should seriously be asking yourself what kind of social media strategy your competitors are using from the get-go; first and foremost, which platforms are they on, and which do they dedicate the most time and resources towards? Furthermore, you should be asking yourself how the competition is garnering user engagement, which is just as important as garnering more overall views.

Take a deep dive into how you ...


Read More on Datafloq
Taking a ‘big data’ view of regulatory information management

Taking a ‘big data’ view of regulatory information management

For pharmaceutical and life sciences firms, a case-by-case approach to information and content management has been prevalent for too long. It is inefficient and results in different parts of a...

[[ This is a content summary only. Visit my website for full links, other content, and more! ]]
How to Turn Your Data Into Gold With Machine Learning

How to Turn Your Data Into Gold With Machine Learning

We have never actually liked the term ‘Big Data’. The term implies that you should have large amounts of data to get anything valuable from it or that big is the only aspect of what’s distinctive...

[[ This is a content summary only. Visit my website for full links, other content, and more! ]]
Next Gen Digital Platforms Drive The Long Awaited Business Agility

Next Gen Digital Platforms Drive The Long Awaited Business Agility

The ‘platform’ metaphor has long been used in a variety of ways. In the context of platform economy, 21st century usage of the word “platformâ€� sometimes refers solely to online...

[[ This is a content summary only. Visit my website for full links, other content, and more! ]]
Graph Databases are the New Paradigm in the Ever-Expanding Data Universe

Graph Databases are the New Paradigm in the Ever-Expanding Data Universe

The world of databases is always evolving to fit business and technical needs. This world was once ruled by SQL Relational Database Management Systems (RDBMS). But that has been changing, and quite...

[[ This is a content summary only. Visit my website for full links, other content, and more! ]]
How Banks Can Thrive By Moving to Cloud

How Banks Can Thrive By Moving to Cloud

By the end of 2019, worldwide spending on digital transformation will reach US$1.7 trillion. That’s a 42% increase from 2017, according to IDC Futurescape: Worldwide Digital Transformation 2018...

[[ This is a content summary only. Visit my website for full links, other content, and more! ]]
How Banks Can Thrive By Moving to Cloud

How Banks Can Thrive By Moving to Cloud

By the end of 2019, worldwide spending on digital transformation will reach US$1.7 trillion. That’s a 42% increase from 2017, according to IDC Futurescape: Worldwide Digital Transformation 2018...

[[ This is a content summary only. Visit my website for full links, other content, and more! ]]
From big data to AI: Where are we now, and what is the road forward?

From big data to AI: Where are we now, and what is the road forward?

In 2016, the AI hype was just beginning, and many people were still cautious when mentioning the term “AI”. After all, many of us have been indoctrinated for years to avoid this term, as...

[[ This is a content summary only. Visit my website for full links, other content, and more! ]]
Operational Intelligence: The Future of Urban Mobility

Operational Intelligence: The Future of Urban Mobility

Cities today are facing many of the same problems they were facing 100 years ago. We live in a changing world but lingering remnants of history still affect our lives, despite huge leaps in our...

[[ This is a content summary only. Visit my website for full links, other content, and more! ]]
Without data management, forget AI and machine learning in health care

Without data management, forget AI and machine learning in health care

Data management gets lost in the enthusiasm around Artificial intelligence (AI) and machine learning (ML). Not surprising, when it’s an algorithm that decides what search results to show you,...

[[ This is a content summary only. Visit my website for full links, other content, and more! ]]
10 features to look for in visualization tools for big data

10 features to look for in visualization tools for big data

Visualization tools for big data show promise in unlocking the value of data collected by enterprises. Getting the best results requires building out an infrastructure for aggregating data from...

[[ This is a content summary only. Visit my website for full links, other content, and more! ]]
How can companies connect disparate data silos?

How can companies connect disparate data silos?

With automation initiatives well on their way, some major mining companies have embarked on the next step in the transformation of mining. “The intelligent mine� is a vision for an operation that can...

[[ This is a content summary only. Visit my website for full links, other content, and more! ]]
Quantum Computing to Protect Data: Will You Wait and See or Be an Early Adopter?

Quantum Computing to Protect Data: Will You Wait and See or Be an Early Adopter?

Time to dispel with a myth: quantum computing is still just a theory. It’s not. If you don’t believe us, read here. And because it’s past the theoretical stage, commercialization is not far away,...

[[ This is a content summary only. Visit my website for full links, other content, and more! ]]
The Great 3-1 Rule Debate

The Great 3-1 Rule Debate

coldwarmap3[This piece was originally posted on 13 July 2016.]

Trevor Dupuy’s article cited in my previous post, “Combat Data and the 3:1 Rule,� was the final salvo in a roaring, multi-year debate between two highly regarded members of the U.S. strategic and security studies academic communities, political scientist John Mearsheimer and military analyst/polymath Joshua Epstein. Carried out primarily in the pages of the academic journal International Security, Epstein and Mearsheimer argued the validity of the 3-1 rule and other analytical models with respect the NATO/Warsaw Pact military balance in Europe in the 1980s. Epstein cited Dupuy’s empirical research in support of his criticism of Mearsheimer’s reliance on the 3-1 rule. In turn, Mearsheimer questioned Dupuy’s data and conclusions to refute Epstein. Dupuy’s article defended his research and pointed out the errors in Mearsheimer’s assertions. With the publication of Dupuy’s rebuttal, the International Security editors called a time out on the debate thread.

The Epstein/Mearsheimer debate was itself part of a larger political debate over U.S. policy toward the Soviet Union during the administration of Ronald Reagan. This interdisciplinary argument, which has since become legendary in security and strategic studies circles, drew in some of the biggest names in these fields, including Eliot Cohen, Barry Posen, the late Samuel Huntington, and Stephen Biddle. As Jeffery Friedman observed,

These debates played a prominent role in the “renaissance of security studies� because they brought together scholars with different theoretical, methodological, and professional backgrounds to push forward a cohesive line of research that had clear implications for the conduct of contemporary defense policy. Just as importantly, the debate forced scholars to engage broader, fundamental issues. Is “military power� something that can be studied using static measures like force ratios, or does it require a more dynamic analysis? How should analysts evaluate the role of doctrine, or politics, or military strategy in determining the appropriate “balance�? What role should formal modeling play in formulating defense policy? What is the place for empirical analysis, and what are the strengths and limitations of existing data?[1]

It is well worth the time to revisit the contributions to the 1980s debate. I have included a bibliography below that is not exhaustive, but is a place to start. The collapse of the Soviet Union and the end of the Cold War diminished the intensity of the debates, which simmered through the 1990s and then were obscured during the counterterrorism/ counterinsurgency conflicts of the post-9/11 era. It is possible that the challenges posed by China and Russia amidst the ongoing “hybrid� conflict in Syria and Iraq may revive interest in interrogating the bases of military analyses in the U.S and the West. It is a discussion that is long overdue and potentially quite illuminating.

NOTES

[1] Jeffery A. Friedman, “Manpower and Counterinsurgency: Empirical Foundations for Theory and Doctrine,� Security Studies 20 (2011)

BIBLIOGRAPHY

(Note: Some of these are behind paywalls, but some are available in PDF format. Mearsheimer has made many of his publications freely available here.)

John J. Mearsheimer, “Why the Soviets Can’t Win Quickly in Central Europe,” International Security, Vol. 7, No. 1 (Summer 1982)

Samuel P. Huntington, “Conventional Deterrence and Conventional Retaliation in Europe,� International Security 8, no. 3 (Winter 1983/84)

Joshua Epstein, Strategy and Force Planning (Washington, DC: Brookings, 1987)

Joshua M. Epstein, “Dynamic Analysis and the Conventional Balance in Europe,� International Security 12, no. 4 (Spring 1988)

John J. Mearsheimer, “Numbers, Strategy, and the European Balance,� International Security 12, no. 4 (Spring 1988)

Stephen Biddle, “The European Conventional Balance,� Survival 30, no. 2 (March/April 1988)

Eliot A. Cohen, “Toward Better Net Assessment: Rethinking the European Conventional Balance,International Security Vol. 13, No. 1 (Summer 1988)

Joshua M. Epstein, “The 3:1 Rule, the Adaptive Dynamic Model, and the Future of Security Studies,� International Security 13, no. 4 (Spring 1989)

John J. Mearsheimer, “Assessing the Conventional Balance,� International Security 13, no. 4 (Spring 1989)

John J. Mearsheimer, Barry R. Posen, Eliot A. Cohen, “Correspondence: Reassessing Net Assessment,� International Security 13, No. 4 (Spring 1989)

Trevor N. Dupuy, “Combat Data and the 3:1 Rule,� International Security 14, no. 1 (Summer 1989)

Stephen Biddle et al., Defense at Low Force Levels (Alexandria, VA: Institute for Defense Analyses, 1991)

Ford Eyes Using Personal Data to Boost Profits

Ford Eyes Using Personal Data to Boost Profits

Ford Motor Company is known for making cars and trucks; but the future for the iconic automaker might look a little more like Facebook than an assembly line. As it struggles with hemorrhaging...

[[ This is a content summary only. Visit my website for full links, other content, and more! ]]
5 Ways to Embrace Data to Engage Customers

5 Ways to Embrace Data to Engage Customers

By nature, we are a consumer-driven society. We are fast-paced, with fast apps, and ultimately we make fast friends. The question is, though, how do we transition those friendships into relationships that stick? How do we bridge the gap from a prospective buyer, consumer, and user – to a customer that keeps coming back, when there is plenty of other fish in the sea, so to speak?

The baseline of any relationship is communication and how you choose to engage in that conversation actively. Most importantly, when the focus is on flipping a passerby to a patron, your website and marketing should be your focus. Essentially, you want to date the consumer. But how do you get there?

The answer is that you first have to “date the data�.

Just as dating has evolved from a mere 15 years ago, so has the process of analysing data. We now can form an integral bond and trust by using resources that simply help you know your customer better. You are now able to grow those roots by identifying their preferences and ultimately getting a foothold on repeat traffic.

Here are five ways you can date your data, and enhance your customer engagement.

1. Show You Care

Don’t ...


Read More on Datafloq
How Will IBM’s Acquisition Affect Red Hat?

How Will IBM’s Acquisition Affect Red Hat?

ZD Net reported that IBM's acquisition of Red Hat is the biggest business deal to hit the open source market in history. For those of us who have grown up with Red hat's distributions alongside releases like Ubuntu and Mandrake, seeing them being acquired by a powerhouse like IBM bodes well for the company. Red Hat is a flagship in the world of Linux and it's unlikely that IBM will change the name since it risks losing the brand loyalty of hundreds of thousands of users. However, this acquisition could have a lot more far-reaching consequences for the tech world at large and for Red Hat directly.

Open Shift May Be Coming into Its Own

One of the notable things that IBM has been angling for is the presentation of Open Shift, Red Hat's comprehensively developed Kubernetes interface. In recent years, Red Hat's marketing of Kubernetes has put them on the radar of a lot of major firms and quite a lot of these companies have adopted the open source solution. Companies such as IBM’s direct cloud competitors fall into this group. However, Open Shift was not nearly as well marketed as Kubernetes, despite it being just as well-designed and implemented as ...


Read More on Datafloq
CIO Strategy: Kimberly-Clark’s Digital Transformation

CIO Strategy: Kimberly-Clark’s Digital Transformation

Consumer products icon Kimberly-Clark’s CIO launched a digital transformation at the highest strategic levels. Here’s her game plan. Google, Amazon and other big technology companies may...

[[ This is a content summary only. Visit my website for full links, other content, and more! ]]
Blockchain smart contracts can finally have a real world impact

Blockchain smart contracts can finally have a real world impact

You have probably heard that blockchain technology and “smart contracts” are going to revolutionize our lives. But there’s a problem: before smart contracts can do anything really useful,...

[[ This is a content summary only. Visit my website for full links, other content, and more! ]]
Cloud Security: How to Secure Your Sensitive Data in the Cloud

Cloud Security: How to Secure Your Sensitive Data in the Cloud

In today’s always-connected world, an increasing number of organisations are moving their data to the cloud for operational efficiency, cost management, agility, scalability, etc. As more data is...

[[ This is a content summary only. Visit my website for full links, other content, and more! ]]
What is adversarial artificial intelligence and why does it matter?

What is adversarial artificial intelligence and why does it matter?

Artificial intelligence (AI) is quickly becoming a critical component in how government, business and citizens defend themselves against cyber attacks. Starting with technology designed to automate...

[[ This is a content summary only. Visit my website for full links, other content, and more! ]]
How smart city investment can unlock economic growth

How smart city investment can unlock economic growth

Smart city investments can trigger “a robust cycle� of economic growth by unlocking savings and attracting businesses, residents, and talent, according to a major new benchmarking report of 136...

[[ This is a content summary only. Visit my website for full links, other content, and more! ]]
Why Blockchain is Quickly Becoming the Gold Standard for Supply Chains

Why Blockchain is Quickly Becoming the Gold Standard for Supply Chains

Global supply chains are complex processes. Different companies, with distinctive objectives, are working together to achieve a common goal; to bring something from A to B. For a supply chain to work, partners have to trust each other. To do so, there are multiple checks-and-balances, extensive documents and different checkpoints all interacting in a web of bureaucratic processes. Knowing the amount of paperwork required to send a product from farm to plate, it is remarkable that we have managed to develop global supply chains.

However, the processes in place are time-consuming, expensive and they don’t always prevent growing problems such as counterfeit products, fragmentation and falsification of data, lack of transparency, extensive settlement times and incorrect storage conditions.

Especially the availability of counterfeit products is an extensive problem. Research showed that 20 out of 47 items audited from renowned retailers such as Amazon or eBay turned out to be counterfeit. This results in annual damages of $1 trillion of missed income by retailers and manufacturers. To make matters worse, according to Paige Cox,

VP and Head of Digital Supply Chain Networks at SAP, fake pharma kills over 1 million people every year, and counterfeit manufacturers make astronomical profits. It is time for change.

Collaboration ...


Read More on Datafloq
Future transport: From MaaS to the Internet of Mobility

Future transport: From MaaS to the Internet of Mobility

Throughout the 20th century, the American dream was commonly depicted as moving out to the suburbs with a white picket fence and a car (usually a Ford) in the garage. Over the decades, this dream...

[[ This is a content summary only. Visit my website for full links, other content, and more! ]]
Reimagining Smart Cities With Connected Consumer Products

Reimagining Smart Cities With Connected Consumer Products

We are fast moving towards a future where cities will feature hundreds and thousands of smart connected objects, talking to each other, exchanging and producing meaningful data and insights,...

[[ This is a content summary only. Visit my website for full links, other content, and more! ]]
Rákos megbetegedések diagnosztizálása adatokkal 72 óra alatt

Rákos megbetegedések diagnosztizálása adatokkal 72 óra alatt

 

 A csapat tagjai: Nádai Bence Szalóki Kristóf, Adriana Custode, Vuchetich Bálint, és Rabatin Gábor 

Októberben került megrendezésre első alkalommal a JunctionXBudapest, a híres finn hackhaton, a Junction előversenyeként. Már nyáron eldöntöttük a Dmlabnál, hogy próbára tesszük magunkat és elindulunk a versenyen. A háromnapos megmérettetés alatt új emberekkel, cégekkel ismerkedtünk meg és rengeteget tanultunk ezen rövid idő alatt is. 

Felvetődhet a kérdés, hogy mi is az a Junction X Budapest? A Junction eredetileg egy Finnországban megrendezett hackhaton, amelyre az évek során annyira megnőtt az érdeklődés, hogy úgy döntöttek, egy új sorozatot indítanak JunctionX néven a világ különböző pontjain. Budapestre a TechEmbassy csapata hozta el nekünk a versenyt, akik már első alkalommal is nagyon színvonalas eseményt szerveztek.

 A hackhaton lényege, hogy rövid idő kell elkészíteni, felvázolni vagy megvalósítani egy olyan ötletet, amivel megnyerheted a versenyt. Ezzel mi sem voltunk másképp, ám ötletünk megvalósításáig hosszadalmas út vezetett. Nekünk péntek 19 órától vasárnap 13 óráig volt lehetőségünk megvalósítani az innovatív és olykor lehetetlennek tűnő elképzeléseinket.

 Először választanunk kellett egyet a négy challange közül, melyen versenyezni kívántunk. A péntek esténk azzal telt, hogy eldöntsük, hogy a Nokia vagy a Varian feladatát válasszuk. Végül az utóbbi mellett döntöttünk, amelyet egy cseppet sem bántunk meg. A Varian feladata agyi tumorok detektációja volt MRI és CT képeken. A feladat tehát képfeldolgozás volt, ahol nem szabták meg, hogy milyen technológiát használhatunk, a cél az volt, hogy találjuk meg nekik a tumorokat a valós felvételeken.

 A feladat jelentőségét mutatta, hogy elmondásuk szerint a saját mérnökeik számára is rendkívül nagy nehézséget jelent a tumorok automatikus felismerése, így egy jó megoldás valóban megkönyítené a vállalat munkáját. Péntek este jött egy, akkor még lehetetlennek tűnő ötlet, mely az idő előrehaladtával egyre megvalósíthatóbbnak tűnt.

 Ezen ötletünk alapján az MRI felvételekből készítettünk egy 3D-s agyat, benne a tumorral, amit egy VR környezetben jelenítettünk meg. Az ötletünket hallva a Varion munkatársai nem gondolták, hogy képesek leszünk elkészíteni ilyen rövid idő alatt tervünket. Szombat délelőttől egészen vasárnap 13 óráig megállás nélkül a megoldáson dolgozva sikerült elkészítenünk azt, amiben oly kevesen hittek az ottlévők közül - köztük néha mi magunk is. Vasárnap délutánra elkészült a 3D-s vizualizációnk.

 A tumorok megtalálásához első lépésként fel kellett dolgoznunk a kapott DICOM formátumú fájlokat. Ezen fájlformátumot az egészségügyben használják különféle orvosi gépek által készített felvételek tárolására. A beolvasást követően különféle transzformációkat (Grey Scaling) és szűréseket (Antistropic filter, Erosion, Dilation) hajtottunk végre a képeken a szükségtelen részek eltüntetése és a fontos részek kiemelése érdekében. Az előfeldolgozott képeken ezután következhetett a tumor keresése. Első ötletünk egy neurális háló volt, azonban az adatok egységességenek hiányában, illetve azok kis elemszáma miatt ezt elvetettük. Végül az OpenCV könyvtár segítségével oldottuk meg a feladatot, ahol a feldolgozott képeken kerestük a megfelelő attribútumokkal rendelkező, tumorokhoz hasonlító alakzatokat.

A tumor detektálása az alábbi folyamatok végrehajtásával valósult meg:

 

A tumor megtalálását követően a 3D-s vizualizáció elkészítése következett. A vizualizáció a Unity nevű program felhasználásával történt, ahol raymarching segítségével az MRI képekből felépítettük a vizsgált agy háromdimenziós modelljét. A modellben a detekció során megtalált tumort pontosan helyeztük el, mivel a tumor elhelyezkedését ismertük, azonban a pontos alakját nem, ezért a verseny alatt egy elipszoiddal szemléltettük azt. Az agyat nem csupán nézni lehetett, hanem különféle attribútumok változtatásával vizsgálni is; a vágósíkok elhelyezése a különböző tengelyeken, az intenzitás, a küszöbérték állitása mind egyénileg testreszabható volt. Ehhez készítettünk egy virtuális valóság applikációt, és egy Samsung Gear VR szemüveg segítségével mutattuk be a projektünket a többi csapat számára.

A megoldásunk mindenki tetszését elnyerte. Próbáltunk innovatívak és merészek lenni, ennek ellenére kategóriánk első helyét sajnos nem sikerült elnyernünk, mindenesetre a Community Challengen legjobb magyar csapatként 8. helyezést értünk el a 44 résztvevő közül.

A hackhaton nagyon jó élmény volt számunkra, és megmutatta, hogy képesek vagyunk bármit elkészíteni, amit csak kigondolunk. 

Megosztom Facebookon! Megosztom Twitteren! Megosztom Tumblren!

What 5G will bring to the IoT device, network and application

What 5G will bring to the IoT device, network and application

As global organisations increasingly adopt IoT solutions, Peter Van Den Houten at Kore discusses the major innovations set to transform connectivity and networking infrastructure It is perhaps an...

[[ This is a content summary only. Visit my website for full links, other content, and more! ]]
Why Law Firms Hire Virtual Paralegals?

Why Law Firms Hire Virtual Paralegals?

Every growing business requires additional staff to streamline their specific tasks and maintain efficiency internally. However, hiring an in-house team eventually adds to the overhead cost of the company even businesses require help for a certain period. 

But today, in the technological world, virtual assistants are becoming paramount in numerous industries and the legal sector is not excluded from this phenomenon. 

Many law firms are scared to hire assistants for their paralegal work as they likely have to share their crucial legal information. 

Now no need to worry! Let’s understand what is a virtual paralegal and then move further with its immense benefits:

What is virtual paralegal and benefits of hiring these professionals?

A virtual assistant is an employee who works remotely that means away from your office premise. A virtual assistant is always equipped with the latest technology and use various mode of communication to stay connected with their clients. However, virtual paralegals are the senior-level professionals experienced with the paralegal process serving law firms and attorneys to reduce their overall burden.

These paralegal services are a boon for solo practitioners and small law firms as they can streamline their administrative task and attract more clients to their firms. 

Benefits of virtual paralegals

Cost-saving

The number one advantage ...


Read More on Datafloq
Artificial Intelligence Is Revolutionizing Earthquake Prediction

Artificial Intelligence Is Revolutionizing Earthquake Prediction

Predicting earthquakes used to be the stuff of science fiction but not anymore! Artificial intelligence is revolutionizing this brand new area of research, and the predictions are going to get even more accurate over time. Here's what you need to know about AI and earthquake prediction:

How Accurate Are The Predictions Now?

They are not very accurate yet, but scientists can make educated guesses at what parts of the world are most prone to earthquakes. Scientists can estimate the maximum intensity of earthquakes that a certain region has the potential to experience. AI estimates the probability that an earthquake will strike a region over the next 50 years. Check out this link to learn more about how earthquakes are predicted.

Can Artificial Intelligence Predict Aftershocks?

Aftershocks are smaller earthquakes that occur after the main event, and AI is exceptionally effective for the prediction of these events. This can greatly improve the repair process in earthquake affected regions. This is important since there are times when the aftershocks are nearly as severe as the main event.

How Does It Work?

The technology is able to predict earthquakes by modeling the movements of plates, and it's able to predict movements that occur over very long periods of time. ...


Read More on Datafloq
Why the EU Is More Likely to Drive IT and Security Trends Than the US

Why the EU Is More Likely to Drive IT and Security Trends Than the US

The General Data Protection Regulation (GDPR) has been a game changer for data privacy, and U.S. companies are beginning to catch up to the EU in data management practices. However, privacy is only...

[[ This is a content summary only. Visit my website for full links, other content, and more! ]]
Smart Contracts: The Business Process Enablers for Blockchain

Smart Contracts: The Business Process Enablers for Blockchain

This article is the first in a series of articles exploring ‘Smart Contracts’ and how enterprises can benefit from their use. Here, Antonio delves into the basics of smart contracts for the...

[[ This is a content summary only. Visit my website for full links, other content, and more! ]]
Force Ratios in Conventional Combat

Force Ratios in Conventional Combat

American soldiers of the 117th Infantry Regiment, Tennessee National Guard, part of the 30th Infantry Division, move past a destroyed American M5A1 “Stuart” tank on their march to recapture the town of St. Vith during the Battle of the Bulge, January 1945. [Wikipedia]

[This piece was originally posted on 16 May 2017.]

This post is a partial response to questions from one of our readers (Stilzkin). On the subject of force ratios in conventional combat….I know of no detailed discussion on the phenomenon published to date. It was clearly addressed by Clausewitz. For example:

At Leuthen Frederick the Great, with about 30,000 men, defeated 80,000 Austrians; at Rossbach he defeated 50,000 allies with 25,000 men. These however are the only examples of victories over an opponent two or even nearly three times as strong. Charles XII at the battle of Narva is not in the same category. The Russian at that time could hardly be considered as Europeans; moreover, we know too little about the main features of that battle. Bonaparte commanded 120,000 men at Dresden against 220,000—not quite half. At Kolin, Frederick the Great’s 30,000 men could not defeat 50,000 Austrians; similarly, victory eluded Bonaparte at the desperate battle of Leipzig, though with his 160,000 men against 280,000, his opponent was far from being twice as strong.

These examples may show that in modern Europe even the most talented general will find it very difficult to defeat an opponent twice his strength. When we observe that the skill of the greatest commanders may be counterbalanced by a two-to-one ratio in the fighting forces, we cannot doubt that superiority in numbers (it does not have to more than double) will suffice to assure victory, however adverse the other circumstances.

and:

If we thus strip the engagement of all the variables arising from its purpose and circumstance, and disregard the fighting value of the troops involved (which is a given quantity), we are left with the bare concept of the engagement, a shapeless battle in which the only distinguishing factors is the number of troops on either side.

These numbers, therefore, will determine victory. It is, of course, evident from the mass of abstractions I have made to reach this point that superiority of numbers in a given engagement is only one of the factors that determines victory. Superior numbers, far from contributing everything, or even a substantial part, to victory, may actually be contributing very little, depending on the circumstances.

But superiority varies in degree. It can be two to one, or three or four to one, and so on; it can obviously reach the point where it is overwhelming.

In this sense superiority of numbers admittedly is the most important factor in the outcome of an engagement, as long as it is great enough to counterbalance all other contributing circumstance. It thus follows that as many troops as possible should be brought into the engagement at the decisive point.

And, in relation to making a combat model:

Numerical superiority was a material factor. It was chosen from all elements that make up victory because, by using combinations of time and space, it could be fitted into a mathematical system of laws. It was thought that all other factors could be ignored if they were assumed to be equal on both sides and thus cancelled one another out. That might have been acceptable as a temporary device for the study of the characteristics of this single factor; but to make the device permanent, to accept superiority of numbers as the one and only rule, and to reduce the whole secret of the art of war to a formula of numerical superiority at a certain time and a certain place was an oversimplification that would not have stood up for a moment against the realities of life.

Force ratios were discussed in various versions of FM 105-5 Maneuver Control, but as far as I can tell, this was not material analytically developed. It was a set of rules, pulled together by a group of anonymous writers for the sake of being able to adjudicate wargames.

The only detailed quantification of force ratios was provided in Numbers, Predictions and War by Trevor Dupuy. Again, these were modeling constructs, not something that was analytically developed (although there was significant background research done and the model was validated multiple times). He then discusses the subject in his book Understanding War, which I consider the most significant book of the 90+ that he wrote or co-authored.

The only analytically based discussion of force ratios that I am aware of (or at least can think of at this moment) is my discussion in my upcoming book War by Numbers: Understanding Conventional Combat. It is the second chapter of the book: http://www.dupuyinstitute.org/blog/2016/02/17/war-by-numbers-iii/

In this book, I assembled the force ratios required to win a battle based upon a large number of cases from World War II division-level combat. For example (page 18 of the manuscript):

I did this for the ETO, for the battles of Kharkov and Kursk (Eastern Front 1943, divided by when the Germans are attacking and when the Soviets are attacking) and for PTO (Manila and Okinawa 1945).

There is more than can be done on this, and we do have the data assembled to do this, but as always, I have not gotten around to it. This is why I am already considering a War by Numbers II, as I am already thinking about all the subjects I did not cover in sufficient depth in my first book.

How open source is fuelling an explosion in fintech innovation

How open source is fuelling an explosion in fintech innovation

The open nature of fintech is a clear break with traditional financial services. Where incumbent businesses have complex legacy IT systems – often developed over decades – the new fintech companies...

[[ This is a content summary only. Visit my website for full links, other content, and more! ]]
The Internet of Moving Things: Where Big Data meets mobility

The Internet of Moving Things: Where Big Data meets mobility

Robin Chase doesn’t buy into the hype about mobile — at least when mobile is associated with portable gadgets like smartphones. But the rising stock of technological tools related to smart...

[[ This is a content summary only. Visit my website for full links, other content, and more! ]]
Empátia az üzleti világban

Empátia az üzleti világban

 

Lead with your EARS, not with your MOUTH

Ahogy azt már bizonyára hallottátok, csapatunkkal ellátogattunk a térség egyik legfontosabb adatos konferenciájára, a Crunch-ra, amely idén másik két rangos konferenciával párhuzamosan került megrendezésre. Az egyik ezek közül az Amuse, a másik pedig az Impact, ami Product Owner, illetve Product Management témákban vonultatott fel érdekesebbnél érdekesebb előadásokat. Ami engem a legjobban megfogott és elgondolkoztatott, az Paul Ortchanian előadása volt, ami alapvetően arról szólt, hogy projektmenedzserként hogyan bánjunk azokkal az emberekkel, akikkel együtt dolgozunk. Figyeljünk oda rájuk, és szánjunk rá időt, hogy meghallgassuk őket. Az előadás nyitó mondata a következő volt:

 

 

Minden egyes projekt során, amin sok ember dolgozik elkerülhetetlen az, hogy az emberek nézőpontjai ütköznek egymással, és nem találják meg a közös hangot. Projektmenedzserként nagyon fontos az, hogy tudjuk, hogy miként lehet konszenzust teremteni. Nem érvényesülhet mindenki akarata, hiszen az káoszhoz vezetne, azonban mindenki véleményét meg kell hallgatni, és megtalálni azt a megoldást, amivel mindenki egyet tud érteni.

 

              You are not going to win every battle.

Az empátia a kulcsa mindennek; megkönnyíti a kommunikációt az emberekkel és segít elérni azt, hogy egy csapatként haladjunk előre a cél felé. Meg kell próbálnunk a másik szemszögéből nézni a világot és megérteni az ő nézőpontját is. Ezt úgy érhetjük el, hogy bevonunk másokat is a megoldás keresésébe, ugyanis a jővőről együtt kell döntenünk.

              The future needs to agreed on as a group.

Az emberek nem hiába születtek két füllel és egy szájjal – mondta az előadó. Projektmenedzserként egy aranyszabály lehet, hogy kétszer annyit hallgassunk, kérdezzünk, mint amennyit beszélünk. Minél több információnk van, minél több véleményt meghallgatunk, annál inkább tudunk majd olyan döntést hozni, amely mindenkinek megfelel, és el is fogadják. Erre egy nagyon egyszerű és könnyen érthető példát hozott az előadó, hogy megértsük mire is gondol: “A példa kedvéért tegyük fel, hogy egy túlsúlyos ember elmegy a doktorhoz, mert baj van a vérnyomásával, nagyon magas. A doktor már az elején tudja, hogy mivel lehetne megoldani a problémát, a páciensnek le kellene fogynia és a magas vérnyomás megszűnne. Azonban, ha a orvos csak ennyit mondana a betegnek, akkor szinte biztosak lehetünk abban, hogy a betegünk nem kezdene el diétázni. Ehelyett a doktor előszőr megvizsgálja a beteget; a sztetoszkóppal meghallgatja a szívét, megkérdezi mit dolgozik, mi a hobbija, mennyire stresszes, vagyis kialakít vele egy bensőséges kapcsolatot. Ezután elküldi a pácienst vérvételre és egyéb vizsgálatokra, majd leül beszélgetni vele, hogy kielemezzék az eredményeket. Elmondja, hogy a vérvétel alapján az látszik, hogy a betegnek magas a koleszterinszintje, emellett magas a vérnyomása, tehát megalapozza a döntését tényekkel, majd ezután azt mondja, hogy a látottak szerint le kellene fogynia a betegnek.�

A projektjeinkben, amiken a csapatunkkal dolgozunk is mindig az a fő cél, hogy az ügyfeleink és partnereink végül azt kapják, amit üzletileg a legjobban tudnak használni. Mivel az adatok világában ez a dolog a projektek elején eléggé homályos tud lenni, ezért a fenti jótanácsokat mi is beépítjük az összes ügyféllel történő kommunikációba.

Megosztom Facebookon! Megosztom Twitteren! Megosztom Tumblren!

How Big Data Analytics Will Make a Huge Difference in Small Business Social Media Marketing

How Big Data Analytics Will Make a Huge Difference in Small Business Social Media Marketing

When most small business owners think of Big Data analytics, the first image that often pops into their minds is dollar signs. This type of technology is fantastic and innovative, but it is also highly technical and can be expensive. Plus, it requires specialized knowledge and training, which means more employees to hire, more overhead, and more problems.

However, as Big Data technology continues to advance, it has actually become more practical and accessible for small businesses. More than half of companies today have adopted Big Data systems into their current strategies - a number that is rising as more and more businesses find sensible ways to incorporate this technology.

One area of marketing where small businesses can expect the most impact from data is in social media. Ad spending on social has only increased year over year, and 64% of companies are planning to spend a decent chunk of their time and marketing budget on this platform in the upcoming year. Big Data can make this type of digital advertising much more effective, and small business owners are becoming more comfortable with integrating it into their campaigns.

So, what ways can small businesses expect to see the biggest impact from Big Data analytics in their ...


Read More on Datafloq
How Big Bata Analytics Is The Lure For Business Today ?

How Big Bata Analytics Is The Lure For Business Today ?

In today’s digital world, businesses generate large amounts of data or ‘Big Data’, which is a potential goldmine if used correctly. Whether you are a multi-million dollar organization or a small...

[[ This is a content summary only. Visit my website for full links, other content, and more! ]]
How The Construction Industry Is Leveraging Big Data

How The Construction Industry Is Leveraging Big Data

Big data is making its way into virtually every industry – and that includes construction. From safety to productivity and streamlining business processes, the construction industry is leveraging big...

[[ This is a content summary only. Visit my website for full links, other content, and more! ]]
U.S. versus The World (GDP)

U.S. versus The World (GDP)

There has been a lot of statements lately coming out of Russia (Putin in particular), China and other places about how the U.S. is in decline. Not sure what is the basis of these statements. Right now the United States GDP is $19.391 Trillion. The entire world’s GDP, according to the World Bank, is $80.684 Trillion. This means that the U.S.’s GPD makes up only 24% of the world’s GDP. This hardly puts us out in the dumpster.

Now, it has changed over time. In 1960 the U.S. GDP was $543.3 billion while the World’s was $1.366 Trillion. This is 40%. So we have declined from having 40% of all the goods and services of the world to only having 24% of all the goods and services in the world. I would argue that in fact this is not a decline, but a growth on the part of the rest of the world, and in many cases a well-deserved growth. In fact, it was the intention of the Marshall Plan to restore the European economy for the purpose of making stable democratic economically viable counties. This is a plan that succeeded in spades. The data from 1960 is only 15 years after the devastating World War II, so it is really no surprise that the U.S. economy, living in “splendid isolationism” or at least conveniently isolated by two very large bodies of water, was in much better shape than those people who had panzers running back and forth across their front lawns.

As of 1980 the United States GDP was 26% of the world GDP. Much of this change was a result of the growth of the western European and Japanese economies. It was before the later boom of the Chinese economy. But this is close to the same percentage as it is now. Just to compare over time:

Year…………Percent GDP

1960             40%

1970             39%

1980             26%

1990             26%

2000             31%

2010             23%

2017             24%

 

There is definitely a trend here, but it is a trend caused by the rest of the world growing, not by the United States declining. You can definitely see the world economy growing significantly after 2000. But most significant relative shift occurred between 1970, when the U.S. economy made up 39% of the world’s economy, to 1980, when it was down to 26%. It is now 24%, so really not a hugely significant shift in the last 40 years. Not sure how you then conclude the United States is now in decline.

In the 57 years on this graph, the U.S. GPD actually only declined in one year (2009). That is a pretty good run.

Big Data is a Powerful Asset for Business Success

Big Data is a Powerful Asset for Business Success

Various business trends today, such as the use of artificial intelligence and multimedia visual marketing, are connected to the concept of Big Data. Every action internet users take generates a data...

[[ This is a content summary only. Visit my website for full links, other content, and more! ]]
How To Hire The Right Data Scientist For Your Business

How To Hire The Right Data Scientist For Your Business

Data Scientists are hot. It’s no secret just how in demand they are, with Indeed recently reporting that job posts for Data Scientists increased by 75% between 2015 and 2018. The 2017 LinkedIn US Emerging Jobs Report also noted that there are over 1,800 open Machine Learning Engineering positions on the networking site, with Machine Learning Engineer, Data Scientist and Big Data Engineer all ranking at the top of the emerging jobs list.

With such high demand for top tier-talent, there is a lot of competition amongst companies to find the right people for the job. You need to be confident in finding the right person and be able to convey your company as an attractive option for them in order to stand out as a company that Data Scientists will want to work at. It can be hard navigating this market, so here are some pointers to help you on your way.

First of all - is your company really ready for a Data Scientist? With high demand comes even higher salaries. In last years’ Salary Report 2017/18, we found that the average salary for a Data Scientist in the USA is $120,253. It’s not cheap to hire great talent! Glassdoor ...


Read More on Datafloq
Are We Nearing The End Of Hadoop And Big Data?

Are We Nearing The End Of Hadoop And Big Data?

A few weeks ago, two giants of the big data Hadoop era, Cloudera and Hortonworks, announced they would be merging. The announcement claimed it would be a “merger of equals.� It is fascinating to see...

[[ This is a content summary only. Visit my website for full links, other content, and more! ]]
The Impact of Artificial Intelligence on the World Economy

The Impact of Artificial Intelligence on the World Economy

A customer takes a picture as robotic arms collect pre-packaged dishes from a cold storage, done according to the diners’ orders, at Haidilao’s new AI restaurant in Beijing, China, Nov.r...

[[ This is a content summary only. Visit my website for full links, other content, and more! ]]
R Shiny for Cross Industry, Cross Function Data Visualization

R Shiny for Cross Industry, Cross Function Data Visualization

The increasing access to Big Data in this world of connectivity has opened up the possibility of gaining deep insights that can help businesses take relevant, strategic decisions that can spur growth. No wonder then that the Big Data market is expected to grow from USD 28.65 Billion in 2016 to USD 66.79 Billion by 2021, at a high Compound Annual Growth Rate (CAGR) of 18.45%, according to MarketsandMarkets Research.

The growth in Big Data in itself is not as significant as the ability to cull insights and forecast future trends. As a result, Zion Market Research expects the global predictive analytics market to grow at a CAGR of 21 per cent from USD 3.49 billion in 2016 to approximately USD 10.95 billion by 2022.

Given the complexity of data, data visualization tools have become critical to presenting the data in ways that can help understand the dynamics between data elements better. Charts, videos, infographics and even virtual reality and augmented reality presentations are being used for more engaging and intuitive insights.

The visualization tools convert numeric algorithmic outputs into images that help in an intuitive understanding of the depth and range of the data in an easy-to-interpret format. Therefore, the visualization is ...


Read More on Datafloq
5 Ways Big Data Affects Your Personal Life: What Does This Mean?

5 Ways Big Data Affects Your Personal Life: What Does This Mean?

Big data refers to huge data sets that are in exabytes of size. An exabyte is equal to a million terabytes. This large amount of data is used to better improve our everyday lives. From online shopping to TV streaming to preventing crime, big data has become an important role to help companies compile, analyze, and make use of this data. Here are some ways big data is used in our personal lives.

Online Shopping

Online retailers try to target consumers by looking at their browsing history, purchasing history, or search history. Amazon looks at the items you last purchased, and compares it to the big data of what other users have purchased. It then suggests other items that you're likely to buy. Databases are keeping track of the items in their warehouses to prevent theft and to collect product data. A lot of these techniques were originally employed by Amazon and are being used among other retailers.

Music Streaming

Music streaming has become more personalized, allowing people to create their own radio stations and playlists. Music streaming services collect this information based on the type of music that the user listens to. They also collect the data from their libraries to ...


Read More on Datafloq
Artificial Intelligence Accurately Diagnoses Skin Cancers

Artificial Intelligence Accurately Diagnoses Skin Cancers

New study conducted by an international team of researchers suggests that artificial intelligence (AI) may be better than highly-trained humans at detecting certain skin cancers Artificial...

[[ This is a content summary only. Visit my website for full links, other content, and more! ]]
Machine learning, meet quantum computing

Machine learning, meet quantum computing

Back in 1958, in the earliest days of the computing revolution, the US Office of Naval Research organized a press conference to unveil a device invented by a psychologist named Frank Rosenblatt at...

[[ This is a content summary only. Visit my website for full links, other content, and more! ]]
Japan’s Grand Strategy and Military Forces (I)

Japan’s Grand Strategy and Military Forces (I)

[Source: Consulate-General of Japan, Sydney]

This is the first in a series of Orders of Battle (OOB) posts, which will cover Japan, the neighboring and regional powers in East Asia, as well as the major global players, with a specific viewpoint on their military forces in East Asia and the Greater Indo-Pacific. The idea is to provide a catalog of forces and capabilities, but also to provide some analysis of how those forces are linked to the nation’s strategy.

The geographic term “Indo-Pacific� is a relatively new one, and referred to by name in the grand strategy as detailed by the Japanese Ministry of Foreign Affairs (MOFA) in April 2017. It also aligns with the strategy and terminology used by US Defense Secretary James Mattis at the Shangri-La conference in June 2018. Dr. Michael J. Green has a good primer on the evolution of Japan’s grand strategy, along with a workable definition of the term:

What is “grand strategy�? It is the integration of all instruments of national power to shape a more favorable external environment for peace and prosperity. These comprehensive instruments of power are diplomatic, informational, military and economic. Successful grand strategies are most important in peacetime, since war may be considered the failure of strategy.

Nonetheless, the seminal speech by Vice President Pence regarding China policy on 4 October 2018, had an articulation of Chinese grand strategy: “Beijing is employing a whole-of-government approach, using political, economic, and military tools, as well as propaganda, to advance its influence and benefit its interests in the United States.� The concept of grand strategy is not new; Thucydides is often credited with the first discussion of this concept in History of the Peloponnesian War (431-404 BCE). It is fundamentally about the projection of power in all its forms.

With the Focus on the Indo-Pacific Strategy, What About the Home Islands? 

[Source: Japanese Ministry of Defense (MOD) ]

The East Asian region has some long simmering conflicts, legacies from past wars, such as World War II (or Great Pacific War) (1937-1945), the Korean War (1950-1953), and the Chinese Civil War (1921-1947). These conflicts led to static and stable borders, across which a “military balance� is often referred to, and publications from think tanks often refer to this, for example the Institute for International and Strategic Studies (IISS) offers a publication with this title. The points emphasized by IISS in the 2018 edition are “new arms orders and deliveries graphics and essays on Chinese and Russian air-launched weapons, artificial intelligence and defence, and Russian strategic-force modernisation.�

So, the Japanese military has two challenges, maintain the balance of power at home, that is playing defense, with neighbors who are changing and deploying new capabilities that have a material effect on this balance. And, as seen above Japan is working to build an offense as part of the new grand strategy, and military forces play a role.

Given the size and capability of the Japanese military forces, it is possible to project power  at great distances from the Japanese home waters. Yet, as a legacy from the Great Pacific War, the Japanese do not technically have armed forces. The constitution, imposed by Americans, officially renounces war as a sovereign right of the nation.

In July 2014, the constitution was officially �re-interpreted� to allow collective self-defense. The meaning was that if the American military was under attack, for example in Guam, nearby Japanese military units could not legally engage with the forces attacking the Americans, even though they are allied nations, and conduct numerous training exercises together, that is, they train to fight together. This caused significant policy debate in Japan.

More recently, as was an item of debate in the national election in September 2018, the legal status of the SDF is viewed as requiring clarification, with some saying they are altogether illegal. “It’s time to tackle a constitutional revision,” Abe said in a victory speech.

The original defense plan was for the American military to defend Japan. The practical realities of the Cold War and the Soviet threat to Japan ended up creating what are technically “self-defense forces� (SDF) in three branches:

  • Japan Ground Self-Defense Forces (JGSDF)
  • Japan Maritime Self-Defense Forces (JGSDF)
  • Japan Air Self-Defense Forces (JASDF)

In the next post, these forces will be cataloged, with specific capabilities linked to Japanese strategy. As a quick preview, the map below illustrates the early warning radar sites, airborne early warning aircraft, and fighter-interceptor aircraft, charged with the mission to maintain a balance of power in the air, as Russian and Chinese air forces challenge the sovereignty of Japanese airspace. With the Russians, this is an old dance from the Cold War, but recently the Chinese have gotten into this game as well.

[Source: J-Wings magazine, December 2018]

Data and IoT: The Secret Sauce

Data and IoT: The Secret Sauce

Data and IoT: Discussion about the Internet of Things often centers around sensors and hardware, the additions to our physical environments. They may be built into street lights and bus stops, or in...

[[ This is a content summary only. Visit my website for full links, other content, and more! ]]
6 Business Intelligence Trends to Watch For in 2019

6 Business Intelligence Trends to Watch For in 2019

The goal of business intelligence (BI) is to thoughtfully and purposefully collect and analyze past information to support an organization and make better decisions about it. As the new year...

[[ This is a content summary only. Visit my website for full links, other content, and more! ]]
5 Reasons Why Artificial Intelligence is the Driving Force for Web Development

5 Reasons Why Artificial Intelligence is the Driving Force for Web Development

Web development is an ever-evolving field with users always looking for personalized content and rich experiences. If we talk in a nutshell, they want smart web applications carrying hyper-customized user experiences.

It becomes challenging for web development companies as the traditional approaches do not suffice the increasing demands of users. Hence, AI (Artificial Intelligence) and other tools help to make the web development solutions more effective.

Automating the web building process enables the web developers to put less attention to basic tiresome tasks. It helps them to focus on other value-adding aspects like planning for state-of-the-art user experience, formulating design strategy and more. Also, human-machine collaboration can drive the web transformation.

Let’s have a look at the reasons to use artificial intelligence for web development:

1. Self-learning Algorithms

AI simplifies the web programming. It can perform the basic tasks like updating, adding records to the database, predicting the bits of code are most likely to use for solving a problem. It also utilizes those predictions for suggesting a particular solution to the web developers.
You can build smarter apps and bots with artificial intelligence technology much faster than before.

2. Virtual Assistants

It sometimes becomes tedious for web developers to design the basic layouts and templates of web ...


Read More on Datafloq
The State of Analytics Degrees in Universities

The State of Analytics Degrees in Universities

If you want to hire students from universities with strong analytical skills, you need to know the landscape of available programs and skills. Some believe that universities move slowly, but many...

[[ This is a content summary only. Visit my website for full links, other content, and more! ]]
How to future-proof your IT job in the age of AI

How to future-proof your IT job in the age of AI

Could a robot do your job? Could you help a robot do its job? If you are thinking about your career development and where you’d like to be a year from now, it’s time to ask yourself these questions....

[[ This is a content summary only. Visit my website for full links, other content, and more! ]]
Accelerating Data Sharing

Accelerating Data Sharing

Figshare and Digital Science published the 2018 State of Open Data report this week. Based on an annual survey run with us at Springer Nature since 2016, the report tracks changes in...

[[ This is a content summary only. Visit my website for full links, other content, and more! ]]
Datascience; more bang for the buck

Datascience; more bang for the buck

Bang

'I made some Python code that really rocks, Ronald'

'It extracts data from various sources, validates it, does some cleansing, codes xxx business rules, lots of integration of course, executes some kind of predictive model, outputs the data and visualizes it'.

And then the magical words are uttered; lets deploy it to production……

Alas,, the magic of datascience ends abruptly, IT is probably being blamed for not being agile and architects are scorned for being too restrictive and even killing innovation in the process.

Datascience has got a problem, it fails to operationalize its brilliant models and it therefor fails to deliver value to the business. There I said it, I know, pretty polarizing, but I encounter it on a daily basis now. Datascience needs to grow up....

It’s all about: (1) the required quality of services and (2) separating concerns.  Both seem to be not that important in datascience. It should though!

Let me clarify;

Quality of services

Two use cases;

(a) deploying a riscmodel at scale (lets say 500K transactions per day) that evaluates a transaction (from a customer) in realtime based on contra-information and determining in the process the level of supervision needed. Oh and by the way; one has to take into account ‘equality of right’ since the organization is a publicly owned organization.

(b) doing a one-time analysis on various sources, using advanced machine learning and where the output is used for a one-time policy influencing decision.

The quality of services between (a) and (b) are like night and day. (a) needs to be run at scale, realitime (direct feedback), using contra-information, the provenance is hugely important, it is subject based, so there are privacy concerns, it’s an automated decision (there is heavy legal shit here), equality of rights (metadata like; what model did we use on what transaction, what data did we evaluate,…) and many more.

(b) is a one-off….its output influences new policy or contributes to some insight. Your quality of services might be that the model is versioned, properly annotated and that the dataset is somehow archived properly to ensure repeatability. 

My point is that, whenever you start on an analytic journey, establish the quality of services that you require on forehand as much as possible. And for that to happen you need to have a clear explicit statement on how the required informationproduct contributes to the bottom line. So yes, a proper portfolio management process, a risk based impact assessment (!) and deployment patterns (architecture!) that are designed in advance!

With regard to datascience it is vital to make a conscious choice, before you start, of the required quality of services. If these services are high, you might wanna work together closely with system engineers, datamodelling experts, rule experts, legal experts, etc.. Only then, you might be able to deploy stuff and generate the value the field of datascience promises us.

 

Separation of concerns

For those who do not know what ‘separation of concerns’ means, start with Wikipedia or google Edsger Dijkstra, one of the greatest (Dutch) computer scientist…..

Anything IT related is suffering from the ‘overloading concerns’ issue. Some examples;

  • XBRL is a great standard, but suffers from overloading; integrity, validation, structure, meaning and presentation concerns are all bundled into one technical exchange format.
  • Datavault is a great technical modeling paradigm, but it does not capture logical, linguistic or semantic concerns, and yet the data modelling community still tries
  • Archimate is a great modeling notation in the Enterprise Architecture arena, why is it overloaded with process concerns? BPMN is such a better choice.

And of course we see it in code and we have seen it for ages in all programming languages; human tendency to solve all challenges/problems with the tool they are dominantly familiar/trained with. Datascience is no different. Failing to separate concerns lies at  the root of many software related problems like maintainability, transparancy, changeability, performance, scaleability and many many more. 

Like the example I started with in this blog;

A brilliant Python script where a staggering number of concerns have all been dealt with. This might not be a problem when the required quality of services is not that high. But when the required quality of service are high it becomes painfully clear that ‘deploying’ this code to production is a fantasy.

Extraction, validation, cleansing and integration concerns might be better dealt with by making use of tools and techniques in the information (modeling) arena.

Business rules might be better of designed (for example) by means of Rulespeak and subsequently making them more transparent for legal people and domain experts (which is btw – especially in AI – a huge concern!).

Visualization and presentation might be better of by using  tools that are already purchased by your organization, be it Tableau, SAS Visual Analytics, Qlik, Tibco or whatever.

 

Finally

Dear datascientist, stop blaming IT, architects or whoever that your brilliant code is not being deployed in production. Instead, reflect on your own role and your own part in the total supply chain of stuff that need to happen to actually get things working in production at the quality of services that are required.

Dear organization, stop blaming datascientists for not delivering on the value which was promised. Start organizing, consciously, the operationalization of datascience. It is not a walk in the park, it is an assignment that requires an extremely broad skillset, an holistic view, cooperation and of course attention to human behavior/nature.

And the majority of these challenges fall with management!

Starting a datascience team or department without organizing the operationalisation is a waste of resources. 

Operationalization of datascience IS NOT a technical problem!

For my dataquadrant fans; it is all about transitioning from quadrant IV to II.

 

 

 

 

Mastering data governance initiatives in the age of IIoT

Mastering data governance initiatives in the age of IIoT

The Industrial Internet of Things encompasses internet-connected devices that companies use within their organizations. Improved data governance is a necessary goal to strive for regarding the...

[[ This is a content summary only. Visit my website for full links, other content, and more! ]]
From DevOps to DataOps

From DevOps to DataOps

Over the past 10 years, many of us in technology companies have experienced the emergence of “DevOps.� This new set of practices and tools has improved the velocity, quality, predictability and scale...

[[ This is a content summary only. Visit my website for full links, other content, and more! ]]
How Big Data Is Redefining Ad Networks

How Big Data Is Redefining Ad Networks

The rapid rise of big data in virtually every realm of economic activity has caused a flurry of activity across the market as business owners, and hard-working professionals try to cash in on this exciting trend. Despite the renewed attention being paid to big data lately, however, many commentators are still paying pitifully little attention to how big data is set to redefine ad networks, which could be the most important way that it reshapes our material lives.

Here are some of the ways that continued innovations in big data could redefine ad networks as we know them, and how our consumer lifestyles will soon never be the same.

Big data is already changing advertising

To comprehensively understand how big data is redefining ad networks, you need to have a basic understanding of how it’s already changed the advertising industry. Despite the fact that many proponents and critics of big data alike frequently talk about it as though it were some forthcoming innovation, making use of software to sort through dizzying sums of information has been a vital part of the market for years now. The way that companies like Amazon link hands with Madison Avenue to deliver you enticing content like never ...


Read More on Datafloq
How should CIOs manage data at the edge?

How should CIOs manage data at the edge?

The ubiquity of popular buzzwords or phrases in the technology community brings a certain kind of pressure. If everyone seems to be talking about the importance and transformative potential of an...

[[ This is a content summary only. Visit my website for full links, other content, and more! ]]
What is Data Portability?

What is Data Portability?

In May of 2018 the European Union tightened regulations about customer right to data portability as part of theGDPR(General Data Protection Regulation). But what do these changes mean, and how will...

[[ This is a content summary only. Visit my website for full links, other content, and more! ]]
With blockchain asset tracking, Walmart pushes supplier tech adoption

With blockchain asset tracking, Walmart pushes supplier tech adoption

Walmart Inc.’s recent mandate that suppliers of leafy, green vegetables use blockchain by September 2019 faces two important hurdles that other companies should consider: the adoption rate of a...

[[ This is a content summary only. Visit my website for full links, other content, and more! ]]
6 Ways AR Plays a Role in Your Company’s Digital Transformation

6 Ways AR Plays a Role in Your Company’s Digital Transformation

The Augmented Reality revolution has reached a new point, with enterprise adoption overshadowing the consumer world. Market leaders have started directing their focus from niche offerings to...

[[ This is a content summary only. Visit my website for full links, other content, and more! ]]
Inside SAP’s digital transformation strategy

Inside SAP’s digital transformation strategy

SAP has been at the forefront of business digital transformation, primarily by selling technology and applications… that enable companies to create new digital models and opportunities. But SAP...

[[ This is a content summary only. Visit my website for full links, other content, and more! ]]
How AR and VR Changing the Dynamics of Design Industry

How AR and VR Changing the Dynamics of Design Industry

The world we live in, the place we call home, has become more of a global village. Connected in all dimensions. Transforming in all aspects. It goes through a constant evolutional phase and has been very consistent in its technological revolution. The basis for these changes are billions of individuals and millions of companies that set their sole purpose to innovate their respective domains and automate the learning and building process.

Over the past few decades, we have witnessed a revolution in the technology sector with the introduction of AI and the ease it brings to different sectors. The latest addition to this sui generis system is the introduction of Virtual Reality (VR) and Augmented reality (AR). These interfaces are taking the world of design by storm and are providing a groundbreaking, real-time experience to the industrialists as well as the customers as a whole. They are altering our lifestyle with immensity, the way we contrive, work, live and play.

The design which is the sole representative of a company, or at least its virtual aspect. That is how respective audiences interact with their brands. As long as the company is portraying themselves uniquely and branding themselves substantially, their product is the ...


Read More on Datafloq
Facial recognition’s failings: Coping with uncertainty in the age of machine learning

Facial recognition’s failings: Coping with uncertainty in the age of machine learning

Deep learning is a technology with a lot of promise: helping computers “see” the world, understand speech, and make sense of language. But away from the headlines about computers...

[[ This is a content summary only. Visit my website for full links, other content, and more! ]]
Monetizing IoT Data Is The Next Step In A Connected Economy

Monetizing IoT Data Is The Next Step In A Connected Economy

We are moving at a break neck speed into the digitized world, where data has reigned king for several years. With the Internet of Things (IoT) becoming more prominent in our daily lives, data will...

[[ This is a content summary only. Visit my website for full links, other content, and more! ]]
36th Panzer Regiment Tank Losses, January 1944

36th Panzer Regiment Tank Losses, January 1944

The 36th Panzer Regiment was the tank component of the 14th Panzer Division, which had been destroyed at Stalingrad. When recreated, the division was supposed to have a three-battalion panzer regiment. However, it only received the I. and III. battalions before transferring to the eastern front in the autumn 1943. As losses accumulated, its remaining tanks and assault guns were concentrated in the III. battalion and the I. battalion was sent out of the theatre to replenish.

On 1 January 1944, the regiment had the following vehicles operational: 10 StuG, 11 Pz III, 11 Pz IV. In short term repair were: 7 StuG, 1 Pz III and 8 Pz IV. However, there is some uncertainty regarding the Pz III tanks, as they are not to be found in the organization chart, except for 6 command tanks in the battalion and the regiment. This is according to the monthly report to the Inspector-General of Panzer Troops (BA-MA RH 10/152).

The battalion war diary can be found in file BA-MA RH39/380. It is not as detailed as the war diary I have used for previous posts on I./Pz.Rgt. 26 Panther battalion. It just contains a narrative and I don’t have the kind of detailed annexes included in the file on I./Pz.Rgt. 26.

From the war diary, I conclude the following losses during January 1944:

StuG: 10 complete losses. One of them was only damaged by enemy fire, but could not be recovered due to nearby enemy units and was fired upon by other German StuG until it caught fire. Another 10 StuG were damaged, either by enemy fire or suffered from technical breakdowns.

Pz IV: 3 complete losses, 6 damaged. As there has been some posts on the effectiveness of artillery versus armour on the blog, it can be worth mentioning that one of the damaged Pz IV was hit by artillery fire.

Pz III tanks are not mentioned at all in the battalion war diary.

There are two comments on repairs in the war diary. On 14 January, it is said that one repaired tank returns and on 27 January, it is reported that 3 Pz IV and 1 StuG returns from workshops. However, this can not be all repairs. On 1 February, the battalion had 5 operational Stug and 2 in short term repair. As it started out with 10+7 StuG, had 5+2 on 1 February, while reporting 10 destroyed and 10 damaged during January, there must have been more repairs. The figures would suggest that 15 StuG were repaired, as there were no shipments of new StuG from the factories, according to the records in BA-MA RH 10/349 (list of deliveries of new AFV). Neither is any transfer of AFV from other units mentioned in the war diary.

It seems that the number of repaired Pz IV is 5, given the number on hand on 1 February.

All in all, this would mean that the battalion started out with 10 StuG and 11 Pz IV on 1 January, lost irretrievably 10 StuG and 3 Pz IV, 10 StuG damaged and 6 Pz IV damaged, while 15 StuG and 5 Pz IV were repaired. It should be noted that these figures are less certain than those given for the I./Pz.Rgt. 26 in previous posts, as the war diary of the III./Pz.Rgt 36 is not as detailed.

Admittedly, it is problematic to compare loss rates between units fighting different enemy formations, but it is still tempting to compare the III./Pz.Rgt. 36 with the I./Pz.Rgt. 26. After all, they fought in the same general area (Ukraine south of Kiev) in similar conditions against similar Soviet units. Clearly, the StuG and Pz IV were far more often directly destroyed by hits from enemy units. On the other hand, there seems to have been significantly fewer cases of mechanical breakdown among StuG and Pz IV. Ten such cases are explicitly mentioned, but in many cases the war diary just says that a tank was out of action, without giving a cause. Most likely, in those cases the cause was enemy action.

Clearly the proportions between destroyed by enemy fire, damaged by enemy fire and lost due to other causes differ considerably between the III./Pz.Rgt 36 and I./Pz.Rgt. 26.

This Picture is TAKEN FROM The SS Panzer corps in July 1943.

 

Understanding the role of automation in data management strategies

Understanding the role of automation in data management strategies

“The bigger the better,� so the saying goes. However, when it comes to data, it’s not so simple. We’ve ended up with bigger data, but have we really got better data? In my experience, businesses are...

[[ This is a content summary only. Visit my website for full links, other content, and more! ]]
How to Make Your Business Smarter with IBM Watson Studio

How to Make Your Business Smarter with IBM Watson Studio

A smart business is one that runs on the numbers. Today, open communication means more access to customers and competition than ever. Even small companies have world-wide customer lists, in part due to the increasingly streamlined logistics companies that help with shipping. In a business world of static customers and increasing competition, staying ahead of the curve often means staying on the leading edge of technology. Smart businesses leverage technology for direct gains, and that all comes down to the data. Efficiency improvements, trend tracking, smooth workflows and dozens of other internal operational improvements depend on the data you can collect and interpret.

All Data is Part of Big Data

Big Data is a term that gets thrown around a lot, and all it means is that you now have access to an incredible amount of data points. Some data points are important, while others just clutter up the landscape. In a Big Data approach to collection, you take it all in and start sorting through to gain valuable business insights. The sheer volume of information collected can make this approach seem inefficient. But, when done well, data collection can lead to some serious business benefits.

Benefits of Adopting a Data-Centric Approach

Data collection ...


Read More on Datafloq
Hitachi’s Bill Schmarzo: ‘IoT avalanche will open up security vulnerabilities’

Hitachi’s Bill Schmarzo: ‘IoT avalanche will open up security vulnerabilities’

Known as the ‘Dean of Big Data’, Hitachi Vantara CTO Bill Schmarzo reveals his thoughts on AI, blockchain and the internet of things. Bill Schmarzo is CTO in charge of internet of things (IoT) and...

[[ This is a content summary only. Visit my website for full links, other content, and more! ]]
Top 4 Artificial Intelligence Applications in Financial Institutions

Top 4 Artificial Intelligence Applications in Financial Institutions

Artificial intelligence has become very important in financial institutions and banking. Many of the applications and banking software are embracing AI extensively to compete in a very intense atmosphere. They are using various applications like virtual assistants, chatbots and AI debt collection assistants.

It is believed that more than 85% of bank customer interactions will be solely managed by artificial intelligence by 2020. TechEmergence, the AI market research specialist believes that chatbots are going to become the primary consumer AI apps in the future as banks need to engage with their customers who seek help and information.

AI Solutions in Banks & Financial Institutions

The use of chatbots and various virtual assistants reduce the expensive and tedious tasks in call centres. This reduces the work of customer service agents to a great extent. All financial institutions and banks have no alternative other than artificial intelligence to provide quick and fast responses and effective solutions to their thousands of customers who contact them every day.

Here are the top 4 Artificial Intelligence solutions that are currently being used in banks and financial institutions:

Sales Assistant

Artificial intelligence provides real-time assistance to fill forms in banks. This can increase the bank's conversion rate from 2% to 12%. It ...


Read More on Datafloq
Automatic Clustering, Materialized Views and Automatic Maintenance in Snowflake 

Automatic Clustering, Materialized Views and Automatic Maintenance in Snowflake 

Boy are things going bananas at Snowflake these days. The really big news a few weeks back was another round of funding! This week we announced two new major features.
Connected Vehicles – Are Commuters in The Privacy Driving Seat?

Connected Vehicles – Are Commuters in The Privacy Driving Seat?

We seem to be living in a world that is barreling headlong towards a science fiction reality that was the dream of writers in the 1950’s. Our parents were shown images and told stories of an ever...

[[ This is a content summary only. Visit my website for full links, other content, and more! ]]
Drug Barons, Rogue States and Terror Groups Use Banks – Can Blockchain Stop Them?

Drug Barons, Rogue States and Terror Groups Use Banks – Can Blockchain Stop Them?

Scathing reports by regulators have accused traditional banks of inadvertently helping “drug kingpins and rogue nations� – enabling them to commit money laundering, make questionable transfers and...

[[ This is a content summary only. Visit my website for full links, other content, and more! ]]
Can the Blockchain Make AI-based Systems Free of Monopolization?

Can the Blockchain Make AI-based Systems Free of Monopolization?

The potential of blockchain technology is thought by many to be huge - the technology could not only affect the economy, but also medicine, scientific research, government, education, and several other fields. The same is thought about artificial intelligence. If both technologies have so much potential, what could happen if the two are combined? According to some computer scientists, venture capitalists, and entrepreneurs, decentralization of AI-based systems is one of the expected outcomes.

Some people from the AI sector have raised concerns about the extent to which companies such as Google or Facebook have taken control over online data and how this control could limit the training of machine learning programs. Dawn Song, a computer science professor from UC Berkeley believes that blockchain technology could significantly limit the control that Internet giants have over online information. Song defends the importance to have machine learning capabilities that are under the users’ control and he believes blockchain could be the answer. More specifically, blockchain technology could be used to provide AI networks access to stores of online data without the need to involve third parties in the process.

Song also found a way to test his beliefs, as he is currently developing a blockchain named Oasis, ...


Read More on Datafloq

Privacy Policy

Copyright © 2018 BBBT - All Rights Reserved
Powered by WordPress & Atahualpa
X