Big Data Innovation, Issue 25

Page 1

T H E L E A D I N G V O I C E I N B I G D ATA I N N O VAT I O N

BIG DATA INNOVATION NOV 2016 | #25

Quantum Physics And The Big Data Question | 14

Quantum theory may be beyond the realms of most of our understanding, but it could have a huge impact on data

+ The Top 5 Chief Data Officers Of 2016

Data Isn’t Dead, But The Way We Do Election Modeling Could Be

We take a look at the 5 top performers in the CDO role from the previous 12 months | 16

The election of Donald Trump has shocked the world, but what does it mean for the future of polling? | 12


big data & Analytics Innovation summit Speakers Include

Jan 25 & 26, 2017 Las Vegas Contact Details Roy Asterley

+1 415 692 5426 rasterley@theiegroup.com theinnovationenterprise.com


ISSUE 25

EDITOR’S LETTER Welcome to the 25th Edition of Big Data Innovation

Writing this on November 9th, the world has just experienced the shock of Donald Trump being elected President. If outcry around the world is to be believed, it is going to have a huge impact on minorities and women across the US, but it is also likely to have an impact on how people perceive data in the future. Regardless of political preferences or potential outcomes, the truth is that almost every pollster got this wrong. There are a couple of outliers who claimed that Trump would win, but their work was generally seen as erroneous and the consensus was a comfortable win for Clinton. Whether or not this impacted voting is a question for another day, but what it has done is brought data into disrepute on the biggest stage.

It can now act as a stick with which to beat the entire concept of data-driven decisions, especially when it comes to government policy. We have seen with the appointment of DJ Patil as a major part of the Obama administration, but this policy may be scrapped in the future. The trouble is that the importance of open data cannot be understated in government and opposition. The media needs data to evidence corruption and poor leadership, and the government needs to it create the best policies. Most importantly the general public needs to trust this is being done. With the attacks on the media throughout the campaign, how are die-hard supporters of Trump going to accept data that discredits him shown in the media.

Trump’s election campaign has been anti-data, with the new President-Elect calling it ‘overrated’ and lying about or using misleading figures throughout, whilst his personal attacks have meant that the data used by Clinton has been largely ignored in favor of more newsworthy stories. Let’s hope that a Trump White House can embrace data and push its development within government, both for the good of the country and the industry as a whole.

george hill managing editor

/3


Develop a Cohesive Enterprise Data Strategy December 7 & 8 2016 New York

Chief Data Officer Summit Speakers Include

Contact Details Roy Asterley

+1 415 692 5426 rasterley@theiegroup.com /4

theinnovationenterprise.com


contents 6 | DESPITE IT BEING ‘OVERRATED’ , DONALD TRUMP IS USING DATA, BUT TO WHAT ENDS?

16 | THE TOP 5 CHIEF DATA OFFICERS OF 2016

According the Gartner hype cycle, data related technologies are dominant, We look at it in more detail and try to see why this is the case.

We take a look at the 5 top performers in the CDO role from the previous 12 months

10 | AI AND THE FUTURE OF MENTAL HEALTH

18 | COMPANIES MERGING FOR DATA SHOULD BE AWARE PRIVACY CONCERNS

We have heard a considerable amount about the use of data in traditional healthcare, but how about mental health?

We are seeing a growing amount of data-driven M&A activity, but companies need to be careful when merging databases

12 | DATA ISN’T DEAD, BUT THE WAY WE DO ELECTION MODELING COULD BE The election of Donald Trump has shocked the world, but what does it mean for the future of polling?

20 | 7 MYTHS ABOUT BLOCKCHAIN Nick Ayton takes a look at some of the myths about Blockchain and

14 | QUANTUM PHYSICS AND THE BIG DATA QUESTION Quantum theory may be beyond the realms of most of our understanding, but it could have a huge impact on data

WRITE FOR US

ADVERTISING

Do you want to contribute to our next issue? Contact: ghill@theiegroup.com for details

For advertising opportunities contact: achristofi@theiegroup.com for more information

managing editor george hill | assistant editor james ovenden | creative director charlotte weyer contributors shivanee pattni, nick ayton, david barton, olivia timson, jake hissit

|

/5


DESPITE IT BEING ‘OVERRATED’, DONALD TRUMP IS USING DATA, BUT TO WHAT ENDS? George Hill, Editor in Chief

...despite the outward showing of hatred towards data and analytics, has been doing considerable work behind the scenes /6


WE HAVE PREVIOUSLY discussed Donald Trump’s dislike for data, with him having publicly referred to it as ‘overrated’. In fact, in our article, we claimed that it was only in May 2016 that he had hired a pollster to his team, which was shocking at the time. However, it seems that the Trump team, despite the outward showing of hatred towards data and analytics, did considerable work behind the scenes, creating an impressive data science team based in San Jose. In a recent article from Joshua Green and Sasha Issenberg in Bloomberg Businessweek, we found out several things about the campaign’s data operations, including: The campaign spends $100,000 every week on surveys His own campaign polling has found that he has a low chance of winning, despite the public statements, with Brad Parscale, one of Trump’s closest advisors who runs data operations, saying ‘Nate Silver’s [founder of fivethirtyeight.com] results have been similar to ours’ They are utilizing social media data in an attempt to suppress voters in demographics unlikely to vote for Trump, including white liberal idealists, young women, and black people. The team has found unconventional, and in some cases spammy, methods to gain small political donations, in their first email blast 60% of their emails were caught in spam filters However, the most interesting element is that data actually guided much of Trump’s campaign, despite the commonly held belief that much of it was haphazard and reactive, largely due to negative things that Trump has said about minorities, women, or political opponents. For instance, the location of his rallies was chosen through Cambridge Analytica’s ranking of the places in any given state with the largest clusters of persuadable voters. We have also seen some of the questionable things he has said were actually data-driven, with the Bloomberg story claiming: ‘Trump’s invocation at the debate of Clinton’s WikiLeaks e-mails and support for the Trans-Pacific Partnership was designed to turn off Sanders supporters. The parade of

women who say they were sexually assaulted by Bill Clinton and harassed or threatened by Hillary is meant to undermine her appeal to young women. And her 1996 suggestion that some African American males are ‘super predators’ is the basis of a below-the-radar effort to discourage infrequent black voters from showing up at the polls — particularly in Florida.’ Trump also created a huge fundraising database known as Project Alamo, which grew to 100 dedicated members of staff, all paid for by the Trump campaign (rather than the RNC). The database isn’t just data collected by Trump, though, it includes the full $100 million RNC database created after Mitt Romney’s defeat in 2012 was blamed on a lack of data effort compared to Barack Obama’s team. It has then been added to by the campaign to include significantly more data, making this list worth between $36 million and $112 million. All of this had a considerably higher impact on Donald Trump’s words and action on the campaign trail than many would believe, but the really brilliant thing he may have done wasn’t on the path to the White House, but what he did to deal with the aftermath. This is because despite winning the election, his historical business plan is in ruins. His business is focussed on putting his name on luxury buildings, hotels, and golf courses in order to give them the gravitas that his name represented, yet recent figures from an online poll from Morning Consult show that 40% of voters said that his campaign made them less likely to buy his products, 46% said they wouldn’t stay in a Trump hotel, and 63% said they wouldn’t use one of his golf courses. This represents a problem for Trump, given that he hasn’t been making money from building things

/7


himself, but has instead licensed his name to luxury developments. If there are huge swathes of the population who are actively against the name, nobody is going to license it. However, this list gives him a significant amount of power that can be exploited in either a political or commercial way. Some believe that it will allow him to create a powerful right wing or alt-right media empire, given the Trump campaign’s employment of Roger Ailes, founder of Fox News and Steve Bannon, executive chairman of Breitbart news. It would create a strong foundation of those likely to watch a network focussed on the issues often discussed during the campaign. The second idea is that the campaign is trying to splinter the Republican party, which would fit with the

/8

continued narrative of criticizing party leaders such as Paul Ryan and John McCain, whilst also spearing ‘the system’. Having this database of his own supporters, combined with the wider party membership, would allow him and his supporters to largely control the narrative, making dividing the party considerably easier. Regardless of whether either of these theories is true, one thing is for certain - the 2008 and 2012 elections demonstrated how voter data can get people to vote. Will 2016 show how voter data can actually stop people voting and then be used in a commercial or political vehicle afterwards?


Premium online courses delivered by industry experts

academy.theinnovationenterprise.com

/9


AI And The Future Of Mental Health Shivanee Pattni, Technology Commentator

/ 10


THERE HAS BEEN a lot of research and experimentation done to try and understand incurable brain diseases. Despite finding ways of easing the symptoms, prevention and early diagnosis remain a problem that holds back a potential cure. Early detection of cognitive changes can significantly slow down disease development, its complications, and prevent devastating consequences. Cognitive science and the ability to process huge amounts of data, artificial intelligence can be that missing element in solving a mystery around neural disorders. Despite the progress in medical technology, when it comes to diagnostic tools for cognitive disorders, most of the time, healthcare specialists apply tests using good old pen and paper. Montreal Cognitive Assessment (MoCA) and the Clock Drawing Test (CDT) remain among the most effective to detect degenerative changes in the brain. With CDT, for example, a person is told to draw a clock face showing a specified time and then copy the previous drawing. The test demonstrates how people perform in terms of verbal understanding, spatial knowledge, and memory. By using CDT, it is possible to say which potential disorder is currently affecting a person's brain. However, despite being of high accuracy, the method can still be subject to a doctor's subjective judgement when identifying to what extent cognitive changes have taken place. Thus, it's possible to say what type of disorder a person is suffering from, but unfortunately, only after the disease has started affecting the quality of life. In cases of Alzheimer's disease, for example, it can be decades before a cognitive change becomes noticeable. While analyzing the issue, the Computer Science and Artificial Intelligence Laboratory at MIT (CSAIL) suggested that instead of analyzing final drawings or scores in cognitive tests like CDT and MoCA, scientists and doctors should consider the process of the assessment as a whole. In order to extract data from the process of assessment, the research team combined hardware and artificial

intelligence and created the Anoto Live Pen. The digital ballpoint pen is capable of measuring its position on the paper upwards of 80 times a second, with a built-in camera capturing movements and collecting important data. Data collected from 2,600 tests over 9 years allowed the CSAIL team to create specialized software in the form of the Digital Clock Drawing Test (dCDT). The newly developed method proved to be more accurate than the conventional CDT, thanks to the more thorough analysis of the results due to the thousands of new data points available. The CSAIL team has found that individuals with memory impairments spend more time thinking prior to drawing than those without a disorder. Also, individuals with signs of Parkinson's disease spend more time drawing the clocks - these insights were impossible to see with the conventional test. The main purpose of the dCDT is to save time on the detection of cognitive changes, so the disease can be diagnosed at earlier stages.

Artificial Intelligence is based on the deep understanding of how the human brain works, so if technology is in direct communication with the organ, the research and development of AI can be more accurate. However, due to ethical concerns around AI, it is likely that the interaction with the human brain and a machine will be difficult to implement any time soon. AI is a field with enormous potential, and despite numerous fears that one day, technology will act against humanity, so far, solutions provided by machine learning mostly act in society’s interest. AI driven approaches are already disrupting conventional methods of diagnostics, showing outstanding results like in case with dCDT. An increasing interest among investors and scientists to integrate AI within healthcare system also means that one day those who suffer may be able to regain their independence of mind.

However, the CSAIL team is not the only one who believes that AI and its capabilities of learning from data can contribute to the diagnostics and treatment of degenerative diseases. A former founder of Braintree, Bryan Johnson, has recently invested $100 million in Kernel - a human intelligence company which is developing the world’s first neuroprosthesis to mimic and improve cognition of mentally ill patients. The idea of the project is to create a prosthesis which can be either implanted or attached to the body, where machine learning algorithms facilitate communication between brain cells by operating directly in the neural code that is responsible for storing and recalling information in the human brain. Whilst the device’s priority is to understand and slow down the cognitive changes, researchers from Kernel also believe that the overall development of AI can benefit too.

/ 11


DATA ISN’T DEAD, But The Way We Do Election Modeling Could Be David Barton, Head of Analytics, Innovation Enterprise

IN THE RUN-UP to the election, Huffington Post mounted what could, in the world of statistics at least, only be described as a sustained assault on Nate Silver’s 538 website, publishing no fewer than three articles virulently criticizing his predictions for the US Presidential election and the models he used to make them. What prompted this vendetta is unclear, but it seems that it was, at least in part, an attempt to calm fears among its readership that Hillary Clinton may lose to Trump. Well, they did an excellent job. Possibly too good, apparently calming many to the extent that they felt they didn’t have to worry about actually voting for her. There is a strong case to be made that Silver was wrong,

/ 12

because he was, giving her a 65% chance winning. However, he was far less wrong than most, predicting many swing states for Clinton that only went to Trump within the margin of error. And he was certainly less wrong than the Huffington Post, who had her at a patently ludicrous 98.3%. Not that they were the only ones. Princeton Election Consortium's Sam Wang’s model showed a >99 percent chance of a Clinton victory, and The New York Times’ model at The Upshot put her chances at only a slightly more conservative 85%. In his attack on Silver in HuffPo, Ryan Grim wrote: ‘The short version is that Silver is changing the results of polls to fit where he thinks the polls truly are, rather than simply entering the poll


numbers into his model and crunching them.’ His article concluded, ‘If you want to put your faith in the numbers, you can relax. She’s got this.’ Given that many will have taken Grim’s advice and had faith in the numbers, it is understandable that this faith is shaken, maybe even gone. The numbers failed them. Mike Murphy, a Republican strategist who predicted a Clinton win, summed up the mood towards numbers on MSNBC, saying ‘My crystal ball has been shattered into atoms. Tonight, data died.’ But what do flaws in the election modeling and polls actually tell us this time? Is data really dead? Ultimately, this result has served only to underline Silver’s argument that numbers are not magically correct simply because they’re numbers. They are not infallible, and in politics as in business, they need context, because when humans are involved, there is always going to be a high degree of uncertainty. Silver’s trend line adjustments that Grim took such umbrage with in his article are an attempt to account for this uncertainty, correcting predictions based on historical patterns. The problem is that there is a limited amount of historical data when it comes to elections, with thorough and modern polls only going back as far as the 1970s, and this election was totally different to those that had gone before. There has never been a female presidential candidate from a major political party before, there has never been a candidate as completely off the wall as Trump before, there has never been an approach to an election campaign like his before, and many who voted for him had never voted at all before. This meant that historical data was more or less rendered useless, adding in a further layer of uncertainty, which is likely why even Silver got it so wrong. There were a number of reasons polls were wrong on an individual basis. Pollsters are less likely to question new voters, in this case white working class Americans. It has also been claimed that there were many so-called ‘Shy Trump’ supporters, echoing a phenomenon seen in Britain when the

polls got it so wrong about Brexit and a Conservative Party majority in 2015, hiding their voting intentions for fear of being labeled racist and sexist. Frank McCarthy, a Republican consultant with the Keelen Group, a consulting firm in Washington, DC, said: 'People have been told that they have to be embarrassed to support Donald Trump, even when they're answering a quick question in a telephone poll.' He added that, ‘What we've been hearing from the [Republican National Committee] for months is there's a distinct difference on people who get polled by a real person versus touch tone push poll.’ Data is not dead, but this election must be a learning experience for all pollsters and election modellers, including Nate Silver. Not all polls were wrong. Over the last two months, 10 polls published on Real Clear Politics gave Trump the lead. Nine of these were from from LA Times/USC, and pollsters must look at what they did right. MogIA, an AI system which analyses data from Google, Twitter, and Facebook, also correctly predicted that Donald Trump would win based on the amount of chatter there was, and advances in the ability to use unstructured data to gauge sentiment should help improve this even more before the next election. Politics is ultimately about more than numbers, it is about people, and people are hard to predict, but not completely impossible. Election modelling has to account for uncertainty, and if it looks at the context on social media and corrects for other factors seen in this election, then it will improve. We have to accept this uncertainty rather than demand definitive answers from the press and pollsters as soon as possible. As Pradeep Mutalik noted ahead of the election, ‘Aggregating poll results accurately and assigning a probability estimate to the win are completely different problems. Forecasters do the former pretty well, but the science of election modeling still has a long way to go.’ This defeat is a defining moment, and how they bounce back will determine whether or not they ever actually get there.

...There is a strong case to be made that Silver was wrong, because he was, giving her a 65% chance winning

/ 13


Quantum Physics And The Big Data Question George Hill, Editor in Chief

“

...Quantum physics may actually have a bigger impact on data in the future than simply the way that we run our computers

“

/ 14


WE HAVE SPENT a considerable amount of time discussing the implications that quantum computing may well have on big data and computing power in the coming years, but it may be another aspect of quantum theory that could see the biggest change. Quantum computing, essentially allows computer bits to operate in three states, rather than the two states in current technologies. Essentially, when you turn on your computer and load a file, play music or even just move your mouse around on the screen, you are causing bits to be either turned on or off. The sequencing of these switches then powers whatever you see on your screen. With quantum computing, instead of bits, they use qubits, which aren’t just on or off, qubits can be on, off or on and off. It is a difficult concept to get your head around, but the most important thing to know is that it means computers that work thousands of times faster than regular computers. The implications of this for data is clear - think in-memory speeds accelerated to beyond anything you could imagine today - but quantum physics may actually have a bigger impact on data in the future than simply the way that we run our computers. Through quantum entanglement, we may have the first unhackable data transfer process and the fasted method of communication. Quantum entanglement essentially ‘binds’ two particles together, with changes occurring to one of the two entangled particles having the same effect of the other. This can technically happen at huge distances, essentially if one of the pair is changed on one side of the universe, it will still have the same result on the other. At present the record is over 300km, held by NTT Basic Research Laboratories in Kanagawa, Japan, with instantaneous changes occurring in two entangled particles. The implications of this for data is huge, as it would basically create a way of transporting data with no way of intercepting it, unless you had the other particle. This technology is not some kind of far off sci-fi vision of the

future either, we are already seeing the beginnings of it today. China recently made headlines when it launched what the country billed as a ‘hack-proof’ satellite. The technology they are using in the satellite is quantum entanglement, but at present it isn’t necessarily doing much communicating, instead it has been launched to allow for ‘[A] twoyear mission…to develop ‘hack-proof’ quantum communications, allowing users to send messages securely and at speeds faster than light’, according to the Xinhua new agency. It may be some years off that we are going to see this become a widespread way of communicating data, but with the developments that China and others are making it is not so distant that it seems unfeasible. One aspect that few are talking about is that it may actually speed up the development and spread of quantum computing as some of the security concerns that many have about the technology would no longer be relevant. For instance Konstantinos Karagiannis, BT Security's Global Technical Lead of Ethical Hacking told Vice in 2014 that Quantum computing ‘will instantly change the world of encryption.’ This is because the speed of quantum computers will allow them to quickly get through current encryption techniques such as publickey cryptography (PK) and even break the Data Encryption Standard (DES) put forward by the US Government. If data can only be communicated between two particles in the same state, it means that there is no way that this could happen. Although this wouldn’t necessarily be the case once the data is communicated, as storage on current technologies will still use these kinds of technologies, it stops anything being collected when it moves between two places. This negates a significant amount of the threat and may even have implications for the storage of data in the future. So in the future we may see a concept originally conceived by Albert Einstein making the computers and data transfer of today, look like little more than punch cards.

/ 15


The Top 5 Chief Data Officers of 2016 Jake Hissitt, Big Data Journalist THE CHIEF DATA OFFICER (CDO) role has been developing for the past few years, with many companies creating the position to help deal with the huge influx of data they are seeing. It is often an overlooked position compared to some of the more traditional boardroom stalwarts like the CEO and CFO, but plays an essential role in companies today. However, many are doing amazing work and we wanted to pick out 5 who have particularly impressed us in the past year. This isn’t necessarily going to be about new innovative ideas or approaches, but more about what they have achieved and what their role says about them.

In a commencement speech given to the University of Maryland in May 2012, he gave a detailed account of his failings, one of which was taking calculus at a community college - ’It was then I realized that I wasn’t just stupid; I was really stupid.’

DJ Patil

This ‘really stupid’ guy has gone on to become arguably the most famous CDO in the world, having worked for Linkedin, Ebay and the department of defense before accepting the role at the White House. Although he may

not be the person who comes up with the latest new technologies or new innovative ways of utilizing data given his work for the government, he is undoubtedly the most highly-visible data leader in the world. So rather than discussing his work, it is worth noting his impact just by being himself, continuing to spread new ideas around data and informing data policy in the most powerful office in the world.

U.S. Chief Data Scientist, White House Office of Science and Technology Policy Chicago is one of the most forward thinking cities in the US regarding their use of data.

Tom Schenk Jr Chief Data Officer for the City of Chicago

/ 16

They have become known for taking proactive steps for use with their data. For instance, they have opened up one of the world’s most open data portals, giving citizen data scientists the opportunity to make big differences in their city. They have also approached the accusations leveled at their police department through adopting data to investigate them and data to prevent them in the future.

Most of this comes from Tom Schenk Jr, Chief Data Officer for the City of Chicago, who has been in the role for the past two years and is central to the embracing of data by the city. He also writes about data in his free time and the intricacies of its use. Tom is the epitome of somebody who is bringing his passion for what he does to his job and this shows in how the city has embraced data.


Meeta Yadav

Steven Hirsch

Helen Crooks

Chief Data Officer, Blockchain at IBM

Chief Data Officer at IntercontinentalExchange Group / NYSE

Chief Data Officer at Lloyd’s of London

There are two things that have had a huge impact on data in 2016, one has been the increasing prevalence of AI and cognitive computing and the other is the spreading realization that Blockchain may well be a new dawn for how companies utilize and store their data.

2016 has shown that money genuinely makes the world go round, with where money is being spent, local jobs being lost to other countries, and huge currency fluctuations impacting companies across the world. At the centre of all of this unpredictability has been stock market fluctuations representing what they mean and acting as a watermark of the overall state of all public companies.

Unsurprisingly, there is one company who has been instrumental in both - IBM. Big blue are never far away from data innovations. Take Watson as a prime example of their drive for constant improvement. They have recognized this with their blockchain efforts and appointed Meeta Yadav as Chief Data Officer, Blockchain. With her background in academia at NC State, before working on IoT architecture and CTO Innovation, Technical Assistant to CTO of Middleware at IBM, she is more than qualified to do what will potentially be one of the key roles in developing blockchain for the wider data community.

The NYSE tracks every trade, every stock and the tiniest movement in every company. It looks at all potential factors that could have impacted a movement and holds all of the historic data from some of the world’s most important stock exchanges. In charge of all this data is Steven Hirsch, a man who has overseen the data initiatives of the organization for the past 9 years, seeing it through recessions and seismic global shifts. Having worked in the position for nearly a decade Steven is one of the longest serving CDOs in the data world and having worked with him several times, we can testify to his knowledge and passion for data

The insurance industry is one that runs on data. Without it, they would be unable to evaluate risk effectively, establish culpability, or even make a profit. At the center of this is companies like Lloyds of London, who are one of the world’s best-known insurance companies. Not only that but they are also one of the most advanced in terms of advising on insurance. Helen Crooks is pivotal to this and the company’s CDO. She has worked her way up through some of the most advanced data companies in the world, including taking a leading role in one of the defining data projects in the UK of the past 3 decades, the roll out of the Tesco Clubcard scheme by dunnhumby in the mid 90’s. She has also worked at Virgin Media and Tesco before taking the helm at Lloyds of London. Given the current turmoil around the world with wars and political decisions turning previously stable markets into roller coasters, it is testament to her experience and skill that she is excelling at such an important datadriven company looking to predict every change.

/ 17


Companies Merging For Data Should Beware Privacy Concerns Olivia Timson, Analytics Evangelist

THE IMPORTANCE OF DATA has long been recognized by organizations of all hues, particularly those in the tech sector, for whom it is often their lifeblood. This value is now being reflected in a number of recent mergers and acquisitions, with the decisions of AT&T and Time Warner, Verizon and Yahoo, and Microsoft and LinkedIn motivated largely by monetization of users’ personal information, providing additional data with which they can target consumers with content and advertising. AT&T’s purchase of Time Warner for $85 billion is the latest such merger, should regulators allow it to go through. Randall Stephenson, AT&T’s chief executive, has publicly stated his belief that the benefits of the merger lay in the additional data AT&T can provide to both Time Warner and advertisers about consumer viewing habits, which they can use to tailor specialized, interactive programming for AT&T’s mobile customers.

Stephenson noted that: ‘We'll develop content that's better tailored to what specific audience segments want to watch, when, where and on which device, and we'll use the insights to expand the market for addressable advertising. Addressable advertising is far more effective and more valuable both to the advertisers and to our customer.’ Time Warner’s decision to buy AT&T for the data follows an emerging pattern in M&A, with Microsoft’s purchase of LinkedIn for $26.2 billion earlier this year also said to have been driven primarily by access to data. LinkedIn’s 450 million users are, arguably, Microsoft’s core demographic, and the enormous amounts of data they generate could yield insights and products Microsoft could use to monetize its investment in LinkedIn. For example, Microsoft’s digital assistant Cortana could search your LinkedIn network to find out who is going to be at your next meeting and brief you accordingly.


Is the future for M&A one in which companies come together simply to pool their data resources? Privacy campaigners such as Jeffrey Chester of the Center for Digital Democracy, seem to think so, but they also believe there are dangers. Chester has argued of the AT&T/Time Warner deal that, ‘The goal of the next generation of Big Media mergers: bringing together under a single entity massive broadband network connections and vast production and content capabilities, along with sophisticated data-mining operations that deliver micro-targeted ads’. Dimitri Sirota, CEO of BigID, has also warned that LinkedIn has a ‘deep insight into the majority of professionals in terms of relationships, interests, satisfaction with their work, intentions to leave their work,’ noting that, ‘Mergers complicate how personal data is shared across franchises. These businesses will suddenly have new personal data it can leverage, begging questions around controls.’

using large amounts of a person’s information on behalf of marketers and advertisers with a user’s say so, essentially enshrining Nadella’s commitment to ensure opt in into law, may go some way to assuaging such concerns. Traditionally, the theory has gone that customers are willing to hand over their data if they feel they are getting something in return. But does this theory hold water if it is being shared with a company from whom they are not getting anything? And do customers trust companies enough to ensure their data is secure during a merger? At the moment, there has been little public outcry outside of privacy groups, but as this becomes more of an issue it may be that companies see an increase in resistance from users.

Microsoft Chief Executive Satya Nadella said of the LinkedIn buyout that, ‘Nothing will get connected or linked without users opting in’, yet at the same time he said one result of the merger would be that they could use machine learning on user data to generate more recruitment leads and help drive B2B sales, indicating that privacy may not be his priority. The recent decision by the Federal Communications Commission (FCC) to ban broadband internet service providers (ISPs) from automatically

/ 19


7 Myths About Blockchain Nick Ayton, Digital Disruptor & BOOMer, Digital BOOM

The Blockchain (ledger) itself is not immutable. It is a myth. / 20


THERE ARE MANY MYTHS circulating about the Bitcoin Blockchain, most of them propagated by people that assume too much and make rapid judgments without really knowing… Pretty normal you might think for an emerging disruptive technology to provoke such extreme reactions and opinions. It is obvious that some don’t want Blockchain to succeed and that others are still scared of it, looking only to defend themselves. And then there are the opportunists looking to make money from Blockchain by pretending they have the answers, even though 6 months ago they couldn’t spell Blockchain. So here are some of the common myths you will hear and read about Blockchain… ONE: Blockchain is Immutable – WRONG. The Blockchain (ledger) itself is not immutable. It is a myth. Immutability in fact comes from the expenditure of effort (in this case power and computing resources) relating to the Proof of Work algorithm where the difficulty increases and number of Bitcoin rewards for Miners reduces every 4 years. Currently 12.5. The ultimate visible Proof of Work example is the Pyramids in Egypt, which took decades to complete and those that took on the project had to feed and water hundreds of thousands of people that in turn moved mountains. Therefore demonstrating Proof of Work completeness encompassing extreme difficulty and expenditure of resources. Miners expend not only huge effort and resources, they also invest significant amounts of capital in Mining equipment to mine the mainstream crypto-currencies. This is very different to the Proof of Stake investment/ vested approach. Miners make the investment in computing and electrical power to solve the increasingly challenging (difficult) PoW algorithm (SHA256) to try to get a match, called Hashing. This is used to verify transactions and enable a new Block

to be written on (taking 10 minutes) for which they get a reward. It is a race and mining power has a direct impact on the result. TWO: Smart Contracts are Smart and they are Legal documents – WRONG again Smart Contracts are dumb. They are not contracts at all. They are scripts as software code that are deployed onto the Blockchain at a particular address (data store) that follow simply instructions normally triggered events, e.g. IF, THEN statements. They are normally written as a transaction instruction and rely on the Computational Capabilities of the Ethereum Blockchain. Smart Contracts eliminate the need for individuals to handle time consuming and costly business processes. They are autonomous and once loaded cannot be stopped or altered. Like a virus they can operate alone1 (autonomously) or in conjunction with other Smart Contracts, Data Stores as Oracles and interoperate with other legacy systems. Smart Contracts are not contracts in any legal sense, nor will legal contracts be part of Smart Contracts. But they are capable of executing terms (as instructions) that may reside in an agreement between parties, to make a payment or move entitlement/ ownership and transfer funds. They form part of the business logic layer that links nicely with the process logic to form what become unintelligent groups of transactions. Smart Contracts are emerging and the development and deployment is very complex. They are vulnerable to attacks/hacks and they are where most of the recent problems such as the DAO have been. Smart Contracts carry transaction instructions and become layers and groups of Smart Contracts that work together to form a new generation of Decentralised applications or Dapps. Smart Contracts along with Keys (Public and Private digital fingerprints)

play an increasing important role in the design of Blockchain Operating Models where core business processes are automated using embedded Smart Contracts. They are eventually embedded as firmware into physical things in an IoT world, with everything written to a ledger of Everything. Smart Contracts is the business logic layer that directs the transaction traffic between the participants. A Smart Contract does not form any legal status, and the legal position is largely irrelevant. Should a Smart Contract do something the parties had not intended. Make a wrong payment? Who is responsible? Was it hacked or just poor coding? THREE: Bitcoin network can be shut down – WRONG Bankers and purists generally don’t like Bitcoin for the primary reason of Openness, Transparency, and because it is generally Free (micro fees), arguing anonymity allows miss-use. Bitcoin is not owned by anyone. The Bitcoin Foundation provides oversight but not control. Bitcoin is now 7 years old and last week broke through the $640USD resistance. It is already mature. The code is Open Source and anyone can download it and set himself or herself up as a Miner to mine Bitcoin. Others may download Bitcoin as clients onto their smart phones as Bitcoin Wallets to transact Peer 2 Peer with whomever they want, passing Bitcoin tokens across the network and not via any central control or body, like a clearing bank or central authority. Frictionless and where the transaction is guaranteed and immediate. Bitcoin is for the people, run by the people, and, like the Internet, cannot be shut down - although there are, of course, efforts to police it, with others that want to control it. It is beyond the central authorities (Regulators, Clearing Banks and some Governments) that see the Libertarian threat of Bitcoin and fear it. It is censorship resistant, without geographical boundaries. Many governments are starting to see the potential and want to be part of its

/ 21


evolution, as sovereign governments race to be first to issue debt on the Blockchain, and deploy their own crypto-currency. To be one of the first matters. The other myth is that once the 21 million of Bitcoins are mined the currency will collapse and stop and people will lose their value. The last coins will be mined in 2140 and given there are 9 decimal places to play with the Bitcoin crypto-currency will do just fine. FOUR: There are 20 or 30 Crypto-currencies in circulation – WRONG again In fact there are over 800 cryptocurrencies and within a few years there will be 5,000 to 10,000. There are many Crypto Exchanges around the world, and the number is increasing all the time with more than 20m Bitcoin Wallets now handling multi digital and fiat currencies. The leading crypto-currencies remain Bitcoin, which has a market cap approaching $12billion, Ethereum with Mkt Cap $1billion, Ripple, Litecoin, Ethereum Classic, Monero, Dash, Augur, MaidSafeCoin, Nem, Waves, Steem, Dogecoin, Factom, DigixDAO, Lisk, Gulden, Synero, Stojcoin and many, many more. www. coinmarketcap.com is a great place to find them. Given the purpose of a ‘digital tokenized rail’, they are based on many years of cryptography thinking, recent Innovations, and the Computer Science breakthrough that created the Distributed Ledger Technology or DLT, that for me takes the brakes of commerce. They are an integral part of the core design and their use depends on the business outcome sought. Each Crypto currency is used for a different purpose, forming the basis of the security model. The crypto keys are part of the transaction to be validated by the Miners, and where each transaction once Hashed is held on copies of distributed ledgers held on the network nodes, where

/ 22

Miners hold a full copy of the Ledger giving Blockchain its Zero Downtime attributes. Some are designed as exchange tradable digital currencies, others as tokens, some to reward and to deliver functionality as part of the forked code that attempts to deliver a specific level of performance. Zero Cash or Zcash attempt (starts mining now) to offer what Bitcoin has with more anonymity. Others are part of a forked design to improve the scaling or support a different consensus algorithm. Many crypto-currencies will never reach the volumes required to become mainstream. ETHER or ETH was really designed as part of the Ethereum deployment to reward Miners and Coders for their computational efforts known as Gas. Hence the expression Gasing Up your Smart Contracts, as inefficient and poorly written code not only is computationally more expensive, they are also open and vulnerable. FIVE: That Bitcoin and Ethereum are the same – WRONG Bitcoin was designed as a Peer 2 Peer financial system using the core code based on the Satoshi papers of 2008. It was designed as a borderless financial (payment) system beyond the control and manipulation of central banks, governments and retail banks that take disproportionate fees for doing very little. Bitcoin was the first and is the largest Public Blockchain. Ethereum forked the original open source code and went live in 2015 with its own genesis block, when Vitalik Buterin engineered it to deliver a complete development environment around a global Computational Machine, that may one day become the new Internet. Ether arrived Jan 2016 and things really took off. Ethereum offered for the first time a Turing complete language (a development environment) to build out architecture for Smart Contracts that can run as a Blockchain Operating Model. Ethereum delivers the future

...there are over 800 cryptocurrencies and within a few years there will be 5,000 to 1,000.


of commerce and where the cryptocurrency as a financial rail is a consequence and not the focus. They share common features; they started from the same source code, they are both a network (P2P), a currency and a technology. SIX: The Bitcoin Blockchain has been hacked – again technically WRONG The underlying Bitcoin network has not been hacked. The Bitcoin Exchanges have been hacked - Mt Gox comes to mind - and some were run by some people whose motives may not have been open. Smart Contracts have been hacked to change their logic outputs, in the main to divert funds to a different address, the DOA for example. However, Cold Stores holding key information have also been hacked allowing access to hackers that get access to your Bitcoin Wallet/Account and remove the coins. People who own and trade Bitcoin and ETHER have a wide range of accounts, Wallets, and Keys to spread the risk hackers will record the key logs on Public and Private Keys. With each fork things are improving as preproduction Use Cases emerge. SEVEN: Blockchains cannot be linked together - WRONG Apart from a host of activity to patent aspects of Blockchain design largely by banks, much of the Blockchain community remains Open Source and committed to improving the underlying performance of the Blockchain code so that it may scale. Many Blockchain start ups are doing amazing things to improve the performance and usability of the original code. Consensys, Eris Industries (now Monax), and Tendermint are achieving great things.

and enable the passing of Tokens (currencies/value) between each Blockchain remains complex. Two leading examples of this effort are Interledger and Cosmos Hub (Tendermint) two such projects that are linking chains together to create Blockchain eco systems and communities, very useful when deployed as an Industry Solution. Summary These are the popular myths. It is inevitable with any new technology that there is broad confusion, and even more so with Blockchain, which is very complex and involves very smart people designing, developing, and building a new future where the rules of trade are different and where organizational operating models function differently.

Smart Contracts are dumb. They are not contracts at all

The recent Forking activity is a natural evolution process, but again those that aren’t fans point the finger at Blockchain calling it high risk, claim it can be hacked, and say Bitcoin is a turbulent currency despite it outperforming everything. Blockchain is a set of building blocks, a development environment to be creative, to re-design commerce, and to invent new ways of solving business problems. However, for me, the most compelling thing about Blockchain is that it delivers the opportunity to build new Operating Models built for competitive advantage in terms of cost and efficiency. Every industry, new invention, anything that people don’t fully understand, are surrounded by Myths, as non facts as facts, of non truths and definitive information. But none of this matters to Blockchain because the people that know, know and we the Blockchain community are pleased about that..!

The big activity is to link Chains developed, and built in different languages and structures (consensus, voting, approaches to Identity etc)

/ 23


FOR MORE GREAT CONTENT, GO TO IEONDEMAND

www.ieondemand.com

Over 4000 hours of interactive on-demand video content

There are probably of definitions the single job of Head of Innovation and any withemployee, them dozens of perspectives What would happendozens if a company fundedfor every new product idea from no questions asked? on it should beAdobe done.did Without anythat. official credentials the Randall subject will I was asked give my personal account As how an experiment, exactly In this session,on Mark share thetosurprising discoveries of running an in innovation team in the of an innovation-hungry organisation that started on the highfor street Adobe made creating Kickbox, thecontext new innovation process that’s already becoming an industry model and has innovation. grown to employ 16,000 people overa 80 years. In red the box pastpacked year orwith so Iimagination, have learnedmoney that when comes igniting Each employee receives mysterious and ait strange to innovation culture trumps everything and there really aren’t any rules. In order to get by, I stick some guiding game with six levels. Learn why the principles behind Kickbox are so powerful, why Adobe is opentosourcing the principles and lots gutany feel. Join me forcan an honest andprinciples straightforward perspective entire process andof how organization tap these to ignite innovation.on a modern job without a Mark Randall's serial entrepreneurial career conceiving, designing and marketing innovative technology spans nearly 20 years and three successful high-tech start-ups. As Chief Strategist, VP of Creativity at Adobe, Mark Randall is focused on infusing divergent thinking at the software giant. Mark has fielded over a dozen award-winning products which combined have sold over a million units, generated over $100 million in sales and won two Emmy awards. As an innovator, Mark has a dozen U.S. patents, he’s been named to Digital Media Magazine’s “Digital Media 100 and he is one of Streaming Magazine’s “50 Most Influential People.”

/ 24

Stay on the cutting edge Innovative, convenient content updated regularly with the latest ideas from the sharpest minds in your industry.


Issuu converts static files into: digital portfolios, online yearbooks, online catalogs, digital photo albums and more. Sign up and create your flipbook.