Big Data Innovation, Issue 23

Page 1

T H E L E A D I N G V O I C E I N B I G D ATA I N N O VAT I O N

BIG DATA INNOVATION JUL 2016 | #23

Big Data Opening Up The Human Genome|10

+

Data Scientists Are The Next UX Designers Chatbots and virtual assistants are going to change how we interact with technology and data will be the new UX design | 15

The Importance of Big Data After Brexit Great Britain recently voted to leave the EU in a decision that has rocked the world, but it could be a system based on facts and data | 22


Boston Data Festival September 8 & 9, 2016

SUMMITS Big Data Innovation Data Visualization Internet of Things

Speakers Include

+1 415 614 4191 jc@theiegroup.com www.theinnovationenterprise.com


ISSUE 23

EDITOR’S LETTER Welcome to the 23rd Edition of the Big Data Innovation Magazine

We are currently at a strange time regarding the use of data. On the one hand, it is having a huge impact on our lives, changing the way we shop, how we interact, the things that are advertised to us and even how we move around the country. On the other, we are seeing major decisions and opinions being formed in the face of evidence being presented by data. As Britain voted to leave the EU despite evidence showing that it was going to have a negative impact on the country as a whole, Michael Gove attempted to discredit the data presented by experts by saying ‘I think people in this country have had enough of experts.’ This wasn’t a slip of the tongue though, as Arron Banks, the largest donor to the Leave campaign stated that the campaign team said early on that ‘facts don’t work’ and that’s it. The remain campaign featured fact, fact,

fact, fact, fact. It just doesn’t work. You have got to connect with people emotionally. It’s the Trump success.’

the element of people’s lives that should be most filled with truth and data.

The ‘Trump success’ that Banks refers to is that Donald Trump’s poll to become the next president despite Politifact showing that only 25% of what he has said in his presidential campaign has been even half true. In fact some of his biggest claims such as ‘Don’t believe those phony numbers when you hear 4.9 and 5 percent unemployment. The number’s probably 28, 29, as high as 35. In fact, I even heard recently 42 percent’ and ‘the 2016 federal omnibus spending bill ‘funds illegal immigrants coming in and through your border, right through Phoenix’ have been shown to be completely false.

Politicians essentially need to catch up with businesses in realizing that it is not necessarily about what the data shows, but instead about how it is presented and understood. We are currently at a crossroads, with some saying that we live in a postfactual society, whilst the wealth of data we have around us suggests otherwise and it now needs to be the responsibility of those in power to make sure the data is presented effectively and without spin, otherwise we will continue to see confused and frustrating results, from confused and frustrated voters.

It is an interesting dichotomy given that facts and truth aren’t working in political discourse, but it is perhaps

George Hill managing editor


Big Data & Analytics Innovation Summit September 14 &15, 2016 Sydney

Speakers Include

+61 (2) 8188 7345 vhernandez@theiegroup.com BD Analytics Sydney ie Group

/4


contents 6 | 4 QUESTIONS FOR STEPHEN DILLION, SR ENGINEERING FELLOW\BIG DATA ARCHITECT AT SCHNEIDER ELECTRIC

We discuss IoT and big data with Stephen Dillion ahead of his presentation at the Big Data Innovation Summit in Boston 10 | BIG DATA OPENING UP THE HUMAN GENOME

The human genome has the potential to open up huge opportunities for healthcare, but we need big data to unlock it first 12 | BIG DATA’S A BUST: SMALL DATA PROVIDES CONTEXT TO USER BEHAVIOUR

With the ultimate goal of big data being to create small data, we ask whether or not big data can function without it? 15 | DATA SCIENTISTS ARE THE NEXT UX DESIGNERS

17 | DEEPMIND’S WORK WITH THE NHS SHOWS IT IS GOOGLE’S BEST PURCHASE

Despite questions being raised about the reason for the purchase of the company, we are now seeing the full potential it may have 20 | HOW DATA IS BEING USED TO CONTAIN THE SPREAD OF ZIKA

The Zika virus is all over the news, destroying lives and putting unborn children in danger, but could data stop its spread? 22 | THE IMPORTANCE OF BIG DATA AFTER BREXIT

Great Britain recently voted to leave the EU in a decision that has rocked the world, but it could be a system based on facts and data

Chatbots and virtual assistants are going to change how we interact with technology and data will be the new UX design

WRITE FOR US

ADVERTISING

Do you want to contribute to our next issue? Contact: ghill@theiegroup.com for details

For advertising opportunities contact: achristofi@theiegroup.com for more information

managing editor george hill

| assistant editor james ovenden | creative director nathan wood

contributors gabrielle morse, laura denham, chris davidson, alex lane, megan rimmer /5


4 Questions For Stephen Dillion Sr Engineering Fellow & Big Data Architect at Schneider Electric Gabrielle Morse, Organizer, Big Data Innovation Summit

AHEAD OF HIS PRESENTATION AT THE BIG DATA INNOVATION SUMMIT in Boston on September 8 & 9, we spoke to Stephen Dillion about his work as a Sr Engineering Fellow\Big Data Architect at Schneider Electric. Stephen is a senior member of Schneider Electric’s engineering fellow program with over 17 years of experience in the data engineering field. He has been working with IoT since 2009, Big Data technologies since 2011 and Fast Data technologies since 2012 and has spoken at numerous conferences and events on the subject of In-memory databases, MongoDB, and NoSQL. He is currently working on IoT Innovation with a focus on Fast Data technologies for Schneider Electric’s Global Solutions team, which is responsible for developing their IoT Platform. /6


1

Innovation Enterprise: How do you see the use of data in companies developing over the next 5 years?

Stephen: We can expect streaming, In-memory databases, and Graph analytics to become common place as fast data solutions are embraced in support of real-time decisioning. This will prepare companies to implement predictive analytics and deep learning which will be expected to be part of Enterprise architectures, especially in the IoT industry. Today, predictive analytics is the Holy Grail for companies, but most are simply not ready for a multitude of reasons. Either their use cases are not understood well enough, their employees are not up to speed on the technical topics, they’ve not identified the value proposition for investing in this area, or they’re still early on in the adoption process of big data solutions. Of course we see some companies at the forefront of IoT today that are utilizing predictive analytics but the gap between these companies and others is wide.

2

What should companies do to improve their data security in the wake of the several widely reported hacks of the last 2 years?

Learn, adapt, and be aware. Companies must not only invest in a technological effort but also one of social awareness as well. From the technology perspective, companies must be able to react faster to potential threats especially as devices become more connected to a network and to other devices. It is fair to say there will be new unforeseen threats that will emerge as a result that human intervention and today’s reactionary responses alone cannot sufficiently handle. machine learning is one area

companies can look to that can play a role in preventing many such breaches or at least are able to identify threats and intervene as they happen as opposed to after they’ve happened. But even these technical approaches are only so good. If people are not aware of the possible threats of social engineering, one employee can cause a multi-million dollar system’s security to be breached.

In most cases there is no absolutely correct answer regarding which myriad technologies will meet your needs but there are absolutely incorrect answers if applied to the wrong use case

3

What is your outlook on the amount of regulation of data from governments?

There are so many concerns to date regarding security of data, especially in the IoT domain, that the amount of regulation can only increase. Where and how that regulation will be implemented is the real question the industry is watching. Simply locking it down and encrypting all data in flight and at rest and in-depth access control practices will not be sufficient. Even the greatest banks with the most secure of vaults are still guarded by other security measures. However, if data becomes too difficult to work with, due to too many or harsh constraints, its full potential will not be achieved. Security must be fluid and this is where Artificial Intelligence can play a role.

/7


4

What advice would you give to a company making their first moves into implementing a new data program?

First and foremost, you must understand your use case(s). Your use cases will define the value for your business. If you cannot identify the value, you may want to question the purpose of your data program’s initiative. Without understanding the value, you will have no way to measurably determine success. Knowing your use cases well will also help narrow down your technology choices. In most cases there is no absolutely correct answer regarding which myriad technologies will meet your needs, but there are absolutely incorrect answers if applied to the wrong use case. Be comfortable in the fact there will be tradeoffs between different solutions and only you can weigh the pros and cons against your needs. Second, technology should not drive your business decisions but eventually you must implement technologies that will support your business. There is no one-size-fitsall solution and you must be ready to cultivate and acquire the skills necessary to support your data initiative with a mixture of specialists and generalists. These must be open-minded technologists who can adapt to the evolving use cases and technologies. Else, you’ll have a team ill prepared to implement a solution when the technologies must change. Finally, be willing to fail fast and fail forward. There is still a practice I observe many in the industry are afraid to embrace. If something is not working or your use cases have evolved beyond the usefulness of the chosen technologies then it is a foolish and stubborn person who will refuse to make the necessary changes. Stephen will be speaking at the Big Data Innovation Summit on September 8 & 9 /8

Today; predictive analytics is the Holy Grail for companies but most simply are not ready for a multitude of reasons


Smart Cities Innovation Summit October 19 & 20, 2016 San Francisco

Speakers Include +1 415 614 4191 jc@theiegroup.com Smart Cities San Francisco / 9ie Group


Big Data Opening Up The Human Genome Laura Denham, Organizer, Big Data Innovation

WE HAVE TALKED ABOUT THE USE of big data in medicine consistently across the last few years. It has very much been at the heart of modern medical discoveries and its future is bright. However, we are currently at the very start of the process and it will not be until we can harness the power of data in our healthcare that we can truly make the most of the early potential it has shown. One of the most important elements to this is going to be in the mapping of the human genome, which / 10

created a significant opportunity for medicine. Theoretically it could show exactly what may happen to somebody with specific genomic sequencing, which diseases they are at risk from, ailments that others with a similar makeup suffer from, and the best treatments if they do become ill. However, the issue this brings is that the human genome contains 3 billion pairs of bases, with a considerable number of these varying in each person. They can change the smallest detail of a human being too, from whether somebody can roll

their tongue through to the chances of them suffering from heart disease. This challenge is where big data is currently having a huge impact and will have an even bigger impact in the future. The ability to identify not just these links, but also the elements that could create the links is going to stem from big data and analytics. David Delaney, Chief Medical Officer at SAP believes that ‘There are so many factors that will cause expression or non-expression of a gene. And when you add in factors around a patient’s environment, it


turns out that there are so many different pieces to this. The hope right now is that once we get more of the puzzle pieces out there, we will be able to start making the links that will eventually lead to major breakthroughs in things like cancer.’

Ultimately, cancer is not a single disease...It’s a constellation of different diseases that you can subdivide based on organ type or tissue pathology, but you can also divide it on the basis of their genetic changes One of the key diseases that this kind of data is going to impact most is Cancer, and in an article in Health IT Analytics, Marcin Imielinski, MD, PhD, a Core Member and Assistant Investigator at the New York Genome Center, pointed out that ‘Ultimately, cancer is not a single disease...It’s a constellation of different diseases that you can subdivide based on organ type or tissue pathology, but you can also divide it on the basis of their genetic changes.’ This makes it even more complex, given that there are over 100 types of cancer, but in the

current system they are treated as a single disease. Through using new data technologies, algorithms, and filtering, it is possible to treat them as genuinely different diseases, and through recording of genome sequences and different reactions to drugs it is possible to either treat cancers or take steps to prevent cancers that are more common with those with similar genetic sequences. When the data sets become large enough (i.e. when we have a considerable number of mapped genomes) it may be the largest and most highly valued single dataset in the world. A single human genome is around 200GB, so if we were to theoretically sequence the entire population of the world it would work out at 1,200,000,000,000GB or 1.2 Zettabytes of data, which would require a huge amount of computing power to process. Even if we could only get 10% of the world, this number would still pose

many challenges in management but significant opportunities in finding these links more accurately. However, according to Reid J.Robison (a physician who has now become a data scientist) of this, only around 0.1%, or 125mb, of data within each genome, are mutations that can cause different characteristics and diseases, meaning a dataset of only 750,000,000GB or 750 Exabytes for the entire world. Despite the huge numbers, involved this would be far more manageable and gives scientists an even better chance of blockbuster breakthroughs.

Only around 0.1%, or 125mb, of data within each genome, are mutations that can cause different characteristics and diseases We are not yet at a stage where these are going to be genuine practical solutions, but it is something that we are getting closer to and although big data will have a profound impact when we hit critical mass, it is already providing the key to finding out what we will need to look for when we get there.

/ 11


Big Data's a Bust: Small Data Provides Context To User Behaviour Chris Davidson, Product Marketing Manager, M-Files HAVE YOU EVER WONDERED WHAT YOUR CLIENTS think about your product? Did you ever ponder why they picked your product rather than your competitor’s? Have you ever speculated why one particular software deployment at your company failed while others have gone very well? Over the past several years we’ve been promised that Big Data had the answers to these questions–but many of us are finding this is just not true. One of the biggest problems with Big Data is that it keeps getting bigger–so it seems as though we are drowning in our own information.

/ 12


Small Data Provides Context for Big Data Business consultant and author m suggests that we flip the script on big data since most of us lack the tools and the time. In his book, Small Data: The Tiny Clues that Uncover Huge Trends, Lindstrom suggests that while we may have a wide-range of data points about our customer’s behavior, what we are missing is the ‘small data.’ He defines small data as the context and connections the human brain makes when examining data points. Lindstrom strongly advocates that ‘we must not ignore fleeting but instructive glimpses of human behaviour in our haste to download and analyse large collections of cold information.’ I see the concept of small data allowing humans to condense volumes of big data into manageable and actionable chunks of information. A computer can provide an organization with the information regarding a client’s behavior, but it takes a human to take that information and create a new direction in product development.

Agile Narratives As a proponent of agile project management, Lindstrom’s small data thesis strikes a chord with me. Fundamentally, he is building a Business consultant narrative around user behaviour. and author Martin This narrative in Lindstrom suggests that agile parlance is known as a ‘user we flip the script on Big story.’ A user Data since most of us lack story describes how a customer the tools and the time or user employs the product, and is written from the perspective of the user. If Lindstrom were to create a user story, I’m theorising he would

gather the mechanical data behind user behaviour like clicks, page views, downloads, buying patterns, etc. He would then interview the clients to glean the ‘whys’ behind the patterns of the data. Finally, he would do more wide-ranging research in order to plug the hard data and interview content into cultural context. Extrapolating Lindstrom’s framework beyond external clients and applying it to internal clients for successful enterprise content management (ECM) deployments could prove to be particularly effective. Analysts firms like Gartner Research laud ‘contextual relevance’ as the future of the ECM marketplace. When I ponder the phrase ‘contextual relevance,’ my take is that this is defined as providing the user with the ability to use content and data to create the narrative (user story) they need to make a client happy or complete a project successfully. Hence, I see the entire purpose of an ECM system as providing a large portion of the narrative as it serves this exact purpose.

ECM Integration with Existing Business Systems Gets You Contextual Nirvana I do want to mention that it’s almost impossible to deliver on contextual relevance or small data without effectively and efficiently integrating with business systems such as ERP or CRM. Furthermore, these integrations should be direct, dynamic and bidirectional. Integration is important because it provides the connection between data, documents and non-document objects. These linkages serve as a portion of the context to allow users to reduce big data into small data.

/ 13


Want to contribute?

channels.theinnovationenterprise.com/authors contact ghill@theiegroup.com / 14


Data Scientists Are The Next UX Designers Laura Denham, Organizer, Big Data Innvovation TRADITIONALLY WHEN WE THINK OF UX DESIGN, it is around how people design websites, apps and the layout of digital products. It allows people to move through the digital space as seamlessly and efficiently as possible, all within an attractive and well-designed environment. It has as much to do with aesthetics as it does with the way that customers or users interact with a site. This has meant that it is generally seen as a creative aspect of the business with more in common with traditional design roles. However, the world of UX is beginning to diversify, with the increased use of chatbots and personal assistants creating a new field of UX. Here, the idea isn’t necessarily to make sure that things look good, create a customer journey or make a site looks clean, it is to interact effectively with somebody asking a question of the technology. This move puts the onus of UX not on looks, but instead on knowledge and understanding of the question being asked. Having a chatbot that looks good but can’t understand many questions asked of it is the

equivalent of having a really nice looking taxi, but a taxi driver who doesn’t know their way around - it may be well designed, but you are still going to get frustrated when you don’t get where you want to go. This new form of UX is being driven by data, AI, and cognitive computing, meaning that those who can understand and implement data-driven insights will become one of the most important members of these new UX teams. We are equally going to see an increasing number of copywriters becoming integral to their development, with Anna Kelsey (who was hired straight out of / 15


the world of UX is beginning to diversify, with the increased use of chatbots and personal assistants creating a new field of UX

Harvard), the AI interaction designer at x.ai - a startup creating a chatbot - saying ‘The whole idea of creating a character, and thinking very technically about the way specific words or groupings of words can make people react and respond, is something I thought about all the time in college.’ It will also empower those who maybe before had a bit part in company perception, with Ben Brown, co-founder of Howdy a chatbot run within Slack - claiming that, ‘All of a sudden [microcopywriters are] the king because it’s nothing but microcopy now. That little form validation error message, or whatever, is now the full and total sum of your brand’s representation [in this interface].’ However, although the tone of the bot and how it says things will be important, the most important will be how it works, which is where data takes the lead. Firstly, the ability to find answers to questions through mining billions of web pages, APIs and social media feeds has reached a stage where it / 16

brings up the correct answer more often than not, compared to fairly limited capabilities only a couple of years ago. We have seen through the development of cognitive computing technologies, like IBM’s Watson, that the practical uses aren’t a sci-fi future, but very much a current proposition. However, it is not simply in the answering of questions, but in the analysis of the questions in the first place through semantic analysis. We may not be at a stage where a virtual assistant can understand the complexities of tone, like sarcasm or anger (the word ’sick’, for instance, could have multiple meanings, but only one to a virtual assistant) but through AI, sentiment analysis and voice analysis of millions of interactions and outcomes, these systems are building a better understanding of meaning, even in colloquial speech. The big challenge that currently exists is the use of these not as standalone apps, but within other apps and alongside other apps. This increases the reliance on data-driven insight, but also

opens up technology for even more uses. For instance, it would be possible for a virtual assistant to set reminders about birthdays from Facebook, appointments from calendar applications, and order food knowing your preferences from your previous online shopping experiences. It would essentially be able to create an entire dataset focussed on the individual far superior to anything that a single company could collect at present. This will create the ultimate seamless UX experience where the technology can learn about the individual elements of people’s lives, but with this amount of information, the only people who can truly utilize it will be those who can already work with huge amounts of data.


DEEPMIND’S WORK WITH THE NHS SHOWS IT IS GOOGLE’S BEST PURCHASE Alex Lane, Big Data Evangelist

WHEN GOOGLE BOUGHT DEEPMIND FOR $500 MILLION in 2014, it was just one of a number of artificial intelligence (AI) acquisitions, albeit a highly expensive one. Since then, DeepMind has taken center stage in the tech giant’s push to become the world leader in AI and machine learning, with Alphabet chairman Eric Schmidt labelling the company ‘one of the greatest British success stories of the modern age.’

/ 17


Much of the work DeepMind has completed thus far may seem frivolous, teaching itself to play classic arcade games like Pong and Space Invaders, for example. However, the ability for machines to beat these games has wide reaching consequences. Its victory over legendary Go player Lee Se-dol was said by The Verge to be a ‘huge moment in the history of artificial intelligence, and something many predicted would be decades away.’ It also won the 2016 Innovation Lions Grand Prix. DeepMind has committed itself to many projects over the years, but its work with the NHS is perhaps the best evidence of the widespread benefits the firm could bring to the world. The company has now followed up its work on kidney failure alongside the UK’s Royal Free Hospital London by announcing another collaboration with the world famous Moorfields Eye Hospital in east London. DeepMind’s AI technology will be applied by Moorfields to one million anonymous OCT (Optical Coherence Tomography) scans, the aim being that the machine learning system will teach itself to, eventually, recognize any conditions that pose a threat to someone’s eyesight from just one digital eye scan, such as age-related macular degeneration and sight loss that occurs as a result of diabetes. / 18

It is hoped that the research will enable eye care professionals to make faster, more accurate diagnoses. DeepMind claims that patients who receive the correct treatment when it’s needed stand a far better chance of retaining their site, particular in the case of diabetes adding that up to 98% of severe sight loss resulting from diabetes can be prevented by early detection and treatment.

DeepMind has not made it through without attracting its fair share of criticism. Its partnership with Royal Free Hospital drew ire because it appeared the company had access to far more data than it needed, as well as an apparent lack of transparency because the the hospital was not upfront with patients about the project and failed to inform them that their personal information was being provided to a commercial entity.

This time, they are on more solid ground. Explicit patient consent is not requirved as the scans are historic, which means the results won’t affect the care of current patients

This time, they are on more solid ground. Explicit patient consent is not required as the scans are historic, which means the results won’t affect the care of current patients, and the hospital will also have access to related anonymous information about their eye conditions and disease management. As DeepMind’s AI technology gets more advanced, these kinds of important projects are going to become more and more common, and for Google, a company that has always said its primary objective was doing good or at least, not be evil - this is likely to make it a hugely important part of its operations.

Moorfield’vs Professor Sir Peng Tee Khaw said: ‘Our research with DeepMind has the potential to revolutionize the way professionals carry out eye tests and could lead to earlier detection and treatment of common eye diseases such as agerelated macular degeneration. With sight loss predicted to double by the year 2050 it is vital we explore the use of cutting-edge technology to prevent eye disease.’


Premium online courses delivered by industry experts academy.theinnovationenterprise.com

/ 19


Data science is now one of the primary tools health experts have at their disposal for attempting to control an outbreak

HOW DATA IS BEING USED TO CONTAIN THE SPREAD OF ZIKA Megan Rimmer, Analytics Commentator

THE ZIKA VIRUS RECENTLY CLAIMED ITS FIRST VICTIM in the Continental US, taking the life of an as-yet-unidentified pensioner in Salt Lake County, Utah. Although Zika has been around since the 1940s, it is only during the last few years that it has really exploded, and its spread across Americas has been a tremendous cause for concern, particularly with the Rio Olympics coming up.

/ 20


As with all contagions, one of the most pressing challenges for its containment is understanding where it will spread. Obviously, it is not enough simply to deal with a disease once it has infected an area. Infectious disease physician at Toronto-based St. Michael’s Hospital, Kamran Khan notes that one thing is true of the spread of infectious diseases: ‘If you start to analyze the situation when an outbreak occurs, you’re already too late.’ This is particularly true of Zika, as there is still so little known about the disease. The disease is often symptomless, with just 1 in 4 of those with the disease developing them. The most worrying aspect of the virus is the birth defects it causes, such as abnormally small heads and brain damage. From what we know about the disease so far, it is transmitted by the Aedes aegypti and Aedes albopictus mosquito, neither of which are found in Utah. The majority of cases in America have been travel related, which means finding a pattern to its spread is exceptionally difficult. The only treatment available at the moment is also ‘mosquito management’ - an indiscriminate, costly, and wasteful program of insecticide spraying in areas with a large population of the mosquitos in question, the environmental impact of which is hard to ascertain. Data science is now one of the primary tools health experts have at their disposal for attempting to control an outbreak. Big data has a checkered history when it comes to spotting disease trends. Google’s Flu tracker, for example, was a spectacular failure that is often held up as a warning of the hubris of data practitioners. On the other hand, an algorithm from Nashvillebased health analytics firm WPC Healthcare was able to predict the spread of another virus spread via mosquitos, the West Nile virus, with 85% accuracy.

The data necessary to track a virus like Zika comes from a variety of sources, including clinical trials and flight patterns. University of Miami researchers, for example, have previously studied risk maps to combat mosquito-borne illnesses like malaria and dengue fever. They want to apply the same tools to Zika, and researchers from the university are using data from several local databases to develop maps that county officials could use to forecast where the disease may strike in future. They have already managed to garner a number of insights that could prove useful, including that affluent neighborhoods are more likely to have Zika mosquitoes, though they have yet to find out why. Obviously, when you are dealing with a disease like Zika in which it appears that travel is playing a central role in its spread, one of the first things that requires analysis is data streams like flight itineraries. By blending this information with clinical information, you can better understand the points of origin. Organizations are learning a lot from their recent experience dealing with the ebola virus, which utilized big data in a number of ways to quell its spread. The Centre for Disease Control (CDC), for example, applied real-time mapping software and telecommunications masts to track the disease across Western Africa, sending resources to anywhere that it could see the threat of infection raised as quickly as possible. It is through this sharing of data that organizations and governments will be able to best understand the spread of diseases, and organizations need to work together to ensure this happens. Big data is not a cure in and of itself, but if it allows companies to respond in a timely fashion, containment is a far easier proposition.

/ 21


THE IMPORTANCE OF BIG DATA AFTER BREXIT George Hill, Managing Editor BRITAIN HAS BEEN ROCKED BY THE EU REFERENDUM result and the implications of what it means in the long term are yet to be seen. In the short term we can see that the Pound has crashed to pre-1985 levels, Scotland is likely to leave the UK (taking close to 10% of the country’s total GDP with it), estate agents are predicting a 20% drop in house prices across the country and rumours abound of Morgan Stanley already moving 2,000 jobs abroad. With David Cameron, the British Prime Minister, resigning after calling the referendum in the first place, the country is also facing leadership issues in what seems like an increasingly dark time for the country. / 22

There is considerable anger for those under 50 too (who will be impacted the hardest and for the longest), with voting data showing that the vast majority voted remain whilst the majority over that age voted to leave. In a democratic


society, this is simply something that is always necessary, even if it seems unfair on those who will bear the brunt of the decision when they didn’t agree to it. In the face of all of this though, there may be a glimmer of light if the new government is willing to use data effectively to create new deals, protocols and laws. At present, laws are generally set through opinion and lobbying rather than fact. Look at the facts behind the gun control laws in the US, currently the subject of sit ins from Democrats due to a lack of reform following several mass shootings. Laws limiting gun use across the world, in places like the UK and Australia have significantly decreased the amount of gun crime, yet primarily due to lobbying, they are not being enacted in the US.

data could now help them both prioritize what needs to be done and the best way to do it

real scenarios in their campaigning, but arguably, data could now help them both prioritize what needs to be done and the best way to do it. It will be needed given the huge amount that needs to be done and the relatively short time in which to do it. However, one of the key elements is going to be reconciling a completely divided nation. During the campaign itself things became so toxic that the polarized viewpoints from both sides seem to be almost irreparable. Unfortunately the rifts after the increasingly aggressive and divisive campaign seem insurmountable, but through the use of data it could be possible. Thanks to voter data, online activity and even social media analytics, it is possible to identify areas of consensus and attempt to build bridges amongst divided communities. The process is not going to be easy and even with the use of data it will take many years to get Britain back to where it was prior to June 23rd 2016. However, with the use of data the UK will have a better chance of repairing the undoubtedly huge selfinflicted economic damage.

However, Britain has the opportunity to use data to create effective laws, policies and legislation based on sound facts rather the political rhetoric. Laws can now be enacted from the huge amount of public data that the UK holds. Rather than policing based on archaic laws, with the amount of reform needed, an entire law system could theoretically be based around real numbers rather than political will. Another aspect that the new government will need to deal with is that they have little to no plans for this scenario. One of the major criticisms of the leave campaign has been that they haven’t set out any / 23


For more great content, go to Innovation Enterprise On Demand

www.ieondemand.com

Over 5000 hours of interactive on-demand video content

There are probably of definitions the single job of Head of Innovation and any withemployee, them dozens of perspectives What would happendozens if a company fundedfor every new product idea from no questions asked? on it should beAdobe done.did Without anythat. official credentials the Randall subject will I was asked give my personal account As how an experiment, exactly In this session,on Mark share thetosurprising discoveries of running an in innovation team in the of an innovation-hungry organisation that started on the highfor street Adobe made creating Kickbox, thecontext new innovation process that’s already becoming an industry model and has innovation. grown to employ 16,000 people overa 80 years. In red the box pastpacked year orwith so Iimagination, have learnedmoney that when comes igniting Each employee receives mysterious and ait strange to innovation everything and there reallyKickbox aren’t any In orderwhy to get by, Iisstick some guiding game with sixculture levels.trumps Learn why the principles behind are rules. so powerful, Adobe opentosourcing the principles and lots of gut feel. Join me for an honest and straightforward perspective on a modern job without a entire process and how any organization can tap these principles to ignite innovation. Mark Randall's serial entrepreneurial career conceiving, designing and marketing innovative technology spans nearly 20 years and three successful high-tech start-ups. As Chief Strategist, VP of Creativity at Adobe, Mark Randall is focused on infusing divergent thinking at the software giant. Mark has fielded over a dozen award-winning products which combined have sold over a million units, generated over $100 million in sales and won two Emmy awards. As an innovator, Mark has a dozen U.S. patents, he’s been named to Digital Media Magazine’s “Digital Media 100 and he is one of Streaming Magazine’s “50 Most Influential People.”

/ 24

Stay on the cutting edge Innovative, convenient content updated regularly with the latest ideas from the sharpest minds in your industry.


Issuu converts static files into: digital portfolios, online yearbooks, online catalogs, digital photo albums and more. Sign up and create your flipbook.