volume 5 | no. 1 | 15,80 euro
Deep Dive: Collecting Relevant Insights The Service Design Promise By Ben Reason
Purpose-Driven Research as Key to Successful Service Design By Stefan Moritz and Marcus Gabrielsson
When Design and Market Researchers Join Forces By Remko van der Lugt and Gerrita van der Veen
Volume 5 No. 1
May 2013 The Journal of Service Design ISSN 1868-6052 Publisher Service Design Network Chief Editor Birgit Mager
peipers – DruckZentrum Kölnwest Fonts Mercury G3 Whitney Pro Service Design Network gGmbH Mülheimer Freiheit 56
Melvin Brand Flu
Contact & Advertising Sales
Project Management &
For ordering or subscribing to
Touchpoint, please visit
Pictures Unless otherwise stated, the copyrights of all images used for illustration lie with the author(s) of the respective article
from the editors
Deep Dive: Collecting Relevant Insights As the practice of service design continues to mature and find greater traction within companies, it continually faces the same commercial question that challenges its very existence: “Does it add value?” The traditional realms of designers – emotions, experiences and aesthetics – don’t naturally lend themselves to the facts-and-figures demands of the boardroom. But one aspect of the service design approach holds the key to supporting these two very different areas: Research. Research has always been at the heart of service design; without an understanding of who the consumers of a service are, how could a designer possibly tailor it to them? But research has always fallen into two distinct areas. Qualitative research typically provides inspiration (around experience, behaviours and motivations), whereas quantitative research provides evidence (about statistics, needs and performance). This division between qualitative and quantitative research presents a challenge: How can the benefits of both be harnessed to deliver real innovation and better services? This issue of Touchpoint is dedicated to just that topic, and inside you will find researchers and practitioners sharing their thoughts on bridging the gap. Those looking for a practical method in which quantitative and qualitative techniques join forces to determine (and improve) the likelihood of success for a service concept can find some inspiration coming from Helsinki. Four service designers and service analysts from Palmu Inc. share a model which has iterative co-design at its heart, in “A Comprehensive Model for Measuring Value” on page 40. And in the interest of providing our readers with even more tangible, concrete approaches to bringing ‘qual’ and ‘quant’ closer together, we have several other articles introducing well-defined techniques, such as one by Anna Lässer and Christina Dicke, who show how Lean UX can help translate qualitative insights into measurable units (see page 44). And covering the airline industry – where innovation and service are key differentiators – we learn of an online research community of frequent fliers which delivered qualitative research insight in the form of ideas for service concepts, which were then fed into a quantitative validation process, where concepts were measured against innovation KPIs. You can read more in “Fusing Qualitative and Quantitative Skills in Service Design” on page 62. In our Tools and Methods section, Stephen Masiclat comes to the rescue of personas. Despite their widespread use in service design (and user centred design more broadly), they routinely draw criticism for being too speculative and ‘soft’. He introduces Q Methodology as a technique to bolster the reputation and authority of personas, and better establish them as a quantitative technique as well (see page 84). Moving away from this issue specifically, we’re proud to announce that the entire back catalogue of Touchpoint will soon be available on an article-by-article basis, for purchase via the SDN website (free access to SDN members!). And lastly, we’re busy planning the upcoming SDN Global Conference in Cardiff this autumn. We hope to see you all there!
Jesse Grimes for the editorial board
Birgit Mager is professor for service design at Köln Inter national School of Design (KISD), Cologne, Germany. She is founder and director of sedes research – the centre for service design research at KISD, co-founder and president of Service Design Network and chief editor of Touchpoint. Jesse Grimes has thirteen years experience as an interaction designer and consultant, now specialising in service design. He has worked in London, Copenha gen, Dusseldorf and Sydney, and is now based in Amsterdam with Dutch agency Informaat. Melvin Brand Flu is partner of strategy and business design at Livework. He is a business and strategy consultant with over 20 years’ experience working for companies across continents. He advises executives and businesses on the cutting edge of business innovation in industries ranging from telecommunications and financial services, to the public sector and entertainment. Roberto Saco is the principal at Aporia Advisors, a management advisory in South Florida (USA). His clients reside mostly in North America and include organisa tions in the financial and profes sional services as well as large industrial concerns. He is also an independent scholar and manage ment instructor teaching at Miami Dade College.
48 6 2
from the editors
Value of Design Birgit Mager
forrester’s take 12 The Danger of Big Data
26 Using Data to Support
Effective Decision Making
48 Purpose-Driven Research as
Key to Successful Service Design
30 ‘Live Labs’: prototyping
Environments to Measure Customer Experience
34 When Design and Market
Antti Koskinen, Petteri Hertto, Maiju Nöyränen, Mikko Jäppinen
Katrin Dribbisch, Manuel Großmann, 44 Lean UX Martin Jordan, Olga Scupin Anna Lässer, Christina Dicke
Tony Driscoll, Craig LaRosa
14 The Service Design Promise
20 Opening the Black Box of
40 A Comprehensive Model for
10 Measuring the Role and
feature: deep dive: collecting relevant insights
Researchers Join Forces
Remko van der Lugt, Gerrita van der Veen
Stefan Moritz, Marcus Gabrielsson
54 Life and Death Data Christopher Ferguson
58 Left Brain, Right Brain:
Working at the Intersection of Design and Business Erick Mohr
62 62 Fusing Qualitative and
Quantitative Skills in Service Design
Thomas Troch, Tom De Ruyck, Annelies Verhaeghe, Charles Hageman
68 Measuring and
Demonstrating the Value of Service Design David Singh, David Le Brocquy
72 Building the Bridge Samara Tanaka, Isabel K. Adler, Ana Fucs, Bernardo Segura
76 Pilot or Perish Lavrans LĂ¸vlie, Melvin Brand Flu
tools and methods 84 Build Better Personas Using
Subjective Science Stephen Masiclat
90 Workstyles: At Your Service Yen Chiang
profiles 92 Interview: Katrine Rau Ofenstein touchpoint 5-1
let's visit the welsh capital!
SERVICE DESIGN GLOBAL CONFERENCE CARDIFF | UNITED KINGDOM 19th â€“ 20th NOVEMBER 2013 MEMBERS DAY 18th NOVEMBER
Cardiff, the capital of Wales, a principality of the United Kingdom, was ranked sixth in National Geographic's top 10 alternative tourist destinations in 2011. It is also the city where the Service Design Network has decided to hold its Service Design Global Conference this year. The event will take place on the 19th and 20th of November, 2013. This will be an international event that will offer you the opportunity to gain new insights into service design while getting to know people from the field! Keep connected to SDN social network and the conference website to get the latest news! http://www.sdnc13.com
join the sdn national conference in japan The Japan National Chapter of the Service Design Network invites practitioners and researchers to participate in the first Service Design Network National Conference in Japan, to be held on Saturday, May 11th 2013 at the Recruit Academy Hall, Tokyo. The goals of the conference are to share knowledge, experiences and understanding about service design and also to connect participants from various fields of expertise. In particular, it focuses on exploring innovative 6
applications of service design approaches to current and future business practices. The conference will feature a special keynote from Prof. Brigit Mager, the president of the Service Design Network,
introductory and visionary talks on service design, business case presentations, and workshops to learn service design tools. Have a look at the website through this link: http://bit.ly/sdnc-jp13 M.T.
sdn offers 10 'outside in' books!
SDN offers you the opportunity to purchase the full back catalogue of Touchpoint for a reduced price! Fill your bookcase with four years worth of in-depth articles relevant to service design, written by many different authors from the field! Visit our online shop: http://bit.ly/touchpoint-shop
Since 2011, Kerry Bodine has been writing Forrester’s take column. Recently, she and Harley Manning published their book Outside In: The Power of Putting Customers at the Center of Your Business. The Service Design Network has one book to offer each of the first ten purchasers of the Touchpoint Collection between the 15th of May and the 15th of June 2013!
sdn new website The Service Design Network has a brand new website! Easy to navigate, it offers the opportunity to all users to create a profile, upload job announcements, articles, papers and presentations, news... as long as it's related to service design, of course! One of the big innovations is the ability to search online any articles published in Touchpoint since its first issue in 2009! Thanks to a keyword search, users can look for specific topics, and discover which articles are
related to it. SDN members have full access to all the articles, while the non-members can purchase the PDF versions of articles via PayPal. Join us on www.service-designnetwork.org and connect, read, discover and share!
Purchase point the Touch r a fo Collection e copy a fre chance of e In! of Outsid
‘Outside In: The Power of Putting Customers at the Center of Your Business’ by Kerry Bodine and Harley Manning.
KISD Photo Studio
Join more than 5000 members of Service Design Network group on Share your thoughts, start discussions or polls, give advices, answer questions with more than 5000 Linkedin users interested in the service design field! Join SDN group on linkedin: http://bit.ly/sdn-linkedin-group
government service design manual
building the service design knowledge base
“From April 2014, all new and redesigned digital government services will need to be so good that people prefer to use them,” says the digital design manual of the UK Government. Service design is not necessarily digital – but if you put it at the heart of designing digital services, it is bound to make a radical difference. And this is exactly what UK Government is planning to do! Look at digital as a service! Be user-focused! Work in multidisciplinary teams! And apply service design phases to the design of digital services! Detailed guides will help the multidisciplinary teams to live service design thinking and doing in all phases of their projects! An impressive mission and an amazing effort! We'll keep a close eye the project and keep you updated! B.M. Read more: https://www.gov.uk/ service-manual
Throughout the last decade, service design has been applied in all branches of service industries, as well as in the public and social sector. It is high time to build a solid knowledge base and to give access to the huge amount of knowledge and experience incorporated in these projects. The SDN Case Study Database will enable learning and networking as well as further research on the value and impact of service design. Yes, it is a bit of effort to summarise
a project and submit a case study. But we have made it easy: you'll find a template that takes you through the whole process. And we will reward your effort: If your case study is accepted, you'll get a free, one-year subscription to Touchpoint. And three accepted submissions will each receive one free ticket for the SDN Conference in Cardiff on November 18th-20th. Who will these be, we wonder….? Join in and submit your case studies. Help to build the knowledge base of service design! B.M. Find out more on: http://bit.ly/SDN-case-study
powered by service design
design in theory and practice. In February 2013, the next step was taken: an intensive service design curriculum, led by Birgit Mager and accompanied by Ben Reason of Livework and Julia Schaeper of the NHS UK. Almost four weeks of diving deep into the world of service design theory and methods, along with six practical projects carried out in parallel, enabled designers, entrepreneurs and government
to better understand and to apply service design. In parallel to the curriculum, the TCDC and the Thai Government prepared the launch of the Speed Train Project, in which service design will play a crucial role in the design and implementation of a new high-speed train system in Thailand, supported by Livework as the service design agency. Touchpoint will keep you updated on the progress of the project! B.M.
Last September, the Thailand Center of Creativity and Design (TCDC) held the Creativities Unfold Conference 2012. Birgit Mager's keynote lit the service design touchpaper, and a lot of energy was invested by TCDC and other enthusiasts in setting up a powerful, long-term initiative to really embrace service 8
Service Design books
Book Recommendations from the Network service design - on the evolution of design expertise by T. Kuosa and L. Westerlund
Like most recent publications on service design, this publication gives an overview of the economic developments that justify the need for service design. Some of the contributions have a very marked link to the Finnish and Estonian cultures and economies. Like most recent publications, the book relates service design to other disciplines connected to service industries, like service science, service engineering and service management and points out its interdependency with brand and business models. And, like most recent publications, it relates service design to the creative
class, open innovation and to other concepts related to creativity, moving between Florida and Schumpeter and Einstein and Csíkszentmihályi. Unlike most publications, the authors make attempts at finding new ways of understanding, defining and systematising service design. The ‘Design Tree’ is a attempt to visualise the service design ecology. Systematic content analysis of the ways that professionals describe service design leads to interesting dimensions of reference systems. They relate art and (service) design, playing with out-of-the-box views on service design. These threads are certainly not spun to their conclusion, but they do lead into new realms. The book is quite an easy read. The
lack of case studies is really the biggest concern, especially since service design has such a strong dynamic in Finland and the Nordic region. The book was published at the end of 2012 by a team focused around Tuomo Kuosa and Leo Westerlund with funding from the Service D1 project. It comes complete with a fresh and 'busy'-looking journal (thanks, Jesse!). B.M. You can download the book at http:// bit.ly/evolution-design-expertise 1
of Baltic Interreg Programme 2007-2013
service design “Andy, Lavrans and I felt that there from insight to implementation was no book to use with students. We by A. Polaine, L. Løvlie, and B. Reason
The library for service design has been constantly growing in the last few years, and it is nice to put a new book onto the shelf: a book rooted in service design practice and informed by years of experience with a broad spectrum of clients and challenges. Polaine, Løvlie and Reason present a refreshing mix of historical perspective, methodological overview and advice and practical examples, with the latter being the facet that makes the book most valuable. Even though some of the projects are themselves historical and the methods and visuals have been seen before by the dedicated service designer, thus far there has been no comparably comprehensive summary of real-world experiences.
wanted it to be a book for practitioners and tell the story of where we come from,” says Ben Reason. The book is, therefore, also the story of one of the first service design agencies. It gives amusingly straight answers to frequently asked questions, it combines the big picture of economic change with detailed ‘how to’ tips and it involves guest authors who give outside perspectives, like Chris Risdon's on the value of journey maps and visualisation. Knowing Livework and their recent strategic focus puts high expectation on the topic of the business value of service design, and a full chapter is dedicated to measuring this, one of the most relevant topics for service design today: “We have not found a single, perfect method that provides robust evidence for the value of service design.
However, it is important to define some measurement criteria…”. A nice overview on optional methods from other fields is given, and room for another book on this topic is left on the shelf! Readability, content-wise, is great. Layout and typos are, however, another story...! A must-read for everyone interested in service design. Service Design Network members also get a 25% discount: just contact media@ service-design-network.org to receive your promotional code! B.M. touchpoint 5-1
Measuring the Role and Value of Design Reflections on a scoping study Aiming to define future areas of research funding, the UK Design Council and the Arts & Humanities Research Council took a closer look at design research and business collaborations in UK Universities. Birgit Mager is professor for service design at Köln International School of Design (KISD), Cologne, Germany. She is co-founder and president of Service Design Network.
Based on a scoping study (conducted by Madano Partnership) on the value of design in general, service design was identified as one of the two areas – the second being health care – that received the strongest endorsement for future research funding. Growth, innovation and impact are to be expected in this still young field of design, but research needs to be done to stimulate and strengthen these perspectives! The study points out that service design is strongly perceived as focused on public and social service, and not so much on business and economic value, which is quite surprising when we look at the number of business related cases that are presented at SDN conferences and in publications. It might be related to the limited scope of participants in the research and also to a specific perspective of the academic community in service design. Service design should grow its client base beyond the public and social spheres and identifying and tackling bigger cases instead of small scale interventions is one of the recommendations of the results from the research.
“… demand for service design is growing and will continue to do so, as our service-led economy grows. However, this reported demand works against comments that service design works on too small a scale to flourish. Expanding the sector to work on a larger scale – for example working in hospitals nationally rather than in one department – may be a challenge for the sector as it is constructed currently on a small and more informal basis.”1 Within the field of service design a need for more research on how design adds value was identified. “There is little compelling academic or professional practice material on the impact and value of service design. Respondents were unable on the whole to provide detailed or robust case studies of impact and there appears to be no common method or framework for measurement.”2 So one recommended focus for future research funding lies in measurement-related research in interdisciplinary teams, a focus that Service Design
articles database and Read more e report: h t d a o l n dow dy ping-stu bit.ly sco
Network will happily support. The ongoing SDN effort to build a strong case-study database will certainly help to give better measurements for and proof of value (see also page 8 and contribute your cases!) Service design teaching is criticised for “...being shallow and not looking to other disciplines to strengthen its theory”3. Most certainly, good education is at the core of the future success of service design. Over the last few months, SDN Academia Board has conducted a research project among our academic partners in order to build a transparent overview of opportunities to study service design. Today, we do lack a clear understanding of what curricula for service design need to contain and how interdisciplinarity has to be part of it, and this needs to be changed! Results will be published soon. Hopefully Design Councils all over the world will be able to access to this report and relate their considerations on future focus and funding to it. The SDN National Chapters will certainly try to push this forward and strengthen service design research on a national level!
References 1, 2, 3 Scoping Study on Service Design. Submitted by the Madano Partnership December 2012
Touchpoint, the Journal of Service Design, was launched in May 2009 and is the first journal on service design worldwide. Each issue focuses on one topic and features news and trends, interviews, insightful discussions and case studies. The articles published in Touchpoint since its first publication are now available online! The formatted Pdfs of single articles are now downloadable at no cost for SDN member and can be purchased by non-members. You will have the opportunity to search articles by volume and issue, by keywords or by author. Don't wait! Visit SDN new website and rediscover hundreds of gripping articles!
free acces s f o r ds of t sdn members e r d n ! Hu s abou e l n artic e design i servic database! sdn www.service-design-network.org
The Danger of Big Data Service design requires a mix of research inputs
Big data has become a big buzz phrase. A couple of months ago, I spoke at a conference in Las Vegas. Immediately before my talk, two advertising execs, one a professed quant geek and the other a “creative”, spoke about how their agencies rely less on hunches these days and more on quantitative data to drive emotional relevance between their clients and consumers. “We can identify human emotions in massive rivers of data,” the ad men said. When I pressed them for an example during the Q&A session, they described how they had recently mined millions of click streams, search queries, video views, website clicks and the like for a major mortgage lender. All in, the technology investment behind their analysis must have stacked well into six figures. And their big emotional insight? When people start shopping around for a mortgage, that’s all they can focus on until they’ve gotten it all sorted out. I could hardly believe my ears! Any skilled ethnographer could have discovered that same insight – and then some – through a day of inhome customer visits and a handful of taxi receipts. Service design teams can now glean customer insights from social media, financial systems, 12
emails, surveys, call centers, and digital and analog sensors. It’s amazing and wonderful, yes. But here’s the danger: Companies that become mesmerised by big data put themselves at risk of spending enormous amounts of time and money amassing new data sources — and, in the process, forgetting that research methods like observation and one-on-one interviews even exist. This has the potential to create a large, and exceedingly expensive, blind spot. Don’t get me wrong. I’m not a big data hater. However, to create a complete picture of who your customers are and what kind of services they really need, you need a combination of quantitative and qualitative research methods. Here are two effective ways to do it. As I alluded to a moment ago, you can gauge home buyers’ attitudes, emotions, and behaviors
through qualitative research with just a handful of subjects – and then validate the statistical significance of those findings through quantitative analysis. Or, moving in the opposite direction, you can identify a problem or opportunity area in the home buying experience through quantitative research – and then uncover the underlying reasons for the problem (or the specific characteristics of the desired solution) through qualitative studies. To see how this can play out in a little more detail, consider the work that European power and gas provider E.ON did to improve its customer onboarding experience. Through surveys, E.ON discovered that many customers thought the company was boring. Maybe this isn’t too surprising given that we’re talking about a utility provider! But this news didn’t sit well with Adam Elliott, E.ON’s head of customer insights. “We want customers to get the first bill and love us. If they don’t have a strong positive opinion of us after three to four months, we haven’t sufficiently engaged with them.”
Why is Adam so concerned with customers’ emotions? “The way that we make a customer feel will dictate how they behave — whether they stay with us as their energy provider, whether they call us up, and whether they shout at us when they do call,” he says. For example, the churn rate for new customers — those who had been with E.ON for a year or less — was much too high. But when and how did customers want to be engaged? And what kinds of emotions did new customers have during the switchover period? “These things have huge cost and revenue implications,” says Adam. “But typical management information doesn’t address how our customers feel.” To get to the bottom of these questions, Adam enlisted the help of Andrew Franklin, E.ON’s head of service design. Andrew asked a group of brand-new customers to keep every communication they received from E.ON over the six weeks that it took to get them switched over from their old provider. He also asked them to create a graph about how they felt about the switching experience each day. The “happy graphs,” as Andrew calls them, helped E.ON see exactly how customers’ emotions changed over time and clued the team into things that the company just hadn’t picked up on before. “When we looked at the graphs, we saw the lines going down. But nothing had happened. The customers hadn’t had any interactions with us.” Subsequent one-on-one interviews revealed that E.ON’s new recruits wanted reassurance that they’d made the right decision — and became concerned when they didn’t get it. “In the past, we felt that reaching out during these six weeks was an extra thing that we didn’t need to do”, says Andrew. But the research showed that what
happened during that time actually mattered a lot. So E.ON created additional touchpoints that now enable them to stay in touch with customers during the switchover period, even though there’s nothing of substance to communicate. And, to alleviate any doubts customers might have, the company revised its standard sales quote format so that prospective customers can easily compare E.ON’s charges with those of their current provider. These may sound like relatively simple improvements. That’s because they are. But they’ve helped E.ON fill a gap in new customers’ emotional needs during a critical time period – and they wouldn’t have been possible without a mix of quantitative and qualitative research.
The E.ON story is an excerpt from Outside In: The Power Of Putting Customers At The Center Of Your Business.
Kerry Bodine is vice president and principal analyst at Forrester Research and the coauthor of Outside In. Her research, analysis, and opinions appear frequently on sites such as Harvard Business Review, Forbes, and Fast Company.
The Service Design Promise
Ben Reason is director at Livework - Service Innovation & Design
Service design promises something that – although hard to pin down – does have some common characteristics. I would argue that at its core, service design brings together both a practical understanding of people (specifically, service users and customers), as well as enables an organisation to imagine how to respond to this understanding in ways that improve the service offer and/or experience. I have often been tempted to say to our clients that they are scared of their customers (patients, clients, etc.), and once or twice I have actually done so. What I really mean is that they are scared that the effort to better understand their customers will open them to criticism from those same customers. In my experience, introducing a service design approach gives clients a structure in which to start with customer understanding, formulate an analysis and develop responses that can develop and become systematic over time. This approach reduces fear and enables organisations to improve their services. At Livework, we have found that an actionable understanding of a customer – and the designer’s focus on the tangible, visceral aspects of experience – has a number of benefits. Often, the understanding developed is richer because of the use of design
tools such as visualisation, sketching and scenario-making. With some customer groups (such as younger people), we have found that using visual tools creates insights that do not emerge in conversation. A creative response to customer issues can lead to solutions that are much less “us and them”, and instead align customers and service providers in value creation. The design tendency and capability to prototype regularly enables our clients to relax, try something and learn, rather than the traditional approach of either killing an idea, or rushing to invest in it before it is proven. And finally, service design has the facility to retain the customer perspective through the development process. It defines the desired experience and retains customer requirements and visual aspects well into the specification of complex ser-
vices, whereas in a non-design approach, that customer perspective could be lost. However, service design can also over-promise. Or we can feel that it is the panacea for all our customers’ problems. Lately at Livework we have felt the need to recognise where the promise falls short and respond to the diagnosis. Here’s what we have learnt through engaging with sympathetic but critical voices. First up, service design can fall short in its understanding of the organisation. We have experienced again and again how what feels like compelling evidence for change, coupled with exciting solutions, falls on deaf ears. What we realise is that this difficulty is due to internal political, procedural or process issues that we are not aware of. Birgit Mager’s presentation at the SDN Global Conference 2012 in Paris reiterated that service design has an overwhelming
focus on the end user. We feel the need to complement this focus with an understanding of the organisation, and support our design activities with this knowledge. Secondly, we want to have a bigger impact. We want to see a case study which credits service design as the critical factor behind a tangible shift in an organisation, in the way that industrial design is understood as core to Apple’s success. Thirdly, can service design identify real value? Not for customers but for our clients. Can we find the solution and the case that demonstrates significant business benefit? Too often we are arguing for the customer and failing to demonstrate how our work will benefit the company. For a while this was ok, because clients wanted someone from outside to shake them up with customer insight. Now, perhaps in more austere times, we feel that they want clear savings or revenue. touchpoint 5-1 15
Design Finally, on a related point, service design needs to create proof. This is subtly different to the previous challenge. Proof is something we can do that is new. The natural design ability to prototype and test can grow into a truly new way to demonstrate that something works and has value. It’s better than a business case; it’s a business case in action. So what to do? I think that there are many directions being taken by service design practitioners and service design buyers. Some companies are pairing designers with business consultants to balance the picture. In other cases – including the best projects I have been involved in – service designers find great internal people who have the knowledge to complement their design insight and qualities. The insurance project with the inspired actuary comes to mind. But this is more than simply adding a numbers guy to the creative guy and bingo! At Livework we are pursuing four key directions. It is early days so please forgive me if these are half formulated: BETTER ORGANISATIONAL INSIGHT
We are turning our ability to understand the human experience and motivation from the customer to the organisation. We want to develop insight into what will 16
work in an organisation. This is different to change management: All we want is to understand how to develop the right solution that will work with the business as well as the customers. Whilst this will be to our benefit and enable us to better gauge our pitch and prepare for our audience, we also believe that it will provide additional benefit to our client, who will share our deeper understanding. A PROACTIVE ROLE
We aim to have a greater impact by better understanding the business agenda, and focusing on it. This is something that we have often known because it comes with the brief, meaning our client has done the hard work of working out the problem and we come in to do the customer bit. We believe that service design needs to take an active lead and to identify business issues and opportunities for clients. We know that doing this elevates our value and increases the impact of our work.
THE CHALLENGE OF SCALE
We think that service design has a specific case to make about the relationship between an organisation and its customers. The understanding that services are made in the interaction between a provider and customer and that the service experience is different every time is a very challenging concept for people in businesses with large numbers of customers used to industrial production methods. As a challenge this is interesting but potentially annoying. If we can offer a solution to how an organisation is able to serve customers as individuals whilst still being able to access the economies of scale that come with mass production and consumption, then we really do have a new story. THE IMPORTANCE OF FIGURES
Finally, we do need to be able to put numbers to our designs. Simply understanding the cost and potential for savings and revenue will help the service design case radically. But we should not think that this is the beginning and end of engaging with business. To conclude, service design has a great toolkit and I think a great advantage in the visual, practical and holistic nature of the approach. But we have spent too much time on the outside looking into our clients businesses without truly understanding them. We would like to change them but we don’t want to get too close. Perhaps we are scared of them? I think we will find that if we do get closer and truly understand – their motivations, language and context – we will find another fascinating and human environment where our specific skills, approach and perspective will add significant value.
KISD Photo Studio
Deep Dive: Collecting Relevant Insights Exploring successful strategies that bridge the border between two ways of thinking
Opening the Black Box of Research The use of qualitative and quantitative research in service design
Katrin Dribbisch is a Ph.D. candidate at the University of Potsdam. Manuel Großmann is a senior service designer at Fjord. Martin Jordan is a senior user experience designer at Nokia. Olga Scupin is a community manager at Stylemarks.
Service design practitioners seem to agree on the fact that research is important. There seems to be bias against quantitative research and a preconception to favouring qualitative research methods in the service design context. But only scant evidence and information on how research is actually embedded in the design process and which methods – qualitative and quantitative – are being used, is available. This article aims to shed light into the ‘research black box’ of service design by offering a current analysis of how research methods are used in this field. For this article, both qualitative and quantitative research methods were used in order to investigate this topic from different angles. We conducted structured interviews with five practitioners: two from design agencies; two from large corporations and one from a small start-up business. Building on the insights from our qualitative interviews we carried out a quantitative online survey with 152 participants. STRENGTHS AND WEAKNESSES OF QUALITATIVE AND QUANTITATIVE RESEARCH
Quantitative research methods explore the ‘what’, qualitative methods examine the ‘why’. Yet which question comes first depends on the project, on previously gathered data and, of course, on the 20
decision of the design researcher. Talking to service design practitioners we explored the strengths and weaknesses of qualitative and quantitative methods. When qualitative research methods are used at the beginning of a project, they help to understand the subject. Qualitative research such as ethnographic immersions or diary studies allows us to record anecdotal examples of people’s behaviour and, more generally, insights one could not have asked for. The main benefit is “...a small sampling, but very deep study and analysis of what people need, expect and their [ulterior] motivation,” explains Christine Truc Modica of Fjord. Especially for designers, qualitative research results have big inspirational potential.
deep dive: collecting relevant insights Interviewees: Julia Leihener (service designer at Deutsche Telekom’s Creation Center) Christine Truc Modica (program director of user research at Fjord) Justin McMurray (founder of Somewhere) Dr. Hartmut Obendorf (Head of User and Design Research at Nokia) Reto Wettach (design director of IxDS) When used subsequently to quantitative methods, they are applied to dig deeper, to uncover the reasons for tracked actions and to identify “...blind spots you would not find by going through existing quantitative material,” Julia Leihener of Deutsche Telekom states. A main challenge, though, is finding the right people for user tests, interviews and cultural probe studies: people that fit a target segment. The researcher cannot know upfront if a participant might be the wrong kind of person for the study, or if they can provide useful feedback. With small sample sizes, just two unsuitable participants can distort the study. In any case, designers and researchers have to regularly emphasise to managers and product owners that the insights might not be representative or that they have to demonstrate that something is valid for a market. Beyond that, qualitative research is very time-consuming. Whether it is spending a whole day in the field, recording consumer journeys or conducting a cultural probes study, the preparation, execution and evaluation are time intensive and still have to be validated with quantitative research afterwards. Quantitative research methods are highly useful for tracking performance, measuring success and optimising things based on valid numbers. Large sample sizes, with up to thousands of participants in questionnaires, can provide representative results. Also, they are certainly faster to conduct and more concrete. When used in agile and iterative development environments, a method such as A/B testing – measuring the success of one design against another – can be used for checking a hypothesis quickly. “Trust your instincts and act on your intuition, but be guided by the data,” advises Somewhere’s Justin McMurray. Yet, when a sample size is too small, there is little to rely on and the data can only be used to check tendencies, but not to trust in absolute numbers. Apart from this, it can be difficult to shape the right questions, design the right type of test or define the proper event to track. “It’s easy to miss the right point,” Modica remarks, and research might even need a proper data analyst for the numbers. Nokia’s Dr. Hartmut Obendorf touchpoint 5-1 21
is research part of your work? (positive answers by area)
100% 97.1% 75%
notes that: “Just statistics won’t work [either]. You have to get a statement out of the statistics.” So whatever the kind of applied method – whether quantitative or qualitative – its results and data have to be processed, filtered and interpreted. HOW AND WHEN RESEARCH IS USED IN SERVICE DESIGN
Building on the insights from the semi-structured interviews we conducted an online survey with 152 participants. This was aimed at validating the assertions of our interviewees and gathering more evidence on the actual use of research in service design. The survey shows that research is part of most participants’ work, with the highest share among academics and the lowest among corporations, with startups and agencies ranging in between. However, the question ‘Out of all projects that you worked on in 2012, how often did you do research yourself?’ produced mixed answers. For more than one quarter of participants, research was part of up to 25% of all of their projects and a similar number of people said more than 75% of all of their projects in 2012 included research. For 20.4% of participants, all of their projects in 2012 included research. Overall, research seems to play a big role amongst all service design practitioners. 22
REASONS FOR USING QUALITATIVE AND QUANTITATIVE RESEARCH RESULTS
The relationship between quantitative and qualitative research can be best understood by looking at the reasons for using the respective research approaches (see graphic). Qualitative research results are mainly used to better understand the context of the project, to test a concept or an idea, as well as to find new opportunities and to get inspired. Main reasons for using quantitative research results are, however, to provide evidence, to measure success and to track performance. The various sectors set different priorities with regard to using quantitative and qualitative research data. For academia (79.4%) and agencies (63.0%) understanding the context of the project better is the main reason for using qualitative research results, whereas for corporations (60.7%) and startups (66.7%) finding new opportunities ranks highest. Quantitative research results are primarily used by academia (52.9%) and
deep dive: collecting relevant insights
out of all projects that you worked on in 2012, how often did you research yourself? 100% 29.4% 75%
16.7% 28.3% 8.7%
never up to 25% of the time up to 50% of the time up to 75% of the time more than 75% of the time
agencies (50.0%) to provide evidence, by corporations to measure success (55.6%) and by startups to track performance (50.0%). This seems to reflect the main challenges of the different sectors: whereas agencies and academia have to persuade their clients, corporations use quantitative data for business decisions and startups are more process-oriented, using quantitative data to evaluate performance in the market. The survey results thus confirm the findings of the qualitative interviews with service design practitioners. The interviewees state that qualitative research seems to provide deeper understanding and inspiration and helps to gather feedback. Quantitative research results serve a different purpose: tracking performance and evaluating success. PRACTICAL CHALLENGES OF COMBINING BOTH APPROACHES
The survey shows that most participants are willing to include research into their projects, but face a variety of challenges in doing so. As one participant puts it: â€œWorking in an agency, we rarely have the possibility to follow the same, ideal research [and] design process each time, so we rather adapt the process for each project to match the resources, information sources and materials available.â€? There are many practical challenges that need
main reasons for using qualitative research methods 60%
to understand the context of my project better
to test a concept or an idea and to find new opportunities
to get inspired
main reasons for using quantitative research methods 60%
to provide evidence to measure success
to track performance
30% 20% touchpoint 5-1 23
to be taken into account: “For instance, you might start with desktop research, just because there’s [something] available and the client wants some kind of proof that the whole effort is worthwhile”, one participant from a corporate background comments. Many participants of the survey state that time and money constraints prevent research. One online innovation consultant says: “Clients do not like to invest in qualitative research, since they very often think they know [something] anyway.” As a workaround, the participant suggests that: “In most cases I do research via interviewing sales people, to reveal some of their customer insights.” One participant from a startup reflects: “If there’s enough budget, I try to work with anthropologists.” Both types of research also seem disconnected within the organisation and from design teams. A participant from a corporation critically suggests that: “Outputs from qualitative and quantitative research could inform our design much better than it actually does.” One very insightful example of the disconnectedness of research and design was given by another corporate designer: “Research is done by other departments. I sit in the design department and I put some time aside to do my own research [...]: it’s many times [more] difficult to make the management rethink their proposals or consider [a] different functionality.” This exemplifies the obstacles designers face in justifying research in both corporate and agency environments. Another challenge that is mentioned is the different understandings of research: “Designers do research in a very different way from analysts or researchers themselves.” Despite practical challenges, the benefit of combining qualitative and quantitative research is acknowledged as “...critical to the success of a research project” and “...[a] lot more robust as [it] captures the whole experience.” PRESENTING AND DELIVERING RESEARCH
The interviews and the online survey confirm that practitioners primarily conduct research to help in improving or creating services. Long research reports 24
that purely document primary data in an objective manner provide little value to practitioners. With regards to user research data Christine Truc Modica claims that: “It’s not about finding, it’s about translating.” Service design practitioners need to see their research reports as a service in itself that has to be designed in a user-centric fashion. It is therefore very important to know to whom the research results will be presented: in other words, who is the user of the research data? Members of design teams use research results as inspiration. Results of both qualitative and quantitative research should be presented in a visual and tangible way so that the team can use it to drive the design further. What is interesting is that designers do not value quantitative data as a source of inspiration: “Once I see numbers, nothing happens in my mind”, one interviewee comments. Quantitative data, however, can be inspirational for designers if it is contextualised. The “...legibility of data” – the way that we read and understand data – needs to be improved, as Reto Wettach of IxDS points out. In contrast to designers, decisionmakers need to be convinced of the necessity of a specific design. When research results are presented to them, they need to understand the impact of the results towards a strategic design direction or a value proposition. In this context, the impact on the overall business is of higher importance than inspirational aspects. Especially in start-ups, the two groups – designers and decisionmakers – merge. Strategy and design
deep dive: collecting relevant insights
happens within the same team. Thus, the research methods used and the presentation of results are mixed. Frequently, automated analytical tools are interview e h t h c used and the entire team is connected to their t a W users almost in real time. We discovered that videos on quantitative data is presented in snapshot format to the team. As a best practice example, Justin McMurray shares that research highlights are read out to the entire team during a Friday-night beer session. This provides inspirational insights and gives continuous feedback on current tendencies. No matter to whom research results are presented, they always need to tell a story so that data turns into actionable recommendations. In adequately translating and LESSONS LEARNT FOR RESEARCH IN SERVICE DESIGN presenting data, service designers can This article provides an overview of how research is overcome one of the biggest challenges understood and used by service design practitioners for doing research in service design, in different sectors. We conclude that both research namely to sell the value of research approaches are employed. It needs both qualitative and within the organisation and to clients, quantitative research to answer the ‘what’ and the ‘why’ especially qualitative research. and, in a best case, they are not separated, but interlinked. Service design practitioners should Yet it seems that designers know too little about how to value both qualitative and quantitative be inspired by quantitative data. According to Wettach: research. Rather than choosing one “Designers have to prepare themselves for more metricover the other, service designers would based or data-based design.” But all data needs people do better to creatively combine the two with the ability to read them, to extract meaningful approaches. In this manner, they can statements out of hard measurable numbers just as from overcome the weaknesses and build on soft behavioural anecdotes. “This is the next phase also the strengths of the respective methods: in the university,” Wettach summarises, “where we will broadening inspiration and providing teach people to work with this data […] and to design in evidence. this context.” Services cannot be built and improved by only Notes on the survey: 22.4% were from academia, 30.3% applying quantitative or qualitative research methods. from agencies, 19.1% from corporations, 15.8% from Obendorf generally questions the distinction between start-ups and 12.5% from other sectors. The share of women (52.0%) and men (46.7%) was almost equal. the two and rather distinguishes the “...kind of detail, The dominant age group among all sectors was 26-35 authenticity and validity [that] is provided.” years (63.2%), followed by far by age groups 18-25 years The strict differentiation of both methods can (17.8%) and 36-45 years. The survey hence represents mostly young professionals. turn out to be rather counterproductive in the end. Thus, The research question does not clearly specify the usage service designers should think of data as insights and not in service design projects. It is nevertheless assumed divide them into data that are inspirational and data that that survey participants responded against the backdrop of their service design work. are only used to measure things.
touchpoint 5-1 25
Using Data to Support Effective Decision Making
Amanda Kross is senior strategist at Brightspot Strategy. Amanda recognises that people should be at the core of every design decision. As a senior strategist, she is focused on improving experiences through innovative research methods. She has worked with Google, AlbrightKnox Art Gallery, and the University of Minnesota to develop space, service and organisational strategies.
As service designers, we understand the value of quick decision making. Timely consensus and approval ensures a project is delivered on time, on budget and in alignment with organisational objectives. Similar to personal life changes – selecting a college, buying a home, having a child – people expect to understand the emotional, financial and extended impact of a change before making a decision. So, how can we use information – both qualitative and quantitative – to create a business case for change that enables effective client decision making? And how can we facilitate a process of gathering, analysing, and visualising data that builds consensus along the way? As a mindset and a practice, service design helps create better experiences for people by understanding and improving the complex systems within which they live, work, and play. Done right, it couples qualitative and quantitative research methods to gain a deep understanding of the problem, create holistic solutions and enable effective decision making and implementation. While qualitative methods enable you to gain an understanding of people’s experience, motivations, behaviours and norms, quantitative methods provide you with the tools you need to systematically forecast needs and assess the business impact.
Based on our experience, there are several key principles that can be followed to enable decision-making through innovative research methods: 1. Collect information that address both the human experience and business performance 2. Validate research findings by collecting information from several data sources 3. Break down complex systems by defining criteria for analysis 4. Prototype options to test concepts and enable quick decision making 5. Collect baseline data that can be measured against after a project is complete
deep dive: collecting relevant insights
Research method organisation matrix
Staff Survey Gate counts Utilisation study Space Program Analysis Service Plot
Customer Experience Survey Peer Space Program Analysis Service Plot
Interviews Observations Personas Service Blueprints Visitor Voyages Workstyles Prototyping
Peer Interviews Peer Observations
When developing the approach for a new project, it is important to consider which qualitative and quantitative research methods could be used to understand the full impact of a service, space, or organisational change. In order to do this, you need to understand the relationship between the human experience and business performance, by answering the following questions about the current state and future aspirations: 1. What experience and behaviour am I trying to improve? (Qualitative) 2. What organisational systems are currently supporting this service? (Quantitative) 3. What is the organisationâ€˜s mission and vision for the future? (Qualitative) 4. How does the organisation measure performance? (Quantitative) 5. What are my peers doing? (Qualitative) 6. How do we compare against them? (Quantitative) The methods used to answer these questions can easily be organised using one of our favourite tools: a two-by-two matrix! This matrix, organised by data type (human experience and business performance) and source (internal and external), is a helpful guide in planning the design research process. By identifying the data you would like to collect (are you interested in understanding visitor experience and space utilisation?) you can use the matrix to select the appropriate research methods from the corresponding quadrant (observations and ticket sales, for example). The most robust research studies collect data from each quadrant to address an existing challenge.
LEFT BRAIN, RIGHT BRAIN
This article will unpack these principles and provide resources to help you apply them to your daily work.
Budget Sales Utilization Space Efficiency Staffing
Consumer Market Trends Costs
Behaviors Habits Interactions Satisfaction Productivity Culture Community
Personas Consumer Trends Customer Reviews Service Design Trends
touchpoint 5-1 27
ServicePlot™ framework and sample mapping
DISTRIBUTED (“PUSH” MODEL)
DO IT WITH OUR PARTNERS AND USERS
DO IT FOR OUR USERS
CENTRALISED (“PULL” MODEL) DOUBLE UP
We all know the importance of asking questions. Questions that might seem obvious or irrelevant can expose critical information about project drivers and characteristics. In the same vein, it is important to ask the same question more than once: to different people, during different conversations, in different contexts or using different methods. This integrated approach enables us to gather input from multiple stakeholders, validate data with users, create hypotheses for recommendation and to tell a complete story. Our clients at the NCSU Libraries at North Carolina State University understand the value of this approach quite well. During a recent project to develop the service model for the new James B. Hunt Library, we worked closely with a cross-sectional staff committee to develop a new service model and philosophy and to determine what services would be offered where, when, by whom and requiring what skills and training. This process combined (mostly) qualitative tools like personas, journey maps and service blueprints supplemented by quantitative data inputs. Together, we quantified the current state by looking at gate counts, check-outs, logins and reference questions. We also identified future staffing needs for each service area by forecasting times of low, moderate 28
and peak traffic over the course of a day, a week and a semester, working in an iterative way and checking back to staffing headcounts by level and role. Looking at this challenge from these different perspectives allowed us to consider the staffing, space and service implications of a consolidated service point that would enable usercentered ‘side-by-side’ support for learning and research, innovative technology support for high-tech spaces and proactive roving staff delivering services when and where they are needed throughout the library. DEFINE CRITERIA
The service design practice is rooted in qualitative research methods that enable practitioners to put themselves in the shoes of the customer, user and service provider. Interviews, observations and behaviour mapping are used to empathise with and understand the needs, rituals, and aspirations of our clients and colleagues. In addition to personas, journey maps and service blueprints, there are several methods to systematically break down a big problem into small enough pieces so that you can quantify and analyse them. Decision matrices can be used to aggregate and organise data from individual stakeholders – collected through interviews, questionnaires, or workshops – to organise and streamline the analysis and interpreta-
deep dive: collecting relevant insights
tion process. We recently used a decision matrix with a private university to structure their relocation strategy for over 5,000 Administrative staff. Through a series of interviews, we collected a vast amount of information about each department’s interactions, the function of their current state, degree of organisational change, and institutional goals. These findings were then used to map department along two spectrums: location and relocation priority. The matrix informed the proximity of each department to the main campus, the type of space they received, and when each department should move. The decision matrix resulting in four relocation scenarios exploring different cost, time, and move options. This school of thought translates well into the service design practice. ServicePlot™, a tool developed by brightspot, NC State, and AECOM as part of the Learning Space Toolkit (www.learningspacetoolkit.com), can be used to understand existing service philosophy and envision future philosophy based on organisational values and customer needs. The two principal drivers that position organisations along a spectrum of different philosophies are responsibility (the services an organisation provides or performs) and delivery (how and where services are delivered). When the current state and future direction of an organisation are plotted the ServicePlot™ can be used to guide further conversations around where an organisation’s service philosophy is and where they want it to go in the future. MOCK IT UP
One of the biggest barriers to decision making is fear of failure. Naturally, we are all resistant to take risks, worried that we will not be successful. Many leaders and managers now encourage staff to take risks, accepting that failure is part of the exploration and creation process in order to achieve a successful solution. One of the best methods to circumvent people’s fear of failure is to prototype new service models at a small scale to test their impact on staff, roles, space and technology. In addition to addressing risks, prototyping enables you to identify potential issues in the design before making a significant investment. Prototypes can be especially helpful in understand-
ing the relationship between a space, the people using it and the services provided to them. Through full-scale mock ups of different workstation furniture configurations, we were able to identify the ideal space layout for our clients at Canvas, a digital agency in New York City. Staff worked at folding tables rearranged into various different configurations over the course of several weeks, to see which solution best supported their existing workstyles. Not only did this approach help our team select a furniture solution that best met the needs of the user, but it also gave staff and leadership ownership of that selection, further enforcing buy-in and consensus building. DATA YOU CAN MEASURE AGAINST
Lastly, we would like to leave you with the principle of evaluation. Decisionmakers are interested in understanding the performance of their investment 3, 6, 12 or 24 months after its implementation. Therefore, it is important that every project include several businessperformance / internal measurements from the Research Method Matrix that can be evaluated against in the future. This could include customer or staff surveys, utilisation studies or revenue analysis. Compared against the previous state, these measures present a business case for change. TO SUM UP
Quantitative research methods should be used to supplement and support qualitative data findings. When faced with a new challenge, remember to ask yourself how you will measure the success of a proposed strategy. The answer will ultimately point you towards the most effective quantitative method for that project.
touchpoint 5-1 29
‘Live Labs’: Prototyping Environments to Measure Customer Experience
Tony Driscoll is principal in business strategy at Continuum. He is a seasoned manager with direct experience in the creative, technical and business aspects of design and innovation.
Craig LaRosa is managing principal at Continuum. An accomplished design and innovation leader, Craig has proven his expertise working with clients across a range of industries including Holiday Inn, Sam Adams, Hershey and Audi.
In this digital age, companies have become accustomed to capturing real-time customer data from websites and apps with the click of a mouse or the tap of a finger. Nearly every interaction that a customer has on these digital platforms is logged in the form of 1’s and 0’s, making it cheap and easy for a designer to tweak a site’s interface and see almost instantaneously how that affects customers’ behaviours. In a physical service environment, however, obtaining that kind of rich quantitative customer experience data can be much more difficult. There, customer experiences are full of intangibles, with the sum total of the experience made up of dozens of component parts: The texture of the floor, the level of lighting, the selection of music being played, the smells that pervade the space, the comfort of the seating or the way in which a server engages with you. All of these things must work in concert to evoke a feeling and inspire the behaviours that will ultimately drive business results. The challenge with traditional quantitative metrics, however, is that they tend to focus on business outcomes – or Key Performance Indicators – that are useful in telling you that something is wrong, but often fail when it comes to explaining what aspect of a customer’s experience isn’t delivering. Similarly,
customer experience surveys often miss the mark by capturing what a consumer thinks he does or feels, rather than what he actually does or feels. Even so, when a business is venturing on an expensive new project, it’s natural and desirable for it to want to quantify an experience in order to minimise risk and design the project to meet both business and customer needs. So how does a company go about getting the hard numbers it needs to refine its offerings? When one of us (Tony) worked as an imagineer at Disney, he and his colleagues built, tested and refined entire rides and shows inside warehouses to ensure they fully delivered on customer experience before they ever saw the light of day at a park. But not every company has the budget or resources of
deep dive: collecting relevant insights
Disney. At the design and consulting firm Continuum, we have been pioneering the concept of ‘live labs,’ designing affordable tools to measure the important elements of a customer experience and combine them in a holistic way. Our goal is to understand qualitatively which aspects of the overall customer experience are key drivers and then look for quantitative proxies that can give us a sense of how well these elements are delivering on the overall customer experience. These tools can be used to reduce risk and improve services, whether designing a new experience, maintaining an existing one or auditing and refining an experience that is not functioning as it should. BUILDING NEW EXPERIENCES
The Boston-based restaurant chain, Bertucci’s, which has served pasta and brick-oven pizzas to families for several decades, came to us looking to partner on creating a new restaurant concept that would appeal to the ‘Millennials’ demographic. Together we named it ‘2ovens’ and developed a different kind of experience around the needs of younger customers, who are more interested in hanging out in a fun, social environment and having the freedom to design their own dining experience as an alternative to the traditional. We knew that the physical make up of the space would
be key to succeeding in providing the best experience, as well as meeting the business goals of the company. As a lowcost way of testing the concept, we built a full-size, foam-core mock-up and brought in chefs, servers and faux-customers to test its viability. The prototype taught the team important lessons, such as the importance of every seat being able to view the two wood-fired ovens that we identified as being one of our key experience drivers. In order to actually test how the restaurant would work, however, our client went the next step by opening up a pilot restaurant in a Boston suburb as a live laboratory. There we were able to test the concept and make further changes before launching more widely. One important feature, for example, was ‘table turns’. With most restaurants, the goal is to turn over tables as quickly as possible to increase revenues but, with 2ovens, we developed a concept in keeping with Millennials’ habits. We encouraged them to stay longer, continuing to eat and drink as they did. To facilitate this, we created a slot at the table to hold the menu so it would always be there for ordering. Because of the layout, however, some tables were not able to incorporate this feature. By analysing the data from the tables with and without menus, Bertucci’s is able to tell how this feature affects touchpoint 5-1
length of time spent and total revenues for tables, in order to see if both the company’s business goals and the goals for customer experience have been met. While the pilot restaurant has already vastly exceeded expectations – a great sign overall – we’ve also been able to qualitatively identify some aspects of the experience that have not been implemented effectively. By identifying and correcting these aspects of service design, the company will be better able to ensure the efficiency of its staff and the enjoyment of its customers when it opens more widely in the coming months. REFINING EXISTING EXPERIENCES
While the 2ovens work allowed us to build a project from the ground up, often we are working to refine experiences that already exist, giving companies a set of tools to constantly monitor experiences to gradually improve them or, in some cases, completely retooling them to fix what isn’t working. When we worked with a global medical diagnostic firm to create a new service experience for blood testing, we found the actual test took only a few moments. However, customers perceived their wait time as being insufferably long due to the 32
anxieties they had about the blooddrawing process, potential results, time available away from their job and paying for the test. By focusing on the patient’s anxiety as the key experience driver, the company worked to minimise anxieties by altering the layout of the space, the information available and the intake form, as well as installing an electronic queue that would allow patients to know where they stood in line. In order to capture the emotional aspects of the experience, they implemented a touchscreen survey at the end with a series of simple questions to understand how the wait time matched the patient’s expectations. Using that data, we were able to help the company make subtle changes in how representatives greeted people or seated patients, monitoring the effects on the experience in realtime to improve the experience overall.
deep dive: collecting relevant insights
“ [...] the physical make up of the space would be key to succeeding in providing the best experience, as well as meeting the business goals of the company.” AUDITING UNDERPERFORMING EXPERIENCES
We performed a more dramatic intervention for a major financial services company that wanted to better integrate customer service and consultation in their newly formatted retail locations. Their existing space was designed with a traditional counter for transactions, as well as a row of offices for consultation. In addition, they had installed a large video wall with LCD screens to advertise their latest service offerings. In order to understand how these services were being utilised, we analysed a day’s worth of security footage from multiple cameras around the space. We meticulously noted the demographics of customers, where they walked, and even where they looked inside the space, in order to draw up a ‘heat map’ of the average customer’s experience. When we analysed this data, we found that, despite our client’s design intent, the vast majority of customers were
still using the transaction counter. Only 2% of people over the course of the entire day stepped into one of the fully staffed offices for a consultation. What’s more, not a single person went over to the video wall, or even looked at it over the course of the day. From those findings, we were able to dramatically redesign the service environment in order to more successfully achieve the company’s intended customer experience and business goals. While these techniques are still in their early stages, we can see them being used more widely to rapidly prototype, test and refine customer experiences in a wide variety of physical spaces through market launch and beyond. This ‘live lab’ approach to customer experience can help to minimise risk and meet both customer and business needs by providing actionable insights that ensure the most critical components of customer experience are delivered upon at all times.
touchpoint 5-1 33
When Design and Market Researchers Join Forces The challenge of member retention in health clubs
Remko van der Lugt is professor of co-design at Utrecht University of Applied Sciences and researcher at ID-Studiolab, TU Delft. His current research focuses on involving users in designing for sustainable behaviour change. He has extensive experience in organising and facilitating creative co-design processes in business, governmental organisations and academia.
Gerrita van der Veen is professor of marketing, market Research & innovation at the HU Business School Utrecht, and a partner at HighValue, a brand consultancy in Amsterdam. With more than 20 years of experience in market research, she is a specialist in consumer behaviour in relation to brand strategy, concept development and innovation.
The health club industry is booming: one in six citizens of the Netherlands is now a member of a health club. However, about one quarter of health club members cancel their membership every year. The cancellation rate is accelerating. Loyalty in other sports – football, for example – is much greater. The health club industry’s most pressing question is, therefore: what can be done to promote the retention of health club members? The retention issue is partly attributable to increased competition in the industry. More and more people want to stay fit, and increasing numbers of health clubs have opened their doors to meet demand. The health club industry is a prime example of an industry where entrepreneurs start their businesses based on an opportunity combined with personal passion: in this case a passion for sports and fitness. Health club owners must invest significant sums of money in equipment when they start out. In order to attract new members, they tend to promise strength and endurance by providing the best training methods and equipment, assuming that this will attract customers, since this is what they would want themselves. However, many of their customers have different motives and desires. If these are not met, they are likely to walk out the door
within the first three months. Health club owners need to start thinking from the customer’s perspective and to understand that different customers have different motives for going to the health club. ‘Band-Aid’ solutions do not suffice. One health club owner told us how he had invested a large sum of money to install a fancy fireplace, in order to enhance the ‘experience’. However, his members do not gather around the fireplace, and he feels that the investment was made in vain. In order to help health club owners gain insight into the motives and needs of customers so they can target their innovation efforts accordingly, we recently conducted an extensive research
deep dive: collecting relevant insights
and innovation project. We opted for a combination of generative qualitative research and quantitative panel research, followed by a series of idea generation sessions and service innovation pilots. DESIGN RESEARCH MEETS MARKET RESEARCH
Most marketing models and techniques tend to make the customer fit the world of the company, instead of opening up and truly trying to understand the customer’s world. The result is that market researchers confine their research to the needs and desires of customers regarding the domain in which the current activities of the company take place. Organisations tend to perpetuate the image they have of themselves by using it to interpret the world around them. In marketing, existing theoretical models are used to identify target groups. Lifestyle models segment consumers based on different lifestyles, each of which represents a particular value-orientation and associated consumer needs. Psychodynamic models based on the work of Freud and Jung describe consumer behaviour by means of archetypal needs-fulfilment strategies, thus providing an understanding of the world of latent needs and access to them. Such theoretical models harbour their own vision of reality, thus sustaining the internal orientation. The design-research approach aims to take the world of the customer as the ‘centre of the universe’ rather than the world of the company. These research methods attempt to systematically study the complexity of the consumer context without entering the interaction with preformulated hypotheses. Such a user-driven approach can help companies take on an external orientation, provided that
a. it delivers valid insights, b. the company takes part in the learning process, c. serious efforts are made to develop a shared understanding between the design researchers and market researchers. THE RESEARCH PROCESS
The qualitative techniques were used to map the customer’s experiential world in all its richness. The results of the qualitative study were subsequently used as the basis for an exploratory follow-up study, in which we searched for relevant motives and the resulting motivational customer segments. For this, we chose a quantitative approach, because the quantitative extrapolation of structures and patterns from the data are more easily accepted as ‘evidence’ to support our choices. These results were then reconnected and enriched with the insights from the qualitative study. Developing various innovative service concepts for each customer segment provided stepping stones for the health club owners to start targeting their innovation efforts. The project team consisted of a sub-team of service designers and a subteam of market researchers. In addition, a group of 16 fitness club owners were actively involved in supporting the touchpoint 5-1 35
Interview posters (left) and health club owners interpreting and organising the interview posters (right). research, interpreting results, generating ideas and executing pilots within their businesses. In order to prepare the participating health club owners, we provided them with an interview kit, and asked them to investigate their club membersâ€™ experience at the gym. This raised awareness among the health club owners that their clientsâ€™ needs are different from their own. In a session, the owners identified a set of user profiles based on the identified needs. The results were used as input for designing both the generative research tools and the questionnaire for the panel research. GENERATIVE DESIGN RESEARCH
Thirteen athletes from two different health clubs in the Dutch city of Utrecht took part in the qualitative part of the study. The health clubs differed in their range of services and location. One health club was small and located in a deprived area. The other was large and located in an affluent neighbourhood. 36
About a week before the first session, the participants received a sensitising package containing inspiring assignments that helped the athletes to get prepared. For instance, one task was to photograph the contents of their own gym bag and make a series of photos of their trip to the health club.
The fitness sensitiser, containing a variety of assignments for the participants.
deep dive: collecting relevant insights
Different motivational drivers during the fitness journey In the two group sessions, individual creative assignments were Improve alternated with group discussions. Desire This helped the participants to bring Maintain Movement their own world of experience into Structure focus. They provided input through MOTIVATION EXPERIENCE Compensate collages and then presented their creaDiscipline Social RESULT tions to each other and discussed the Context resulting insights. All of the sessions Feeling good Pressure were recorded on video. Relevant segments were identified from the Self image video footage and then transcribed. BEFORE DURING AFTER Segments were clustered and relationships were identified. A model PANEL RESEARCH The basis for our segmentation gradually developed that identified the The results from the qualitative study was a combination of the motivation motivational factors for these athletes. were used as a basis for quantitative and perception statements for all The model shows that the athlete’s panel research in order to gain insight three phases of a visit (before, during motives change during the various into the various behavioural motives, and after). These questions covered phases of their health club experience. on the basis of which a number of the psychological drivers relating to We constructed a set of seven different motivational segments could exercise as they had been revealed personas based on the motives, desires then be distinguished. The customers through the qualitative research. and ambitions of the group of athletes. of twenty different health clubs were Firstly, a factor analysis was carried Here, we opted for staying as close as asked to rate statements on motivaout in order to expose the structure possible to the characters of the real tion and perception on a 7-point scale of the statements. Two strong factors users in the study. For each cluster of based on personal applicability. More and two weak factors were derived needs, one participant was selected than 5,000 respondents participated from this analysis; all the factors lent to develop more fully as a persona, in the study from January to June themselves to clear interpretation. allowing for slight fictional additions 2010 (response rate 30%). and the inclusion of insights from Results of the factor analysis on motivation and perception statements other users in that cluster. achievement discipline oriented
motivation & perception Admiration from others, greater self-confidence through better appearance When I see others, I want to achieve that too Seeing others working hard gives me energy I constantly want to be better It’s an addiction: sometimes I have to stop myself from going Sport can also be an escape I used to be overweight, but now I have no weight problem. That is because of exercising I feel great afterwards I like the rhythm and regularity You know that it’s good for you, so you go It irritates me if other people watch me while I’m exercising I find it dull to exercise using apparatus I like eating good food and I like going to the pub, so I also need to exercise Having fun exercising on your own and feeling happy It’s more fun with other people The freedom to decide for myself when I exercise
.72 .72 .70 .65 .58 .48 .40
extrinsic individual motivation
.77 .76 .73 .69 .61 .58 .75 .56 .50 .36
touchpoint 5-1 37
“If you are fit, you look better and that means you will achieve more in life... people who look better are more successful”
Athlete segmentation based on the four motivational factors “I am fairly disciplined. I do the things I ought to do in order to compensate for my habits”
results-oriented customer performance
maintenance customer intrinsic motivation
“If I want to stay happy, I just have to exercise. It makes me feel good”
On the basis of these factor analyses, we formulated the following four factors relating to what motivates health club members to exercise: • Factor 1: performance versus compensation (as the personal goal of the health club customer): those whose personal goal is performance prioritize physical hard work and mental release. Those whose personal goal is compensation exercise in order to compensate for an unhealthy lifestyle. • Factor 2: discipline versus freedom: those who like routine and regularity versus those who would like to be able to exercise whenever they feel like it. • Factor 3: extrinsic versus intrinsic motivation: those who do not derive much pleasure from exercise versus those who enjoy the activity for its own sake. • Factor 4: exercising individually versus exercising together: those who would rather exercise individually versus those who prefer to exercise with others and sometimes need others in order to achieve their fitness goals. 38
“I exercise at least six times a week and always at least for two hours”
“When I exercise with other people, I have more fun. I talk more, and I enjoy myself more”
A two-step clustering analysis was then carried out.1 This resulted in the formulation of five customer segments, which differ from one another in each of the four motivators. Rather than providing a report, we ensured that the resulting insights would indeed lead to new perspectives by means of organising four creative sessions based on the motivational model. Each session focused on a single customer segment. Eight health club owners participated in these sessions. They each brought along one of their members who represented one of the segments. These sessions led to 61 ideas that were developed into 21 innovation concepts. These concepts were then organised in the various customer segments, thus providing a framework for fitness entrepreneurs to focus their innovation efforts and direction for new services. Finally, about 15 concepts were prototyped within individual health clubs in order to provide case examples. An example of a service concept for the ‘enthusiast’ customer segment is ‘Items-of-Connection’: many athletes do not appreciate social contact during their workout. At the same time, anonymous athletes experience a minimal bond with the health club. A variety of low-threshold elements promote brief and no-obligation moments of contact. For instance, when an athlete logs on to a piece of exercise equipment, their unique avatar appears on the ‘Connect Wall’ engaged in the relevant activity. Innocuous, playful interactions
deep dive: collecting relevant insights
take place between the avatars at random. This provides the athletes with no-obligation opportunities to interact with people around them.
a sense of objectivity is required. This is where design researchers can learn from the experience of market researchers.
ENGAGING THE CLIENTS
‘Why should I trust your findings?’ The qualitative analysis and interpretation in design research is largely a subjective process, which puts high requirements on the skills and experience of the researchers. At the start of the process, this led to distrust from the market researchers. We put effort into inviting the market researchers to be present at the generative sessions and during the data interpretation. Along the way, the market researchers started to appreciate the value, especially when they saw how the results make things ‘come alive’ for the health club owners. To enhance collaboration in the exploratory stage of user-driven research, qualitative analysis can be complemented by an exploratory quantitative segmentation. This allows for the necessary flexibility to learn from the world of the customer, without being (completely) dependent on the subjective interpretation of the researchers. It also makes it possible to substantiate and legitimise choices made for a target group and positioning strategies.
The health club owners were actively involved throughout the whole process. This helped them to understand the mechanics of the market they were operating in. For this purpose, the creative and qualitative design research approaches were much more powerful. However, these entrepreneurs first needed to be convinced of the purpose of this effort. As the status quo is numberedbased market research, they first had to be engaged by means of basic statistics, after which they were more prone to open up to the real voice of the customer.
LEARNING GOES BOTH WAYS
The collaboration between the disciplines of design and marketing provided learning in two directions. The creative skills of the designers helped the market researchers and health club owners to access the world of imagination. Marketing and market research are disciplines that are very cognitive in nature. In order to bring emotional aspects into the equation, the market researchers appreciated the visualisation and narrative skills of the design researchers. Simultaneously, the marketeers and market researchers helped bring reason to the design process or, at least, they helped substantiate insights and concepts, in order to make them more viable in the business world. Solely using results from the largely subjective design research may be useful in the design process, however, for a segmentation that can be used in business practice,
Ultimately, the goal of this research project was to help health club owners understand their customers better, so that they can make better use of their limited innovation budgets to keep their customers on board. The quantitative research provided them with information about the market segmentation. The qualitative design research provided them with empathy and a deep understanding of their customers’ experiential worlds. Most of all, their engagement throughout the project, including the piloting of service innovations at their venues, inspired them to really make choices and take initiative in developing new services that better meet their customers’ needs and desires.
Since the factors are not all equally reliable, the cluster analysis was carried out using the individual items.
touchpoint 5-1 39
A Comprehensive Model for Measuring Value Merging the Needs of Management and Design
Antti Koskinen is service designer at Service Design Agency Palmu Inc. Antti has worked as the creative director or CEO of various design agencies for the past 14 years. Petteri Hertto is service analyst at Service Design Agency Palmu Inc. Petteri has over 10 years of experience in concept testing and service optimisation. His areas of expertise are KPI definition and quantitative and qualitative design research. Maiju Nöyränen is service analyst at Service Design Agency Palmu Inc. Maiju is a psychologist with 12 years of experience in behavioural analysis, qualitative customer profiling and customer insight. Mikko Jäppinen is service designer at Service Design Agency Palmu Inc. Mikko is an industrial designer with over 15 years experience designing services. He specialises in concept design, sketching and visualising 40
In most cases, managers and designers need different data. Managers use ‘hard data’, figures and KPIs (Key Performance Indicators) to make decisions based on validated facts. Designers need tools to understand customer needs, motivation and behaviour. We will introduce a proven model for merging the qualitative design drivers preferred by designers with the quantitative KPIs preferred by managers. The model represents an agile method of iterative co-design. It builds a service that meets established goals and enables an optimised balance between business and customer value in service execution. ‘Customer Insight’ is a hot business buzzword today. A considerable amount of research is conducted to support business decision-making and the work of designers. But, in real life, most of this research is rarely actually utilised. Findings are presented and – in some cases – used in business decision-making. But when the project lands on the designer’s desk, the findings are ignored. The vast majority of customer research is conducted using quantitative methods full of figures and percentages on demographics, brand preferences, etc. Qualitative studies flesh out the numbers and gather general expectations and opinions of the target group. The definition ‘Female, 35-45 years,
Conservative’ enables development management to estimate the size of the market and tone of communication, but offers few guidelines to designers. A designer needs concrete, hands-on information: why are customers using the existing service? What is their motivation? What problems or limitations exist with it? Designers need a deep understanding of customer behaviour and contextual and cultural factors, as well as identification of those factors that would solve the customers’ problems. Results should be presented in a format that is directly applicable in service development: revealing behaviours, motivations, needs and their relevance for various customer profiles.
deep dive: collecting relevant insights
Quite often, even tiny alterations in design can make drastic changes in the value of service for the customer (customer value). Traditional market research is too general to measure these micro-improvements. The conflict between the information provided and actual design needs has led to a situation wherein design is said to be based on research, while designers are actually making decisions based on instinct. Traditional research tools are not perfect for management, either. Typically, a potential idea is tested before management will make large investments in it: this testing generally takes the form of concept tests conducted by market researchers. There are several problems in this traditional testing approach: it does not iterate the concept; it does not indicate if the idea is void
of customer value and it does not communicate how the concept should be changed to increase customer value. But the main problem is that, with traditional testing, the findings are recognised too late. Typically, when research shows that there are problems with the idea or design at hand, no-one involved wants to listen. Designers do not want to admit that they have done poor design. Project managers donâ€™t want to report that they have wasted several weeks and substantial sums on design and research. The typical outcome is either the death of the potential concept or only a few, easily fixable cosmetic changes to the solution. Palmu Inc. is a service design company founded by designers, researchers and business consultants. For four years, designers and researchers have sat side by side and developed the method described below. It has grown out of a culture in which the researcher has the final say on design and the designers have the right to demand the answers that they actually need. The first step in the creation of a service is an idea. Service ideas arise from different sources: customers, touchpoint 5-1 41
brainstorming or even research. Our Service Success Forecasting model measures the potential of an idea and shows how it should be designed to achieve its maximum potential. SERVICE SUCCESS FORECASTING
Level 0: Goal setting When a potential service idea has been found, a KPI workshop is held at which designers and managers jointly define the success metrics and the goals for the service. In order to achieve common agreement and engagement, it is vital that both parties be involved at this stage. The established KPIs will, throughout the design process, act as quantifiable goals that reveal customer value and will drive the coming service to success. Level 1: Conceptual Once the KPIs have been defined, the idea is tested on target users as soon as possible. Even the simplest mock-ups and MVPs (minimum viable products) will reveal customer reactions. The first test round should focus on validating customer needs and the ability of the idea 42
to solve them. Testing as soon as possible ensures that the right course is charted: pivoting is easy at this point. Level 2: Usability When customer feedback validates that the idea solves real customer problems, the design of the actual solution (i.e. a visual prototype) begins. Again, the solution should be tested on target users as soon as possible. If there is any negative feedback, corrections should be made instantly and another round of testing undertaken. The key issue is to test the service concept iteratively. Varying the elements of the solution between test rounds enables agile co-design of the concept. This method guarantees that we end up with an extensively tested, customer value-proven service â€“ not a stack of dense test reports that nobody actually utilises. Test iterations are repeated until all customer feedback is positive. Level 3: Wow! But that is not enough! We strongly emphasise that â€˜OKâ€™-level executions (everything is fine in terms of usability
deep dive: collecting relevant insights
and idea) just will not cut it when it comes to gaining the necessary customer and business value for a successful service. A successful service has to have something that differentiates it from competitors and leaves a yen for it in customers’ minds. We call this reaching the ‘Wow!’ level. If the concept iteration reaches the Wow! level, the service idea has a strong prognosis for success. If the Wow! level is reached, but the big picture remains unclear, a quantitative test can be used to validate the results with a larger customer base, e.g. to gain feedback from different customer profiles. Quantitative tests also reveal the raw numbers for managers: is the concept attractive enough to create a business case? THE MODEL IN ACTION
The model is iteratively co-created within our multidisciplinary organisation through a vast number of service design cases for different business areas. The fundamental insight in iteration of the Service Success Forecasting model was the importance of cooperative goal-setting in KPI workshops. A facility services client, for example, was experiencing major challenges in implementing new service ideas in their organisation. The KPI workshop was used to set shared goals at every level of the organisation. During the iterative rounds of service idea testing, KPIs were also visualised for everyone – managers and workers and clients of the facility services company – and designers had access to see how the new pop-up service iterations performed in real life. An iterative, agile development process is used in the majority of our
cases. The opportunity to make quick fixes and dramatic pivots in concepts and usability has become a natural part of the design process. Service ideas for media, banking, insurance, medical and directory company clients are quickly and continuously prototyped and tested: a single relevant piece of feedback from a customer can lead to immediate iteration. We strongly believe that the best proof of the relevance of the iteration is vigorous testing. Test interviews are open to the whole team (researchers, designers and the client) through a broadcast streaming application. Most of the concepts we have tested easily reach the ‘OK’ level. Applausegenerating triumphs are rare. In a case for a car website, only one of five good service ideas survived the process. Another example is that of a real-estate company that wanted to develop a selfservice concept for tenants. After rounds of testing iterative mock-ups, we needed to know the respective customer value of various service elements in order to prioritise their hierarchy and role within the self-service concept, and to identify the elements that reached the ‘Wow!’ level. We achieved this through quantitative research and through studying layouts and customer profiles via multivariate analysis (cluster and factor analysis tools). As true believers in the ‘suckit-and-see’ culture of design, we also continuously and iteratively test our Service Success Forecasting model. Only in this way can we create increasingly better services for customers whose Wow! reaction is also easily iterated. Their demands for Wow! experiences are continuously increasing.
touchpoint 5-1 43
Lean UX: An Iterative Process Between Quantitative and Qualitative User-experience Research
Anna LĂ¤sser, Design Thinking consultant, relevantive AG.
Christina Dicke, Ph.D. senior researcher, Quality and Usability Lab, Telekom Innovation Laboratories, TU Berlin.
If UX research is to become the driving and defining force in the product development cycle â€“ and it should â€“ it cannot be sporadic. Particularly in the case of innovative product development, it has to go beyond the traditional insights and issues of focussed selective UX testing towards an iterative process of validating market needs and defining business goals from which customer value propositions can be derived and product features developed. While many companies rely heavily on (web) metrics to reveal quantified tendencies, metrics do not give insights into why users use products or services in a specific way. By combining both quantitative and qualitative methods, applying Lean UX becomes a strategic advantage: by radically translating qualitative user insights into measurable units, companies have the means to evaluate proposed solutions and designs along the KPIs most viable for their specific business model and product phase. This paper will introduce a practical Lean UX approach that combines qualitative and quantitative research and utilises Design Thinking methods for fast paced, user-centred product and service development. After giving a definition of Lean UX and Design Thinking to establish a common understanding, the
key features of a Lean UX process will be introduced and exemplified by a real-life scenario. Finally, the interplay between qualitative and quantitative research will be highlighted. LEAN UX AND DESIGN THINKING
In the context of this paper, Lean UX and Design Thinking are understood as iterative, solution-focussed approaches to usercentred design and product development. They embody the philosophy of question first, then build, measure, and learn1. Therefore, they advocate testing ideas at an early stage so that adjustments can be made promptly, with minimal effort and at little cost. Lean UX (testing) focuses on a limited, well-defined set of research questions throughout the product development cycle and is only defined by this scope, not the research methods applied.
deep dive: collecting relevant insights
all stakeholders kick-off workshop
prioritise core feature
Phases of the Lean UX & Design Thinking methodology
While Lean UX (testing) seeks to answer the questions posed and generates user insights, Design Thinking methods are used to make insights accessible to all stakeholders, to foster an active collaboration, and to support a creative and solution-oriented product or service design. PHASES OF LEAN UX
The Kick-Off Workshop As depicted in the figure above, each Lean UX project starts with a kick-off workshop, which is crucial to understanding a company’s current situation and needs. In the case of a company providing a car-sharing service, product owners, technicians, designers and other relevant stakeholders would participate in the workshop. Together with all stakeholders, the product or offered service is analysed and put into an internal (business) and external (competitors) context . This is an important step to fully understand its facets, especially the Key Performance Indicators (KPIs) that the product is evaluated on. The Feature Backlog In the feature backlog, already existing hypotheses and assumptions regarding user needs and possible causes for problems are collected. Additionally, a list of features that are considered for implementation are added to the log and are then translated into research questions and hypotheses.
rapid prototyping evaluation & optimisation all stakeholders
testing & optimisation
Prioritise Core Feature From the feature backlog, the feature, hypothesis and research question with the highest priority is selected. • Example Feature: RFID sticker on a drivering licence to replace separate access cards. • Example Hypothesis: a certain number of users will not always want to carry a separate ID card with them to gain access to the car sharing service and will therefore not have it to hand when they decide to use a car on the spur of the moment. • Example Research Questions: When do users use car sharing services? Do users plan trips ahead or use car sharing on the spur of the moment? What do users think about ID or access cards? Rapid Prototyping & Evaluation and Optimisation Depending on the core feature, hypothesis and research question, a suitable prototype and evaluation method will be designed that will deliver the necessary insights. Methods with a qualitative emphasis, such as interviews, behavioural observations touchpoint 5-1 45
“While qualitative research seeks to generate an understanding of users, their motives and needs, quantitative research validates hypotheses formed on the basis of qualitative research.” and diary studies are good choices for enabling an understanding of potential users, how they interact with certain technologies or services, presumptions, prior relevant experiences, mental models, and so forth. Depending on the insights gained, the prototype is adjusted and evaluated again. Once the best possible prototypical solution is found, all relevant insights gained are presented in a workshop. By applying Design Thinking methods, these insights and the prototypical solution are presented to the product team in a way that makes them as accessible and transferable into an implementation as possible. Active collaboration is a core element of a successful workshop, through which recommendations can be formulated and solutions can be conceived. As a general principle, at a minimum those stakeholders who later have to work with the research results ought to participate in these phases as observers. Integrating the product team’s feedback into the prototyping and evaluation cycle makes the research itself iterative, as it can be tailored to cater to the company’s needs. The product team’s participation during this 46
phase is an important step in building more empathy for potential users and to gain an understanding of users’ needs, which, in turn, enables the product team to successfully create user-centred products. Implementation The prototypical solution is then implemented based on the recommendations given by the product team. Testing & Optimisation The implementation and the proposed solution are then tested and evaluated. This can either be by means of a quantitative performance measurement, for example in form of A/B testing (where A = solution in place and B = intended improvement). Or it can be, depending on the core feature and associated research questions, another quantitative study focussed on evaluating the hypothesis. If, for example, the research team finds a strong influence of gender or income or some other aspect on the usage of car sharing services, a questionnaire based evaluation with a sufficiently high sample number might be the right choice to demonstrate a statistically relevant effect and its size.
deep dive: collecting relevant insights qualitative research
The interplay of qualitative and quantitative research methods in Lean UX QUALITATIVE AND QUANTITATIVE RESEARCH IN LEAN UX
The figure above visualises the interconnection of qualitative and quantitative research in a Lean UX project. Given that the Lean UX approach accompanies a product or service throughout its lifecycle, data or insights generated by one method feed into the other. While qualitative research seeks to generate an understanding of users, their motives and needs, quantitative research validates hypotheses formed on the basis of qualitative research. As all hypotheses aim at improving elements that impact KPIs, a post-hoc evaluation of each iteration in the Lean UX cycle is a business imperative in user-centred design. On the other hand, user behaviour, revealed by quantitative data analysis, ought to be explained by qualitative research. This, too, is a necessity, as an exclusively number-based optimisation precludes solutions and innovations that are based on a deeper understanding of the user. Thereby, Lean UX evolves into a process that generates results that fit the pace set by a client. It complements short deadlines that demand a very focussed, yet sound research design that delivers
B quantitative research
actionable and verifiable results. Lean UX is based on four requirements: • Knowledge about internal and external stakeholders has to be shared; • Directly related stakeholders, such as the product team, need to be involved; • Performance measurements based on the company’s KPIs need to be conducted to evaluate the success of recommendations and solutions and; • All insights and data need to be made as accessible and understandable as possible to stakeholders by applying Design Thinking methods Hence, by strategically combining rapid prototyping and quantitative and qualitative research into a format facilitated through Design Thinking, it is possible to fulfil two main requirements stipulated by companies: to enable an agile and iterative context that adapts to the company’s development pace and to ensure validated decisions that will have a positive impact on the company’s business goals.
References 1 Ries, Eric (2011). The Lean Startup: How Today’s Entrepreneurs Use Continuous Innovation to Create Radically Successful Businesses. Crown Publishing.
touchpoint 5-1 47
Purpose-Driven Research as Key to Successful Service Design
Stefan Moritz recently joined Veryday as director of service design. Featured as a notable alumnus in Business Week, Stefan has worked with numerous blue-chip brands, including Adidas, Disney, Nokia and Philips. He has accumulated wide experience from the fields of marketing, design and technology.
Marcus Gabrielsson is a design strategist and researcher at Veryday. Based in Stockholm, Sweden, Marcus has been with Veryday for 10 years. Leading multidisciplinary teams, he has extensive experience in designing and executing insight and analysis strategies, as well as translating insights into actionable and businesscompatible concepts.
A service experience takes place at the very moment when a person interacts with an organisation’s touchpoints over time. That interaction can’t be exported, as the customer always has to participate. However, each person does wear different hats when it comes to their needs and expectations depending on mood, agenda, time of day, etc. For instance, travelling through an airport is different and feels different when on a business trip versus a vacation with the family. Recent neurological and psychological research gives us new insight into the way experiences work. Daniel Kahneman is well known for researching the disconnect of the ‘Experiencing Self’ and the ‘Reflective Self’. The latter derives emotion from the memory of an experience. The peak positive and negative moments – and especially the end of an experience – shape that memory1. Understanding individual emotions is key to designing better context. Designing for better emotional experiences, as well as practical, puts a new set of demands on service designers and enterprises. As many organisations are shifting to customer centricity the demand for research, data and metrics is increasing. Decision makers need the numbers to back up the business case for each design concept, and designers need qualitative
insights to spark innovation. At first sight there seems to be a conflict between rigour and magic. How can intuition and imagination connect best with robust validation? A recent global study of CEOs highlights that 66% consider customer relations as a key source of sustained economic value and 73% are investing heavily in customer insights. The CEOs challenge is that, even if they uncover customer insights, their organisation is not necessarily equipped to respond with relevance and speed2. For service designers to respond successfully, it is critical to find the right combination of quantitative and qualitative research. It is as important to consider the decision-making process and organisational engagement as it is to uncover quality insights and innovative design solutions.
deep dive: collecting relevant insights
Four stages of research
Requirement document Sparking innovation
We see four key challenges. Firstly, finding the right balance between statistics (quantitative data) and the understanding of human needs and desires (qualitative data). Deep and truly game-changing insights most often come out of human interaction and are an essential part of the design process. Quantitative data can be limited in its depth and often not sufficient to uncover the underlying emotions and motivation to pursue the ‘Why’ behind the numbers as innovation-sparking fuel. Qualitative research, on the other hand, can be time consuming to prepare and expensive at a scale where it would be statistically robust. Often, decision makers are used to a more robust statistical validation and research reports based on quantitative data. Secondly, to drive relevant outcomes, you have to negotiate between business reality and understanding customers. It can be difficult to share the empathy with customer’s emotional experiences across all relevant stakeholders. This is not only key to
creating relevant designs, but also for stakeholder and project confidence, buy-in and prioritisation of investments. Thirdly, it can be hard to articulate the value of service design solutions in a way that connects with established measurements and benchmarks and the fourth challenge we see is empowering decision makers with the right data and intelligence to validate solutions and, what’s more, to justify these to their peers, the board or shareholders. Veryday has worked with both qualitative and quantitative design research for more than 40 years. In the last decade, a number of service design projects have highlighted the need for new perspectives and solutions. Ultimately, it comes down to undertaking a focussed effort to understand the constraints and culture of business owners and to negotiating the right approach and, ideally, to be re-appraised regularly. The framework of ‘Four stages of research’ has been a useful aid in optimising the choice and combination of relevant research formats. touchpoint 5-1 49
The Verydeep Water Framework
What people say & think
What people do & use What people know, feel & dream
Observations Psychological Elicitation Techniques
Tacit knowledge Latent needs
There are five key lessons that might help us create more effective research for service design projects: 1. Research the research: properly leverage existing research and help the organisation to make sense of it from both consumer and business perspectives. For example, with a large international travel business we conducted an audit of all existing research to identify important principles and gaps, as well as making the whole body of knowledge accessible and more valuable. Realtime behavioural data presents new challenges, but also new opportunities. Persona clusters can be mapped against business models and help identify the value of service design solutions. 2. The client is also a user: organisations often need help navigating and packaging their insight strategy. As service designers, we should be well equipped to help organisations to design both qualitative and quantitative studies. We need to step back and take internal politics, motivations, perception of risk and value as seriously touchpoint 5-1
as what we know about customers. We used this approach successfully to help the innovation department at a global life-science equipment provider completely change their strategy. 3. Sense and feel the magic: involve and engage different stakeholders to be immersed in research data. As Bill Moggridge said: “The only way to experience an experience is to experience it.”3 So, bring stakeholders along during research, give them a role and let them experience first hand. This will make them ambassadors for insights and concepts further on in the process and help connect organisation and customer needs. For example, with a global furniture retailer we have roleplayed selected situations and insights to give the client team a chance to ‘feel themselves’ into their customers in a very effective way. 4. Purpose as key to success: understanding the ‘Why’ of people’s behaviours and emotions is as important as ‘Why’ research initiatives take place and how they are linked to each other. Customer behaviour is not always well represented
Adapted from SLEESWIJK VISSER!, STAPPERS, VAN DER LUGT, 2005
deep dive: collecting relevant insights
Shared Veryday method
Emotional Experience Mapping As humans, customers are influenced by emotions in their acceptance, liking and attachment to products and services. An emotion is both a mental and physiological state, associated with a wide variety of feelings, thoughts and behaviours. Emotions evoke different actions and reactions to products, services and the brands that create them. Alongside traditional customer journey maps, we also map the emotional journey of customers â€“ before, during and after using a service â€“ providing vital cues to innovation. Understanding emotions means that we can design the most compelling context, services and systems possible. Truly understanding and addressing the emotions of customers offers an opportunity to differentiate a service experience.
How does it work? 1. Segmentation & recruitment for interviews, shadowing, and observation 2. Consumer interviews capturing and probing emotions that occur before, during and after user interaction 3. Emotion clustering 4. Mapping and analysis of the total emotional journey 5. Emotional pattern analysis The map provides vital clues on how to add value to a proposition and make our clients offerings more meaningful and attractive to their customers. touchpoint 5-1 51
“The opportunity for service designers is to see ‘Qual’ and ‘Quant’ as complements to the process.” in explicit verbalised articulation. It is key to understanding what lies behind, what the drivers are and, ultimately, to discovering the underlying purpose. The framework ‘Very Deep Water’ has been very helpful in discussing and identify the right methodological set up. 5. Connecting research strategy with validation strategy: if the measurement is negotiated upfront, it becomes easier to connect and inform a quantitative study with qualitative input. In many projects, we have seen that a research set up with clear constraints, as well as a strategy on how to process and validate data, makes the research and analysis process more effective. In a recent project, we helped a logistics provider improve the service offered to private individuals. We used our ‘Speed Dating’ method to respond to the requirement of quantitative validation. Through scenario business cases, extrapolation and quantified iterations, we established results that were judged to be ‘good enough’ for decision-making in a fraction of the cost and the time that a regular approach would have taken. Overall, the opportunity for service designers is to see ‘Qual’ and ‘Quant’ as complements to the process. When the two approaches work together, the process is more efficient and service design projects more successful. We used such a Qual/Quant set up for a large global 52
FMCG brand very successfully last year. During a six-month period, we conducted qualitative studies in Asia, the US, Europe and South America, spending many hours shadowing, interviewing and co-creating with consumers in their own private environments. Each insight process was followed by ideation and prototyping and local validation tests. This allowed the organisation to make global conclusions on local insights and data. As a young field, we in service design need to continue to develop tools and best practice to bridge, as well as link, the two approaches. Ultimately, business facts and figures with clear connection to human behaviour are hard to argue with. Setting a validation strategy early on in the process with decision makers and stakeholders creates a more integrated development process. Validated innovation rooted in deep insights is the key to building strong and sustainable service design solutions and can create valuable long-term brand affinity.
References 1 Daniel Kahneman TED talk 2010 http://www.ted.com/ talks/daniel_kahneman_the_riddle_of_experience_vs_ memory.html 2 IBM conducted more than 1,700 in-depth, face-to-face interviews with CEOs, general managers and senior public sector leaders from around the globe. http://www-935.ibm.com/services/us/en/c-suite/ ceostudy2012/ 3 http://www.billmoggridge.com/celebration/?cat=6
deep dive: collecting relevant insights
Shared Veryday method
Speed Dating From the start of this project, we knew that this logistics company required validation tests for concepts for them to be able to make strategic decisions. This meant we designed our qualitative method set up with a quantitative study in mind. We considered how we were going to measure the success of our concepts. Consumer issues that later emerged during our qualitative study helped us frame problems, as well as giving input on how to measure the extent to which the concept addressed or eliminated these issues. With this knowledge, we outlined a quantitative study and incorporated qualitative variables at the same time. The qualitative part: we researched customers from the very beginning of their decision-making process at their kitchen tables, through every step of the purchase process, from finding the right address and service, to looking up the opening hours of the service. Through a combination of immersive research techniques, observations and interviews, we tracked frustrations and emotional highs and lows throughout the whole experience. This allowed us to overlay an emotional ‘heat map’ on top of the more traditional customer journey map. This helped identify events and touchpoints and the emotional clusters connected to specific parts of the journey. Analysis of these clusters of emotions later informed the ideation process and helped us reduce customer frustrations with our design concept. The quantitative part: the ‘Speed Dating’ method combines both quantitative and qualitative approaches. Having addressed consumer frustrations with our concepts, we wanted to find out to what extent these frustrations were reduced or eliminated. This was measured by asking questions related to the issue that the concept aimed to solve. In this way, we were able to extract statistics on how well we solved the problem. This is how Speed Dating works: • A number of stations are set up in one location • Consumers are exposed to a service prototype that is brought to life • Throughout the validation test, the customer is prompted with various questions, which are logged through a questionnaire • When completed, the customer moves to the next prototype • Analysis and validation
Here is a sample question we asked in the project, based on a qualitative insight, namely that customers sometimes had difficulties understanding what input was needed from them in order for the service to perform at its best: “This service makes it easy for me to understand what I need to do to get what I want.” The respondents were then asked to rank on a scale how well the prototype responded to the above question. This way we were able to quantitatively measure acceptance level and likeability for each prototype. User-validated concepts, in combination with scenario business cases for each concept, allowed top management to make grounded decisions on how to prioritise and move forward. Pros and cons of this method: this method requires active recruitment and working with the same cohort throughout both qualitative and quantitative research stages (the same cohort, that is, for evaluating the prototype as the one identifying the need for it, and not necessarily the same individuals), and the experience needs to be brought to life. Both take time and drive costs that need to be considered. The benefit of this validation is to be able to ask the ‘Whys’ for more qualitative input behind the collected data. In our experience, it is possible to get qualitative and quantitative feedback on several service prototypes from many customers in a relatively short period of time. The value of this method lies in obtaining a robust validation in a very effective way by leveraging the best of qualitative and quantitative techniques. touchpoint 5-1 53
Life and Death Data Merging qualitative and quantitative approaches to improve patient engagement
Christopher Ferguson is the CEO of Bridgeable (formerly Cooler Solutions), a Toronto-based research and design firm. At BridgeÂ able, Chris leads project teams in the healthcare and consumer sectors with world-leading organisations such as Genentech, the Centre for Addiction and Mental Health, Mt. Sinai Hospital and Specialized.
This article will address the challenges and opportunities in transforming quantitative data into qualitative experiences for patients. This approach culminated in the design of a shared decision-making service for cancer patients. The service provides patients and health care practitioners with decision-making materials to improve the treatment experience of people with breast and prostate cancer. While quantitative data about treatment options has existed for decades, research proved that patients and physicians had a very different understanding of the consequences and trade offs each treatment offered. BRIDGING SCIENTIFIC TREATMENT AND HUMAN EXPERIENCE
Within healthcare systems, our ability to collect quantitative data has never been better. Technological infrastructure and analytics are increasingly accessible and affordable. Meanwhile, our healthcare systems struggle to address the changing patterns of daily life. Patients are regularly frustrated by their inability to understand and manage their own health. In order to bridge this gap, we must design healthcare delivery that effectively merges quantitative data with experiences that are qualitatively improved from the perspective of patients. 54
Regulatory reforms are driving a change towards the inclusion of qualitative elements of patient experience. These reforms are replacing the traditional medical paradigm where the physician made treatment decisions driven by quantitative data. For example, many clinical trials now require that quantitative scientific results must be supported by qualitative reports from the patients. What this means in practical terms is that even if a new treatment outperforms a placebo by objective measures, the treatment benefits must also be reported by the patient. This has lead to certain products that demonstrated scientific efficacy being rejected simply because patients did not experience the results as described in the study.
deep dive: collecting relevant insights
percentage of patients with breast cancer considering chemotherapy that rate living as long as possible a top priority:
The other major reform coming into place that is driving change within healthcare is the ability for physicians to bill for the time spent educating patients. In the past, patient education was a ‘nice-to-have’ feature of patient-physician interaction. With the requirement for patients to be more involved in decision-making and with physicians now being compensated to spend time on education, the stage is set for new modes of interaction that bridge between objective quantitative data and subjective qualitative realities. CANCER TREATMENT DECISION MAKING
Within cancer treatment, patients have historically had little understanding or input into their own treatment. Physicians have been the principal decision makers, yet statistics show that physicians do not have intuitive understanding of their patients’ preferences. For example, doctors believe that 71% of patients with breast cancer rate keeping their breasts as a top priority. The actual figure reported by patients is 7%. Furthermore, doctors believe that 96% of breast cancer patients considering chemotherapy rate living as long as possible a top priority. When you ask patients, only 59% agree that it is a top priority. Even the experience of receiving a cancer diagnosis can be baffling to
Source: Lee et al. 2010
percentage of patients with breast cancer that rate keeping their breast a top priority:
some patients: one patient in our study shared their experience of completely misunderstanding the cancer diagnosis that was being communicated to them: “The nurse told me, and she used a big long word I couldn’t even pronounce. She said, ‘you’re taking this really well,’ and I said ‘that’s because I don’t know what you just said.’” The focus on quantitative survival data alone misses issues of key importance to people about how various treatments would impact their daily life. In cancers where various treatment modalities can be offered, treatment courses tend to be prescribed based on physician specialisation preferences. From a purely statistical basis, there was little variation between the survival rates of patients who undergo different treatments. Even if a physician wanted to direct a patient to external resources, no trusted resource exists that is tailored to the communication needs of patients or the treatment impact on their daily life. In fact, one clinician in our study reported: “... the hospital doesn’t have resources to support decisions… 90% of patients are left without treatment support.” Given the historic lack of patient involvement in decision making, we set out to design a shared decision-making service that could effectively educate and empower concordant care. touchpoint 5-1 55
Personal and Clinical approaches to content used in testing
IDENTIFYING KNOWLEDGE GAPS AMONGST PATIENTS
We began by conducting qualitative research with select cancer types that have multiple treatment options and where these options have highly subjective risks and benefits. The cancer types focused on for the study were prostate and breast. Prostate cancer is highly unlikely to be fatal. Treatment options range greatly from observation to surgery and radiation therapy. For most cases, observation is all that is required, although many patients fear the idea of having a cancer growing inside them. As one clinician explained: “Patients treat their anxiety with surgery.” A major side effect of prostate surgery is impotence, which can have a significant impact on the quality of life for people who are sexually active. Also, many people confused the term ‘infertility’ with the term ‘impotence’ only to find out the real impact too late. 56
Breast cancer treatments also offer a wide variety of risks and benefits that can have a qualitative impact on people’s lives. More invasive treatments such as surgery can cause limitations in movement, scarring and pain without changing survival rates. Less invasive treatments like chemotherapy can last 2-6 months and affect people’s ability to go back to work or to help out at home. DESIGNING FOR ENGAGEMENT
The focus of our project was an online decision support tool that patients, their families and clinicians could use together or independently. The decision support tool is web-based, allowing for access inside clinical settings or into people’s everyday lives. We began our project by partnering with the University of Toronto’s Biomedical Communications Program, one of 4 such programs in the world. Biomedical Communication is an interdisciplinary training that merges visual communication and scientific research to develop images and interactive technology that communicate complex scientific ideas. Together with professors and a graduate from the program, we began a human-centred design approach that included both in situ observation and prototype iteration.
deep dive: collecting relevant insights
We developed a range of communication styles for communicating key content. For example, we developed definitions of treatment options in a narrative style illustrated with hand sketches and also with a textbook layout. We also designed a wide variety of visual representations of data. We tested survival data using curves, stacked icons, random arrays and various icon types were used to represent the proportion of people who survive every year. The content and overall structure of the decision-making tool was iterated multiple times based on direct feedback from people undergoing cancer treatment. Prototyping helped to determine the rational and emotional impacts on patients and their families. Currently, the decision-making tool is being prepared for the clinical setting with a world-leading cancer treatment centre where it will be incorporated into the cancer services delivered to patients and their families. THE CHALLENGES AHEAD
As we continue to collect and store terabytes of patient data, the real work of designers will be in translating scientific data into the challenges of everyday life. One major challenge ahead is how to help patients set goals and measure preferences when so many important elements of the healthcare experience are fundamentally qualitative. For example, a treatment goal for a person with prostate cancer, such as good sexual function or normal urinary function, is hard for most patients to relate to a numerical value. In order to address this, Bridgeableâ€™s designers have been experimenting with patient cases and personas as a way for
Variations in data visualisation for patients
people to relate their goals with a patient type who has been through a specific procedure. Another challenge is in how to enrol physicians and the medical community to accept and adopt new forms of decision making. The scientific paradigm dictates that all things can be objectively measured and quantified. Patient preferences can too easily become measures that lack any real meaning to patients. At the same time, quantification offers real benefits for measuring and tracking how services improve patient satisfaction and outcomes over time. Ultimately, physicians decide what is allowed within treatment centres so services that communicate qualitative values must be designed so that physicians also consider them valid and trustworthy. As designers, we play an important role in developing solutions that both provide patients with qualitatively improved experiences and meet medical scienceâ€™s need to quantify and measure.
â€˘ touchpoint 5-1 57
Left Brain, Right Brain: Working at the Intersection of Design and Business
Erick Mohr is director at Truth. Erick has been working with companies such as E.ON, Barclays and Sony to drive business growth. He holds an MA in Design and a MBA with a major in finance, which allows him to bring a unique set of creative and analytical skills to the projects he is involved with.
Whenever a service design project is presented in a boardroom, questions on hard, measurable outputs often emerge, such as: ‘Who are our most valuable customer segments?’ ‘Which ideas and propositions resonate with these segments?’ and ‘How much are they willing to pay for these propositions?’ To answer such questions, service designers need to rely on a hybrid approach that combines qualitative and quantitative methodologies. Traditionally, service design is grounded in qualitative techniques. For example, the ‘Double Diamond’ approach (coined by the UK’s Design Council1) was built around techniques such as brainstorming and ethnography. Research is often conducted with a relatively small number of participants, but issues and insights are explored in great depth. Qualitative approaches are often quite subjective because designers need to interpret what people say and translate that into actionable insights. Quantitative research, on the other hand, relies on a very large sample (often thousands of people), providing breadth of insight. The interpretation of results is often quite objective because you are dealing with hard data. Clearly, both approaches offer their own benefits. But why is quantitative research not so common in service
design? One of the reasons is that the two approaches are quite different in nature and provide very distinct outputs that are often perceived as ‘incongruent’. But what about running quantitative and qualitative research in parallel? Employing these methodologies simultaneously allows us to iterate results throughout the design and development stages, providing both depth and breadth of insight. Most importantly, it enables us to translate the innovative, creative thinking delivered by qualitative techniques into tangible facts provided by quantitative research, and vice versa. At Truth, we developed a threestep approach that combines both techniques to help service designers build a solid business case for new service propositions, and to get the critical buy-in from the boardroom:
deep dive: collecting relevant insights
quantitative 1. Understand who your customers are 2. Define what your customers really want 3. Determine how much these customers want to pay for it Throughout this process, we can dip in and out of qualitative and quantitative methods, allowing us to iterate and refine insights. This article will present practical examples to demonstrate how this hybrid approach can be applied in practice. STEP 1: UNDERSTAND WHO YOUR CUSTOMERS ARE
In order to gain a better understanding of who the customers of a service are, you’ll have to explore not only their needs, behaviours and attitudes, but also define whether there are different customer segments with different priorities within the customer base. Developing a segmentation model We recently developed a segmentation model for a British bank as an innovation platform for new products and services. We started by looking at data collected from the bank’s internal systems to identify the behaviours customers have towards their money. For example, what payment methods do certain types of people prefer to use? Cash, debit card, credit card? How often do they use these payment methods? Do they rely on credit facilities? It is interesting to note that this type of behavioural data is often readily available from a company’s database and, more importantly, it’s free! The next step was to cross-reference the data with customers’ age and income. From this exercise, a few clear patterns started to become evident. For example, we realised that younger customers tend to rely quite a lot on overdraft facilities, but are not so keen to use cheques.
» many users » objective » breadth
qualitative discover define
» few users » subjective » depth
Qualitative and quantitative research are often viewed as ‘incongruent’ methods Now that we had a good understanding of how different clusters behaved, we then had to explore why they behaved like that. To do so, we relied on ethnographic approaches to explore customers’ attitudes towards personal finances. Some very interesting and consistent insights emerged within each cluster, such as the notion that for some customers, cash was being used as a budgeting tool. For example, by starting their week with, let’s say, 100 pounds in their pockets, they could clearly see how much they had left at the end of each day. On the other hand, these people were not so keen to use debit cards, as how much was being spent over time wasn’t made visible to them.
Combining qualitative and quantitative methodologies through an iterative approach
qualitative quantitative 1. understand who your customers are
2. define what they want
3. determine how much they want to pay touchpoint 5-1 59
monthly data charge
pay per use
Exploring the trade off between price level and take up
included in handset cost 120
Accepted price range: 20 - 50 extra
% take up
Behavioural clusters emerging from the data analysis
STEP 2: DEFINE WHAT YOUR CUSTOMERS REALLY WANT
pay per use
Now that 120 you have a pretty good idea of who your customers are – their needs, wants, attitudes, Accepted price range: 100 behaviours, and most which segments are 1.1 -importantly, 2.3 per use potentially most valuable – the next step is to define 80 what they want.
% take up
This exercise enabled us to overlay attitudinal and behavioural insights, providing a preliminary set of customer segments. In order to validate and further refine the emerging segments, we then conducted a final piece of quantitative research. We opted for a phone questionnaire with a sample of 1,000 participants, in which we explored the attitudes and behaviours of the preliminary segments in further detail. The end result was a clear definition of clusters of customers who were: • Very distinct in nature (so we know which products or services might be more appealing to each segment) • Identifiable in behavioural terms (so we can easily spot them and target accordingly).
Identifying and prioritising propositions for a mobile phone40vendor Another client of ours wanted to develop a pipeline 20 of added value location-based services aimed at the global market. We kicked the project off with a 00 qualitative a number of 0 stage,1in which 2 we visited 3 4 use how people were different markets to understand using their phones and to explore latent needs and opportunities. During these visits we co-created potential applications, leaving us with a large number of ideas. So the question for the next stage was: which ideas should we take forward in each market? To answer this question we relied on an online quantitative survey, conducted in every market where the co-creation took place. The survey relied on a set of storyboards, allowing respondents to understand the nuances of each idea before providing feedback around dimensions such as appeal, price sensitivity and likely frequency of use. These dimensions enabled us to understand core need areas of different customer groups, and prioritise ideas accordingly. For example, by evaluating the relative appeal and likely frequency of use, we could clearly define which ideas to take forward.
deep dive: collecting relevant insights
Prototyping with Excel: identifying the optimum combination of service options
STEP 3: DETERMINE HOW MUCH CUSTOMERS WANT TO PAY
The last stage is to define how much your target customers are willing to pay for the services you are developing. Ultimately, this will inform the financial feasibility of a new product or service. Defining optimum price levels for after-sales services The mobile phone market is becoming increasingly competitive. A major handset manufacturer asked us to help develop a suite of after-sales services in order to increase retention and drive differentiation. Following an initial stage of qualitative insight and creative development (in which many ideas were generated), our team faced the challenge of narrowing down the initial set of ideas. To do so, we employed a quantitative technique called conjoint analysis, which forces respondents to do trade-offs. If we simply asked customers what they wanted, they would probably say: ‘The best of everything, for as little as possible’. Conjoint, on the other hand, replicates real life situations where customers are constantly weighing up options. As a result, it allowed us to: • Optimise ideas by identifying the strongest elements that could drive customer retention • Estimate market size by assessing potential take-up for different service options and price levels Moreover, conjoint analysis produced a wealth of data that was used to build an Excel-based simulator. This allowed us to prototype different scenarios and combinations of service features. Ultimately, the simulator enabled the client team to gain internal buy-in
to take the project to a pilot stage, by demonstrating what the financial and commercial results would be like. BRINGING EVERYTHING TOGETHER
Once you go through each of the steps outlined above, you will be able to create a solid business case for your service design project, by knowing: 1. Who your customers are, so you can target the right segments. 2. What they want, so you can design service solutions that appeal to them. 3. How much they want to pay, so you can prioritise ideas and evaluate commercial feasibility Ultimately, by relying on a hybrid approach of qualitative and quantitative methods, you can ensure that a service design project is not only desirable (from a customers’ point of view), but also economically viable.
References 1 The Design Process [Online] Retrieved January 7, 2013, from http://www.designcouncil.org.uk/designprocess
touchpoint 5-1 61
deep dive: collecting relevant insights
Fusing Qualitative and Quantitative Skills in Service Design Engaging frequent flyers in the co-creation of a positive transfer experience For most travellers, the transfer between connecting flights is a phase in their journey they would be happy to skip. In order to bring delight to a moment characterised by negative emotions, Air France and KLM have joined forces with their frequent flyers in the development of new service concepts. By integrating a broad array of contemporary techniques like research communities, ethnographic blogs, online co-creation and quantitative measurement of emotions, the customer takes on a central and active role in this creative process. Participatory reporting formats ensure that the rich output these techniques generate increases the impact of consumer needs and emotions on the implementation of the new concepts. In complex situations such as When travelling from Manchester to Hong transfers, where different stakeholders Kong, you might first fly to Amsterdam (travellers, airline, airport, security and and transfer there to a second plane to customs) come together, putting the your final destination. From a traveller’s customer at the centre of the service perspective, you then have either too much design process can unite and engage all time to spend at the airport or too little parties into anticipating consumer needs. time to reach the gate of your connecting flight. Transfer flights are chosen either BRIDGING THE BORDER BETWEEN because there is no other option available QUANTITATIVE AND QUALITATIVE RESEARCH or when travellers need to make a tradeBusiness success is contingent upon the off between time and cost. It’s no surprise adoption of innovations, new products, that the transfer phase often causes services, processes and ideas. In turn, this frustrations, especially with frequent is dependent upon consumers’ acceptance flyers who know that a lot can go wrong. and perceptions of an innovation. FREQUENT FLYERS IN THE PILOT SEAT
Thomas Troch is research innovation manager at InSites Consulting. Tom De Ruyck is head of research communities at InSites Consulting. Annelies Verhaeghe is head of research innovation at InSites Consulting. Charles Hageman is product strategy manager at KLM.
touchpoint 5-1 63
Consumer story dashboard & Daily highlight mails
consumer insight activation
Board game to capture existing insights, knowledge and assumptions
Deck of concept cards
ideation and concept development
Workshop sessions to co-create and iterate ideas with frequent flyers
Shortlist of concepts for implementation
Concept casino to share results and inspire the re-write of the concepts
Iterative project flow Traditionally, the consumer is treated as a passive player in this process, mainly because consumers are often relegated to the role of ‘validator’ through traditional and often quantitative methods of consumer inquiry. With user-centricity, co-creation, sequence, evidence and a holistic intention as the principles of service design1, active customer participation and qualitative research are inherent to the approach. Both quantitative and qualitative research have their place in detecting opportunities for new innovation projects, fuelling the creative process and taking ideas from concept to implementation. The strength of an innovation project often lies in a fusion of research techniques2. Research communities are such a hybrid methodology. Although they are qualitative by nature, they combine the best of both qualitative and quantitative tools to generate rich insights and ideas. Together with InSites Consulting, the Customer Insight team and the R&D Customer Ground Experience team of Air France and KLM connected with their frequent travellers through this methodology in a staged-innovation approach. 64
MAKING ITERATIVE CUSTOMER CONNECTIONS
Online research communities were developed as a research methodology to take advantage of the characteristics of modern consumers, matching their social media behaviour and emphasising the dialogue between brands and consumers. We can define a research community as ‘a small group (up to 150) of highly engaged people joined together by a common passion, connected online for a longer period, who are systematically engaged by applying various social media techniques for different business objectives, especially co-creation or even collaboration’. By definition, communities are not representative, as they work best with participants who identify with the topic and/or the brand hosting the platform3. In the iterative process of designing new transfer services, we applied the following structure: 1. Consumer insight activation: detecting new needs and ‘dissatisfiers’ from transfer passengers through personal ethnographic blogs. 2. Ideation and concept development: the insights from the first phase trigger the development of new service concepts. 3. Quantitative validation: concepts with a high potential are integrated in a quantitative and emotional concept ‘screener’. The results guide a workshop to rewrite the concept boards and develop the final proposition. In the following explanation of the three stages outlined above, we highlight how the choice of methodology and participatory reporting formats contributes to the efficiency and effectiveness of the service design approach.
deep dive: collecting relevant insights
Consumer story dashboard
1. CONSUMER INSIGHT ACTIVATION
2. IDEATION AND CONCEPT DEVELOPMENT
A service design project starts by identifying the problem that needs to be addressed: therefore, we started by discovering and shaping consumer insights. In the first phase of exploration, before involving the passengers of Air France and KLM, a knowledge map was created summarising all existing insights, knowledge and assumptions present within the organisation. We developed a board game to capture this information, based on personification techniques. The team members were all assigned a persona representing a typical passenger and had to come up with needs and problems that these customers might experience during transfer. This existing knowledge was challenged and enriched by connecting with forty frequent flyers. They reported their transfer experience on a personal blog and we immersed ourselves in their world through 400 observations in text and pictures to detect new needs. The Air France and KLM team was kept up to date with the most striking and refreshing consumer stories through daily email messages. A ‘consumer story dashboard’ was set up to analyse all stories and visuals. This acted as an online reporting tool for visual and unstructured information, allowing the researcher to intuitively analyse qualitative data in a quantitative way. Once the results were uploaded, the rich input could easily be compared on many dimensions, such as type of airport, stage in the transfer process and type of frequent traveller, without the need for any statistics.
In contradiction to what many people think, co-creation is not about getting, but about giving. We combined the 68 insights discovered in the first phase, into 10 clusters according to their themes. Each cluster was a trigger for the team to generate new ideas to create a smooth and positive transfer experience. To facilitate true collaboration, a series of workshop sessions with the Air France and KLM team was carried out alternately with idea generation by the frequent flyers on the online community. For this idea generation, we were looking for customers with a specific profile who could help us create service concepts that were both new and relevant. Fifty frequent flyers joined a three-week ideation & concept development community: half of them were selected based on their profile as innovators – with demographic characteristics that showed that they were used to challenging the norm and were in search of what is unique and original in travel – combining a focus on functional benefits with social independence. The other half were influential – accepting the norm and in search of what is relevant in travel – being team players with a focus on social benefits4. The community environment proved to be particularly stimulating for the generation of ideas: consumers received challenges based on the detected insights and could build further on each other’s ideas to make them more relevant. Although the passengers were recruited based on their involvement in the travel category, we further increased their engagement by integrating gamification touchpoint 5-1 65
Ideation tool elements in the online environment. By adding a countdown to the challenges, the competitive nature of people could be stimulated to come up with as many ideas as possible in a limited timeframe. We also tapped into the collaborative spirit of these frequent flyers by granting a status to each idea they generated. Participants could improve the idea by commenting and consequently make the status of the idea change from ‘mining’, through to ‘rough diamond’, ‘cut diamond’ and finally ‘diamond ring’. As a reward, the most feasible ideas with the highest status were visualised by one of the service designers in the team.
3. QUANTITATIVE VALIDATION A successful service concept fits the strategy and objectives of Air France and KLM, as well as the needs of consumers. In the previous stage, the passengers and the project team inspired each other to come up with over 700 unique ideas, resulting in 32 new transfer concepts. The four co-created concepts showing the highest relevance for both travellers and 66
the airline companies were integrated in a quantitative validation. The results guided a workshop to rewrite the concept boards and develop the final proposition. Traditional innovation KPIs such as ‘appeal’ and ‘uniqueness’ help management to push successful concepts forward to the implementation stage. But it is also important to consider the emotional aspects of the service in this test. In many cases, measuring emotions in quantitative research is done very rationally by asking people to indicate which emotion they feel. Also, one can wonder to what extent consumers are aware of all their emotions and whether they are even able to answer this question directly. Therefore, we integrated both direct and implicit measurement of emotions into this validation. The two implicit techniques we applied aim at measuring emotions based on reaction times: ‘dual tasks’ asked participants to indicate all the emotions they experienced concerning a given concept, while remembering a set of symbols. In the second implicit technique, we put the participants under time pressure and asked them to indicate their emotion on a conceptby-concept basis. The results of the emotional validation were quite surprising. Although three of the four concepts scored very similarly on traditional quantitative innovation KPIs – ‘appeal’, ‘usefulness’, ‘unprized buying intention’, ‘uniqueness’ and ‘talkability’ – and showed little difference in perception by Air France or KLM customers, the emotional measures were very different. The difference was not only between concepts but, more importantly, the concepts evoked different emotions between Air France and KLM customers. When connecting identical concepts to different brands, the heritage of these brands plays a key role in the emotional perception. While both
deep dive: collecting relevant insights
welcome back mr kawasaki. let me show you to your terminal. airport contact staff
Example of a concept board visual airlines can share the same platform for their services, they should use a different tone of voice to connect with their customers, resulting in a different implementation of the concept. To generate true impact (and surprise) using the results from the quantitative and emotional validation, we organised a ‘concept casino’, truly engaging the team and providing a welcome interruption. Each member of the Air France and KLM team received a number of poker chips and, by presenting the scores of the different concepts for one KPI at a time, they could place their bet on the concept that scored highest on ‘unprized buying intention’ for example. Not only did this stimulate a competitive, informal and creative atmosphere, it was also meaningful in translating the results into a rewrite of the final concepts. PREPARE FOR LANDING
Three of the final service concepts are currently being investigated by Air France and KLM: a mobile transfer application including real-time customer notifications and communication about travel details, a new in-flight transfer video and a concept known as ‘The Agent of the Future’. Furthermore, the insights and the other 29 concepts have laid the blueprint for future service innovations in the transfer zone. The Air France and KLM teams will regularly review their service based on the concept cards we have developed. This deck of concept cards embodies the afterlife of the project and is a trigger that refers to the other deliverables. The characteristics of online research communities have proven to be crucial in this service innovation project. From helping us to understand their world of transfer, to creating new concepts, the frequent flyers
in this community were great ‘partners in crime’ for coming up with new and relevant business propositions. With qualitative insights as the main source of inspiration in service design, quantitative research should not be excluded: we can combine the best of both worlds. Not only are facts and figures often a trigger to start a new project and select the concepts that make it to the final stage of implementation, service designers can also use numbers to their advantage. Given that it’s not always obvious to demonstrate the ROI of research and service design, we defined the KPIs of this project up-front and measured the change. By comparing the existing knowledge and assumptions at the start of the project with the actual result, we can prove the added value of this approach.
References 1 Stickdorn, M., & Schneider, J. (2011). This is service design thinking. Amsterdam: BIS Publishers. 2 Verhaeghe, A., De Ruyck, T. and Rogeaux, M. (2010). 'Exploring the world of water - Fusing contemporary research methods.' ESOMAR Congress 2010 3 De Ruyck, T., Van Kesteren, M., Ludwig, S. and Schillewaert, N. (2010). 'How fans become shapers of an ice-cream brand. Towards the next frontier in conducting insight communities.' ESOMAR Qualitative Research Conference 2010 4 Van Belleghem, S. and De Ruyck, T. (2012). 'From Cocreation towards Structural Collaboration.'
touchpoint 5-1 67
Measuring and Demonstrating the Value of Service Design
The aims of the paper are to demonstrate how the combination of service design and a traditional qualitative method have delivered proven and scalable results. We will use a real-life client case study to share our approach and key learning points. The client was a major banking brand operating within the UK financial services sector, providing products to consumers via a network of Independent Financial Advisors (IFAs). Our approach used a combination of Commitment-based Methods (CbM), a behaviour-based service design philosophy and traditional SPC principles and practices. This was executed within a structured delivery approach. CLIENT CONCERNS AND OUR OFFER
The client’s primary concerns were: • The client was six months into a two-year plan and was already significantly behind target. • In addition, the client projected they would fall short of their twoyear business plan by 50%. • They where perceived by the market as the “provider of last resort” due to poor service and product performance • The client had no further cash to invest in improving performance 68
Based on our diagnosis, we created an offer: to correct the first year’s shortfall and to increase the overall target by 100%, financed by a risk-reward relationship. This turnaround would be achieved by a refreshed proposition and supporting service model. SUPPORTING METHODOLOGIES
There were two distinct strands to the method deployed within this engagement: 1. CbM service Design – Qualitative Element. This is defined as the identification and design of the network of promises made by a performer (and her/his own performers) to a customer to fulfill the agreed and mutually valuable Conditions of Satisfaction (CoS)1. A business is made up of networks of people who make requests of and offers to each other. Requests come in many forms: responsibilities, tasks, customer orders or complaints, customer requests for information and invitations to meetings are all essentially requests made by one person or group to another.
David Singh is a proposition director responsibile for proposition insight and design within Royal London Insurance. David Le Brocquy is director at Vision Consulting after 12 years of working internationally with Andersen Consulting.
Similarly, an offer is linked with concepts such as ‘sales call’, ‘advertising’, etc. Clearly, offers and requests form a huge part of a business, and it is for this reason that we need to understand what exactly is happening when offers and requests are made. If we know what the structure of a ‘good’ request or offer looks like, we can quickly analyse a business example to see what elements are missing or dysfunctional, to help improve performance and the satisfaction of all. If we look at the detail of how requests are made, we see four key elements: a. The offer: what the performer will do for the customer (or what the customer may ask of the performer) b. Negotiation: an agreement between the customer and performer of what each needs to do realise the offer – this is called the promise (The Conditions of Satisfaction) c. Execution: the delivery of the promise to the requirements of both customer and performer d. Satisfaction: checking, for the sake of learning, that both customer and end performer are satisfied and primed to improve the next offer
deep dive: collecting relevant insights
This can be visualised in the following manner: OFFER customer
2. Statistical Process Control (SPC)2 – quantitative element. This is a method of monitoring, controlling and, ideally, improving a process through statistical analysis. Its four basic steps include measuring the process, eliminating variances in the process to make it consistent, monitoring the process and improving the process to its best target value. HOW DID WE COMBINE THE METHODS?
With this client we adopted our standard approach and supporting principles:
MUTUAL SATSIFACTION performer
CbM Promise Model
Typical Project Approach
ENGAGEMENT APPROACH '
SERVICE DESIGN AND IMPLEMENTATION DISTINCTIONS
Imagine the Future
Prove the Future
Scale & Sustain the Future S ' CALE
Key Activities – Understanding from the client and internal perspective the current drivers of performance.
Key Activities – The design of the future end to end business model to address client assessments and your ambition.
Key Activities – Working with real customers and your staff to prove, through a pilot the proposition, design and business case.
Key Activities – The creation of the approach and plan to scales and sustain the pilot benefits across the your business.
Client Assessments – An understanding of why the clients use or not the services you provide.
Proposition Design – The service base proposition to address the current client defined gaps in your offer
Proven Design – based on the insights and learning's from success of the pilot and underpin case for further implementation
Ambition – Your ambition achieved.
Diagnostic map – outlines the current end to and alignment of the business to the new ambition.
High Level Behavioural Design – The creation of the new design to deliver the ambition and proposition to the client defined needs
Proven Business Case - against the design standards and the business goals
Proposition Review Articulation and mapping of current Customer Proposition against the new ambition.
Pilot Plan – definition of how the design will be proven – business case, customer focus and anticipated benefits, etc
Design – A proposition and design that can be managed and enhanced by your staff. A Team – Capable of address changes in your business environment.
Plan - To sustain and scale from the initial from the initial pilot A Team - that will own, be capable and act as advocates / coaches to drive rollout your design.
touchpoint 5-1 69
Promise (Service) Design
The project had three distinct phases. 1. Diagnosis: understanding the market / client organisation’s customer ambitions and concerns. Understand the current proposition; how well it is received and delivered, and to identify the breakdowns and opportunities in the current service design 2. Design: building a compelling proposition and support service model based on the insight gathered during diagnosis 3. Prove and scale: delivery of a pilot and business case Diagnosis: 2 weeks 1. An investigation to understand the key breakdowns and opportunities; ranging from the customer to the back end fulfilment concerns of the full end-to-end supply chain. 2. Key outputs: understanding of key issues and opportunities, high-level customer proposition, high level business opportunity and highlevel design. Detailed design and pilot build: 3 weeks 1. Final design of the customer (IFA) proposition around the key concerns and ambitions of the IFA community: to offer selected IFAs within the defined market an exclusive service that will increase their own defined level of success through enhanced reputation, matched with low cost, and hassle free service. 2. Creation of the sales and operational design and supporting 70
lender (dip manager)
enquire 3hr decision
lender (applicaction manager)
IFA PROMISE ifa
funds drawdown customer
practices aligned around the concerns and opportunities of all within the full supply chain. 3. Creation of a coordinated, end-toend business process as illustrated above. The model describes the supporting service design. At the core there is a “service promise” (point one of this list) between the customer and the IFA. To support this “service promise” we designed a number of internal promises to address the key moments of truth3 where “trust” in the promise will either be created or destroyed. The key moments of truth (identified during the diagnosis phase) is where the initial enquiry decision, the application completions and the drawdown of the funds take place. Based on this we created a series of sub promises that connected the core and sub promises in a seamless manner.
lender (fund manager)
a. Build a balanced score card, targets and supporting control charts covering all the key moments (enquire, application and availability of funds) across the service design (see table on next page) supported by an opento-all, web-based tool. The lag indicators were used to predict future performance through a set of agreed assumptions, e.g. conversion rates across the supply chain, etc. Over time, we see that, in 60% of the cases, enquires lead to applications. Individual SPC/ control charts were created for each of the key measures and tracked on a daily basis. b. Create control group to compare and contrast performance c. Key Outputs: A pilot design to prove the proposition, the end-toend service design and the pilot business targets.
deep dive: collecting relevant insights
lead indicators – predictors of future performance
lag indicators – results metrics
Customer (IFA) Facing
• Number of engaged partners • Volume / value of enquiries • Volume / value of applications
• Value of sold product • Quality of customer
Organisation (performer) Facing
• Quality of customer based on credit score • Volume of sold product • Value of individual transaction • Quality of application (information, etc.)
• Risk, e.g. level of financial exposure • Unit costs / operational efficiency
Run pilot / scale: 8 weeks 1. Prove the proposition and service design, by working with real customers within the market. 2. Build sustainability by working side-by-side with bank staff and coaching in the real business environment. 3. Create a business case for scaling the proven proposition and service design. 4. Use SPC to demonstrate the statistical shift in process performance. Using predefined rules4, we were able to identify a shift in design performance that could not be challenged from a statistical perspective, e.g. six continuous points of improvement indicate a statistically-defensible shift in performance. Therefore, we were able to demonstrate to all project key players and stakeholders that an improvement in performance was achieved and sustained. 5. Key Outputs: The business result and a design capable of being scaled and sustained by employees and customers SUMMARY RESULTS
The project achieved a number of tangible and intangible results without changing product, tasks/workflow or accepting-lower quality business. The results include: • The delivery of a diagnosis and design, and senior stakeholder acceptance of pilot results (a 600% improvement in pilot performance across 20 IFAs) in eleven weeks, which was capable of scaling • A scaled and sustainable solution that delivered a 500% increase in revenue performance • 30% improvement in customer quality • Application values (£Sterling) improved by 40% • Independent audit of the quality of the business generated by Group Credit & Risk highlighted a significant increase in the quality of business • Improved capacity by 300%
• Approach was adopted across the business as default service model for the defined product type • Achievement of a the 10-year business plan in 18 months • Zero out-of-pocket cost to the client. All consulting fees paid as a risk-based cost of sale: i.e. a percentage of all sales shared with the consulting partner SUMMARY
All the CbM projects we delivered shared a set of common drivers for success. For us, CbM based service design must be • Driven by a ‘scary’ ambition to challenge current ways of doing things • Based on a solid diagnosis that allows you to bring a new insight to the client’s pain or ambition and how that can be addressed through a new design. Successful service design without a solid diagnosis will be based on luck • Ninety percent internal to the organisation/10% facing off to the market. Successful design provides a coordinated solution across the value chain addressing the needs of all and securing their defined commitment to support the central service promise • Outcome and results focused from the outset. The retrofitting of metrics will only lead to disaster • About results and what you can do: most clients do not care about the ‘how’ • Spoken on the client’s terms, not in jargon. People buy solutions, not philosophies • And, finally, there are no poor users, just poor designers.
References 1 ‘Using Commitments to Manage Across Units’. Donald N. Sull and Charles Spinosa, MIT Sloan Management Review, Fall 2005 Vol. 47 No. 1 2 http://aleduc.iweb.bsu.edu/itmfg265/Misc/nelson_rules.htm 3 https://www.mckinseyquarterly.com/The_moment_of_truth_in_customer_ service_1728 4 http://en.wikipedia.org/wiki/t
Building the Bridge Mix methods to leverage service design within enterprises
Samara Tanaka is a strategic designer at MJV. She has worked as graphic and interaction designer in USA and Brazil and her research work focuses on strategy for youth political engagement and social innovation. Isabel K. Adler is chief innovation officer at MJV and autor of Design Thinking: business innovation book. She has worked as interaction designer at Microsoft, USA and as a user researcher at OcĂŠ, The Netherlands. Ana Fucs is a strategic designer at MJV. She has worked as interface designer and developer at Xerox Research Centre Europe and in the in the infographic design team at O Globo Newspaper. Bernardo Segura is a strategic consultant at MJV. He is an inÂ dustrial engineer. He has worked at Deloitte, IBM and Renault, with experience in business strategy, operations, logistics and process management. 72
Deciding which methodology to use in design work takes into consideration the type of information needed to conduct a project: that is, to inspire service designers to create innovative solutions. However, that decision has the potential to influence the way results are communicated within the enterprise. With that in mind, we present here a case study that shows a mix of qualitative and quantitative methods that were carefully chosen, taking into consideration how the results would be used internally by our client, and therefore increasing the penetration of the service design solutions. BACKGROUND
The starting point of a service design project is the translation of the information that we need to know into research questions. The most appropriate method is selected and, as the research is conducted, we tease out patterns and insights that will give the focus for the idea generation phase. Although in the business environment numbers and objective data are more familiar formats for answering questions, service designers are more keen to use qualitative research in their process, as it attempts to describe and understand, as much as possible, the whole situation of interest and helps to build up an understanding of the interrelationship among the various components of the phenom-
ena under study1. Therefore, qualitative research is a strong starting point for service design as it raises issues concerning the context of study that can then be worked to generate innovative solutions. However, qualitative data has to be carefully interpreted and is often based on a small number of respondents, so it can be less valued in the corporate context where marketing specialists have been working for decades with quantitative data. In the latter case, we examine a larger number of respondents so that the results can be generalised to the target group. We are then better able to compare between subjects as the research is more structured and standardised2, allowing for more grounded decisions.
deep dive: collecting relevant insights
Co-creation session with customers and staff
Nevertheless, depending on how the research is conducted, some research methods (e.g. focus groups) can produce qualitative data, quantitative data or both, showing that there isn’t necessarily a clear-cut process in method application. Personas are another example of a widely used service design tool that lies at the frontier of qualitative and quantitative data. A well defined persona is based on data collected from a variety of sources: going out on field studies to really understand the people who would use the product but, at the same time, bringing in statistical data to grant credibility and broad perspective to the answers3. Facing this context, we, as the service design team of MJV Technology and Innovation, decided to test a mixed method when redesigning the experience of a petrol station of the future. CASE STUDY: USER EXPERIENCE AT A PETROL STATION OF THE FUTURE
The present case study is a project we conducted for a Brazilian company that had created a concept petrol station that focused on technology and sustainable solutions. In this project, our aim was to understand how the user was experiencing the service with the purpose of defining which elements could be replicated for other petrol stations. To answer this question, we planned a design-based research phase that would include quantitative data within the results to support internal decision making, since this was one of our client’s requests. The project ran for one month and, due to the very short time frame, we kicked off with a quick qualitative exploratory research that enabled the holistic understanding of the context and the formulation of the mixed research methodology that followed. With only
one week of immersion, we carefully planned our daily tasks to cover different techniques that needed to be spread throughout the day for more accurate sampling. Each research shift consisted of time-distributed slots where we needed to gather quantitative observation, as well as conduct a certain number of interviews and observations. We applied a methodology that consisted of interviews, ethnographic research and participant observation, and techniques such as shadowing, ‘a day in the life’4 and ‘mystery shopper’. Due to some characteristics of the service being analysed – it lasts on average three to four minutes, consists of short interactions and dialogues between customers and providers and has a repetitive nature – we were able to extract both qualitative and quantitative data from the same observations. An illustration of this dual nature of the observations can be seen in the shadowing technique, where we had qualitative observations of the service journey, as well as quantitative data about customer trajectories that were used to analyse points of agglomeration and flow issues. With the data visualisations created, we noticed, for instance, gender differences in behaviour and service usage that had not been pinpointed in qualitative observation, such as the fact touchpoint 5-1 73
These illustrations represent customer trajectories within the service journey, with women in orange and men in violet. During the immersion phase, both qualitative and quantitative data were gathered and matched.
that women tend to gravitate toward the middle pumps, whereas men tend to spread out more among all the different pumps. LESSONS LEARNED FROM USING QUANTI FOR SERVICE DESIGN
In the case study, we applied various uses of qualitative and quantitative approaches that, beyond being valuable to the project, gave us practical insights when using simultaneous mixed methods in service design: Quantitative analysis helps validate qualitative observations. This exchange between different types of information helped the design team validate and guide the next steps during the research and was well received 74
by our client, who immediately took a decision on the next steps to be followed internally. Having a specific focus helps the researcher capture unexpected information. In our project, the same researchers applied different techniques and, due to this shift of focus, we were able to detect some abnormal customer behaviours. For instance, while gathering customer statistics of car models, duration of stay and customersâ€™ age, we observed a large number of people trying to find things to do during refueling, invariably making use of their mobiles. When the observation is based on a well defined focus, anything that falls out of the ordinary pops out, allowing the researcher to notice other details. Gathering quantitative data out of qualitative methods was a way of bringing efficiency and agility to our project. The characteristics of this service facilitated the process, but this combination can be applied to various settings. Even with a short set of samples, extracting numbers from observations is very effective and might quickly bring insights and conclusions that can guide the project. One should not reject a quantitative approach even if there is not a very large amount of data to be collected. Having flexibility in the choice of methods may help secure results. Based on
deep dive: collecting relevant insights
constant analysis of the data, we reframed the research methods in order to improve our understanding. This flexibility became evident during the prototyping stage when we needed to evaluate the reactions of the customers exposed to different visual contents on a screen. We tried to measure interactions using a quantitative approach but quickly changed the strategy as we noticed that a qualitative approach would be more valuable. Need of a multidisciplinary team with skilled staff for quantitative and qualitative approaches. The researchers should be able to collaborate and adapt their methods and techniques without losing focus. We used a format similar to agile methodologies, which allowed the team to interact in short-loop iterations and to have a large number of discussions and analysis during the process, leading to daily changes in field research. USING MIXED METHODS WISELY
Based on the findings from our case study and on an analysis of previous projects conducted at MJV, we conceived a framework to support the decision of which qualitative and quantitative methods to use for service design. This framework allows us to know when to
QUALI QUANTI (simultaneous approach)
CHARACTERISTICS > requires a large sample to start with
> Definition of hypothesis comes from initial observation of behaviors and environment
> flexible approach with frequent iterations
> quantitative data collected during field observation
combine the different methodologies and for which types of projects they are better suited. Traditional applications mentioned in the literature indicate using a qualitative method to help define a subsequent quantitative approach or vice-versa. We show here that a simultaneous approach could be more appropriate in certain cases, indicating both its advantages and downsides. The decision-making language of most companies is usually based exclusively on quantitative results. By combining qualitative and quantitative data collection methods, solutions generated by service designers can have a greater penetration within the corporate environment, without losing depth and allowing useful connections among the different approaches.
References 1 Malhotra, N. K. & Birks, D.F. (2000). Marketing Research: An Applied Approach (European Edition), Parsons Education Ltd, Chapter 6: 159. 2 Creusen, M. E. H. And Stokmans, M. J. W. (2003). “Concept Testing and Conjoint Analysis”, International Paper: Delft University of Technology, Faculty of Industrial Design Engineering. 3 Pruit, J., Adlin, T. (2006). The Persona Lifecycle: Keeping People in Mind throughout Product Design. Elsevier. 4 Vianna, M. et al. (2012). Design Thinking: Business innovation. Rio de Janeiro: MJV Press.
INDICATION > quanti is a straightforward way of detecting an issue and quali is then used to understand the reasons behind
> when the problem is not defined
> in extremely complex contexts
> complex problems with short time for resolution
> examines a larger
> risk of having a
number of data in a short period and quickly narrows the research
> having a holistic view of the context in order to raise hypothesis that will be then validated
> agility to understand and measure the context
> allows detection of useful unexpected information
skewed or even wrong focus
> can be time consuming and expensive
> requires skilled research staff on both approaches
> risk of losing focus
> efficiency in time management
touchpoint 5-1 75
Testing a new ticket machine for transport in Oslo
Lavrans Løvlie is a co-founder and partner of Livework, a service design and business design consultancy with offices in London, Rotterdam, Oslo and Sao Paulo. Pioneering service design for 11 years, he advises top brands across the world on how to design services that make sense both for business and for customers.
Melvin Brand Flu is partner of strategy and busi ness design at Livework. He is a business and strategy consultant with over 20 years’ experi ence working for companies across continents. He advises executives and businesses on the cutting edge of business innovation in industries ranging from telecommunications and financial services, to the public sector and entertainment.
deep dive: collecting relevant insights
Pilot or Perish Designers bring ideas to life and are able to convince organisations to build services that their customers will love, but fail to communicate how these will bring value to business. At the same time, businesses spend millions to communicate their ideas through business cases that fail to convince customers. Neither beautiful designs nor detailed spread sheets give (senior) management enough certainty to invest ambitiously in new services – but pilots will. Livework is working with clients in different industries to conduct ‘real life’ pilots with actual clients in real business scenarios and settings. The outcomes challenge often long-held beliefs about customers and the organisation and guide the design and implementation of services that work in the real world1. Pilots that combine the creativity of design with the analytical rigour of business consulting provide a powerful approach for launching services that can perform in the market. Before we look at why, we need to understand how both design practice and business cases are risky propositions on their own.
TANGIBLE VISIONS OR ‘CORPORATE ENTERTAINMENT’
Organisations thirst for exciting visions and smart strategies that are wonderfully presented. Great service visualisations bring designers praise and design buyers acclaim from their peers, but often fail in the challenging process of going all the way to market. When the applause for a brilliantly envisioned customer experience fades and enthusiasm meets with business realities, the need for reassurance slowly kills the persuasive vision. A thrilling vision of the future ends up serving as a moment of engaging corporate entertainment. For an organisation this has value in itself – and there is certainly a market for it – but ultimately it does not carry enough weight to make an impact in the real world. touchpoint 5-1 77
impact on organisation
USER INSIGHTS VS. CUSTOMER BEHAVIOUR
Qualitative design research is great at bringing a customer experiences to life inside an organisation. It is also a quick and cost-efficient way to pinpoint the most important needs and opportunities for innovation. However, most design researchers have had to answer the question ‘So, how can I trust a conclusion based on what a handful of people say?’ This uncertainty is not caused by businesses trusting numbers over people. It is caused by an approach that does not tell them much about how customers behave en masse. Understanding how different people behave differently at large scales is needed to judge commercial value and to make crucial priorities and decisions in practice. Qualitative research produces convincing stories, but needs to be backed up by evidence that acting on them will make an impact on large groups of customers. MARRYING INSIGHTS WITH QUANTITATIVE DATA ONLY GETS YOU SO FAR
Quantitative research and data analysis might reveal useful patterns and trends but are not necessarily a good predictor 78
of customer behaviour. Validating these with customers can still throw up false positives that point in the wrong directions. As an example, a bank invested heavily in enabling account holders to manage their household finances via an online tool, expecting to strengthen their relationship with their customers, as well as attract new ones. Research clearly showed the interest and need of the customers to better manage their finances, customer clinics confirmed this and user testing made the tool easy to use. Two years after its launch, the uptake is disappointingly low, despite the strong push to promote and redesign the service. SOLVING THE RIGHT PROBLEMS
Visual and tangible ideas have incredibly high value. Designers have a wealth of techniques to help communicate a customer experience. When managers need to make decisions that involve effort and risk, it is invaluable to see what a service will look and feel like for customers. At the same time, the designer’s strength in seeing the service from a customer perspective is a weakness in terms of overlooking the factors that really will make the idea sink or swim. A service concept that may make perfect sense for customers may simply be unachievable by the organisation. The reasons will often be found a long distance away from the design brief. It can be in a detail of an IT system, in an organisational structure that is too costly to change or a corporate culture that will demand a high political price for being challenged.
deep dive: collecting relevant insights
Designers are great at making services simple for customers, but often underestimate how complex they are for organisations to realise. Design can make a convincing argument that customers will appreciate and value the service, but also need to prove that the idea is achievable for the business. As a case in point, a regional council dealing with children with disabilities wanted to simplify the service and make it more transparent to families and caregivers. While everyone involved in the project bought into the proposed elegant solutions and designs, it did not address the fundamental question: how to do more with less. In addition to this, senior management and stakeholders could only see the massive reorganisation and redistribution of power required, and stopped the project.
PILOTS ACCELERATE LEARNING
When you trial services in a real-life situation, you gain insights that are simply unachievable on the drawing board or in a spread sheet. When you move from theory to practice you will gain insights and (dis-) prove assumptions from day one. Pilots can be adjusted and changed in rapid iterations, and helps the organisation learn quickly from the behaviour of customers, staff and systems.
Full-scale subway station pilot of a new information system for Ruter (Transport for Oslo). “The pilot taught us things that would have been impossible to understand working at our desks” according to Ruter CEO Bernt Reitan Jenssen.
WHY BUSINESS CASES COST ORGANISATIONS BILLIONS
Most organisations assure the viability of an initiative by building a business case that has to meet certain targets, such as return on investment (ROI), or an industry specific measure such as average revenue per user (ARPU). Ultimately, a business case is supposed to offer management a rationale for committing resources to a project. Unfortunately, businesses have a graveyard of unsuccessful projects and initiatives that have cost hundreds of millions but failed to deliver, even though the business case was accepted. Often the business case is massaged, negotiated and adjusted internally to reach management acceptance, failing to test the assumptions in the market and with customers. A global brand had a solid business case for upgrading core IT systems which was based on one billion of additional revenue. Since this programme only touched internal operational systems, customers would not experience a better service. Where would any significant increase of sales come from? BUILD PILOTS TO DEMONSTRATE AND CONVINCE
Service pilots create a confidence in success that neither design nor business cases can do on their own. They merge design skill with business analysis in a way that produces effects beyond both approaches. touchpoint 5-1 79
In testing a new service members of staff from different departments as well as consultants and trainers performed the same front line roles. Because of the diverse background and level of experience of the people servicing the customers, the organisation very quickly learned the best ways of interacting and supporting the customer. This informed frontline staff behaviours and policies even beyond the scope of the new service.
PILOTS CAN FAIL
When you ask them, customers will say one thing and do something else. The same goes for staff. When you run a real service at small scale, you can observe actual behaviour, see the results and engage with customers and staff about their experience. Pilots enable customer involvement beyond voicing their opinions through surveys, observations and workshops. They let you both understand and measure what really means something in the everyday life of customers and staff. The bank with its financial tool for households found out after launch that even though customer really liked the idea, they were not interested in changing their behaviour and use the application to take control of their finances. Piloting with real life customers instead of controlled groups would have exposed the human tendency towards socially desired responses to topics such as financial responsibility. People would like to think of themselves as taking charge, but in reality they didn’t care enough to sit down and work it through.
Setting up a pilot for it to work for customers and staff means it will most likely be successful. Designing a pilot to learn with the customer and staff might actually mean failure in terms of business objective. Understanding why something does not work is sometimes more valuable than seeing how something can work. When piloting a new online-offline service concept to support new customers, the project team faced significant internal resistance from several departments. The pilots failed to make any impact on customers and were discontinued under heavy political pressure. However, through the pilots, a painful process flaw was exposed in the running of the existing service. This enabled the organisation to improve its current online and offline services significantly by eliminating the redundant process.
PILOTS ENGAGE AND ENERGISE THE ORGANISATION
PILOTS HELP IDENTIFY REAL ECONOMIC VALUE.
Pilots need to be run by the people who ultimately will deliver the service, not by consultants. This engages staff, from the top to the bottom, in making service better for themselves and for customers in hands-on ways. Pilots inspire because they shift the focus from performance in daily routines, to asking ‘how can we do this better?’ Pilots breed a culture for continual improvement. In a pilot, sales agents were given lightweight, easyto-use tools to support quick and extremely customerfriendly instructions to customers. During the pilot, an internal competition grew between agents as to who was
IF YOU DESIGN THEM WITH THE BUSINESS
PILOTS ENABLE TRUE CUSTOMER INVOLVEMENT
the best at servicing the pilots. The pilot tools, through their immediate feedback, for the first time gave agents meaningful feedback on their performance and, since the results were public, it motivated the agents in ways incentive schemes had not.
Pilots allow you to test, validate and adjust the business case to actual numbers. When pilots are designed to align closely with business goals, they provide the data needed to model the economic performance of the service when scaled up. This means that numbers can be compared to historical data and business forecasts. Pilots build a body of evidence that enables confident decisions.
deep dive: collecting relevant insights
Design of the six scenarios that made up a 9-month pilot to helping long-term unemployed back to work. This picture is from the office of the pilot manager.
Changing the call script for an insurance call centre enabled agents to show more empathy and offer a better service to customers. The data analysis revealed that even though the experience of these customers was significantly better, it did not contribute to the organisationâ€™s bottom line due to the limited number of customers who were impacted. PILOTS WILL TELL YOU MORE THAN THE BUSINESS CASE
A business case will describe a model for the financial value of a service. By definition, a model canâ€™t incorporate the complexity that businesses and customers experience on a daily basis. Pilots allow organisations to identify crucial gaps in the model, and highlight opportunities that were not obvious at first sight. Pilots turn economic theory into hard evidence. In a retail pilot, customers were offered a service that would enable them to walk out of a mobile phone shop with a fully functioning phone with all their settings and content transferred from the old one. The pilot proved the willingness to pay for the service was very high, which means it had the potential for commercial success. However, the pilot also revealed the tremendous pull this service had on attracting new customers. This had greater economic value and supported a more interesting business case. IF YOU THINK PILOTS ARE EXPENSIVE, TRY FAILURE
In an interesting paradox of corporate behaviour, managers are willing to invest hundreds of millions to bring a new service to market, without properly testing it in real-life situations first. Unfortunately, the cost of both fundamental issues about the proposition and small
irritations in interactions will multiply when a service is launched and scaled. Bringing new services to market demands a lot of any organisation, and rightly so, as the risks are significant. Managers both need to be convinced and they need evidence that their efforts will bring success. Pilots create both and, more importantly, they allow businesses to test, learn and refine their ideas in a real-world context that involves customers and staff. Pilots also unite the skills of business consulting with design skill, to produce results that impact beyond both. Service designers have the skill to imagine experiences that make sense to customers and build, run and iterate service pilots. In addition, designers have the skill to design stories that convince managers and staff. Pilots built on solid business logic enable organisations to detail new service propositions based on evidence. If you want your idea to become a successful reality for customers and for the business, pilot it to make sure it makes sense.
The examples in this article include cases from the telecoms industry, insurance, banking, health services and transport, and are anonymised to respect client confidentiality.
touchpoint 5-1 81
Tools and Methods Service Design Related Techniques, Activities and Deliverables
Build Better Personas Using Subjective Science Q Methodology gives service designers statistical tools for turning statements of opinion into insightful user profiles.
Stephen Masiclat is a professor at Syracuse University and director of the New Media Management graduate program. He is an awardwinning designer and owner/principal of the design consultancy, SM(D).
Service design has become a valued business process in Europe, but has struggled in the US, despite efforts by firms like Continuum and IDEO. Part of the reason is the culture. American distaste for big government has meant fewer opportunities for service designers to address large civic systems and thereby build awareness of the value of service design thinking. But a complete explanation of the difference in uptake must address another cultural divide: that between the highly quantitative values of American managers and the exploratory values of designers. Service designers seeking to bridge the gap should consider Q Methodology, a scientific method for studying subjectivity. It is particularly powerful for creating design personas based on captured points of view. THE PROBLEM OF PERSONAS
While CEOs look for manageable tasks and measurable outcomes, service design practice uses many investigative phases that are inherently speculative. Ideation, role-playing, even brainstorming (in early stages) are journeys of serendipitous discovery which look – from the outside – like haphazard, non-scientific procedures. This is particularly true when it comes to the use of personas to guide design. No other method in the service design toolbox draws more criticism. 84
In their 2006 article, C.N Chapman and R.P. Milham heaped scorn on personas, calling them non-scientific and nongeneralisable and insisting that they were difficult or impossible to verify as accurate. “This involves several aspects [including]… burdens on inference related to personas’ high specificity; and the possibility that personas are non-falsifiable.”1 Objections like this, necessary to scientific debate, become broadsides that executives use to scuttle the ‘soft’ claims of designers. To be fair, these grievances arise from real limits to some qualitative research methods, yet overcoming these limits leads to another management quibble with qualitative design methods: their cost.
tools and methods
According to standard design practice, personas should be built only after exhaustive observation of potential users. But developing and executing rigorous observational protocols (that allow statistical analysis) is an expensive undertaking. Augmenting with comprehensive searches through demographic, psychographic and market/ behavioural data further raises costs. Worse, when a service is innovative or substantially new, there are no historical data from which to proceed and designers are left with little choice but to go on experience, intelligent guesswork and the sorts of rapid-prototyping exercises that business executives too-eagerly dismiss as stabs in the dark. We encountered problems with observation when a major motion picture studio asked us to conduct research into the viability of a market for social media-friendly showings of feature films. In essence, we were asked to study movie audiences and to discover whether there was a customer persona amenable to a new movie service that encouraged mobile phone use. Our difficulty in exhaustively examining potential users led us to a method that we believe sets a high mark for scientifically discerning personas and that bridges the cultural
divide between quantitatively oriented managers and design consultants. Our first approach was to convene focus groups to solicit opinions and attitudes toward social movie theatre experiences, but every discussion began with some participants vehemently declaring intense dislike of anything (or anyone) that disrupted the immersive experience. This anger intimidated potential users of the proposed service, and forced us to find more private ways to study attitudes. In our second attempt, we used Q Methodology, a method designed to study people’s strongly held views. There are four basic steps to conducting a Q-study: developing the Q-sample, the Q-sort, factor analysis and factor interpretation. In the final step, the researcher discovers the framework of statements that defines a group, thus giving insight into the subjective views that form this group’s highly personal, but internally consistent, frameworks for understanding and acting in the world. STEP 1 : DEVELOPING THE Q-SAMPLE
Any worthwhile endeavour will engender discussion, and in ‘Q’ a set of discussions that deepen understanding of an endeavour is called the concourse. From the concourse, one draws a representative Q-sample for study. In our movie service study, the Q-sample was drawn from the concourse of statements from a number of discussions including: • Statements of opinion heard in our unsuccessful focus groups • Statements from the research team’s discussions of the research question • Statements from the client about the target market for the new service touchpoint 5-1 85
A test subject reads and sorts statements of opinion.
The last item leads to another characteristic that strongly recommends Q Methodology to service designers. A sample can include speculative statements that are thought to be representative of stakeholders’ states of mind. As a matter of course, Q Methodology will automatically validate or invalidate these statements (i.e. these statements may or may not load on a statistically significant factor). Service design has numerous methods for generating statement texts. They might come from observing roleplaying exercises or service scenario skits. They can be built up from mind-maps and affinity diagrams, or derived from narratives like storyboards or motivation matrices. In our study, as we were developing the concourse, one of the researchers expressed his opinion that there would not be a market for the movie service because “...as far as I’m concerned, using apps to do things like check in to a movie turns something fun into a chore.” We explored this opinion further and eventually included five statements that spoke (positively and negatively) to interacting with content and social sharing applications, to see if the concerns were valid. The statements were: 86
• I like to Skype with my friends while I watch a movie at home. • I’m always using Google to look up things that appear in movies and TV. • Sometimes I see or hear things in movies that I just have to share on Twitter. • Social TV apps like GetGlue turn television watching into a chore. • Software allowing me to turn a movie into a pop-up video experience for my friends would be fantastic. The full Q-sample is available online.2 In the end, these statements did not correlate with any of the significant factors in our research and we were confident that the attitude voiced by the researcher – a valid personal opinion – was not pervasive enough to form part of the psychological makeup of any of the emergent personas and their views of the proposed service. This bears repeating: in early design phases when hard data is scarce, the natural tendency is to project personal feelings and experiences onto the persona. In the absence of data, anecdotes abound and are made powerful by their narrative force. Q Methodology encourages researchers to tease apart the narratives into declarative and subjective statements of
tools and methods
opinion, which can then be tested and analysed to see if they are held deeply enough to form a basis for differentiating populations. STEP 2: THE Q-SORT
Once you have a set of statements, the next step is to have people sort them. This is the core of Q Methodology. A person arranges the statements in a matrix according to a condition of instruction such as ‘Please sort these statements from those you most agree with, to those you most disagree with’ or ‘from those that make you feel appreciated to those that make you feel taken for granted’. The Q-sorter (test subject) ranks the statements from their point of view, and this action captures their subjectivity. The ordering results in unique arrays where statements, identified by number, are arranged in relation to each other. The arrays from multiple subjects
can be examined with statistical methods, specifically factor analysis maths. The final step is to ask each Q-sorter to briefly explain (in writing) why they placed particular statements at the extremes of the array. These texts give researchers insight into the emotion that accompanies personal views. In our study, for example, the vehemence expressed in focus groups was explained almost universally as righteous anger at other people’s rudeness: “Using a phone in a movie is just rude!” This is a powerful method for service design. Instead of distilling reams of data, design researchers can develop a broad set of attitudinal statements that come from many potential users. In the process of sorting and analysing, the actual personas – defined by common and consistent attitudes – emerge from the data. The figure above shows a typical Q-sort array.
A completed Q-sort with statements, identified by number, placed in the array.
touchpoint 5-1 87
“Q Methodology gives designers a clear picture of the motivating mindset first.”
The precise structure of an array depends on the number of statements in the concourse. Generally, the extreme ends (usually corresponding to ‘strongly agree’ and ‘strongly disagree’) have one or two cells for the statements that elicit the clearest responses, whilst the centre of the array (reserved for statements seen as neutral or non-controversial) has the largest number of cells. STEP 3: ANALYSING Q-SORTS
A Google search for ‘q methodology software’ yields a number of tools for conducting the statistical analysis, including complex factor rotations. Most are free, and the best of these provide tutorials on the best methods of factor rotation. The wide availability of software to do the calculation and derivation means one can discover statistically significant factors without understanding the maths, much as one can jet to Barcelona without full command of Bernoulli’s principle. As Steven Brown, the leading Q methodologist in the US says: “In Q, the role of mathematics is quite subdued and serves primarily to prepare the data to reveal their structure.” Those interested in the maths should begin with Brown’s excellent Primer on Q Methodology. 3 Briefly, then, highly correlated Q-sorts (i.e. correlations 2-2.5 X the 88
standard error) form families. Q-sorts belonging to one family are highly correlated with one another and uncorrelated with elements of other families. Factor analysis, (including rotational transformations) tells us how many different families there are. This is precisely what recommends this method to service designers. The focus in Q is on identifying typical characteristics for each factor. Individuals might differ in their degree of fit to the factor: some will be better exemplars than others, but Q Methodology excels at the discovery of valid opinion categories. And for the purposes of service design, we can safely say that these categories, derived as factors, are personas. STEP 4: INTERPRETING THE FACTORS
In our study, highly correlated statements coalesced into compelling portraits, as we layered additional information. On each Q-sort form, we asked for a few basic demographic descriptors like details of device ownership, age and gender. From these we were able to discern additional characteristics of the factors / personas. For example, the target persona emerged from the data: she turned out to be a young woman who (according to the factor analysis of statements) saw virtual social interactions as a natural evolution of the physical, to the point where
tools and methods
texting a friend across the room was as natural as texting across the country. The proposed movie service was, to her, an acknowledgement that entertainment was meant to engender interaction, and she was excited about the service because she wanted to respect others’ expectations of silence. It is crucial to note that the attitudes emerged first. We discovered the persona’s gender and age only after we had identified her psychological makeup. Q Methodology gives designers a clear picture of the motivating mindset first. This later allows more direct connection to many of the other service design tools, such as mind-maps, affinity diagrams and motivation matrices and makes for a rigorously defined and internally consistent design programme. Q VS R METHODOLOGY: SIZE DOESN’T
factors. Additional subjects tend to load on existing factors: they don’t increase decimal-point accuracy of the factor. This reality has the additional advantage of keeping costs low as researchers needn’t field largescale surveys with random samples from the general population. CONCLUSION
Adopting Q Methodology as a standard practice in defining personas would do much to move the discipline toward wider acceptance in the executive suites of international business and to combat the notion that it uses ‘soft’, non-scientific methods. Q Methodology lets us test our assumptions about attitudes that form the worldview of a persona but, more relevant to service design, Q-analysis is a scientifically valid, falsifiable way to discover personas without the benefit of historical data sets such as purchase histories, comment threads or other large, unstructured data sets. Personas generated in this manner are demonstrably valid and, from this start, the business of designing excellent services can proceed confidently.
Q Methodology is not a method built on large sample sizes. Rather, Q is concerned with the nature of correlated sorts and the extent to which they are similar or dissimilar. That being the case, large numbers (and their associated confidence intervals), so important to most social research, are second to the existence of valid, correlated factors. In the majority of cases, 40 to 50 test subjects will suffice to discern valid
Chapman, C.N, and Milham, R. P (2006) ‘The persona’s new clothes: methodological and practical arguments against a popular method’ Proceedings of the Human Factors and Ergonomics Society 50th Annual Meeting, 634 –636.
Masiclat, S. et al. (2012) ‘Market Viability for Social+Content Experiences’. Whitepaper from the Center for Digital Convergence, Syracuse University.
Brown, S.R. (1991) A Primer on Q Methodology, Operant Subjectivity, Journal of the International Society for the Scientific Study of Subjectivity, V16, N3/4.
touchpoint 5-1 89
Workstyles: At Your Service Introducing a versatile tool to forecast space and service needs
‘Workstyles’ integrate quantitative and qualitative information about users to enable planners and designers to forecast the variety and scales of user needs with more certainty. This article explains what workstyles are and how to develop and apply them to space and service design projects. The term ‘workstyle’ has been popularised in several disciplines to signal the different ways in which people work. In mass media, there are comparisons between ‘old’ and ‘new’ workstyles, such as the work mentality and tools of Generation X versus Generation Y. In the workplace, workstyles function as employee personas that must be managed and engaged in different ways to optimise the well-being of staff, as well as their performance and growth. And finally, in workplace strategy and design, workstyles quantify the space and service offerings that each user group receives in order to forecast final design needs. WHAT ARE WORKSTYLES?
So, what defines a workstyle and how do they differ from personas? Akin to personas, workstyles are flexible, tangible representations of user groups that capture their key charac90
teristics and needs, and that inform research and ideation: for example, their expectations about their work environment, their technology skills, the amount of time they spend working with others or their motivations for advancing their career. And, like personas, the ‘buckets’ or characteristics to fill for each workstyle can be defined according to the project’s needs. But, unlike typical personas, workstyles go beyond illustrating users’ needs to include quantitative information about how those needs will be met: asking not only what but also how much, for example, of different spaces, services and technologies are required to enable effective work. Workstyles act as building blocks, each with a kit-ofparts to meet that workstyle’s needs. Furthermore, workstyles typically follow the MECE rule – Mutually Exclusive, Collectively Exhaustive
– so that each potential user or user group can be assigned to a workstyle. DEVELOPING AND APPLYING WORKSTYLES
Workstyles are developed and validated in the research phase and applied during planning through the following workflow: 1. Conduct general research, interviews and/or workshops to understand and broadly define the different types of users and the key dimensions that differentiate them. 2. Survey users along the key dimensions. 3. Analyse survey, interview and/or workshop data to develop workstyles, based on the key dimensions, which should be mutually exclusive and collectively exhaustive. At a basic level, this is done by ‘cutting’ the data along one or multiple dimensions following natural breaks in the data. If the future user group is known, they are surveyed in Step 2 and concurrently assigned a workstyle when defined in Step 3. 4. Define the types and quantities of spaces and services (the kit-ofparts) that each workstyle receives, based on research, best practices and/or aspirations.
tools and methods
Developing and applying workstyles.
1. preliminary research
2. user survey
3. workstyle development
Time spent out-of-office
Time spent collaborating
spaces Assigned Workstations
Small Meeting Rooms
services Event & Travel Planning Technology Help Research Assistance Professional Coaching
4 & 5. kit-of-parts development and review
5. Review and validate the workstyles and kit-of-parts through further interviews, workshops and/ or observations. 6. Calculate the ratio of workstyles to be served. 7. Calculate, based on the ratios and kit-of-parts, the final types and quantities of spaces and services to deliver. As workstyles are a tool to integrate qualitative and quantitative information, the questions asked in each research method should strive to deliver both types of information. As an example, understanding the narrative about where, how and with whom users work, as well as how long they spend in each location and how frequently they use workplace services, can reveal how many service points to provide, the staffing needs by type (e.g. technology, research, finance) and level (e.g. novice, expert), and the proportion of services to provide.
6 & 7. calculating workstyle ratios and final deliverables
Successfully defined workstyles take the guesswork out of forecasting what is needed. They integrate qualitative and quantitative information so that planners and designers can easily scale up or scale down designs and offerings, and they afford a view of micro- (individual) and macro-scale (environment) in concert. Moreover, designers can use workstyles in a dynamic manner, adjusting kits-of-parts in real time, as needed, to understand the effect on and to negotiate the bottom line. With their level of detail and flexibility, workstyles enable designers to break down the complexity of planning for a variety of users and to do so in a efficient, considerate and research-backed manner. BEYOND THE WORKSTYLE
Putting aside the ‘work’ in ‘workstyle’, workstyles can be applied in numerous settings beyond the workplace, whether it be for spaces
only, services only or integrating both in serviced environments such as libraries, airport lounges, museums, hotels, banks, fitness clubs and community centres. Just as personas have been used across disciplines, we hope to see planners and designers in all realms use workstyles to understand and plan for their users, to make grounded design decisions, to explain their work to others, and to surpass the call to conduct and apply qualitative and quantitative research to the design process.
Yen Chiang is a strategist at Brightspot strategy. She engages clients through a user-centred and participatory process to envision and plan brighter work and learning experiences, and has worked on projects in the higher education, cultural, corporate and non-profit sectors.
touchpoint 5-1 91
Katrine Rau Ofenstein is service designer at GE. She is very passionate about solving problems on a systems level. Before joining GE, she worked as a consultant in Denmark and the US where she helped Fortune 500 companies, non-profit organisations and healthcare groups to identify business challenges and solve user problems through service innovation. Katrine has a MSc in Service Design from Aalborg University, Denmark and is on the management team for the Service Design Network as the National Chapter Principal.
“I didn't plan on being so interested in this ethnographic study, but it quickly caught my attention.” Touchpoint decided to get to know more Katerine Rau Ofenstein, the principal of SDN National Chapters, who recently joined GE as one of their first service designers. How did you ever get in touch with Service Design? I started thinking about designing services before I even knew about the term. When I was traveling in Southeast Asia for half a year, I had a lot of time and interest in learning how people in other cultures where performing their services. I didn't plan on being so interested in this ethnographic study, but it quickly caught my attention. In the areas where I travelled, service
employees valued taking the time to connect with people in their work. It really opened my eyes to how much our rushed culture has change the world we live in and our expectations for good service. You were one of the first founders of a National Chapter of SDN - what made you push this forward? I was really interested in connecting with likeminded people and sharing stories. I was finding that there
“[...] getting the government and the policy makers to consider alternative ways of thinking would have a giant impact.”
wasn’t enough experience sharing in Denmark. The events that were taking place were always related to one company telling their story not related to the service experience they were delivering. I wanted to create a neutral forum that was not colored by one company. The SDN gave me this opportunity. My cofounders and I set out with a mission of creating stronger bonds between designers. From there, we would be able to reach out to more companies and organisations to teach them about the service design approach. You are now at GE as a service designer. What difference are you hoping to make? Where do you see the opportunities? GE is relatively new to the term service design even though they have been thinking along the lines of service design for many years. An interesting thing about GE is that there is a buy-in on design and the impact of design from the higher executive levels. One of my main interests is to focus on service interactions and human-to-human interactions. I see this focus applying not only to GE employees but also
to users of GE’s products. All that being said, I’m interested in discovering new opportunities for integrating service design within the divisions of the organisation. You moved to San Francisco about a year ago - does USA need the same type of Service Design as Scandinavia? When I moved to California and started interacting with the companies here, I was very fascinated because I discovered one main difference between the evolution of service design in Europe and USA; in Europe the public sector has really been on the forefront for utilising service design and in the US service design has mainly being popularised in the private sector. The service design work that is seen for hospitals showcases this - In the US some of the largest hospitals like Kaiser Permanente and the Mayo Clinic have been working on service design for years. In Scandinavia many of the public hospitals have been picking up the model seen at Kaiser to create their own internal design teams. In reality, modern healthcare systems across the globe need service design. However, the economic
model of each culture influences when and how service design is integrated. The same applies for any other industry. Most importantly, I believe that the countries all have a need for service design in both the private and public sector. What is your favorite service? While I may enjoy a variety of smaller services, I don’t have a favorite social service (which is the area I’m most interested in with my work). That being said, one of the biggest opportunities for service design that I see is within the government. Imagine how much we could do for our society? Not to say that service design or a single service designer can change the world, but just getting the government and the policy makers to consider alternative ways of thinking would have a giant impact. That would be amazing. One of my favorite service design companies that I always refer to is Participle in London. They have done amazing work for the public sector. They are in the forefront of understanding how to influence a government by creating meaningful services for the people around them.
Interview by Birgit Mager touchpoint 5-1 93
buy touchpoint online! Touchpoint, the SDN Service Design Journal, was launched in May 2009 and is the first journal on service design worldwide. Each issue focuses on one topic and features news and trends, interviews, insightful discussions and case studies. All issues of Touchpoint are available on the SDN website both as printed version and ebook. To purchase Touchpoint issues visit http://bit.ly/touchpoint-shop
volume 4 | no. 1 | 12,80 euro
volume 4 | no. 2 | 12,80 euro
uro ,80 e 3 | 12 | no. 13 me 4 nuary 20
ge l Chan Cultura ice Design by Serv ill
W ¬ How Worlds u Intend? Service Living Know What Yo s Service Evenson y By Shelle
and rdable ects all, Affo oj ete Sm ice Design Pr Compl rv sful Se Succes oker By Chris
Eat, Sleep, Play Design Principles for Eating Sustainably
By Michelle McCune
A Performing Arts Perspective on Service Design By Raymond P. Fisk and Stephen J. Grove
Hospitality Service as Science and Art
By Kipum Lee
Boom! Wow. Wow! WOW! BOOOOM!!!
By Christopher Wright and Jennifer Young
By Markus Hormeß and Adam Lawrence
Reinventing Flight. Porter Airlines: a Case Study
The Lost Pleasure of Randomness and Surprise By Fabio Di Liberto
volume 3 | no. 1 | 12,80 euro
volume 3 | no. 2 | 12,80 euro
volume 3 | no. 3 | 12,80 euro
Learning, Changing, Growing • Being Led or Finding the Way?
From Sketchbook to Spreadsheet
• Overcoming the
Mary Cook and Joseph Harrington
Jesse Grimes and Mark Alexander Fonds
• Better Services for the People
Service Design Creates Break through Cultural Change in the Brazilian Financial Industry By Tennyson Pinheiro, Luis Alt and Jose Mello
• Innovating in Health Care –
Sylvia Harris and Chelsea Mauldin
an Environment Adverse to Change Francesca Dickson, Emily Friedman, Lorna Ross
• Using Service Design Education
to Design University Services • Service Transformation:
Learning the Language of Finance Gives Your Ideas the Best Chance of Success By Jürgen Tanghe
Service Design on Steroids Melvin Brand Flu
Designing Human Rights By Zack Brisson and Panthea Lee
volume 2 | no. 1 | 12,80 euro
volume 2 | no. 2 | 12,80 euro
volume 2 | no. 3 | 12,80 euro
the journal of Service Design
Service Design and Behavioural Change
Business Impact of Service Design
• Designing motivation or motivating
• Service Design – The Bottom Line
design? Exploring Service Design, motivation and behavioural change
Connecting the Dots • Service Design as Business
Change Agent Mark Hartevelt and Hugo Raaijmakers
Lavrans Løvlie and Ben Reason
Fergus Bisset and Dan Lockton
• How Human Is Your Business?
Lauren Currie and Sarah Drummond
• Design and behaviour in complex
B2B service engagements
• Service Design at a Crossroads
• Stuck in a Price War? Use Service
Ben Shaw and Melissa Cefkin
Design to Change the Game in B2B Relations.
• Charging Up: energy usage in
Lotte Christiansen, Rikke B E Knutzen, Søren Bolvig Poulsen
households around the world Geke van Dijk
service design network
to uc hpo int | t he jo urn a l o f s ervi ce d es i gn
volume 1 | no. 1
the journal of service design
volume 1 | no. 2 | 12,80 euro
Touchpoint the journal of service design
volume 1 | no. 3 | 12,80 euro
the journal of service design
Beyond Basics • Make yourself useful Joe Heapy
• A healthy relationship • Dutch Design:
Lavrans Løvlie, Ben Reason, Mark Mugglestone and John-Arne Røttingen
Time for a New Definition
Health and Service Design
What is Service Design?
• Do you really need that iPhone
• Designing from within Julia Schaeper, Lynne Maher and Helen Baxter
• Design’s Odd Couple
• Service Design 2020: What does
Fran Samalionis and James Moed
the future hold and (how) can we shape it?
• Revealing experiences Christine Janae-Leoniak
• Service Design:
Bruce S. Tether and Ileana Stigliani
From Products to People Lavrans Løvlie
• Great expectations: The healthcare
journey Gianna Marzilli Ericson
service design network
to uc hpo int | t he jo urn a l o f s ervi ce d es i gn
service design network
to u c hp o i n t | the jo u r na l o f s erv i c e des i g n
ice for Serv Machine A Time uer rs nning Bre Designe hener and Dr. He
Service Design on Stage
service design network
to u c hp o i n t | the jo u r na l o f s erv i c e des i g n
Order online at http://bit.ly/touchpoint-shop
Buy the Touchpoint Collection and, in one fell swoop, get the whole back catalogue of Touchpoint (Vol. 1, No. 2 to Vol. 4 No. 3), along with a subscription to Volume 5, at an irresistible price!
download articles The articles published in Touchpoint since its first publication are available online! The formatted Pdfs of single articles are now downloadable at no cost for SDN member and can be purchase by non-members. You have the opportunity to search articles by volume and issue, by keywords or by author!
free acces s for sdn membe rs!
Order online at http://bit.ly/touchpoint-shop
SERVICE DESIGN GLOBAL CONFERENCE CARDIFF | UNITED KINGDOM 19th â€“ 20th NOVEMBER 2013 MEMBERS DAY 18th NOVEMBER
About Service Design Network The Service Design Network is the global centre for recognising and promoting excellence in the field of service design. Through national and international events, online and print publications, and coordination with academic institutions, the network connects multiple disciplines within agencies, business, and government to strengthen the impact of service design both in the public and private sector. Service Design Network Office | Ubierring 40 | 50678 Cologne | Germany | www.service-design-network.org
Facts and figures, performance statistics and KPIs – these are what managers want when they initiate a project. Most service designers, on t...
Published on May 15, 2013
Facts and figures, performance statistics and KPIs – these are what managers want when they initiate a project. Most service designers, on t...