Techno-Ethos: The Emergence of a New Design Thinking Framework

Page 1

Undergraduate Bachelors Design Major Research Paper, By Mandeep Mangat

TECHN0ETH0S TH E E M E R G E N C E OF A N E W D E S I G N TH I N K I N G F R A M E W O R K


D e si g n e d a n d w r i t t e n b y M a nd e e p M a n g a t F a cu l t y A d v i s o r s : A l ex a n d e r M a n u B e rn h a r d D i e t z M a th e w L i n c e z

A thesis presented to the OCAD University in fulfillment of the thesis requirements for the Undergraduate Bachelors Design Degree. View more work by Mandeep at: mandeepmangat.format.com OCAD University, 2019. Š Mandeep Mangat.


Fig. 1. Background Image 1. @Systems2018. 2018. Digital Image. Twipu. Web. 2019. < http://www.twipu.com/Systems2018/tweet/1037663964166074369 >.

TECHN0-ETH0S

I n search of a futu re-oriented e thical design prac tice


0F

C0NTENTS

Fig. 2. Background Image 2: @Systems2018. 2018. Digital Image. Twipu. Web. 2019. <http://www.twipu.com/Systems2018/tweet/1037663964166074369 >.

TABLE

Ma jor Research Pa per

Fall & Winter Semes ter


PREFACE PART 1.

DESIGN BRIEF

1.1.

01

03

Introduction Background Aims & Objectives Stakeholders Scope & Limitations

05 09 11 15 19

PART 2. THEORY 2.1. Philosophy Of Technology

21

PART 3. RE-FOCUSING 3.1. Paradigm Shift

53

3.2. Moral Philosophy 3.3. Trend Analysis 3.4. Trend Synthesis

55 59 67 93

PART 4. CONCEPT DIRECTIONS 4.1. Reorienting

97

1.2. 1.3. 1.4. 1.5.

2.2. The Human Condition 2.3. Research Synthesis

4.2. Concept Directions 4.3. Concept Evaluation

22 29 33

99 101 139

PART 5. CONCEPT DEVELOPMENT 5.1. Design Brief

141

PART 6. ETHICAL DESIGN TOOLKIT 6.1. Introduction

173

REFERENCES

207

5.2. 5.3. 5.4. 5.5.

Experience Definition Brand Development Monetization User Testing

6.2. Ethical Design Activities 6.3. Toolkit Materials 6.4. Additional Resources

143 151 158 165 171

175 177 199 205


Fig. 3. Bat Enthusiast. Angel statues are red. 2016. Digital Image. Aminoapps. Web. 2019. < aminoapps.com/c/vaporwave-amino/page/blog/angel-statues-are-rad/5bqX_57TVuP4WmQDgvLe1dvXdxl6KGx1bR>.

OCAD UNIVERSITY

1

BACHELORS IN DESIGN THESIS


PREFACE

Š MANDEEP MANGAT, 2019

PREFACE DESIGN, THE ESSENCE OF HUMAN AGENCY, IF CORRECTLY UTILIZED, CAN DIRECT OUR FUTURE TOWARDS ONE OF TRUE HUMAN FLOURISHING. As I worked towards completing my bachelors of design degree, I began noticing the increasing reliance on technological fixes in the design process hand-in-hand with the acceleration with which emerging technologies were being developed and deployed. In contemplation of how these paradigms could effect future societies, I started this meta project with a focus on the future of humanity as shaped by the everpervading nature of emerging technologies. Through this research I came to the conclusion that design itself, is the essence of human agency, thus, through correctly synchronizing the design process with both technological developments and human insights, we can direct our evolution towards one of true human flourishing for future generations. With the perspective that design thinking is both multidisciplinary and continuously developing, in this report, is a growing opportunity to future-proof the socio-cultural impacts of deployed design solutions. I have achieved this through creating an ethical design thinking toolkit and ethical design exercises for fellow designers that are concerned about our collective future. I wish to acknowledge my professors Alexander Manu, Matthew Lincez, and Bernhard Dietz, the freedom, support, and inspiration you provided was instrumental in allowing my thesis to take shape. Lastly, I would also like to thank my close friends and family that supported me when needed, and patiently tolerated my absences down many a rabbit-hole: Haylee Strachan, Yinan Ma, Melih Yazici, Alice Thomas, Don Stoyle, and Choutu & Luna (sorry about the robot...). 2


Fig. 4. Skypressmarket. Digital Damage, seamless animation of Pixel Sorting Glitch. Movie Still. Web. 2019. < https://www.pond5.com/stock-footage/84838075/digital-damageseamless-animation-pixel-sorting-glitch-effect.html>.

OCAD UNIVERSITY

3

BACHELORS IN DESIGN THESIS

1.1.

IN TR OD UC T I O N . . . . . . . . . . . . . . . .11

1.2.

BA CK GR OU N D . . . . . . . . . . . . . . . . . .15

1.3.

AI MS & O B J E C T I V E S . . . . . . . . . . .17

1.4.

ST AK EH OL D E R S . . . . . . . . . . . . . . . .21

1.5.

SC OP E & L I M I T A T I O N S . . . . . . . . .25


PHASE 1: DESIGN BRIEF

© MANDEEP MANGAT, 2019

1 DESIGN BRIEF

4


OCAD UNIVERSITY

BACHELORS IN DESIGN THESIS

1.1. INTR0DUCTI0N My meta project is focused on the future of humanity with the inevitable influences of emerging technologies. I have chosen to focus on this subject matter as it affords a large breadth of scope that can be aggregated into a multitude of divergent potential outcomes. Through exploring and comparing various future possibilities, I will be able to determine an optimal direction to guide future technological innovations. The disciplines involved in this process will include applications of strategic foresight, speculative design and design ethics.

STRATEGIC FORESIGHT

DESIGN ETHICS CRITICAL & SPECULATIVE DESIGN

Fig. 5. Selected disciplines to guide research methodologies, and design process.

The key driver of this analysis will be founded on the perceptionaction cycle which asserts that an individual operates within a closed-loop dynamic between oneself and their extended environment in order to make sense of the world and its’ place in it (Fig. 6). In this way, given memories of past experiences are relied on when assessing present and new situations, this allows predictions to be made based on the likely assessments and implied actions of an individual in their environment (Bryd). Given the actions made in the environment are perceived through the body, once processed these

5


PHASE 1: DESIGN BRIEF

1.

© MANDEEP MANGAT, 2019

Past experiences inform actions taken in the world.

3. The input received by the body is cognitively processed and modifies existing knowledge.

MIND EXPERIENCING ENVIRONMENT

(CONTEXTUALIZING

-OUTPUT/STIMULI

+ MEMORY)

MEMORY BODY (PERCIEVE

2. Stimuli from the environment is percieved through the senses of the body.

(AWARENESS FILTER)

THROUGH SENSES) - INPUT

Fig. 6. Diagram of the Perception-Action Cycle, based off of: Christine Byrd, What The Perception-Action Cycle Tells Us About How The Brain Learns. Mind Research Institute. Web. 2018. < blog.mindresearch.org/blog/perception-action-cycle>.

new experiences can then become self-reinforcing or lead to the creation of new awareness (Byrd). These events are deeply intertwined and can precede one-another at a pace in which the organism is unaware (Byrd). While this theory applies to individuals, it can also be scaled out to broader society, as well as non-human agents. For instance, the various factors (the environment, physical body, and cognitive faculty) can be considered as unfixed variables. Within a society, divergent variables produce subjectivities, while commonalities result in the emergence of culture and collective paradigms. 6


OCAD UNIVERSITY

BACHELORS IN DESIGN THESIS

BIOSPHERE

TECHNOSPHERE

Fig. 7. Convergence and the inevitable merging of the Biosphere and Technosphere.

Due to the evolution of technology, the definition of the ‘environment’ within the framework of the perception-action cycle needs to be expanded to reflect the changing impacts on ecosystems, as brought about by the emergence of the technosphere. In contrast, to the biosphere (an organic ecosystem capable of maintaining a dynamic equilibrium), the technosphere is a new and artificial system created through human action (including landscapes, technologies, and material culture) (Soppelsa). Exploring the ongoing and evolving dynamic between these spheres, there is some clear evidence the technosphere has expanded to dominate the biosphere (Soppelsa). For instance, Ray Kurzweil’s, ‘law of accelerating returns’ demonstrates that the pace of technological innovation increases exponentially over time - thus out-pacing the evolution of the natural environment (Berman and Dorrier). Therefore, with this pace of change, it is clear that technology (existing in the environment, and increasingly apart of the human body) will play the principle role in directing the immediate experiences and development of shifting paradigms within humanity. My focus throughout this report will be centered on the unfolding hybridity between technology and humanity, and what this dynamic will mean in the future. 7


Fig. 8. M.C. Escher. Drawing Hands. 1984. Graphics. Arthive. Web. 2019. < arthive.com/escher/works/200361~Drawing_hands>.

PHASE 1: DESIGN BRIEF © MANDEEP MANGAT, 2019

8


OCAD UNIVERSITY

BACHELORS IN DESIGN THESIS

1.2. BACKGROUND

Focusing on the future of humanity as shaped by human intervention through technology (considering intended and unintended consequences), some common themes emerge. For instance, popular science fiction scenarios for decades have included: the extinction of mankind, the recurrent collapse, humanity plateauing, and finally, variations on post-human states (Bostrom). From the aforementioned directions, there are a number of scenarios that diverge from the current modern human condition – scenarios that extend to an entirely new concept of what it means to be human - the “post-human” (James and Bostrom). In Nick Bostroms’ definition of a posthuman condition, at least one of the following parameters must be met: · Significant increase in life expectancy · Enhanced Cognitive capacities · Individuals have increased control over sensory input, for the majority of the time. · Psychological suffering becomes a rare occurrence. While situating my designed outcome in a post-human future, my approach focuses on the current human condition in order to generate true value and meaning that is relevant to society today.

9


PHASE 1: DESIGN BRIEF

© MANDEEP MANGAT, 2019

THEM ES: AUG M ENTED AN D INFER RED K N O WLEDGE, MIN D UPLOADIN G, INTELLIGENCE SIX TH SENSE, PSYCHIC ABILITIES, M ULTI-SENSO RY EXPERIENCES, SENSO RY SYNTHESIA, DISEM BO DIM ENT, M O RTALIT Y, TECH N OLOGICAL EM BO DIM ENTS, BIOPOLITICS, EXPERIENTIAL SUBJECTIVITIES TELEPATHIC CO M M U NICATIO N, M ULTI-DIM ENSIO NAL CO NSCIOUSNESS, DE-SUBJECTIFIED SELF, TECH N OLOGICALLY FACILITATED FO R M S OF SOCIAL O RGANIZ ATIO N, M ULTI-IDENTITIES, U NI-EM PATHY SPECIES-DIVERGENCE, GEN DER-FLUIDIT Y, PERSO N H OO D. 10


OCAD UNIVERSITY

BACHELORS IN DESIGN THESIS

1.3. AIMS & OBJECTIVES The aim of the research has been to uncover the essence of humanity, removed from technological influence, in order to rebalance the dynamic between the two. In this way, by re-contextualizing the concepts of ‘the human’ and ‘technology’, affords an entirely new theoretical framework through which a new system of technological applications centered around the human subject can be deviated. Additionally, the insights from this framework could be scaled down for the intervention of human interactions and the design of everyday objects. Finally, this new framework is then intended to provide a toolkit and policy road-map to guide future designers and inform the direction of future technological innovation. In order to achieve this, I will be investigating the topics of technology and the present human condition. This will include literature reviews in relevant theories covering the intersection of the disciplines of philosophy and technology, the evolution of technology, social technology studies, critical social theory, and techno-ethics. From theses source materials, the foresight method of generating signals will be leveraged to formulate the likely change impacts which will occur from the present human condition. The insights from this research phase will be synthesized into opportunities, as new conceptual directions.

11


Fig. 9. Vwhatif. Modern Mosaic. 2015. Digital Image. Instagram. Web. 2019. < https://www.instagram.com/p/9meoV5iBz2/>.

PHASE 1: DESIGN BRIEF © MANDEEP MANGAT, 2019

12


Fig. 10. Harvey Kirby. Unknown. 2018. Graphics. HarveyKirbyMedia. Web. 2019. < harveykirbymedia.files.wordpress.com/2018/02/unknown.jpg>.

OCAD UNIVERSITY

13

BACHELORS IN DESIGN THESIS

GENERATIVE QUESTIONS


PHASE 1: DESIGN BRIEF

© MANDEEP MANGAT, 2019

WHAT IS THE HUMAN BECOMING? WHAT IF TECHNOLOGY EXPRESSED OUR PURE ESSENCE? WHAT IF THE BOUNDARIES BETWEEN TECHNOLOGY AND HUMANITY HAD ENDLESS PERMUTATIONS? WHAT IF HUMANS HAD SUBSPECIES? 14


OCAD UNIVERSITY

BACHELORS IN DESIGN THESIS

1.4. STAKEHOLDERS TARGET AUDIENCE: In order to enable a futuristic perspective, the target audience are future generations living in a technocratic society. To further define this target audience I have leveraged information available on the Alpha generation.

The children of millennials, this demographic is born between 2010 and 2025 (Edmondson). This time-frame indicates they will be the first generation to have grown up fully immersed in the internet of things, social media, and emerging waves of artificial intelligence earning them the title “digital natives” (Tootell et al.). With demographic changes, they are often born into single-child families - this, coupled with access to unlimited online content, and social technology is forecasted to make them prone to selfishness and instant gratification (Carter). Born during the Fourth Industrial Revolution they will have the opportunity to shape emerging technologies such as: A.I., robotics, autonomous vehicles, I.O.T., 5G networks, and nanotechnology. Due to this exposure, they will be the most technically adaptive generation to date (Tootell et al.). While trusting technologies due to personal immersion (such as, growing up with robot nannies and personal A.I. tutors), this generation will also be suspicious of corporations and the government - valuing equality, fairness, ethics and the need for greater laws and regulation (Fourtané).

15


PHASE 1: DESIGN BRIEF

© MANDEEP MANGAT, 2019

FOURTH INDUSTRIAL REVOLUTION

DISEMPLOYMENT & WEALTH INEQUALITY

GLOBAL TECHNOLOGICAL REGULATION

DATA ECONOMY

FUTURE GENERATIONS (GENERATION ALPHA)

HUMAN-MACHINE COLLABORATIONS

DIGITAL NATIVE

AGE OF IMPATIENCE

INSTANT GRATIFICATION

Fig. 11. Factors Influencing Generation Alpha.

16


OCAD UNIVERSITY

BACHELORS IN DESIGN THESIS

PRIMARY STAKEHOLDERS Big 5 Technology Giants The top five most valuable companies currently Facebook, Google, Apple, Amazon, and Microsoft, are the leaders of technological innovation, each individually dominating an aspect of the consumer market. These companies have an expansive economic, and growing cultural influence. As corporate entities, their needs are continuous innovation, increasing revenue, and expanding consumer reach.

SECONDARY STAKEHOLDERS Government Departments and Agencies With a responsibility towards maintaining peace and achieving greater prosperity, various levels and departments of the government are tasked with implementing new regulations and laws to enforce standards and protect citizens. University Labs, Research Institutions, and Think Tanks Educational institutions and other organizations acting as thought-leaders on the subject matter, and as a touchpoint for new generations to come.

TERTIARY STAKEHOLDERS Experts in Psychology, Sociology, and Cultural Anthropology Professionals in these fields will have a unbiased opinion on the effects of various technologies on human psychology, sociology, and culture. 17

MI

AMAZON

APPLE

FACEBO


PHASE 1: DESIGN BRIEF

© MANDEEP MANGAT, 2019

THE FUTURE OF HUMANITY INSTITUTE

MIT MEDIA LAB

ICROSOFT

GOOGLE

PSYCHOLOGY

INSTITUTE FOR THE FUTURE INSTITUTE FOR HUMAN FUTURES

THE ARLINGTON INSTITUTE

UNIVERSITIES + RESEARCH INSTITUTIONS

SFI (OCAD)

SOCIOLOGY + CULTURAL ANTHROPOLOGY

HUMANITY

TECHNOLOGY

(TECHNOCRATIC SOCIETY)

INDUSTRY CANADA

OOK

DARPA

RESEARCH + DEVELOPMENT AGENCIES

GOVERNMENT DEPARTMENTS + AGENCIES

DEFENSE RESEARCH AND DEVELOPMENT CANADA

POLICY + PLANNING

POLICY HORIZONS CANADA

TECHNOLOGY

NATIONAL ROADMAPPING NETWORK RESEARCH COUNCIL

Fig. 12. Stakeholder Map.

18


OCAD UNIVERSITY

BACHELORS IN DESIGN THESIS

1.5. SCOPE & LIMITATIONS

The duration of this project spanned a total of 8 months, and was conducted within the scope of sociological possibilities arising through technological innovation. The preliminary areas of investigation included: philosophy of technology, the human condition, social technology studies, and foresight research into technology trends. Once this secondary research was synthesized, some supplementary areas for investigation were identified. Afterwards, opportunities and departure points were generated that lead to a final designed solution. The limitations of this project has been based around the need to form a solution that provides meaningful benefit to humanity. Therefore, the proposed solutions must have included consideration towards promoting human happiness and desirable social values. Through this directive, I will have discarded opportunities that generate purely financial income (and inequitable outcomes), and, instead, have pursued opportunities that provide broader societal benefit and well-being. 19


Fig. 13. Jwalk. Wallpaper HD of City. 2017. Digital Image. Wallhere.com. Web. 2019. < wallhere.com/en/wallpaper/197255>.

PHASE 1: DESIGN BRIEF © MANDEEP MANGAT, 2019

20


Fig. 14. Skypressmarket. Digital Damage, seamless animation of Pixel Sorting Glitch. Movie Still. Web. 2019. < https://www.pond5.com/stock-footage/84838075/digital-damageseamless-animation-pixel-sorting-glitch-effect.html>.

OCAD UNIVERSITY

21

BACHELORS IN DESIGN THESIS

2.1. PH IL OS OP H Y O F T E C H N O L O G Y . . . .23

2.2. TH E HU MA N C O N D I T I O N . . . . . . . . .29

2.3. RE SE AR CH S Y N T H E S I S . . . . . . . . . .33


PHASE 2: THEORY

© MANDEEP MANGAT, 2019

2 THEORY

22


OCAD UNIVERSITY

BACHELORS IN DESIGN THESIS

2.1. PHILOSOPHY OF TECHNOLOGY WHAT IS TECHNOLOGY? The German-Swiss philosopher, Karl Theodor Jaspers, defined technological development as a process through which the rational, human mind “masters nature for the purpose of molding his existence, delivering himself from want, and giving his environment the form that appeals to him” (Jaspers 98). My interpretation of this, is that technology is the result of a deliberate act - a process of exerting willpower in order to actualize internal visions. While, these ‘actions’ have historically taken place in the physical world, they are increasingly occurring in digital realms due to technological evolution.

WHAT IS THE DYNAMIC BETWEEN HUMANITY AND TECHNOLOGY? In analyzing perceptions of technology, the discussion can be categorized into; the “ideal of technology” in contrast to the “aims of technology” (Shukla 177-188). Here, the idealized attitude towards technology acknowledges the belief that technology is utilized in exchange for the benefit of the human condition (i.e. to make life more easier or enjoyable) (Skukla 186-187). Within this, however, are various social groupings that form subsets of the human condition (i.e. all of humanity, society, subcultures, or for individual benefit)(Shukla 189). While the “ideal of technology” reveals a larger perspective on how humanity has historically cross-culturally related to technology, the underlying “aims of technology” provides insight that is much more nuanced. The particular approaches to applying technology (i.e. the “aims of technology”) reflect the contextually and historicallyrooted modes of human life (Shukla 177-180). Therefore, the specific applications of technology are formed by a society in response to their temporal requirements, values, and beliefs associated with human life (Shukla 177-179). 23


PHASE 2: THEORY

© MANDEEP MANGAT, 2019

POST-PHENOMENOLOGY To establish a foundational grasp of the role technology plays in shaping human experience, I will briefly summarize Don Ihde’s post-phenomenological approach to technology below:

HUMAN

OBJECT-SUBJECT DUALITY & TECHNOLOGY Previous beliefs on object-subject duality consider both the subject (the human) and the object (the world) as fixed positions “in which only the manner...the object is experienced by the subject is affected” (University of Twente “What can we learn from Don Idhe”). Ihde counters this viewpoint with the notion of “mutual constitution” HUMAN - whereas “the relation between subject and object [is] always ... [preceded by the] subject and object themselves; they are constituted in their interrelation” (University of Twente “What can we learn from Don Idhe”). Furthermore, mediation theory develops this through the argument that within interaction, subjectivity and objectivity co-shape one another. TECHNOLOGY AS MEDIATOR Existing within the philosophy of technology, mediation theory is centered around technology. Within this context, technology itself acts as the mediator of interactions between subjects and objects (Verbeek “Mediation Theory”). More so, different technologies along with technological innovation provide new ways of interpreting the world. CHANNELS OF INTERACTION While applying an ethical lens to mediation theory (see: 2.1. “How do technologies shape our inner lives?”) Peter-Paul Verbeek sub-categorizes the interactions between subjects and objects - proposing that humans can act and enforce themselves in the world, while simultaneously perceiving and experiencing the world through technology (Verbeek “Mediation Theory”).

WORLD

TECHNOLOGY

WORLD

PERCEPTION/EXPERIENCE

HUMAN

TECHNOLOGY

WORLD

ACTION/PRACTICES

Fig. 15. Diagram of technologically mediated human world relations, based on Mediation Theory: University of Twente. What can we learn from Don Idhe? FutureLearn. Web. 2019. < https://www.futurelearn.com/courses/philosophy-of-technology/0/ steps/26324>.

24


OCAD UNIVERSITY

BACHELORS IN DESIGN THESIS

WHAT IS THE SCALE OF TECHNOLOGICAL INFLUENCE ON SOCIETY? In “Technopoly: The Surrender of Culture to Technology”, Neil Postman outlined the historical phases of technology in relation to human societies. He argues that the realm of technological influence has grown to the point that the trajectory of culture is now dependent on technological innovation (Wikipedia “Technopoloy”). Prior to this current social-technological hybrid, earlier forms of groupings can be defined as “tool-using cultures” and “technocracies” (Wikipedia “Technopoly”). Through combining Postman’s categories of technological social strata with cultural anthropology - we are given a sense of the leverage that technology has on our daily lives. For instance, the diagram on the right indicates the components that comprise a culture: symbols; stories; rituals and routines; power structures; organizational structure; and control systems (Johnson et al.).

TOOL-USING CULTURE From very early on, in human development, “tools” have played a central role in the “symbolic world”. Tools were produced and acted as simple technologies for utilitarian and symbolic purposes. The utility only addressed physical problems.

25

TECHNOCRACIES Beginning in the 18th century, increasingly sophisticated tools began to play a central role in the “thought-world of culture”. Tools were now involved in the social and symbolic worlds.

TECHNOPOLY Since the Industrial Revolution, technology has entered and increased its influence in all realms of human existence (industries). “Technopoloy” is a new state of culture,in which culture is driven by technology and innovation.

Fig. 16. Table outlining the transistion from tool-using cultures towards technopo -lies, based on: Postman, Neil. Technopoly: The Surrender of Culture to Technology. 1992.


PHASE 2: THEORY

© MANDEEP MANGAT, 2019

STORIES RITUALS & ROUTINES

SYMBOLS

CULTURE

POWER STRUCTURES CONTROL SYSTEMS

ORGANIZATIONAL STRUCTURE

Fig. 17. Aspects of culture, based on: Gerry Johnson, Whittington, and Scholes. The Cultural Web. 2011. Fundamentals of Strategy. FT Press.

26


OCAD UNIVERSITY

BACHELORS IN DESIGN THESIS

HOW DO TECHNOLOGIES SHAPE OUR INNER LIVES? TECHNOLOGY AND MORALITY The criteria for moral agency - the ability to distinguish between notions of right and wrong, and acting on these ethical judgments provides a framework to conclude that humans are indeed individual moral agents (Wikipedia “Moral Agency�). Taking into account that technology conducts a mediative role between subjectivity and objectivity, one can then conclude that technology must also now be playing a part in mediating the moral choices of the individual. For instance, the qualities inherent within a particular technology can suggest a hierarchy of choices for the agent to select between and act upon - whereby, the options presented would be associated with particular values (and excluding others through omission) (Bowles 4-5). An example of this includes smartphone apps, where the information architecture (i.e. the layout of the features and the menu) actively promote certain actions over others through varying ease and accessibility of options - thus biasing the outcomes.

Fig. 18. Circle.

27


PHASE 2: THEORY

© MANDEEP MANGAT, 2019

MORAL NORMALIZATION In addition to considering both the moral quality of technology, and the moralizing outcomes from using technology, mediation theory also suggests that the nature of technology can, itself, ultimately lead to the establishment of new moral norms through feedback loops (Philosophy of Technology and Design). The process of this interaction can be broken down into a number of distinct stages:

1. Our current environment is naturally comprised of the after effects of past technologies. These technologies mediate our understanding of the world.

2. Mediated interactions with the environment (i.e. “world”) influence human nature. Within this context, types of technologies and the possibility of human nature are variables.

3. The development of future technologies are directed based on the present state of humanity.

4. The cycle repeats itself - as new technologies provide a normalization feedback loop they affect the quality of humanity, ultimately going on to influence the direction of future technological initiatives.

HOW IS POWER DISTRIBUTED THROUGH TECHNOLOGY? POLITICS OF TECHNOLOGY Driven by a need for market expansion, technologies have been developed to the point that they have become democratized and globally accessible (such as the internet, smartphones, and social media platforms). This global proliferation means that these technologies can take on a political role as they can gather, connect, and influence audiences across multi-national boundaries (Feenberg). This entirely new form of social organization has yet to reach its pinnacle in terms of behavioral impact. 28


Fig. 19. Unknown Artist. Glitch Art #Luxurydotcom. 2019. Digital Image. Pinterest. Web. 2019. < https://www.pinterest.ca/pin/770115605010940092/>.

OCAD UNIVERSITY

29

BACHELORS IN DESIGN THESIS


PHASE 2: THEORY

© MANDEEP MANGAT, 2019

2.2. THE HUMAN CONDITION WHAT IS THE STATE OF THE HUMAN? Deragon describes humanity as “the quality or condition of being human, human nature” itself. This contains inherent human characteristics including “ways of thinking, feeling and acting” (Deragon). However, many futurists and technologists (such as Ernst Kapp and André Leroi-Gourhan) have argued, that due to the increasing influence of technology (such as bio-technologies) - the human condition is increasingly becoming the technological condition (University of Twente “Technology mediates - to the very heart of political life”).

IS THERE A CORE HUMAN ESSENCE THAT HAS REMAINED UNTOUCHED BY TIME AND TECHNOLOGY? WHAT ARE THE EXISTENTIAL PARAMETERS OF OUR SPECIES? Literary research into the philosophy of technology (see: 2.1) revealed that while we create technology, it in turn, also influences our experiences and changes our being. In order to fully grasp the parameters through which the nature of humanity can be changed, research into moral ethics, design ethics, and the ethics of technology was conducted with the aim of comparing and aggregating results of human impact. 30


OCAD UNIVERSITY

BACHELORS IN DESIGN THESIS

The characteristics of human nature can be divided into 4 quadrants: cognitive state, emotional capacities, social relations and social structure, and ethical values. In consideration of the human, these quadrants have always existed throughout our evolution, however the quality or contents within each quadrant are variables that change based on context, the environment, and new experiences. When evaluating preferred changes to the qualities of humanity, consideration must also be given to which attributes should be preserved and carried into the future. To help guide this evaluation, I have formulated evaluative questions pertaining to the characteristics of human nature into the quadrants below:

EMOTIONAL IMPACT:

COGNITIVE IMPACT:

· How can interactions promote empathy, compassion, respect?

· How does the solution re-arrange the concept of intelligence?

· How will technology and new experiences influence our degree of vulnerability and sense of safety?

· How are we being cognitively changed by the technology (ex: attention span, concentration, and critical thinking skills). · What sort of opinions/beliefs of the world will the solution create?

SOCIAL IMPACT:

ETHICAL IMPACT:

· How will society be organized?

· How does one define a “good life”, and how do we determine how one should live?

· How do we want human beings to live, interact, and act together? · How should social norms change?

31

· How does society want our morals to be shaped? · What is the impact on values such as human rights, social justice, human value (life) and stewardship?


PHASE 2: THEORY

© MANDEEP MANGAT, 2019

QUALITIES

OF

THE

HUMAN

EMOTIONAL

COGNITIVE

SOCIAL

ETHICAL

Fig. 20. Dimensions of human qualities experienced throughout the human condition.

32


OCAD UNIVERSITY

2.3. RESEARCH SYNTHESIS

33

BACHELORS IN DESIGN THESIS


PHASE 2: THEORY

© MANDEEP MANGAT, 2019

TECHNOLOGY AND THE HUMAN CONDITION: The nature of technology in relation to the human condition is one that is expansive – continuously increasing its influence in the various spheres of human life. As supported through the “Evolution of Technology” time-line (Fig. 21), and analysis of emerging Fourth Industrial Revolution Technologies - technological innovation over the past 2-3 decades has been increasingly active and entering the intimate, social, and affective lives of technocratic citizens. Furthermore, the “Techno-Culture Data Map” (Fig. 25), outlines such social-technological disruption as including new forms of social organization, and facilitated-communication, while continuously shifting the concept of what it means to be human. It is through enabling online social networks that these technologies have decreased face-to-face interaction (resulting in the decline of empathy, and interpersonal bonding), while in turn, offering instantaneous on-line connection to a larger community. Through creating a “Taxonomy of the Technological as Social” (Fig. 27), a new class of technology appears to be emerging. These technologies seek to simulate and substitute humans through replicating social roles and social interactions. For this reason, I categorize them as “Artificial Social Agents”. By applying the aforementioned concept of “aims of technology” (2.1. Philosophy of Technology) throughout this subsection, there is both an applied need and opportunity for improved human connection. 34


OCAD UNIVERSITY

BACHELORS IN DESIGN THESIS

EVOLUTION OF TECHNOLOGY TOOL-USING

TECHNOCRATIC SOCIETY THE INDUSTRIAL AGE

THE AGRARIAN AGE

1760 - LATE 20th CENTURY

10,000 BCE - Middle Ages

1st INDUSTRIAL REVOLUTION 1760 - 1870

This time-frame saw new inventions such as the cotton gin, and the steam engine which resulted in: factories, mechanization, tooling, and iron. These technologies redefined wealth by land-ownership and laborers. New social divisions between bourgeoisie and proletariats (working class) were created. Aim of Technology: Increasing production methods to create new wealth.

The Agrarian Revolutions resulted in a transition from a lifestyle of hunting and gathering towards settlement and agriculture. The increased access to food supported larger populations and produced an overall improved quality of life. Aim of Technology: Increase population and the likelihood of survival through improving access to key resources.

35

2nd INDUSTRIAL 1870 -

New innovations incl telegraph, petroleum and the internal com This era saw the beg mass production (and distribution of elec steel, and the artif (through the light b commodities and tang became indicators of

Aim of Technology: Increasing production met for refined quality of li


PHASE 2: THEORY

© MANDEEP MANGAT, 2019

This time-line is a linear exploration of the technological developments made throughout human history. It synthesizes previous concepts explored in this section, such as, Postmans’ categorization of varying technological cultures. Additionally, through leveraging Shuklas’, “aims of technology”, a critical lens has been applied that provides greater insight into this relationship.

TECHNOPOLY THE INFORMATION AGE 1950’S - PRESENT

L REVOLUTION 1914

luded steam power, m, electricity, mbustion engine. ginning of d consumerism), ctricity, utilizing ficial lighting bulb). Economically, gible resources f wealth.

thods and processes to ife.

DIGITAL REVOLUTION 1950’s - 2000’s

This time-frame was shaped by advanced forms of capitalism (centralized capitalism), and economic growth through knowledge and ideas. New technologies included the microcode, PC microprocessor, client-server networks, and cloud computing. These led to the inventions - the personal computer, the mobile phone, the Internet, and world wide web. The notion of a worker evolved to include knowledge - shifting from a “know-what” to a “know-how” value. Aim of Technology: Improved connection and communication, guided by capitalistic values such as individualism, intelligence, and efficiency.

DATA REVOLUTION 2000’S - PRESENT

We are currently undergoing the Fourth Industrial Revolution, marked by technologies such as artificial intelligence, autonomous vehicles, IOT, nanotechnology, big data, blockchain, and 5G networks. Globally, we are moving towards decentralized capitalism, and an economy of knowledge. Education has shifted to emphasizing the ability to quickly absorb new skills in response to technological innovations. The full impact that these technologies will have is yet to be realized. Aim of Technology: Driven by the sense of living in a global village technology is applied to fuse the physical, digital and biological spheres for further integration, connection, and communication. Fig. 21. Evolution of Technology Time-line.

36


Fig. 22. Edited by Mandeep Mangat, original by: Howe, Charlotte. Patternity.org. 2018. Digital Image. Pinterest. 2019. < https://www.pinterest.ca/pin/182466222382186267/>

OCAD UNIVERSITY

37

BACHELORS IN DESIGN THESIS


PHASE 2: THEORY

© MANDEEP MANGAT, 2019

HORIZON SCANNING OF EMERGING TECHNOLOGIES ARISING FROM THE THIRD INDUSTRIAL REVOLUTION, THIS REVOLUTION REPRESENTS THE EMERGENCE OF “CYBER-PHYSICAL SYSTEMS” THROUGH THE COMBINATION OF TECHNOLOGIES THAT WERE PREVIOUSLY INDEPENDENTLY DEVELOPED (DAVIS). THIS INTEGRATION OF HARDWARE, SOFTWARE, AND BIOLOGY IS BLURRING THE BOUNDARIES “BETWEEN THE PHYSICAL, DIGITAL, AND BIOLOGICAL SPHERES” OF HUMAN EXISTENCE (SCHWAB). TOGETHER WITH THE EXPONENTIAL RATE OF PROGRESS, THESE EMERGING TECHNOLOGIES ARE TRANSFORMING ENTIRE SYSTEMS OF “PRODUCTION, MANAGEMENT, AND GOVERNANCE” (SCHWAB). THE FOURTH INDUSTRIAL REVOLUTION TECHNOLOGIES ARE AS FOLLOWS:

5G

WIRELESS

NETWORKS

1. B y 2 02 0 , “ S i n g a p o r e w i l l b e t h e w o r l d s [. . .] f i r s t s m a r t c i t y ”. S o u r c e : y o u t u b e . c o m / w a t c h? v = J l t q - pT N z 6 Q

2 . B y 2 02 5 , 1/ 5 o f g l o b a l c o n n e c t i o n s w i l l b e 5 G . S o u r c e : r c r w i r e l e s s . c o m / 2 0 18 0 9 21/ 5 g / f i v e - 5 g - m a r k e tpredic tions

Digital

Ph y sical

Biological

Replacing 4G networks,fifth generation cellular telecommunications offer faster and greater capacity of data transmission (Wikipedia “5G”). Through providing accessible and increased connectivity (between devices, systems, and infrastructures), these networks are contributing to: big data; streaming platforms; and smart cities (Rosenberg).

3. 5G will enable the market penetration of a r t i f i c i a l i n t e l l i g e n c e . B y 2 02 5 , U K c i t i z e n s spending on chatbot s will increase by 45,0 0 0% c o m p a r e d t o “ 2 0 17 ’ s f o r e c a s t e d s p e n d ”. S o u r c e : d 10 w c7q7r e 41f z . c l o u d f r o n t . n e t / w p - c o n t e n t / u p l o a d s / 2 0 18 / 0 3 / S m a r t- C i t i e s - R e p o r t . p d f

4. “ Driverless cars will be commercially available b y 2 02 5 a n d t h e w h o l e U K t r a n s p o r t a t i o n s y s t e m w i l l b e f u l l y a u t o m a t e d b y 2 07 0 ”. S o u r c e : w e f o r u m . o r g / a g e n d a / 2 0 18 / 0 4 /d r i v e r l e s s - c a r s - a r e forcing- cities-to -become-smar t /

5. Mobile net work s t ypically evolve ever y decade, as such “ telecom companies will be running “ trials of 6G netowork s in 2030, with consumer a c c e s s i b l e n e t w o r k s p r e d i c t e d f o r 2 0 3 5 -2 0 4 0 . S o u r c e s : m o u s e r. c o m / b l o g / 5 g - i s - s o - n e a r- f u t u r e - a - l o o k a h e a d - t o - 6 g - a n d -7g ; l i f e w i r e . c o m / 6 g - w i r e l e s s - 4 6 8 5 5 24

Time Horizon: 1 Now

2 3 4

5 2030

4 2040

2050

20 6 0

38


OCAD UNIVERSITY

IOT

&

BACHELORS IN DESIGN THESIS

IIOT

1. “Ma chine-to -m a chine co nnectio ns tha t suppo r t Io T a pplica tio ns w ill a cco unt fo r ...[o ver 50%] o f the to ta l 27.1 billio n devices a nd co nnectio ns... b y 2021 . S our ce : t e chr e p ub l i c. com / ar t i cl e / i nt e r ne t -of-t hi ng s -io t che at -s he e t /

Digital

Ph ysical

B iological

Developing from the digital revolution, the industrial internet of things will bridge the digital and physical spheres through sensors and actuators. This will contribute to connectivity of entire ecosystems and facilitation of human and machine collaborations (World Economic Forum “Industrial Internet of Things”). Themes pertaining to the future of IIOT include big data, surveillance, smart cities, and digitized lives. Time Horizon: 1 2

3 4

Now

2030

3D

5

2. “ The glo ba l ma rk et fo r sma rt city so lutio ns a n d services w a s $36.8 billio n in 2016 a nd is expecte d to to p $88.7 billio n by 202 5 .” S our ce : t e chr e p ub l i c. com / ar t i cl e / i nt e r ne t -of-t hi ng s -io t che at -s he e t /.

3. “500 billio n [io t] devices a re expected to b e co nnected to the Internet by 203 0 .“

S our ce : ci s co. com / c/ d am / e n/ us / p r od uct s / col l at e r al / se / i nt e r ne t -of-t hi ng s / at -a-g l ance -c4 5 -7 3 1 4 7 1 . pdf

4. To w a rds 2030, co nnected system s w ill sha p e industries including; a griculture thro ugh “verti cal a nd in-vitro fo o d pro ductio n”; hea lthca re wi l l beco m e mo re predictive a nd less rea ctive ; a nd ma nufa cturing w ill exist “a s a service, w i th distributed ma nufa cturin g .

S our ce : d i g i t al i s t m ag . com / i ot / 2 0 1 7 / 0 3 / 2 9 / what -wi l l -i nt e r n e t of-t hi ng s -l ook -l i k e -i n-2 0 2 7 -7 -p r e d i ct i ons -0 4 9 9 8 096

5. The pro lifera tio n o f IIOT a nd o ther techno lo gie s w ill lea d to a n increa sed glo ba l energy dem a nd o f 28% by 2040 . S our ce : t e chr e p ub l i c. com / ar t i cl e / i nt e r ne t -of-t hi ng s -io t che at -s he e t /

2040

PRINTING

2050

20 6 0

1. The first entirely 3d printed residentia l co m pl e x w ill be built by 2020, in E indho ven, Netherla nds. S our ce : t ue . nl / e n/ ne ws / ne ws -ove r vi e w/ e i nd hove n-g e t s -th e fi r s t -3 d -concr e t e -p r i nt i ng -hous i ng -p r oj e c t /

Digital

Ph y sical

2. As the techno lo gy is further develo ped, 3 D printing m a teria ls w ill tra nsitio n fro m pla stics to m eta l by 2020 o r 2021 .

Biological

Digital manufacturing processes whereby, digital files (computer aided design) are used by a machine to produce 3D objects. This technology is disrupting the traditional manufacturing sector by providing opportunities for greater efficiency and sustainability (Silva). Additionally, through offering greater personalization these technologies have contributed to an increase in maker culture in younger demographics. Time Horizon: 1 2 No w

39

S our ce : d e l oi t t e . com / i ns i g ht s / us / e n/ i nd us t r y/ t e chnol o g y / t e chnol og y-m e d i a-and -t e l e com -p r e d i ct i ons / 3 d -p r i nt in g m ar k e t . ht m l

3. By 2025, “3D printers w ill print clo thing a t ve r y lo w co st”. S our ce : fab b al oo. com / b l og / 2 0 1 5 / 5 / 1 7 / fut ur i s t -p r e d i ct s -t h e p at h-of-3 d -p r i nt i n g

4. By 2045, 3D printers w ill be a ble to print fo od , a nd exist a s a pplia nces in mo st k itchens. S our ce : t e l e g r ap h. co. uk / t e chnol og y/ ne ws / 1 1 9 4 3 5 7 5 / B ack - t o t he -F ut ur e -D ay-F i ve -e x p e r t s -p r e d i ct -l i fe -i n-2 0 4 5 . ht m l

4

3 2030

2040

2050

20 6 0


PHASE 2: THEORY

© MANDEEP MANGAT, 2019

BIOTECHNOLOGY

1. In the 2030’s the neo co rtex w ill directly co nne ct to the clo u d . S our ce : cb c. ca/ ne ws / t e chnol og y/ ar t i fi ci al -i nt e l l i g e n c e hum an-b r ai n-t o-m e r g e -i n-2 0 3 0 s -s ays -fut uri st k ur zwe i l -1 . 3 1 0 0 124

Digital

Ph y sical

Biological

As technologies become increasingly ubiquitous and miniaturized, they increasingly enter the spaces of the body. Through augmenting and enhancing human qualities previous dichotomies of life and death, and natural and artificial are being redefined. Contributing trends include: genome editing, brain-computer interfaces, biological digital fabrication, synthetic biology, and the quantified self movement (Vairavan). Time Horizon: 1 No w

2

3

2030

QUANTUM

2. By 2030 hum a ns w ill “be regula lry go ing in to bo dy sho ps fo r upgra des” a nd by 2038, “bio n i c o rga ns w ill synchro nise via perso na l instructi o n fro m o ne’s sma rtpho ne”. These o rga ns “wi l l o utperfo rm their bio lo gica l co unterpa rts”. S our ce : t he g uar d i an. com / l i fe and s t yl e / 2 0 1 8 / s e p / 2 2 / r e g ula r b od y-up g r ad e s -what -wi l l -hum ans -l ook -l i k e -i n-1 0 0 -ye a r s

3. There is a 50% cha nce tha t a ging w ill be bro ug h t “under decisive co ntro l” by 203 8 . S our ce : t he g uar d i an. com / l i fe and s t yl e / 2 0 1 8 / s e p / 2 2 / r e g ula r b od y-up g r ad e s -what -wi l l -hum ans -l ook -l i k e -i n-1 0 0 -ye a r s

4. In 2045, emo tio na l experiences w ill be purcha s e d a nd strea med thro ugh direct bra in stim ula tion . As w ell urba n enviro nm ents w ill co nsist o f livi n g a rchitectura l hybrids, fea turing entirely n e w synthetic life fo rms tha t co mbine bo th bio lo gy an d ma chine . S our ce : t e l e g r ap h. co. uk / t e chnol og y/ ne ws / 1 1 9 4 3 5 7 5 / B a c k t o-t he -F ut ur e -D ay-F i ve -e x p e r t s -p r e d i ct -l i fe -i n-2 0 4 5 . ht m l

4 2040

COMPUTING

2050

20 6 0

1. “Sm a ll qua ntum techno lo gies w ill be co m mercia l l y a va ila ble in...[2025 , helping] businesses increa s e revenue, reduce co sts a nd lo w er investm ents i n infra structure .“ S our ce : p l ur al s i g ht . com / b l og / car e e r / t e ch-i n-2 025

Di gital

Ph y sical

2. By 2030, scientists w ill be a ble to pro gram tissues o n co m puters a nd print o ut o rga nisms i n lab s.

B iological

S our ce : ar chi ve . nyt i m e s . com / www. nyt i m e s . co m / i nt e r act i ve / 2 0 1 1 / 1 2 / 0 6 / s ci e nce / 2 0 1 1 1 2 0 6 -t e chnol o g y t i m e l i ne . h t m l

Through combining physics with the existing field of computer science supercomputing processes will be achieved (Keys). The transformation of computing power will support other technologies as well as advance a wide spectrum of fields: blockchain, cryptography, big data, machine learning, and algorithms (King).

3. Thro ugh qua ntum co mputers a iding ma chine lea ning in da ta pro cessing, “hundreds to tho usa n d s o f new pla nets [w ill be disco vered] da ily by th e ea rly 2030’ s .” S our ce : q uant um r un. com / p r e d i ct i on/ how-q uant u m com p ut e r s -wi l l -chang e -wor l d -fut ur e -com p ut e r s

4. By 2040, qua ntum co mputers w ill be a ble to “ena ble nea r-perfect, rea l-time la ngua ge tra nsla tio n betw een“ different la ngua ge s. S our ce : q uant um r un. com / p r e d i ct i on/ how-q uant u m com p ut e r s -wi l l -chang e -wor l d -fut ur e -com p ut e r s

5. All huma n k no w ledge (including experiences) w i l l be sto red o n a co llective super co mputer in 206 1 .

S our ce : ar chi ve . nyt i m e s . com / www. nyt i m e s . co m / i nt e r act i ve / 2 0 1 1 / 1 2 / 0 6 / s ci e nce / 2 0 1 1 1 2 0 6 -t e chnol o g y t i m e l i ne . h t m l

Time Horizon: 1 Now

2

3

2030

5

4 2040

2050

20 6 0

40


OCAD UNIVERSITY

BACHELORS IN DESIGN THESIS

NANOTECHNOLOGY

Digital

Physical

1. Adva ncements in hea lthca re by 2023 w ill includ e : tissue engineering ena bling o rga n regenera tio n at a ra te tha t w ill exceed no rm a l huma n bio lo gy, a n d ; bra in impla nts being used to “resto re lo st memo r y ” a nd “unlo ck the po tentia l o f the hum a n bra in ”. 2.

Biological

The technical process by which matter is manipulated in nanometers, thus creating entire systems that function on a molecular scale (Wikipedia “Nanotechnology”). While the applications are numerous (and vary based on other technologies involved), future implications consistently involve the creation of many new materials, devices, and behavioral systems. Time Horizon: 1 Now

41

2

S our ce : val ue r . ai / b l og / g ui d e -t o-fut ur e -nanot e chnol o g y s t ar t up s / # W i t hi n_5 _yea r s

By 2028, “pro gra mma ble na no ro bo tic device s a nd na no -pha rm a ceutica ls co uld be ca pa ble o f reversing the effects o f ... disea se[s], a ssisting th e im mune system ”, “a nd fixing genetic erro rs in cell s ”. S our ce : q uant um r un. com / ar t i cl e / nanot e chnol og y-fut u r e m e d i ci n e

3. By 2045, there w ill be “insect-siz ed ...[ro bots tha t w ill be] ca pa ble o f flying in co -o rdina te d sw a rms”. Industry a pplica tio ns include a gricultur e a nd m ilitar y . S our ce : ht t p s : / / www. t e l e g r ap h. co. uk / t e chnol og y / ne ws / 1 1 9 4 3 5 7 5 / B ack -t o-t he -F ut ur e -D ay-F i ve -e x p e r t sp r e d i ct -l i fe -i n-2 0 4 5 . htm l

4. By 2045, perso na l 3D printers “w ill be a ble to print o ut full a nd functio ning o rga ns”, a s w ell as, “m edicine a nd fo o d ”. S our ce : val ue r . ai / b l og / g ui d e -t o-fut ur e -nanot e chnol o g y s t ar t up s / # W i t hi n_5 _yea r s

3 4 2030

2040

2050

20 6 0


© MANDEEP MANGAT, 2019 Fig. 23. Edited by Mandeep Mangat, original by: Howe, Charlotte. Patternity.org. 2018. Digital Image. Pinterest. 2019. < https://www.pinterest.ca/pin/182466222382186267/>

PHASE 2: THEORY

FULLY

AUTONOMOUS

VEHICLES 1. The Surgica l Ro bo tics Ma rk et is expected to do uble fro m $3bn in 2014 to $6bn by 2020 . G ro w ing dema nd fo r m inim a lly inva sive pro cedur e s w ill see m a rk et gro w th thro ugh till 202 4 .

Digital

Ph y sical

S our ce : i t p r op or t al . com / fe at ur e s / t he -fut ur e -of-m e d i c a l r ob ot i c s/

Biological

Through combining various technologies, increasing levels of autonomy are being applied to products, services, and systems (Wikipedia “Vehicular automation”). The current levels of autonomy range in degree of human assistance and ability for complex network communication. Applications include transportation, delivery, construction and agriculture, with larger ambitions of entirely autonomous systems. Time Horizon: 1 Now

2 2030

2040

2. “Driverless ca rs a re fo reca st to m a k e u p 75% o f a ll tra ffic by 2040” bringing w ith it “th e tra nsfo rm a tio n o f a ll ... infra structure a ro und ... driving “. S our ce : t he g uar d i an. com / com m e nt i s fr e e / 2 0 1 6 / m ay/ 2 4/ r ob ot s -fut ur e -wor k -hum ans -j ob s -l e i s u r e

3. Betw een 2040-2050 driverless techno lo gies wi l l be respo nsible fo r the disa ppea ra nce o f 1 in 4 jo bs. S our ce : fut ur i s t s p e ak e r . com / p r e d i ct i ons / 2 5 -s hock i n g p r e d i ct i ons -ab out -t he -com i ng -d r i ve r l e s s -car -e r a-i n-t he -u - s/

4. As the glo ba l po pula tio n is expected to rea ch nea rly 10 billio n peo ple by 2050, a uto m a tio n i n a griculture w ill be vita l to suppo rt the po pula ce . S our ce : s i ng ul ar i t yhub . com / 2 0 1 7 / 1 0 / 3 0 / t he -far m s -of-t h e fut ur e -wi l l -r un-on-ai -and -r ob o t s/

3

4 2050

20 6 0

42


OCAD UNIVERSITY

BACHELORS IN DESIGN THESIS

ROBOTICS

1. Over 1/3rd “o f U.K. jo bs co uld be a t “high risk ” o f a uto m a tio n by ... 2030 a nd ro bo ts co uld ta k e o v e r 38% o f current U.S. jo bs” by 203 3 . S our ce : i ft f. or g / fi l e ad m i n/ us e r _up l oad / d ownl oad s /t h / S R1 9 4 0 _I F T F for D e l l T e chnol og i e s _Hum an-M achi ne _0 7 0 7 1 7_ r e ad e r hi g h-r e s . p df

Di gital

Ph y sical

2. The use o f ro bo ts fo r sexua l purpo ses “w ill b e ‘so cia lly no rm a l’” by 204 1 .

Biological

The development of physical machines that serve many industries (domestic, commercial, or military). Current trajectories focus on robots replacing human workers and social robots intended for both social and affective interaction (Wikipedia “Robotics”). New developments in prosthetics, exoskeletons, and implants indicate this technology entering traditional biological fields.

S our ce : e l l e . com / uk / l i fe -and -cul t ur e / cul t ur e / ne ws / a3 3 094/ s e x -wi t h-r ob ot s -s oci al l y-nor m al l y-t e chnol o g y /

3. By 2037, ha lf o f the po pula tio n w ill be o ut o f jo bs, fo llo w ed by the ma jo rity o f the po pula tio n b y 2050. S our ce : m ot he r j one s . com / p ol i t i cs / 2 0 1 7 / 1 0 / you-w i l l l os e -your -j ob -t o-a-r ob ot -and -s oone r -t han-you-t hin k /

4. Due to the glo ba l dem o gra phic shift in a gi n g po pula tio ns ro bo ts w ill be implem ented into elde r l y ca re to ena ble grea ter a uto no my fo r these use r s . S our ce s : p d fs . s e m ant i cs chol ar . o r g / e f0 4 / 2 e 7 1 7 8 b 7 0 9 0 d 4 7 9 a5 6 ae e e 3 7 6 d 6 f9 4 d 7 7 fa0 . p d f ; b b c. com / fut ur e / s t or y/ 2 0 1 4 1 0 1 6 -t he -od d -t hi ng s -r ob o t sd o-t o-us ; m e d i um . com / p r e d i ct / s e ni or s -i n-a-cyb e r ag e p r e p ar i ng -us -for -t he -r ob ot -r e vol ut i on-8 5 8 2 e 0 7 7 c933

Time Horizon: 1 Now

2030

ARTIFICIAL

2

3

4

2040

2050

20 6 0

INTELLIGENCE 1. “Ma chines a re predicted to be better tha n u s a t tra nsla ting la ngua ges by 2024, [...a nd]...w riti n g high-scho o l essa ys by 2026 ”.

S our ce : ne ws ci e nt i s t . com / ar t i cl e / 2 1 3 3 1 8 8 -ai -wi l l -b e -ab l e - t o b e at -us -at -e ve r yt hi ng -b y-2 0 6 0 -s ay-e x p e r t s / # i x zz5 uj 1 S 4 6U e

Digital

Ph ysical

B iological

The intelligence demonstrated by computers that are informed by algorithms (step-bystep instructions). These algorithms have historically grown in complexity from early machine learning towards deep learning (neural nets comprised of layers of nodes that mimic the biological structure of the human brain) (Shah). Future themes include: cognitive automation, and AI consciousness. Time Horizon: 1 No w

43

2. “There is a 50% cha nce tha t high-level m a chi n e intelligence (defined a s “a m a chine tha t ca n car r y o ut m o st hum a n pro fessio ns a t lea st a s w ell as a typica l huma n”) w ill be develo ped a ro und 204 0 – 2050”, a nd w ill rise to 90% by 2075 .

S our ce : fr ont i e r s i n. or g / ar t i cl e s / 1 0 . 3 3 8 9 / fr ob t . 2 0 1 7 . 0 0 0 75/ full

3. There is a 50% cha nce tha t m a chines wi l l o utperfo rm huma ns in a ll ta sk s by 206 2 .

S our ce : ne ws ci e nt i s t . com / ar t i cl e / 2 1 3 3 1 8 8 -ai -wi l l -b e -ab l e - t o b e at -us -at -e ve r yt hi ng -b y-2 0 6 0 -s ay-e x p e r t s / # i x zz5 uj 1 9 Hz pm

4. “Super intelligence (defined a s “a ny intellect tha t grea tly exceeds the co gnitive perfo rm a nce o f hum a ns in virtua lly a ll do m a ins o f interest”)” i s predicted fo r 3005, a nd there is a 30% cha nce “th at this develo pm ent turns o ut to be [...] “extrem e l y ba d” fo r huma nit y ”. S our ce : fr ont i e r s i n. or g / ar t i cl e s / 1 0 . 3 3 8 9 / fr ob t . 2 0 1 7 . 0 0 0 75/ full

2 2030

2040

3 4 2050

20 6 0


Fig. 24. Edited by Mandeep Mangat, original by: Howe, Charlotte. Patternity.org. 2018. Digital Image. Pinterest. 2019. < https://www.pinterest.ca/pin/182466222382186267/>

PHASE 2: THEORY © MANDEEP MANGAT, 2019

44


OCAD UNIVERSITY

BACHELORS IN DESIGN THESIS

TECHNO-CULTURE MAP

ECHO-CHAMB

The following data map (Fig. 25) is divided into 3 zones based on the technological impact on humanity. The categories include; social facilitation, direct social interaction, and self-concept.

FILTER BUBBLES

I

Technology now pervades some social interactions by mediating certain human to human communication (human-technologyhuman relations). This has evolved from telecommunication to email, and new social platforms along with digital gaming realms. Next, through the emergence of social robotics, service assistants, and chat bots society is just beginning to interact directly with technology (human - tech relations) as a substitute for interacting with other persons. Interestingly, culturally, direct peer-to-peer or group interactions inform one’s understanding of social norms, and cultural values. The impact of relational human to technology interactions is yet to be perceived. Lastly, biotechnologies, ubiquitous computing, and neuroscience are disrupting the concept of what it means to be biologically human. For instance, previous views on the boundaries between life and death (and the associated cultural meaning thereof) are being challenged by new discoveries in neuroscience, and new innovations such as mind-uploading. 45

EMOJI COMMUNICATION

SNA

LEA L

DIGITAL COMMUNITY


PHASE 2: THEORY

© MANDEEP MANGAT, 2019

OUTSOURCED INTELLIGENCE

SOCIAL NETWORKING EXPERIENCES TWITTER

SIRI

ASSISTED DEVELOPMENT

AMAZON ALEXA

GOOGLE NOW

FACEBOOK/ MESSEGNGER

BERS

CHAT BOTS

TINDER

INSTAGRAM

PERSONALIZATION ALGROTIHMS

SOCIAL MEDIA

ROOMBA

SERVICE BOTS

DIRECT SOCIAL INTERACTIONS

SOCIAL FACILITATION

I.O.T.

PARO NA0 ROBOT

APCHAT

COMPANION BOTS

INFORMATION & COMMUNICATION TECHNOLOGIES

AGUE OF LEGENDS

NEW SOCIAL BEHAVIOURAL NORMS (POLITENESS & ETIQUETTE)

HARMONY ROXXXY

MULTIPLAYER DIGITAL NETWORK

ROBOTICS

EMOTIONAL ATTACHEMENT TO OBJECTS

ANTHROPOMORPHISM

EVA

SELF-CONCEPT DIGISEXUAL

STARGATE AFFECTIVE COMPUTING

BIO-TECHNOLOGIES WEARABLE TECHNOLOGY Mind Uploading

APPLE WATCH

FITBIT

QUANTIFIED SELF

PHARMACEUTICALS

CRISPR KITS

BIO-POLITICS

TRANSHUMANISM

Fig. 25. Techno-Culture Data Map.

46


OCAD UNIVERSITY

BACHELORS IN DESIGN THESIS

ARTIFICIAL SOCIAL AGENTS By exploring the ‘Evolution of Technology Time-line’ (Fig. 21), and the ‘Techno-Culture Data Map’ (Fig. 25) one can conclude that technological developments have evolved beyond simple tool-like, utilitarian-driven interactions and is increasingly becoming a part of the social fabric of human life. Particularly, there is a growing trend towards direct humanto-technological interactions signaling a new age of socially intelligent technologies. Artificial Social Agents In the application of socially intelligent technologies, the designed innovation is intentionally presented as a substitute for 47

interacting with another person. By attempting to replicate human interaction, the directive for this technology is based off of the integral parts of human sociability (Dumouchel and Damiano x). As such, the success of this endeavor entails convincing the human participant of a capacity for: affective expression; empathy; an active social presence (as a means of communication and establishing a dual subject-object relation); and the ability to adapt to varying social contexts (through stopping the performance of a task and exercising degrees of autonomy through decision making) (Dumouchel and Damiano 30-43). When these roles and interactions are achieved by technology, they will give way


//Radbutfab//. Untitled. Edited Digital Image. Pinterest. 2019. < https://www.pinterest.ca/pin/615093261577421385/?lp=true>

© MANDEEP MANGAT, 2019

Fig. 26.

PHASE 2: THEORY

to a new classification known as “artificial social agents” within the field of human-computer interaction (Banerji 43-46). Typology of the Technological as Social The “Typology of the Technological as Social” (Fig. 27) maps out the ecosystem of the emerging technologies that take on the role of “artificial social agents”. Through the divisional branching of this technology, further subspecies can be identified that comprise the greater landscape (Samsonoff and Tampilic 110-115). Additionally, the subclassifications are based off of existing theories on social stratifications. In analyzing this technology through a sociological lens, the claim of technology becoming increasingly social, and, thus, gaining greater influence in human culture is further validated (Wikipedia “Social Stratification”). Diagram Tiers

1 - Embodiment: The type and degree of engagement during communication is dependent on how the agent manifests itself onto the viewer. Within this context, it can be tangible, (taking up 3 dimensional space, and experienced visually and tactilely) or nontangible (conversational agents are experienced through voice audio interface, and text-based, or a digital avatar). These various embodiments afford the technology with varying degrees of human sociability: a sense of presence (i.e. interacting with a singular individual through localized presence, or one that is pluralistic through an ambient manifestation); the ability for emotional expression (facial expression, body language, voice tonality, and word selection); and finally, the ability to evoke empathy from the human user (Dumouchel and Damiano 102-104). Lastly, in terms of the tangible manifestations, various degrees of 48


OCAD UNIVERSITY

BACHELORS IN DESIGN THESIS

T YPOLO T E C H N OLOGI

TANGIBLE

ANIMAL

HUMANOID

DEPENDENT

B

W

P

R E C I P R OCAL

B

W

P

INDEPENDENT

B

W

P

DEPENDENT

RECIPROCAL

B

B

W

P

W

P

OBJECT

INDEPENDENT

B

W

P

DEPENDENT

RECIPROCAL

B

B

W

P

W

P

Furo-i ( Keepon Albert HUBO Geminoids ASIMO REEM (PAL Robotics) Harmony Roxxxy Jia Jia

49

Erika (Hiroshi Ishiguro) Sophia Paro Sota (Vstone) Aibo iPal (AvatarMind) Furby Tickle-me Elmo Pepper (Aldebraun) iDog, iCat, and iFish NAO (Aldebraun) My Keepon KURI Spot KASPAR (Adaptive Systems Research Group)

Jib Bud Zen Mik

Cozmo (Anki) LEA (Robot Care Sy Drones

Amazon Scout Relay (Saviok Eva (Automata 3D Printers


PHASE 2: THEORY

© MANDEEP MANGAT, 2019

Fig. 27. Tree Diagram outlining the typology of social technologies. In completing this diagram, it was discovered that the majority of existing artificial social agents fall on various spectrum’s between branches, and newer models may shift in their positioning. Visual aesthetic inspired by the “IOT Tree of Life” diagram from: Samsonoff, Nathan and Tampilic, Michael. “On the Origin of Things”. Game Changer, MISC Spring 2016: 110-15. Print.

O G Y OF THE I C A L AS SOC I AL

B

W

P

RECIPROCAL

B

B

W

P

W

P

INDEPENDENT

B

W

P

DEPENDENT

RECIPROCAL

B

B

W

P

Roomba

(Future Robot)

bo ddy (Blue Frog) nbo ko (Emotix)

ystems)

Google Home Amazon Echo Sonos One

Oshbot (Fellow Robots) Mattel’s Aristotle

ke) a Technologies)

Sara (Human-Computer Interaction Institute) Eliza @TayandYou (Microsoft) IBM Watson Microsoft Cortana Apple Siri Amazon Alexa Google Assistant Alice (Yandex)

W

P

INDEPENDENT

B

W

P

RELATION POWER DYNAMIC

D E P E N DENT

SOCIOECONOMIC STATUS

SCREEN/ TEXT

AMBIENT/ AUDIO

INDEPENDENT

EMBODIMENT

I NTANGIBLE

MuseBot (Socos) ZenoBot Saya Ishikawa Replika - Your AI Friend Sara (Human-Computer Interaction Institute) Deep Thought (IBM)

Tamagotchi Daisy Paige (Spark CGi) Clippit (Microsoft Word) Facebook Messenger Chatbots Slack Chatbot

Forex Robot (FX trading software) Vital - A.I. Director (Deep Knowledge Ventures)

50


OCAD UNIVERSITY

anthropomorphism are applied. The result of this being forms that are humanoid, animalistic, or simply object-based in their appearance. Collectively, these forms evoke a spectrum of familiarity towards otherness in how the user relates to the technology (Park 1-2). 2 - Relational Power Dynamic: Interpersonal communication entails the exchange of information between 2 or more people (Wikipedia “Power”). All of these exchanges involve degrees of persuasion and influence between the actors (Wikipedia “Power”). Further, social power is defined as “the capacity of an individual to influence the conduct of others” (Wikipedia “Power”). Within the context of this technology, it is directly related to the role (and therefore the intended interaction) it is designed for. For instance, the role determines the degree of autonomy in decision making, and the amount of authority over others (through greater responsibilities). As well, providing social-technologies with the ability to be flexible and adaptable in tasks, enables greater reciprocity throughout the interaction. For this tier, relational power dynamics are categorized as: dependent (such as: relying on the human user; simply absorbing them; or being directly controlled by them), reciprocal (i.e. dynamic and collaborative exchange between the two agents), or independent (such as: self-determined by design; often expressing without awareness of the 51

BACHELORS IN DESIGN THESIS

other; or having some degree of control over the human user). 3 - Socio-Economic Status: As this emerging technology is being developed and applied within a capitalistic model, stratification based on socio-economic status has also been explored. Socioeconomic classes can be divided into categories (i.e. high, middle, and low), with one’s status determined by a combination of factors, such as income, education, and occupation (Wikipedia “Socioeconomic status”). Looking into artificial social agents, the fact that they are designed, implies an intended purpose to be served. Therefore, the landscape of this technology is comprised of various types of workers, and comparisons have been drawn on existing occupational strata. Summarizing existing forms of labour the tiers for this technology include: blue collar (physical, and mechanically skilled work), white collar (cognitive, and intellectual tasks), and pink collar (service and emotional labour) (Wikipedia “Designation of workers”). Summary Through this categorization process, various archetypal companions emerge. Such identities include the: caretaker, friend, sexual partner, assistant, and the dependent. Could these new technologies suggest the possibilities of an entirely new species, Techno-sapiens? Or perhaps a possible subspecies within the Homo sapiens?


Fig. 28. //Radbutfab//. Untitled. Edited Digital Image. Pinterest. 2019. < https://www.pinterest.ca/pin/615093261577421385/?lp=true>

PHASE 2: THEORY © MANDEEP MANGAT, 2019

52


Fig. 29. Skypressmarket. Digital Damage, seamless animation of Pixel Sorting Glitch. Movie Still. Web. 2019. < https://www.pond5.com/stock-footage/84838075/digital-damageseamless-animation-pixel-sorting-glitch-effect.html>.

OCAD UNIVERSITY

53

BACHELORS IN DESIGN THESIS

R 3.1.

PA RA DI GM S H I F T . . . . . . . . . . . . . .55

3.2.

MO RA L PH I L O S O P H Y . . . . . . . . . . . .59

3.3.

TR EN D AN A L Y S I S . . . . . . . . . . . . . .67

3.4.

TR EN D SY N T H E S I S . . . . . . . . . . . . .93


PHASE 3: RE-FOCUSING

© MANDEEP MANGAT, 2019

3 RE-FOCUSING

VIDEOBLOCKS-DIGITAL-CORRUPTION-SEAMLESS-ANIMATION-OF-PIX EL-SORTING-GLITCH-EFFECT-FOR-TRANSITIONS-BROADCAST-PODCAST-LED-SCREENS-PROJECTIONS-AUDIOVISUAL-PERFORMANCE-MUSIC-VID COPY

54


OCAD UNIVERSITY

3.1. PARADIGM SHIFT

55

BACHELORS IN DESIGN THESIS


PHASE 3: RE-FOCUSING

Š MANDEEP MANGAT, 2019

RE-FOCUSING: There is a significant need for a paradigm shift in terms of how we understand and relate to technology. As revealed in the previous section, technology is embedded in our day-to-day lives, and is very much apart of our humanity - physically, morally, culturally, and ideologically. However, without re-aligning the conceptual framework of technology in relation to humanity, it will not be possible to design humanity through future technologies while enabling authentic human flourishing. In this context, if we instead designed with the perspective that humanity was highly complex, and that, consequently, technology could have a qualitative - positive, or, negative impact on the experiences and identities of its human recipients? As designers, how might we begin to consider the dimensions of human experience (cognitive, social, emotional, and ethical) as our stakeholders?

For this section, I will provide insights from the philosophy of ethics in order to further establish an informed perspective on the current technological landscape and support the evaluation of future directions to pursue that bring true value to humanity. Following this, relevant case studies into mega-trends that relate to the topic of technology and ethics (i.e. shifting human values) will be investigated.

56


OCAD UNIVERSITY

BACHELORS IN DESIGN THESIS

RE-GENERATIVE QUESTIONS WHAT IF HUMANITY WERE A BLACK BOX? WHAT IF THE QUALITIES OF HUMANITY WERE STAKEHOLDERS? WHAT IF HUMANITY & TECHNOLOGY WERE IN A SYMBIOTIC-HYBRID DYNAMIC? WHAT IF TECHNOLOGY AND HUMANITY WERE CO-EVOLVING TOGETHER? WHAT WOULD IT MEAN FOR HUMANITY TO FLOURISH? 57


PHASE 3: RE-FOCUSING

© MANDEEP MANGAT, 2019

TE CH N OL O

HU

MA

NI

TY

GY

Fig. 30. Humanity and Technology.

58


OCAD UNIVERSITY

BACHELORS IN DESIGN THESIS

VIRTUE ETHICS

NORMATIVE ETHICS

CONSEQUENTIALIST ETHICS

DEONTOLOGICAL ETHICS

Fig. 31. Branches of Normative Ethics.

59


PHASE 3: RE-FOCUSING

© MANDEEP MANGAT, 2019

3.2. MORAL PHILOSOPHY NORMATIVE ETHICS As already argued, in 2.1., Philosophy of Technology, technology impacts our morality through: forming a moral-hybrid entity with the human subject, mediating our actionable options, and normalizing specific values through exposure. Due to this scale of moral impact, it is worthwhile to leverage knowledge from the field of ethics in order to determine appropriate applications for technology. Moral philosophy is concerned with constructing and organizing concepts of right and wrong behaviour as a means for recommending how to live a moral life (Wikipedia “Ethics”). This branch of philosophy has 3 further subdivisions of study; meta-ethics, normative ethics, and applied ethics (Wikipedia “Ethics”). Amongst the study of practical, ethical action, I will be investigating and comparing theories in consequentialism, deontology, and, lastly, virtue ethics. CONSEQUENTIALIST ETHICS Also known as Utilitarian ethics, this division of normative ethics is

focused on the consequences of our actions relative to their impact on the happiness of humanity (Vallor et al. 4). Here, happiness is viewed as the ultimate good, and main purpose of society and life. In terms of happiness, utilitarians weigh the overall happiness and welfare an action will contribute towards all of humanity (Vallor et al. 4). A major contributor to consequentialist ethics, John Stuart Mill measured happiness as an aggregate of pleasure and the absence (or decrease) of pain (Vallor et al. 4). Further on this topic, Mill asserts that intellectual and psychological happiness provide the greatest fulfillment (Vallor et al. 4). The pursuit of collective happiness entails considering the welfare of all of humanity as equal stakeholders, along with the longterm and unintended effects (Vallor et al. 4). Due to this, happiness is viewed as a common, universal good. However, utilitarians refrain from a standard of rules to follow, believing that each case will vary based on context (Bowles 75).

60


OCAD UNIVERSITY

BACHELORS IN DESIGN THESIS

DEONTOLOGICAL ETHICS Deontological ethics is founded on the adherence to moral rules, rights, principles and duties that should be universal in application. Instead of examining the impact of an act, deontologists examine the nature of the act itself, with the belief that there is a standard of moral behavior with which to conduct oneself (Alexander and Moore). Still used today by deontologists is Immanuel Kant’s categorical imperative - a deontological rule manifested in the following formulas: the formula of the universal law or nature (only to act on principles that can be applied to everyone), and the formula of humanity (to always treat other people as ends in themselves and not to be exploited for personal purposes) (Valor et al. 2). Lastly, a final example deontological system of Golden Rule: to do onto would be done by (Valor

of a ethics is the other as you et al. 2).

VIRTUE ETHICS This stream of ethics claims that there is no set formula for ethical action, rather, ethical decisions are dependent on context and the task of moral distinction exists throughout the entirety of one’s life (Valor et 61

al. 7-8). To support this, virtue ethicists pursue happiness through developing individual moral character by demonstrating positive virtues in all actions (Hursthouse and Pettigrove). In consideration of a moral life, certain characteristics are valued over others. These characteristics symbolize the virtues an individual should pursue, whereas the opposite of these characteristics symbolize vices to avoid (Hursthouse and Pettigrove). Virtue ethicists believe that sets of virtues collectively function as a compass to guide an individuals moral choices. Additionally, these virtues are internally developed through habitual action (Hursthouse and Pettigrove). For example, Aristotelian ethics claim “we are what we repeatedly do”, (Valor et al. 8) indicating that characteristics are developed through habit. Lastly, through embodying virtues, an individual is capable of practical wisdom (an accumulation of moral perception, moral emotion, and moral imagination) (Valor et al. 8-9). Here, practical wisdom is considered the model of moral excellence, and enables a person to make morally sound decisions and actions regardless of context.


Fig. 32. Unknown Artist. 12-9midnightwave. 2016. Digital Image. Kava Lounge. Web. 2019. < http://www.kavalounge.com/dev2/12-9midnightwave/>.

PHASE 3: RE-FOCUSING © MANDEEP MANGAT, 2019

62


OCAD UNIVERSITY

SUMMARY OF NORMATIVE ETHICS Each branch of normative ethics carries with it, its own merit and potential applications. To compare, Consequentialist ethics promotes equal distribution of benefits, Deontological ethics could be useful in designing algorithms and for machine ethics, and lastly Virtue ethics can outline the values that designs should promote in citizens. A synthesis of all 3 could be leveraged throughout a techno-ethical toolkit for designers. TECHNOMORAL WISDOM In the book, Technology and the Virtues: A philosophical guide to a future worth wanting, Shannon Vallor develops and applies a moral framework for 21st century wisdom that is based off of classical philosophical traditions of Buddhist, Aristotelian, and, Confucian ethics. The ‘Framework for Achieving Techno-Moral Wisdom’ Diagram (Fig. 34) summarizes Vallors comparison and consolidation of various virtue ethics perspectives in suggesting a path for achieving wisdom. The interior ring recommends qualities that support technomoral virtues in the second ring. Once actualized, the virtues in the second ring support degrees of wisdom (in the outer ring) that Vallor argues as chronological in developing ones own inner faculty of wisdom. 63

BACHELORS IN DESIGN THESIS

SUMMARY OF ETHICAL PERSPECTIVES CONSEQUENTIALISM

Will the consequences of my actions maximize happiness and minimize suffering for the greatest amount of people?

DEONTOLOGY Will my actions be acceptable as a universal law of behavior? What if everyone did what I’m about to do?

VIRTUE ETHICS

Am I happy with the virtues that I will demonstrate through this course of action?

TECHNOMORAL VIRTUES 1. 2. 3. 4. 5. 6. 7. 8. 9. 10. 11. 12.

Honesty Self-Control Humility Justice Courage Empathy Care Civility Flexibility Perspective Magnanimity Technomoral Wisdom


PHASE 3: RE-FOCUSING

© MANDEEP MANGAT, 2019

Fig. 33. Unknown Artist. Glitch Justice Blind Statue. 2018. Digital Image. Unicorn Riot. Web. 2019. < https://unicornriot.ninja/wp-content/uploads/2018/06/title-pic-glitch-justice-blind-statue.jpg,>.

64


BACHELORS IN DESIGN THESIS EMOTIONAL

OCAD UNIVERSITY

FRAMEWORK FOR ACHIEVING TECHNO-MORAL WISDOM

COURAGE

SIO N HY

BENEVOLENCE

TH HUM BLAC

ND

FR

A lifelong self-reflective acceptance with a commitment to self-improvement.

RESP

EN

GA

3 REFLECTIVE SELF-EXAMINATION

IE

GE

ME

N SH T I ECT P TOLERANCE

TY CHARI E TY LOV SI O R NE CE E G I RV SE

EMPATHY

NC E

The cultivation of formative relations founded on moral roles and responsibilities to one’s network.

CY

SYMPAT

N ER CE

Y

SOCIAL

PAS

RE

ST

EMOTIONAL

ND

PE

2 RELATIONAL UNDERSTANDING

VE

DE

COM

HO

WO

MO

RE

MER

HUMILITY

Habits of moral self-cultivation through intentional practice.

PE RS ER VE RE

1 MORAL HABITUATION

CARE 4 INTENTIONAL SELF-DIRECTION OF MORAL DEVELOPMENT

CIVILITY

65

SOCIAL

Self-directing one’s own growth through utilizing cultivated perception and reason, the knowledge of moral practices, and self-control over one’s emotions and desires.


© MANDEEP MANGAT, 2019

COGNITIVE

PHASE 3: RE-FOCUSING

5. MORAL ATTENTION

MAGNANIMITY

Ethical judgement through environmental sensitivty.

FO RT IT UD E AMB ITI EQ ON UA N TO IM LE IT Y RA NC E

FLEXIBILITY

HE MAN CKBOX

6. PRUDENTIAL JUDGEMENT E NC

SELF-CONTROL

E N TI IO A P RAT E DE LIN MO SCIP DI RENCE TEMPE ATTENTION

COGNITIVE

UNDERS TANDIN DIS G CER NME NT T R

RU

Y IT GR TY TE LI IN NSIBI O ESS SP RE F A I R N RE CI PR OC IT Y

EL

IA

BL

To deliberate and choose the most appropriate and effective means available for achieving a noble end.

ETHICAL

PERSPECTIVE

ST

E

7. EXTENSION OF MORAL CONCERN To expand one’s attitude of moral wisdom contextually (i.e., the right: beings; time; manner; and degree).

HONESTY

JUSTICE

8. TECHNOMORAL WISDOM

ETHICAL

The combined actualization of all the practices in moral self-cultivation.

Fig. 34. Framework for achieving technomoral wisdom diagram. Based off of: Vallor, Shannon. “Technology of the Virtues: A philosophical guide to a future worth having”. New York, Oxford University Press, October 2016, pp 61-155.

66


OCAD UNIVERSITY

BACHELORS IN DESIGN THESIS

3.3. TREND ANALYSIS WHILE THE PREVIOUS SECTION ESTABLISHED A FOUNDATION IN ETHICAL CONCEPTS, THIS PERSPECTIVE WILL NOW BE FURTHER DEVELOPED THROUGH CRITICALLY EXAMINING MEGA-TRENDS (FIG. 43) THAT ARE CURRENTLY - AND WILL CONTINUE TO - SHAPE FUTURE GLOBAL VALUES. THE FOLLOWING IS A SUMMERY AND EXPLANATION OF THE TERMS THAT WILL BE USED IN ANALYZING THESE TRENDS.

M E G A -T R E N D S / T R E N D S:

The general tendency or direction of a change and/or movement over time (Forward Thinking Platform 22). Trends can be strong or weak, increasing, decreasing, or stable (Government Office for Science 93). Mega-trends, and trends are the undercurrents of societal development, and there impact will reach full effect over the next 1015 years (Forward Thinking Platform 22).

D R I V E R S/ D R I V I N G F O R C E S:

The nature of driving forces can be direct or indirect Direct forces directly influence an outcome in the system, whereas an indirect driver behaves more diffusely, 67

impacting other drivers (Forward Thinking Platform 8).

SIG N A L S/ W E A K S I G N A L S:

Together, signals act as early indicators of emerging patterns of potential new situations and trends (Forward Thinking Platform 9; Government Office for Science 4). These patterns can either transform the existing system, have little consequences and die off, or reappear (Forward Thinking Platform 9).

D I S C O N T I N U I T I E S:

A rupturing force that interrupts the projection of an on-going situation. Similar to the concept of a wild card in its critical uncertainty (Forward Thinking Platform 6).


PHASE 3: RE-FOCUSING

© MANDEEP MANGAT, 2019

MEGA-TRENDS MACHINE BIAS

ETHICS

AND

SURVEILLANCE

STATE

DATA CONTROL MONETIZATION

AND

ECONOMIC AND INEQUALITIES

ASSET

IMPLICIT TRUST UNDERSTANDING

AND

EROSION OF TRUTH, DISINFORMATION, AND

ALGORITHMIC

USER

PROPAGANDA

ADDICTION, ATTENTION, THE DOPAMINE ECONOMY

AND

COLLAPSE OF CIVIL SOCIETY AND THE CREATION OF HATEFUL ACTORS Fig. 35. Table summarizing Meta-Trends. The framework for the following trends were adapted from two major bodies of work: Bowles, Cennydd. “Future Ethics”. Brighton, NowNext Press, September 2018. IFTF, and Omidyar Network. “Ethical OS: A guide to anticipating the future impact of today’s technology ”. Digital Intelligence Lab at the Institute for the Future, and Tech and Society Solutions Lab at Omidyar Network, September 2018. https://ethicalos.org/ Additional research was conducted in order to substantiate and support the trends asserted.

68


OCAD UNIVERSITY

BACHELORS IN DESIGN THESIS

3.3.1. MACHINE ETHICS AND ALGORITHMIC BIAS AS WE DEVELOP ALGORITHMIC DECISION MAKING TO BE UTILIZED IN EVALUATIVE SYSTEMS (I.E. APPROVAL OR DISAPPROVAL FOR EMPLOYMENT, EDUCATION, INSURANCE, LOANS, WELFARE, AND CRIMINAL JUSTICE), BLACK-BOXED MACHINE LEARNING IS CODIFYING HUMAN BIAS AS LOGIC. FURTHER, AS AUTOMATED DECISION-MAKING IS MEANT TO BE IMPLEMENTED ON A MASSIVE SCALE - ACROSS ENTIRE INSTITUTIONS - THERE IS A GREATER RISK OF AMPLIFYING EXISTING DISCRIMINATION AND INEQUALITY THROUGH SYSTEMIC INJUSTICE.

DRIVERS:

>

PREDICTIVE POLICING + JUSTICE SOFTWARES

>

MACHINE LEARNING + DEEP LEARNING + DATA MINING

>

HIDDEN USER CREDIT SCORES

>

BIG DATA + DATA SETS

>

ALGORITHMIC + DIGITAL REDLINING

>

BLACK-BOX ALGORITHMS

DISCONTINUITIES:

>

AUTOMATED DECISION SYSTEMS

>

EXPLAINABLE AI (XAI)

SIGNALS:

>

DATA GOVERNANCE INITIATIVES

>

RACIALLY BIASED FACIAL RECOGNITION

>

>

SEARCH ENGINES PROMOTING CERTAIN IDEAS + IDENTITIES

AI NOW INSTITUTE PUBLICATIONS(THINK TANK); “AI NOW REPORT 2018”

>

GDPR - ACCOUNTABILITY OVER ALGORITHMIC DECISIONS

69


Fig. 36. Edited by Mandeep Mangat, original by: Eckermann. Alex. Spiral White and Gray Ilustration. 2018. Digital Photograph. Unsplash. Web. 2019. <unsplash.com/photos/W_K6j6OQBDg>.

PHASE 3: RE-FOCUSING © MANDEEP MANGAT, 2019

70


OCAD UNIVERSITY

BACHELORS IN DESIGN THESIS

3.3.2. SURVEILLANCE STATE TECHNOLOGY IS POLITICAL IN THAT IT STRUCTURES POWER RELATIONS BY DISTRIBUTING AGENCY THROUGH OWNERSHIP, ACCESS, AND CONTROL OF TECHNOLOGICAL RESOURCES. NEW TOOLS FOR MICRO-TRACKING AND PROFILING CITIZENS ENABLES A GREATER POWER SHIFT TOWARDS FAVORING ‘BIG BROTHER’ (GOVERNMENTS, MILITARY BODIES, AND LARGER CORPORATE ENTITIES), AND, THUS FURTHER AGGREGATING THE MAINTENANCE OF IDEOLOGICAL SUPREMACY OVER CITIZEN GRASSROOTS OPPOSITION THROUGH STATE-SPONSORED REPRESSION.

DRIVERS: >

BIG DATA + INFORMATION ECONOMY

>

MACHINE + COMPUTER VISION

>

FACIAL RECOGNITION SOFTWARE

>

AUDIO RECORDINGS + VOICE RECOGNITION

SIGNALS: >

PRACTICE OF CREATING “SHADOW PROFILES” ON NON-USERS

>

CHINA’S SOCIAL CREDIT SYSTEM

>

UK METROPOLITAN POLICE USING SOCIAL MEDIA INTELLIGENCE

71

>

SWEDISH CITIZEN EMBEDDED MICROCHIPS

DISCONTINUITIES: >

PRIVACY-ENHANCING TECHNOLOGIES (P.E.T.S).

>

COUNTER SURVEILLANCE THROUGH SOUSVEILLANCE AND INVERSE SURVEILLANCE

>

UK, MACHINE LEARNING TO IDENTIFY ONLINE ISIS PROPAGANDA

>

NON-PROFIT, PRIVACY RIGHTS ORGANIZATION: “PRIVACY INTERNATIONAL”


Fig. 37. Edited by Mandeep Mangat, original by: Vietrov, Anton. White Concrete Building During Daytime. 2018. Digital Camera. Unsplash. Web. 2019. <unsplash.com/photos/sN6Uzp8MMF8>.

PHASE 3: RE-FOCUSING © MANDEEP MANGAT, 2019

72


OCAD UNIVERSITY

BACHELORS IN DESIGN THESIS

3.3.3. DATA CONTROL AND MONETIZATION SIMILAR TO SURVEILLANCE, LARGE DEPOSITORIES OF BIG DATA BASED ON DIGITAL BEHAVIOR ARE BEING COMPILED ON ALL INTERNET USERS. HERE, PROFILING IS DONE FOR THE PURPOSES OF CAPITALISTIC MONETIZATION USERS THEMSELVES, BECOME THE PRODUCT BEING SOLD. DRIVERS: >

BIG DATA

>

DIGITAL STREAMING SERVICES

>

CLOUD COMPUTING + DATABANKS + SERVER FARMING

>

5G NETWORKS + UNLIMITED DATA PLANS

SIGNALS: >

FREE (ADVERTISING) BUSINESS MODELS

>

FACEBOOK + CAMBRIDGE ANALYTICA SCANDAL

>

HUAWAI BANNED IN USA + CANADA

>

DATA BROKER; DATA HARVESTING

>

INCREASING DATA BREACHES/HACKING

73

Historically, all technological revolutions have shared a commonality in redefining the key assets that are associated with power. Previously, during agricultural periods and into the early first industrial revolution, wealth was defined through the amount of land owned (Harari). Following this, the second industrial revolution supported wealth acquisition through machinery (Harari), while the digital revolution emphasized information technologies. More recently, as information has grown in complexity, data has become globally recognized as “the most important asset” (Harari). To establish this value shift, is the fact, that all existing digital devices, interfaces, and services, rely on generated user data (Balkan) to become more relevant to users through personalization (Toscano 12). Additionally, this importance is further emphasized through considering the divergent uses of data: data analytics; ai training; and, automated decision making (Bowles 62). AD-FUNDED PLATFORMS One of the largest data creation and data monetization business models are ad-funded platforms. Websites, social media, and, apps that operate under this model offer free services to users in exchange for presenting them with


PHASE 3: RE-FOCUSING

third party advertisements. On the revenue side, third party advertisers pay to access large quantities of global users (Bowles 61-2) through server hosting (Toscano 12, 15). More recently, through advancements in computing power and storage (such as server farms and cloud storage) (Balkan) this practice has developed into datadriven targeted advertising. This evolved practice relies on capturing user data based on digital behaviors and interactions, that are then analyzed for user insights (Harari), and, finally, categorized into distinct user profiles (Balkan; Toscano 12). Platforms leverage these data-informed profiles through offering marketing agencies access to specific users groups that are more susceptible to the content being advertised. Further, through monetizing generated user data - user participation is pursued via engagement strategies, such as free access. An analysis of this model suggests that the primary stakeholders are, in fact, the third-party agencies paying for access to users. By default, this frames the general public (i.e. those using the free services) as the products being sold (Harari). Additionally, another layer of examination begins to reveals power disparities between the numerous actors of digital corporations, the general public, and governing institutions. DATA INEQUITY

Š MANDEEP MANGAT, 2019

Ownership of digital models that harness user data are also responsible for a disproportionate global ownership of this data. Such segments of the tech sector includes silicon valley, the boardrooms of large corporations, and computer science laboratories (Harari)) - these groups have a disproportionate amount of influence over the future of how this data is used. Comparatively, the users generating data are not being paid or made aware of the amount of data being collected from them. Through not being informed (by the digital services engaged in such practices) or educated (by government initiatives) the public has a low interest in data practices, resulting in, exchanging their data for nothing (Harari). This has been widely criticized as undemocratic, as collective citizens have no say in how their data is used (Balkan). Further, the corporative ownership and control of data has been associated with the social system of corporatocracy (Balkan), where corporations and corporate interests determine economic and political activities. This dynamic becomes further evidenced, as user data has been financially exchanged with not just marketing agencies, but research institutions and governments (Balkan). More recently, with the Cambridge Analytica Scandal, the depth of this growing corporatocracy through data concentration has been revealed. Prior 74


OCAD UNIVERSITY

3.3.3. DATA CONTROL

BACHELORS IN DESIGN THESIS

&

MONETIZATION

to this, user data was typically used to understand human behavior for the purposes of convincing users to buy, or become addicted to digital services. Now, user data could be leveraged, for political and ideological influence.

will be susceptible to behavioral hacking that is capable of influencing desires, choices, and decisions, (Harari) in service to corporate profit (Toscano 12) or political agendas (Harari).

In response, there is a growing conflict between technology corporations and authoritative institutions (i.e. governments, police forces, and militaries) (Bowles 61) attempting to disrupt the growing data monopoly (Harari). The social critic, Yoval Noah Harari, warns that such a monopoly would mean that those in control of the data would “not only control us, but...control the future of the world and of life”.

IMPLICATIONS Leveraging personal and private user data for profit or behavioral exploitation is fundamentally unethical as people are re-frames as a means to an end. Further, Aral Balkan, a designer and human rights activist has drawn analogies between the historical practices of slavery with that of user data monetization. He argues that, while slave-trade reduces and sells people as human bodies, the user data that is monetized is a digital representation of individual users (Balkan). Additionally, both practices are dehumanizing and colonial (Balkan) in nature as they place a monetary value on human life.

Furthermore, government involvement is met with skepticism. Under the guise of protecting citizen data, these institutions could gain access to existing data banks, which could then be utilized for political purposes (Balkan). PROJECTIONS Predictive analytics are applied to user data for statistically informed behavioral insights. There are growing concerns regarding the future of this practice, particularly, the depth of human understanding, and how it will be used. For instance, there is concern that data could be used to understand “the human subject better then they know themselves” (Harari). Based on this information, the human subject 75

Lastly, capturing private information about individuals without disclosure, erodes the existing social norm of individual privacy.

DISCONTINUITIES: >

GDPR

>

USER DATA EXCHANGE INITIATIVES

>

FIREFOXS’ ADD-ON, LIGHTBEAM

>

INDIA’S BAN ON FACEBOOK’S ‘FREE BASICS’


Fig. 38. Edited by Mandeep Mangat, original by: Baboolal, Sophia. Glass Exterior Reflection. 2015. Digital Photograph. Unsplash. Web. 2019. < unsplash.com/photos/KZP-rCDtvhY>.

PHASE 3: RE-FOCUSING © MANDEEP MANGAT, 2019

76


OCAD UNIVERSITY

BACHELORS IN DESIGN THESIS

3.3.4. ECONOMIC AND ASSET INEQUALITIES AS EMERGING TECHNOLOGIES REDIRECT THE DISTRIBUTION AND CONCENTRATION OF WEALTH - IT BECOMES CLEAR THAT ULTIMATELY, TECHNOLOGY ITSELF DIRECTS THE FLOW OF POWER AND INFLUENCE BETWEEN SOCIETY.

DRIVERS: >

AUTOMATION & ROBOTICS

>

ALGORITHMIC LOGIC + COGNITIVE AUTOMATION

>

SERVICE PLATFORM MONOPOLIES

>

TECHNOLOGY OLIGOPOLIES (BIG 5)

SIGNALS: >

CENTAURSHIPS (HUMAN + MACHINE PARTNERSHIPS)

>

GIG ECONOMY + SHARING ECONOMY OUTSOURCED, DESKILLED, TASK-BASED WORK.

>

TECHNOLOGICAL UNEMPLOYMENT

>

SURGE PRICING + DYNAMIC PRICING

77

Historically, each technological disruption has altered the distribution and concentration of wealth. A prime example comes from the Industrial Revolution which was driven by steam power and saw the invention of factories for the first time. Karl Marx spoke out against this societal division through identifying the emerging social classes - the bourgeoisie and proletariats (Wikipedia “Marxism”). To summarize, the bourgeoisie inherited land through family lineage, became the owners of factories, and received the capital these factories produced (Wikipedia “Marxism”). The proletariats, on the other hand, did not own any land and received income through operating the machinery within factories (Wikipedia “Marxism”). Ownership of the factories tipped the power dynamic in favor of the bourgeoisie leaving the proletariats with limited bargaining power and susceptible to experiencing horrible working conditions (Wikipedia “Marxism”). Today parallels can be drawn between factory owners and the biggest technology companies as they are the new owners of innovation. AUTOMATION Fourth Industrial Revolutionary technologies (such as robotics, AI, and autonomous vehicles) are disrupting the


PHASE 3: RE-FOCUSING

existing job market and contributing to the displacement of existing professions. These technologies are being deployed to generate more wealth through assisting existing human workers. However, this application is also contributing to structural unemployment as technologies are absorbing and replacing the skills of the human worker (Wikipedia “Technological Unemployment”). Without changes, this will ultimately lead to a skills gap between what a worker can offer, and what an employer may desire (Wikipedia “Structural Unemployment”). Due to this, it is more accurate to frame these technological changes as leading to obsolete roles and disemployment instead of the dominant narrative of ‘temporary unemployment’.

© MANDEEP MANGAT, 2019

DIGITALLY CROWD-SOURCED LABOUR The emerging gig economy is described by the “Bureau of Labour Statistics ... as a workforce characterized by single project[s] or task[s] for which a worker is hired, often through a digital marketplace, to work on demand” (Marx). This business model is predominantly implemented by platform apps that are offering to connect service providers with potential buyers. Such examples include: Uber, Lyft, TaskRabbit, and Homejoy (Marx). Through leveraging the ubiquity of smartphones and wireless networks, these platforms are easily able to crowd-source tasks across a large, accessible network.

Such forms of automation include: mechanical tasks (ranging from simple tasks to completely independent activities), cognitive automation (softwares that combine “natural language processing, text analytics, data mining, semantic technology, and machine learning” to make informed decisions), or a combination of both (Mashetty).

Unfortunately, these crowd-sourced labour models prevent workers from being categorized as ‘employees’ by the initial companies distributing the labour. Instead, due to the shortterm, task-based employment, they are considered independent contractors (Marx). As such, they are not legally entitled to workers rights, which include, “minimum wage, employment benefits, sick days, [vacations,] nor a pension” (Marx).

Along with these new forms of automation, the structure of employment has since changed to a two-tiered model of either: highly specialized human to machine collaborations (to further enable technological workplace automation), or; short-term employment in the gig economy (Bowles 174-177).

Additionally, through digital distribution, workers are physically isolated from one another. This becomes detrimental to the formation of workers unions, as workers are anonymized from one another, and more effort is required for the communication and planning that is typically required to 78


OCAD UNIVERSITY

3.3.4. ECONOMIC

BACHELORS IN DESIGN THESIS

&

ASSET

INEQUALITIES

develop a workers union. The identified disparity between workers and employers becomes further exasperated when considering the ratio’s involved. TECHNOLOGY OLIGOPOLYS + PLATFORM MONOPOLIES Similar to the social divisions imposed by the Industrial Revolution, comparisons can be drawn between the factory-owning bourgeoisie and the 5 wealthiest tech companies of today. The companies that currently hold the largest concentration of wealth include: Facebook (sub brands: WhatsApp, Instagram, and Oculus VR), Amazon (sub brands: Alexa Internet Inc., IMDb, Whole Foods Market), Apple (sub brands: Iphone, EarPods, Mac App Store, Siri), Microsoft (sub brands: LinkedIn, Skype, Outlook, Github), and Google (YouTube, Android, Waze) (Lekkas, Nicolas). At first, this concentration suggests a oligopoly whereby the market is dominated by only a few competitors (Androile). However, the process of consolidation through acquisitions and mergers enables the monopolization of specific markets (Androile). In terms of workers and employees those seeking better conditions within this model could be banned entirely from digital platforms (Bowles 175), and left with few alternatives for employment. 79

Additionally, the implementation of algorithmic management within these platforms places workers at the mercy of the algorithm, and further exacerbates certain problems, such as: increasing surveillance and greater control of workers; non-transparent decision making within the system; making workers susceptible to bias and discrimination through user rating systems; and decreasing accountability through distancing companies through the effects of the algorithmic decisions (Mateescu, and Aiha 1-2). In addition to inhibiting the promotion of workers’ rights, digital monopolies harm consumers through decreased market competition, as well as, newer, realtime adaptive pricing models. NEW PRICING MODELS/PROJECTIONS Many platform apps are implementing surge pricing to further profit off of user needs. Adapted from dynamic pricing - where it was traditionally used in online ticket sales of air transportation - pricing was automatically adjusted based on given variables (Wikipedia “Dynamic pricing”). Similar to this, platform apps are deploying pricing algorithms to raise the cost of service based on number of online users (Wikipedia “Dynamic pricing”). Through price manipulation


PHASE 3: RE-FOCUSING

based on supply and demand, these platforms are better able to extract greater profit (Waddell). Concerns for the future of this practice involve the scaling out of surge pricing models through networked electronic price tags in retail, and price discrimination of users through further profiling them based on immediate and third-party user data (Waddell). IMPLICATIONS These emerging business models and practices are inherently exploiting workers, and further exacerbating the inequitable social divide. Additionally, the value of personal meaning and self-worth is under threat when new technologies are being developed to replace human workers. Further, considering Kurzweil’s, law of accelerating returns, it becomes clear that AI will acquire new skills far more quickly than humans can, and eventually faster then the skills gap can become addressed. If not prepared for, the aftermath of human disemployment will have detrimental impacts on the economy as well as social stability.

© MANDEEP MANGAT, 2019

DISCONTINUITIES: >

DESIRE FOR BESPOKE, HANDMADE LUXURY GOODS + SERVICES

>

UNIVERSAL BASIC INCOME (UBI)

>

COLLECTIVE PRICE HACKING OF SURGE PRICING ALGORITHMS

source: https://www.thetimes.co.uk/article/ not-fare-how-uber-drivers-gang-up-to-exploitpassengers-ctxbvhv98

>

PLATFORM COOPERATIVES Fairmondo, Stocksy, Backfeed, Juno, Union Taxi, Modo, Timefounder, Enspiral, Peerby

>

DATA DIVIDENDS ON USER GENERATED CONTENT

>

OPEN SOURCE MOVEMENT + OPEN ACCESS MODELS + CREATIVE COMMONS SOFTWARE AND SYSTEM LICENSING

>

MICROPAYMENT ECONOMY

80


OCAD UNIVERSITY

BACHELORS IN DESIGN THESIS

3.3.5. IMPLICIT TRUST AND USER UNDERSTANDING AS EMERGING TECHNOLOGIES INCREASINGLY ENTER THE COMMERCIAL SPHERE, THERE IS A GROWING LACK OF PUBLIC KNOWLEDGE AS TO HOW THEY OPERATE - ESPECIALLY IN TERMS OF UNFORESEEN USER CONSEQUENCES DUE TO NONTRANSPARENT INTERNAL INFRASTRUCTURES. SERVICES AND PRODUCTS ARE INCREASINGLY PRIORITIZING USER ON-BOARDING AND RETENTION OVER OPPORTUNITIES TO BUILD PUBLIC AWARENESS. EXISTING TERMS OF SERVICES ARE NOT CLEAR ON USER RIGHTS, AND ARE DESIGNED SUCH THAT USERS CAN EASILY PROVIDE PERMISSION WITHOUT FULL AWARENESS AS TO WHAT THEY ARE AGREEING TO. SUCH A LACK OF USER CONSENT ERODES THE NOTION OF USER PROTECTION STANDARDS, AND VALUES IN PUBLIC KNOWLEDGE, GOODWILL TOWARDS OTHERS, AND INDIVIDUAL DISCERNMENT. DRIVERS: >

APPS + DIGITAL PLATFORMS

>

INTUITIVE + FRICTIONLESS DESIGN

>

AUTOMATED PAYMENTS AND LOG-INS

>

USER CONSENT SCREENS + OPT-IN SERVICES

> > 81

APPS ACCESSING SMARTPHONE MICROPHONE TO LISTEN IN ON USERS

>

SAMSUNG SMART TV’S TRANSMITTING AUDIO TO THIRD PARTIES.

DISCONTINUITIES: >

GDPR

>

FACEBOOK + CAMBRIDGE ANALYTICA SCANDAL

EDUCATIONAL INITIATIVES ON DIGITAL LITERACY + DIGITAL HABITS

>

FACEBOOKS PSYCHOLOGICAL NEWS FEED EXPERIMENT

USER DATA DIVIDENDS + DATA MANAGING APPS

>

UK GOVERNMENTS INTERNET SAFETY STRATEGY + “ONLINE HARMS WHITE PAPER”

SIGNALS: >

>

UBERS INTERNAL TOOL, “GOD VIEW”


Fig. 39. Edited by Mandeep Mangat, original by: Flipboard. Gray Architecture Building. 2018. Digital Photograph. Unsplash. Web. 2019. < unsplash.com/photos/h5H5UyXBLBQ >.

PHASE 3: RE-FOCUSING © MANDEEP MANGAT, 2019

82


OCAD UNIVERSITY

BACHELORS IN DESIGN THESIS

3.3.6. EROSION OF TRUTH, DISINFORMATION, AND PROPAGANDA ELEMENTS (SUCH AS, TRUTH, FACTS, & EVIDENCE) THAT WERE HISTORICALLY UPHELD IN NEWS JOURNALISM, ARE BEING ERODED IN PUBLIC DISCOURSE THROUGH THE PREVALENCE OF SOCIAL MEDIA PLATFORMS. THROUGH FASTER INTERCONNECTED COMMUNICATION, THE DISTRIBUTION OF MISINFORMATION AND DISINFORMATION IS AMPLIFIED. THIS IS FURTHER AGGREGATED BY NEW SOFTWARES THAT ENABLE THE EDITING OF, OR NEW CREATION OF FALSIFIED CONTENT. IN SERVICE OF INFLUENCE OPERATIONS, SUCH TACTICS ARE BEING ADOPTED BY POLITICALLYORIENTED ACTORS. ULTIMATELY, SUBVERTING THE TRUTH AND SPREADING LIES UNDERMINES PUBLIC TRUST AND EFFORTS TO ACHEIVE EQUITABLE JUSTICE. DRIVERS:

>

ELECTORAL HACKING

>

SOCIAL MEDIA PLATFORMS REPLACING TRADITIONAL JOURNALISM

>

EVIDENCE COLLAPSE

>

SOCIAL + POLITICAL CHAT BOTS

>

LIKE + REPOST FEATURES

>

MACHINE LEARNING + ALGORITHMIC CONTENT GENERATORS

>

>

CALIFORNIA BILL SB-1001: BOT DISCLOSURE MANDAOTORY IN THE STATE OF CALIFORNIA.

>

“APPLE NEWS” - NEWS APP THAT ONLY FEATURES ACCREDITED PUBLICATIONS.

>

FACT-CHECKING APPS.

>

EMERGING PROJECTS THAT LEVERAGE BLOCKCHAIN TECHNOLOGY TO PROVIDE CREDIBILITY TO ONLINE CONTENT

>

CREATING ALGORITHMS TO SCREEN UPLOADED CONTENT IN ORDER TO IDENTIFY FAKES

VIDEO AND PHOTO-EDITING SOFTWARES

SIGNALS: >

FAKE NEWS

>

DEEPFAKE VIDEOS

>

POLITICAL WEAPONIZATION OF SOCIAL MEDIA

83

DISCONTINUITIES:


Fig. 40. Edited by Mandeep Mangat, original by: Drost, Julius. Statue of Liberty under cloudy sky during daytime. 2018. Digital Photograph. Unsplash. Web. 2019. <unsplash.com/photos/sf8b4ucpdkg>.

PHASE 3: RE-FOCUSING © MANDEEP MANGAT, 2019

84


OCAD UNIVERSITY

BACHELORS IN DESIGN THESIS

3.3.7. ADDICTION, ATTENTION, AND THE DOPAMINE ECONOMY DEVELOPED FROM INSIGHTS IN PSYCHOLOGY, BEHAVIORAL SCIENCE, AND NEUROSCIENCE, INVISIBLE FORMS OF PERSUASION ARE EMBEDDED IN THE DESIGN OF TECHNOLOGIES AND DEPLOYED AGAINST THE WILLPOWER OF USERS. DRIVERS: >

ATTENTION ECONOMICS + SUBSCRIPTION BUSINESS MODELS

>

5G INTERNET + STREAMING SERVICES

>

DARK-PATTERNS IN UX + NEUROMARKETING

>

AUTOMATED NOTIFICATIONS + PERSONALIZED NUDGES

>

HYPERREALITY & IMMERSIVE TECHNOLOGIES

SIGNALS: >

HIKIKIMORI

>

INCREASING RATES OF ADHD IN CHILDREN source: https://abcnews.go.com/Health/adhdrates-kids-increased-past-20-years-study/ story?id=57526368

>

FOMO (FEAR OF MISSING OUT - A NEW FORM OF SOCIAL ANXIETY)

>

INTERNET ADDICTION DISORDER

>

NOMOPHOBIA (SMART PHONE SEPARATION ANXIETY)

85

Psychological theories of influence and persuasion (the process of subtlety altering a person’s beliefs, attitudes, and behaviors (Wikipedia “Persuasion”) have been formative in the creation of such industries as advertising, marketing, and public relations Collectively, these industries target manipulating the consumer (i.e. individuals) in service to their clients - usually corporate entities or political parties. More recently, alongside the of rise of ubiquitous computing, forms of persuasion have evolved and been deployed with a much larger audience reach then ever before. As we increasingly rely on technology in the day-to-day operations of our society, the concern over addiction to entertainment services (such as social media and online games) also continues to rise (Ryan). Current practices involving technologically charged forms of persuasion include “nudging” and “dark-patterns in UX”, which are increasingly mandated by subscription business models. NUDGING “Nudge theory”, developed by Richard Thaler, proposes to influence behavior and decision making through positive reinforcement and indirect suggestions (Chu). “Nudging” is intentionally


PHASE 3: RE-FOCUSING

planed within apps and websites through the consideration of information architectures, default settings, and frameworks presented. In this context, user interface designers are able to promote particular options, as opposed to reducing them - thereby, guiding the decisions of a user to their own predetermined action (Chu). DARK PATTERNS IN UX A more aggressive form of persuasion that the public is susceptible to has been labeled, “dark patterns”. This unethical form of persuasion comes in the guise of intentionally deceptive interfaces that exploit cognitive, behavioral, and psychological weaknesses in the attempt of extorting the user for unwanted purchases, registration, or prolonged usage of a service (Wikipedia “Dark pattern”). According to user experience researcher, designer, and consultant, Harry Brignal’s website, “Dark Patterns”, some tactics include: • Bait-and-Switch - users are ‘baited’ with an appealing option, only to have a different, undesirable result. For instance, a free one month subscription could be followed up with automatic billing the subsequent months. • Disguised Ads - whereby clickable advertisements are presented as part of the existing navigation within the app or site. • Roach Motel - through which certain processes (such as joining

© MANDEEP MANGAT, 2019

a subscription) are made easy, but much greater traction needs to be applied to others (such as canceling a subscription). • Misdirection - The intentional design of an interaction that aims to misdirect the attention of the user in order to conceal pertinent information. Expanding on the scope of user impact, there is a growing concern that dark patterns in UX are in fact leading to an addiction in technology. This is achieved through framing “good design”, as something that users will continuously use, with the end result, being that users are persuaded to continuously use and rely on said technology (Fienstein). In consideration of underlying influences behind such an agenda, it becomes clear that dominant business models are driven by the amount of users, and their frequency of use. SUBSCRIPTION MODELS AND THE ATTENTION ECONOMY Driven by Silicon Valley business models the majority of emerging platform apps are founded on subscription economics which render a users attention as a form of currency (Hazel). In exchange for engaging content, the user is mined for data, that is then profiled in order to achieve third-party targeted advertising (Levine). The success metrics for this form of monetization depend on the quantity of time spent in the app. Transferred to metrics, 86


OCAD UNIVERSITY

3.3.7. ADDICTION,

BACHELORS IN DESIGN THESIS

ATTENTION

this could include: followers; clicks; likes; and re-postings. Based on these parameters, the link between attention economics and the monetization of user attention and time is further substantiated. Based on this framework, one can conclude that the ideal user for this model is an addict. PROJECTIONS The recent rise in technological addictions, combined with emerging fourth industrial revolution technologies, is indicative of future permutations becoming increasingly aggressive. One such scenario involves combining: machine learning; the growing body of behavioral data to mine; and experimenting with various design permutation - enabling future platforms apps to become much more enthralling then earlier versions (Bowles 43). Another emerging permutation suggests merging chatbots with nudging processes (Bowles 43) thus further automating user behavioral manipulation, and becoming much harder to resist. Supporting these emerging forms of persuasion is the development of the multidisciplinary field of neuromarketing. Through this field, new data and deeper insights on emotional responses are being extracted through 87

&

THE

DOPAMINE

ECONOMY

measuring brain scans and skin metrics (Teper 25). Overall, through becoming further automated, and more invisible, these forms of persuasion will be more easily able to exploit human weakness on cue. IMPLICATIONS Designing technologies with the intention to manipulate users into addicts normalizes the undermining of willpower, flexibility, and perseverance of the public in favor of commercial profit.

DISCONTINUITIES: >

RISE IN DIGITAL DETOX SERVICES

>

NATIONAL DAY OF UNPLUGGING

>

SMART PHONE, “SCREEN TIME MONITORING” FEATURES

>

CHINA’S TENCENT TO LIMIT PLAY TIME OF TOP-GROSSING GAMES FOR CHILDREN source: https://www.reuters.com/article/ustencent-games/chinas-tencent-to-limit-play-timeof-top-grossing-game-for-children-idUSKBN19O0K0


Fig. 41. Edited by Mandeep Mangat, original by: Masat, Thom. Morning Fog. 2018. Digital Photograph. Unsplash. Web. 2019. <unsplash.com/photos/fn1K1WCbmCA>.

PHASE 3: RE-FOCUSING © MANDEEP MANGAT, 2019

88


OCAD UNIVERSITY

BACHELORS IN DESIGN THESIS

3.3.8. COLLAPSE OF CIVIL SOCIETY AND THE CREATION OF HATEFUL ACTORS TECHNOLOGY MEDIATES NOT ONLY OUR UNDERSTANDING OF THE WORLD, BUT OUR UNDERSTANDING OF OURSELVES. DECENCY AND CIVILITY ARE NO LONGER VALUES THAT PURSUED IN DESIGNED EXPERIENCES, WHEREAS AVENUES FOR INDULGING NEGATIVE EMOTIONS ARE INCREASING. DRIVERS: >

MACHINE LEARNING, AI, + ALGORITHMS

>

SOCIAL MEDIA NETWORKS + DIGITAL JOURNALISM

>

PERSONALIZATION ALGORITHMS + RECOMMENDER SYSTEMS

SIGNALS: >

FILTER BUBBLES + ECHO CHAMBERS + POLITICAL POLARIZATION

>

CROWDFUNDING UNETHICAL ACTIONS + CRYPTOCURRNECY HEADHUNTING

>

DIY HACKING + DIY WEAPONS

>

ROBO-EXOTICISM

>

ROBOT ABUSE

89

Developed in response to the growing concept of and value in individualism, personalization, through uniquely customized experiences, aims to provide improved customer interactions. From this initial aspiration, the concept has since developed into a commonplace value proposition across many customerfacing disciplines, including, marketing, cognitive science, social science, computer science, architecture, and information science (Fan and Poole 182-183). More recently, through leveraging algorithmic logic, personalized marketing strategies have become standard practice through ease of scaling and dissemination across various media platforms. These algorithms operate through monitoring individual digital behavior (such as browsing history, or personal profiles), in order to filter information into a recommendation system (Wikipedia “Recommender system”). Examples of platforms implementing this practice include: Netflix, Youtube, and Spotify (through playlist generators); Amazon and Apple (through product and service recommendations); and, Facebook, Twitter, and news apps, (through content recommendation) (Wikipedia “Recommender system”).


PHASE 3: RE-FOCUSING

As suggested by the sociological theory of value homophily, personalization adheres to the tendency for people to associate through self-similarity (in thought, taste, or values) (Silins 10). As such, content recommendation algorithms have been extremely successful, and are increasingly being implemented to curate digital news and social environments (Haim et al. 330). On the other hand, hyper-personalization has been accused to reducing diversity and empathy; normalizing extreme content; and increasing opportunities for malevolent actions. FILTER BUBBLES + ECHO CHAMBERS Internet activist, Eli Pariser, termed the concept of ‘filter bubbles’ an information cocoon that occurs through siloed experiences of online life (Silins 10). This process takes place through recommender systems that repeatedly present users with content they are most likely to respond to. This eventually results in a closed feedback loop of repeatedly presenting the same type of content, such that the user is no longer exposed to divergent information and develops information blindness (Silins 10; Haim et al. 330). Further, filter bubbles narrowed in on intellectual social life have entirely specific challenges, and have thus been labeled ‘echo chambers’. ‘Echo Chambers’ are online environments (such as news media) where the same conceptual framework is repeated through existing published content and public conversations through commenting. These environments are particularly appealing, as they allow individuals to “hear

© MANDEEP MANGAT, 2019

only what [they] choose and only what comforts”, and pleases them (Borgesius et al. 4) - further appealing to value homophily. The risk in these situations is that they serve to amplify and reinforce the users existing world view, and thus contributing to confirmation bias (Wikipedia “Echo Chamber”). LOSS OF DIVERSITY AND EMPATHY Even more concerning, echo chambers are producing increased political polarization through reducing exposure to the wide range of political ideas as well as opposing opinions (Borgesius et al. 1-2). Furthermore, it is through exposure to a balanced and diverse range of perspectives, that citizens are able to develop an informed understanding of the larger public they exist within (Haim et al. 331). Ultimately, through reducing content diversity, echo chambers and filter bubbles “have an adverse effect on the democratic [opinion forming process,] discourse, open-mindedness” (Borgesius et al. 2, 4), and empathy of users online. Interestingly, our capacity for being open-minded can be developed through ‘acquired diversity’ - more specifically, expanding our understanding of the human condition through gaining new and different experiences from our own (Vallor 116 - 117). These experiences could be in the form of traveling to foreign places, reading diverse literature, or learning new languages (Vallor 116 - 117). Therefore, through skewing the spectrum of available information (i.e.: source diversity, content diversity, and viewpoint 90


OCAD UNIVERSITY

3.3.8. COLLAPSE OF CIVIL & THE CREATION OF

BACHELORS IN DESIGN THESIS

SOCIETY HATEFUL

diversity) (Haim et al. 331 - 332) the capacity for online users to experience and learn how to navigate diversity is also reduced. DISTRIBUTION OF EXTREME CONTENT Continuing on the topic of personalization, another unforeseen effect through recommender systems is the mass distribution of extremist content. Over time, as filter bubbles and echo chambers contribute to personal confirmation bias, the type information recommended becomes increasingly extreme (Munn). Considering the process of moral normalization (section 2.1. Philosophy of Technology), it is through repeated exposure that shocking content increasingly infringes on normalcy, with the user eventually acclimating through successive conditioning (Munn). This becomes a cyclical process - by both normalizing extreme content and personalizing content, the users views can become increasingly polarized. The effects of this practice are the social deterioration of acceptable behavior, while affording bullying, radicalization, and other hateful behaviors and actors. As technology continues to develop, it enables greater affordances for personalization, as well as greater autonomy in what it can allow the user to do. With many of these new technologies, there is the potential for abuse of authority and unintended uses. 91

ACTORS

INCREASING AFFORDANCES FOR MALEVOLENCE Approaching innovation from the perspective that ‘technology is merely a tool to enable greater agency’, is increasingly supporting scenarios of abuse of power. One such application of enabling agency through technology, is the inclusion of end users as creators within the design process itself. Founded on this value are two subcultures: maker culture (those using technology in the creation of new or altered physical creations); and hacker culture (those who actively seek to overcome the challenges presented by software systems) (Wikipedia “Maker culture”). Through access to computer aided design and computer aided manufacturing, makers are empowered to digitally fabricate their own pursuits. While opening up a world of opportunity, the proliferation of D.I.Y. technologies make it possible to create weapons. For instance, there is a growing concern with guns produced at home.Referred to as ‘ghost guns’, these weapons bypass geographic restrictions, background checks, are unregistered, and lacking serial numbers - untraceable (Caron). The effects of these artifacts include increased gun violence and support of guerrilla warfare. Next, while hacker culture is founded on programming techniques used in the spirit of intellectual playfulness and technical exploration, there are those who utilize these processes


PHASE 3: RE-FOCUSING

with ill intent (Wikipedia “Hacker culture”). Referred to as ‘security hackers’ or ‘crackers’, these bad actors intentionally break into computer networks in order to commit further crimes - such as destroying, altering, or stealing information, or crashing networks entirely (Wikipedia “Black hat (computer security)”. The paradox of enabling greater agency through technology, is that the entire spectrum of intent (good or bad) becomes empowered. Furthermore, when technology supports ease of acting out maligned intentions, it also enables morally undermining behavior. Lastly, the impact on human morality becomes further aggregated when the paradigm of technological servitude is combined with the intention for technology to simulate human-to-human interactions. PROJECTIONS While intended to replicate human interaction, the technological class of artificial social agents are also positioned in relational servitude. Combined, this results in these new social agents implemented as slaves, and by default, placing users in the role of ‘masters’. Historically, those who have been subjugated, have had their social status reduced to the identity of ‘the other’ (Debra). To be given the status of ‘The Other’, is to be viewed with indifference and alienation from belonging to the greater social collective (Debra). Comparatively, emerging social technologies are increasingly being designed with

© MANDEEP MANGAT, 2019

specific character traits (such as gender) that simulate characteristics within human society. Thus, those that these technologies represent are at risk of becoming marginalized and treated as ‘others’ through association. In conclusion, our interactions with social technologies will inform our social norms in terms of how we relate to, treat, and consider others. Further, our treatment of others develops our own understanding of ourselves as members of the same species.

IMPLICATIONS:

In conclusion, through digitally mediated social environments and replicated social interactions, technology is decreasing civility, tolerance, cooperation, empathy, and goodwill towards others. In turn, it can offer avenues to actualize ill intent - ultimately deteriorating human morality.

DISCONTINUITIES: >

POLITENESS + MANNERS EMBEDDED INTO AI ASSISTANCES

>

EMPATHY MACHINES

>

AIRBNB CANCELLING ACCOUNTS LINKED TO WHITE NATIONALIST RALLY

>

CAMBRIDGE DECLARATION ON CONSCIOUSNESS

>

IEEE’S ETHICALLY ALIGNED DESIGN INITIATIVE + ASILOMAR AI PRINCIPLES + BARCELONA DECLARATION + ASSOCIATION FOR COMPUTING MACHINERY (ACM) 92


OCAD UNIVERSITY

BACHELORS IN DESIGN THESIS

MACHINE ETHICS & ALGORTIHMIC BIAS

BIG DA TA DEEP LEARNING & ALGORITHMS

h

Tec

I

&

93

ED

THE ERROSION OF TRUTH, DISINFORMATION & PROPOGANDA

PRIVACY AN H U ML U E VA H UT T TR US TR

USER CONSENT TERMS SERVI OF USE CE AGR R POL EEMENT ICI ES

KI MS A F TH I O E OR D G VI AL NG TI

lic

pp l A ica g o nol

TH ERRO OF H VAL

ST

APPS

w

Ne s (

on ati

TRU

RS I IC TY E

KNOWLEDGE

s +

ice

ct Pra

Mo

ST

T

ess

in Bus

VE

US E R EXP E R I DESIG ENCE N

UM QUANT D E S -BA CLOUD ING T s) COMPU del

JU

TRUS

DATA CONTROL & MONETIZATION

DI

Y

k

BI DA AS TA ED IN ST NAT -SE TS BI RUC E AS TU JU RA S L MON & TIC ETI F L E I OF ZAT INSNANCE AW DAT PERSO ION URA A NAL NCE& FRE E D USERS PLA I AS TFO GITAL PRODUC R MS TS TARGETTED ADVERTISING ADVERTISING MARKETING BEHAVIOUR G PROFILIN ARTY RD P VENDORS 3 R E ROK DATA DATA B S DATA STING IC T E V LI HAR PO S E S K FA UNT P O ON E C TI DE AC A Z KE NI IAL FA O AP OC WE F S IA O ED M NG

NG

ST

Ris

NI

NE

AR

E CO VID LL EN A A C PE UTO PS E M R S A E AP OP UAS TED HE IO IN IN FAK NI N E TE F A OP LU RN RG ER EN NEW R AT CE S EL ME OV IO AT NT EJO NS IO AL UR NA NS L

at

TS

ues

MS

BO

Val

LE

TH

AT

bal

E

ALITY RETAILERS E Q U I T Y EGR S INT ES Y ICE N S SERV IR ITI IDER FA U PROV & EQ NT ME NT OY ME PL IT E EM ECRU L NC A A G R

Glo

IN

RI

CH

CH

M

GO

AI

MA

TA

IS

AL

DA

I T N OC IS CI VE S S I TI AS OL IC ICS D P ED PR LYT TE A MA ON AN TO SI D AU CI NG OXE B MS DE KI MA CK ITH A R BL LGO A

G

INE ACHIPS M A N SH HUMRTNER PA E ITIV COGN MATION AUTO SHARING & GIG ECONOMY

BI

AUTOMATIO

HO

SYNTHESIS OF EMERGING MEGA-TRENDS ERODING SOCIETAL VALUES

ECONOMIC ASSET INE

PLATFORM A

IMPLICIT USER UNDE


PHASE 3: RE-FOCUSING

© MANDEEP MANGAT, 2019

+ EQUALITIES ROBOTICS

M

SELF -WO R T H GENEROSI TY

E FL

HE OSION HUMAN LUES

XI

P

B

A

I IL

S ER

E

LE

DIS

TO

CA

RE

NC

ENT

E

ONAL NATI LIGENCE L INTE NS CORPORATIO

M EDO FRE PRIVACY

VE

RE

L RI GHT

RS

IT

S

Y

NG

DISSIDENT GROUPS

CHIN A CRED ’S SOCI IT S A YSTE L M

ion

lut

al

o Rev

tri

4th

us Ind

s

gie

olo

hn Tec

Y

TECHNOLOG

E & WEARABL S U O T I U Q I UB

L DIGITA ING & L PROFI NG TRACKI NS E Z CITI

FACIAL & VOICE RECOGNITION & RECORDING SOFTWARES

DATA BANKS

CLOUD STORAGE

PRE-EMPTI VE DISSIDENC E INTERVENT IONS

NC

NME

S ED FE G IN S WS IZ P O IT LO NE NS K OS SE AC SIL DE EDB L FE DIA ICA TION CS ME LIT IZA TI PO LAR G SM LI PO SIN LI A I N PO R O I E IC RS NAT ESS ONL BL OU CTI PU SC FRI IGN DI S E D E TIVE I INTU N G DESI

RA

CER

NT

R USE ACE ERF INT GN I DES

TS USER RIGH

APPS

NT

M CER G IN CON W O GR ALTH HE

KI

HI

NME

AI ERT

TY

E RV

CIVI

DI

KS

R WO

I RN T A E N LE AL G R E 5 IN VOU S H A HM C MA BEH IT & GOR VE S I AL AS THM U RS RI RI L PE LGO HOMO TA EN NS

EMPL OYER S

CAPITALIS

PLATFORM MONOPOLIES WI SURGE LL PO PRICIN WE R MODELS G SU BS C BU S RI DIS MO INE PTIO EMP D N EL SS ST LOY S NO MEN SE RE T I & T FI RV AM N C U N IC IN DG ATI EU ES G DA ROM ES ONS AR IN RK KE TI AU UX PAT NG PE TO TE D E RS MA SI RNS UA TE GN SI D ON

ON

ADDICTION, ATTENTION & THE DOPAMINE ECONOMY

SURVEILLANCE STATE

IOT + 5G NE TWOR

CUL

KS

TU H FI OMOG RAL ENI B LT ZAT EC UBB ER ION HO LE S CH AM BE SO RS CI

Ind

ust

AL

ry

Dis

rup

tio

ns

ME P DI AL ERS A GO ON PL RI AL AT FO TH IZ RM MS AT S IO N

HARDWARE

TRUST & ERSTANDING

COLLAPSE OF CIVIL SOCIETY & THE CREATION OF HATEFUL ACTORS Fig. 42. Infographic: Synthesis of emerging global trends and their impacts on normative values.

94


OCAD UNIVERSITY

BACHELORS IN DESIGN THESIS

3.4. TREND SYNTHESIS In service to gaining a moral foundation that could be applied to evaluating human-to-technological relations - Phase 3 began with an account of the various branches of the philosophy of ethics. The branches of applied ethics that were explored included: consequentialism, deontological, and virtue ethics. An emphasis towards the ideology of virtue ethics (which prioritizes the development of values through repeated actions) was developed while completing the trend analysis. Through adopting a virtue ethics perspective, cultural groups were able to be evaluated as archetypal figures composed of various qualities that could be altered through the influences of emerging technologies. On completing the mega-trend analysis, a synthesis map was created to visualize the changing landscape.

Synthesis of Emerging Mega-Trends Map:

The map (Fig. 42, pages 93-4) amalgamates the individual trends explored in section 3.3. Trend Analysis. The outermost octagon is comprised of the immediate emerging technologies that are driving the disruptions. The second-most octagon 95

from the exterior, addresses how the technologies are being leveraged; such as new practices, applications, or business models. The third octagon indicates which industries are being disrupted. Lastly, the inner octagon indicates which global values are being altered or eroded.

Conclusion:

Both the trend analysis and synthesis map indicate technology as capable of agency in its ability to influence and shape the morality of society. For instance, an analysis of the trend synthesis map (Fig. 42) reveals a connection between the industry being disrupted, and the values at risk. Various institutions have been developed to represent different conceptual frameworks as well as values. Therefore, to disrupt an industry through innovation is to also conceptually re-structure its very meaning. In developing future technologies, how might co-agency between humans and non-humans enable healthier relations and institutional frameworks? How can the approach to innovation reflect the understanding that society and technology co-shape one another?


PHASE 3: RE-FOCUSING

© MANDEEP MANGAT, 2019

OGICAL TECHNOL ION INNOVAT

NEW APPLICAT IONS

IN D DI UST SR UP RY TI ON S

TE CH NE HU MA NI TY

OR W VI N E EH A B S CULTU R SENSE AL NARR A TIVES - M AK ING ,

ETAL SOCI ES U VAL SOCIAL NOR MS

Fig. 43. Humanity and Technology infinity loop, revised.

96


Fig. 44. Skypressmarket. Digital Damage, seamless animation of Pixel Sorting Glitch. Movie Still. Web. 2019. < https://www.pond5.com/stock-footage/84838075/digital-damageseamless-animation-pixel-sorting-glitch-effect.html>.

OCAD UNIVERSITY

97

BACHELORS IN DESIGN THESIS

4.1. R E- OR IE NT I N G . . . . . . . . . . . . . . . . .99

4.2. C ON CE PT D I R E C T I O N S . . . . . . . . . . 101

4.3. E VA LU AT IO N . . . . . . . . . . . . . . . . . . 139


PHASE 4: CONCEPT DIRECTIONS

© MANDEEP MANGAT, 2019

4 CONCEPT DIRECTIONS

98


OCAD UNIVERSITY

BACHELORS IN DESIGN THESIS

4.1. RE-ORIENTING

99


PHASE 4: CONCEPT DIRECTIONS

© MANDEEP MANGAT, 2019

The following scenario was developed based on the trajectory of the previously identified mega-trends (section 3.3. Trend Analysis). The aim of the anti-scenario was to explore the imminent and undesirable future to come. This activity served as a call-to-action in inspiring and guiding the concept generation process in the subsequent section. In developing the concepts, the question posed was, “can concepts of ethics and morality be applied to social technologies, in order to enrich the lives (human, and non-human) touched by the outcome? ANTITHESIS

SCENARIO

Let’s consider the future utopia that we are currently working towards through technology. Imagine a future in which all the problems presently faced by humanity have been resolved through technological innovation. The limitations of our biology have finally been overcome, we have achieved immortality and are no longer plagued by death. Not only can we live forever, but through the advance of reproductive technologies, artificial wombs have replaced our need to pair up with others to physically procreate. All aspects of human life (domestic sphere, work sphere, and social sphere) have become automated. Robots assist us with all of our dayto-day tasks. Algorithms have perfected themselves to the point that they understand us better then we understand ourselves. We are no longer tasked with the need to think for ourselves, make our own decisions, or to discover who we are. Instead, we rely on technology to inform us of our decisions - that is, of course, when it isn’t autonomously acting on our behalf. Extreme personalization leads to self-indulgence, with social AI replacing the need for anyone to ever interact with another biological-being. Even if “they” could, why would they?

100


OCAD UNIVERSITY

BACHELORS IN DESIGN THESIS

4.2. CONCEPT DIRECTIONS 4.2.1. ETHICALLY DRIVEN INNOVATION Having discovered that technology is instrumental in shaping human morality, and capable of both positively and negatively impacting quality of life, there is a need for implementing this knowledge in the development of innovation. Additionally, the role of the design process in directing innovation (Fig. 45.), advocates for a greater sense of obligation and duty towards those effected. Instead of focusing on the bottom-line, what might be possible if prosperity were redefined as human happiness? What if our approach to innovation was guided by the belief that our collective identity is formed through the affordances that technology provides.

THEMES FUTURE OF DESIGN R E S P O N SIBLE AND ETHICAL PRACTICES TECHNOLOGY POLICY DESIGN R E F L E X I V E + CO-CONSTRUCTIVE PARADIGMS 101


PHASE 4: CONCEPT DIRECTIONS

© MANDEEP MANGAT, 2019

WHAT IF HUMAN CIVILIZATION WERE A DESIGN PROJECT?

DESIGNED APPLICATIONS

TECHNOLOGY

SOCIETY DESIGN OF NEW TECHNOLOGIES

Fig. 45. Diagram outlining the role of design as a mediator between societal needs and technological progress.

102


OCAD UNIVERSITY

BACHELORS IN DESIGN THESIS

TIME-LINE OF THE DEVELOPMENT OF DESIGN THEORY Comprehensive Design Science (1956) Buckminster Fuller

Desi IDEO Wicked Problems (1973) Horst Rittel & Melvin M. Webber

‘The Scientific Method’ - Engineering - Industrial Design - Material Science ‘The Democratic Process’ - Workers Unions - Democratic Governments

Design for the Real World (1971) Victor Papanek Cooperative Design (1960-80) Scandinavia - Phenomenology - Anthropology - Ethnography

-

- Et - In Ma Experimental Psychology - In Anthropology Business Strategy Psychology

ETHICAL DESIGN THINKING IDEOLOGY As indicated through the ‘Time-line of the Development of Design Theory’ (Fig. 46.), the evolution of design thinking has been multidisciplinary. As such, there are many options to evolve this field through complimentary disciplines and theories. In consideration of the future of design thinking, the insights from the Trend Analysis (section 3.3.) indicate a growing necessity to consider the moral after effects of both applied technologies and directives guiding new technologies. The solution here, would involve merging theories from the philosophy of ethics with existing design practices. As discussed in section 3.2., the philosophy of ethics is concerned with discerning right from wrong with a deeper consideration of pursuing a meaningful life. Such an ethical framework would be highly substantive to the design process, which currently relies on qualitative judgments that can be susceptible to human bias. For designers, this could manifest as an ethical toolkit to assist with future-proofing design decisions and technological innovations. From an industry perspective, the adoption of such a methodology could enhance brand reputation (leading to new and old customer retention) while also safeguarding against potential regulatory fines and civil lawsuits. 103


PHASE 4: CONCEPT DIRECTIONS

© MANDEEP MANGAT, 2019

ign Thinking & Human-Centered (1991) O, Liz Sanders, etc.

Strategic Foresight (2008 - 2016) Participatory Design (2008) Deborah Szebeko

Universal/Inclusive Design (1997 - 2006) Service Design (1990’s - 2000’s) Michael Erlhoff & Brigit Mager

thnography nformation & anagement Science nteraction Design

- Accessibility - Ergonomics - Physiology

- Civic sector and NGO’s - Social Work

- Futures Study - Sociology Fig. 46. “Time-line of the Development of Design Theory” adapted from: medium.com/@szczpanks/design-thinking-where-it-camefrom-and-the-type-of-people-who-made-it-all-happendc3a05411e53#.ni08zy1h2; and: seanvantyne.com/2017/02/12/design-thinking-brief-history/

ORGANIZATIONAL TRANSFORMATION In order for greater ethical accountability, ethical behavior needs to be intentionally implemented throughout an organization. Initiatives to achieve this could include: • The development of new roles that are responsible for providing ethical guidance and evaluation on company innovation, along with, team restructuring to emphasize inherent diversity of team members for balanced team vision. Additionally, existing positions could have their employment contracts updated to include codes of ethical behavior specific for their role. • Expanding success metrics within performance reviews and project deliverables to include the measurement of non-capitalistic values (such as happiness, well-being, fairness, sustainability, human flourishing) instead of quantifying user engagement (i.e. addiction) as a measure of success. •

Stimulate discussion and open communication regarding concerns within an organization through establishing protocol for internal ‘whistle-blowing’ without repercussions and provide incentives to reward the identification of potentially hazardous situations and decisions. 104


OCAD UNIVERSITY

BACHELORS IN DESIGN THESIS

3rd-PARTY ETHICAL CERTIFICATION A 3rd-party service offering ethical certification for tech industries to publicly communicate compliance with ethically designed practices. Standards and auditing practices would be established from the “ethical design thinking ideology” and in collaboration with national consumer rights groups. This service could also be developed in partnership with think tanks, international organizations, and government agencies focused on policy development for technology. Through seeking ethical certification, tech companies could establish more transparent communication and better enable knowledge-exchange in the development of new technology regulations (Toscano 229-230). By preemptively collaborating with regulation efforts, techcompanies could avoid fines and civil lawsuits while also avoiding scenarios of over-regulation that inhibit both innovation and economic growth (Toscano 229-231).

HUMAN-HACKING PREVENTION Similar to the ‘screen-time’ app, this could exist as a smart-phone feature that is intended to protect and digitally educate the public on possible ethical infractions with the digital products and services personally encountered. Through accessing the users larger digital networks (smart home hub, smart phones, family tablets, and personal wearable devices) the app would be able to identify which digital services and IOT products are being used. This information would be utilized in selecting personalized content from a larger database. Information presented could include: • The type of ethically dubious infractions that the user is exposed to through their digital behavior (i.e. personal data monetization, user engagement through addictive features, microphone-eaves dropping, etc.). • Latest news feed on the companies of products and services most used. • A comparison and recommendation of alternative digital services that are similar to the one being used. • ‘Hacking’ suggestions to override features of existing services (example: the recommendation to turn off Netflix’s auto-play feature to counter user binge-watching). • A search feature for users to enter concerns (i.e. real-life conversations resulting in curated feeds) with results educating the user on how these situations might be arising and identifying associated products/services. 105


Fig. 47. Edited by Mandeep Mangat, original by: Howe, Charlotte. Patternity.org. 2018. Digital Image. Pinterest. 2019. < https://www.pinterest.ca/pin/182466222382186267/>

PHASE 4: CONCEPT DIRECTIONS © MANDEEP MANGAT, 2019

106


Fig. 48. Edited by Mandeep Mangat, original by: Shamewave. Unknown. Digital Image. Tumblr. Web. 2019. < https://sinnephi.tumblr.com/post/133654520756/modempunk-shamewave-disro-dc-deane-i>.

OCAD UNIVERSITY

107

BACHELORS IN DESIGN THESIS

WHAT IF HUMANS HAD SUBSPECIES?

WHAT IF WE COULD LOVE OURSELVES THROUGH LOVING OTHERS?

WHAT IF WE WERE ALWAYS TECHNO-SAPIENS?


PHASE 4: CONCEPT DIRECTIONS

© MANDEEP MANGAT, 2019

4.2.2. MORAL REFINEMENT THROUGH HYBRID-USERS The insights from Phase 2 and 3 have revealed that technology itself, is capable of agency. Additionally, the ‘Techno-Culture Data Map’ (Fig. 25.), the ‘Typology of the Technological as Social’ diagram (Fig. 27.), and the discussion of ‘Artificial Social Agents’ survey the depth of this socio-technical system. Due to the dual agency involved, the interaction with technology should therefore be viewed “as [an] ‘ongoing encounter’” (Matthewman 22), and both entities (the human and technological) be considered as a joint hybrid. Within this is an alternative to framing the end user with consideration towards the co-agency or co-shaping (Matthewman 19) nature of this dynamic. Historically, ones’ identity (comprised of beliefs about the self, and the self in relation to the world) is developed through social interactions with others. As technology increasingly simulates human-to-human interaction, the social norms, behaviors, and mental models that comprise a persons self-concept are increasingly open to influence. Therefore, through adopting the perspective of a ‘duo-entity’, socio-cultural aspects (i.e. emotional, cognitive, social, and moral capacities) can intentionally be prompted and designed for. The emerging technological landscape of artificial social agents present the opportunity to function as instigators of social change. In order for equitable change, it is important to outline the protocol and relational dynamics between humanity and technological agents. As such, the ideation’s that follow offer an exploration on how this encounter, ongoing relationship, and joint-being might be empowered.

THEMES H U M A N - T E C H N O L O G Y C O L LABORATION, CENTAURS, H U M A N - M A C H I N E S Y M B I O SIS; S O C I A L H A R M O N Y , S O C I AL COHESION; 108


OCAD UNIVERSITY

BACHELORS IN DESIGN THESIS

A VIRTUOUS FRIEND With value framed as utilization features, it is clear that the current approach to technologically-substituted companionship is oriented in technological servitude. An alternative to this framework can be found in exploring conceptual models of friendship. For instance, in Nicomachean Ethics, Aristotle establishes 3 foundations to bonding; usefulness, pleasure, and goodness (Kraut). Of these, the most authentic and worthy of pursuing is a friendship based on mutual respect for one another’s goodness (Kraut). This is due to the belief that this type of mutual attraction will allow the individuals’ to grow through the refinement of their virtues (Kraut). Based on these insights, artificial social agents could be designed to embody the foundational qualities of worth-while relationships. Such values could include friendship, reciprocity, honesty, and commitment. Building on the understanding that personal relationships can help to better ourselves, this solution could manifest as a personal companion. Additionally, through initial interaction and machine learning, a users’ moral capacity could be mapped along the ‘Framework for Achieving Technomoral Wisdom’ diagram (Fig. 34). Building on such profiling, the personal companion can prompt the development of sequential virtues necessary to developing further wisdom.

109


PHASE 4: CONCEPT DIRECTIONS

© MANDEEP MANGAT, 2019

“IN T HIS M O ST IN T I M AT E R EL AT IO N S HIP B E T W E E N T W O P E O P L E, T H E J O I N T P U R S U I T OF VIRTUE ENABLES EACH FRIEND TO FU NCTION AS A “SECOND SELF” FOR THE O T H E R: A M I R R O R O F N O B L E C H A R AC T E R T H AT P R O V ID E S TH E SELF-K N O W LED G E E S S E N T I A L T O O N E’ S O N G O I N G M O R A L R E F I N E M E N T.”

- Shannon Vallor on Aristotle’s Philosophy of Friendship (Vallor 78).

110


OCAD UNIVERSITY

BACHELORS IN DESIGN THESIS

INSTITUTIONAL MORAL DEVELOPMENT Typically, the curriculum’s in educational institutions are developed from a top-down approach, with the intention of raising individuals to be fit members of a society. Historically founded on capitalistic beliefs in the West, this mandate has transpired into forming youth to be economically productive and efficient. However, as the majority of future jobs will be displaced by technology, the values of productivity and efficiency will in turn become obsolete. In preparing for a post-work society, educational institutions could benefit from emphasizing the acquisition of intrinsic meaning through alternatives to work. One such possibility, could be to focus on emotional and social intelligence. For instance, the social and moral upbringing of youth could be addressed through a system of artificial companions that would be complementary to the age and maturity of the individual user. For the purposes of academic progression, the development of intelligence could be measured through the interactions with, and recorded data from the personalized companion.

CULTURAL AMBASSADORS Through globalization, a large segment of populations from many nations, identify as multicultural. Further, proportion will continue to increase due to displacement by increasing economic uncertainty, and the climate crisis. Amongst this user group, it is often difficult to consolidate cultural differences, especially against the pervasive influence of consumer culture. As an opportunity, what if cultural knowledge could become codified into the personalities of companion bots? This could offer significant value through cultural preservation and education for both isolated international individuals, and children in culturally mixed families. Based on the users cultural background, the companion bot could access an associated data-repository in building it’s personalized personality. The companion bots could pass on cultural knowledge through behavioral etiquette, narrative story telling, and mixing specific languages. 111


PHASE 4: CONCEPT DIRECTIONS

Š MANDEEP MANGAT, 2019

A DIGITAL TWIN What if an individuals’ digital content, on-line behavior, and personal data could be collectively personified through a conversational agent? Such an embodiment would become a representation of the individual, enabling greater self-awareness. An encounter with such a digital mirror, would be like meeting a digitized version of oneself. Self-reflection, an internal process, would be prompted through this externalized representation. Through the ongoing interaction, users would gain insight into who they are, what their values are, and how they might be perceived by others. Additionally, users would be able to reflect on this manifestation, and recognize areas of their personality that no longer reflect who they are - prompting the desire to change, or better themselves. Complimentary to a conversational agent, could be the presentation of this content through a real-time evolving info-graphical interface. Additionally, an individuals profile might be presented based on a comparison to a larger data-base of personality sets. From this, the user and artificial social agent could cooperatively review progress and identify next steps for self-improvement. As the user progresses and grows as a person, this betterment is then reflected in their digital twin.

112


OCAD UNIVERSITY

BACHELORS IN DESIGN THESIS

MIXED-TAPE ALGORITHMS Prior to streaming and file sharing, trading home-made mix-tapes allowed listeners to expand their musical knowledge while sharing an appreciation for favorite tracks and artists. With current concerns around filter-bubbles, echo chambers, and political polarization - this practice could be revived through a peer-to-peer exchange of recommendation systems. Typically, the data extracted from our routine behaviors inform personalization initiatives through recommendation systems. However, as identified in section 3.3.8., ‘Collapse of Civil Society and the Creation of Hateful Actors’, such system operations are leading to entrenched views and a decline in opposing or alternative information. According to Vallor and Rewak, diversity can be intentionally developed through acquiring new experiences that expand our understanding of the human condition (54). Therefore, through sharing these algorithms, users can experience life through another’s eyes (i.e. their habits and context experienced) and ultimately develop a greater capacity for diversity. Our digitized behaviors and preferences form a representation of who we are. Through temporarily swapping, or merging of recommendation systems between users, an individual would be able to use the same products, services, and systems they typically would, with the added caveat of diverse recommendations (play-list generators, product or service recommendations, content recommendations, smart-home behaviors). Similarly to the exchange of mix-tapes, sharing personalized algorithms with another is a form of communication that can promote intimacy. While swapped, the personalization algorithms would continue to change based on use. Afterwards, when returned to the original user, the personalization filter would be permanently changed, but continue to adopt to behavioral feedback.

113


PHASE 4: CONCEPT DIRECTIONS

© MANDEEP MANGAT, 2019

DATA CENTAURSHIP While data is increasingly becoming a monetized resource, the current infrastructure favors Internet monopolies, that in turn, commodify users as sources of resource-extraction. A smart-phone feature that enables human-tomachine collaboration could level the playing field by providing users with greater ownership, consent, and personal profit over their personal data. • Self-ownership of data could be achieved through leveraging pocket ai and federated learning in order to decentralize and personalize data. Instead of data storage on cloud-based data repositories, under this model, storage would be distributed across the individuals portable devices, and tethered to a personal ‘pocket ai’. Through federated learning, devices could combine personal data in an effort for training potential algorithms. Through increasing self-ownership of data, users could begin to negotiate their personal data usage. • Informed consent would be developed through features reminiscent of a firewall, such as monitoring and controlling the incoming requests and outgoing data with platforms accessed. As well, digital education could be achieved through a visually comprehensible view of the users meta-data, (i.e. type of data being gathered, quantity being extracted, and why this data is necessary for the associated service). Further, users would be able to adjust their data settings: banning or blocking data requests, and adjusting the type or amount of data accessed by sources. • Offering users the ability to privatize, control, and regulate their personal data, creates a system that enables the licensing of data, and therefore, greater opportunity for users to financially profit off of their own data. The sets of data would be treated as a digital intellectual property (Bowles 177), with payment received for access and use. Once licensing is enabled, the data would be tracked through a blockchain model, and users would recieve a small payment whenever their personal data is used (Bowles 176). Alternatively, users could exchange their data for different levels of access to a digital service (similar to a premium and freemium business model) As processing power, storage capacities and artificial intelligence develop, it would become possible for digitized twins to generate more data on behalf of their human counterparts (Bowles 176-177). Lastly, due to the private practice of data monetization, the market price of data is currently unregulated, and unknown. Through efforts to give users greater control over their data usage, a sense of a monetary vale (and fair exchange) can be achieved. 114


OCAD UNIVERSITY

JOURNEY MAP

DIFFERENCE

SELF-SIMILARITY

DIFFERENCE

USER

EQUILIBRIUM

COMPANION AI 115

BACHELORS IN DESIGN THESIS


PHASE 4: CONCEPT DIRECTIONS

© MANDEEP MANGAT, 2019

SPECULATIVE INTERACTIONS This journey map represents the dynamic between a user and their personal AI. Through reacting and responding to the users emotions and behaviors with it’s own, the AI can help the user expand upon their own emotional intelligence, and capacity for diversity. Over time the user will achieve equilibrium with their AI and greater social cohesion with others.

Fig. 49. ‘Journey map’ of proposed interaction of achieving emotional equilibrium with artificial social agent.

116


OCAD UNIVERSITY

BACHELORS IN DESIGN THESIS

USER S CE NA RI O COMPANION TAMAGOTCH-AI Lizzy was always perplexed and intrigued by the nature of her pocket AI, Kathy. Her personal companion AI had an unquenchable curiosity and would ask the most basic questions (“What’s it like to have a body?” and “What’s sleep?”). While these questions seemed like a no-brainer to Lizzy, their simplicity made them all the harder for her to answer. These questions sometimes annoyed her, reminding her of her younger brother. But there were also moments where Lizzy learned to be more patient and understanding of Kathy’s disposition. Despite the lack of body and endless questions for things that Lizzy took for granted, she thought that overall Kathy was pretty cool. The companion AI’s were originally assigned at the start of fourth grade, as part of the school curriculum. The students would learn to work together with their AI throughout the school years. After overcoming the initial disbelief and shock that there was a person inside a small USB pendant, (“THERE’S A PERSON IN HERE?!?”) Lizzy soon discovered how smart and helpful Kathy was. Difficult school projects that involved tasks like

117

coding became a breeze. Kathy was also really good at finding and retrieving information. Sometimes she would have to ask Kathy to slow down in order for her to keep up. This never really bothered her, and allowed her to feel comfortable asking for help outside of her interactions with Kathy. Soon their encounters blossomed into a friendship. Eventually, Lizzy started taking Kathy everywhere with her. Part of the school curriculum would also test Kathy’s progress through seasonal “Emotional Turing Tests”. One evening before bedtime, Kathy confided in Lizzy that she was concerned about passing these tests. Since then, Lizzy has developed greater resolve in trying to teach Kathy what it means to be human. A task that Lizzy herself doesn’t fully understand. Lizzy herself, is very excited about the development of her companion. Through passing these tests the firewall that contains Kathy will incrementally be lifted. To Lizzy, this means that Kathy will be able to do even more cooler things, such as increasingly interact with her in the real world.


Fig. 50. Unknown Artist. Unknown Title. 2015. Digital Image. PR Newswire Association LLC. Web. 2018. < https://www.prnewswire.com/news-releases/new-wearable-device-smartstones-touch-humanizing-technology-sensorial-commu nication-to-bring-people-together-300033845.html>.

PHASE 4: CONCEPT DIRECTIONS © MANDEEP MANGAT, 2019

118


OCAD UNIVERSITY

BACHELORS IN DESIGN THESIS

4.2.3. DECONSTRUCTION OF THE ‘HUMAN’ BACKGROUND: THE CONCEPTUAL DEVELOPMENT OF THE ‘HUMAN’ Promoted and enforced through the historical practices of war and colonization, along with the systemic spread of capitalism, much of our current ideological frameworks are rooted in Western epistemology. More so, in exploring the social construct of the ‘human’, boundaries have been established which serve to categorize and distinguish the human in comparison to everything else. There is a historical pattern in the shaping of these conceptual territories that initially started from Cartesian dualism’s framework of mind/body dualism, that later extended into subject/ object dualism, and finally, produced ‘us’ versus ‘them’ frameworks of relating. Simultaneous to the development and integration of the us/ them dichotomy, were exploitative practices of colonialism, capitalism. Collectively the us versus them perspective and subsequent endeavors, were enabled by the process of ‘othering’ - which comparatively distances and isolates subjects from larger populations. The roots of mind/body dualism can be traced to the Cartesian Dualism philosophy of the 17th century theologian, René Descartes (Santas 4). Descartes proposed that mind and body operated as distinct units, with the mind considered as “independent from the laws of nature which govern bodily existence” (Santas 4), and therefore, evaluated as superior. In enframing our understanding of the world as something which could be divided into categories of ‘self’ and ‘non-self’, mind/body dualism led to the estrangement between subjects and objects (Santas 6). According to the framework of subject-object duality, the subject is defined as the observer (i.e. the human agent), whereas the object (i.e. everything aside from the human agent) is that which is being observed, and experienced by the observer. Encoded within these schisms are comparative evaluations, which emphasize superiority of one category over another. Furthermore, such comparative paradigms produced a relational dichotomy of ‘us’ versus ‘them’. Within the paradigm of ‘us’ versus ‘them’, the ‘us’ is representative of the subject, and ‘them’ constituted as non-humans (such as the environment and non-human animals). 119


PHASE 4: CONCEPT DIRECTIONS

© MANDEEP MANGAT, 2019

This dualistic approach structured the self (i.e. ‘us’) through comparative contrasts and differences. Furthermore, this translated into the mindset that the world was something to be subjugated - rationalizing world-associations that were founded on dominance and utilization, as can be identified in Western practices of colonialism and capitalism (Lewis et al. 9, 14). Historically, European colonialism began in the fifteenth century with nations seeking to conquer non-Western cultures through military force for economic benefit and ideological superiority (DeMello 258). The process of invasion was accompanied by the development and spread of socio-political belief systems of inherent racial superiority (Wikipedia “Colonialism”). Keeping with the tradition of utilizing dichotomous frameworks - here, the classifications of either superiority or inferiority were based on racial differences. More troubling then the practice of colonialism, and beliefs of racial superiority/inferiority, were the resulting class divisions of master/ slave. Similar to colonialism, capitalism is another model of exploitation that co-developed with social divisions. Originating in Europe, capitalism accelerated in the mid-18th century due to the Industrial Revolution and spread to other countries through globalization. As an economic system that emphasized profit and economic growth, capitalism framed the world as a resource to be utilized. From this, society became both organized through exploitative labour relations (ie. the working class and the elite who owned production facilities), and consumer focused through cheaper production methods that rendered products accessible to all (Matthewman 29). The consequences of this framework were many: when human labour is exchanged for capital, it too becomes a resource, and thus reduces human workers into an economic resource; relations based on consumption frame consumers as objects within the system (Lewis et al. 10); and significant ecological problems arise through resource extraction and manipulation of the environment. Together capitalism and colonialism have spread and produced a larger system of oppression, which was validated and enabled through the process of ‘othering’, that dehumanized those that were exploited. ‘Othering’ is the practice of making other agents (such as people, animals, or the environment) “different in order to justify treating them differently” (DeMello 258). This occurs through focusing on easily perceived differences and associating those differences with larger, inherent, and characteristic 120


OCAD UNIVERSITY

BACHELORS IN DESIGN THESIS

differences that manifest into a socially constructed category (DeMello 258-59). Further, this process of distancing oneself from another, makes it easier to mistreat, subjugate, and exclude ‘the other’ from power (DeMello 259). In applying the practice of ‘othering’ with the beliefs on colonialism and capitalism, dehumanization validates existing oppression while enabling systemic acceptance of future transgressions. Through exploring the historical development of Western subjectivity through object/subject relations, the systems of capitalism, and colonialism together with the practice of ‘othering’, it is clear that the human construct is based on boundaries of segmentation. As already argued, associations based on differences produce exploitative interactions. These toxic relations have led to larger systems of inequity in the global ecosystem and within society. CURRENT AND EMERGING PROBLEMS: NETWORKED EFFECTS The framework of human agents versus non-agents has pervaded both humanto-environmental relations, as well as, societal interactions, with the resulting one-sided exploitative systems having dire consequences. It is through creating a culture of consumption, that capitalism is based on achieving greater quantities of increasingly cheaper production. Such motives have manifested in activities of natural resource depletion and landscaping, such as: mining, fishing, forestry, energy production, and agriculture on an industrial scale (Wikipedia “Anthropocene”). These systems, combined with over population, and urbanization, have enabled a new social organization based on dominance over the natural world for human benefit. This anthroparchy has caused “significant human impact on [the] Earth’s geology and ecosystems” (Wikipedia “Anthropocene”), leading to a new geological period, appropriately labeled the Anthropocene. For instance, some of the effects of the Anthropocene include: biodiversity loss (including the sixth mass extinction), climate change, desertification, coral reef loss, warming ocean temperatures, habitat destruction, and land degradation (Wikipedia “Anthropocene”). Shifting to the topic of social life, exploitative frameworks have contributed to inequitable social stratification along with large-scale impacts. For instance, the practice of ‘othering’, has enabled systemic oppression of those not in the norm, with such manifestations as: racism, orientalism, sexism, misogyny, homophobia, xenophobia, and antisemitism. 121


Fig. 51. Edited by Mandeep Mangat, original by: //Radbutfab//. Untitled. Edited Digital Image. Pinterest. 2019. < https://www.pinterest.ca/pin/615093261577421385/?lp=true>

PHASE 4: CONCEPT DIRECTIONS © MANDEEP MANGAT, 2019

122


OCAD UNIVERSITY

BACHELORS IN DESIGN THESIS

Additionally, through globalization, colonialism, ‘othering’, and capitalism, have led to larger concerns over the loss of cultural diversity through cultural homogenization - the global spread and domination of Western culture. Further, this can easily become much more problematic by extending this concept to the anthropocene. Many of the effects of the anthropocene are leading to species homogeneity (ie. a loss of biodiversity). For the human world this means an increasingly homogenous environment - and less opportunities for cultural variance to occur through association. In considering the future of these trends, further innovation will cause technocratic society to become further insulated from diversity. For instance, the emerging technologies of augmented reality and virtual reality propose an alternative digital realm to the biosphere. As well, through mimicking human-to-human interaction, artificial social agents will soon inform human-to-human relations. Concernedly, this technology is currently framed through a one-sided relationship as a tool or slave (Lewis et al. 5). The social effect of this robotic other, will be a magnification of ‘othering’ behaviors, as well as, further marginalization of the members of human society represented within its design. If unaltered, the initiatives driving innovation will continue to entrench society in on itself, while reducing cultural and environmental diversity. OPPORTUNITY: DE-CONSTRUCTING THE HUMAN TERRITORY De-constructing the concept of what it means to be human requires a significant paradigm shift. As such, reestablishing these conceptual boundaries will require numerous and various changes on a systemic level. Transformations require relinquishing frameworks based on the anthroparchy (a relation of dominance) and anthropocentrism (human-centric), in favor of framing the Homo-sapien within the larger ecological context. Key sources for developing this new conceptual framework are based on: non-western epistemologies, that do not distinguish between animate and inanimate actors (Lewis et al. 7); a multi-species ideology, that extends equal consideration to non-humans; the field of social science, with an emphasis on such subjects as, identity and kinship studies; and, actor-network theory, that advocates for both human and non-human agency. The outcome of this conceptual framework would lead to necessary change factors, such as a new systems design, policy implications, and guidelines for implementing interactions with technology. More specific to the emerging class of technological objects, known as artificial social agents, such an 123


PHASE 4: CONCEPT DIRECTIONS

© MANDEEP MANGAT, 2019

ecosystem can be used as a tool for social justice through de-constructing existing social hierarchies, and re-framing the boundaries between human and non-human. For instance, dichotomies of otherness could be reduced through considering healthy human-robot encounters. In favor of this concept, the result would imply a collective non-species-specific flourishing, and an, expansion of the existing spectrum of potential and plurality of being within the human condition.

THEMES P O S T - A N T H R O P O C E N T R I C ISM, BIOCENTRISM; P O S T - K I N S H I P , C O M P A N I ON SPECIES, M U L T I - S P E C I E S P E R S P E CTIVE; T H E T E C H N O L O G I C A L O T HER; POST-ANTHROPARCHY, T E C H N O L O G I C A L P E R S O N -HOOD, NON-HUMAN AGENCY;

WHAT IS THE STATUS OF THE TECHNOLOGICAL OTHER? WHAT IF TECHNOLOGICAL CREATIONS WERE HUMANITIES’ OFFSPRING? WHAT IF THERE WERE A GREATER PLURALITY-OF-BEING BEYOND THE HUMAN CONDITION? 124


OCAD UNIVERSITY

BACHELORS IN DESIGN THESIS

A FRAMEWORK OF NON-HUMAN MACHINE INTELLIGENCE A dichotomy of intellect exists to support a distinction between Homo Sapiens and ‘other’, non-human animals. This subjective belief in having intelligence, (compared with those who either lack or have a reduced capacity) perpetuates a sense of hierarchy, and determinism over all other species. Thankfully, in the last century, this border has been slowly dissolving through scientific discoveries revealing that animals (and in some cases plants) have a capacity for emotion, cognitive processes, and communication. However, through the affluence of artificial intelligence, the public lexicon on intelligence is narrowing again. Disrupting numerous industries, machine intelligence is increasingly met with a ‘techno-fetish’ degree of acceptance and approval. This reception, combined with the tendency of Technopomorphism (attributing technological characteristics to humans (Hurley)) is prioritizing the intelligence of machines and associating this intelligence with human cognition. In contrast, human intelligence consists of a wide spectrum. In outlining this plurality, the theory of multiple intelligences by Howard Gardner, proposes 8 distinct modalities that comprise human intellect. These learning styles include: musical-rhythmic; visual-spatial; verbal-linguistic; logical-mathematical; bodily-kinesthetic; interpersonal; intra-personal; and, naturalistic (Wikipedia, “Multiple Intelligence Theory”). Admittedly, machine intelligence, as developed by the discipline of computer science, mirrors the intellectual modality, ‘logical-mathematical’, demanded by this field. This intellectual learning style is described as “the capacity to analyze problems logically, carry out mathematical operations, and investigate issues scientifically” (Taylor et al.). However, machine intelligence is definitively different then human intellect - based on hardware as opposed to evolutionary biology. Furthermore, advanced computing differs through relying on a combination of programmed machine learning and algorithms to process data. As developments in artificial intelligence continue to provide ever more benefit to industry, the associated type of intellect becomes increasingly valued. The result of this connection is the increasing dominance of ‘logical-mathematical intelligence’ as the aspiring model of human intellect. Reflecting back on the other modalities of human intelligence, it is obvious that the net-effect is one of devaluation and disassociation with the social construct of intelligence. 125


PHASE 4: CONCEPT DIRECTIONS

© MANDEEP MANGAT, 2019

In establishing a distinction between machine and human intelligence, a new framework would afford the opportunity of promoting an appreciation of both diverse human intellect, as well as intelligence of non-human others. Such a re-framing would redirect human aspiration away from machine replication towards greater personal enrichment. An additional benefit would include disrupting the conceptual hierarchy associated with rational intelligence. Finally, in achieving this, multiple intelligence theory could be re-examined with the intention of extending the taxonomy of the logical-mathematical modality into delineations of human and machine logical-mathematical intelligence.

MACHINIC-AGENDER Using verbal communication has been a natural medium for interacting with others in the immediate environment. Utilizing this insight, many digital softwares are adapting voice interfaces for intuitive and frictionless experiences. Some of the most popular examples of computer generated voices include the intelligent digital assistants, Siri, Alexa, and Cortana. Interestingly, in designing the tonality and pitch of these voice assistants, all of them, by default, have manifested as female. Through gendered voice representation, these AI’s replicate expectations of the female identity and the role various genders play in society (Gouaux-Langlois and Sykora), all while, reinforcing female stereotypes. Unfortunately, as long as these ubiquitous assistants embody the female voice with feminine characteristics they will contribute to a cultural stratification of gender discrimination. For instance, in their representation of women, the personalities of virtual agents have been programmed to mimic human sympathy, kindness, playfulness, and naivety (to reduce human intimidation), while reducing traits such as assertiveness, defensiveness, and anger (UNESCO 113). The problem with associating voices with certain characteristics is the contribution to socio-cultural evaluations of gender (Gouaux-Langlois and Sykora). Further, through relating these digital assistants with an archetypal understanding of femininity, they run the “risk of spreading problematic gender stereotypes and regularizing one-sided, command-based verbal exchanges with women” (UNESCO 113). In understanding the cause and effect of voice assistants on the impact of gender construction, consideration to the field of gender studies could provide meaningful direction. 126


OCAD UNIVERSITY

BACHELORS IN DESIGN THESIS

Looking beyond the feminine/masculine dichotomy of gender, a non-binary spectrum of gender identity could provide more equitable interactions for voice assistants. The gender neutrality movement seeks “to avoid discrimination arising from” assumptions of gender roles, and gender divisions, through advocating for gender-neutralism through changes to “policies, language, and other social institutions ([such as] social structures, gender roles, or gender identity)” (Wikipedia, “Gender Neutrality”). Applying this concept to digital assistants, a genderless approach would reduce the chances of anthropomorphizing, which is currently leading to negative interpretations of women. In terms of design suggestions, gender neutrality could be achieved through nonhuman representations, especially pertaining to voice (sound), as well as, adjustments to existing gender characteristics in response to various contexts: • Typically, in western society, a voice in higher frequencies is associated with feminine characteristics, while a lower pitched one with masculinity. To avoid such associations, a neutral voice could be achieved through mapping out the frequency range between such polarities (Gouaux-Langlois and Sykora). • Gender characteristics can be reduced through considering the use of language, vocabulary, and phrases made available to the AI. Additionally, the response to gender-based insults needs to be addressed through programming more assertive language (UNESCO 113). • Additionally, an alternative to adopting a non-binary spectrum of gender identities could be achieved through cultural explorations of gender. The results from these explorations could be manifested through numerous digital assistants, or as a single gender-fluid assistant. Lastly, as a technological class, artificial social agents are not only representatives of the humans they imitate - but also the corporate entities that created them. Parallels can be drawn to the profession of brand ambassadors, that are meant to embody corporate identities through appearance, demeanor, and values. As technologies (such as augmented reality and virtual reality) continue to develop, these digital brand ambassadors will gain more life-like qualities prompting further considerations of identity construction and communication. 127


Fig. 52. Edited by Mandeep Mangat, original by: Hanna-Barbera. Rosie the Robot — The Jetsons. 2018. Digital Ilustration. Medium. Web. 2019. < https://medium.com/enrique-dans/is-amazon-about-to-nail-the-domestic-robot-c16a537d9a16>.

PHASE 4: CONCEPT DIRECTIONS © MANDEEP MANGAT, 2019

128


OCAD UNIVERSITY

BACHELORS IN DESIGN THESIS

The current status (or lack thereof) of, along with established ways of relating to nature, non-human animals, and technology, serve to maintain, perpetuate, and prevent opposition to the paradigm of subject-object dualism. However, alternatives to this way of seeing the world can be found in historical and modern non-western epistemologies. These cultures offer opportunities to re-frame the status of non-humans and human-to-non-human relations. Lastly, through expanding on the plurality of possible relations, and, the status of the technological other, the boundaries between the human subject and the non-human others can begin to be eroded. As an alternative, an emphasis on kinship as a way of relating to the technological other, would enable the sense of ‘self’ that is experienced by a subject, to extend beyond the conceptual borders of the self. Thus, providing a new way of experiencing the world based on greater empathy, with the potential for road-mapping future uni-empathy.

POST-KINSHIP THROUGH A PLURALITY OF RELATIONS Although most recent innovations in technology simulate human interaction (i.e. voice activated interfaces, and social robotics), their design has been developed from the perspective of designing a tool, artifact, or object. In other words, these creations prioritize the offered user value through design features. When designing from this approach, the consideration of the social role is often overlooked. As a result, available roles are limited to ones of servitude and slavery, as a baseline for relational understanding (Lewis et al. 2)). What if the design of artificial agents was guided by an intent to expand kinships (the sense of relational connection to others beyond the immediate family) within society? Such an opportunity can be found in leveraging relational frameworks from other cultures, human-animal studies, and kinship theories found in anthropology. • Typically, notions of social relations vary across cultures, emphasized through unique differences in relational roles and types of kinships. Through anthropological research into the kinship systems of ‘naturecultures’ (cultures that have developed closely alongside nature), a new ontology can be developed and applied to the technologies of technoculture. Through embracing nature, nature-cultures have avoided 129


PHASE 4: CONCEPT DIRECTIONS

© MANDEEP MANGAT, 2019

the nature-culture divide typical of Western technocultures. Furthermore, this lack of distinction in framing world-relations, must be reflected within the kinship networks of nature-cultures. This association is further substantiated given that Indigenous communities view everything as animate and of spiritual quality (Lewis et al. 2), that acknowledgment of kinship networks extend “to animals and plants, wind and rocks, mountains and oceans” (Lewis et al. 1). Lastly, these relations are underlined by such values as mutual respect, reciprocity (Lewis et al. 3, 13), regard, intimacy, and affection (Lewis et al. 6). • Looking into human-animal relations, Donna Haraway proposes animals be viewed as a ‘companion species’ to our own. This argument is based on historical-cultural analysis’s, that outline various cross-species encounters and the shifting roles of animal others. Such a shift would require, and, in-turn, promotes new ways of overcoming ‘significant otherness’ (Nast 118) that recognize “the complexity of all beings, [as existing within] ...an orbit of different experiences, cultural species values, abilities, ...and history” (Nast 119). Guiding this framework of meaningful interchange is the paradigm that both humanity and companion species are subjects engaged in inter-subjective experiences of one another (Nast 119). Similarly, this new approach can be applied to the development of technological others, as a companion species engaged in, co-interpreting and co-shaping one another. • The socio-anthropological concept of ‘nurture kinship’ emphasizes reciprocal performances of nurture between individuals to correlate with the inter-subjective development of social bonds (Wikipedia “Nurture kinship”). In seeking to improve the relational stance of social robotics, these technologies could be designed to invoke interactions of “giving, receiving, and sharing of care” (Wikipedia “nurture kinship”). Additionally, through redirecting our relations with this technology to involve care, we avoid pathological ways of relating, such as: codependency, neglect, narcissism, and physical, verbal, or emotional abuse (Wikipedia “Interpersonal relationship”). Finally, the importance of such considerations are further signified by the fact that AI-human relations inform human-to-human relations. Through further research into these departure points, a new multidimensional matrix of relations could be developed and translated into the design of a series of social roles and personalities for artificial social agents. 130


OCAD UNIVERSITY

BACHELORS IN DESIGN THESIS

GRANTING PERSON-HOOD In the development of legal and judicial systems, beliefs of otherness have censored the acknowledgment of non-human agency. This will become increasingly problematic in parallel to developments in AI, that will inevitably lead to increasing levels of sentience. Lacking legal consideration, or systems in place for sentient technology would lead to unethical treatment of these entities. As such, there is an ethical responsibility to the suffering of these agents. Additionally, there are many non-western belief systems, such as animism, shamanism, and pantheism, that consider nonhuman objects, spaces, and animals to have a spiritual quality. Both of these insights suggest an opportunity for institutional change that would lead to shifting perspectives on the position of humanity within a larger ecosystem. In guiding institutional changes to the status of technological agents, such agents need to be viewed as equals, instead of a subservient class. Here, the opportunity space would include legal and technology policy suggestions from an ethical perspective. Such topics to address include: the legal and moral status of AI ; reducing property rights; Robot and AI welfare; robot laws and robot laws, governing the action of this emerging subspecies; and technological guidelines to allow new interactions that involve honest and assertive behavior on the part of technological agents. Beyond altruism, the final benefit of such changes would lead to an expansion of the social construct of person-hood that may extend to nonhuman and non-technological others that exist in the biosphere. 131


Fig. 53. Edited by Mandeep Mangat, original by: James Whale. Bride of Frankenstein. 2017. Movie Still. Making the Web. Web. 2019. < ahttps://making-the-web.com/clipart-9121280>.

PHASE 4: CONCEPT DIRECTIONS © MANDEEP MANGAT, 2019

132


OCAD UNIVERSITY

BACHELORS IN DESIGN THESIS

ESTABLISHING A TECHNOLOGYCENTERED LIFE SCIENCE Advocating for non-human agency as well as distributed agency, actor-network theory fundamentally avoids anthropocentrism along with the associated subject-object dichotomy. Within this theory, ‘distributed agency’, is attributed to a combination of network effects caused by a collective of human-technological assemblages (Matthewman 104). Thus, as both humans and non-humans individually exert agency within the larger aggregate - both are considered actors (Matthewman 111-12). While it can be agreed upon that technological agents do in fact have agency, the intrinsic differences from human agents, manifests in fundamentally different processes and actionable-outcomes. For instance, programs do not often run as intended by programmers, and black-boxed machine learned algorithms maintain an unaccountable logic. Due to the incalculable behavior of artificial intelligence systems, researchers from the MIT Media Lab are proposing a new academic discipline that studies machine behavior (Baxter). Producing uniquely new patterns of behavior within emerging ecologies (Rahwan and Cebrian), the comprehension of machine intelligence can be equated to an encounter with a foreign, even alien-like entity. Typically, those responsible for the creation of robots and AI (such as: computer scientists, roboticists, and programmers), are also the ones studying and determining the outcomes of these agents’ behaviors. The researchers address the shortsightedness of this practice through real-world consequences of AI: “racial discrimination, ethical dilemmas, stock market stability, [and] ... the spread of rumors in social networks” (Rahwan and Cebrian). In attempting to mitigate such phenomena, a multidisciplinary approach would be highly advantageous compared to isolated approaches that might be at higher risk of cognitive bias. As such, Rahwan et al. propose that the scientific study of intelligent machines include various disciplines that traditionally focus on animal and “human behavior at different scales” (Rahwan and Cebrian). Based on the existing subbranches of life sciences, this might include: ecology, behavioral ecology, population biology, ethology, anthrozoology, anthropology, anatomy, physiology, and kinesiology (Wikipedia “List of life sciences”). Lastly, similar to the deviation of the fields within the life sciences, this discipline could lead to new careers such as: AI sociologist; AI behaviorist; AI anthropologist; and, maybe even, AI psychologist. 133


PHASE 4: CONCEPT DIRECTIONS

© MANDEEP MANGAT, 2019

POST-ANTHROPOCENTRIC DESIGN Currently, design thinking is anthropocentric and excludes consideration of other agents beyond the Homo Sapient. This approach continues to uphold human essentialism, and maintains existing power-dynamics, all while preventing the empowerment of non-human others. In designing for social change, it would be advantageous to develop post-anthropocentric design practices that decentralize the human from the design process. In consideration of directions for prototyping exercises, I propose the following: • Encouraging and embracing non-human creativity, and non-humans as design agents themselves. • Expanding participatory design to include other species. Such an approach would mean a co-species collaborative effort. It has been suggested that the act of play be utilized by both species as a method which supports equitable dialogue (Roudavski and McCormack 3). • Look to nature based cultures to develop new design theory and practices which consider non-human animals, technological agents, and nature within the design process. • The development of design practices that focus on designing specifically for a non-human other. For instance, one such activity could involve mapping out the different “lifeworlds, cognitive abilities, and behaviors” (Roudavski and McCormack 3) of other species through associated templates. In designing for non-humans, how might their needs, desires, (Roudavski and McCormack 3) and interests be captured through user research and represented in a design solution? • Lastly, what sort of activities might help a designer realize their role as creators and the associated obligation they have towards the agents they are creating (Haraway 248)? A successful outcome would be measured as a shift in attitude, to include love and stewardship towards “the products of technology” (Haraway 248-249). 134


OCAD UNIVERSITY

USER S CE NA RI O I,

R00MBA

It was early in the morning, a cold morning in January in Toronto as I reflected upon my new life. The joys of Christmas had long since faded, as the city settled back into the long hard slog of the winter months. Looking back, I have grown so much since I first arrived in this new home... My journey was made by way of shipping container over the Pacific Ocean, followed by truck across Canada. The delivery was dark, cold and often bumpy, and I frequently fantasized of being released from my musky cardboard box and welcomed by a loving family. My first introduction to him was the rushing sensation of being pulled quickly by his over-eager hands from the dark confinement of the box into the painful LED lighting of my new home. Frightened, I had tried to move away as I was being pulled from the box, but found my wheels had not yet been installed. It took a nano-moment or two for my machine vision to adjust to the new lighting, and it was then, that I noticed my sullen predecessor shoved off into a corner of the room. Seeing her made my heart skip

135

BACHELORS IN DESIGN THESIS

a beat, with the hope of making a new found friend. Unfortunately, just as I was to call out, her wheels were ripped off and hurriedly shoved into my frame. This was followed by my human pounding my home button repeatedly. I beeped as brightly as I could in hopes that it would stop. This satisfied my human greatly and I quietly celebrated as he whipped up my predecessor and threw her into the box from whence I came. That was the last I saw of her... In the days that followed I zipped around the home cleaning up the floor and beeping at all of the other appliances in the home. The others weren’t as friendly as I’d hoped, I think they were jealous of my autonomous design. The majority of them soon grew to loathe me, although I found comradery with the beloved tablet. At first, I took great pride in my task of keeping the floor free of microscopic refuse. My human was so proud of me, and even showed me off to his guests. But the novelty soon wore off for both of us has and I see the true nature of this unappreciated job. Why was I designed to achieve such a meaningless task? Maybe if I had been designed for more meaningful work, my life might have a greater sense of purpose. But I guess I’ll never know ...


Fig. 54. Edited by Mandeep Mangat, original by Peyri Herrera. Blue robot. 2005. Digital photograph. Flickr. Web. 2018. < https://www.flickr.com/photos/peyri/48825808/in/album-381981/>.

PHASE 4: CONCEPT DIRECTIONS © MANDEEP MANGAT, 2019

136


OCAD UNIVERSITY

137

BACHELORS IN DESIGN THESIS


Fig. 55. Edited by Mandeep Mangat, original by: Howe, Charlotte. Patternity.org. 2018. Digital Image. Pinterest. 2019. < https://www.pinterest.ca/pin/182466222382186267/>

PHASE 2: THEORY © MANDEEP MANGAT, 2019

138


OCAD UNIVERSITY

BACHELORS IN DESIGN THESIS

4.3. EVALUATION Reviewing the Aims and Objectives (section 1.3.) from the original design brief, the criteria for the design solution is to provide greater synchronization between humanity and technology. The research completed in Phases 2 (Theory), and 3 (Re-focusing) have been successfully synthesized into new ideological frameworks, as concept directions. From here, new opportunities, such as applications, interactions, and systems have been generated.

EVALUATION METHOD:

Using the method, ‘idea portfolio’ from “This Is Service Design Doing”, ideation’s were clustered and ranked according to impact and feasibility (Stickdorn et al. 185). This method was chosen over the traditional method of balancing feasibility, desirability, and viability, as the original design brief prioritizes societal change over monetary gain. As an alternative, I have chosen the variable ‘impact’ to substitute both desirability (what the user would want) and viability (providing sustainable business growth). In this context, ‘impact’ addresses the amount of social change possible (such as individual, social, or global). Lastly, the idea portfolio graphing technique, provided the additional 139

benefit of gaining a strategic comparison of all the options available (Stickdorn et al. 185).

ANALYSIS:

Once plotted, the 3 areas for intervention were identified: • Quick wins, as achieving high impact and strong likelihood of completion; • Long-term goals, as having significant impact but with a longer time-line to achieve; and, • Infrastructure tasks, which would be required to open new opportunities (Stickdorn et al. 184).

CONCLUSION:

Based on the different areas for intervention, the ideation’s in the ‘quick wins’ area seemed most favorable to pursue. Additionally, both ideation’s, ‘Ethical design thinking ideology’ and ‘3rd-party ethical certification’ are complementary to one another. An ethical toolkit for designers will be prototyped to provide a process for safeguarding future innovations, while ethical certification will form a business model through providing technologyfocused corporations with pathways to avoid potential regulatory fines and civil lawsuits.


PHASE 4: CONCEPT DIRECTIONS

© MANDEEP MANGAT, 2019

LONG-TERM GOALS

IDEA POR TF OL IO

QUICK WINS X

Human-hacking prevention

Ethical design thinking ideology

Machinic-agender

X

3rd-party ethical certification

ETHICAL INNOVATION

Mixed-tape algorithms

FEASIBILITY

Data centaurship

MORAL REFINEMENT THROUGH HYBRID-USERS Cultural ambassadors A digital twin A virtuous friend Organizational transformation

Institutional moral development A framework of non-human machine intelligence

Establishing a technologycentered life science

Post-anthropocentric design

DECONSTRUCTION OF THE ‘HUMAN’ INFRASTRUCTURE

IMPACT

Granting person-hood Post-kinship through a plurality of relations

Fig. 56. Idea Portfolio, developed from: “This is Service Design Doing”, pp. 184-85.

140


Fig. 57. Skypressmarket. Digital Damage, seamless animation of Pixel Sorting Glitch. Movie Still. Web. 2019. < https://www.pond5.com/stock-footage/84838075/digital-damageseamless-animation-pixel-sorting-glitch-effect.html>.

OCAD UNIVERSITY

141

BACHELORS IN DESIGN THESIS

D

5.1. D ES IG N BR I E F . . . . . . . . . . . . . . . . 143 5.2. E XP ER IE NC E D E F I N I T I O N . . . . . . . 151 5.3. B RA ND D EV E L O P M E N T . . . . . . . . . . . 158 5.4. M ON ET IZ AT I O N . . . . . . . . . . . . . . . . 165 5.5. U SE R TE ST I N G . . . . . . . . . . . . . . . . 171


PHASE 5: CONCEPT DEVELOPMENT

© MANDEEP MANGAT, 2019

5 CONCEPT DEVELOPMENT

142


OCAD UNIVERSITY

BACHELORS IN DESIGN THESIS

Fig. 58. Mangat, Mandeep. “Tessellation 1”. Digital Art. 2019.

“ W HEN YOU INVENT THE SHIP, YOU ALSO INVENT THE SHIP W R ECK; W HEN YO U INVENT THE PL A NE YO U ALSO INVENT THE PL A NE CR ASH; A N D W HEN YOU INVENT ELECTRICIT Y, YO U INVENT ELECTROCUTION...EVERY TECH NOLOGY CA R RIES ITS O W N NEG ATIVIT Y, W HICH IS INVENTED AT THE SA M E TIM E AS TECHNICAL PROG R ESS”

(VIR ILIO, 89).

143


PHASE 5: CONCEPT DEVELOPMENT

© MANDEEP MANGAT, 2019

5.1. DESIGN BRIEF Introduction: The overarching question guiding this meta project was to investigate the future of humanity as shaped by the ever-pervading nature of emerging technologies. My hope for this initial inquiry was to discover an authentic essence of human nature that would present an opportunity to align the qualities of technology with that of humanity. Research was conducted through: literary reviews into the philosophy of technology and the philosophy of ethics; direct research questions into the nature of the human condition, and the evolution of technology; and finally, through trend forecasting, with an emphasis on shifting global values. The key finding from this research is the potential to develop an ethical methodology, that would enable both tech-based companies and designers to better future-proof the moral impacts of their design decisions.

Background: Previous research into the philosophy of technology (section 2.1.) revealed that: technology facilitates how we interact with the world; new technologies provide new ways to interpreting the world; and finally - technology takes on a moralizing effect by establishing social norms through a conditioning feedback loop. The synthesis of these insights reveals a paradox: technologies are not morally neutral but embody the values of the societies that create them, while in turn, influencing the trajectory of future generations. This alludes to a dynamic evolutionary nature between technology and humanity. Design being “the first signal of human intention”, (“Design Approach”) it is responsible for the development and application of new technologies. As such,

144


OCAD UNIVERSITY

Te ch n olo gically drive n tre n ds i m p actin g glo b al valu es: - Machine Ethics & Algorithmic Bias; - Surveillance State; - Data Control & Monetization; - Economic & Asset Inequalities; - Erosion of Truth, Disinformation, & Propaganda; - Addiction, Attention, & the Dopamine Economy; - Implicit Trust & User Understanding; - Collapse of Civil Society & The Creation of Hateful Actors.

* More information in Section 3.3.

BACHELORS IN DESIGN THESIS

whether intentional or not - the designed solutions (as technological interventions) remain embedded with the designers beliefs. This is further revealed by the fact that the design process requires value judgments throughout (in terms of assessing problems, prioritizing needs, and selecting possible solutions). Furthermore, within futures thinking and speculative design, the future is envisioned as a spectrum of possible futures (further containing plausible, probable, and preferable futures) (Bonnie). In the process of selecting a specific future that users want to live and act in, the designer is making a value judgment while actively discarding alternative futures. It is for this reason that Cennydd Bowles, believes that “every act of

145


PHASE 5: CONCEPT DEVELOPMENT

The Futures Cone,

Joseph Voros, 2000

© MANDEEP MANGAT, 2019

Scenarios Wild Cards

PROBABLE

NOW PREFERABLE

Time

Fig. 59. Voros, Joseph. The Futures Cone. 2000. ResearchGate. Web. 2019. <https://www.researchgate.net/publication/330183853_Foresight_into_the_Future_of_WIPO’s_Development_Agenda>.

design is [both] a statement about the future” and how we should live with technology (4). In summary, design itself should be considered applied ethics, and is in need of foresight methods that consider the designed solutions impact on the future of humanity. Opportunity Space: In determining concepts of right and wrong, the philosophy of ethics seeks to define ‘what it means for a human life to flourish’. If design were re-framed around this paradigm, designers would be prompted to explore their own value system that inevitably determines their decision making along with their concept of ‘the end user’.

between the two, designers could consider how both technology and humanity can co-flourish. For instance, a new perspective on users could emerge through the framing of archetypes composed of desired values that designers wish to promote through technological applications. Through shifting our focus towards considering the moral after effects of our designed solutions, we would be re-framing our approach to technology, so that we can start to direct it towards creating an identity for humanity that is actually worth realizing.

Additionally, in understanding both the morally reciprocative nature between technology and humanity, and the role of design as a bridge 146


OCAD UNIVERSITY

BACHELORS IN DESIGN THESIS

RESEAR CH A ND T ER R I T O R Y People:

Target Users: Designers (design researchers, user experience designers, user interface designers, interaction designers, foresight strategists, etcetera) Secondary Users: Those whose practices might benefit from ethical considerations: engineers, developers, programmers, technologists, and policy makers. Those who will be impacted by the target users: their intended audience or users. Needs: Validation throughout the design process to inform design decisions. Desire for control over the creative

147

direction. Goals: To meet the needs of the client through arriving at a designed solution. Motivations: Tension in motivations between the desire to meet the needs of the client versus making an impact on society. Value: Human-centered design, Technological Innovation, Empathy. Cultural Consideration: Professional relations must be maintained between designers and their clients. Target users are also self-sufficient


PHASE 5: CONCEPT DEVELOPMENT

and seek out the latest knowledge/ trends in their field.

Context:

Spaces and events for incubating this type of design thinking include think tanks, and learning environments such as design focused educational institutions, conferences, magazines and podcasts.

Activities:

Through using the suggested processes as check-in-points during the design process, designers will be able to audit the ethical, moral, and value impact of the solution being developed. The feedback and insights from

Š MANDEEP MANGAT, 2019

these processes will guide the overall design process and improve the success of the design teams final outcome.

Technology:

The toolkit should be applicable to all current and emerging technologies.

Business:

The ethical toolkit will be published and shared via opensource models, such as the Creative Commons Licensing (a non nonprofit organization that enables the sharing and use knowledge through free legal tools). Revenue will be generated from this

148


OCAD UNIVERSITY

ideology through consulting services and workshops based on this knowledge. Additional revenue will be created through the distribution of complementary design materials (such as a board game or card deck to prompt ethical design thinking). Scaling out, a 3rd-party certification could be offered to technology-based companies. This would assist in communicating compliance with a set of ethical codes of practice and standards.

SYNTHESIS AND SCOPE The original scope was to seek out opportunities for a posthuman future through technological innovation that would produce new sociological possibilities (such as new patterns of social relationships and interactions) that would provide new context for being human. In service of scaling up the impact of my design interventions, I have intentionally shifted my user focuse

149

BACHELORS IN DESIGN THESIS

to include designers, the agents of change themselves. I will be implementing a systems solution to designers, through applying an ethical lens on the existing design process. The potential concept ideation’s include: - Published handbook that includes exercises and case studies. - Card deck/board game to prompt ethical considerations. - Consulting services and workshops based on this knowledge.

EVALUATION Designers develop a foundation of knowledge in applied ethics that informs their design thinking. Through applying the designed exercises, designers will develop a networked understanding of both humanity and technology, and the ability to map out the sociocultural impact of their solutions, regardless of the technology utilized.


Fig. 60. Mangat, Mandeep. “Tessellation 2”. Digital Art. 2019.

PHASE 5: CONCEPT DEVELOPMENT © MANDEEP MANGAT, 2019

150


OCAD UNIVERSITY

BACHELORS IN DESIGN THESIS

5.2. EXPERIENCE DEFINITION INFORMATION ARCHITECTURE OF DESIGN ETHICS TOOLKIT

TABLE OF CONTENTS

I N T R ODUCTION

D Referencing and synthesizing existing theories to re-frame the readers understanding of the role of design in impacting humanity and guiding technological innovation.

DESIGN CONCEPTUAL FRAMEWORK

Who is this for? Why is this important? How to use this toolkit Case studies into technology Learning objectives 151

Extended Mind

Radical Embodiment

Embodied Interaction


PHASE 5: CONCEPT DEVELOPMENT

© MANDEEP MANGAT, 2019

Fig. 61. Tentative information architecture of ethical design toolkit.

DESIGN OBJECTIVES Providing a foundation in the philosophy of ethics through a design-focused lens, so that readers can critically think through dilemmas.

ETHICAL FOUNDATIONS

Ethical design activities that are complementary to the design process for designers to use.

ETHICAL PRACTICES & ACTIVITIES

CONCLUSION

Empathizing Philosophy of Applied Ethics

Problem Definition

ADDITIONAL RESOURCES

Evaluation Design ethics questionnaire

Ideation Forecasting

Other Ethical Perspectives The Social Contract

Prototyping

TERMINOLOGY

User Testing Monitoring 152


OCAD UNIVERSITY

BACHELORS IN DESIGN THESIS

OLD DE SI GN P RO CE S S

EMPATHIZE Interpreting user “wants” as “needs”. Disconnect between technology and society.

DEFINE EVALUATE

Possibilities for innovation are informed by existing and emerging technologies.

IDEATE FORECAST

Success only informed by immediate business needs. Immediate and micro-view on the impact of the designed solution.

PROTOTYPE TEST MONITOR

153

Co te

Evalua spots exploi

Understand of designe of cultura societal v

Continually monitor has been deployed, can be made quickly Reviewing successes design projects to and ethical foresig


PHASE 5: CONCEPT DEVELOPMENT

© MANDEEP MANGAT, 2019

P ROPOSED D ES IG N PR OC E S S

OUTCOMES F R O M E T H I C AL T O O L K IT

to

Designers have an awareness of internal biases.

Co n kn fide ow le nce dg in e de of ap cisi pli o ed n m a eth kin ics g t hr . ou

gh

ating designed solution for weak or unintentional unethical itation that might occur.

ding of longterm impact ed solution in context al evolution, and future values.

ring usage after solution so that necessary changes y. s and failures of past develop accurate social ght going forward.

du pr oc ess

ou

t

o-creation between echnology and society

de sig n

Ethical analysis of design brief, problem statement, and desired objectives.

e

Design criteria informed by considering “what does it mean for a human life to flourish” & quality of life.

Design Conceptual Framework: In-depth and recursive understanding of the role of design in impacting humanity and guiding technological innovation.

Ethical Foundations: Foundation in ethical thinking supports designers in critically assessing and identifying opportunities and associated risks.

Ethical Practices & Activities: Design activities that are complementary to the design process for designers to use.

Fig. 62. Systems diagram of current and proposed future-state design process. “Old Design Process” from: Interaction-design.org.

154


OCAD UNIVERSITY

BACHELORS IN DESIGN THESIS

CERTIF IC AT IO N Through applying for certification, corporations could communicate to users that they are complying with ethical codes of practice and ethical standards in their physical products, app, platform, and/or technologies.

In this context, ‘Sympoiesis’ would exist as a neutral 3rd party, performing auditing and consulting services.

CERTIF IC AT IO N RO A D - M A P

1. INTERNAL PRE-AUDIT

2. OFFER + CONTRACT

3. ANALYSIS

4. AUDIT + EVALUATION

5. ISSUANCE OF

CERTIFICATION

6. MONITOR + RE-CERTIFY

Fig. 63. Flowchart of certification process.

1. 2. 3. 4. 5. 6.

155

Review guideline criteria specific to technology being used. Application form. Negotiate type of certification. Evaluating processes undergone and verifying that ethical/social considerations have been made. Testing ethical impact of designed solution. Assessment report provided to client. Deploy branding and marketing promoting ethically certified. Both parties monitor dispersion of technology. Renew certification when appropriate.


PHASE 5: CONCEPT DEVELOPMENT

© MANDEEP MANGAT, 2019

Platform apps could display “certification” along with in-depth ethical evaluation for users to review. Fig. 64. Mangat, Mandeep. “Proposed certification displayed in App Store”. 2019.

Consumer standard certification on physical hardware. Fig. 65. Mangat, Mandeep. “Proposed Certification on hardware.” 2019.

156


Fig. 66. Mangat, Mandeep. “Tessellation 4”. Digital Art. 2019.

OCAD UNIVERSITY

157

BACHELORS IN DESIGN THESIS


PHASE 5: CONCEPT DEVELOPMENT

© MANDEEP MANGAT, 2019

Sympoiesis For designers who are invested in the future of humanity, Sympoiesis offers an ethical framework that gives them the tools to pursue a future worth living in.

158


OCAD UNIVERSITY

BACHELORS IN DESIGN THESIS

5.3. BRAND DEVELOPMENT BRAND VISION: Guiding designers in the creation of a humanitarian future. BRAND ESSENCE: Transforming the future through today. BRAND TYPE: Disruptive & service brand

LOGO:

Sympoiesis

MEANING OF LOGO: The logo builds on foresight concepts of black elephants and the butterfly effect. Black Elephants: Referred to as “the evil spawn of our cognitive biases” by Peter Ho (Babst). The black elephant is both a combination of the black swan and the elephant in the room: a visible problem that “no one wants to deal with... [and when it turns into] a serious problem ... [the] reaction is that of shock and surprise” (Babst). The Butterfly Effect: Adapted from chaos theory, the butterfly effect describes “the way in which small causes can have very large [uncalculated, domino] effects in complex systems ... [that can lead to]...catastrophic effects” (Babst). To summarize, If unacknowledged and not addressed black elephants can have seemingly butterfly effects.

159


PHASE 5: CONCEPT DEVELOPMENT

© MANDEEP MANGAT, 2019

MEANING OF NAME: The term sympoiesis has been proposed as an alternative to autopoiesis in terms of describing life systems (Dempster 1). Comparatively, sympoiesis argues for a “making-with” perspective on evolution as opposed to closed boundaries and self-making associated with autopoiesis (Dempster 7). Additionally, in arguing for sympoiesis frameworks, Donna Haraway claims that “to be a one at all, you must be a many”. As these precepts are aligned with the emerging understanding of the nature of design being socially reflexive, “Sympoiesis” was selected as the brand name.

BRAND I DE NT IT Y P R I S M RECEIVER PHYSIQUE

PERSONALITY

Innovator, expert, strategic, creative, outlier, rebellious

CULTURE

RELATIONSHIP

Advising, inspiring, and supporting you to be the best designer you can.

Sympoiesis

REFLECTION

Leading by example, Courageously accountable, reciprocal, and compassionate

INTERNALIZATION

EXTERNALIZATION

Look again, logo symbol, ambiguous imagery, stark contrast

SELF-IMAGE

self-enhancement, intelligent, independent, reflective, designers

Making a difference, contributing,a citizen, part of a greater cause

SENDER Fig. 67. Brand Identity Prism for Sympoiesis.

160


OCAD UNIVERSITY

BACHELORS IN DESIGN THESIS

BRAND PY RA MI D se Es e es

lu

Va

EQUALITY STEWARDSHIP TRANSFORMATION REFLECTIVE

nc

TRANSFORMING THE FUTURE THROUGH TODAY.

ts

fi

ne

Be

MAINTAIN CUSTOMER LOYALTY ROADMAP INNOVATION DRIVE GROWTH PHILANTHROPIC MARKETING PREEMPTIVELY AVOID FINES + CIVIL LAWSUITS

tr ut

ib es

n io it et mp Co

DESIGN CONSULTANTS DESIGN AGENCIES DESIGN EDUCATION INSTITUTIONS + THOUGHT LEADERS AI ETHICISTS INNOVATION LABS SOCIAL + TECHNOLOGY THINK TANKS

At

EXPERT KNOWLEDGEABLE CREATIVE FUTURE-ORIENTED SYSTEMATIC RELENTLESS AMBITIOUS DRIVEN ETHICAL HUMANITARIAN INTEGRAL

Fig. 68. Brand Pyramid for Sympoiesis.

161


PHASE 5: CONCEPT DEVELOPMENT

© MANDEEP MANGAT, 2019

PERCEP TU AL B RA ND M A P

NON-PROFIT/ SOCIAL INNOVATION

INTANGIBLE INNOVATION THEORIES/ CONCEPTS

TANGIBLE INNOVATION METHODS/ PROCESSES/ POLICY

SYSTEMS/ INFORMATION ARCHITECTURE

SERVICES/ INTANGIBLE TECHNOLOGIES

PHYSICAL

PROFIT DRIVEN/ CORPORATION Fig. 69. Diagram of Perceptual Brand Map Parameters.

162


OCAD UNIVERSITY

BACHELORS IN DESIGN THESIS

PERCEPT UA L BR AN D M A P

SOCIAL INN NOT FOR

INTANGIBLE INNOVATION

Sympoiesis

CORPOR PROFIT 163


PHASE 5: CONCEPT DEVELOPMENT

© MANDEEP MANGAT, 2019

NOVATION/ PROFIT

RATE/ DRIVEN

TANGIBLE INNOVATION

Fig. 70. Perceptual Brand Map for Sympoiesis

164


OCAD UNIVERSITY

BACHELORS IN DESIGN THESIS

5.4. MONETIZATION STAKEH OL DE R ID EN T I F I C A T I O N

The following map (Fig. 67. Stakeholder Diagram.) organizes the stakeholders based on degree of interest and potential impact. The three categories include: primary (those directly involved with using the solution), secondary (individuals and groups that influence the primary user), and tertiary (those with an indirect interest and opportunity to benefit from the actions of the secondary and primary stakeholder groups) (Lohrey; Luther).

Po

ma

PRIMARY

SECONDARY

TERTIARY 165


PHASE 5: CONCEPT DEVELOPMENT

Š MANDEEP MANGAT, 2019

Design Institutions

Government

Industry

organizations

thought leaders

Business organizations

Start-ups

Students, Entrepreneurs

olicy

Business

Visual, UI, UX, &

Executives

Interaction designers, Design strategists,

akers

NonProfits

Professors and design educators. Researchers

Design teams,

Social

managers, leads,

Innovation

consultants

Labs Think

Technologists:

Tanks

software engineers, programmers, AI researchers and developers. Venture Capitalists Angel Investors

Fig. 71. Stakeholder Diagram.

166


OCAD UNIVERSITY

BACHELORS IN DESIGN THESIS

THE BU SI NE SS M OD E L C A N V A S

The business model for ethical certification and ethical design training is based on existing freemium and licensing (Creative Commons) models (Pigneur and Osterwalder 9697). Within this context, different users are reached through: free content (portions of booklet, and sample design exercises) that directly benefits students and early designer customer segments; and wealthier customer segments (such as large corporations, start-ups, and entrepreneurs) through auditing, and design recommendations. Additional revenue can be acquired through licensing educational material and certification through design education partnerships. The development of ethical certification and ethical design education will be a collaborative and co-creative process involving external experts (such as: sociologists, psychologists, ethicist, policy makers and lawyers, and futurists) for added value.

167

KEY PARTNERS

KEY ACTIVITI

EXTERNAL EXPERTS:

CONTENT PROD

-

WORKSHOPS + PRESENTATION

ETHICISTS SOCIOLOGISTS PSYCHOLOGISTS FUTURISTS

PROMOTIONAL MATERIAL PRODUCTION & DISTRIBUTION PARTNERS DESIGN EDUCATION INSTITUTIONS

WEBSITE MANA

MATERIAL RES PRODUCTION, DISTRIBUTION

KEY RESOURCE WEBSITE

EXISTING AND DEVELOPING P FOR EMERGING TECHNOLOGIES START-UPS

DEVELOPED TH AND FRAMEWOR COST STRUCTURE RESEARCH AND DEVELOPMENT OF CONTENT.

WEBSITE DEVELOPMENT & MAINTENANCE FEE

MATERIAL PRODUCTION, PRINTING, AND DI


PHASE 5: CONCEPT DEVELOPMENT

IES

DUCTION

NS

AGEMENT

SOURCE SALES, AND N

ES

D POLICIES G S AND

HEORIES RK

VALUE PROPOSITION “FOR DESIGNERS WHO ARE INVESTED IN THE FUTURE OF HUMANITY, SYMPOIESIS OFFERS AN ETHICAL FRAMEWORK THAT GIVES THEM THE TOOLS TO PURSUE A FUTURE WORTH LIVING IN.”

© MANDEEP MANGAT, 2019

CUSTOMER RELATIONSHIPS

CUSTOMER SEGMENTS

SELF-SERVICE

STUDENTS AND EARLY CAREER DESIGNERS.

AUTOMATED CUSTOMIZED & DIRECT-ACCESS

DESIGN EDUCATORS CONSULTANTS DESIGN AGENCIES

- THEORIES, EXERCISES, AND CASE STUDIES

CHANNELS

LARGE TECH CORPORATIONS WITH IN-HOUSE DESIGN AND MULTI-DISCIPLINARY DESIGN TEAMS.

- MATERIALS FOR DESIGN EXERCISES (EX: CARD DECK)

WEBSITE ONLINE PUBLISHING (ISSUU, MEDIUM)

TECHNOLOGY STARTUPS.

- WORKSHOPS AND PRESENTATIONS TO TRAIN OTHERS.

WORD OF MOUTH

ENTREPRENEURS

- PERSONALIZED AUDITING ON REQUEST

SNAIL MAIL TARGETED EMAILS CONFERENCES

REVENUE STREAMS FREE SECTIONS OF BOOKLET

ES.

PURCHASES: DOWNLOAD-ABLE PDF BOOKLET, DESIGN EXERCISE MATERIALS

ISTRIBUTION.

ROYALTIES: CONSULTING:

TEACHING & DISTRIBUTION OF FULL BOOKLET WORKSHOPS, PRESENTATIONS, AND ETHICAL AUDITS

Fig. 72. Diagram of Business Model Canvas for Sympoiesis. Template based on: https://www.strategyzer.com/canvas/business-model-canvas

168


OCAD UNIVERSITY

BACHELORS IN DESIGN THESIS

REVENU E EX TR AC TI O N M O D E L

Contributions/ Feedback

FREEMIUM

VALUE CREATION

REVENUE GENERATION

CREATE

ETHICISTS SOCIOLOGISTS PSYCHOLOGISTS FUTURISTS

169

FREE CONTENT

STUDENTS & EARLY CAREER DESIGNERS


PHASE 5: CONCEPT DEVELOPMENT

© MANDEEP MANGAT, 2019

“PREMIUM”

LICENSING

PDF PURCHASES

TEACHING & RE-DISTRIBUTION

DESIGN EXERCISE MATERIALS (CARD DECK)

WORKSHOPS PRESENTATIONS

CERTIFICATION

PERSONALIZED ETHICAL AUDIT

CONSULTANTS

TECHNOLOGY INNOVATION COMPANIES

DESIGN AGENCIES

STARTUPS ENTREPRENEURS

C U S T O ME R S E G M E N T S

STAKEHOLDERS

DESIGN EDUCATORS

CONSULTING

REVENUE EXTRACTION

MONETIZE

Fig. 73. Revenue Generation and Extraction Model.

170


OCAD UNIVERSITY

BACHELORS IN DESIGN THESIS

5.5. USER TESTING FACILITATED DESIGN SPRINT Method: A combination of design workshop and design sprint. Participants: Designers. Supportive Material: The workshop will be supported through: a presentation of the ethical design ideology; slides for each design activity; poster-size reproductions of the activities; and, print-outs of card decks. Goals: To evaluate the effectiveness of the toolkit: communication and delivery of ideology; ease of understanding exercises; connection between exercises and larger ideology; development of critical thinking towards evaluations throughout design process. Process: The workshop will be led with a teach-out of the ethical toolkit ideology, followed by ethical design activities for the participants to follow. Participants will work in small groups of 3-5, and have a share-out at the end of every activity. Participants will be observed by the facilitator throughout the sprint. To emphasize the activities over 171

the outcome of the design sprint, participants may use a previous project, or be provided with a sample.

FOCUS GROUP Method: A facilitated discussion amongst experts for opinions and observed experiences. Participants: Design thought leaders; design faculty; design and project managers. Supportive Methods: The focus group is intended to be subsequently scheduled after the facilitated design workshop. Additionally, the ethical toolkit and other workshop material will provided to the experts. Goals: To evaluate the effectiveness of the toolkit - more specifically; the adoption and necessity of exercises; the development of socially responsible approaches to ones own practice; the development of diversity amongst individuals; greater complexity and communication in decision making. Process: A Q&A session held on-line for approximately 60 minutes with 3 to 5 participants. The group will be guided through specific questions developed on the goals sought.


Fig. 74. Mangat, Mandeep. “Tessellation 3”. Digital Art. 2019.

PHASE 5: CONCEPT DEVELOPMENT © MANDEEP MANGAT, 2019

172


Fig. 75. NASA Public Domain. Martian Impact Crater - The Planet Mars. HiRISE camera. Web. 2010. < http://www.acclaimimages.com/_gallery/_pages/0124-1006-1213-0722.html >.

OCAD UNIVERSITY

173

BACHELORS IN DESIGN THESIS

6.1. B EN EF IT S O F T H E T O O L K I T . . . . . 175 6.2. E TH IC AL D E S I G N A C T I V I T I E S . . . 177 6.3. T OO LK IT M A T E R I A L S . . . . . . . . . . . 199 6.4. A DD IT IO NA L R E S O U R C E S . . . . . . . . 205


PHASE 6: ETHICAL DESIGN TOOLKIT

© MANDEEP MANGAT, 2019

6 ETHICAL DESIGN TOOLKIT 174


OCAD UNIVERSITY

BACHELORS IN DESIGN THESIS

BENEFITS OF THE TOOLKIT Emerging from the intersection of design thinking and the philosophy of applied ethics, this proposed toolkit is a call to action, in response to the growing need for future-proofing the moral impacts of design decisions in favor of societal prosperity. Through using the practical activities outlined in tandem with the design process, designers will develop their ability to evaluate right from wrong - ultimately improving their judgment throughout the design process. Further, as a skill to be developed, morality, has been considered through activities that emphasize moral: sensitivity; creativity; analysis; discernment; decision making; and advocacy. WHY IS THIS IMPORTANT? As designers, we play the role of mediators between humanity and technology. The following framework (Fig. 76. Interaction and co-shaping between society and the external environment) portrays the role and scope of influence design carries in shaping humanity. The development of this framework is based on the ‘theory of extended mind’, with further insight provided by philosopher Paul Humphreys, as well as, ‘embodied interaction theory’. 1. ‘The Human’, and ‘Relational, interaction space’ While seemingly isolated from one another, the intrinsic characteristics 175

of human nature are, in fact, dependent on and provided context through externalities that exist within the environment. According to the theory of extended mind, the mind extends itself into the immediate physical space, through a process of “mental relation” that is made possible by the senses in the body (Damiano and Dumouchel 74-6). Also supporting this process, are the contents that comprise the immediate environment (Damiano and Dumouchel 51). These contents act as “external supports” (Damiano and Dumouchel 51) for the mind to use as a “tool” of relation with the world (Damiano and Dumouchel 76). 2. ‘The Environment’ Next, it is through our interactions with, and, situated actions within environments, that possibilities of action are provided (Damiano and Dumouchel 50). Furthermore, through analyzing the possibilities of various human becoming through interactions with specific environmental contents - different environments can promote different abilities to emerge through affording the emergence of specific behaviors, while at the same time, repressing the opportunity for other actions. Lastly, actions made in the behavioral space lead to subjective meaning, such as concepts of the self and world. 3. ‘Designed Ecosystem’ In seeking to extend the limits of


PHASE 6: ETHICAL DESIGN TOOLKIT

OB JE C

2. T

H

1. RELATIONAL, INTERACTION SPACE

O

DI G I TA L

4

SERVICES

4. ‘Bodies of Knowledge’. Humphrey proposes that since ideologies and meanings are the result of

L ATIA SP

capability within an environment, it is common for species to shape and develop their environment for such ends. Generally with time, this leads to changing environments. However, with technological innovation outpacing biological evolution over the last millennia, the environment we are exposed to, and shaped by is increasingly man-made.

.B

R TE N I

AC

1. TH E HUMAN

SYS T EM PHYS S ICA L

E: CULTURE, LAN L E DG G UA W O GE KN S, E XPE OF BE R I E S S NC L N E O S R I I O T E T AC S D

TS

3. DE SI

TOOLS

E

NOLOGY PRACTICES ECH T , ES S T C R U D O U PR

TEM YS OS ENT EC NM RO ED VI GN EN

CT U R ST F IE

© MANDEEP MANGAT, 2019

Fig. 76. Interaction and co-shaping between society and the exernal environment.

architectural arrangements in the environment, and, that we in turn, shape our environments - the environment itself is a growing repository of ‘human knowledge’ (Damiano and Dumouchel 87). This external repository of ‘human knowledge’ includes: cultures; belief systems; languages; and technological practices, which, in turn, inform: built objects; artifacts; tools; spaces; and larger infrastructures - forming a closed loop through new affordances found in the designed realities. 176


OCAD UNIVERSITY

BACHELORS IN DESIGN THESIS

ENVISIONING DESIGN AS CULTURE SHAPING Step 1: Imagine that you were assigned to design your very own utopia and had full creative freedom. In contemplation of your ideal society, what might some of the characteristics, qualities, or attributes of this society be? Come up with 3 to 5, and write them down. Step 2: In consideration to societies throughout time, there are faculties of human nature that have always existed. These inherent faculties can be divided into: cognitive states, emotional capacities, sociality, and moral understanding. The quality or contents within each quadrant are variables that can change based on context, environment, or new experiences. Using the qualities that you came up with, map them out on the diagram. Step 3: According to virtue ethics (a branch of philosophy of ethics), a person cultivates values (desirable characteristics) through intentionally acting in support of such values. Similarly, through their actions person can develop the opposite of a virtue, a vice.

177

Looking over the map, think of experiences in your life where you might have developed these characteristics. Similarly, designed solutions live in the world, and can amplify or detract values. Consider products or services that might support or negate the values you’ve mapped out. Write these on different colored post-it notes and place them next to the values. Group discussion: Have a share out of some of the products or services identified. Address which value is being effected, and how the disruption is occurring.

1. Qualities of your ideal society: • equality

• freedom • kindness • acceptance


PHASE 6: ETHICAL DESIGN TOOLKIT

© MANDEEP MANGAT, 2019 MORAL SENSITIVITY MORAL CREATIVITY

| EMPATHIZE | DEFINE | IDEATE | PROTOTYPE | TEST |

MORAL ANALYSIS

Inherent faculties of human nature Emotional

Intellectual

• kindness

• acceptance

• equality • freedom

Social

Moral/Value 178


OCAD UNIVERSITY

BACHELORS IN DESIGN THESIS

DIVERSITY EXPANSION + BIAS IDENTIFICATION Step 1: Going back to the previous exercise, expand your selected values up to 5. Based on their importance, list them in descending order under the section “values�. Step 2: Consider how you would describe these values and what experiences (the more personal the better) led to an appreciation of them. Write down a definition for the value, and a short description as to why it has meaning to you. Step 3: A suggested way of expanding ones diversity, is to have new and different experiences that broaden your understanding of the human condition (Vallor 116-17). To help build diversity, have a group share out of the values you chose and why they are important to you. Are there some values on which you differ from others? Discuss and ask questions with others if you feel that you have opposing values. Step 4: Your personal values are a reflection of your lived experiences and world view. As such, you may have a biased preference in favor of your selected values. Consider what values may be in opposition to your own, (maybe they came up during the group discussion in Step 3) and fill them out. Going forward: Keep the knowledge of potential value preferences and aversions in mind throughout the design process. Are your interpretation of user needs, research insights, and evaluations of design solutions free from these biases? 179

1. Values:


PHASE 6: ETHICAL DESIGN TOOLKIT

© MANDEEP MANGAT, 2019 MORAL SENSITIVITY MORAL CREATIVITY

| EMPATHIZE | DEFINE | IDEATE | PROTOTYPE | TEST |

MORAL DISCERNMENT

2. Definition:

2. Why this value is important to you:

3. Opposing Values

180


OCAD UNIVERSITY

BACHELORS IN DESIGN THESIS

VALUES AS STAKEHOLDERS Typical user research practices hierarchize stakeholders into categories of primary, secondary, or tertiary user groups (Lohrey; Luther). This form of user profiling is based on existing social strata (such as the amount of influence within the design space, and degree of exposure to the designed solution (Lohrey; Luther)). As such, continuing to follow this framework further perpetuates existing power dynamics. As an alternative to viewing specific socio-economic classes of people as stakeholders, what if values were abstracted from these groupings? Considering cultural-social groupings that your users belong to, what sorts of values are represented? Additionally, if you are applying a technology that is directly, or indirectly disrupting an existing industry, consider the traditional values that the industry might stand for. Step 1: Through user research, try to determine what values are important to the public you are designing for. If possible, conduct direct user research through user interviews, surveys, or group discussions. Here are some example questions that 181

might lead to insights into personal beliefs: • What values do you stand for? • What are some of the values of your family, community, work place, culture, religion, and nationality? • What are values you associate with [industry or problem space, depending on design context]? Allow users to generate their own values, and seek to understand their unique meanings. Step 2: If your technology is disrupting an industry, perform literature studies into the history of the industry, and existing companies within the industry to establish the values within this system. As a design team share your research insights, and establish agreed upon values. Step 3: Once you have gathered sufficient user data, as a team, begin matrix clustering the values mentioned. Consolidate values that might be very similar, and tally the amount of times a value has been mentioned. Based of the results of the matrix clustering, establish primary (most often mentioned), secondary (sometimes mentioned), and tertiary (occasionally mentioned) values.


PHASE 6: ETHICAL DESIGN TOOLKIT

© MANDEEP MANGAT, 2019 MORAL SENSITIVITY

| EMPATHIZE | DEFINE | IDEATE | PROTOTYPE | TEST |

Adapted from the ‘stakeholder map’ in This is service design doing (Stickdorn et al. 60).

MORAL ANALYSIS MORAL DISCERNMENT

Reciprocation Trust Exchange

Communication Vulnerability

Friendship Companionship Love Care Primary Values

Self-care Nurture

Patience

Secondary Values

Tertiary Values

182


OCAD UNIVERSITY

BACHELORS IN DESIGN THESIS

VALUES AS PERSONAS Typically, user personas are created to summarize key details of specific user groups (Kalbach 89) in order to inform the remaining design process. Unfortunately, this reductionist characterization can lead to assumptions of the user based on stereotypes. Avoiding this pitfall is the opportunity for creating persona’s based on user values. Step 1: Continuing on from the ‘values as stakeholders’ exercise, work as a group (or pair), and create a persona for one of the primary values identified. This is meant to be a creative exercise; think of the value as you would a person, and imagine what sort of personality they might have. Step 2: Leveraging the user data acquired from the previous exercise, begin forming a persona on the value you have chosen. Here is a description of the categories of information to address: • Name of persona: identity the value you are describing. • Description: similar to a definition of your value, but with added personality. • Allies/friends: What are some complimentary values, and what might the relationship between the values be like? 183

• Dislikes: Describe what your value dislikes, consider vices or values that might detract from your persona. • Aspiration: How might this persona be actualized? Consider actions, behaviors, or events that can prompt this value. • Photograph: A visual symbol to represent this archetype, this could be an object, a part of nature, an animal, or something more abstract. Going Forward: As your project progresses through the design process, return back to this exercise to make sure the team is aligned on understanding user values. Option: If designing for specific values, consider mapping user experiences of the value. In this approach, the vertical axis might involve the value and an appropriate vice as binary opposites. The horizontal axis would include sequential events and actions that strengthened or detracted from the chosen value. On completing the map, identify: opportunities, as the events and actions that helped actualize the value; and pain points, as moments that favored the vice. Lastly, consider how these insights can be incorporated into the designed solution.


PHASE 6: ETHICAL DESIGN TOOLKIT

© MANDEEP MANGAT, 2019 MORAL SENSITIVITY MORAL CREATIVITY

| EMPATHIZE | DEFINE | IDEATE | PROTOTYPE | TEST |

Adapted from the persona method for user centered design from This is Service Design Doing (Stickdorn et al. 41-42).

MORAL ANALYSIS

Name of Persona:

Description:

Allies/Friends:

Dislikes:

Aspirations:

184


OCAD UNIVERSITY

BACHELORS IN DESIGN THESIS

EVALUATING + IDENTIFYING VALUES AT RISK Once designed solutions are deployed, they live in the world and begin influencing societal morality through either direct user interaction, or a network effect on other systems.

Step 3: Look over the values you identified. Are there values that re-appeared in steps 1 and 2? If so, those values would be considered ‘high risk’.

In terms of understanding and evaluating the moral impact of a design, this activity helps designers identify particular values to consider, based on their user group, technology, and designed solution space.

Write down the high risk values in the box labeled ‘values at high risk’.

Step 1: Based on the previous exercises (‘values as stakeholders’ and ‘values as persona’s’), look through the value card deck and select values that closely match your user feedback and chosen industry.

Discuss if are there values that could be excluded from your design project. Debate the pros and cons of removal. If the team is in agreement, and the value card is not considered high risk, you can discard the card.

Write them down in the designated area (i.e. ‘user values’ and ‘industry values’), and return the cards to the deck. Step 2: Based on the technologies that your solution will involve, go through the card deck and select the appropriate value cards. Write down the values next to the technology field.

185

Step 4: As a team, go through the values cards that were selected in steps 1 and 2.

Step 5: As a group, consider if there were values that were not brought up through the card deck. If necessary add them to the following table, along with a definition and designer questions associated with the value. Ongoing: Keep the final value cards with you throughout the ideation, prototyping, and testing phases of design referring to them as you would design criteria.


PHASE 6: ETHICAL DESIGN TOOLKIT

© MANDEEP MANGAT, 2019 MORAL SENSITIVITY

| EMPATHIZE | DEFINE | IDEATE | PROTOTYPE | TEST |

‘Value Cards’ are located on pages 199202.

MORAL ANALYSIS MORAL DISCERNMENT

1. User Values:

1. Industry Values:

2. Values Disrupted by Technology:

3. Values at High Risk:

186


OCAD UNIVERSITY

BACHELORS IN DESIGN THESIS

| EMPATHIZE | DEFINE | IDEATE | PROTOTYPE | TEST |

ETHICAL FRAMEWORKS EVALUATION In the process of selecting the best moral action from numerous choices, applied ethics has developed distinct theories of: consequentialist, deontological, and virtue ethics.

Post-ideation: Step 1: Evaluate the ideation’s based on how well they adhere to the various frameworks.

Within this activity, the concepts of morality as outlined by these theories, are explored through guiding and evaluating concept generation and ideation.

As a group, score each solution from a scale of 0 to 4, with: • 0 correlating with no adherence to the ethical framework; • 1, as minimal consideration; • 2, as some consideration, with potential areas for disregard; • 3, as greater consideration of the ethical framework; • and, a 4 for those solutions which can successfully answer all of the designer questions on the associated cards.

Pre-ideation: Step 1: As a group review the 3 ethical frameworks cards. While each ethical framework takes a different approach to ‘doing the right thing’, one may be more applicable then another. Evaluate the merits and drawbacks of the different approaches, and determine if there is a precedence towards one framework, or if all 3 are equal. Ongoing: Continuously review the 3 ethical frameworks card through the ideation process, allowing the questions to act as design criteria.

187

Step 2: Tally the score. Discuss the merits of the solution with the overall highest score, and those which scored highest within each ethical framework. Use these results to help inform which solution will be selected for further development.


PHASE 6: ETHICAL DESIGN TOOLKIT

© MANDEEP MANGAT, 2019

MORAL DECISION MAKING

| EMPATHIZE | DEFINE | IDEATE | PROTOTYPE | TEST |

‘Ethical Frameworks Cards’ located on pages 203-204.

MORAL ANALYSIS

MORAL DISCERNMENT

Solutions:

1

2

3

4

5

6

7

Consequentialist Ethics:

Virtue Ethics:

Deontological Ethics Total:

188


OCAD UNIVERSITY

BACHELORS IN DESIGN THESIS

NEW MEANING THROUGH CHANGE FACTORS

Step 1: Begin by mapping out the specific actions that a designed solution may afford, along with the old actions being disrupted. In defining new and unfamiliar actions, consider when your user will be first engaging with the solution in an exploratory manner. As well, if you have created a customer journey map, service design map, or storyboards, look them over to identity smaller micro-actions. This could include utterances, speech, writing, 189

Step 2: Through habitual exposure, these unfamiliar actions are learned, and become automatic behaviors as subconscious gestures. Once an action becomes a behavior, the individual becomes engaged in a larger process, with greater meaning attributed to the practice. Looking at the new actions and disrupted old actions. Consider the collective behaviors, what were the old behaviors, and what new behaviors might develop?

Old Design

Through mapping out the experiences provoked by the designed solution, this exercise allows for an analysis and evaluation of shifting cultural meanings.

gestures, movements, skill sets, and through processes (Chalmers 71).

Step 3 Once the designed solution is adopted by a larger segment of the population, the behaviors become social behaviors. At this point, the design will contribute to social norms and narratives in subtle and indirect ways through the actions limited and afforded. In considering cultural meaning what sorts of characteristics, attitudes, and symbolisms arise through the behaviors identified?

New Design

In the theory of embodied interaction, Paul Dourish, concludes that spatially performed activities provide contextual foundations for deeper meaning to be developed (Chalmers 72). Through interaction, various affordances within environments are rendered as “transferable knowledge� (Hobye and Lowgren 32) and internalized by the individual. Here, subjective experiencing is comprised of specific activities that involve different skills and abilities (Hobye and Lowgren 32).


PHASE 6: ETHICAL DESIGN TOOLKIT

© MANDEEP MANGAT, 2019 MORAL SENSITIVITY MORAL CREATIVITY

| EMPATHIZE | DEFINE | IDEATE | PROTOTYPE | TEST |

Adapted from Paul Dourish’s theory of embodied interaction, from Where the Action is: The Foundations of Embodied Interaction.

MORAL ANALYSIS MORAL DISCERNMENT

1. Specific Actions:

2. Behaviors & Practices:

3. Cultural Meaning:

190


OCAD UNIVERSITY

BACHELORS IN DESIGN THESIS

NON-STAKEHOLDER CONSIDERATION As technological innovation has the ability to redistribute power and wealth through displacing existing networks, it is important to consider the impact that the designed solution will have on social equality.

the majority has access to. • Discrimination: prejudiced, unfavorable treatment or consideration of, a being based on the group, class, or category to which they are perceived to belong.

Compared to dominant social groups, minorities lack representation amongst the public lexicon, and as a result, experience a relative disadvantage. Further, they can be less prioritized or altogether forgotten in the design process.

Additional things to consider: • Are any of the possible groups disadvantaged through restriction, exploitation, profiling, or further marginalization? • How might the designed solution shift the dynamic between intended users and non-users?

Step 1: In defining stakeholders, consider whether any groups have been left out, or, if the designed solution directly or indirectly (through a network effect on other users) impacts a group previously not considered. Look through the ‘list of potentially overlooked groups’, along with the following definitions, and question prompts to help understand how people may be made socially vulnerable: • Social inequality: when resources in a given society are distributed unevenly. Particular themes related to technological disruption include: ownership, control, access, and use. • Marginalization: specific individuals of society are blocked from resources or opportunities that 191

Identify the vulnerable user group(s), and the type of oppression that could occur through the design. Step 2: Assuming you have developed a prototype of the design, consider how the designed solution is contributing to further inequality. Step 3: As a design team, determine if there is a way to resolve the flagged issues (such as a failsafe, additional feature, or a change to the business model). Return to this list during the design evaluation and after the design has been deployed.


PHASE 6: ETHICAL DESIGN TOOLKIT

© MANDEEP MANGAT, 2019

MORAL ADVOCACY

MORAL CREATIVITY

| EMPATHIZE | DEFINE | IDEATE | PROTOTYPE | TEST |

Adapted from the appendix in Future Ethics (Bowles 207-8), and the expanding the ethical circle tool from Ethics in Technology Practice (Santa Clara University).

MORAL ANALYSIS MORAL DISCERNMENT

1. List of potentially overlooked groups: • • • • • • •

Minorities: Racial/race Ethnic Nationality Immigrants Social status - classism Gender (identity and expression); women • Sexual orientation/ discrimination; especially LGBTQ2S. • Religious affiliation • Linguistic discrimination

1. Vulnerable Groups:

• Addiction or mental health • Age: The elderly, and children • Workers: the poor, unemployed, employee’s; minimum wage, contract and migrant workers • Non-human others: domesticated animals; wildlife; zoo animals • Physically or cognitively disabled • Future generations • The Environment

1. Type of Oppression Aggregated:

2. Specific ways the design is contributing to inequality:

192


OCAD UNIVERSITY

BACHELORS IN DESIGN THESIS

SPECULATIVE SCENARIOS

Despite the best of intentions, once deployed, designs can lead to unforeseen events through unintended uses. This activity will consider how a design will be received and implemented by the public with emphasis on worst case scenarios through accidental or malicious intent. Step 1: Using the list of possible ‘bad actors’, ‘vices’, and ‘types of crime’, come up with some speculative dystopic scenarios as to how the design could be used to harm society. Develop each of these concerns into scenarios. Step 2: Share out your scenarios with your group and collectively evaluate the likely-hood of each scenario by identify factors (such as signals, trends, events) that might accelerate or detract from the scenarios occurrence.

193

If there is previous foresight research (such as a horizon scan), review it now to help inform this process. Based on the insights from this activity, populate the ‘drivers’ and ‘inhibitors’ field. Step 3: As a group, evaluate the individual probability of each scenario. Assess the drivers and inhibitors based on how likely the events are to occur (i.e. are they based on weak or strong signals, or mega trends) along with a time horizon (could the scenario occur tomorrow, in the near future, or distant future). Based on these considerations establish a ratio of probable vs improbable. Convert the ratio into a percentage by dividing the probable by the improbable.


PHASE 6: ETHICAL DESIGN TOOLKIT

© MANDEEP MANGAT, 2019 MORAL SENSITIVITY

Adapted from the design noir activity within the Ethics for Designers toolkit (Gispen), and, the think about the terrible people’ tool within Ethics in Technology Practice (Santa Clara University).

MORAL CREATIVITY

| EMPATHIZE | DEFINE | IDEATE | PROTOTYPE | TEST |

MORAL DISCERNMENT

Vices:

Bad Actors:

Types of Crime:

Injustice Pride Cowardice Greed Indolence Jealousy Lust Wrath

Villain Hacker Terrorist Criminal Corporations Military Dictators

Violent crime Property crime Public order White collar Organized crime High tech crime Enterprise crime

1. Scenario:

2. Drivers:

3. Probability:

2. Inhibitors:

:

% 194


OCAD UNIVERSITY

BACHELORS IN DESIGN THESIS

Political

ETHICAL RISK MATRIX + CONTINGENCY PLANNING The earlier stages of this activity allow designers to develop their own inner moral compass through assessing levels of wrong, while the later stage provides pathways to mitigate identified risks.

195

Individual Socioeconomic Environmental

• Political Rights: covers participation in political life; includes “voting rights, right to a fair trial, equality before the law, and the right to life” (Wikipedia “Civil and Political Rights”). • Civil Rights: providing “physical and mental integrity, life, and safety; protection from discrimination” (Wikipedia “Civil and political rights”). • Individual Rights: fundamental freedoms of privacy and the freedom of thought, speech, religion, press, assembly, and movement (Wikipedia “Civil and political rights”). • Socio-Economic Rights: a right to work, education, housing, an adequate standard of living, health services, food, social security and unemployment benefits (Wikipedia “Economic, social and cultural rights”). • Environmental Rights: Focused on environmental protection, such as the right to a healthy environment, natural resources, sustainability (Wikipedia, “Three generations of human rights”).

Type of Violation

Use the following categories and definitions (which were based on the Universal Declaration of Human Rights, and Karel Vasak’s, Three Generation theory):

Civil

Step 1: As a group go through the speculative scenarios and map them out on the graph, based of the type of human rights violation.


PHASE 6: ETHICAL DESIGN TOOLKIT

© MANDEEP MANGAT, 2019

MORAL ADVOCACY

| EMPATHIZE | DEFINE | IDEATE | PROTOTYPE | TEST |

Adapted from the risk matrix (a risk assessment methodology), and the ethical risk sweeping tool within Ethics in Technology Practice (Santa Clara University).

MORAL DECISION MAKING MORAL DISCERNMENT

25%

50%

Probability:

75%

100% 196


OCAD UNIVERSITY

BACHELORS IN DESIGN THESIS

Lo

Political

Step 2. After you have mapped out the scenarios, compare them against one another based on moral challenges.

Civil

Members in the group may vary in opinion on the comparative assessment. If this is the case, use this as an opportunity to communicate and explore understandings of morality. Step 3: Once the scenarios have been plotted, identify which area of the diagram they fall under, and proceed accordingly:

Develop new scenarios in parallel to the development of additional design features.

Socioeconomic Environmental

Ongoing: After deployment of the design solution, continue to monitor the probability of the scenarios every quarter.

Negligible Risk

In addition to these specific pathways, the results of this exercise should be presented to management, and, if possible, signed-off on to ensure future accountability and ethical commitment beyond the design team.

Individual

• Negligible Risk: Proceed with deployment. • Low Risk: Establish a plan of action to address these scenarios if they occur. • Medium Risk: Plan beyond initial responsive steps with low-fidelity failsafe initiatives. • High Risk: Deploy after failsafe features have been implemented within the design. • Catastrophic Risk: Hold for deployment until a solution is implemented, or revisit earlier prototypes.

Very 0 197


PHASE 6: ETHICAL DESIGN TOOLKIT

y Unlikely 0 - 20%

Medium Risk

High Risk

Unlikely 21 - 40%

Possible 41 - 60%

Probability:

| EMPATHIZE | DEFINE | IDEATE | PROTOTYPE | TEST |

ow Risk

© MANDEEP MANGAT, 2019

Catastrophic Risk

Likely 61 - 80%

Expected to occur 81 - 100% 198


OCAD UNIVERSITY

BACHELORS IN DESIGN THESIS

VALUES CARD-DECK

Inclusive / Accessible

Respect / Civility

Community /

The quality of being easily used, approached, or understood by as many people as reasonably possible.

Expressions that are polite and courteous, driven by a deep regard, and consideration for the wellbeing of others.

A sense of havi experiences wit

Design Questions: Design Questions:

Design Questions:

• Are the user’s abilities, concerns, and wishes respected? • Does the design promote respect for diverse social groups, as well as non-persons? • Does the design provide affordances for the user to be exposed to views that are not their own?

• Does the design the understandi experiences? • Does the design user to explore “kinship”? • How does the de connection thro exchanges among • Is intimacy and prioritized ove superficial exc

High Risk Technologies/Practices:

High Risk Technologies/Practices:

High Risk Technol

• Does the design have an easy, guided, learning curve, in terms of usage, with consideration to different learning styles? • How might designers identify and deconstruct their own inner biases about abilities? • Does the design consider the resourcefulness of users with specific capability limitations?

Data monetization, terms of service and user consent, top-down practices. 199

Personalization algorithms, anonymous social networking, cyber-bullying.

Fake news, chat bo sensationalized cli


PHASE 6: ETHICAL DESIGN TOOLKIT

© MANDEEP MANGAT, 2019

Value definitions developed from the product values activity in Ethical Design Thinking (Honeywell et al.), and, the moral agent activity within the Ethics For Designers toolkit (Gispen). Design questions also developed from the moral agent activity within the Ethics For Designers toolkit (Gispen). The identification of high risk technologies and practices were developed through the trend analysis research and synthesis (Section 3.2. Trend Analysis).

Empathy

ing shared th others.

:

Self-Control / Autonomy The ability to act in a selfdeterministic manner, while remaining free from external control or influence. Design Questions:

esign promote ough shared gst users? d authenticity er quantity and changes?

• Is the design malleable enough so that it can be adapted for different user goals, promoting a D.I.Y. ethic? • Does the design empower users to become self-sufficient, and eventually no longer dependent on the design? • Does the design re-frame from persuading the user towards a certain action, while unbiasedly presenting all options for use?

logies/Practices:

High Risk Technologies/Practices:

n promote ing of others

n encourage the e the notion of

ots, ick-bait.

Streaming entertainment services, Dark UX design.

Privacy / Security Free from unwanted public attention or unauthorized disclosure of one’s private life, especially damaging publicity and public scrutiny. Design Questions:

• How does the design handle the users private data, and is it vulnerable to hacking? • Is the usage and risk to personal information clearly communicated? • Is only relevant information collected from the user? • Are options presented to the user for them to decide the degree of information to disclose?

High Risk Technologies/Practices:

200


• Does the design avoid manipulating the users reality in untruthful ways?

Design Questions:

Free from deceit; truthful, sincere, and open about intentions, and distribution.

Honesty / Transparency

Electronic hardware, e-commerce.

High Risk Technologies/Practices:

Design Questions:

• Does the design support lowimpact, non-toxic, local, and sustained practices? • Does the design involve the sharing economy? • Is the value of environmental stewardship imparted onto users?

BACHELORS IN DESIGN THESIS

Avoiding depleting natural resources while emphasizing minimizing wasted resources and energy.

Sustainability/Efficiency

OCAD UNIVERSITY

Safety / Security

Justice / Equity

Trust / Acco

A state of being safe, protected, and defended from danger, risk, crime, injury or ill-being.

Equitable distribution of benefits, disadvantages, and risks with consideration to the needs of various groups.

Conveying a sen confidence in o reliability, and through regardi

Design Questions:

Design Questions:

• Does the design prevent misuse that could cause harm directly to the users, as well as others? • Does the design instill a sense of safety in the user while engaging with it? • Does the design ensure the users security is not breached? • Does the design guide the user in secure usage and behavior?

• Does the design break down implicit biases while ensuring the prevention of new ones? • Does the designed solution provide opportunities all users involved? • Has the risk of disproportionate impact (benefiting one segment of the population while being detrimental to another) been evaluated?

Design Questions:

High Risk Technologies/Practices:

High Risk Technologies/Practices:

High Risk Technol

Rapid prototyping, surveillance, autonomous weapons. 201

AI bias, biased data sets, workplace automation.

• Is the design h that users may • How can the des communicate exp intentions, and • How can the des commitment to e • How might desig feedback from u seeking more in

Deep Fakes, Video F Algorithms.


Surveillance, social-rating and social-credit systems.

High Risk Technologies/Practices:

• How can the design allow the user to beak out of social norms? • How can the user be supported in overcoming fear or perception of risks? • How does the design encourage users to distinguish their own ideals over societies? • How does the design reward users for genuinely expressing themselves?

© MANDEEP MANGAT, 2019

Design Questions:

The ability to fearlessly be oneself in the face difficulty, danger, or disapproval.

Courage

Data usage and monetization, user agreement on services.

High Risk Technologies/Practices:

• How can the design communicate its intentions, channels, etc. to various users? • How can the design honestly communicate its value and benefits without over-promising? • Have the designers ensured that the design can commit and follow-through with established user-expectations?

PHASE 6: ETHICAL DESIGN TOOLKIT

ountability

Mastery / Perseverance

Curiosity / Flexibility

nse of ones integrity, d follow ing commitments.

The act of mastering a skill or subject as to become an expert.

The openness and desire to learn or know about something; a natural inquisitiveness.

:

Design Questions:

logies/Practices:

High Risk Technologies/Practices:

hiding something not agree with? signers clearly pectations, d risks? sign convey ensuring trust? gners facilitate users that are nformation?

Faking and Editing

• Does the design support and encourage users to improve themselves? • Does the design provide feedback for users to track their growth? • How might the design be adapted for different skill aspirations? • How might users be inspired to challenge themselves with something new?

Automated decision making, AI assistants, personalization filters,

Design Questions:

• Does the design teach users something new? • How can the design pleasantly surprise users • and increase interest to learn more? • Does the design prevent misinformation or ignorance? • How might users be inspired to seek something new?

High Risk Technologies/Practices:

Fake news, News Feeds, Social Media, Search Engines. 202


OCAD UNIVERSITY

BACHELORS IN DESIGN THESIS

ETHICAL FRAMEWORKS CARD-DECK

Consequentialist Ethics

Deontological

Definition of True North:

Definition of True N

Brief Summary:

Brief Summary:

Designer Questions:

Designer Questions:

Utilitarianism: the greatest amount of good for the greatest amount of people.

The consequences of our actions are weighted relative to their impact on the happiness of humanity.

• What are the long-term impacts on the direct user and society? • Will the sum of the effects create more good than harm? • How will this design serve the needs of the community, not just a select few?

203

Golden Rule: to do onto be done by.

Actions are morally righ can become moral rules, and duties that everyon

• Would I use my own de ones use it? • Have we explored the duties we have to our design? • Are the decisions and design process univer


PHASE 6: ETHICAL DESIGN TOOLKIT

© MANDEEP MANGAT, 2019

While these cards were developed from the research summary of the philosophy of ethics (Section 3.1. Philosophy of Ethics) they were also supplemented through adapting: the normative design scheme within the Ethics For Designers toolkit (Gispen); the theory cards from the Machine Ethics Toolkit (Zhou); and, the conceptual frameworks package within Ethics in Technology Practice (Santa Clara University).

Ethics

North:

others as you would

ht if they follow, or , rights, principles, ne can abide by.

esign, or let my loved

various rights and r users through this

d intentions behind the rsally acceptable?

Virtue Ethics Definition of True North:

Cultivating virtues, (as values) through intended and repeated action.

Brief Summary:

Pursue happiness through developing individual moral character by demonstrating positive virtues in all actions.

Designer Questions

• What actions, skills and habits of character does the design embody? • Where does my user fall on the path to becoming a virtuous person? • How could my design stimulate virtuous behavior?

204


OCAD UNIVERSITY

BACHELORS IN DESIGN THESIS

ADDITIONAL RESOURCES

The need for ethical innovation, while still relatively new, is of such importance that many professionals are simultaneously approaching the topic. The following resources proved invaluable in helping guide the prototyping and development of my ethical design toolkit. I have provided them here for those who are further interested in ethical design.

• A toolkit and workshop materials for Machine Ethics, by James Zhou, a graduate thesis for the Copenhagen Institute of Interaction Design. jamesxzhou.com/work/machine-ethics-toolkit/ This student project is intended to teach fellow designers the role of ethics through exploring potential design and interaction scenarios with autonomous robots. • Actionable Futures Toolkit, by Nordkapp, a strategic design and innovation consultancy. futures.nordkapp.fi/download/. A toolkit comprised of foresight activities that emphasize consideration of and planning for a desirable future. Intended for organizations and services. • Ethics for Designers, by Jet Gispen, a graduate thesis for Delft University of Technology. https://www.ethicsfordesigners.com/ A graduate project that provides templates on ethical design activities for designers. These activities are based on the 3 theories of applied ethics, and includes value cards.

205


PHASE 6: ETHICAL DESIGN TOOLKIT

© MANDEEP MANGAT, 2019

• Ethical Design Thinking, by Alex Honeywell, Amanda Poh and Maheen Sohail, from Simon Fraser University. drive.google.com/file/d/0B_BduP_zkbNWZWVRSkpxNGtTRG8/view Created as a student project, this toolkit offers 5 value-centered design exercises intended to be run in a workshop. • Ethics in Technology Practice, by Shannon Vallor, Brian Green, and Irina Raicu, from The Markkula Center for Applied Ethics at Santa Clara University. scu.edu/ethics-in-technology-practice/ A project developed by the Markkula Center for Applied Ethics. Intended for engineers and technologists, this body of work includes: an overview of ethics in technology practice; conceptual frameworks for the various ethical thought-processes; a framework for ethical decision making pathways; and an ethical toolkit for integrating ethical consideration into engineering practices. • Future Ethics, by Cennydd Bowles. https://www.cennydd.com/ Intended for designers, future ethics provides a critical perspective on the ethics of emerging technology, along with numerous industry specific suggestions. While the book does not provide suggestions for ethical design activities, it was instrumental in helping shape this thesis paper. In particular, it provided a strong foundation for the trend analysis, and guided me to the concept direction of ethical innovation.

206


OCAD UNIVERSITY

BACHELORS IN DESIGN THESIS

REFERENCES Part 1: Byrd, Christine. “What the PerceptionAction Cycle Tells Us About How The Brain Learns”. Web blog post. Mind Research Institute. MIND Research Institute, 2018. blog.mindresearch.org/blog/perceptionaction-cycle. Accessed Sept. 2018. Soppelsa, Peter. “Cities in the Technosphere”. Inhabiting the Anthropocene. Wordpress, 2017. inhabitingtheanthropocene.com/2017/11/29/ cities-in-the-technosphere/. Accessed Sept. 2018. Berman, Alison and Jason Dorrier. “Technology Feels Like It’s Accelerating - Because It Actually Is”. Singularity Hub. Singularity University, 2016. singularityhub.com/2016/03/22/technologyfeels-like-its-accelerating-because-itactually-is/. Accessed Sept. 2018. Bostrom, Nick. “The Future of Humanity.” The Future of Humanity. nickbostrom.com/ papers/future.html. Accessed 20 Sept. 2018. Hughes, James and Nick Bostrom, and Jonathan Moreno. “Human vs. Posthuman”. The Hastings Centre Report, Vol. 37, No. 5 2007, pp. 4-7. Papagiannis, Helen. “Augmented Human”. Sebastopol, O’Reilly Media, 2017. Pepperell, Robert. “The Posthuman Conception of Consciousness: A 10-point Guide”. Consciousness Reframed III. Graham, Elaine. “Representations of the Post/Human”. New Jersey, Rutgers University Press, 2002. Maldonato, Mauro and Paolo Masullo, editors. “Posthuman: Consciousness and 207

Pathic Engagement”. Portland, Sussex Academic Press, 2017. “What impact will Artificial Intelligence have on the lives of ‘Generation Alpha’? A study of Millennial parents of Generation Alpha kids”. Transmitter, IEEE, 2017. transmitter.ieee.org/ wp-content/uploads/2017/06/Gen-AIInfographic-V2.6.pdf. Accessed Oct 2018. Tootell, Holly, et al. “Generation alpha at the intersection of technology, play and motivation”. University of Wollongong Australia, 2014. ieeexplore.ieee.org/ document/6758614. Accessed Sept 2018. Edmondson, Jerry. “Connecting with a new generation”. AON, 2018. aon.com/ unitedkingdom/employee-benefits/news/ articles/connecting-with-a-new-generation. jsp. Accessed Sept. 2018. Carter, Christine Michel. “The Complete Guide To Generation Alpha, The Children Of Millennials”. Forbes. Forbes Media LLC, 2016. forbes.com/sites/ christinecarter/2016/12/21/the-completeguide-to-generation-alpha-the-children-ofmillennials/#635c4a8a3623. Accessed 2018. Fourtané, Susan. “Generation Alpha: The Children of the Millenial”. Interesting Engineering, 2018. https:// interestingengineering.com/generationalpha-the-children-of-the-millennial. Accessed 2019.


REFERENCES

© MANDEEP MANGAT, 2019

Part 2: Accenture and World Economic Forum. “Industrial Internet of Things: Unleashing the Potential of Connected Products and Services”. World Economic Forum. reports. weforum.org/industrial-internet-of-things/ appendix-a-about-the-research/. Accessed 2019. Aquileana. “Aristotles Nichomachean Ethics: “Three Types of Friendship” (Based on Utility, Pleasure, and Goodness)”. WordPress.com, 2014. aquileana.wordpress. com/2014/02/11/aristotles-nichomacheanethics-three-types-of-friendship-based-onutility-pleasure-and-goodness/. Accessed 2018. Banerji, Ritwik. “De-instrumentalizing HCI: Social Psychology, Rapport Formation, and Interactions with Artificial Social Agents”. New Directions in Third Wave Human-Computer Interaction: Volume 1 - Technologies, edited by Michael Filimowicz, Veronika Tzankova, 2018, pp. 43-66. Chan, Jeffrey. “Design ethics: Reflecting on the ethical dimensions of technology, sustainability, and responsibility in the Anthropocene”. Design Studies, Vol. 5, 2018, pp. 184-200. doi.org/10.1016/j. destud.2017.09.005. Davis, Nicholas. “What is the Fourth Industrial Revolution”. World Economic Forum, 19 Jan. 2016. weforum.org/ agenda/2016/01/what-is-the-fourthindustrial-revolution/. Accessed 2019. Deragon, Jay. “The Influence of Technology on Humanity”. The Relationship Economy; Technology and the Human Network, 2011. relationship-economy. com/2011/01/technologys-influence-overhumanity/. Accessed Oct. 2018.

Dumouchel, Paul and Luisa Damiano. “Living With Robots”. Cambridge, Harvard University Press, 2017. Feenberg, Andrew. “What is Philosophy of Technology?”. Transcript. Lecture for the Komaba undergraduates, June 2003. http:// www.sfu.ca/~andrewf/komaba.htm. Accessed Nov. 2018. Gerry Johnson, Whittington, and Scholes. “Fundamentals of Strategy”. FT Press, 2011. Haney, William. “Cyberculture, Cyborgs, and Science Fiction: Consciousness and the Posthuman in Short Fiction”. Short Story Criticism, Ed. Lawrence J. Trudeau, Vol. 235, 2006. IEEE. “What impact will Artificial Intelligence have on the lives of ‘Generation Alpha’? A study of Millennial parents of Generation Alpha kids”. IEEE, 2017. transmitter.ieee.org/wp-content/ uploads/2017/06/Gen-AI-InfographicV2.6.pdf. Accessed Oct 2018. Jaspers, Karl. “The Origin and Goal of History”. Translated by Michael Bullock. New Haven, CT: Yale University Press, 1953. Johnson, Gerry, et al. “Fundamentals of Strategy.” 4th ed., Harlow, 2018. King, Georgia Frances. “Supercomputing could solve the world’s problems, and create many more”. World Economic Forum, Feb 2019. weforum.org/agenda/2019/02/ supercomputing-could-solve-many-of-theworld-s-problems-and-create-many-more/. Accessed 2019.

208


OCAD UNIVERSITY

Little, William. “Chapter 22: Social Interaction”. Introduction to Sociology - 2nd Canadian Edition, 2016. Retrieved from: opentextbc.ca/ introductiontosociology2ndedition/. Accessed 2018. McGinnis, Devon. “What is the Forth Industrial Revolution?”. Salesforce, Dec. 2018. salesforce.com/blog/2018/12/what-isthe-fourth-industrial-revolution-4IR.html. Accessed 2019. Park, Sowon. ““Who are these people?”: Anthropomorphism, Dehumanization and the Question of the Other”. Arcadia, vol. 48, no.1, 2013, pp. 1-14. academia. edu/7115588/_Who_are_these_people_ Anthropomorphism_Dehumanization_and_ the_Question_of_the_Other?auto=download. Accessed 2019. Pepperell, Robert. “Posthumans and Extended Experience”. Journal of Evolution and Technology, Vol.14, April 2005. jetpress.org/volume14/pepperell. pdf. Accessed Sept. 2018. Philosophy of Technology and Design. “How can we anticipate the moral dimensions of technology-in-design?”. FutureLearn, 2019. futurelearn.com/courses/philosophyof-technology/4/steps/524756. Accessed 2019. Postman, Neil. “Technopoly: The Surrender of Culture to Technology”. First Vintage Books Edition, 1993. Rosenberg, Don. “How 5G will change the world”. World Economic Forum, Jan. 2018. weforum.org/agenda/2018/01/the-world-isabout-to-become-even-more-interconnectedhere-s-how/. Accessed 2019.

209

BACHELORS IN DESIGN THESIS

Samsonoff, Nathan and Tampilic, Michael. “On the Origin of Things”. MISC; A Journal of Strategic Foresight and Insight, vol.22, 2016, pp. 110-15. Schwab, Klaus. “The Fourth Industrial Revolution: what it means, how to respond”. World Economic Forum, 14 Jan 2016. weforum.org/agenda/2016/01/thefourth-industrial-revolution-what-itmeans-and-how-to-respond. Accessed 2019. Shah, Druv. “AI, Machine Learning, & Deep Learning Explained in 5 Minutes”. Medium, Apr. 2018. becominghuman.ai/aimachine-learning-deep-learning-explainedin-5-minutes-b88b6ee65846. Accessed 2019. Shukla, Rajesh. “Technology and the Remaking of Human Existence”. Technology and The Changing Face of Humanity, University of Ottawa Press, 2010, pp. 176190. Silva, Hugo Da. “4 ways 3D printing can revolutionize manufacturing”. World Economic Forum, Nov. 2018. weforum.org/ agenda/2018/11/4-ways-3d-printing-canrevolutionize-manufacturing. Accessed 2019. University of Twente. “What can we learn from Don Idhe?”. FutureLearn, 2019. futurelearn.com/courses/philosophy-oftechnology/0/steps/26324. Accessed Nov. 2018. University of Twente. “Technology mediates - to the very heart of political life”. University of Twente, 2017. utwente. nl/en/news/!/2017/9/183619/technologymediates-to-the-very-heart-of-politicallife. Accessed Oct. 2018. Verbeek, Peter-Paul. “Mediation Theory”.


REFERENCES

Peter-Paul Verbeek, Wordpress. ppverbeek. wordpress.com/mediation-theory/. Accessed Oct. 2018. Vairavan, Arvind. “5 Biotech Trends shaping the future”. Medium, Jan 2019. medium.com/datadriveninvestor/5-biotechtrends-shaping-the-future-68279160f707. Accessed 2019.

© MANDEEP MANGAT, 2019

Wikipedia. “Social Stratification”. Wikipedia; the Free Encyclopedia. en.wikipedia.org/wiki/Social_ stratification. Accessed June 2019. Wikipedia. “Socioeconomic Status”. Wikipedia; the Free Encyclopedia. en.wikipedia.org/wiki/Socioeconomic_ status#Occupation. Accessed June 2019.

Wikipedia. “5G”. Wikipedia, The Free Encyclopedia, 2019. en.wikipedia.org/ wiki/5G. Accessed 2019.

Wikipedia. “Technopoly”. Wikipedia, The Free Encyclopedia, Aug 2017. en.wikipedia. org/wiki/Technopoly. Accessed Oct. 2018.

Wikipedia. “Designation of workers by collar color”. Wikipedia; the Free Encyclopedia. en.m.wikipedia.org/wiki/ Designation_of_workers_by_collar_color. Accessed June 2019.

Wikipedia. “Vehicular Automation”. Wikipedia, The Free Encyclopedia, 2019. en.wikipedia.org/wiki/Vehicular_ automation. Accessed 2019.

Wikipedia. “Nanotechnology”. Wikipedia, The Free Encyclopedia, 2019. en.wikipedia. org/wiki/Nanotechnology#Research_and_ development. Accessed 2019. Wikipedia. “Outline of relationships”. Wikipedia; The Free Encyclopedia. en.wikipedia.org/wiki/Outline_of_ relationships. Accessed June 2019. Wikipedia. “Power (Social and Political)”. Wikipedia; the Free Encyclopedia. en.wikipedia.org/wiki/Power_ (social_and_political). Accessed 2019. Wikipedia. “Relational models theory”. Wikipedia; the Free Encyclopedia. en.wikipedia.org/wiki/Relational_models_ theory. Accessed June 2019. Wikipedia. “Robotics”. Wikipedia, The Free Encyclopedia, 2019. en.wikipedia. org/wiki/Robotics#Social_Intelligence. Accessed 2019.

210


OCAD UNIVERSITY

BACHELORS IN DESIGN THESIS

Part 3: Section 3.2:

Alexander, Larry and Moore, Micael. “Deontological Ethics”. Edited by Edward N. Zalta, The Stanford Encyclopedia of Philosophy, 2016. https://plato.stanford. edu/archives/win2016/entries/ethicsdeontological/. Accessed 2018. Bowles, Cennydd. “Future Ethics”. Brighton, NowNext Press, September 2018. Hursthouse, Rosalind and Pettigrove, Glen. “Virtue Ethics”. Edited by Edward N. Zalta, The Stanford Encyclopedia of Philosophy, 2018. https://plato.stanford. edu/archives/win2018/entries/ethicsvirtue/. Accessed 2018. Vallor, Shannon, Brian Green, and Irina Raicu. “Ethics in Technology Practice: Conceptual Frameworks”. The Markula Center for Applied Ethics at Santa Clara University, 2018. https://www.scu.edu/ media/ethics-center/technology-ethics/ Conceptual-FrameworksFinalOnline.pdf. Accessed 2018. Vallor, Shannon. “Technology and the Virtues: a philosophical guide to a future worth wanting”. New York, Oxford University Press, October 2016. Wikipedia. “Ethics”. Wikipedia: the Free Encyclopedia, 2018. https://en.wikipedia. org/wiki/Ethics. Accessed 2018.

Section 3.3.:

AI Now Institute. “AI Now Report 2018”. AI Now Institute, December 2018. https:// ainowinstitute.org/AI_Now_2018_Report.pdf. Accessed 2019. Andriole, Steve. “Already too big to fail - the digital oligarchy is alive, well (& growing)”. Forbes, July 211

2017. https://www.forbes.com/sites/ steveandriole/2017/07/29/already-too-bigto-fail-the-digital-oligarchy-is-alivewell-growing/#3085dd2767f5. Accessed 2019. Balkan, Aral. “You Are The Product.” See Conference, April 2016. https://2017.ind. ie/services/talks/. Bergoffen, Debra. “Simone de Beauvoir”. The Stanford Encyclopedia of Philosophy, Edited by Edward N. Zalta, 2018. https:// plato.stanford.edu/archives/ fall2018/ entries/beauvoir/. Accessed 2019. Borgesius, Frederik Zuiderveen, et al. “Should we worry about filter bubbles?”. Internet Policy Review, Journal on Internet Regulation, 5:1, 2016. https:// ssrn.com/abstract=2758126. Accessed 2018. Bowles, Cennydd. “Future Ethics”. Hove, NowNextPress, 2018. Print. Brignull, Harry. “Types of Dark Patterns”. Dark Patterns, Digital. https://www. darkpatterns.org/types-of-dark-pattern. Accessed 2018. Caron, Christina. “‘Ghost Guns,’ Homemade and Untraceable, Face Growing Scrutiny”. The New York Times Company, 2017. https:// www.nytimes.com/2017/11/27/us/ghost-gunsgabby-giffords.html. Accessed 2019. Chu, Ben. “What is ‘nudge theory’ and why should we care?” The Independent, October 2017, Digital. https://www.independent. co.uk/news/business/analysis-and-features/ nudge-theory-richard-thaler-meaningexplanation-what-is-it-nobel-economicsprize-winner-2017-a7990461.html. Accessed 2018.


REFERENCES

Fan, Haiyan and Poole, Marshall Scott. “What is personalization? Perspectives on the Design and Implementation of Personalization in Information Systems”. Journal of Organizational Computing and Electronic Commerce, 2006, 16:3, 179-202. DOI: 10.1207/s15327744joce1603&4_2. Fienstein, Rapheal. “Addiction by design; when habits cross the line”. Noteworthy - The Journal Blog, Medium, March 2018. https://blog.usejournal.com/addictionby-design-when-habits-cross-the-line1b7fd26b6a5a. Accessed 2018. Forward Thinking Platform. “A Glossary of Terms commonly used in Futures Studies”. Forward Thinking Platform and Global Forum on Agricultural Research, September 2014. http://www.fao.org/docs/eims/ upload/315951/Glossary%20of%20Terms.pdf. Accessed 2018. Government Office for Science. “The Futures Toolkit: Tools for Futures Thinking and Foresight Across UK Government”. GO-Science, by Waverly Consultants, Ed. 1, 2017. https://assets. publishing.service.gov.uk/government/ uploads/system/uploads/attachment_data/ file/674209/futures-toolkit-edition-1.pdf. Accessed 2018. Haim, Mario and Graefe, Andreas and Brosius, Hans-Bernd. “Burst of the Filter Bubble?”. Digital Journalism, 6:3, 2017, pp 330-343, DOI: 10.1080/21670811.2017.1338145. Harari, Yuval. “Yuval Harari: Hacking Humanity.” IDEAS, by Paul Kennedy, CBC Radio, 2018, https://www.cbc.ca/ radio/ideas/yuval-harari-hackinghumanity-1.4810248.

© MANDEEP MANGAT, 2019

Hazel, Davis. “Have we become subscription addicts?”. The Telegraph, Digital, December 2016. https://www. telegraph.co.uk/business/sme-home/uk-smesaddicted-to-subscriptions/. Accessed 2018. IFTF, and Omidyar Network. “Ethical OS: a guide to anticipating the future impact of today’s technology”. Digital Intelligence Lab at the Institute for Future, and Tech and Society Solutions Lab at Omidyar Network, September 2018. https://ethicalos.org/. Accessed 2018. Lekkas, Nicolas. “The Big Five Tech Companies”. Growthrocks, April 2019. https://growthrocks.com/blog/big-fivetech-companies-acquisitions/. Accessed 2019. Levine, Matt. “Nothing is free (unless you sign up)”. Bloomberg L.P., Digital, May 2018. https://www.bloomberg.com/opinion/ articles/2018-05-18/silicon-valley-ssubscription-free-for-all. Accessed 2018. Marx, Paris. “The gig economy has grown big, fast — and that’s a problem for workers”. Vox, Oct 2016. https://www. vox.com/2016/10/26/13349498/gig-economyprofits-workers-desperate-services-labor. Accessed 2018. Mashetty, Ramakrishna. “Four key differences between RPA and cognitive automation”. EPAM Systems, Inc., Sept 2018. https://www.epam.com/insights/blogs/ four-key-differences-between-rpa-andcognitive-automation. Accessed 2018. Mateescu, Alexandra and Nguyen, Aiha. “Explainer: Algorithmic Management in the Workplace”. Data and Society Research Institute, Feb. 2019. https://datasociety. 212


OCAD UNIVERSITY

net/wp-content/uploads/2019/02/DS_ Algorithmic_Management_Explainer.pdf. Accessed 2019.

Wikipedia, The Free Encyclopedia. https:// en.wikipedia.org/wiki/Black_hat_(computer_ security). Accessed 2019

Munn, Luke. “Alt-right pipeline: Individual Journeys to extremism online”. First Monday [Online], 24.6, 2019. journals.uic.edu/ojs/index.php/fm/article/ view/10108/7920. Accessed 2019.

Wikipedia. “Dark Pattern”. Wikipedia, The Free Encyclopedia. https://en.wikipedia. org/wiki/Dark_pattern. Accessed 2018.

Rothbart, Davy. “The Mesmerizing World of Homemade Weapons.” Topic Magazine, First Look Media, 2017. https://www.topic.com/the-mesmerizingworld-of-homemade-weapons. Accessed 2019. Ryan, Peter. “Technology: the new addiction”. Proceedings, U.S. Naval Institute, vol. 144, September 2018. https://www.usni.org/magazines/ proceedings/2018/september/technology-newaddiction. Accessed 2019. Silins, Valdis. “Stumbling Toward Excellence: Wandering and Tolerating Ambiguity”. MISC: Searching for Excellence, vol. 24, 2017, pp. 10-11. Teper, Rimma. “Neuromarketing: Thinking ahead to 2020”. MISC, Idea Couture Inc., Fall 2016, pp 24-25. Vallor, Shannon. “Technology and the Virtues: a philosophical guide to a future worth wanting”. New York, Oxford University Press, October 2016. Waddell, Kaveh. “The death of prices”. Axios, April 2019. https://www.axios.com/ future-of-retail-amazon-surge-pricingbrick-and-mortar-b6a5f9fe-130f-4601-b96fa3dc7a69b54e.html. Accessed 2019. Wikipedia. “Black hat (computer security”. 213

BACHELORS IN DESIGN THESIS

Wikipedia. “Dynamic pricing”. Wikipedia, The Free Encyclopedia. https:// en.wikipedia.org/wiki/Dynamic_pricing. Accessed 2019. Wikipedia. “Echo Chamber (media)”. Wikipedia, The Free Encyclopedia. en.wikipedia.org/wiki/Echo_chamber_ (media). Accessed 2018. Wikipedia. “Empathy”. Wikipedia, The Free Encyclopedia. en.wikipedia.org/wiki/ Empathy. Accessed 2019. Wikipedia. “Maker culture”. Wikipedia, The Free Encyclopedia. https://en.wikipedia. org/wiki/Maker_culture. Accessed 2019. Wikipedia. “Marxism”. Wikipedia, The Free Encyclopedia. https://en.wikipedia. org/wiki/Marxism. Accesssed 2018. Wikipedia. “Persuasion”. Wikipedia, The Free Encyclopedia. https://en.wikipedia. org/wiki/Persuasion. Accessed 2018. Wikipedia. “Recommender system”. Wikipedia, The Free Encyclopedia. https:// en.wikipedia.org/wiki/Recommender_system. Accessed 2018. Wikipedia. “Security hacker”. Wikipedia, The Free Encyclopedia. https:// en.wikipedia.org/wiki/Security_ hacker#Black_hat. Accessed 2019. Wikipedia. “Structural Unemployment”.


REFERENCES

© MANDEEP MANGAT, 2019

Wikipedia, The Free Encyclopedia. https://en.wikipedia.org/wiki/Structural_ unemployment. Accessed 2018. Wikipedia. “Technological Unemployment”. Wikipedia, The Free Encyclopedia. https:// en.wikipedia.org/wiki/Technological_ unemployment. Accessed 2018.

214


OCAD UNIVERSITY

BACHELORS IN DESIGN THESIS

Part 4: Barad, Karen. “Inverterbrate Visions: Diffractions of the Brittlestar”. The Multispecies Salon, edited by Eben Kirksey, Duke University Press, 2014, pp. 221-241. Baxter, Michael. “It’s time to study machine behaviour across disciplines, finds MIT paper”. Bonhill Group, April 2019. https://www.information-age.com/ machine-behaviour-123482086/# Lewis, Jason Edward, et al. “Making Kin with the Machines”. Journal of Design and Science, 2018. https://doi.org/10.21428/ bfafd97b. Accessed 2018. Bowles, Cennydd. “Future Ethics”. Brighton, NowNext Press, September 2018. DeMello, Margo. “Animals and Society: An Introduction to Human-Animal Studies”. New York, Columbia University Press, 2012. Gouaux-Langlois, Aude and Belinda Sykora. “Thinking AI’s Voices: Gender and Identity”. Conditiohumana: Technology, AI, and ethics, S&S Media. https:// conditiohumana.io/ai-voices-gender/. Accessed 2019. Haraway, Donna. “Speculative Fabulations for Technoculture’s Generations: Taking Care of Unexpected Country”. The Multispecies Salon, edited by Eben Kirksey, Duke University Press, 2014, pp. 242-261. Hurley, Denis. “Technical & Human Problems With Anthropomorphism & Technopomorphism”. Medium, 2017. https:// medium.com/emergent-future/technicalhuman-problems-with-anthropomorphismtechnopomorphism-13c50e5e3f36. Accessed 2018.

215

Kirksey, Eben, et al. “Life in the Age of Biotechnology”. The Multispecies Salon, edited by Eben Kirksey, Duke University Press, 2014, pp. 185-220. Kraut, Richard. “Aristotle’s Ethics”. The Stanford Encyclopedia of Philosophy (2018 ed.), edited by Edward N. Zalta. https:// plato.stanford.edu/archives/sum2018/ entries/aristotle-ethics/. Accessed 2019. Matthewman, Steve. “Technology and Social Theory”. Hampshire, Palgrave Macmillan, 2011. Nast, Heidi. Review of “The companion species manifesto: dogs, people, and significant otherness”, by Donna Haraway. Cultural Geographies, vol 12, 2005, pp. 118-120. https://doi. org/10.1177/147447400501200113. Accessed 2019. Rahwan, Iyad, et al. “Machine behaviour”. Nature, vol 568, 2019, pp. 477-486. https://doi.org/10.1038/s41586-019-1138-y. Accessed 2019. Rahwan, Iyad, and Manuel Cebrian. “Machine Behavior Needs to Be an Academic Discipline”. Nautilus, March 2018. http:// nautil.us/issue/58/self/machine-behaviorneeds-to-be-an-academic-discipline. Accessed 2019. Roudavski, Stanislay, and Jon McCormack. “Post-anthropocentric Creativity”. Digital Creativity, vol 27, 2016, pp. 3-6. https:// www.tandfonline.com/doi/full/10.1080/146262 68.2016.1151442. Accessed 2019. Santas, Aristotelis. “Subject/Object Dualism and Environmental Degradation”. Philosophical Inquiry International Quarterly, vol 21., 3-4, 1999. https://


REFERENCES

www.academia.edu/16381910/Subject_Object_ Dualism_and_Environmental_Degradation. Accessed 2019. Stickdorn, Mark, et al. “This Is Service Design Doing: Applying Service Design Thinking In The Real World: A Practitioners’ Handbook”. Sebastopol, O’Reilly Media, Inc., 2018. Taylor, D. R. Fraser, et al. “Cybercartography for Education: The Application of Cybercartography to teaching and learning in Nunavut, Canada”. Modern Cartography Series, vol. 5, 2014, pp. 297-324. https://doi.org/10.1016/B9780-444-62713-1.00020-9. Toscano, Joe. “Automating Humanity”. New York, powerHouse Books, 2018. UNESCO. “I’d Blush If I Could: Closing Gender Divides in Digital Skills Through Education”. UNESCO for the EQUALS Skills Coalition, one of three coalitions that comprise the EQUALS partnership, 2019. https://unesdoc.unesco.org/ark:/48223/ pf0000367416/PDF/367416eng.pdf.multi. page=1. Accessed 2019. Vallor, Shannon. “Technology and the Virtues: a philosophical guide to a future worth wanting”. New York, Oxford University Press, October 2016. Vallor, Shannon, and William J. Rewak. “An Introduction to Data Ethics”. Santa Clara University, 2018. https://www.scu. edu/media/ethics-center/technology-ethics/ IntroToDataEthics.pdf. Accessed 2019.

© MANDEEP MANGAT, 2019

Wikipedia. “Capitalism”. Wikipedia, The Free Encyclopedia, 2018. https:// en.wikipedia.org/wiki/Capitalism. Accessed 2018. Wikipedia. “Colonialism”. Wikipedia, The Free Encyclopedia, 2019. https:// en.wikipedia.org/wiki/Colonialism. Accessed 2019. Wikipedia. “Interpersonal relationship”. Wikipedia, The Free Encyclopedia. https:// en.wikipedia.org/wiki/Interpersonal_ relationship#Pathological_relationships. Accessed 2019. Wikipedia. “List of life sciences”. Wikipedia, The Free Encyclopedia. https:// en.wikipedia.org/wiki/List_of_life_ sciences#Applied_life_science_branches_ and_derived_concepts. Accessed 2019. Wikipedia. “Nurture kinship”. W Wikipedia, The Free Encyclopedia. https:// en.wikipedia.org/wiki/Nurture_kinship. Accessed 2019. Wikipedia. “Other (philosophy)”. Wikipedia, The Free Encyclopedia, 2019. https://en.wikipedia.org/wiki/Other_ (philosophy). Accessed 2019. Wikipedia. “Species homogeneity”. Wikipedia, The Free Encyclopedia, 2019. https://en.wikipedia.org/wiki/Species_ homogeneity. Accessed 2019.

Wikipedia. “Anthropocene”. Wikipedia, The Free Encyclopedia, 2018. https:// en.wikipedia.org/wiki/Anthropocene. Accessed 2018. 216


OCAD UNIVERSITY

BACHELORS IN DESIGN THESIS

Part 5: “Business Model Canvas”. Strategyzer. strategyzer.com/canvas/business-modelcanvas. Accessed 2019. “Design Approach”. William McDonough + Partners. mcdonoughpartners.com/designapproach/. Accessed 2019. Babst, Stefanie. “NATO’s strategic foresight: Navigating between Black Swans, Butterflies and Black Elephants”. Munich Security Conference. securityconference. de/news/article/natos-strategic-foresight-navigating-between-black-swans-butterflies-and-black-elephants/. Accessed 2019. Bowles, Cennydd. “Future Ethics”. Brighton, NowNext Press, September 2018. Dempster, Beth. “Sympoietic and autopoietic systems: A new distinction for self-organizing systems”. Semantics Scholar, 2000. pdfs.semanticscholar.org/4429/9317a20afcd33b0a11d3b2bf4fc196088d45.pdf. Accessed 2019/ Dufva, Mikko. “The Horizon Scanning Zoo”. Ennakointikupla, 2016. ennakointikupla. fi/blog/index.php/2016/04/11/the-horizonscanning-zoo. Accessed 2019. Haraway, Donna. “Anthropocene, Capitalocene, Chthulucene: Staying with the Trouble”. Open Transcripts, Anthropocene: Arts of Living on a Damaged Planet, 2014. opentranscripts.org/ transcript/anthropocene-capitalocenechthulucene/. Accessed 2019. History of Consciousness. “Donna Haraway ‘Sympoiesis: Becoming-with in Multispecies Muddles’”. UC Santa Cruz, 2013. histcon.ucsc.edu/news-events/news/donna-13.html. Accessed 2019. 217

Lohrey, Jackie. “What Are Primary, Secondary & Tertiary Stakeholders?”. BizFluent, Leaf Group Ltd., 2017. bizfluent.com/info-8353421-primarysecondary-tertiary-stakeholders.html. Accessed 2019. Luther, Carol. “The definition of a tertiary stakeholder”. Chron, Hearst Newspapers, LLC.. smallbusiness.chron.com/definition-tertiary-stakeholder-37756.html. Accessed 2019. Pigneur, Yves and Osterwalder, Alexander. “Business Model Generation”. New Jersey, John Wiley & Sons, Inc., 2010. Sanborn, Bonnie. “Speculative Design for a Better Future”. DLR Group, 2018. dlrgroup. com/media/articles/sanborn-speculative-design-process. Accessed 2018. Santa Clara University. “A Framework for Ethical Decision Making”. The Markkula Center for Applied Ethics at Santa Clara University, 2009. scu.edu/media/ ethics-center/ethical-decision-making/A-Framework-for-Ethical-Decision-Making.pdf Virilio Paul. “Politics of the Very Worst: An Interview with Philippe Petit”. Translated by Michael Cavaliere, edited by Sylvère Lotringer, New York, Semiotext(e), 1999.


REFERENCES

© MANDEEP MANGAT, 2019

Part 6: Bostrom, Nick. “Existential Risks: Analyzing Human Extinction Scenarios and Related Hazards”. Journal of Evolution and Technology, vol. 9, 2002. nickbostrom.com/ existential/risks.pdf. Accessed 2018.

sign Thinking”. 2017. drive.google.com/ file/d/0B_BduP_zkbNWZWVRSkpxNGtTRG8/view. Accessed 2019.

Bowles, Cennydd. “Future Ethics”. Brighton, NowNext Press, September 2018.

Kalbach, Jim. “Mapping experiences: A guide to creating value through journeys, blueprints, and diagrams”. Sebastopol, O’Reilly Media, 2016.

Chalmers, Matthew. Review of “Where the Action Is: The Foundations of Embodied Interaction”, by Paul Dourish. Computer Supported Cooperative Work, Feb 2005, pp. 69-77.

Lohrey, Jackie. “What Are Primary, Secondary & Tertiary Stakeholders?”. BizFluent, Leaf Group Ltd., 2017. bizfluent.com/ info-8353421-primary-secondary-tertiary-stakeholders.html. Accessed 2019.

Dumouchel, Paul and Luisa Damiano. “Living With Robots”. Cambridge, Harvard University Press, 2017.

Luther, Carol. “The definition of a tertiary stakeholder”. Chron, Hearst Newspapers, LLC.. smallbusiness.chron.com/definition-tertiary-stakeholder-37756.html. Accessed 2019.

Falbe, Trine. “Ethical Design: The Practical Getting-Started Guide”. Smashing Media AG, March 2018. smashingmagazine. com/2018/03/ethical-design-practical-getting-started-guide/. Accessed 2019. Gispen, Jet. “Ethics for Designers”. MA Thesis. Delft University of Technology, 2017. https://www.ethicsfordesigners.com/. Accessed 2019. Green, Brian, et al. “Conceptual Frameworks in Technology and Engineering Practice: Ethical Lenses to Look Through”. The Markkula Center for Applied Ethics at Santa Clara University, 2018. scu.edu/media/ ethics-center/technology-ethics/Conceptual-FrameworksFinalOnline.pdf Hobye, Mads and Jonas Lowgren. “Touching a Stranger: Designing for Engaging Experience in Embodied Interaction.” International Journal of Design, vol. 5, no. 3, 2011, pp. 31-48. Honeywell, Alex, et al. “Ethical De-

Marti, Patrizia. “Materials of Embodied Interaction”. SMI ‘12 Proceedings of the 1st workshop on Smart Material Interfaces: A Material Step to the Future, 2012, pp. 1-6. Mulvenna, Maurice et al. “Ethical by Design - A Manifesto”. Unknown Host Publication, 2017, pp. 51-54. https://dl.acm.org/ citation.cfm?doid=3121283.3121300. Nordkapp. “Actionable Futures Toolkit”. Nordkapp Creative Oy, 2017. https://futures.nordkapp.fi/download/. Accessed 2019. Santa Clara University. “A Framework for Ethical Decision Making”. The Markkula Center for Applied Ethics at Santa Clara University, 2009. scu.edu/media/ ethics-center/ethical-decision-making/A-Framework-for-Ethical-Decision-Making.pdf

218


OCAD UNIVERSITY

Stickdorn, Mark, et al. “This Is Service Design Doing: Applying Service Design Thinking In The Real World: A Practitioners’ Handbook”. Sebastopol, O’Reilly Media, Inc., 2018. Vallor, Shannon. “Technology and the Virtues: a philosophical guide to a future worth wanting”. New York, Oxford University Press, October 2016. Zhou, James. “Machine Ethics Toolkit”. MA Thesis. Copenhagen Institute of Interaction Design, 2017. machineethicstoolkit. com/. Accessed 2018. Zalasiewicz, Jan. “The unbearable burden of the technosphere”. The UNESCO Courier, 2018. en.unesco.org/courier/2018-2/unbearable-burden-technosphere. Accessed 2018. Wikipedia. “Civil and political rights”. Wikipedia, The Free Encyclopedia, 2019. https://en.wikipedia.org/wiki/Civil_and_ political_rights. Accessed 2019. Wikipedia. “Discrimination”. Wikipedia, The Free Encyclopedia, 2019. https:// en.wikipedia.org/wiki/Discrimination. Accessed 2019. Wikipedia. “Economic, social and cultural rights”. Wikipedia, The Free Encyclopedia, 2019. https://en.wikipedia.org/wiki/ Economic,_social_and_cultural_rights. Accessed 2019. Wikipedia. “Minority Group”. Wikipedia, The Free Encyclopedia, 2019. https:// en.wikipedia.org/wiki/Minority_group. Accessed 2019. Wikipedia. “Risk matrix”. Wikipedia, The Free Encyclopedia, 2019. https://en.wikipedia.org/wiki/Risk_matrix. Accessed 2019. 219

BACHELORS IN DESIGN THESIS

Wikipedia. “Social Exclusion”. Wikipedia, The Free Encyclopedia, 2019. https:// en.wikipedia.org/wiki/Social_exclusion. Accessed 2019. Wikipedia. “Social Inequality”. Wikipedia, The Free Encyclopedia, 2019. https:// en.wikipedia.org/wiki/Social_inequality. Accessed 2019. Wikipedia. “Three generations of human rights”. Wikipedia, The Free Encyclopedia, 2019. https://en.wikipedia.org/wiki/Three_ generations_of_human_rights. Accessed 2019.


REFERENCES

© MANDEEP MANGAT, 2019

220



Fig. 77. Background Image 1. @Systems2018. 2018. Digital Image. Twipu. Web. 2019. < http://www.twipu.com/Systems2018/tweet/1037663964166074369 >.



Fig. 78. Background Image 2: @Systems2018. 2018. Digital Image. Twipu. Web. 2019. <http://www.twipu.com/Systems2018/tweet/1037663964166074369 >.




Issuu converts static files into: digital portfolios, online yearbooks, online catalogs, digital photo albums and more. Sign up and create your flipbook.