

Side Effects
Your tech. Their truth.
Research background: Gen Z tech-paradox
Our research has shown a discrepency between intention and behavior in Gen Z tech-adopters, where users favor cybersecurty measures, yet trade data for convenient online experiences.
Authentic consumer care is a space between law and consumer, with the largest impact in psychological perception.
Gen Z shows a paradox in intention and behavior for privacy and personal boundaries.
Gen Z has the highest expectation and openness towards disruptive innovation.
Personalization outweighs privacy
Gen Z protects their information while willingly sharing for personalized experiences.

88% 21% 24%
willing to share personal information with social media companies to enhance personalized experiences more accepting of social media business model of selling user data and advertisements than older generations (88% vs. 67%) more addicted to social media- even after being exposed to increased mainstream discussion of data privacy concerns.
Gen Z digital natives have the highest expectations and openness to future innovation.


77% 29% of Gen Z believe that biometrics will see increased adoption in the next five years of Gen Z believe that VR will see increased adoption in the next five years of Gen Z is open to travelling by using VR
Tech Adoption model
We aim to create a “care”ful adoption of emerging technologies and innovational services.
Usability: Is it easy and transparent?
Usefulness: Is it convenient, e.g., time, money, and intrusiveness?

Perception: Does it feel psychologically safe?
“This is where designers can create impact”
(Ning Su, Tech Expert)
Problem Statement
How might businesses engaged in disruptive technology innovation demonstrate authentic care for consumers in new services?
How might we design for safe psychological perception and safe behavior in brand-consumer relationships?


Participants: 5, Gen Z, tech-familiar
Format: In-person, SDM Lobby 6 floor th
Duration: 1.5 hours
Date: Friday, March 28th
Workshop 1
Workshop 1: Foresight scenarios
In our first workshop, we focused on future dystopia scenarios in tech innovation to uncover underlying emotions and value drivers in tech adoption. These scenarios offer different values in exchange for an invasion into personal boundaries, privacy, and safety.

Future Scenario: Love Without Secrets
Scenario cards:
ChatGPT brain implant
Amazon drone delivery
Amazon Key
Dating App without boundaries
Deepfake voice calls
A revolutionary new dating app doesn’t just analyze your social media, camera roll, and personal data-it goes further. It accesses your search history, private messages, bank transactions, health records, and even microphone data to uncover your subconscious preferences, daily habits, and hidden desires.
But the real game-changer? Once matched, you and your partner’s Google
Calendars automatically sync, allowing them to see your schedule, plan surprise dates, or even "coincidentally" bump into you. The app guarantees 100% compatibility, making relationships seamless, efficient, and entirely data-driven.
The Choice:
Would you give up your privacy for a love that anticipates your every move? Is a perfect relationship worth letting an algorithm-and your partner-know exactly where you'll be at all times?
Insight 1: Value drivers
Users value convenience, social power, and autonomy.
Convenience
Social power, status, and competition
(“Fear to become average” and lose authenticity) → two sides of a coin Competition rooted in financial survival in the job market
Decision-making agency: AI is welcomed with open arms as long as personal categories such as relationships, food, and exercise stay off limits (setting boundaries)
“Of course i would use a chatgpt brain implant to get a job.”

Insight 2: Generational tech trends
Users have inherent technological familiarity and openness towards further innovation in the future - trading data for value.
“Shopping everything instore is not a reality anymore.”
Generational normalization of existing technologies, e.g., Internet, Alexa, AI
Gen Z’s social interaction is already tech-drive (“techno-human” - Sending reels, liking, facetiming, following, unfollowing)
Data monetization
Online e-commerce and metaverse
Subscription models (capitalizing on consume forgetfulness and automated payments)
Data breaches and selling emails/passwords
AI Automation and reduction of human-touch

Insight 3: Underlying emotions
Users feel uncomfortable, powerless, and hopeless when it comes to protecting personal data and engaging in tech systems.
“It feels impossible to go off-grid if you want to be part of society.”
Discomfort in excessive surveillance
Powerlessness in broader governemental regime and system construct
Hopelessness in “going off-grid”, feels impossible
Mixed emotion in excitement and fear on AI
Uncertainty if personal data is truly deleted
Trust in using AI for human relationship advice
Buyers’ remorse: Can you undo? Insurance?
Data Warranties?

Users value convenience, social power, and autonomy.
Users have inherent technological familiarity and openness towards further innovation in the future - trading data for value.
Users feel uncomfortable, powerless, and hopeless when it comes to protecting personal data and engaging in tech systems.
HMW give users back their power and autonomy without losing convenience?
Users value convenience, social power, and autonomy.
Users have inherent technological familiarity and openness towards further innovation in the future - trading data for value.
Users feel uncomfortable, powerless, and hopeless when it comes to protecting personal data and engaging in tech systems.
HMW give users back their power and autonomy without losing convenience?
Users value convenience, social power, and autonomy.
HMW soften the tradeoff between data and value while honoring openness towards technological innovation?
Users have inherent technological familiarity and openness towards further innovation in the future - trading data for value.
Users feel uncomfortable, powerless, and hopeless when it comes to protecting personal data and engaging in tech systems.
HMW give users back their power and autonomy without losing convenience?
Users value convenience, social power, and autonomy.
HMW soften the tradeoff between data and value while honoring openness towards technological innovation?
Users have inherent technological familiarity and openness towards further innovation in the future - trading data for value.
HMW make users feel comfortable, powerful, and hopeful in their decisions within a highly technological and personal data era?
Users feel uncomfortable, powerless, and hopeless when it comes to protecting personal data and engaging in tech systems.





Workshop 2: Platform co-creation and UX refinement
Participants: 6, Gen Z, tech-familiar
Format: Online, Zoom
Duration: 40 minutes
Date: Saturday, April 12th
In our second workshop, we co-created our platform concept with our Gen Z target audience. We focused on usability, value, and understanding. UX screens: Onboarding and verification
Breaking news and featured products
Product detail page
Alternative product recommendations
Product Review
Insight 1: “Looks like Pinterest, not panic”
Users do not feel a sense of danger in the platform as it comes across as a casual social media sharing app. Problem: Missing urgency. Pivot: Warning label traffic light. Sense of danger or urgency should be signaled via a measurement system and color theory, e.g., red for danger.
Insight 2: “Feels like fake reviews”
Problem: Missing trust.
Users wish to read and see real voices that have experienced fraud or misuse with tech innovation. This could be users or employees as “whistleblower”.
Pivot: User reivews.
Experts in tech and data validate the security of products. Users can add personal stories after verification and declare their voice safely (anonomously) on the app.
Design a believable system (Why should users trust our platform over famous brands?)
Expert-based evaluation Consumer autonomy Ethical advertising


This homepage acts as your ethical tech feed— showing a trusted “Product of the Week” and real-time breaking news on privacy or safety risks.
Each story is labeled by severity and links to a chat room where users can discuss or take action. It’s designed to keep you informed, aware, and engaged.


This screen shows Tesla flagged for privacy risks, with each colored bar representing a risk area backed by real metrics. The bars reflect how poorly the product scores in Data Collection, User Control, Transparency, Data Sharing, and Misuse Potential all weighted by importance.
The higher the bar, the greater the concern. It's a quick, data-backed way to see where the product fails users.
This is the Product Recommendations screen on Side Effects, filtered by the “Voice Assistants” category.
Each product (like Mycroft Mark II, Sonos, and HomePod mini) is labeled with color-coded badges—“Worth Trying” shows overall value, while the “Safe” tags highlight individual areas (like data sharing or transparency) where the product performs well.
It’s a quick, trust-based guide to help users pick ethical tech confidently.


This section boosts community engagement through topic-based chat channels and short, scrollable videos. Users can join conversations, share insights, and watch reels that break down tech risks in simple, relatable ways making learning social and interactive.
It turns ethical tech awareness into a shared experience. Whether you're chatting in privacy forums or watching quick reels, you're part of a growing, informed community driving change together.




Expert-based evaluation: Panel and me
Contracted experts in tech and data privacy industries will evaluate various ca innovation products according to our standard rubric factoring in data collectio data sharing, transparency, and misuse potential.
1. Data Collection 30% uch data does it collect? (location, contacts, behavior, bio
2. User Control 20% users given real choices? (opt-out, delete, permission des
3. Data Sharing 20% Is data shared with third parties? Is it anonymized?
4. Transparency 15% rivacy policies clear? Is it easy to understand how data is
5. Misuse Potential 15% product been misused in the past?



Consumer autonomy: Plug-in shopping option
As external feature of the app, consumers have the ability to use a Side Effects plug-in while shopping, informing them about potential tech and privacy issues close to their purchase.


Ethical advertising: New incentive for brands
With time, brands will have an incentive to structure their business model and infrastructure to serve our safety rubrics, otherwise they will be exposed in scandals. After validating their safety, Side Effects will recommend their products as alternatives, creating ethical advertising.

Vision: A world where consumer care and safety are baked into business.

Business and consumers should work hand-in-hand to create and serve products. We begin this journey via transparency of current concerns on the consumer-side.

THANK YOU
APPENDIX
1.How Gen Z Uses Social Media Is Causing a Data Privacy Paradox
DeBrusk, C., & Kreacic, A. (2023, August 23). How Gen Z Uses Social Media Is Causing a Data Privacy Paradox. Oliver Wyman Forum. Retrieved from https://www.oliverwymanforum.com/uncharted/gen-z-s-privacy-paradox-in-the-era-of-socialmedia.htmlOliver Wyman Forum+1Digital Future Society+1
2. Gen Z's Travel Intentions and Museum Visits in the Metaverse Nazli, M., Bulut, Ç., & Ozarslan, Y. (2024). Gen Z travel intentions and museum visits in the metaverse: Case of Egypt, Scotland, and Turkey. Current Issues in Tourism, 1–19. https://doi.org/10.1080/13683500.2024.2376885Semantic Scholar+6CoLab+6OUCI+6
3. Trust in Ride-Sharing Alone vs. Self-Driving Cars
MediaPost Communications. (n.d.). Consumers Trust Ridesharing Alone Less Than A SelfDriving Ride. Retrieved from https://www.mediapost.com/publications/article/341012/consumers-trust-ridesharing-aloneless-than-a-self.htmlMedia Post
AAA Newsroom. (2025, February 25). Fear in Self-Driving Vehicles Persists. Retrieved from https://newsroom.aaa.com/2025/02/aaa-fear-in-self-driving-vehiclespersists/:contentReference[oaicite:14]{index=14}
4. EY. (2023). How consumers rely on technology but don't trust it. EY Future Consumer Index. https://www.ey.com
5. Swiss Re. (2025). Consumer trust and data sharing in the age of generative AI: Survey. Swiss Re Institute. https://www.swissre.com
6. Edelman. (2024). Technology industry watch out, innovation at risk: 2024 Edelman Trust Barometer. Edelman. https://www.edelman.com
Bibliography

https://docs.google.com/document/ d/1fmbiYJp7QmI_coyz0ONOl5gNHJt3dC60oDCcyzL6AjY/edit?tab=t.0

https://docs.google.com/document/d/1pjyIREUJ2F2MjIKh9CwtSCS8s_xwcM2uqh4SFXP8qI/edit?tab=t.0

https://docs.google.com/document/d/1RXenfDH4SecA9bVk58wK7vKyEKhPQUpYowB1tsTU2s/edit?tab=t.0

https://docs.google.com/document/d/1PKnOe1vDjxoCpqcXJfTcBmnMkQL-uI5RBAOoGlv5s58/edit?
tab=t.0#heading=h.s3dt4de7qs2d