BUILD A CITY
Interactive Installation Documentation Team Spark Augusts 6, 2019
TABLE OF CONTENTS INTRODUCTION
5
1.1 PROJECT EXECUTIVE SUMMARY
6
1.2 PROJECT PREMISE
6
1.3 PROJECT GOALS
6
THE TEAM 2.1 ROLES AND RESPONSIBILITIES
9 10
2.2 MILESTONES AND DELIVERABLES PROJECT OVERVIEW
11 12
3.1 RESEARCH AND IDEATION
13
3.2 BLOCKS TO BUILDINGS IDEA
14
3.3 EXPERIENCE OVERVIEW
14
3.4 INTERACTION
15
3.4.1 BLOCKS
15
3.4.2 TOUCHPOINTS
16
3.4.3 VIRTUAL CITY
18
3.4.4 SUSTAINABILITY
18
3.4.5 FEEDBACK
19
USER EXPERIENCE (UX)
21
4.1 DESIGN STATEMENT
22
4.2 RESEARCH
23
4.3 FIRST ITERATION AND USER TESTING
28
4.4 SECOND ITERATION AND USER TESTING
31
4.5 USABILITY TESTING
38
4.6 USER EXPERIENCE MAP
39
4.7 THE CLIENT
42
4.7.1 SCIENCE WORLD
42
4.7.2 MEASURING SUCCESS
42
ART BIBLE
44
5.1 ART STYLE
45
5.1.1 LOOK
45
5.1.2 DESIGN ITERATION
45
4.1.3 COLOURS
49
Blocks to Buildings Documentation
2
5.2 3D MODELS
49
5.2.1 MODELING THE BUILDINGS
49
5.2.2 UV MAPPING THE BUILDINGS
51
5.2.3 TEXTURING THE BUILDINGS
53
5.2.4 EXPORTING TO UNITY
56
5.3 DYNAMIC LIGHTING AND SHADERS
57
5.3.1 DYNAMIC LIGHTING
57
5.3.2 SHADERS
58
5.4 ENVIRONMENT LAYOUT
59
5.5 ANIMATIONS
60
5.5.1 BUILDING ANIMATION
60
5.5.2 CAR ANIMATION
62
5.6 USER INTERFACE
66
5.6.1 FONTS
66
5.6.2 EXAMPLE SHOTS
66
5.6.3 UI ASSETS
67
5.6.4 LOGO
67
5.7 COLOUR SCHEME
68
TECHNICAL DOCUMENTATION
69
6.1 CHOOSING THE TECH
70
6.1.1 AUGMENTED REALITY
71
6.1.2 TOUCHPOINTS
74
6.2 TECHNICAL GUIDE
79
6.2.1 TOUCH DETECTION
79
6.2.2 COMBINATION
80
6.2.3 AFFORDANCES
81
6.2.4 SCREEN SHOT
82
6.2.5 MULTIPLAYER EXPERIENCE
84
PIVOTS AND LESSONS LEARNED
86
7.1 USER EXPERIENCE
87
7.2 ART
87
7.3 TECH
87
7.4 CLIENT
88
FUTURE DEVELOPMENTS 8.1 ADDITIONAL FEATURES
89 90
8.1.1 ROTATION
90
8.1.2 UNDO
90
Blocks to Buildings Documentation
3
8.1.3 GAMIFICATION
91
8.2 TAKEAWAY
93
8.3 ADDITIONAL USES/CLIENTS
93
Blocks to Buildings Documentation
4
INTRODUCTION
This document outlines Team Spark’s journey from brief to minimum viable product, with a focus on process, problem-solving, and product development. The document also includes a detailed Art Bible, Technical Guide, and suggestions for future iterations of the product.
1.1 PROJECT EXECUTIVE SUMMARY This project, undertaken by Team Spark for the client, Thinkingbox, was produced as part of the “Projects III” course in the Masters of Digital Media (MDM) program at the Centre for Digital Media (CDM). The team had 14 weeks to deliver the MVP. The goal was to create an OOH installation that focused on using new and exciting technology for a unique user experience. The team focused on the idea of mixing the physical and virtual world where players could use building blocks to create a virtual city. The final product delivers an interactive experience where players can build a city, learn about the environmental impact and sustainability of different buildings, and view their completed city in a 360° photo.
1.2 PROJECT PREMISE The B locks to Buildings Installation is a collaboration between Thinkingbox and a group of graduate students at the Center for Digital Media. The client (Thinkingbox) approached Team Spark to pitch and create an installation, focused on, but not limited to, the automotive or entertainment industries. The team aimed to use their expertise in technical art, user experience, business management and programming to research, propose and prototype an original and exciting solution.
1.3 PROJECT GOALS The goal of the project was to create a Minimum Viable Product (MVP) of an interactive installation that took real-world interactions (building with simple blocks) and translated/augmented the experience in a virtual world. Blocks to Buildings Documentation
6
Client’s Goals for the Project: ● Explore new, exciting technology, pushing the envelope on what’s possible ● Research and compile a list of installation and technologies, developed through iterative ideation ● Provide a fleshed out experience, with a focus on user experience from beginning to end, including: o Onboarding o The ability to build with blocks (focusing on the combinations as a core element of building) and to see the result manifest in the virtual world o Wrapping up/restarting the entire experience o Takeaways for the client and the user ● Lessons learned and pivots Team Spark’s goals and objectives were to provide an MVP that met the client’s goals. This included: Tech Codebase / Git repo + iOS App
Art Nine building models
Ability to place three different blocks that Environment for building correspond to three different buildings Day-to-night lighting Ability to combine those three blocks in any Animation pairing to create new buildings Simple gamification
Building
Multiplayer abilities
Combining
People/cars in the city “Clear Board” function on an alternate device (like a remote) UI Elements Suggestions for how to expand Blocks to Buildings Documentation
7
User Experience
Research and Client Information
User testing results and learnings
A list of our installation research
Technical rider
A list of our own installation ideas
Onboarding and offboarding strategy
Summary of our tech research (what worked and what didn’t)
Suggested takeaway
What success looks like to the client The best way to measure it
Blocks to Buildings Documentation
8
THE TEAM
2.1 ROLES AND RESPONSIBILITIES Team Spark is made up of seven individuals from diverse backgrounds, with support from our advisor, Yangos. The roles were divided as follows: Name
Main Role
Supporting Role
Basel Alnashinibi
Technical / 3D Artist
Tech Support
Mika Chang
UX Designer
Graphic Designer
Annie Chen
2D/3D Artist
Animator
Pranay Jain
Programmer
UX Designer
Emma Konrad
Project Manager
Writer
Stephanie Wu
UI Designer
Tech Support
Silver Xue
Programmer
Art Support
Blocks to Buildings Documentation
10
2.2 MILESTONES AND DELIVERABLES Date
Milestone
Details
Project kickoff
Scheduling and team building
May 13
First Client Meeting
Team finalizes gameplay structure and core loop
May 22
Ideation Ends
Team Spark pitches to Thinkingbox
June 5
Discovery Workshop
Work with Thinkingbox to refine the idea and determine the scope of the project.
June 6
Production begins
Team Spark will begin iterative prototyping
June 26
Pre-Alpha Development Ends
Team Spark to present current iteration of project(s) for feedback and comments
July 24
Alpha Development Ends
Team Spark to present current iteration of project(s) for feedback and comments
July 26
Features and Content Lock
No new features will be added to the MVP at this point
July 27
Final Fixes and Updates
Team Spark to refine and debug before final presentation
August 6
Presentation to client Minimum Viable Product (MVP) delivery
Team presents the MVP to Thinkingbox MVP and documentation are handed over to Thinkingbox
May 8
Blocks to Buildings Documentation
11
PROJECT OVERVIEW
This section provides an overview of the Blocks to Buildings Installation, including the results of the research and ideation phase and how we chose the final concept.
3.1 RESEARCH AND IDEATION Before the Meeting: Installation Research The team spent the week prior to the initial client meeting researching existing installations. We wanted to get an idea of what had been done, as well as what was successful and why. At the end of the research period we had a list of 60 installations, which we then used to create a “Best Of” compilation of our favourites. Both of these documents can be found in the 01_SupportingDocuments folder. The Ideas After our initial meeting, the team began to brainstorm our own ideas, with a particular focus on the entertainment and automotive industries. We timeboxed our sessions, giving ourselves 10 minutes to come up with 10 ideas for each sector. A spreadsheet of all the ideas can be found in the 01_SupportingDocuments folder (title: 03_TeamIdeas). With more than 60 ideas, we knew we had to whittle the pile down. Through discussion and voting, we were able to choose 13 ideas to present to Thinkingbox. These ideas were deemed the most exciting, creative, and original of all our ideas. The Pitches On May 22, Team Spark pitched 13 ideas to Thinkingbox. All of the ideas were thought starters, and the goal was to choose three to five ideas for the team to develop further. The pitch presentation slides can be found in the 01_SupportingDocuments folder (title: 04_PitchDecks). The Final Three
Blocks to Buildings Documentation
13
Following this meeting, Thinkingbox and Team Spark chose to move forward with three installation ideas: Build a City (eventually Blocks to Buildings), Spotify Light Activation, and Colgate Smile. We worked through each idea and presented them anew at the Discovery Workshop. The presentation slides can be found in the 01_SupportingDocuments folder (title: 05_FinalThree). Shortly after that, we decided on the activation we would move forward with: Blocks to Buildings.
3.2 BLOCKS TO BUILDINGS IDEA The B locks to Buildings Installation is about merging the physical world with the virtual. Users place simple building blocks onto a screen and watch as those blocks manifest on-screen as city buildings.
3.3 EXPERIENCE OVERVIEW Blocks to Buildings offers users the opportunity to build with their hands and see a virtual city matching their design appear on-screen. As the potential player approaches the installation, there are two options. First, if the building space is unoccupied, the player can interact with the space. If it is occupied (i.e. someone is already building), the player can watch on the main screen or join the other player. Currently, there are four iPad screens networked together to provide a larger playing space. Each iPad has three blocks intended to be used with it, however the player has the freedom to mix and match the blocks as they desire. Next to the iPad, the player will find an iPhone with a remote control screen. The remote will allow them to rotate the camera view, so they can see the city they’re building from different angles. Perhaps more importantly, it has an “Info” button. Should the player feel uncertain of what to do, they can tap that button and an explanatory video will start playing. Upon choosing and placing a block on the screen, a 3D model of a building matching the shape and colour of the block will appear on-screen. If the player would like to delete a block, they can use the red triangular block (one for all four iPads).
Blocks to Buildings Documentation
14
The player can also combine buildings. When a new building is placed close enough to an existing model, the two will combine to create a new building entirely. Because of these combinations, the player can create a total of 9 different buildings. The player can interact with the installation for as long as they’d like, however, the average interaction lasts about 3-5 minutes. When they are happy with their city, the player can press “Finish Building” on the remote and they will be rewarded with a 360° image of their city. After a moment, the screen will reset and the next player can begin building. In future versions, the player should be able to access their 360° image at a future date (i.e. once they’re at home) on an online platform (such as Flickr), and be able to share it on their social media.
3.4 INTERACTION This section outlines the different objects with which the player can interact and the ways in which they can do that.
3.4.1 BLOCKS The blocks are the foundation of the installation. When you place a block upon the iPad, it registers as a 3D model on-screen. This is done using touchpoints; each block has a specific number of touchpoints, identifying it as each respective model. All the details on the blocks are below. Block
Colour
Size
Units
Touchpoints
Function
1
Small Square
Base Unit
2
Creates a home
2
Medium Rectangle
2x Base Unit
3
Creates a condo
3
High Rectangle
3x Base Unit
4
Creates an apartment
4
Triangle
N/A
5
Deletes buildings
Blocks to Buildings Documentation
15
3.4.2 TOUCHPOINTS The blocks are recognized by the iPads using multi-touch. Each block has a distinct number of touchpoints (outlined above) that are added to the block using conductive copper tape. For more information on the touchpoints, see S ection 6.2.1 Touch Detection. Blocks to Buildings Documentation
16
3.4.3 VIRTUAL CITY The blocks, once placed on the iPad, are manifested as 3D buildings on screen. Each block has a specific building attached to it, and the buildings have been made to mirror the size and colour of the block. Base Buildings There are three base buildings: green, yellow, and blue. Combinations The player can also combine the blocks in any variation of a pairing, creating a further six more buildings. For details on the blocks and their corresponding models, see S ection 5.1 Art Style. Affordances The 3D models have been carefully designed to mirror the blocks in both shape and colour. This helps the player to connect what they place on the iPad with what appears on screen. Any future changes to the models should keep these affordances in mind.
3.4.4 SUSTAINABILITY The narrative of the installation is lightly focused on sustainable cities. However, the main driver of the experience is always the act of building. Therefore, the theme of sustainability is quite passive, and the player is actively confronted with the ideas only after the building has ended. The three blocks each represent one element of a sustainable cities. They are divided as such: Block
Sustainable Feature
Green
Green Spaces
Icon
Effect of placing that block Increases the number of green trees that appear on the map
Blocks to Buildings Documentation
18
Yellow
Energy Efficiency
Increases the visibility of a yellow electric grid on the map
Blue
Social Wellness/Community
Increases the number of blue cars on the road
The combinations of the buildings have a dual effect. For example, a combination of a green and blue block would increase both the trees and the cars. The purpose of these affordances is to increase the representation of each colour on the map according to how many of those blocks have been placed. As noted above, the point is not to tell people what or how to think about these cities, but rather to explore the different ways they can build and the different effects this has on the city. Should the player wish to learn more after building, there is a screen set to the side that explains all three building types and why each is important to a sustainable city (for more details, see Section 4.4 Second Iteration and User Testing.
3.4.5 FEEDBACK In addition to the feedback for the sustainable cities, there is also direct feedback when a player builds. This is done using animation and sound. When the player builds a singular building, a spinning animation and a “whooshing” noise play to signal that the build was successful. When two buildings are combined, a separate animation and sound plays, distinguishing it from a simple build. The word “Combined!” also appears on screen, further alerting the player to the fact that the two buildings combined to create something new. Other Sounds The sound effects were mastered using Audacity and are used under Creative Commons license from the sources detailed in 02_Sound Folder. Blocks to Buildings Documentation
19
There were three “strains” of sound constructed, one for each building type. These sounds were programmed to play at volumes proportional to the number of buildings of the corresponding type on the screen. There were also sounds that corresponded with building creation and combination. These were programmed to appear upon triggering their respective actions. Future developments could include additional daytime sound effects such as birds, or likewise detailed nighttime sound effects including crickets that come along with night fall. These sounds were not added to the final installation, however they were researched and sources for them can be found in the reference document.
Blocks to Buildings Documentation
20
USER EXPERIENCE
4.1 DESIGN STATEMENT The Idea To build a Sustainable Metropolitan City - Digital + OOH Activation at Science World A sustainable city building installation for users to participate in what our city could look like in the future. The installation aimed to: ● Provide multiple choices of buildings for users to apply into their city ● Assign each building a different level of environmental impact (Such as CO2 emission) ● Allow users to build their own cities and see how “green” their cities are ● Put each city on one planet and export a digital version as takeaways for users ● Users can walk through planet by planet to see city views that were built by different users Target audience The installation aims to raise the awareness of how to make the world more sustainable through an engaging and fun city building activation. Our preliminary design will target the general public in metro Vancouver who go to museums such as Science World in their spare time. In order to discover the possible needs and requirements of our stakeholder/client, we conducted an interview with Science World staff and discussed what are the KPIs and how they measure success for each exhibition. Core mechanic
User will be given a tablet as the medium for input, multiple physical blocks as the tool to conduct interaction, and a big screen aside will display the output. Users have to put these blocks onto the iPad to build their city in virtual. A simple user flow as following:
Blocks to Buildings Documentation
22
4.2 RESEARCH Observation: How do people naturally place the blocks? Objective: To observe users playing with the blocks in order to determine the way people would naturally place them. Tools: 10 - 15 blocks Task: A sk users to play with the blocks, see how they interact with them, and at the same time add variations such as the shape of the blocks, instructions, etc. Exploration Questions: ● How will users naturally interact with the blocks? ● How do users feel when they are building with the blocks? ● What are the possible factors that would impact the users’ decision making while building? The Result: ● Visual balance: when building with blocks in the same shape (but different colours), balancing the colours/visuals is sometimes a concern ● Stack up: users tend to stack up the blocks when they are different shapes and colours ● Shape matching: users tend to fill the map with blocks in corresponding shapes (in this case, colour doesn’t matter) Blocks to Buildings Documentation
23
[INSERT BUILDING PHOTOS HERE] Blocks to Buildings Documentation
24
Visual Guidelines Testing: How might we make users place the blocks in the way we want them to? The Purpose: The purpose of this user testing was to uncover how we could guide users to build blocks through our visual design. In our installation, we do not want users to stack the blocks but, instead, have them sprawl across the surface According to the insights we gathered from the observation, when users were given a guideline to build, they had little tendency to stack the blocks. Instead, they tried to fill in the map with the block of a corresponding shape. Tools: Five maps with different visual design elements, including colour, dotted lines, slash patterns, and unit division (Figure 1) and three types of blocks with different units: one (A), two (B), and three (C) (Figure 2). Figure 1
Figure 2
Task: Ask users to ignore the appearance (colour) of the blocks, and tell them the only difference would be the size of each (1, 2, and 3). Ask users to fill one map at a time (in 2 minutes) with blocks according to the visual design of each map. Record the result, observe the user journey, and gather insights on the decision making. The Assumptions: ● The user will put the blocks on the map without stacking them up ● The shape and colour of blocks may affect how the user puts the blocks on the map ● The user knows what “a unit” means and can match the blocks to the correspondent unit Blocks to Buildings Documentation
25
● The user can identify where to put the blocks, according to the guidelines ● The user can easily recognize where to put the blocks with the guidance of dotted lines ● The user knows they can place blocks with different unit combinations on the maps, because the dotted lines will not mislead users The result: ● The dotted line works well in terms of indicating where to put the blocks. Slash patterns and solid lines will make it even clearer that users can put the blocks on the map in different combinations ● Colours can mislead the users easily. It’s hard for users to identify the road by colours. Use other symbols to indicate the road, and this will help users have a better understanding of where to put the blocks ● Without the virtual feedback on a big screen, users may still tend to stack up the blocks to create depth for better visual ● Watch the scale of the road in comparison to the blocks. If the road is too wide, users may want to do something with it/build on it ● Based on the testing result, we started to draft the first version of the map as guidelines for our next user testing.
Blocks to Buildings Documentation
26
4.3 FIRST ITERATION AND USER TESTING The purpose To simulate the installation environment and observe how users will interact with the blocks, including their reaction to the virtual world. Method Tools: a projector, a display screen, 3 different types of blocks, and Unity software Task: ● To project a digital map on the desk as user input interface, and the display screen as output of actions ● To prepare the blocks beside the map and ask users start to play with it freely ● At the same time, use a tablet behind the scene to simulate users’ action in order to make the corresponding reaction on the display screen Assumptions: ● The user will know what to do once they see the installation setup ● The grid on the map will prevent users from stacking up the blocks ● The user will notice the display screen output while building the physical blocks ● The user will discover the rules and combine the blocks to make new buildings Observation and Insight Blocks to Buildings Documentation
28
Observation
How might we...
Actionable plan
The users saw the transformation of the buildings How might we effectively To design the appearance of the Building but had no idea about the rules inform users the rules of blocks by different pattern, icon, transformation behind it. The users wonder each combination? or colour. how many combinations of the house they can make. The users couldn't figure out the How might we make correlation between the Appearance of connections between appearance of the blocks and the block physical objects and its how it will reveal virtually on big virtual reflection? screen.
The map
The game play
Through corresponding height, colour, and shape between physical blocks and virtual 3D models.
The users tended to put the buildings on the preserved area(on the grass).
How might we tell users where to build houses on Give users an error sign while the map? they put on the wrong area.
The intention of the grid on the map was clear to the users but the space is not wide enough for them to put each house separately.
How might we adjust the To test with the ipad and see space of each grid so that what's the best distance users can do the between each building. combination easily?
The users had no clue about what is the purpose of this installation, and what should they achieve at the end.
How might we make the To add more information around subject this topic through environment outstanding(sustainable design. city building)?
Some of the testers related this to a city building installation and expected to see people move in once their city built.
How might we create takeaways that will either To add animations at the end. amaze or meet the expectations of users?
Some of them tried to stack the blocks up while others put the How might we guide block either horizontally or users how to put the vertically to see what will blocks correctly? happen on big screen.
To design the appearance of the blocks by different pattern, icon, or colour.
Most of the users picked one The block at a time and put it in a How might we encourage movement/int fixed place. Some may put one users to combine two A tutorial at the beginning. eraction block in multiple places to see blocks together? what will happen. The users were curious about what will happen if they take out the block they just put on the map.
How might we let users to cancel the action they To add an eraser tool in Unity. just made?
Blocks to Buildings Documentation
30
4.4 SECOND ITERATION AND USER TESTING Defined problem areas According to the result of first user testing, we identified the key insights and the problem areas we would like to tap into. First, we found that the grid on the map could not prevent users from stacking up the blocks, and some users even put the block in different angles to see what will happen on display screen. Second, even though the users noticed the combination of the buildings, they couldn't figure out the correlation between the appearance of the blocks and how it will reveal virtually on the display screen. The problems we aimed to tackle next were: I. How might we make the subject more remarkable (sustainable city building)? II. How might we guide users how to put the blocks correctly? III. How might we encourage users to combine two blocks together and make connections between physical objects and its virtual reflection? Our Proposed Solutions Problem I : H ow might we make the subject more remarkable (sustainable city building)? Our Solution: P oint system design Blocks to Buildings Documentation
31
We looked into sustainable building categories created by LEED, the most widely used green building rating system in the world, and designed a rating system for our installation. Green Space Energy Efficiency Social Proposed starting point system: Points-to-perfect ratio Building
Social Wellness
Energy Efficiency
Green Space
House (Green)
-1
1
1
Apartment ( Green + Green)
1
-1
2
Condo (Yellow)
1
1
-1
Mega Condo (Yellow + Yellow)
1
2
-1
Storefront (Blue)
1
-1
1
Mall (Blue + Blue)
2
1
-1
Dorm (Green + Yellow)
1
-1
1
Skyscraper (Yellow + Blue)
1
1
-1
Hippie House (Green + Blue)
-1
1
1
Problem II : H ow might we guide users to place the blocks correctly? Our Solution: B lock skin design
Based on the point system we came up with, we designed the block skin as follows. The colours and icons indicate the category of each block.
Blocks to Buildings Documentation
32
The concept design of the block skin
However, for clarity, we decided to just put icons on the top of blocks to make it look like stamps, so that users may naturally put blocks vertically, and not horizontally. Problem III : How might we encourage users to combine two blocks together and make connections between physical objects and their virtual reflection? Our Solution: C olour/shape matching of virtual building and physical blocks There are three types of blocks with nine building combinations. According to the first user testing result, most users noticed there were combinations of buildings but had no idea how it worked. Therefore, we started to consider how we could provide users with clues through the blocks and models appearances.
Initially, we thought to colour each surface of blocks with 4 different colours (nine colours in total), and each colour represents one type of building. Users have to match the same colour to make the combination. Then we also came up with the idea to make the shape and Blocks to Buildings Documentation
33
colours of the blocks match the shape of the virtual buildings. User testing Before we finalized our 3D models and launched the usability test in digital, we wanted to verify the assumptions as follows: ● The colour match solution makes sense to the user, and at the same time, informs them of the building combination logic ● The icons on the top of the blocks help the user to put the blocks vertically ● The user will only put one block on the tablet at a time Colour Match Testing The purpose: To verify if the colour match solution makes sense to the user and, at the same time, informs the user of the building combination logic Tools: P rinted building pictures & the blocks in green, yellow, and blue colours. Task: A sk the user to use 1 or 2 blocks to make each building in the pictures according to its colours Observation & insights The users could make most of the buildings with the correct blocks combinations as long as the scales and colours of buildings matched the physical blocks appearance. However, there were two building that confused the users because the size and the layout of colours did not intuitively make the users relate them to the corresponding blocks. As a result, we revised the colour and scale for these two models. Blocks to Buildings Documentation
34
Testing of gameplay The purpose: To confirm if the icons on the top of blocks encouraged users to put the blocks vertically, as well as confirm whether the users will only put one block on the tablet at a time. Also, to test if the point system helps convey the message of sustainable building to users through gameplay. Tools: Unity prototype, display screen, iPads, the blocks to build (in green, yellow, and blue colours.) and a block in red as eraser. We also placed a bar chart with three categories (green space, energy efficiency, and social wellness) on the top-right corner of the display screen to visually represent the output of the point system. Task: Ask users to build and make combinations of the buildings with those blocks. At the end of the game, to evaluate the results according to the point system and give users a badge as takeaway. After users finish playing the game, there’s an introduction to sustainable buildings on the side for users to further understand the purpose of this city building installation.
The UI element on the upright corner of the display screen to show the outcome.
Blocks to Buildings Documentation
36
Users will receive a badge in one of these three sustainable building categories depending on how many points they get at the end of the game.
The educational information will be presented to the side. Whenever the user is done interacting with the installation, they can have a further understanding about each colour and its representation.
Blocks to Buildings Documentation
37
Observation & insights Through this gameplay user testing, we confirmed that the icons on the top of blocks help users to place the blocks with the right position, but users still left multiple blocks on the tablet at the same time which caused issues due to technical constraints. To tackle this problem, we made a tutorial video to guide users on how to interact with the installation.
The tutorial video introducing how to interact with the blocks and tablet. In terms of the gamification of point system, we found our testers did not resonate with the bar chart and the final takeaways. The point system did not make sense to them compared to the amount of buildings in each colour showing on the display screen. As a matter of fact, we decided to put the point system in the parking lot at this phase, and instead, to focus more on enhancing each colour and its representation.
4.5 USABILITY TESTING
Mock Up Space Our potential clients are museums such as Science World.In order to simulate the environment, we set up the installation in a spacious room called Hanger at Centre for Digital Media.
Blocks to Buildings Documentation
38
Science World Exhibition Space
Our mockup spot in a spacious room called Hanger at Centre for Digital Media
4.6 USER EXPERIENCE MAP Basic Blocks
Blocks to Buildings Documentation
39
Combinations
Blocks to Buildings Documentation
40
The Affordance The Takeaway
Blocks to Buildings Documentation
41
4.7 THE CLIENT 4.7.1 SCIENCE WORLD With the focus on learning, this installation is currently best suited for educational institutions. For the purpose of this MVP, we chose Science World as our client. Science World’s mission is as follows: Through science and nature, we ignite wonder and empower dreams. With that in mind, we focused our installation on nature and, through building and collaboration, igniting wonder. Our installation also fit in with their vision, which specifies that “within a generation, Canada will be a country of thriving, sustainable communities rooted in scientific literacy, technological innovation, and a deep connection to nature.” These ideas fundamentally changed the way we thought of measuring success. Instead of traditional KPI metrics, we knew that successful learning should be our key measurement. With that in mind, we reached out to Science World to discover how they measure the success of their current exhibits.
4.7.2 MEASURING SUCCESS Science World uses two papers to outline and test the success of their exhibitions. The first, “Assessing Exhibit for Learning in Science Centers: A Practical Tool” (by Chantal Barriault) provides a way to quantifiably measure qualitative and overseved data. There are three stages to this measurement: Initiation Behaviors - Walking up to exhibit, watching someone else participate in exhibit, initial participation in the exhibit Transition Behaviors - Repeating the activity, expressing positive emotional response in reaction to engaging in activity Breakthrough behaviors - Referring to past experiences while engaging in the activity, seeking and sharing information, engaging in exploratory behavior (repeating activity to achieve different outcomes, remaining on task for 5+ minutes) How do we make this more qualitative? Through observation, recording data, and plotting it:
Blocks to Buildings Documentation
42
Take the number of visitors that initiate interaction with the exhibit (so actually use it or watch someone use it), and then track the percentage of people that move on to transition and breakthrough behaviour. A successful exhibition has a high number of participants who make it to the breakthrough stage (as recorded through observation of behaviour when interacting with the exhibit). The second paper, “Direct and Unobtrusive Measures of Informal STEM Education Outcomes” provides ideas for how to ethically and effectively measure the success of the exhibit through observation. Both papers can be found in the 01_SupportingDocuments folder.
Blocks to Buildings Documentation
43
ART BIBLE
5.1 ART STYLE
5.1.1 LOOK We wanted the style and look of the city to be more cartoon-like and stylized, to appeal more to our target audience. We initially started with basic shapes, but then evolved into making more complex architectural structures and buildings. We gathered references for stylized buildings, and developed models that were similar to those designs (Figure 1.1). We wanted the buildings to look simple and full by making a simple silhouette, with intricate patterns and designs. We wanted each building to have unique attributes that best visually represent the building, but we also designed small elements in the buildings with the same process to be more consistent.
Figure 1.1
5.1.2 DESIGN ITERATION Version 1: The first version was designed for user testing. We designed different types of buildings to show the connection between them. For example, if the user combined two houses, they made a dormitory. Two condos could be combined to make a multi-condo, and two storefronts would turn into a big mall. Also by applying the same material on all the buildings, we kept the art consistent. After the testing, we found out that the design was not clear enough for the users to differentiate the buildings and discover the logic behind the
Blocks to Buildings Documentation
45
combinations. We also received similar feedback from Thinkingbox, and so we developed the second generation of models.
House
Dorm
Hipple House
Condo
Muti-condo
Villa
Storefront
Mall
Skyscraper
Blocks to Buildings Documentation
46
Version 2: The second version was built on the foundations of the first. We changed the colours not only to match the blocks, but also to make it easier for users to memorize the buildings. We also pivoted the shaders and added more details on the texture, such as the moss on the green walls and random lit windows to make the design more realistic. This version clearer and more user-friendly, as evidenced by the results of the user testing.
Green
Green by Green
Green by Blue
Yellow
Yellow by Yellow
Yellow by Green
Blocks to Buildings Documentation
47
Blue
Blue by Blue
Blue by Yellow
Building Map: Figure 1.2 shows the initial rules for how the buildings are connected and combined with each other.
Figure 1.2 Blocks to Buildings Documentation
48
4.1.3 COLOURS The colours of the buildings change over time, depending on the lighting (daytime versus nighttime). For each building, we gave them different colour themes and for each theme, we choose different shades (see Figure 1.3) for the bricks’ texture to make a richer visual content. Then we followed the same principle and developed more colours of low saturation and low light for the environment and UI elements later on. colour palette:
Figure 1.3
5.2 3D MODELS 5.2.1 MODELING THE BUILDINGS We built nine custom buildings with different architectural structures to convey the appearance of a multifaceted city. When building the geometry for the models we took into account that they needed to be built for an ipad, so we made sure that the models had a low polygon count. For future development, if there is a need for more models, it would be Blocks to Buildings Documentation
49
optimal to build them in a minimal process. Objects should not include any extra edges or poly loops, because a minimalist build allows for faster rendering, as well as faster building and iteration. In (Figure 2.1-2.2) you can see specifically how we modelled our buildings. During production we did not model a single building object, we combined several objects together to form a structure, this allows you to duplicate similar objects and quickly iterate on the style of the building. While building the objects, you should also take into account the “normals” of faces, and make sure that they are either hard or soft depending on their look. Once all the objects are in the correct position and orientation, you can combine them into one object and export them to unity. For the software we used Autodesk Maya to model the buildings; alternatively you can use 3DS Max, Blender or 3D Coat. We used Autodesk Maya because we were familiar with using it, and because it provides tools for artists to quickly build models.
Figure 2.1
Figure 2.2
Tools we used in maya:- Combine Hard Edge Soft Edge
Combines two objects into one
Contrasts the normals of a face to make is look sharper
interpolates the normals of a face to make it look smoother
Blocks to Buildings Documentation
50
Transfer Attributes
If two objects share the same topology, you can transfer the vertex position and UV mapping from one to the other
5.2.2 UV MAPPING THE BUILDINGS We used Autodesk Maya to unwrap the UVs of the 3D objects, because it has many tools that allow you to easily unfold and organize the UVs. The objects are mainly hard surfaces so we first began with automatically mapping the UVS (Figure 3.1), to quickly unfold all of the UVs. Unfortunately automatic UV mapping gives unorganized UV shells, so you need to clean them up and organize them together. To organize the UV shells you need to use the “Cut” and “Stitch Together” tools (Figure 3.2) that help form neat UV Shells. You can also use the “Stack and Orient” tools (Figure 3.3) as well as the “Unstack” tool to organize the UV’s position and orientation throughout the UV Quadrant. Try as best as you can to fit the entire building’s UVs onto the UV Quadrant and make sure that they all share the same texel density. You can use the “Get” and “Set” (Figure 3.4) Tools to make sure that all the UVs shells share the same texel density. There is no specific correct way to implement UV Mapping, just use your best judgement with balancing efficiency and reducing obvious seams.
Blocks to Buildings Documentation
51
Figure 3.1
Figure 3.2
Figure 3.3
Blocks to Buildings Documentation
52
Figure 3.4
5.2.3 TEXTURING THE BUILDINGS We used substance painter to paint the textures on the buildings, because it’s efficient and easy to use. The first process includes baking the essential maps that make up the shading for the building. You need to extract several maps from substance painter including the World Space Normal, Ambient Occlusion, Position, Depth, Curvature, and thickness. We then add colours to the shaded model to achieve a specific look or style. In order to reduce the time taken to iterate on the colours, we used masks and fill layers instead of using empty layers and poly painting the colours onto the models. Fill layers allow you to assign specific colours or textures to specific UV shells. When creating a fill layer, add a black mask to it, and then use the polygon fill tool to isolate specific UV shells that can be affected by the fill layer. After you complete this process, you can now change the colour in the fill layer in the base colour tab. For the shading, we tested out different modes and we came up with the following set for baking the shading. Within the Basecolour channel, create fill layers and follow the structure as seen in (Figure 4.1).
Blocks to Buildings Documentation
53
Figure(4.1) After completing this process, you should end up with a model that looks like (figure 4.2). Afterwards you should add a folder for colour and place it under the depth folder, otherwise the colour will override the shading. Create new fill layers with textures or colours and place those fill layers under the colour folder. Depending on which colours you assign to model, it should look similar to figure (figure 4.3) that incorporates both shading and colour. For specific textures we used the tile generator option in substance painter, and substance designer depending on the complexity of the texture.
Blocks to Buildings Documentation
54
Figure 4.2
Figure 4.3
When exporting the textures for the models you need to create a configuration to only export the Base colour channel in substance painter. Create a new channel by pressing the “RGB + A” and assign it a name depending on how you want to name your textures). Afterwards assign “Base colour” to the RGB channel and assign the Opacity to the A channel, it should look similar to (figure 4.4) Return to the texture export menu, and select the configuration that you created from the config menu. Based on our tests we agreed that the “Common Padding ” should be “Dilation + Transparent” because it provides the option of editing the textures in photoshop if necessary. We also decreased the dilation to 3 pixels, because we did not want the dilation to affect any nearby UV Shells. Select the maps that you want to export with the desired resolution, and then select the export button. Blocks to Buildings Documentation
55
Figure 4.4
Figure 4.5
5.2.4 EXPORTING TO UNITY When you’ve finished modeling your 3D objects, you can export them into Unity from Maya. For consistency we found that exporting the models as an FBX (File format) was the most efficient way to export both static models and animations. In order to export the object, select it and go to the file menu and select “Export Selection” in (figure 5.1). Afterwards you should see a menu appear with the location where you can save your file. Make sure to select “FBX export” from the menu called “Files of type” in (Figure 5.2). Once you import the model into unity, make sure to adjust the scale factor of the model according to the size of the environment. You can adjust them inside the FBX File in (figure 5.3), and they should affect all of the objects that you have in the scene. With the right size adjusted you can now drag your fbx model into the scene, and begin interacting with them.
Blocks to Buildings Documentation
56
Figure 5.1
Figure 5.2
Figure 5.3
5.3 DYNAMIC LIGHTING AND SHADERS 5.3.1 DYNAMIC LIGHTING We wanted to create a day and night effect to bring more life into the city, as well as to emphasize the shading in the environment. We created a C# Script that rotates a directional light in the scene over a period of time that can be altered in the unity inspector. You can attach this script to the directional light in the scene.The script also affects other light attributes including the exposure and the light colour. The exposure is reduced during the night time and the colour changes from white during the day to light blue at night. The script also affects the skybox rotation, which is linked to the speed of the day and night shift, as well as the change in the skybox textures. In (Figure 6.1) you can see the attributes that manipulate the night and day affect. Before changing any of the attributes you need to assign the Directional light to the “Sun” slot, and the Skybox to the SkyBox” slot, in order to change their attributes in the C# script. “Daytime Speed” affects the frequency in which the days and nights cycle throughout the scene. “Exposure min and Max” control the brightness and exposure of the light source during the different cycles. “Day and Night colours” affect the colour that the light source adds to the scene and they change gradually during each different cycle.
Blocks to Buildings Documentation
57
Figure 6.1
5.3.2 SHADERS For shading we decided to use cel shading (Figure 7.2) instead of the standard Autodesk Interactive Material (Figure 7.3), because we wanted to go for a cartoon-like simple aesthetic. We designed 3 specific shaders that help achieve this look in the environment, one is assigned to opaque objects, the second is assigned to the transparent objects, and the third is applied to the skybox. The opaque shader (Toon in figure 7.4) has 3 inputs: the Base colour, Emissive, and Normal. The emissive texture can also gradually turn off and on with a slider called the “Emissive Amount” that ranges from 0 to 1. The Transparent shader (Mask in figure 7.5) has 2 inputs: The Base colour and the Alpha mask. The Skybox shader (Skybox in figure 7.6) allows you to shift between 2 different skybox map, along with the default settings that are provided with the default skybox shader.
Blocks to Buildings Documentation
58
Figure 7.2
Figure 7.3
Figure 7.4
Figure 7.5
Figure 7.6
5.4 ENVIRONMENT LAYOUT We populated the environment with assets that we acquired from this account (https://gumroad.com/discover?query=3dex&tags=). We used these assets to add colour to the ground, and added tree objects to the areas that cannot be built on. We bought these Blocks to Buildings Documentation
59
assets because we wanted the scene to look more complete and we wanted to add variety to the ground textures. We developed the environment to mimic the schematic of a city, which has a grid like structure for the road, areas for buildings, and areas for gardens and parks. The environment can be changed or manipulated based on the experience, and as long as it has fundamental elements of a city such as a road, an area to build, and an area for parks. We specifically acquired the following items: Stylized Dirt Material, Stylized Grass Material, Stylized Stone Material, Stylized Tiles Material, Stylized Wood Material, and a stylized pine tree asset (Figure 8.1-2). You can assign these materials to any object in unity, if it is properly UV mapped, and you can place the tree asset anywhere in the unity scene.
Figure 8.1
Figure 8.2
5.5 ANIMATIONS 5.5.1 BUILDING ANIMATION Blocks to Buildings Documentation
60
The animations and interactions are designed for building the 3 basic buildings and the combinations made by those 3 buildings. Normally there are three metrics we can use: position rotation and scale. Since the “position” information has been coded so that users can move the blocks, we only used the other two to create the final look. We use Maya to edit the main animation and export the FBX files to Unity. In Maya, we set up the scene to 30 frames per second, added keyframes on scale Y, made the transition look like it was stretched and adjusted the timing. We also added a rotation from 180 to 720, which w ill position the building in the correct orientation after the animation concludes. Then we edited the curve in the graphic editor, enhanced the stretch effect and applied the ease-in and ease-out tool on the rotation curve. Once it is finished, the graphic editor will look similar to (figure 9.1) for Rotation Y and (Figure 9.2) for Scale Y.
Figure 9.1
Figure 9.2
Then, export the FBX file and import it into Unity, by connecting the animation file with an animator controller, you can apply the animation to all the models. In this project, we applied the popping animation to the 3 basic blocks. For the combination, the animation will only be applied to the combined buildings. This animation is built in Unity, which is a simpler and more efficient way to achieve the result we want. In this case, the animator controller was built first, and then properties were added. To differentiate the two animations, we designed the combination animation in a more quirky way, changing the rotation value on the Z-axis (figure 9.3) and adjusting the scale curve to (figure 9.4) create a feeling of growing.
Blocks to Buildings Documentation
61
Figure 9.3
Figure 9.4
5.5.2 CAR ANIMATION We included a custom interaction where cars move freely in the city if there are more buildings that support social wellness (The large blue buildings). To reduce confusion about the interaction we gave the cars a blue colour to associate the buildings with the cars. To start we took a car model and modified it to our specifications and imported it to unity as a prefab. We then added 2 components to the object, one is a nav mesh agent (Figure 9.5), and the second is a custom script called “AI Movement” (Figure 9.6) that guides the car prefab towards certain destinations, and then destroys it. You can adjust the parameters in the “Nav Mesh Agent” (Figure 9.5) depending on your preferences, and how they are affected within the environment. For the “AI Movement” script you need to provide an end point and start point in order to create the animation. The car will begin moving from the start point and will move across the map until it reaches one of the end points. To create the end/start points you need to add empty game objects in the environment, and place them in specific destinations. Afterwards you can assign these game objects to the next target or previous target slots. The “Previous Target” slot has only one input, and this is where the car will be spawn when it is created. The “Next Target” slot provides the option of adding a custom number of inputs, where you can adjust them in the size parameter. You can then add the end point game objects to the next target slots, and the car will move to a random end point once it is spawned.
Blocks to Buildings Documentation
62
Figure 9.5
Figure 9.6
In order for the cars to move across the map you need to bake a walkable navmesh onto a plane in the environment. We created a road object that represents the walkable plane, and baked the nav mesh onto it. To begin select the road, and then go to the navigation tab under the under the AI menu, and you should see 4 tabs (Figure 9.7-8). First check the “Navigation Static” and the “Generate OFFMesh Links” in the Object tab, then choose from the Navigation Area the attribute that you want to assign. For example the walkable areas allow the cars to travel across it, while the not Walkable areas will prevent the cars from traveling on it. Repeat this step for all the planes within the environment, and choose the appropriate attribute based on your preferences. Once you’ve completed the last step, you can move towards the Bake tab, and click the bake button (Figure 9.8)). You can adjust the padding difference between the not walkable and the walkable areas with the “Agent Radius”. You should see the plane area turn light blue, which means it is now active (Figure 9.9). Now the car (A Nav Mesh Agent) can travel in the light blue areas in the environment, and avoid the not walkable areas.
Blocks to Buildings Documentation
63
Figure 9.7
Figure 9.8
Figure 9.9
Blocks to Buildings Documentation
64
Blocks to Buildings Documentation
65
5.6 USER INTERFACE The UI is designed in bold, simple colours, intended to evoke the blocks as well as a more childlike interface. Text and image boxes are designed with use of rounded regular corners. Colours are used mainly in a flat manner, with some use of gradient such as in the buttons for more emphasis. Drop shadows are used to set objects apart on a separate layer in front of other UI elements.
5.6.1 FONTS Two main fonts are used throughout the game, one for the logo and one for body text. The logo font used is Cheap Pine and the body font used is League Gothic. The Cheap Pine font used in the current version of installation is due to the Adobe Fonts commercial license being available to the team through the CDM’s Adobe CC subscription. However, if the installation were to be made commercial, independent of the CDM Adobe CC subscription, the license for this would have to be bought through its respective design studios or a separate Adobe subscription. A final alternative would be a font change, to one that is either open source, or has an available commercial licenses. League Gothic is a free font, and so no further action would need to be taken to use it in the installation.
5.6.2 EXAMPLE SHOTS
Blocks to Buildings Documentation
66
5.6.3 UI ASSETS Central items to the overall UI design are shown here, separate from other elements.
5.6.4 LOGO The Logo for Blocks to Buildings is simple, childlike, and meant to evoke the DIY, off-kilter vibe of older building shows and games (i.e.Bob the Builder). The font has a constructed quality to it and the three blocks mimic the blocks the player can use to build their own city. The colour version keeps with the traditional block colours (green, blue, and yellow), and the logo can also be used in black and white.
Blocks to Buildings Documentation
67
5.7 COLOUR SCHEME The overall colour palette of the Blocks to Buildings installation is based in two separate schemes, one for UI and one for the models and environment. The colour palette for the environment has a lower overall saturation, leaning towards more naturalistic colours for a sense of realism.
The colour palette for UI is based in the bright, bold colours of the building blocks. This is reflected in the buildings as well, whose colours were made to match the colours of the blocks.
Blocks to Buildings Documentation
68
TECHNICAL DOCUMENTATION
6.1 CHOOSING THE TECH Team Spark researched and considering a few different options when it came to the best tech for the experience. The top four our outlined below. Traditional AR Idea: Using a toolkit such as ARKit or Vuforia, we can use marker or object recognition to identify the blocks and build the city. Foreseeable Issues: Both Vuforia and ARKit can only identify up to 5 objects at a time, making the scale of the project (an entire city built of blocks) difficult. Alternative: This tech could be better (currently) for smaller scope projects, such as building a single house. Otherwise, we would have to build our own AR App to adapt to the scale of the project. RFID or Bluetooth Beacon Idea: Use RFID chips or a Bluetooth Beacon to map the placement of the blocks and identify which model should be paired with that specific block. Foreseeable Issues: Wouldn’t enable the “combination” aspect of the initial idea, as the blocks would have to be associated with specific models. There may also be issues with the precision of the location-tracking. 3D printed buildings on interactive screen Idea: Inspired by the Maersk Drilling table, this idea would allow users to place buildings on an interactive screen and then “build” through the virtual world. Foreseeable Issues: This idea loses a lot of the physical building aspect of the original idea. Instead, players could place a model and then transform it (make it taller, shorter, wider, etc.) on screen. A pro of this could potentially be that very few models are needed, because players can move them to new locations once they’ve finished editing the original location in the virtual world. Interactive Touch Screen [FINAL DECISION] Idea: T his idea is very similar to the one directly above but allows for a bit more physical creativity. Players are given blocks with multi-touch capabilities. When they place the block on the screen, the object is recognized, and the model appears on screen.
Blocks to Buildings Documentation
70
Foreseeable Issues: Does not allow for stacking blocks, and so a planned city approach (i.e. providing geographic landmarks/roadways for the player to build around) is suggested. Reference: https://www.instructables.com/id/Object-Interaction-With-Touchscreens/ From these options, the team chose the top two options and presented them to Thinkingbox. Eventually, the Touchpoints method was chosen. Details on both technologies, including pros and cons are outlined below.
6.1.1 AUGMENTED REALITY The team compared Vuforia and ARKit to determine the best software to use, should we choose Augmented Reality as our technology of choice. Vuforia
ARKit
More supported devices
iOS 12+ or iPadOS (ARKit 3 limited to Apple devices with A12/A12+ processors.
Baked in Unity
Baked in XCode
Uses C#
Uses Swift
Functionalities: Model Targets with Deep Learning (allow to instantly recognize objects by shape using pre-existing 3D models and deep learning algorithms) Image Targets (the easiest way to put AR content on flat objects) Multi Targets (for objects with flat surfaces and multiple sides) Cylinder Targets (for placing AR content on objects with cylindrical shapes) Ground Plane (as a part of Smart Terrain, this feature enables digital content to be placed on
Three similar fundamental concepts in ARKit: World Tracking Scene Understanding(which includes three stages: Plane Detection, Hit-Testing and Light Estimation) Rendering with a help of ARKit companion – SceneKit framework, that’s actually a game engine Uses Visual Inertial Odometry, to very accurately track the world around your phone [ARKit 3.0] Support for collaborative sessions that allow to share World Maps.
Blocks to Buildings Documentation
71
floors and tabletop surfaces - look at a list of supported devices) VuMarks (allow to identify and add content to series of objects) Object Targets (for scanning an object) and Fusion. Better for managing and customizing shaders, renderings and 3D models.
Difficult to customize shaders and renderings
Verdict Vuforia better for Unity integration. As a game engine, Unity is better at incorporating and managing models, shaders, textures, etc. than XCode for the app. Also multi-platform. ARKit provides better world-tracking and object detection. Also provides extended tracking which allows for managing many 3D object anchors simultaneously. Discussed in detail below (see Extended Tracking). Both platforms came to be pretty much equally capable to accomplish any task. Comes down to personal preference of coding environment. Features and Limitations of AR Object Detection For 3D object detection, we require objects with high detail ( 'features'). This means small textures, gradients or rough surfaces to be able to differentiate the object target's features from the background ● The blocks don't work - smooth surface, solid colours, too small ● Combinations of blocks is better but still hard to get it accurate because of very smooth surfaces and large surface area with same colour/texture ● Objects with positive results : Coffee cups with designed cup holders, toys, blocks with marker-ed surfaces. Number of Detected Images/Objects ● Vuforia has a limit of 5 ImageTargets. ● ARKit provides the property configuration.maximumNumberOfTrackedImages which allows you to define the max number of image targets to be tracked in a view at any given time. It has a default value of 1, but can be increased. There are various posts Blocks to Buildings Documentation
72
that say the max possible value of it - 2 to 4. To me, it seems like it is a value that Apple aims to keep increasing over the years as their AR tech gets better, but for now, it is max 4. ● Object detection is even slower and more energy consuming than Images. That's why it seems like, as things stand, even Object detection (marker or marker-less) stands at most 4 in a given frame, but realistically I'd say 3 or less would provide for a pleasant user experience Extended Tracking ARKit uses Extended Object Tracking in such a way that it only tracks 1 object at a time (no concurrent tracking), but at the same time maintains the anchors dropped on previously tracked objects in the world space. It allows you to drop virtual nodes positioned based on the position of these anchors. What that means is that ARKit will continue to track any virtual objects in the world space even after it has lost current tracking of that real-world object. This allows for a more fluid and immersive experience, and at the same time add as many anchors to the app as we want. Limitations ● Lag in object detection (especially for 3D objects). Gets better with markers but that limits what physical objects can be used for the project. ● Might require calibration of world environment and/or tutorials for first time users. ● Jitters in continued tracking, especially when the amount of tracked objects increases.
Blocks to Buildings Documentation
73
6.1.2 TOUCHPOINTS
One of the options that we explored during the prototyping phase involved a tutorial called Object Interaction With Touchscreens. With its helpful guides and to-dos on software functionality, touchpoint methodology, and necessary materials, this resource formed an important precursor to our final deliverable by serving as the inspiration and guiding methodology for the Unity project we have today.
Blocks to Buildings Documentation
74
Above: One of the early methods for touchpoint systems considered involved differentiating different interactions based on significant isosceles-shaped triangle touchpoint configurations . These configurations were encoded in the form of noting the unique angle of an isosceles triangle in the leaf elements of an XML file. An example of such a file is pictured below:
Blocks to Buildings Documentation
75
Although the idea and guide were excellent and most of the tutorial was instructive and clear, after working on and exploring the software referenced in Step 6, we found that there were several drawbacks to pursuing the use this pre-existing software, namely being that the software was not very well-maintained (seemingly being maintained by one independent developer with the most recent commit of Sep 6, 2016 - 3 years ago) and as a result, there was a significant lack of adequate project documentation to be found in either of the tutorial and the GitHub repository necessary to the setup and extrapolation of the project.
Blocks to Buildings Documentation
76
Above: View from the master branch of the GitHub repository. Note the lack of the usual README.md file that usually accompanies complex software projects like this, as well as the dated date of the most recent commit. However, we took it upon ourselves to forge ahead and navigate the perils of open source software anyways. Whereas other technologies from the likes of companies such as Stripe,, Apple (e.g. ARKit), Microsoft, and Unity (e.g. Vuforia) have teams of people generating and maintaining forums and extensive documentation to guide users through the proper setup and usage of their developer tools and software, with an open source project like this we were looking to piecemeal all this knowledge and information together ourselves. In the process we wrote documentation regarding the setup of this project, written with the aim of being more comprehensive than the documentation available online. Should the client find the curiosity or need, this documentation for setting up the iOS software can be found detailed at this link.
Blocks to Buildings Documentation
77
Blocks to Buildings Documentation
78
Additional potential trouble spots that came to mind when working on this software were that it relies on a tech stack of Xcode and C++, particularly openFrameworks, which the tech lead Silver claimed little to no expertise in. Additionally it’s of note that you cannot compile or run this software project without a licensed Apple Developer Program account, which retails for $99/year. These factors, in combination with issues previously mentioned, brought reason to doubt or seek alternatives to working directly with this software in the long term. It was for these reasons that we decided to take it upon ourselves to build out a variation of the project in Unity that accomplished all the capabilities of the touchscreen object and more. However, it’s important to note that having this resource still proved very valuable as it gave us this whole idea and informed us about the touchscreen method along with the materials and different “ingredients” we’d need (as well as the recipe)
6.2 TECHNICAL GUIDE 6.2.1 TOUCH DETECTION By using U nity Input class, we are capable of tracking multiple fingers touching the mobile device screen simultaneously. The I nput.Touches property array provide us with detailed information about how many touch points are on the screen as well as the position and status info about those touch points. There are two methods that we can use to create buildings based on touchpoints, as shown below. Implementation
Advantages
Identify Touchpoints based on Triangular shape ● Group all the adjacent touch points ● Find all the touch points that forms a triangular shape ● Create buildings based on the angles of each triangle Create multiple buildings at
Identify Touchpoints based on Number of Touches ● Count the number of touches at the screen ● Create Building based on the number of touches
Easy Implementation
Blocks to Buildings Documentation
79
Disadvantages
the same time
Higher Accuracy
Massive Computation Lower Accuracy
Create one building at a time
In this prototype, we used the method that identifies touchpoints based on the number of touches since it’s more accurate and easier to implement. When there are two, three, or four touches on the touch screen, the system will calculate a middle point between the touchpoints and detects if there is any building under the middle point. If there is already a building under it, and the building is not a combined building, it will destroy the building and creates a new building based on the number of touch points.
6.2.2 COMBINATION Each building object has an associated collider with it which is able to detect if any other building enters its collider space. When that happens, we clear the 2 ‘colliding’ buildings and introduce a new combined building. The combined building will be created based on the tags of the 2 ‘colliding’ buildings. Tags indicate different kinds of buildings. There are 9 different building tags where 6 of them indicate the combined building and 3 of them indicate single-block buildings. Table below shows all the possible combinations corresponding to the building tags.
Building
Building1
Building2
Building
BuildingCombined1x 1
BuildingCombined1x 2
BuildingCombined1x 3
Building1
BuildingCombined1x 2
BuildingCombined2x BuildingCombined2x 2 3
Building2
BuildingCombined1x 3
BuildingCombined2x BuildingCombined3x 3 3
For Unity 3D Physics, it is also required that one of the objects has a rigid body component attached to it for collision detection. Since all the single-block buildings will collide with each other, rigid body is attached to all of them.
Blocks to Buildings Documentation
80
Here is the rigid body setting we used:
Since gravity is not considered for building cities, it is turned off. Since multiple combinations on the same building is not desired, only single-block buildings have colliders and rigid body associated with them.
6.2.3 AFFORDANCES In general, there are three sustainable building categories: green space, energy efficiency, and social wellness. Each single-block building belongs to one of the noted categories, and each combined building shares the same categories of the buildings it combined from.
Affordances will appear based on the number of buildings in the s ustainable building categories, as shown below. Blocks to Buildings Documentation
81
More than 2 buildings in the Less than 3 buildings in the category category
Green Space
Birds sound plays at the background Trees grow on the ground
No birds sound plays at the background No trees on the ground
Energy Efficiency
Electricity sound plays at the background Road texture switches to circuit pattern
No electricity sound plays at the background Road texture switches to soil color
Social Wellness
Crowd talking sound plays at the background Cars moving around on the roads
No crowd talking sound plays at the background No cars moving around on the roads
6.2.4 SCREEN SHOT In the system, there are two screenshot functions. First function uses the open source asset 360 Screenshot Capture which takes a screenshot and injects output image with XMP metadata, so that the output image will support 360 viewers on the web such as facebook or flickr. In this prototype, captured 360 images will be saved to the Application Data Path under ScreenShot/360Image/ folder, the images can be viewed at web 360 viewers but users have to upload them manually. Here is a sample picture on the regular image viewer. View this picture in 360 on flickr.
Blocks to Buildings Documentation
82
The other function captures screenshot from different camera angles and output 4 different PNG images. Note: T his function is not compatible with the light rotation. Here are some sample outputs:
Blocks to Buildings Documentation
83
6.2.5 MULTIPLAYER EXPERIENCE We realized that, given the size of the blocks, the iPad touch screen provided a very limited game area for the users to create a city in some sense. Furthermore, we wanted to incorporate an element of collaboration in the building process. Therefore we decided to expand the experience to have multiplayer functionality. To implement this, we used the Unity UNET Lower-Level API (LLAPI). We implemented a standalone server application with a combined game board. We set up the installation to have 4 iPads, each running a client application that would connect to the server. Since this was for an on-site installation, we used a L AN Server Connection over wifi. Each client has a handshake method which, once connected to the server, receives a unique connection ID and accordingly generates a unique city area on the screen. This allows us to have extensible environments; as more iPads get added, we can have a variety of city planes to work on.
We used a variety of two-way messages to communicate between the clients and server including changing position or deleting a building, rotating the game view or turning the game On/Off.
One of the main challenges we came across while developing the multiplayer platform was dealing with possibly duplicate ID’s for different game-objects in different client apps. This meant that creating, updating or deleting a building created by one client could cause other clients’ buildings to be affected as well. To deal with this, we created a function to generate a unique ID for each GameObject created in each of the clients’ scenes by using the GameObject Id provided by the client and the client’s connectionId. We also maintained a local Dictionary to keep track of which buildings are on the board for doing any update/delete. Blocks to Buildings Documentation
84
We can also extend this function to use a more secure hashing function to prevent any possible collisions, but given the scale of this project, it didn’t seem necessary.
Furthermore, since the server app represents the game board as a single plane, we had to find a way to calculate the exact position of each building on the game board, given it’s position on the client side. We had the following solution to it, which provides a modular and easily extensible formula to allow for more clients. In this method we use the Bottom Left as the origin and calculate all positions respective to it.
Note: The Unity UNET framework has been marked as Deprecated at the beginning of this year. Unity plans to release a new framework for creating Multiplayer games, but right now has no timeline on when it will be launched. Therefore, we had to use their UNET framework for this project since it is still being supported for the time being.
Blocks to Buildings Documentation
85
PIVOTS AND LESSONS LEARNED
7.1 USER EXPERIENCE Stacking Versus Sprawling The greatest adaptation for User Experience during the course of the project had to do with encouraging the users to focus on building in a sprawling manner, instead of stacking the blocks. Solution: First, we added icons to the tops of the blocks. Then, we created a grid-like map for the user to build upon. Both served to encourage building out, rather than up.
7.2 ART Art Mirrors Life The greatest adaptation for Art was the changes made to the models to reflect the blocks in both colour and size. The original models were more lifelike, however changes were made to their structure and their textures so that the user could quickly and easily draw the connection between the blocks they were playing with and the models they saw on screen. Different Environments Another pivot was fairly easy and small in implementation, but went a long way for user comprehension. The team ensured that each environment for each iPad was different, which further helped to orient the user.
7.3 TECH AR versus Touchpoints The team researched a number of different technologies that could have served to create this experience. The final two were Augmented Reality and Touchpoints. After discussing with Thinkingbox, the team chose to move forward with touchpoints. This was mainly due to the fact that the touchpoints were able to read and register the blocks much faster, providing a smoother interaction for the player. 2-3-4 Touchpoints Versus Triangle Touchpoints
Blocks to Buildings Documentation
87
Originally, the team discussed using isosceles triangles to form the touchpoints. In this case, all the blocks would have been 3 touchpoints and the angle of their vertex would have been the differentiating factor. However, we quickly found that this method was not very reliable - the blocks were often confused for each other. BEcause of this, we pivoted and assigned distinct 2, 3, 4, and 5 touchpoints to the blocks.
7.4 CLIENT Godzilla to Air Canada to Science World When the team originally pitched this idea, it was for the Godzilla movie. However, as that movie came out at the beginning of June, it was decided to choose something a little more relevant (and a little less specific). At first, we tried to stay in the realm of entertainment, and looked at other games and movies, but they felt too specific. We considered creating an installation for Air Canada, but that idea required a more “paint-by-numbers” approach and so it was nixed. Finally, we settled on an installation for Science World, where we could make the focus education. However, the installation is built to be adaptive, and other uses for it are listed in the next chapter.
Blocks to Buildings Documentation
88
FUTURE VERSIONS
Due to the length of this project, Spark could not develop all the features we envisioned for this product. This chapter details some developments that we feel would be good additions to the product.
8.1 ADDITIONAL FEATURES 8.1.1 ROTATION In future versions, Team Spark would suggests including a rotation block. This block would require a specific three-point angled touchpoints (using an isosceles triangle made from dots of copper tape). When the player places it upon an existing building on the screen, they would be able to rotate the block and the model would rotate as well. It is suggested to use one of the circular blocks to distinguish it from the building blocks.
8.1.2 UNDO Team Spark would also suggest incorporating an undo function into the remote control, which would allow the player to undo the most recent combination. This would increase the control the player has over their city and allow them to place blocks close to each other without combining them.
Blocks to Buildings Documentation
90
8.1.3 GAMIFICATION The current version of the installation has some affordances that can influence player behaviour, however the idea of a gamification was moved to “future versions” due to time. The original idea was to have scores associated with the three types of buildings (green space, energy efficiency, and social wellness). The focus would be on balancing the three elements, as a sustainable city should have a balance to be effective. We played around with a few different ways of implementing this, and would recommend the following if future versions choose to have this feature: Scorekeeping We would recommend instituting a form a scorekeeping for the game, where the buildings the player places on the board affects their score. We played around with different variations of scorekeeping and, through testing, discovered the best way to convey the balancing aspect of the game was by using either a gauge or a radar chart (see image below).
It’s important to note that if the specific numbers are displayed, the player will focus more on balancing the numbers than on building itself. It’s for this reason that we suggest more open, interpretative options for scorekeeping. Here is the base points system developed by Team Spark: Blocks to Buildings Documentation
91
Building
Social
Energy Efficiency
Green Space
House (Green)
-1
1
1
Apartment ( Green + Green)
1
-1
2
Condo (Yellow)
1
1
-1
Mega Condo (Yellow + Yellow)
1
2
-1
Storefront (Blue)
1
-1
1
Mall (Blue + Blue)
2
1
-1
Dorm (Green + Yellow)
1
-1
1
Skyscraper (Yellow + Blue)
1
1
-1
Hippie House (Green + Blue)
-1
1
1
Affordances We recommend maintaining the affordances already in the game as further confirmation to the player. These affordances can also be refined further. Currently, they appear after 3 buildings of any one kind are placed in the space. Eventually, they could be more gradual, appearing and disappearing in proportion to the rest of the buildings in the space. Regulations In order for gamification to be complete, we would recommend including more regulations. First, a regulation on where the player can build each building would create the need for a bit of strategy. Secondly, we suggest creating a population goal. The player must house a certain number of people while maintaining balance in the city. This creates another level of challenge and complexity to the game. For more information on our research into gamification, please see Sustainable Gamification Elements in the 01_ Research folder.
Blocks to Buildings Documentation
92
8.2 TAKEAWAY The current MVP takes a 360° photo of the completed city and displays it for the player. In future iterations, we would recommend making this photo accessible through a link or email so that players can share their experiences online.
8.3 ADDITIONAL USES/CLIENTS While the current MVP is built as an educational tool for science institutions/learning spaces, there are other applications for this technology. We’ve outlined some of our ideas below, with a specific focus on the commercial marketing space. Godzilla The original pitch for this idea was meant as a marketing tool for the Godzilla movie. In order to adapt the installation to this idea (or a similar movie), two major changes would have to be implemented. First, Team Spark would recommend a timer for building. After 3 minutes of building, the experience would come to a close. Second, an animation and the destruction of the city would have to be added. After the timer sounds, Godzilla would traipse through the city, destroying it. This format could work for any number of disaster movies. DC Comics The second idea focused on bringing a younger audience to the historically more somber/adult DC Comics. The purpose of the installation would be to engage new users in the DC Comics world through an experience that speaks to their other fields of interest, primarily world-building. One major change we would suggest for this route is the skinning of the game. Players would be able to create different worlds in the DC universe - Gotham, Themiscyra, Atlantis - etc by choosing which skin they’d like to place on the building. Similar to the Godzilla idea, we’d also suggest having some form of animation at the end of the experience, such as Batman swooping through the buildings in Gotham. For more details on this client, see DC Comics - Audience and Client Profile in 01_SupportingDocuments. Blocks to Buildings Documentation
93
Air Canada Finally, this installation could be set up in airports to engage flyers. Upon scanning their boarding pass, they could build the city they are about to visit. The main difference in this experience would be the freedom to build. In this scenario, we would suggest more of a fill-in-the-blank or paint-by-numbers approach. The main reason for this is because players would be building existing cities with identifiable landmarks. It would be incongruent to allow them to move things around or change the makeup of the city. More details on this approach, as well as the current iteration, can be found in UX Use Cases in the 01_SupportingDocuments folder.
Blocks to Buildings Documentation
94
A PROJECT BY