Page 1




Rhys Duindam


University of Technology Eindhoven

Sebastiaan Krijnen s114143 B32




An his et

se ag ing uid ug hro et

lin . ort rep he

ht , it ten Of

This arrow will "post" the knowledge (in this case

tes i ca



from a research paper) to the first iteration. P22 means that the link can be found on page 22. Sometimes, a small decription of the link is added too.

ag d oo ord er for g din

rea e. ag

p the

This arrow will "receive" the knowledge (again from


a research paper) from another section. In the same way, you could find the information at page 13.


HOW TO READ THIS REPORT? In this report you will find some symbols that may seem odd to you. I'm talking about the arrows, spread around the entire report. These arrows indicate that there's a connection between two parts in the design process: one event may have led to some findings that were useful in another event. During my design process, I've noticed that almost all the decisions, thoughts or events were connected to each other. So I added these symbols only as a clarification of the design process. Sometimes the arrows or lines stay on the same page, sometimes the thoughts whill be indicated with a small arrow. Please see the example:


T O F I R S T I T E R AT I O N P 1 1 High frequencies

can be felt the best with the fingertips.


F R O M R E S E A R C H P A P E R P 1 3 Different

genres have a relatively low impact on the amount of flow reached with the tested deaf people.

However, not all the connections will be shown. If I had done that, this report would have become one big mess of arrows and lists. Enjoy reading!



2 .























O N Finding my drives and intentions ............... 7

ndation of knowledge and ideas ............... 8



Getting grip on a concept ............... 22

Shaping, reshaping & refining ............... 38




An overview of the result ............... 60






I O N S Reflection & what it could become ............... 68


Projject descrription “Thi “T hiss pr hi proj ojectt fo oj focu cuse ses on tthe he d dev evel elop el opme ment me nt off new ne w mu musi sica si cal instru rumentss for me ment ntally and nd/o nd /orr /o moto mo tori to rica ri callllllyy impair ca ired ed p people. T The he ccha hallenge ge in this pro th roje ro ject je ct iiss to ccom ombi bine tthe he n not otio ion n of n new ways wa ys o off co cont ntro nt rolling (electro ro roni nic) ni c) music with thee ex th extr traa ch tr chal alle al leng le nge of designing for specific ng groups, wh whic ich ic h in ttur urn may lead you to a very ur innovative con in ontr on trol tr olle ol ler. le r.”” 1 r.


1. PROJECT VISION - finding my drives and intentions Trying to define my project vision, I started working from my

was I looking for? In the first place, a disability that has a

vision as a designer. My goal is to get people to notice, feel

fundamental effect on the perception of music. I set this

and eventually admire the deepest meanings and reasons

to ensure that I am reaching a target group that is in need

of a design, mainly as a result from the designer’s intrinsic

not only for playing, but also for perceiving the instrument.

motivations. So as a starting point, I could clearly get the

In my opinion, that could be something that would help

question to “What is my intrinsic motivation for designing

this group even more. This would surely be something

this instrument for disabled people?” as a point of grip for

challenging and extraordinary. After all, I was looking

choosing a direction. So,

at my fascinations as

what is it, this motivation

a musician, in relation

for doing the project as my


FBP anyway? I’ll answer it

Summing this up, I almost


automatically chose my





everyone on earth knows



target group: deaf people. Or as I would describe it myself in some fewer words:

So, now I’ve finally guided you to my choice by

the feeling of listening to






“Design a musical instrument for disabled

shaping my project vision,

people that can be adapted and is able to

I can proudly present you

adapt itself.”

the project goal:


music could be stated – roughly – as being equal in strength to every individual. Although, the feeling of freedom, control and fun by actually playing

(To) give deaf people a musical experience by playing an

and mastering an instrument is nót directly available for


anyone. Especially not for disabled people, who may have physical or mental limits for playing an instrument. If I

How could you describe this musical experience? How

design an instrument for these people, I will be able to

could deaf people actually notice the music? Answers to

share this feeling that makes me so extremely happy.

all the questions that you now (and I at that time) may think of, will likely be answered in the rest of the report. ¾

As my motivation for the project, I would not consider this as being strong enough. I was looking for something that


Here you can find the full project description:

even triggered me more. The answer could be found in my

yet-to-be-defined target group. So what kind of disability







2 . 1 L I T E R AT U R E R E S E A R C H — Feeling music — Seeing music


2.2 INITIAL IDEAS — Conductor's sticks — Embracing sound — Knockbox — Capture sound — Playing with electronics: Ping Dong


¾ 2.1 LITERATURE RESEARCH - foundation of knowledge -

The research paper Different from what I’d expected, there were hardly any research papers that were centered around my goal. Although, literature about deaf people in connection to music was widely available. Just to be clear, these papers were about the passive form of using music or sound, not the active form of playing an instrument. A helpful design research paper was “Enhancing Musical Experience for the Hearing impaired using Visual and Haptic Displays” 2. In my own words, it mainly addressed the question: how to give deaf people a musical experience by offering them a tactile and visual representation of music?

diagram and photo of the Haptic Chair, used to communicate music to the deaf test persons —


— Feeling music In the paper, theories about the three main senses hear-

“Palmer developed a theory in which he claimed that the

ing, touching and feeling are being discussed. A very

vibrations produced by low-pitched (low frequency) tones can

fundamental statement was made here:

be felt by body sensors in the feet, legs and hips; middle tones can be felt in the stomach, chest and arms; and high-pitched

“Shibata (2001) found that some deaf people process vibra-

tones can be felt in the fingers, head and hair.” 5

tions sensed via touch in the part of the brain used by most


people for hearing.” 3


T I O N High frequencies can be felt the best with the “Sound transmitted through the air and through other


physical media such as floors, walls, chairs and machines act on the entire human body, not just the ears, and play an important role in the perception of music and environmental events for almost all people, but in particular for the deaf.”


The above mapping of different frequencies to different body locations can provide the most successful sound

(Glennie, 2009)

communication to deaf people.

This ensured that the sense of touch could offer a way of

“In addition, Kayser et al. (2005) suggests that a mechanism

perceiving music for deaf people. Even with objects and

to physically ‘feel’ music might provide an experience to a

surfaces from the everyday environment, sound can be

hearing-impaired person that is qualitatively similar to that

transmitted successfully to the deaf person.

experience by a hearing person.” 6 So a step further to my main goal, this stated that deaf people could even get a musical experience just like hearing people would. However, to use this term, a closer definition should be given to better understand it. There is no widely accepted definition for this complex concept, though. An approach was given by Professor Daniel Levitin (2007): “The aesthetic and emotional appreciation of variations in pitch and rhythm over time, which typically cause a desire to synchronize one’s body movements to the underlying (rhythmic) pulse” As this definition is completely based on hearing people, this definition could not fully suit the context around my goal. In the paper, a recommendation for a more suitable answer was referred to Sloboda, O’Neill, and Ivaldi (2001)7. They have studied musical engagement in everyday life using conventional ‘flow’ methodology: someone who experiences timelessness, effortlessness and lack of self-consciousness would be in the state of flow (Csikszentmihalyi, 1975)8. Using this flow as a point of grip, this can be measured (in the unit of FSS, Flow State Scale) to find to what extent someone has a musical experience. 11

— Seeing music

visual representation using a 2D panel of moving shapes and colors synchronized to the music. (see figures below)

This scale enabled them to measure the felt musical experience using haptic and/or visual displays. This was done

“Taylor, Moss, Stamatakis, and Tyler (2006) suggested that

by placing test persons on the Haptic Chair (see previous

the human peripheral cortex helps to bind the major aspects

page for an image) playing music and eventually adding

of audio-visual features to provide meaningful multi-modal

a visual representation of the music. The results of the

representations. It can thus be established that even though

highest FSS was found when using or the haptic display

humans receive sensations via distinct sensory pathways, the

or the haptic display together with the visual display. Spe-

information is not always perceived independently, but often

cific results for the played musical genres, hardly gave any

put together before being finally processed.” 9

difference in result. “Thompson, Russo, and Quinto (2008) have shown that


T O I N I T I A L I D E A S P 1 5 Different genres have

facial expressions of a singer can significantly influence the

a relatively low impact on the amount of flow reached

judgement of emotions in music. Referring to the above with

with the tested deaf people.

regard to people with hearing impairment, exploiting the visual mode might be one of the ways to compensate for the lack of auditory information.” 10

"Many participants reported that they could ‘hear’ better when watching human gestures while listening to music and

All of this suggests that a visual element as an addition

seated in the Haptic Chair. Referring to face/lip movements

to the tactile element can amplify the musical experience.

and conductor’s gestures, some participants said these (ges-

Mostly, human gestures or other movements to express

tures) “are more musical”. "

emotion can amplify the musical feeling even more.

As you can read above, human gestures while feeling the


music even contributed more to the musical experience

gestures to express music (like a conductor or singer)

when having a visual display (more than when using a

gives a meaningful visual representation of music.

T O I N I T I A L I D E A S P 1 5 Body movements as

— 2D representation of music during the trials, using moving colored blobs

— 3D representation of music during the trials: screen captures of an orchestra conductor making expressive gestures


— photos of singers (who have inspired me) showing their musical emotions by using gestures and facial expressions. From low to high: Michael Jackson, Chris Martin, Pharrell Williams, Thom Yorke.


Shibata, D. (2001). Brains of Deaf People "Hear"

Music (Vol. 16): International Arts-Medicine Association Newsletter 4

Glennie, D. E. Hearing Essay. Retrieved Jul. 8, 2009,

from 5

Palmer, R. (1994). Tac-tile sounds system (TTSS).

Retrieved Jul. 8, 2009, from Palmer, R. (1997). Feeling Music, Based on the paper presented at the 3rd Nordic Conference of music therapy, Finland. 6

Kayser, C., Petkov, C. I., Augath, M., & Nikos K.

Logothetis. (2005). Integration of Touch and Sound in Auditory Cortex. Neuron 48(2), 373-384. 7

Sloboda, J. A., O’Neill, S. A., & Ivaldi, A. (2001).

Functions of music in everyday life: an exploratory study using the Experience Sampling Method. Musicae Scientiae, 5(1), 9-29. 8

Csikszentmihalyi, M. (1990). Flow: The psychology of

optimal experience. New York: HarperCollins 9

Taylor, K. I., Moss, H. E., Stamatakis, E. A., & Tyler, L. K.

(2006). Binding crossmodal object features in perirhinal cortex. National Academy of Sciences, 103(21), 8239-8244. 10

Thompson, W. F., Russo, F. A., & Quinto, L. (2008).

Audio-visual integration of emotional cues in song. Cognition and Emotion, 22(8), 1457-1470. 13

¾ 2.2 INITIAL IDEAS - foundation of ideas -

A circle of research and idea generation

me for doing research in other subject. Next, I’ll be discussing some (very rough) ideas I came up with. Mind

While getting strong foundation of knowledge during the

the fact that most of these ideas – in one or another way

design paper research, I did some idea generation too.

– can be found back in my final prototype. These ideas

To make sure I didn’t limit my creativity, I strongly kept in

come from my deepest interests, so making them guide

mind I didn’t have to generate these ideas directly from

my design process suggests that the the process comes

the research. However, the fact that did my research and

from my deepest interests too. So, the ideas Conductor's

idea generation simultaneously resulted in a circle; read-

sticks, Embracing sound, Knockbox and Capture sound, in

ing research papers inspired me for getting new ideas and

the next chapter to be shaped into real concepts, can be

the other way around, completely different ideas inspired

read here:

— exhibiting my ideas with sketches and a little pitch. Getting feedback from the coached helped me further.


getting other input than movement, so a fixed variety of


F R O M R E S E A R C H P A P E R P 1 2 Body move-

ments as gestures to express music (like a conductor or

samples would be necessary. Furthermore, the sounds could be felt as vibrations through the stick’s grips.

singer) gives a meaningful visual representation of music. +

So, in a nutshell: a person can move the sticks to pro-

So (deaf) people can get a musical experience by getting

duce sound, dependent on the way he moves it (and the

in the theoretical flow.

mapping of movement-sound). " T O C A P T U R E +

S O U N D P 1 8 The system can learn the way the player

Different genres of music have a relatively low impact on

tends to interact and with that, adapt itself, knowing what

the amount of flow reached with the tested deaf people.

to play – that’s where the “unreachable and intelligent” element comes in.

— Conductor’s sticks


T O F I R S T & S E C O N D I T E R AT I O N The

limb’s movements as gestures to express music (like a conductor or singer) gives a meaningful visual represen-

Focusing completely on the player’s movement as an

tation of music.

input method, I was inspired by conductors. I imagined conductors in front of big orchestras, waving their sticks to lead the musicians. I’ve done it a few times as a teenager. The sense of full control is what attracts me, but also the unreachable part intrigues me: often, musicians know what to play, so just marking the beat with some emotional signs would be enough. The fact that the musicians mostly know what to do - partly disregarding the input any more - points at a moment that the musician’s intelligence takes place. In a creative way, still. Somewhere, that may be scary or even frustrating. But the way I look at it, it’s beautiful: shaping something that has been built or guided by yourself and at one point, knowing how to continue, may stimulate the engagement (of the player, but also the virtual engagement of the instrument) and thus the player’s flow. And all the aforementioned is exactly going about the idea Conductor’s sticks. So what is it? Well, roughly sketched, I saw two sticks in the size of traditional baton (that’s the official name of these sticks). The aimed interaction was mostly in the gestures you can make with it. I thought of quite straightforward mappings of low-high movements to low and high instruments or a mapping of quick-slow movements 11

to staccato-legato instruments. The instruments could


staccato playing = with each sound or note sharply detached or separated from the others:.

be traditional orchestra instruments, associated with

legato playing = in a smooth flowing manner, without

these sticks. As different genres don’t appear to have

breaks between notes

great influence on their different amounts of flow reached with the deaf people, other sounds and samples could

from Oxford Dictionaries at

also be used. Most important: I didn’t see any way for 15


F R O M B 2 2 P R O J E C T My musical instrument

Granulizor from my B22 project, inspired me to get a more emotional valueable interaction.

— Embracing sound For me, the embracement movement seemed the first clear and emotionally meaningful motion. I knew I wanted to approach this project more from this side, so this (very rough) idea was actually an important step in my idea generation. But why did I want this embracement movement? Well, the fact that sound isn’t actually touchable nor embraceable for anyone is obvious. But with the fact that deaf cannot hear it either, I wanted to design something to conquer both of these problems. With this instrument they would even be able to do things hearing persons can’t even do. How far-fetched the idea may seem, the first step of the new principle was made: " T O S E C O N D I T E R AT I O N raise their self-esteem about their hearing loss by enabling them to do something with sound even hearing people cannot do. How exactly I saw this idea as a concrete instrument, became visible during my first iteration (see the concept SoundHug at page 29).


— Knockbox What exactly the instrument idea Knockbox would be like, was not very clear, but its principle is clear: the main goal of this idea was to make people aware of sound going through objects and materials. I took the aspect shape as something to make variable on an object. The intention then was to interact with it by, for instance, knocking on it to hear and feel its shape. You could see it as a tool to learn the link between shape and material to sound and vibrations, while actively deforming an object and making sound with it. As deaf people wouldn’t hear the output but feel it, I thought of a way to make it “knock back”, i.e. giving feedback as a response to their output. Indeed, the direct feedback of the vibrations through the material would give the strongest impression of the object. Read more about my sound communication exploration in paragraph 3.1 at page 24.


T O A N Y S E N S E P 3 0 Exploring the sound though

objects and materials by touching it. + Direct feedback on the taken action (such as knocking, ticking, waving) through an object. 17


F R O M I N S P I R AT I O N Granulizor B22 project,

using recorded environmental sounds to make music + Aformentioned Embracing sound + Knockbox

ultimately missing a meaningful way for controlling the instrument. Instead of turning knobs and moving sliders in order to change parameters of an effect, I felt like looking for a more emotional meaningful interaction.

+ Movie of woman hearing days of the week for the

But why was I so interested in recording sound? Two

first time ever: pure admiration of sound:

answers: firstly, (from my Granulizor project) I am point-

ing at the aesthetical value of environmental sounds (see some examples a few lines back) around us, however


F R O M P R O J E C T D E S C R I P T I O N P 7 "Design

a musical instrument ... that can be adapted"

we don’t even notice it. Making ordinary and everyday sounds available to play with, can offer us a way to use them in a joyful way. By this, I aim at a higher apprecia-



Designing an adaptive instrument

tion of these sounds. Secondly, I am seeing an opportunity in the flexibility of recorded sounds. If a player would record and manipulate a sound on a very special way, this could make this sound completely his. Just like trained

— Capture sound

guitarists come up with their own playing techniques to create their own sound; it makes it unique and the player

I used the same motivations of the Embracing sound and

can be proud to show it. As “pride” and “self-esteem” for

the Knockbox idea, but this time, recording sound from

disabled people were key words in this project (see page

the environment kept my attention. The instrument Gran-

16), I could clearly see the importance of this idea.

ulizor from my B22 project used this aimed principle too. So, my idea in the first place was to physically open the


instrument and to record sounds from the environment,

M O V E M E N T S P 2 8 Finding a meaningful way for

e.g. sounds from electrical devices, nature sounds, people

controlling the texture of sound.


talking, etc. After closing the instrument and thus stopping to record this sample, the sound could be manipu-


lated by moving, shaping or even deforming it. This dea

to ensure flexibility and a possibility to make a unique

from Capture sound of physically opening, closing and

sounding instrument.

shaking the instrument, could be considered as a direct reaction to my Granulizor project; in that project, I was


T O C A P T U R E B A L L P 2 9 Recording sound

— the Pure Data code I needed in order to receive the accelerometer values from touchOSC on my phone


F R O M B 2 2 P R O J E C T Experience from using

accelerometer. The wireless character seemed useful for

Pure Data for prototyping an instrument, using ways of

stimulating complete freedom in movement. So, sending

processing sensor data

the accelerometer data from my phone to Pure Data, my canvas was still empty, ready to be filled with my instru-


F R O M C O N D U C T O R ' S S T I C K S P 1 5 The

ment patch.

idea of waving with your hands to control music

Accelerometer to a piano sampler — Playing with electronics

So, what did I make? First off, I experimented with sending the sensor data directly from Pure Data to a virtual

For designing an instrument, I knew technology would

instrument, using loopMIDI for the communication and

play a prominent role. So, playing around with hardware

MIDI as the data itself. As a virtual instrument I chose a

and software for creating an instrument was a great

simple piano tone, the standard sampler’s instrument in

practice. Already having some experience with this from

FL Studio 11. Mapping my Z-axis from the accelerometer

the project Granulizor, I decided to build an instrument

values low-high to the all the piano’s keys low-high, I

in Pure Data. From all the aforementioned ideas, I knew

could control the piano keys by waving my phone in the

movement as an input for controlling an instrument

air. Result: slowly waving my phone creating some quick

would be something I wanted to use anyway. But how to

and continuous piano notes, feeling quite nervous. Wav-

measure movement? Speaking in terms of sensors, accel-

ing in large and quick motions created a chaotic and loud

erometers, tilt sensors, light sensors and cameras would

mess of notes. In that sense, these hard, uncontrolled

be the most straightforward examples. I wanted to do

motions could surely be heard back in the sound. Howev-

something easy, yet powerful, so I knew the accelerometer

er, when playing in a controlled rhythmic way, the piano’s

could offer a precise measurement of 3 different dimen-

sound was also relatively chaotic. This was mainly a result

sions. Both accurate and versatile: great! As a last choice,

of the chromatic mapping of the floating sensor value to

I used touchOSC on my smartphone to act as a wireless

the piano keys. 19

— the Pure Data code that measures if the player is making a hitting gestures and gets the speed of this motion

— the Pure Data code that selects the needed notes from a chord


The Ping Dong So, continuing from this chaos of nervous and uncon-

its initial pitch. For getting a more familiar musical un-

trolled piano notes, I went looking for something with

dertone, the tones played were the notes from a selected

a higher musical value. Taking the floating Z-axis ac-

major chord12. For adding a playful element, each cycle

celerometer value, I decided not to map it directly to a

of hitting, rising and falling increased the key by 1 (mak-

pitch, but to the volume of an impact. While playing the

ing the chord sound higher). Even speeding up the rising

previous piano prototype, I particularly liked playing it

and falling tone increased difficulty in hitting at the right

with hard and sudden movements, as that got the most

moment. If hitting too late or too early, the ball would

significant change in sound. So from this, I built a patch

virtually fall down, stopping the playing notes and and

measuring the level of motion impact. From a mathemati-

resetting the system to the first chord. In this way, you get

cal perspective, I calculated the amount of change be-

a musical instrument by playing a ping pong game. ¾

tween two sensor values, resulting in the derivative of this value. Using a threshold value to detect an impact event, I could simultaneously measure the force behind it. While moving the phone to this patch, I associated this motion of hitting in the air with hitting a ball and passing


it away. Together with my desire to shape it into a game

lates complete freedom in movement.

T O T E S T I N G P 5 7 The wireless character stimu+

with a musical undertone, I decided to make it a musical ping pong game. The game element (both to make


the player continue playing and at the same time, make

ence in the sound output between flowing motions and

music with it) was in the following: by waving the instru-

rhythmic motions

T O S E C O N D I T E R AT I O N Making a differ-

ment, you virtually hit a ball that you could hear flying up


and falling down with a rising and falling tone. Then, the

Calculating the mathematical derivative of the acceler-

challenge was to wave the instrument again at the right

ometer values, you can measure how quick the player is

moment – at the moment te falling tone would approach

moving, i.e. if he is playing “soft or hard”

These ideas, together with my literature research, shape the firm base beneath the developing instrument concept that I’m about to show you.


So the chord C major would just use the notes C, E and

G in any octave. In this case, an example of a tone rising and falling would be C3 – E3 – G3 – C4 – E4 – G4 – E4 – C4 – G3 – E3 – C3, using the number as an indication of the octave. 21



I T-

MAIN QUESTIONS So let's continue. For achieving the main goal, I split it up into three different questions: 1.

How to communicate sound/music to these deaf people?


How to use gestures/emotions to control the instrument?


What is an effective and appropriate sound for these gestures/emotions?

In the upcoming paragraphs, I will show you the exploration, ideation and testing to come closer to an answer to these questions.



E R AT I O N n

3 . 1 S O U N D C O M M U N I C AT I O N : S U R F A C E TRANSDUCER — Sound through water — Sound through air — Sound through materials




* *

— SoundHug — Captureball — Tingling & Stroking



— AnySense


3.3 PROTOTYPE #1 — Squeez-eh — Testing Squeezeh



¾ 3.1 SOUND COMMUNICATION - how to make the deaf hear it -


F R O M R E S E A R C H P A P E R P 1 1 For the best

result, send lower frequencies to feet, legs and chest and higher frequencies to finger(top)s, head. A first clue at the answer to the question of my goal could be found in the way to communicate music to the player. To be clear, with “communicating” I mean “transferring on a way to the player can perceive it”. I’m beginning with this, because I see this as a fundamental part of the musical experience I’m aiming at.




Surface transducer However I had read that a wooden “Haptic Chair� successfully communicated vibrations from played music through the wood to the deaf person, I insisted on finding different ways. Knowing that wood as a vibration medium actually worked, I took that as a starting point for an exploration. First of all, a way of making vibrations had to be found. From the Haptic Chair I knew the SolidDrive Sound Transducer: a sound speaker, transmitting vibrations directly to a surface instead of transmitting it to the traditional diaphragm for air vibrations. These kind of surface transducers appeared to be the perfect actuator in my case too. Due to easy availability and low price, I decided to buy the very similar V5 Vibration Speaker. Via a USB input and a 3.5mm jack plug, almost any sound source could be plugged into this vibration speaker13. Now, the sound communication exploration could start.


— Sound through water “Full enclosure of the body” was the reason I experimented with this. Putting a hand inside a bucket of water, feeling the vibrations through the fluid, that was my intention. Although these vibrations were perceptible, the heavies results came from the part of the bucket that touched a solid object (the table, in this case). In comparison to the test with just the solid object, the bucket of water didn’t give any strong results.

— Sound through air The experiments I did with air only used object filled with air. While holding my vibration speaker directly to the surface of these (see the photos around this text) objects, I noted some positive results: the vibrations were tangible. Although the intensity was comparable to the intensity with the material exploration (that I will propose next), I realized this intensity was mainly caused by the stretched and tight material. For instance, a skippy ball I was holding to the speaker actually caused some noticeable vibrations. A possible reason for this positive outcome, was the thin plastic layer acting as a vibrating membrane.


— Sound through materials Finally, solid materials and objects had a positive effect too. Especially wooden materials and hollow objects seemed to communicate the vibrations very well. Large objects like a hollow, plastic bench had almost the similar, positive impact as some large, wooden object. Although, last but not least, direct contact of the speaker to the “solid materials” of our body, i.e. our bones, had an outstanding result. The bones with the best results were: Body of Sternum (bone in the middle of the chest) and the skull (especially near the Temporal bone, e.g. the temple and behind the ear). See the images on the right.

Conclusion For answering the question Which method communicated sound with the best result?, I would strongly lean towards the direct contact to the body-method, the last one I described my exploration. The vibrations felt very strong through the body, as they didn’t get the chance to reduce strength in another medium. If I won’t be able to use direct contact to the body, then a solid medium made of wood and preferably hollow, would be the best option. ¾


In this report, I use the words vibration speaker and surface transducer as synonyms 27

¾ 3.2 EMOTIONS BY GESTURES/MOVEMENTS - the desired emotional input -


F R O M S O U N D C O M M U N I C AT I O N P 2 4

Direct contact with chest or head to communicate vibrations is the strongest. While exploring the options for sound communication, I got inspired for possible applications in the instrument. That’s why some concepts are using the chest as the centre for interaction. In this paragraph, I’ll propose the concepts, together with some characteristics I assessed them on: what emotions14 does the idea imply? Following from this, is it an introvert or extravert interaction? After all, I’ll propose my criteria and by that, support my choice for further concept development.



F R O M I N I T I A L I D E A S P 1 6 Embracing sound


F R O M I N I T I A L I D E A S P 1 8 Capture sound

— SoundHug

— Captureball

Using the chest as the centre for interaction, one of the

Instead of forcing the player to bring things to his body,

initial ideas “Embracing sound” was even confirmed in its

this Captureball concept proposes the main gestures in

power for the target group; the interaction was both cen-

another way: not centered around the body. This gives the

tered around an effective way of communicating sound

idea a more extravert feeling. So, what’s it? By detaching

and using gestures and emotions I was looking for. How-

the focus from the body and shifting it towards the envi-

ever, just the idea of embracing something didn’t seem to

ronment, I wanted to enable the player to capture sounds

have a logical flow: does the player just embrace some-

around him. This could be done by opening the instru-

thing and then stop? Where did the apparently embraced

ment, recording something, closing it and then manipu-

sounds go? Did they just slip away, ready to be grabbed

lating it. It still uses the sense of attraction from the previ-

again? That’s why I added a last action: throwing or pass-

ous concept, but then more outside the player's physical

ing it away. This completion added an extravert character

range. The aimed result: curiosity. By this, the player will

to the concept. Summing this up, which emotions does it

be invited to walk around, investigate the world, record it

stimulate? Attraction and pride. The movement of embrac-

and then use it to make sound.

ing things points at a desire to collect and grab things, i.e. you’re trying to “attract” certain things you want to have.


When keeping this to yourself, – considerably introvert –

Extravert when looking for sounds,

you can then throw them back into the world. You could

introvert when playing them

see this motion as an expression of disgust, but considering the fact that you’ve just collected them out of attraction, positive emotions like pride (of achievement), relief and joy are more likely. Attraction, pride, relief, joy Both introvert and extravert


The chosen emotions are based on the widely accepted

categorization of emotions from wiki/Contrasting_and_categorization_of_emotions 29


F R O M S O U N D C O M M U N I C AT I O N P 2 7

Direct contact to the chest and head give the most intense vibration experience.


F R O M R E S E A R C H P A P E R P 1 1 High frequen-

cies can be felt the best with the fingertips.


F R O M I N I T I A L I D E A S P 1 8 Capture sound

— Tingling & Stroking

— AnySense

Previous concepts used relatively big arm motions and

From the same motivations that I proposed in my initial

gestures. SoundHug used the chest as the centre, but

ideas under Capture sound and the concept Captureball

Captureball didn’t. Captureball already conflicted with the

I described before, I wanted to take the recording idea a

vibrations-to-chest-method, as it was using the hands as

step further. A major reason for making the full environ-

input location. Since I knew from the sound communi-

ment playable in the Captureball, was to encourage an

cation exploration that the hands didn’t give as strong

active attitude. Taking this positive part and adding it to

results as the chest, I had to find suitable option for this

the positive, intimate character of Tingling & Stroking,

location of the body. Tingling & Stroking was a direct reac-

resulted in AnySense. The goal for this concept was to

tion to that. For this concept, I wanted the player to subtly

get sounds from the environment by physically touch-

feel and stroke the instrument in order to feel the vibra-

ing them. This could be done by two shapes, directly in

tions. The aimed interaction is ticking, tingling, stroking

connection to each other: one for touching an object, one

and rubbing. These close and caring actions give it an

touching the player’s body. Even just sending the vibra-

overall intimate feeling. Ultimately, this interaction will

tions from an object directly – and amplified – to the body

become a fundamental part of the final concept.

would be an effective tool for deaf people (this was later confirmed in the Co-reflection at page 58). However, the

Curiosity, attraction

core of the initial motivation was to use these sounds to


make music with it. Having these two objects, you could enable the player to manipulate the sound by deforming or moving the objects. All of this would result in a creative instrument that stimulates you to feel the environment around you, while actively deforming it to your preference. Curiosity, pride Introvert and extravert


T O S E C O N D I T E R AT I O N Two shapes, con-

nected to each other, one feeling the sound input from the other one and vice versa


— a video that I showed during the Demo Day, displaying some of aforementioned interactions, using the yellow dummy shapes (to focus on interaction instead of form). Different names are used though, but it indicates the intentions of the emotions.

The video can be found at


¾ 3.3 PROTOTYPE #1 - presenting Squeez-eh -

Sqeez-eh is a squeezable and rounded box-shaped form that can be placed to the chest. Pushing it towards you, already some vibrations can be felt, directly emitted to the chest. These vibrations come from a low bass tone, changeable in pitch by squeezing the shape in different directions. When squeezed “inwards”, i.e. the long ends to each other, the tone rises, insinuating a stressed state of the object. When squeezed “outwards”, i.e. the long edges straight and the centre pushed down, the tone falls, insinuating that the object is forced downwards. This last action is even strengthened the experience, because of the fact that while pushing it down, the speaker that emits sound is being pushed some more to the body.


From the aforementioned concepts I wanted to build a technically feasible prototype to test during the Demo Day (a great opportunity to make visitors test my prototypes to get feedback). Although the deeper intentions from the concepts Captureball and AnySense felt the strongest to me, I failed to make a working prototype for those. On the other hand, the basic principle of communicating sound flourished the most in SoundHug and Tingling & Stroking. Because even this basic principle hadn’t been tested before, I saw this as a good chance to test it anyway. Furthermore, I chose to add the intimate sense of Tingling & Stroking too. This resulted in the following concept:

— Testing Squeez-eh For testing the prototype, I gave the visitors of the Demo Day the working prototype. Together with that, I showed a video demonstrating the concepts that I’ve just discussed (see previous pages). For showing these concepts on video, I used two half sphere-shaped forms to mark the aimed movements. Because I wanted to get the most feedback on the proposed emotions and the corresponding movements, I used these general shapes. For now, let’s take a look at the feedback and own observations:



Interaction !

Different test persons had different preferences in body parts to put the instrument to: for some of them, the head was the intense experience they were looking for. For others this location was too intense, even experiencing dizziness. The same counted for the chest. In general, when I didn’t indicate where to put the instrument, people started trying out different places. Mainly, these places were the chest, body and hands.

Sound !

However I tried to suggest a squeezing motion with the shape, but someone shook the instrument.


People found it hard to find a way to get noticeable change to the sound. Even though they deformed the


“A gliding tone as an effect of a squeeze motion is too

object and changed the note, the sound changed too

1 dimensional.”, quoting one of the visitors. Appar-


ently, the mapping didn’t enough freedom, because two different actions just gave the two same results every time.


The bass tone, emitted by the instrument, felt too much like ordinary vibrations: it didn’t feel like making music, more like making vibrations. ¾


While playing the instrument, most of the people fully concentrated on the instrument itself.

You can find it at

— a video showing people playing the instrument at the Demo Day. This video plays the sound from the instrument too


¾ 3.4 CONCLUSION & DISCUSSION - answering the questions from this chapter -

By exploring, ideating, prototyping and testing, I can now conclude the following as a response to the 3 questions from the main goal:


How to communicate sound/music to these deaf people?


How to use gestures/emotions to control the instrument?


What is an effective and appropriate sound for these gestures/emotions?

By proposing different concepts I tried to determine the best way for answering the last two questions. Technical difficulties caused the issue that I couldn’t make a prototype for every concept I wanted to test. The next thing followed: I got the result from just one concept. However I knew what could be improved for that single concept, I still didn’t know what the results of the other concepts were. That's what I will find out in the next iteration.


The answers 1.

A strong way of communicating sound/music is by making sounds with a vibration speaker that is being



kept directly (or very close) to the body. The location

C O M M U N I C AT I O N P 2 7

on the body should be flexible though, as every individual has its own preference of intensity.


Possible ways to use gestures/emotions to control the instrument could be

• •

Hugging/throwing movements Catching movements for recording environmental sounds

• •

Tingling and stroking movements Searching and touching actions for feeling environmental sounds

Like with the sound communication question, which one of these ways is the strongest can’t be concluded from the



test. However, following from the assessment (I'm aiming


at extravert actions, using the emotion of pride) of the


corresponding emotions, I can suggest that the actions of SoundHug and AnySense could suit my goals the best.


Only suggestions of appropriate sounds for the gestures/emotions can be concluded from the test:

The sound should be altering in such a way, that it doesn’t feel like a continuous, harmonic vibration. I should use a mapping that is less one-dimensional than binding one parameter of the interaction to one parameter of the sound output. Then, the instrument can get a more free interaction. This can contribute in the search for a sound that feels more like music. ¾





I T-



* n

4 . 1 A D AY O F D E A F — Findings — Conclusion


* * *

— Sound input — Emotion input — Sound output



¾ 4.1 A DAY OF DEAF - research by experiencing deafness -

There was one thing I knew that was necessary for design-

And how? Well, I couldn’t find any product that would

ing an instrument for handicapped people: experience the

lower my threshold of hearing to such a value, that I

handicap itself too.

wouldn’t be able to hear any sound. After some research, using the Ohropax Classic earplugs seemed to be a

But why? First of all, I wanted to experience the practical

proper answer to my question. These little spheres con-

impossibilities from the perspective of a deaf person. It

sisted of petroleum jelly, a variety of paraffin waxes and

could lead to previously unforeseen difficulties, mainly

cotton wool15. A major advantage of this product was that

because I didn’t have many points of reference in com-

it was unobtrusive. Especially during social interaction, I

parable design projects. Secondly, I wanted to experience

didn’t want people to change their usual social behavior

the impact on our way of using the five senses. I was curi-

by noticing my earplugs. So, however some other sound

ous if, losing the sense of hearing, the other four senses

reduction product (headphones for example) could have

would become more sensitive.

offered me a better sound reduction, walking around with them would distort the results in my social interaction.

— a case of the Ohropax Classic earplugs I used during the day


— Findings

(high pitched and a little piercing). This could suggest that after these 24 hours, I became used to the

During one day, exactly 24 hours, starting at 1:00 and end-

lower frequencies. This explanation can be supported

ing at 1:00 the next day, I continued life as a deaf person.

by the filtering diagram of the Ohropax earplugs

My findings:

(image below). You can see that the high frequencies are being reduced more than the middle and low


Almost all the environmental sounds I was used to


hear, were noticeable. However, when I wanted to understand some people talking in the distance, I had to focus on it in order to comprehend the language.


As foreseen, I had some practical issues too: –

Due to the chewing sound while eating, I couldn’t hear the television or people talking. While having dinner, I had to wait with chewing until the person I was talking to stopped speaking. Also when a talking person started to speak some quieter (in this case it was as an expression of emotion), I had to pay a lot of attention to understand it. Even though this person knew I was wearing earplugs, he tended to forget it. A good cause for this was that my own social behavior wasn’t different than usual.

Sometimes, people scared me because of their


seemingly sudden appearance behind or next to

to the body of a deaf person can be processed in the audi-


tory cortex (the part of the brains that hearing persons

F R O M L I T E R AT U R E R E S E A R C H Vibrations

use to process sound).


Like with the chewing, sounds/vibrations from anything directly and close to my head felt like they were


amplified. An explanation based on this day, could be: most sounds around me were limited, however,

But what can I use from this? Most of the findings were

sound directly in contact body parts close to my ears,

distorted by the fact that I could still hear a lot of sounds

felt the strongest. However I normally hear them

around me. Nevertheless, I think that sounds/vibrations

too, they feel heavier when insolated, resulting in an

from anything directly and close to my head felt like they

amplified feeling. This happened mostly while

were amplified is something that would hold for completely deaf people too. Even more, when concerning

– – – – !


the fact that these stronger vibrations in the body can be

brushing my teeth

processed by the auditory cortex too. ¾

talking moving my head on my pillow while lying in bed

During the day, I had listened a same song multiple times. After the 24 hours, when I listened to the song again without earplugs, the music sounded shrill


The material structure of the Ohropax Classic earplugs

gan be found under “Product Information > Material:” on 41

¾ 4.2 TECHNOLOGY EXPLORATION - finding electronics for realizing my ideas -

Current direction

— Sound input

Regarding the former issues in testing my ideas due to

Piezo elements as a contact microphone, in small and

the absence of a technical prototype, I decided to ap-

large size. These were very helpful in realizing the idea of

proach my second iteration from a more technical side.

using sound through objects as an input. The two differ-

For the first phase, a technology exploration, I collected

ent sizes caused a significant difference in strength.

components I could use to realize my ideas. As a start, I purchased some helpful components:

— Emotion input – Accelerometer for gesture recognition. The expressiveness I was looking for could be found in the concepts that used gestures and motions. As I marked earlier, the accelerometer is a great way of measuring these movements. Already having the experience of using the accelerometer during the creation of Ping Dong, I could easily apply this knowledge to my other concepts.

– LDR for closing (hand) movements or as a proximity sensor.



F R O M S O U N D C O M M U N I C AT I O N P 2 7

too. During an ideation phase about this princi-

Vibrations in direct contact with the body gives the best

ple, the first thoughts of haptic jackets, sleeves

result for feeling them.

and gloves came around.

— Sound output

– The original V5 Vibration Speaker that I used in my first iteration gave a lot success. The big-

– Piezo elements as a surface transducer. I hoped a piezo element would give enough power to

gest disadvantage was its size, but this actually caused its power too. ¾

make a surface emit sound. Unfortunately, this wasn’t the case. When I put the piezo element to a metal, thin-walled case, I could vaguely hear something. The only thing I could hear, were just the high tones. But even worse, the caused vibrations through the object were not noticeable with my fingertops at all.

— the greeting card speaker I ripped out of a Hallmark greeting card. The circuit on the right is used to store and trigger the music.

– The greeting card speaker was actually a great plan for buying an inexpensive speaker. These famous birthday cards that play some music after opening it actually cost only 5 euros. Breaking up the paper and the electrical components provided me a tiny speaker. I knew that this speaker could make the card play at a high volume, however the power of the speaker turned out to be not so big after all. I think the fragile plastic construction was a cause for that.

– Mini surface transducers gave much better results. They worked the same as any speaker and used the same principle as the original V5 Vibration Speaker I put inside the first prototype. A major advantage too was the minimal size: hardly 1 by 1,5 cm. Just like I had done with the V5 Vibration Speaker, I experimented a lot with putting them on different locations of the body. The most effective places turned out to be: fingertips, palm of the hand and the temporal bone. — the small surface transducers I bought

– Furthermore, I saw mapping different tones to different body parts as another way to communicate music to the deaf person’s body. A source of inspiration was Jeroen Blom’s project Sensible Sense16. In this project, he designed a system that could offer a tactile sense for people using a hand prosthesis. This new “remapped” sense was actually something I could use for deafness


Link to video of Jeroem Blom and his project: 43

¾ 4.3 THE STEP TO WEARABLE - finding electronics for realizing my ideas -


F R O M E M O T I O N S I wanted to aim at an

extravert interaction

In my final prototype, you’ll see that I’ve chosen to use the V5 Vibration Speaker as an addition to the smaller transducers. These smaller speakers will ultimately be used to drive the aforementioned “different mapping method” (from the previous page). However the smaller ones do make perceptible vibrations, the output was weak in comparison to the bigger V5 Vibration Speaker. This closer look at my aimed interaction together with the technical implementation, pushed me towards an important step: the product should be wearable. Why exactly?

– The speakers had to be very close to the body,constantly

– When the user plays the instrument, he doesn’t have to look at it and thus stimulate a more extravert way of playing. During the first Demo Day, almost anyone was looking at the instrument itself. I want to achieve the opposite, so that the player looks into the world, explores it and plays the instrument as an expression of pride.

– The instrument and player will become one; when the instrument is bound to the body, you can move your body freely, as you don’t have to hold anything. This contributes to the player’s flow.



the moment

when I discovered a wearable insturment suited my needs.



different shapes to suggest a wearable instrument. In this case: a belt

— L E F T one of the first drawings that illustrated not only a wearable band, but also two hotspots: an upper spot and a lower spot.

Next, you will read about a first quick prototype I made to find the exact interaction I wanted to use in my final concept. ¾


¾ 4.4 PROTOTYPE #2: THE GLOVE - shaping it -

The concept was on its way. With the clarification of the instrument being something wearable, my gesture-driven ideas pointed directly at gloves. Just the idea of having gloves didn’t bring a clear concept, though. But the extravert AnySense and sound recording-oriented Captureball formed the basis for it. Since I had experimented with some necessary electronics, I could easily start building a prototype. This prototype could have been tested in order to find a proper interaction.

— the glove prototype, made within a few hours. The prototype consisted of a 5 euro glove from the H&M, four sensors, wires and duct tape. And some imagination, of course.



a sketch showing the first idea to use an eight-cored

cable, to bundle all seven wires that turned out to be necessary for sending the sensor values to Arduino

— L E F T a sketch showing the sensor placement on the actual glove: piezo discs on the fingertips, accelerometer on the back of the hand and LDR on the palm of the hand.

But wait, why did I say “could have been tested” then? After all, the prototype wasn’t completely finished in detail to make it fully working. Plus, the prototype couldn’t have provided me the answer to “what will be a good interaction?”, as the glove idea was just a random guess. And I was searching for a concept based on my previous design decisions, not on a random guess. There was still one important design decision I had to make, proceeded from some findings at the Demo Day: the location of the instrument should be variable. There were varying preferences of this location among the visitors, so I saw this as a necessary step away from the gloves shape. ¾ 47

ž 4.5 THE CONCEPT - from design decisions to a concept -

MAIN GOAL Give deaf people a beautiful way of experiencing their environment by playing sounds on objects or materials, recording these and and manipulating them according to their preference.

— a more specific goal, composed by the design decision I made earlier, led to the concept on the right page.


* *



The instrument consists of an expandable network π of wearable bands spread around the body. π These bands can capture vibrations, movement and output vibrations. π Vibrations can be captured by holding one band unit to



an object or surface. By moving the band on the surface or by interacting with the recorded surface, vibrations can


gestures as an emotional,

be made and recorded. Afterwards, the recorded sound

visual representation of


can then be played by moving the bands. π This played



sound can be felt through all the bands, using surface


transducers in direct contact to the skin. π Dependent


on the player’s playing method (melodic or rhythmic) the sound is being played on a π flowing or rhythmic way. If



the player wants to, the different bands can be played in


different ways, e.g. a band around the leg can be played


in a rhythmic way while a band around the arm can be played in a melodic way. π






¾ 4.6

PROTOTYPE #3: THE WATCH - reshaping it -

— Requirements This was the first prototype to be built around the concept. From this concept, I knew I needed to put a piezo disc, small surface transducer, accelerometer and an LDR inside a wearable band. These electronics needed to have a meaningful place inside the band, in order to insinuate the aimed interaction:


The aimed location on the body was the wrist, so it had to invite the player to put it there


The band had to be easy to put on and off


The band had to support the interaction of placing it on an object (to record sound)


The surface transducer had to face the skin to communicate the vibrations

— the first prototype to this concept, as shown on the Final Demo Days. Worn by a visitor.


— The design 1.

There was one shape of a product that’s familiar to the wrist location: a watch. Drawing the shape of a band, together with a firm case to put the electronics (in a watch, there would be the clock), actually looks like an ordinary (smart)watch. The accelerometer and surface transducer could be easily put inside the case. The recording components, the LDR and piezo disc, had to be placed somewhere else.


An extremely simple solution for making it easy to put on and off, was by using Velcro strips. These strips insinuated something wearable to be opened and closed; exactly what I needed.


To support the actual concept of placing the band to an object, I placed the recording components on the opposite side of the band. Why there? For touching an object, people automatically face the palm of the hand towards the object. Logical, as the fingertips – very sensitive – face the object too, then. So that’s why I chose the opposite side of the band. Now, the shape was complete: the clockwork (accelerometer and surface transducer) up and a band, closeable with Velcro strips, containing the LDR and piezo disc.


As I had already determined the place of the surface transducer, this was an easy step: the transducer could be placed on the inner part of the “clockwork case”, directly facing the skin. If the user would tighten the band, he would tighten the transducer to his skin too.



— Testing it 1.

One reason for make it a wearable band, was the flexibility of location. However, as it clearly had the shape and size of a watch, the idea of putting it on other body locations became vague. An important cause for this was the band size: however the bands were made from elastic fabric, it was not resizable to for instance, a leg, head or chest. The elastic fabric was



a sketch showing the design of the

prototype. It also indicates the location of the sensors.

effective for making the band tight, though. In this way, the transducer had firm contact with the skin.





the elastic straps I used as base for the prototype



The idea of being able to do things, pride as an aimed emotion 2. 2.

So the Velcro strips were useful for putting it on very quickly, but the problem was that they weren’t flexible in size at all. Another problem with the Velcro strips closing system was that the band was hard to close on your own. To some extent, this is in contradiction to the emotions I am pointing at in this project: the deaf player has to be able to do everything on his own; independent from others. So there should be another closing system for the bands, that is easier to


close as an individual. 3.

From the previous points I could already conclude that it was important to find a way to make the band more adjustable in size. I couldn’t find any solution to that, while still maintaining the division of the electronics to two separate parts in the band. But why did I want this division in the first place? Yes, to suggest

these reflections lead to an improved design that

the user to wear it as a watch. Although, I was trying

you can find on the following pages

to find ways to make it universal. This was a clear barricade to that. So while I didn’t see the importance of this division any more, I decided to put all the components in one part. (" T O P R O T O T Y P E # 4 P 5 4 Making it one hotspot). 4. 4.

The contact with the skin on the inside part of the band felt uncomfortable. The outside fabric was actually the same as the inside fabric. As the function of the inside part was mainly to communicate sound by touching the skin (and not the outside part), the fabric had to feel that way to. (" T O P R O T O T Y P E # 4 P 5 4 Softer fabric on the inside of the band). ¾


¾ 4.7 PROTOTYPE #4: FEELBACK - refining it -

The major design decisions had been made. The concept was clear. Only, during the creation of the previous prototype, some more decisions, specifications and changes had to be made. Unexpected yet eye-opening. Some nec-

on a surface, untouched for some seconds, it turns

essary changes are described in the Testing section in the

on the sleep mode: the sound input goes directly to

Prototype #3 paragraph. You can find the final prototype

the sound output. Result: when a person is seeing it

here: these are some solutions to these problems.

and approaching it, the instrument will directly give feedback on any sound through the surface it’s on.

1+2. Now the association of the band with a watch was

Optionally, when the user is grabbing it, it gets out of

(more or less) gone, the criterion became generalized

sleep mode and goes to the ordinary behavior.

to “The instrument had to invite the user to wear it


on any location on the body and to play it”.

T O C O - R E F L E C T I O N P 5 9 Confrontation

phase To fulfill the criterion to invite the user to wear it, I had to find an alternative to the Velcro strips; these strips already insinuated an opening-wearing-closing


A major step was to give the instrument one “hot-

interaction. The alternative I found, was the plastic

spot”. This spot is both the input and the output:

click buckle. In the same way as the previous Velcro

the sound input, the gesture input and the vibration

strips, these click buckles insinuate an opening-wear-

output. This changed the interaction to a way more

ing-closing interaction. On top of that, these closing

intuitive interaction: from now on, the hotspot was

systems are generally applied in clothing accessories

the centerpiece of attention. As this spot had to be

like bags or sportswear. I saw the association of the

used to touch objects/materials too, the plastic block

instrument with wearable accessories as a positive

from The Watch prototype made the interaction very

contribution to the criterion.

distant. The hotspot was now the only layer between the user’s body part and the object/material. The


To fulfill the criterion to invite the user to play it, I

hard-shaped plastic box couldn’t connect up very well

added something to the interaction. The objective

to different objects/materials. That’s why I decided to

was the following: the user is seeing the object, grab-

change the material of the hotspot to a little cush-

bing it, putting it on, moving it and eventually playing

ion. In fact, this cushion was more flexible and it

it. The instrument’s job is to guide the user through

felt much more like a wearable too. In addition, the

these steps and eventually, make the user play it. So

cushion texture even provided a pressing action while

what I added is “sleepMode”, as I called it during

touching something. This empowered the sense of

the programming phase. When the instrument is

connecting yourself to the object even more.


During the material choice of the aforementioned, I searched for some textiles to wrap around the cushion too. From The Watch I knew I wanted a soft fabric on the inside. Synthetic felt turned out to have the character of both being soft and being able to communicate sounds very well17. The outside part of the band still had to be covered too. This part didn’t have direct contact to the skin though, but it could be touched while feeling the band. Plus, this fabric is constantly visible from the outside too. That’s why I chose sweat fabric as the textile: it feels soft and looks like clothing too.

As a last design decision, I wanted to clearly indicate the “hotspot on the hotspot” too, i.e. the very sensitive piezo disc as the sound input. In The Watch, the this disc wasn’t even visible, but as this was the centerpiece of attention now, it needed to be attractive. So, the outlook of the spot had to be attractive and sensitive. Ultimately, I chose to use the piezo disc in its purest from. It already had the golden, shiny and attractive appearance. The metal of the disc felt firm yet sensitive because of the scratches on it. Although, there may have been some other materials to satisfy these requirements even more. ¾


C O N C L U S I O N P 6 8 There could be materials

that feel more attractive and sensitive than the golden piezo disc


Pieter Bron used synthetic felt too to send vibrations from a surface transducer through a cushion. 55

¾ 4.8 TESTING - valitating the design -

Final Demo Days test

– The vibrations from the surface transducers were hardly noticeable. Even when I added an amplifier

At this last stage of the design process, the prototype was

circuit, the vibrations were not strong enough to

tested with a deaf person. Before I tell more about this, I’ll

be felt anywhere on the body. To temporarily fix this

quickly give some earlier feedback from the visitors at the

problem, I added an “Enhancer Element”: a very

Final Demo Days.

comparable band, intended to put around the chest. This unit enclosed the original V5 Vibration Speaker,

– The closing system from the band was in fact easy to use, however people still had difficulties putting

so regarding the strength of this speaker, a successful sound communication was guaranteed.

them on: the band had to be very tight to get a strong touch to the skin. The elastic band from The Watch

– For the playing interaction, I had designed a se-

prototype was actually easy to tighten. Changing the

quence of two steps: touching objects to record a

band textile to something elastic again, may offer a

sample and then moving the bands to manipulate the

solution for this.

sample. These consecutive steps did not turn out to be intuitive at all. To some extent, the first step


was achieved when the player touched the golden

would be the solution for this. As a bonus, the user

ring. After some explanation and hesitation, the us-

could choose to do the next step anyway, but this

ers touched objects too. The most intuitive action

would not be required to make music with it.

turned out to be tapping on a surface. Although, I programmed the system to record sound only when directly touching an object (using the light sensor). So this program was made to record long samples,


for instance, when rubbing a surface. This was in

to stimulate a freedom of movement

F R O M P I N G D O N G P 2 1 wireless instrument

complete contrast to this intuitive action. The result was a program recording small samples over and

– To send the sensor data to the Arduino, I used an

over, while tapping the surface over and over: the

UTP cable, going from the prototype to a black box,

aimed next interaction, moving the bands separated

containing the Arduino chip. The five meter cables

from the surface, hardly produced any sound (often

I used to connect the instrument, seemed to be

empty samples were recorded).

enough. However, the friction of the cable limited the freedom of movement. Just as I had seen in

To improve this, I

Ping Doing, a wireless

needed to melt the

instrument actually

two steps of interac-

stimulates this freedom.

tion into one: touching surfaces, using direct manipulation,

MAIN GOAL Give deaf people a beautiful way of

So, a wireless instrument would be a better choice.

experiencing their environment by playing sounds on objects or materials and by manipulating these according to their preference.

— a visitor of the Final

— the adjusted goal to the findings of my tests: the recording action

Demo Days wearing the

is not the an aimed goal any more.

Enhancer element in addition

— a visitor of the Final Demo Days exploring the glass material of a window with the instrument


Co-reflection Like I promised to tell you, the last test I did, was with a deaf person. For the test, I used the Co-reflection method in order to get on the same level of thinking. However we could communicate well by just talking (he could lip-read very well and he could talk too), I prepared a short Powerpoint presentation too to ask him some more complex question. This is the way I built it up:

1. Exploration phase Introduce him to the subject, ask him some things about

the “boom-boom” through his body, was the main

himself and continue to the subject again. Check if he

source of musical experience for him.

already has some experience with musical instrument by asking him the following questions:


The objects from the jam that gave him the best musical experience, were mostly the rhythmic ones. For

“What role does music play in your live?”

instance, the vibrations directly to the skin, caused by shaking the SMINT box felt musical for him.

“Do you have some experience with playing musical

Furthermore, I asked if a rubber band plucked like


a guitar would feel musical for him too. He told me that almost all vibrating objects directly to the skin

Tell I’m designing something that every deaf person

felt strong.

should be able to play. Give him objects to play with and let him make music with it. After a while, do it together. Some objects that I used:

2. Ideation phase Pencil, kazoo, plastic box, rubber band, glass, plastic bag, plastic glove, SMINT box, textile (some of these were

Following from this, give him a pen and paper and the

actually used to build the prototype).

tools he has just used. Make him draw or build his own prototype with these tools. Build and draw the own ideas

After playing, ask which interaction with objects gave him

too. After a few minutes, ask him what his ideas are. Ask

the best musical experience.

him to criticize his own idea(s), and if necessary, by asking him some critical questions. Then propose the own


His own experience with listening to music, was

built or drawn ideas. As him on which way they could

mainly from going out. The “boom-boom” beat, as he


described it himself, was easily noticeable. However, other instruments with a more melodic value were

As an effect to the jam session from before, the ideas

harder to feel for him. The experience he had with

he came up with all had the factor of direct contact to

playing music, was mostly playing drums – some-

the skin. Plus, all of them had a rhythmic intention.

thing logical following from the aforementioned. The reason for this was the direct tactile feedback. Feeling 58


3. Confrontation phase

objects. Clapping seemed to give him a hard impact sound as feedback. This is something he obviously

Then, propose the prototype of the instrument by drawing

liked. Due to some failings of the prototype, the re-

and using the tools from the previous phases (don’t tell

cording function didn’t work. As this wasn’t the main

every step of interaction in detail, otherwise nothing will

interaction any more, this failure didn’t influence the

be left to test). Then setup the actual working prototype,

test much. After taking of the bands and putting them

make him play it and afterwards reflect on it.

on the other side of the table, the instrument came in "sleep mode" started to directly output the sound


The drawings seemed to explain the idea well to

input form the table surface. This actually attracted

him. When I gave him the prototype, he was doubt-

us back to the instrument. So, the attractive character

ing where to wear it. Ultimately, he put it around his

of the “sleep mode” function had positive results.

wrist. Just like during the Final Demo Days, my test person didn’t intuitively started to touch random

— the deaf test person rubbing the instrument to the metal rings of my notebook

— the deaf test person ticking on the instrument using his pen










5.2 THE PROTOTYPE — The instrument — The sound — The code


¾ 5.1 THE CONCEPT - an overview of the concept -

Interaction •

Feelback is an expandable network of wearable bands

is changed to a rhythmic or melodic/flowing sound

spread around the body.

The units can be worn on any location of the body:

According to the user's movements, the input sound

If the user doesn't touch any object, he can still play the sounds he has previously felt (using a recording

this is left open to the preference of the user.

system) by moving the bands through the air.

The bands can detect vibrations by touching objects/ surfaces and interacting with them, e.g. knocking on it, rubbing it, ticking on it, etc.

When a band is left alone for a while, it tries to get some attention by directly playing sounds from the surface that it is put on ("sleep mode").

These created vibrations can be felt through all the bands, using a vibration speaker directly to the skin.




Object •

The units use a click closing system to easily put the band on. This plastic system also enables the user to resize the band, if he wants to put it on another body location.

The adjustable ribbon part of the band is some smaller than the rest. In this way, it can easily fit the hand too.

The body of the main part of the band is some wider and thicker and formed by a flexible cushion: this allows the player to press it tightly against object.

The sound input part of the band is made of a firm, shiny and golden disc: it makes the centerpiece of ineraction some more attractive.

The band has wireless communication to stimulate a complete freedom of movement.


the movement of feeling the

ring of the band: a possible interaction for playing the instrument —


the movement of waving the hands through

the air: a possible interaction too.


¾ 5.2 THE PROTOTYPE - the final instrument that I've built -

— The instrument


— the poster I showed on the Final Demo Days, displaying the instrument being worn on different body parts.

— The sound A video, containing the sound from the instrument can be found after June 26th 2014 at


— The code For illustrating the program I wrote, some important parts of the code can be found here:

— the part of the code that is the layer on top of the recorder and sequencer I built. For debugging, I added a quick view on the waveform, too.

— for detecting the beats in a sample, while recording, this piece of code remembers the moments in the sample. Afterwards, the beats will be used for playback.


— the above code is used to play the sound input as a floating output: reverb, delay, equalizing and pitch bend are used for this

— for detecting the beats in a sample, while recording, this piece of code remembers the moments in the sample. Afterwards, the beats will be used for playback.


6. CONCLUSION & R E CO M M E N DAT I O N S - reflection & what it could become -

¾ Hands-on approach

final interaction, form design and details. This caused that the final prototype worked fine, but didn’t fully feel or look

After all what happened in five months, there are a lot of

fine. For instance, the material choice was based on a

things to reflect on. Let’s start with the most important

quick decision on the availability of some textiles. After all,

one. As a personal goal, I had set to approach the strongly

I know these decisions could have been more profound.

user focused project in a hands-on way. The reason for

All these decisions shape the body of the design, which

this was to ensure that I could test my designs often. I

is from my point of view a fundamental starting point for

succeeded to follow this with an iterative-heavy process.

making the user enjoy the design. So taking these deci-

However, in the first half this was some weaker and in the

sions to a higher level would clearly improve the overall

second half it was too much. A reason for this weaker (in

quality of my design.

terms of the hands-on approach) start, was the fact that I wanted to have a firm base of research. I started making

As a conclusion, I can state the following: in future pro-

the first prototypes only at the end of this half. That result-

jects, I should clearly look for a more balanced approach. I

ed in a prototype that wasn’t representative for the ideas I

know I can control both sides of the previous two stories,

had. What followed: a few loose ideas and concepts, one

but I have to find a better balance between them.

prototype of them and just one tested. The test during the Demo Day didn’t clarify the best direction to go. Instead,


it increased my desire to explore the other directions.


In contrast to this, the second half of the project was pri-

In addition to my design process reflection, I’d like to

marily hands-on. I explored a lot of new electronics (which

highlight something about my research. As this project

contributed to my Integrating Technology skills more

had a strong user focused undertone, the user research

than ever), tried to build different prototypes with them,

played an important role. There were two major moments

to reflect on them, to rebuild, to refine, etc. Although,

when I gained some valuable results from this: at the

now I was focusing on the technology so much, I started

beginning, while doing literature research, and at the end-

losing grip on the concepts that I had. These concepts

ing, while validating my design with the target group. The

were based on my goals to make an emotionally valuable

initial literature research had a successful effect on my

interaction, but during this stage, I lost some track of this.

design process: it offered me new insights and banned

This quite loose way of exploring and building resulted

some incorrect assumptions. However, this should have

in an actual working prototype (I was actually very happy

been the case too for my validation. The problem was that

with that). But as I implied, a big disadvantage of it was

the test happened completely in the ending. The out-

that the final prototype lost the strength of non-techno-

comes of this test weren’t different from what I’d already

logical aspects. Examples of this were material choice, a

seen with hearing users. But this was more or less a case

of “good luck”. It could have brought some very differnt outcomes, though. I surely learned from this to plan these user tests not around the absolute ending, but at a certain time before building the final prototype. However I’ve fulfilled my personal goal to “Make quick … prototypes”, the essential addition of “…that can be used during user tests to generate knowledge rapidly” wasn’t satisfied. So in future projects, I should not wait until I can propose a beautiful, finished prototype to my user group. A Lo-Fi experience prototype could be enough to get valuable feedback. And this valuable feedback is essential for shaping the final concept and prototype.

So, in general... I know what to do and how to do. The main learning point from this project is to find a proper balance and timing. Fortunately, as I knew what to do and how to do, I got a satisfying result anyway. I have actually gone somewhere with this project. Somewhere beautiful where I could see a smile on the face of my deaf test person. ¾

— the deaf test person, smiling when feeling the vibrations from the input directly to his body.


Report B32 Feelback  

Design report about Feelback: a musical instrument for the deaf.

Read more
Read more
Similar to
Popular now
Just for you