auGmeNted realIty, art aNd teChNoloGy
marty the New affordable ar headSet yolande kolstee & Pieter jonker
auGmeNted PrototyPING auGmeNted realIty to SuPPort the deSIGN ProCeSS
radIoSCaPe edwin van der heide
Image courtesy STUDIO Edwin van der Heide
AR[t] Magazine about Augmented Reality, art and technology
Colophon ISSN Number 2213-2481
Contact The Augmented Reality Lab (AR Lab) Royal Academy of Art, The Hague (Koninklijke Academie van Beeldende Kunsten)
Prinsessegracht 4 2514 AN The Hague The Netherlands +31 (0)70 3154795 www.arlab.nl firstname.lastname@example.org
Editorial team Yolande Kolstee, Hanna Schraffenberger, Esmé Vahrmeijer (graphic design) and Jouke Verlinden.
Contributors AR Lab & Partners: Wim van Eck, Edwin van der Heide, Pieter Jonker, Maarten Lamers, Ferenc Molnár (photography), Maaike Roozenburg and Dirk Vis. Guest Contributors: Esther de Graaff, Vincent Hui, Barbara Nordhjem and Martin Sjardijn.
CoVer ‘Marty’, the new Augmented Reality headset by the AR Lab, worn by Mariana Kniveton.
Table of contents
48 welcome to AR[t]
Penetrating deeper into the temporal lobe
Marty — The new affordable AR headset!
Yolande Kolstee and Pieter Jonker
Radioscape — in the context of augmented reality
[AR]chitectural Education at Play Vincent Hui
Digital technologies and fine art — a complex relationship
Edwin van der Heide
Who owns the space?
Chasing virtual spooks, losing real weight
Esther de Graaff
Augmented Reality: A Story Dirk Vis
Smart Replicas — bringing heritage back to life
Unspecialize! The more you know the less you see
Finding what I didn’t seek
How it was made: a tangible replica with mobile AR
Jouke Verlinden, Maaike Roozenburg & Wolf Song
Augmented Belief in Reality Maarten H. Lamers
A portrait of Daniel Disselkoen Hanna Schraffenberger
Augmented Prototyping: Augmented Reality to support the design process
The revival of Virtual Reality?
Wim van Eck 5
to the second issue of AR[t], the magazine about Augmented Reality, art and technology! Much has happened in the six months since AR[t]
London. For the second time in a row we have
magazine’s first issue. The AR Lab has developed
presented our research at the ISMAR conference,
a new Augmented Reality headset in close co-op-
which took place from 5 till 8 November 2012 in
eration with the Bio-robotics Lab at Delft Univer-
Atlanta and the AR Lab's presentation of the Van
sity of Technology. Our article “The Augmented
Gogh project was rewarded as ‘Best Demo 2012’.
Painting: Playful Interaction with Multi-Spectral Images” has been accepted at ISMAR 2012, the
The AR[t] magazine has developed as well. Six
International Symposium on Mixed and Augment-
months ago we started off as an aspiring maga-
ed Reality. The project described in this article
zine series for the emerging AR community in and
shows in an interactive way discoveries made by
outside the Netherlands. We received enthusias-
material specialists and art historians in Vincent
tic feedback on the first issue of the magazine.
van Gogh’s paintings. The AR Lab has grown: in
More than that, we were approached with vari-
September 2012 Dirk Vis joined our team for one
ous contributions from interested parties. The
day per week to promote AR and other new visu-
editors have selected several of these contri-
alization techniques like 3D printing, in various
butions to be included in this second issue. As
disciplines at the Royal Academy of Art by orga-
a consequence this issue is even more diverse
nizing a Pop-Up Art Gallery. We have furthermore
than its predecessor. Besides contributions from
welcomed Mariana Kniveton as an intern at the
researchers, artists and lecturers of the AR Lab
Lab. She researches the use of innovative visual-
(based at the Royal Academy of Arts, The Hague),
ization techniques in museums and has supported
Delft University of Technology (TU Delft) and
the editors of this magazine in various tasks. We
Leiden University, this second issue also features
have also invited DPI Animation House, a com-
texts by international researchers as well as local
pany based in Schevingen, to join the AR Lab. We
young writers. Contributions include technical
are experimenting with artistic projection map-
articles, entertaining essays, in depth discussions
ping, while Jouke Verlinden works in this field for
of AR art works as well as short columns.
industrial goals like rapid prototyping. While the magazine undoubtedly has grown, its The hard work of the last few months has paid
core remains the same. In AR[t], we share our
off and the AR Lab won the 3rd prize at the
interest in Augmented Reality (AR), discuss its
RAAK-AWARD 2012 for the high quality of the re-
applications in the arts and provide insight into
search work done in the framework of a Raak-
the underlying technology. If you haven’t done
Pro Research Programme. Furthermore, the AR
so yet, we invite you to check out the first issue
Lab’s Van Gogh iPad project has been shortlisted
and our the website www.arlab.nl to learn more
for the ‘Most Innovative use of AR 2012’ Award
about Augmented Reality in the arts and the work
as part of the AR Summit held in June 2012 in
of the AR Lab.
PeNetratING deePer INto the temPoral lobe BarBara nOrDHJEM
CrEw Crew is an interdisciplinary collective of artists and scientists mainly based in belgium. the composition of the group changes depending on the project, but the performances are always at the intersection between art and science. Crew has been combining theatre and technology since 1998 with eric joris as artistic leader. their first immersive performance, â€œCrashâ€?, premiered in 2004. Since then, the group has been experimenting with hybrid performances and installations where story-telling, live elements, and human-machine interfaces come together. there is no traditional separation between the actors on a stage and the audience. Instead the visitor is guided through different settings where active participation is required. a lot of Crewâ€™s activities evolve around issues that are also currently being investigated by neuroscientists and philosophers. themes such as the human mind, senses and our experience of reality are explored with "multimedia as a prosthesis". apart from the performance group, there is also the Crew_lab which focuses on research related to immersive media and development of new technology.
I am always looking for art that messes with your mind and tricks your senses. The performance Terra nova by CrEw starts out as a perverted experiment with the audience as guinea pigs. In other words, I feel like I have come to the right place. CrEw is an interdisciplinary group of artists and scientists based in Belgium. Their performances incorporate new technological environments with live performance. Terra nova can be experienced by 55 participants at a time. Before entering the performance space, we are all given headphones and divided into five smaller groups. During the first part of the performance, I am placed in a steel chair and can observe how other people are guided through an alternate reality. a line of people enters, wearing brown jackets and heavy backpacks. Their footsteps are slow and insecure, the outside world is sealed off by headphones and video goggles. Each person is accompanied by an actor. The movements of the actors appear to be orchestrated by a central person who gives directions by using gestures, as if they carry out a medical procedure or experimental protocol with care and precision. It looks like a scene taking place in a dystopian near-future society. â€œThe fear of death is in the temporal lobe, we are going deeperâ€?, a monotone voice announces. The participants are then tied to wooden boards which are flipped backwards into horizontal position. On the sideline, we can also watch a video projection, images of narrow hallways, is there light at the end?
Crew, terra NoVa at deaf2012. Image by jaN SPrIj 2012
At some point, I am instructed to proceed into
turn my head. I can explore the environment as I
another room by a distant voice. I line up behind
look and walk around (or rather, I wildly bounce
other group members and we enter a room with
my head in all directions to see if the equipment
a bare-chested man. He tells a story about delir-
can keep up). This direct relationship between
ium and fatigue during a rough expedition to the
head movement and vision really gives the feel-
south-pole. The central theme is our perception
ing of being placed in the video.
of reality and how certain situations can create a distorted view of our surroundings. This way, the polar quest becomes a metaphor for explor-
Levels of reality
ing the brain and our senses. After a dramatic mind trip involving penguins, snowstorms, and
CREW collaborates with institutes for media tech-
the frontal lobe, I finally find myself in the role
nology to develop customized tools for immersive
of ‘immersant’: a traveller in a virtual surround-
environments. The group works with ‘omni-direc-
ing. I am eager to try out the video goggles and
tional video’ (OVD), which allows you to see sur-
see if I indeed will feel like I am somewhere in a
round video on a head mounted display (HMD).
different time and place. Perhaps a few people
The places you are exploring as an immersant are
from CREW noticed one person continuously turn-
pre-recorded while actors around you appear via
ing her head around after being ‘plugged in’. The
real-time video filmed with a small head mounted
projection on the video goggles gives a partial
camera. By using an orientation tracker placed
view of the scene according to the direction I
on the immersant’s head, it is possible to move
Crew, terra NoVa at deaf2012. Image by jaN SPrIj 2012
around in a multi-layered video environment.
mind travels in a completely computer-generated
what appears to be one coherent view of the
world detached from the physical body. CrEw
environment is created by combining 360 degree
takes a turn towards embodied technology and
pre-recorded video, real-time video from the cur-
physical presence by combining video, sound,
rent environment, and metadata from the orien-
motion tracking, and tactile stimulation given by
tation tracker. Different recordings are merged
into one unified scene where the immersant can move around and explore. Unlike augmented reality (ar) and virtual real-
Tapping into the senses
ity (vr) there are no computer-generated virtual elements, but different layers of video recorded
CrEw takes a lot of experimental findings from
reality. vr takes place completely in a virtual
cognitive neuroscience and puts them into prac-
3D world, whereas ar allows you to see the real
tice in the performance. In the beginning of an
world with added virtual elements. what CrEw
immersion, you find yourself in a state of sensory
presents is a form of mixed reality where video
deprivation. visual references to the surrounding
recorded elements from a different time and
space are concealed during an initial period of
place are combined with elements recorded
complete darkness. To add to the loss of orienta-
in real-time. In mixed reality there has been a
tion, you are then strapped onto a standing bed
strong focus on the visual aspect of the experi-
which is slowly flipped backwards. This creates a
ence, especially in earlier vr scenarios where the
very effective feeling of disorientation by mov-
ing the body around in space without any visual
your own hand being touched the same way. The
references. By shielding off visual input and relo-
combination of visual and tactile stimulation can
cating the body in space, the reference to reality
create a powerful illusion: you start experiencing
the virtual hand as your own. In other ways, your
It is already known that the brain is more likely to
sense of bodily self is moved into the virtual world,
fill in information when there is no sensory input.
you feel like it is really you on the screen. This
Early studies in the 1960s and 1970s showed that
effect works with a principle called the rubber
people start to hallucinate when they are put into
hand illusion (rHI), which has been used in several
tanks where they are floating in complete darkness. More recently, there has also been an experiment showing
‘your sense of bodily self is moved into the virtual world’
some hallucinations after just 15 minutes of senso-
scientific experiments. The participant’s real hand
ry deprivation. It seems like you begin to perceive
is hidden and a rubber hand (or someone else’s
your own reality in the absence of external input.
hand) is visible instead. Synchronous stroking and
In the case of CrEw, the initial period of complete
tapping of both the participant’s hidden hand and
disorientation seems to create a heightened sense
the rubber hand creates the illusion that the rub-
of presence by preparing you for accepting the
ber hand actually belongs to the participant. If
new mediated reality. If you don’t know where
stimulation is not synchronous, the illusion does
you are, you might as well be located in the video
not take place. You can try the illusion at home
projection appearing in front of you.
by putting a glove on the table and keep your own hand hidden under the table. Let someone stroke
both the glove and the real hand in the same way at the same time. The rHI is an example of visual capture, where the visual sense dominates over
another tactic used to enhance presence — the
other senses. You feel tactile stimulation on your
feeling of being there — is to create a link between
body, but see it happening somewhere else. The
what is shown on the HMD and what you feel your
brain has to integrate conflicting information with
body. During the immersion, you will see a virtual
the result that the fake hand feels like a part of
hand being touched via the video goggles and feel
your own body.
Image by reNé PaSSet
CrEw tells a story about perception, the brain,
Botvinick, M., & Cohen, J. (1998).
and what we experience as real. It is a story
rubber hands ‘feel’ touch that eyes
about the mind and a poetic interpretation of
see. Nature, 391, 756.
scientific studies. a lot of fundamental questions are posed about who we are and what we believe
Lenggenhager, B., Tadi, T., Metzinger,
is real. Can we trust our senses at all if our ex-
T., & Blanke, O. (2007). video ergo sum:
perience of where we are and what we percieve
manipulating bodily self-consciousness.
can be manipulated so easily? CrEw states that
Science, 317, 1096-1099.
neuro-philosopher Thomas Metzinger inspired the performance Terra nova. The idea of our bodily
Mason, O. J. & Brady, F. The psychoto-
self is something we take for granted and which
mimetic effects of short-term sensory
we see as a fundamental part of being a person.
deprivation. the Journal of Nervous
Metzinger argues that how we experience our-
and Mental Disease, 197, 783–785
selves is an ongoing process rather than some-
thing fixed and stable. The experience of being a person with a body and a mind is the result of
Metzinger, T. (2007). Empirical perspec-
neural processes and sensory information. when
tives from the self-model theory of
your brain begins to receive evidence that your
body is located somewhere in a virtual world,
a brief summary with examples. Prog-
that becomes the experienced reality.
ress in Brain Research, 168, 215-278.
It may be a disturbing idea that our perception of
wynants, n., vanhoutte, K., & Bekaert,
reality and of having a self is not stable but can
P. (2008). Being Inside the Image.
be manipulated. at the same time, it also means
Heightening the Sense of Presence in a
that we are able to see and feel much more than
video Captured Environment through
what we may think. after the performance, I walk
artistic Means: The Case of CrEw.
outside with my unstable self and my deceivable
Proceedings of the 11th annual Interna-
senses. I look down at my hands a second time to
tional Workshop on Presence, 157-162.
check that they are really mine.
BarBara nOrDHJEM I graduated in 2011 with a master’s degree in Cognitive Neuroscience from the university of leiden. Now, I am in the Visual Neuroscience Group at the university medical Center in Groningen. I am generally interested in visual perception and how we are able to extract the most useful information from the environment in different situations. I have a rather mixed background. before I started my Phd research about different ways of seeing our surroundings and the neural mechanisms involved, I initiated a festival for live visuals and electronic music in denmark and have worked at the media art institute V2_ in rotterdam.
the New affordable ar headSet! YOLanDE KOLSTEE anD PIETEr JOnKEr The ar Lab has developed a new augmented reality headset in close co-operation with our partner the Delft Biorobotics Lab of the Delft University of Technology. It is a successor to the well-known headset George, which became famous because of the work done on co-operation through ar by our partner, the Systems Engineering Department of the Faculty of Technology, Policy & Management at the Delft University of Technology for the CSI-The Hague project (www.csithehague.com). 14
Our new headset deserves an appropriate name:
sitioned. In contrast to markers, natural feature
‘Marty’ refers to the scene in “Back to the Fu-
tracking is non-intrusive as it uses key points from
ture (2)” in which VR headsets were used. In con-
the world around you.
trast with George, the new headset is a videosee through system. Niels Mulder designed both
Open Source and low-cost
George and the new Marty.
This project is open Source. The AR Lab will make
the 3d model and print associated with this this low-cost AR headset available to the public. This
The Sony HMZ-T1 VR optics was used as a starting
allows anyone to make an AR headset at 720p for
point for building Marty. The basic components of
only € 1.500, paving the way for high quality Aug-
the Sony head mounted display have been re-used.
mented Reality for the general public.
Two Logitech C905 cameras were added to the design and the glasses have been made more robust.
With this new headset and software, developed
In line with the design of the George headset, sim-
by the Delft University of Technology and the AR
plicity and robustness have been integrated with
Lab, it will be possible to walk through any space
state of the art design. Niels drew inspiration from
and experience Augmented Reality as never be-
the famous View-master; a simple, low cost setup
fore. The design files of Marty will be made avail-
which has been in use for decades and allows you
able a.s.a.p. on Sourceforge. The tracking, map-
to see real 3D images in a simple stereo setup.
ping and user interface software will be made
available (on Sourceforge) after Oytun Akman defends his PhD thesis on December 3, 2012.
Oytun Akman and Robert Prevel, PhD research-
The AR Lab will open a blog for help and com-
ers at the Bio-Robotics Lab at Delft University of
ments on the design of Marty and the tracking,
Technology, developed the software for this
mapping and user interface software.
headset. The software allows: ■ Head-pose tracking and on-line building of a coarse 3D point-cloud map at 30fps; ■ Reconstruction of the 3D scene and on-line building of a fine grain 3D point-cloud map
The PhD thesis of Oytun Akman (see Figure 1) can be downloaded from: http://repository.tudelft. nl/v iew/ir/uuid:3adeccef-19db-4 a0 6-ab268636ac03f5c0/
at 2fps; ■ Hand pose tracking for human-machine interfacing to control the headset. The software works by tracking 3D natural features on multiple scales, using only a stereo video-feed from a pair of webcams. It is still very difficult to position virtual information in three-dimensional space for AR headsets, so we are very glad that mayor breakthroughs in software development by the TU-Delft have made 3D natural feature tracking possible. With natural feature tracking, no markers are needed; instead, salient points in the 3D space — on walls, books, and paintings for example — are tracked and used to provide invisible anchor points by which the virtual objects or scenes can be po-
Figure 1: Oytun Akman’s PhD thesis
Advantages of AR head-sets over smartphone or tablet PC based AR
say ‘Restaurants around’, and then wait for the
Using a smartphone or tablet PC to see AR informa-
cases, you also have to lay the smartphone flat
tion has a lot of advantages because smart phones
to prevent compass errors. In addition, distortion
and tablets are so widely used. Many users expe-
of the virtual layer in relation to the real world
rience an extra layer of information by scanning
makes smartphones insufficient for certain appli-
AR markers or by using GPS and compass informa-
tion on their location. This can link them to cul-
Head-set AR displays, on the other hand, permit
tural, historical or commercial information.
the user to see the real world in 3D, which en-
However, there are some disadvantages too.
ables the user to move around safely and to see
The screen size is very limited, even when using
and experience virtual reality at the same time.
an iPad or Android tablet. Moreover, smartphone-
The experience of a virtual layer when using AR
based AR requires that the user hold his phone
eye-wear is immersive, as we know from our ear-
at eye-level to position the phone between his
lier experiments with AR headsets in an exhibi-
eyes and the real world he is augmenting. This
tion in Milan, where we showed virtual furniture.
puts the user in a tiring pose that cannot be sus-
By designing the headset-frame properly, we can
tained and interferes with other activities such
keep occlusions to a minimal and maximize the
as walking around in a street finding historical or
user’s field of view. This allows the AR glasses
architectural information to insert into the AR
to be used hands free and lets the user do tasks
requiring both hands, as many people have seen
Another annoyance is the process of setting up
in the BMW commercial where a mechanical en-
an AR application via, for example, Layar. You
gineer is repairing a motor.
phone to get the correct information. In most
have to start up Layar, activate the application,
The CYBER-I BI-Ocular: the new professional AR headset The ARLab ordered the manufacturing and obtained the very first optical-see through headset from our partner CybermindNL.com, the CYBER-I BI-Ocular. This is an XSGA OpticalSee-Through AR headset with two USB 2 uEye LE cameras. It is about a factor 10 more expensive than Marty, but unlike Marty apt for the professional market. We are negotiating with business partners to bring a professional version of the software of Oytun Akman on the market for this headset. Contact email@example.com if you have a business proposal. The CUBER-I BI-Ocular headset
Radioscape in the context of augmented reality Edwin van der Heide When I initially got the idea for Radioscape, I was thinking about creating a parallel reality that traverses the existing physical space and reveals itself by navigating through that physical space. It was later, when I encountered the term augmented reality, that I started reflecting about artworks in which the augmentation of reality plays a central role. During this process I beRadioscape can be seen as an artwork in which reality gets augmented. It’s important to mention that I consider all modalities (not only the visual) when thinking about augmentation and I believe that it’s a requirement for an augmented experience that the perceived result is more than the combination of the real and the virtual. The real and the virtual should not just co-exist but relate to, and influence each other, leading to an experience that is more complex than the addition of the two. I like to see it as a perceived ‘multiplication’ of the world around us with a virtual constructed reality.
Image courtesy STUDIO Edwin van der Heide
came more and more convinced of the idea that
In this text I’ll first give a more technical explanation of Radioscape. I believe it’s important to understand the parallels between the acoustic
traverses and reflects in the three dimensional
space and the electromagnetic space that form
space around us, it is also spatial because our
the foundations of the work. After that I’ll go into
listening results in the perception of sound as a
more detail on the nature of the augmentation
spatial phenomenon. Furthermore, any spatial
that occurs in the work.
experience, such as the perception of sound, has
It was a call for artworks for an area in the
a bodily component to it because we relate the
country side of Japan that brought me to the fol-
perception of (things in) space to the perception
lowing question: We perceive sounds as individual
of our body in space. We can act in, and move
sources that merge and interact with the acoustic
ourselves through the acoustic space.
space around us; would it be possible to achieve
Radio is often used to transmit individual sig-
a similar experience with electromagnetic (ra-
nals from one point to one or more other points.
dio) waves? Sound is not only spatial because it
The spatial nature of the electromagnetic waves
and the multiplicity of sources are ignored in or-
shift it back down. This is a form of transmitting
der to realize individual communication channels
without using any form of modulation and just
(examples of exceptions are certain radio ama-
using a certain frequency range within the elec-
teur contests, fox hunting and gps). To establish
tromagnetic spectrum. what I especially wanted
these channels, different forms of modulation
to know is what would happen if we would trans-
(like aM, FM, GSM, etc.) are being used. Individu-
mit two independent sounds by shifting them up
al transmitters broadcast on their own frequency
individually, transmitting them at two different
and with a receiver we tune in to a specific fre-
locations, receive them at another location, shift
quency while filtering away the others.
them back down and listen to the received signal.
I wondered what would happen if we would
I was expecting that the signals would ‘mix’ in the
simply shift sound up in frequency, transmit it at
electromagnetic space just like acoustic sounds
one location, receive it at another location and
‘mix’ and co-exist in the acoustic space. It is an
approach to radio where we think of it as a field
we increase the distance to the source because
of transmitted sources with the receiver moving
the radiated energy gets spread over a bigger sur-
through this field, instead of a receiver that is
face in space and besides that, the air itself ab-
tuning in to a single transmitter and establishing
sorbs a bit of the energy as well. Electromagnetic
a (one directional) non-spatial, point-to-point
waves have that same behavior but when using
regular radio transmission techniques, this effect
The above setup of transmitter(s) and re-
is avoided or compensated for. The reason for this
ceiver can simply be compared to a setup of
is that both FM and AM use a carrier signal that is
speaker(s) and a microphone, but addressing the
modulated by the audio signal.2
electromagnetic space instead of the acoustic
The proposed system of shifting the audio sig-
space. The transmitters can be seen as amplifiers
nal up (and down) in frequency is a form of trans-
with an antenna instead of a speaker connected
mission without using a carrier frequency. Any sig-
to the output. And the receivers use an antenna
nal the receiver receives within the chosen fre-
instead of a microphone. The used frequency
quency range becomes audible. It is the individu-
range is determined by the amount of frequency
al strength of the received signal(s) that directly
shift used by the transmitter and receiver.1 The
corresponds to the loudness of the signal(s). The
loudness of acoustic sound sources decays when
closer the receiver gets to the transmitter, the
louder the signal becomes and vice versa.
approximately 2.5 seconds for a radio signal to
while sound travels with a speed of about 340
travel to the moon and come back. The moon has
meters per second, radio waves travel with the
an average distance of about 384,400 kilometers.
speed of light: 300,000 kilometers per second.
The wavelength of a wave depends on its fre-
an acoustic sound reflecting in space is some-
quency and traveling speed. Frequencies in the
thing that we perceive as both a timbral and a
FM band range have a wavelength of about three
temporal phenomenon. we can hear an acoustic
meters. when these waves reflect between build-
sound reflecting in space, resulting in reverb or
ings in a street this leads to standing wave pat-
echoes. It is temporal not because it just happens
terns with cancellations points that occur - for
in time but temporal because we perceive it as
example - every 1.5 meters. we can avoid these
something that happens in time. we speak for ex-
standing wave patterns by using a relatively long
ample about a reverb with a duration of 1.5 sec-
wavelength of 175 meters (=1.7 mHz). at this
onds or an echo that arrives half a second later.
wavelength buildings are not â€˜justâ€™ reflectors,
radio waves can also reflect in space but since
they start to become conductors and resonators
they are that much faster than sound we would
for the transmitted signals. This means that the
need enormously big spaces in order to be able to
physical environment is excited by, and responds
perceive these temporally. For example, it takes
to, the transmitted radio waves. while developing the work I learned about different antenna principles. a vertical antenna has an omni-directional sensitivity pattern and relates to the electric component of the electromagnetic field. a coil or loop antenna is only sensitive from the sides and relates to the magnetic component of the electromagnetic field. I realized that these two directivity patterns were equal to the patterns of the omni directional microphone and the figure eight microphone. There is a stereo recording technique called mid-side (m-s) that uses exactly these two microphones and I started to wonder whether it would be possible to realize a stereo receiver with such an antenna setup. It would not be a receiver that receives a signal that is broadcasted in stereo but a receiver that creates a stereo image with the positions of the individual (mono) transmitters. Transmitters that are to the left of the antenna will be heard more on the left and transmitters to the right side of the antenna will be heard more on the right. rotating and moving the receiver changes the stereo image directly. Each transmitter is transmitting its own layer of the meta-composition. The layers are slowly changing and eventually repeating after 4-10 minutes. The changes within a layer are the slow-
Image courtesy StudIo edwIN VaN der heIde
est changes that you can experience in this environment. Itâ€™s a result of not walking and not mov-
ing the receiver and just listening to the changes
transmitter that is otherwise inaudible because of
within the received layers themselves. The next
its large distance or that one of the transmitted
level of change is the interaction that occurs
signals becomes dominant over the others. The
when you don’t walk but just move the receiver.
resonating buildings are an interesting example
By doing so, you re-orient yourself in the field of
of a situation where the real world interacts with
received signals and find new perspectives to the
the added layer. The radio signals are not just a
environment. The last level of change is the re-
parallel reality that leads to associated relations
sult of simply walking and thereby getting closer
between the physical space and the electromag-
to certain transmitters while moving away from
netic space. They directly influence each other.
others. Certain transmitted signals will decrease
Besides the intentionally transmitted signals,
or disappear while other signals will fade-in or
the receiver picks up other signals that are pres-
become louder. while listening, you alternate
ent in the same frequency range. neon lights and
your focus and the way of interacting.
computers that control street lights can emit
The radioscape receiver is a handheld receiv-
strong fields around them that don’t carry far but
er. By moving the receiver you explore the space
can take over in loudness on a local scale. already
around you. Every movement of the receiver has
present signals that are otherwise unperceivable
a direct effect on the received signals. The scale
become perceivable and part of the received en-
and the speed of the changes match well with the
vironment. These moments add to experience
space that we describe with the movements of
and make it even more real.
our hands and arms. Therefore the space around
radioscape is a work in which you can share
you becomes an almost tangible space in which
your experiences. By navigating the city you gen-
you explore and remember positions and transi-
erate your own sonic order, combinations and
By navigating the city you generate your own sonic order, combinations and timing of the composition.
timing of the composition. when
with other people you share the same augmented space. The global experience will be more or less identical while local signals really differ from each other. I’ve seen par-
tions. It’s intuitive to navigate in the sense that
ticipants, talk about what they experience, imi-
it reveals itself easily but is complex enough to
tate each other and notify each other of particu-
keep on exploring.
larities they discover. They are in an augmented
radioscape takes place in the public space.
world that is based on the intersection of the
My preferred locations are areas within a city
physical world with a composed parallel world of
with a lot of diversity, that are easy to walk and
transmitted signals. Furthermore, there is a true
have streets that are close to each other so that
interaction between these two worlds.
the participants frequently have to choose where to go. It is the chosen city area that dictates the
possibilities of where you can go. as mentioned earlier the buildings in the
■ 1 Examples of frequency bands are the Lw, Mw,
environment become conductors and resonators
Sw and FM band. radioscape is however not
for the radio waves. There is often a clear effect
using the corresponding modulation tech-
when you move the receiver close to a building.
nique and therefore incompatible with con-
It happens for example that you start hearing a
â– 2 The carrier signal is generated by an oscil-
transmitter signal (carrier) is multiplied with
lator that corresponds to the transmitter
the audio signal. The received signal decays
frequency. when using frequency modula-
with increasing distance between the trans-
tion (FM) the transmitted audio signal modu-
mitter and receiver, but this is compensated
lates the transmitter frequency (carrier) and
for by an automatic gain control system build
therefore the content is independent from
into the receiver and resulting in a constant
the received amplitude. when using ampli-
amplitude of the carrier. It does not influ-
tude modulation (aM) the amplitude of the
ence the dynamics within the audio signal.
Image courtesy StudIo edwIN VaN der heIde
“I’ve never been here”
Finding what I didn’t seek Esther de GraafF “I’ve never been here”, says one of the participants who has lived in Rotterdam for eleven years. At the beginning of the “Serendipitor Walk” ten minutes earlier, he told us he knows every part of Rotterdam like the back of his hand. Within a few minutes, my initial scepticism about this navigation-app has disappeared. Mark Shepard, a charming and pleasant American artist, architect, and researcher, smiles. In 2010, he developed the “Serendipitor” app together with the V2 _ Institute for the Unstable Media in Amsterdam. It is part of his longstanding project “Sentient City Survival Kit”; a collection of objects for surviving futuristic urban life. ‘Surviving’ may sound rather severe, but the term indicates that Shepard hasn’t just made a nice toy. His work is an investigation into the relationship between the ubiquitous computer and the city, in an attempt to provoke thought about our controlled society. By following various instructions, you are forced to relate to the city in a novel way. For example: follow a man with an umbrella, or buy a rose and give it to the first passerby. Each walk is determined by four types of instructions: social (meeting people), geographic (go left), abstract (track a square) or architectural (enter the tallest building). You decide how long and complicated your walk will be and how many assignments you wish to do, depending on how much time you have to play, explains Shepard. It is not surprising that his instructions draw on artists like Vito Acconci and Yoko Ono of the Fluxus movement, an international avant-garde art movement from the sixties which focused on art based on mundane things.
During the DEaF festival in rotterdam (May 17 -
him/her to draw a map of his/her childhood. at
20, 2012), Shepard was present every day to walk
Café Brasserie Dudok, we meet a woman from
the “Serendipitor walk” with a group; an amazing
Turkey who is waiting for a friend. while enjoying
opportunity. On a beautiful Friday afternoon, we
a drink she tells and draws us her life story in just
accompany the artist from the DEaF main site at
ten minutes. She has established an impressive
the Coolsingel into the city. Our destination has
career as a doctor in many European cities. De-
been entered, but the exact route depends on
spite how short our encounter was, I am moved
the instructions we are given. For our first as-
by the beautiful story. next, we placed her draw-
signment, we must walk down the Coolsingel to
ing over the map of rotterdam, so that the lines
the Meent, find a quiet spot, and stay there for a
literally determine our route. This dictates that
while. OK. So we arrive at a very busy intersec-
we head west.
tion, not the easiest place for quietness. we spot a wide staircase at the side of a building, which
Current navigational products, such as TomTom,
leads us to a higher, quieter place. Suddenly we
make it very easy to get from a to B. Previously,
are faced with a door, behind which we find an
computer technology could only be found in-
oasis of calm and beauty. we’ve come across a
doors. nowadays, the Internet is everywhere in
gallery unknown to me: the wTC art Gallery.
the city. we are surrounded by these invisible
after walking around here for a while, we press
digital information systems, which constitute a
‘next’ for our next assignment. we must go east
kind of virtual layer over our reality. This cre-
and find a bar, offer someone a drink and ask
ates a capacity to gather and process information
that is increasingly interwoven with the physi-
places in rotterdam that I would otherwise have
cal structure of our urban space. For instance,
walked right by. we found places unexpectedly
various applications for smartphones and tablets
when we were searching for something else. and
provide digital information about your immediate
I’ll be damned. He inspired me and got me think-
surroundings. Simply hold your phone in front of
ing. I often have no idea what route I traveled,
a street to see where you can find the nearest
simply because I look at the screen instead of
aTM, or point it at an old building to read its his-
my surroundings. The app stimulates you to re-
tory. This is incredibly convenient, but the rise
ally look around and to make contact with peo-
of computers also makes it easier for futuristic
ple. The app is available for download and use
cities to keep an eye on us. You could say that the
in every city in the world. want to skip a task?
city is getting smarter. Shepard wants to make us
anything is possible, simply press ‘next’. Each as-
think about the consequences this could have on
signment gives the opportunity to find something
our culture and politics, in regards to privacy for
you were never really looking for. Something we
example. His app presents an alternative way to
should do much more often.
navigate through a city where everything is regulated, public and controlled. Shepard’s point is to
the app is free to download from appStore:
become aware of your surroundings and the route
you travel. However the instructions themselves
are fairly concrete and thus exert some form of
control. Shepard responds with a smile: “Free-
dom is not possible without control”.
Small detail: unfortunately not suitable for android users.
after having walked for an hour and a half, we collectively decide to end the ‘artist walk’. The
assignments kept guiding us away from our origi-
nal route. we did not reach our destination, but that is not important. During this walk, I found
© June 2012, Esther de Graaff
ESTHEr DE GraaFF www.estherdegraaff.nl
Project manager, curator, writer, coordinator — call me any of these things; I do them all. My love for organizing events is central. It’s all about the end result. I support artists, curators and institutions in organising various projects, both in content and in organisation. Social themes are essential to my work. I am not interested in art for the sake of art; it’s all about sharing experiences. Culture is an encounter. The projects, texts and exhibitions created by Esther de Graaff are characterized by creative solutions and efficiency. Inspiring people with beautiful images and deepening their view of the world is the whole point.
Smart Replicas bringing heritage back to life Maaike Roozenburg Treasuries of objects
at this time? And what is their significance in a
Our museological heritage comprises collections
rapidly digitizing society?
of objects — from paintings to pottery — which
Museums and cultural heritage sites are the
have been categorized, grown and preserved
“institutions” who manage this heritage; store-
over the course of centuries. Objects we value
houses of objects, stories, images and historical
for their historical, cultural and social context;
knowledge. The conservative way these objects
objects that play a key role in the way we view
are made accessible — you must to go to a muse-
our shared history and cultural identity. Our his-
um or exhibition — means you can admire, but not
tory is examined, reconstructed and visualized
touch, let alone use them. This strips the objects
through these objects, but how can we relate to
of their main purpose and function — their origi-
all these stored items? What is their relevance
nal use — and isolates them from our daily lives.
Image by SaM rEnTMEESTEr
‘enriching’ objects The knowledge and information which museums
tural heritage into new design? How can you link
and cultural heritage sites hold, gather and de-
these objects to the historical information which
velop about the items they own and manage is
is stored in a museum? How could visitors interact
barely accessible to the public. It is fixed within
with these “enriched” objects? These questions
the confines of the museum and is often reduced
will be examined in the project Smart replicas.
to a short text on a museum plaque or an item in a catalogue. In the project Smart replicas, we investigate how
3D prototyping — such as 3D scanning, printing and
Smart replica is a term we have given to repli-
reproductions which make digital objects “real”
cas of historical objects made useable again by
and vice-versa — and augmented reality (ar) can
means of 3D scanning and printing techniques.
contribute to the accessibility of museum objects
Smart refers to ‘intelligent’; these objects are
while providing the means with which we can also
not just copies, but replicas enriched by innova-
increase our knowledge of the objects. How can
tive technologies such as ar to bring informa-
you “reverse engineer” a museum piece? will
tion across, so that they can serve their original
this allow you to take the museum piece out of
intent and provide information outside the mu-
the museum and put it to use? Can you turn cul-
seum at the same time. That way, heritage can
provide new designs; objects shaped and refined through the centuries can be used in the present. Smart Replicas is a project by Studio Maaike Roozenburg in cooperation with four partners: Delft University of Technology, Faculty of Industrial Design and Faculty of Civil Engineering; Delft Heritage; Museum Boijmans Van Beuningen Department of Archaeology, the AR Lab and students from the Graphic Design department of the Royal Academy of Art. A team of very different parties with their own background and expertise contribute to this project: academics, students, curators and technical experts. Smart Replicas is a project in which the worlds of museological heritage, design, art history, 3D prototyping and AR come together.
'Reverse engineering' historic items Seven tea cups from the Boijmans Van Beuningen collection form the “base material” to explore how reverse engineering and 3D prototyping can be used to produce useful replicas of museum pieces. Previous tests have shown that a (medical) CT scanner can scan very vulnerable and “untouchable” historical objects. This does not damage the objects and meets the museum’s guidelines for handling these objects. The resulting data can be converted to a digital reconstruction of the object which can be printed out on a 3D printer. The aim is to combine 3D prototyping techniques with the historical porcelain techniques used to create the original cups. This includes experimenting with milling porcelain casting moulds straight from 3D models as well as printing onto porcelain directly. It is important that the replica is a complete designer product that invites everyday use and can lead a new life outside the museum’s walls.
'Smart' objects AR brings the physical and digital worlds of an object together, thus transforming it into an information medium. Or rather, AR can show the information, history, context and stories associated with the object. Each porcelain cup is Images by Maaike Roozenburg 30
a historical source, with a wealth of historical
Maaike Roozenburg Maaike Roozenburg is a designer. She develops concepts, projects and products on the border of heritage, design and visual communication. After graduating at the Gerrit Rietveld Academy in Amsterdam she founded Studio Maaike Roozenburg in witch she combines a historical fascination with high tech materials and techniques and ‘traditional’ crafts. ‘Research by design’ plays an important role in these projects. The studio aims to research and communicate the value of historical (museum) heritage for us now. This approach can result in a design collection of ceramics pieces, a work for the facade of a building, an exhibition for a museum, or a mobile application that unlocks a historical collection. Besides her work at the studio Roozenburg teaches at the Post Graduate Course of Industrial Design at the Royal Academy of Art.
Image by Julia Blaukopf
production processes, use and rituals. AR allows
What is the real significance of our heritage?
us to visualise this information and actually link
With the project “Smart replicas”, we examine
it to the object; connecting knowledge to the
how 3D prototyping can be used to put heritage
object itself, outside the context of a museum.
back to use and how AR can turn objects into
This allows the knowledge, carried by the rep-
information carriers outside the museum. The
lica, to leave the museum and enter the wide
underlying questions we want to ask are: What
is the meaning of all these stored historical ob-
information; stories about travelling, locations,
A group of students from the Graphic Design de-
jects? How can we relate to them? Are they rel-
partment at the Royal Academy of Art is working
evant to us now? What happens when you copy
on structuring and shaping digitized historical
them? What does this change in our perception
information concerning the cups. They can de-
and appreciation of the real object? What is the
velop their own storyline or narrative and design
relevance of these objects in our increasingly dig-
it for a corresponding medium. Routes can be
ital society? Augmented Reality and 3D prototyp-
accessed with Google maps; moving images can
ing offer opportunities to pose these questions
be displayed through animation and text can be
(indirectly), and to identify and investigate them.
used by aid of visual or audio stimuli. These designs will be linked to the replicas using AR.
The result is a trial installation in Museum Boi-
The research also includes the markers that cre-
jmans Van Beuningen which will run in spring
ate that link. Here we examine how to combine
2013,where the smart replica prototypes can be
the historical techniques and decorations of the
‘touched’, used and tested by the public.
original museum pieces with technology aided
The project can be followed at http://smartrep-
by markers and object recognition.
How it was made: a tangible replica with mobile AR Jouke Verlinden, Maaike Roozenburg & Wolf Song
Replicated cup and lid in porcelain, augmented by animated 3D graphics on a smartphone.
As part of the “Smart Replica” theme initiated by
Maaike Roozenburg, we made a proof of concept in
The original Loosdrecht’s porcelain objects are
the minor on Advanced Prototyping at the Faculty
part of the collection of museum Boijmans Van
of Industrial Design, Delft University of Technology.
Beuningen. They measure approximately 8 cm
The top figure presents an impression of the final
and due to their fragility non-contact scanning
result: an exact replica in porcelain of an 18th
had to be selected. A medical Computer Tomog-
century sugar cup + lid, the decorations function
raphy scanner was employed, in which a radia-
as markers for a handheld AR overlay. On the cup,
tion source and the detectors rotate around the
the augmentation is a 3D animation floating on top
sample and measure the attenuation of the x-rays
of the physical object, while the physical lid is cov-
of the sample from different angles. In the slice
ered exactly with a virtual golden decoration that
the pixel resolution is approximately 0.2 mm,
matches the original lid.
yielding a result which is still sufficient to inspect
Most of the technologies that we used require
object thickness and geometry. Furthermore, the
only a basic skill level with some experience in 3D
scans gave a surprising effect: the gold decora-
modeling (CAD or visualization). A close account of
tions (“goudluster”) caused distortions in the
this developmental process can be found on the
Below, the most essential steps are discussed.
A substantial part of the reverse engineering is
This project was done with students Anne Lien
the digital reconstruction of the scanned point
van der Linden, Kenneth van Kogelenberg, Mariet
clouds into a valid, 3D CAD model that is print-
Sauerwein and Elizabeth Berghuijs. Furthermore,
able. In this case, much effort was spent to con-
we would like to thank Wim Verwaal and Wim van
vert the collection of jpeg pictures of 2D slices
Eck for their technical advice and guidance, as
into a valid 3D mesh. However, this was not pos-
well as Alexandra Gaba-van Dongen of museum
sible given the time constraints and we ended up
Boijmans Van Beuningen for trusting us with such
extracting a vertical section view to generate a
a unique piece of cultural heritage.
working revolve in Rhinoceros - a regular type of CAD package. In the cup, an AR tag was embossed, to be used as an optical marker. A 3D print was made with a polyjet technique in maximum resolution (an Objet Eden, accuracy 16 micron).
Manufacturing and decoration Based on the 3D prints, molds were made in plaster to pour the porcelain clay into. After drying the mass, the molds were carefully removed, baked and painted, after which a final transparent glaze was applied.
Augmentation For mobile AR, we chose Junaio â€“ a straightforward smartphone application that offers location-based augmentation as well as image recognition. The so-called GLUE functionality allowed us to overlay 3D models and animations on top of predetermined images (pattern) with a webbased interface to adapt the scale, rotation, and position of the objects. For modeling we used 3D Studio Max, which could export in the proprietary .md2 format with a special plugin. The augmentation of the lid was a gold version of the same object, which was simply a modified version of the same reconstructed 3D file of the lid. The
Original Loosdrecht sugar cup and resulting mesh
cup was extended with an animation of several
model after the CT scan.
portraits that seemed to float in the air.
Augmented Belief in Reality Maarten H. Lamers
At the Ars Electronica Festival 2008 I encountered artist Julius von Bismarck, who refreshingly resembled more an Alaskan bearded mountain-man than the typical sleek newmedia artist. Anyway, Julius was there to present his work, the â€œImage Fulguratorâ€?, a low-tech device that secretly adds elements to photographs taken nearby. Truly augmented reality, to my opinion. Image courtesy the artist and alexander levy, Berlin
let me explain. If you were to take photographs with your own
our hands. we know that the bunny isn’t re-
camera of, let’s say, barack obama, and the Im-
ally in front of us, but we happily ignore this
age fulgurator was nearby, then a peace dove
knowledge so we can enjoy it longer. In gen-
or a tomato could appear in all your photos. but
eral, to appreciate most augmented reality,
you wouldn’t see the dove or tomato with your
we must willingly ignore what we know is real.
own eyes, only in the photos that you made. It works by projecting the visual element into the
and that is what sets the Image fulgurator
scene, for example onto barack, exactly during
apart from most ar that I encounter: it does
the ﬂash of your own camera. your eyes won’t
not rely on our willing suspension of disbe-
register it, but your camera will.
lief! when tricked by the Image fulgurator, we are truly confused by what is augmented
‘but how is this augmented reality?’ you may
and what is real. just go online and watch the
be asking now. well, the Image fulgurator
faces of the photographers that julius tricked
definitely augments something, namely your
and filmed, while they stare puzzled onto the
photos. and since we accept photos as real-
lCd of their fancy digital cameras. their faces
ity, particularly the ones we make ourselves,
do not show understanding and acceptance of
it augments what we believe to be reality. ac-
what happened. Quite the opposite! they are
tually, that is what all see-through ar does; it
completely surprised by what appears to be
adds elements to our mediated vision of the
real world. and with most ar systems we willingly suspend
It is like the pink bunny just jumped from the
our disbelief that, for example, a pink bunny
book and kicked them in the nuts: ultimate
really appears from the pages of the book in
[AR]chitectural Education at Play Vincent Hui
Image courtesy of SaM GHanTOUS
as a child, video games seemed to be an escape
augmented reality (ar) app, entitled arch-app, at
from the rigors of school yet learning continued
the nationâ€™s largest architecture program at ryer-
in the gaming environment â€” whether it was
son University in Toronto, Canada.
keeping track of scores and experience points,
The widespread ubiquity of ar technologies has
checking my health meter, or even simply reading
extended into video gaming platforms such as the
in-game dialogue, my entertainment was fueling
Playstation vita and xbox Kinect pioneered by
my education. The superposition of gaming data
its rapid integration in mobile computing devices
atop the fantastic onscreen activity ensured that
and smart phones. In these gaming worlds, ubiq-
I could better understand the world I was play-
uitous data is persistently on display, informing
ing in. It comes as no surprise then that as an ar-
players about everything from mapping informa-
chitecture professor, I bring these same edutain-
tion to enemy data thereby enhancing the gam-
ment and gaming perspectives into my pedagogy
ing experience. where current generation gaming
through the development and deployment of an
technologies have pushed the boundaries of rep-
resentation and simulation of the real world with
Vincent hui (mraIC, assoc. aIa, leed aP) received
incredible fidelity, ar access has empowered
his bachelor of environmental Studies (architec-
people with the ability to bring this persistent,
ture), master of architecture, and teaching Cer-
ubiquitous data to the real world. with a quick
tification from the university of waterloo as well
glance through a smartphone, entirely new lay-
as a master of business administration (specializa-
ers of information are made visible. recent rev-
tions in marketing and Strategy) from the Schulich
elations that the rapid adoption of smartphones
School of business at york university. after
is accelerating (approximately one third of the
gaining international and domestic work experi-
global cell phone market) serve to highlight the
ence with architecture firms around the world, he
inevitable ubiquity of an ar-enabled population.
became a partner at atelier anaesthetic in 2003.
It is within this context of interface familiarity
he has been awarded several teaching citations
and hardware access that I have worked with my
while at the university of waterloo since 2001
ryerson University colleagues, Graham McCarthy
within both the Schools of Planning and architec-
and Steven Marsden, in assembling the infrastruc-
ture. he currently teaches a variety of courses
ture of an ar app that effectively provides a mo-
within the department of architectural Science
bile database of building information to anyone
at ryerson university in toronto, Canada, ranging
with a smartphone.
from design studios to advanced architectural
One of the biggest challenges faced by educators
computing, and digital fabrication. Vincentâ€™s
in architectural science programs is providing
works with physical computing and digital fabrica-
meaningful and self-directed pedagogical tools
tion have been exhibited and published inter-
that are accessible, current, and engaging. To
nationally. his recent work with architectural
many students, architectural engineering materi-
appropriation of ubiquitous computing, augment-
al is often discussed in the confines of classrooms
ed reality, and data-scapes has culminated the
rather than contextualized in real world projects.
development of tools that allow users to access
It is incumbent upon educators to seamlessly con-
data on any landmark in the built environment.
nect academia to real world application. augmented reality proved to be the most suitable technology to ameliorate this condition. In the case of ryerson University, the vast majority of new students are commuters who take public transit from their suburban homes to what amounts to a relatively unfamiliar environment in the downtown core. Despite the incredible architectural renaissance Toronto has experienced in the past three decades, many incoming students arrive unfamiliar with the acclaimed work literally across the street from their classrooms. Developed within four months, the early prototype of the arch-app was a simple database of notable architectural projects within the
The arch-app in use in the summer allowing a user to see a building in the winter.
Image courtesy of SaM GHanTOUS
downtown Toronto campus that highlighted basic
It has since grown into an ar interface that allows
project information such as the completion date,
users to visualize everything from historic imag-
the architects, and some basic text describing the
ery and structural drawings, investigate the his-
projectâ€™s architectural relevance alongside sup-
tory and theory behind a buildingâ€™s design, lever-
porting imagery. Originally developed in HTML 5,
age global positioning to offer tours and guidance
the project was a website that catered to a stu-
to particular sites, and even watch interviews
dent audience that would explore the city while
with the architects behind the projects. Though
mobilized with access to a database of architec-
a proprietary standalone application is currently
tural landmarks. The prototypical infrastructure
underway, the arch-app content from the robust
for the app, known as rULa Maps (ryerson Univer-
database has been integrated into widely acces-
sity Library & archives Maps) became a tricorder
sible ar viewers such as Layar and Junaio.
of sorts where any user with a web-enabled phone could bring up individual layers of pedagogically
Instead of listening to lectures about the built en-
valuable information. Like the venerable tricorder
vironment within the classroom, the arch-app has
of science fiction, the rULa Maps project trans-
become a tool to introduce students to their sur-
formed a simple phone into a device able to un-
rounding buildings and serves to contextualize the
cover pinpoint data on nearby landmarks.
academic discussion via real world application. For
example, rather than assign readings and show
in seeing what they have learned in class applied
slides of notable steel buildings in a Structures
in the real world.
course, students could follow a preordained tour
Taking advantage of the web 2.0 paradigm of in-
to visit relevant buildings in the city in person
formation sharing and user-generated content,
using the Google maps interface and global posi-
the arch-app also serves as a platform for stu-
tioning system integrated in the arch-app. Once
dents to add and update content as part of their
at a key landmark, students could also leverage
academic responsibilities in various courses. Stu-
ar interface identifying nearby architectural landmarks
Image courtesy of Sam GhaNtouS
the online database through the app to not only
dents are not only part of the end-user audience
simultaneously see historic imagery atop what
community, but also content creators and editors
currently exists in the real world, but also bet-
who vigilantly ensure app content is current, rel-
ter understand the building through detail and
evant, and robust. For one course students are
interior imagery as well as future design propos-
asked to create 3D structural models that can be
als by other architects. an indispensable feature
pulled up in the app while in another they would
to students within the arch-app is the ability to
be asked to take photos of a construction site
arbitrarily use it to discover information about
to maintain an archive of progress imagery once
buildings they find interesting as they explore the
the building is complete. The aggregation of this
city. Beyond showcasing orthographic, rendered,
process results in an archive that is both current
and historic imagery, the arch-app also provides
and diverse while also leveraging studentsâ€™ learn-
users a glimpse into a building at different times
ing and sense of accomplishment in having their
of day and through the seasons. The app effec-
content uploaded on the app. This has proven to
tively became an augmented reality architecture
be quite successful as anecdotally students have
professor in studentsâ€™ pockets, uncovering facets
not only drawn material from courses to the built
of the buildings inaccessible or not necessarily
world, but have also leveraged the arch-app in
visible in person. as though immersed in a video
integrating material from multiple courses. For
game environment, students engage the built
example, using the ar component of the app,
world with a sense of discovery and confidence
students are able to view structural models in
tandem with existing conditions and historic im-
tegrative thinking with other courses, connecting
agery to understand why certain architectural
students to a community beyond the university,
decisions were made. This level of integration
and spurring greater curricular engagement within
among courses and connection to existing build-
the department. Its success since its deployment
ings in real-time is a unique opportunity that is
a year ago has merited great interest from both
only made possible via the ar opportunities pro-
within the academic community and a variety of
vided by the arch-app.
external stakeholders including the general pub-
The pedagogical model that the arch-app has es-
lic. Future directions for greater integration with
tablished has since partially become a victim of
other courses within the architectural science
its own success. as students and even members
program and even with other departments such
of the general public continued to upload content
as Interior Design are excellent opportunities in
freely, the limited capacity of the testing servers
continuing the positive academic trajectory of
indicated a need to find better models of maintain-
the arch-app while interest from professional ar-
ing content. The current model in use relies upon
chitecture organizations has generated requests
providing students enrolled in specific classes us-
for collaborations to provide content from its
ing the arch-app with accounts whereupon they
membership to serve as a platform documenting
may upload content on their own. Unfortunately as
great local projects. another dimension currently
with any wiki-based system, there remains to be
in discussion is how to make the arch-app more
a stronger mechanism for curating content which
engaging by integrating social media and possibly
currently is in the hands of a few professors and
adopting game mechanics within the ar interface.
ar-gaming hybridization under the arch-app may
The arch-app has served to address several edu-
prove to be an excellent opportunity to maintain
cational issues in architectural engineering includ-
learning in an engaging, entertaining framework.
ing increasing accessibility, maintaining currency,
ar-gaming interfaces fostering learning â€” appar-
and drawing in real world application while also
ently some things never change!
netting secondary benefits such as encouraging in-
“There’s only one mind, the one we all share”
Image by Martin Sjardijn
Digital technologies and fine art a complex relationship Martin Sjardijn 42
3d print weightless Sculpture 10x10x10 cm (3d printer kabk)
The launch of the first Sputnik in 1957, also called
at the same time, we see the appearance of art
“the start of the space age” had major implica-
that has transitioned through cyberspace as a
tions for the ensuing Internet. The fact that man-
‘wormhole’, and thereby took a super-, supra- digi-
kind wanted to leave the earth seemed to run
or megamodern form1. an interactive digital (vi-
counter to the retrospective mentality of the art-
sual) language is developing globally. works of art
ists, who turned to mythology rather than tech-
are created in real-time, sometimes through social
nology. with the advent of digital media, integra-
media, sometimes through virtual and augmented
tion is gradually becoming noticeable in society,
reality, sometimes converting physical material to
which, instead of opposing modernism, gives the
virtual forms and sometimes the reverse. with the
modern perspective a higher plan through digital
advent of 3D printing, sculptures can be made with
modern software that never could have been real-
Image by Martin Sjardijn Gerwin de Haan tests Elements installation with virtual sculpture Blue Cyber, TU Delft
ized before. The results of digital media also have
techniques. Only social media seem to attract the
clear effects on image development, which can be
attention of fine arts students as a contemporary
seen, for instance, in architecture and literature2.
means of communication. In ‘Not Just for the Bou-
Developments in the field of virtual and augment-
tique: Art schools and digital everyday culture’,
ed reality are still young and provide challenging
Florian Cramer reports on the use of digital equip-
possibilities for aesthetic and visual experiments,
ment and software programs, among 350 3th year
especially from a traditional fine arts perspective.
Art Students in The Netherlands and observes:
These new forms of expression deserve attention
"Conversely, art schools are now mostly chosen by
from artists with a free mindset, because they al-
students who love manual craft such as drawing
low free investigations, backed by a rich tradition.
and hand making of tangible products. Often, for
However, besides technological advancement,
example in zine- and print making, this is moti-
also significant cultural developments have taken
vated by cultural opposition to electronic media"4.
place during the digital revolution in the past fifty years. At art schools this has led to a critical at-
In line with this, the exhibition called “Simply
titude towards the modern desire for innovation.
painting” which was shown at the Gemeentemu-
Around 1978, ‘the exhausted modern artist could
seum The Hague from 10th march till 17th June
no longer keep pace with new technologies, and
2012, seemed to be a protest against the ever-
was left behind in myth and irony’ as was con-
present digital media and its profound cultural con-
cluded by Achille Bonita Oliva in his book Trans-
sequences. Not that there is anything wrong with
Avant-garde International3, a review of art in the
handmade art — on the contrary, it is an essential
late 1970s and early 1980s. The consequences can
cog in the fine arts machine, and the closest one
still be seen in art academies and especially in the
to mankind — but computers have penetrated so
liberal arts departments. The new tools that ar-
deeply into our society, partly due to rapid min-
rived with the advent of computers, such as the
iaturization, that research and experimentation
Internet and digitally controlled equipment, are
from the aesthetically oriented field of fine arts is
available at art academies, but their application
of paramount importance. This research is current-
is still geared towards industrial design — the fine
ly being conducted by academics with technical
arts departments still focus mainly on traditional
backgrounds and at academies of art, mostly in the
photography, film and applied art departments.
During my research into zero-gravity in sculpture at the University of Technology Delft, I was much
1 Introduction to digimodernism
more impressed by their interactive technical vi-
sualizations, than by those I found at academies of art.
2 Anabela Sarmento, Issues of Human
Opportunities at art academies abound in the
Computer Interaction, 2005, IRM Press,
form of knowledge, software and available equip-
ment. In addition to traditional subjects, a wider knowledge base should be offered; digital literacy
3 Achille Bonito Oliva, Trans-Avantgarde Inter-
courses should be offered as part of the liberal
national, Review of art in the late 1970s and
arts, to allow students to understand the artistic
early 1980â€™s, Published 1982 by Giancarlo
possibilities these techniques present. Cooperation
with Universities of Technology provides highly interesting results, as seen from the AR Lab of the
Not Just for the Boutique: Art schools and
Royal Academy. The artist, born from romanticism,
digital everyday culture. Florian Cramer
is now both an artist and an explorer and is able to
poeticize widely introduced and socially accepted technology in dialogue with technically oriented scientists.Â
Martin Sjardijn Martin Sjardijn was born in The Hague, The Netherlands. He Studied Fine Arts at the Royal Academy of Art and for some years Cultural Sciences and Philosophy. After painting for many years, he started the Weightless Sculpture Project in 1985. In 1990 he began to work with virtual reality, using, for example, head-mounted displays and Datagloves with tactile feedback. Since 1998 he has been developing an Art and Educational Project using interactive 3D technology in collaboration with the Groninger Museum in the Netherlands. His latest concept is called ArtSpaceLab, which contains a virtual exhibition, a database and a proposal for an artproject inside the International Space Station. The virtual Coop Himmelb(l)au Pavilion gives access to a database, containing a big collection of free software and (video)tutorials as used by Sjardijn. At the Royal Academy of Art he teaches virtual modeling
for autonomous art.
Cora Beijersbergen van Henegouwen
Who owns the space? A legal issue on the use of markerless Augmented Reality Yolande Kolstee Using the right AR app, we can see virtual commercial and cultural information and even virtual art placed all around us. But what if we dislike the content made visible by the app? Who owns the space that surrounds us? Can one object to any of this information placed in AR space? What if the information is false? To be aware of a virtual layer that is placed on top of physical space is one thing, but to be able to erase or correct the virtual layer is another.
God commanded Moses to inform all the Israelites to mark their doorposts with lamb’s blood by which the Lord will pass over them sparing all the Israelite first-borns: they will not “suffer the destroyer to come into your houses and smite you”. Although AR markers are still in use, in the last years they haven’t been required for every AR application. Nowadays, AR often uses spatial information derived from the worldwide satellite-based global positioning system. Using GPS (and compass) it is possible to post additional location-based information all around us — even inside our own house — without us even noticing. With smartphones and apps such as Layar, we are able to see location-based information around us. For example, we can see who tweets around us. And we can even get directions to reach the tweeter via an app like “Maps”. AR Artist Sander Veenhof’s works serve as a good example for AR art that uses locationbased information. Without the board of MoMA
When using visible markers, such as AR or QR
in New York knowing, he placed virtual art in
codes, there is at least a chance to notice that
the museum and even added a virtual 7th floor.
virtual information might be available. A mark-
He did not only augment the MoMA, but also
er might even work as a warning.
the Pentagon and the White House. Through
The use of visible markers (by an invisible en-
this art invasion, he showed clearly that there
tity) can even be found in one of the most ter-
are no physical borders anymore.
rible stories in the bible, Exodus 12:7–13, where
(see http://www.sndrv.nl/moma/ and http://
“CDD”) filed a Complaint and Request for In-
Other, less playful uses of AR stand in stark
vestigation with the Federal Trade Commission
contrast to Sander’s artistic use of AR. In a col-
(FTC) against PepsiCo and its subsidiary, Frito-
umn by Lester Madden, he discusses informa-
Lay. The ultimate point of the Complaint is to
tion that might be useful to burglars.
argue that Frito-Lay’s campaign deceives teens
There are more critical comments on the ap-
into eating too many unhealthy snacks, thus
plication of AR as we can see in the blog of
contributing to the childhood obesity prob-
Brian Wassom, a commercial litigator, who dis-
cusses legal issues related to AR. He describes his first legal issue in http://www.wassom.com/
In AR space, we can add information on politi-
cal preferences for example, or information on sexual habits. This is a really serious topic, be-
“It’s on. For real, this time. A newly filed legal
cause we don’t have a system yet to track own-
complaint raises non-imaginary (although cer-
ership of this ‘space’ around us, we don’t have
tainly still-untested) legal theories concerning
to give our personal approval, and many of us
an actual, commercial use of augmented real-
are unaware of the virtual information, which
ity. AR litigation is now a cold, hard reality.
is placed in the air around us. That is why we
And the result of this initial salvo could have a
will pay close attention to this topic in the next
huge impact on AR campaigns across the board.
issue of AR[t].
On October 19, 2011, four consumer advocacy groups (the Center for Digital Democracy, Con-
sumer Action, Consumer Watchdog, and The
Praxis Project–who I’ll refer to collectively as
AR can be used to display information that might be useful to burglars. 47
ChaSING VIrtual SPookS, loSING real weIGht auGmeNted ruNNING aNd a SIde trIP INto the hIStory of audIo auGmeNted realIty haNNa SChraffeNberGer
a strange voice tells me to run. My heartbeat rises
happens visually. By now, probably all of us have
as I follow the instructions without giving them a
seen some three dimensional objects popping up
second thought. The voice’s manner of speaking
upon designated markers, virtual pink bunnies
reminds me of my parents’ TomTom. The only dif-
above augmented cereal boxes or walking direc-
ference: instead of telling me to take a turn, I am
tions superimposed on real streets. However, ar
instructed to accelerate, slow down, to run or — if
does not have to be visual. Sound, in particular,
I am lucky — to walk. I am running with my new
has already brought forth some fascinating ar
mobile app and virtual trainer. The app tracks ev-
applications and artworks such as Edwin van der
ery move, knows when my heartbeat rises, and is
Heide’s Radioscape1 and Theo watson’s audio
supposed to help me gain speed and lose weight.
Today, I run to clear my head after a mentally exhausting but physically unchallenging day. Howev-
Entering the latter, visitors can hear the sounds
er, trying to catch my breath, my thoughts return
left by previous visitors, spatialized, as if they
to work. More precisely, I pore over my research
were actually still there. at the same time, they
topic, non-visual augmented reality.
can leave their own audio messages at any point within a room. It is not just the fact that the
In augmented reality (ar), virtual content is
physical space is augmented with the ghost-like
added to our real environment. Most often, this
presence of previous visitors that makes me term
it ar. what convinces me is that visitors can re-
I will have to try this app while running. I can al-
late their own sounds and messages to those left
ready hear the sound of my steps on the asphalt
earlier by others; thereby establishing connec-
evolving, blending into a rhythmical soundscape,
tions between the virtual and the real. I imagine
slowly displaced by the wind of heavy breathing,
walkers, cyclists and other runners leaving their
interrupted by pitched variations of my sudden
sound-trails behind on the road, leaving it up to
greetings whenever I meet another runner.
me to add my own sounds and follow their steps, which are spread across time and space.
while RjDj and successor apps like Inception and Dimensions3 are a rather recent phenomenon,
My favorite mobile app, RjDj, can also be consid-
the idea of remixing the sonic environment is not
ered ar sound art. The app remixes the sounds
new. The artist akitsugu Maebayashi has worked
of the surroundings and provides you with a
with similar concepts for a long time. His portable
soundtrack to your life that blends in, makes use
Sonic Interface4 was built in 1999 â€” years before
of and accompanies your environment. although
mobile phones gained comparable sound-process-
it is certainly no typical ar application, the rela-
ing abilities. The custom built device consists of
tion between the sounds of the real environment
a laptop, headphones and microphones and uses
and those produced by the app is so strong that
delays, overlapping repetitions and distor-
often, they seem to melt into a single soundscape.
tions in order to recompose ambient sounds
in urban space. The resulting soundscapes break
everyday sounds. Yet, the effect is similar; they
the usual synchronicity between what one hears
create “a new reality within existing realms, a
and what one sees. Unsurprisingly, Maebayashi is
form of ‘augmented reality’.” 8 Clearly, the de-
not the only one who has been exploring sound-
velopments in non-visual AR were in no way infe-
based augmentations of the environment early
rior to the development of its visual counterpart.
on. In fact, audio augmentations of our environ-
Taking slow steps, I imagine being on such a walk
ment have quite a history of their own. Unfortu-
right now… Listening to instructions on which
nately, they are less known in the context of AR
route to take, where to look, superimposed foot-
and are often not even considered to be part of
steps here, sounds recorded there, on this path
earlier, maybe altered with special effects. I imagine those sounds mixing in with the naturally
“Walk!”, my virtual trainer gives in to my exhaus-
present sounds of the river, bikes, and the oc-
tion and I slow down. However, my thoughts keep
casional mopeds passing by. “Run!”, my trainer,
racing. Quickly, they approach the early 1990s:
whom I decide to call Tom, puts an abrupt end to
Tom Caudell is believed to have coined the term
this walk. The fact that AR sound art like Cardiff’s
Augmented Reality. It describes a head-worn dis-
and Erens’ walks are not usually mentioned in the
play that superimposes visual information onto
context of AR leaves me wondering what else we
real objects5. In Caudell’s case, the new AR sys-
tem helps workers assemble cables into an aircraft at Boeing. What usually goes unnoticed is
After Tom’s instruction, my music fades back in.
that around the same time, Janet Cardiff started
The song is intended to get me to run even faster. After my footsteps have
“Stop!”, apparently my position adapted to the new it hits me: these has changed enough. My run is over. rhythm instructions about how recording her so-called audio walks. Those walks
fast to run, the information about my heart rate,
are designed for a certain walking route and
distance covered and calories burned and options
confront the listener with instructions such as
such as racing against a virtual running partner in
“Go towards the brownish green garbage can.
real physical space — this is just like AR.
Then there’s a trail off to your right. Take the
In fact, my virtual running trainer shares most
trail, it’s overgrown a bit. There’s an eaten-out
of the characteristics commonly found in AR
dead tree. Looks like ants.”6. While the listen-
applications. It adds another layer of content
er navigates the space, he gets to listen to ed-
to my running. It is interactive and operates in
ited mixes of pre-recorded sounds, which blend
real-time. Just like many other GPS based AR ap-
in with the present sounds of the environment.
plications, it reacts to my position in the world.
Cardiff’s virtual recorded soundscapes mimic the
Most importantly, Tom fulfills my own, personal
real physical one “in order to create a new world
requirements for an AR experience: something is
as a seamless combination of the two.”7 By su-
added (the instructions), something is augmented
perimposing an additional virtual world onto our
(the running), and most importantly, there is a
existing one, and thereby creating a new, mixed
relationship between the two.
reality, Cardiff’s sound art explores one of the key concepts of AR. And Cardiff is not alone with
When another runner passes me slowly, my heart
this idea; as early as 1987, Cilia Erens introduced
rate drops. I wonder whether it might be his heart
sound walks, soundscapes and sound panoramas
rate that is mistakenly reported back to me. I am
in the Netherlands. In contrast to Cardiff, she
astonished, that without the sensor’s help, I can-
forgoes spoken content and uses largely unmixed
not even accurately perceive such basic and vital
References facts as my very own heart rate. Maybe this is farfetched, but with respect to that, the running app
■ 1 An article about Radioscape by Edwin
relates to the kind of AR applications which allow
van der Heide is featured in this maga-
us to perceive things about the world that we nor-
zine on pp. 18-23
mally cannot perceive, such as seeing heat, feeling magnetic fields or hearing ultra-high frequencies.9
■ 2 Theo Watson, Audio Space (2005),
So why are virtual Tom and his colleagues not
considered to be AR? Perhaps because there are
also numerous differences between running apps and common AR applications. To begin with, this running app does not augment the environment.
■ 3 For RjDj, Inception and Dimensions see http://rjdj.me/
Rather, it augments an activity — my running. And to be honest, despite the fact that Tom follows
■ 4 Akitsugu Maebayashi, Sonic Interface
my every move — chasing a virtual competitor
or running with a virtual trainer — it still feels
m8/installation.html and http://www.
like they are running on my phone while I have
to tackle the real road. What is more, locationbased AR applications usually display content re-
■ 5 T. P. Caudell, and D. W. Mizell, “Aug-
lated to the user’s absolute position in the world.
mented Reality: An Application of
Tom, on the other hand, is only interested in the
Heads-Up Display Technology to Manual
change of my position over time.
Manufacturing Processes”, Proceedings of 1992 IEEE Hawaii International Con-
“Stop!”, apparently my position has changed
ference on Systems Sciences, 1992, pp
enough. My run is over. The result: 583 kcal
burned, 5 miles run and the revelation that the combination of the virtual and the real encom-
■ 6 Janet Cardiff, Forest walk (1991),
passes much more than just adding virtual vi-
sual objects to the real physical environment.
There is a whole field of augmented activities as well! I cannot wait to jam with virtual bands, to
■7 Janet Cardiff, Introduction to the
try augmented eating or to take an augmented
Audio Walks, http://www.cardiffmiller.
nap. As if to approve, my heart rate makes a
last excited jump. Who knows, in the future, Tom might learn from existing AR. He might then
■ 8 Cilia, Erens, The Audible Space,
have a look at my environment and direct my
turns so that I discover new routes, point out
sights or, when needed, help me find a shortcut home. Considering current developments
■ 9 The course ‘Perceptualization’ which
in lightweight AR glasses, I guess it cannot be
is taught by Edwin van der Heide and
long until we can also see our virtual competitor
Maarten Lamers as part of the MSc
passing by, are asked to design avatars repre-
Media Technology at Leiden University
senting our personal best time in races against
discusses such translations of infor-
other runners and are challenged to chase vi-
mation to our human modalities. See
sual virtual spooks. I would not mind that. And I
bet that that is when augmented running will be
truly considered to be AR.
tion/ and http://www.maartenlamers. com/PZ/ 51
auGmeNted realIty: a Story DIrK vIS augmented reality (ar) is used for many literary purposes: secret pages can contain additional information that is only visible on smartphones; promotional texts can be linked to gabel stones and bus stops, a book can be scattered all over the world virtually, etcetera etcetera. however, ar can also be used as a medium to tell a story. an ar-story consists of separate parts, that each tell a story of their own, which are integrated into a wider story-arc and which interact with the physical reality of the reader. the text below is an exploration of the narrative possibilities of ar. all of your decisions are stored: all of your purchases, your travel destinations and your doubts are recorded. not only far away on Google’s server farms, but also nearby, where they can be seen best: in your face. Mike writes at night. He is working on the code and the interface for an application. He writes non-stop. He is making an application that calculates how your face will look when you get older. He has forgotten whether it was his idea or ada’s. They invented it as a joke, just as they came up with everything else. But this night, in a single night, he will write the code that will earn him money. The afternoon before, ada and he were stan-
clay or in pixels, and reads. Imagine if the text had eyes and scanned the reader back as he read.
ding by his door. She was on the sidewalk; he on
The text records the movements of the reader
the doorstep. In her hands she held her sketch-
and responds. The reader finds his way through
book. It was the last thing she still had to collect,
the text with movements of the head. The story
the last thing left of her in his house.
is different — and the effect of the story is dif-
- “People are scared of the idea”, he said, “— you
ferent — because the position, the facial features
are too — that everything is recorded by Google,
and the expressions of the readers differ.
by cameras, but they forget— you forget, that
Mike tests his work on himself that night. The
every misstep is already stored in the place
program scans his face. The color and texture of
where it hurts the most: in your face. I’m going to
his skin are linked to his personal data from the
build that application: The future face one.”
cloud. His medical records, credit card informa-
- “You’re crazy”, she said. “There’s a reason
tion and favourite music are combined, supple-
nobody can see how we will look when we are
mented and extrapolated with the facial scan.
older. True youth cannot know what it means to be old.” reading is a complex task, but the reality of it is simple: the reader sees a text on paper, in
He writes his code with fervour. He is constructing the interface, which the user will read later. Sometimes he is afraid of what he does; that things might arise that he’d rather not see,
much like how Ada sees things that she would
be printed that you wanted to be alone, that you
rather not. Illusions, nocturnal animals and fears
could not make the decision yourself, and that
that remain invisible to others, she sees. She has
you cheated on me. You think you have solved it
started to draw them.
and got rid of it, but everyone will read it in your
From the porch door he touched Ada’s cheek. Probably for the last time, he thought.
eyes, your skin, your nose: ‘do not trust her’.” Ada and Mike stared silently at each other.
- “Your skin registers the amount of wine you
He touched the tip of her nose with his forefin-
drink, cigarettes you smoke, men you use. Ciga-
ger. That was his favourite place. She had asked
rettes make your radiance fade. Literally. You
him once to draw lines of altitude on her face,
may think, my light will dim as I get old anyway,
like they are on a map. Her nose was the highest
but old women can shine. You say that your genes
mountain. The lines gave a cross-sectional view
determine how you look, but you decide whether
that made her face look like a layer cake.
to smoke and so you too decide your future face.
- “I know you hate my programming. You think I
Whatever you do, even if it’s nothing at all, it is
make gadgets that contribute nothing to society,
engraved in your face.”
but imagine if you could see your future face, the
The city is quiet. Only the taxi drivers and
effects of a night out, if you could be your own
bats are still awake. Mike holds his face in front
Tamagochi, shape your own true face. Plastic
of the test version of his application. The camera
surgery will shrivel by comparison.”
scans the bags under his eyes as he writes: “The
Mike lets the program scan a height map of
system compares colors, patterns and textures
his face, in order that a virtual 3D model can be
with data from others to determine your lifestyle
made. He continues his work inflamed. The light
and future face.”
is returning outside. Based on his personal data
Ada was still standing right there on the side-
from the Internet, the program calculates how his
walk. Mike held his face close to hers.
face will look like in five, ten or twenty years. His
- “You think you can get away with this.”
nose is finished first. Noses never stop growing, as
He saw the skin around her eyes, of which he
if they record all the smells of a lifetime and all
knew every square millimeter.
the faces that came close. The program creates a
- “...but your wrinkles will tell your tale. It will
virtual model and as Mike looks into the camera,
be written plainly on your face for all the world
it superimposes the computer-generated future
to see, no matter how small it is. In Saudi Arabia
face over his own. It’s as if he’s talking to a 60
adultery carries the death penalty; some tribes
year old Mike.
paint a purple spot on the face of the adulteress
- “Finish your application,” one Mike says to
which remains for months. I’m going to make that
the other. He says it himself, but it looks as if
program that we thought of once, that shows
his elder counterpart is speaking. “Finish your
your future face. Somewhere on your face it will
program, show Ada and then go outside.”
Dirk Vis Dirk Vis (1981) studied Beeld & Taal (Image & Language) at the Rietveld Academy and Design at the Sandberg Institute. He published Bestseller (2009) in limited edition, made electronic poems with K. Michel, is editor of De Gids and teaches Interactive Media at the Royal Academy of Fine Arts. www.dirkvis.net
Unspecialize! The more you know the less you see Image by Mareike Bode
A portrait of Daniel Disselkoen Hanna Schraffenberger 54
Image by DanIEL DISSELKOEn
I’m looking out of the window of a tram. Daniel Disselkoen, the graduate from the royal academy of art I’m about to meet, lives in the Statenkwartier ― a part of The Hague I don’t know yet. It’s a nice district and although there is nothing spectacular about this neighborhood, I enjoy the view of
During my studies I took the same tram to art academy on an almost daily basis. on this ride there simply was nothing new to see and hence no motivation to look outside. I realized how easily it becomes boring when you live in a place you know so well.
streets I haven’t seen before. a few minutes later I meet Daniel in a small café
Leaving the familiar behind, Daniel spends several
and he tells me more about this area. He grew up
months abroad. First in america, at the Minne-
in this neighborhood. after studying law and philo-
apolis College of art and Design, then in Japan,
sophy in Groningen for four years, he moved back
doing an internship at the advertisement agency
to the very same street he was born in. That was
four years ago and marked the beginning of his graphic design studies at the art academy. Just recently he graduated. It is his knowledge of the city that has motivated his search for new perspectives and inspired several of his works.
When you don’t know a place you look around and spot all those new and interesting things. But the better you get to know the area the less you look around. While commuting in Japan I 55
paid a lot of attention to my surroundings. I realized that there are three main groups of passengers: those staring at their phone, those who read, and those who sleep or look down. It’s a general phenomenon that when people know the area they don’t look outside. Upon returning home, Daniel decides to take matters into his own hands and sets out to make his regular route to the academy interesting again.
The online world calls the game the real-world version of Pac-Man. However, the arcade game was not what inspired Daniel.
I used to play this game with dots on the window. I think there are quite a lot of people who played a game like this when they were kids. Those people enjoy rediscovering it now. Others like it because it is entirely new to them.
The result is Man-eater, the simplest augmented reality game I’ve ever seen. Without the use of
His story is convincing. Indeed, Man-eater makes
phones, headsets or computers, merely relying on
me relive my own rides to school, pretending the
two stickers, the game adds an additional layer
bird poop on the window was Super Mario who
on top of our view, changes our focus and allows
had to jump from passing car to car. I like the
us to experience our well-known surrounding in
beautiful and unexpected update to the child-
a new way.
hood game and appreciate that Daniel turned it
The first sticker shows a little monster, the so-
into a visually appealing version. However, reli-
called Man-eater. It is placed on the window of
ving childhood memories wasn’t Daniel’s origi-
the tram. The second sticker is a manual, placed
nal intention with the game. His goal is to make
on the headrest right in front of the potential
people notice the outside world again.
player. The manual is hard to overlook and states the four simple rules:
1. Close one eye and look to the right 2. With the Man-eater, eat as many heads of pedestrians as possible 3. Time to play: between two stops of the tram 4. If you haven’t eaten enough heads, start again at the next stop The first level challenges you to eat at least 3 heads, the third level asks for 12. Daniel is aware that this won’t always be possible.
Sometimes the circumstances outside make the game completely unplayable with those rules. For example, if there is simply no one outside then you can’t play, or if there are far too many people it will be too easy. But I think the simplicity is also the beauty of the game.
Traveling by tram, I just think this is a great moment for looking around and experiencing your environment in a new way. With the game, I want to provide a fresh perspective, give them a new experience of the city they know so well. Contrary to what one might think, Daniel doesn’t think that people always have to look around.
I don’t necessarily think it is bad that people don’t look outside. I enjoy those rides where you do nothing at all. The last thing I want is to force commuters into another thing they have to do while they travel. I think in the tram or in a train your mind can be satisfied with the fact that you are traveling. So you are free to do nothing at all, your mind can reverberate, you can just let your thoughts travel as well. But for those that don’t look around anymore because they don’t expect something interesting to see, I made the Man-eater to let them explore the familiar in a new way.
Image by DanIEL DISSELKOEn
Besides the fact that he doesn’t want to push people to ‘do something’ in the tram, Daniel has more reservations about whether or not to place the stickers.
Very often I don’t like street art like graffiti and stickers. too many times it’s just somebody tagging his name. I think when you push your work in public places, you have to be mindful of your audience.
passengers were not playing the game. Before playing, people looked around cautiously to see whether someone is watching them. only when they didn’t feel observed, they felt comfortable playing it. Consequently, I couldn’t look at them the whole time. So I just sat there and filmed them with my phone. The dilemma whether and how to observe people with or without their knowledge is also well known in science. Once you tell somebody that you are
Daniel’s unobtrusive solution: in case the stickers
observing him, he might behave differently. If
are not appreciated or even considered vandalism,
you don’t tell, there are ethical considerations to
they can easily be removed due to his use of re-
take. Daniel’s approach paid off but also gave rise
to doubts about his method.
However, not all aspects of the project’s realization are so commendable. In order to test how his project is perceived and get the natural reaction of those playing the game, he filmed the tram passengers sitting next to the sticker without them knowing about it.
I wanted to know how people react by shooting first, and ask for permission to use the footage afterwards. at first, it seemed like the
From my short glances, it seemed like they were not playing the game. only when I checked out the video footage, I noticed the little movement of their heads ― they were playing it after all. However, a few people noticed that I was filming them and apparently felt really bad about it. they stopped playing; sat somewhere else or even stepped out of the tram. I felt terrible about it. 57
The underlying question I asked myself is “Can we change the way we look at our environment and how we experience it?” With Man-eater, I have shown it is possible. With the new app, I take the concept of providing a new experience of the surroundings even further. This game can be played everywhere. I would really like it if people started using it in their room, continued outside, looked around, and then were driven to also discover different places. I am curious how he intends to alter my view of the world this time.
Image by Daniel Disselkoen
His close observations of commuters, his interest in their behavior and reactions to the game not only lead to some valuable insights regarding the success of Man-eater, but also inspired his newest project.
Just like I had noticed before in Japan, we also have those different types of commuters: those reading, those looking at their phone or tablet, and those sleeping or looking down. During my observations, I noticed it’s the people staring at their phone or tablet who did not play Maneater at all. They just sat in the tram, looking at their phone, not even noticing a sticker right in front of them. That observation made me want to create something that turns the tablet or phone into a device that makes you look around and take part in your surrounding, rather than isolate yourself from it. Indeed, his newest project does just that. Or to be precise: it will do just that, as soon as it is published in the App Store. Right now, Daniel is in con-
I noticed it is not only knowing an area well that makes you perceive less about it. Knowledge in general makes you see your environment in a certain way. I’m a designer and the more I’ve learned about design, the more I started to perceive the world in terms of design. Of course, I am exaggerating when I say that, looking at the world, I see typefaces, patterns, grids, packages, posters and advertisements. At the same time, I miss other things about the world. For example, I might only see the package and not its content. To generalize and exaggerate a bit more: a biologist might spot plants, an architect might focus on buildings and a psychologist might perceive more about people. What you perceive about the world is influenced by what you are specialized in. My app is intended to free people from their specialized view of the world and will provide them with another way to look at it.
“When you don’t know a place you look around and spot all those new and interesting things.”
tact with programmers, who will turn his concept into an actual app. Until then, the details remain
To achieve this, the app will basically use every-
a secret. However, from what I could deduce, the
thing a phone or tablet has to offer: it will make
upcoming app will be focused on a similar idea.
use of the modern phone’s computational capabi-
lities and mobility, it will use the camera as well as information about the user’s position, make use of the Internet connection, use the fact that people always have it with them and incorporate social aspects. Hearing about the technical elements such as spatial awareness and the use of the camera, I expect an ar application. However, this much is for sure: Daniel did not start out with the intent to make an ar app.
I don’t have a strong opinion about what aR is. I don’t have a fixed definition of it. For me, aR is a bit like labels or branding in advertisement. It’s something you put on top, and suddenly, people experience a product in a different way. I would never start out thinking ‘I should make something with augmented Reality’, and then come up with a project. But in retrospect, I can say that the Man-eater can be seen as an augmented Reality project. I don’t know whether the new project will fit your definition of aR, but it will definitely let people perceive reality differently.
know everything from one subject, it becomes harder and harder to see the big picture and to come up with a new idea.
In contrast to Man-eater, which manages to aug-
Ordering our second round of coffees, I have a
ment the outside with as little as two stickers, this
pretty clear idea that with his work, Daniel wants
app sounds rather technical. I wonder to what de-
to create something new and intends to confront
gree technology served as an inspiration.
us with another perspective on the world ― an
Image by daNIel dISSelkoeN
ambition he shares with many artists. However,
of course, you do need to know something about the material you are working with. But it’s only because I realized that people don’t look around anymore when they are busy with their phones and tablets, that I thought it was an interesting subject and medium to use. I wanted to know more about it, so I got a tablet. Knowing its functionality and possibilities allowed me to come up with the final idea. But the inspiration was not the technology, but my observation of people being completely immersed in it. For me technology is not a target, but a tool. I think when you know too much about something technical, you don’t come up with ideas that create something new. You then might come up with a technical improvement or technical innovation or something like that. If you
given all our talking about apps, games, travelling and trams, I am not sure yet whether he considers his games works of art.
I would call it ‘applied arts’. I like it if what I do also has a specific function which is not abstract. of course the work and shape can be abstract. But I like it when I can see if it works. If people look outside, play the Man-eater and enjoy it, it works. this concrete functionality is an important aspect of my work. Finishing our drinks, we talk about his current work: right now he is designing a crazy bridge, which presumably will never be built. we talk about his other works, including documentaries about Mall-walkers and 47-second long interviews
held in the middle of Japan’s busiest intersection. We talk about the commercial potential of his app, and calculate the hours needed to realize it. Quickly, the time has come to pay our coffees. Usually, at this point in an interview, I ask my dialog partners a last question about their plans for the future. I already know that Daniel will be working hard on getting his app done. I ask about his plans anyway.
I think I might want to work in advertising. It is one of the little areas where you can come up with an idea and it doesn’t matter which medium you use. You can adjust the medium to your idea. Ideally, I want to first come up with an idea, then choose the suitable medium and then show it to a client. A good agency where you can work like that ― that’s a place where I want to work. I thank Daniel for the interview. On my way back home, there’s no Man-eater to keep me company. The effect of seeing something new is wearing off and my interest in looking outside is fading quickly. So I take a look around inside the tram. Daniel’s right: there are the ones with the phones, the readers and the sleepers. But there are also quite a few people gazing out the window. Just when I get suspicious, I notice the way they pronounce Scheveningen. There’s no doubt about it: they are German tourists. I’m not sure whether locals really lost their interest in the outside world. However, I have to yield a point to Daniel’s observations. What we perceive is shaped by what we know. Knowing German makes me spot Germans on the tram. What I have learned about Daniel is most probably similarly shaped by my knowledge and specialization. Given my own background in creative science, I see the scientist in Daniel: a young observer, driven by the question how we can change what we perceive periments. And if his app will let me perceive things differently, I can’t wait to take part and try it out myself. Daniel’s website: www.danieldisselkoen.nl
Image by Daniel Disselkoen
about the world. I see his games as a series of ex-
auGmeNted PrototyPING: aUGMEnTED rEaLITY TO SUPPOrT THE DESIGn PrOCESS JOUKE vErLInDEn In the last decade, new augmented reality tech-
this combination of the physical and digital realms
niques emerged to visualize and interact with physi-
offers both a highly dynamic model that can cover
cal shapes and related artifact knowledge. This en-
engineering and aesthetic aspects simultaneously.
ables so-called augmented Prototyping approaches
The physical interaction allows a tactile and haptic
that combine physical models with principles of
dialogue between participants and artefact model.
projection or video mixing to establish a dynamic
This article draws upon research presented in my
prototype at a relatively low cost. For example,
doctoral research, where I apply ar techniques
nam and Lee documented the prototyping setup for
to the field of industrial design. It illustrates the
an interactive tour guide: a physical mockup was
benefits and systems by summarizing a number of
equipped with microswitches to control a computer
installations found in academia. It will then elabo-
simulation while the output is merged either by a
rate on the investigation of what the true impact
see-through Head Mounted Display (see Figure 1,
of such techniques could be in practice.
left) or by projector (Figure 1, right). In both cases, physical cues such as proportions, key layout and grasping control can be combined with alternate screen graphics and workflow.
IaP as design support in literature Since the inception of ar, speculative design support scenarios for IaP have been devised. For example, Bimber et al. (2001) predicted five application scenarios without much detail: augmented design review, hybrid assembling, hybrid modelling/ sketching, visual inspection of moulded parts and
Figure 1. Two augmented Prototyping scenarios (nam and Lee, 2003).
hybrid ergonomic design. we surveyed the existing IaP applications that cover a mixture of domains and employ various display types (head-mounted
Interactive augmented Prototyping (IaP) requires
displays, video mixing or see-through and so forth).
imaging technologies (output) and sensor tech-
a detailed characterisation of these IaP systems can
nologies (input) to establish an interactive spa-
be found in Table 3, including domain, objective,
tial experience. Furthermore, aP technology em-
and interactivity. In terms of design domains, the
braces existing physical prototyping methods and
applications cover information appliances, automo-
can include virtual prototyping/ simulation tools.
tive, architecture and factory planning, while some
In comparison to traditional physical prototyping,
systems propose a general-purpose IaP system.
ar display type
Presentation of concept cars
virtual object on turntable, user moves around
HMD video mixing
Cheok et al., (2002)
Generating curves and surfaces
Index finger is tracked, creation of control points in air
HMD video mixing
Fiorentino et al.(2002) ”Spacedesign”
Free-form surface modeling, inclusion of physical models.
HMD video mixing
Bimber et al., (2001) “augmented Engineering”
Supporting sketching on mirror, grasp physical objects
Painting on physical objects
Moving object and paintbrush, selecting color from virtual palette
Fixed multiple projectors (projectorbased ar)
Exploring component features on CnC or clay models
Moving object on turntable, Change texture/paint by menu.
3D sketching and rP
Factory planning: layout check, collaborative reviews
Moving objects in 2D plane
Interactive simulation for Urban architecture reflections/ shadows/ wind
Moving objects in 2D plane
Publication Klinker et al. (2002) ”Fata Morgana”
Bandyopadhyay et al. (2002) “Dynamic shader lamps”
rauterberg et al. (1998) “Built-it” Underkoffer and Ishii, (1999) “UrP”
Fründ et al. (2003)
Support automotive modeling
Moving components in 3D space (rotation, translation, scaling)
video mixing HMD
nightclub layout with pedestrian flow simulation
Small-scale physical objects
Kitchen layout, full scale
Full scale cardboard cabinets and voice control
Multiple fixed projectors
nam and Lee (2003)
Simulate Information appliances
Usability assessment of a digital tour guide
Operating the switches while grasping the object.
video mixing HMD and fixed projector
Simulate Information appliances
Dialogue definition and evaluation
Sketching screens and screen transitions, operating the switches while grasping the object
Embedded screen and Fixed projector
Simulate Information appliances
Usability assessment of a remote control
Simulate Information appliances
Handheld voice recorder
Handheld Mockup with projection
table 1 : topical overview of augmented Prototyping research. 63
Five different scenarios were identified by inspecting the design activities: presentation, geometric modeling, interactive painting, layout design and simulating information appliances. These are discussed in the following subsections.
Presentation The main takeaway of most ar systems is the fact
Figure 3. Full-scale car mockup with projection (htas, 2009).
that product visualizations can be shared with other stakeholders in the design process (e.g.
Interactive painting systems such as dynamic
higher management or potential users). Such pre-
shader lamps indicate the advantages of digital
sentation systems have been specifically devised
drawing on physical objects (Bandyopadhyay et
in automotive design. For example, Klinker et al.
al., 2001). Based on raskar’s shader lamps tech-
(2002) investigated the presentation of (virtual)
nique, a white object is illuminated by a collec-
concept cars in a typical showroom by observing
tion of video projectors from different angles.
the behavior of designers and by presenting some
a tracked wand acts as a drawing tool. when in
proof-of-concept examples. The resulting system
contact with the object’s surface, strokes are
is shown in Figure 2.
captured and rendered in an airbrush effect. as it copies natural drawing on objects, this establishes an easy-to-use interface that has been positively evaluated by kids and graphic artists. a restriction of such interactive painting systems is that the shape of the physical object cannot differ from that of the virtual object — the haptic and visual display of the virtual object will then be misaligned with the physical object.
Figure 2. Example of a presentation system (Klinker, 2002).
Instead of using head-mounted displays, projector-based ar have also been documented that employ mockups with and projectors, for example the system shown in Figure 3. This setup is on demonstration at the High-Tech automotive Cam-
Figure 4. Dynamic shader lamps system in use (Bandyopadhyay et al., 2001).
pus in Helmond, the netherlands and comprises
More intricate progress in software techniques al-
of a full-scale model of a racing car and 3 video
lows sketching on arbitrary surfaces from various
projections that are carefully aligned. In both
distances (Cao et al., 2006). Specific applications
cases described above, the as such, the interac-
have been developed for customizing ceramic
tion is passive, merely to inspect designs that are
plates, cf Figure 5.
modeled in separate applications.
on the components themselves, cf . Figure 6. The Built-it system supports the layout of assembly lines in a similar way; simple data are projected on top of the blocks while a large view on the wall shows the resulting manufacturing plant in a 3D perspective. Both systems exhibit the potential of using physiFigure 5. blueBrush (van den Berg, 2006).
cal design components as user interfaces â€“ the parts are managed by direct manipulation and a design can be reconfigured by multiple hands/
users simultaneously. Furthermore, the light reflection and other simulation modulesâ€™ support in combination with this tangible interface show
Some geometric modelling tools that originated
the combination of physical spatial reasoning and
from vr were adapted to ar. For example, Cheok
computational simulation. In contrast, an aug-
et al. (2002) presented a see-through ar system
mented modelling system developed at a German
that tracks the index finger by arToolkit (Kato and
car manufacturer deployed virtual components
Billinghurst, 1999). The user can generate curves
on a physical global shape (FrĂźnd et al., 2003).
and surfaces that float in the air. as opposed to
Modelling operations were limited to component
regular vr, this enables awareness of phenome-
placement (translation, orientation, scaling),
nological space, which is relevant for most prod-
while a Pinch Glove supported the dialogue.
uct design activities. However, the system provides no tactile feedback as no physical objects are included. Furthermore, interaction is difficult to scale up to multiple users at a single location, as the movement envelopes of such tracking systems are small. a similar system was presented by Fiorentino et al. (2002), who adapted their free-form vr modelling application to work with see-through display technologies and infrared 3D tracking. again, interaction takes place in mid-air and although physical objects can be included, these are only used to project texture maps.
Figure 6. UrP system displaying an interactive wind simulation (Underkoffer & Isshi, 1999).
Simulating information appliances Examples of this type of design support are pri-
Layout design systems like UrP (Underkoffler and
marily proposed for the design of information
Ishii, 1999) and Built-it (rauterberg et al., 1998)
appliances, referring to consumer products that
offer a number of fixed physical components that
include electronics, e.g., mobile phones, MP3
can be reconfigured on a planar surface. In UrP,
players, etc. In this case, augmentation is used to
the components represent buildings, the augmen-
overlay graphics or other types of visual feedback
tation focuses on the simulation of light reflec-
to a physical mode and simulate navigational
tion, shadows and wind and simulation results
behaviour. Much emphasis is put on measuring
such as flow fields are directly projected in 2D
button interaction to assess the usability of a de-
sign by capturing and time-stamping click events. as presented in the introduction of this article, nam and Lee documented the design evaluation of a hand-held information appliance. They used arToolkit to track the position and orientation of the object; the optical marker is attached to the back of the object. In this case, a video-based ar HMD and a projector-based ar are compared and
Figure 7. thermostat mockup with rFID based interaction (Kanai et al., 2007).
evaluated; the projector-based ar display reportedly performs more accurately in the interaction
Kanai et al. (2007) developed a usability assess-
tests. nam (2005) expanded this simulation to the
ment tool to rapidly test interaction with mock-
interaction modelling of both screen and buttons
ups by rFID technology and video projection. The
for a hand-held tour guide by the use of state tran-
level of interactivity with the prototype is even
sition graphs. although the aspect of button inter-
higher when the user can model and interact with
action is well elaborated, the modelling of shape
a product simulation: simple tags can be glued on
and its features is limited (e.g., the location of
foam mockups and the user dons a glove with a
the buttons). Support to track layout modifications
is lacking and this has to be performed manually.
Characterizing the IaP support scenarios The aforementioned design support scenarios can be characterized by three activities: browsing, interaction with the artefact’s behaviour and alteration of the model. In the table below, a brief summary is shown.
Interaction with behavior
view model from all sides – either handheld or fixed to the environment
view model from all sides – either handheld or fixed to the environment
adaptation of textures/ materials/ annotations
view model from all sides – either handheld or fixed to the environment
Creation of new geometry (curves/surfaces)
view layout of multiple components
real-time rendering of environment (simulations)
Change assembly (2D or 3D)
Simulating information appliances
Manual handling (physical mockups)
Digital product interaction
Table 2. Characterization of IaP design support scenarios from literature.
alteration of model
assessing the impact of IaP in industry To assess the possible impact of such systems, we
IaP. The responses are summarized in Table 4,
interviewed 13 top design and engineering agen-
ordered by the number of participants that ex-
cies in the netherlands (verlinden et al. 2010).
pressed these. The most prominent strengths are
we targeted senior project managers in various
in line with the envisioned benefits: allowing to
domains, ranging from well-known interior and
explore multiple design variants and to commu-
furniture designers to studios in automotive and
nicate with the client and other stakeholders. On
product design. with each participant, we showed
the other hand, primary weaknesses signify the
a short demonstration of IaP and a 90-minute
effort needed to realize such prototypes and the
semi-structured interview. we asked their opin-
ion regarding the strengths and weaknesses of
Strong idea, Fills a gap in current design process, different than present methods.
Might take too much effort to realize (labor intensive, complex) .
Exploring/Presenting many variations rapidly without making it costly.
Quality of the projection vs regular finished models (e.g. wood finish).
Flexible with fabrics and lay-out, styling-tool.
Physical model is required, also needs updated when applied in a new situation.
Bring ideas rapidly to the customer .
Only 2,5 D (just a “skin”).
Might be spectacular — esp. with a handheld version.
requires a lot of knowledge/might be difficult.
Good for acquisition/marketing.
Tactility not the same as end version (interior).
Easy to use.
User can occlude the projection.
Cannot support 1:1 scale for large products/ setups. Possibly wrong perception of the product by client.
Table 3: Expressed IaP strengths and weaknesses and their occurrences.
Then we asked about the envisioned benefits of
i) does variability in a design matches the physi-
the IaP to their own design process. The results
cal – virtual division, ii) what is the trade-off be-
are depicted in Figure 4. The participants men-
tween speed versus quality, iii) how does this af-
tion as most important benefits: Communication
fect the emancipation of other stakeholders. The
with external parties, Facilitate user testing proc-
first is related to product domain and activity – the
ess, facilitate concept development, improve in-
fit with some (like website or kitchen design) is
sights, and reduce errors/mistakes. Concerning
less obvious. The second concern touches the need
the method, three main issues need addressing:
for speed versus the need for credible prototypes,
which differ based on customer and studio tradi-
turers, engineers and other experts. It remains
tion. The last demarcates the role of the designer
difficult to bridge the differences in knowledge,
in the design process and the power that clients
skills and attitudes among the stakeholders.
and other stakeholders have in the process to ex-
when considering decision making during design
plore alternatives that might be undesirable.
process, physical models can also be regarded as “boundary objects” - as interfaces between the stakeholders in transition between design phases (Smulders’ et al., 2008). These boundary objects encompass both product specifications and argumentation, providing a platform to create shared insight and to freeze the status of a product design for later use. The authors argue that miscommunication and misinterpretation are often related to the characteristics of the used boundary objects, which should bridge the interfaces between design phases and the related discourse domains. There is some evidence that the inclusion of prototypes enhances the performance of collaborative engineering teams (Yang, 2004)(McGarry, 2005). virtual prototypes such as 3D renderings and virtual reality models do provide a good insight; yet have challenges in providing a proper perception of context, scale and proportions (Kuutti et al., 2001). Because the end result is typically a physical object, a materialization of the idea is better approachable than technical drawings or specifications – and often more eco-
Figure 8: Envisioned benefits of the I/O Pad (n=13).
nomical to fabricate. Based on the functions that other IaP systems
The results show that almost all design and en-
portray, we hypothesized that a key element
gineering firms extensively use models during
of augmentation is not just to enrich a physical
design reviews, and they consider IaP to have a
mockup with additional product information but
potential to avoid miscommunication. However,
to add support for design reviews. Design reviews
they perceive the inclusion of such new technol-
represent formal discussions between the stake-
ogy as time consuming which might not be worth
holders of a design process, and are key in deci-
the risk of adopting such systems.
sion making during the design process (Huet et al., 2007). Our initial concept of the IaP Design
augmenting the design discourse
review system was been devised to support synchronous, co-located meetings that typically do not employ advanced recording techniques. The interaction concept is shown in Figure 9: allow-
Product design is never a solitary process – often
ing to host presentations and discussions on de-
fellow designers are involved in projects, while
sign alternatives while using handheld and large
the act of design requires collaboration with
projector-based ar systems to add information to
client, prospective users, marketers, manufac-
the models as described in the previous sections.
During the presentation, audio and video feeds of all IaP systems are captured, while interaction with the pens and navigation through the presentation are captured as notes; input events are stored as well in the segment index, similar to the where were we System (Minneman and Harrison,1993). The augmented prototyping technologies allow a fusion of combine audio, video, model changes, and user annotations. This results in a large, data warehouse of multimedia indexed by semantic tags (decisions, tool usage, camera switching and the like). These sessions can be inspected later
Figure 9. Interaction concept of IaP as a design review system.
to refine and reflect on the decisions and planned design activities. To our knowledge, this endeavour is the first to capture experiences by recording all channels of augmented reality sessions. In our current implementation, we have made a handheld system, based on a picoprojector, a webcam and a small UMPC tablet (Figure 10), running a customized version of arToolkit. The ar cube â€“ the desktop version- is packed in a flightcase equipped with a ultra-short throw projector and a specialized Ir tracking system (Personal
Figure 10. working prototype of our handheld IaP system
Space Technologies) and a large TabletPC running vrmeer â€“ based on OpenSceneGraph. It can be placed on a table and be up and running in less than 10 minutes (Figure 11).
Conclusions although some commercial ar solutions have emerged, most inspiration can be drawn from the IaP installations created in academia. as design support tools, these systems showcase the power
Figure 11. ar Cube in use with several 3D printed objects.
of tangible computing as natural and embodied interaction. By inspecting the collection three
as the missing link, while the proper configura-
different interaction characteristics to support
tion of the overall solution concept is difficult to
design emerged: browsing, interaction with the
grasp at this moment. Based on this feedback and
artefactâ€™s behaviour, and alteration of the model.
other empirical studies, a design review concept
In assessing the impact of IaP, Dutch design stu-
was developed, incorporating two different pro-
dios were surveyed. Their reactions are positive;
jector-based ar systems: handheld and a larger
Most of the proposed hardware solutions are al-
desktop model. Pilot studies and additional dem-
ready available. However, software is regarded
onstrations show promising results.
Bandyopadhyay, D., raskar, r. and Fuchs, H.
environment for testing and assessing the
(2001) “Dynamic shader lamps: painting on
usability of information appliances using di-
movable objects “, proceedings International
gital and physical mock-ups “, Lecture notes
Symposium on augmented reality (ISMar),
in Computer Science, vol. 4563, pp.478–487.
pp.207–216. ■ ■
tracking and HMD calibration for a video-
“Projection-based augmented engineering
based augmented reality conferencing
“, Proceedings of International Conference
system “, Proceedings of International
on Human-Computer Interaction (HCI’2001),
workshop on augmented reality (Iwar 99),
vol. 1, pp.787–791.
Bochenek, G.M., ragusa, J.M. and Malone,
Kato, H. and Billinghurst, M. (1999) “Marker
Bimber, O., Stork, a. and Branco, P. (2001)
Klinker, G., Dutoit, a.H., Bauer, M., Bayer,
L.C. (2001) “Integrating virtual 3-D display
J., novak, v. and Matzke, D. (2002) “Fata
systems into product design reviews: some
Morgana – a presentation system for product
insights from empirical testing “, Int. J.
design “, Proceedings of ISMar ‘02, pp.76–
Technology Management, vol. 21, nos. 3–4,
pp.340–352. ■ ■
Kuutti, K., Battarbee, K., Säde, S., Mat-
Cao, x., Balakrishnan, r. (2006) “Interacting
telmäki, T., Keinonen, T., Teirikko, T. and
with dynamically defined information spaces
Tornberg, a. (2001) “virtual prototypes in
using a handheld projector and a pen”, aCM
usability testing “, Proceedings of the 34th
UIST Symposium on User Interface Software
annual Hawaii International Conference on
and Technology, p.225-234.
System Sciences (Hicss-34), 3–6 January, vol. 5.
Cheok, a.D., Edmund, n.w.C. and Eng,
a.w. (2002) “Inexpensive non-sensor based
McGarry, B. (2005). Things to think with:
augmented reality modeling of curves and
understanding interactions with artefacts in
surfaces in physical space “, proceedings
engineering design. PhD Thesis. University of
Queensland, School of Information Technology and Electrical Engineering.
Fiorentino, M., de amicis, r., Monno, G.
and Stork, a. (2002) “Spacedesign: a mixed
Minneman, S.L., Harrison, S.r. : where were
reality workplace for aesthetic industrial de-
we:Making and Using near-Synchronous, Pre-
sign“, Proceedings ISMar’02, pp.86–96.
narrative video, proc. aCM Multimedia ‘93, pp. 207-214 (1993)
Fründ, J., Gausemeier, J., Matysczok, C.
and radovski, r. (2003) “Cooperative design
typing platform for hardware-software integrated interactive products “, Proceedings of
Proceedings of CSCw in Design, pp.492–497.
ama, a. and Kikuta, Y. (2007) “an integrated
nam, T-J. (2005) “Sketch-based rapid proto-
ment using augmented reality technology “,
Kanai, S., Horiuchi, S., Shiroma, Y., Yokoy-
support within automobile advance develop-
nam, T-J. and Lee, w. (2003) “Integrating hardware and software: augmented reality
based prototyping method for digital pro-
rison of virtual and augmented prototyping
ducts “, Proceedings of CHI’03, pp.956–957.
of handheld products “, Proceedings of Design 2004, pp.533–538.
rauterberg, M., Fjeld, M., Krueger, H., Bichsel, M., Leonhardt, U. and Meier, M. (1998)
verlinden, J.C., Horvath, I., nam, T-J. (2009)
“BUILD-IT: a planning tool for construction
“recording augmented reality experiences to
and design “, video Program of CHI’98,
capture design reviews”, International Jour-
nal on Interactive Design and Manufacturing, volume 3, number 3 / august, 2009, pp. 189-
Smulders, F.E., Lousberg, L., Dorst, K. (2008), “Towards different communication in collaborative design“International Journal of
Yang, M.Y. (2004) “an examination of pro-
Managing Projects in Business, vol. 1(3), pp.
totyping and design outcome” proceedings
DETC ’04, Paper no. DETC2004-57552.
Underkoffler, J. and Ishii, H. (1999) “Urp: a luminous-tangible workbench for urban planning and design “, Proceedings of CHI’99,
links to youtube videos of these systems
van den Berg (2006) “Project Light Blue”,
Underkoffer’s UrP and other Luminous room Demos http://vimeo.com/2235474
denberg/lightblue.html ■ ■
Fleld et al. Built-it video vor CHI’98 at
verlinden, J.C., de Smit, a., Horváth, I.,
Epema, E. and de Jong, M. (2003a) “Time
compression characteristics of the augmented prototyping pipeline “, Proceedings of
Euro-urapid’03, p.a/1. ■
verlinden, J.C., de Smit, a., Peeters, a.w.J.
Bluebrush prototype van den Berg http:// www.youtube.com/watch?v=Dvvl9-C2rU0
Tek-Jin nam “Sketch-based rapid pro-
and van Gelderen, M.H. (2003b) “Develop-
ment of a flexible augmented prototyping
system “, Journal of wSCG, vol. 11, no. 3, pp.496–503.
verlinden aP videos http://youtu.be/F-3BrBfigEw
verlinden, J., de Smit, a. and Horváth, I. (2004a) “Case-based exploration of the
augmented prototyping dialogue to support
Fiorentino’s Spacedesign http://www.youtube.com/watch?v=GOMx_sytCmU
design “, Proceedings of TMCE 2004, pp.245– 254.
Bandyopadhyay et al. Dynamic Shader lights http://www.youtube.com/
verlinden, J., van den Esker, w., wind, L.
and Horváth, I. (2004b) “Qualitative compa-
Image by Oculus VR
The revival of Virtual Reality? Wim van Eck In the early 90’s, virtual reality (VR) was the ‘next
seems to have been widely accepted, we can safe-
big thing’. Popularised by movies such as “The
ly say goodbye to VR. Or can we…?
Lawnmower Man”, it was expected that soon ev-
In August this year, a VR enthusiast named Palmer
erybody would be emerged in virtual worlds us-
Luckey started a Kickstarter project which prom-
ing head-mounted displays (HMDs) and CAVEs. The
ises a truly immersive virtual reality headset, the
industry invested enormous amounts of money,
Oculus Rift. While currently the best consumer
but never managed to deliver affordable hardware
headsets only have a 45 degree horizontal and a
which would live up to the people’s high expec-
52 degree diagonal field of view, Palmer’s design
tations. The technology simply wasn’t ready yet.
succeeds in offering a stunning 90 degrees hori-
VR soon became a niche product only used by in-
zontal and 110 degrees diagonal field of view. This
dustries which could afford it, such as the army
means that instead of having the equivalent of a
and research institutes. Since Augmented Reality
large screen at a couple of meters distance, you
(which is even more technologically impressive)
can actually hardly see the edges of the screen
anymore. Combined with an ultra-low latency head
such as Unity and the Unreal Engine have prom-
tracker with 6 degrees of freedom, you should fi-
ised support for the Rift, and the first compatible
nally be able to experience VR as it was originally
games are currently being announced. Many are
meant to be, at an affordable price of $275.
already saying the Rift will be the next big inno-
Palmer’s Kickstarter project was quickly backed
vation in gaming.
by some of the most influential game developers such as John Carmack (id Software), Gabe Newell (Valve) and Cliff Bleszinski (Epic Games), who
At the AR Lab, we are also very curious about
were completely sold after experiencing an early
this headset. Not only for VR purposes, but also
prototype of the Rift. After a month the project
to see if we can turn it into an Augmented Reality
was backed by almost 10.000 people and raised
headset, just like how we turned the Sony HMZ-
$2,437,430 — ten times more than the $250,000
T1 into ‘Marty’ (see page 14-17 of this magazine).
goal. The first batch of headsets is mostly meant
We backed the project and we will receive two
for developers so they are able to already de-
prototypes of the headset in December. We will
velop projects for it, while a further improved
keep you posted on our experiences!
consumer version of the Rift is expected to be on sale at the end of 2013. Popular game engines
Contributors Wim van Eck
Royal Academy of Art (KABK) firstname.lastname@example.org
Leiden University email@example.com
Wim van Eck is the 3D animation specialist of the AR Lab. His main tasks are developing Augmented Reality projects, supporting and supervising students and creating 3d content. His interests are, among others, real-time 3d animation, game design and creative research.
Edwin van der Heide Leiden University firstname.lastname@example.org
Edwin van der Heide is an artist and researcher in the field of sound, space and interaction. Beside’s running his own studio he’s part-time assistant professor at Leiden University (LIACS / Media Technology MSc programme) and heading the Spatial Interaction Lab at the ArtScience Interfaculty of the Royal Conservatoire and Arts Academy in The Hague.
Pieter Jonker Delft University of Technology P.P.Jonker@tudelft.nl
Pieter Jonker is Professor at Delft University of Technology, Faculty Mechanical, Maritime and Materials Engineering (3ME). His main interests and fields of research are: real-time embedded image processing, parallel image processing architectures, robot vision, robot learning and Augmented Reality.
Yolande Kolstee Royal Academy of Art (KABK) Y.Kolstee@kabk.nl
Yolande Kolstee is head of the AR Lab since 2006. She holds the post of Lector (Dutch for researcher in professional universities) in the field of ‘Innovative Visualisation Techniques in higher Art Education’ for the Royal Academy of Art, The Hague.
Maarten Lamers is assistant professor at the Leiden Institute of Advanced Computer Science (LIACS) and board member of the Media Technology MSc program. Specializations include social robotics, bio-hybrid computer games, scientific creativity, and models for perceptualization.
Ferenc Molnár Photographer email@example.com
Ferenc Molnár is a multimedia artist based in The Hague since 1991. In 2006 he has returned to the KABK to study photography and that’s where he started to experiment with AR. His focus is on the possibilities and on the impact of this new technology as a communication platform in our visual culture.
Maaike Roozenburg Royal Academy of Art (KABK) firstname.lastname@example.org
Maaike Roozenburg is a designer. She develops concepts, projects and products on the border of heritage, design and visual communication. She founded Studio Maaike Roozenburg in witch she combines a historical fascination with high tech materials and techniques and ‘traditional’ crafts. Besides her work at the studio, Roozenburg teaches at the Post Graduate Course of Industrial Design at the Royal Academy of Art.
Hanna Schraffenberger Leiden University email@example.com Hanna Schraffenberger works as a researcher and PhD student at the Leiden Institute of Advanced Computer Science (LIACS) and at the AR Lab in The Hague. Her research interests include interaction in interactive art and (non-visual) Augmented Reality.
Royal Academy of Art (KABK) firstname.lastname@example.org
Ryerson Universitt Department of Architectural Science
Esmé Vahrmeijer is the graphic designer and webmaster of the AR Lab. Besides her work at the AR Lab, she is a part time student at the Royal Academy of Art (KABK) and runs her own graphic design studio Ooxo. Her interests are in graphic design, typography, web design, photography and education.
Vincent Hui teaches a variety of courses within the Department of Architectural Science at Ryerson University in Toronto, Canada, ranging from design studios to advanced architectural computing, and digital fabrication.
BArbara Nordhjem http://nordhjem.net
Delft University of Technology email@example.com
Jouke Verlinden is assistant professor at the section of computer aided design engineering at the Faculty of Industrial Design Engineering. With a background in virtual reality and interaction design, he leads the “Augmented Matter in Context” lab that focuses on blend between bits and atoms for design and creativity.
Barbara Nordhjem is a PhD student at the Visual Neuroscience Group at the University Medical Center in Groningen. She is interested in visual perception and how humans are able to extract the most useful information from the environment in different situations.
Martin Sjardijn http://www.sjardijn.com/
Dirk Vis Royal Academy of Art (KABK) firstname.lastname@example.org
Dirk Vis (1981) studied Beeld & Taal (Image & Language) at the Rietveld Academy and Design at the Sandberg Institute. He published Bestseller (2009) in limited edition, made electronic poems with K. Michel, is editor of De Gids and teaches Interactive Media at the Royal Academy of Fine Arts.
Guest ContributoRS Esther de Graaff www.estherdegraaff.nl
Esther de Graaff studied art history with a focus on contemporary art. Now she works as a project manager, curator, writer and coordinator. She supports artists, curators and institutions in organizing various projects, both in content and in organization. It is her mission to bring culture at the hart of our live.
Martin Sjardijn is a painter, sculptor, digital and conceptual artist. He was born in The Hague, The Netherlands where he also studied Fine Arts at the Royal Academy of Art and for some years Cultural Sciences and Philosophy. At the Royal Academy of Art he teaches virtual modeling for autonomous art.
Special thanks We would like to thank Mariana Kniveton, Reba Wesdorp, Tama McGlinn and last but not least the Stichting Innovatie Alliantie (SIA) and the RAAK (Regionale Aandacht en Actie voor Kenniscirculatie) initiative of the Dutch Ministry of Education, Culture and Science.
Next Issue The next issue of AR[t] will be out in the second quarter of 2013.