Page 1



Donovan Ong 639309

SECTION A 2 A0 Introduction 8 A1 Design Computation 14 A2 Composition/Generation 20 A3+4 Conclusion+Learning Outcomes 22 A5 Appendix - Algorithmic Sketches SECTION B 26 B1 Research Field 38 B2 Case Study 1.0 50 B3 Case Study 2.0 66 B4 Technique: Development 76 B5 Technique: Prototypes 78 B6 Technique: Proposal 86 B7 Learning Objectives & Outcomes 90 B8 Appendix - Algorithmic Sketches SECTION C 96 C1.1 Design Concept: Preliminary Trials 106 C1.2 Design Concept: Final 126 C2 Tectonic Elements & Prototypes 132 C3 Final Detail Model 134 C4 Learning Objectives & Outcomes



ably there is a failure to deliver. For me, the construct lacks any pragmatic process, and thus, though the end goal is understood and even partly formed in mind, the path to the manifestation of such design cannot be reached.

DONOVAN ONG 3RD YEAR ARCHITECTURE Perhaps to know what drives my study and thus my work, it must be understood that for me, the design construct is a difficult concept to grasp. In all my work so far the requirements of the brief are established, expectations shown and yet invari-

the realm of digital design in a much wider scope, taking into account topics such as the ethicality or morality of such technique, and thus, perhaps the future of architectural design as a whole; What is it that most justifies a design process? The pursuit of the ultimate efficiency? The achievement of ‘true’ creativity? Or perhaps above all else the selfish interest of the designer?

Perhaps for the me of today, underlying all personal motivations in my study lie not an interest stemming from the finished architectural form, but the aim that one day the abstract construct that is the design It is my hope that by the end of the process is made clear to me. subject my opinions on such considerations can begin to form. My experience in design software can only be described as basic, with limited technical skill in Rhino and SketchUp. It is to be noted that my thoughts on using software have not gone further than considerations of efficiency, convenience and ease. In the coming weeks I hope to explore






MERCEDES BENZ MUSEUM It is necessary to understand that when Tony Fry speaks of design futuring, he speaks not only of a consideration for the sustainable design or human attitude but too of the futuring every design fundamentally brings:

very least contributed to the field of architectural abstract geometries.

Implications of design however do not extend merely to the final built form, but of course the method behind it, in this case 3D modelling[2]. Perhaps more than an indication of successful design, the fact that the museum is built further validates “designed things go on designing, be the method: it is feasible and perthey designed to do so or not”[1] haps even preferred to have now even the most complex structures Every individual design develops de- planned entirely in 3D. sign as a realm, and thus, in extension develops our future. Can the influence of the museum compare to say Heinz Isler’s conIn analysing the Mercedes Benz Mu- crete shells? Perhaps not, yet even seum as a design we thus seek for so its contribution to the field canits true implications on design, both not be ignored. as it currently is and as it will be, in this case through looking at the mu- To develop the realm of design is to seum’s geometry: a space of curves develop future possibility, to appreciseemingly avoiding the traditional ate design futuring is to appreciate right angle - it is clear that Berkel this fact; it is not calculating how and Sobek exaggerates the fact that much the museum has influenced we have stepped into an age where that is significant, but the simple even the once most basic structural understanding that it did. principles are evolving.



What this geometry means for design is however a more complex matter; it is hard to say, without the architect explicitly stating so, that Berkel and Sobek’s museum has directly influenced a work, yet it cannot be argued that it has at the

Notes 1. Stephen Fry, Design Futuring (Oxford: Berg Publishing, 2009), 7 2. Klaus Bollinger, Manfred Grohmann and Oliver Tessmann, ‘Structured Becoming: Evolutionary Processes In Design Engineering’, Architectural Design, 80 (2010), 34-39 <http://dx.doi. org/10.1002/ad.1103>. 36 Fig2. commons/b/b5/Mercedes-Benz_Museum_interior-6_2013_March.jpg





concept and the built environment is baroque elements of the Asam incomparable. church[2].

In the previous analysis of the Mercedes Benz Museum, it is claimed that there is a certain significance that comes with a built project. For Berkel and Sobek, it is a validation of their ideas but perhaps more importantly another step taken in the validations of their methods - 3D planned design. “Whenever we bring something into being we also destroy something”[1]

It may be true that digital design will never bring the same level of impact of the architecture of reality, as inherently architecture is a field grounded soundly in the real - our end goal is no doubt the physical manifestation of our design, yet when we speak of design futuring, in terms of implication on future design and thus future possibility, can not the digital concept still have its significant place?

The cost of this proof however, must not be forgotten. There is of course an embodied time and energy in all action, yet it is clear that the resource difference between the

In Oliver Von Malm’s chapel within Asam Church we are shown his parametric and quasi modernist style, with his shapes loosely inspired by the already existing



Design develops design. Though the fact that the Mercedes Benz Museum is built is important, even without being constructed Malm’s conceptual chapel no doubt enriches the parametric architectural field, developing the environment in which works such as PTW Architect’s water cube or UNStudio’s Burnham Pavillion can be built. Notes 1. Tony Fry, Design Futuring (Oxford: Berg Publishing, 2009), 4 2. Marjan Colletti, ‘Digitalia - The Other Digital Practice’, Architectural Design, 80 (2010), 16-23 <> Fig3. https://turkeberts.files.wordpress. com/2014/10/dc6a3104.jpg Fig4. “DigitAlia - the Other Digital Practice” Architectural Design 204 (2010): 25



THE CREATIVE SYMBIOSIS As the line between tool and partner blurs, the status quo is disturbed. No longer can we be claim to be the all-powerful creator-wielder of simple instrument; in the age of creative symbiosis, we do not use but communicate[1]. For the ancient Greeks, creativity sourced not from the self, but the whims of the divine[2]. For those of the Post-Enlightenment

creative design becomes the representation of the creative soul we cultivate through our studies in each of our respective contexts[3]. For the designer of today, creativity is sourced not from the soul but from the electrical impulses of the brain – ‘creativity’ begins to mean a collective term for the cognitive processes responsible for the design[4].

stand: logic and iteration, and assign the program the task of diversifying and documenting, based on our communicated parameters, the idea to its logical conclusion[5].

The results are vast and various, its effects widespread. In the realm of architectural creativity there begins a paradigm shift – slowly, it is not so much the initial idea that determines creativity, but the choice of iteration, the In the creative symbiosis, we thus exconscious decision in parameter variatract from this cognitive creativity some tion; the process of design[6]. processes we have begun to under-



In response to this new focus on the process, we ask then, as our partner in computational design, is the program a limitation on our creativity? Undoubtedly so, yet is this not inherent in all mediums? Does not the translation of thought not intrinsically come with loss of information? This is true for conventional means of design, including computerisation: the base design is manifested from the initial idea, and grows from there.

putation, we begin with the function, the limitation, the rule - the algorithm. With this process we thus begin to explore what can be instead of what should (according to our preconceived ideal) - we begin to research by design[7]. Yet countering this, does not the language of the program define our interaction, and thus our design? Are we not funneled, knowingly or not, into a certain shape, pattern, philosophy?

We must understand however in com-



Notes 1. Yehuda E Kalay, Architectureâ&#x20AC;&#x2122;s New Media (Cambridge, Mass.: MIT Press, 2004), 3 2. Jonah Lehrer, Imagine (Boston: Houghton Mifflin Harcourt, 2012), 5 3. Ibid. 4. Ibid. 6 5. Kalay, Architectureâ&#x20AC;&#x2122;s new media: principles, theories and methods of computer-aided design, 2 6. Branko Kolarevic, Architecture In The Digital Age (New York, NY: Spon Press, 2003), 7 7. Rivka Oxman and Robert Oxman, Theories of the digital in architecture (New York: Roultedge, 2014), 5. Fig5.

THE TECTONIC MATERIALITY Can computing be used to re-define practice? We need only look on at the work of Archim Menges and UNStudio to see that for those at the cutting edge of the creative discourse, it already has. Computation enables geometries that were once far too hard to conceptualize, let alone iterate. Were the wood panels of the Landesgartenschau readily available? The frames and claddings of the Youturn pavilion? Through sheer necessity the gap between architect and engineer once again bridges[7] â&#x20AC;&#x201C; as the designer we must make possible the manufacture and joining of each and every part for the sake of realizing our design, and it is to the end we innovate the tectonic materiality[8]. FIGURE 6: YOUTURN PAVILION 2010 BY UNSTUDIO Notes 7. Branko Kolarevic, Architecture in the Digital Age: Design and Manufacturing (London: Spoon Press, 2003), 7. 8. Rivka Oxman and Robert Oxman, Theories of the digital in architecture (New York: Roultedge, 2014), 5. Fig 6.












In computational design it is advocated the contrary; iteration breeds emergence, variation feeds research, the process thus defines the design, the final product, and therefore demands an equal, or perhaps greater, importance.

The inspiration underlying the design is set: ink dispersion. Already we see how contemporary practice reacts to computational creativity: Pseudo morphogenesis, procession; wether generic or manipulated: We are asked to critique the use of movement. generation, yet there are no clear indications of balance between comWhen we speak of generation in position and generation. By itself, various fields of thought, inherently generation, I will claim, is useless in there is change, there is motion, the architectural design process. there is flow, principles unchanged in computational design. This however is not the refusal of speculation in architecture but inFor UNStudio’s National Art Mudeed the refusal of pure generation, seum of China (fig.7), this is taken if it can even exist. What architects literally – The form is to be the aim for, be it for the now or the manifestation of diffusive flow. To future, is real, what pure generation this end, the eight levelled structure fundamentally produces, is not. is proportionally balanced, vertically similar to the ink cloud, with the In UNStudio’s museum, generation internal spaces planned according is key and in fact is what inspires to contextual social procession. composition. Thus, it can be argued that the emphasis on generation Composition and generation are is perhaps stronger than its conintegral to architecture. Where there temporaries, yet as an architectural is one, there will be the other, we piece there can be no refuting the can only question their balance. existence of composition in the Ingrained in conventional architecgenerated forms. ture is the ultimate importance of the final product and reasonably so, Notes it is the end design, the end compo- Fig7. sition, that we build, that above all Fig8. Ibid iteration, we give the title ‘best’. Fig9. Ibid


tial is killed.

seven years how much computation is able to develop, starting from A major benefit of algorithmic We cut it down for the sake of cursimple temporary shades and small computational design generation is rent fashion. To the end of develop- sculpture to permanent fixture and often described as the â&#x20AC;&#x153;potential for ing personal style. Though stagnant, real structure. Explorations are differentiationâ&#x20AC;?[1]. In context of deit is able to remain beautiful. Yet made about the properties of the sign generation, diversifying a single must this acceptable standard material and iterations are guided concept quickly, efficiently, neatly is remain? We are enabled more than and pushed by the results of their what I will argue to be what Schum- ever before to change. We are able research: generation begets design. acher implies as this potential. to encourage more than ever before, emergence, innovation. In its time, the Guggenheim of Gehry When iteration slows, when we was awesome. The shapes near become comfortable with what we In a lecture by Michael Weinstock[2], unthinkable, its construction more have already developed, the potenwe are shown within a period of so. Yet pushing the boundaries of



what could be done, the Museum is realized. This was innovation. Fast forward thirteen years to the concept models for Atlantis in Sentosa, Singapore. It is a beautiful piece, context sensitive, cutting edge geometries and perhaps most importantly, clearly recognisable as a Gehry piece. It is this immediate recognition which I worry. Can I fault someone for building a name? For aiming


to achieve (or in this case already achieved) recognition through a style? Even in the context of computation it is hard. Though we live in a time where we are enabled more than ever to change, more than ever to innovate, is there not a beauty in achieving a defining personal ideal architecture? Even understanding that it is this selfish concept â&#x20AC;&#x2DC;styleâ&#x20AC;&#x2122; that stifles creativity, can I oppose it?


Notes 1. Rivka Oxman and Robert Oxman, Theories of the digital in architecture (New York: Roultedge, 2014), 7. 2. Michael Weinstock, Fabrication Intelligence Note change of time, VIDEO/lecture.php?ID=2752 Fig10.




In my introduction I too state my interest for gaining a wider scope on the theory behind computation. This Being only three weeks into the perhaps is the clearest difference subject, it is difficult to clearly state between the present me and the my intended design approach in me of three weeks past, I am more regards to the Part C design project, interested in computational theory however from what I have learned, I than I have every been. intend on investing time and effort into understanding the algorithm It was stated that before, I only and the design that stems from the considered software as means for program. effeciency, ease, convenience; I saw the program only as a means for Why this can be significant in my computerisation. design journey is that it completely redefines the process. In my introShown clearly in my explorations duction I state that what primarily such as A1 CREATIVE SYMBIOdrives me is the aim of achieving a SIS, I hope to develop my skills in clear understanding behind the ab- computation, to design through the stract construct that is the process program as a partner in parametric of design. architecture. In computational design, we are taught again and again that it is the process that defines the design. It is the process that innovates. It is the process that we base our research. In my two design subjects so far, I felt clearly that though they claimed to be interested in the creative process, in the choice of iteration, it was the end product that held meaning. In Studio Air, I am overjoyed to have the opportunity to spend ample time on the process and techniques behind generation, not on deliverables (such as plans, sections) that force the immature idea to be developed to the end of one more complete composition that I can only feel disappointment in.





A5.1 This exploration in week two really made it clear what the possibilities of computational design could be, without any preconceived ideas, without that beginning phase of generating concepts, I am able, through parametric algorithm and program, design shapes, and through iteration, emergence of (at least to me) new form and ideas.

A5.2 This algorithm created in response to a task in week 3 was selected because it marks the most advanced three dimensional exploration in grasshopper (or really any other program) to date. To put it another way, this was perhaps when I started to truly feel that I could create an architecture from algorithmic computational forms.



References Bollinger, Klaus, Manfred Grohmann, and Oliver Tessmann, ‘Structured Becoming: Evolutionary Processes In Design Engineering’, Architectural Design, 80 (2010), 34-39 <> Colletti, Marjan, ‘Digitalia - The Other Digital Practice’, Architectural Design, 80 (2010), 16-23 <http://dx.doi. org/10.1002/ad.1037> Fry, Tony, Design Futuring (Oxford: Berg, 2009) Kalay, Yehuda E, Architecture’s New Media (Cambridge, Mass.: MIT Press, 2004) Kolarevic, Branko, Architecture In The Digital Age (New York, NY: Spon Press, 2003) Lehrer, Jonah, Imagine (Boston: Houghton Mifflin Harcourt, 2012) Oxman, Rivka, and Robert Oxman, Theories Of The Digital In Architecture Weinstock, Michael, Fabrication Intelligence - Note change of time, php?ID=2752 Images Figure 2: Mercedes Benz Museum. Figure 3: Asam Church English Figure 4: Oliver Von Malm Chapel “DigitAlia - the Other Digital Practice” Architectural Design 204 (2010): 25 Figure 5: Landesgartenschau Exhibition Hall ICD/ITK/IIGS Figure 6: Youturn Pavilion UNSTUDIO Figure 7: National Musuem of China UNSTUDIO Figure 8: Museum Isonometric Layout Ibid Figure 9: Ink Dispersion Ibid Figure 10: Atlantis in Sentosa by Frank Gehry, Greg Lyn + Team

SECTION A 2 A0 Introduction 8 A1 Design Computation 14 A2 Composition/Generation 20 A3+4 Conclusion+Learning Outcomes 22 A5 Appendix - Algorithmic Sketches SECTION B 26 B1 Research Field 38 B2 Case Study 1.0 50 B3 Case Study 2.0 66 B4 Technique: Development 76 B5 Technique: Prototypes 78 B6 Technique: Proposal 86 B7 Learning Objectives & Outcomes 90 B8 Appendix - Algorithmic Sketches SECTION C 96 C1.1 Design Concept: Preliminary Trials 106 C1.2 Design Concept: Final 126 C2 Tectonic Elements & Prototypes 132 C3 Final Detail Model 134 C4 Learning Objectives & Outcomes



EVOLVING THE PROCESS Biomimetry is no innovative concept, existing in varying scales of architectural design, from the ornamentation of classical orders to theories of fundamental origins, we have looked to nature for guidance and in Her substantiality we find near infinite potential. More than final forms, structural systems or organic function however, I have chosen to study instead biological evolution in context of the architectural discourse, and thus how the design can independently evolve through the bottom up process. “Nature [is] a colossal engine producing unlimited variations based on complex, but comprehensible, rules... adaptations of species by incremental augmentation is no different from tools that are progressively improved by selection and refinement” [1] In Butler’s theories on evolution as a mechanical system, the advocation of nature as a machine and thus the machine as a part of nature, is made. This is what I will study in the coming precedents; evolution as a program: automatic recursion and iteration.



Notes 1. Self, J. (2013). Darwin Among the Machines. Architectural Design, 83(4), 66-71. doi:10.1002/ ad.1620

THE MIDDLE ROAD Using programs such as Grasshopper, inevitably we come to realize that still the interference of the designer is too great; that still, though perhaps unconsciously, there is the notion of the program as a tool. In A5 ALGORITHMIC SKETCH APPENDIX, we see this clearly: In A5.1, upon reflection, we understand that what drives iteration is an aim for ‘interesting’ form - the direction of the process, wether decided or not, subject to the designer. In A5.2, variation is far simpler, yet from the simple fact that the initial geometry is created through computerizing (in contrast to computing) the designer’s idea, we are able to claim the same. Though shapes that could only be achieved through computation, fundamentally, the design has not reached what Jahn describes “bottom up”[1], it instead lies between computerization and true generative design, existing in a sense as both bottom up and top down: Though the process begins not so dissimilar to generative design approach, initial resulting forms dictate, according to each designer’s agenda, future iteration, and thus, though we move from the bottom up, in parallel, through leading the program to the final form, we too begin to move top down.

RULES OF SIX ARANDA LASCH In Aranda Lasch’s Rules of Six installed in New York City 2008, one example of a design closer to the bottom up approach is shown: explicitly she states the aim of self-assembly, replicating lab-like mineral formation through software simulations[2]. From this evidence-based coding, biomimetric forms appear. Seemingly it is exactly what we seek: a design that “multiplies indefinetly without sacrificing stability”[3] and thus is capable of self evolution. Yet in the context of a true independently evolving design, there is something not quite right: the design is held back by the real; in attempting the programmic evolution, we necessarily accept emergence and thus speculation - simply what exists in contemporary knowledge is inadequate. Though Lasch’s Rules of Six can evolve mathematically, it lacks much of the complexity the iterative developments of biological evolution. “Infinitely..indefinetly”[4] yet fundamentally, unchanging. In this we see clearly: the predetermined potential can never be more. Notes 1. Lecture 4 Gwyllim Jahn, Parametric Modelling 2. 3. Ibid Fig11. Ibid.










WANDERERS - WEARABLES NERI OXMAN “The processes, considered as being evolutionary, can be introduced to a mechanism (also considered as evolutionary), and a mutual training, resilience and growth can be developed.” Negreponte 1969[1] In Neri Oxman’s work , we begin to truly see Negreponte’s computational evolution. In a video showing fifteen formal variations[2], we take a step closer to what bottom up design can be: It is through unrestrained generative design that Oxman’s forms evolve to attain some natural property - emergent biomimetry. Unlike the Rules of Six, though the underlaying behavior (vein, cells, partitions, fibres)[3] is set, we do not limit the function to the preexisting and preconceived.

In this case, does not the so called bottom up generative design approach yield once again to the designers intent? The preexisting? The preconceived? There is no evidence of color or composition through computational iteration. If they were indeed done ‘by hand’, are we still able to call this design bottom up? If we imagine the difference between conventional top down to bottom up as a clear line, though obviously contrasting, it is near impossible to define their fundamental separations. Yet in comparing Lasch’s and Oxman’s work, do we not see clearly that varying scales of bottom up exist?

In many ways, in both selected precAutomatic iteration through scriptedents, I am disappointed by what I ing ‘basic’ behaviors: no longer can perceive to be an incomplete explowe anticipate or need to manipulate, ration into true emergent design and during the process, the resulting thus true evolution of the tool to the form. Yet what of the post-generapartner. tion? However, the arrival of disappointThough seemingly embracing the ment harks the existence of failed bottom up, Oxman’s Wanderers lack expectation, and it is in this we reno concerns. In particular are the new the persisting will to progress: symbolisms and design intents lay- As we step closer towards the evolvered over generated form: what is it ing program we ask, as we always that dictates, for instance, the color? have and as we always will: if not The composition of each piece of this, then what? clothing? What can design be? In the chest piece Qamar, Oxman cites an inspiration from the luNotes 1. Negreponte, 1969, towards a humanism minosity of the scarred Moon. In through machines the Mushtari skirt, Jupiter and the 3. human gastrointestinal tract. In Neri-Oxman-Wanderers 4. Ibid. the cape Otaared, the atmospheric Fig12. of Mecury in relation to lery/21605971/Neri-Oxman-Wanderers Fig13. Ibid. human respiration[4].




tion are simply on different levels of complexity. Lasch fails to outline Drawing the discussion back to arsolution to first-level issues that unchitecture, we must of course speak planned evolution brings: contours, of the fabrication concerns of such sun paths, face orientation, natural a design. ventilation - there is no plan for a building here. Lasch’s Rules of Six are 3D printed simply and conventionally, no doubt In Oxman’s work there are no claims with the purpose of highlighting the to the design as building, yet even in key geometries of the installations the face of failure, we cannot help first put up in New York 2008. but imagine. Oxman’s wearables are 3D printed using innovative techniques to allow for the first time ever, colored gradients, exhibited at EuroMold 2014[1].

Even replacing arguably fundamental design elements such as the opaque glass like materials, complexity on the level of say the Otaared, if even structurally possible today, would be near impossible financially.

Scaled to the built environment however, such success, at least currently, cannot be had. We Cannot simply 3D print the building and Yet Oxman’s agenda was the wearexpect it stand; to meet our expecta- able, not the buildable. Her criteria tions as an architecture. was necessarily different. In Lasch’s work, it is claimed that the code used for the installation could be applied to an indefinite scale: “its sprawling construction could represent molecules, rooms, buildings or entire neighborhoods”[2] However representation and realization in terms of real world construc-

In picturing an architecture built five years from now, this selection I can only call a failure. In speculating a century from now, who can say?

Notes 1. 2. Neri-Oxman-Wanderers Fig14. Ibid Fig15. Ibid.






REFLECTIONS Perhaps there is something more. In choosing the research field, we necessarily introspect; why? What is Perhaps in the primitive, the intuitive, the basic, the shared, we find it that draws one to biomimecry? the link between now and tomorrow. Inherently primitive, intrinsically To act we must first know. natural, supposedly logical: in conventional evolution theory, we are To know we must first relate. taught: live, grow, reproduce. To apply computational design and therefore varying scales of generative emergence is to thus evolve. In the algorithm: To live is to be clearly documented. To grow is to be logically concluded. To reproduce is to be continually iterated.


More than simplicity’s sake, I stake my interest on this fact. Mutual understanding through mutual behavior[2]: In our ubiquitous fundamental fervor, our innate love for the natural, we are spurred to motion and thus are revealed the deeper reason:

What is it that draws me so? As a beginner, it may be simply attributed to the obvious parallels augmented Biomimecry for the sake of design. by the instinctual love for the natural: a power Dr. Roudavski describes as “sublime”[1]. Yet must this attrac- Notes 1. Lecture 5 Stanislav Roudavski, Patterning tion be founded on mere inexperi2. Lecture 3 Dominique Hes, Working in the Future Built Environment ence?




ITERATION SERIES 1&2 INTERNAL POINT We examine the four InternalPoint nodes, in series 1 by first using the provided extremes (being -1 to 1), and in series 2 by replacing values within the slider in order to experiment with a far larger value range (-40 to 40). Why so much time is spent on the InternalPoint node is that at first examination, it is seemingly where emergent variance is spurred most easily; unlike say manipulating the size of the HexaGrid, results, especially playing with higher values, were highly unpredictable. ITERATION SERIES 8 SPACING EXPRESSION In the original algorithm even spacing is achieved through specific mathematical expressions (pythagoras). Through simple changes (integer addition and subtraction) we show the fragility of the even spacing, whilst play with the ideas of compressed or expanded (split) panels. ITERATION SERIES 3 INPUT VARIANCE: IMAGE CULL We replaced the provided image used in the pattern cull for three of our own, labeling iterations according to their resulting patterns.





We replace the HexaGrid with the RadGrid. Using the default values of the Radial Grid node, we found that the results were perhaps unexpectedly not circles but variances of the triangle. We then iterate further using the Internal Point node used in series 1&2.

Perhaps of note is how the image Interesting iteration no doubt, show- pattern cull is represented when using the Radial Grid, there are seeming what changes in the panels controlling cell and point of the List ingly more folds on the geometries Item nodes could achieve, however affected, suggesting that something playing with the two panels in isola- more complex than the simple offset occurs whilst using the Radial tion, we could not achieve as large or obvious of a change seen in itera- Grid as our base input. tions such as 1&2, hence the need ITERATION SERIES 6 for a zoomed view of the cells. INPUT VARIANCE: SQUARE GRID It is still not very clear what is We replace the HexaGrid with the happening, however from the few iterations completed we understand SquareGrid. Using the default values of the Square Grid node, we find some kind of offset or bending of that the resulting effect is an array cells is undertaking each time we of smaller separated rectangular change the values of the panels grids, in which grid squares affected from the default. by the image pattern cull are dotITERATION SERIES 7 ted. Iteration once again takes place INPUT VARIANCE: LOFT TRIAL through manipulating Internal Point sliders, and in each we once again We replace the HexaGrid with a see the original geometry fold or simple loft. This resulted in near stretch. total failure of the algorithm, which perhaps should not surprise; in this attempt we reconfirm the fact that translating 2D pattern to 3D geometry takes more than a simple change in initial input.



CONTROL/DEFAULT [0.2/0.2][0.4/0.0][0.0/0.0][0.2/0.0]

ITERATION 1.1 [1.0/-1.0] [0.4/0.0][0.0/0.0][0.2/0.0]

ITERATION 1.2 [1.0/-1.0][1.0/-1.0][0.0/0.0][0.2/0.0]

ITERATION 1.5/ORIGINAL HEXGRID [0.0/0.0][0.0/0.0][0.0/0.0][0.0/0.0]

ITERATION 1.6 [-1.0/1.0] [0.4/0.0][0.0/0.0][0.2/0.0]

ITERATION 1.6 [-1.0/1.0][-1.0/1.0][0.0/0.0][0.2/0.0]

ITERATION 2.1 [2.2/3.1] [5.0/4.1][0.0/0.0][3.7/0.0]

ITERATION 2.2 [2.2/6.9][5.0/-4.1][5.8/-7.8][5.6/0.0]

ITERATION 2.3 [-3.3/-0.4][4.2/4.9][-1.5/0.0][-2.4/-4.0]




ITERATION 1.3 [1.0/-1.0] [1.0/-1.0][1.0/-1.0][0.2/0.0]

ITERATION 1.4 [1.0/-1.0][1.0/-1.0][1.0/-1.0][1.0/-1.0]


ITERATION 1.6 [-1.0/1.0][-1.0/1.0][-1.0/1.0][0.2/0.0]

ITERATION 1.6 [-1.0/1.0][-1.0/1.0][-1.0/1.0][-1.0/1.0]


ITERATION 2.4 [8.0/-4.8][-4.4/0.0][0.1/0.0][-0.7/-0.6]

ITERATION 2.5 [0.5/-5.1][-4.1/-1.0][-2.3/0.0][4.8/-9.6]




ITERATION 4.1 CELL INPUT [20,300,40,400]




ITERATION 5.1 RADIAL GRID [0.0/0.0][0.0/0.0][0.0/0.0][0.0/0.0]

ITERATION 5.2 RADIAL GRID [0.3/-0.9][1.3/-1.3][0.0/0.0][0.0/0.0]

ITERATION 5.3 RADIAL GRID [0.3/0.6][1.9/-2.6][1.9/-2.5][0.0/0.0]

ITERATION 6.1 SQUARE GRID [0.0/0.0][0.0/0.0][0.0/0.0][0.0/0.0]

ITERATION 6.2 SQUARE GRID [0.3/-0.9][1.3/-1.3][0.0/0.0][0.0/0.0]

ITERATION 6.3 SQUARE GRID [0.3/0.6][1.9/-2.6][1.9/-2.5][0.0/0.0]

ITERATION 7.1 LOFT TRIAL [0.0/0.0][0.0/0.0][0.0/0.0][0.0/0.0]



ITERATION 8.1 HORIZONTAL EXPANSION ARRAY n*(2*sqrt(s^2-(s/2)^2)) [+4]




ITERATION 8.3 VERTICAL COMPRESSION ARRAY n*(2*sqrt(s^2-(s/2)^2)) [-4]







In generating the 8 series, what drove iteration was an aim for understanding the algorithm through flexing, what did each node do, and from there, what inputs could I replace them with?

Accordingly with the selection criteria set, we choose four highlights:

To speculate is to see beyond intension. To question what could be.

ITERATION 1.1 The variance though not bringing as much impact as iterations such as 2.2, the difference is significant yet maintains the potential as a wall cladding.

What if instead of a wall cladding, I extruded each cell into a room? Suddenly instead of the entirety of series 8, only 8.1 and 8.2 qualify.

Thus, there is no real direction in terms of form, in a sense we begin to experience what Rivka and Robert Oxman describe as research by design: through playing with the algorithm we learn. From our learning we take a step further, from manipulating numerical values we begin to manipulate the nodes themselves. From play we achieve deeper understanding of the fundamental fixed logics of the algorithm, thus stripping down code to only the essentials, ready for use in future computational design.

ITERATION 2.1+2.3+2.4 We begin to see major obvious changes. No longer do the arrayed geometries look even remotely similar to the beginning hexagonal grid, yet there is a consistent pattern with obvious relationships to the image cull. ITERATION 3.3 Akin to 1.1, the variance is not staggering yet fits the selection criteria accordingly.

What if instead of simply a pattern, iterations such as series 3 informed the layout of entire complexes? What if instead of taking the grid, we took only one geometry? Suddenly the need for any relationship with the image pattern cull is gone; what interested me in 6.1 becomes irrelevant and the unpredictably and instability of 2.2 becomes its asset; from one iteration we are given 40 geometries.

In the world of speculation, lines inSELECTION CRITERIA capable of being fabricated into patITERATION 6.1 terned cladding find new purpose as Though our generative processes Though the panels are not directly framing systems for shades; folds lack any formal direction, I wanted linked/repeated, I felt there was deemed useless given meaning to ground generation with real potential in 6.1 as a simple cladding. through sculpture. composition; at this stage I am Though the base rectangular form concerned not with what surprised is perhaps not as interesting as me, not with what I learned, but previous iterations, there is a greater which iterations could be used as an clarity in regards to the image patarchitecture. tern cull. Focusing more specifically, I brought it back to the original purpose of the algorithm, which mimicked the cladding of FOAâ&#x20AC;&#x2122;s Spanish Pavilion; in selecting my highlights I chose to judge them based on their potential as wall panels and patterns.

ITERATION SERIES 8 Perhaps not so surprising is the qualification of the entirety of series 8, though a different arrangement in array is present, the resulting geometry present high potential for wall paneling and cladding.






3 4

1 PRIMARY GEOMETRY Calling the geometries fractals may be hasty, though the geometry splinters at each of the polygonâ&#x20AC;&#x2122;s corners, the fractures are seemingly not smaller versions of the base hexagon.

2 SIZE VARIATION Quick analysis of this picture reveals around four different sized geometries. Without a complete analysis we cannot say definitely, but this assumption we take as we continue.

3 COMPOSITIONAL SURFACES Composition of the piece has no identifiable pattern (original work is based on real simulations), however, it is seen that surfaces sharing equal heights are seemingly joined together into a continuos surface.

Start: Hexagon fractals.

Start: Scaling point differences.

Start: Random array+join Breps.


6 1

5 4 COMPOSITIONAL STACKING The different sized geometries are then stacked to achieve vertical variation. Unable to identify a pattern (original work is based on real simulations).

5 NEGATIVE AREAS It is hard to tell how the negatives are done in this piece, though one general characteristic observed is the tendency for the negatives to never be as deep as the geometries are high.

6 SURFACE PATTERNING Secondary design element, however extremely interesting due to the inference or suggestion that from the patterned lines are the primary geometries formed.

Start: Random point list+move

Start: Trim solid with base plane.

Start: Hexagrid lines + cull patterns.



1.11 1/0.333/0.333

1.13 5/0.430/0.001

1.12 4/0.430/0.333

1.14 5/0.430/0.112

1.15 4/0.525/0.115






In finding the primary geometry, we begin with the fractal script given in week 4 with minor variations. Each iteration on the left is labeled factor/scale1/scale2. In 1.13 a problem that occurred was when scale2 became too low (in this case 0.001), baked results did not give us the brep we saw in grasshopper.




Next, we needed to remove the top fracture of iteration 1.15. We find the component we need quite quickly, as seen in 1.20. Several methods were used. It was found that the cherry picker+index cull method produced no results. Brep/Brep nodes culled entire orientations of faces. Prune and Trim tree functions were not appropriate. In the end we use a split tree node and extract the tree (fracture) that we did not need. We then split that unnecessary fracture into faces, to extract its base as a capping to the rest of the fractal.



In finding a way to scale our geometry into four variations, we first attempt the point distance. Quickly we understand that center points (where the geometries are placed) will be needed, and thus from the box array we switch to the rectangle array. This succeeds in 2.21, however in 2.22 we vastly imporve it by scaling not only height, but the entire geometry, alongside we a remapping of domain into 1 to 4, meaning no more flat geometry. The major problem here was the residue geometry. Floating pyramids make the algorithm unusable without manual intervention.



I was interested in the point distance due to its dynamism, but more than this, was the wish to keep the algorithm as continuos as possible. To this end, we settle for a much simpler and perhaps much more mundane solution: scaling the geometry four times. This meant that I was in total control of the scale instead of a point informing the scale. Felt very computerizing (vs. computational).






2.30 STANDARD SCALING 1.0/0.8/0.5/0.4












Due to the nature of the project, it is near impossible to extrapolate an algorithmic pattern from simple observation. Thus we turn to random culling of rectangular arrays, making sure that the capping of each geometry had the chance to overlap, as one key element listed for Rules of Six was the fact geometries overlapping became continuos. To achieve this continuity, we try first joining the brep, however surfaces are still not one, as evident by the overlapping lines. Trying the mesh however produced results closer to our intention, surfaces were now continuos.



In the first trial, I attempted to take from the random reduce node the list of points which dictated the placements of the primary geometries. This was not possible due to the size and complexity of the tree (as shown in param viewer). In the end, I settled for a duplication of what was in essence the algorithm 3.10, however this time the geometry arrayed was a smaller variant, and the array was moved up according to a vector defining the height of the primary base geometry. How this works is due to understanding grasshopperâ&#x20AC;&#x2122;s pseudo random nodes: randomness was based on a seed. When the seed remained the same, so did the pattern. One obvious problem here was the fact that akin to 2.21/2.22 there appeared some unwanted residue geometry - however in this case, did not know how to deal with this algorithmically and did not want to dwell so we move on.



As mentioned before, the parameters for our random reduce node was far too complex. The work around this time was to manually pick negatives through cherry picker. For iteration 5.11, we used the fractals we scaled in part 2, however, due to the nature of the geometry being an open collection of breps with no capping or real joining, trimming was not successful. We instead create our own hexagonal extrusion for trimming, however quickly understand trimming created complete hollows instead of negatives due to rhinos systems of solids - not truly solid but a closed surface. we thus replace trim with solid difference and end up with negatives closer to our intent. It is important to note that once we decide to use our own extrusion, the algorithm is split.



In 6.10, we try to trim a larger hexagon grid to the rectangle extracted from the surface of the box used as the base plane in part 5. This results in the failure shown in 6.10, not only are cells randomly culled, the intersecting curves are never shattered or split. In 6.20, we move pass having the grid on the entire surface of the box and instead make a smaller grid that fits within the box. The first layer of larger hexagons are randomly reduced and then further region unioned. The second smaller layer is simply randomly reduced. The combination of these layers give the surface pattern a complexity closer to the original.












SIMILARITIES From the very beginning, my task was to re-engineer Aranda Lasch’s Rules of Six. Through splitting the installation into six elements, there are obvious connections between the final outcome of my algorithm and Lasch’s work: 1. Primary Geometry: Hexagonal base with triangulated fractures at each corner. 2. Size Variation: In my case, four scales of the primary geometry, mimicking the variation in Rules of Six.

DIFFERENCES etry itself as mentioned before, had It was obvious from the start that to be random, as we had no infordifferences would occur, as inferring mation to inform a plan. from Lasch’s website, it is clear that Rules of Six was not produced in Relating to the composition, another Rhino. major issue was that I did not diffuse smaller scales onto the base A major issue was the fact that plane, unlike Rules of Six, there is the original work was based on only one scale dictating the pattern. real world mineral formations. This meant the information that in4. Compositional Stacking: The formed the geometry, composition method used in my re-engineering and pattern were unavailable to us. was a mere clumsy workaround. In these cases, we had no choice In reality, there are more than two but to turn to random functions or levels of stacking, and each stacked manual interpretations of how the element varied in size. pattern would generally look like.

1. Primary Geometry: Modelling the geometry as seen in picture may 3. Compositional Surfaces: Overlap- have been more accurate, as what ping surfaces are joined together to we produce though similar is in form continuos surface. many ways fundamentally different. Scale, the triangular fractures, the 4. Compositional Stacking: On top capping - we got close but proof our first level primary geometries, portionality and general feel of the we stacked smaller versions to repli- piece feels wrong. cate vertical variation. 2. Size Variation: We guess that 5. Negatives: Using a hexagon there are four general scales of the based geometry, we take a negative geometry, this may be completely from the base plane. wrong - there may have been hundreds of scales with only minor 6. Surface Patterning: Using a basic variation. hexagonal grid, we layer and pattern to produce a complexity similar to 3. Compositional Surfaces: The Rules of Six. composition or layout of the geom-

5. Negatives: Unable to use the actual fractal, we used simple hexagon extrusions. This mean much of the complexity in the shape attributed to the fracture triangles could not manifest in my work. The hexagonal extrusions also did not vary in scale - meaning all negatives looked the same and had the same depth, once again simplifying the existing work. 6. Surface Patterning: A few problems here, once again we did not know what informed the pattern in the original work, thus we turn to random culling methods.


SPECULATION Second issue I could not overcome was applying the pattern to the negatives - some surface curves float above negative space. Third issue is that the hexagrids were extremely specific to the base plane used, as I could not find a way to trim the hexagrid to the specific face of the base plane box. 7. One ignored element was the fact that Laschâ&#x20AC;&#x2122;s installation consisted of around 12 panels which each fit together in interlocking hexagonal edging, in terms of fabrication, this may be because of maximum 3D printing size, however it still exists as a design choice.

In analysing Rules of Six in Part B1, we criticize the implication that Laschâ&#x20AC;&#x2122;s work scales to indefinite scale. As a even simpler re-engineering, I stand by this statement. Alone, the algorithm I have produced can not become a plan for a house, a complex of buildings, a layout for a city: the next step to pushing this is to decide on a direction and further specialize the code. For instance, if each fractal was now to become a building, what script could be added to computationally add openings such as windows and doors? Further, can perhaps the surface pattern inform some kind of layout for each building? In terms of performance, can we use plug ins such as gecko to optimize where each building might be (as opposed to the random function we use)? Lasch identifies, whilst exaggerating, the potentials of her forms. Unrestrained by the aim of re-engineering, I would like to explore how far this script can be pushed in scale.













SPLIT {0;0;3;0}

SPLIT {0;0;1 to 3;0} {0;0;5 to 6;0}


SPLIT {0;0;1,3,5;0}

SPLIT {0;0;1 to 3;0}

SPLIT {0;0;1,3,5,6;0}

SPLIT {0;0;0,2,4;0}











SEED = 2

SEED = 3

SEED = 4

SEED = 5

SEED = 6

SEED = 7
















Changing inputs felt like the only way to truly achieve some kind From the beginning, I felt that of new form, however there were though my script was to a certain not many areas to do this without degree robust, it was overly special- significantly altering the script; to ized, as if existing only to the end a degree that though may produce of re-engineering Aranda Laschâ&#x20AC;&#x2122;s a differing form, is contrary to the Rules of Six. In regards to B3, I felt aim of producing something differthat this was appropriate, and for its ent from what is fundamentally the aim, successful. However, through same. splitting the elements into six, iterating the whole script into something Where input interchange was poscompletely was extremely difficult. sible however, I found opportunities that moved far from the original On purely computational power, the project, evident perhaps most slightest change in say the factor prominently in surface patterning. size of the fractal would take upwards of five minutes , depending In searching for a selection criteria, on the value, to compute. Other in line with our objective of creating sliders could take double, triple, something unidentifiable from the quadruple that or even just crash original, I wanted to contrast in purthe program. pose. For Lasch, Rules of Six was a sculptural installation created for This lead us to iterate based on sec- exhibition. The purpose here is symtions, firstly with fractals, then with bolism, experimentation, proofing, capping, composition, etc. expression. For my criteria, I wanted something more pragmatic and in a


very physical sense usable. Buildings here would not make sense. A master urban plan even less so. I chose instead to select based on potential for seating. With the criteria set, some obvious decisions are made, firstly: all fractals were not appropriate. In composition iterations, shapes other than the triangle or square though were possible, due to the fact that the area of top flat surface (seat) shrunk with every added polygon face, meant they were not optimal for seating. All seeding iterations fit as layouts for how seats may have been arranged. Finally, in surface patterns, an obvious one was the extrusions of the various grids, however, lofts showed some potential due to their entirely curved forms allowing surface area for seating.




Materiality was another major concern we skimmed. In our experimenIn developing our fabrication techtal models, we assume the paper to nique, we take one of the successful represent some kind of wood. iteration from B3, in this case the triangular grid extrusion. Instead, we look at connections, repeating our paper models using In the render there are two elements three different methods: to the composition, the black is a shorter step or foundation sitting Firstly, drawing inspiration from above the site. On top of the black Weinstockâ&#x20AC;&#x2122;s experimentation with steps are the white triangular seats wood and carbon tape, we tape the themselves. edges of our components. A major benefit of this was the lack of need Accordingly, touching pieces are for additional design consideration joined into one continuos surface. (notch, tabs, joints). Tape could be applied directly to the cut outs. In fabricating the piece,we take an example from each of the two com- Secondly, we looked at lamination ponents, scaling them to fit within of wood, how a structure could gain an A4 sheet. rigidity through layering and gluing. Quickly we found that due to the Fabrication is inherently complex. thickness we were working with, It considers factors such as labor, such properties could not be feasicost, time, safety, factors not exbly achieved. plored in depth here.

In addressing the problems of our second method, our third experimental model focused on structural framing. I found this to be the most successful. Through gluing ply to the edges of the paper cut out, we gave the model weight, structural strength and rigidity, all required for seating.












SITE ANALYSIS What I noticed from the site: To the North: Cleared areas, empty of all but browning grass. Suburbs to my left. Warehouses to my right. To the South: Noises of the traffic on the bridge. The darkness of the underpass. The Graffiti lining the walls. Around the site: faded signs and huge electrical pylons.

Cleared, Empty Land

Warehouses, Industrial Buildings

The people on the track were in constant motion. I wanted to show however, that there were sights worth seeing. In contrast to the intents of the walking and biking track, in contrast to the flow of the river, I wanted to create a place to stop. How could I encourage that? At first I wanted to kind of array my seating around the entire bend of the river, but that wouldnâ&#x20AC;&#x2122;t really respond to the conditions of the site. I want to now take a closer look and record the locations of places worth stopping for and the seats I design will be indirectly, a marker for these places.

Electrical Pylons Suburbs Bridge Underpass



Performatively, there were also many conditions of the site I wanted to address. These were perhaps more in line with my research field. I wanted the seats to not be standardized disregarding the site but to respond to it. In physical form, this meant to me following the curvature of the river and the contours of the land. In regards to site conditions, I wanted my seating to respond to flooding through not only physical design but adaptive materials.


PRECEDENTS Aranda Lasch 20 bridges for central park: like rules of six, successful in experimentation and giving meaning or symbolism physical form. In this project she talks about extending the culture of iron and ceramic bridges in central park. Though not inherently an issue, it lacked physical purpose or tangible use, something I wanted in my own work. Biothing Seroussi Pavillion: similar to rules of six as an experimental piece, but I used this example as I was interested in using fields for my exercise. A success here is how they manage to wrap these fields around to create the skeletal pavilion, able to stand to some degree on its own. A failure for me here is that they fail to cite the inspirations for the points or the formâ&#x20AC;Śis it purely for the sake of interesting form? For my work, I wanted the design to be influenced and informed by conditions present on site.





1. To have physical purpose - being a realstic seating platform

So these are the aims I have set myself, and that I want to compare my final product with. 2. To improve existing seating con- I wanted my seat to be a seat – ditions physical purpose took priority over - views pretty much everything else. - the placement of the seats There were also around five bench- the material response es that I identified on site – how would my design compare to them? 3. To respond to perfomative issues There were also many performative of the site issues about the site, how did say - flooding the materials and the structure re- sun exposure spond to conditions such as floodGRASSHOPPER - leaf fall ing and rainfall? - rainfall I mention before that I wanted to enThe grasshopper functions that I am - car traffic noise courage people to stop – this meant interested are really based on the to me manipulating human traffic as point. I felt that points were an ex4. Human traffic opposed to physically obstructing it. tremely simple but effective way to - Manipulating vs Physical Obstruc- Did my design clash with the origilet the site inform the design. From tion nal intentions of the track? points, I wanted to experiment with Structural stability. I wanted the metaballs, fields, rotational fields, 4. Structural stability seats to be semi-permanent. Did the graph controllers, lofting, piping, - semi-permanent lifespan structure represent that goal? And if extruding, clusters and recursion. - ease of fabrication so, was fabrication feasible?


NEXT STEPS Next step is to really take my ideas and push them into reality. I know what I want to make. Why I want to make it. How it improves the site. What I am interested in. How that relates to the theory. From here, I need to find site conditions of which will inform the starting points or the â&#x20AC;&#x2DC;bottomâ&#x20AC;&#x2122; of my design. From there, I will need to experiment with recursion or other grasshopper functions to grow the design up.





For the design project specifically for our studio, it felt that though there were stakeholders in organisations such as Ceres, general populations such as the suburbians, the power facility, the land itself, another major stakeholder was ourself.

In terms of developing skills in the three dimensional media, I felt that the re-engineering project of case study 2.0 was most beneficial.


I felt that I was able to develop my skills in making a case for proposing my design. Seen most clearly in B6 Proposal, I identified what I wanted to achieve, why I wanted to achieve From the re-engineer, not only did it, how it was beneficial to the site I explore many of the techniques and the criteria which I wanted to taught to us through combining and compare my design to. What is it that we wanted to flexing, I was able to understand achieve. What is it that we wanted how more complex algorithms It is hard to judge personally how to explore. could be structured and how this persuasive my arguments may be, structure limited them. however judging from some comIn regards to this, I felt I satisifed the ments in the Interim presentation, objective, seen perhaps most clearly I mention a number of times that there was some extent of success in B6 Proposal. I found a purpose the final result seemed restrictive. in regards to the outcome of this for my design, and set the criteria of In the Interim presentation, it was objective. which I could compare my design suggested that the restriction may to. be inherent. This was something I wanted to explore further. OBJECTIVE 2



It is hard to asses my outcomes in this. What situations does the guide refer to?

An understanding of architecture and air. To be honest, I do not feel like this was an objective I had solid outcomes for. It is a vague objective, and I understood this objective as thinking about designs in real physical space.

Referring once again to B3 Case Study 2.0, the fact that I was able to split the project into six and then re-engineer a project in its likeness shows the extent of my developed capabilities in analyzing contemporary architectural projects.

My models in B5 were simple to say the least. Though physical, were not close to any scale, and were not designed to respond to site conditions such as sun exposure, rainfall, flooding etc.

I mentioned many differences between my own conception and the original project, and in doing so outline my current level of critical thinking in regards to analyzing projects.

In terms of a purely theoretical circumstance, I felt that I achieved strong outcomes in B2 and B4 where I iterated existing script. In that situation, yes, I was able to generate a variety of design possibilities. However, was I able to translate to real life situations, in which the conditions are complex by manifold? At this stage, the answer is clear: No.

This is something I need to work on. There is still a long way to go, but the outcome for this objective was successful to me.


OBJECTIVE 7 In my exploration of B1, I felt that I really started to extend the explorations seen first in Part A into foundational understandings of computational geometries, data structures and types of programming. Through my research in the bottom up design, and my analyzation of what we were doing now in grasshopper, I found the outcome for this objective, though developing, at this stage, a success.

OBJECTIVE 8 Because we are still very much beginners in the computational field, I felt that any personalized repetoire at this stage would be more a limitation than a benefit. Now is the time to explore the limitations and the range of functions available. Comfort kills creativity. It may someday happen to me, but not so soon. For the outcome of this objective, I have failed - but it is not something I can honestly feel unsatisifed about.





The examples I picked were from week 5 and 6. These excercises really represented what I wanted to pursue in Part C. In particular, I liked the point as it was so easy to inform a design based on existing information in point form (contours, flood heights, shadows, etc). From points, I wanted to experiment with metaballs, fields, rotational fields, graph controllers, lofting, piping and extruding. Further, in line with my explorations in B1, I wanted to try my hand at clustering and recursive functions.


REFERENCES Negroponte, N. (1970). The architecture machine. Cambridge, Mass.: M.I.T. Press. Self, J. (2013). Darwin Among the Machines. Architectural Design, 83(4), 66-71. doi:10.1002/ad.1620 Weinstock, Michael, Fabrication Intelligence - Note change of time, php?ID=2752 LECTURES Lecture 3 Dominique Hes, Working in the Future Built Environment Lecture 4 Gwyllim Jahn, Parametric Modelling Lecture 5 Stanislav Roudavski, Patterning CASE STUDIES & IMAGES 20 Bridges for Central Park: Sterroussi Pavilion: Rules of Six: Wanderers: Spanish Pavillion:

SECTION A 2 A0 Introduction 8 A1 Design Computation 14 A2 Composition/Generation 20 A3+4 Conclusion+Learning Outcomes 22 A5 Appendix - Algorithmic Sketches SECTION B 26 B1 Research Field 38 B2 Case Study 1.0 50 B3 Case Study 2.0 66 B4 Technique: Development 76 B5 Technique: Prototypes 78 B6 Technique: Proposal 86 B7 Learning Objectives & Outcomes 90 B8 Appendix - Algorithmic Sketches SECTION C 96 C1.1 Design Concept: Preliminary Trials 106 C1.2 Design Concept: Final 126 C2 Tectonic Elements & Prototypes 132 C3 Final Detail Model 134 C4 Learning Objectives & Outcomes



A SEATED VIEW Introducing seating that responded to the site, that responded to the objectives I had set; of balancing the negatives man-mades on the bank of CERES - I knew what it is I wanted to do, and now it was time to find out how.

LEARNING FROM FAILURE: REFERENCED RECURSION Though ultimately not taking the form of a seat, the branching LSystems using Hoopsnake were no doubt interesting, yet it was not mere interest that drove this design, it was responding to the ultimate, and perhaps most neglected stakeholder: the landscape itself.

the origin? Was there a reason for the primary point, or could it simply be anywhere?

Secondly, driving the algorithm: in hindsight, encouraged by my algorithmic interests of recursion taken from my precedents of Aranda Lasch and Neri Oxman, I had choWhat this meant for me is that form sen to explore the Hoopsnake loopreferred to actual elements existing ing system without fully understandon the site, relating it back to my in- ing it - simply following the tutorial terest in the point stated in Part B, I and replacing inputs created a very began in each iteration three vectors shallow design that I could not that pointed to a specific structure further transform from the branches or view I felt were related to my into a real seat. design objective of balance, including structures such as the closeby These two points were of course bridge and opposing factories. not the only issues, yet we must consider from every failure the lesIn the end, there were two major son we can take. In the case of my issues that we faced that halted Hoopsnake experimentation, though further exploration: falling short here, I wanted to push my design to more closely reference Firstly, the transformation from ref- meaningfully the site and its eleerence to vector was far to arbitrary. ments. Why three? There were virtually an unlimited number of things we could point to. Why the location of

Line Projection Iteration 1

Line Projection Iteration 2

Line Projection Iteration 3


Hoopsnake Iteration 1

Hoopsnake Iteration 2

Hoopsnake Iteration 3




At the time, what really pushes the exploration into the Rabbit plugin is the search for a form more suitable for seating, in accordance to my criteria and intent, yet in reflection it is obvious that it shares the major problem of our first experiment in Hoopsnake: Though this time it may have formed planar surfaces capable of seating, this did not excuse the shallow methods of simply replacing inputs through form and layout.

In the previous Hoopsnake iterations, a major problem is the weakness in reference. The 8 point system is my response to the issue: taking the wider site into consideration (southern bank of CERES), I looked at all major ‘negatives’ of the site. Of course, this was still to some extent arbitrary, yet I felt it took a step in the right direction: no longer was I restricted by some number related more to pragmatics of the algorithm than the site, instead we looked at the space of the A second contrasting issue however site itself, creating and developing was the nature of how I approached more meaningful reference points. the design brief of a ‘living architecture’. My response to this was These eight points are first used in seemingly conveniently answered our Rabbit iteration, through conthrough Rabbit’s Cellular Automata necting the dots and creating a system, which through adjusting basic form. parameters could grow in real time a form based on life/death of cells in relation to the positions of each other. Yet once again this response to the brief was taken almost exactly from an existing system. I never asked myself what it is ‘living architecture’ could or should mean for the project. Simply being related to living and dying did not make it appropriate; simply taking, without modification, from another could not be called design.


Generic Box Iterations

8 Point Form Iteration


LEARNING FROM FAILURE: LANDSCAPE OF SEATING With the final experimentation, used for the final presentation, I aimed to respond to the major issue of a shallow algorithm through simplifying the idea of recursion. This meant for me looking back to Case Study 2.0, in which I reverse engineered Aranda Lasch’s Rules of Six. The term I used to describe the process I took to designing the algorithm of Case Study 2.0 was ‘selected repetition’. It was far less complex than designing a loopable script for insertion into nodes such as Hoopsnake, relying on brute force, simple repetition of predetermined clusters with selective inputs. This selective repetition I felt was a success. I could now really drive the layout, the form, the scale and thus direct the algorithm to result that corresponded purposely to my design intent. Another success I felt with this piece was further developing the form that responded to the site: in our first experiment with the eight point system, we merely connected the line and extruded. In this, height is no longer arbitrary, but a scaled value in relation to their distance away from each of the eight points.

This generated form was then flipped and joined to each other to then end of a planar seating surface, a decision that will be discussed further in C1.2 and C2.

-tives of the site by directing users to a certain view, a certain landscape that I felt represented the creek more than say the electrical pylons, suburbs and warehouses, yet for the sake of impact, for the The piece was however not withsake of readable scaling, I had out failure. In feedback for the final increased the area to almost the presentation, one criticism was that entire southern half of the CERES I had based my layout on the conbank. How could the user know tours of the site: was this something what it is I wanted to show? They that the user could really read? It felt could not. It was an unconsidered like for the sake of referring to the mess. I claim to respond to the consite, I had pretended the contours text of the site, yet the result is so were relevant to the project. If the shallow, so weak, so inadequate. landscape was really built, would the scaling (done by sectioning each contour) be seen as something coherent? In reflection, I thought not. A secondary problem was the design intent itself. Somewhere down the line it had watered down, and I lost focus and vision on what I wanted to do, what I wanted to achieve with this piece of architecture. In the end, it was not sculpture, nor was it a high-tech manifestation of novel never-seen-before ideas. With a weak response to the objective, the design would naturally fall apart. I claimed to try to balance the nega-






4 5














AN IMPROVED REFERENCE: 12 POINT SYSTEM In the 12 point system now spanning the entire bank of CERES, the negatives chosen in accordance of our criteria of elements ‘immediately or easily visible’: 1. Electrical Pylon South 2. Graffitied Sign South 3. Suburbs 4. Electrical Pylon North 5. Graffitied Sign North 6. Warehouse North 7, Warehouse East 8. Goldsmith Grove (road) 9. Warehouse South-East 1 10. Warehouse South-East 2 11. Warehouse South 12. Blyth St - Arthurton Rd Bridge

[2] Graffitied Sign South

[9] Warehouse South-East 1

[10] Warehouse South-East 2

[3] Electrical Pylon North

[1] Electrical Pylon South

[5] Graffitied Sign North

[12] Blyth St-Arthurton Rd Bridge


RESPONDING TO FAILURE: In the previous explorations, there was no doubt improvement in our selection of references. From the very arbitrary three vector branching system to the eight point, to the eight point with referenced height, yet there was one major issue that I could not really justify: why the South bank? What was specific about my selected area that influenced the project? There was no real reason.

creating, seemed out of my reach. I instead took the idea of living and dying in the context of our new principle of facilitating and embracing change referenced from CERES, and aimed to produce a seating that To change we must first understand could be quickly and easily fabricatwhat it is we change, why it is we ed, constructed and disassembled: change. a seating that was designed to have an impermanent lifespan - and it This statement brings onto us was in this consideration that I tried a new design intent: awareness to embody an architecture with ‘life’. through seating. This thus respondSo the question became: what ed to our issue of readability: It was What this meant for the design was would be the new space from which in fact my first experience on site that I would avoid at all costs 3D we took our points? Either we had that spurs the initial intent of balprinting and focus instead on laser to define a smaller space or take ancing negatives, as it was through cutting planar surfaces that could fit the whole. Ultimately, the latter is negatives that I read the site. In together through detailing - somechosen over the former due to the contrast to contours, by putting my thing I will explore in greater detail scale of the design objective we seating on each of the negatives I in C2. have defined: the change or balance chose through my new 12 point syswe aimed to achieve; overcoming tem we inherently incorporated their the negatives of the site, did not af- undeniable presence and power. fect any one particular area but the entirety of the site. Faced with the looming steel pylons, the unending sound of traffic echoThis thus leads us to question, why ing underneath the bridge, the grafCERES? Was it simply because fiti on virtually everything - I wanted the tutor suggested the location? I the user to ask themselves: hoped that this was not the case. In every step I questioned what it Was this the Merri Creek that should was that the context shaped my be? That could be? design, and it was no different here. Through the four principles or misThis new focus on my design intent sions CERES state on their website, I hoped to keep throughout the proI chose one that would dictate the cess of design. rest of my design: The final point I wanted to write on “Embrace and facilitate rapid was how this all then fit into the change” brief of a ‘living architecture’. In Rabbit, we explored Cellular Automata: What would this mean for the delife in the algorithm, albeit simplified. sign then? I began to move from the It was extremely interesting, yet in idea of balance to the idea of emterms of understanding, in terms of


-bracing and facilitation of balance. Objectively, was a landscape of seating going to balance the negatives of the site? I did not feel so. What could it do then?

CONNECTING THE DOTS We connect the 12 point negatives we have selected to form the outline of our initial polygonal shape. 112



1. Connect Selected Reference Points

2. Fillet Polyline

4. Contour + Extrude

5. Fabrication Detailing

DEVELOPING THE FORM From our connected points, we fillet our resulting polyline, before extruding to two points, found by approximating the center of the two halves of the polygon. From there, contours are used to determine thickness of each stacked board, finally finishing by simple details for ease of joining post laser cut/fabrication.

3. Extrude to Two Center Points

SELECTIVE REPETITION With the form and method selected, it was time to choose what we were going to repeat in our design. To introduce variance and complexity, each seating at each point would be based on a slightly different polygon, each consisting of 11 out of the 12 points. The point that would be excluded from the polyline represented the location of the seat. For instance, in this image, point [6] Warehouse North is the location of the seating, and thus the point is excluded from the resulting polygon. This process is then repeated 11 more times, one for each location/point.















LAYOUT DIAGRAM - NOT TO SCALE Due to the size of each seat (max width 500mm, height 420mm) and the relative long distance between each, it was not possible to show them in plan on scale. This diagram shows the approximate proposed location for each seating. Seating that would not be feasibly placed were moved along horizontally towards the bank until they were in a more open location.

1 2 0

1 2


1 2 3


[4] Electrical Pylon North

[12] Blyth St - Arthurton Rd Bridge



SIMPLICITY IN PLANARITY: CARD CUTTING PROTOTYPE As we mention before, in accordance to our newly adopted principle of “facilitating and embracing rapid change”, I wanted to make our temporary seating structures easy not only to fabricate, but to construct and then to later disassemble and remove. This meant planarity, but as I soon found out, planarity did not necessarily mean simplicity or ease of fabrication. In the Seated Landscape exploration: one key decision is made for the initial form to instead of using a function such as smooth mesh to create a much more comfortable and feasible seating space, we flipped the form over each other to create the flat surface free of protrusion and sharp angles. Perhaps the complexity of the task should have been obvious from the time we unrolled the surfaces, yet it was only when I had hands on experience with trying to fabricate the piece that I understood being flat didn’t necessarily correlate with an easy construction. With easily over 30+ panels per seating (not including internal frames or joinery) it was clear that this fabrication method would not be in line with my objective of facilitating rapid change and thus rapid realization of our design.



A DIFFERENT PLANARITY: SECTIONING/STACKING As aforementioned, unrolling a relatively complex brep would produce a corresponding complexity of surfaces, and in our case, surfaces that were non-uniform and thus required very specific detailing and joinery keeping in mind the intended use: as a seating, a hollow structure would not adequately hold weight of the user.

Another legitimate question that may be asked is what determined the number or size of the sectioning, and really one major advantage of the parametric system used is that these numbers were readily modifiable, responding to available materials and responding to available time, making it extremely possible for thousands of sectioning depending on wether sheet material was used, So what could we do? Clearly 3D or perhaps instead reducing the printing was still out of the question. number of the stacks to as few as 3 It was just not practical to implefor the sake of cutting labor/transment in the current technological portation to the site. environment. Stacking of course eliminated our I believed there was still something issues of compressive strength to be said about planarity: though it due to its fully solid construction, may not mean inherent simplicity, it yet as it was, there was little to no still held the potential for it. resistance to shearing, causing even minor forces to collapse the stack. This belief manifests into my design: the sectioned mesh - though Another issue was of course dethe initial polygon is referenced from fining where the stacks would be our 12 selected points, pragmatics placed in order to achieve the form to then end of efficient fabrication we developed, one idea was to etch and construction/deconstruction the outlines on each panel so we heavily influenced the final form. could line the panels up one by one. For instance, when the polyline is extruded to the two points, what is In the end, the detail we use, though it that defined the locations of the extremely simple, felt like it dealt points? Though I had tried using with these key issues, whilst retainthe 12 point system and referencing that very simple, very fast fast ing other elements of the site (the design: using square rod cores at center of CERES vs center of indus- each of the two points we extruded trial zone?), ultimately we take the to. centroids of the two halves for the sake of structural stability. The advantages were multi faceted:

now there was a real resistance to shear and on the bottom, a rod that could be attached to surface relatively easily giving the seating structural integrity. At the same time, it meant that the cut outs for each rod core would ensure the panels lined up correctly and rather intuitively; given a basic image of the intended form, really no plans or further instructions would be needed.





GROUND UP Building the seating as it would be on site at scale: from the bottom up, we first install the rods to the ground before slotting in each panel.









There can be no doubt that the brief we were given for the final project was open, in tutorials even ideas such as dresses were discussed as feasible explorations.

Generating a variety of design solutions to a given situation: this was an area I felt extremely lacking. It is true that I had explored many different plug-ins and algorithmic styles, yet was this not to the end of adequacy?

Developing skills in three dimensional media. I have no doubt that I improved my skills in programs such as Rhino, Photoshop, Grasshopper and rendering systems such as Vray.

Keeping the base requirements of a living architecture, it was implied that we found our own objectives and aims within this project.

Variety in exploration was spurred by failure, not a search for options. Instead of selecting the most sucThrough my adoption of one of cessful or promising solution, it felt CERES ideals, through my consider- like from the beginning there was ation of site, users, and experience, I only one style I wanted to pursue, found my aim: awareness in seatrecursion. ing. Encouraging the user to consider potential issues was the first step Though this desire brings me to to change, and it is this first step many different experiences, I did that I wanted my design to embody not feel like I satisfied the objectiveâ&#x20AC;&#x2122;s as a facilitator of change. intent in regards to Objective 2. I felt that I had justified it well, and referenced to meaningful elements taken from the context, and thus, leave this subject satisfied in regards to Objective 1.

I am infinitely better at parametric modelling than I was 12 weeks ago. Being my second studio, I have improved my diagramming in very meaningful ways (superimposition, rendering, proportions, layouts). This was also my first time with digital fabrication, and though my projects have been simple, I am happy with what I have produced this semester (though it did get expensive at times haha).



My skills in presenting a proposal orally was extremely lacking I felt, though hopefully my ability to think critically is shown in my writing.

I did not do much of this in Part C, but looking at the course as a whole, it is clear that I have been influenced by the precedents I have chosen. Where applicable, I try to be critical in each case, finding what I like and dislike and more importantly, why.

I hope to work on my oral presentation skills to better articulate my thoughts and thus convince my peers, my seniors and my future clients. In this digital world, in this digital field, it is necessary to understand the importance of face to face and physical presentables.

Once again I must refer to my case study projects 1.0 and 2.0 and say that I have satisfied the intents of this obejctive.


OBJECTIVE 4 This objective still felt very vague to me, but perhaps it is purposely so. If that is the case, my interpretation was understanding the relationship between the digital and the real through physical models â&#x20AC;&#x153;in atmosphereâ&#x20AC;?, and in this regard, certainly I had achieved a greater understanding and appreciation between the two mediums. Physical models gave that sense of scale that is lost in the infinite space of the Rhino drawing space; where a kilometer can be as small as a centimeter we lose understanding of what it is we actually design. and perhaps more significantly, gave a real complexity that in many ways transcended diagrams.



Certainly I have developed the foundations of algorithmic and thus data structure understanding. Concepts such as flattening, grafting, simplifying and complex path mapping are now terms familiar to me.

I mention this before, but I felt like personalization of a repertoire was not necessarily something to encourage at such an early stage of our exploration. I have barely scratched the surface, yet it is true that we all have interests, and it is true that we all want to follow our interests.

One issue I have however is understanding tree structures as they become more complex, this is especially true when trying to line up For me, this was recursion. Perhaps different structures together. this is what they meant by a personal repertoire, but I know recurAll in all, I felt like I have most of the sion is not the only style that will basics down right. interest me, and though specializing will eventually be necessary, it is not now, and it will not be tomorrow.


SECTION A 2 A0 Introduction 8 A1 Design Computation 14 A2 Composition/Generation 20 A3+4 Conclusion+Learning Outcomes 22 A5 Appendix - Algorithmic Sketches SECTION B 26 B1 Research Field 38 B2 Case Study 1.0 50 B3 Case Study 2.0 66 B4 Technique: Development 76 B5 Technique: Prototypes 78 B6 Technique: Proposal 86 B7 Learning Objectives & Outcomes 90 B8 Appendix - Algorithmic Sketches SECTION C 96 C1.1 Design Concept: Preliminary Trials 106 C1.2 Design Concept: Final 126 C2 Tectonic Elements & Prototypes 132 C3 Final Detail Model 134 C4 Learning Objectives & Outcomes

Algorithmic Sketchbook Preliminary  

Design Studio Air Tutorial 5 Donovan Ong

Algorithmic Sketchbook Preliminary  

Design Studio Air Tutorial 5 Donovan Ong