We are approaching the first “crit” of the term and our students are already proposing joyful projects for the Burning Man festival and Buro Happold’s newly refurbished HQ on Newman Street. The talented photographer NK Guy (**http://nkguy.com/** and **http://burningcam.com/**) gave an excellent evening lecture at our campus to inspire our students and for the release of the book “The Art of Burning Man” (Taschen) which will feature some of our studio’s work. Here are couple images of the student’s project and of our buzzing DS10 space (pictures by Toby Burgess):

# Tensegrity

Tensegrity structure or also known as tensional integrity, a portmanteau term that was coined by Buckminster Fuller in the 1960s. It is a structural principle which is based on the use of detached components in compression inside a net of continuous tension.It is also known as “floating compression”, a term that was promoted by Kenneth Snelson. Each compressed members such as struts or bars do not touch among another and the prestressed tensioned members , tendons or cables for example, define the system spatially. Snelson defines tensegrity as a closed structural system composed of a set of three or more elongate compression struts within a network of tension tendons, the combined parts mutually supportive in such a way that the struts do not touch one another, but press outwardly against nodal points in the tension network to form a firm, triangulated, prestressed, tension and compression unit. Triangulated network are stronger and even more firm, if compared to non-triangulated network.

**Biotensegrity**

Biotensegrity, is a term coined by Dr. Stephen Levin, is the application of tensegrity principles to biologic structures.Harvard physicians and scientist Donald Ingber has developed a theory of tensegrity in molecular biology to explain cellular structure.The shape of cells are all can be mathematically modelled if a tensegrity model is used for cell’s cytoskeleton. Cytoskeleton is a network of fivers composed of proteins contained within a cell’s cytoplasm, which is dynamic structure, parts of which are constantly destroyed, renewed or newly constructed.

The study of tensegrity structure started through geometric remodeling the tensegrity model using digital softwares, to understand the deployability of the system . The basic form of tensegrity is being explored through addition of struts in the cell modules . The basic cell module is then are combined according to geometry tessellation . They were applied on regular surface and also irregular surface , while maintaining the structural frequency and mesh tension .

By using basic cell module of 3 struts, the tensional components were explored both using cables and fabric . The fabric helps in creating enclosures and more aesthetically pleasing in installation. It is still in the beginning of the design by studying the behavior of the system, with the goal to expand the possibilities of design in the future , either for Burning Man or Buro Happold.

For the next project, the possibility of the tensegrity structures will be explored more and the design will be highlighted on the playful intervention of tensegrity structure and the advantages of this super lightweight system for both proposal for Burning Man and Buro Happold .

# Recursive Growth through Aggregation

The beauty of recursive algorithms is that they can be used to generate intricate sculptural shapes, through a simple definition. The first iteration starts with an edge condition (an element, object or shape), which is not always defined recursively. Following iterations are defined by data loops, in which items are repeated in a self-similar way. Different structures are seen to arise from subtle variations of the function definition, creating forms reminiscent of plants, corals and micro-organisms. With this initial investigation, and further physical representation exercises, my aim is to explore how design can be defined through recursive aggregation.

# System development – Cellular Automata

A cellular automaton is a collection of (coloured) cells arranged on a grid. The cells evolve on the grid through a number of time steps, according to a set of rules based on the states of the neighboring cells. The rules can be applied iteratively for as many steps as desired. Such a model was first considered in the 1950s by von Neumann, who used it to build his “universal constructor”. Further studies were conducted in the 1980s by S. Wolfram, whose extensive research culminated in the publication of the book “A new kind of science”, which provides an exhaustive collection of results concerning cellular automata. The fundamental parameter concerning a cellular automaton is the grid on which it is computated. A CA can be computed on a 1D line, a 2D or a 3D grid which can both vary in terms of shapes. CAs can be computated on grids consisting of squares, triangles, hexagons, etc. Another parameter is the number k, representing the colours or states a cell can have. K=2 (binary CA) is the simplest choice, and also the one I have been using in my experiments. In the case of a binary automaton, the number 0 is usually assigned to the colour white and 1 to the colour black. In my experiments the number 0 refers to a cell being dead, and 1 refers to a cells state being alive. An alive cell generates a point in spaces, whereas a dead one generates a void. Governing the evolution of the CA is also the set of rules applied. For 2D cellular automata, the one I am using for my experiments, there is a total of 255 possible rules depending on the states of the neighboring cells of each cell. For my form finding experiments each iteration of a 2D CA has been memorized by the computer and stored in 3D spaces. The result was a collection of points generated by a CA controlled by its initial configuration ( or the initial state of each cell in the grid ), the evolving rule and the number of iterations. The rules governing the evolution of a CA are vast and produce interesting results, varying from ordered CAs which die after few iterations to chaotic patterns. Upon experimenting with a few rules I have decided to research rule 30 in more detail, also known as the Game of Life rule. Rule 30 has been discovered by John Conway in the 1970s and popularized in Martin Gardner`s Scientific American columns. The game of Life is a binary (k=2) totalistic cellular automaton with a Moore neighbourhood of range r=1. The evolving rule states that a dead cell can come to life if surrounded by 3 alive neighbours, and an alive cell survives if surrounded by 2 or 3 alive neighbours. Such a simple rule can produce very interesting results when computated in 3D space. For my experiments I have been using the Rabbit plugin by Morphocode, using their sample CA definition as a starting point. [caption id="attachment_8590" align="aligncenter" width="545"] Game of Life CA evolution - Initial configuration=Pentonimo Puzzle - Time=150[/caption] [caption id="attachment_8589" align="aligncenter" width="545"] Game of Life CA evolution - Initial configuration=Queen Bee Shuttle - time=150[/caption] [caption id="attachment_8591" align="aligncenter" width="545"] Game of Life CA evolution - Initial configuration=Diehard - Time=150[/caption] [caption id="attachment_8592" align="aligncenter" width="545"] For this experiment the same evolution rule was applied, but the CA grew in both directions[/caption] [caption id="attachment_8597" align="aligncenter" width="545"] The same CA definition explored in vertical growth was explored in a circular growth[/caption] [caption id="attachment_8599" align="aligncenter" width="545"] Following the circular growth experiments various curves and rotation angles were explored for the growth pattern[/caption] [caption id="attachment_8600" align="aligncenter" width="545"] Proximity experiments using the points generated by the CA[/caption] [caption id="attachment_8601" align="aligncenter" width="545"] The lines generated by the proximity experiments were used to generate structural frames[/caption] [caption id="attachment_8596" align="aligncenter" width="545"] Experiments in building the frames generated by the CA[/caption]

# Pattern Formation of Reaction-diffusion

# Mesh Recursive Subdivision based on Alain Fournier’s algorithm

The principle of the subdivision method to generate mountains is to recursively subdivide (split) polygons of a model up to a required level of detail. At the same time the parts of the split polygons will be perturbed. The initial shape of the model is retained to an extent, depending on the perturbations. Thus, a central point of the fractal subdivision algorithm is perturbation as a function of the subdivision level.

Concerning mountains, the higher the level the smaller the perturbation, otherwise the mountains would get higher and higher. In addition there must be a random number generator to obtain irregularities within the shape – and to achieve a kind of statistical similarity:

P_n = p( n ) * rnd();

Where p_n = perturbation at level n,

P( ) = perturbation function depending on level n, and

Rnd( ) = random number generator.

Fournier developed a subdivision algorithm for a triangle. Here, the midpoints of each side of the triangle are connected, creating four new sub triangles.

**Based on this algorithm, the process is done recursively to all the new triangles generated so that the shape is not limited to vertical mountains.**

Random perturbation is where the first iteration is based on a random parameter within the range of 0-9 and the following iterations also are based on a random parameter. This is done using grasshopper by setting the seed number of the initial polygon and the seed number of the iterations. All iterations perturbed based on the z-axis of the new polygon produced.

This resulted in a different shape of the ‘base’ and all the iterations after the first one, ranging from small to big volume depending on the seed of the random number generated. By producing polygons using random perturbation, each iteration is different than the others. The iteration runs for ten (10) times using grasshopper.

Base perturbation * random seed: 1 – 10

Second – third perturbation * random seed: 0 – 10

Based on the principle, one module is chose to continue with the next step. The module chosen is the 5;5 module which is **Base perturbation * random seed: 5** and **Second – third perturbation * random seed: 5**. After the second iteration, whenever there are surfaces which will intersect, one or both of the surface is removed. This resulted in a random yet in control structure based on the principle applied.

For the next brief, I am developing the module to grow further than the third iteration and grows following a certain flow.