Moving Wall [scripted and built!]

Moving Wall is an experiment that focuses on the automatization of design through implicit techniques. Animation, long and widely used to navigate through finished, static models, can also serve as means to design and visualize [re]active spaces.

We all know responsive architecture in our everyday life: sliding doors, automatic doors, even scalators now react to human stimulous. Nevertheless, these simple devices do not offer significant environmantal qualities, nor do they influence how we approach our built environment. If hand-held devices can change how we communicate and navigate cities through immediated access to remote information, so can architecture bring brand new sensations and human relationships precisely through the opposite mechanism: bluring the separation of spaces through movable devices that react to human proximity.

[design concept]

Moving Wall is just an instance of this mode of thinking. In archi[o]logics we believe in finding-out-through-doing, so we felt necessary to build a working prototipe, a proof of concept to show that another materiality in architecture is possible. We used flexinol wire and sensors as moving devices, and a PBasic-programmalbe board as base for the "intelligence" of the facade component.


The design code

For this project, we used rvb (rhino Visual Basic, RhinoScript) and PBasic for the actual prototipe. There is no built-in toold for creating animations in rhino, so we needed to write our own in order to be able to visualize what we were doing. We followed these steps:

1. Write all code to create actual geometry

2. Implement its parametric variables

3. Implement a way to delete all geometry while maintining the varialbes

4. Change the variables

5. Implement a way to keep track of changes and render every single time we modify the state of the model

6. Automate the process through a number of steps (frames)

7. Finally delete all changes and come back to original state


Moving Wall Concept Diagram

The prototype

The protoyipe emerges from a series of generic exercises on output movement that implemented precisely certain input conditions. We used people as an input variable in the building and the amount of shadow they created. As an output we obtained the movement generated by an overemphasized response and the shift in the spatial environment caused by circulation of people.  In order to introduce the variables to our input procedure, we briefly studied the precedents of popular responsive systems with a single variable, such as the before mentioned moving doors.

[proof of concept]

Furthermore, we investigated possible controller systems using the light sensors to study the possibilities of creating different ranges of output conditions (speed of movement) through different input variables.

A major concern of the design is the amplification of movement and its relation to the spatial capabilities of already built spaces. Thus, geometry plays an important role throughout the entire process of brainstorming and the actual research prototype.

The project has been carried out with the support and cooperation of: Dae Wook Lee, Guillermo Sevillano (SUMA Arquitectos), Heejoo Shi, and Sang wan Shin.

Search Site

archi·o·logics survey

I am interested in: