Algorithmic Futures. The Analog Beginnings of Advanced Parametric Design in First Year Studios

: This paper examines the use of algorithms and parameter based methods in architecture and design and proposes a new, implicit pedagogical approach for introducing algorithmic design processes in first year design studios using traditional analog and manual techniques. In an attempt to provide a platform for advanced design studios and to close the space between lower and upper level courses, an approach was developed that grew out of the theories and strategies of Alan Turing, Ron Resch, and Sol LeWitt, among others, through which students focused on contextual analysis, value definition, and pattern development. They developed simple algorithms and descriptive rule sets to establish generative, deterministic processes that repeatedly led to unexpected outcomes. The assessment of the studio experience presented here evidences both the successes and failures of the pedagogical and practical strategies implemented and reveals the quality and effectiveness of utilizing parametric and algorithmic processes as a pedagogical approach.


Introduction
In architecture and design, the word "parametric" is a loaded term; the word "parametricism", even more so, and hotly debated. The current movement of parametric design in architecture, heavily pattern based, began in the early 1990's at schools like Columbia and the Architectural Association and continued through the work of Greg Lynn, Ben van Berkel, and others (AIACC, 2012). However, parametric design has a long history in mathematics and the sciences, one of the earliest uses of the term found in an 1821 paper by physicist John Leslie, Geometrical Analysis and Geometry of Curve Lines (Leslie, 1821). While, parametric design in architecture emerged much earlier than the advent of computer aided design, farther back than Gaudi's hanging chain model and through to mid-twentieth century architects Luigi Moretti and Frei Otto, current trends in education and the profession would suggest that parametric design is identified by numerical operations performed in a computational environment (Davis, 2013). This is especially evident in the way in which current architecture and design programs are structured, heavily or exclusively favoring digital fabrication and computational design over analog methodologies. This dichotomy is at the heart of the central question facing architecture today: its own identity. According to Alexander Tzonis, architecture is isolated from the cultural mainstream and has been facing the greatest identity crisis in its history since the mid-twentieth century. Constantly shifting societal priorities and cultural perspectives have demanded a redefinition of practice and education radically different from what has been traditionally accepted (Tzonis, 1969). The more recent and robust emergence of technology in the design process has raised further questions about the very nature of design and has set up a number of oppositions, from the expression of pattern versus imagination, to determinism versus ambiguity, to authorship versus autonomous computation, and to the question of technology, or man versus machine. These oppositions are explored in the three sections below. In the wake of postmodernism, the lack of a defined movement in contemporary architecture and rifts between phenomenologists and numerologists, the analog and the digital, the profession seems aimless. Following postmodernism, the search for a definition of what architecture is has led to the emergence of various camps, some calling for a return to when architecture was more "human" (Gans, 1977). In direct retaliation against the stark formalism which has grown out of current parametric approaches in architecture, especially Patrik Schumacher's "parametricism", others have gone as far as rejecting architecture as art, which is primarily about self-expression, and accused parametricists of being less about social progress than self-promotion (Hosey, 2015). This resistance against the architectural fascism of Patrik Schumacher is found not just among traditionalists, but among parametric-based designers as well, who accuse Schumacher's parametricism to be less a method, approach, or process than a personal style, concerning itself solely with aesthetics. (Schumacher, 2008).
Compounding the issue of identity in architecture, first-year, basic, and foundation design studios still rely on traditional approaches, which can create a disconnect for students between their basic design education and more advanced courses, leaving and unreasonably wide gap to bridge as they progress through the architecture curriculum. In developing our approach to parametric design as an implicit pedagogy, we challenged the above oppositions and pursued a middle ground, siding with Daniel Davis in that all design is parameter based, and always has been (Davis, 2014). We operated across a broad spectrum of intentions, curiosities, materialities and media, and engaged in processes that seemed to lend themselves to, or crave, computation, but were just as well done by hand. Ultimately, we realized that the computer, like a hammer or a ruler, is just a tool and that parametric design is, in the end, just design. Our process and its outcomes are described here through research, precedent analysis, and the presentation of a series of first year design projects developed, implemented, and studied over a five-year period.

Pattern
In 1951, Alan Turing, often referred to as the father of modern computer science, published a paper, The Chemical Basis of Morphogenesis, on how pattern and structure develop in biological organisms. While the paper provides the "background mathematics required", Turing is clear up front in the summary that, despite the underlying mathematics, the emergence of pattern is due to the interaction of chemical substances based on "certain well-known physical laws" and algorithms. Describing these systems using words such as "instability" and "random disturbance", he undercuts the precision and sterility we typically associate with mathematical formulae, and presents the paper in its simplest terms, making the case that our understanding of how pattern develops and is structured in nature is "elementary" and messy. In contrast with how current parametricists and their detractors describe the approach, and despite the forced, computational complexity evident in current parametric design, Turing argues that the patterns and structures we see in nature are a result of simple, environmental rule-sets and qualitative rather than quantitative conditions, or parameters, and concludes that understanding the theory of such processes may only require digital computation in "a few particular cases" (Turing, 1951).
Opponents of parametric design pounce on the formulaic and totalizing aspects of designing through pattern, contending that it tends toward homogeneity and rejects the "strangeness of ourselves" and our world. They argue that true, imaginative design thinking requires decision making that is fraught with doubt and risk, where parametric design is pure, unimaginative calculation (Wigley, 2014). Despite this argument, Turing's position is clear in that naturally patterned systems are not homogenous at all, developing constantly from one pattern to another through a series of strange occurrences and that, regardless of the underlying mathematical terms in its expression of this evolution of pattern, nature is quite risky and imaginative in its processes. Following Turing, we developed an approach through which we asked first year design students to identify, analyse, and represent or express patterns in nature and the physical world through material and spatial relationships. Although students engaged in discussion and research about the connection between mathematics, pattern, and the design process, student concerns, interests, and curiosities were qualitative rather than quantitative, in character. These curiosities, or intentions, formed the basis of their investigations and were one of three major criteria for evaluating their design process. While students were free to explore and investigate nature in any way they chose, they were held to expressing the results of their investigations through a progression of "true" relationships, reducing the reliance on opinion and invention while intuitively prioritizing natural alignments, connections, and sequences. Imagination and individual decision making, along with the associated risks, were occurring throughout the process. Instead of relying on numbers, projects were language based, initiated with a question and the definition of terms, following another example from Turing's process in his paper, Computing Machinery and Intelligence (Turing, 1950).
Following initial investigations, students engaged in a set of rigorous operations based on the model of morphogenesis and evolutionary design, beginning with a simple element that is repeated, aggregated, and mutated based on environmental parameters and conditions. These operations are not characterized merely by superficial, "minor variations" meant to impose a false complexity, as Mark Wigley suggests (Wigley, 2014), but a series of natural, intuitive steps based on internal logics inherent in any real or natural system. This process is most clearly visible in an assignment called the Pattern Project, conceived by Karen Bermann at Iowa State University (Bermann, 2000), where, while teaching it, the approach described here was initially developed. In the Pattern Project, students are asked to analyse, deconstruct, and unfold a three-dimensional object or tool into a two-dimensional drawing, similar to a clothing pattern. Then, based on a set of simple criteria, students extract a fragment from the larger pattern to repeat, aggregate and mutate in material form, in this case using paper. At the conclusion of the project, students arrive at a patterned material and spatial construction. Although the operational rule-sets of the project are simple, what results is surprisingly complex and, often, suggestive of biological systems. The processes of the Pattern Project form the foundation for the development of projects that would follow.

Responsibility
A central point of contention regarding parametric design is on the question of authorship. The architectural education and the profession has historically coveted the cultivation of the individual genius, responsible for poetically crafting ideas into narratives manifested through material and spatial relationships that somehow masterfully represent the complexities of our everyday life. In this way, the poetics of narrative elevate building to architecture. For opponents to computational design, the notion of autopoesis is perverse, in which a machine is responsible for making design decisions in a cultural vacuum free from the dystopia of the modern world and lacking a critical cultural perspective, leading only to a fetishized homogeneity of pattern and structure. The narrative, or argument, Reinhold Martin claims is what "reinforces the premise that cultural value is performed" in the making of architecture (Martin, 2014). Without it, the process is purely procedural and about the numbers. Unabashedly, parametric designers seem to have abandoned the narrative for what David Benjamin calls "possibilities". Using computers only as tools, Benjamin and his students at Columbia University's GSAPP push the limits of what is potential at the intersection of of architecture and biology, using the surplus of time that a computational environment affords to test as many scenarios and outcomes as possible. Processes are less about critical theoretical positions than performance and efficiency. While both designer and computer are both making hundreds of decisions independently throughout the process, Benjamin points out that the computer is not autonomous. The applications his students use are programmed, and students are responsible to be aware of and control every decision, whoever or whatever is making them. The individual genius has no place in this model, as the author may be a number of designers working together simultaneously, or plugging in at some future date through an open source platform. This way of working foreshadows working environments in the field, which are becoming increasingly collaborative and interdisciplinary, and necessarily so to allow designers the ability to understand and manipulate the complexity of contemporary architecture (Benjamin, 2014). The narrative in Benjamin's case, if there is one, may be the connections that are formed between scientists and architects toward a future in which architecture has more agency in our lives aside from telling interesting stories.
The question of authorship in art, architecture, and design has been ever present, and certainly predates the advent of computational design aided by the use of computers. From masterapprentice relationships of the Renaissance to corporate architecture's mega-firms, the question of who is actually responsible for what has lingered. Nevertheless, when it comes to art and architecture, technology has always gotten a bad rap. David Hockney's book, Secret Knowledge, sent shockwaves through the art world when it proposed that Johannes Vermeer painted his masterpieces with the aid of complex optics, which some historians responded to as a "refutation of the 17th century Dutch master's genius" (Riefe, 2014), and leading some to reconsider the value of the work itself. A more modern example is the artist Sol LeWitt, who is known for designing his constructions for other artists and volunteers to execute. His work, visibly measured and calculated, is based on a process by which his work revealed only what was absolutely essential through its making, employing an economy of means that he distinguished as an ethical position (Lovatt, 2010). LeWitt made his position on authorship clear when he stated, "I can't paint a picture of a person because to me it wouldn't be ethical. I mean, I can draw a line on a wall because I think it's an ethical act", suggesting that any attempt on the part of the artist to represent someone's likeness would require judgment and, therefore, bias. (Teerink, 2012). As a reaction to intellectual bias, Sol LeWitt writes that, "the artist's will is secondary to the process he initiates from idea to completion. His willfulness may only be ego". LeWitt was perhaps the first artist to produce and sell ideas as an art form for others to fulfill through contractual constraints. However, regardless of who executes the work, LeWitt is clear when he states in his Sentences on Conceptual Art that "banal ideas cannot be rescued by beautiful execution", placing the responsibility for the work solely on the shoulders of the one who conceives of the initial idea (LeWitt, 1969). During a project we entitled Monster, authorship was as questionable, as students developed a process of iteration through transformation based on simple rule sets derived from observing tangible, measurable conditions, or facts, within a patterned context, whereby intellectual bias was minimized and students learned to operate based on a set of truths rather than their own opinions or desires. They were asked to derive a form and structure based on questions about a personal, internal fear through a layered, aggregated, and mutated body and frame that, once set in motion, developed its own internal logic and grew out of control of the designer. The decision making process was restructured as students no longer struggled for control over ideas or made things to satisfy certain external expectations about form or beauty. In this way, students learned to evaluate work based on the integrity of the process and what the work is or has become, rather than what it could or should be based upon some subjective viewpoint about aesthetics. When asked in a studio during an initial review of the work whether or not criticism about balance and formal composition was valid, one student replied, "No", explaining that if the work is "true" and "right", what we see is what we get and the only way to improve the composition is to ask a different, perhaps more rigorous, question.

Mystery
We were first introduced to the work of Sol LeWitt as undergraduate design students as an example of an artist who used simple rule sets to make a painting or drawing leading to an unanticipated outcome. This idea fascinated us, because as beginning designers we assumed that the architectural design process was to work toward some predetermined idea that would be imposed on the world based upon an initial sketch, parti, or diagram. What was so interesting about LeWitt's evolutionary process was not necessarily that it was minimal, measured, and calculated using only lines, but that, derived from a few basic parameters, it led to something unexpected. The unexpected nature of LeWitt's work is what Mario Carpo refers to as "magic". But for Carpo, the notion of magic, or indeterminacy, in parametric computational design is a strange one. He finds himself between two groups, the phenomenologists on one hand and parametric designers on the other, and contends that both are concerned with the indeterminate, the search for mystery in the process. But to try find it in computation, he argues, is irrational (Carpo, 2014). But it is this irrational thinking tempered by rational, logical execution that LeWitt believes leads to new experience and gives art and design its vitality (LeWitt, 1969).
In 1970, British mathematician John Conway devised an algorithm based, zero-player computer program called the Game of Life. Comprised of a community of pixels, or cells, in which an initial configuration of cells evolves into a highly complex, patterned microcosm based a few basic rules known as "genetic laws" The program was important for a number of reasons, including advancing algorithmic design and opening up a new field of mathematics, cellular automata (Gardner, 1970). But perhaps the most exciting potential of Life was its mathematical proof of "undecidability" that describes the inability of algorithmic program, such as a universal Turing machine, to determine the outcomes of ever evolving patterns (Berlekamp, Conway, and Guy, 1982). This idea of irrational indeterminacy which was found to be core to an algorithmic, computational environment suggested that it may be a core principle in nature, as well, has prompted philosophers and cognitive scientists to draw on Life to understand the evolution of complex philosophical constructs, such as consciousness and free will, from simple sets of deterministic rules (Dennett, 2003). Manually and more fundamentally, perhaps, the work of artist, computer scientist, and applied geometrist Ron Resch demonstrates the unexpected, mysterious characteristics found in highly ordered natural systems. He begins with a simple operation, such as a twist, performed on a piece of paper, iterating through a process of critical discovery as patterned systems evolve into more complex material and spatial forms. An essential part of Resch's approach is the way in which he engages with the system through trial and error, observing the system diagnostically in order to make adjustments and, rather than to determine, control, or mimic toward some desired outcome, but to gain a better understanding of its natural qualities (Resch, 1973). While the process itself is determined in some way, through rules about ways of operating, outcomes are unexpected and, in many ways, magical. Working within a semester-long project simply titled Process, we asked students to investigate a 60 by 60 centimetre plaza tile made from granite through the activities of seeing, materializing, and articulating, toward the goal of uncovering hidden topologies that may lie buried beneath the flat, polished surface of the tile, utilizing various techniques including rubbing, and drawing, and casting. Based upon their initial curiosities, which ranged from surface conditions such a cracking to aggregate size and density, students began with flat line drawings that categorized the components of the tile into a network of fields, points, and spaces. After assigning numerical values to lines and intersections of lines based on hierarchy within the system, such as lineweight, density, and values of tone and shade, molds were constructed through a technique of triangulation to form plaster casts which mysteriously gave volume and shape to aspects of the tile which previously could only be imagined. Finally, continuing to follow process parameters set up weeks earlier, students articulated the surface of the plaster casts through a layering and superimposition of material and spatial relationships, including image and text, that formed a rich identity for an object typically ignored. Results were decidedly undecidable as the process evolved ambiguously, with no ideological position being taken about right or wrong, as Carpo suggests occurs when designers are interested with the indeterminacy of a system or process (Carpo, 2014). The only "knowns", it seemed, were the individual decisions made in the moment based on the logic of the process, with each outcome more unexpected than the next.

Conclusion
Because our approach to parametric design was implicit, rather than explicit, and because we were dealing with first year design students, we were concerned whether students would find negotiating the process too complex, that we would develop within them desirable "designerly" characteristics such as independence and self-confidence, and that we would meet our traditional course outcomes effectively. While we concluded that we met course outcomes effectively, as evidenced by the quality and sophistication of the work, as well as the students' ability to communicate well about their process and take responsibility for their decisions, whether students developed an inner voice, the ability to tap personal desires to tell a meaningful story, or to understand fully the implications of what they were doing, remains to be seen. Although student course evaluations remained consistently high over the course of more than seven semesters, student ability and understanding will have to be tracked as they progress through the curriculum. And although we skirted the technology debate directly, it remains at the root of all aspects of the debate. As we determined at the outset, we concluded that computers are only tools. Given more autonomy in the process, it may be possible that a designer's decisions and, therefore, the outcomes of those decisions may be influenced or limited. With the possible exception of time, we found that there were relatively few unique benefits of utilizing digital fabrication methods and that, in fact, making manually led to outcomes rife with character, meaning, and vitality, drawing from the frailties and imperfections natural to the human hand.
Ultimately, we found that our approach and its outcomes were not out of line with more traditional approaches, and students developed in similar ways as they have in the past working through more conventional means and methods. Personal desire and decision making was not replaced by rote rule following and hollow, meaningless computation. The desire with the student, it seemed, merely shifted toward curiosities about place, object, surface, and physical conditions there to be investigated, rather than being focused on or determinate about a final object, form, shape, or outcome. As in any design process, those investigations led to identification, description, and representation of material and spatial relationships, only based on certain parameters, often with unintended or unexpected outcomes. In our experience, the differences in outcomes between our approach and others were few and nominal.