Search
Mailing List
Back to Top
Issue 33 G
06/08/2022
ISSN 2634-8578
Curated By:
Francesca Coman
-
Climate solutions, Fun Palace
Add to Basket
Share →
image source: Cantrell, Martin, Ellis 2017
image source: Cantrell, Martin, Ellis 2017
Wild Disequilibria 
03/08/2022
Climate solutions, Climatic Energy, cognitive tools, Ecological Autonomy, landscape futures
Marantha Dawkins, Bradley Cantrell

mmd5mk@virginia.edu
Add to Issue
Read Article: 2327 Words

Climatic Energy and Ecological Autonomy 

There is no way back to the climate that we once knew: “our old world, the one that we have inhabited for the last 12,000 years, has ended”.[1] Accepting this end presents an opportunity to reframe considerations of risk, indeterminacy, and danger as questions of restructuring and rewilding; shifting the discussion of global warming from a matter of a scarcity of resources to an abundance of energy that can kick-start landscape futures. 

To engage this future, it is critical to set up some terms for how design will engage with the multitude of potential climates before us. Rather than working preventatively by designing solutions that are predicated on the simplification of the environment by models, we advocate for an experimentalism that is concerned with the proliferation of complexity and autonomy in the context of radical change. Earth systems are moving hundreds to thousands of times faster than they did when humans first documented them. This acceleration is distributed across such vast space and time scales that the consequences are ubiquitous but also unthinkable, which sets present-day Earth out of reach of existing cognitive tools. For example, twenty- to fifty-year decarbonisation plans are expected to solve problems that will unfold over million-year timescales.[2] These efforts are well-intentioned but poorly framed; in the relentless pursuit of a future that looks the same as the past, there is a failure to acknowledge that it is easier to destroy a system than it is to create one, a failure to acknowledge the fool’s errand of stasis that is embodied in preservation, and most importantly, a failure to recognise that climate change is not a problem to be solved.[3] Climate “solutions” are left conceptually bankrupt when they flatten complex contexts into one-dimensional problem sets that are doomed by unknowable variability. From succession, to extinction, to ocean biochemistry, to ice migration; our understanding of environmental norms has expired.[4] 

The expiration of our environmental understanding is underlined by the state of climate adaptation today – filled with moving targets, brittle infrastructures, increasing rates of failure, and overly complicated management regimes. These symptoms illustrate the trouble contemporary adaptation has escaping the cognitive dissonance of the manner in which knowledge about climate change is produced: the information has eclipsed its own ideological boundaries. This eclipse represents a crisis of knowledge, and therefore must give rise to a new climatic form. Changing how we think and how we see climatic energy asks us to make contact with the underlying texture and character of this nascent unruliness we find ourselves in, and the wilds that it can produce. 

Earth’s new wilds will look very different from the wilderness of the past. Classical wilderness is characterised by purity: it is unsettled, uncultivated, and untouched. But given the massive reshaping of ecological patterns and processes across the Earth, wilderness has become less useful, conceptually. Even in protected wilderness areas, “it has become a challenge to sustain ecological patterns and processes without increasingly frequent and intensive management interventions, including control of invading species, management of endangered populations, and pollution remediation”.[5] Subsequently, recent work has begun to focus less on the pursuit of historical nature and more on promoting ecological autonomy.[6, 7, 8] Wildness, on the other hand, is undomesticated rather than untouched. The difference between undomesticated and untouched means that design priorities change from maintaining a precious and pure environment to creating plural conditions of autonomy and distributed control that promote both human and non-human form. 

Working with wildness requires new ways of imagining and engaging futurity that operate beyond concepts of classical earth systems and the conventional modelling procedures that re-enact them, though conventional climate thinking, especially with the aid of computation, has achieved so much: “everything we know about the world’s climate – past, present, future – we know through models”.[9] Models take weather, which is experiential and ephemeral, abstract it into data over long periods of time, and assemble this data into patterns. Over time, these patterns have become increasingly dimensional. This way of understanding climate has advanced extremely quickly over the past few decades, enough that we can get incredibly high-resolution pictures (like the one below, which illustrates how water temperature swirls around the earth). Climate models use grids to organise their high-resolution, layered data and assign it rules about how to pass information to neighbouring cells. But the infinite storage capacity of the grid cells and the ways they are set up to handle rules and parameters create a vicious cycle, by enabling exponential growth toward greater and greater degrees of accuracy. Models get bigger and bigger, heavier and heavier, with more and more data; operating under the assumption that collecting enough information will eventually lead to the establishment of a perfect “control” earth,[10] and to an earth that is under perfect control. But this clearly isn’t the case, as for these models, more data means more uncertainty about the future. This is the central issue with the traditional, bottom-up climate knowledge that continues to pursue precision. It produces ever more perfect descriptions of the past while casting the future as more and more obscene and unthinkable. In other words, in a nonlinear world, looking through the lens of these bottom-up models refracts the future into an aberration.[11] 

Figure 1 – Global ocean temperatures modeled at Los Alamos National Labs illustrate how heat travels in swirling eddies across the globe. Image source: Los Alamos National Laboratories.

The technological structure of models binds us to a bizarre present. It is a state which forecloses the future in the same way that Narcissus found himself bound to his own reflection. When he saw his reflection in a river, he “[mistook] a mere shadow for a real body” and found himself transfixed by a “fleeting image”.[12] The climatic transfixion is the hypnotism of the immediate, the hypothetically knowable, which devalues real life in favour of an imaginary, gridded one. We are always just a few simulations from perfect understanding and an ideal solution. But this perfection is a form of deskilling which simulates not only ideas but thinking itself. The illusion of the ideal hypothetical solution, just out of reach, allows the technical image to operate not only as subject but as project;[13] a project of accuracy. And the project of making decisions about accuracy in models then displaces the imperative of making decisions about the environments that the models aim to describe by suspending us in the inertia of a present that is accumulating more data than it can handle. 

It is important to take note of this accumulation because too much information starts to take on its own life. It becomes a burden beyond knowledge,[14] which makes evident that “without forgetting it is quite impossible to live at all”.[15] But rather than forget accumulated data and work with the materiality of the present, we produce metanarratives via statistics. These metanarratives are a false consciousness. Issues with resolution, boundary conditions, parameterization, and the representation of physical processes represent technical barriers to accuracy, but the deeper problem facing accuracy is the inadequacy of old data to predict new dynamics. For example, the means and extremes of evapotranspiration, precipitation and river discharge have undergone such extreme variation due to anthropogenic climate change that fundamental concepts about the behaviour of earth systems for fields like water resource management are undergoing radical transformation.[16] Changes like this illustrate how dependence upon the windows of variability that statistics produce is no longer viable. This directly conflicts with the central conceit of models: that the metanarrative can be explanatory and predictive. In his recently published book, Justin Joque challenges the completeness of the explanatory qualities of statistics by underlining the conflicts between its mathematical and metaphysical assumptions.[17] He describes how statistics (and its accelerated form, machine learning) are better at describing imaginary worlds than understanding the real one. Statistical knowledge produces a way of living on top of reality rather than in it. 

Figure 2 – An illustration of how a climate model breaks the Earth surface and atmosphere into rectangular chunks within which data is stored, manipulated, and passed on to neighboring cells. Image source: ERA-Interim Archive.

The shells of modelled environments miss the materiality, the complexity and the energy of an ecosystem breaking apart and restructuring itself. The phase of a system that follows a large shift is known as a “back loop” in resilience ecology,[18, 19] and is an original and unstable period of invention that is highly contingent upon the materials left strewn about in the ruins of old norms. For ecological systems in transition, plant form, geological structure, biochemistry and raw materiality matter. These are landscape-scale issues that are not described in the abstractions of parts per million. High-level knowledge of climate change, while potentially relevant for some scales of decision-making, does not capture the differentiated impacts of its effects that are critical for structuring discussions around the specific ways that environments will grow and change, degrade or complexify through time. 

This is where wilds can play a role in structuring design experimentation. Wildness is unquestionably of reality, or a product of the physical world inhabited by corporeal form. Wilds as in situ experiments become model forms, which have a long epistemological history as a tool for complex and contingent knowledge. Physicists (and, here, conventional climate modellers) look to universal laws to codify, explain and predict events, but because medical and biological scientists, for example, do not have the luxury of stable universalism, they often use experiments as loose vehicles for projection. By “repeatedly returning to, manipulating, observing, interpreting, and reinterpreting certain subjects—such as flies, mice, worms, or microbes—or, as they are known in biology, ‘model systems’”, experimenters can acquire a reliable body of knowledge grounded in existing space and time.[20] This is how we position the project of wildness, which can be found from wastewater swamps, to robotically maintained coral reefs, to reclaimed mines and up-tempo forests. Experimental wilds, rather than precisely calculated infrastructures, have the potential to do more than fail at adapting to climate: they can serve “not only as points of reference and illustrations of general principles or values but also as sites of continued investigation and reinterpretation”.[21] 

There is a tension between a humility of human smallness and a lunacy in which we imagine ourselves engineering dramatic and effective climate fixes using politics and abstract principles. In both of these cases, climate is framed as being about control: control of narrative, control of environment. This control imaginary produces its own terms of engagement. Because its connections to causality, accuracy, utility, certainty and reality are empty promises, modelling loses its role as a scientific project and instead becomes a historical, political and aesthetic one. When the model is assumed to take on the role of explaining how climate works, climate itself becomes effectively useless. So rather than thickening the layer of virtualisation, a focus on wild experiments represents a turn to land and to embodied changes occurring in real time. To do this will require an embrace of aspects of the environment that have been marginalised, such as expanded autonomy, distributed intelligence, a confrontation of failure, and pluralities of control. This is not a back-to-the-earth strategy, but a focus on engagement, interaction and modification; a purposeful approach to curating climatic conditions that embraces the complexity of entanglements that form the ether of existence. 

References

[1] M. Davis, “Living on the Ice Shelf”, Guernica.org https://www.guernicamag.com/living_on_the_ice_shelf_humani/, (accessed May 01, 2022). 

[2] V. Masson-Delmotte, P. Zhai, A. Pirani, S.L. Connors, C. Péan, S. Berger, N. Caud, Y. Chen, L. Goldfarb, M.I. Gomis, M. Huang, K. Leitzell, E. Lonnoy, J.B.R. Matthews, T.K. Maycock, T. Waterfield, O. Yelekçi, R. Yu, and B. Zhou (eds.)]. IPCC, 2021: Climate Change 2021: The Physical Science Basis. Contribution of Working Group I to the Sixth Assessment Report of the Intergovernmental Panel on Climate Change (Cambridge University Press, Cambridge, UK and New York, USA, 2021) doi:10.1017/9781009157896. 

[3] R, Holmes, “The problem with solutions”, Places Journal (2020). 

[4] V. Masson-Delmotte, P. Zhai, A. Pirani, S.L. Connors, C. Péan, S. Berger, N. Caud, Y. Chen, L. Goldfarb, M.I. Gomis, M. Huang, K. Leitzell, E. Lonnoy, J.B.R. Matthews, T.K. Maycock, T. Waterfield, O. Yelekçi, R. Yu, and B. Zhou (eds.)]. IPCC, 2021: Climate Change 2021: The Physical Science Basis. Contribution of Working Group I to the Sixth Assessment Report of the Intergovernmental Panel on Climate Change (Cambridge University Press, Cambridge, UK and New York, USA, 2021) doi:10.1017/9781009157896. 

[5] B. Cantrell, L.J. Martin, and E.C. Ellis, “Designing autonomy: Opportunities for new wildness in the Anthropocene”, Trends in Ecology & Evolution 32.3 (2017), 156-166. 

[6] Ibid. 

[7] R.T. Corlett, “Restoration, reintroduction, and rewilding in a changing world”, Trends in Ecology & Evolution 31 (2016), 453–462 

[8] J. Svenning, et al., “Science for wilder Anthropocene: Synthesis and future directions for trophic rewilding research” Proceedings of the National Academy of Sciences 113 (2015), 898–906 

[9] P. N. Edwards, A vast machine: Computer models, climate data, and the politics of global warming (MIT Press, Cambridge, 2010). 

[10] P. N. Edwards, “Control earth”, Places Journal (2016). 

[11] J. Baudrillard, Cool Memories V: 2000-2004, (Polity, Oxford, 2006). 

[12] Ovid, Metamorphoses III, (Indiana University Press, Bloomington, 1955), 85 

[13] B. Han, Psychopolitics: Neoliberalism and new technologies of power, (Verso Books, New York, 2017). 

[14] B. Frohmann, Deflating Information, (University of Toronto Press, Toronto, 2016). 

[15] F. Nietzsche, On the Advantage and Disadvantage of History for Life, (1874). 

[16] P. C. D. Milly, et al. “Stationarity is dead: whither water management?”, Science 319.5863 (2008), 573-574. 

[17] J. Joque, Revolutionary Mathematics: Artificial Intelligence, Statistics and the Logic of Capitalism, (Verso Books, New York, 2022). 

[18] Gunderson and Holling, 2001; and Holling, “From complex regions to complex worlds”, Ecology and Society, 9, 1 (2004), 11. 

[19] S. Wakefield, Anthropocene Back Loop (Open Humanities Press, 2020). 

[20] A. N. H. Creager, et al., eds. Science without laws: model systems, cases, exemplary narratives (Duke University Press, Durham, 2007). 

[21] Ibid 

Suggest a Tag for this Article
Figure 5 Fun Palace in London before Demolition [61] 
Figure 5 Fun Palace in London before Demolition [61] 
Architectural Authorship in “the Last Mile”
29/04/2022
Architectural Authorship, automation, digitalisation, Fun Palace, Leon Battista Alberti, mass-customisation, the Last Mile
Yixuan Chen

y.chen.20@alumni.ucl.ac.uk
Add to Issue
Read Article: 6617 Words

Introduction 

A loyal companion to the breakthroughs of artificial intelligence is the fear of losing jobs due to a robotic takeover of the labour market. Mary L. Gray and Siddharth Suri’s research on ghost work unveiled another possible future, where a “last mile” requiring human intervention would always exist in the journey towards automation. [1] The so-called “paradox of the last mile” has been exerting impacts on the human labour market across the industrial age, recurringly re-organising itself when absorbing marginalised groups into its territory. These groups range from child labourers in factories, to the “human computer” women of NASA, to on-demand workers from Amazon Mechanical Turk (MTurk). [2] Yet their strenuous efforts are often rendered invisible behind the ostensibly neutral algorithmic form of the automation process, creating “ghost work”. [3] 

Based on this concept of “the last mile”, this study intends to excavate how its paradox has influenced architectural authorship, especially during architecture’s encounters with digital revolutions. I will firstly contextualise “architectural authorship” and “the last mile” in previous studies. Then I will discuss the (dis)entanglements between “automation” and “digitalisation”. Following Antoine Picon and Nicholas Negroponte, I distinguish between the pre-information age, information age and post-information age before locating my arguments according to these three periods. Accordingly, I will study how Leon Battista Alberti, the Fun Palace, and mass-customised houses fail in the last mile of architectural digitalisation and how these failures affect architectural authorship. From these case studies, I challenge the dominant narrative of architectural authorship, either as divinity or total dissolution. In the end, I contend that it is imperative to conceive architectural authorship as relational and call for the involvement of multi-faceted agents in this post-information age. 

Academic Context 

Architectural Authorship in the Digital Age 

The emergence of architects’ authorial status can be dated back to Alberti’s De re aedificatoria, which states that “the author’s original intentions” should be sustained throughout construction. [4] Yet at the same time, those architects should keep a distance from the construction process. [5] It not only marks the shift from the artisanal authorship of craftsmen to the intellectual authorship of architects but also begets the divide between the authorship of architectural designs and architectural end products. [6] However, this tradition can be problematic in the digital age, when multi-layered authorship becomes feasible with the advent of mass-collaboration software and digital customisation technologies. [7] 

Based on this, Antoine Picon has argued that, despite attempts to include various actors by collaborative platforms such as BIM, architects have entered the Darwinian world of competition with engineers, constructors and existing monopolies, to maintain their prerogative authorship over the profession. [8] These challenges have brought about a shifting attention in the profession, from authorship as architects to ownership as entrepreneurs. [9] Yuan and Wang, on the other hand, call for a reconciliation of architectural authorship between regional traditions and technologies from a pragmatic perspective. [10] However, these accounts did not throw off the fetters of positioning architects as the centre of analysis. In the following article, I will introduce “the last mile”, a theory from the field of automation, to provide another perspective on the issues of architectural authorship. 

“The Last Mile” as Method 

The meaning of “the last mile” has changed several times throughout history. Metaphorically, it was used to indicate the distance between the status quo and the goal, in various fields, such as movies, legal negotiations, and presidential campaigns. [11] It was first introduced in the technology industry as “the last mile” of telecommunication, on which one of the earliest traceable records was written in the late 1980s. [12] Afterwards, “the last mile” of logistics began to be widely used in the early 2000s, following the dot-com boom of the late 90s that fuelled discussions of B2C eCommerce. [13] However, in this article, I will use “the last mile” of automation, a concept from the recent “AI revolution” since 2010, to reconsider architectural authorship. [14] In this context, “the last mile” of automation refers to “the gap between what a person can do and what a computer can do”, as Gray and Suri defined in their book. [15] 

I employ this theory to discuss architectural authorship for two purposes.  

1. Understanding the paradox of automation can be of assistance in understanding how architectural authorship changes along with technological advancements. Pasquinelli and Joler suggest that “automation is a myth”, because machines have never entirely operated by themselves without human assistance, and might never do so. [16] Subsequently, here rises the paradox that “the desire to eliminate human labour always generates new tasks for humans” and this shortcoming “stretched across the industrial era”. [17] Despite being confined within the architectural profession, architectural authorship is subject to change in parallel with the alterations of labour tasks. 

2. I contend that changes in denotations of “the last mile” signal turning points in both digital and architectural history. As Figure 1 suggests, in digital history, the implication of the last mile has changed from the transmission of data to the analysis of data, and then to automation based on data. The former change was in step with the arrival of the small-data environment in the 1990s and the latter corresponds with the leap towards the big-data environment around 2010. [18] In a similar fashion, after the increasing availability of personal computers after the 90s, the digital spline in architecture found formal expression and from around 2010 onwards, spirits of interactivity and mass-collaboration began to take their root in the design profession. [19] Therefore, revisiting the digital history of architecture from the angle of “the last mile” can not only provide alternative readings of architectural authorship in the past but can also be indicative of how the future might be influenced. 

Figure 1 Changes of Meanings for “the Last Mile” in Digital History, and Digital Turns in Architectural History. 

Between Automation and Digitalisation 

Before elucidating how architectural authorship was changed by the arrival of the automated/digital age, it is imperative to distinguish two concepts mentioned in the previous section – automation and digitalisation. To begin with, although automation first came to use in the automotive industry in 1936 to describe “the automatic handling of parts”, what this phrase alludes to has long been rooted in history. [20] As Ekbia and Nardi define, automation essentially relates to labour-saving mechanisms that reduce the human burden by transferring it to machines in labour-requiring tasks, including both manual and cognitive tasks. [21] Despite its use in human history, it was not until the emergence of digital computers after WWII that its meaning became widely applicable. [22] The notion of computerised automation was put forward by computer scientist Michael Dertouzos in 1979, highlighting its potential for tailoring products on demand. [23] With respect to cognitive tasks, artificial intelligence that mimics human thinking is employed to tackle functions concerning “data processing, decision making, and organizational management”. [24] 

Digitalisation, on the other hand, is a more recent concept engendered by the society of information in the late 19th century, according to Antoine Picon. [25] This period was later referred to as the Second Industrial Revolution, when mass-production was made possible by a series of innovations, including electrical power, automobiles, and the internal combustion engine. It triggered what Beniger called the “control revolution” – the volume of data exploded to the degree that it begot revolutions in information technology. [26] Crucial to this revolution was the invention of digital computing, which brought about a paradigm shift in the information society. [27] It has changed “the DNA of information” in the sense that, as Nicholas Negroponte suggests, “all media has become digital”, by converting information from atoms to bits. [28] In this sense, Negroponte distinguishes between the information age, which is based on economics of scale, and the post-information age, founded on personalisation. [29] 

It can be observed that automation and digitalisation are intertwined in multiple ways. Firstly, had there been no advancement in automation during the Second Industrial Revolution, there would be no need to develop information technology, as data would have remained at a manageable level. Secondly, the advent of digital computers has further intermingled these two concepts to the extent that, in numerous cases, for something to be automated, it needs first to be digitalised, and vice versa. In the architectural field alone, examples of this can be found in cybernetics in architecture and planning, digital fabrication, smart materials, and so on. Hence, although these two terms are fundamentally different – most obviously, automation is affiliated with the process of input and output, and digitalisation relates to information media – the following analysis serves with no intention to differentiate between the two. Instead, I discuss “the last mile” in the context of reciprocity between these two concepts. After all, architecture itself is at the convergence point between material objects and media technologies. [30] 

Leon Battista Alberti: Before the Information Age 

Digitalisation efforts made by architects, however, appeared to come earlier than such attempts made in industrial settings of the late 19th century. This spirit can be traced back to Alberti’s insistence on identicality during information transmission, by compressing two-dimensional and three-dimensional information into digits – which is exemplified by Descriptio Urbis Romae and De statua. [31] In terms of architecture, as mentioned previously, he positions built architecture as an exact copy of architects’ intention. [32] This stance might be influenced by his views on painting. First, he maintains that all arts, including architecture, are subordinate to paintings, where “the architraves, the capitals, the bases, the columns, the pediments, and all other similar ornaments” came from. [33] Second, in his accounts, “the point is a sign” that can be seen by eyes, the line is joined by points, and the surface by lines. [34] As a result, the link between signs and architecture is established through paintings since architecture is derived from paintings and paintings from points/signs.  

Furthermore, architecture can also be built according to the given signs. In Alberti’s words, “the whole art of buildings consists in the design (lineamenti), and in the structure”, and by lineamenti, he means the ability of architects to find “proper places, determinate numbers, just proportion and beautiful order” for their constructions. [35] It can be assumed that, if buildings are to be identical to their design, then, to begin with, there must be “determinate numbers” to convey architects’ visions by digital means – such as De statua (Fig. 2). Also, in translating the design into buildings, these numbers and proportions should be unbothered by any distortions as they are placed in actual places – places studied and measured by digital means, just like Descriptio Urbis Romae (Fig. 2). 

Although the Albertian design process reflects the spirit of the mechanical age, insisting on the identicality of production, it can be argued that his pursuit of precise copying was also influenced by his pre-modern digital inventions being used to manage data. [36] Therefore, what signs/points mean to architecture for Alberti can be compared to what bits mean to information for Negroponte, as the latter is composed of the former and can be retrieved from the former. Ideally, this translation process can be achieved by means of digitalisation. 

Figure 2 Descriptio Urbis Romae (Left) and De statua (Right)37 

Yet it is obvious that the last mile for Alberti is vastly longer than that for Negroponte. As Giorgio Vasari noted in the case of Servite Church of the Annunziata, while Alberti’s drawings and models were employed for the construction of the rotunda, the result turned out to be unsatisfactory, and the arches of nine chapels are falling backwards from the tribune due to construction difficulties. [38] Also, in the loggia of the Via della Vigna Nuova, his initial plan to build semi-circular vaults was aborted because of the inability to fulfil this shape on-site. [39] These two cases suggest that the allographic design process – employing precise measurements and construction – which heralded the modern digital modelling software and 3D-printing technologies, was deeply problematic in Alberti’s time. 

This problem was recognised by Alberti himself in his De re aedificatoria, when he wrote that to be “a wise man”, one cannot stop in the middle or at the end of one’s work and say, “I wish that were otherwise”. [40] In Alberti’s opinion, this problem can be offset by making “real models of wood and other substances”, as well as by following his instruction to “examine and compute the particulars and sum of your future expense, the size, height, thickness, number”, and so on. [41] While models can be completed without being exactly precise, architectural drawings should achieve the exactness measured “by the real compartments founded upon reason”. [42] According to these descriptions, the design process conceived by Alberti can be summarised as Figure 3. 

Figure 3 Albertian Design Process 

If, as previously discussed, architecture and its context can be viewed as an assembly of points and signs, the Albertian design process can be compared to how these data are collected, analysed and judged until the process reaches the “good to print” point – the point when architects exit and construction begins. Nonetheless, what Vasari has unveiled is that the collection, analysis and execution of data can fail due to technological constraints, and this failure impedes architects from making a sensible judgement. Here, the so-called “technological constraints” are what I consider to be “the last mile” that can be found across the Albertian design process. As Vasari added, many of these technological limitations at that time were surmounted with the assistance of Salvestro Fancelli, who realised Alberti’s models and drawings, and a Florentine named Luca, who was responsible for the construction process. [43] Regardless of these efforts, Alberti remarked that only people involved in intellectual activities – especially mathematics and paintings – are architects; the opposite of craftsmen. [44] Subsequently, the challenges of confronting “the last mile” are removed from architects’ responsibilities through this ostensibly neutral design process, narrowing the scope of who is eligible to be called an architect. The marginalisation of artisanal activities, either those of model makers, draughtsmen or craftsmen, is consistent with attributing the laborious last mile of data collection, analysis and execution – measuring, model making, constructing – exclusively to their domain. 

While the division of labour is necessary for architecture, as John Ruskin argued, it would be “degraded and dishonourable” if manual work were less valued than intellectual work. [45] For this reason, Ruskin praised Gothic architecture with respect to the freedom granted to craftsmen to execute their own talents. [46] Such freedom, however, can be expected if the last mile is narrowed to the extent that, through digitalisation/automation, people can be at the same time both architects and craftsmen. Or can it? 

Fun Palace: At the Turn of the Information and Post-Information Age 

Whilst the Albertian allographic mode of designing architecture has exerted a profound impact on architectural discipline due to subsequent changes to the ways architects have been trained, from the site to the academy, this ambition of separating design from buildings was not fulfilled, or even agreed upon among architects, in the second half of the 20th century. [47] Besides, the information age on the basis of scale had limited influences on architectural history, except for bringing about a new functional area – the control room. [48] Architecture’s initial encounters with the digital revolution after Alberti’s pre-modern technologies can be traced back to the 1960s, when architects envisaged futuristic cybernetic-oriented environments. [49] Different from Alberti’s emphasis on the identicality of information – the information per se – this time, the digitalisation and information in architecture convey a rather different message. 

Gorden Pask defined cybernetics as “the field concerned with information flows in all media, including biological, mechanical, and even cosmological systems”. [50] By emphasising the flow of data – rather than the information per se – cybernetics distinguishes itself in two aspects. Firstly, it is characterised by attempts of reterritorialization – it breaks down the boundaries between biological organisms and machines, between observers and systems, and between observers, systems and their environments, during its different development phases – which are categorised respectively as first-order cybernetics (1943-1960), second-order cybernetics (1960-1985) and third-order cybernetics (1985-1996). [51]  

Secondly, while data and information became secondary to their flow, catalysed by technologies and mixed realities, cybernetics is also typified by the construction of frameworks. [52] The so-called framework was initially perceived as a classifying system for all machines, and later, after computers were made more widely available and powerful, it began to be recognised as the computational process. [53] This thinking also leads to Stephen Wolfram’s assertion that the physical reality of the whole universe is generated by the computational process and is itself a computational process. [54] This is where the fundamental difference is between the Albertian paradigm and cybernetics, as the former is based on mathematical equations and the latter attempts to understand the world as a framework/computation. [55] Briefly, in cybernetics theory, information per se is subordinate to the flow of information and this flow can again be subsumed into the framework, which is later known as computational processes (Fig. 4). 

Figure 4 Information in Cybernetics Theory 

In Cedric Price’s Fun Palace, this hierarchical order resulted in what Isozaki described as “erasing architecture into system” after its partial completion (Fig. 5). [56] Such an erasure of architecture was rooted in the conceptual process, since the cybernetics expert in charge of the Fun Palace was Gordon Pask, who founded his theory and practice on second-order cybernetics. [57] Especially so, as considering that one major feature of second-order cybernetics is what Maturana and Varela termed “allopoiesis” – a process of producing something other than the system’s original component – it is understandable that if the system is architecture, then it would generate something different than architecture. [58] In the case of the Fun Palace, it was presupposed that architecture is capable of generating social activities, and that architects can become social controllers. [59] More importantly, Cedric Price rejected all that is “designed” and instead only made sketches of indistinct elements, diagrams of forces, and functional programs, rather than architectural details. [60] All these ideas, highlighting the potential in regarding architecture as the framework of computing – in contrast to seeing architecture as information – rendered the system more pronounced and set architecture aside. 

Figure 5 Fun Palace in London before Demolition61 

By rejecting architecture as pre-designed, Price and Littlewood strived to problematize the conventional paradigm of architectural authorship. They highlighted that the first and foremost quality of the space should be its informality, and that “with informality goes flexibility”. [62] This envisages user participation by rebuking fixed interventions by architects such as permanent structures or anchored teak benches. [63] In this regard, flexibility is no longer positioned as a trait of buildings but that of use, by encouraging users to appropriate the space. [64] As a result, it delineates a scenario of “the death of the author” in which buildings are no longer viewed as objects by architects, but as bodily experiences by users – architectural authorship is shared between architects and users. [65] 

However, it would be questionable to claim the anonymity of architectural authorship – anonymous in the sense of “the death of the author” – based on an insignificant traditional architectural presence in this project, as Isozaki did. [66] To begin with, Isozaki himself has remarked that in its initial design, the Fun Palace would have been “bulky”, “heavy”, and “lacking in freedom”, indicating the deficiency of transportation and construction technologies at that time. [67] Apart from the last mile to construction, as Reyner Banham explained, if the Fun Palace’s vision of mass-participation is to be accomplished, three premises must be set – skilful technicians, computer technologies that ensure interactive experiences and programmable operations, and a secure source of electricity connecting to the state grid. [68] While the last two concerns are related to technological and infrastructural constraints, the need for technicians suggests that, despite its claim, this project is not a fully automated one. The necessary involvement of human factors to assist this supposedly automated machine can be further confirmed in Price and Littlewood’s accounts that “the movement of staff, piped services and escape routes” would be contained within “stanchions of the superstructure”. [69] Consequently, if architects can extend their authorship by translating elements of indeterminacy into architectural flexibility, and users can be involved by experiencing and appropriating the space, it would be problematic to leave the authorship of these technicians unacknowledged and confine them within service pipes. [70] 

The authorship of the Fun Palace is further complicated when the content of its program is scrutinized. Price and Littlewood envisaged that people’s activities would feed into the system, and that decisions would be made according to this information. [71] During this feed-in and feedback process, human activities would be quantified and registered in a flow chart (Fig. 6). [72] However, the hand-written proposed list of activities in Figure 6 shows that human engagement is inseparable from the ostensibly automated flow chart. The arrows and lines mask human labours that are essential for observing, recognising, and classifying human activities. These tasks are the last mile of machine learning, which still requires heavy human participation even in the early 21st century. 

For instance, when, in 2007, the artificial intelligence project ImageNet was developed to recognise and identify the main object in pictures, developers found it impossible to increase the system’s accuracy by developing AI alone (and only assisting it when it failed). [73] Finally, they improved the accuracy of ImageNet’s algorithms by finding a “gold standard” of labelling the object – not from the developments of AI itself, but by using 49,000 on-demand workers from the online outsourcing platform MTurk to perform the labelling process. [74] This example suggests that if the automation promised by the Fun Palace is to be achieved, it is likely to require more than just the involvement of architects, users, and technicians. In the time of the Fun Palace’s original conception, the attempt was not fulfilled due to the impotence of computing technologies. Yet if such an attempt was to be made in the 2020s, it is likely that architectural authorship would be shared among architects, users, technicians, and ghost workers from platforms such as MTurk. 

Figure 6 Cybernetic Diagram (Left) and Proposed Activities (Right)75 

Returning to the topic of cybernetics, whilst cybernetic theories tend to redefine territories of the architectural system by including what was previously the other parts of the system – machines, observers, adaptive environments – the example of the Fun Palace has shown that this process of blurring boundaries would not be possible without human assistance, at least initially. The flow of information between these spheres would require human interventions to make this process feasible and comprehensible because, in essence, “the information source of machine learning (whatever its name: input data, training data or just data) is always a representation of human skills, activities and behaviours, social production at large”. [76] 

Houses of Mass-Customisation: In the Post-information Age 

Although cybernetics theories have metaphorically or practically influenced architectural discourse in multiple ways, from Metabolism and Archigram to Negroponte and Cedric Price, such impact was diminished after the 1970s, in parallel with the near-total banishment of cybernetics as an independent discipline in the in the academia. [77] After a long hibernation during “the winter of artificial intelligence”, architecture’s next encounter with digital revolutions happened in the 1990s. [78] It was triggered by the increasing popularity and affordability of personal computers – contrary to the expectations of cybernetics engineers, who back in the 1960s dreamt that computers would increase both in power and size. [79] These distinctive material conditions led to the underlying difference between the second-order cybernetics in the 1960s and architecture’s first digital turn in the 1990s. I contend that this distinction can be explained by comparing Turing’s universal machine with Deleuze’s notion of the “objectile”. 

As Stanley Mathews argued, the Fun Palace works in the same way as the universal machine. [80] The latter is a precursor of modern electronic computers, which can function as different devices – either as typewriters, drawing boards, or other machines – according to different codes they receive (Fig. 7). [81] Comparatively, “objectile” connotes a situation in which a series of variant objects is produced based on their shared algorithms (Fig. 8). [82] These products are so-called “non-standard series” whose key definition relates to their variance rather than form.83  

Figure 7 Simplified Diagram of the Universal Machine 
Figure 8 Non-standard Production 

While the universal machine seems to require more power to support its every change, an infinite one-dimensional tape on which its programmers can mark symbols of any instructions to claim its universality, non-standard production can operate on a smaller scale and under less demanding environments. [84] The emphasis on variance in non-standard production processes also indicates a shift of attention from the “process” underscored by second-order cybernetics towards the product of certain parametric models. When the latter is applied to architecture, the physical building regains its significance as the variable product. 

However, it does not mean a total cut-off between cybernetics and non-standard production. Since human-machine interactions are crucial for customising according to users’ input, I maintain that mass-customisation reconnects architecture with first-order cybernetics whilst resisting the notion of chaos and complexity intrinsic in second-order cybernetics.  

Figure 9 Flatwriter85 

Such correlation can be justified by comparing two examples. First, the visionary project Flatwriter (1967) by the Hungarian architect Yona Friedman proposed a scenario in which users can choose their preferred apartment plan from several patterns of spatial configurations, locations, and orientations. [86] Based on their preferences, they would receive optimised feedback from the system (Fig. 9). [87] This optimisation process would consider issues concerning access to the building, comfortable environments, lighting, communication, and so on. [88] Given that it rejects chaos and uncertainty by adjusting users’ selections for certain patterns of order and layout, this user-computer interaction system is essentially an application of first-order cybernetics, as Yiannoudes argued. [89] Contemporary open-source architectural platforms are based on the same logic. As the founder of WikiHouse argued, since the target group of mass-customisation is the 99 per cent who are constantly overlooked by the normative production of buildings after the retreat of state intervention, designing “normal” environments for them is the primary concern – transgression and disorder should be set aside. [90] As Figure 10 illustrates, similarly to Flatwriter, in theory, WikiHouse would pre-set design rules and offer design proposals according to calculations of the parametric model. [91] These rules would follow a “LEGO-like system”, which produces designs by arranging and composing standard types or systems. [92] Both Flatwriter’s optimisation and WikiHouse’s “LEGO-like system” are pursuing design in accordance with patterns, and discouraging chaotic results. 

Figure 10 Designing Process for a WikiHouse [93

Nevertheless, neither Flatwriter nor WikiHouse has achieved what is supposed to be an automatic process of using parametric models to generate a variety of designs. For Flatwriter, the last mile of automation could be ascribed to the unavailability of computers capable of performing calculations or processing images. For WikiHouse, the project has not yet fulfilled its promise of developing algorithms for design rules that resemble how the “LEGO blocks” are organised. Specifically, in the current stage, plans, components and structures of WikiHouse are designed in SketchUp by hand. [94] The flexibility granted to users is achieved by grouping plywood lumber into components and allowing users to duplicate them (Fig. 11). Admittedly, if users are proficient in Sketchup, they could possibly customise their WikiHouse on demand – but that would then go against the promise of democratising buildings through open-source platforms. [95]  

Figure 11 SketchUp Models of WikiHouse96 

Consequently, the last mile of automation again causes a conundrum of architectural authorship. Firstly, in both cases, never mind “the death of the author”, it appears that there is no author to be identified. One can argue that it signals a democratic spirit, anonymising the once Howard Roark-style architects and substituting them with a “creative common”. Nonetheless, it must be cautioned that such substitution takes time, and during this time, architects are obliged to be involved when automation fails. To democratise buildings is not to end architects’ authorship over architecture, but conceivably, for a long time, to be what Ratti and Claudel called “choral architects”, who are at the intersection of top-down and bottom-up, orchestrating the transition from the information age of scale to the post-information age of collaboration and interactivity. [97] Although projects with similar intentions of generating design and customising housing through parametric models – such as Intelligent City and Nabr – may prove to be more mature in their algorithmic process, architects are still required to coordinate across extensive sectors – clients’ inputs, design automation, prefabrication, logistics, and construction. [98] Architectural authorship in this sense is not definitive but relational, carrying multitudes of meanings and involving multiplicities of agents. [99]  

In addition, it would be inaccurate to claim architectural authorship by the user, even though these projects all prioritise users’ opinions in the design process. By hailing first-order cybernetics while rejecting the second-order, advocating order while disapproving disorder, they risk the erasure of architectural authorship – just as those who play with LEGO do not have authorship over the brand, to extend the metaphor of the “LEGO-like system” in WikiHouse. Especially as the digital turn in terms of technology does not guarantee a cognitive turn in terms of thinking. [100] Assuming that the capitalist characteristics of production will not change, technological advancements are likely to be appropriated by corporate and state power, either by means of monopoly or censorship.  

Figure 12 Non-standard Production After Repositioning Users 

This erasure of human agency should be further elucidated in relation to the suppression of chaos in these systems. As Robin Evans explained, there are two types of methods to address chaos: (1) preventing humans from making chaos by organising humans; and (2) limiting the effects of chaotic environments by organising the system. [101] While Flatwriter and WikiHouse choose to conform according to the former at the expense of diminishing human agency, it is necessary to reinvite observers and chaos as an integral part of the system towards mass-customisation and mass-collaboration (Fig. 12). 

Conclusion 

For Walter Benjamin, “the angel of history” moves into the future with its face turned towards the past, where wreckages were piled upon wreckages. [102] For me, addressing the paradox of “the last mile” in the history of architectural digitalisation is this backward gaze that can possibly provide a different angle to look into the future.  

This article mainly discussed three moments in architectural history when technology failed to live up to the expectation of full automation/digitalisation. Such failure is where “the last mile” lies. I employ “the last mile” as a perspective to scrutinize architectural authorship in these moments of digital revolutions. Before the information age, the Albertian notational system can be regarded as one of the earliest attempts to digitalise architecture. Alberti’s insistence on the identical copying between designers’ drawings and buildings resulted in the divide between architects as intellectuals and artisans as labourers. However, this allographic mode of architectural authorship was not widely accepted even into the late 20th century.  

At the turn of the information age and post-information age, Cedric Price’s Fun Palace was another attempt made by architects to respond to the digital revolution in the post-war era. It was influenced by second-order cybernetics theories that focused on the flow of information and the computational process. Buildings were deemed only as a catalyst, and architectural authorship was shared between architects and users. Yet by examining how the Fun Palace failed in the last mile, I put forward the idea that this authorship should also be attributed to technicians and ghost workers assisting the computation processes behind the stage. 

Finally, I analysed two case studies of open-source architectural platforms established for mass-customisation. By comparing Flatwriter of the cybernetics era and WikiHouse of the post-information age, I cautioned that both systems degrade architectural authorship into emptiness, by excluding users and discouraging acts of chaos. Also, by studying how these systems fail in the last mile, I position architects as “choral architects” who mediate between the information and post-information age. Subsequently, architectural authorship in the age of mass-customisation and mass-collaboration should be regarded as relational, involving actors from multiple positions. 

References

  1. Mary L. Gray and Siddharth Suri, Ghost Work: How to Stop Silicon Valley from Building a New Global Underclass (New York: Houghton Mifflin Harcourt Publishing Company, 2019).
  2. Gray and Suri.
  3. Gray and Suri.
  4. Mario Carpo, The Alphabet and the Algorithm (London: The MIT Press, 2011), p. 22.
  5. Carpo, The Alphabet and the Algorithm, p. 22.
  6. Carpo, The Alphabet and the Algorithm, pp. 22–23.
  7. Mario Carpo, The Second Digital Turn: Design Beyond Intelligence (Cambridge, MA: The MIT Press, 2017), pp. 131, 140.
  8. Antoine Picon, ‘From Authorship to Ownership’, Architectural Design, 86.5 (2016), pp. 39–40.
  9. Picon, ‘From Authorship to Ownership’, pp. 39 & 41.
  10. Philip F. Yuan and Xiang Wang, ‘From Theory to Praxis: Digital Tools and the New Architectural Authorship’, Architectural Design, 88.6 (2018), 94–101 (p. 101) <https://doi.org/10.1002/ad.2371>.
  11. ‘“The Last Mile” An Exciting Play’, New Leader with Which Is Combined the American Appeal, 10.18 (1930), 6; Benjamin B Ferencz, ‘Defining Aggression–The Last Mile’, Columbia Journal of Transnational Law, 12.3 (1973), 430–63; John Osborne, ‘The Last Mile’, The New Republic (Pre-1988) (Washington, 1980), 8–9.
  12. Donald F Burnside, ‘Last-Mile Communications Alternatives’, Networking Management, 1 April 1988, 57-.
  13. Mikko Punakivi, Hannu Yrjölä, and Jan Holmström, ‘Solving the Last Mile Issue: Reception Box or Delivery Box?’, International Journal of Physical Distribution and Logistics Management, 31.6 (2001), 427–39 <https://doi.org/10.1108/09600030110399423>.
  14. Gray and Suri, p. 12.
  15. Gray and Suri, p. 12.
  16. Matteo Pasquinelli and Vladan Joler, ‘The Nooscope Manifested: AI as Instrument of Knowledge Extractivism’, 2020, pp. 1–23 (p. 19) <https://doi.org/10.1007/s00146-020-01097-6>.
  17. Gray and Suri, pp. 12 & 71.
  18. Carpo, The Second Digital Turn: Design Beyond Intelligence, pp. 9, 18 & 68.
  19. Carpo, The Second Digital Turn: Design Beyond Intelligence, pp. 5, 18 & 68.
  20. James Beniger, The Control Revolution: Technological and Economic Origins of the Information Society (London: Harvard University Press, 1986), p. 295.
  21. Hamid R. Ekbia and Bonnie Nardi, Heteromation, and Other Stories of Computing and Capitalism (Cambridge, Massachusetts: The MIT Press, 2017), p. 25.
  22. [1] Ekbia and Nardi, pp. 25-6.
  23. [1] Michael L. Dertouzos, ‘Individualized Automation’, in The Computer Age: A Twenty-Year View, ed. by Michael L. Dertouzos and Joel Moses, 4th edn (Cambridge, Massachusetts: The MIT Press, 1983), p. 52.
  24. Ekbia and Nardi, p. 26.
  25. Antoine Picon, Digital Culture in Architecture : An Introduction for the Design Professions (Basel: Birkhäuser, 2010), p. 16.
  26. Beniger, p. 433.
  27. Picon, Digital Culture in Architecture : An Introduction for the Design Professions, pp. 24–26.
  28. Nicholas Negroponte, Being Digital (New York: Vintage Books, 1995), pp. 11 & 16.
  29. Negroponte, pp. 163–64.
  30. Carpo, The Alphabet and the Algorithm, p. 12.
  31. Carpo, The Alphabet and the Algorithm, pp. 54–55.
  32. Carpo, The Alphabet and the Algorithm, p. 26.
  33. Leon Battista Alberti, On Painting, trans. by Rocco SiniSgalli (Cambridge: Cambridge University Press, 2011), p. 45.
  34. Alberti, On Painting, p. 23.
  35. Leon Battista Alberti, The Ten Books of Architecture (Toronto: Dover Publications, Inc, 1986), p. 1.
  36. Carpo, The Alphabet and the Algorithm, p. 27.
  37. ‘Architectural Intentions from Vitruvius to the Renaissance’ [online] <https://f12arch531project.fil es.wordpress.com/2012/10/xproulx-4.jpg>; ‘Alberti’s Diffinitore’ http://www.thesculptorsfuneral.com /episode-04-alberti-and-de-statua/7zf3hfxtgyps12r9igveuqa788ptgj [accessed 23 April 2021].
  38. Giorgio Vasari, The Lives of the Artists, trans. by Julia Conaway & Peter Bondanella (Oxford: Oxford University Press, 1998), p. 182.
  39. Vasari, p. 181.
  40. Alberti, The Ten Books of Architecture, p. 22.
  41. Alberti, The Ten Books of Architecture, p. 22.
  42. Alberti, The Ten Books of Architecture, p. 22.
  43. Vasari, p. 183.
  44. Mary Hollingsworth, ‘The Architect in Fifteenth-Century Florence’, Art History, 7.4 (1984), 385–410 (p. 396).
  45. Adrian Forty, Words and Buildings: A Vocabulary of Modern Architecture (New York: Thames & Hudson, 2000), p. 138.
  46. Forty, p. 138.
  47. Forty, p. 137; Carpo, The Alphabet and the Algorithm, p. 78.
  48. Picon, Digital Culture in Architecture : An Introduction for the Design Professions, p. 20.
  49. Mario Carpo, ‘Myth of the Digital’, Gta Papers, 2019, 1–16 (p. 3).
  50. N. Katherine Hayles, ‘Cybernetics’, in Critical Terms for Media Stuies, ed. by W.J.T. Mitchell and Mark B.N. Hansen (Chicago and London: The University of Chicago Press, 2010), p. 145.
  51. Hayles, p. 149.
  52. Hayles, pp. 149–50.
  53. Socrates Yiannoudes, Architecture and Adaptation: From Cybernetics to Tangible Computing (New York and London: Taylor & Francis, 2016), p. 11; Hayles, p. 150.
  54. Hayles, p. 150.
  55. Stephen Wolfram, A New Kind of Science (Champaign: Wolfram Media, Inc., 2002), pp. 1, 5 & 14.
  56. Arata Isozaki, ‘Erasing Architecture into the System’, in Re: CP, ed. by Cedric Price and Hans-Ulrich Obrist (Basel: Birkhäuser, 2003), pp. 25–47 (p. 35).
  57. Yiannoudes, p. 29.
  58. Yiannoudes, p. 14.
  59. Stanley Mathews, ‘The Fun Palace as Virtual Architecture: Cedric Price and the Practices of Indeterminacy’, Journal of Architectural Education, 59.3 (2006), 39–48 (p. 43); Yiannoudes, p. 26.
  60. Isozaki, p. 34; Yiannoudes, p. 50.
  61. Stanley Mathews, p. 47.
  62. Cedric Price and Joan Littlewood, ‘The Fun Palace’, The Drama Review, 12.3 (1968), 127–34 (p. 130).
  63. Price and Littlewood, p. 130.
  64. Forty, p. 148.
  65. Jonathan Hill, Actions of Architecture (London: Routledge, 2003), pp. 68–69.
  66. Isozaki, p. 34.
  67. Isozaki, p. 35.
  68. Reyner Banham, Megastructure: Urban Futures of the Recent Past (London: Thames and Hudson, 1972).
  69. Price and Littlewood, p. 133.
  70. Forty, pp. 142-8.
  71. Yiannoudes, p. 29.
  72. Yiannoudes, p. 31.
  73. Gray and Suri, pp. 33–34.
  74. Gray and Suri, p. 34.
  75. Cedric Price, Fun Palace Project (1961-1985), <https://www.cca.qc.ca/en/archives/380477/cedric-price-fonds/396839/projects/399301/fun-palace-project#fa-obj-309847> [accessed 25 April 2021].
  76. Pasquinelli and Joler, p. 19.
  77. Yiannoudes, p. 18; Carpo, ‘Myth of the Digital’, p. 11; Hayles, p. 145.
  78. Mario Carpo, ‘Myth of the Digital’, pp. 11–13.
  79. Carpo, ‘Myth of the Digital’, p. 13.
  80. Mathews, p. 42.
  81. Yiannoudes, p. 33.
  82. Carpo, The Alphabet and the Algorithm, p. 99.
  83. Carpo, The Alphabet and the Algorithm, p. 99.
  84. Yiannoudes, p. 50.
  85. Yiannoudes, p. 30.
  86. Yiannoudes, p. 30.
  87. Yiannoudes, p. 30.
  88. Yiannoudes, p. 31.
  89. Yiannoudes, p. 31.
  90. Alastair Parvin, ‘Architecture (and the Other 99%): Open-Source Architecture and the Design Commons’, Architectural Design: The Architecture of Transgression, 226, 2013, 90–95 (p. 95).
  91. Open Systems Lab, ‘The DfMA Housing Manual’, 2019 <https://docs.google.com/document/d/1OiLXP7QJ2h4wMbdmypQByAi_fso7zWjLSdg8Lf4KvaY/edit#> [accessed 25 April 2021].
  92. Open Systems Lab.
  93. Open Systems Lab.
  94. Carlo Ratti and Matthew Claudel, ‘Open Source Gets Physical: How Digital Collaboration Technologies Became Tangible’, in Open Source Architecture (London: Thames and Hudson, 2015).
  95. Parvin.
  96. ‘An Introduction to WikiHouse Modelling’, dir. by James Hardiman, online film recording, YouTube, 5 June 2014, <https://www.youtube.com/watch?v=qB4rfM6krLc> [accessed 25 April 2021].
  97. Carlo Ratti and Matthew Claudel, ‘Building Harmonies: Toward a Choral Architect’, in Open Source Architecture (London: Thames and Hudson, 2015).
  98. Oliver David Krieg and Oliver Lang, ‘The Future of Wood: Parametric Building Platforms’, Wood Design & Building, 88 (2021), 41–44 (p. 44).
  99. Ratti and Claudel, ‘Building Harmonies: Toward a Choral Architect’.
  100. Carpo, The Second Digital Turn: Design Beyond Intelligence, p. 162.
  101. Robin Evans, ‘Towards “Anarchitecture”’, in Translations From Drawings to Building and Other Essays (从绘图到建筑物的翻译及其他文章), trans. by Liu Dongyang (Beijing: China Architecture & Building Press, 2018), p. 20.
  102. Walter Benjamin, Illuminations: Essays and Reflections (New York: Schocken Books, 2007), p. 12.

Suggest a Tag for this Article
Subscribe To Prospectives To Automatically Receive Curated Issues By Our Advisory Board Twice A Year!

£30