ISSN 2634-8578
29/04/2022
A design process consists of a conventionalised practice – a process of (personal) habits that have proven to be successful – combined with a quest for creative and innovative actions. As tasks within the field of architecture and urban design become more complex, professionals tend to specialise in one of many subsets, such as designing, modelling, engineering, managing, construction, etc. They use digital tools which are developed for these specialised tasks only. Therefore, paradoxically, automation and new algorithms in architecture and urbanism are primarily oriented to simplify tasks within subsets, rather than engaging with the complex challenges the field is facing. This fragmented landscape of digital technologies, together with the lack of proper data, hinders professionals’ and developers’ ability to investigate the full digital potential for architecture and urban design. [1] Today, while designers explore the aid that digital technologies can provide, it is mostly the conventionalised part of practice that is being automated to achieve a more efficient workflow. This position statement argues for a different approach: to overcome fragmentation and discuss the preconditions for truly coping with complexity in design – which is not a visual complexity, nor a complexity of form, but rather a complexity of intentions, performance and engagement, constituted in a large set of parameters. We will substantiate our statement with experience in practice, reflecting on the Retrofit Project: our goal to develop a smart tool that supports the design of energy neutral districts. [2]
So, can designers break free from the established fragmentation and compute more than technical rationale, regulations and socio-economic constraints? Can they also incorporate intentions of aesthetics, representation, culture and critical intelligence into an architectural algorithm? To do so, the focus of digital tools should shift from efficiency to good architecture. And to compute good architecture, there is a need to codify a designer’s evaluation system: a prescriptive method to navigate a design process by giving value to every design decision. This evaluation system ought to incorporate architectural liberty – and therein lies the biggest challenge: differentiating between where to apply conventionalised design decisions and where (and how) to be creative or inventive. Within a 5000-year-old profession, the permitted liberty for these creative acts has been defined elastically: while some treatises allow only a minimum of liberty for a designing architect, others will lean towards a maximum form of liberty to guarantee good architecture. [3]
A minor group of early adopters, such as Greg Lynn, Zaha Hadid Architects, and UN Studio, tried to tackle the field’s complexity using upcoming digital technologies, in the late ’90s early 2000s. They conveniently inferred their new style or signature architecture from these computational techniques. This inference, however, causes an instant divide between existing design currents and these avant-garde styles. The latter claim the notion of complexity – the justification for their computational techniques – lies mostly within the subset of form-giving, not covering the complexity of the field. This stylistic path is visible in, for example, Zaha Hadid Architects’ 2006 masterplan for Kartal-Pendik in Istanbul. The design thrives on binary decisions in 3D-modelling tool Maya, where it plays out a maximum of two parameters at once: the building block with inner court and the tower. The resulting plastic urban mesh looks novel and stylistically intriguing, yet produces no real urbanity and contains no intelligence on the level of the building type. This methodology does not generate knowledge on how well the proposed urban quarter (or constituent buildings) will perform on the level of, for example, costs, energy production and consumption, infrastructure, city utilities, diversity and health. The fluid mass still needs all conventional design operations to effectively turn it into a mixture of types, urban functions, and local identity. Arguably, the early adopters’ stylistic path avoided dealing with real complexity and remained close to simple automation. In doing so, while they promoted a digital turn, they might also have dug the foundations for today’s fragmentation in the field.
Ironically, to some extent Schumacher’s treatise – definitely the parts that promote parametricism as a style – reads as a cover-up of the shortcomings of parametric software; for example, the inability to produce local diversity and typological characteristics beyond formal plasticity. [4] Schumacher further rejects Vitruvius to prevent structural rationale from taking primacy, and he disavows composition, harmony and proportion as outdated variable communication structures to propose the “fluid space” as the new norm. [5] This only makes sense knowing that the alternative – a higher intelligence across the whole field of architecture and urban planning, such as codified data and machine learning algorithms – did not yet exist for the early adopters. Contemporary applications such as Delve or Hypar do make use of those intelligent algorithms, yet prioritise technical and economical parameters (e.g. daylight, density, costs) to market efficiency. [6]
Any endeavour to overcome the established fragmentation and simplified automation will ultimately find itself struggling with the question of what good architecture is. After all, even with large computational power at hand, the question remains: how to evaluate design decisions beyond the merely personal or functional, in a time where no unified design theory exists? In fact, the fragmented specialisation of today’s professionals has popularised the proclamation of efficiency. As a result, an efficiency driver (whether geared by controlling costs, management or resources) is often disguised as moral behaviour, as if its interest is good architecture first, and the profit and needs of beneficiaries only come second. If the added value of good architecture cannot be defined, the efficiency driver will continue to get the upper hand, eroding the architectural profession to an engineering and construction service providing calculations, permits and execution drawings.
It was inspiring to encounter Alessandro Bava’s Computational Tendencies on this matter:
The definition of what constitutes “good” architecture is, in fact, always at the center of architecture discourse, despite never finding a definite answer. Discourses around digital architecture have too often resolved the question of the “good” in architecture by escaping into the realm of taste or artistic judgment. [7]
Bava renders Serlio’s architectural treatise as an original evaluating system that attributes universal value, and revisits Rossi’s exalted rationalism to propose a merger of architecture’s scientific qualities with its artistic qualities. He aims to re-establish architecture’s habitat-forming abilities and prevent architecture from becoming an amalgam of reduced and fragmented services. However, Serlio’s treatise did not provide a fully codified and closed formal system, as it still includes the liberty of the architect. [8] Going through Serlio’s On Domestic Architecture, an emphasis is placed on ideal building types, mostly without context. Therefore, no consideration is given to how these types ought to be modified when they need to be fitted in less ideal configurations such as non-orthogonal grids. The books also remain ignorant of the exceptions: the corner-piece-type, or fitting-parts that mediate between buildings and squares on a higher level. This is not a cheap critique of Serlio’s work. It is an awareness one needs to have when revisiting Serlio’s work as a “proto-BIM system, one whose core values are not market availability or construction efficiency, but harmonic proportions”. [9] Arguably, it is the liberty, the modifications, and the exceptions that need to be codified, to reach beyond simplified automation, across fragmentation, and towards an architectural algorithm to assist designers.
This is easier said than done, otherwise the market would be flooded with design technologies by now. As with most design problems, the only way to solve them is by tackling them in practice. In 2021, the Design Sciences Hub, affiliated with the University of Antwerp, set up the Retrofit Project. The aim is to develop an application to test the feasibility of district developments. The solution will show an urban plan with an automatically generated function mix and optimized energetic and ecological footprint, for any given site and context. The project team collaborates with machine-learning experts and environmental engineers for the necessary interdisciplinary execution. Retrofit is currently in the proof-of-concept phase, which focuses on energy neutrality and will tackle urban health and carbon neutrality in the long run.
The problem of modifications and exceptions seems the easiest to examine, as it primarily translates into a challenge of computational power and coping with a multitude of parameters. However, these algorithms should be smart enough to select a specific range within the necessary modifications and exceptions to comply with the design task at hand. In this case, the algorithm should select the correct modifications and exceptions needed to integrate certain types into any given site within the Retrofit application. In other words, there is a need for an intelligent algorithm that can be fed a large number of types as input data to generate entirely new or appropriate building types. The catch resides within the word “intelligent”, as algorithms aren’t created intelligent, they are trained to reach a certain level of intelligence based on (1) codifiable theory and (2) relevant training sets of data. Inquiring into a variety of evaluation systems for architectural design that emerged over the last 40 years, Verbruggen revealed the impossibility of creating a closed theoretical framework, and uniquely relating this framework to a conventionalised evaluation system in practice. [10] As such, both the codifiable theory – a unified evaluation system that integrates scientific and artistic qualities into one set of rules – and the training set hardly exist in architecture and urban design. To complicate matters even more, today’s non-unification is itself often embraced as the precondition for good architecture [11-15]
And so, the liberty question emerges here once again: how can different types, their modifications and exceptions, including respective relationships with different contexts, be codified? It is easy to talk about codification, but much harder to implement it within a project. When different types are inserted into a database, how are the attributes defined? This is a task that proved to be very laborious and raised many new questions in the Retrofit project. Attributes will include shape and size, yet might also include levels of privacy, preferred material usage, degree of openness, average energetic performance, historic and social acceptance in specific areas, compatibility with different functions, etc. Which values define when and where a specific type is appropriate, and how are they weighed? Do architects alone fill up the database, and if so, which architect is qualified, and why? And when an AI application would examine existing typologies within our built environment, which of these examples should be considered good, and why? Can big data or IoT sensors help in data gathering? To truly take everything into account, how much data do we really need (e.g., a structure’s age and condition, social importance, usage, materials, history, etc.). Furthermore, when the Retrofit application runs on an artificially intelligent algorithm that is trained to think beyond the capabilities of a single architect, will the results diverge (too) much from what society is used to?
The many practical questions from the Retrofit Project show that defining the architect’s liberty is both the problem and holds the potential for digital technologies to tackle the true complexity of the field. Liberty is undeniably linked to the design process and, therefore, encoding a design process needs to (1) capture the architect’s evaluation system and (2) allow for targeted and smart data gathering. The evaluation system can then be coded into an algorithm, with the help of machine learning experts, and trained using the gathered data. Both the evaluation system and the necessary data rely heavily on the architect’s liberty. Because dealing with these liberties is a difficult task – perhaps the most difficult task in the age of digital architecture – many contemporary businesses and start-ups that claim to revolutionise the design process with innovative technologies might not revolutionise anything, because they opt for the easy route and avoid dealing with the liberty aspect. An architectural algorithm that does take the liberty aspect into account may provide designers with an artificial assistant to help tackle all complexities in the field while tapping into the full potential of today’s available computational power.
This could be the ultimate task we set ourselves at the DSH. Studying a large dataset of design processes, steps, and creative acts might reveal codifiable patterns that could be integrated into a unified and conventionalized evaluation system. This study would target large and diverse groups of designers and users in general, including their knowledge exchange with other involved professionals. Could such an integral evaluation system, combined with data gathering, finally offer the prospect of developing a truly architectural algorithm? Eventually, this too will encounter issues that require further study, such as deciding who to involve and how to wisely navigate between the highs and lows of the wisdom of crowds: [16] can we still trust the emerging patterns detected by machine learning algorithms to constitute proper architectural liberty and, thus, good architecture? We will proceed vigilantly, but we must explore this path to avoid further fragmentation, non-crucial automation, and the propagation of false complexity.
References
[1] N. Leach, Architecture in the Age of Artificial Intelligence: An Introduction for Architects (London; New York: Bloomsbury Visual Arts, 2021).
[2] The Design Sciences Hub [DSH] is a valorisation team of the Antwerp Valorisation Office. The DSH works closely with IDLab Antwerp for Machine Learning components and with the UAntwerp research group Energy and Materials in Infrastructure and Buildings [EMIB] to study energy neutrality within the Retrofit Project. Although the project will be led and executed by the University of Antwerp, the private industry is involved as well. Four real estate partners – Bopro, Immogra, Quares and Vooruitzicht – are financing and steering this project. So is the Beacon, maximizing the insights from digital technology companies. Also see: https://www.uantwerpen.be/en/projects/project-design-sciences-hub/projects/retrofit/
[3] H.W. Kruft, A History of Architectural Theory: from Vitruvius to the present (London; New York: Zwemmer Princeton Architectural Press, 1994).
[4] P. Schumacher, The Autopoiesis of Architecture: A New Framework for Architecture. Vol. 1 (Chichester: John Wiley & Sons Ltd, 2011). P. Schumacher, The Autopoiesis of Architecture: A New Agenda for Architecture. Vol. 2 (Chichester: John Wiley & Sons Ltd, 2012).
[5] Ibid.
[6] Delve is a product of Sidewalk Labs, founded as Google’s urban innovation lab, becoming an Alphabet company in 2016. Hypar is a building generator application started by former Autodesk and Happold engineer Ian Keough. Also see www.hypar.io, www.sidewalklabs.com/delve.
[7] A. Bava, “Computational Tendencies”, In N. Axel, T. Geisler, N. Hirsch, & A. L. Rezende (Eds.), Exhibition catalogue of the 26th Biennial of Design Ljubljana. Slovenia (2020): e-flux Architecture and BIO26| Common Knowledge.
[8] H.W. Kruft, A History of Architectural Theory: from Vitruvius to the present (London; New York: Zwemmer Princeton Architectural Press, 1994).
[9] A. Bava, “Computational Tendencies”, In N. Axel, T. Geisler, N. Hirsch, & A. L. Rezende (Eds.), Exhibition catalogue of the 26th Biennial of Design Ljubljana. Slovenia (2020): e-flux Architecture and BIO26| Common Knowledge.
[10] S. Verbruggen, The Critical Residue: Creativity and Order in Architectural Design Theories 1972-2012 (2017).
[11] M. Gausa & S. Cros, Operative optimism (Barcelona: Actar, 2005)
[12] W. S. Saunders, The new architectural pragmatism: a Harvard design magazine reader. (Minneapolis: University of Minnesota Press, 2007).
[13] R. Somol & S. Whiting, Notes around the Doppler Effect and Other Moods of Modernism. (2002) In K. Sykes (Ed.), Constructing a New Agenda: Architectural Theory 1993-2009 (1st ed., pp. 188-203). (New York: Princeton Architectural Press).
[14] K. Sykes, Constructing a new agenda : architectural theory 1993-2009. (1st ed., New York: Princeton Architectural Press, 2010).
[15] S. Whiting, (recorded in Delft, march 2006). The Projective, Judgment and Legibility: Lecture at the Projective Landscape Conference, organized by the TU Delft and the Stylos foundation.
[16] P. Mavrodiev & F. Schweitzer “Enhanced or distorted wisdom of crowds? An agent-based model of opinion formation under social influence”, Swarm Intelligence, 15(1-2), 31-46. doi:10.1007/s11721-021-00189-3 J. Surowiecki, The wisdom of crowds : why the many are smarter than the few. (London: Abacus, 2005).