Continued from page 1
Designing Human-Centric Spaces with Holodeck VR and Machine Learning
The discussion in this panel was about the density of housing in cities resulting in high demand for efficient smaller apartments. The design perception was to create spaces that felt larger to the occupant while ensuring it functioned for their everyday lives.
Designers have always been responsible for the perception of their design using static 2D and 3D platforms as an overall part of the building design and evaluation process. In this panel, Kohn Petersen Fox (KPF) architects, Cobus Bothma and Xin Zhang discuss the use of AI-based agents creating and testing these spaces through design and immersive virtual environments using the NVIDIA Holodeck VR and the KPF Urban Interface (custom evaluation tools).
They demonstrated how using data helped them determine the size of apartment room layouts for different types of people so that the client could offer apartments to suit an occupant’s lifestyle before construction. An analytical based generative design on an urban scale can use datasets to prioritize comfort, mobility and access, and aesthetics. The process begins with Human Input (Generative Design Data), and the next step is Machine Learning (Analysis of Data) to Human Evaluation (Immersive Collaborative Evaluation in VR) (see photos 04 – 09 below. images: Akiko Ashley / Architosh. All rights reserved.)
Kohn Petersen Fox has worked on or in development of such projects as the Petersen Automotive Museum, IBM Headquarters, and 10 & 30 Hudson Yards in New York City. KPF has a staff of 550 people with 30 Principals across 16 countries.
Proprietary tools are common in large firms that want to be able to design more efficiently incorporating data with machine learning into the design to be able to work with their client interactively before building begins.
next page: How Multi-User Collaborative VR is Changing the Way Architects Design Spaces
Reader Comments
Comments for this story are closed