An XR-based digital twin of Tampa is a first-of-its-kind installation combining a 3D printed model of the Water Street Tampa area with projected real-time mapping and data visualizations rendered with the Unreal Engine.
The developers can use the digital twin with city planners, property buyers, outdoor advertisers, and more, including design consultants and community groups.
XR-based Digital Twin
The installation is found amongst the restaurants, shops, and food stalls of Sparkman Wharf, where Strategic Property Partners (SPP) are showcasing the XR installation as its crown jew inside its high-tech marketing center in the company’s headquarters.
At the touch of a screen, the digital twin’s imagery can be projected onto the 17-foot diameter physical model. Two large video walls encircle the model display and synchronize real-time 3D views looking out across the city from any floor of any individual building, while the physical model shows a projected view cone from the same building. (see images next page).
The installation combines 3D-printed replicas, project mapping, and real-time rendering technology and is the largest and most sophisticated model of its kind known to be developed to date.
Digital Twin Purposes
SPP is the company behind the multibillion-dollar Water Street Tampa development. The projection-mapped scale model of Tampa enables SPP to show data in real-time and offers the flexibility to change as the design of the 56-acre development evolves over the next five years.
The model is used in numerous ways, including conversations around real estate sales and leasing, meetings with city planners, internal design reviews sessions, billboard advertising, and much more. The data contained in the digital twin is useful to the processes of both sales, design collaboration, and city reviews.
SPP turned to IMERZA a company that focuses on experiential technology for real-estate, seeking to leverage its expertise in this area. The company partnered with DCBolt, a project-mapping expert and they both created all the custom software and content for the project.
The Physical Model
The 3D-printed physical model is an accurate replica of Tampa. It is designed and fabricated in a modular way to allow for easy replacement of future-phase buildings as they are designed.
Real-time rendering the visualizations projected onto the model removes the need to re-render hundreds of animations when something changes in the city, reducing the overall cost of the project to a fraction of what it would take if the content was rendered traditionally. (see the video next page)
next page: IMERZA’s Process, Unreal Engine, and Ray Tracing
IMERZA’s Process—Unreal Engine
While it took one year to complete the entire project, IMERZA wrote six custom applications including interactive touchscreen kiosks, a video wall application, an Apple iPad application, a custom data aggregator, a content management system (CMS), and a projection-mapped content application.
The Unreal Engine is at the core of four of the six applications. All of the applications communicate with each other via the IMERZA API. For example, the SPP team can select an individual building and show that building’s cone-of-view on the physical model, while simultaneously showing the views from every view of that building on the surrounding wall screens. They can also rotate that view around 360 degrees by turning a compass on the custom iPad app. Both the projection model and the video walls rotate together in perfect sync.
Advertising
The SPP team can upload a potential advertiser’s collateral to their CMS and have the system automatically apply it to all the billboards and digital signage locations throughout the Tampa environment, both on the scale model and via the video wall imagery as well.
This means you can have the orientation nature of the physical model and see where all advertising will go as well as the “experience” of encountering that advertising in context. This is fundamentally about simulation.
“Features like this could only be possible with real-time rendering,” says Dorian Vee, Co-Founder, and CTO of IMERZA. “Unreal Engine renders all of the content, and technologies such as nDisplay and NVIDIA Quadro RTX cards enable us to keep the multiple PCs frame-locked and rendering 12 separate cameras to the 12 projectors in real-time.”
Unreal Engine
The team opted for Unreal Engine to power its applications because it needed a solution that could produce stunning visuals while also providing the ability to connect to multiple third-party tools and handle complex datasets. “The choice to use Unreal Engine was really a no-brainer,” recalls Vee. “Unreal Engine allows our artists to create beautiful visuals faster utilizing real-time ray tracing, and the open-source code and access to technologies such as nDisplay allowed us to customize the engine to suit our needs and innovate quickly.”
One of the most impressive aspects of the project from a visual standpoint is the clarity of the projections the team was able to achieve. This was no walk in the park. “The biggest challenge was rendering 12 different cameras to 12 projectors and ensuring everything, down to each particle, was absolutely frame-locked,” explains Vee. “If it wasn’t frame-locked, the projection would be a blurry mess.”
Unreal Engine, NVIDIA Quadro cards, and nDisplay technology were the key to frame-locking the 24 million pixels being projected and rendered at over 90 frames per second.
For the video walls and touchscreens, IMERZA also used Quadro RTX 6000 cards, leveraging real-time ray tracing to give extra realism to the scenes. In order to keep everything running efficiently at 4K, the team used Blueprint-driven logic to toggle between RTX-GI and screen-space GI based on view distance. RTX capability can be turned on and off within materials using dynamic RTX reflections based on a material node, driven by camera visibility and distance.
Real-time Technology
Interactive workflows like those behind the Water Street Tampa Marketing Center project are ironing out some of the friction that exists in the traditional architectural design process. “I could go on for days about this, but I’ll keep it simple,” says Vee. “The majority of people have an extremely difficult time visualizing design. Real-time technology allows everyone to see the same thing and make decisions faster. This translates to reduced costs, fewer change-orders, and overall better design.”
Vee is equally emphatic about the overall impact real-time technology is having on real estate and urban development. “It isn’t often something comes along that can revolutionize an industry. Something that revolutionizes several industries at once happens maybe once a century,” he says. “Unreal Engine is doing just that. AEC, film production, product design, vehicle design, training, and manufacturing—even creating new industries. It’s exciting, and I feel lucky to be a part of it.”
next page: Further Resources
Further Resources
For the scale model, IMERZA used Lightact servers with Quadro RTX 6000 cards. These servers were frame-locked—synchronized on each display—using Unreal Engine’s nDisplay technology. The nDisplay system works with Unreal Engine to render 3D content simultaneously to multiple displays in real-time. The team modified the nDisplay code to allow for more granular control of passing data and to enable easier automation. Modifying the nDisplay code provided the ability to access nDisplay commands from the command-line interface, which dramatically improved the 24/7 operational requirement.
To align the digital model with the physical model, a three-point algorithm was used to get the approximate camera location in real-world space, from which IMERZA fine-tuned the position using further calculations authored in the Blueprint visual scripting system. Finally, the DCBolt team used blends and masks to fine-tune each projector for maximum visual quality.
For data that changes on a regular or real-time basis, IMERZA wrote a data aggregator that pulls from multiple sources, including commercial real-estate data sources and ESRI ArcGIS—a geographic information system for working with maps and geographic information developed by the Environmental Systems Research Institute. The aggregator normalizes the data into a new geocoded database from which the team translates the data into visuals.
“This allowed us to show all data types together in contexts, such as market data, rent growth, leasing rates over time, building data, city data, projected traffic data, real-time traffic from Google, and solar analysis,” explains Vee. “For data that changes less regularly, such as census data, we worked closely with the ESRI team to author a method to render ArcGIS map data to JPG textures, then ingest the textures at runtime.”
Listed Resources: