November 1, 2018
Effective responses to natural and man-made disasters require preparation and the ability to mobilize quickly. Virtual and mixed reality (VR and MR) are giving city planners and first responders a number of tools to help save lives when it comes to these devastating events.
On the planning side, generating virtual environments complete with detailed characters may sound like playing SimCity in the early 2000s. But when combined with large-scale, real-world data, 3D-model-creating algorithms and VR, disaster scenarios can be simulated for cities with increasing accuracy.
We talked with Xun Luo, IEEE Senior member and Professor at Tianjin University of Technology (China), who is an expert in the creation of these environments.
When it comes to creating a model, “The first step is to prepare the digital building blocks, which are the buildings. The second step is to combine these building blocks into communities. The third step is to add dynamic traffic (people and vehicles),” says Luo.
The level of detail is not to be underestimated: “For generation of digital characters, we collect the profile photo of a real person, and then we use deep learning methods to automatically construct the 3D head model.” This makes the models astoundingly accurate, both visually and socio-economically.
To Luo, “This helps researchers from a broad spectrum of backgrounds to understand cities’ problems, and try different solutions with low cost and risk.” Making the process digital allows for experimentation and running of realistic scenarios in ways that have never existed.
When disaster strikes, unmanned vehicles can be ideal for search-and-rescue missions. Drones excel in this capacity because they can move quickly and relay video back to their operators. However, controlling them once they’re out of sight can be quite challenging. Researchers in Austria are using MR to help solve this problem, as reported by IEEE Spectrum.
By using a Microsoft HoloLens MR headset, which tracks the orientation of your face, the drone can reposition itself to give you the proper view when you turn your head in a different direction. This gives the sense of having x-ray vision, since you can effectively look through the walls separating you and the drone, making it a great potential asset for saving lives.
Recent research published on IEEE Xplore proposes a similar concept, where an engineer or architect can teleoperate a WALK-MAN robot using a virtual reality device and a body tracking system, allowing them to investigate buildings at risk of collapse after a disaster, which has traditionally been an extremely risky endeavor.
It’s worth mentioning that while these two technologies are still in the research phase, this type of technology is not a future “what if?” Says Luo, “Urban simulation using VR/MR technology is not the future, but a practical solution now. In 5-10 years, new city expansion and improvement will be largely reflected in the digital world first.” And this virtual proving ground stands to improve the lives of all city residents.
To hear more about Xun Luo’s work, watch his video interview: