
27 minute read
Editorial
Ubiquitous Digitalization for Everyman
Prof. Arup Dasgupta
arup@geospatialworld.net For long, the term digital, in all its connotations, has been a recurring keyword in the field of geospatial. Digital versions of analogue technologies have led to robust data acquisition, storage and analysis systems. Digital data is ubiquitous, persistent, and immutable over generations, thus enabling sharing and multiple use-cases. The evolution of common data standards, within and across the systems, has led to widespreaduse of data for a range of purposes.
Urban planning and management is one of the areas where this is most visible. Architecture, Engineering and Construction ( AEC) started with simple maps, then moved towards digitization and CAD, and from 2D to 3D, onto GIS integration.
Today, Digital Twins has become the trending buzzword. It constitutes an entire virtual representation of a real world environment that can be created using CAD, BIM and GIS. By adding data from IoT enabled devices, the changes in the real world can be replicated in the virtual world enabling better and efficient management of the urban environment.
Digital Cities represents another aspect of Industrial Revolution 4.0 where people, data, processes and objects form a virtual continuum in a digital representation of the world. While Digital Twins replicate the inanimate world, the Metaverse has the capability to populate such an environment with avatars of real persons. Just like in movies, these avatars can virtually visit locations which would be difficult to access physically.
While the urban landscape does provide the most visible example of such an integration, it leaves no doubt that this concept can be extended to other areas. As an example, the European Commission is promoting a concept of Destination Earth where Digital Twins will be used to model the global weather and be able to predict weather extremes and climate scenarios.
Geospatial systems are used in many fields, urban planning being one of them. However, all individual systems like urban, agriculture, forestry, to name a few, are also intimately interrelated and each has its influence on the other in complex ways. The future therefore is not only Digital Cities, but Digital Everything.
The task of Digital Cities itself is humongous. The evolution of the technology itself, particularly from the points of interoperability, scalability and openness is yet to be achieved in full measure. The human dimension has to be addressed by internalizing the requirements of sustainability and resilience, not only by the administration and industry but also by mainstreaming social inclusion in all types of projects and throughout all project phases.
If we can achieve this in all other spheres, then we can begin to work towards Digital Everything.
In the future, issues of security, safety and privacy will get more complicated. Social and human factors may be impacted as well as we move into a Digital Universe. The road ahead is difficult but challenging nevertheless.
Commercial data providers have immensely increased the coverage of Earth but also with different types of sensors and modalities. But you need to analyze all that data to glean greater and more accurate insights, says Daryl Madden, Vice President, Geospatial Systems, Textron Systems.
How do you see the geospatial intelligence scene evolving and what are the top trends? The clear-cut answer is Artificial Intelligence/Machine Learning (AI/ ML); however, the community has moved from the “buzz word” to implementation and the challenge is the details.
Going back 10 or 20 years ago, if you wanted to get insight, analysts were dependent on national resources when it came to satellite-based earth observation. With the advent of commercial satellites, there was an abundance of imagery. However, there was always that risk of missing a threat simply because there were not enough analysts to process the data.
With AI/ML, you could take these hundreds or thousands of images and run them through a pre-processor. Again, this doesn’t work magically, and it’s not straightforward. You need to build enough training sets on specific types of imagery, and then determine what answer is “good enough,” and, can we get users to buy in.
Different users build different training sets and use different models. Additionally, users do not want to bounce around between different applications. What we have done with the RemoteView™ software is taken an adaptable approach of being able to integrate various applications within the tool of choice for imagery analysts.
You mentioned the abundance of imagery with the onset of commercial satellite industry. In a way, the commercial industry’s participation in the GEOINT space has expanded the democratization of data and technologies. How do you view this trend? The commercial industry has greatly transformed the GEOINT space. Commercial data providers have immensely increased the coverage of Earth but also with different modalities; for instance, synthetic aperture radar (SAR), radio frequency (RF), multispectral, hyperspectral, etc. So, now you are not only getting a lot of pictures of the Earth, but you are getting a lot more information about those pictures. For example, if you are getting pictures of a ship, you can tie that into the RF signal to see if the vessel is emitting a signal that could give its location. If it is not emitting that signal, that may be an indication that the ship is doing something it’s not supposed to.
Multispectral imagery will give you information like tree cover, or certain types of materials, like iron oxide, so you can look at different bands within the imagery to give you more information about what is on the Earth rather than just a picture and a human trying to decipher what it is. So, this is leading to multi-event correlation.
There is just immensely more information out there from commercial satellite providers, but you do need to merge and analyze
that data to glean greater and more accurate insights.
Tell us about some of the innovations and top-of-theline products from Textron Systems. We at Textron Systems continue to enhance our industry-leading imagery exploitation tool, RemoteView, but have added two new full-motion video products to our offerings. SeeGEO®, a web-based solution with the ability to ingest, display and store/discover full motion video (FMV) with an easyto-use user interface, and Optice™, a desktop solution utilizing GPU technology for high-performance analytics such as object tracking.
For over 25 years, RemoteView has played a critical role in the intelligence agencies monitoring bad actors and critical situations around the world, including natural disasters.
One example is the tsunami in Thailand, where you could look at before and after images and see what roads had gone, where relief supplies could land, or what is the status of hospitals. RemoteView has done that and been critical to national security.
Additionally, over the past few years, we have added software products that are purpose-built to enhance interactions with full-motion video (FMV) sources. Our SeeGEO and Optice software are two different products that utilize FMV real-time viewing from a drone. With Optice, you can track a vehicle… an example could be if there was a bank robbery, you could click on the car and see if they have thrown something out of the window and track where they landed. With SeeGEO, you can store that and retrieve it for later to go back and compare it to something you saw a few days ago or show all the areas the drone has flown and the area it has covered.
Textron Systems has been at the center of the geopolitical climate and safety of our nation, and we continue to ramp that up with newer products in the area of FMV.
As the GEOINT space is evolving, so are its applications. Geointelligence is no more just about defense and security. As a business leader, how do you see Textron Systems’ client base evolving? We are truly seeing this. SeeGEO and Optice were originally designed for the defense market, but actually, our first customer of Optice was a local sheriff’s drones department, which was operating search and rescue missions. Although we see similar needs and workflows, we realize that they are two totally different markets with different users, infrastructure and purchasing models.
We are seeing an emergence in the first responder market in terms of drone technology, and our products meet their needs. Some of the core things you would do in the military are what first responders do. For example, rescues, search and discovery, mapping areas, or monitoring ongoing situations. However, their infrastructure is less mature than the military, where you have the network set-up to pass FMV back to the headquarters.
So, we are partnering with commercial solutions so we can pass that FMV with metadata

The Feature Analyst software can extract the extent of flooded areas and determine how many sq. miles or acres were affected, or are still currently flooded. The red areas indicate change between before and after photos to observe the flooded streets and displaced watercraft.
The Feature Analyst and LIDAR Analyst software are used jointly to identify and delineate port infrastructure. The software is extracting and counting how many structures are in each of the terminal zones.

remotely. This is not only in defense and security, but homeland security as well. SeeGEO, Optice and Feature Analyst™ software will soon be available commercially on our new e-commerce site.
Extreme weather events and natural disasters are on the rise and are requiring more attention in predictions, monitoring and post-disaster situations to make valuable contributions in the related mitigating efforts. How is real-time GEOINT making valuable inroads in this matter? Back in 2017 when Hurricane Harvey hit Houston, Texas, we reached out to see how we could help with our RemoteView product for flood analysis. We quickly realized that all of the overhead imagery (electro-optical) was useless because of the cloud cover. Having the ability to use different modalities of data sets was imper-

The LIDAR Analyst software is shown extracting buildings from an office park as well as counting the number of trees and tree canopy, which are converted into valuable spatial data that can be modeled in 3D.
ative to help solve different challenges. In this case, synthetic aperture radar (SAR) would allow you to “see through the clouds.” So, certainly, natural disasters fit within the geospatial arena.
Evolution of smart, urban cities is grabbing much attention for urban and related policymakers. What is Textron Systems’ vision towards equipping the urban development sector? Textron Systems has worked in this market for several years. One example is the city of Durham in North Carolina, to support its storm water runoff prevention strategy through object-oriented feature extraction (pervious and impervious areas). Automated feature extraction software enabled the North Carolina Public Works GIS group to process and analyze data more efficiently, at a higher level of accuracy, and on a much larger scale than through manual processing. Through a case study completed with the city using our Feature Analyst™ software, the results demonstrated the potential of automated feature extraction to accomplish a task that would normally take days, to be accomplished in mere minutes. We are looking forward to continuing to equip the urban development sector with accurate and efficient solutions. The full case study can be found on our website.
Smart natural resource management is being looked at as the greener way to create overall development goals for any region. How are you providing support to the natural resource areas management goals? Textron Systems has a wide range of users. Our Feature Analyst software can be very useful for developing overall development goals for resource management. One example is the use of Feature Analyst to identify all logging in the New York City Watershed, which encompasses more than 1 million acres in upstate New York. The non-profit Watershed Agricultural Council’s (WAC) goal for the New York City Watershed was to balance water quality protection with a viable rural economy.
The WAC chose to use the Feature Analyst extension for ArcGIS to identify logged areas within the watershed. They accomplished this by first training the Feature Analyst software with examples from known logging areas, then refining the classification results using hierarchical learning, and then using the final trained model to batch classify other National Agriculture Imagery Program aerial images and extract logged areas in those images. The results of this logging identification project using Feature Analyst allowed resource managers for the first time to know the full extent of logging in the New York City Watershed, and to calculate the impact of logging on the watershed’s water quality.
Recent advances in the miniaturization of sensors and data fusion capability, and increasing digitalization and workflow integration, how do you view the future – in terms of both opportunities and challenges? As exciting as the past years have been, we can expect even more innovations to come.We have seen hardware get smaller and processing power increase.Therefore, more and more data will be available, resulting in sifting and fusing this data to derive useful insights will become more challenging.
As always, we continue to work closely as teammates, listen to the analysts' needs and show the art of the possible (from a technology standpoint) to provide the best tools and processes to meet the ever-demanding needs of the intelligence community. We are excited to continue to play a major role in this geospatial revolution!
Interviewed by: Anusuya Datta
FUTURE IS HERE
Metaverse for Building Next-Gen Infrastructure
While the very concept of Metaverse and what constitutes it continues to be defined, its potential to unleash the next wave of digital disruption is obvious, especially in the infrastructure space. By Anusuya Datta
“We believe Metaverse will be the successor to the mobile internet… We’ll be able to feel present – like we’re right there with people no matter how far apart we actually are.” Mark Zuckerberg, October 28, 2021
Metaverse has become the most widely bandied neologisms ever since the Facebook CEO laid out his vision and rechristened the social media giant to Meta.
Although, the concept of Metaverse has existed long before Zuckerberg actually created the hype and gave it a
‘consumerist’ spin — several organizations experimented with and attempted to successfully implementing it in pilot phase across in various spheres.
“Many people think of the Metaverse in terms of social and gaming experiences. But businesses have always explored the potential of virtual or augmented realities,” says JuergenDold, Executive Vice President, Hexagon. He explains that the real-time 3D visualizations that companies like Hexagon are creating are actually solving real-world problems, as they are anchored in the physical world — a street, a construction site, a factory, a park, a building, even products like cars and phones.
And because today’s Metaverse is an evolution of the virtual universe of the past, its use and significance in all things related to infrastructure — be it construction of standalone buildings, transport infrastructure or even entire cities — is obvious.

For instance, as Benson Chan, Chief Operating Officer, Strategy of Things, explains,
DID YOU KNOW?
The term metaverse first came up in Snow Crash, a 1992 science fiction novel by Neal Stephenson. In this fictional world, humans, as programmable avatars, interact with each other and software agents, in a 3D virtual space that uses the metaphor of the real world.
in its simplest form, the Metaverse is the digital extension of the smart city. The city’s Metaverse is an online version of the city and works in lockstep alignment with the physical city and community. “Smart cities utilize the physical data from its many sensors, building information models, digital infrastructure and geospatial information, to replicate and create models in the Metaverse that enable it to work and behave like the physical city.”
Similarly, the Metaverse, also known as the Web 3.0 is a collection of digital data that represents the real-world and enables more intuitive, immersive, interaction with that data. The Metaverse for construction, or the Industrial Metaverse, is really just an iteration of the ‘digital twin’ concept.
“An immersive representation of the physical and digital worlds combined as one that enables us to better understand how these representations of a given project interact with one another,” says Nathan Patton, Product Marketing Manager, Strategy & Innovation, Trimble.
The connection between the digital world and the physical world in which Metaverse interaction provides different complex systems in the same space. Thus, the flow of data is understood to make better decisions to manage the infrastructure to obtain solutions. “Multiple assets create something unique and entirely new by collecting real-time data and gaining insights,” elaborates Henrique Reis, 3D Geospatial Analyst (Digital Twins) at AAM.
What is the Metaverse? The Metaverse can be most succinctly defined as the next stage in the evolution of the internet, where boundaries between the physical and the virtual merge.
But it is a bit more complex. Cesium CEO Patrick Cozziexplains, the Metaverse is a progression of the internet from something that’s 2D to fully immersive 3D.
The ‘fully immersive 3D’ is the underlying factor here.
“The media types on the internet have gone from text to images, to video, and now we have all the technology to enable immersive 3D experiences,” he adds.
And that is why, as Jensen Huang,FounderandCEO of NVIDIAhad recently said, the combination of AI and computer graphics will power theMetaverse.
“It’s a 3D embodied Web, where we can connect inside virtual worlds that look and feel to us as rich and complex as the real world.We can play, socialize, work and create within these interoperable worlds despite being separated by great distances in the real world,” elaborates Jeff Kember, Director, Omniverse Technologies,NVIDIA.
Naturally, many Metaverse experiences will also be in a digital twin of the real world, whether that be for productivity applications for work, or it be monitoring a construction site, or it be training for GEOINT purposes.
The underlying technologies, such as 3D, augmented reality (AR) and virtual reality (VR), digital twins or real-time streaming, have finally matured and are converging. “In many ways, the

Source: Radoff, J. (2021, April 7). The Seven Layers of the Metaverse. Medium.

Graphic courtesy of Strategy of Things
Metaverse we are talking about is the ultimate modeling and simulation environment with the pieces finally coming together,” emphasizes Nadine Alameh, CEO, Open Geospatial Consortium.
While the very concept of Metaverse and what constitutes it continues to be defined, its potential to unleash the next wave of digital disruption is obvious. Mckinsey estimates that in the first five months of 2022, more than USD 120 billion have been invested in Metaverse technology and infrastructure, which is more than double the USD 57 billion invested in all of 2021.
Geospatial Foundational to Metaverse If the Metaverse has to be a realtime representation of the physical world, it’s a no-brainer that the concept of location or geography is central to that idea. Everything about the Metaverse is geospatial, emphasizes Alameh. “You need to know where you are; where things are; where things are relative to each other,” she says.
Staying tethered to our real geography – not just accurate coordinates but everything, the real environment, around us — will be essential if we want to capitalize on the full potential of what is envisioned as the Metaverse.
Imagine, while planning a city you are able to not only model but also experience the changes to the city’s physical environment such as traffic congestion, environmental emissions, and sea level rise. Or imagine constructing a building actually seeing the view from the windows of your new office.
“Geospatial information will be fundamental to combining the technology that propagates through the command of the physical world to make us understand different environments. This social-digital interaction is intrinsically geospatial and will rely heavily on interoperability to combine data/information to gain new insights, interactive actions of digital twin technologies for modelling and simulation that affect the real world,” says Reis.
Digital realities already handle air traffic; autonomous trains already move between terminals. Now, imagine first responders never making a wrong turn; industrial plants predicting and performing maintenance; or autonomous technologies making mines and construction sites completely safe and sustainable; elaborates Dold.
And this is being made possible by advances in technology that will help city leaders to quickly solve the problem of infrastructure system to improve operations and maintain efficiency in resource analysis, saving cost to develop different strategies to provide quality of life for citizens from a certain place.
Metaverse for Smart Cities One of the biggest lessons from the pandemic is that we need a fresh take on the nature of real estate as cities across countries. Seoul is one of the first cities to announce plans for a Metaverse in 2021 with a virtual communication ecosystem for municipal administration.
Similarly, China has invested in exploring the Metaverse in Shanghai. Santa Monica, California in the United States offers a Metaverse experience for tourists. Recently, on October 15, Dubai announced a Metaverse strategy.
Dubai had also recently announced a Higher Committee for Future Technology and Digital Economy, which will play an integral part in the city’s mission to spearhead the Metaverse and Web3.
Many more cities are engaging with technologies related to the Metaverse, such as digital twins, blockchain, and IoT to help them with hosting municipal events, tourism, or even post NFTs in virtual spaces that could then appear on real apartment walls!
However, for most cities, these technologies are more useful and interesting than the more immersive versions of the Metaverse that currently exist, thinksLena Geraghty, Director of Sustainability and Innovation, Center for City Solutions.
Here, big cities that are often on the cutting edge of technological innovation will likely continue to invest in a more immersive version of the Metaverse.
Geraghty thinks many other cities will also continue to incorporate IoT and digital twins into their operations as these tools become more accessible and as their utility is further explored.
“Technologies such as digital twins have tremendous potential to help cities with a range of goals, from planning for infrastructure changes to lowering building emissions. They allow city leaders to view their cities with a granularity that allows them to play out hypothetical scenarios to anticipate specific impacts of new buildings, street changes, or other land use decisions. Cities, including Boston, Orlando, Las Vegas, and Phoenix, have already invested in this technology.”
Chan points to an interesting angle. Cities have long faced challenges with providing equity, accessibility and quality of life for vulnerable residents, and with the pandemic resetting smart city priorities, these once again have emerged on top of the agenda. Mobility issues have limited senior citizens and physically disabled residents from fully accessing services, visiting businesses and attending events. Residents in lower socio-economic communities do not have the same services as other areas have, nor do they have the same access to quality education and health services.
While the Metaverse cannot solve all these challenges fully, Chan believes its immersive nature offers the potential to make a meaningful impact in ways not possible before.
“For example, senior citizens and physically disabled residents with limited physical mobility, are no longer restricted in where they can go or do. In the Metaverse, they can visit and engage with friends, attend classes and events, exercise, access services, and do so in ways that are similar to in-person activities. City leaders and managers can use the Metaverse to host more effective community engagement and collaboration meetings with its residents and businesses,” he explains.
The Metaverse could also change how people return to work in a post-pandemic world.
However, cities also must be proactive in considering how the Metaverse will impact them. “We have seen that new technology can provide innovative solutions to better cities when local leaders plan for it but can also be detrimental to cities when it arrives without foresight,” Geraghty says.
Metaverse for Construction Digital twins are already an integral part of construction processes. The immersive side of Metaverse will further drive a desire for better ways of interacting with the data that describes a construction project, points out Trimble’s Patton.
THE LAYERS
Some of the ways we interact with the Metaverse is different from how we interact with the physical community in the smart city. However, the city’s Metaverse creates the same outcomes that are aligned to the same needs and priorities of its residents, businesses and visitors as the physical smart city. Therefore, it is helpful to think about the metaverse in layers. Technology layer: This is composed of some of the components like VR/
AR and digital twins. Data layer: This feeds the digital twin and other reality models and services in the metaverse. Connectivity layer: This allows residents, businesses and visitors to access the metaverse and interact with each other. Content and experiences layer:
These are the interactions and engagements between the community and other members that bring the metaverse to life, grow, and sustain itself. Innovation layer: This includes the tools and the means for metaverse community members to continuously create the content, experiences and services for the metaverse.
VIRTUAL LAND FOR SALE!
As the metaverse begins to take shape, virtual worlds such as Decentraland, The Sandbox, Somnium Space and Cryptovoxels are equipping users with avatars, notes an EY paper. There are companies which are using the virtual land to create new marketing channels through immersive experiences, digital goods like NFTs and sponsored content. EY says over 200 major consumer brands have reportedly bought virtual land in the metaverse, including Atari and Wari Music Group.
And that applies to not only in the typical sense of immersing ourselves in 3D models but also in the digital representation of the project schedule, the costs, the materials, and the timeline.
“All of these things are digital representations of the construction project that I feel will be made more intuitive through some iteration of the Metaverse. For example, a legal title for a building is signed over as a paper document that is recorded in a county’s ledger, in the future the digital data will represent the building itself and can be sold and verified through the Blockchain as opposed to a physical library,” he explains.
Agrees Stephanie Lin, Senior Director, Global Retail Strategy at Matterport. “From real estate, architecture, design, engineering, retail, travel and hospitality, we know that giving virtual access to physical space has tremendous value.”
Replicating real-world environments adds a completely new level of authenticity and reliability to the countless simulations and training sessions. This can range from immersive job training for difficult or hazardous conditions to severely cutting down the carbon footprint and pollution generated by humans today. Being able to test and validate a Metaverse proofof-concept prior to physical world execution can be immensely valuable and cost saving.
Simulation brings enormous opportunity for all enterprises as simulating projects thousands of times virtually before producing in reality will save on cost and waste, and increase operational efficiency and accuracy. “Applying accurate physical simulation to the digital twin gives us incredible superpowers. We can teleport to any part of the digital twin just like we can in a video game, and inspect any aspect of it reflected from the real world. We can also run simulations to predict the near future, or test many possible futures for us to pick the most optimal one,” says Kember of NVIDIA.
NVIDIA’s Omniverse is a prominent example in this area, and already powering several businesses — whether by DNEG or Sony Pictures Animation to build accelerated USD media and entertainment workflows; by Foster + Partners or KPF to bring architectural design to the next level; by Ericsson to create a digital twin of a city for 5G deployment optimization; by BMW Group for building factories of the future; or by Lockheed Martin to build simulation environments to better predict wildfire spread.
Cozzi gives the example of the work Cesium does with its partner Komatsu — creating digital twin of the construction site. “We envision a future where you may have an operator on site wearing AR glasses, seeing overlays of stockpiled volumes and cut film maps, and they'd be connected to someone off site wearing VR and both pointing at the same areas and maybe a third person in an operations center with a big screen,” he says.
Similarly, the Matterport digital twins are ready-made for the Metaverse. “Through our partner platform, augmented reality and virtual reality technologies are integrating with digital twins to create seamless and immersive experiences in these digital environments. We believe digital twins can be one of the core foundations for building the Metaverse,” explains Lin.
Is Metaverse Enhanced Virtual Reality? It seems no! As Cozzi explains, the Metaverses will be much more than VR. VR will be one way to engage with the Metaverse just like your phone today is one way to engage with the internet, but you also engage with the Metaverse through AR wearables and through your laptop and phone as you know it today.
Photo courtesy of NVIDIA

NVIDIA Omniverse - BMW AI Factory of the Future.
“For example, we do work for digital twins of the construction site while we work with Komatsu. And we envision a future where you may have an operator on site wearing AR glasses, seeing overlays of stockpiled volumes and cut film maps, and they'd be connected to someone off site wearing VR and both pointing at the same areas and maybe a third person in an operations center with a big screen,” he elaborates.
Chan explains that interpretation is a bit simplistic of what makes a Metaverse. “From a technology perspective, digital twins and VR/AR are some of the more top of mind components that enable the Metaverse, but it’s not the sole component,” he says.
Geraghty concurs: “These technologies are iterations of what the Metaverse may one day become. Technologies like 3D modeling and digital twins are developments that move towards blending the physical and digital worlds, which is what the Metaverse posits to do. As technology advances, these tools may improve to become even more lifelike and useful in various sectors.”
However, Pattonlargely agrees that the ‘Industrial Metaverse’ will be an iteration of the digital twin concept with deeper interaction, collaboration, and immersive capabilities. “Essentially, it seeks to be the Digital Twin concept actually realized.”
The concept of the ‘Industrial Metaverse’ is really no different than these other iterations of virtual worlds that we’ve seen in the past, but now the emphasis is to introduce genuine value in other vertical areas of opportunity. Computers were originally seen as devices geared towards only gaming, but over time, the technology advanced to the point where there were more tangible benefits that extended into business (advanced calculation, digital visualization of information, etc.) that made it a bonafide requirement for successful operations. The same is happening to immersive technologies now, Patton says.
Future Economic Impact Now that we have the technology to create true-to-reality digital twins of the physical world, this evolution of the web will be much larger than ever before. Creators will make more things for virtual worlds than they do in the physical world, enterprises will build countless digital twins of products, environments and spaces — from object scale to planetary scale, says Kember
There are estimates that the economic impact of the Metaversewill touch USD 1 trillion in the next few years. Gartner has predictedthat one in every four consumers will be using the Metaverse for at least an hour each day by 2026.
The next era of industries and AI will be enabled by the Metaverse — these interoperable virtual worlds. For one, 3D workflows are now essential to every industry. After all everything that we design and build are typically first built in a virtual world. Be it a car, or bridge, or factory or a city — everything is first designed with various CAD tools, before they are built.
Dold thinks as more and more users adopt these innovations — and because companies continue to make advanced technologies easier to use – these solutions become suitable for an ever-widening base of users, extending beyond traditional industrial or governmental customers, to entertainment, healthcare and fashion.
“Imagine we can model our entire earth and ecosystem and do simulations to understand the impact of policies on climate change; from local to global and global to local,” says Alameh.
The Metaverse can help us better connect with the world around us, interact more deeply, and get to the root of issues. There will be a larger market, larger industry with more designers and creators building digital things in virtual worlds.
And just like the initial internet did, Web 3.0 will spark many new economies, larger than our current physical economy.
Photo courtesy of Cesium

Conceptual model of a construction site in the metaverse, where real world location data from real-time 3D scans and intelligent machines are combined to create fully immersive and interactive digitized worlds.
Anusuya Datta
Editor-at-Large Americas
anusuya@geospatialworld.net