Home  >  Analytics   >   The Future of the Metaverse in Healthcare

The Future of the Metaverse in Healthcare

by Jody Ranck | February 09, 2022

Key Takeaways

Facebook’s rebranding as Meta has been driving attention to the concept of the metaverse as the next major paradigm in technology, but as a concept, it still lacks clarity. Most discussions of the metaverse are focused on applications utilizing AR/VR. These are components of the metaverse, but the actual metaverse of the future will be far more complex and difficult to create.
In the present, it’s important to focus on innovations in the constitutive components of the metaverse and spatial computing: Edge AI, AR/VR, sensors, optics and displays, and control mechanisms. A great deal of academic research and startup activity is in the areas that will become part of the healthcare metaverse and we will slowly see the shape of this emerge from these components. The whole will be harder to discern but will take at least a decade.
The social infrastructure to create the metaverse will be critical and a very steep hill to climb. The dominant platform players see the metaverse as the next big thing for land grabs. The metaverse will need even more cooperation across sectors than the creation of the internet to build standards and address the ethical issues of bridging the physical and digital worlds. These issues will make current debates over responsible AI and ethics appear to be elementary aspects of the metaverse.
The future impact of the metaverse will most likely benefit medical and patient education, collaborative research, and precision medicine. The nearest term applications of the metaverse will be in these areas where we already see AR/VR work being developed, particularly in academic research centers.


Facebook’s recent name change to Meta has sparked a flurry of speculative articles on the future metaverse and its impact on society overall, as well as diverse sectors such as healthcare. The science fiction vision of the metaverse has been around for decades with examples such as Neal Stephenson’s Snow Crash, Hollywood films such as The Matrix and many contemporary gaming platforms from Fortnite to Unreal Engine. We are hearing more and more about augmented reality (AR) and virtual reality (VR) in healthcare, with use cases in surgery, pain management and medical education. But what is the metaverse? And what role do we foresee it having across the health IT ecosystem in the future?

Metaverse applications in healthcare run the gamut of new ways of delivering medical education, from using simulation to improving surgeons’ skills on more complex surgeries, to new modes of patient engagement. We have seen fragments of the future in examples such as Second Life, which for a period offered spaces for patients to receive counseling on HIV and sexually transmitted infections in a way that can preserve privacy and improve doctor-patient relationships. Collaborative spaces for medical professionals are already forming, led by platforms such as Veyond. Digital twin technology is now being used for simulation and will likely play a role in future precision medicine platforms.

Matthew Ball is one of the early-stage experts on the metaverse and has developed the best description(s) of what it is and what it is not. For starters, it can be described as follows:

  • Persistent: it never ends
  • Synchronous and live: a living experience in real time
  • Exists without any cap on usage and every user has a unique sense of presence: everyone can participate and with agency
  • Fully functioning economy: individuals and businesses can own and invest in assets and be rewarded with value
  • Omni-experiential: it spans digital and physical world, public and private spheres, open and closed platforms
  • Massive interoperability across assets, experiences
  • Driven by content and experiences: an even wider range of contributors than the internet

The interoperability part should give anyone in healthcare a moment to pause, considering the mountain we may be climbing here.
Ball also points out a number of misleading interpretations of the metaverse and attempts to clarify by describing what it is not:

  • Not a virtual world, these have been around for quite a while
  • Not a virtual space such as Second Life. Just having a virtual space with avatars is not enough to be considered a metaverse
  • Not just a digital and virtual economy. World of Warcraft and Bitcoin are not metaverses.
  • Not a game or theme park like Disneyland. A metaverse is not a game in itself.
  • It is not an app store or user-generated content platform like YouTube.

Right now, this wide-open space means that big tech (Facebook, Microsoft, Google) is scrambling to invest in and build early versions of pieces of what will be the metaverse; this is part of a future land grab and evokes the familiar pattern of major technology leaps often shifting the deck chairs on who dominates the economy.

Why Does the Metaverse Matter?

A similar vision of the metaverse, but less well-defined, is the “spatial computing” paradigm defined by Robert Scoble and Irena Cronin in their book “The Infinite Retina”. They focus on the technologies that will enable the metaverse including AI, AR, VR, computer vision, sensors, robotics, cloud computing and 3D technologies. The impact of these technologies oriented towards spatial computing and/or the metaverse will drive technology change in six key technology areas:

  • Optics and Displays
  • Wireless and Communications
  • Control Mechanisms (voice, eyes, hands)
  • Sensors and Mapping
  • Computing Architectures
  • Artificial Intelligence

Paradigm shifts do not happen overnight and most often do not unfold the way speculative thinkers envision. To understand why, it will be important to look at some of the challenges.

Challenges in Building the Metaverse. The first challenge is that the metaverse is not the internet, and the internet was not designed for the metaverse (see Ball). Persistent communications across physical and digital world—and the standards needed to accomplish this with the prerequisite computing architecture—is a major lift. Getting land grabbing big tech companies at the same table on standards will be an interesting challenge, particularly when fiefdoms have served certain companies so well in the present technology era. Who will drive FHIR for the metaverse? (…although, hopefully we’ll have moved beyond it by then). The socio-political processes to create interoperability will be measured most likely in decades.

Speculative Insights on the Metaverse in Healthcare
From the introduction above, it’s evident we are still in a moment when the metaverse is being defined, often without a great deal of clarity. The other challenge is the rhetoric of futurists who often do not have a background in health IT and claim that wearables and use of VR are about to radically change patient behavior and access to healthcare. I’m sorry, try again—a better recipe would be less technology and more social innovation on those fronts for now.

Cronin and Scoble point to the rise of a new practitioner: The Virtualist. A living example is Dr. Sherylle Calder in South Africa who works with patients on virtual screens to improve patients’ eyes (EyeGym.com). Her therapy uses these screens to retrain pathways in the brain and she has worked with cyclists and rugby teams. One can imagine a new modality of virtual care that utilizes digital twins and VR/AR, that enables patients to see the impact of behaviors or therapies on their bodies in real-time that could better motivate behavioral change, for example.

Fig. 1: Components and Technology Layers of the Metaverse (Source: Outlier Ventures)

The layers include the individual personas (avatars and assets), hardware and software (VR/AR headsets, PCs, gaming consoles, client software), assets from within the virtual world, physical assets from virtual world (buildings, avatars, collectables, wearables), and currencies and content that exist as images, audio, video and structured data sets.

Radiology and surgery are already becoming early stage virtualists, according to these writers. Surgical VR applications can function as a kind of Waze for surgeons that can help them perform better in more complex surgeries. Mediview is already providing holographic applications for surgeons, along with a number of other companies ranging from Philips (Azurion Platform using HoloLens2). The University of Connecticut School of Medicine is utilizing PrecisionOS and Oculus for training residents in orthopedic surgery.

VR is already growing in medicine, with applications for pain management (Applied VR, Psious, Firsthand Technology), schizophrenia, pre-operative pediatric surgery (UCSF), dementia, autism, and PTSD. Virtual coaches are being developed with AR that will use glasses to prompt users of behavior changes and reminders. Even Apple has filed a patent for glasses that would project a screen onto a user’s eyes.

Digital twins are also frequently mentioned as one of the initial steps into the metaverse. Sensors, AI, and augmented reality can come together in a convergence of physical and digital worlds, according to Microsoft, and this enables simulations and interactions in both physical and digital worlds. Some may speculate that digital twins could show up for doctor visits someday, or maybe your liver will after real-time data on your biometrics are analyzed from the edge on a remote patient monitoring platform (RPM). However, this still may not reach far enough to be considered the metaverse in Ball’s definition.

Companies to Watch

NVIDIA is one of the big players with their Omniverse Platform for connecting virtual worlds. Already it has been used across media, architecture, manufacturing, computing and game development for virtual collaboration spaces. Roblox bills itself as a platform for building “experiences” and is used by most of the major gaming platforms that teens use today. Their recent IPO will provide financial resources for expansion; however, we have yet to see much mention of healthcare in the strategic vision of Roblox.

Microsoft is making substantial investments in these technologies, including their HoloLens, Azure Digital Twins, IoT, Mesh and Power Platform, as a full metaverse technology stack that could be leveraged in healthcare as well. The digital twins and VR applications have the most immediate applications for research, education, robotic surgery and perhaps even new modalities of virtual care in the future.

Zimmer Biomet uses HoloLens in their OptiVu Mixed Reality Solution platform that has applications in their robotic surgery offering as well as for pre-operative education with patients. A number of academic medical centers are also experimenting with a diverse range of surgical and medical education applications that utilize AR/VR. Some of the leading institutions include the University of Southern California (Institute for Creative Technologies), Johns Hopkins University (neurosurgery), University of Miami (Gordon Center for Simulation and Innovation in Medical Education).

Figure 2: Roadmap to a Future Metaverse (Source: Lee et. al.)


Given the long road to the creation of a real metaverse, it is important to pay attention to the technologies and companies that are building the early components of what will become the metaverse in time:

  • AR/VR applications in surgery and medical education are already in use, and the academic programs listed above with more cutting edge or experimental applications. would be useful to follow, to assess the technology’s readiness to enter the market.
  • Digital twins usage has been growing in the past two years, but we may see a wider range of applications and some that could be used in virtual care.
  • Edge AI is going to play a growing role in remote patient monitoring and wearables and may offer paths to next generation apps built upon the data collected from sensors. These are applications that may be too expensive with the current cloud computing architecture, but Edge AI will enable applications with less latency, more context awareness, and better economics.
  • The metaverse will build on the shift we’ve seen from the pandemic and growth in virtual work/care. Tools for more robust forms of virtual collaboration, from surgery to business operations, will be a focus of some vendors in the metaverse space. Blended reality applications, new screens, sensors, and control mechanisms will begin entering the market for specific professions and tasks.

The metaverse will also see a great deal of socio-political discourse over the ethics of control of data, digital haves/have-nots, and contests over open vs closed standards and environments. While visionaries in the field are calling for open protocols and standards, just as we heard in the 1980s-early 1990s, the stakes will be high for those wanting to become the market makers in what is viewed as the next technological revolution. The relative openness of the future means it’s all up for grabs.

Leave a Reply

Your email address will not be published. Required fields are marked *

Stay up to the minute.