Geographic Frontiers

Archives

Navigation

Where am I? I just got lost in the Metaverse again!



“The metaverse is coming and it’s a very big deal.” That was the title of a July 2020 Forbes article by Cathy Hackl, and I agree. It is a very big deal with companies, particularly Facebook, seeing it as a huge part of their future. Facebook is creating their new parent company called “Meta” and recently announced their plan to hire 10,000 workers in Europe to build the Facebook metaverse over the next five years. I hope a few of the new employees will be geographers.

In navigating the world of metaverses (or should that be the metaverse of metaverses), we’ll find there are multiple concepts of what makes a metaverse.

At one end is a metaverse that augments reality. It is tied to where you are in the real, geographic world. It provides amplifying information based on your location and what you encounter.

At the other end is an immersive virtual world—perhaps a digital mirror of the real world but without the constraints. It is one you make up. It will likely have spatial relationships—since that is how we deal with reality—but they will all be relative to the metaverse we have entered.

In the middle is a mix of reality augmented by virtual objects. A surgeon might practice a procedure on a virtual body before attempting on a real patient. An assembly worker may manipulate virtual parts rather than real ones. Behind the scene, robots will do the heavy lifting and assembly based on the instructions generated by the worker in his or her virtual space.

Coffee in Katmandu

Here’s an example of how metaverses might operate. You and your friend want to meet for coffee. Instead of meeting at a coffee shop, you plan to meet at a Buddhist monastery in Tibet. So, you go to the kitchen, grab a cup of coffee, sit down at the table and put on your “metaverse glasses.” You are then immersed in your Buddhist monastery. You walk out onto a terrace to see the stunning vista of the Himalayas and your friend sitting at the table. Your friend is dressed in his Tibetan monk outfit because that’s just the way he rolls. You’ve decided on more conventional hiking pants, a sweater and a down vest. Even in this virtual metaverse, you can track that you have a real cup of coffee on the table in front of you, and you are able to take a sip without spilling it on you. About five minutes into your conversation, you realize your son’s school—as in the school building--has popped up on the shoulder of a nearby peak. Underneath is a notice that the school has called; your son is running a fever and needs to be picked up. You bid farewell to your friend and at once are back in your kitchen. Still wearing your “metaverse glasses” you grab your car keys and head toward the door. You look down and realize you are still in your pajamas. It is a cool fall day, so you quickly put on your hiking pants, a sweater and a down vest. Your metaverse glasses are sync’d to your car. When you start it up, it tells you how much charge you have in the car’s battery and gives you a visible range of how far you can drive and still get back home. (You forgot to plug in the car last night). As you travel past the coffee shop, a notice pops up that it is national coffee day, and they are selling coffee at half price. That explains the exceptionally long line at the drive thru. As you’re about to turn onto the highway on ramp, you receive a notice that an accident has just been reported on your planned route. You take side streets to school instead with your glasses giving you visual cues for each turn. As you enter the school and sign in, you realize you’ve never met the school nurse before. Your “metaverse glasses” show you’re her picture from the school staff directory along with her name and even where she went to college. As your turn the corner, you recognize her immediately and greet her by name. Your son is sitting in a chair nearby. The nurse begins running down his symptoms. As she names them off, the symptoms appear around him as if floating. You have your “metaverse glasses” capture the combined real and virtual image for playback once you get home.

All this sounds pretty good, and for the most part, it is. So, how could one possibly get lost in the metaverse? It could happen. It happens today.

Death by Metaverse

In 2011, Albert and Rita Chretien headed out from British Columbia to attend a convention in Las Vegas. Based on information provided by their navigation system, they decided to take a scenic shortcut through the mountains in northern Nevada. What their GPS system could not tell them was the barely passable road in good conditions was impassable due to late winter snow and mud. Six miles in, they got stuck in the mud and were stranded. Having injured her knee in their first attempt to walk out, Rita stayed in the van, while Albert set out on his own. Seven weeks after they got stuck, antler hunters came upon the van. They found Rita inside. She had survived by melting snow for water and rationing what food she and Albert had brought with them on the trip. She had given up hope of rescue and presumed she would soon die. Albert’s body was not found until a year and a half later roughly seven miles from where he had set off.

Albert and Rita were pulled into the metaverse of the GPS-enabled navigation system. Even as their surroundings were telling them that the road was getting rougher and the terrain more extreme, they thought they would get through it. They believed in the metaverse until they got stuck.

If such disorientation happens with GPS-enabled navigation systems, it will happen with metaverses. People will get disoriented when using metaverses. Some so much so they will be seriously injured or die. These instances might be rare. But just like the occasional “death by GPS” stories, these are the ones that will make the news.

This doesn’t mean we stop the development of metaverse technology because it might harm people. (If we had followed that line of thinking one hundred years ago, we would have outlawed automobiles because people got hurt in accidents). We simply need to be cognizant of necessary “safety features.” Metaverse designers, tech company apparatchiks, and software coders would do well to consult with geographers to integrate such features into their metaverses. Like the “kick” in the movie Inception or the ringing telephone in The Matrix, there must be mechanism individuals can use to pull themselves out and understand where they are in real geographic time and space.

Time is intimately tied to geography. The Global Positioning System satellites we’ve grown to depend on for determining location are essentially extremely accurate clocks orbiting the earth. Measuring the difference in time that signals from these satellites arrive—usually to fixed devices on our network along with our cell phones—gives us our location. With a universe in constant motion, our location is determined by our place in both time and space—a downright Einsteinian concept. When leaving a metaverse, we need to be informed of both time and space to be successfully reintroduced to reality.

There are two ways a metaverse can geographically fail its participants. One is that it might contain incorrect or incomplete information. This is the problem the Chretiens encountered. Their navigation system led them to believe a perfectly passable road would provide them a scenic shortcut to their destination. Second, like The Matrix or the dreams in Inception, the metaverse may be so convincing that individuals lose sense of where they really are. Perhaps a more appropriate example of this phenomenon also occurred in 2011, the same year the Chretiens made their ill-fated drive into the Nevada mountains. Three women were driving a rental SUV at night in Bellevue, Washington when water started coming into the car. Their GPS navigation system had directed them down a boat ramp and into a lake. All three managed to escape the sinking SUV. Certainly, the navigation system was flawed in identifying a boat ramp as a road, but the women had little other sensory information available to them since it was in the middle of the night. They simply proceeded in the direction their GPS navigation system metaverse was taking them. Time was a factor. Had they been navigating in the daylight, they likely would have recognized the error of their navigation system and stopped short of the water.

Of Mice and Men

To understand what geographic safety features would be effective in a metaverse, we need to understand how individuals innately find their way in the real world. We likely will lean on these same innate tendencies whether we are in the real world or immersed in a metaverse. We will need a “kick” to put the metaverse into geographic perspective.

Two studies, each now a decade old, showed that cells in the hippocampus associated with direction finding were firing in baby rats not long after birth. Scientists divide the brain cells that enable direction finding into three types: direction (where the head is pointing), place (location in an environment), and grid (distance covered while moving). The upshot is, like mice (or more specifically rats), we come hard wired to operate in the physical world around us. It’s postulated these innate brain cell functions lead to our broader understanding of how things geographically fit together in our physical world. Rooms fit into buildings. Paths or streets connect to other buildings and natural features. And so on. Like the seafaring navigators of centuries ago, we are wired to operate on dead reckoning. We have some sense of where we begin (place), where we are headed (direction) and how fast we are traveling (grid). When we can, we factor in cues from our environment to help us. Those with a strong spatial understanding are better at creating a “bird’s-eye view” of where they are in relation to where they need to go.

It seems we humans can also learn things. Studies have shown that in addition to the spatial navigation we come wired with in our hippocampus, we learn how to create and follow directions using another part of our brain—the caudate nucleus. Psychologists call this the egocentric approach. We start from a point and follow a set of learned directions. In many cases, this works fine. It’s the primary approach programmers use when writing navigation programs that give us turn-by-turn directions.

Some people do not fully develop their innate spatial abilities or allow them to wane. I once had a meeting with a rather brilliant computer programmer working on some leading-edge software. He was two hours late to the meeting. It turns out he missed his exit from the beltway, so his only option was to drive completely around the city on the beltway until he came upon the exit again. I’m guessing he had a well-developed caudate nucleus. He was good at creating directions for a computer (computer programming) but could only follow directions when it came to navigation.

To be successful, metaverse developers will need to accommodate both types of people. At times we will need the metaverse to give us instructions on where to go next. At other times, the metaverse will give us the leeway of finding our way around on our own. It will need to be smart enough to know which approach works under which circumstances.

The Kick

When it comes to engineering the kicks that allow individual to escape their metaverse and come back to the real world, developers must rely on our innate sense of geographic and spatial understanding. Even those who have atrophied these hardwired traits will find reliance on our innate spatial ability is still the best strategy for quick or emergency extraction. We need to be immediately told where we are, to how to get back to where we came from.

Should the metaverse recognize that an individual must leave? Metaverse developers might be wise to incorporate a geofencing capability so that when a participant moves outside a specific area such as their home or is about to venture into a dangerous area, there will automatically be a kick. They are brought back to geographic reality and asked whether they are really where they want to be.

In 2015, a Brazilian couple followed their GPS and drove into a gang-controlled favela in Rio de Janeiro. The favela had a street with the same name as the beach resort they were driving to. Thinking they might be police, gang members ambushed their car killing the wife. A geofence within their GPS navigation program—as in let us know if we go outside a predefined area-- could have kicked them out of their “metaverse” and asked them if they really knew where they were headed.

The dilemma for developers will be in discerning how people will want to get their “kicks.” Here are some possibilities. Each has some potential shortcomings.

The escape hatch. Always floating around in the corner of one’s metaverse is an escape hatch icon. When touched it will perhaps give a bird’s eye view of the real world. Satisfied they are in are geographically safe, the user returns to their metaverse. Will the user know they might be in trouble and need to use their escape hatch? The metaverse may be so convincing they have no idea they are headed for the cliff.

I’ll set my own preferences, thank you. Before launching their metaverse, the user sets a number of conditions—geofences, a time limit—for them to get a kick. This might be like user controls set by a parent for their metaverse-crazed child. Will the user be able to anticipate all potential problems? How many would take the time to do this?

Be smart enough to know when I’m in trouble. Perhaps using a learning artificial intelligence program, the metaverse program knows when the individual needs their kick. Will the program learn fast enough? Will it become an annoyance to the user in that it executes the kick when it’s not needed?

Opt out to stay in. Given a choice, some will simply opt out of any kick feature believing that they would never lose track of geographic reality—at least not to the point they would put themselves in danger. How often do people get lost in the real world when convinced they know where they are going?

If metaverse’s are going to be as compelling as developers claim, I think automatic kicks will be necessary. Experience says that even with limited metaverses established by GPS navigation systems, people will follow them even when there are indications something might be going wrong. The metaverses in our near future must be smart enough to know when something might be going wrong, kick us out and help us gain our real world spatial and chronological understanding. It may be an annoyance at times. It will likely make a mistake from time to time, ideally almost always on the side of caution.

What if developers don’t do this? Where we now read the tragic, occasional story of death by GPS, in a decade we will be reading tragic stories of death by metaverse.


Back to Archives