Metaverse and interfaces: How will we move and interact within virtual worlds?

 

We can consider the development of the metaverse as a kind of gold rush. Now that the concept of a virtual world where we can spend hours working and having fun has been embraced, numerous companies from all over the world are working on various solutions: software, platforms, and virtual and augmented reality headsets. But interfaces are also one of the key aspects. It’s pointless to have the best technology on the market if its use is complicated, unintuitive, or unnatural. In video games, the problem has always been relevant, and classic control systems like the combination of a mouse and keyboard or a traditional gamepad, such as those from PlayStation and Xbox, have been relied upon until now. These work great for those who have grown up with a passion for video games but are not very effective for different immersive experiences, such as collaborating with remote colleagues.

 

 

Simplicity is the keyword.

 

The metaverse is now an experience that is fascinating at times, especially for those worlds that are meant to be experienced through virtual reality headsets (as envisioned by Meta and others). However, the controllers and interfaces need to leap forward if they want to appeal to a wide audience. The controllers provided with Oculus Quest and HTC Vive, although very comfortable for those accustomed to video games, are not always ideal for professional collaboration. While they are precise and represent a significant advancement compared to a mouse and keyboard (the standard interface of Second Life, which we can consider as the first metaverse experiment), they still don’t provide the sensation of, for example, moving one’s fingers.

 

Moreover, screen interfaces are not very effective, and the trend will likely be to do away with them. The ideal interface, after all, is the one that isn’t there: if the system could understand our intentions, in addition to voice commands, and comprehend where we are focusing our gaze, there would be no need for additional tools.

 

What is available to us today

 

Numerous companies are working on controllers that can emulate hands, and one of the most interesting technologies in this regard is Valve’s Index controller. They may initially appear similar to those provided with Oculus or HTC Vive, but appearances can be deceiving. These gadgets are equipped with 87 sensors (optical, capacitive, force, and motion sensors) that track the position of hands, individual fingers, and button pressure. All the input gathered from these sensors is then processed to understand the player’s intentions and replicate them in the metaverse. The result is that to drop an object on the ground, you simply need to open your hand, and to throw it, you make the gesture as if you were holding something, adjusting the force accordingly. Want to play rock-paper-scissors? It’s as easy as doing it in reality with friends.

 

 

To date, this is certainly the most effective input system for VR, as well as the most economically accessible. Another valid alternative comes from the VRfree gloves by Swiss company Sensoryx AG, compatible with major VR headsets. These gloves also offer finger tracking, but the precision is limited compared to Valve’s models due to a significantly lower number of sensors. It’s worth noting that Sensoryx AG offers different versions, including VRFree Haptic Glove, which provides tactile feedback: essentially, we can “physically” feel the objects we grasp.

 

On the other hand, VRfree Sole is a special insole designed to simulate movement. It detects foot pressure and can determine if the user is sitting, standing, walking, or climbing stairs. However, it is not a tool intended for exploring virtual worlds but rather for conducting clinical studies.

 

For gamers who want to move more intuitively within virtual worlds, they can experiment with Cybershoes, a type of shoe equipped with sensors and wheels. Don’t worry: you won’t risk banging into a wall or tumbling down the stairs. These shoes are not designed for actual movement within environments but rather to simulate movement while comfortably seated, by simply rolling the wheels to simulate walking speed.

 

Those who feel nostalgic for the 90s and the early VR experiments might be fascinated by VR treadmills, such as the one developed by Ping-S: special “turntables” to which players are secured, allowing them to move while remaining in place, even simulating running without any risk of colliding with walls or objects. However, these solutions are very expensive and not particularly accurate, mainly intended for arcade halls where the “wow” effect can make a difference. In everyday use, they make little sense, both for gaming enthusiasts and businesses.

 

 

The interface of the future is the one that doesn’t exist yet.

 

The problem with many current interfaces is that we have been trying to replicate reality as much as possible for over 20 years. However, this may not be the ideal approach, and research is now focusing on other areas. In particular, two companies are leading the way in developing solutions to make interaction in the metaverse simpler and more natural. One of them is, of course, Meta: the giant led by Zuckerberg has gone all-in on the metaverse, and it’s not surprising that they don’t want to limit themselves to offering the headset and software platform. The multinational company has long been patenting eye tracking, initially done to gather more data on user behavior and deliver more effective advertising. Over time, however, the focus shifted to the metaverse, and now Zuckerberg wants to integrate facial and eye tracking into future versions of Oculus. By tracking the face, it will be possible to go from avatars that resemble LEGO or Playmobil characters due to a perpetually smiling expression to much more expressive digital alter egos, thus facilitating communication. And, why not, use eye tracking to give commands, such as rotating one’s gaze.

 

Elon Musk, on the other hand, is looking even further ahead and has been experimenting with neural interfaces on monkeys for some time. Why wear gloves, suits, strange shoes, or use counterintuitive controllers when we can do everything with our minds? The eclectic entrepreneur who founded PayPal, Tesla, and SpaceX has been working on Neuralink, a brain implant interface. No, it has nothing to do with the jacks that cyberspace cowboys inserted into their skulls to enter cyberspace in William Gibson’s stories. It won’t be used to “transfer” us into the matrix, but rather to control interfaces: move cursors, and activate commands. Essentially, it’s a joystick controlled by the mind, at least initially. According to Musk, in the future, this technology could solve the problem of autism and schizophrenia and even enable paralyzed individuals to walk again, but he has provided no evidence to support these claims. Does it work well? It’s hard to say since it has only been tested on monkeys so far, allowing them to play Pong using only their minds. Musk had initially estimated that the first human tests could be conducted as early as 2020, a date that was then shifted to 2021 and ultimately to 2022. Will we see these tests soon? It’s not certain: Max Hodak, the co-founder, and president of Neuralink, left the company in 2021 along with seven other key members.

 

And what if the screen were the ideal interface for the metaverse?

 

We have explored various possibilities, both present and future, for the metaverse interface. Setting aside Musk’s highly futuristic vision, the other examples mentioned assume that we will access the metaverse using VR headsets, which is Zuckerberg’s vision. While this is fascinating, it doesn’t necessarily mean that it will be the path technology takes. This doesn’t imply that we will soon forget about headsets; on the contrary, they will likely continue to be a prominent technology. However, they may not be the only technology. It is quite possible that for some time, we won’t be able to fully embrace this approach, and as a result, the metaverse will also need to be accessible through traditional PC screens and, most importantly, mobile phones, using the familiar interfaces we have been accustomed to for decades. It is the most common and simplest interface to use, and everyone already has access to it.