There is one last experience I want to tell you about from my trip to AWE USA a few months ago: the time I visited XCOM Labs to try its remote rendering solution!
A few friends of mine suggested I visit the booth of XCOM Labs, so I booked a meeting to try their solution in one of their private rooms. As soon as I arrived, an employee made me wear a HoloLens and walk around to see a complete baseball stadium scene rendered around me via servers that were there in the room with me. We are talking about remote rendering, so the servers were rendering the scene and sending it via local network to my headset, which acted as a dull display. This experience had everything I could expect from a remote rendering solution: the scene was much more complex than anything the HoloLens could handle, and the rendering latency was low, so almost not noticeable. When I removed the HoloLens, I noticed that it had a special antenna connected, which I guess was part of the networking solution I was shown.
After that, I was required to try the VR demo, in which I could wear a Vive Focus 3, always with some special network accessory attached, and enter into a sort of physical square ring and see around me a very complex virtual panorama, with some plants growing if I went around waving my hands. Again, I noticed that the remote rendering was working well, with good visuals and low latency. When I removed the headset, I could say that the experience had been good, but I did not understand what should surprise me: the rendering servers were not even in the cloud but there, in the room with us, so I had not experienced anything different than Virtual Desktop.
I was almost leaving the room, very puzzled, when I was introduced to someone in the company that actually explained to me what I had just experienced. And yes, he told me that at the end of the day, the result is quite similar to Virtual Desktop, because what the company does is streaming the rendered scene from a computer to an AR or VR headset. But what I should have looked at is how the network was done.
This employee told me that there was not a simple Wi-Fi network in the room, but a complex wireless network that works with unlicensed millimeter wave (mmW), which is much more powerful than Wi-Fi 6. The network was composed of a grid of many small access points installed on the ceiling, and while I moved in the room, the system made the roaming between the various access points so that I was always served by the ones that were the best for me in that moment. This system has been developed so that it can serve many people in the same room: with Virtual Desktop, you can’t reliably stream to more than one person, while with this system, you can perform remote rendering for 8 people together in the same room. And if you increment the number of streaming servers, the number of people can also increase to dozens enjoying remote rendering in the same moment. This system guarantees that there are no interferences between the different streamings, and so every user lives his experience at its best.
The HoloLens 2 used in the tests. Those little white cylinders are the special antennas developed by XCOM Labs. There was a grid of like 16 of them on the ceiling to create the network for the headsets
XCOM Labs has worked on all the layers of this solution: the network architecture, the hardware, and the streaming protocols. They are not aiming at substituting Virtual Desktop, but they want to offer this solution to companies that may need to offer remote rendering to multiple employees at the same time in the same room. One good example could be LBVR (location-based VR) venues, which could so substitute backpack PCs with Vive Focus 3 devices and remote rendering. In fact, they are already partners of THE VOID, with which they developed the last demo I tried.
I’ve found it to be a cool solution.
Since it is my last post about AWE, let me cast a ray of visibility on some companies I have met there, but about which I do have not enough material to write a full article:
Gamedriver: an interesting solution to create automated tests for your XR application in Unity. According to the team, if you need to test for instance that your game increases the health of the player by 50 hp when he grabs a white suitcase, you can create an automated test that simulates the grab and checks that the action is executed. It’s very interesting and needed for companies that perform test-driven development
Photon industries: one of the leading companies in the implementation of multiplayer networking into games, now they have released a new SDK (Fusion) and are becoming more interested in metaverse platforms, too
Surreal Events: a platform to host events in 3D. It exploits the graphical power of Unreal, streamed with cloud rendering so it can run on all platforms via the browser. Surreal was not the only platform using cloud rendering, which was in my opinion one of the big trends at AWE this year. Ironically, AWE showed exactly the limit of these solutions: the network there was not great, so they all showed choppy visuals. Anyway, the product per se was good, especially for what concerns the graphical quality
Allied Powers: a startup proposing a small device, similar to an old MP3 reader, with electrodes that you could put on your shoulders so that it could massage you via electric stimulation. At low intensity, it was relaxing, while increasing the intensity, things become a bit too… intense for my muscles
InWorld.ai: a startup that claims to be able to create NPCs that can populate virtual worlds and that feel alive and intelligent, like real people. It got good visibility after AWE, also because it can count on John Gaeta (Oscar prize for The Matrix) as one of its advisors. I’ve found the team composed of smart people, but currently, the status of the project hasn’t excited me. It has potential, but it has a long road to go. I have spoken with a few of these NPCs and they just answered like chatbots… one even answered with “I think you are right” when I said he was very stupid. So, the company claims to have intelligent avatars, but the avatars don’t think they are intelligent themselves…
Hoboloco: a company that is trying to build a system to let you move in VR while staying seated, by just using the movements of your feet (something like the 3dRudder)
Tiled Media: a company that is able to provide streaming of 8K videos with adaptive quality (more resolution in front of the screen, less in the periphery), so you can watch 360 videos, for instance of sports matches, in streaming, with very high quality.
I hope you have enjoyed all these articles of mine about AWE! Next year when I’ll come back I’ll tell you even more stories from there…
… but wait, there is a final surprise about AWE! I have still to publish the article about my tests with the Mojo Vision contact lenses! It’s coming soon and I’m sure you will like it… subscribe to my newsletter to be sure not to miss it
(Header image by XCOM Labs)