Hilton Innovation VR

At the Hilton Innovation Gallery, we developed a cutting-edge virtual reality experience to immerse users in the brand’s luxury offerings. While the guest walks through virtual suites, a docent controls their experience from an iPad where they can set the lighting, swap furnishings, and even change the time of day! Built with bleeding-edge tech, the TPCast system that let us use HTC’s Vive VR headset wirelessly. Remote control was powered by the Rockwell LAB’s own Spacebrew.

👤 Role: Software Engineer


Highlights

Show Control

So much is orchestrated from the docent’s iPad. They use it to initiate and customize their guest’s experience. In the real-world, the system triggers choreographed lighting and projection settings in the room, projects floorplans onto the floor and renders the headset user’s point-of-view on a wall. In the VR scene, the docent loads branded environments, fades smoothly between furnishings and materials, controls lighting, even sets the scene’s time of day literally moving the sun and moon!

Spacebrew provides bi-directional communication. So as the guest looks around in VR, the real-world wall projection smoothly follows their gaze, allowing spectators to share their experience, and providing feedback to the docent as they guide the experience. When no one is in VR, the docent can activate automatic animation paths, or pilot the wall projection directly from the iPad.

Hardware

Hardware for this installation includes a VR rig, controlled lighting, as well as wall and floor projections, all of which is piloted from the iPad. For VR, we used bleeding edge technology — an HTC Vive headset made wireless through the use of the TPCast add-on, a technology that brought wireless to the Vive before HTC’s official release.

Software

The iPad controller was a React.js web application served from a Node.js server. The VR experience was built in Unity with custom animations and shaders that could crossfade between surface materials as well as change the time of day, smoothly transitioning the sky textures and lighting direction of the sun and the moon. The Node server sent TCP messages to a Crestron to control real-world lighting. All other systems communicated over Spacebrew.

React.js, which was relatively new at the time, was a perfect solution for this project’s controller app. React’s emphasis on a single source of truth was key. The state of this distributed system was complex, including real-world factors like whether a projection screen was lowered, virtual-world factors like the position of the VR user and time of day, as well as UI factors like the branded theme of the UI.

Testing Early

Testing early was essential, but since the Crestron would acquired and configured by a vendor during install, our system would lack critical hardware until the final week of the project. To get ahead o this, I drafted a TCP messaging protocol, got sign-off from the vendor, then created a mock Crestron in Node.js. It behaved realistically, including faking delays due to hardware warmups.

Open source contributions

After the project wrapped, we contributed to Spacebrew for Unity and Javascript adding error handling and automating reconnection attempts with exponential backoff.