Meta's web browser now uses depth sensing for WebXR Hit Testing on Quest 3 & 3S, enabling instant mixed reality object placement without a Scene Mesh.The WebXR Hit Testing API lets developers cast a conceptual ray from an origin, such as the user's head or controller, and find where it first intersects real-world geometry. The API is a part of the WebXR open standard, but how it works on the underlying technical level will vary between devices.Previously on Quest 3 and Quest 3S, in WebXR the headset would use the Scene Mesh generated by the mixed reality setup process to determine which real-world geometry the raycast hits. But that approach had several problems. If the user hadn't set up a mesh for the room they're in they'd need to do so if the developer called the hit testing API, adding significant friction. And even if they had a Scene Mesh, it wouldn't reflect moved furniture or other changes since the scan. 0:00 /0:34 1× Demo clip from Meta engineer Rik Cabanier. With Horizon Browser 40.4, rolling out now, the WebXR Hit Testing API now uses Meta's Depth API under the hood, not the Scene Mesh.Supported on Quest 3 and Quest 3S, the Depth API provides real-time first person depth frames, generated by a computer vision algorithm which compares the disparity from the two tracking cameras on the front. It works up to around 5 meters distance, and is typically used to implement dynamic occlusion in mixed reality, since you can determine whether virtual objects should be occluded by physical geometry.Meta SDKs Get Instant Placement, Keyboard Cutout, Colocation DiscoveryMeta’s Quest SDKs now enable placing virtual objects on surfaces without a scene scan, showing a passthrough cutout of any keyboard, and discovering nearby headsets for colocation via Bluetooth.UploadVRDavid HeaneyFor hit testing, the Depth API enables instantly placing virtual objects on real world surfaces without the need for a Scene Mesh. This capability has been easily available to Unity developers for around a year as part of Meta's Mixed Reality Utility Kit (MRUK), and Unreal and native developers can implement it themselves using the Depth API. Now, it's easily available for WebXR developers too.Keep in mind, however, that this is only appropriate for spawning simple stationary objects and interfaces. If entities need to move around surfaces or interact with any of the rest of the room, a scanned Scene Mesh will still be needed.