Apple released visionOS 26.4 this week, bringing VR foveated streaming and a key improvement to spatial audio in mixed reality.
Foveated Streaming
The key new feature, foveated streaming, was announced last month when the first visionOS 26.4 beta released for Vision Pro developers.
Before you continue reading, note that foveated streaming is not the same as foveated rendering, though the two techniques can be used alongside each other. As the names suggest, while foveated rendering involves the host device actually rendering the area of each frame you’re currently looking at with higher resolution, foveated streaming refers to sending that area to the headset with higher image quality than the rest of the frame.
It’s a term you may have heard in the context of Valve’s Steam Frame, where it’s a fundamental always-on feature of its PC VR streaming offering, delivered via the USB PC wireless adapter by default.
Given that the video decoders in headsets have a limited maximum resolution and bitrate, foveated streaming helps prioritize resolution and compression quality where you’re currently looking.
Valve’s depiction of foveated streaming.
Unlike the macOS Spatial Rendering introduced in the main visionOS 26 release last year, which is a relatively high-level system that only supports a local Mac as a host, Foveated Streaming is a low-level host-agnostic framework, that any PC VR server could in theory implement. For developers who want a ready-to-go implementation, Apple worked closely with Nvidia to add support for its CloudXR SDK. And despite the name, CloudXR can be used for both local and cloud PCs.
Normally, for native visionOS apps, the operating system does not provide developers with any information about where the user is looking – Apple says this is in order to preserve privacy. Instead, developers only receive events, such as which element the user was looking at as they performed the pinch gesture. But crucial to foveated streaming working, the API tells the developer the “rough” region of the frame the user is looking at. It’s the first time visionOS developers get this kind of information.
This should allow the host to render at higher resolution in this region too, not just stream it in higher resolution. As always, this will require the specific VR game to support foveated rendering, or to support tools that inject foveated rendering.
X-Plane & iRacing Getting Official Apple Vision Pro Support Via PC VR Streaming
X-Plane 12 and iRacing will be officially playable on Apple Vision Pro, streamed from your PC via Nvidia CloudXR, with your physical accessories blended in using mixed reality passthrough.
Autodesk VRED Gets Immersive Mode That Streams To Apple Vision Pro
Autodesk VRED now has an immersive mode that streams to Apple Vision Pro via Nvidia’s CloudXR, and Kia, BMW, Volvo, and Rivian are already using it.
Clear XR Brings Apple Vision Pro’s Foveated Streaming To OpenXR PC VR Games
Clear XR, available on TestFlight, lets Nvidia RTX 40 and 50 series owners use Apple Vision Pro’s foveated streaming feature for OpenXR PC VR games.

Over the past few weeks, multiple companies have announced supporting Vision Pro’s foveated streaming via Nvidia’s CloudXR SDK:
X-Plane 12 and iRacing are releasing dedicated visionOS clients that automatically connect to the sim on your PC, automatically align your cockpit to your physical peripherals through computer vision, and segment out those peripherals with passthrough so you can see them in VR.Autodesk VRED has already added an immersive mode that pixel streams high fidelity assets to Vision Pro, and it’s already being used by Kia, BMW, Volvo, and Rivian.A lone developer released Clear XR on TestFlight, which lets you stream any OpenXR VR game from your PC with foveated streaming.
One notable limitation is that Nvidia’s CloudXR SDK only supports Nvidia’s Ada and Blackwell GPU architectures, meaning RTX 40-series and 50-series graphics cards.
Improved Spatial Audio In Mixed Reality
Since launch, Apple Vision Pro has supported not just basic positional spatial audio, but audio that scans the features and materials of your space to precisely match sound to your physical environment. Apple calls this Audio Ray Tracing.
With visionOS 26.4, Apple says Vision Pro now stores and remembers the acoustic properties of rooms you use the headset in, such that the Audio Ray Tracing feature can initialize quicker and have a more holistic model of your environment without you needing to move around first.

