{"id":171083,"date":"2025-09-26T17:33:07","date_gmt":"2025-09-26T17:33:07","guid":{"rendered":"https:\/\/www.newsbeep.com\/ca\/171083\/"},"modified":"2025-09-26T17:33:07","modified_gmt":"2025-09-26T17:33:07","slug":"the-quest-3s-hyperscapes-are-impressive-weird-and-doomed-to-be-under-appreciated","status":"publish","type":"post","link":"https:\/\/www.newsbeep.com\/ca\/171083\/","title":{"rendered":"The Quest 3&#8217;s &#8216;Hyperscapes&#8217; Are Impressive, Weird, and Doomed to Be Under-Appreciated"},"content":{"rendered":"<p>There was a lot to unpack at <a href=\"https:\/\/gizmodo.com\/live-updates-from-meta-connect-2025-2000658450\" rel=\"nofollow noopener\" target=\"_blank\">Meta\u2019s latest Connect conference<\/a>, and almost none of it had to do with the Quest. That\u2019s actually good news for fans of smart glasses\u2014Meta introduced not one, not two, but three new pairs, one of which <a href=\"https:\/\/gizmodo.com\/meta-ray-ban-display-hands-on-the-smart-glasses-you-were-waiting-for-2000660384\" rel=\"nofollow noopener\" target=\"_blank\">actually has a display in it<\/a>. For fans of VR, which is traditionally what Connect is all about, the selection of things to be excited over was a little more sparse.<\/p>\n<p>One thing that actually did make the keynote, however, was <a href=\"https:\/\/www.meta.com\/experiences\/meta-horizon-hyperscape-capture-beta\/8798130056953686\/?srsltid=AfmBOooenFywBmha1bnuU443QlUqHBRY6bnPPsyYDUAKGXvhCsWZVWK_\" rel=\"nofollow noopener\" target=\"_blank\">Meta\u2019s Hyperscape<\/a>. If that name sounds familiar, it\u2019s because Meta announced it a year ago, but it\u2019s just starting to roll it out for real now. In case you either didn\u2019t catch the announcement back then or have forgotten, Hyperscapes are hyperealistic environments that can be captured within a few minutes by just walking around with a Quest 3 strapped to your head. While Meta has had examples for people to explore for a while now (environments include Chance the Rapper\u2019s home studio and Gordon Ramsey\u2019s kitchen), this is the first time that it\u2019s actually allowing Quest 3 and <a href=\"https:\/\/gizmodo.com\/meta-quest-3s-hands-on-it-may-be-cheaper-but-it-still-made-me-queasy-2000502999\" rel=\"nofollow noopener\" target=\"_blank\">3S<\/a> users to capture content for themselves.<\/p>\n<p>Naturally, after seeing lots of wild <a href=\"https:\/\/x.com\/bilawalsidhu\/status\/1970830926549766296\" rel=\"nofollow\">examples on X<\/a>, I wanted to give it a whirl for myself\u2014and folks, I\u2019m glad I did.<\/p>\n<p lang=\"en\" dir=\"ltr\">Hyperscaped my office. Results aren\u2019t perfect by any means but still kind of cool? <a href=\"https:\/\/t.co\/0SnICH1iip\" rel=\"nofollow\">pic.twitter.com\/0SnICH1iip<\/a><\/p>\n<p>\u2014 James Pero (@jamestpero) <a href=\"https:\/\/twitter.com\/jamestpero\/status\/1971601106850783723?ref_src=twsrc%5Etfw\" rel=\"nofollow noopener\" target=\"_blank\">September 26, 2025<\/a><\/p>\n<p>First thing you need to know is that in order to use Hyperscape Capture on your Quest 3\/3S, you need to download the Public Test Channel (PTC) version of the Horizon OS, which is a public beta. Having downloaded it myself, you should be warned that it\u2019s a little glitchy. But as long as you\u2019re okay dealing with some bugs and a vastly different UI, you can download Horizon OS v81 and get capturing. To join, you have to use the Meta Horizon app and go to your Headset settings, then tap Advanced Settings and toggle on the \u201cPublic Test Channel.\u201d You\u2019ll then be able to go into your headset and download v81 as a software update in your Settings.<\/p>\n<p>The process of creating a Hyperscape couldn\u2019t be easier, really. Once you download the Hyperscape Capture app from the store, you can just load in and start capturing. Hyperscape Capture is still in beta, mind you, so not everything is seamless. It may take a couple of tries to get the feature to work. Once you do, you\u2019ll be prompted to put the headset on and start walking around your chosen room. This is the mapping process, and Meta shows a helpful grid around areas that it wants you to ogle, so you can go stare at them until the app is satisfied.<\/p>\n<p lang=\"en\" dir=\"ltr\">Turn any room into an immersive world \ud83c\udf0d\u2728 <\/p>\n<p>At <a href=\"https:\/\/twitter.com\/hashtag\/MetaConnect?src=hash&amp;ref_src=twsrc%5Etfw\" rel=\"nofollow noopener\" target=\"_blank\">#MetaConnect<\/a>, we shared how Hyperscape Capture (Beta) lets you capture physical spaces on Meta Quest in minutes and transform them into photorealistic environments \ud83e\udd2f<\/p>\n<p>See it in the Meta Horizon Store \ud83d\udc49 <a href=\"https:\/\/t.co\/XElaYPJxNj\" rel=\"nofollow\">https:\/\/t.co\/XElaYPJxNj<\/a> <a href=\"https:\/\/t.co\/j8K7AjEsEC\" rel=\"nofollow\">pic.twitter.com\/j8K7AjEsEC<\/a><\/p>\n<p>\u2014 Meta Horizon Developers (@MetaHorizonDevs) <a href=\"https:\/\/twitter.com\/MetaHorizonDevs\/status\/1969062949810561250?ref_src=twsrc%5Etfw\" rel=\"nofollow noopener\" target=\"_blank\">September 19, 2025<\/a><\/p>\n<p>I chose the Gizmodo office, which is pretty big, and honestly, a tall order given that it\u2019s a little dark and has lots of stuff on it. Plus, I had to walk around staring vacantly at my co-workers while they work, and Hyperscape is not optimized for people, it\u2019s meant for places. The things you map should be stationary so you can capture everything in detail. What I\u2019m trying to say is that you should not do what I did\u2014my choice of environment was a product of me having no bandwidth\/free time and needing to get a test done when I actually have a second. But, hey, consider my non-ideal environment a test!<\/p>\n<p>Once I awkwardly shuffled around the room staring at stuff (you\u2019ll have to go through several rounds that capture the objects, details, and the ceiling), it was time for the hard part\u2026 waiting. I\u2019m not going to lie, it takes a long time for Meta to process videos. In my case, since it was a rather large space, it took about 8 hours. Good thing I wasn\u2019t on a tight deadline! Ultimately, this isn\u2019t a big deal since, again, this feature is in beta and it\u2019s really more for fun than anything. It\u2019s still good to keep in mind if you\u2019re feeling excited and want to get exploring right away, though.<\/p>\n<p> <img loading=\"lazy\" decoding=\"async\" class=\"size-full wp-image-2000664253\" src=\"https:\/\/www.newsbeep.com\/ca\/wp-content\/uploads\/2025\/09\/Image-from-Slack-4.jpg\" alt=\"Hyperscape app for Meta Quest 3\" width=\"1170\" height=\"1156\"  \/>This was one angle that Hyperscape captured particularly well. \u00a9 Screenshot by James Pero \/ Gizmodo <\/p>\n<p>The good news is, the results were honestly worth the wait. The next day, after mapping my office, I popped the Quest 3S on my face to check it out, and was genuinely impressed with the level of detail the headset was able to capture with my pretty expedited run-through and not ideal choice of environment. While some details (people and chairs that had moved while mapping) were blurred, lots of aspects\u2014especially those that I actually spent time looking at\u2014were rendered in enough detail to make me feel like I was witnessing something close to the real thing.<\/p>\n<p>I was able to use the Quest 3S controllers to teleport around, looking at how good (or bad) details were captured, and found that there was a big difference depending on which area of the room I was in. The areas where I was patient and spent more time were decidedly a lot more realistic than those I rushed through. That\u2019s hardly a complaint, though. If you\u2019re patient and you choose the right space, I\u2019m sure Hyperscape could capture most rooms of a moderate size in serviceable detail. And even when the Hyperscape Capture feature doesn\u2019t succeed, the results are kind of\u2026 interesting. The blurred stuff is like glitch art, which I know is not what Meta is going for here, but I can appreciate the imperfections nonetheless.<\/p>\n<p> <img loading=\"lazy\" decoding=\"async\" class=\"size-full wp-image-2000664179\" src=\"https:\/\/www.newsbeep.com\/ca\/wp-content\/uploads\/2025\/09\/hyperscape-example-3.jpg\" alt=\"Hyperscape Example 3\" width=\"1920\" height=\"1280\"  \/>Not everything was pristine (people and moving chairs, especially), but I\u2019ll forgive Hyperscape for using a not-ideal environment. \u00a9 Screenshot by James Pero \/ Gizmodo <\/p>\n<p>Plus, let\u2019s not forget the fact that this sort of replication of environments in 3D used to require all sorts of advanced equipment. Sure, if you want something totally immersive and high-res, this is not the ideal solution\u2014you\u2019re limited to the hardware of the Quest 3\/3S\u2014but for more general, everyday purposes, it\u2019s hard to beat the fact that all you need is a headset. Honestly, after using Hyperscape once, I\u2019m excited to try mapping other scenes. I\u2019m already thinking about how much I would have loved to have Hyperscape back in the day when my parents sold my childhood home. I still have dreams about that house, and to have been able to capture my memories in immersive detail would have been legitimately special. It\u2019s too late for that now, of course, but I\u2019m already preparing myself to map my current apartment before I leave next year.<\/p>\n<p>Will any of this move the needle for VR? Probably not. <a href=\"https:\/\/gizmodo.com\/vr-headsets-are-better-than-ever-and-no-one-seems-to-care-2000663098\" rel=\"nofollow noopener\" target=\"_blank\">As I recently wrote<\/a>, headsets, no matter what features they introduce or how sophisticated the hardware becomes, may still end up being a fairly niche device. In the case of Hyperscape, though, it\u2019s easy to see the mass appeal. It\u2019s further proof to me that VR has come a long way, and while it may not be the ultimate form factor, there are things that headsets can do that other technology just can\u2019t. You probably won\u2019t run home and strap a headset to your face after reading this, but ya know what? Maybe you should.<\/p>\n<p>      <script async src=\"https:\/\/platform.twitter.com\/widgets.js\" charset=\"utf-8\"><\/script><\/p>\n","protected":false},"excerpt":{"rendered":"There was a lot to unpack at Meta\u2019s latest Connect conference, and almost none of it had to&hellip;\n","protected":false},"author":2,"featured_media":171084,"comment_status":"","ping_status":"","sticky":false,"template":"","format":"standard","meta":{"footnotes":""},"categories":[19],"tags":[49,48,1800,281,31694,87932,61,260,261,262],"class_list":{"0":"post-171083","1":"post","2":"type-post","3":"status-publish","4":"format-standard","5":"has-post-thumbnail","7":"category-virtual-reality","8":"tag-ca","9":"tag-canada","10":"tag-hands-on","11":"tag-meta","12":"tag-meta-quest-3","13":"tag-quest-3","14":"tag-technology","15":"tag-virtual-reality","16":"tag-virtualreality","17":"tag-vr"},"_links":{"self":[{"href":"https:\/\/www.newsbeep.com\/ca\/wp-json\/wp\/v2\/posts\/171083","targetHints":{"allow":["GET"]}}],"collection":[{"href":"https:\/\/www.newsbeep.com\/ca\/wp-json\/wp\/v2\/posts"}],"about":[{"href":"https:\/\/www.newsbeep.com\/ca\/wp-json\/wp\/v2\/types\/post"}],"author":[{"embeddable":true,"href":"https:\/\/www.newsbeep.com\/ca\/wp-json\/wp\/v2\/users\/2"}],"replies":[{"embeddable":true,"href":"https:\/\/www.newsbeep.com\/ca\/wp-json\/wp\/v2\/comments?post=171083"}],"version-history":[{"count":0,"href":"https:\/\/www.newsbeep.com\/ca\/wp-json\/wp\/v2\/posts\/171083\/revisions"}],"wp:featuredmedia":[{"embeddable":true,"href":"https:\/\/www.newsbeep.com\/ca\/wp-json\/wp\/v2\/media\/171084"}],"wp:attachment":[{"href":"https:\/\/www.newsbeep.com\/ca\/wp-json\/wp\/v2\/media?parent=171083"}],"wp:term":[{"taxonomy":"category","embeddable":true,"href":"https:\/\/www.newsbeep.com\/ca\/wp-json\/wp\/v2\/categories?post=171083"},{"taxonomy":"post_tag","embeddable":true,"href":"https:\/\/www.newsbeep.com\/ca\/wp-json\/wp\/v2\/tags?post=171083"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}