{"id":86196,"date":"2025-08-22T00:13:12","date_gmt":"2025-08-22T00:13:12","guid":{"rendered":"https:\/\/www.newsbeep.com\/au\/86196\/"},"modified":"2025-08-22T00:13:12","modified_gmt":"2025-08-22T00:13:12","slug":"the-pixel-10-pros-100x-zoom-is-googles-most-controversial-use-of-ai-yet-heres-why","status":"publish","type":"post","link":"https:\/\/www.newsbeep.com\/au\/86196\/","title":{"rendered":"The Pixel 10 Pro&#8217;s 100x zoom is Google&#8217;s most controversial use of AI yet \u2014 here&#8217;s why"},"content":{"rendered":"<p><a href=\"https:\/\/www.androidauthority.com\/best-pixel-features-explained-3217987\/\" rel=\"nofollow noopener\" target=\"_blank\">Google loves AI<\/a>, and it\u2019s doubled down on the tech with every new Pixel generation. But this year\u2019s <a href=\"https:\/\/www.androidauthority.com\/google-pixel-10-specs-features-price-availability-3588502\/\" rel=\"nofollow noopener\" target=\"_blank\">Pixel 10 Pro and Pro XL<\/a> take things to another level, introducing a diffusion model to upscale images from the phone\u2019s conservative 5x optical zoom into telescopic-length 100x photos.<\/p>\n<p>Google is no stranger to computational photography or AI-assisted imaging \u2014 features like <a href=\"https:\/\/www.androidauthority.com\/pixel-9-add-me-3520637\/\" rel=\"nofollow noopener\" target=\"_blank\">Add Me<\/a> and Astrophotography mode laid the groundwork for its ongoing evolution. However, the introduction of diffusion models in the Pixel 10 Pro series marks a significant shift: using generative AI to reconstruct details beyond what the sensor can physically capture.<\/p>\n<p>Thankfully, Google includes the original, unprocessed photo alongside the enhanced version, allowing users to decide how much AI is too much. Google also securely writes AI metadata into the file so others can check if pictures have been artificially enhanced. Still, this all begs the question about whether AI enhancements risk going too far.<\/p>\n<p> Don\u2019t want to miss the best from Android Authority?<\/p>\n<p>What is diffusion upscaling?<\/p>\n<p><img class=\"e_sg\" decoding=\"async\" loading=\"lazy\"  title=\"Stable Diffusion Qualcomm Doggo\"  alt=\"Stable Diffusion Qualcomm Doggo\" src=\"https:\/\/www.newsbeep.com\/au\/wp-content\/uploads\/2025\/08\/Stable-Diffusion-Qualcomm-Doggo.jpg\"\/><\/p>\n<p>Robert Triggs \/ Android Authority<\/p>\n<p>If you\u2019ve followed the AI landscape at all, you\u2019ve probably encountered the term diffusion in the context of image generation. Stable Diffusion was the breakout image generation tool that brought the concept mainstream \u2014 <a href=\"https:\/\/www.androidauthority.com\/qualcomm-offline-ai-image-generator-3290206\/\" rel=\"nofollow noopener\" target=\"_blank\">Qualcomm even managed to get it running on a demo phone<\/a> a couple of years back.<\/p>\n<p>Diffusion models are fascinating because they recreate images from random noise, refining them over many iterations to match a target prompt. They\u2019re trained by progressively introducing more noise to an image and then learning to reverse that process. Diffusion can generate realistic images from essentially nothing, but it can also clean up noisy images or super-size low-resolution ones.<\/p>\n<p>Still, we\u2019re not talking full-blown image regeneration with the Pixel 10 Pro. Starting from a low-res or noisy zoomed-in crop (instead of pure noise), Google\u2019s diffusion model acts as an intelligent denoiser, polishing up edges and fine details without reinventing swathes of the original image \u2014 at least in theory. Done well, you could consider it a texture enhancer or AI sharpener rather than a synthetic image generator.<\/p>\n<p>Are you OK with phones using AI to add more detail to pictures?<\/p>\n<p>146 votes<\/p>\n<p>Yes, it&#8217;s fine.<\/p>\n<p>33%<\/p>\n<p>It&#8217;s OK, but only in moderation.<\/p>\n<p>45%<\/p>\n<p>No, it&#8217;s wrong.<\/p>\n<p>22%<\/p>\n<p><img class=\"e_sg\" decoding=\"async\" loading=\"lazy\"  title=\"Pixel 10 Pro Pro Res Zoom Redditor Dry Astronomer 3210\"  alt=\"Pixel 10 Pro Pro Res Zoom Redditor Dry Astronomer 3210\" src=\"https:\/\/www.newsbeep.com\/au\/wp-content\/uploads\/2025\/08\/Pixel-10-Pro-Pro-Res-Zoom-Redditor-Dry_Astronomer-3210-scaled.jpeg\"\/><\/p>\n<p>Based on patterns learned from countless training images, the model fills in textures and details that should statistically exist beneath the noise. That seems to be closer to Google\u2019s angle here, though some creative license will always exist with diffusion.<\/p>\n<p>That said, the lower the quality of the input, the more likely the model is to misinterpret what it sees. Extremely noisy or low-res images, such as 100x long-range shots in less ideal lighting, are more prone to aggressive \u201challucination,\u201d where entire details or even objects can be reinvented. Early results suggest that 100x is perhaps a stretch too far for Google\u2019s diffusion upscaling approach. Perhaps shorter distances will look better.<\/p>\n<p>Diffusion creates detail from noise \u2014 whether for generating new images or touching up existing ones.<\/p>\n<p>Google already seems aware of this approach\u2019s limitations. During our pre-brief, it was highlighted that special tuning is applied when a person is detected in the shot to prevent \u201cinaccurate representation.\u201d Likewise, Google suggests its model is best for landscapes and landmarks (think solid, block textures) while wildlife is best kept to a more limited range in the region of 30x to 60x, likely because fine textures like fur are far more complex to fake convincingly.<\/p>\n<p>More importantly, Google takes a different approach when it detects people as the subject. Diffusion\u2019s random approach to detail enhancement might be fine for minor textures on brickwork or distant trees, but it\u2019s potentially rather troublesome for facial features, hence why Google flicks the off switch in these situations. To demonstrate, I generated a random, low-res AI image of a person and ran a 3x diffusion upscale eight times using precisely the same settings.<\/p>\n<p><img class=\"e_sg\" decoding=\"async\" loading=\"lazy\"  title=\"Diffusion Loop Small\" alt=\"Diffusion Loop Small\" src=\"https:\/\/www.newsbeep.com\/au\/wp-content\/uploads\/2025\/08\/Diffusion_Loop_Small.gif\"\/><\/p>\n<p>Same algorithm, eight slightly different-looking versions of the same person, but which is even close to the original image? Minor, random variations in eyes, eyebrows, hairlines, and facial structures can make people look somewhat different when upscaled via diffusion. There\u2019s always the risk that a diffusion model makes far more glaring mistakes, some of which can be horrifically jarring. Google might be erring on the side of caution here, but there\u2019s no guarantee that other brands will do the same.<\/p>\n<p>Is this good or bad?<\/p>\n<p><img class=\"e_sg\" decoding=\"async\" loading=\"lazy\"  title=\"google pixel 10 pro zoom sample 2 90x\"  alt=\"google pixel 10 pro zoom sample 2 90x\" src=\"https:\/\/www.newsbeep.com\/au\/wp-content\/uploads\/2025\/08\/google-pixel-10-pro-zoom-sample-2-90x.jpg\"\/><\/p>\n<p>Rita El Khoury \/ Android Authority<\/p>\n<p>Clearly, inventing details in your pictures is a contentious topic and marks a notable shift from Google\u2019s past image processing efforts at long range. Previous versions of <a href=\"https:\/\/www.androidauthority.com\/google-pixel-super-res-zoom-3223994\/\" rel=\"nofollow noopener\" target=\"_blank\">Super Res Zoom<\/a> relied on sub-pixel shifts between frames to extract and enhance real additional detail when shooting past 10x \u2014 a clever multi-frame sampling technique rooted in physics and optics, with a dose of innovative processing to piece it altogether.<\/p>\n<p>Historically, Google\u2019s reputation for computational photography has revolved around doing more with less, but all based on actual captured data. HDR layering, Night Sight, and Astrophotography blend information harnessed from multiple frames and exposures, but nothing is invented out of thin air.<\/p>\n<p>Diffusion, however, is a departure. It hallucinates extra detail that looks real based on patterns from thousands of similar images \u2014 but it\u2019s not necessarily what was actually there when you pressed the shutter. For some users, that might cross a line.<\/p>\n<p>Diffusion marks a shift in Google&#8217;s use of AI to enhance your pictures.<\/p>\n<p>Then again, at 100x, your eyes couldn\u2019t see what was really there either. As long as the image looks believable, most people won\u2019t know \u2014 or care. Pixel fans have already embraced other AI tools that make pictures look better. <a href=\"https:\/\/www.androidauthority.com\/what-is-google-photos-magic-editor-3329779\/\" rel=\"nofollow noopener\" target=\"_blank\">Magic Editor<\/a>, Best Take, and Photo Unblur all leverage machine learning to reshape reality to some degree. And rather than protest, many users rave about them.<\/p>\n<p>Google also isn\u2019t alone in exploring AI upscaling. The <a href=\"https:\/\/www.androidauthority.com\/oneplus-13-review-3512919\/\" rel=\"nofollow noopener\" target=\"_blank\">OnePlus 13<\/a> and OPPO Find X8 series boast<a href=\"https:\/\/www.androidauthority.com\/oneplus-13-camera-zoom-vs-pixel-vs-iphone-3512529\/\" rel=\"nofollow noopener\" target=\"_blank\"> impressive long-range zoom<\/a> results based on <a href=\"https:\/\/www.oppo.com\/en\/newsroom\/stories\/oppo-find-x8-series-camera-ai-telescope-zoom\/\" target=\"_blank\" rel=\"nofollow noopener\">OPPO\u2019s AI Telescope Zoom<\/a>, which again fills in missing details at extreme distances. These phones offer extremely compelling long-range zoom capabilities from seemingly modest lenses.<\/p>\n<p><img class=\"e_sg\" decoding=\"async\" loading=\"lazy\"  title=\"OPPO AI Telescope Zoom\"  alt=\"OPPO AI Telescope Zoom\" src=\"https:\/\/www.newsbeep.com\/au\/wp-content\/uploads\/2025\/08\/OPPO-AI-Telescope-Zoom.webp.webp\"\/><\/p>\n<p>Let\u2019s face it: Between color profiles, filters, and RAW edits, the boundary between a photo and what\u2019s real has always been blurry. Personally, I\u2019ll take more emovite color pallets over hardcore realism every time. Object removal and diffusion are just more tools on the belt to help you capture the pictures you want to take.<\/p>\n<p>Still, I can\u2019t help but feel that padding out fine detail is a cheap shortcut. Smartphones can\u2019t overcome the range limitations of compact optics, but inventing the details hardly feels like a compelling solution. But what concerns me more is what comes next; if 30x is acceptable today, what stops that kind of hallucination from creeping into your 10x shots tomorrow? Would you be happy with a phone that uses AI outpainting instead of a real wide-angle lens?<\/p>\n<p>While there\u2019s plenty of grey area, there\u2019s a boundary hidden somewhere within. The Pixel 10 Pro\u2019s long-range zoom feels like it\u2019s approaching it, and fast.<\/p>\n<p><a id=\"deal-medium-6699\" class=\"e_l\">See price at Amazon<\/a><img class=\"e_sg\" decoding=\"async\" loading=\"lazy\"  alt=\"Google Pixel 10 Pro\" src=\"https:\/\/www.newsbeep.com\/au\/wp-content\/uploads\/2025\/08\/1755712508_791_custom_filename.jpg\"\/><\/p>\n<p>Google Pixel 10 Pro<\/p>\n<p>Top-tier specs with small display <br \/>Satellite SOS <br \/>Powerful AI tools <br \/>Bright display<\/p>\n<p><a id=\"deal-medium-6700\" class=\"e_l\">See price at Amazon<\/a><img class=\"e_sg\" decoding=\"async\" loading=\"lazy\"  alt=\"Google Pixel 10 Pro XL\" src=\"https:\/\/www.newsbeep.com\/au\/wp-content\/uploads\/2025\/08\/1755712508_35_custom_filename.jpg\"\/><\/p>\n<p>Google Pixel 10 Pro XL<\/p>\n<p>Biggest non-folding Pixel phone <br \/>Best specs and AI features<\/p>\n<p>Thank you for being part of our community. Read our\u00a0<a class=\"c-link\" href=\"https:\/\/www.androidauthority.com\/android-authority-comment-policy\/\" target=\"_blank\" rel=\"noopener noreferrer nofollow\" data-stringify-link=\"https:\/\/www.androidauthority.com\/android-authority-comment-policy\/\" data-sk=\"tooltip_parent\">Comment Policy<\/a> before posting.<\/p>\n","protected":false},"excerpt":{"rendered":"Google loves AI, and it\u2019s doubled down on the tech with every new Pixel generation. But this year\u2019s&hellip;\n","protected":false},"author":2,"featured_media":86197,"comment_status":"","ping_status":"","sticky":false,"template":"","format":"standard","meta":{"footnotes":""},"categories":[20],"tags":[256,254,255,64,63,16729,105],"class_list":{"0":"post-86196","1":"post","2":"type-post","3":"status-publish","4":"format-standard","5":"has-post-thumbnail","7":"category-artificial-intelligence","8":"tag-ai","9":"tag-artificial-intelligence","10":"tag-artificialintelligence","11":"tag-au","12":"tag-australia","13":"tag-google-pixel-10","14":"tag-technology"},"_links":{"self":[{"href":"https:\/\/www.newsbeep.com\/au\/wp-json\/wp\/v2\/posts\/86196","targetHints":{"allow":["GET"]}}],"collection":[{"href":"https:\/\/www.newsbeep.com\/au\/wp-json\/wp\/v2\/posts"}],"about":[{"href":"https:\/\/www.newsbeep.com\/au\/wp-json\/wp\/v2\/types\/post"}],"author":[{"embeddable":true,"href":"https:\/\/www.newsbeep.com\/au\/wp-json\/wp\/v2\/users\/2"}],"replies":[{"embeddable":true,"href":"https:\/\/www.newsbeep.com\/au\/wp-json\/wp\/v2\/comments?post=86196"}],"version-history":[{"count":0,"href":"https:\/\/www.newsbeep.com\/au\/wp-json\/wp\/v2\/posts\/86196\/revisions"}],"wp:featuredmedia":[{"embeddable":true,"href":"https:\/\/www.newsbeep.com\/au\/wp-json\/wp\/v2\/media\/86197"}],"wp:attachment":[{"href":"https:\/\/www.newsbeep.com\/au\/wp-json\/wp\/v2\/media?parent=86196"}],"wp:term":[{"taxonomy":"category","embeddable":true,"href":"https:\/\/www.newsbeep.com\/au\/wp-json\/wp\/v2\/categories?post=86196"},{"taxonomy":"post_tag","embeddable":true,"href":"https:\/\/www.newsbeep.com\/au\/wp-json\/wp\/v2\/tags?post=86196"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}