While each of Apple’s latest iPhone models has a different rear camera system, they all feature the same brand-new Center Stage front-facing, or “selfie,” camera. The base iPhone 17, ultra-thin iPhone Air, and flagship iPhone 17 Pro all have the same Center Stage camera, complete with its groundbreaking square image sensor and sophisticated new features. We sat down with key minds at Apple to learn more about the new camera and how it empowers mobile photography.
We chatted with Jon McCormack, Apple’s Vice President of Camera and Photos, and Megan Nash, Apple’s iPhone Product Manager. The two, along with the teams they lead, are responsible for many evolutions in iPhone camera technology, including the focus of this article: the new Center Stage front camera.
500 Billion Selfies a Year Are Captured on iPhone
Although particularly passionate iPhone photographers rely heavily on the iPhone’s rear camera system, which has seen numerous improvements in the current generation, the front camera is essential for a wide range of iPhone users. It is an inherently personal camera, most commonly used for self-portraits and FaceTime calls.
“Globally, [people] took around 500 billion selfies on iPhone last year,” Jon McCormack tells PetaPixel. “We like to say iPhone is a very social camera, and that is especially true of the front camera. It’s how we express ourselves, it’s how we share our point of view, [and] it’s how we record our precious memories.”
McCormack explains that at Apple, the team puts “tremendous care and consideration” into every iPhone experience, including selfies. The team has learned a lot about how its users capture selfies over the years, including that “most people hold iPhone vertically” when capturing selfies.
This is an integral part of the selfie process, as it keeps the camera centered and in a natural position for eye gaze, Megan Nash adds. However, selfies are not exclusively photos of just the iPhone user; they often include friends and family squeezing into the frame.
How People Solve Their Selfie Problems
“People want to get more photos of friends and group shots, and they want to take selfies with landmarks in the background,” McCormack says. “So we’ve noticed some really interesting things. People often use workarounds to get the shots they want. We see selfie sticks, we see people switch to the ultra-wide camera, people rotate the iPhone, and we even see folks handing the phone over to the tallest person in the group to get the best arm extension.”
All these “workarounds,” as McCormack describes them, center around people trying to make the iPhone’s cameras work for their specific self-portrait needs.
“But we knew we could do something better,” McCormack says.
The iPhone 17 Pro’s new Center Stage front camera automatically expands its field of view to capture better group shots. | Image credit: Jordan Drake The New Center Stage Front Camera Has Been Years in the Making
The development of the new Center Stage camera, which has been years in the making, has centered around the concept of improving the self-portrait experience on iPhone.
“We asked ourselves: ‘What if the camera could just understand what you’re trying to capture and make adjustments from you?’” McCormack continues. “So we set out to remove the effort and make the experience seamless to keep you in the moment and let the technology do the hard work for you. We wanted to design a digital camera experience that makes it easier and more delightful to take selfies, especially in groups and to allow you to stay more present when communicating with loved ones or on video calls.”
All of Apple’s new iPhone models, including the iPhone 17 shown here, feature the new Center Stage front camera. A Groundbreaking New Square Image Sensor
A significant part of the new Center Stage camera’s success relies upon the new 24-megapixel square image sensor.
“The front camera sensor and lens were developed together with really clear ideas about the [user] experiences we wanted to enable,” Nash says. “We knew we wanted to help users fit more friends into their group selfies, so we needed more field of view.”
However, with a broader field of view, there are often issues with distortion, as evidenced by the ultra-wide camera systems on smartphones, which can result in worse image quality.
Apple did not want to make the lens wider or add more pixels to the same image sensor, so it made a new, larger image sensor with the same pixel size as the prior-generation camera, ensuring no reduction in image quality.
“We grew the sensor to almost double the previous sensor size to match pixel-per-pixel sharpness,” Nash says. “So it’s the same number of pixels per degree field of view as the previous front camera.”
That larger sensor enabled Apple to achieve the desired wider field of view without sacrificing image quality or introducing distortion.
“The best of both worlds,” Nash adds.
While typically, smartphone image sensors are 4:3, Apple determined that when making a camera that was “orientation agnostic,” meaning that it would work equally well for portrait orientation selfies of one person or landscape shots of groups, the sensor should be square.
“[Square sensors] aren’t common in the industry,” Nash says. “They’re more expensive, they take up more silicon area and they take up more space in the device.”
“For the first time ever, we’ve uncoupled the orientation of your phone and the aspect ratio of your capture,” Nash says.
Hardware and Software Work Together in Harmony
It’s not just a new sensor and selfie experience that the new Center Stage Camera offers with the iPhone. The team has also implemented its stabilization technology from Action Mode on the rear camera, significantly improving the stabilization of the new front camera.
The new Center Stage Camera relies on various Apple technologies and recent hardware and software advancements, beyond just the latest sensor and the Action Mode-derived stabilization. The larger image sensor has increased processing demands, as do the subject detection modes that automatically swap between different orientations and crops.
“We were actually thinking years in advance when we were thinking about this new front camera,” Nash recalls. “The experiences we wanted to enable and how it would need the new high-speed Apple Camera Interface, those were especially important.”
Nash says that the new Center Stage Camera required Apple’s A19 and A19 Pro chips to process the necessary data, including for the new Dual Capture experience. Ultimately, the new Center Stage front camera was shipped as soon as it was ready.
“The magic of these experiences is really in the hardware, software, and Apple Silicon,” Nash concludes.
“And the reaction that we are getting over and over again about the new Center Stage front camera is people saying, ‘Well, of course that’s what a selfie camera should have always been. What have people been doing over these years? That took so much to do?’” McCormack laughs. “It took so much processing power and so much industrial design. We’ve been wanting to do this for a while, and this is just the first year we can actually pull it off.”
Apple’s Approach to Cameras
The Center Stage front camera reflects Apple’s broader approach to cameras and imaging technology. The company is not focused solely on installing a new sensor in an iPhone or increasing the megapixel count to achieve a new benchmark.
“To us, a camera is the combination of the lens and sensor, which is the camera module, image signal processor, CPU, GPU, and then all the math we put on top of it,” McCormack says.
The purpose of a camera also matters significantly. Apple saw a problem: people were trying all sorts of different things to capture the selfies they wanted. The solution was to devise a completely different type of camera system, so the problem informed the solution.
“One of our single biggest bottlenecks is actually compute speed,” he adds. McCormack says Apple has a lot of image-related math queued up, “just waiting for more silicon to arrive to enable better stuff.”
A significant part of the development work was based on trying to make the iPhone camera experience “invisible,” McCormack says.
Apple wants its users to stay in the moment, which means not needing to fuss with settings or crop modes. While photographers may wish to have extensive control when using the rear camera, most selfies are captured “in the moment,” focusing more on preserving a specific time and place. The last thing people want to deal with when posing with their friends for a group shot is diving into menus. These menus are still there for those who want more control, of course.
“It sounds simple,” McCormack says of making an essentially invisible camera experience. “But it often takes an extraordinary amount of innovation. The new Center Stage Camera is no exception… when you ask who works on the camera at Apple, the ripples of the answer go out forever because there are just so many people and so much involved.”
When the Camera Gets Out of the Way, People Capture Better and More Meaningful Photos
“Just take a minute and think about all the things you can do with the all new Center Stage front camera this year without even thinking about it,” says McCormack. “You can frame yourself solo, frame a group of friends, you can go on a hike or a run while you record yourself, or you can give a walking or biking tour with both you and the environment stable in the frame. And you can do all of this without having to think about the camera settings or even the composition. You’ll get great image quality, authentic skin tones, and excellent stabilization.”
For the Center Stage front camera to automatically swap between different framing modes to handle all these situations required a lot of development. McCormack says “intention” is the crucial concern here. The camera must know when someone is supposed to be in the frame, rather than someone who just walked behind the primary subject. The camera needs to know when to zoom automatically or when to swap orientations. ”We spent months tuning the parameters on this,” McCormack says. “The system needs to be responsive, but not twitchy. You don’t want the camera to overreact, but when your friend leans into the shot, you want the framing to readjust smoothly.”
“The camera just has to ‘get it.’ There’s a lot of technology here, and it’s taken us years to refine it,” McCormack concludes. “We needed a bigger sensor, we needed a really fast processor and it took a ton of math. But the funny thing is we did all of this just so it could be effortless. The camera on iPhone is all about letting you capture the moment while you stay in that moment. No settings, no distraction — just pull out your iPhone and let the new Center Stage front camera do all the work for you.”
Image credits: Apple, unless otherwise noted