Apple camera execs talk iPhone 12 in a brand new interview

"...the company thinks of camera development holistically."

What you need to know

  • The cameras in Apple's iPhone 12 lineup are the best yet.
  • Two of the people responsible for them have been talking to photography site PetaPixel.
  • Apple's Product Line Manager, iPhone Francesca Sweet and Vice President, Camera Software Engineering Jon McCormack, discussed the cameras

 

The launch of Apple's iPhone 12 and iPhone 12 Pro models bring the best cameras ever attached to an iPhone and the company has a ton of people working to make sure they're as impressive as can be. Two of those people, Apple's Product Line Manager, iPhone Francesca Sweet and Vice President, Camera Software Engineering Jon McCormack, have been speaking to photography site PetaPixel.

Straight out the gate, it's clear that Apple thinks of cameras in a way that means it isn't just the lenses that make the magic happen. Instead, the report notes that Apple sees camera development as something that includes everything about the iPhone – including the A14 Bionic.

 

"In an interview with Apple's Product Line Manager, iPhone Francesca Sweet and Vice President, Camera Software Engineering Jon McCormack, both made clear that the company thinks of camera development holistically: it's not just the sensor and lenses, but also everything from Apple's A14 Bionic chip, to the image signal processing, to the software behind its computational photography."

Apple's ability to close the computational photography gap between iPhone and Google's Pixels is something that has been noticeable in the last couple of years, not least with the addition and growth of Night Mode.

 

The pair also note that iPhone tries to do what Apple thinks a photographer might normally do in post. That means that it's applying machine learning to try and create a finished image without the need for taking shots into an app afterwards.

"We replicate as much as we can to what the photographer will do in post," McCormack continued. "There are two sides to taking a photo: the exposure, and how you develop it afterwards. We use a lot of computational photography in exposure, but more and more in post and doing that automatically for you. The goal of this is to make photographs that look more true to life, to replicate what it was like to actually be there."

That's accomplished by taking an image and then breaking it down into its components, allowing machine learning to get to work.

"The background, foreground, eyes, lips, hair, skin, clothing, skies. We process all these independently like you would in Lightroom with a bunch of local adjustments," he explained. "We adjust everything from exposure, contrast, and saturation, and combine them all together."

The full interview is absolutely worth a read with stuff covered in way more detail. There are some gorgeous sample shots showing off what these cameras are capable of, too.

 

Buy used, save big

jemjem makes it safe and easy to save money buying gently used technology for home and family.

Apple how to

Leave a comment

All comments are moderated before being published