It’s computational. We’ll get back to that – hi all! It’s #TravelTuesday and I, Dave Williams, am here as always, this week reporting from the UK.
This week I want to touch on a subject that has emerged recently to do with Samsung’s Galaxy phones. It turns out Samsung have been faking their moon photos. This is hardly a surprise to many of us but I guess the surprise comes more from the fact that the marketing of the Samsung Galaxy phones capable of shooting the detailed, close-up shots of the moon involves making out that it’s all 100% real. Wha’t been happening is that the camera app has been trained to recognise the presence of the moon in the photos and then switch out the bad version with a good version, all with the help of AI.
Whether this is right or wrong is an absolutely enormous discussion but whatever your feelings, this is the future of photography. In fact, the future of photography is that it’s computational. The rapid advancement of mobile photography is where the roots are being laid for this, but I’m sure it’s all going tot translate to mirrorless photography. Take a look at this image: –
I shot this on iPhone (and I’ll be talking about it at the iPhone Photography Conference) but here’s why it’s computational. I shot this using the Profoto camera app and the app fired a strobe located in the room in the background, giving the ambient light in there. This strobe and the others in the Profoto range are fired from the app but in order for that to happen, there’s a huge amount of mathematics involved. Having the bluetooth tell the flash when to fire, communicating that with the camera to tell it when to store the image that’s constantly being updated with the data received from the sensor, and many other factors are all coming in to play. Even deeper than this, we cannot change the aperture on the iPhone camera, meaning we’re changing other elements of the exposure triad to balance our exposure. Ok, so that doesn’t seem like a huge deal, but what about this example:
This is a long exposure shot on the Even Longer app for iPhone. This photo was made over the course of about 90 seconds on my iPhone 13 Pro Max, with no filter. In most of our minds we envision that the shutter was kept open because traditionally, that’s what a long exposure is. Here’s the thing: – the iPhone doesn’t allow for anything longer than a one-second exposure. This image therefore, is ninety images each exposed for one-second, stacked and blended together by the app. It’s made computationally.
I’ll talk about this in a bit more detail in my class at the iPhone Photography Conference, and I’m sure some of the other instructors will have something to say about it as well. See you there!