Tension - Seven


Curated by tz1PdHWyZmtWntsZmQjrDRRkfGvWSyRVBLoi
Aug 19, 2024 at 10:43 PM

I take pictures of moments that never happened and landscapes that don’t exist using an unmodified iPhone. The photographs exploit the limits of human imagination (in product design) using commodity hardware to create images that look past the familiar photographic output of smart phones to reveal the programming and processing that work invisibly in the background to deliver the food porn and family snapshots that fill our phones. Many images rely on causing failures in the image processing to generate outputs that depart from the visual polish that we expect from computational photography. All the images are generated in camera (with less than a handful of exceptions) although the images that I post may be cropped or rotated to an ideal cardinal direction. One of the most interesting things about this process is the total lack of visibility into the computational system that underlies the iPhone’s panorama process. This is functionally a black box that I feed with light and movement attempting to nudge the output towards a specific outcome. At the core of this is the iPhone’s panorama feature. The panorama feature has changed substantially since it was originally introduced and the level of abstraction has increased. These changes accelerated with the iPhone 12 however the parts that intersect with my work were only added as a software update in the fall of 2021. All modern in-camera panorama work falls under the broad heading of computational photography. In 2021 the familiar photo-stitching recipe for panoramas started to give way to elements of photogrammetry and depth data that relied upon the multiple cameras on the new iPhones. NOTE: I have to make a lot of assumptions about what Apple is doing in the panorama software because they don’t disclose much and because I have only smallest possible interest in methodically reverse engineering how the feature works -- which is no interest. Early on I felt I had some sense of what the variables are but as I have progressed over the last ten years I have found a number of shots that are simply outliers that may not be reproducible because the number of variables is unknowable. HISTORICAL CONTEXT The tradition of generative photography is built on a similar repurposing of hardware and processes. This is a natural part of the adoption curve in technology where it becomes possible for a curious la(z)yperson to start to misuse the technology. Darkroom photography. Long exposures. Abusing Polaroids. Intentional camera movement. Point and shoot photography. Digital photography. Circuit bending digital cameras. Circuit bending digital video recorders intended for children. The benign and beautiful exploitation of image file formats on home computers. This tracks to film projection with Anthony McCall and the cooption of overhead projectors for light shows. My work exists in that continuum. Over the last ten years my personal work has coalesced around the iPhone and creating work that exploits the gaps between the assumptions made by the designers of the computational photography processing at the center iPhone’s camera and the actions of this one particular user. The Interlocutor project (on OBJKT) and several curious parties have duplicated the basic process. These images are built up in a systematic way in the panorama function in the iPhone camera. The patterns created in the panorama application are ingested by the image processing system in the iPhone and output as images because the phone doesn’t know I am doing it wrong. The final image is partly controlled by the input data (color, light, movement, and a broader list of optical effects including the camera’s reaction to extreme shifts in brightness) and partly controlled by the software on the iPhone — which thinks that this is just another day at the office. This is best described as a form of generative photography where there is a disconnect between the input and the output. My work also exists in the continuum of the detritus that I generate (also money). Recent images have been shot with wrappers from candy and Scottish chocolate caramel wafer biscuits as well as labels reclaimed from food and drink.
I take pictures of moments that never happened and landscapes that don’t exist using an unmodified iPhone. The photographs exploit the limits of human imagination (in product design) using commodity hardware to create images that look past the familiar photographic output of smart phones to reveal the programming and processing that work invisibly in the background to deliver the food porn and family snapshots that fill our phones. Many images rely on causing failures in the image processing to generate outputs that depart from the visual polish that we expect from computational photography.

All the images are generated in camera (with less than a handful of exceptions) although the images that I post may be cropped or rotated to an ideal cardinal direction.

One of the most interesting things about this process is the total lack of visibility into the computational system that underlies the iPhone’s panorama process. This is functionally a black box that I feed with light and movement attempting to nudge the output towards a specific outcome. At the core of this is the iPhone’s panorama feature. The panorama feature has changed substantially since it was originally introduced and the level of abstraction has increased. These changes accelerated with the iPhone 12 however the parts that intersect with my work were only added as a software update in the fall of 2021. All modern in-camera panorama work falls under the broad heading of computational photography. In 2021 the familiar photo-stitching recipe for panoramas started to give way to elements of photogrammetry and depth data that relied upon the multiple cameras on the new iPhones.

NOTE: I have to make a lot of assumptions about what Apple is doing in the panorama software because they don’t disclose much and because I have only smallest possible interest in methodically reverse engineering how the feature works -- which is no interest.

Early on I felt I had some sense of what the variables are but as I have progressed over the last ten years I have found a number of shots that are simply outliers that may not be reproducible because the number of variables is unknowable.

HISTORICAL CONTEXT

The tradition of generative photography is built on a similar repurposing of hardware and processes. This is a natural part of the adoption curve in technology where it becomes possible for a curious la(z)yperson to start to misuse the technology. Darkroom photography. Long exposures. Abusing Polaroids. Intentional camera movement. Point and shoot photography. Digital photography. Circuit bending digital cameras. Circuit bending digital video recorders intended for children. The benign and beautiful exploitation of image file formats on home computers. This tracks to film projection with Anthony McCall and the cooption of overhead projectors for light shows.

My work exists in that continuum. Over the last ten years my personal work has coalesced around the iPhone and creating work that exploits the gaps between the assumptions made by the designers of the computational photography processing at the center iPhone’s camera and the actions of this one particular user. The Interlocutor project (on OBJKT) and several curious parties have duplicated the basic process.

These images are built up in a systematic way in the panorama function in the iPhone camera. The patterns created in the panorama application are ingested by the image processing system in the iPhone and output as images because the phone doesn’t know I am doing it wrong. The final image is partly controlled by the input data (color, light, movement, and a broader list of optical effects including the camera’s reaction to extreme shifts in brightness) and partly controlled by the software on the iPhone — which thinks that this is just another day at the office. This is best described as a form of generative photography where there is a disconnect between the input and the output.

My work also exists in the continuum of the detritus that I generate (also money). Recent images have been shot with wrappers from candy and Scottish chocolate caramel wafer biscuits as well as labels reclaimed from food and drink.
?
🔈
🐢

🐇