ABOUT CELADOOR
I take pictures of moments that never happened and landscapes that don’t exist using an unmodified iPhone. The photographs exploit the limits of human imagination (in product design) using commodity hardware to create images that look past the familiar photographic output of smart phones to reveal the programming and processing that work invisibly in the background to deliver the food porn and family snapshots that fill our phones. Many images rely on causing failures in the image processing to generate outputs that depart from the visual polish that we expect from computational photography.
All the images are generated in camera (with less than a handful of exceptions) although the images that I post may be cropped or rotated to an ideal cardinal direction.
One of the most interesting things about this process is the total lack of visibility into the computational system that underlies the iPhone’s panorama process. This is functionally a black box that I feed with light and movement attempting to nudge the output towards a specific outcome. At the core of this is the iPhone’s panorama feature. The panorama feature has changed substantially since it was originally introduced and the level of abstraction has increased. These changes accelerated with the iPhone 12 however the parts that intersect with my work were only added as a software update in the fall of 2021. All modern in-camera panorama work falls under the broad heading of computational photography. In 2021 the familiar photo-stitching recipe for panoramas started to give way to elements of photogrammetry and depth data that relied upon the multiple cameras on the new iPhones.
NOTE: I have to make a lot of assumptions about what Apple is doing in the panorama software because they don’t disclose much and because I have only smallest possible interest in methodically reverse engineering how the feature works -- which is no interest.
Early on I felt I had some sense of what the variables are but as I have progressed over the last ten years I have found a number of shots that are simply outliers that may not be reproducible because the number of variables is unknowable.
I am referring this earlier mint on H〓N to some degree as this demonstrates the image capture from an earlier generation of iPhone panorama feature.
#### Occlusion XIV: capture
This short video is screen capture from an iPhone during the image capture for the next three pieces in the Occlusion series. You can see the layers in the image build up but the interaction between the layers is shrouded in the decisions made by the software engineering team for the panorama feature. [ teia.art/objkt/133296 ]
HISTORICAL CONTEXT
The tradition of generative photography is built on a similar repurposing of hardware and processes. This is a natural part of the adoption curve in technology where it becomes possible for a curious la(z)yperson to start to misuse the technology. Darkroom photography. Long exposures. Abusing Polaroids. Intentional camera movement. Point and shoot photography. Digital photography. Circuit bending digital cameras. Circuit bending digital video recorders intended for children. The benign and beautiful exploitation of image file formats on home computers. This tracks to film projection with Anthony McCall and the cooption of overhead projectors for light shows.
My work exists in that continuum. Over the last ten years my personal work has coalesced around the iPhone and creating work that exploits the gaps between the assumptions made by the designers of the computational photography processing at the center iPhone’s camera and the actions of this one particular user. The Interlocutor project (on OBJKT) and several curious parties have duplicated the basic process.
These images are built up in a systematic way in the panorama function in the iPhone camera. The patterns created in the panorama application are ingested by the image processing system in the iPhone and output as images because the phone doesn’t know I am doing it wrong. The final image is partly controlled by the input data (color, light, movement, and a broader list of optical effects including the camera’s reaction to extreme shifts in brightness) and partly controlled by the software on the iPhone — which thinks that this is just another day at the office. This is best described as a form of generative photography where there is a disconnect between the input and the output.
My work also exists in the continuum of the detritus that I generate (also money). Recent images have been shot with wrappers from candy and Scottish chocolate caramel wafer biscuits as well as labels reclaimed from food and drink.
Originally sent to collectors of the documentation on Rodeo. A free mint to accompany a really cheap mint. 👀
*teia.cafe will prioritize playlist curators for online transactions, but the user may sync their wallet to become the "default curator" when the app is used in real-life or public settings. (This will update the QR links as well - if onlookers scan their phone when your wallet is sync'ed, it will identify your wallet as the curator as well.) If no wallet is present, it will fall back to teia.cafe's treasury wallet instead.
🏷️
Purchase
❌ No Listing / Not for Sale 🛒
*Fee structures are subject to change. (teia.cafe does not have control over marketplace or curator fees, which are set independently)
*teia.cafe is currently attempting to purchase the listing lowest in price. Pricing may vary depending on demand and/or how the artist decides to list their existing works in the market. (Some artists may have pricing tiers with the same editions, for example.)
🔴 N/A or Loading Issue (Does Not Imply Absence of Clauses)
*teia.cafe has no legal authority over copyright/distribution/reproduction rights - these agreements are defined at the time of upload by the Creator(s) themselves. Currently, the only platform that supports copyright/licensing mints with NFTs is teia.art, managed by the DAO itself.