Why Can’t My Camera Capture the Wildfire Sky?

3 min read

  

[responsivevoice_button rate=”1″ pitch=”1.2″ volume=”0.8″ voice=”US English Female” buttontext=”Story in Audio”]

Why Can’t My Camera Capture the Wildfire Sky?

Today, some cameras and apps allow a user to choose a white-balance preset, such as “daylight.” But despite the seemingly descriptive name, the setting is really just a way for the camera to choose a specific color temperature, not a surefire way to make daytime images look right. Others have sliders that allow a user to select a desired tone, dialing in an appearance that matches a desired ideal. That’s not duplicitous—it’s what all photographs have always done.

I don’t live in a place whose sky is flushed by fire, so I asked the author Robin Sloan, who lives in Oakland, California, to take photos illustrating the phenomenon. The image on the left, below, is from the iOS camera. The one on the right was taken with the Halide app, which lets you change exposure settings manually, including white balance.

Robin Sloan

“I would say the reality is about halfway between them,” Sloan told me. He also shared another image taken with a Sony camera set to “daylight” white balance, which made the scene look much stranger than in person. The high contrast in that image, which appears below, tricks the eye into thinking the orange is brighter.

Robin Sloan

For Californians gawking at their fiery sky, an image might never be able to capture the embodied sensation of life beneath it. The smoke would have been moving in concert with the dynamics of the air, for example, causing the apparent colors to shift and dance in person. That phenomenon might be impossible to capture fully in a still image, or even a video. Likewise, the eerie claustrophobia of being surrounded by pure orange wouldn’t translate to a screen, much like a James Turrell installation looks less impressive photographed than in person. The images going viral on social media are evocative. But are they real? No, and yes.


Blaming cameras for their failures or making a That’s just how photography works defense of them can be tempting. But images and videos have never captured the world as it really is—they simply create a new understanding of that world from the light that emits from and reflects off of objects.

People who practice photography as a craft think of their work as a collaboration with materials and equipment. They “make” images—they don’t “capture” them—like how an artist creates a painting with canvas and pigment and medium, or a chef creates a meal with protein, vegetables, fat, and salt. But the equipment has become invisible to the rest of us—a window that steals part of the world and puts it inside of our smartphones.

The irony is that software now manipulates images more than ever. Today’s smartphones do huge volumes of software processing apart from just automatically adjusting white balance. Features such as “Portrait” mode, high dynamic range (HDR), and low-light capability strive to invent new styles of pictures. And then there are the filters—whose very name was borrowed from the optical instruments used to color-correct film. They make it plainly obvious that images are always manipulated. And yet, somehow, filters further entrenched the idea that images bear truth. An Instagram post tagged #nofilter makes an implicit claim against artifice: This is how it really looked. And yet, there is no such thing as an unfiltered image, just differently filtered ones.

You May Also Like