Using An Integrating Sphere To Test Camera Metering In Low Light – Project Overview

(This article is a supporting part of our ongoing testing of low-light camera metering reliability)


What is the dimmest light level that a given camera can meter? For a nightscape photographer, and particularly for someone who films timelapse video with day/night transitions, this question is critical. Normally when I film a timelapse video that starts during the day and ends at night, I start with the camera in aperture priority mode, and the camera compensates as the sun sets.

However, as twilight fades into night, at some point the camera’s metering fails, and I have to switch the camera to manual mode and adjust the settings until night has fully fallen. Innumerable exposure errors have occurred because I didn’t switch to manual at the right moment, or irregularly changed settings in manual mode.

In recent years, some cameras have become able to meter at night, so switching to manual mode is unnecessary. Manufacturer metering specifications are completely wrong and useless. Therefore, while stuck in quarantine in 2020, Sean Goebel and I set out to build a standardized comparison of the low light metering abilities of different cameras.

To answer this question, we built an integrating sphere. An integrating sphere is a sphere that is painted white on the inside and has LEDs for internal illumination. The camera lens pokes into the sphere. Through the use of baffles, light from the LEDs has to bounce at minimum twice before entering the camera lens, and this causes the inside of the sphere to have almost perfectly even illumination.

In other words, it doesn’t matter where the camera is pointed inside the sphere–it’s all the same brightness. The sphere is carefully sealed so that light cannot leak inside from the outside.

 

 

Gray image showing what the camera sees

 

Here are the basics of the metering test: we use an Arduino (an open-source microcontroller) to control the LEDs inside the sphere and trigger the camera being tested. We calibrated the maximum light level of the sphere to be exactly 0 Exposure Value (EV0).

For reference, EV0 is about the light level of a very dim restaurant or a landscape 20 minutes after sunset, and “correct” exposure settings for this are 1/8 s, ISO 6400, f2.8. The camera is set to aperture priority mode with an f/2.8 lens and takes a photo of the inside of the sphere.

The light level is dropped by half (to EV-1), and another photo is taken. Ideally, the camera detects the dimming of the light and compensates with a longer exposure time and/or higher ISO. The light level is dropped by one stop again (to EV-2), and this repeats until the ambient light level reaches EV-10. The correct settings for EV-10 are 30 sec, ISO 25,600, f/2.8, which are appropriate for a light level achieved under a cloudy moonless night without light pollution. Most manufacturers report that their bodies can meter to light levels of EV0 or EV1, but as our tests show, most bodies can meter well darker than this.

After collecting the 11 photos in Aperture Priority mode, each with half the light level of the previous photo, we use Python’s RawPy software library to calculate the average brightness of each raw file. We plot this against the EV that the photo was taken at. If a camera was perfect, the average brightness of the raw file would be constant, since the camera would double its exposure time or ISO with each new photo to compensate for the decrease in light. In practice, plots look like this:

For the first few dimmings, the camera effectively compensates, and the average brightness of the resulting image is constant. However, beyond some point (about EV-6 here), the camera only partially compensates for the decrease in light level–when the light drops by 1 stop, the camera only adjusts by 2/3 stop. Further along, the camera can no longer detect any change in brightness at all, and it does not adjust its settings at all.

Because cameras don’t abruptly change from perfect metering to completely nonfunctional metering, we define two criteria for metering failure.

First, we report at what light level the camera adjusts its settings only 2/3 of a stop for every 1 stop decrease in light. In other words, the first time it fails to compensate by at least 1/3 of a stop.

Second, we report at what light level the camera produces images that are half the brightness of (one whole stop dimmer than) its EV0 image.

Once you know this information about your specific camera, you can translate it to a specific ambient lighting condition (using this article here) that you can safely capture in Aperture Priority with your camera.

Say, for example, if your camera can meter correctly at EV-5, but fails at EV-6, that means your camera (with an f/2.8 lens) can correctly meter and expose a moonlit scene of ~50% illumination, but at ~25% illumination your camera will fail to meter correctly, and you should probably switch to manual exposure on such dark nights.


Integrating Sphere & Camera Metering Test Project

Main Project Page – Test Results

Project Overview – What Is An Integrating Sphere, and How We Used One to Measure Cameras’ Low-Light Metering Capability
(YOU ARE HERE)

Frequently Asked Questions / FAQ

What are EVs, and What do They Mean for Different Cameras? (Non-Technical Explanation)

The Technical Explanation of EVs, and Calibration of the Integrating Sphere

So, How Did You Build an Integrating Sphere, Anyway?

Timelapse Methods Compared: Aperture Priority VS Holy Grail Method


 

How To Build an Integrating Sphere, Anyway?

(This article is a supporting part of our ongoing testing of low-light camera metering reliability)


An integrating sphere is used for measuring emission or detection of light. It is a sphere coated on the inside with diffuse white paint, and this scatters light uniformly and thereby eliminates directionality effects. To state simply, an integrating sphere has very even internal illumination, and in the case of this project, it doesn’t matter how a camera is pointed inside of it.

A vacuum-rated integrating sphere useful in cryogenic temperatures (which I used extensively in grad school for characterizing infrared detectors) costs tens of thousands of dollars. If you are alright with ambient temperature and normal air pressures, an integrating sphere can be 3D printed for a few hundred dollars. I could have done that, but part of the fun of this project was to do everything as cheaply as possible. I decided to papier mâché one.

I bought a beach ball at Walmart for $2 and collected the Penny Saver newsprint coupons that came in the mail for a few weeks. I read that flour and water papier mâché molds, so I used wood glue.

I did the first layer with white paper to make it easier to paint. The rest used newsprint.

After three couple-hour papier mâché sessions, separated by a day or two of drying, I had a newfound respect for just 3D printing these. Oh well.

Finally, I cut it in half and removed the beach ball. I cut a hole at one end for the lens.

I painted the inside with flat ultrawhite house paint and then sanded it to improve the smoothness. This image was after the first sanding but before the second coat.

Most integrating spheres use many LEDs for even illumination, but I only used two in order to achieve the dim lighting conditions.

I put a 1-stop neutral density and ¾ color temperature orange (CTO) gel over each LED to make it dimmer and warmer in color.

The center ring bounces the LED light back, thereby forcing at least two reflections before entering the lens. I later put a sheet of paper over this in order to dim it further.

The initial fit test, before additional painting, and work to make it light-tight.

 

Integrating sphere in use.

Here’s the Arduino (lower) and my interface board (on top). The Arduino is the Duemilanove I originally bought for my timelapse motion project a decade ago. The knurled knob in the top center of the photo is a variable resistor which controls the voltage to the LEDs. This enabled me to adjust the max brightness level to 0 EV, and then I glued it to avoid further brightness changes. The dual-pin female plug at right is where the LED connects. The 2.5mm connector at bottom connects to the wire for triggering the camera, and the two integrated circuits alongside it are optocouplers for triggering focus and exposure (equivalent to a half-press and full-press of the shutter button). The pushbutton on the left enables me to interrupt the code if I want to go faster. The Arduino provides a USB interface to the user, and the code it runs can be viewed here.

Project price list:

Beach ball $2
3 bottles wood glue $12
Flat ultrawhite paint – sample size $3
Black spray paint $4
Opaque black fabric $4
Arduino Duemilanove $20 if bought new
Paper folder (thin plastic for baffles) $2
LEDs $5 for 30, used 2
Optocouplers $5 for 50, used 2
Variable resistor $4 for 3, used 1
Total $61

 


 

Integrating Sphere & Camera Metering Test Project

Main Project Page – Test Results

Project Overview – What Is An Integrating Sphere, and How We Used One to Measure Cameras’ Low-Light Metering Capability

Frequently Asked Questions / FAQ

What are EVs, and What do They Mean for Different Cameras? (Non-Technical Explanation)

The Technical Explanation of EVs, and Calibration of the Integrating Sphere

So, How Did You Build an Integrating Sphere, Anyway?
(YOU ARE HERE)

Timelapse Methods Compared: Aperture Priority VS Holy Grail Method


 

What Are EVs In Photography, And What They Mean For Different Cameras (Non-Technical Explanation)

(This article is a supporting part of our ongoing testing of low-light camera metering reliability)


Exposure Values (EVs) have two common usages. First, they can be used to quantify the brightness of a scene. Higher EV numbers indicate a brighter scene. Second, they can describe differences in exposure settings (for example exposure compensation).  In both definitions, a change in EV of 1 indicates that the light level has doubled or halved. Below are the approximate EV ratings of an assortment of scenes:

Chart credit: Brent L. Ander

Photographic light meters typically meter a scene and report its EV plus the exposure settings appropriate to capture it. The table below shows what settings correspond to different EV situations.

Chart credit: Brent L. Ander

To use the above chart, suppose you want to know what shutter speed properly exposes a 5 EV scene at f/4 and ISO 400. Look down the ISO 400 column to the row with “5”, then follow that row right to the f/4 column. The appropriate shutter speed is ⅛ second. If this is still unclear, the link gives several more examples.

Below is a graph of the ambient brightness in Exposure Values for moonlit landscape conditions. You can also use this moonlight exposure calculator to check exposure for a specific aperture and ISO and moon phase.

Chart created by Sean Goebel

EVs and Your Camera’s Histogram: Different Dynamic Ranges Result In Different Histograms For The Same Light

Due to cameras having different dynamic ranges, however, that lone histogram spike will fall in a different place for one camera versus another.

To demonstrate this, I set a Canon M5 and Sony A7m3 to the “correct” EV0 settings of ISO 100, f/2.8, 8 seconds, then inserted them into the sphere with it set to EV0. The same 50mm lens at 2.8 was mounted via an adapter to both.

As you can see, the Sony (left) reports that the scene is ⅓ stop underexposed, but its histogram is perfectly centered. The Canon (right) reports that the scene is 1 stop overexposed, but its histogram is slightly left of center. EV0 is about where it should be (middle of histogram), but the two cameras have different ideas of where in their dynamic range it should go. Canon and Sony have programmed their metering differently.

In summary, the exposure value of a scene describes its brightness, and it can be converted into camera settings that can capture the scene. However, because different models of cameras have different dynamic ranges, not all cameras will place the histogram bump in the same place, even with the same settings and lens.


Integrating Sphere & Camera Metering Test Project

Main Project Page – Test Results

Project Overview – What Is An Integrating Sphere, and How We Used One to Measure Cameras’ Low-Light Metering Capability

Frequently Asked Questions / FAQ

What are EVs, and What do They Mean for Different Cameras? (Non-Technical Explanation)
(YOU ARE HERE)

The Technical Explanation of EVs, and Calibration of the Integrating Sphere

So, How Did You Build an Integrating Sphere, Anyway?

Timelapse Methods Compared: Aperture Priority VS Holy Grail Method