03
2019Recreating 1850s Vintage Photography (and IR!)
Using Careful Science and Pretty Faces
While I was visiting San Francisco, Kristy Headley, a dear friend and fellow engineer, showed me her studio. There I was lucky enough to sit for her while she did some vintage tintyping. Tintyping was one of the earliest forms of photography, popular in the 1850s. It was a kind of incredibly inconvenient polaroid- the photos were exposed instantly after a quick wash on plates of iron or aluminum (never actually tin). All you needed was a very, very large camera, plates treated with “collodion” to make a light-reactive surface, and a sizeable collection of chemicals- Unfortunately, the genuine process has a high risk of explosion, creating cyanide gas, incurring blindness from silver halide splashback, or getting whacked out on ether.
Kristy showed me each painstaking step, and I was enamoured with the result. With her mastery of lighting techniques, she’d produced tintype photographs where every detail popped- every freckle, hair, and eyelash seemed to stand out. Yet, these looked very different from modern photography. There was a certain, seemingly unquantifiable trait to them. Why?
I sought out to quantify this ol’ timey look, and the fashion by which one could reproduce it with a digital camera. Perhaps I could share in this delight without keeping jugs of volatile chemicals in my room.
Look to the Spectrum!
Kristy had told me that her exposure plates were only sensitive to certain colors: UV, blue, and some green. There was a specific range of wavelengths this collodion emulsion was sensitive to- one that doesn’t match human vision.
Human vision, as we can see above, is a sensitivity to the electromagnetic spectrum. Electromagnetic waves between about 400 and 700 nanometers will be perceived as light to us- the seven colors of the rainbow! (Interesting article about this to come later) If we dip to wavelengths shorter than 400, we get imperceivable ultraviolet (UV) light, and rise to wavelengths above 700, we get imperceivable infrared (IR) light.
But what do vintage photographic emulsion surfaces see?
Engineer Niles Lund seems to be the only person who has researched the spectral sensitivity of collodion, the emulsion chemical stew used to make wet-plate and tintype photographs in the mid-1800s. After exchanging a few emails, Niles sent me this updated spectral chart.
So now we know what to recreate if we want “digital collodion”! We need to cut out the upper half of the visible spectrum allowing nothing above about 520 nanometers into the camera.
I’ve done much work in this field, so I thought this would be a great time to blend science and art to explore this concept, and compare it to other hyper-spectral imagery!
Setting up an experiment
Now we invite some friends and models to come sit for us! I asked the models to avoid skin-concealing makeup and zinc-oxide sunscreen, which is the only sunscreen to obstruct UV-A rays. They were asked to model under a broad-spectrum light source (the sun), with an additional light source of a silver reflector, and maintain a pose for a few minutes while I photograph each under the following conditions:
Visible Spectrum
This is what humans see (obviously). Using three cones, sensitive to three different ranges of wavelengths, which loosely map to red-green-blue, we see everything from violet to red. Violet is the shortest wavelength. Red is the longest. Not surprisingly, this is what digital cameras see, as well. Silicon photovoltaic sensors, whether CCD or CMOS, can take in a theoretical 290nm to 1100nm. However, it would be maddening to have your photography overexposed, blooming, and flaring due to lightwaves you cannot see, to say nothing of threats of overheating, so sensors are limited by optical filters placed directly over the sensor to only photograph, more-or-less, in the human visible spectrum. Photons that don’t play nice in this range are either bounced back out of the lens, or absorbed and destroyed.
Here’s what our model Kayla looked like in the visible spectrum:
Orthochromatic Blue (Collodion AND Squid Vision)
This could be seen as the same spectrum that octopus and squid might view in. Humans, except for rare and awesome mutants and the colorblind, are “trichromatic”, with three cones that allow for three different ranges of wavelengths in. Cephalopods only have one- the S cone, which is sensitive in almost the exact fashion of the spectral curve of the 1850’s collodion (except for UV). Though they might have a different mechanical use of their eyes, which could allows for more color vision, but I digress…
By using Schott BG25 bandpass glass, bought off a gentleman who fits esoteric german glass into filters, on a lens that cuts UV (any normal lens) on any unmodified camera, we get a theoretical spectral sensitivity that looks like:
Of course, all the photons that this can transmit that the camera can’t capture become irrelevant, so on an unmodified camera, we will get a spectral response that looks like:
This means that on an unmodified camera, using a lens that doesn’t block UV, we’re going to get an image where the focus is going to be more on near-UV damage to the skin- concentrations of freckles and blemishes are going to much more pronounced. There’s going to be a light leak of very red photons, and we’ll be missing much of the UV range, but we’re very close to vintage emulsion now.
Our model, under the same lighting conditions (with an exposure compensation of +.7, calibrated from an 18% gray card):
By gum, we’ve done it! With that nanty narking, I think we can say we’ve captured the ol’ timey feel of collodion!
…but it is not scientifically accurate. This is about the best we can do on an UNMODIFIED camera, as we need some expensive modifications to allow UV to extend into the sensor. More on that later.
Let’s see what she looks like in some other, more bizarre wavelengths!
Near Infrared and Short Wave Infrared (Snake Vision)
This is, likely, what snakes see. This is the orange-and-red human-visible spectrum (near infrared, or NIR) and the infrared waves closer to the human spectrum. This is what I like to photograph foliage in, because it can be false colored and remapped using the color channels that your camera knows how to deliver data in. I have a Nikon D80 that has been modified to shoot in this wavelength, with an upper infrared spectrum bound (unmeasured) limited only by the physical limitations of a silicon CCD.
For some reason, for the duration of this experiment, this setup absolutely refused to focus properly. This camera does not have live-view, and it is very difficult to assess these images until they’ve been processed on a computer. Nonetheless, these blurry images were interesting…
Short Wave Infrared
I don’t think anything “sees” solely in this spectrum, though it is a difficult concept to explore, so perhaps time will tell. To capture the full breadth of short wave infrared, or to move into longer wave infrared, we would have to use a special sensor made out of indium gallium arsenide. It’s interesting to see that any sun damage and melanin deposits, such as freckles and tans, will not appear in this wavelength. The IR photons do a good deal more sub-surface scattering in the skin than visible light, rendering each person as a waxen figure.
Polishing these photos up a bit in a vintage style split-toning
You might have noticed these are not delivered in monochromatic black and white, as in the style of a silver gelatin process. Instead, I chose to render them out with a bit of split-toning. Since the birth of photography, this was a common practice, using various methods to bring color to the highlights and to the shadows.
In this case, I made the highlights gold to mimic the flame-gilding of the mid 1800s, which “fixed” the highlights of a photograph using a gold chloride solution and an open flame. As ambrotypes caught on in the 1860s, “ruby” hued backs became more common in photography- so I enjoy using a red tint in my lowlights.
A Horrible, No-Good Problem Arises in the bayer filter!
The images I captured occasionally showed very odd artifacts. Fine hairs would get lost, and appear in a fashion that almost looked like… a shadow? But why would such a thing happen?
Then I realized we’ve surpassed the intention of a digital sensor, and we’re paying the price.
All digital sensors are actually monochromatic- black and white. In order to capture color, a translucent, microscopic grid of red, green, and blue tiles is spread across the sensor, with one tile to each pixel. This is a “Color Filter Array”, or CFA. Conventionally, this grid is laid out in a “Bayer pattern”, named for the Bryce Bayer of Eastman Kodak, who conceived of this. This Red-Green-Green-Blue pattern mimics the human eye’s sensitivity (a range where two of our three cones’ sensitivities overlap), allowing for more green to be considered by the camera. Interesting fact, this was supposed to be Cyan-Magenta-Yellow, to better subtract out the wavelengths necessary, but the technology simply did not exist at the time of inception to make such a CFA.
The sensor still functionally reports the incoming light as black-and-white, but now the camera, or camera raw processor if you move it to a computer, performs a process called “demosaicing”, or “debayering”. This reconstructs the image in four pixel groups, using a variety of different methods, depending on the camera and the software, finally outputting RGB channels in full resolution.
This means every digital photograph you have ever taken has functionally been reconstructed from quarter-resolution samples! Obviously, we haven’t noticed it, so we don’t really mind, but when we’re missing an entire wavelength (red), and sharply attenuating another (green), we’re only effectively using one out of every four pixels.
Using RAWTherapee, the discovery of which was a definite win in this project, I could explore my sensor data before it was demosaiced and processed into the image I would see on a screen. As you can see above, it was disheartening.
What is a demosaic process supposed to use to rebuild the image? Without the appropriate data, it simply uses blank black pixels to try and interpolate data. This is going to get weird. Without writing your own demosaic algorithm, and accompanying debayering software to run it, there is no fix for this. (Update: I fixed it by writing my own demosaic algorithm and accompanying debayering software.)
Images captured in the “digital collodion” spectrum on a color digital camera will always get a little muddled when opened by your camera or photo editing software because of this simple fact. The luminance will be interpreted as ~60% less than it is, artificially flattening an image. Details will be lost, and reconstructed as goo.
Immediately after posting this I thought to look at the spectral response of each of the bayer tiles. It turns out the blue tile perfectly matches the orthochromatic technique, so I can effectively write an algorithm to simply crop the other 3 pixels, reducing this to a highly accurate 1/4 resolution. Updates to follow.
Update: Fixing the Debayer Problem!
After writing a surprisingly simple script in Matlab, which did an embarrassingly simple downsample to throw out the three pixels that were not relevant to the spectrum we captured this in, we have a solution. I cannot stress enough, we got lucky. This almost teaches bad color science- you typically cannot select a wavelength in post. We just got very, very, comically lucky and found that my Nikon D610 uses the same spectral sensitivities in the blue bayer tile as collodion once did. That is an odd coincidence, to say the least. This is worth researching more on later.
I will later rewrite this script for free interpreters, so that anyone can follow in my footsteps without spending money on expensive coding platforms. A full new article will be posted exploring this. Update: You can find the code, run on freeware, to perform this in my next post.
The Results are Stunning
The differences far surpassed what I expected. Pleasantly, the math suddenly matched up to what my light meter had suggested in my camera. Now, without any exposure adjustments, the photograph matches the levels of the full-spectrum photo- remember, I had already taken exposure compensation into consideration, allowing an extra 2/3s of a stop of light, when I took the photo.
There’s no bringing back the fine details- those little wisps of hair are lost to the ages, but they didn’t become a gooey mess. So much detail has been restored to the photo, I am absolutely stunned. I did not expect this level of improvement. I went back and reuploaded this spectrum in the earlier section with Kayla.
One more comparison:
Perhaps my short-wave infrared photos could benefit from this process as well?
All of the Photo Results
All results delivered as near to from-camera as possible, using small exposure adjustments to match skin tone. When gradient adjustments were used, they were used uniformly across all photos of the model, to keep lighting consistent between wavelengths.
Kaitlin
Kevin
Christine
Xach
Caitlin
Kayla
Next Steps to Take it Further
In order to fully and accurately capture the collodion sensitivity range, I need to not only use a lens that allows ultraviolet photons to pass, as used in this experiment, but a camera with a sensor modified to receive UV light. Because this is not a fast or cheap prospect, I’ve been dragging my feet to see if this could be done in conjunction with another modification…
As photographing humans draws no benefit from presenting false color, hyperspectral photography (such as UV) is best delivered in black and white. Due to the problems with the bayer pattern described above, having a color sensor actually hurts collodion-spectrum portraiture- if the construction of the image relies on a missing red channel, then we need to have a monochromatic sensor to gather an image using the filters described in this process with due sharpness. So we have two options (Update, a third option has arisen): Buy a monochromatic camera, which can run between 3,000 and 50,000 dollars, or do a “mono-mod” to scrape the bayer filter off of an existing camera. Conversion shops have assured me this is impossible on the Nikon cameras that I like to use, but I hold out hope for a “full spectrum” monochromatic modification. Alternatively, now that a custom debayer option exists that quarters the resolution, I can look into intelligent upscaling, possibly using AI to regain resolution lost in the process.
Final Thoughts
- As I used attractive models in this experiment, instead of the lab tools one should use to characterize optics, the spectral range of each photograph is an educated guess at best, based on theoretical readings. Lab characterizations will follow.
- Trust the math. I know things look underexposed and incorrect, but if you’ve set your exposure compensation based off an 18% gray card- or, realistically, ignored that and just set it to +.7, then you can trust your lightmeter shooting in “orthochromatic blue”.
- Don’t trust the Bayer filter. Its bad and should feel bad. Consider this in any hyperspectral pursuit, and any pursuit where you are limiting your input wavelengths via optical filter.
- Sometimes, there’s no substitute for the real thing- recreating wet-plate collodion and tintypes is severely limited by digital consideration- but limitation is the mother of invention!
Recreating the Look of 1850s Tintypes in Digital with Math and Science - NewsGroove Uk
[…] note: A longer and more comprehensive version of this article can be found on Warren’s website. In it, Warren also discusses challenges he encountered with Bayer filters and how he wrote his own […]
David Perry
Really impressive, I’ve ordered the BG25 filter and have a Canon camera with the color filter removed. What lens would you recommend? The filter is 52mm.
Phil Warren
Assuming you have a full-spectrum modification (which is likely, unless you specified otherwise when you had the modification done), I would recommend you look at lenses that will let UV spectra pass through. Glass destroys anything below 200, but you’ll probably find the 200-400 range to really add to simulated tintype photography.
I strongly recommend the Nikkor F/1.8D with a mount adapter.
For a comprehensive list, see:
https://kolarivision.com/uv-photography-lens-compatibility/
Phil Warren
Also, look into step-down or step-up converters to fit the filter to your lens. Unfortunately it’s difficult to have anything other than a 52mm filter made for BG25, but hopefully this article brings enough attention to the glass to make it worth producing a few more!
Helge Mruck
Fascinating research! I had no idea the demosaicing would have such a destructive effect.
I did know about Nikon sensors being impossible to turn monochromatic – I’ve tried!
The issues with focusing in snake vision infrared: Since, due to the difference in refraction, the infrared image would require focusing at a different distance than the visible spectrum, would it be worth trying to focus half way between the two and stopping down the aperture? But I’m sure you’ve already done that!
Many thanks again for this highly informative article, and I look forward to trying out your custom debayer!
Phil Warren
I’ve begun to question if, at a close range, the Near Infrared+Short Wave Infrared can even be focused appropriately while wide open- But more likely I was more focused on the science than the artistry of this shoot, as I did notice the later subjects were more focused. Either way, I’d certainly recommend shooting more stopped down in IR!
David Bateman
Interesting article. I look forward to seeing your script, will you make it available to be used with RawTherapee? Or will you write a custom script for dcraw comand line?
You may have been lucky, but there is some consistency with the blue dye being used in cameras following that curve. Red and green dyes are also responsive further in the uv. Typically will see a false color yellow from 350nm to about 375nm, blue is above that. Mostly only green is responsive below 340nm. Images in UVb range are mostly green.
As for monochrome cameras, you are forgetting lower cost Astrophotography cameras. You can get a ZWO monochrome for under $1000. I would love to play with a ZWO 1600mm, but still too expensive for me. Conversion sites I know of are maxmax (ldp) and monochrome imaging services. But the latter only really converts Sony cameras. However, the microlenses are removed during these conversions and they do not appear to affect the UV transmission. Maybe even increase UV transmission to the photosites. So best to get a monochrome camera.
For more Uv related information, I would recommend looking at Ultraviolet Photography.com, a great resource.
Phil Warren
You’ve given me a lot to think about! I haven’t considered writing the script directly for RawTherapee, that seems the most considerate thing to do, since I know the command line is not a comfortable place for everyone. I will look into that! If I have time in the next few days I’ll make a quick post with a dcraw/imagemagick script later this week, thought I have my doubts about the endian consideration of dcraw, which makes me distrust it as a 16-bit solution, and I hate encouraging quantizing errors.
I’ve been talking to maxmax, and researching thoroughly on UltraVioletPhotography.com, but had no idea about the dye transmissivity!
I appreciate your feedback, hope to hear from you more as I do more writeups!
Recreating the Look of 1850s Tintypes in Digital with Math and Science – News Tech Dude
[…] note: A longer and more comprehensive version of this article can be found on Warren’s website. In it, Warren also discusses challenges he encountered with Bayer filters and how he wrote his own […]
Karsten Bruun Qvist
Thanks for bringing attention to the problems demosaicing can cause when there are no real data collected behind some of the color patches. I do a fair amount of IR work, and would certainly be interested in checking whether image quality would improve by taking this into account. Look forward to reading more about this work!
Phil Warren
I will let you know! I’ve read some accounts that suggest the bayer dyes are equally transmissive of IR, so it becomes irrelevant, but I haven’t seen the data to back this claim up. Later articles will explore!
Phil Warren
Following up, months after the fact- it turns out IR has the opposite problem from this “digital collodion”. The CFA (bayer pattern) is more-or-less equally transmissive of IR- meaning each channel will have a nearly-identical amount of energy captured. Unfortunately, the bayer pattern is not tuned for this behavior either, but its effect will be barely noticeable. This opens up doors for some very interesting experimentation, an article is to come shortly on this.
Demosaic for Tintype or Blue-Spectra Photography @ Phil Warren Photography
[…] Recreating 1850s Vintage Photos […]
Rainer
Thanks so much for that article.
Because I’m interested in digital collodium pics, I’m searching for the filter.
Is the BG3 also suggestable vs the BG25?
Because the BG3 I could get in 58mm.
Thanks for a short info.
regards
Rainer
Phil Warren
Rainer- BG3 is a great filter too- you’ll see a more pronounced effect from it, exaggerating a little beyond what collodion would have done. BG3 will not allow ANY green to hit your sensor, whereas collodion allowed some green. If you are shooting on a modified camera, you could theoretically let more UV spectra in as well. In short, you’ll exaggerate the artistic intent of your project, likely with stellar results, but shift slightly away from historic re-creationism.
Note if you are shooting on a modified, full-spectrum camera, you’ll need to stack something that blocks IR on top of that, probably a S8612 or BG50.
I found this chart to be kind of helpful when comparing Schott glass. https://www.us.schott.com/d/advanced_optics/c7ef378d-ea1b-4c36-9cd1-79a379804dca/1.5/schott-optical-filter-diagram-december-2012-eng.pdf
Colorizing Infrared with AI @ Phil Warren Photography
[…] Recreating 1850s Vintage Photos […]
Bookmarks for May 3rd through May 4th : Extenuating Circumstances
[…] Recreating 1850s Vintage Photography (and IR!) @ Phil Warren Photography – […]
Willis
It’s actually a great and useful piece of info.
I am happy that you shared this helpful info with us.
Please stay us up to date like this. Thank you for sharing. http://kadenceorlando.com/
Alexander Rutz
Dear Phil,
thanks for your informative and detailed article, this is something I wanted to try myself for some time too.
But then I was wondering if it’s not easier to actually take advantage of the sensor’s RGB-filter, rather than trying to get rid of it.
With it we actually have a pretty effective way of getting only the near UV-spectrum: The B-channel. It is not cut off at the same point as the BG23 you use, but the results are quite similar.
A quick test with a Fujifilm RAF-file in Iridient Developer rendered a reasonably tintypeish image. Drop me a line an I can send you some screenshots.
All the best!
Alex