Astrophotography 101

In short:  Astrophotography relies on trying to let the most light into your camera before the earth rotates too much and the celestial bodies begin to leave streaks-  which is typically under 30 seconds, depending on your lens.

Finding a Dark Sky

Mastery of your camera makes absolutely no difference if you aren’t shooting in an environment with very little light polution.

Shoot Without the Moon

Ideally, you shoot between sunset and moonrise.  

Don’t Forget the Weather

Living in California as I do, it becomes very easy to forget that weather exists.  The skies are generally clear at night here, but when travelling this isn’t the case.  Additionally, wind storms in the desert (the darker spots of California) kick up quite a bit of dust, which can become a cloying haze in even simple astrophotography.

Avoid Light Pollution

This is simple, check darkskyfinder.com and find a place that is very, very dark.  Interestingly, these line up with ghost towns with some frequency.

Your Camera Settings

To better understand astrophotography, you need to assess the ideal settings, and then understand why those simply won’t work.  Then we can dial in actual, pragmatic settings that will work.

The Ideal Settings

F-Stop

You want those stars to be sharp as possible, right?  And your lens likely has the best ratio of sharpness to focal depth at f7.

Shutter speed

You want to let in as much light as possible, and with such a low ISO, you probably want to be

ISO

You want as little noise as possible, because in post, trying to differentiate between faint stars and noise may become difficult.  Minimize noise by setting the ISO as low as possible, and the lowest possible is ISO100.

Stop Dreaming

Try these settings, and see what you get.  Blackness, with no stars.  Clearly these settings don’t quite work, but we can reasonably assume that the two most important settings are the shutter speed and ISO.

Actual settings that work

F-Stop

Open up that aperture as much as possible.  You are desperate for every bit of light, so open it up as far as the lens will allow-  f2.8 or lower, if you can.

Shutter Speed

Here you’ll need to use something called the “500 Rule”.  Simply put, determine your maximum shutter speed time by dividing 500 by your lens’ focal length.  I like to shoot on as wide a lens as possible for this reason, and with my Rokinon 14mm, I can shoot at (500/14), or 35 seconds.  I cannot recommend the Rokinon 14mm enough for astrophotography.

ISO

You very likely need to set this near your camera’s max, try around ISO3200.  The lower the better, obviously, but 

Try to hit an Exposure Value (EV) of -7

The exposure value is a universal number for how bright something is.  There’s a formula to calculate it.  You don’t need to know it.  You can use a calculator, and with a more in-depth explanation of astrophotography!

Milky Way Exposure Calculator

Consider your foreground

You might find that shooting the dizzying expanse of the milky way gets old quickly-  without capturing the environment you are experiencing it in, often these shots lack impact.

Consider tree silhouettes, models on rock outcroppings, or interesting structures.

Think about painting the environment with flashlights, or sprinting around firing your flashes during the long exposures.

 

Shooting in Infrared: So You Want to Photograph Things You Can’t See

If you want to skip the boring stuff:  IR photography is real complicated and involves breaking every conventional way you would use a camera.  Here’s how to do it:

Infrared (IR) photography is kind of the opposite of photography.  It’s important to know that both in form and in function, IR photography breaks all the rules of photography.  Instead of capturing the world as you see and experience it, and how others might see and experience it, you are creating an image using wavelengths the human eye could never see.  You aren’t forging a connection with the world-  you’re breaking one.

As such, know that your camera, regardless of what it is, is not designed to capture, record, or process infrared.  Absolutely every step of the process is either breaking or circumnavigating the way image technology is supposed to work.

What is Infrared?

Light, being a form of electromagnetic radiation, has a wavelength.  The frequency of this wavelength dictates what color we perceive the light to be, largely due to how those photons (which, let’s not get into it, are particles that are also waves) tickle the cones in our eyes.  We have three types of cones, and if you plot out the tristimulus model for light perception, you see that the three different cones allow us to visualize color fairly evenly-ish between 390 to 700 nanometers.  But, of course, waves can exist well outside this boundary.  The shorter wavelengths, below 390nm, are event shorter than violet, and are commonly called “ultraviolet” (UV).  The longer wavelengths, above 700nm-ish, are longer than red, and would be called “infrared”.  Our eyes can’t see these, but an electronic sensor theoretically can! In fact, through a process called “hyperspectral imaging”, you can capture whichever wavelengths you want-  with a sensor designed appropriately!  

This includes Near Infrared (NIR), Shortwave Infrared (SWIR), Medium-wave Infrared (MWIR), Longwave infrared (LWIR), some acronyms you might see on more pedantic websites.

Of course, any image you capture in non-visible light needs to be re-mapped, or “false colored”, so that you can present the image in a visible spectrum.  The wider range of wavelengths you have captured, the more you have available to remap.

Why Can’t My Camera Shoot Infrared?

The sensor on your camera can fudge fairly far on either side of the visible spectrum.  Obviously, it did not behoove any camera sensor manufacturers to tune a a sensor to capture wavelengths outside of what we’d be able to see, so instead they created sensors that roughly hit about where our eyes would see.   Roughly.  There’s some wiggle room, quite a bit of wiggle room, on both sides of the visible spectrum..

But it would do your capture process no good if you had all these spare photons bouncing about, especially as those photons are needlessly imparting a good deal of heat energy.  As such, the camera manufacturers place a dichroic filter, or “hot mirror” in front of the sensor that bounces that useless IR light right back out of the lens.  These can also be called an “Optical Low-Pass filter”, or OLPF.  There’s also a nice piece of glass, well, quartz, actually, sandwiched over the OLPF, which creates a blurring/anti-aliasing effect that a normal photographer would find necessary to keep from generating slight bizarre artifacts in fine detail.  Basically, you’ve got an optical wafer in front of your sensor that keeps your camera from recording wavelengths beyond human vision, and this wafer can take different forms with different sensitivities depending on the camera.  Sometimes they don’t work so perfectly, which allows us to perform some trickery to capture IR on an unmodified camera!

 

Forcing a Camera to Shoot Infrared!

Without modifying the camera

Depending on your camera, the imperfect OLPF might let in enough photons to capture IR, if you set your exposure long enough!  In fact, some cameras don’t even bother blocking IR until it passes 750nm, though most begin to cut around 700.

What this means is, if you purchase an IR filter above 700nm, such as the cheap and easily-available R72, which will prevent any photons UNDER 720nm from passing, you can simply set your exposure to a much longer period, and let those photos that sneak past the interior OLPF just trickle in.  You could call these filters optical high-pass, longpass, or infrared bandpass, and in my experience they seem to be far more effective than the optical lowpass filter inside your camera, which is why the long exposure trick works.

This has a few problems

  1. You won’t get much of a color range to false color later. Wavelengths under 720 disappear, obviously, with an R72.  You can’t really use a shorter wavelength filter, because your camera is going to be far more sensitive to the 500-700nm wavelenths that aren’t being cut by the OLP.  That means if you used, say, a 560nm filter, you’d get a crapton of “red” photons before you ever got an adequate amount of IR photons.
  2. You can’t autofocus with the filter on.
  3. You can’t really see through your camera at all.  Because the IR filter blocks visible light, if you look through the lens you’ll simply see black.
  4. You’re going to have to find the proper exposure through trial and error.
  5. Because you are functionally shooting long-exposure, even if only a quarter of a second, you are going to experience blur on anything moving, and will need a tripod to combat camera shake.

The actual steps to shooting with an unmodified camera and a highpass lens filter are simple

  1. Compose your shot on a tripod and set focus.
  2. Affix a 720nm high-pass lens filter.
  3. Make up some wackadoo exposure (I’m sure there is a multiplier for your normal exposure, but I haven’t figured this out yet).  I found ¼ at ISO320 f2 to work well.
  4. Shoot, ideally with a cable release to prevent camera shake
  5. “Chimp” and see what the camera actually recorded.  You should see a deep red image with black blacks and near-white highlights.  If you don’t, adjust your exposure and try again until it works.
  6. Go on to the steps to channel swap and process the photo. (post inbound, stay-tuned!)

 

Modifying the Camera to Properly Shoot Infrared

If you want to remap the infrared spectrum between 560 and 1000nm, you are going to have to modify your camera and tear out the optical lowpass filter deep inside its guts.

I do not recommend attempting to modify your camera yourself.  I work in a lab where we perform research and development on image capture devices, so I, in theory, have more resources than the normal human to perform such a conversion, yet I deemed it too risky to perform myself.  I seldom deem anything too risky to perform myself.  But this isn’t a simple tooth extraction or appendectomy.

If you accidentally discharge a capacitor in your camera while tinkering, you could die.  Literally, no-foolin’, no takesies-backsies, insta-death.  But, laughing in the face of death as us photographers so often do, let’s move on to an even bigger risk.

Your camera might never achieve focus again.

Really, if you give anything a good knock, shock, or sneeze at the wrong time, your camera is not only not going to shoot IR, it’s never going to shoot anything ever again.  It will cease to be a camera, and become a conversation piece about what could have been.

Moreover, the replacement glass you need to take the place of the OLPF inside your camera is expensive and difficult to find- you’re going to need to order it from a company that will do these conversions for you anyway.

Here’s ultimately what needs to happen:  You need to get that OLPF filter out of your camera, and have it replaced by some piece of glass that can allow infrared through.  That glass needs to have a similar, perhaps identical, thickness to the OLPF that has been removed, else you’ll need to recalibrate focus, which I do not know how to do on the scope we are talking here.  Any conversion service will handle that for you.

You can replace the OLPF with a few things.

  • Clear, no-filtering-whatsoever glass.  Your camera now captures all of the light spectrum, forever.  This seems the most attractive option at first glance, because you can now shoot UV, NIR, IR, or any crazy thing you can dream of, so long as you have the correct filter placed over your lens, which you can easily change.
    • UV lens filters are so expensive that’s it doubtful you’ll want to drop an additional $600 on a filter, after having paid for this conversion (and I’m not certain the OLPF got in the way of using this anyway).
    • Like shooting with a pre-conversion camera with a lens filter, you’ll be unable to look through the viewfinder.
    • Like shooting with a pre-conversion camera with a lens filter, you’ll be unable to use autofocus.
    • You will require a bandpass filter for every lens you want to use.
  • Ultralowpass filter – Pass ~214nm to ~360nm, block everything else.  This range is ultraviolet, and nothing else.  I’m including this in the tutorial because it’s the same philosophy, but this glass is hard to find and prohibitively expensive.  I have no experience with this, nor have I ever met anyone who does, so I am speaking hypothetically.
  • IR bandpass filters- Using these will negate a need for a filter over your lens, and you won’t need a filter for every lens you own.   

The two main companies seem to be Kolari Vision and LifePixel.  I myself sent away to have my Nikon D80 converted by Kolari.

The actual steps to shooting with a camera modified with an internal IR bandpass filter are even simpler, but still have some caveats

Set an exposure compensation.  This may differ based on your camera, but I use around a -.7, and from there I can use my light meter as if it were completely normal.

Keep you ISO at 100.  Always.  Seriously.  The slightest bit of noise is going to destroy this.

Shoot with prime lenses, and don’t trust the focus through the lens!

  • All wavelengths of light have a different index of refraction, or IOR.  Anyone outside the field of optical engineering probably won’t need to think about this, but it is why we see rainbows when white light hits a prism.  Your camera lenses are designed with this in mind, and specially tuned so it doesn’t cause problems.  They are NOT, however, designed with infrared in mind.  This is speculation on my part, but I have found, with much frustration, that my focus is sorely impaired with all of my zoom lenses. I have a 12-24 DX lens that somehow has an incredibly shallow depth of field, even completely stopped down, when used on my IR camera.  This might vary from lens to lens, but you will likely find that you need to shoot with prime lenses in order to avoid missing focus.   Break out that 50mm, because, as usual, it’s going to be your workhorse lens here.

So here are my final notes on IR photography.

  • Noise becomes very evident after a channel swap, so try and shoot everything at ISO100.
  • A wider range of wavelengths means more mappable colors
  • Anything above 720nm will likely need to be presented in grayscale.
  • 720nm will give you slightly pink-or-gold leaves and blue everything else
  • 560nm will give you golds, pinks, and blues with the proper post-processing, but you are including half the visible spectrum at this point, so you might lose some of your science-y bragging rights.

 

  • Shooting on an unmodified camera has underwhelming results
  • Shooting with an external high-pass filter is not very fun, as you can’t even look through the viewfinder.
  • If your camera is modified, you use an external high-pass filter in “live view”.
  • Shooting on a modified camera with an optical high-pass filter directly over the sensor allows you to focus, look through the lens, and shoot normally.