The Full Dome Saga Pt 3: Creating the Dome Master

Welcome back to The Full Dome Saga, the story of full dome production. Check out parts 1 and 2

So what is a “dome master”? Most planetarians like to consider themselves masters of the dome, but that’s not what we’re talking about here.

brycecanyon

This is a dome master!

Frames for full dome movies are square with a circular image inscribed within it, this is called a dome master. When looking at a dome master on a flat screen it looks like a distorted spherical image, but when projected upward onto the dome the image produces an accurate immersive effect for the viewer.

superfisheyeHow do you make a dome master? Unfortunately it isn’t really as simple as taking footage or images intended for a normal flat screen and running them through some software to make them immersive. You need to capture the footage differently using special camera lenses. The super fisheye lens captures the 180 degree perspective necessary to get enough picture to fill the dome.

Many inexpensive cameras can capture high resolution still images, like Bryce Canyon image above. However, those megapixels aren’t put to use in video capture mode. The maximum video resolution for most digital cameras and DSLRs is a 1920 x 1080 rectangle (perfect for your HD TV at home!). Remembering back to previous articles, the fisheye lens produces a spherical image inside the camera’s sensor frame. So when capturing a video with a DSLR and fisheye lens you get a maximum of a 1,000 x 1,000 pixel dome master video. Which isn’t nearly enough!

liveaction

Red Epic pixels are expensive, look at all the wasted pixels on either side! Luckily this was shot with a much less expensive Canon 7D!

So, let’s put one a fisheye on a Red Epic and start filming! Again, hold your horses… The frame captured by a 4k movie camera is not a square image either, its approximately 4,000 pixels wide by 2,000 pixels tall (just what you need for a regular movie theater). Slap a fisheye lens on there and you get a large black rectangle, with circle in the middle of it. Planetarium producers only care about the height of the frame, that is the maximum for how big our dome master will be. So a 4k movie camera that costs $100,000+  will only yield a 2k dome master, and the resolution to either side is thrown away. Bummer!

This has been the main reason that full dome movies up until now have been mainly composed of computer animation. That and of course we don’t have a lot of live footage of outer space. In recent years full dome movies have moved beyond just astronomy topics. There are biology shows, chemistry shows, history shows, and entertainment shows featuring roller coasters in outer space. Computer animation has been the solution for planetarium producers for how to get that true 4k by 4k dome master at whatever frame rate they want.

When producing CGI content, an animator essentially creates a virtual movie set. With virtual objects, virtual lights, even a virtual camera. The animator assigns textures to the objects, and makes them move. When the scene is finished, the animator picks the resolution that they want, then the scene is rendered. This means, that the computer creates each individual frame of the animation. One frame at a time, the computer calculates how the scene should look based on the lighting and texturing of the scene as seen from the point of view of the virtual camera. Those frames are then assembled into a movie.

TritonUnrendered

An unrendered scene. Virtual Dome Camera is circled. “+” signs represent steam and ice particles that will be generated during rendering.

When producing for the dome, the animator uses a virtual camera that has a virtual super fisheye lens on it. In fact this virtual camera can capture beyond the 180 degree field of view, difficult to do with a real camera. Many full dome animators use a field of view of 200 degrees or more. This captures more of the scene, giving the viewer an even more realistic experience. At render time, the animator chooses an output resolution of 4,000 x 4,000 and renders out dome master frames, that will later be assembled into a movie.

TritonSideBySide

You see more of the scene with a wider field of view! The experience becomes that much more immersive for the viewer.

So is the sky the limit here? Not really. Come back next time to learn about how rendering actually works, and the time required to produce a 4k animation.

The Full Dome Saga Part 2: Resolution and Frame Rate

Welcome back to the Full Dome Saga. For those of you just checking in, last week’s article can be found here Part 1.

So what is a 4k production? It’s a full dome movie that takes four Kyras to produce it, a 4K production…..

Jokes aside… 4k is short for 4,000 pixels.

catanddog

The movement here is not smooth.

Movies everywhere are made up of individual images called frames, played back at high speeds to simulate motion. The higher the frame rate, the smoother the motion and the higher perception of realism. Low frame rates cause motion to appear jerky and the picture to flicker. Think back on early films from the 1930s and 1940s, or funny internet GIFs of today.

When we go to a movie theater and they advertise that they are a 4k cinema, what they really mean is that their movie is 4,000 pixels wide and 2,000 pixels tall (roughly) because movies are rectangular (your HD tv at home is 1920×1080 pixels). So each frame is approximately 4,000 pixels across and 2,000 pixels tall.

bilbosmeagol

Back away Smeagol, you look to realistic at this frame rate!!

Generally, movie theaters run at 24 frames per second.  Peter Jackson made headlines when he offered The Hobbit at 48 frames per second (gasp!). Many still argue that high frame rates for live action films on flat screens produce too much realism for the audience to enjoy. Personally I enjoyed they hyper realism of The Hobbit at 48 fps, but that’s just me.

Movies like The Hobbit push the envelope for current technology. Capturing live action at high resolutions and high frame rates (then double this for two cameras if you’re making a 3D movie), require sophisticated and expensive camera equipment. The Red Epic (used to film The Hobbit and other recent blockbusters) is advertised as a 4k camera. I am often asked, why don’t you planetarians use that to get 4k live action footage for your dome? Not so fast son….

Here in the planetarium things are a bit different. Rather than a rectangular screen in front of you, there is a hemispherical screen that surrounds you. The movie is in front of you, next to you, above you, and behind you. 4k in the planetarium world means each frame is 4,000 by 4,000 pixels (a square rather than a rectangle) twice the number of pixels per frame.

tospaceandback

“To Space and Back” produced by Sky Skan. This full dome movie is available in 8k, 60 frames per second, and 3D!

Many digital planetaria run full dome movies at 30 frames per second, but producers are beginning to push the envelope. If you thought The Hobbit at 48fps was crazy, some digital planetaria offer full dome movies at 60 fps. The high frame rate applied to CG graphics creates that extra degree of smoothness and realism in the motion of objects on the dome. Stay tuned, as we have been dabbling in this a bit ourselves here at LASM!

The planetarium dome is a larger surface that surrounds the audience, unlike a traditional movie screen. Therefore motion is amplified on a dome screen. For this reason planetarium shows are edited with longer cuts than traditional movies and TV content, camera movement is slower, and objects move slower. Movement that is fast and somewhat jerky on a flat screen will look even more so on a domed screen. High frame rate smoothes the motion, resulting in a more realistic experience.

Why are most full dome movies CGI you ask? Come back next week for The Full Dome Saga PT3 where we will investigate the ins and outs of Live Action vs CGI production for the dome…..

The Full Dome Saga Pt. 1: From Mechanical to Digital

The Full Dome Saga Part 1: From Mechanical to Digital

Hello Readers, you are about to embark on a journey through the world of planetarium technology and production. Every Friday for the next few weeks we will explore all things digital planetarium. The first part of the Full Dome Saga is a brief background on the  technological evolution from the traditional planetarium to the digital dome of today. In later posts we will compare production techniques and equipment used for Hollywood movies to that used for planetarium production, animation and rendering techniques, live action capture, and more.

When many people think planetarium, they imagine a dark, domed room with a strange machine in the center. They imagine a mystifying experience where a presenter takes them on a tour of the stars and constellations in our sky. In the last ten years, the digital revolution has taken the planetarium on an interesting journey (which is far from over). With advances in digital projection systems, software and computers, planetaria are transforming into immersive theaters. Using anywhere between one and thirty projectors, a bank of synchronized computers, and sophisticated software, planetaria are pushing the boundaries of possibility for both education and entertainment.

DSC_2987

These digital systems allow presenters to move beyond the traditional night sky star talk. In a digital planetarium you can watch a full dome movie, specially produced to cover the entire dome, providing a “you are there” experience unlike any other. Differing from traditional movie theaters, digital planetaria possess realtime astronomy visualization software. A presenter can simulate and navigate through actual astronomical data in real time, almost like a video game.  Full dome content is still largely CGI, with only small amounts of live action. Where are the IMAX type nature films designed for the immersive planetarium theater experience? They’re coming…. (I hope!)

Here at LASM the Irene W. Pennington Planetarium houses a 4k digital projection system made by a company called Sky Skan. Our image on the dome is created by two projectors (one at the front and the other at the back of the dome). There are four computers sending visual information to each projector, and one computer that stores the surround sound (see picture at left). All of the computers are controlled by a main master computer and a software called Digital Sky and SPICE. In order for everything to run seamlessly (no pun intended) all computers must run simultaneously with no lagging.

Why are there so many pieces? Projecting a 4k video at a normal frame rate requires multiple computers to share the job, each one takes a small piece of the video to send to the projector (allowing everything to run quickly and smoothly). Current projector technology makes it difficult and expensive to cover a large 60-foot dome with a high quality image using a single projector, so we use two. Other planetaria use four, six or more to accomplish this task.

What does 4k even mean anyway? And does frame rate matter? Return next week to learn about how projection and video in the planetarium compares to your HD TV…

SUMMER TIME-LAPSE OVER MISSISSIPPI RIVER

This is a time-lapse taken from 10 am through 9pm on August 13 and results in a 10,000 frame video of still pictures with one picture taken every 3 seconds. This video that was originally 5 1/2 minutes long but now squeezed to make this video 1 1/2 minutes.
Shot from the window of my office at LASM.

Photographing Meteors (for beginners!)

earthswiftorbit

Debris left behind by comet Swift-Tuttle intersects Earth’s orbit. This year, Earth travels through the densest part of the debris field around August 12.

Right now, Earth is entering the debris field of comet Swift-Tuttle. What does this mean? Perseid Meteor Shower is upon us! It takes Earth about two weeks to pass through the field of dust, rock, and metal left behind from the comet. These bits, also known as meteors, fall into Earth’s atmosphere and burn up, leaving streaks of light in what appears to be the northern sky. The meteors appear to radiate from the constellation Perseus. It is not essential to know where Perseus is; just look north. In the middle of the two weeks it takes Earth to pass through the debris (this year August 12th and 13th) is the peak of the shower. The best time to see the meteors is after midnight,  this year the Moon will be in the Waxing Crescent Phase, and will set in the early evening. Therefore, the sky will be extra dark, making it perfect for photography. The peak is still about a few days away, but that gives you time to practice your meteor photographing skills in anticipation for the big night.

Not all astrophotography is difficult, especially in the age of digital cameras. You can accomplish good results with most digital cameras and a tripod. My two weapons of choice are a Canon 7D with a 180 degree fisheye lens, for capturing pictures that we can use in the planetarium, and of course my Sony Nex-5 (if you’re in the market for a new camera, Sony makes a whole family of Nex cameras at a variety of prices). Small point and shoot cameras can work too! Many of these little cameras come with some if not all of the necessary manual controls.

cameras

Canon 7D (left) and Sony Nex-5 (right).

For the meteor shower I will set up both cameras. Below you can see how the two kinds of pictures look different. The circular “dome master” picture from the 7D projects onto the planetarium dome, while the rectangular picture from the Nex-5  is for printing or viewing on a computer screen.

deathvalleydome

Canon 7D + Tripod + ISO 1600 + Shutter Open for 65 Seconds. Death Valley has America’s darkest night skies, but with these settings we picked up the lights of Las Vegas over 100 miles away. Click the pic to see the full size version.

So here’s what to do. I’d recommend experimenting in your back yard or somewhere with little light from nearby houses first (with long exposure images, you’ll be surprised how much human-made light will show up in your photos). While we hope for totally clear skies, a few stray clouds here and there can make for some interesting additions to your photos. Especially when doing long exposure shots……

1. Point your camera north towards the meteors, and switch it to the manual mode.

2. Higher ISOs make the camera more sensitive to light (better for dark conditions). Some digital cameras let you go to ISOs up in the thousands, but don’t get too carried away. These high ISO values can result in noisy images. I usually set ISO between 800 and 1600 for taking pictures of the night sky.

astrophoto

Nex-5 + Tripod + ISO 1600 + shutter open for 80 seconds. This photo was taken later at night, the Milky Way was in the Western sky (pointing away from Las Vegas Lights).

3. Many digital cameras allow you to adjust your shutter speed. In order to pick up starlight, you need to have a minimum of about 3 seconds of the shutter open. Yes, you read correctly. For everyday photography shutter speeds are hundredths or thousandths of a second. For nighttime shots that shutter needs to stay open for a while to collect as much light as possible. Also, make sure that your f/stop (how wide the aperture is open) is as wide as it can go. The wider the aperture the smaller the f/stop number (f/2 f/4 f/5.6), and the more light enters the lens.

4. Some cameras have a setting called “bulb”, which allows you to leave the shutter open for as long as you want! It closes when you press the picture button again. If you have this setting, experiment with this too. Remember, the Earth is rotating; the stars appear to slowly rise in the east and set in the west. By using the bulb setting and leaving your shutter open for about 10 minutes or so, you will begin to see “star trails” showing up in your picture.

So the longer you leave your shutter open, the longer your “star trails” will become. Also, you will get more meteors in your picture. Check out these cool pictures taken by someone else.

otherpersonsmerseid

Click the picture to visit this photographer’s page with more meteor shower pictures.

5. So, perhaps we can crank up the ISO and the shutter speed and we’ll get awesome pics, right? Not exactly. Exposure for too long paired with ISO that is too high results in a noisy image. The black parts of your image will have tiny dots of blue, green, and red. This happens when the light sensor in your camera heats up. So there is a trade off…

To capture less light over a longer period of time, try longer exposures of 10 min + , bring the ISO to 800 or less. If you want to capture more light over a short period of time, try higher ISO 800+, bring the shutter speed down under 10 min. If your camera is older, try an ISO threshold of 400 + or -.

6. I will leave you with one last tip. Keep that camera as still as possible! If you don’t have a tripod, you can prop your camera up on a table or even on the ground. When pressing the picture button, be very careful not to knock the camera. If you use the bulb function you will have to press the picture button again to close the shutter and finish the picture. It is most essential that you are careful at this point. Disturbing the camera will blur your image.

In summary. All cameras take slightly different pictures, experiment and see what results you get with your set up.  The more you experiment with different settings and combinations the more you will understand how your manual functions work. It’s a learning experience. Have fun!