In the past two blog postings we uncovered how some characters in the Harry Potter universe are tied to Greek, Roman, and Norse mythology. Not only are certain names shared, but the stories between the character and the myth from which its name is derived are actually intertwined. In the first blog posting we dealt mainly with two of Harry’s classmates: Draco and Luna. In the second blog posting we looked at two members of the family Black: Sirius and Bellatrix. Now it is time we took a closer look at one of Harry Potter’s fiercest enemies and one of his fiercest allies: Fenrir Greyback and Albus Dumbledore.
If you’ve ever read the Harry Potter series of books, or even seen the films, you’ll come across many unusual and colorful character names. Characters such as Millicent Bulstrode, Fleur Delacour, Argus Filch, and Gilderoy Lockhart are a few of the names you’ll come across when working your way through the seven books in the series. These names—the sounds they create and their connection to other words such as the Slither in Slytherin or the Guile in Goyle—can give an indication, a slight inkling, to the reader into what can be expected of such characters. But there are other names—names such as Draco, Sirius, and Luna—which can also tell the reader something about their respective characters, not based on the allusion of their names but based on the astronomical backgrounds their names are derived from.
When we see depictions of the solar system, we often see an inaccurate representation of its size and scale. Some of the planets are often enlarged, other planets are minimized to a degree, while the orbital paths are shrunken down to be closer to the sun. This is done because, when viewing the entire solar system, you want to see everything that is there. Not only is there an enormous amount of space between the planets–especially between the rocky inner planets and the outer gas giants–but the scale between the inner and outter planets are at opposite ends of the spectrum. After all, you can fit over a thousand Earths inside Jupiter.
Giving an accurate depiction of the solar system’s scale is a very difficult task. It is a task that was recently tackled by filmmakers Alex Gorosh and Wylie Overstreet, using the “Earth as a marble” concept.
What does it sound like to sling-shot around Jupiter or to crash land on Venus? What does it sound like when molecules rapidly vibrate around each other or when you’re able to fly through a nebula? Writing music for a visual medium can be challenging enough as it is, but when you’re attempting to rhapsodize upon experiences that people have a hard time wrapping their head around it can create a whole new realm of difficulty. I mean, how do you summarize the feeling of approaching a star that’s ten times more massive than our Sun? But aside from the creative aspect, the task of writing music for planetarium productions is completely different than writing music for any other visual medium.
So what is a “dome master”? Most planetarians like to consider themselves masters of the dome, but that’s not what we’re talking about here.
Frames for full dome movies are square with a circular image inscribed within it, this is called a dome master. When looking at a dome master on a flat screen it looks like a distorted spherical image, but when projected upward onto the dome the image produces an accurate immersive effect for the viewer.
How do you make a dome master? Unfortunately it isn’t really as simple as taking footage or images intended for a normal flat screen and running them through some software to make them immersive. You need to capture the footage differently using special camera lenses. The super fisheye lens captures the 180 degree perspective necessary to get enough picture to fill the dome.
Many inexpensive cameras can capture high resolution still images, like Bryce Canyon image above. However, those megapixels aren’t put to use in video capture mode. The maximum video resolution for most digital cameras and DSLRs is a 1920 x 1080 rectangle (perfect for your HD TV at home!). Remembering back to previous articles, the fisheye lens produces a spherical image inside the camera’s sensor frame. So when capturing a video with a DSLR and fisheye lens you get a maximum of a 1,000 x 1,000 pixel dome master video. Which isn’t nearly enough!
So, let’s put one a fisheye on a Red Epic and start filming! Again, hold your horses… The frame captured by a 4k movie camera is not a square image either, its approximately 4,000 pixels wide by 2,000 pixels tall (just what you need for a regular movie theater). Slap a fisheye lens on there and you get a large black rectangle, with circle in the middle of it. Planetarium producers only care about the height of the frame, that is the maximum for how big our dome master will be. So a 4k movie camera that costs $100,000+ will only yield a 2k dome master, and the resolution to either side is thrown away. Bummer!
This has been the main reason that full dome movies up until now have been mainly composed of computer animation. That and of course we don’t have a lot of live footage of outer space. In recent years full dome movies have moved beyond just astronomy topics. There are biology shows, chemistry shows, history shows, and entertainment shows featuring roller coasters in outer space. Computer animation has been the solution for planetarium producers for how to get that true 4k by 4k dome master at whatever frame rate they want.
When producing CGI content, an animator essentially creates a virtual movie set. With virtual objects, virtual lights, even a virtual camera. The animator assigns textures to the objects, and makes them move. When the scene is finished, the animator picks the resolution that they want, then the scene is rendered. This means, that the computer creates each individual frame of the animation. One frame at a time, the computer calculates how the scene should look based on the lighting and texturing of the scene as seen from the point of view of the virtual camera. Those frames are then assembled into a movie.
When producing for the dome, the animator uses a virtual camera that has a virtual super fisheye lens on it. In fact this virtual camera can capture beyond the 180 degree field of view, difficult to do with a real camera. Many full dome animators use a field of view of 200 degrees or more. This captures more of the scene, giving the viewer an even more realistic experience. At render time, the animator chooses an output resolution of 4,000 x 4,000 and renders out dome master frames, that will later be assembled into a movie.
So is the sky the limit here? Not really. Come back next time to learn about how rendering actually works, and the time required to produce a 4k animation.
Welcome back to the Full Dome Saga. For those of you just checking in, last week’s article can be found here Part 1.
So what is a 4k production? It’s a full dome movie that takes four Kyras to produce it, a 4K production…..
Jokes aside… 4k is short for 4,000 pixels.
Movies everywhere are made up of individual images called frames, played back at high speeds to simulate motion. The higher the frame rate, the smoother the motion and the higher perception of realism. Low frame rates cause motion to appear jerky and the picture to flicker. Think back on early films from the 1930s and 1940s, or funny internet GIFs of today.
When we go to a movie theater and they advertise that they are a 4k cinema, what they really mean is that their movie is 4,000 pixels wide and 2,000 pixels tall (roughly) because movies are rectangular (your HD tv at home is 1920×1080 pixels). So each frame is approximately 4,000 pixels across and 2,000 pixels tall.
Generally, movie theaters run at 24 frames per second. Peter Jackson made headlines when he offered The Hobbit at 48 frames per second (gasp!). Many still argue that high frame rates for live action films on flat screens produce too much realism for the audience to enjoy. Personally I enjoyed they hyper realism of The Hobbit at 48 fps, but that’s just me.
Movies like The Hobbit push the envelope for current technology. Capturing live action at high resolutions and high frame rates (then double this for two cameras if you’re making a 3D movie), require sophisticated and expensive camera equipment. The Red Epic (used to film The Hobbit and other recent blockbusters) is advertised as a 4k camera. I am often asked, why don’t you planetarians use that to get 4k live action footage for your dome? Not so fast son….
Here in the planetarium things are a bit different. Rather than a rectangular screen in front of you, there is a hemispherical screen that surrounds you. The movie is in front of you, next to you, above you, and behind you. 4k in the planetarium world means each frame is 4,000 by 4,000 pixels (a square rather than a rectangle) twice the number of pixels per frame.
Many digital planetaria run full dome movies at 30 frames per second, but producers are beginning to push the envelope. If you thought The Hobbit at 48fps was crazy, some digital planetaria offer full dome movies at 60 fps. The high frame rate applied to CG graphics creates that extra degree of smoothness and realism in the motion of objects on the dome. Stay tuned, as we have been dabbling in this a bit ourselves here at LASM!
The planetarium dome is a larger surface that surrounds the audience, unlike a traditional movie screen. Therefore motion is amplified on a dome screen. For this reason planetarium shows are edited with longer cuts than traditional movies and TV content, camera movement is slower, and objects move slower. Movement that is fast and somewhat jerky on a flat screen will look even more so on a domed screen. High frame rate smoothes the motion, resulting in a more realistic experience.
Why are most full dome movies CGI you ask? Come back next week for The Full Dome Saga PT3 where we will investigate the ins and outs of Live Action vs CGI production for the dome…..
The Full Dome Saga Part 1: From Mechanical to Digital
Hello Readers, you are about to embark on a journey through the world of planetarium technology and production. Every Friday for the next few weeks we will explore all things digital planetarium. The first part of the Full Dome Saga is a brief background on the technological evolution from the traditional planetarium to the digital dome of today. In later posts we will compare production techniques and equipment used for Hollywood movies to that used for planetarium production, animation and rendering techniques, live action capture, and more.
When many people think planetarium, they imagine a dark, domed room with a strange machine in the center. They imagine a mystifying experience where a presenter takes them on a tour of the stars and constellations in our sky. In the last ten years, the digital revolution has taken the planetarium on an interesting journey (which is far from over). With advances in digital projection systems, software and computers, planetaria are transforming into immersive theaters. Using anywhere between one and thirty projectors, a bank of synchronized computers, and sophisticated software, planetaria are pushing the boundaries of possibility for both education and entertainment.
These digital systems allow presenters to move beyond the traditional night sky star talk. In a digital planetarium you can watch a full dome movie, specially produced to cover the entire dome, providing a “you are there” experience unlike any other. Differing from traditional movie theaters, digital planetaria possess realtime astronomy visualization software. A presenter can simulate and navigate through actual astronomical data in real time, almost like a video game. Full dome content is still largely CGI, with only small amounts of live action. Where are the IMAX type nature films designed for the immersive planetarium theater experience? They’re coming…. (I hope!)
Here at LASM the Irene W. Pennington Planetarium houses a 4k digital projection system made by a company called Sky Skan. Our image on the dome is created by two projectors (one at the front and the other at the back of the dome). There are four computers sending visual information to each projector, and one computer that stores the surround sound (see picture at left). All of the computers are controlled by a main master computer and a software called Digital Sky and SPICE. In order for everything to run seamlessly (no pun intended) all computers must run simultaneously with no lagging.
Why are there so many pieces? Projecting a 4k video at a normal frame rate requires multiple computers to share the job, each one takes a small piece of the video to send to the projector (allowing everything to run quickly and smoothly). Current projector technology makes it difficult and expensive to cover a large 60-foot dome with a high quality image using a single projector, so we use two. Other planetaria use four, six or more to accomplish this task.
What does 4k even mean anyway? And does frame rate matter? Return next week to learn about how projection and video in the planetarium compares to your HD TV…
This is a time-lapse taken from 10 am through 9pm on August 13 and results in a 10,000 frame video of still pictures with one picture taken every 3 seconds. This video that was originally 5 1/2 minutes long but now squeezed to make this video 1 1/2 minutes.
Shot from the window of my office at LASM.
Right now, Earth is entering the debris field of comet Swift-Tuttle. What does this mean? Perseid Meteor Shower is upon us! It takes Earth about two weeks to pass through the field of dust, rock, and metal left behind from the comet. These bits, also known as meteors, fall into Earth’s atmosphere and burn up, leaving streaks of light in what appears to be the northern sky. The meteors appear to radiate from the constellation Perseus. It is not essential to know where Perseus is; just look north. In the middle of the two weeks it takes Earth to pass through the debris (this year August 12th and 13th) is the peak of the shower. The best time to see the meteors is after midnight, this year the Moon will be in the Waxing Crescent Phase, and will set in the early evening. Therefore, the sky will be extra dark, making it perfect for photography. The peak is still about a few days away, but that gives you time to practice your meteor photographing skills in anticipation for the big night.
Not all astrophotography is difficult, especially in the age of digital cameras. You can accomplish good results with most digital cameras and a tripod. My two weapons of choice are a Canon 7D with a 180 degree fisheye lens, for capturing pictures that we can use in the planetarium, and of course my Sony Nex-5 (if you’re in the market for a new camera, Sony makes a whole family of Nex cameras at a variety of prices). Small point and shoot cameras can work too! Many of these little cameras come with some if not all of the necessary manual controls.
For the meteor shower I will set up both cameras. Below you can see how the two kinds of pictures look different. The circular “dome master” picture from the 7D projects onto the planetarium dome, while the rectangular picture from the Nex-5 is for printing or viewing on a computer screen.
So here’s what to do. I’d recommend experimenting in your back yard or somewhere with little light from nearby houses first (with long exposure images, you’ll be surprised how much human-made light will show up in your photos). While we hope for totally clear skies, a few stray clouds here and there can make for some interesting additions to your photos. Especially when doing long exposure shots……
1. Point your camera north towards the meteors, and switch it to the manual mode.
2. Higher ISOs make the camera more sensitive to light (better for dark conditions). Some digital cameras let you go to ISOs up in the thousands, but don’t get too carried away. These high ISO values can result in noisy images. I usually set ISO between 800 and 1600 for taking pictures of the night sky.
3. Many digital cameras allow you to adjust your shutter speed. In order to pick up starlight, you need to have a minimum of about 3 seconds of the shutter open. Yes, you read correctly. For everyday photography shutter speeds are hundredths or thousandths of a second. For nighttime shots that shutter needs to stay open for a while to collect as much light as possible. Also, make sure that your f/stop (how wide the aperture is open) is as wide as it can go. The wider the aperture the smaller the f/stop number (f/2 f/4 f/5.6), and the more light enters the lens.
4. Some cameras have a setting called “bulb”, which allows you to leave the shutter open for as long as you want! It closes when you press the picture button again. If you have this setting, experiment with this too. Remember, the Earth is rotating; the stars appear to slowly rise in the east and set in the west. By using the bulb setting and leaving your shutter open for about 10 minutes or so, you will begin to see “star trails” showing up in your picture.
So the longer you leave your shutter open, the longer your “star trails” will become. Also, you will get more meteors in your picture. Check out these cool pictures taken by someone else.
5. So, perhaps we can crank up the ISO and the shutter speed and we’ll get awesome pics, right? Not exactly. Exposure for too long paired with ISO that is too high results in a noisy image. The black parts of your image will have tiny dots of blue, green, and red. This happens when the light sensor in your camera heats up. So there is a trade off…
To capture less light over a longer period of time, try longer exposures of 10 min + , bring the ISO to 800 or less. If you want to capture more light over a short period of time, try higher ISO 800+, bring the shutter speed down under 10 min. If your camera is older, try an ISO threshold of 400 + or -.
6. I will leave you with one last tip. Keep that camera as still as possible! If you don’t have a tripod, you can prop your camera up on a table or even on the ground. When pressing the picture button, be very careful not to knock the camera. If you use the bulb function you will have to press the picture button again to close the shutter and finish the picture. It is most essential that you are careful at this point. Disturbing the camera will blur your image.
In summary. All cameras take slightly different pictures, experiment and see what results you get with your set up. The more you experiment with different settings and combinations the more you will understand how your manual functions work. It’s a learning experience. Have fun!