Monthly Archive for October, 2010

Welcome to my Lair!

Welcome to the Lair of Evil Dr Ganymede! On this site, you’ll find my 2300AD RPG blog and science blog as well as articles about role-playing games, worldbuilding, and Stellar Mapping that I’ve written or co-authored, along with artwork that I’ve created using 3D modelling software (POVray and Lightwave) over the years. I’ve also put my Ph.D. Thesis and abstracts online on my Science page.

I run a blog here in which I generally talk about interesting planetary science and astronomy topics, but I may talk about RPGs and worldbuilding and other stuff I find interesting too (you can read all the blog entries below this post, so scroll down! To read the full article, just click on the title of the entry in the blog). If you just want to read blog entries about specific subjects (e.g. Science, Worldbuilding, Stellar Mapping, etc) you can just click the link in the sidebar on the right under “Blog Topics” (you can use that link as a bookmark too).

You can also subscribe to my blog so you recieve an email whenever I update it, comment on the individual blog articles, or send me an email!

Thanks for visiting!

Why I love Voyager

Voyager flyby.

I’ll just come out and say it – I love Voyager.

And no, I’m not referring to the Star Trek series, I’m talking about the original JPL/NASA spacecraft! Lately I’ve been posting some of the Voyager images I worked on here on my Science Blog, I’ve finally got my Voyager for POVray code back online, and only yesterday Voyager 2 edged over being 13 light-hours from Earth according to its twitter feed (this, I might add, is quite a long way – just over 93.85 AU!). So I guess I’m on a Voyager kick at the moment 😉

It’s hard to believe that back in the 1970s, we knew very little about the outer solar system. Even by the mid 80s, Uranus, Neptune and Pluto each got a couple of pages at most in an astronomy textbook because of that – but the Voyager spacecraft literally re-wrote those text books. In my opinion, they pretty much were responsible for the creation of Planetary Science as a rich field of study because they revealed a bewildering variety of new worlds for us to understand – and I was growing up while this revolution was going on. It’s largely because of Voyager that I got into astronomy and planetary science in the first place, so I owe it a lot :).

Voyager showed us the turbulent, changing atmosphere of Jupiter. It showed us the craters and impact basins of Callisto, the bright sulci and dark terrain of Ganymede, and the mysteriously smooth, cracked surface of Europa. One of my earliest memories is of watching Patrick Moore talk about Io’s active volcanoes, during live TV coverage while the images were coming in from Voyager 1 in 1979 (one fun story is that the active volcanism was predicted in a scientific paper that was published a matter of days before the flyby confirmed them!).

Two years later in 1981, Saturn was revealed to us, along with Titan’s thick atmosphere, Enceladus’ patchwork terrain of craters and grooves, Mimas’ similarity to the Death Star, the bright and dark faces of Iapetus, spokes and braids in Saturn’s rings and of course majestic Saturn itself. Voyager 1 subsequently went on a path that took it out of the solar system after that, but Voyager 2 continued to Uranus and Neptune.

In 1986 we saw mysterious Uranus for the first time (I remember watching that one on TV too!) – while the planet itself turned out to be a bit of a visual disappointment (just a bland greenish sphere), the moons proved to steal the show yet again, with Miranda’s crazy patchwork terrain and huge cliffs, Ariel’s rifts, and Titania’s canyons. Sadly the geometry of the flyby meant that we couldn’t get as much information as we did from the previous ones, since Uranus is tipped on its side and Voyager was passing through the system essentially at perpendicularly relative to the satellite orbits – so we had to be content with distant views of the other moons that could only hint at interesting features.

In 1989, as the Eastern Bloc was starting to crumble here on Earth, Voyager 2 flew past Neptune, revealing a blue world with high cirrus clouds casting shadows on the atmosphere below, a great dark spot, a super-fast “Scooter” bright spot, and clumpy rings. Again though, the satellite Triton stole the show, with its pink methane icecap, frozen nitrogen lakes, mysterious “cantaloupe terrain”, and cryovolcanic geysers that erupted dark material into the satellite’s incredibly thin atmosphere.

And finally, in 1990, Voyager 1 was commanded to perform one last task – to take a family portrait of the solar system from its vantage point about 40 AU from Earth. And there was Earth, a “pale blue dot” looking small and insignificant and almost lost in the glare of the sun. The late, great, Professor Carl Sagan summed it up nicely:

    From this distant vantage point, the Earth might not seem of particular interest. But for us, it’s different. Consider again that dot. That’s here, that’s home, that’s us. On it everyone you love, everyone you know, everyone you ever heard of, every human being who ever was, lived out their lives. The aggregate of our joy and suffering, thousands of confident religions, ideologies, and economic doctrines, every hunter and forager, every hero and coward, every creator and destroyer of civilization, every king and peasant, every young couple in love, every mother and father, hopeful child, inventor and explorer, every teacher of morals, every corrupt politician, every “superstar,” every “supreme leader,” every saint and sinner in the history of our species lived there – on a mote of dust suspended in a sunbeam. – Prof. Carl Sagan.

And they’re still going today, barely. The signals from the Voyagers is getting weaker over time as their power runs low and the get further from Earth, but they’re probing the very edge of the sun’s influence and may reach interstellar space in the next few years. Voyager 2 just passed 13 light-hours from Earth, and Voyager 1 is just over 16 light-hours from Earth (over 115 AU!). The cameras may be dead but they’re still sending back good science with their other instruments about the solar wind out there at the edge of the solar system, and hopefully they’ll last till about 2025.

The spacecraft that we have sent out there since then – Cassini, New Horizons, and Galileo – are following in the metaphorical footsteps of Voyager, and those more long-term missions are answering many of the questions that Voyager raised (while raising more of their own, of course). But Voyager’s always going to have a special place in my heart because through that the outer solar system was revealed in all its glory to humanity for the first time, and in some way I was a part of that.

Voyager for POVRay now online again!

My Voyager for POVray model is now live again on this website (any bookmarks to it will have to be updated, as the URL has changed)!

Seeing in a different light.

In my last post I talked about the fun I had making huge mosaics of Ganymede and showed off one that I made of Ganymede’s anti-jovian hemisphere taken by Voyager 2. Today I’m going show you the side of Ganymede that permanently faces Jupiter, taken by Voyager 1 when it passed through the Jupiter system in January 1979, six months before its sister ship arrived there.

So, without further ado, here it is – the Subjovian hemisphere of Ganymede, again mosaicked by me in 1999 (click the image to see the full-size mosaic in all its glory):
VGR1 Ganymede OBV subjovian

This mosaic shows Perrine Regio (the dark blocks of terrain at top left) and Barnard and Nicholson Regiones (the dark areas in the central part of the hemisphere). The bright ray crater Tros is visible in the bright terrain of Phrygia Sulcus, between Perrine and Barnard Regiones. You can also make out the “polar caps” of Ganymede, visible as the paler terrain at the north and south poles (at top and bottom of the image). If you keep going west around the satellite from Perrine Regio, you’ll come to the eastern edge of Galileo Regio, which was the large area of dark terrain seen at the top right of the Voyager 2 image in my previous post.

This mosaic was made using the same filters as the Voyager 2 mosaic, but if you look closely you’ll see that the top-left part of Ganymede looks blurry. This is actually due to the motion of the spacecraft as it was taking the pictures. Voyager was moving pretty fast as it travelled through the Jupiter system (I’m not sure what its exactly velocity was at the time, but right now Voyager 1 is travelling at about 17 km/s relative to the sun!), and that had to be compensated for when taking pictures by rotating the camera on its scan platform or by rotating the whole spacecraft.

Another issue is that the solar illumination drops as the inverse square of the distance from the sun. This means that if you go twice as far from the sun as the Earth then the illumination there will drop to a quarter (1/4) of what it is as the Earth’s distance (conversely, if you go towards the sun to half the Earth’s distance then the sun will appear four times brighter). At Jupiter’s distance from the sun – just over five times further from the sun than Earth – the sunlight is about 27 times dimmer than at Earth. I don’t think this would actually be all that noticeable to the human eye, and it’d still probably be sufficient to blind us if we looked directly at the sun without any protection, but the dimmer illumination does make a difference for cameras, requiring longer exposures – these particular images were taken with 360 second (6 minute) exposures.

Despite the fast-moving camera (and spacecraft) and a long exposure times, most of the images taken by the Voyagers actually came out really sharp, which is a testimony to the skill of the people who planned the images and engineered the spacecraft – but there were still a few cases (like this one) when things didn’t quite turn out according to plan and the compensation wasn’t enough, and that’s why some of the images are blurry. Unfortunately there’s no way to fix this after the fact.

What about other filter combinations though? One simple trick that I tried was to simply take the Orange mosaic and duplicate that (changing the brightness and contrast to more closely match the Blue mosaic) and put the modified Orange mosaic in the green channel of the image instead of the blurry blue one. The result is shown below:
VGR1 Ganymede OOV subjovian

While this does sharpen the mosaic (since we’ve got rid of the blurry parts of the image), this is purely for aesthetic purposes though – it’s not accurate or useful for scientific purposes at all (you might also notice that the overall colour of Ganymede appears more yellow-tinged than in the OBV mosaic).

What about true colour? You might recall from my previous post that Voyager’s cameras don’t have a Red filter, so the closest that we can get to true colour is an Orange/Green/Blue combination. It so happens that Voyager 1 did take some Green images in this sequence, but unfortunately they don’t cover the whole hemisphere. But at least we can see something close to what Ganymede might look like to human eyes, and that’s shown in the next mosaic:
VGR1 Ganymede OGB subjovian

We’re interested in the central part of the image – going from top to bottom – which is covered by Orange, Green, and Blue filters. The rest of the image looks tinted magenta and dark-green because the Green filter images don’t cover those areas, so we’re just seeing those parts of Ganymede through the red and blue channels of the image. The colour difference is quite dramatic – in the OBV mosaics Ganymede looks a lot browner in colour, whereas in the OGB mosaic it looks more greyish/beige colour. But that’s closer to the true colour of Ganymede that we would see with our own eyes if we were there.

It occurs to me that I might be able to write a program to interpolate between the filters and give a more-accurate-but-still-simulated “true colour” view – so as an example I could take an existing Green image and Violet image, and linearly interpolate a Blue filter image (since Blue is about 43% of the way between Violet and Green in the spectrum). Of course, it wouldn’t be accurate because the way that the surface reflects blue wavelengths probably isn’t on a nice straight line between Violet and Green, but it’d be better than just eyeballing and fiddling with an image like I did in the mosaic where I modified the Orange to replace the Blue. Hrm…

All of this goes to show that one has to be careful when looking at colour images taken using different filters, because they are false-colour – not what one would see with the naked eye. That said, there’s still obviously a lot of very useful science that can be done with these images (including subtracting one filter from the other, taking ratios etc) which can tell us a lot about the nature of the surface material that we’re looking at, which I might discuss in a later post. But for me, nothing’s quite as satisfying as being able to see something in as close to true colour as possible, because that’s the closest thing to being there and seeing it with my own eyes – which is essentially why I got into planetary science and astronomy in the first place!

Postcards from a distant moon

While digging around Unmannedspaceflight.com (which is full of really cool images of the other bodies in the solar system), I got inspired to try to find some of the images that I used to work on when I did my Ph.D. at Lancaster University all those years ago. Fortunately I hadn’t lost them as I initially suspected and they’re all safely backed up online now, so I’m going to show some of them off in this and later posts and explain a bit about the story behind them.

You’re probably used to seeing planetary images on the news or in an article – a nice panoramic mosaic taken by Spirit or Opportunity on the surface of Mars, or an image of some terrain on a distant moon – but there’s actually quite a lot of work that goes into making these images presentable (and scientifically useful). You can get a lot of the raw images online from NASA nowadays, and new images are being released frequently (for example, head over to the Cassini raw images website, and click “browse latest 500 images” to see what’s hot off the press from the Saturnian system). While these raw images are useful for basic visual examination and interpretation, a lot of the time one needs to process the images somewhat before they’re usable for scientific purposes.

Let’s take the example that I’m going to show you here – a global mosaic of Jupiter’s largest moon Ganymede that I lovingly hand-made back in 1999 (click the image to see the full-size mosaic in all its glory):
VGR2 Ganymede OBV antijovian

This mosaic shows the anti-jovian hemisphere of Ganymede – like Earth’s moon, Ganymede is tidally locked to Jupiter which means that one side (the “sub-jovian hemisphere”) always faces the planet. We’re looking at the opposite side here, that always faces away from Jupiter – equivalent to what we’d call the “far side of the moon” if we were talking about Luna here. The mosaic is presented as if we were looking at the globe of the planet, with the north pole at the top of the image and the south pole at the bottom (though we can’t actually see either here because they weren’t covered in the mosaic), and the equator running from left to right across the middle of the mosaic. You can see that Ganymede is brownish in colour, with light and darker brown areas (the large dark area on the top right is Galileo Regio, and the lighter linear strip marking its western border is called Uruk Sulcus), and whiter areas where asteroids have smashed into its surface to make craters and expose relatively fresh ice (e.g. Osiris crater at the bottom). I’ll probably get on to talking about Ganymede itself more in a later post, but today I want to talk about the epic image processing that went into making the mosaic (if you want to look at some other mosaics of Ganymede that I made, check out my Ganymede gallery on Flickr. I’ll probably talk about the Voyager 1 mosaics in a later post).

This mosaic is made up of 18 separate images (some of which you can see as thumbnails of their original format here), taken by the Voyager 2 spacecraft way back in 1979. Now, space probes don’t actually take colour images – they take greyscale images through various camera filters, which are then combined (on the ground) to make a colour image. This happens in a modern handheld digital camera too – light passes through red, green and blue filters (or a Bayer filter made of all three), and that’s combined within the camera to make the colour image. Conversely, a digital colour image can be broken down into red, green, and blue channels, which would all look slightly different because of the way the subject reflects light in the red, green and blue parts of the visible spectrum (if you’ve got photoshop, you can play around with this by opening up a photo and going to the “channels” window and turning some of them on or off).

So to make a colour image from Voyager’s greyscale images, we have to combine three greyscale images by putting them in the red, green, and blue (R/G/B) channels. Ideally, to get a “true colour” image, you would take three greyscale images of a target in rapid succession (or even at the same time) – one through a Red filter, one through a Green filter, and one through a Blue filter – and these would be directly equivalent to the R/G/B channels in the colour image. Unfortunately, Voyager’s vidicon cameras didn’t come with all those filters – it has an Orange filter but not a Red filter, and it also has Methane (infrared), Violet and Ultraviolet filters whose responsiveness peaks at those wavelengths of (and around) the visible spectrum. This means we can’t get a truly accurate representation of the colours we would see with our own eyes, but we can get close if we have Orange, Green, and Blue images. We can also make other combinations such as Orange/Blue/Violet or even Methane (IR)/Green/Ultraviolet if we have images from those filters, but those are progressively more “false-colour”. In each case, we put the reddest filter in the Red channel of the colour image, the middle filter in the Green channel, and the bluest filter in the Blue channel, which is what I did in this mosaic – I combined Orange, Blue and Violet images to make a slightly false-colour view.

The other thing I had to do was to use the ISIS image processing software to import the images, calibrate them (remove the dots called reseau marks and correct for known distortions in the camera optics), correct the camera pointing (itself an epic tale), find enough match-points between each pair of images in the set, reproject them, account for the way light reflects from the target’s surface, and then stitch them all together to make a single mosaic – and I had to do that for all the images taken with each filter that I was using.

Finding the match points was a hugely time-consuming and incredibly tedious task, since I had to do that by hand (if you were in a rich US university you could use a separate IDL program to do it automatically, but that was horribly expensive and completely unaffordable for us – so we did the next best thing). Matchpoints are pairs of (x,y) co-ordinates of features that are visible in two overlapping images – by finding those and telling ISIS where they are in each image pair, the program can put the images on top of eachother when it’s mosaicking them in a way that they match up properly (modern cameras that have a “Panorama” mode do the same thing (automatically), so that when you take three overlapping photos it can stitch them together to make a continuous mosaic). Somewhere at home I have a notebook that is completely filled with (at least) three pairs of x and y pixel co-ordinates for each overlapping image pair, that I would then meticulously enter by hand into a text file that I would then pipe into ISIS – this mosaic alone had had a total of 81 matchpoints in it (for all of the filters)!

Once that was all done, the three separate mosaics taken through each filter could then be combined in photoshop to create the resulting (false) colour image, which is what you see here (the magenta/yellow/green strips are where the images in each channel don’t quite cover eachother, so if you have say a red and blue component without the green one there, the image looks like it’s tinted magenta).

Now, I made this mosaic back in 1999. I can’t exactly recall the specs of the computer we were using – looking at the specs for computers from that era that I can find on the net, it could have been a 400 MHz Pentium III (but more likely was whatever came before that) with 128MB RAM, a 20GB Hard drive, and running Red Hat Linux (that was the only OS that the ISIS image processing software would work on). We thought that was pretty awesome at the time, but it’s quite mind-boggling to think that I’m currently typing this entry on a home PC that at that time would have been called an honest-to-god Supercomputer (2.83 GHz Intel Q9550 Quadcore CPU, 4 GB DDR3 RAM, and a 500 GB hard drive)! The (estimated) specs may help you appreciate just how big a deal making this mosaic was at the time, because I’m pretty sure I had to leave the computer running overnight to stitch everything together (whereas it’d probably be done in less than 30 minutes today or something) – and sometimes it didn’t work because the computer ran out of scratch memory or the match-points weren’t quite right!

Of course, as processing power and hard drive space has increased over time, all this has become less time-consuming (though the process hasn’t actually changed all that much), but spare a thought for all the poor saps who had to make these mosaics in the 1970s, ’80s, and ’90s ;). I will say though that I wouldn’t have gone to all that effort if I didn’t think the results would be worthwhile, and it was rather satisfying to see the final mosaic at the end – as it is, this particular mosaic made it into my thesis as well :).

In my next post, I’ll show you what the other side of Ganymede looks like (as seen by Voyager 1) and demonstrate the difference that filters make to the appearance of the image.

Space Art page is up!

Hooray, I finally got around to adding my Space Art! All I need to do now is add the pics of the 3D models I’ve made, but I’ll figure that out later.