It may come as a shock to some of you that all images received from space telescopes are black and white. They are colourized later. But first, let’s understand how cameras work and if there are any differences between cameras in smartphones and cameras sent to space.
How cameras in smartphones work
The camera in your smartphone consists of various components but we will be focusing on the sensor. Sensors are essential components of cameras. They are devices that ‘capture’ light coming from the lens and convert it into an image1. On their surface, they contain photosites or light-collecting cavities made of silicon arranged in minute grids.
Sensors can be of two types: CCDs(Charged Couple Device) or CMOS (Complementary Metal Oxide Semiconductor)2. The working of sensors is based on the photoelectric effect. The photoelectric effect was first explained by Albert Einstein in 1905. When certain substances like silicon are hit by electromagnetic radiation in the right conditions, electrons can be freed from their atoms and can be ejected. Light is absorbed by the electron resulting in enough energy to break bonds and escape. Now, when the lens of the camera focuses light on the silicon present in the photosite, an electron is ejected from each silicon atom. One can extrapolate that the number of electrons ejected from the atoms will be equal to the number of photons that hit the silicon atoms3. The electrons produced can be used to produce an image. This process ends up resulting in an image in black and white.
What about colour images? Images in colour are obtained by using the Bayer filter array. These filters, penetrative to only certain colours, are placed on top of each cavity. They’re made of three colours, blue, green, and red arranged in a pattern. The filters absorb only their respective colours and reflect the rest. Almost two-thirds of the light is discarded.4
Figure 1 – An image depicting camera sensors letting in either red, green, or blue light
Why are cameras in space telescopes designed to see in black and white?
As mentioned previously, without the Bayer filter array, images produced by cameras in smartphones would be in grayscale. Using the Bayer filters means you are restricting only some of the light to reach the sensors. But this is not desirable if one wants to use cameras for scientific purposes as in space telescopes. Astrophysicists would like to observe an object in many wavelengths or certain wavelengths other than the ones we can observe.
Observing in only visible light washes away a lot of the scientific data. Let us take the Hubble space telescope for instance. Hubble’s cameras use CCD sensors. Its camera can have filters that let only one wavelength of light at one time5. Cameras in smartphones use three filters to produce a colour image instantly while Hubble can observe an object in one ‘colour’ at a time. It cannot view the object in both infrared and ultraviolet at the same time.
How are images from space colourized?
To obtain colour images, space telescopes take multiple images of the same object using red, green, and blue filters. These three images are later processed and combined by scientists to produce the colour images you see. But that’s not all. While observing objects in wavelengths the human eye is not sensitive to, scientists assign visible colours to them. This is broadband filtering or ‘true colour’ imaging, meaning this would be how our eyes would see the object observed if it could.
Our eyes have millions of specialised cells called rods and cones. Cone cells are sensitive to colour. There are three types of cones, each one most sensitive to either blue, green, or red. Each cone sends signals to the brain, which the brain amalgamates, allowing us to see millions of colours6.
Figure 2 – An image of NGC 7635, also known as The Bubble Nebula, which is an emission nebula. This image was taken using narrowband filtering. In it, the blue colour represents oxygen, green hydrogen, and red nitrogen.7
There is another type of filtering called narrowband filtering. It involves taking images of an object at a very specific wavelength8. Telescopes can see at one time, a very small portion of the electromagnetic spectrum and are generally used to observe emission nebulae9.
Nebulae are giant clouds of gas and dust in space, some of which are a result of a star’s death10. Emission nebulae are types of nebulae. Atoms in it are energized, meaning their electrons jump to a higher energy level due to ultraviolet radiation from a star close by, and as the electrons fall back to lower energy levels, energy is emitted in the form of light which makes these nebulae glow11.
Photographing emission nebulae involves narrowband filtering as previously mentioned. Some gases in the nebula emit light at really ‘close’ wavelengths. This makes it extremely hard to distinguish the gases. For practical purposes, each gas present in the nebula is assigned a colour. For example, the blue colour in the image indicates the presence of oxygen, reddish-pink evinces hydrogen. In this way, one might also say that the colour of the nebula is enhanced.
Image 3- The Milky Way galaxy as seen in infrared light
You might have come across images taken in radio, ultraviolet, etc. These are not visible to the human eye. So, a new scale of colours is taken to represent images taken in them. Such as the colours red, yellow, and orange show warm temperatures while shades of purple and blue depict colder temperatures in infrared observations12.
The present and the future
The perseverance rover, which touched down on Mars on the 18th of February 2021, has a colour camera. It is one of the few to take with itself an inbuilt colour camera. The Mastcam-Z, one of Perseverance’s cameras, is the camera with the highest resolution that takes images in colour13. The James Webb Space Telescope which is expected to launch later this year will see the universe in infrared and a small portion of the visible spectrum (red and orange light).
Space telescopes have helped us understand this Universe in more detail than we could have imagined. High up the surface of the Earth, there is no atmosphere to obscure our vision, helping us to see the Universe in ways our eyes alone never could. Though some images of space that you see, may not be how the human eye perceives the object, they are not entirely falsified as they still represent important scientific data. Colourizing black and white images from space has only increased our wonder towards the universe.
- The Smartphone Photographer. “The Phone Camera Sensor: A Simple Introduction | The Smartphone Photographer.” Accessed July 18, 2021. https://thesmartphonephotographer.com/phone-camera-sensor/.
5.Moskowitz, Clara. “Truth Behind the Photos: What the Hubble Space Telescope Really Sees.” Space.com. Accessed July 19, 2021. https://www.space.com/8059-truth-photos-hubble-space-telescope-sees.html.
12. Fries-Gaither, Jessica. “Seeing Temperature Through Color — Keeping Warm — Beyond Penguins and Polar Bears.”
13. Schneider, Jaron. “A Closer Look at the Mars Perseverance Rover’s Incredible Cameras | PetaPixel.” Accessed July 21, 2021. https://petapixel.com/2021/03/01/a-closer-look-at-the-mars-perseverance-rovers-incredible-cameras/.
Figure 1 – Cambridge in Colour. “Understanding Digital Camera Sensors.” Accessed July 18, 2021. https://www.cambridgeincolour.com/tutorials/camera-sensors.htm.