NASA, ESA, CSA and STScI You’ve seen the first color images from the James Webb Space Telescope, right? A stellar nursery that reveals previously unseen stars, the atmosphere of a giant exoplanet probed, a group of galaxies, a beautiful planetary nebula, and the deepest picture of the universe ever recorded. Pretty cool, huh? But were they real? Of course they were real! Was it just like Webb captured them in a single image, like you take a picture with your phone? Not at all. Webb is designed to be sensitive to light we can’t see. It also features four science instruments and seventeen modes of operation. “When you get the data, it doesn’t look like a pretty color image at all,” said Klaus Pontoppidan, a Webb program scientist at STScI who leads a team of 30 expert image operators. “They hardly look alike [and] Only if you know what to look for can you appreciate them.” Webb’s engineers had to heavily manipulate the images we saw long before they were released, and for some pretty simple and common sense reasons. So what’s going on? This isn’t just taking a photo on a phone.

Image design

First comes the shot selection. NASA was looking for objects that would produce a nice frame, have structure and make use of color – while also emphasizing science. Webb cannot see every part of the sky at any given time. So since the telescope’s launch was delayed several times, there was no way for engineers to meticulously plan the first images until Webb took to the sky last December. When it did, the engineers had a list of about 70 targets, which were chosen to show the range of web science it was capable of and which could herald stunning color images. “Once we knew when we could get the data, we could go down that list and pick the highest priority targets that were visible at that time,” Pontopidan said. “The images were planned for a long time [and] a lot of work has gone into stimulating what observations would look like so that everything can be set up properly.” The Carina Nebula as imaged by the James Webb Space Telescope (JWST). NASA, ESA, CSA and STScI NASA, ESA, CSA and STScI

How Webb data comes back to Earth

Before engineers can begin manipulating Webb’s images, the raw data must be beamed back to our planet from a million miles away in space. This is done using NASA JPL’s Deep Space Network (DSN), which is how engineers communicate with and receive data from its 30+ robotic probes in the solar system and beyond—including Webb. There are three complexes in the DSN, each positioned 120º apart. California, Madrid in Spain and Canberra in Australia. Radio waves are very reliable, but slow. Data comes at a hefty two megabits per second (Mbps). However, DSN will soon be upgraded from slow radio transmissions to ultra-fast “space lasers” that could massively increase data rates up to 10 or even 100 times faster. “We design things, upload them to the observatory, get the data and bring it back to Earth — then we have another long period of time where we process the data,” Pontopidan said.

Because the colors in Webb’s photos are fake

Are the Webb Telescope images in color? Are the colors in space photos real? No it is not. The Webb Telescope sees in red. It’s up there specifically to detect infrared light, the weakest and most distant light in the world. It essentially sees in heat radiation, not visible light. He sees another part of the electromagnetic spectrum: Electromagnetic spectrum, The visible range (shaded part) is shown enlarged on the right. (Photo … [+] From Encyclopaedia Britannica/UIG Via Getty Images) Universal Images Group via Getty Images Think of a rainbow. At one end it is red at the other end it is blue or purple. This rainbow is, in fact, much wider, but both ends represent the limits to the colors that the human eye can perceive. Beyond the blue there are shorter and shorter wavelengths of light for which we have no names. The same beyond the red, where the wavelength of light gets longer. That’s where Webb is looking — the infrared portion of the electromagnetic spectrum. It uses masking techniques – filters – to allow it to detect dim light sources next to very bright ones. But none of this is in “color”. So how can the pictures we see be in color to us?

How Webb’s photos are colored

Webb’s images move up the electromagnetic spectrum from a part we can’t perceive to the part of visible light we can see. They take mono-brightness images from Webb using up to 29 different narrow-band filters, each detecting different wavelengths of infrared light. They give the collected light of each filter a different visible color, from redder light with the longest wavelength) to blue (which has the shortest wavelength). They then create a composite image. Is this cheating? All engineers do is take radiation from a part of the spectrum that our eyes can’t see and transfer it to another part of the spectrum that we can see. It’s like playing a song in a different key. Additionally, all cameras—including your smartphone camera—use filters to capture the images you see. No, not Instagram filters, but individual red, green, and blue filters that, when combined, produce a visible image that looks “real.” If you think Webb’s images aren’t real, then you should also think your own smartphone photos are fake. The galaxy cluster SMACS 0723, known as Webb’s first deep field, features “pointy” stars and even … [+] galaxies. NASA, ESA, CSA, STScI, Webb ERO

How long does it take to process Webb images?

It’s a complex process that for data from Webb simply hasn’t been done before. So it takes a few weeks for each image to emerge in its full colorful glory. “Typically, the process from raw telescope data to a final, clean image that conveys scientific information about the universe can take anywhere from weeks to a month,” said Alyssa Pagan, a science visualization developer at STScI. It was definitely worth the wait. “In the first images we have observations of a few days,” Pontopidan said. “This is really just the beginning and we’re only scratching the surface.” I wish you clear skies and open eyes.