Image stacking to remove noise and optical artifacts, careful use of color filters to enhance contrast and pull out detail. The press release says it used Red: F444W, Orange: F335M, Yellow: F470N, Green: F200W, Cyan: F187N, Blue: F090W. The N filters are narrowband. F470N is only 54 nanometers wide: https://jwst-docs.stsci.edu/jwst-near-infrared-camera/nircam...
Almost all the light in this image is way off the red end of the human visual spectrum, of course. The shortest wavelength filter is F090W which has a center wavelength of 902nm, about the same color as the light coming out of a TV remote infrared LED, which is barely visible in pure darkness.
This is what it looks like through a film SLR, without the detail enhancing filters: http://www.phys.ttu.edu/~ozprof/3372f.htm Here's a 20 minute exposure through a telescope: http://www.phys.ttu.edu/~ozprof/3372fk.jpg Maybe what you would see with your own eyes through binoculars at a dark site well away from city lights. A dim red smudge, hints of finer detail.
It’s real light, just color shifted as the JWST is designed to look at severely very distant and thus red shifted objects. The nebula is however much closer than that.
It's all real, but you would not be able to see it with your bare eyes even if you were relatively close to the nebula. The world around us would look very different if our eyes could perceive more of the infrared and ultraviolet spectrum.
The coloring is usually done to indicate different temperatures or wavelengths detected, so it can be a bit misleading.
I'm waiting for https://en.wikipedia.org/wiki/Pillars_of_Creation made by Webb.
It's not the same object, but similarly awesome. Maybe the article gives you an useful overview of how different telescopes 'see', and how that is translated into pictures for us.
I really wish astronomers would come up (or use) a standard mechanism for indicating the field of view of an image. The scale of this one in the night sky is much larger than the deep field one.
In the Ring Nebula image, the two galaxies just kind of casually hanging out on the left side (just above midline), one square on, the other edge on, is pretty impressive.
There are a few others to be found (I suspect image duration is much shorter than for the Deep Field).
Even as far-from-primary-interest-objects, amazing detail.
Is this image distorted in any way at all? It feels like the galaxies are somehow oriented around a center spot. Not all of them, but enough to give the image a distorted feeling. Probably it's just my mind pattern matching against something that doesn't really exist.
Something missing from this discussion that's worth pointing out:
This image shows profound "gravitational lensing", which you know. But what you might not know is that is precisely _why_ they chose to photograph it.
This galaxy cluster (SMACS 0723) may be the most well known and powerful gravitational lens we have observed. The galaxies shown distorted around the edges are actually behind the lens, but are magnified by it. This means we can see even farther in this region of space than normal, because we compound the power of the JWST with the power of this natural lens.
It all adds up to providing the "deepest" view of the universe yet, allowing us to see galaxies at a distance of more than 13.2B lightyears. This lets us see structures formed in the infancy of the universe, that wouldn't be possible looking at most other points in the sky, or even anywhere else in this deep field besides the perimeter of the lens in the middle.
Other features include the prominent arcs in this field. The powerful gravitational field of a galaxy cluster can bend the light rays from more distant galaxies behind it, just as a magnifying glass bends and warps images. Stars are also captured with prominent diffraction spikes, as they appear brighter at shorter wavelengths.
Yes, it is distorted by a gravitational lensing effect of a massive galaxy cluster. Each image has a short discussion at this link, and a longer discussion linked via "Learn more about this image" for even more info: https://www.nasa.gov/webbfirstimages
Thanks for posting these links! It was frustrating that the main NASA PR pages linked photos that were 1280x720. I guess that's to protect their bandwidth costs since much of the general public is probably viewing on mobile anyway and higher res would not only be slower but wasted bits.
I just wish NASA had provided a link at the bottom of their low-res image pages to intermediate sized images (~4k) for desktop viewing.
I believe this page has what you want: https://www.nasa.gov/webbfirstimages Click on the image, twice, to get to a large-but-not-crazy resolution photo.
Mobile is actually a great platform to get Hugh resolution, since you can zoom in really easily and navigate the full image.
However, after spending 10 minutes on mobile this morning, I was unable to find any high resolution images, and many images had that anti-pattern of a BS HTML gallery that severely restricts interacting with the image.
Elements absorb light at certain frequencies. Given a spectral analysis of the light that passes through the atmosphere and another of the light that doesn't pass through the atmosphere, you can take the difference and see what frequencies were absorbed by the atmosphere. This tells you what elements make up the atmosphere. The H2O sections in the graph are the light frequencies that are absorbed by water molecules ("amount of light blocked" on the Y axis), indicating that the atmosphere contains water.
I know nothing about optics. What is the effect that causes the 6 or 8 points of light of come off of bright objects? Does it have to do with the hex-shaped mirrors on JWT?
Yes, and also two of the trusses to the secondary mirror (these are the two additional horizontal lines). The Hubble Space Telescope gets 4 lines because of its 4 trusses.
The simple answer is simply because of the physics of the scope, the support arms cause diffraction spikes. The hubble has them too but is a smaller mirror and different support arrangement. Super common on consumer scopes such as RCs, Newts and big tube scopes that aren't RASAs
Watching the livestream i was more than surprised, that color correction actually happens in Photoshop.
Also, there seem to be multiple layer-masks involved for specific regions and objects.
I get that you can shift and composite color, based on hue, apply filters etc, but: Photoshop?
Curious if anyone can explain, that what we see is actual science or some touched up version of objects in our universe.
p.s.: What struck me the most is the absence of noise, especially for the deep field photo. Hubble took many exposures over weeks, which normally would allow for reliable reduction of noise, webb took some shots over the course of hours and there’s hardly any noise to see. Weirdest part is seeing them just “healing brushing” away some dots - how is the decision process on altering images like that?
The difference between 'actual science' and 'some touched up version of objects in our universe' is smaller than you might think: no matter how good your eyes, if there was no frequency shift involved you would not be able to perceive the image, other than as an array of numbers. To facilitate your consumption of the data it has to be frequency shifted and the easiest way to do this is to map the IR intensity to a range of colors that are graded the same way we grade false color images from other sources: higher intensities get brighter colors and lower intensities darker colors. Because not all of these are equally pleasing to the eye and/or enlightening Photoshop is actually a pretty good choice because it allows for dynamic experimentation what brings out the various details in the best way.
If you would rather stare at an array of numbers or a non colorized version (black-and-white) it would be much harder to make out the various features.
So think of it as a visual aid, rather than an arts project or a way to falsify the data: the colorization is part of the science, specifically: how to present the data best.
I get that the aquired data needs to be transformed in a way so we get an image that depicts a reality we can visually process.
I honestly thought there’s some tools in Nasa’s imaging group that, based on scientific rules, pumps out an image that is correct - seeing Photoshop in use left me wonder…
I get that the investment needs to be “sold” too, would be sad though if we reached fashion-ad conduct for science…
And don’t get me wrong: I am in awe and more than happy this thing finally gets put to use.
> actual science or some touched up version of objects in our universe.
Here's a mental model that I found particularly beneficial:
All electromagnetic radiation is the same. In the sense that every proton/neutron is the same. But adding a few more protons/neutrons creates an entirely new element, with entirely new chemical properties. From something simple come incredibly new powerful behaviours. So just as Iron is massively different from Plutonium, Microwaves are massively different from Gamma rays.
What we call "visible light" is not particularly special, except to us, and our specific human biology. It feels more real because it's visible to us, but it's not on the grand scale of the universe.
What we're observing through these telescopes isn't a dog chasing a ball. We're seeing stuff billions of light years away, millions of light years in size, billions of years ago. Passing by trillions of other stars and planets on the way.
These objects are emitting a gargantuan amount of information. Why should we only present the information that happens to be in the same subset as what our primitive primate vision cones can process?
So, no, if you were to teleport to the nebula/galaxy that we're showing images for, it wouldn't look exactly like that to your human eyes. Instead, what you're seeing is what a god with perfect vision of the universe would see. You're seeing the universe for what it is, not just the part of it that is presented to humans.
1. photoshop is really good at composing different (spectral) layers together. There is alternatives to this like pixinight that are more geared toward deep sky astronomy work but I'm sure it's easier to hire people that can just take a Photoshop class.
there are many layers/masks involved for different filters. the filters accept or reject certain wavelengths of light and may be designed for specific elements on the periodic table. people often talk about hydrogen filters or oxygen filters, sulfer filters etc. the color distinction you see is actually indicating elemental composition much of the time. I'm not sure what filters webb is using.
2. modern telescopes clean up their images by taking a "master dark frame" that is a stacked frame of many frames taken with the lens cap on. The goal there is to compute the noise profile of the sensor. I'm sure before launch the darks for the sensors were determined and are at the ready to correct and calibrate images coming from the telescope. think of it as applying a bespoke noise filter for that sensor. It's a fast process to apply it, but not to generate it. If they really make the raws available I'm sure we'll see more noise there.
3. the touch up you see them doing is the removal of a hot pixel which survived the calibration process with the dark frame. no doubt on space telescopes they still get errant hot pixel of some kind of particle or cosmic ray they don't want makes it to the sensor and flips a bit (and is therefore not account for in the master dark). happens all the time. they're probably keeping a map of where they're getting hot pixels.
I think you've answered your own question there, it's just PR images touched up by the media team without regard for anything. If there's any science being done it'll be done by matlab scripts using raw data as input.
I’m confused.. why would we expect some other image processing software to be better than Photoshop - a software package which has been the top of its class for ~30 years?
(And have been eagerly waiting for this moment for ages)
It just seems “unscientific” to just use Photoshop and above all curious about the set of rules and algorithms, that enables them to decide which hue to pick for which region, levels, etc.
I had always assumed they were doing it completely mathematically though. Like collating spectrometry readings to know what elements were present where and figuring out the temperature for blackbody emission or something, or even just linearly transforming the raw data from the spectrum the telescope can receive to the visible spectrum.
Kinda disappointing if it's really just a paint by numbers Photoshop to look nice
I am not doing astronomy but Photoshop is useful to analyze any kind of image. You can manipulate contrast, apply all sorts of filters, map a color palette, etc... All that using a user-friendly interface. It is very mature software used by millions of people, for general purpose image work, no custom tool will come close.
I guess that scientists will also use specialized software for fine analysis, but it doesn't make Photoshop useless.
I use PixInsight myself, but you can do a lot on photoshop and a lot of people will take the output of PixInsight to touch up in photoshop.
Modern sensors are amazingly low noise. 10 years ago, I used to have to calibrate out darks, bias and flats just to remove my sensor noise. Now with modern CMOS sensors, people still do that but it isn't as necessary and you can overcome much of the sensor noise by capturing enough data - and that's where JWST just dominates. THere is nothing impeeding the data causing noise.
Shot noise is easily removed by integration, Read noise on modern sensors is almost non existent and easily calibrated out, dark current is extremely low, bias, hot and cold pixels are all things that can be removed with calibration and integration and with space telescopes cosmic rays are probably the most annoying thing but if you stack enough images they integrate right out.
But back to photoshop, the final images are just publishing art. I use pixinsight myself for all the heavy lifting, pixel math, integration and calibration but sometimes go out to photoshop for cleanup - especially for web/print.
I read a detailed interview with the person who does the enhancements a couple of days ago (can’t recall where a grrr).
He said:
A) there are two of them in the team doing the imaging
B) it doesn’t start with an image - it’s literally heaps of binary data that the scientists stitch together
C) he then does the colour overlay based on agreed norms (one colour per input frequency for consistency)
D) most of his “touch up” work is getting the colour gradient right between the brightest and dimmer objects - without this a lot of resolution would be lost (brights too bright, or dim not visible).
I work with images on the other end of the scale regularly, and amongst scientists it's probably 50:50 Photoshop or ImageJ for editing images like that.
Unfortunately the NASA stream online was a disaster. Choppy video and it seemed like nobody had prepared anything. Also 720p in 2022...
Don't get me wrong, the images are amazing, but when small startups like Rocket Lab can have uninterrupted streams all the way to the orbit, but NASA stream from a studio looks more amateurish than your average 13-year-old Fortnite player on Twitch, it leaves a pretty bad impression.
Seriously it was such a mess. Lag aside, they had MULTIPLE cases of either someone's mic not being on, or someone with a hot mic after they were done whispering over the stream. Almost every single transition to scientists in other cities failed. This is really unfortunate because they hyped up this event big time. They announced it two weeks in advance, had a countdown, even had scientists do "reaction" videos to seeing the photos for the first time...
People often underestimate how insanely hard it is to put something like this together, but I'm surprised NASA did, It's not like it's the first time NASA does a livecast.
I'm not sure if NASA or the White House directed that stream. I've seen much better-organized streams from NASA. It wasn't just technically flawed. It was late, abrupt, disjointed and the talking points appeared to be delivered by people that had little knowledge in the matter. I can't believe I saw that level of disorganization from our highest executive office.
Existential dread pro-tip: The Wikipedia page on "Ultimate fate of the universe" is a fantastic way to compell the question of why anything ultimately matters.
Coming up with personal answers to this is the ultimate character resolve exercise!
One of my favourite concepts from Douglas Adams was the Total Perspective Vortex, a form of punishment that would drive the victim insane by showing them the entire totality of existence and their place in it.
it's terrifying how alone and ephemeral we truly are, that there are already places in our expanding universe that will never be reachable even via communication with any technology on any time scale (unless universe expansion reverses course). that any communication we may receive today will be from civilizations that have ceased to exist thousands to billions of years ago. and humans will likely never travel outside the solar system.
Why existential dread? We're extremely lucky to be alive. That one sperm hit that one egg and we survived to now. That is extremely unlucky, each of us is one sperm out of hundreds of millions, so savor this existence!!
Indeed, it is truly cause to pause and step back. What's the name of that phenomenon common amongst astronauts when they see the earth from afar? I feel like our society could use more of that.
I made this page (posted in another thread yesterday) because I was rather underwhelmed by the .gif. I think the page shows in much better detail the difference between the telescopes' capabilities.
That is incorrect. The famous Hubble Ultra Deep Field image[1] took 11.3 days of imaging spread over four months (because of high demand to use Hubble). However, that is a different part of the sky. The Hubble image shown here was taken as part of RELICS[2], a survey of images to find good candidates for JWST to image, and was only exposed for 1.7 hours (5 orbits at ~20 minutes each), compared to JWST's exposure time of 12.5 hours. So comparisons between between Hubble and JWST for that particular shot are not fair to Hubble.
Ok to be honest I know it's not cool to admit it, but so far it all looks the same.
If someone told me that the Webb picture was taken by Hubble I would not have thought about it for an extra second.
I'm hoping that in the future we see pictures of locations and environments that are mind-blowing to the average person who loves space.
The very rough equivalent in computer terms: a 1997 PC computing something and taking a week or so to do it and returning the answer: 3.
The same by the 2022 version: 3.14159265358979323846 in a few milliseconds.
Both the speed of the computation and the resolution of the result are what makes it impressive, not the fact that the nature of the universe does not change fundamentally when viewed across a longer span of time.
It is mind-blowing, but maybe not to the 'average person who loves space'. But if you stop for a bit longer to understand what it took to create that image and what it is that you are actually looking at (the age of the objects involved, their apparent size and the resolving power and temperature of the telescope required to make it) it becomes a lot more impressive.
The difference is in a) the details and b) the length of time the telescope has to gather light to get the photo. JWST got the photo in hours when Hubble took weeks, and there's easily 10x as many objects in the JWST shot.
JWST can thus observe much fainter and much more distant objects - galaxies billions of years old, exoplanets, etc., and it can do more of it.
These are just the initial "pretty pictures" processed to look nice and promoted as part of NASA's ongoing fundraising. The more valuable science payload is in the spectral data which will tell us about the composition of these objects. Another exciting aspect of of JWST is the IR instrument (NIRCAM) which can see red shifted wavelengths revealing much older objects from the early universe.
To me, the real 'shock and awe' will be when scientific papers are published which reveal new knowledge and deeper understanding of our universe. This will take some time although I'm sure the first papers are already racing toward pre-print.
I kind of agree with you, these pictures do look like more of the same. But that's okay, the real exciting stuff isn't going to be pretty pictures, it's going to be what astronomers and physicists are able to learn by peering deep into the origins of the universe. The pictures of galaxies are nice to look at, but the real ramifications of JWST will take years to play out.
This makes the Hubble telescope even more impressive in my eyes. Built 50 years ago with presumably 60 year old tech.
> Hubble telescope was funded and built in the 1970s by the United States space agency NASA with contributions from the European Space Agency. Its intended launch was 1983, but the project was beset by technical delays, budget problems, and the 1986 Challenger disaster. Hubble was finally launched in 1990.
The exoplanet analysis is what I'm most intrigued by. They're getting much more data than in the past on these.
Of course they went for an easy gas giant target first (it has lots of water, which is great), but those Earth-like planets in the Goldilocks zone are gonna be some of the most exciting stuff that comes out of this. Looking forward to it.
I don't know about Proxima Centauri b, but they'll be spending around 25% of "Cycle 1" (the first 6,000 hours of science) working on exoplanets, don't worry:
"Over the coming year, researchers will use spectroscopy to analyze the surfaces and atmospheres of several dozen exoplanets, from small rocky planets to gas- and ice-rich giants. Nearly one-quarter of Webb’s Cycle 1 observation time is allocated to studying exoplanets and the materials that form them." - https://www.nasa.gov/image-feature/goddard/2022/nasa-s-webb-...
1150 light years away! Imagine how much more details can be detected for stuff within 50 light years.
Really, they should be already building 2nd James Webb. I am sure even 10 of them would get 100% utilization for their whole lifetime. I can only imagine what kind of needless political game is happening around prioritization of time slots for it.
Or start working on next-gen, bigger, more resilient etc. It costs peanuts compared to any significant CERN upgrade and we have so much room to progress in astronomy (aka understanding our home, this universe) just by getting more data and resolution.
I fear there won't be any more JWSTs at all. People are already bitching about how much it cost and that all it does is make pretty pictures right here in this thread and there were many times that it came within a hair of having its budget slashed.
Super happy we have one JWST, and I hope fervently that it will outlast its original mission by a large fraction, every sign right now points in that direction.
A lot of the pictures have some bright stars with 6 long lens flare like points coming out of them in a consistent pattern. Is that because of the hexagonal shape of JWT's lenses/mirrors?
That's quite exhaustive, but it makes me wonder why isn't anything done to correct for that. Like for example instead of taking one 15h exposure, why not take three 5h exposures and roll the telescope 5 degrees in between, then median filter out the artefacts?
I think you also had a similar comment and linked the same article under the previous topic about JWST's first image?
The article is very informative, but my read of it is different: the three major "spikes" are in fact due to the hexagonal shape of the mirrors and how they're laid out. The struts also add three spikes, but: two of them coincide with the mirror spikes, while one of them (from the vertical strut) is visible on its own, and causes the smaller perfectly horizontal spike.
The image I'm basing this on is in your article with a caption starting from "The point spread function for the James Webb Space Telescope" [1]
From the other comments, I understand why it's there, but i wish they would photoshop them out.
The images take on a more synthetic and fake quality when the technical physical man-made constraints of our telescope get projected out onto the natural very much NON-man-made universe.
Yeah, it's the hexagonal shape. The objects with the 6 diffraction spikes are overexposed compared to the rest of the objects in the picture, so they're generally brighter and/or closer objects.
I really appreciate the work of the US Air Force Cambridge Research Laboratories for creating HITRAN. HITRAN is a molecular spectroscopic database used to look molecules in gas and atmosphere. They are the standard archive for transmission and radiance calculations. Without their groundwork we would not be as good at understanding planetary atmospheres.
Stephan's Quintet (NIRCam and MIRI Composite Image):
https://stsci-opo.org/STScI-01G7DB1FHPMJCCY59CQGZC1YJQ.png
Southern Ring Nebula (NIRCam and MIRI Images Side by Side):
https://stsci-opo.org/STScI-01G79R28V7S4AXDN8NG5QCPGE3.png
“Cosmic Cliffs” in the Carina Nebula (NIRCam Image):
https://stsci-opo.org/STScI-01G7ETPF7DVBJAC42JR5N6EQRH.png
Webb's First Deep Field (NIRCam Image):
https://stsci-opo.org/STScI-01G7DDBW5NNXTJV8PGHB0465QP.png
Exoplanet WASP-96 b (NIRISS Transmission Spectrum):
https://stsci-opo.org/STScI-01G7NBXDHYYSVBP2M476PRGG3A.png
> https://stsci-opo.org/STScI-01G7ETPF7DVBJAC42JR5N6EQRH.png
Is this for real?! It looks like it came right out of a Sci-Fi movie/book. Could anyone explain how much of this is post-editing magic?
Almost all the light in this image is way off the red end of the human visual spectrum, of course. The shortest wavelength filter is F090W which has a center wavelength of 902nm, about the same color as the light coming out of a TV remote infrared LED, which is barely visible in pure darkness.
This is what it looks like through a film SLR, without the detail enhancing filters: http://www.phys.ttu.edu/~ozprof/3372f.htm Here's a 20 minute exposure through a telescope: http://www.phys.ttu.edu/~ozprof/3372fk.jpg Maybe what you would see with your own eyes through binoculars at a dark site well away from city lights. A dim red smudge, hints of finer detail.
Anyway, that looks like science fiction because science fiction borrowed that look from astronomy. https://en.wikipedia.org/wiki/Nebula
The coloring is usually done to indicate different temperatures or wavelengths detected, so it can be a bit misleading.
i.e. If we were moving at the same velocity of the Nebula looking with our own eyes.
i.e. What it would look like "in real life if I actually went there"
Not much weirder than looking at an X-ray image.
There are a few others to be found (I suspect image duration is much shorter than for the Deep Field).
Even as far-from-primary-interest-objects, amazing detail.
> Webb's First Deep Field (NIRCam Image)
Is this image distorted in any way at all? It feels like the galaxies are somehow oriented around a center spot. Not all of them, but enough to give the image a distorted feeling. Probably it's just my mind pattern matching against something that doesn't really exist.
This image shows profound "gravitational lensing", which you know. But what you might not know is that is precisely _why_ they chose to photograph it.
This galaxy cluster (SMACS 0723) may be the most well known and powerful gravitational lens we have observed. The galaxies shown distorted around the edges are actually behind the lens, but are magnified by it. This means we can see even farther in this region of space than normal, because we compound the power of the JWST with the power of this natural lens.
It all adds up to providing the "deepest" view of the universe yet, allowing us to see galaxies at a distance of more than 13.2B lightyears. This lets us see structures formed in the infancy of the universe, that wouldn't be possible looking at most other points in the sky, or even anywhere else in this deep field besides the perimeter of the lens in the middle.
Other features include the prominent arcs in this field. The powerful gravitational field of a galaxy cluster can bend the light rays from more distant galaxies behind it, just as a magnifying glass bends and warps images. Stars are also captured with prominent diffraction spikes, as they appear brighter at shorter wavelengths.
[0] https://webbtelescope.org/contents/news-releases/2022/news-2...
Deleted Comment
I just wish NASA had provided a link at the bottom of their low-res image pages to intermediate sized images (~4k) for desktop viewing.
However, after spending 10 minutes on mobile this morning, I was unable to find any high resolution images, and many images had that anti-pattern of a BS HTML gallery that severely restricts interacting with the image.
More here: https://en.wikipedia.org/wiki/Absorption_spectroscopy
Much more about this particular graph here: https://www.nasa.gov/image-feature/goddard/2022/nasa-s-webb-...
Also, there seem to be multiple layer-masks involved for specific regions and objects.
I get that you can shift and composite color, based on hue, apply filters etc, but: Photoshop?
Curious if anyone can explain, that what we see is actual science or some touched up version of objects in our universe.
p.s.: What struck me the most is the absence of noise, especially for the deep field photo. Hubble took many exposures over weeks, which normally would allow for reliable reduction of noise, webb took some shots over the course of hours and there’s hardly any noise to see. Weirdest part is seeing them just “healing brushing” away some dots - how is the decision process on altering images like that?
(edit for typos)
If you would rather stare at an array of numbers or a non colorized version (black-and-white) it would be much harder to make out the various features.
So think of it as a visual aid, rather than an arts project or a way to falsify the data: the colorization is part of the science, specifically: how to present the data best.
Guess, that’s my main question.
I get that the aquired data needs to be transformed in a way so we get an image that depicts a reality we can visually process.
I honestly thought there’s some tools in Nasa’s imaging group that, based on scientific rules, pumps out an image that is correct - seeing Photoshop in use left me wonder…
I get that the investment needs to be “sold” too, would be sad though if we reached fashion-ad conduct for science…
And don’t get me wrong: I am in awe and more than happy this thing finally gets put to use.
Here's a mental model that I found particularly beneficial:
All electromagnetic radiation is the same. In the sense that every proton/neutron is the same. But adding a few more protons/neutrons creates an entirely new element, with entirely new chemical properties. From something simple come incredibly new powerful behaviours. So just as Iron is massively different from Plutonium, Microwaves are massively different from Gamma rays.
What we call "visible light" is not particularly special, except to us, and our specific human biology. It feels more real because it's visible to us, but it's not on the grand scale of the universe.
What we're observing through these telescopes isn't a dog chasing a ball. We're seeing stuff billions of light years away, millions of light years in size, billions of years ago. Passing by trillions of other stars and planets on the way.
These objects are emitting a gargantuan amount of information. Why should we only present the information that happens to be in the same subset as what our primitive primate vision cones can process?
So, no, if you were to teleport to the nebula/galaxy that we're showing images for, it wouldn't look exactly like that to your human eyes. Instead, what you're seeing is what a god with perfect vision of the universe would see. You're seeing the universe for what it is, not just the part of it that is presented to humans.
1. photoshop is really good at composing different (spectral) layers together. There is alternatives to this like pixinight that are more geared toward deep sky astronomy work but I'm sure it's easier to hire people that can just take a Photoshop class.
there are many layers/masks involved for different filters. the filters accept or reject certain wavelengths of light and may be designed for specific elements on the periodic table. people often talk about hydrogen filters or oxygen filters, sulfer filters etc. the color distinction you see is actually indicating elemental composition much of the time. I'm not sure what filters webb is using.
2. modern telescopes clean up their images by taking a "master dark frame" that is a stacked frame of many frames taken with the lens cap on. The goal there is to compute the noise profile of the sensor. I'm sure before launch the darks for the sensors were determined and are at the ready to correct and calibrate images coming from the telescope. think of it as applying a bespoke noise filter for that sensor. It's a fast process to apply it, but not to generate it. If they really make the raws available I'm sure we'll see more noise there.
3. the touch up you see them doing is the removal of a hot pixel which survived the calibration process with the dark frame. no doubt on space telescopes they still get errant hot pixel of some kind of particle or cosmic ray they don't want makes it to the sensor and flips a bit (and is therefore not account for in the master dark). happens all the time. they're probably keeping a map of where they're getting hot pixels.
Deleted Comment
(And have been eagerly waiting for this moment for ages)
It just seems “unscientific” to just use Photoshop and above all curious about the set of rules and algorithms, that enables them to decide which hue to pick for which region, levels, etc.
Kinda disappointing if it's really just a paint by numbers Photoshop to look nice
I guess that scientists will also use specialized software for fine analysis, but it doesn't make Photoshop useless.
Modern sensors are amazingly low noise. 10 years ago, I used to have to calibrate out darks, bias and flats just to remove my sensor noise. Now with modern CMOS sensors, people still do that but it isn't as necessary and you can overcome much of the sensor noise by capturing enough data - and that's where JWST just dominates. THere is nothing impeeding the data causing noise.
Shot noise is easily removed by integration, Read noise on modern sensors is almost non existent and easily calibrated out, dark current is extremely low, bias, hot and cold pixels are all things that can be removed with calibration and integration and with space telescopes cosmic rays are probably the most annoying thing but if you stack enough images they integrate right out.
But back to photoshop, the final images are just publishing art. I use pixinsight myself for all the heavy lifting, pixel math, integration and calibration but sometimes go out to photoshop for cleanup - especially for web/print.
He said: A) there are two of them in the team doing the imaging B) it doesn’t start with an image - it’s literally heaps of binary data that the scientists stitch together C) he then does the colour overlay based on agreed norms (one colour per input frequency for consistency) D) most of his “touch up” work is getting the colour gradient right between the brightest and dimmer objects - without this a lot of resolution would be lost (brights too bright, or dim not visible).
Hope this helps…
Here's the API to access the boring original data.
Don't get me wrong, the images are amazing, but when small startups like Rocket Lab can have uninterrupted streams all the way to the orbit, but NASA stream from a studio looks more amateurish than your average 13-year-old Fortnite player on Twitch, it leaves a pretty bad impression.
People often underestimate how insanely hard it is to put something like this together, but I'm surprised NASA did, It's not like it's the first time NASA does a livecast.
Coming up with personal answers to this is the ultimate character resolve exercise!
https://www.youtube.com/watch?v=MBRqu0YOH14
consciousness is a hell of a drug
edit: Seems to be called the overview effect [0]
[0]: https://en.wikipedia.org/wiki/Overview_effect
https://www.facebook.com/photo.php?fbid=10159217085846758&se...
https://blog.wolfd.me/hubble-jwst/
(If you're on mobile, you should be able to zoom in and still use the slider)
https://johnedchristensen.github.io/WebbCompare/
[1]https://esahubble.org/images/heic0611b/
[2]https://archive.stsci.edu/prepds/relics/
I'm hoping that in the future we see pictures of locations and environments that are mind-blowing to the average person who loves space.
The same by the 2022 version: 3.14159265358979323846 in a few milliseconds.
Both the speed of the computation and the resolution of the result are what makes it impressive, not the fact that the nature of the universe does not change fundamentally when viewed across a longer span of time.
It is mind-blowing, but maybe not to the 'average person who loves space'. But if you stop for a bit longer to understand what it took to create that image and what it is that you are actually looking at (the age of the objects involved, their apparent size and the resolving power and temperature of the telescope required to make it) it becomes a lot more impressive.
JWST can thus observe much fainter and much more distant objects - galaxies billions of years old, exoplanets, etc., and it can do more of it.
To me, the real 'shock and awe' will be when scientific papers are published which reveal new knowledge and deeper understanding of our universe. This will take some time although I'm sure the first papers are already racing toward pre-print.
> Hubble telescope was funded and built in the 1970s by the United States space agency NASA with contributions from the European Space Agency. Its intended launch was 1983, but the project was beset by technical delays, budget problems, and the 1986 Challenger disaster. Hubble was finally launched in 1990.
Of course they went for an easy gas giant target first (it has lots of water, which is great), but those Earth-like planets in the Goldilocks zone are gonna be some of the most exciting stuff that comes out of this. Looking forward to it.
https://en.wikipedia.org/wiki/Proxima_Centauri_b
"Over the coming year, researchers will use spectroscopy to analyze the surfaces and atmospheres of several dozen exoplanets, from small rocky planets to gas- and ice-rich giants. Nearly one-quarter of Webb’s Cycle 1 observation time is allocated to studying exoplanets and the materials that form them." - https://www.nasa.gov/image-feature/goddard/2022/nasa-s-webb-...
An obvious target for the coronagraph for regular imaging, but there's no way to get a transmission spectrum of its atmosphere.
Really, they should be already building 2nd James Webb. I am sure even 10 of them would get 100% utilization for their whole lifetime. I can only imagine what kind of needless political game is happening around prioritization of time slots for it.
Or start working on next-gen, bigger, more resilient etc. It costs peanuts compared to any significant CERN upgrade and we have so much room to progress in astronomy (aka understanding our home, this universe) just by getting more data and resolution.
Super happy we have one JWST, and I hope fervently that it will outlast its original mission by a large fraction, every sign right now points in that direction.
Hubble shows four spikes because it has two struts.
https://bigthink.com/starts-with-a-bang/james-webb-spikes/
https://www.universetoday.com/155062/wondering-about-the-6-r...
The article is very informative, but my read of it is different: the three major "spikes" are in fact due to the hexagonal shape of the mirrors and how they're laid out. The struts also add three spikes, but: two of them coincide with the mirror spikes, while one of them (from the vertical strut) is visible on its own, and causes the smaller perfectly horizontal spike.
The image I'm basing this on is in your article with a caption starting from "The point spread function for the James Webb Space Telescope" [1]
[1]: https://bigthink.com/wp-content/uploads/2022/03/FOFC8ZPX0AIB...
The images take on a more synthetic and fake quality when the technical physical man-made constraints of our telescope get projected out onto the natural very much NON-man-made universe.
Look at https://stsci-opo.org/STScI-01G7ETPF7DVBJAC42JR5N6EQRH.png and observe the incredible entropy in the nebula itself. The consistent, perfect, straight lines, of each star are jarring in the image.
but we should edit them :)
https://www.youtube.com/watch?v=UBcc3vpJTAU
https://hitran.org/ free after registration
https://hitran.org/media/refs/HITRAN-2020.pdf
HAPI (programming interface manual) https://hitran.org/static/hapi/hapi_manual.pdf
Youtube tutorials https://www.youtube.com/watch?v=NiKuigtFahk&list=PLqOG3cBizT...
It is very easy to use and might help to understand WASP-96 b transmission spectrum. https://stsci-opo.org/STScI-01G7NBXDHYYSVBP2M476PRGG3A.png
https://en.wikipedia.org/wiki/Electromagnetic_absorption_by_...
https://webbtelescope.org/contents/media/images/2022/034/01G...
https://webbtelescope.org/contents/media/images/2022/034/01G...
https://webbtelescope.org/contents/media/images/2022/035/01G...
https://webbtelescope.org/contents/media/images/2022/034/01G...