Readit News logoReadit News
chaps · 4 years ago
Direct links --

Stephan's Quintet (NIRCam and MIRI Composite Image):

https://stsci-opo.org/STScI-01G7DB1FHPMJCCY59CQGZC1YJQ.png

Southern Ring Nebula (NIRCam and MIRI Images Side by Side):

https://stsci-opo.org/STScI-01G79R28V7S4AXDN8NG5QCPGE3.png

“Cosmic Cliffs” in the Carina Nebula (NIRCam Image):

https://stsci-opo.org/STScI-01G7ETPF7DVBJAC42JR5N6EQRH.png

Webb's First Deep Field (NIRCam Image):

https://stsci-opo.org/STScI-01G7DDBW5NNXTJV8PGHB0465QP.png

Exoplanet WASP-96 b (NIRISS Transmission Spectrum):

https://stsci-opo.org/STScI-01G7NBXDHYYSVBP2M476PRGG3A.png

j0e1 · 4 years ago
> “Cosmic Cliffs” in the Carina Nebula (NIRCam Image):

> https://stsci-opo.org/STScI-01G7ETPF7DVBJAC42JR5N6EQRH.png

Is this for real?! It looks like it came right out of a Sci-Fi movie/book. Could anyone explain how much of this is post-editing magic?

sbierwagen · 4 years ago
Image stacking to remove noise and optical artifacts, careful use of color filters to enhance contrast and pull out detail. The press release says it used Red: F444W, Orange: F335M, Yellow: F470N, Green: F200W, Cyan: F187N, Blue: F090W. The N filters are narrowband. F470N is only 54 nanometers wide: https://jwst-docs.stsci.edu/jwst-near-infrared-camera/nircam...

Almost all the light in this image is way off the red end of the human visual spectrum, of course. The shortest wavelength filter is F090W which has a center wavelength of 902nm, about the same color as the light coming out of a TV remote infrared LED, which is barely visible in pure darkness.

This is what it looks like through a film SLR, without the detail enhancing filters: http://www.phys.ttu.edu/~ozprof/3372f.htm Here's a 20 minute exposure through a telescope: http://www.phys.ttu.edu/~ozprof/3372fk.jpg Maybe what you would see with your own eyes through binoculars at a dark site well away from city lights. A dim red smudge, hints of finer detail.

Retric · 4 years ago
It’s real light, just color shifted as the JWST is designed to look at severely very distant and thus red shifted objects. The nebula is however much closer than that.

Anyway, that looks like science fiction because science fiction borrowed that look from astronomy. https://en.wikipedia.org/wiki/Nebula

The5thElephant · 4 years ago
It's all real, but you would not be able to see it with your bare eyes even if you were relatively close to the nebula. The world around us would look very different if our eyes could perceive more of the infrared and ultraviolet spectrum.

The coloring is usually done to indicate different temperatures or wavelengths detected, so it can be a bit misleading.

LargoLasskhyfv · 4 years ago
I'm waiting for https://en.wikipedia.org/wiki/Pillars_of_Creation made by Webb. It's not the same object, but similarly awesome. Maybe the article gives you an useful overview of how different telescopes 'see', and how that is translated into pictures for us.
randyrand · 4 years ago
Does anyone have a simulated image of what it would look like in visible light without red shifting?

i.e. If we were moving at the same velocity of the Nebula looking with our own eyes.

i.e. What it would look like "in real life if I actually went there"

BurningFrog · 4 years ago
Much of it is in infrared light we can't see, so it's "transposed" to the visible spectrum.

Not much weirder than looking at an X-ray image.

est · 4 years ago
I am wondering what size of the "cosmic dust" are. It looks like the size of stars but without emitting light?
pwned1 · 4 years ago
My god, look at the background of the first image at full scale.
jcims · 4 years ago
I really wish astronomers would come up (or use) a standard mechanism for indicating the field of view of an image. The scale of this one in the night sky is much larger than the deep field one.
dredmorbius · 4 years ago
In the Ring Nebula image, the two galaxies just kind of casually hanging out on the left side (just above midline), one square on, the other edge on, is pretty impressive.

There are a few others to be found (I suspect image duration is much shorter than for the Deep Field).

Even as far-from-primary-interest-objects, amazing detail.

rootusrootus · 4 years ago
Thanks for the direct links!

> Webb's First Deep Field (NIRCam Image)

Is this image distorted in any way at all? It feels like the galaxies are somehow oriented around a center spot. Not all of them, but enough to give the image a distorted feeling. Probably it's just my mind pattern matching against something that doesn't really exist.

bcherry · 4 years ago
Something missing from this discussion that's worth pointing out:

This image shows profound "gravitational lensing", which you know. But what you might not know is that is precisely _why_ they chose to photograph it.

This galaxy cluster (SMACS 0723) may be the most well known and powerful gravitational lens we have observed. The galaxies shown distorted around the edges are actually behind the lens, but are magnified by it. This means we can see even farther in this region of space than normal, because we compound the power of the JWST with the power of this natural lens.

It all adds up to providing the "deepest" view of the universe yet, allowing us to see galaxies at a distance of more than 13.2B lightyears. This lets us see structures formed in the infancy of the universe, that wouldn't be possible looking at most other points in the sky, or even anywhere else in this deep field besides the perimeter of the lens in the middle.

palmtree3000 · 4 years ago
Gravitational lensing. From the description[0]:

Other features include the prominent arcs in this field. The powerful gravitational field of a galaxy cluster can bend the light rays from more distant galaxies behind it, just as a magnifying glass bends and warps images. Stars are also captured with prominent diffraction spikes, as they appear brighter at shorter wavelengths.

[0] https://webbtelescope.org/contents/news-releases/2022/news-2...

coldpie · 4 years ago
Yes, it is distorted by a gravitational lensing effect of a massive galaxy cluster. Each image has a short discussion at this link, and a longer discussion linked via "Learn more about this image" for even more info: https://www.nasa.gov/webbfirstimages

Deleted Comment

mrandish · 4 years ago
Thanks for posting these links! It was frustrating that the main NASA PR pages linked photos that were 1280x720. I guess that's to protect their bandwidth costs since much of the general public is probably viewing on mobile anyway and higher res would not only be slower but wasted bits.

I just wish NASA had provided a link at the bottom of their low-res image pages to intermediate sized images (~4k) for desktop viewing.

coldpie · 4 years ago
I believe this page has what you want: https://www.nasa.gov/webbfirstimages Click on the image, twice, to get to a large-but-not-crazy resolution photo.
epistasis · 4 years ago
Mobile is actually a great platform to get Hugh resolution, since you can zoom in really easily and navigate the full image.

However, after spending 10 minutes on mobile this morning, I was unable to find any high resolution images, and many images had that anti-pattern of a BS HTML gallery that severely restricts interacting with the image.

Wowfunhappy · 4 years ago
Not that I'm complaining since I hate jpeg compression, but you'd think that if they were concerned about bandwidth, they wouldn't use png...
yread · 4 years ago
you can also download full res (even uncompressed) images from ESA site (they developed two of the IR instruments)
vishnugupta · 4 years ago
If it helps others like me, I found it easier to download the images through wget and then open the local file through browser.
samstave · 4 years ago
May anyone please ELI5 how to interpret the WASP-96 water spectrum graph above?
coldpie · 4 years ago
Elements absorb light at certain frequencies. Given a spectral analysis of the light that passes through the atmosphere and another of the light that doesn't pass through the atmosphere, you can take the difference and see what frequencies were absorbed by the atmosphere. This tells you what elements make up the atmosphere. The H2O sections in the graph are the light frequencies that are absorbed by water molecules ("amount of light blocked" on the Y axis), indicating that the atmosphere contains water.

More here: https://en.wikipedia.org/wiki/Absorption_spectroscopy

Much more about this particular graph here: https://www.nasa.gov/image-feature/goddard/2022/nasa-s-webb-...

ndm000 · 4 years ago
I know nothing about optics. What is the effect that causes the 6 or 8 points of light of come off of bright objects? Does it have to do with the hex-shaped mirrors on JWT?
arianvanp · 4 years ago
it's called a point spread function; and is an artifact that occurs in any mirror telescope. https://bigthink.com/starts-with-a-bang/james-webb-spikes/ explains it pretty well.
PavleMiha · 4 years ago
Yes, and also two of the trusses to the secondary mirror (these are the two additional horizontal lines). The Hubble Space Telescope gets 4 lines because of its 4 trusses.
supernovae · 4 years ago
The simple answer is simply because of the physics of the scope, the support arms cause diffraction spikes. The hubble has them too but is a smaller mirror and different support arrangement. Super common on consumer scopes such as RCs, Newts and big tube scopes that aren't RASAs
Keyframe · 4 years ago
Aperture shape, so in this case I guess the answer is yes?
anshumankmr · 4 years ago
Anyone have the gigabyte size image of this?
whiteboardr · 4 years ago
Watching the livestream i was more than surprised, that color correction actually happens in Photoshop.

Also, there seem to be multiple layer-masks involved for specific regions and objects.

I get that you can shift and composite color, based on hue, apply filters etc, but: Photoshop?

Curious if anyone can explain, that what we see is actual science or some touched up version of objects in our universe.

p.s.: What struck me the most is the absence of noise, especially for the deep field photo. Hubble took many exposures over weeks, which normally would allow for reliable reduction of noise, webb took some shots over the course of hours and there’s hardly any noise to see. Weirdest part is seeing them just “healing brushing” away some dots - how is the decision process on altering images like that?

(edit for typos)

jacquesm · 4 years ago
The difference between 'actual science' and 'some touched up version of objects in our universe' is smaller than you might think: no matter how good your eyes, if there was no frequency shift involved you would not be able to perceive the image, other than as an array of numbers. To facilitate your consumption of the data it has to be frequency shifted and the easiest way to do this is to map the IR intensity to a range of colors that are graded the same way we grade false color images from other sources: higher intensities get brighter colors and lower intensities darker colors. Because not all of these are equally pleasing to the eye and/or enlightening Photoshop is actually a pretty good choice because it allows for dynamic experimentation what brings out the various details in the best way.

If you would rather stare at an array of numbers or a non colorized version (black-and-white) it would be much harder to make out the various features.

So think of it as a visual aid, rather than an arts project or a way to falsify the data: the colorization is part of the science, specifically: how to present the data best.

whiteboardr · 4 years ago
Thanks, but how is the sausage made then?

Guess, that’s my main question.

I get that the aquired data needs to be transformed in a way so we get an image that depicts a reality we can visually process.

I honestly thought there’s some tools in Nasa’s imaging group that, based on scientific rules, pumps out an image that is correct - seeing Photoshop in use left me wonder…

I get that the investment needs to be “sold” too, would be sad though if we reached fashion-ad conduct for science…

And don’t get me wrong: I am in awe and more than happy this thing finally gets put to use.

ricardobeat · 4 years ago
What would these look like, if you could point a ground telescope at the exact same spot? How much light is in the visible spectrum?
deanCommie · 4 years ago
> actual science or some touched up version of objects in our universe.

Here's a mental model that I found particularly beneficial:

All electromagnetic radiation is the same. In the sense that every proton/neutron is the same. But adding a few more protons/neutrons creates an entirely new element, with entirely new chemical properties. From something simple come incredibly new powerful behaviours. So just as Iron is massively different from Plutonium, Microwaves are massively different from Gamma rays.

What we call "visible light" is not particularly special, except to us, and our specific human biology. It feels more real because it's visible to us, but it's not on the grand scale of the universe.

What we're observing through these telescopes isn't a dog chasing a ball. We're seeing stuff billions of light years away, millions of light years in size, billions of years ago. Passing by trillions of other stars and planets on the way.

These objects are emitting a gargantuan amount of information. Why should we only present the information that happens to be in the same subset as what our primitive primate vision cones can process?

So, no, if you were to teleport to the nebula/galaxy that we're showing images for, it wouldn't look exactly like that to your human eyes. Instead, what you're seeing is what a god with perfect vision of the universe would see. You're seeing the universe for what it is, not just the part of it that is presented to humans.

penneyd · 4 years ago
Very nicely stated.
dmead · 4 years ago
amateur astronomer here. (https://www.instagram.com/mead_observatory/)

1. photoshop is really good at composing different (spectral) layers together. There is alternatives to this like pixinight that are more geared toward deep sky astronomy work but I'm sure it's easier to hire people that can just take a Photoshop class.

there are many layers/masks involved for different filters. the filters accept or reject certain wavelengths of light and may be designed for specific elements on the periodic table. people often talk about hydrogen filters or oxygen filters, sulfer filters etc. the color distinction you see is actually indicating elemental composition much of the time. I'm not sure what filters webb is using.

2. modern telescopes clean up their images by taking a "master dark frame" that is a stacked frame of many frames taken with the lens cap on. The goal there is to compute the noise profile of the sensor. I'm sure before launch the darks for the sensors were determined and are at the ready to correct and calibrate images coming from the telescope. think of it as applying a bespoke noise filter for that sensor. It's a fast process to apply it, but not to generate it. If they really make the raws available I'm sure we'll see more noise there.

3. the touch up you see them doing is the removal of a hot pixel which survived the calibration process with the dark frame. no doubt on space telescopes they still get errant hot pixel of some kind of particle or cosmic ray they don't want makes it to the sensor and flips a bit (and is therefore not account for in the master dark). happens all the time. they're probably keeping a map of where they're getting hot pixels.

buildbot · 4 years ago
To point 3, they are absolutely keeping many, many maps of the pixels and dark current for all of their sensors - this is a good picture of the process for a standard astronomical CCD: https://cdn.nightskypix.com/wp-content/uploads/2020/06/calib...
willis936 · 4 years ago
Thanks for 3. Without the explanation it really did come off as doctoring data to be more artistic.
moffkalast · 4 years ago
I think you've answered your own question there, it's just PR images touched up by the media team without regard for anything. If there's any science being done it'll be done by matlab scripts using raw data as input.
bottled_poe · 4 years ago
I’m confused.. why would we expect some other image processing software to be better than Photoshop - a software package which has been the top of its class for ~30 years?
yread · 4 years ago
Can it even open FITS?
randyrand · 4 years ago
Because photoshop it not open source, not verifiable, and not documented on a scientific level about how its filters behave.

Deleted Comment

roywiggins · 4 years ago
Webb's primary camera is infrared, so there is by necessity a choice to be made with how to present the data for humans who can't see in infrared.
whiteboardr · 4 years ago
I am aware.

(And have been eagerly waiting for this moment for ages)

It just seems “unscientific” to just use Photoshop and above all curious about the set of rules and algorithms, that enables them to decide which hue to pick for which region, levels, etc.

Wowfunhappy · 4 years ago
But, the inferred data is supposed to help us determine what I might see if I could teleport there (and time travel, not die, etc)—right?
shitpostbot · 4 years ago
I had always assumed they were doing it completely mathematically though. Like collating spectrometry readings to know what elements were present where and figuring out the temperature for blackbody emission or something, or even just linearly transforming the raw data from the spectrum the telescope can receive to the visible spectrum.

Kinda disappointing if it's really just a paint by numbers Photoshop to look nice

GuB-42 · 4 years ago
I am not doing astronomy but Photoshop is useful to analyze any kind of image. You can manipulate contrast, apply all sorts of filters, map a color palette, etc... All that using a user-friendly interface. It is very mature software used by millions of people, for general purpose image work, no custom tool will come close.

I guess that scientists will also use specialized software for fine analysis, but it doesn't make Photoshop useless.

amelius · 4 years ago
I'd recommend Fiji; it was developed within a scientific environment; and it is free (unlike PhotoShop).
supernovae · 4 years ago
I use PixInsight myself, but you can do a lot on photoshop and a lot of people will take the output of PixInsight to touch up in photoshop.

Modern sensors are amazingly low noise. 10 years ago, I used to have to calibrate out darks, bias and flats just to remove my sensor noise. Now with modern CMOS sensors, people still do that but it isn't as necessary and you can overcome much of the sensor noise by capturing enough data - and that's where JWST just dominates. THere is nothing impeeding the data causing noise.

Shot noise is easily removed by integration, Read noise on modern sensors is almost non existent and easily calibrated out, dark current is extremely low, bias, hot and cold pixels are all things that can be removed with calibration and integration and with space telescopes cosmic rays are probably the most annoying thing but if you stack enough images they integrate right out.

But back to photoshop, the final images are just publishing art. I use pixinsight myself for all the heavy lifting, pixel math, integration and calibration but sometimes go out to photoshop for cleanup - especially for web/print.

Paddywack · 4 years ago
I read a detailed interview with the person who does the enhancements a couple of days ago (can’t recall where a grrr).

He said: A) there are two of them in the team doing the imaging B) it doesn’t start with an image - it’s literally heaps of binary data that the scientists stitch together C) he then does the colour overlay based on agreed norms (one colour per input frequency for consistency) D) most of his “touch up” work is getting the colour gradient right between the brightest and dimmer objects - without this a lot of resolution would be lost (brights too bright, or dim not visible).

Hope this helps…

beowulfey · 4 years ago
I work with images on the other end of the scale regularly, and amongst scientists it's probably 50:50 Photoshop or ImageJ for editing images like that.
djfobbz · 4 years ago
I was wondering the same...why not also share the boring originals that we can process through our own filters?
_justinfunk · 4 years ago
https://mast.stsci.edu/api/v0/index.html

Here's the API to access the boring original data.

yread · 4 years ago
I would have expected ImageJ has plugins better suited to work on science
clint · 4 years ago
Why would you assume this?
oittaa · 4 years ago
Unfortunately the NASA stream online was a disaster. Choppy video and it seemed like nobody had prepared anything. Also 720p in 2022...

Don't get me wrong, the images are amazing, but when small startups like Rocket Lab can have uninterrupted streams all the way to the orbit, but NASA stream from a studio looks more amateurish than your average 13-year-old Fortnite player on Twitch, it leaves a pretty bad impression.

ehsankia · 4 years ago
Seriously it was such a mess. Lag aside, they had MULTIPLE cases of either someone's mic not being on, or someone with a hot mic after they were done whispering over the stream. Almost every single transition to scientists in other cities failed. This is really unfortunate because they hyped up this event big time. They announced it two weeks in advance, had a countdown, even had scientists do "reaction" videos to seeing the photos for the first time...

People often underestimate how insanely hard it is to put something like this together, but I'm surprised NASA did, It's not like it's the first time NASA does a livecast.

the_cat_kittles · 4 years ago
the classic "hacker news landing page critique" applied to nasa, love it
SalmoShalazar · 4 years ago
I think NASA’s funding generally goes towards doing science rather than optimizing their Fortnite streams
dan_quixote · 4 years ago
I'm not sure if NASA or the White House directed that stream. I've seen much better-organized streams from NASA. It wasn't just technically flawed. It was late, abrupt, disjointed and the talking points appeared to be delivered by people that had little knowledge in the matter. I can't believe I saw that level of disorganization from our highest executive office.
silentsea90 · 4 years ago
Way to brighten my day with awe and wonder, way to ruin my day with existential dread about our place in the universe.
sho_hn · 4 years ago
Existential dread pro-tip: The Wikipedia page on "Ultimate fate of the universe" is a fantastic way to compell the question of why anything ultimately matters.

Coming up with personal answers to this is the ultimate character resolve exercise!

yreg · 4 years ago
I found Kurzgesagt's video on Optimistic Nihilism helpful.

https://www.youtube.com/watch?v=MBRqu0YOH14

idiotsecant · 4 years ago
Nothing matters. You live for a while and then you die, but it sure can be a cool trip getting there!
sillysaurusx · 4 years ago
See also "Ask HN: What's the point of life?" https://news.ycombinator.com/item?id=28866558
noneeeed · 4 years ago
One of my favourite concepts from Douglas Adams was the Total Perspective Vortex, a form of punishment that would drive the victim insane by showing them the entire totality of existence and their place in it.
_moof · 4 years ago
Didn't work on Zaphod though. He just ate the cake.
silentsea90 · 4 years ago
Wow. That's genius
leeoniya · 4 years ago
it's terrifying how alone and ephemeral we truly are, that there are already places in our expanding universe that will never be reachable even via communication with any technology on any time scale (unless universe expansion reverses course). that any communication we may receive today will be from civilizations that have ceased to exist thousands to billions of years ago. and humans will likely never travel outside the solar system.

consciousness is a hell of a drug

HKH2 · 4 years ago
It seems more like the fear of missing out. I don't feel terrified at all.
layer8 · 4 years ago
You’re aware that this is just the observable universe? It may be completely irrelevant relative to the total universe. ;)
WhompingWindows · 4 years ago
Why existential dread? We're extremely lucky to be alive. That one sperm hit that one egg and we survived to now. That is extremely unlucky, each of us is one sperm out of hundreds of millions, so savor this existence!!
teh_klev · 4 years ago
It's like looking into the Total Perspective Vortex.
mkeedlinger · 4 years ago
Indeed, it is truly cause to pause and step back. What's the name of that phenomenon common amongst astronauts when they see the earth from afar? I feel like our society could use more of that.

edit: Seems to be called the overview effect [0]

[0]: https://en.wikipedia.org/wiki/Overview_effect

crhulls · 4 years ago
Here is a Hubble side by side of the deep field for comparison

https://www.facebook.com/photo.php?fbid=10159217085846758&se...

quaintdev · 4 years ago
A GIF comparing both Hubble and JWST https://i.redd.it/9uyhwijeo0b91.gif
wolfd · 4 years ago
I made this page (posted in another thread yesterday) because I was rather underwhelmed by the .gif. I think the page shows in much better detail the difference between the telescopes' capabilities.

https://blog.wolfd.me/hubble-jwst/

(If you're on mobile, you should be able to zoom in and still use the slider)

ehsankia · 4 years ago
Here's another tool with all 4 photos:

https://johnedchristensen.github.io/WebbCompare/

bdefore · 4 years ago
The additional detail of the red spiral galaxy around 12:30 is stark by comparison to others. Any ideas on why?
throwaway5752 · 4 years ago
My understanding is that it is also 12-13 hours of exposure for the Webb image vs weeks for Hubble.
pavon · 4 years ago
That is incorrect. The famous Hubble Ultra Deep Field image[1] took 11.3 days of imaging spread over four months (because of high demand to use Hubble). However, that is a different part of the sky. The Hubble image shown here was taken as part of RELICS[2], a survey of images to find good candidates for JWST to image, and was only exposed for 1.7 hours (5 orbits at ~20 minutes each), compared to JWST's exposure time of 12.5 hours. So comparisons between between Hubble and JWST for that particular shot are not fair to Hubble.

[1]https://esahubble.org/images/heic0611b/

[2]https://archive.stsci.edu/prepds/relics/

mike10921 · 4 years ago
Ok to be honest I know it's not cool to admit it, but so far it all looks the same. If someone told me that the Webb picture was taken by Hubble I would not have thought about it for an extra second.

I'm hoping that in the future we see pictures of locations and environments that are mind-blowing to the average person who loves space.

jacquesm · 4 years ago
The very rough equivalent in computer terms: a 1997 PC computing something and taking a week or so to do it and returning the answer: 3.

The same by the 2022 version: 3.14159265358979323846 in a few milliseconds.

Both the speed of the computation and the resolution of the result are what makes it impressive, not the fact that the nature of the universe does not change fundamentally when viewed across a longer span of time.

It is mind-blowing, but maybe not to the 'average person who loves space'. But if you stop for a bit longer to understand what it took to create that image and what it is that you are actually looking at (the age of the objects involved, their apparent size and the resolving power and temperature of the telescope required to make it) it becomes a lot more impressive.

ceejayoz · 4 years ago
The difference is in a) the details and b) the length of time the telescope has to gather light to get the photo. JWST got the photo in hours when Hubble took weeks, and there's easily 10x as many objects in the JWST shot.

JWST can thus observe much fainter and much more distant objects - galaxies billions of years old, exoplanets, etc., and it can do more of it.

mrandish · 4 years ago
These are just the initial "pretty pictures" processed to look nice and promoted as part of NASA's ongoing fundraising. The more valuable science payload is in the spectral data which will tell us about the composition of these objects. Another exciting aspect of of JWST is the IR instrument (NIRCAM) which can see red shifted wavelengths revealing much older objects from the early universe.

To me, the real 'shock and awe' will be when scientific papers are published which reveal new knowledge and deeper understanding of our universe. This will take some time although I'm sure the first papers are already racing toward pre-print.

loudmax · 4 years ago
I kind of agree with you, these pictures do look like more of the same. But that's okay, the real exciting stuff isn't going to be pretty pictures, it's going to be what astronomers and physicists are able to learn by peering deep into the origins of the universe. The pictures of galaxies are nice to look at, but the real ramifications of JWST will take years to play out.
systemvoltage · 4 years ago
This makes the Hubble telescope even more impressive in my eyes. Built 50 years ago with presumably 60 year old tech.

> Hubble telescope was funded and built in the 1970s by the United States space agency NASA with contributions from the European Space Agency. Its intended launch was 1983, but the project was beset by technical delays, budget problems, and the 1986 Challenger disaster. Hubble was finally launched in 1990.

kzrdude · 4 years ago
Right and it's slightly rotated, 20-30 degrees (guess). Just for others that try to line them up
slfnflctd · 4 years ago
The exoplanet analysis is what I'm most intrigued by. They're getting much more data than in the past on these.

Of course they went for an easy gas giant target first (it has lots of water, which is great), but those Earth-like planets in the Goldilocks zone are gonna be some of the most exciting stuff that comes out of this. Looking forward to it.

kentonv · 4 years ago
So is there any reason not to point this at Proxima Centauri b, like, ASAP?

https://en.wikipedia.org/wiki/Proxima_Centauri_b

yupper32 · 4 years ago
I don't know about Proxima Centauri b, but they'll be spending around 25% of "Cycle 1" (the first 6,000 hours of science) working on exoplanets, don't worry:

"Over the coming year, researchers will use spectroscopy to analyze the surfaces and atmospheres of several dozen exoplanets, from small rocky planets to gas- and ice-rich giants. Nearly one-quarter of Webb’s Cycle 1 observation time is allocated to studying exoplanets and the materials that form them." - https://www.nasa.gov/image-feature/goddard/2022/nasa-s-webb-...

sbierwagen · 4 years ago
WASP-96b has an orbit that passes in front of its star, Proxima Centauri b doesn't.

An obvious target for the coronagraph for regular imaging, but there's no way to get a transmission spectrum of its atmosphere.

saiya-jin · 4 years ago
1150 light years away! Imagine how much more details can be detected for stuff within 50 light years.

Really, they should be already building 2nd James Webb. I am sure even 10 of them would get 100% utilization for their whole lifetime. I can only imagine what kind of needless political game is happening around prioritization of time slots for it.

Or start working on next-gen, bigger, more resilient etc. It costs peanuts compared to any significant CERN upgrade and we have so much room to progress in astronomy (aka understanding our home, this universe) just by getting more data and resolution.

jacquesm · 4 years ago
I fear there won't be any more JWSTs at all. People are already bitching about how much it cost and that all it does is make pretty pictures right here in this thread and there were many times that it came within a hair of having its budget slashed.

Super happy we have one JWST, and I hope fervently that it will outlast its original mission by a large fraction, every sign right now points in that direction.

mden · 4 years ago
The next NASA space telescope is The Nancy Grace Roman Space Telescope - https://www.jpl.nasa.gov/missions/the-nancy-grace-roman-spac....
historynops · 4 years ago
A lot of the pictures have some bright stars with 6 long lens flare like points coming out of them in a consistent pattern. Is that because of the hexagonal shape of JWT's lenses/mirrors?
rbliss · 4 years ago
Yes, it's a combination of both the primary mirror and struts. The JWST website has a very helpful infographic explaining: https://webbtelescope.org/contents/media/images/01G529MX46J7...
creatonez · 4 years ago
Here is an image showing how each part of the distortion comes about - https://bigthink.com/wp-content/uploads/2022/03/FOFC8ZPX0AIB...
moffkalast · 4 years ago
That's quite exhaustive, but it makes me wonder why isn't anything done to correct for that. Like for example instead of taking one 15h exposure, why not take three 5h exposures and roll the telescope 5 degrees in between, then median filter out the artefacts?
MontagFTB · 4 years ago
You beat me to it- incredibly helpful diagram. Thanks for sharing it.
coldpie · 4 years ago
Wow, thanks for this link. The level of communication around JWST's technology and launch has been amazing, and this is a great example of that.
ceejayoz · 4 years ago
It's not the mirrors, it's the three struts supporting the reflector.

Hubble shows four spikes because it has two struts.

https://bigthink.com/starts-with-a-bang/james-webb-spikes/

https://www.universetoday.com/155062/wondering-about-the-6-r...

krajzeg · 4 years ago
I think you also had a similar comment and linked the same article under the previous topic about JWST's first image?

The article is very informative, but my read of it is different: the three major "spikes" are in fact due to the hexagonal shape of the mirrors and how they're laid out. The struts also add three spikes, but: two of them coincide with the mirror spikes, while one of them (from the vertical strut) is visible on its own, and causes the smaller perfectly horizontal spike.

The image I'm basing this on is in your article with a caption starting from "The point spread function for the James Webb Space Telescope" [1]

[1]: https://bigthink.com/wp-content/uploads/2022/03/FOFC8ZPX0AIB...

deanCommie · 4 years ago
From the other comments, I understand why it's there, but i wish they would photoshop them out.

The images take on a more synthetic and fake quality when the technical physical man-made constraints of our telescope get projected out onto the natural very much NON-man-made universe.

Look at https://stsci-opo.org/STScI-01G7ETPF7DVBJAC42JR5N6EQRH.png and observe the incredible entropy in the nebula itself. The consistent, perfect, straight lines, of each star are jarring in the image.

deanCommie · 4 years ago
to be clear - i realize these are for science. they shouldn't be edited for scientists.

but we should edit them :)

deelowe · 4 years ago
More or less. That's how they've explained it in the past.
mackieem · 4 years ago
Yeah, it's the hexagonal shape. The objects with the 6 diffraction spikes are overexposed compared to the rest of the objects in the picture, so they're generally brighter and/or closer objects.

https://www.youtube.com/watch?v=UBcc3vpJTAU

MontagFTB · 4 years ago
Here’s an infographic from NASA explaining the phenomenon: https://webbtelescope.org/contents/media/images/01G529MX46J7...
micromacrofoot · 4 years ago
Also, I recall reading that those stars are so bright because they're within our galaxy... so they're the foreground really
WebbWeaver · 4 years ago
I really appreciate the work of the US Air Force Cambridge Research Laboratories for creating HITRAN. HITRAN is a molecular spectroscopic database used to look molecules in gas and atmosphere. They are the standard archive for transmission and radiance calculations. Without their groundwork we would not be as good at understanding planetary atmospheres.

https://hitran.org/ free after registration

https://hitran.org/media/refs/HITRAN-2020.pdf

HAPI (programming interface manual) https://hitran.org/static/hapi/hapi_manual.pdf

Youtube tutorials https://www.youtube.com/watch?v=NiKuigtFahk&list=PLqOG3cBizT...

It is very easy to use and might help to understand WASP-96 b transmission spectrum. https://stsci-opo.org/STScI-01G7NBXDHYYSVBP2M476PRGG3A.png

https://en.wikipedia.org/wiki/Electromagnetic_absorption_by_...