If you are ever able to make it to the KSC visitor complex in Cape Canaveral they have mock-ups of both the Gemini and earlier Mercury capsules you can get in as a size reference. They are both incredibly tight. It's amazing during Gemini 7 they spent 14 days crammed in the capsule testing systems, doing EVA activity along with normal human activity (eating, sleeping, bodily functions). All while being seconds from death at any time if things go wrong. These early astronauts were men of a different caliber.
542458 85 days ago [-]
The Todd Miller Apollo 13 documentary (which is phenomenal, it's entirely file footage but assembled better than most blockbusters) has these bits during launch/landing where they overlay the three astronaut's heart rates with the footage. My big takeaway from that was that they were incredibly unflappable, almost to an absurd degree.
zettabomb 85 days ago [-]
I think you mean Apollo 11, Apollo 13 was the dramatization with Tom Hanks. Both excellent movies in their own ways.
Not to take away from their achievement, but do we know if there were drugs involved?
roryirvine 85 days ago [-]
Not for takeoff. Apollo astronauts used barbituates to help sleep, and scopolamine with dextroamphetamine during re-entry. Unsurprisingly the average HR during re-entry was rather higher than at launch! (source: https://space.stackexchange.com/a/31987 )
Gemini would likely have been similar, save for using Cyclizine during re-entry rather than scopolamine/dextroamphetamine.
ISTR an astronaut saying that you didn't 'get into' a Mercury capsule, instead you 'put it on'.
psunavy03 85 days ago [-]
Not too much different than most tactical jets, to be honest.
teachrdan 85 days ago [-]
You don't stay in a tactical jet for 14 days...
Aardwolf 85 days ago [-]
I don't seem to find anything resembling washrooms on a mercury capsule, how was that managed during those 34 hours?
KineticLensman 85 days ago [-]
Urine collection tubes - like a condom with a bag attached.
euroderf 84 days ago [-]
So, a diet of only liquids ? But didn't they have the standard steak breakfast on launch day ?
KineticLensman 85 days ago [-]
Nobody stayed in a Mercury capsule for 14 days, although Gordon Cooper's flight (the last and longest Mercury) lasted 34 hours
teachrdan 85 days ago [-]
Sorry for the misunderstanding, I was referring to the longest Gemini mission.
solotronics 85 days ago [-]
For anybody near DFW the Apollo 7 Command Module is on display at the Frontiers of Flight museum at Dallas Love Field. Its pretty amazing to see it in person and think about the engineering involved.
https://en.wikipedia.org/wiki/Apollo_7#/media/File:Apollo_7_...
isoprophlex 86 days ago [-]
No amount of computational smartphone photography can match, in my eyes, the clarity and contrast and intensity of whatever analogue medium these were captured on.
This looks gorgeous. I'm extremely tempted to splurge on this, and the Apollo, books...
lm28469 86 days ago [-]
Medium format film (120, 6x6), Hasselblad cameras. I personally think we're barely starting to match the quality of medium format film with modern medium format sensors
Depends on how you define quality. While medium and large format photography are extremely high resolution that’s not the only factor. Space age lenses were significantly lower resolution than the film. Modern mirrorless lenses are starting to come close to being able to out resolve film but still aren’t there. Meaning that you get more functional resolution out of modern digital. Digital also beats the pants off film for dynamic range and low light. That said the noise (grain) and dynamic range fall off in film are more pleasing than digital to most eyes. So it’s not all about technical specs.
thw_9a83c 85 days ago [-]
> Digital also beats the pants off film for dynamic range and low light.
While this is true now, it took a surprisingly long time to get there. The dynamic range of professional medium format negative films is still respectable. Perhaps not so much in a low light, but it's very immune to overexposure.
Also, you can buy a cheap medium-format camera in a good condition and experience that "huge sensor" effect, but unfortunately there are no inexpensive 6x6 digital cameras.
buildbot 85 days ago [-]
In fact there is basically only one digital back ever made at that size, the dicomed bigshot.
thw_9a83c 85 days ago [-]
Interesting. I didn't even know that. I had to look up what's the size of the modern Hasselblad digital camera sensor, and it 43.8 × 32.9mm.
buildbot 85 days ago [-]
It’s incredibly rare and specific, many people in the DB world don’t even know about them. 60x60mm sensor, larger than the actual film gate of 56x56mm There was also a version for the Rollei 6x6 6000 series, the Rollei Q16. I’ve only seen one for sale ever.
Technically larger than 6x6 film sensors have existed since the 80s or 90s at least but are typically only used for government things… Some digital aerial systems use huge sensors.
bhickey 85 days ago [-]
> Space age lenses were significantly lower resolution than the film.
Can you say a little more about this? Modern lenses boast about 7-elements or aspherics, but does that actually matter in prime lenses? You can get an achromat with two lenses and an apochromat with three. There have definitely been some advances in glass since the space program, like fluorite versus BK7, but I'm wholly in the dark on the nuances.
bayindirh 85 days ago [-]
I find modern primes much sharper than their older counterparts not because of the elements or the optical design, but from the glass directly.
Sony's "run of the mill" F2/28 can take stunning pictures, for example. F1.8/55ZA is still from another world, but that thing is made to be sharp from the get go.
The same thing is also happening in corrective glasses too. My eye numbers are not changing, but the lenses I get are much higher resolution then the set I replace, every time. So that I forget that I'm wearing corrective glasses.
dylan604 85 days ago [-]
> I find modern primes much sharper than their older counterparts not because of the elements or the optical design, but from the glass directly
Even back in their prime, haha, the Cooke lens leaned into their glass manufacturing by calling it the Cooke Look. All of the things that gave it that look are things modern lenses would consider as issues to correct.
bayindirh 85 days ago [-]
Actually, I'm pretty flexible when it comes to how lenses and systems behave. A lens with its unique flaws and look is equally valuable for me as a razor-sharp ultra high-fidelity lens.
All boils down what you want to achieve and what emotion you're trying to create with your photography. Film emulation has gone a long way, but emulating glass is not possible the same way (since you don't have any information about what happened to your photons in their way to your sensor), and lenses are important part of the equation, and will forever be, I think.
dylan604 85 days ago [-]
finally, someone that gets it! these different lenses are all just tools. using the right tool for the job is part of being good at your job.
bayindirh 85 days ago [-]
Actually, anything we use is "just tools". IDEs, programming languages, operating systems, text editors, fonts, etc.
We all prefer different toolsets due to our differing needs and preferences. Understanding it removes a lot of misunderstanding, anger and confusion from the environment.
But, reaching there requires experience, maturity and some insight. Tool suitability is real (you can't drive a screw with a pair of pliers), but the dialogue can be improved a ton with a sprinkle of empathy and understanding.
actionfromafar 85 days ago [-]
The lenses also have to be better to compensate for the smaller sensors. All lens defects get more "magnified" the smaller the sensor is. So a straight comparison isn't fair unless the sensor is the same size as the film was.
bbatha 85 days ago [-]
I wrote a longer post a few months ago.[1] The tl;dr is a) computer aided design and manufacturing b) aspherical elements c) fluorite glass d) retro focus wide angle designs and e) improved coatings. Mirrorless lenses also beat slr lenses because they are much closer to the film plane — of course rangefinders and other classic designs never had this problem to begin with.
1: https://news.ycombinator.com/item?id=42962652
Edit: this is just for prosumer style cameras. If you look at phone sized optics that’s a whole other ballgame.
tecleandor 85 days ago [-]
Film in big formats is incredible, gives great images even on shitty cameras...
kjkjadksj 85 days ago [-]
These hasselblads not shitty cameras. A holga is on the other hand, and the images are shitty as a result despite being medium format.
tecleandor 84 days ago [-]
Wasn't talking about those cameras umin particular, but about bigger format film in general.
Even on shittier cameras, like a Holga 120 leaking everywhere with a plastic lens, the results with medium format film is always surprising and gives you a lot of leeway.
kjkjadksj 85 days ago [-]
Even the fuji medium formats don’t have sensors as large. They are more like 4cm x 3cm.
jillesvangurp 85 days ago [-]
You are talking about the digital, heavily processed photos here that are indeed a lot nicer than the originals taken and.printed 50 years ago. The originals were actually a bit under/over exposed (very harsh light in space), and quite grainy.
isoprophlex 85 days ago [-]
I get your point, but no amount of post processing gives my iphone 17 pro portraits the sharpness and atmosphere of what i shoot with my crappy nikon with a prime lens.
And I feel that these old analogue photos contain even more magic in the base material, digital reconstruction notwithstanding
astrange 85 days ago [-]
Those are medium format images. Rent a Fujifilm GFX100 if you want to try something like that. Storage space for the raws adds up though.
londons_explore 85 days ago [-]
Computational photography is about to get really good when it can combine hundreds or thousands of frames into one. 1000 frames effectively combined is equivalent to a lens and sensor of 1000x the surface area - ie exceeding a single frame from a DSLR.
Current methods use optical flow and gyroscopes to align images, but I imagine future methods to use AI to understand movement that doesn't work well for optical flow (ie. Where a specular reflection 'moves' on a wine glass).
dmitrygr 85 days ago [-]
Saying “ai” does not magically solve anything. Current best ML things can’t even solve medium difficulty calculus equations. We’re nowhere near them doing original work like understanding what’s in 1000 images, creating a world model out of that, and then rendering a better image of that world successfully without hallucinations.
astrange 85 days ago [-]
I doubt there's enough information in more than ~4 samples to be worth fusing into one image. Maybe more if you have a full HDR display and supply chain or if the lighting is really bad, but almost everything is about having good taste in SDR tone mapping/image development otherwise.
LiquidPolymer 85 days ago [-]
I have the "Apollo Remastered" book and it is gorgeous. I'm going to buy this one too. Obviously they went back to the original film from the missions and did a full scan. NASA almost never gives access to the original film and instead we have been seeing duplicate transparencies which involves a loss of detail and dynamic range. They were good enough back then. But these first generation scans cannot be matched for detail and color.
isoprophlex 85 days ago [-]
I couldn't help myself and purchased both, ha. Very excited about this..!
speed_spread 85 days ago [-]
I assume you've watched with delight the 2019 Apollo 11 documentary assembled from previously unused 70mm footage.
glimshe 85 days ago [-]
Great lenses with huge film frames. It's doable in digital, but not on a Smartphone or budget SLR.
djmips 85 days ago [-]
I like this, it's really cool - especially the stack images from 16 mm film. The first image (first selfie in space on Gemini 12) is very artistic but I like the original better in that example - just look at the specular highlight before and after.
svelle 85 days ago [-]
I understood it less as an attempt of improvement and more as an alternate version of the same shot.
Where you see more of Aldrin and other smaller bits that were less visible in the original and rightfully iconic shot.
tjpnz 85 days ago [-]
One of those missions lasted just shy of 14-days. Boggles the mind given the size of the Gemini capsule.
pgreenwood 85 days ago [-]
In case anyone wants to have a go at doing this for Mercury, Gemini, or Apollo; all of the RAWs are publicly available for free:
All the of the photographs from these missions are public domain and always have been.
themadturk 85 days ago [-]
"These missions are now forgotten by most Americans, as most of the people alive during that time are now deceased."
I'm still here! I was ten years old during most of the Gemini era. I remember this stuff. I haven't forgotten.
jl6 85 days ago [-]
The headline before/after image is astonishing, almost in-credible. I can't see how the left image was restored into the right. It looks like there is substantial new detail on the right that I can't see anywhere on the left.
I can only assume that the image on the left is a low resolution scan produced for this web article, and that there must be a much better scan somewhere else.
xandrius 85 days ago [-]
Well that photo must have come from a negative film, which can have an astounding amount of detail, even old film.
So, what improved is probably our digitalization tools and with some post, you can reveal a lot of detail.
imdsm 85 days ago [-]
It must be. The amount of detail is incredible, and even trying to extract data from the before picture, it doesn't come close to what you see in the newly processed image.
Of course because you don't have access to the original data.
Imagine:
1. Film -> Method 1 -> Photo #1
2. Film -> Method 2 -> Photo #2
Instead you tried:
3. Photo #1 -> Method 3 -> Photo #2
Which instead gives you a badly edited Photo #1. You don't have the source code, so to speak.
kiicia 85 days ago [-]
Absolute least they did is to rescan original film with newer type of scanning process/device into higher resolution and bit depth „digital negative”. You cannot replicate that from low quality jpeg image.
qingcharles 85 days ago [-]
This is my 30 second attempt with Photoshop: (original on left, mine on right)
Looking at it "correctly" the man's image is obvious.
I enjoy image-art for this 'eye of the observer' opportunity.
gdbsjjdn 83 days ago [-]
It sounds like they took the original film negatives and rescanned them. You can scan them at varying levels of light intensity to get more detail in the highlights/lowlights. By comparison a single compressed digital file is going to have a limited dynamic range.
85 days ago [-]
martinclayton 85 days ago [-]
Fabulous, obviously.
I found the headline image confusing - thought it was a Sontaran (Dr. Who baddie) in there! Aldrin's face and the earth reflection are quite confusing to the eye.
qingcharles 85 days ago [-]
Did they rescan the negatives for this? He mentions having access to RAW files at one point, but it's not clear.
pgreenwood 85 days ago [-]
All the photographs from Mercury, Gemini and Apollo are public domain. They've been previously re-scanned with modern equipment and all of the RAWs are publicly available for free:
Thank you for that! It gives all the details. Interesting they used a HR-500. When I was scanning for Universal I was using a Hasselblad which has a much higher resolution. (although usually the film stock itself was way lower "resolution" than the scanner could image)
CobrastanJorji 85 days ago [-]
What does "stack hundreds and hundreds of frames to bring out incredible detail" mean, technically?
dredmorbius 85 days ago [-]
The 16 mm film format mentioned is usually a motion-picture, rather than still, camera format, generally shot at 24 frames per second (FPS), though values from 8 to 48 FPS were fairly common.
Presumably the stacking was of 100s of frames would involve combining several seconds (8.33 seconds would be 200 frames of film at 24 FPS) to generate a higher-resolution image. Success would depend on how much camera and subject movement occurred over that time, though image stabilisation should help somewhat with the former.
I suspect a clearer explanation of the process was omitted from the article, perhaps due to poor editing.
dredmorbius 83 days ago [-]
Oh, and "image stacking" is the technique of combining multiple images within a single frame, allowing for greater clarity (as in here), removing noise (frequently applied in astrophotography), or depth-of-field / focal-plane stacking.
This almost always implies digital image processing, and depending on the goal / intent, various filters or masks may be applied. De-noising typically relies on including only visual elements appearing in most (but not all) frames of the stack, eliminating glitches such as satellite flares, meteors, terrestrial-based light sources, aircraft, sensor noise, or even radiation spots.
"Take Better Night Sky Photos with Image Stacking"
Again, from TFA the source was motion-film footage from fixed-position cameras which could be used to generate images with much greater resolution than any individual frame. Note that 16mm film grain is typically fairly large, but with stacking and post-processing, smaller details can be inferred.
procflora 85 days ago [-]
To take many exposures of the same scene and average the pixel values with the corresponding pixels from all the exposures. This increases the SNR and dynamic range, but naturally doesn't work very well if your subject isn't static. It doesn't increase the resolution of the image though.
For the 16mm movie cameras in this case, they probably selected frames from rigidly mounted cameras with little subject motion to get a good result out of it. Glenn strapped in tight in his tiny cockpit probably provided them with a decent number of frames they could stack without introducing much motion blur. In fact, you can see a bit of blurring at the end of one of the white straps center frame in that shot: https://cdn.arstechnica.net/wp-content/uploads/2025/09/03b-G...
There are algorithms in computational photography that effectively "stack" multiple images of the same thing in order to achieve a kind of 'super-resolution'. It's been a bit since I've done any of that, so I do not recall, but my hazy memory says its pretty straight forward linear algebra (and has the concomitant drawbacks that algorithms of this type have).
ecoled_ame 85 days ago [-]
Photography and 1960s NASA. Best combo
dostick 85 days ago [-]
Anyone knows if those new photos are published on NASA images site? If they are public domain..
Gormo 85 days ago [-]
They're credited to NASA in the Ars Technica article, so they are likely public domain.
echelon_musk 85 days ago [-]
> Gemini 5, ended just two weeks ago, in 1965
1965 was two weeks ago?
FartyMcFarter 85 days ago [-]
Yep. I'm trying to find tickets for the next Beatles concert as we speak.
laborcontract 86 days ago [-]
Note: this has nothing to do with Gemini, Google's latest image editing model.
OJFord 86 days ago [-]
Which, had it been around anything close to 60 years, could have been confusing!
randomtoast 86 days ago [-]
The name comes from Latin "gemini" meaning "twins," referring to the mythological Dioscuri, Castor and Pollux, sons of Leda. In mythology, Pollux was immortal, Castor mortal; their story is connected with themes of brotherhood and sacrifice.
Project Gemini was NASA's second human spaceflight program (1961-1966), preceding Apollo. It developed spaceflight techniques such as orbital rendezvous and docking, essential for the Moon landing.
Gemini is also a lightweight internet protocol and associated ecosystem (the Gemini Protocol), designed as a middle ground between Gopher and the modern web (HTTP/HTTPS), emphasizing simplicity and privacy.
It is also the name of Google's multimodal AI model, successor to Bard (announced 2023).
CobrastanJorji 85 days ago [-]
Google supposedly merged two of their AI teams, "DeepMind" and "Brain," and named the resulting team "Gemini" because it came from two things. Which is kind of weird as a metaphor because merging two things isn't really how twins work, but it does sound a lot better than "Google Reorg."
JdeBP 85 days ago [-]
But it is connected to the Internet protocol of the same name, which uses 1965 as its well-known port number (although IANA has not heard about this yet).
Rendered at 21:51:53 GMT+0000 (UTC) with Wasmer Edge.
https://www.imdb.com/title/tt8760684/
Gemini would likely have been similar, save for using Cyclizine during re-entry rather than scopolamine/dextroamphetamine.
Detailed info about the Apollo medkit: http://heroicrelics.org/info/csm/apollo-medical-kit.html
Less-detailed history of NASA medkits, mentioning Gemini: https://www.spacesafetymagazine.com/spaceflight/space-medici...
ISTR an astronaut saying that you didn't 'get into' a Mercury capsule, instead you 'put it on'.
This looks gorgeous. I'm extremely tempted to splurge on this, and the Apollo, books...
https://airandspace.si.edu/collection-objects/camera-hasselb...
While this is true now, it took a surprisingly long time to get there. The dynamic range of professional medium format negative films is still respectable. Perhaps not so much in a low light, but it's very immune to overexposure.
Also, you can buy a cheap medium-format camera in a good condition and experience that "huge sensor" effect, but unfortunately there are no inexpensive 6x6 digital cameras.
Technically larger than 6x6 film sensors have existed since the 80s or 90s at least but are typically only used for government things… Some digital aerial systems use huge sensors.
Can you say a little more about this? Modern lenses boast about 7-elements or aspherics, but does that actually matter in prime lenses? You can get an achromat with two lenses and an apochromat with three. There have definitely been some advances in glass since the space program, like fluorite versus BK7, but I'm wholly in the dark on the nuances.
Sony's "run of the mill" F2/28 can take stunning pictures, for example. F1.8/55ZA is still from another world, but that thing is made to be sharp from the get go.
The same thing is also happening in corrective glasses too. My eye numbers are not changing, but the lenses I get are much higher resolution then the set I replace, every time. So that I forget that I'm wearing corrective glasses.
Even back in their prime, haha, the Cooke lens leaned into their glass manufacturing by calling it the Cooke Look. All of the things that gave it that look are things modern lenses would consider as issues to correct.
All boils down what you want to achieve and what emotion you're trying to create with your photography. Film emulation has gone a long way, but emulating glass is not possible the same way (since you don't have any information about what happened to your photons in their way to your sensor), and lenses are important part of the equation, and will forever be, I think.
We all prefer different toolsets due to our differing needs and preferences. Understanding it removes a lot of misunderstanding, anger and confusion from the environment.
But, reaching there requires experience, maturity and some insight. Tool suitability is real (you can't drive a screw with a pair of pliers), but the dialogue can be improved a ton with a sprinkle of empathy and understanding.
Edit: this is just for prosumer style cameras. If you look at phone sized optics that’s a whole other ballgame.
Even on shittier cameras, like a Holga 120 leaking everywhere with a plastic lens, the results with medium format film is always surprising and gives you a lot of leeway.
And I feel that these old analogue photos contain even more magic in the base material, digital reconstruction notwithstanding
Current methods use optical flow and gyroscopes to align images, but I imagine future methods to use AI to understand movement that doesn't work well for optical flow (ie. Where a specular reflection 'moves' on a wine glass).
https://tothemoon.im-ldi.com/
All the of the photographs from these missions are public domain and always have been.
I'm still here! I was ten years old during most of the Gemini era. I remember this stuff. I haven't forgotten.
I can only assume that the image on the left is a low resolution scan produced for this web article, and that there must be a much better scan somewhere else.
So, what improved is probably our digitalization tools and with some post, you can reveal a lot of detail.
My attempt: https://i.imgur.com/QZDDEB5.png
Imagine:
1. Film -> Method 1 -> Photo #1
2. Film -> Method 2 -> Photo #2
Instead you tried:
3. Photo #1 -> Method 3 -> Photo #2
Which instead gives you a badly edited Photo #1. You don't have the source code, so to speak.
https://imgur.com/a/YC2iBHX
Looking at it "correctly" the man's image is obvious.
I enjoy image-art for this 'eye of the observer' opportunity.
I found the headline image confusing - thought it was a Sontaran (Dr. Who baddie) in there! Aldrin's face and the earth reflection are quite confusing to the eye.
https://tothemoon.im-ldi.com/
Presumably the stacking was of 100s of frames would involve combining several seconds (8.33 seconds would be 200 frames of film at 24 FPS) to generate a higher-resolution image. Success would depend on how much camera and subject movement occurred over that time, though image stabilisation should help somewhat with the former.
I suspect a clearer explanation of the process was omitted from the article, perhaps due to poor editing.
This almost always implies digital image processing, and depending on the goal / intent, various filters or masks may be applied. De-noising typically relies on including only visual elements appearing in most (but not all) frames of the stack, eliminating glitches such as satellite flares, meteors, terrestrial-based light sources, aircraft, sensor noise, or even radiation spots.
"Take Better Night Sky Photos with Image Stacking"
<https://photographylife.com/night-sky-image-stacking>
"Focus Stacking" (Wikipedia)
<https://en.wikipedia.org/wiki/Focus_stacking>
Again, from TFA the source was motion-film footage from fixed-position cameras which could be used to generate images with much greater resolution than any individual frame. Note that 16mm film grain is typically fairly large, but with stacking and post-processing, smaller details can be inferred.
For the 16mm movie cameras in this case, they probably selected frames from rigidly mounted cameras with little subject motion to get a good result out of it. Glenn strapped in tight in his tiny cockpit probably provided them with a decent number of frames they could stack without introducing much motion blur. In fact, you can see a bit of blurring at the end of one of the white straps center frame in that shot: https://cdn.arstechnica.net/wp-content/uploads/2025/09/03b-G...
It's a pretty common technique in astrophotography. Here's a page that goes into some more detail in that context: https://clarkvision.com/articles/image-stacking-methods/
1965 was two weeks ago?
Project Gemini was NASA's second human spaceflight program (1961-1966), preceding Apollo. It developed spaceflight techniques such as orbital rendezvous and docking, essential for the Moon landing.
Gemini is also a lightweight internet protocol and associated ecosystem (the Gemini Protocol), designed as a middle ground between Gopher and the modern web (HTTP/HTTPS), emphasizing simplicity and privacy.
It is also the name of Google's multimodal AI model, successor to Bard (announced 2023).