Landing on the moon is not easy. Just this year, the first two American lunar landing attempts since Apollo 17 in 1952 failed. First Astrorobotic’s Peregrine lander failed to reach the moon, due to a propellant leak, and just a month later Intuitive Machines lunar lander took a tumble as it was making its final descent onto the lunar surface.
It was navigating primarily with an autonomous visual guidance system, when its landing leg struck a rock that this visual guidance system failed to navigate around, which broke the leg, and caused the lander to tip over on its side. Despite this unfortunate collision, IM-2 still managed to return valuable data back to earth, but undoubtedly losing a 120 million dollar lunar lander was devastating. NASA needs to relearn how to land on the moon, and more importantly, develop the technologies needed to do it reliably.
So why is landing on the moon so difficult, even with decades of modern technological development? Firefly’s Blue Ghost mission is the next lunar landing attempt scheduled to launch this month, and I visited their Austin Texas headquarters to get a closer look at Blue Ghost, and learn how they are giving themselves the best chance at success, while making landing on the surface of the moon easier for all that follow. Static I'll leave it in Slew.
Armstrong: (To Mike) Relay to us. Aldrin : (To Mike) See if they have got me now. I've got good signal strength in Slew.
Collins: Okay. You should have him now, Houston. Duke: Eagle, we('ve) got you now.
It's looking good. Over. (Pause) Eagle… What you just heard was Houston communicating with Michael Collins in orbit around the moon.
Asking him to relay messages to Buzz Aldrin inside the Apollo lunar lander. That static was the result of a poor connection with the command module. Aldrin began manual pointing the landers s-band antenna towards the command module to gain a better connection.
If this connection hadn’t been improved the landing attempt likely would have had to be called off. Risking losing connection with the astronauts on the surface was not an option. In the latest US moon landing attempts there is no command module to relay messages to earth, and there is certainly no skilled human pilot inside problem solving.
[REF] It simply doesn’t make financial sense to send humans to the moon on these early exploratory missions. These lunar landings are part of NASA’s Commercial Lunar Payload Services initiative. A 2.
6 billion dollar fund set out specifically to fund private companies like Intuitive Machines, Astrobotic, and Firefly. Each mission receiving around 100 million dollars in funding. Now consider that it costs around 1 million dollars per kilogram of payload to reach the moon today.
The two space suits worn by the astronauts on the moon for Apollo 11 alone weighed 160 kilograms. [REF] Launching humans, the life support needed to keep them alive, and the requirement of bringing them back safely simply does not fit into NASA’s current budget for these missions. About 50% of all moon landings across all space faring nations fail.
Around 55% of Mars landings attempts fail. These are unacceptable safety margins for humans. These missions serve an incredibly important purpose.
To develop the expertise to learn how to land on the moon reliably even without skilled human pilots. To test and develop the technologies needed to autonomously land on the moon, and make it easier and safer for the crewed missions of the Artemis program. Technology has improved some things.
It was unimaginable to send a lunar lander to the moon for just 120 million dollars in 1972. While Armstrong was descending onto the lunar surface he had to make multiple course corrections as unexpected boulders and craters appeared in the landing zone. Something we can largely avoid today thanks to the Lunar Reconnaissance Orbiter, which began creating a massive 3D map of the entire moon with a 100 meter resolution.
With areas of interest getting even higher resolution mapping. The same map that we can use to create these animations. We now know where every crater and large boulder exists on the moon.
It’s this map that makes autonomous lunar navigation work. Will Coogan, Firefly’s Chief Engineer for their Lunar Lander, Blue Ghost. Showed us these cameras right in the clear room where Blue Ghost was built.
We have two different cameras on our vision navigation system. One is pointed straight at you. Right there you can see there's a lens cap on it.
Yeah. And one is pointed straight down just behind it on a bracket. The reason it's on a bracket is you don't want anything except the moon in the field of view of the vision navigation camera.
If you get glints coming off of MLIs around the legs, that can interfere with the vision navigation solution. It's not known for sure that it would, but it's a risk that's not worth finding out about. So we put on a bracket so that the feet are not within the field of view of the camera.
Reason we have two is because when you're first coming down, you're breaking, you're firing tangential to the lunar surface to arrest all of your horizontal velocity. So you need this camera to look at the lunar surface. When you slow down enough, you have to land on your feet and you're just fighting gravity at that point.
So then we switch to the other camera. Do you have any redundancy for the cameras if something goes wrong with, like obviously they're pointing in different directions. They feed into the extended Kalman filter as is everything else.
So there's a certain altitude below which we're fine. But for navigation relative to the surface, no, there isn't a one-for-one swap for those. Okay.
So Blue Ghost, similarly to IM-2, is heavily reliant on navigating with these cameras. They use the known detail map of the moon created by the Lunar Reconnaissance Orbiter to figure out exactly where it is. [REF] It does this by taking the known positions of surface features, like the craters that pock mark the surface of the moon.
The lander's computer then uses the images from its cameras to compare what it sees, to the known map of the moon. There’s no weather on the moon to occlude these features, but the software does need to account for changing shadows. Thankfully it’s easy to predict where the sun will be.
Blue Ghost will be tracking along the terminator line. The line which separates the illuminated day side and the dark night side of the Moon. Landing on what is essentially the moon's sunrise to give itself a full lunar day, which is equivalent to 14 earth days, before its mission ends with a spectacular lunar sunset.
With temperatures dipping as low as minus 130 degrees celsius, the lander’s batteries and electronics cannot survive the lunar night. The visual navigation systems algorithms work in a very similar way to those of star trackers, which spacecraft use to establish their orientation, but they can’t provide position data. The star field looks almost identical from anywhere in our solar system.
. With the exact size and shape of everything crater on the moon known it’s relatively trivial math to estimate altitude. The algorithms can estimate orientation, position and velocity with a high degree of accuracy.
The issue comes when the cameras can’t see these features. This happens at lower altitudes. This exact issue impaired Neil Armstrong's landing with Apollo 11.
When dust kicked up from the thrusters occluded the surface of the moon. This is something that can impair these autonomous visual navigation systems too, and below a certain altitude our digital map of the moon is no longer high resolution enough for the visual navigation system to work, as Will Coogan explained. I should say that for all the excellent imagery we have below a certain altitude, you do start to see things that even LRO [Put “LRO: Lunar Reconnaissance Orbiter” on screen] can't see.
And at that point you have to rely on a different system because you're navigating relative to things that have never been seen before. So you start by looking at the things you do know. They might be over on the horizon by now and they're getting pretty far apart.
You identify something new and now you know where that new thing is relative to something you already knew. And so now you start navigating relative to that. That's actually called a SLAM algorithm.
Don't remember what the algorithm stands for. It was not named through lander missions, it was named through robotics programs, so it's maybe a little unfortunate for us that it has that acronym, but that's a common use algorithm. SLAM stands for simultaneous localization and mapping, and it’s the same algorithm autonomous cars can use to navigate busy city streets.
Then at around 10 meters above the surface visuals are no longer viable, because plumes of dust are expected to occlude the view. At this point the lander is essentially flying blind. Relying only on its inertial measurement unit to softly touch down.
Even with perfect guidance this is no easy feat. Precision thrust control comes from the Spectre reaction control system. A total of 8 hypergolic rocket motors that can be precisely controlled.
One of the difficulties with lunar landers is just how light they are. With so much of their weight dedicated to propellant, they become significantly lighter as the propellant is consumed,, altering the thrust to weight ratio continuously. And continually changing the center of gravity of the lander.
Intuitive machine’s lander was quite tall for this reason. They stacked their propellant tanks on top of each other to avoid lateral center of gravity shifts that could cause the lander to roll. However this taller configuration raises the lander's center of gravity higher, and makes it easier for it to tip over.
Blue Ghost opted to have 4 propellant tanks sitting next to each other on a single level to lower the center. Because the two propellants are different densities 4 tanks are needed to maintain the lateral center of gravity. This continuous weight change is something that has to be monitored and adjusted for by reducing thrust from the reaction control system.
One way this can be done is through duty cycling the engines one and off. With short or longer cycles the thrust can be precisely controlled. At this point, you are getting a picture of why landing on the moon is so difficult.
Even the legs of these lunar landers need to be carefully engineered, Will showed us some of the design considerations for this seemingly simple part of the lander. You also see that there's no foot on that leg. There's actually a hollow cylinder.
This is a 3D printed titanium piece. The reason we 3D print, not just because it's easier to manufacture in fact, it's arguably not, but because we're able to vary the infill. So it doesn't have to be solid titanium.
We can have cavities on the inside. Inside of that goes what we call a shock shuttle. It's a crushable aluminum honeycomb piece, and that absorbs the impact of landing and actually compresses when we land.
So that not only absorbs the shock for the payloads, keeping them within the launch environment, but it also makes sure that all four feet are more likely to be in contact with the ground. If you have no compliance, you can have a foot floating. They have a stable platform.
We deploy the vacuum that could push us over. So we want compliant feet. We will probably be able to show you a foot outside the clean room when we leave here, but that also has a swivel bowl ankle.
It's got a nice bowl shape, so it doesn't catch on the lunar surface if we came in at a slight angle. Where does the sensor go for the foot. Like you were saying that one of the ways you know you land is through a foot sensor.
So we actually start with the foot at an angle, and then when we land it presumably goes flat, it goes to some other direction, and it unplugged something. And that's all the sensor is. So it's not complicated.
There is a possibility that we land at, and that foot stays at the exact same angle, but we have four feet. So it's unlikely to happen on every single foot, and only one of them needs to trip to confirmed landing. We do checks on those, right.
Because there is a risk that during launch one comes out, during transit we have an intermittent connection, and we never have them power off the engine until we're below a certain threshold, and we're doing checks on them throughout flight to make sure we don't have any transient signals coming from that. And if we do, we say ignore that sensor during landing, just so we don't get a false contact. So that foot sensor is one of the ways the ground crew can figure out if they landed, but you may remember with intuitive machines that they weren’t entirely sure of the status of the lander after it had toppled over.
There are a number of checks to figure out if the lander has successfully landed. Sensors in the landing legs triggering is the first sign. They check for signals from the landers data links, if its engines have shut off, and they can check its orientation with the inertial measurement unit.
However there is a scenario where all these checks could indicate a successful landing, but in reality the lander bounced off the ground and is actually currently in free fall back to the moon. The moon is just really far away and we just don’t have that much real time data on what’s happening up there. One of the missions of Blue Ghost, and the overall CLIPs initiative, is to improve this situation.
To provide more tools to future lunar missions. One of the major improvements that needs to be made is to communication an d navigation systems. The US and Europe have a blind spot in their communication and navigation systems.
Something China has a leg up on. In 2018 China launched its first lunar relay satellite. China launched this satellite into lagrange point 2.
A parking orbit beyond the moon, where it will have a consistent view of the far side of the moon and a direct line of sight with earth. Preventing extended black out periods that a single satellite orbiting the moon would experience With this relay satellite China landed humanity’s first lunar lander on the far side of the moon in 2019. And In 2024 they launched another relay satellite.
Its huge parabolic antenna offers 10 X-band channels allowing it to talk up to 10 different lunar probes at the same time. Neither the US or Europe have anything close to this capability right now. But Blue Ghost is carrying a payload to evaluate using global navigation system signals emitted from American and European satellites for navigation.
Which could provide future lunar spacecraft accurate position, velocity and time estimations. [REF] Currently, navigation beyond Earth is heavily reliant on NASA’s Deep Space Network, an array of giant radio antennas which transmit positioning data to interplanetary spacecraft to keep them on course. Blue ghost will be receiving positioning data from the swedish space corporation until the visual navigation system takes over.
However, NASA is already using GPS signals to provide position data to 4 spacecraft studying the earth's magnetic field, which are in orbit around half the distance to the moon. But ultimately NASA’s goals are much higher than repurposing global position signals for navigation on the moon, and they plan to build out an entire network of nodes and satellites for future moon missions called Luna Net. Luna Net is the next phase that actually began with Intuitive Machine’s first landing, which carried the lunar node - 1 beacon which was a test node for this future Luna Net system.
]It is a small cubesat sized S band navigation beacon. Something LN-1s principal engineer compared to a lighthouse. Providing critical navigation information from the shore.
[REF] A beacon that he envisions flying on every subsequent lunar mission, gradually building up a network of lunar nodes. The original plan for this payload was to transmit its signal around the clock, but with the lander’s botched landing, it could conduct just two 15 minute transmissions, which the deep space network successfully locked onto providing brief, but critical data on the new system. [REF] Blue Ghost's next mission, which is targeting a launch in 2026 will include a new communications satellite that will orbit the moon.
Built by the British satellite manufacturer Surrey, this is a European space agency payload, which will provide communication for future polar and far side moon missions. A critical step in the European and American goals to future moon missions. These CLPS missions will gradually build up moon technology and infrastructure critical to making moon landings safer and more reliable, and even with the hiccups it’s important to remember that Apollo hardware was thoroughly tested before Apollo 11's success, with their fair share of failures.
We can and should be celebrating these CLPS missions like a modern reincarnation of the early Apollo program, but at a fraction of the cost. Something you can also enjoy at a fraction of the cost is Nebula. For just 3 dollars a month you can sign up to Nebula and get access to amazing exclusive content, while supporting the work we do here.
Before I go into an entire spiel about how great Nebula is. I know many of you are sick of signing up for yet another subscription service. So, if you simply want to support our channel and get lifelong access to all of Nebula’s exclusive programming, we are offering lifetime memberships for a limited time for 300 hundred dollars.
Unlike other subscriptions, you can buy once and never pay again. For the Holiday period we are also offering gift cards for you to gift family members, and if you want to give them a trial of Nebula before doing so all Nebula members have guest passes to give out that don’t require credit card details to use. We also have our own 5 part Battle of Britain series where we recreated the central command room of the Battle of Britain to show in intricate detail how Britain won their most impactful victory.
We show you how incoming raids were plotted on the huge central map with color coded tiles and how the commander kept track of what squadrons were available using a light up board mounted on the walls. We spent over a month recreating this room using archived footage and photos, an investment of time for a single animation asset that we simply can’t afford on YouTube. For less than the price of a cup of coffee per month, you can get access to all of Nebula’s Originals, along with our entire catalog without any ads.
You can also easily download videos to watch on the go. For just 3. 00 a month.
This channel depends on the funding Nebula provides us. If you have been subscribed to this channel for more than 3 years, you have seen the huge increases in production quality that Nebula has facilitated. Growing from 2D animations that myself and Mike taught ourselves how to do, to having a full team of incredibly talented 3D artists that rival any TV production.
This is expensive work and we would love to grow our team even more, something we can only do with your support as YouTube ad revenue simply does not cover the bulls. Last year we made a financial loss for 3 months in a row. YouTube is simply a volatile platform where we depend on the whims of advertisers.
Nebula is our life raft in a volatile sea of social media. This is a common theme across YouTube creators. Nebula’s goal is to enable and level up our entire roster of creators.
To remove the financial uncertainty that forces us to rush projects. To remove the algorithms that force us to analyze data points instead of what really matters, the audience on the other side of the screen.