What’s Going on With Russia’s Space Program?

#Russia #Space #Program #ISS #Nauka

Was the recent ISS emergency an aberration, or a warning of things to come?

An inauspicious start: The newly arrived Nauka science module (right) alongside a Soyuz crew vehicle. (Roskosmos)

Last month, something that long-time observers of the space program thought might never happen actually took place 450 kilometers above Earth: Russia’s 20-ton Nauka (“Science”) module successfully docked to the International Space Station. It was the first expansion of the Russian segment of the station in more than a decade. All the other ISS partners largely completed construction of their facilities years ago.

Listen to this article

As its name suggests, Nauka is designed as a laboratory, complete with a workshop, a glovebox for experiments, attachment points for exterior payloads, an airlock, and a European-built robotic arm that will allow cosmonauts to install equipment outside the station—the first such capability on the Russian segment. Nauka also adds more sleeping space for the cosmonauts and a new toilet hooked up to a sophisticated water-recycling system.

The new module launched to the station on a Proton-M rocket on July 21. After eight days of mostly silence from the Russian space agency Roskosmos about Nauka’s trouble-laden trek to its destination and nerve-racking final approach to the ISS the successful docking was met with fanfare in Moscow. “Starting today, foreigners are learning to pronounce a new Russian word—Nauka,” declared Roskosmos head Dmitry Rogozin. Meanwhile, on Russian social media an army of online trolls went into overdrive to trumpet the success.

Just three hours later, though, the mood turned dramatically. People monitoring communications from the station heard alarming reports from the cosmonauts reporting that Nauka’s thrusters were firing for no reason, sending the entire station into an uncontrolled cartwheel. Live broadcasts from orbit showed a blizzard of flakes outside the station—apparent engine exhaust. There were some tense moments on the ground as other station modules had to be fired to counter the unexpected thrust and bring the station back under control. The emergency ended only when Nauka ran out of fuel.

The inadvertent engine firings, which could have damaged the $100 billion ISS, were the result of a software error. Another programming mistake days earlier had also caused propulsion problems, wasting fuel and leaving mission controllers only one attempt at docking.

As usual, Roskosmos has been mostly silent about the mishaps, leaving it largely to independent researchers to sort out what actually happened. Coincidently, the Russian Duma is now preparing a law that would criminalize virtually any reporting on Russian space and military activities.

What has happened to Russia’s once elite human spaceflight program?

Nauka’s journey, like other events in the international spotlight, even the Olympics, are now treated in Russia as part of a propaganda war with the West. Every Kremlin success, no matter how small, is overhyped. Any hint of corruption or mismanagement is glossed over or hidden from view. Often, blame is shifted to the United States or elsewhere. In a post-docking interview that aired on a Russian TV show known for its ultra-nationalist rhetoric, Rogozin blamed Ukrainian-built bellows in Nauka’s propellant tanks for the module’s propulsion problems.

In truth, Nauka’s dangerous post-docking failure was only the latest snafu in a long string of embarrassing technical problems that have plagued the project over three decades. The pervasive software issues were only part of a drama that included changing contractors and major redesigns of Soviet-era systems whose warranties expired years before they had a chance to fly.

Nauka is the last Russian spacecraft that can trace its roots to a transport ship known as TKS, which was developed in the 1960s and ’70s by the collective of the prolific Soviet space pioneer Vladimir Chelomei. The TKS was originally intended for the top-secret Soviet military space station called Almaz. The same design was later used for the modules of the Mir space station, and was then adopted for the first Russian piece of the ISS.

In the 1990s, as the components of the international station were being built around the world, the hardware that eventually became Nauka was planned for launch before the end of that decade. But various financial and technical problems kept it and the rest of the Russian segment on the ground for nearly a quarter of a century.

In the early 2010s, engineers found severe contamination in the module’s critical propulsion system, reportedly the result of workers mistakenly thinking they were supposed to dismantle it. All attempts to fully clear the system failed, but after years of delays, Nauka’s engines were certified to fly anyway. In the final days before launch, Nauka had to be taken off the fueling facility because press photos of the module posted on the Internet revealed the lack of thermal blankets on critical flight control sensors. The blankets had to be urgently fashioned from leftover materials.

What’s next for the troubled science module?

Nauka arrives at an awkward time, as the ISS is approaching an uncertain retirement date. With the Kremlin’s long-proclaimed lunar exploration program stalled by money problems, Russian officials have switched to talking about building a new station in a different orbit from the ISS, although no new money has been allocated to the project so far. This proposed smaller facility would be visited only occasionally by cosmonauts, and could overfly the strategically important Arctic region if multiple technical issues associated with the new orbit can be resolved. In the new orbit, the future Russian station could be reached by crew vehicles and cargo ships spacecraft launched from Russia, rather than from Kazakhstan, as with vehicles bound for the ISS.

Under these circumstances, adding more Russian modules to the current station would seem to make no sense. Yet Roskosmos has kept the next module, called Prichal (“Pier”), on schedule for a launch to the ISS this November. Beyond that, another major component is currently under construction in Russia. This upgraded new-generation version of Nauka, known as the Science and Power Module or NEM, was intended to make the Russian segment truly independent from the rest of the ISS in terms of energy supplies and flight control.

However, this year, Roskosmos publicly committed to making the NEM the core of the new station rather than send it to the ISS. After a closed-door meeting on July 26, the Council of Chief Designers—which has charted the direction of Russia’s space program since the days of Sputnik—deferred all critical questions about the post-ISS base to some unspecified future.

That means Nauka and Prichal may have a relatively short life in orbit compared to their predecessors. And flight controllers on both sides of the world will be left hoping there are no more in-space emergencies like the one that happened last month.

The Dalnegorsk UFO Crash: Roswell Incident of the Soviet Union

#Dalnegorsk #UFO #Crash #Roswell #SovietUnion

Listen to this article

This internationally famous UFO incident took place in 1986, on January 29, at 7:55 p.m. Some have called it the Roswell Incident of the Soviet Union. The information concerning this incident was sent to us by a number of Russian ufologists.The Dalnegorsk UFO Crash: Roswell Incident of the Soviet UnionThe Dalnegorsk UFO Crash: Roswell Incident of the Soviet Union

Dalnegorsk is a small mining town in the Far East of Russia. That cold January day a reddish sphere flew into this town from the southeastern direction, crossed part of Dalnegorsk, and crashed at the Izvestkovaya Mountain (also known as Height or Hill 611, because of its size). The object flew noiselessly, and parallel to the ground; it was approximately three meters in diameter, of a near-perfect round shape, with no projections or cavities, its colour similar to that of burning stainless steel. One eyewitness, V. Kandakov, said that the speed of the UFO was close to 15 meters per hour. The object slowly ascended and descended, and its glow would heat up every time it rose up. On its approach to Hill 611 the object “jerked”, and fell down like a rock.

All witnesses reported that the object “jerked” or “jumped”. Most of them recall two “jumps”. Two girls remember that the object actually “jumped” four times. The witnesses heard a weak, muted thump. It burned intensively at the cliff’s edge for an hour. A geological expedition to the site, led by V. Skavinsky of the Institute of Geology and Geophysics of the Siberian Branch of the Soviet Academy of Sciences (1988), had confirmed the object’s movements through a series of chemical and physical tests of the rocks collected from the site. Valeri Dvuzhilni, head of the Far Eastern Committee for Anomalous Phenomena, was the first to investigate the crash. With the help of our colleagues in Russia this is the most accurate account of the incident to date.

Dr. Dvuzhilni arrived at the site two days after the crash. Deep snow was covered the area at the time. The site of the crash, located on a rocky ledge, was devoid of snow. All around the site remnants of silica splintered rocks were found: (due to exposure to high temperatures), and “smoky” looking. Many pieces, and a nearby rock, contained particles of silvery metal, some “sprayed”-like, some in the form of solidified balls. At the edge of the site a tree-stump was found. It was burnt and emitted a chemical smell. The objects collected at the site were later dubbed as “tiny nets”, “little balls”, “lead balls”, “and glass pieces” (that is what each resembled).

Closer examination revealed very unusual properties. One of the “tiny nets” contained torn and very thin (17 micrometers) threads. Each of the threads consisted of even thinner fibers, tied up in plaits. Intertwined with the fibers were very thin gold wires. Soviet scientists, at such facilities as the Omsk branch of the Academy of Sciences, analyzed all collected pieces. Without going into specific details suffice it to say that the technology to produce such materials was not yet available on Earth…except for one disturbing account.

To give an idea of the complexity of the composition of the pieces, let us look at the “iron balls”. Each of them had its own chemical composition: iron, and a large mixture of aluminum, manganese, nickel, chromium, tungsten, and cobalt.

Such differences indicate that the object was not just a piece of lead and iron, but some heterogeneous construction made from heterogeneous alloys with definite significance. When melted in a vacuum, some pieces would spread over a base, while at another base they would form into balls. Half of the balls were covered with convex glass-like structures. Neither the physicists nor physical metallurgists can say what these structures are, what their composition is. The “tiny nets” (or “mesh”) have confused many researchers. It is impossible to understand their structure and nature of the formation.

A. Kulikov, an expert on carbon at the Chemistry Institute of the Far Eastern Department of the Academy of Sciences, USSR, wrote that it was not possible to get an idea what the “mesh” is. It resembles glass carbon, but conditions leading to such formation are unknown. Definitely a common fire could not produce such glass carbon. The most mysterious aspect of the collected items was the disappearance, after vacuum melting, of gold, silver, and nickel, and the appearance-from nowhere-of molybdenum, that was not in the chamber to begin with.

The only thing that could be more or less easily explained was the ash found on site. Something biological was burned during the crash. A flock of birds, perhaps, or a stray dog; or someone who was inside the crashed object?

Dr. Dvuzhilni’s article was published in a Soviet (Uzbekistan) Magazine NLO: Chto, Gde, Kogda? (Issue 1, 1990, reprint of an article in FENOMEN Magazine, March 23, 1990). In his article Dalnegorski Phenomen V. Dvuzhilni provides details unavailable elsewhere.

The southwesterly trajectory of the object just about coincides with the Xichang Cosmodrome of People’s Republic of China, where satellites are launched into geo synchronous orbit with the help of the Great March-2 carrier rockets. There is no data of any rocket launches in the PRC at the end of January. At the same time, Sinxua Agency reported on January 25, 1988, that there was a sighting of a glowing red sphere not far from the Cosmodrome, where it hovered for 30 minutes. Possibly, UFOs had shown interest toward the Chinese Cosmodrome in the years 1989 and 1988.

There is another curious detail: at the site of the Height 611 small pieces of light gray color were discovered, but only in the area of the contact. These specimens did not match any of the local varieties of soil. What is amazing, the spectroscopic analysis of the specimens matched them to the Yaroslavl tuffs of the polymetalic deposits (i.e. the specimens possessed some characteristic elements of the Yaroslavl, but not the Dalnegorsk, tuffs). There is a possibility that the object obtain pieces of tuff in the Yaroslavl area. Tuffs experience metamorphosis under the effect of high temperatures .

The site of the crash itself was something like an anomalous zone. It was “active” for three years after the crash. Insects avoid the place. The zone affects mechanical and electronic equipment. Some people, including a local chemist, actually got very sick.

This Hill 611 is located in the area of numerous anomalies; according to an article in the Soviet digest Tainy XX Veka (Moscow, 1990, CP Vsya Moskva Publishing House). Even photos taken at the site, when developed, failed to show the hill, but did clearly show other locations. Members of an expedition to the site reported later that their flashlights stopped working at the same time. They checked the flashlights upon returning home, and discovered burned wires.

Eight days after the UFO crash at Hill 611, on February 8, 1986, at 8:30 p.m., two more yellowish spheres flew from the north, in the southward direction. Reaching the site of the crash, they circled it four times, then turned back to the north and flew away. Then on November 28, 1987 (Saturday night, 11:24 p.m.), 32 flying objects had appeared from nowhere. There were hundreds of witnesses, including the military and civilians.

The objects flew over 12 different settlements, and 13 of them flew to Dalnegorsk and the site. Three of the UFOs hovered over the settlement, and five of them illuminated the nearby mountain. The objects moved noiselessly, at an altitude between 150 to 800 meters. None of the eyewitnesses actually thoughts they were UFOs. Those who observed the objects assumed they were aircraft involved in some disaster, or falling meteorites. As the objects flew over houses, they created interference (television, telegraph functions).

The Ministry of Internal Affairs officers, who were present, testified later that they observed the objects from a street, at 23:30 (precise time). They saw a fiery object, flying in from the direction of Gorely settlement. In front of the fiery “flame” was a lusterless sphere, and in the middle of the object was a red sphere. Another group of eyewitnesses included workers from the Bor quarry. They observed an object at 11:00 pm. A giant cylindrical object was flying straight at the quarry. Its size was like that of a five-story building, its length around 200 or 300 hundred meters. The front part of the object was lit up, like a burning metal. The workers were afraid that the object would crash on them. One of the managers of the quarry observed an object at 11:30 pm.

The object was slowly moving at an altitude of 300 meters. It was huge, and cigar-shaped. The manager, whose last name was Levakov, stated that he was well acquainted with aerodynamics, knew theory and practice of flight, but never knew that a body could fly noiselessly without any wings or engines. Another eyewitness, a kindergarten teacher, saw something else. It was a bright, blinding sphere at an altitude of a nine-story building. It moved noiselessly. In front of the sphere Ms. Markina observed a dark, metallic-looking elongated object of about 10 to 12 meters long. It hovered over a school. There the object emitted a ray (its diameter about half a meter). The colour of the ray was violet-bluish. The ground below illuminated, but there were no shadows from objects below. Then the object in the sky approached a mountain and hovered over it. It illuminated the mountain, emitted a reddish projector-like light, as if searching for something, and then departed, flying over the mountain.

No rocket launches took place at any of the Soviet cosmodromes either on January 29, 1986, or November 28, 1987.

Dr. Dvuzhilni’s conclusion is that it was a malfunctioning alien space probe that crashed into the Hill 611. Another hypothesis has it that the object managed to ascend, and escape (almost in one piece) in the north-easterly direction and probably crashed in the dense taiga.

To be continued
Читайте больше на https://english.pravda.ru/society/112049-dalnegorsk_ufo_crash/

How Far Can SpaceX Starships Go?

#space #spacex #starship #range #astronomy

How far can we go in a SpaceX Starship?? Let’s take a look at what the Starship can actually do, when it comes to humans exploring our solar system.

Listen to this article

Elon musk wants to send starships to Mars obviously, but what about missions to Jupiter, or Saturn, or even beyond that? The answer can be found in the rocket science, and the first thing you need to know is that the SpaceX Starship is heavy. Just by itself, completely empty, it’s 120 tons. Now that’s already the weight of about 40 cars stacked on top of eachother, and we call that the Starship’s dry mass.

But 120 tons gets more than doubled, if the payload bay is filled with 150 tons of… well, you know payload – whatever’s paying for a given mission. Let’s say we’re trying to send 5 people to Jupiter and back. Let’s just quickly, and for example, allocate 150 tons of payload for them, including their own body weight. You got 10 tons of oxygen, 20 tons of water, 10 tons of food, and so on. You get the point.

So now we have 270 tons total, but the starship is on the launch pad with its propellant tanks empty. To fill all the tanks, we need to add a whopping 1230 tons of liquid methane and liquid oxygen, giving us a starting weight of 1500 tons. And that’s just for the Starship, we’ll talk about the booster another time which gets even more propellant than that, filled up inside of it.

Once the booster lifts the starship through the thickest parts of our atmosphere, and gives it a big push on its way to orbit, the Starship will spend almost all of its propellant to get to orbital speed, which is 7.8 kilometers per second, now this is a velocity that is required by any spacecraft that aims to not only get to space, but also remain in space and not fall back down, like the suborbital rockets that are now bringing billionaires into a weightless euphoria.

But the Starship doesn’t use every last drop of propellant getting to orbit. For a normal starship mission with the parameters stated above, once in orbit, there will be just enough propellant left over in the main tanks to perform a deorbit burn, and then there are the special and separate header tanks that hold 30 tons of propellant, reserved only for the propulsive landing, because remember, bringing starships back safely and reusing them is a big part of what makes the SpaceX Starship such a game changer in the world of spaceflight and space exploration.

So how exactly are we supposed to explore the solar system, if our starship is effectively running on empty and is still in low earth orbit?

Well that’s where refilling comes into play.

Record Breaking Asteroid Super Close to the Sun Found – 2021 PH27

#asteroid #2021PH27 #space #astronomy

Astronomers Discover Fastest-Orbiting Asteroid Ever Seen

The newly-discovered asteroid 2021 PH27 has a diameter of about 1 km (3,280 feet) and orbits the Sun in just 113 days — the shortest known orbital period for an asteroid and second shortest for any object in our Solar System after Mercury.

Listen to this article

The asteroid 2021 PH27 was imaged inside Mercury’s orbit and has been colored red and blue to show the two different times where it was imaged on August 13, 2021 — just three minutes apart. Image credit: CTIO / NOIRLab / NSF / DOE / DECam / AURA / S.S. Sheppard, Carnegie Institution of Science.

2021 PH27 has a semi-major axis of 70 million km (43 million miles, 0.46 AU), giving it a 113-day orbital period on an unstable elongated orbit that crosses the orbits of both Mercury and Venus.

This means that within a few million years it will likely be destroyed in a collision with one of these planets or the Sun, or it will be ejected from its current position.

2021 PH27 was discovered by Carnegie Institution for Science’s Dr. Scott Sheppard in images taken by Brown University astronomers Ian Dell’Antonio and Shenming Fu on August 13, 2021.

“Most likely 2021 PH27 was dislodged from the main asteroid belt between Jupiter and Mars and the gravity of the inner planets shaped its orbit into its current configuration,” Dr. Sheppard said.

“Although, based on its large angle of inclination of 32 degrees, it is possible that 2021 PH27 is an extinct comet from the outer Solar System that ventured too close to one of the planets as the path of its voyage brought it into proximity with the inner Solar System.”

An illustration of 2021 PH27’s orbit. Image credit: CTIO / NOIRLab / NSF / AURA / J. da Silva, Spaceengine.org.

Because 2021 PH27 is so close to the Sun’s massive gravitational field, it experiences the largest general relativistic effects of any known solar system object.

This is seen in a slight angular deviation in its elliptical orbit over time, a movement called precession, which occurs at about one arcminute per century.

Observation of Mercury’s precession puzzled scientists until Albert Einstein’s theory of general relativity explained its orbital adjustments over time. The precession of 2021 PH27 is even faster than Mercury’s.

“2021 PH27 gets so close to the Sun that its surface temperature gets to 482 degrees Celsius (900 degrees Fahrenheit) at closest approach, hot enough to melt lead,” Dr. Sheppard said.

The asteroid will soon pass behind the Sun and be unobservable from Earth until early next year, at which time observers will be able to refine its orbit to the precision needed to give it an official name.

The discovery of 2021 PH27 is reported in the Minor Planet Electronic Circular.

Astronomer reveals never-before-seen detail of the center of our galaxy

#Astronomer #astronomy #space #galaxy #center #chandra #milkyway #interstellarengery

New image made using NASA’s Chandra X-Ray Observatory hints at previously unknown interstellar energy source at the Milky Way center

New research reveals, with unprecedented clarity, details of violent phenomena in the center of our galaxy.

Listen to this article

New research by University of Massachusetts Amherst astronomer Daniel Wang reveals, with unprecedented clarity, details of violent phenomena in the center of our galaxy. The images, published recently in Monthly Notices of the Royal Astronomical Society, document an X-ray thread, G0.17-0.41, which hints at a previously unknown interstellar mechanism that may govern the energy flow and potentially the evolution of the Milky Way.

“The galaxy is like an ecosystem,” says Wang, a professor in UMass Amherst’s astronomy department, whose findings are a result of more than two decades of research. “We know the centers of galaxies are where the action is and play an enormous role in their evolution.” And yet, whatever has happened in the center of our own galaxy is hard to study, despite its relative proximity to Earth, because, as Wang explains, it is obscured by a dense fog of gas and dust. Researchers simply can’t see the center, even with an instrument as powerful as the famous Hubble Space Telescope. Wang, however, has used a different telescope, NASA’s Chandra X-Ray Observatory, which “sees” X-rays, rather than the rays of visible light that we perceive with our own eyes. These X-rays are capable of penetrating the obscuring fog — and the results are stunning.

Wang’s findings, which were supported by NASA, give the clearest picture yet of a pair of X-ray-emitting plumes that are emerging from the region near the massive black hole lying at the center of our galaxy. Even more intriguing is the discovery of an X-ray thread called G0.17-0.41, located near the southern plume. “This thread reveals a new phenomenon,” says Wang. “This is evidence of an ongoing magnetic field reconnection event.” The thread, writes Wang, probably represents “only the tip of the reconnection iceberg.”

A magnetic field reconnection event is what happens when two opposing magnetic fields are forced together and combine with one another, releasing an enormous amount of energy. “It’s a violent process,” says Wang, and is known to be responsible for such well-known phenomena as solar flares, which produce space weather powerful enough to disrupt power grids and communications systems here on Earth. They also produce the spectacular Northern Lights. Scientists now think that magnetic reconnection also occurs in interstellar space and tends to take place at the outer boundaries of the expanding plumes driven out of our galaxy’s center.

“What is the total amount of energy outflow at the center of the galaxy? How is it produced and transported? And how does it regulate the galactic ecosystem?” These, says Wang, are the fundamental questions whose answers will help to unlock the history of our galaxy. Though much work remains to be done, Wang’s new map points the way. For more information, including additional images and video, visit the Chandra X-Ray Observatory’s Galactic Center website.

Chandra Survey of Galactic Center
A panorama of the Galactic Center builds on previous surveys from Chandra and other telescopes. This latest version expands Chandra’s high-energy view farther above and below the plane of the galaxy – that is, the disk where most of the galaxy’s stars reside – than previous imaging campaigns. In the first two images, X-rays from Chandra are orange, green, and purple, showing different X-ray energies, and the radio data from MeerKAT are gray. Credit: X-ray: NASA/CXC/UMass/Q.D. Wang; Radio: NRF/SARAO/MeerKAT

New research by University of Massachusetts Amherst astronomer Daniel Wang reveals, with unprecedented clarity, details of violent phenomena in the center of our galaxy. The images, published recently in Monthly Notices of the Royal Astronomical Society, document an X-ray thread, G0.17-0.41, which hints at a previously unknown interstellar mechanism that may govern the energy flow and potentially the evolution of the Milky Way.
“The galaxy is like an ecosystem,” says Wang, a professor in UMass Amherst’s astronomy department, whose findings are a result of more than two decades of research. “We know the centers of galaxies are where the action is and play an enormous role in their evolution.” And yet, whatever has happened in the center of our own galaxy is hard to study, despite its relative proximity to Earth, because, as Wang explains, it is obscured by a dense fog of gas and dust. Researchers simply can’t see the center, even with an instrument as powerful as the famous Hubble Space Telescope. Wang, however, has used a different telescope, NASA’s Chandra X-Ray Observatory, which “sees” X-rays, rather than the rays of visible light that we perceive with our own eyes. These X-rays are capable of penetrating the obscuring fog — and the results are stunning.

Chandra Survey of Galactic Center Labeled
This version of the image highlights several key features of this new Galactic Center survey. The threads are labeled with red rectangles in the image, while X-rays reflected from dust around bright X-ray sources (green circles), Sagittarius A*. In purple circles and ellipses, the Arches and Quintuplet Clusters, DB00-58 and DB00-6, 1E 1743.1-28.43, the Cold Gas Cloud and Sagittarius C are outlined. Credit: X-ray: NASA/CXC/UMass/Q.D. Wang; Radio: NRF/SARAO/MeerKAT

Wang’s findings, which were supported by NASA, give the clearest picture yet of a pair of X-ray-emitting plumes that are emerging from the region near the massive black hole lying at the center of our galaxy. Even more intriguing is the discovery of an X-ray thread called G0.17-0.41, located near the southern plume. “This thread reveals a new phenomenon,” says Wang. “This is evidence of an ongoing magnetic field reconnection event.” The thread, writes Wang, probably represents “only the tip of the reconnection iceberg.”

A magnetic field reconnection event is what happens when two opposing magnetic fields are forced together and combine with one another, releasing an enormous amount of energy. “It’s a violent process,” says Wang, and is known to be responsible for such well-known phenomena as solar flares, which produce space weather powerful enough to disrupt power grids and communications systems here on Earth. They also produce the spectacular Northern Lights. Scientists now think that magnetic reconnection also occurs in interstellar space and tends to take place at the outer boundaries of the expanding plumes driven out of our galaxy’s center.

“What is the total amount of energy outflow at the center of the galaxy? How is it produced and transported? And how does it regulate the galactic ecosystem?” These, says Wang, are the fundamental questions whose answers will help to unlock the history of our galaxy. Though much work remains to be done, Wang’s new map points the way. For more information, including additional images and video, visit the Chandra X-Ray Observatory’s Galactic Center website.

The Best Evidence for Life on Mars Might be Found on its Moons

#space #lifeonmars #mars #moons #astronomy #extraterrestria #ET

Listen to this article

The search for Martian life has been ongoing for decades.  Various landers and rovers have searched for biosignatures or other hints that life existed either currently or in the past on the Red Planet.  But so far, results have been inconclusive.  That might be about to change, though, with a slew of missions planned to collect even more samples for testing.  Mars itself isn’t the only place they are looking, though. Some scientists think the best place to find evidence of life is one of Mars’ moons. 

Phobos and Deimos are usually an afterthought when discussing Mars exploration priorities, but interest has been growing recently due to their unique place in the overall Martian system.  They might serve as a depository for material that was blasted off of Mars’ surface in the past.

tUT video discussing the possibility of life on Mars.

Many scientists think that early Mars could have been habitable, with temperatures in a biologically suitable range, an atmosphere that hadn’t yet been stripped away, and liquid water flowing on its surface, some of which formed Jerezo Crater, where Perseverance is now exploring.  If any life existed back in these more hospitable conditions, it would have been subjected to the catastrophes commonly thought of as extinction-level events here on Earth – asteroid impacts.

Asteroid impacts were much more common earlier in the solar system’s formation, ejecting a multitude of the Martian regolith into space.  While some of that ejecta takes the form of meteorites that eventually wind up on Earth, a large amount of it is absorbed by the moons, particularly Phobos.  Scientists estimate that over 1 billion kg of ejected material was deposited relatively evenly across Phobos’ surface, making up over 1000 parts per million of the material on the small moon. 

UT video discussing how life on Mars and Earth could be related.

The moon itself is incapable of supporting life – it has no water to speak of and is constantly irradiated by the sun and more general cosmic rays.  No life could survive on its surface, yet searching for life on Phobos still has some major advantages over searching for life on Mars itself.

While Mars doesn’t have a traditional weather cycle, like Earth’s, its surface changes regularly, with dust storms and wind causing the erosion and deposition of long-standing geological edifices.  However, both Martian moons lack any such system, so any biosignature that landed there from an asteroid impact would likely still be in the same position now, and in much the same shape it would have been in when it was blasted in space.

https://www.youtube.com/embed/dmtCX-TBgfE?UT video discussing the Mars Sample Return Mission

This is all great in theory, but getting data to prove that theory is another matter entirely.  Luckily there are a series of missions in the works to attempt to do so.  The Mars Sample Return mission (MSR) is ongoing, and Perseverance’s jaunt in Jezero Carter is the first step. The Japanese Space Agency’s Mars Moons eXploration (MMX) mission plans to return to Earth with a regolith sample from Phobos in 2029.

Another advantage that MMX would have over the MSR is that the debris spread across Phobos’ surface wouldn’t be specific to a particular area on Mars, unlike the samples of Jezero that Perseverance is currently attempting to collect. Asteroid impacts are equally destructive ejecta creators, so if life happened to spring up only in a certain region of Mars, it would be more likely to have been caught in an asteroid impact and partially deposited on Phobos. There’s a much better chance of scientists finding that evidence there than of them luckily choosing the right area to look in with no previous knowledge.

UT video on why it might be better to send humans to Mars’ moons first.

No matter where they look, and no matter what they find, scientists working on both the MSR and MMX missions will be adding valuable knowledge to humanity’s stockpile.  And if they happen to find evidence of one of the most important discoveries in history, so much the better.

Dyson Spheres Around Super Massive Black Holes

#DysonSphere #BlackHoles #space #astonomy #physics #aliens #extraterrestrial

An artist’s impression of a Dyson sphere surrounding a star DOTTED YETI/SHUTTERSTOCK

Black holes surrounded by massive, energy-harvesting structures could power alien civilizations!

Listen to this article

In the long-running TV show Doctor Who, aliens known as time lords derived their power from the captured heart of a black hole, which provided energy for their planet and time travel technology. The idea has merit, according to a new study. Researchers have shown that highly advanced alien civilizations could theoretically build megastructures called Dyson spheres around black holes to harness their energy, which can be 100,000 times that of our Sun. The work could even give us a way to detect the existence of these extraterrestrial societies.

“I like these speculations about what advanced civilizations might do,” says Tomáš Opatrný, a physicist at Palacký University Olomouc, who was not involved with the work but agrees that a Dyson sphere around a black hole would provide its builders with lots of power.

If humanity’s energy demands continue to grow, a point will come when our power consumption approaches, or even exceeds, the total energy available to our planet. So argued physicist Freeman Dyson way back in 1960. Borrowing from British sci-fi author Olaf Stapledon, Dyson proposed that any sufficiently advanced civilization that wanted to survive would need to build massive structures around stars that could harness their energy.

Most of these Dyson spheres involve numerous satellites orbiting or sitting motionlessly around a star. (A solid shell totally encasing a solar body—as envisioned in a Star Trek: The Next Generation episode—is considered mechanically impossible, because of the gravity and pressure from the central star.) Such megastructures would have to transform that solar energy into usable energy, a process that creates waste heat. This heat shows up in the midinfrared spectrum, and stars with an excess infrared signal have become a key target in the search for extraterrestrial life.

But astronomer Tiger Hsiao of National Tsing Hua University says we might be looking for the wrong thing. In a new study, he and colleagues set out to calculate whether it would also be possible to use a Dyson sphere around a black hole. They analyzed black holes of three different sizes: those five, 20, and 4 million times the mass of our Sun. These, respectively, reflect the lower and upper limits of black holes known to have formed from the collapse of massive stars—and the even more enormous mass of Sagittarius A*, the supermassive massive black hole thought to lurk at the center of the Milky Way.

Black holes are typically thought of as consumers rather than producers of energy. Yet their huge gravitational fields can generate power through several theoretical processes. These include the radiation emitted from the accumulation of gas around the hole, the spinning “accretion” disk of matter slowly falling toward the event horizon, the relativistic jets of matter and energy that shoot out along the hole’s axis of rotation, and Hawking radiation—a theoretical way that black holes can lose mass, releasing energy in the process.

From their calculations, Hsiao and colleagues concluded that the accretion disk, surrounding gas, and jets of black holes can all serve as viable energy sources. In fact, the energy from the accretion disk alone of a stellar black hole of 20 solar masses could provide the same amount of power as Dyson spheres around 100,000 stars, the team will report next month in the Monthly Notices of the Royal Astronomical Society. Were a supermassive black hole harnessed, the energy it could provide might be 1 million times larger still.

If such technology is at work, there may be a way to spot it. According to the researchers, the waste heat signal from a so-called “hot” Dyson sphere—one somehow capable of surviving temperatures in excess of 3000 kelvin, above the melting point of known metals—around a stellar mass black hole in the Milky Way would be detectible at ultraviolet wavelengths. Such signals might be found in the data from various telescopes, including NASA’s Hubble Space Telescope and Galaxy Evolution Explorer, Hsiao says.

Meanwhile, a “solid” Dyson sphere—operating below 3000 kelvin—could be picked up in the infrared by, for example, the Sloan Digital Sky Survey or the Wide-field Infrared Survey Explorer. The latter is no stranger to looking for the infrared signals of traditional, star-based Dyson spheres. But, like all other such searches, it has yet to find anything conclusive.

Opatrný says using the radiation from accretion disks would be particularly clever, because the disks convert energy more efficiently than the thermonuclear reaction in conventional stars. Aliens concerned with the sustainability of their power supply, he suggests, might be better off encapsulating small stars that burn their fuel slowly. However, he continued, “The fast-living civilizations feeding on black hole accretion disks would be easier to spot from the huge amount of waste heat they produce.”

Inoue Makoto, an astrophysicist from the Academia Sinica Institute of Astronomy and Astrophysics, says regular black holes could support so-called type II civilizations, whose total energy requirements match those of an entire star system. Supermassive black holes, he adds, could fuel type III civilizations, whose power consumption would equal that emitted by an entire galaxy.

As for what the aliens might use this energy for, Opatrný has some thoughts. “Mining cryptocurrency, playing computer games, or just feeding the ever-growing bureaucracy?” he jokingly muses. Either way, maybe the time lords were onto something after all.

Fireball blazes across Texas sky

#Fireball #Texas #sky #astronomy #space

NASA has programs devoted to tracking the exceptionally bright meteors.

Listen to this article

The fire ball that passed over Japan in 2017 is linked to a mile-long asteroid. Scientists now believe that the asteroid, known as 2003 YT1 could break up and harm life on Earth.

Texas residents were stunned to see a fireball blaze across the sky on Sunday night. 

According to NASA Meteor Watch, the celestial spectacle passed overhead just before 9 p.m. CT. 

“Hundreds of eyewitnesses in the states of Texas, Louisiana, Arkansas and Oklahoma report seeing a very bright fireball last night at 8:58 PM Central Daylight Time,” the agency said in a Facebook post on Monday. “Analysis of their reports, combined with information obtained from a couple of videos from public/amateur cameras, shows that the meteor was first seen 48 miles above Texas Highway 11, between Sulphur Springs and Winnsboro. Moving northeast at 30,000 miles per hour, it traveled 59 miles through the upper atmosphere before fragmenting 27 miles above U.S. 82, east of Avery.”

“The fireball was at least as bright as a quarter moon, which translates to something bigger than 6 inches in diameter with a weight of 10 pounds. The slow speed (for a meteor) suggests a small piece of an asteroid produced the fireball,” it added. 

Hundreds uploaded witness reports to the nonprofit American Meteor Society (AMS), including three videos and CBSDFW.com said Monday that others claimed they had heard a “sonic boom.” 

Fireballs are a common occurrence and NASA has programs devoted to tracking the exceptionally bright meteors.

With explosive new result, laser-powered fusion effort nears ‘ignition’

An artist’s rendering shows how the National Ignition Facility’s 192 beams enter an eraser-size cylinder of gold and heat it from the inside to produce x-rays, which then implode the fuel capsule at its center to create fusion. LAWRENCE LIVERMORE NATIONAL LABORATORY
Listen to this article

#laser-powered #fusion #ignition #NIF #physics

More than a decade ago, the world’s most energetic laser started to unleash its blasts on tiny capsules of hydrogen isotopes, with managers promising it would soon demonstrate a route to limitless fusion energy. Now, the National Ignition Facility (NIF) has taken a major leap toward that goal. Last week, a single laser shot sparked a fusion explosion from a peppercorn-size fuel capsule that produced eight times more energy than the facility had ever achieved: 1.35 megajoules (MJ)—roughly the kinetic energy of a car traveling at 160 kilometers per hour. That was also 70% of the energy of the laser pulse that triggered it, making it tantalizingly close to “ignition”: a fusion shot producing an excess of energy.

“After many years at 3% of ignition, this is superexciting,” says Mark Herrmann, head of the fusion program at Lawrence Livermore National Laboratory, which operates NIF.

NIF’s latest shot “proves that a small amount of energy, imploding a small amount of mass, can get fusion. It’s a wonderful result for the field,” says physicist Michael Campbell, director of the Laboratory for Laser Energetics (LLE) at the University of Rochester.

“It’s a remarkable achievement,” adds plasma physicist Steven Rose, co-director of the Centre for Inertial Fusion Studies at Imperial College London. “It’s made me feel very cheerful. … It feels like a breakthrough.”

And it is none too soon, as years of slow progress have raised questions about whether laser-powered fusion has a practical future. Now, according to LLE Chief Scientist Riccardo Betti, researchers need to ask: “What is the maximum fusion yield you can get out of NIF? That’s the real question.”

Fusion, which powers stars, forces small atomic nuclei to meld together into larger ones, releasing large amounts of energy. Extremely hard to achieve on Earth because of the heat and pressure required to join nuclei, fusion continues to attract scientific and commercial interest because it promises copious energy, with little environmental impact.

Yet among the many approaches being investigated, none has yet generated more energy than was needed to cause the reaction in the first place. Large doughnut-shaped reactors called tokamaks, which use magnetic fields to cage a superhot plasma for long enough to heat nuclei to fusion temperatures, have long been the front-runners to achieve a net energy gain. But the giant $25 billion ITER project in France is not expected to get there for more than another decade, although private fusion companies are promising faster progress.

NIF’s approach, known as inertial confinement fusion, uses a giant laser housed in a facility the size of several U.S. football fields to produce 192 beams that are focused on a target in a brief, powerful pulse—1.9 MJ over about 20 nanoseconds. The aim is to get as much of that energy as possible into the target capsule, a diminutive sphere filled with the hydrogen isotopes deuterium and tritium mounted inside a cylinder of gold the size of a pencil eraser. The gold vaporizes, producing a pulse of x-rays that implodes the capsule, driving the fusion fuel into a tiny ball hot and dense enough to ignite fusion. In theory, if such tiny fusion blasts could be triggered at a rate of about 10 per second, a power plant could harvest energy from the high-speed neutrons produced to generate electricity.

When NIF launched, computer models predicted quick success, but fusion shots in the early years only generated about 1 kilojoule (kJ) each. A long effort to better understand the physics of implosions followed and by last year shots were producing 100 kJ. Key improvements included smoothing out microscopic bumps and pits on the fuel capsule surface, reducing the size of the hole in the capsule used to inject fuel, shrinking the holes in the gold cylinder so less energy escapes, and extending the laser pulse to keep driving the fuel inward for longer. The progress was sorely needed, as NIF’s funder, the National Nuclear Security Administration, was reducing shots devoted to ignition in favor of using its lasers for other experiments simulating the workings of nuclear weapons. 

Earlier this year, combining those improvements in various ways, the NIF team produced several shots exceeding 100 kJ, including one of 170 kJ. That result suggested NIF was finally creating a “burning plasma,” in which the fusion reactions themselves provide the heat for more fusion—a runaway reaction that is key to getting higher yields. Then, on 8 August, a shot generated the remarkable 1.35 MJ. “It was a surprise to everyone,” Herrmann says. “This is a whole new regime.”

Exactly which improvements had the greatest impact and what combination will lead to future gains will take a while to unravel, Herrmann says, because several were tweaked at once in the latest shot. “It’s a very nonlinear process. That’s why it’s called ignition: It’s a runaway thing,” he says. But, “This gives us a lot more encouragement that we can go significantly farther.”

Herrmann’s team is a long way from thinking about fusion power plants, however. “Getting fusion in a laboratory is really hard, getting economic fusion power is even harder,” Campbell says. “So, we all have to be patient.” NIF’s main task remains ensuring the United States’s nuclear weapons stockpile is safe and reliable; fusion energy is something of a sideline. But reaching ignition and being able to study and simulate the process will also “open a new window on stewardship,” Herrmann says, because uncontrolled fusion powers nuclear weapons.

Herrmann admits that, when he got a text last week from colleagues saying they’d gotten an “interesting” result from the latest shot, he was worried something might be wrong with the instruments. When that proved not to be the case, “I did open a bottle of champagne.”

Searching for life on Mars and its moons

#life #Mars #moon #space #Extraterrestrial

Listen to this article

The scientific exploration of Mars over the past several decades has resulted in increasing evidence that the martian surface hosted habitable environments early in its history, as well as evidence of the building blocks of life in the form of organic molecules. Habitats on Mars that could harbor extant martian life have been hypothesized, such as subsurface environments, caves, and ice deposits. Mars is currently recognized as a “paleo-habitable” planet, reflecting its ancient habitability. Fully understanding the evolution of habitability and whether Mars has ever hosted life will be essential to understanding and exploring other extraterrestrial habitable environments and potential life-forms. Flagship missions of multiple space agencies in the 2020s will play essential and complementary roles and could finally provide an answer to these long-standing questions.

The planned Mars Sample-Return MSR mission of NASA and the European Space Agency should reveal more about the habitability of Mars by helping to determine the geologic evolution of Jezero crater and its surrounding areas, which are believed to be the site of an ancient lake see the photo. The Mars 2020 Perseverance rover will attempt to collect samples that will allow scientists to explore the evolution of Jezero crater and its habitability over time, as well as samples that may contain evidence of biosignatures. A high-priority science objective for MSR returned-sample science is to understand the habitability of Mars and look for potential signs of both extinct and extant life.

Mars is not alone because it has two small moons, Phobos and Deimos. Throughout the history of Mars, numerous asteroidal impacts on Mars have produced martian impact ejecta, and a fraction of the ejected material has been delivered to its moons. Phobos is closer to Mars, so it has more martian ejecta than Deimos. Numerical simulations show that >109 kg of martian material could be uniformly mixed in the regolith of Phobos the resultant martian fraction is >1000 parts per million.

Even if martian life-forms existed and could survive the transport to Phobos without suffering from impact-shock decomposition with a peak pressure of <5 GPa, the Phobos environment is highly inhospitable. Phobos does not have air or water, and its surface is constantly bathed in solar and galactic cosmic radiation. This indicates that martian materials on Phobos’ surface almost certainly do not contain any living microorganisms.

Embedded Image

Jezero crater on Mars is believed to be the site of an ancient lake. The Mars 2020 Perseverance rover aims to collect samples from the crater to analyze for evidence of life.

Instead, there may be dead biosignatures on Phobos, which we have called “SHIGAI” Sterilized and Harshly Irradiated Genes, and Ancient Imprints—the acronym in Japanese means “dead remains.” SHIGAI includes any potential microorganisms that could have been alive on Mars and were recently sterilized during or after the delivery to Phobos, and the microorganisms and biomarkers that had been processed on ancient Mars before the delivery to Phobos, including potential DNA fragments. The Mars-moon system is an ideal natural laboratory for the study of interplanetary transport and sustainability of SHIGAI on airless bodies in the Solar System.

Should a martian biosphere exist, any biosignatures or biomarkers observed in the samples from Jezero crater could be widespread elsewhere on Mars and possibly occur on the surface of Phobos. Because martian ejecta has been thoroughly delivered to Phobos by impact-driven random sampling, the biosignatures and biomarkers that may be contained in the Phobos regolith could reflect the diversity and evolution of a potential martian biosphere.

Martian Moons eXploration MMX, developed by the Japan Aerospace Exploration Agency, plans to collect a sample of >10 g from the Phobos surface and return to Earth in 2029. Detection of a “fingerprint” of martian life and SHIGAI should be achievable through comprehensive comparative studies using martian material from the Phobos surface and samples from Jezero crater returned by MMX and MSR, respectively.

The MSR samples have the potential to contain a variety of biomarker molecules e.g., lipids, such as hopanoids, sterols, and archaeols, and their diagenetic products. The sample could include modern living organisms from Jezero crater, if they are present. Of course, MSR could return samples without any evidence of life because of the focus on a single location. A distinct advantage for MMX is the ability to deliver martian materials derived from several regions. The random nature of the crater-forming impacts on Mars statistically delivers all possible martian materials, from sedimentary to igneous rocks that cover all of its geological eras.

Mutual international cooperation on MSR and MMX could answer questions such as how martian life, if present, emerged and evolved in time and place. If Mars never had life at all, these missions would then be absolutely vital in unraveling why Mars is lifeless and Earth has life. Therefore, the missions may eventually provide the means to decipher the divergent evolutionary paths of life on Mars and Earth.

Toward next-generation brain-computer interface systems

A new kind of neural interface system that coordinates the activity of hundreds of tiny brain sensors could one day deepen understanding of the brain and lead to new medical therapies

#Brain #BCI #computer #interface #sensor

Listen to this article
Close-up portrait of young and beautiful woman with the virtual futuristic glasses ( technology concept).Virtual holographic interface and young woman wearing glasses

Brain-computer interfaces (BCIs) are emerging assistive devices that may one day help people with brain or spinal injuries to move or communicate. BCI systems depend on implantable sensors that record electrical signals in the brain and use those signals to drive external devices like computers or robotic prosthetics.

Most current BCI systems use one or two sensors to sample up to a few hundred neurons, but neuroscientists are interested in systems that are able to gather data from much larger groups of brain cells.

Now, a team of researchers has taken a key step toward a new concept for a future BCI system — one that employs a coordinated network of independent, wireless microscale neural sensors, each about the size of a grain of salt, to record and stimulate brain activity. The sensors, dubbed “neurograins,” independently record the electrical pulses made by firing neurons and send the signals wirelessly to a central hub, which coordinates and processes the signals.

In a study published on August 12 in Nature Electronics, the research team demonstrated the use of nearly 50 such autonomous neurograins to record neural activity in a rodent.

The results, the researchers say, are a step toward a system that could one day enable the recording of brain signals in unprecedented detail, leading to new insights into how the brain works and new therapies for people with brain or spinal injuries.

“One of the big challenges in the field of brain-computer interfaces is engineering ways of probing as many points in the brain as possible,” said Arto Nurmikko, a professor in Brown’s School of Engineering and the study’s senior author. “Up to now, most BCIs have been monolithic devices — a bit like little beds of needles. Our team’s idea was to break up that monolith into tiny sensors that could be distributed across the cerebral cortex. That’s what we’ve been able to demonstrate here.”

The team, which includes experts from Brown, Baylor University, University of California at San Diego and Qualcomm, began the work of developing the system about four years ago. The challenge was two-fold, said Nurmikko, who is affiliated with Brown’s Carney Institute for Brain Science. The first part required shrinking the complex electronics involved in detecting, amplifying and transmitting neural signals into the tiny silicon neurograin chips. The team first designed and simulated the electronics on a computer, and went through several fabrication iterations to develop operational chips.

The second challenge was developing the body-external communications hub that receives signals from those tiny chips. The device is a thin patch, about the size of a thumb print, that attaches to the scalp outside the skull. It works like a miniature cellular phone tower, employing a network protocol to coordinate the signals from the neurograins, each of which has its own network address. The patch also supplies power wirelessly to the neurograins, which are designed to operate using a minimal amount of electricity.

“This work was a true multidisciplinary challenge,” said Jihun Lee, a postdoctoral researcher at Brown and the study’s lead author. “We had to bring together expertise in electromagnetics, radio frequency communication, circuit design, fabrication and neuroscience to design and operate the neurograin system.”

The goal of this new study was to demonstrate that the system could record neural signals from a living brain — in this case, the brain of a rodent. The team placed 48 neurograins on the animal’s cerebral cortex, the outer layer of the brain, and successfully recorded characteristic neural signals associated with spontaneous brain activity.

The team also tested the devices’ ability to stimulate the brain as well as record from it. Stimulation is done with tiny electrical pulses that can activate neural activity. The stimulation is driven by the same hub that coordinates neural recording and could one day restore brain function lost to illness or injury, researchers hope.

The size of the animal’s brain limited the team to 48 neurograins for this study, but the data suggest that the current configuration of the system could support up to 770. Ultimately, the team envisions scaling up to many thousands of neurograins, which would provide a currently unattainable picture of brain activity.

“It was a challenging endeavor, as the system demands simultaneous wireless power transfer and networking at the mega-bit-per-second rate, and this has to be accomplished under extremely tight silicon area and power constraints,” said Vincent Leung, an associate professor in the Department of Electrical and Computer Engineering at Baylor. “Our team pushed the envelope for distributed neural implants.”

There’s much more work to be done to make that complete system a reality, but researchers said this study represents a key step in that direction.

“Our hope is that we can ultimately develop a system that provides new scientific insights into the brain and new therapies that can help people affected by devastating injuries,” Nurmikko said.

Other co-authors on the research were Ah-Hyoung Lee (Brown), Jiannan Huang (UCSD), Peter Asbeck (UCSD), Patrick P. Mercier (UCSD), Stephen Shellhammer (Qualcomm), Lawrence Larson (Brown) and Farah Laiwalla (Brown). The research was supported by the Defense Advanced Research Projects Agency (N66001-17-C-4013).

Japan tests rotating detonation engine for the first time in space

#Japan #tests #rotating #detonationengine #space #JAXA #Rocketengine

Japan tests rotating detonation engine for the first time in space.
Japan tests rotating detonation engine for the first time in space. Credit: JAXA
Listen to this article

The Japan Aerospace Exploration Agency (JAXA) has announced that it has successfully demonstrated the operation of a “rotating detonation engine” for the first time in space. The novelty of the technologies in question is that such systems obtain a large amount of thrust by using much less fuel compared to conventional rocket engines, which is quite advantageous for space exploration.

On July 27, the Japanese agency launched a pair of futuristic propulsion systems into space to carry out the first tests. They were launched from the Uchinoura Space Center aboard the S-520-31, a single-stage rocket capable of lofting a 220 lbs (100 kg) payload well above 186 miles (300 km). After recovering the rocket from the ocean, the JAXA team of engineers analyzed the data and confirmed the success of the mission, which put the new system at an estimated altitude of (146 miles) 234.9 km.

The rotating detonation engine uses a series of controlled explosions that travel around an annular channel in a continuous loop. This process generates a large amount of super-efficient thrust coming from a much smaller engine using significantly less fuel – which also means sending less weight on a space launch. According to JAXA, it has the potential to be a game-changer for deep space exploration.

The rocket began the test demonstrations after the first stage separated, burning the rotating detonation engine for six seconds, while a second pulse detonation engine operated for two seconds on three occasions. The pulse engine uses detonation waves to combust the fuel and oxidizer mixture.

When the rocket was recovered after the demonstration, it was discovered that the rotary engine produced about 500 Newtons of thrust, which is only a fraction of what conventional rocket engines can achieve in space.

According to JAXA engineers, the successful in-space test has greatly increased the possibility that the detonation engine will be used in practical applications, including in rocket motors for deep space exploration, first-stage, and two-stage engines, and more. The engines could indeed allow us to travel deep into space using a fraction of the fuel and weight, which will be critical in interplanetary journeys.

Study Uncovers Mysterious Radio Objects, Some Hard to Explain

Listen to this article

#space #astronomy #radioobjects #radioastronomy #FRB #LOFARreadioastronomy

FM radio waves reveal a side of the universe invisible to the human eye.

The Hercules A black hole jets captured in a high-resolution image captured by the LOFAR radio telescope. The images revealed that the jet grows stronger and weaker every few hundred thousand years. This variability produced the structure of the jet.

The Hercules A black hole jets captured in a high-resolution image captured by the LOFAR radio telescope. The images revealed that the jet grows stronger and weaker every few hundred thousand years. This variability produced the structure of the jet. (Image credit: R. Timmerman; LOFAR & Hubble Space Telescope)

The most detailed radio images of galaxies outside the Milky Way have been captured by a network of 70,000 radio antennas spread over nine European countries.

The images reveal a side of the universe invisible to optical telescopes and provide a glimpse into some of the most mysterious cosmic phenomena, such as the activity of supermassive black holes at galactic centers. 

A team of astronomers behind the Low Frequency Array (LOFAR), a radio telescope network managed by the Netherlands Institute for Radio Astronomy (Astron), worked for 10 years to produce the images. 

Leah Morabito, assistant professor of physics at the University of Durham in England, led the effort to improve the standard resolution of LOFAR images. By including more antennas and with the help of supercomputers, they improved the resolution by a factor of 20. 

Morabito told Space.com in an email that the images provide the highest-ever resolution in the FM radio frequency band, a band between 88 to 108 megahertz that is used for radio broadcasting on Earth. The biggest accomplishment, however, was being able to combine this high resolution with a wide field of view, she added. 

“That’s totally unique,” Morabito said. “It will allow us to survey the entire northern sky in just a few years. Telescopes with comparable resolution have a field of view almost 20 times smaller, and therefore an all-sky survey isn’t logistically possible. No other current or planned radio telescope will have this combination of field of view and resolution.”

Celestial objects including stars, some planets and black holes emit radio waves, which are not visible to optical telescopes. Unlike visible light, these radio waves penetrate through clouds of dust and gas, revealing a picture of the universe that would otherwise be hidden. 

The GIF shows the difference between standard resolution images and the new high resolution images captured by the radio telescope LOFAR.
(Image credit: L.K. Morabito; LOFAR Surveys KSP)

Supermassive black holes are among the most powerful sources of radio waves in the universe. The LOFAR imaging campaign therefore focused on them, looking for jets of material ejected from these black holes, which can’t be detected in the optical spectrum.

“These high-resolution images allow us to zoom in to see what’s really going on when supermassive black holes launch radio jets, which wasn’t possible before at frequencies near the FM radio band,” Neal Jackson of the University of Manchester in England, who cooperated on the project, said in a statement issued by Astron.

LOFAR usually uses only antennas in the Netherlands. But that limits the diameter of the virtual telescope’s lens to only 75 miles (120 kilometers). The diameter of the telescope, in turn, limits its resolution. 

A galaxy imaged by the LOFAR radio telescope.
A galaxy imaged by the LOFAR radio telescope. (Image credit: LOFAR)

The astronomers, however, found a way to integrate antennas in nine European countries, which enabled them to increase the diameter to 1,200 miles (2,000 km) and achieve 20 times better resolution. 

Observations made by the individual antennas were digitized and combined into the final high-resolution images. But that was no easy feat. The scientists had to process 1.6 terabytes of data per second, an equivalent of more than 300 DVDs.

“To process such immense data volumes, we have to use supercomputers,” Frits Sweijen of Leiden University in the Netherlands, said in the statement. “These allow us to transform the terabytes of information from these antennas into just a few gigabytes of science-ready data, in only a couple of days.” 

Morabito added that it would take 3,000 observations to image the entire northern sky. The images and the scientific papers they spawned were published in a special edition of the journal Astronomy and Astrophysics on Tuesday (Aug 17).

Are we ready for another Carrington Level Event?

#Carrington #Level #SolarStorm #CME #space #astronomy

Preparedness is one of those attributes which has been sorely tested in recent times and in many ways has been found wanting but there are many bullets out there with our name potentially on them, one of which we have touched on before, namely solar storms and the CME or coronal mass ejections that usually follow shortly afterwards. These have the potential to create havoc with our modern technological lifestyle, not only affecting satellites but also power generation and all the knock-on effects that losing either of these could bring, one of them even affected the operations of the US navy in the Vietnam war, so I thought that it would interesting just how prepared we are and would it really be as bad as the popular media makes out.

Listen to this article

Just as the earth has weather, so does the sun, but on a much, much larger scale and were as our weather systems are restricted to the earth, the suns weather affects the whole solar system and when the sun sneezes in our direction we catch a cold.

The earth is exposed to a continuous stream of energetic charged particles called the solar wind that travel at up to 3.2 Million km/h and flow out into the solar system to well beyond the outer planets.

As these particles are affected by magnetism, some are trapped by the earth’s magnetic field and channelled to the poles where they interact with the oxygen and nitrogen in the upper atmosphere to create the auroras or the northern and southern lights

Now the sun rotates once every 27 days but different areas of the sun rotate at differing speeds, this causes the suns magnetic field to twist and contort.

The sun also goes through cycles of activity approximately every 11 years. During the periods of peak activity, the solar maximum, disturbances on the suns surface called sunspots become much more common. Along with these are more violent disturbances called a solar flares.

If the Flare is powerful enough it will often eject huge quantities of plasma or charged particles that make up the suns surface or corona, these are called Coronal Mass Ejections or a CME’s.

During a solar flare, initially, there is a sudden burst of x-rays and ultraviolet light which reaches earth in about 8 minutes, this interacts with the ionosphere to affect radio communications.

About 30 mins later a flood of high energy electrons and protons travelling at nearly the speed of light hit the Earths magnetic field and any spacecraft that are outsides its protection. This can cause computer errors and failures of electronic circuits causing satellites to glitch or fail and expose astronauts to high levels of ionising radiation. These charged particles are drawn into the magnetosphere and channelled to the poles creating intense auroras that can be seen much farther from the poles than normal.

In the most violent solar flares, huge magnetic loops bulge out from the suns surface that are many times the size of the earth. When these loops break a billion tons or so of plasma are ejected into space, this is the coronal mass ejection. If the Earth is in the wrong place at the wrong time then this along with a part of the suns magnetic field will come barreling through space to hit us in between about 14 to 40+ hours later.

It’s the polarity of the CME’s magnetic field which can do so much damage when it gets to earth. If it is opposite to the earth’s magnetosphere, the two are drawn together just like two magnets, dumping energy all around the earth. If they are the same then they will repel each other and the damage will be much less.

The problem is that CME’s travel at a very high speed and it’s only in the last 15 minutes that we know what their polarity is so that leaves very little time in which to prepare.  

When a large CME hits the Earths magnetic field, its a bit like a hammer hitting a bell, the magnetic field rings, compressing and stretching and when a magnetic field lines break the charged particles trapped in it travel back down to the earth creating arouras and inducing electrical currents into the Earths surface and anything running over it like powerlines.

Ones that run north to south, parallel to the Earths magnetic field are the most affected, those that run east to west are less so.

This fluctuating magnetic field can induce DC voltages into the high voltage AC power lines causing the step-down transformers to saturate and overheat and burn out in a matter of seconds. To help protect the transformers against geomagnetically induced currents, giant capacitors that block DC but allow AC to flow are installed. However series capacitors are very expensive and while they may protect one power line, the DC could end up rerouted and concentrated into unprotected lines causing more damage than if capacitors weren’t used.

Although there are backups if too may fail then entire grids can shut down. Because there is much more interconnectivity than ever before with smaller grids sometimes from other countries linked together to form super grids, a shut down in one area could ripple through and cause power outages hundreds to thousands of kilometres away.

CME’s hit the earth all the time with about 2 on average per week but these are small and we bearly notice them. Its when a really big one comes along that we are in for a problem. We have now been watching the sun for long enough to know that the largest the sun can produce would be about 3 times the largest we have seen so far but these are extremely rare, in the order of one every few thousand years.

The first recorded CME to cause us a problem was the “Carrington Event”.  A super solar flare seen by the British astronomer Richard Carrington on the 1st Sept 1859. Over the next couple of days there were reports of amazing auroral displays as the northern lights reached as far south as Mexico, Cuba and Hawaii and the Southern lights as far north as Queensland, Australia.

The CME reached the earth 17.6 hours after Carrington saw the initial flare which was quicker than the usual day or two. This is because CME’s sometimes come in a series of bursts with the first usually being smaller but clearing the way of cosmic debris allowing the following ones to arrive faster.

As there was very little in the way of electrical infrastructure at the time, the Telegraph system was the first to show the electrical effects with sparks jumping from switches, shocking the operators and even powering sections of routes when the battery power was removed.

There have been several superstorms since the Carrington event, though none as large but the one which stands out was the March 1989 geomagnetic storm which blacked out large parts of Canada and very nearly blacked out the northeastern United States. This is memorable because it was first have a big impact on our modern infrastructure and it revealed the very real threat that space weather and things like CME’s could have here on Earth.

Since then our power usage has increased but so has our understanding of how solar storms affect us here on earth with much of this data coming from a lucky escape the earth had in 2012, more on that in a moment.

Geomagnetic storms and CME’s are measured using the DST or the Disturbance storm time index, in fact it’s only been since 1957 that we have had proper records of the DST, before then we had to rely on a few magnetometers scattered around the globe.

The DST index measures the ring current around the earth which is created by solar protons and electrons trapped by the Earths magnetosphere. The Ring current produces a magnetic field that protects the lower latitude regions around the equator but is also opposite to the Earths magnetic field, so during geomagnetic storms and CME’s, an increase in the amount of charged particles being trapped here weakens the Earths geomagnetic field.

The DST is measured in NanoTeslas, the lower the negative DST value the weaker the Earths magnetic field and the more the earth is affected by the solar storm.

The typical quite time measurement of DTS is between plus and minus 20nT. An intense geomagnetic storm might decrease that to around -300nT, the Carrington event was believed to have been between -900 and -1750 nT. The reason for this wide range was because of the very limited data that was recorded in 1859 so much of it has to be guestimated from the observations of things like auroras at the time.

Although DST is a good measure for recording events, for measuring realtime changes in the magnetic field like the electrical gid companies need to know, the Kp-index is used. This uses continuous measurements from 13 different measuring stations in the auroral zones around the world. The Kp index utilises a quasi-logarithmic scale of 1 to 9, where 1 is calm, 5 is a solar storm and 9 is an extreme solar storm.

The map above shows you what the Kp-index would be needed to be to see the aurora overhead at a given location.

Now In 2012 we dodged a “Carrington Event” sized bullet when a -1200 nT CME crossed the earth’s orbit, the lucky thing for us was that it was a week late, if it had happened 7 days earlier it would have been a direct hit but it did hit probably the best-equipped satellite for this very issue, the STEREO A solar observatory.

This is one of two nearly identical satellites designed to image the sun and in particular things like solar storms and CME’s. The data collected from this event gave us a huge amount of information and greatly increased our knowledge on how to protect our earth based systems.

Whilst a large CME will be big enough to completely engulf the earth, where you live can have a major impact on how bad the effects could be.

One of the reasons why Canada was affected so much by the 1989 solar storm was that the long stretches of power lines they have, the longer the lines the more electrical energy can be induced into them but it now been discovered that the type of rock the lines run over can also make a big difference.

Its not just metal power cables that the magnetic disturbance can induce power into it’s the ground its self.  

In recent years it has been found that the type of rocks under where you live can magnify the effect a CME can have on things like the power grid by up to 100 times. Igneous and metamorphic rocks have a very high electrical resistance while sedimentary rocks which have water in them have a very low electrical resistance and allow electrical currents induced into them to flow.

Now whilst it might seem that a highly resistive rock like Igneous and metamorphic ones would be a good thing, they act like a giant insulator but the power lines that cross them provide a short circuit through their ground connection allowing currents to build up and flow through them to damage things like transformers. The North Eastern US was also badly hit by the 1989 storm and was on the verge of a shut down and again much of that area is covered by the Appalachian mountain range which is Igneous and metamorphic in its makeup.

In the United Kingdom, the highland area of Scotland is igneous and metamorphic rock where as the farther south east you go in England they are mostly sedimentary rocks, so Scotland could be affected much more than England.

In recently declassified US Navy documents,  the crew of a US Task Force 77 aircraft saw a group of 20-25 magnetic sea mines which were laid by the US off the coast of Vietnam at Hai Phong, detonate over a 30 second period on August 4th 1972.  At the time there was no obvious reason as to why this should have happened. The mines had a self destruct time built time but that was not for another 30 days or so.

However, the US Navy noticed that an X-class solar flare had been detected earlier that day and in a record 14.6 hours a CME hit the Earth. Although the DST value was only -125Nt, its thought that the speed at which it hit the Earths magnetosphere caused it to compress in a similar way to a larger storm and it was this rapid change in the earth’s magnetic field that triggered the magnetic mines.

By mapping the resistance of the rocks and the local magnetic hot spots in the US and other countries its possible to work out where large currents could build up and thus to make provisions in the power grid connectivity.

In the UK the National Grid has been replacing high voltage transformers with newer designs that are more resilient to extra current surges. The strategy in the UK is that if large CME is expected and the polarity is opposite to the earths, they will turn on as much of the 8000km of the UK power lines as possible to dump energy over the entire system and drain it back to earth rather than allowing it to overload a few key system areas and causing costly and lengthy repairs.

Places like the US, Canada and even Australia where there are very long high voltage cable runs which run north to south over varying geologies are more susceptible. Even with blocking capacitors installed, early warnings from satellite observations will be key to knowing which parts might be affected more than others and as such which to protect or temporarily shut down to avoid long term damage.

With our much-increased knowledge of how solar weather affects us here on earth and how the earth its self reacts, it much less likely that even a Carrington class event would have much impact on the countries like the UK which have prepared for this kind of situation but in the end, it’s down to the individual countries and their power companies to make sure that when that once in a hundred year CME comes along the lights won’t go out.

Space collision: Chinese satellite got whacked by hunk of Russian rocket

#Space #collision #Chinese #satellite #Russian #rocket

Listen to this article

In March, the U.S. Space Force’s 18th Space Control Squadron (18SPCS) reported the breakup of Yunhai 1-02, a Chinese military satellite that launched in September 2019. It was unclear at the time whether the spacecraft had suffered some sort of failure — an explosion in its propulsion system, perhaps — or if it had collided with something in orbit.

We now know that the latter explanation is correct, thanks to some sleuthing by astrophysicist and satellite tracker Jonathan McDowell, who’s based at the Harvard-Smithsonian Center for Astrophysics in Cambridge, Massachusetts.
On Saturday (Aug. 14), McDowell spotted an update in the Space-Track.org catalog, which the 18SPCS makes available to registered users. The update included “a note for object 48078, 1996-051Q: ‘Collided with satellite.’ This is a new kind of comment entry — haven’t seen such a comment for any other satellites before,” McDowell tweeted on Saturday.

He dove into the tracking data to learn more. McDowell found that Object 48078 is a small piece of space junk — likely a piece of debris between 4 inches and 20 inches wide (10 to 50 centimeters) — from the Zenit-2 rocket that launched Russia’s Tselina-2 spy satellite in September 1996. Eight pieces of debris originating from that rocket have been tracked over the years, he said, but Object 48078 has just a single set of orbital data, which was collected in March of this year.

“I conclude that they probably only spotted it in the data after it collided with something, and that’s why there’s only one set of orbital data. So the collision probably happened shortly after the epoch of the orbit. What did it hit?” McDowell wrote in another Saturday tweet.

Yunhai 1-02, which broke up on March 18, was “the obvious candidate,” he added — and the data showed that it was indeed the victim. Yunhai 1-02 and Object 48078 passed within 0.6 miles (1 kilometer) of each other — within the margin of error of the tracking system — at 3:41 a.m. EDT (0741 GMT) on March 18, “exactly when 18SPCS reports Yunhai broke up,” McDowell wrote in another tweet.
Thirty-seven debris objects spawned by the smashup have been detected to date, and there are likely others that remain untracked, he added.

Despite the damage, Yunhai 1-02 apparently survived the violent encounter, which occurred at an altitude of 485 miles (780 kilometers). Amateur radio trackers have continued to detect signals from the satellite, McDowell said, though it’s unclear if Yunhai 1-02 can still do the job it was built to perform (whatever that may be).
McDowell described the incident as the first major confirmed orbital collision since February 2009, when the defunct Russian military spacecraft Kosmos-2251 slammed into Iridium 33, an operational communications satellite. That smashup generated a whopping 1,800 pieces of trackable debris by the following October.

However, we may be entering an era of increasingly frequent space collisions — especially smashups like the Yunhai incident, in which a relatively small piece of debris wounds but doesn’t kill a satellite. Humanity keeps launching more and more spacecraft, after all, at an ever-increasing pace.

“Collisions are proportional to the square of the number of things in orbit,” McDowell told Space.com. “That is to say, if you have 10 times as many satellites, you’re going to get 100 times as many collisions. So, as the traffic density goes up, collisions are going to go from being a minor constituent of the space junk problem to being the major constituent. That’s just math.”

We may reach that point in just a few years, he added.

The nightmare scenario that satellite operators and exploration advocates want to avoid is the Kessler syndrome — a cascading series of collisions that could clutter Earth orbit with so much debris that our use of, and travel through, the final frontier is significantly hampered.
Our current space junk problem is not that severe, but the Yunhai event could be a warning sign of sorts. It’s possible, McDowell said, that Object 48078 was knocked off the Zenit-2 rocket by a collision, so the March smashup may be part of a cascade.

“That’s all very worrying and is an additional reason why you want to remove these big objects from orbit,” McDowell told Space.com. “They can generate this other debris that’s smaller.”

Small debris is tough to track, and there’s already a lot of it up there. About 900,000 objects between 0.4 inches and 4 inches wide (1 to 10 cm) are whizzing around our planet, the European Space Agency estimates. And Earth orbit hosts 128 million pieces of junk 0.04 inches to 0.4 inches (1 mm to 1 cm) in diameter, according to ESA.

Orbiting objects move so fast — about 17,150 mph (27,600 kph) at the altitude of the International Space Station, for example — that even tiny shards of debris can do serious damage to a satellite.

Rare Natural Event in Alaska Sees 3 Volcanoes Erupting at The Same Time

#Alaska #Volcanoes #Eruption #disaster #Aleutian

Three volcanoes in the Alaskan chain of Aleutian islands are currently erupting, and two others are rumbling with disquiet.

Listen to this article

According to a report by NBC, it’s been at least seven years since three Aleutian volcanoes erupted simultaneously. This increased volcanic activity, at this point, is not causing any disruptions, but it is an interesting situation; since volcanoes can be unpredictable, scientists are keeping a careful watch.

The Great Sitkin volcano, Mount Pavlof, and the Semisopochnoi volcano are all at an orange volcano alert level as of Sunday 15 August, according to the Alaska Volcano Observatory.

This means that eruptions are currently underway, but they’re relatively small, rumbly ones with minimal ash.

Only minute amounts of ash have been detected at Mount Pavlof and Semisopochnoi, and none from Great Sitkin. However, lava is flowing from Great Sitkin, and large seismic tremors and several explosions have been detected at Semisopochnoi.

In addition, Mount Cleveland and the volcanic complex on Atka have been showing signs of activity – increased heat under Mount Cleveland, and small earthquakes under Atka. Both are at a yellow volcano alert level.

Although such simultaneous volcanic activity in the Aleutians is uncommon, it’s not unheard of. The Aleutian Arc is a chain of volcanoes spread along the subduction boundary between two tectonic plates – the Pacific Plate pushing beneath the North American Plate. The chain stretches from the Alaskan Peninsula to the Kamchatka Peninsula in Russia.

Often when volcanoes erupt, other nearby volcanoes in close proximity can be roused, but it’s not always clear why. The Aleutian Arc is home to a different kind of mystery.

In 1996, volcanic and seismic activity was spread across 870 kilometers (540 miles) of the arc, which scientists concluded had to be more than coincidental, although the trigger is unknown.

In this case, it’s not entirely clear what’s going on either. Nearly 290 kilometers (180 miles) span between the two outermost volcanoes in this spate of activity, Great Sitkin and Semisopochnoi.

Last year, researchers found that a collection of volcanoes along the Aleutian Arc may be part of a larger supervolcano, but only one of the currently rumbly beasts, Mount Cleveland, is among the specified group.

Although there’s nothing to worry about at this point, the event could turn out to be very scientifically interesting.

Geologists and volcanologists will no doubt be monitoring the situation to see if they can find a link to earlier outbreaks of simultaneous activity, and to try to learn more about this mysterious arc of volcanoes.

Is Ganymede – Not Mars Or Europa – The Best Place To Look For Alien Life?

This could be a trend for icy bodies throughout the solar system and beyond.

#ganymede #space #alien #life #jupiter

The Jupiter moon Ganymede, the largest satellite in the solar system, as seen by NASA’s Voyager 2 spacecraft on July 7, 1979, from a distance of 745,000 miles (1.2 million kilometers).

The Jupiter moon Ganymede, the largest satellite in the solar system, as seen by NASA’s Voyager 2 spacecraft on July 7, 1979, from a distance of 745,000 miles (1.2 million kilometers). (Image credit: NASA)

In the wisp-thin sky of Jupiter’s moon Ganymede, the largest satellite in the solar system, astronomers have for the first time detected evidence of water vapor, a new study finds.

The discovery could shed light on similar watery atmospheres that may envelop other icy bodies in the solar system and beyond, researchers said.

Previous research suggested that Ganymede — which is larger than Mercury and Pluto, and only slightly smaller than Mars — may contain more water than all of Earth’s oceans combined. However, the Jovian moon is so cold that water on its surface is frozen solid. Any liquid water Ganymede possesses would lurk about 100 miles (160 kilometers) below its crust.

Prior work suggested that ice on Ganymede’s surface could turn from a solid directly to a gas, skipping a liquid form, so that water vapor could form part of the giant moon’s thin atmosphere. However, evidence of this water has proved elusive — until now.

In the new study, researchers analyzed old and new data of Ganymede from NASA’s Hubble Space Telescope. In 1998, Hubble captured the first ultraviolet images of Ganymede, including pictures of its auroras, the giant moon’s versions of Earth’s northern and southern lights. Colorful ribbons of electrified gas within these auroras helped provide evidence that Ganymede has a weak magnetic field.

Ultraviolet signals detected in these auroral bands suggested the presence of oxygen molecules, each made of two oxygen atoms, which are produced when charged particles erode Ganymede’s icy surface. However, some of these ultraviolet emissions did not match what one would expect from an atmosphere of pure molecular oxygen. Previous research suggested these discrepancies were linked to signals from atomic oxygen — that is, single atoms of oxygen.

As part of a large observing program to support NASA’s Juno mission to Jupiter, researchers sought to measure the amount of atomic oxygen in Ganymede’s atmosphere using Hubble. Unexpectedly, they discovered there is hardly any atomic oxygen there, suggesting there must be another explanation for the earlier ultraviolet signals.

The scientists focused on how the surface temperature of Ganymede varies strongly throughout the day, with highs of about minus 190 degrees Fahrenheit (minus 123 degrees Celsius) at noon at the equator and lows of about minus 315 degrees Fahrenheit (193 degrees Celsius) at night. At the hottest spots on Ganymede, ice may become sufficiently warm enough to convert directly into vapor. They noted that differences seen between a number of ultraviolet images from Ganymede closely match where one would expect water in the moon’s atmosphere based on its climate.

“Water vapor in the atmosphere matches the data very well,” study lead author Lorenz Roth, a planetary scientist at the KTH Royal Institute of Technology in Stockholm, told Space.com.

The main reason previous research failed to detect water in Ganymede’s atmosphere is because the ultraviolet signal from molecular oxygen is very strong. “Within this stronger oxygen signal, it’s hard to find other signals,” Roth said.

“These findings suggest that water vapor actually exists in the atmospheres of icy bodies in the outer solar system,” Roth said. “Now we might see it more places.”

The scientists detailed their findings online Monday (July 26) in the journal Nature Astronomy.

Asteroid Bennu Earth impact probability increases

#OSIRIS-Rex #Bennu #Earthimpact #space #astronomy #asteroid

Asteroid Bennu is one of the two most hazardous known asteroids in our Solar System. Luckily, the OSIRIS-REx (Origins, Spectral Interpretation, Resource Identification, Security-Regolith Explorer) spacecraft orbited Bennu for more than two years and gathered data that has allowed scientists to better understand the asteroid’s future orbit, trajectory and Earth-impact probability, and even rule out some future impact possibilities.

Listen to this article

In the most precise calculations of an asteroid’s trajectory ever made, researchers determined Bennu’s total impact probability through the year 2300 is really small — about 1 in 1,750 (or 0.057%). The team’s paper says the asteroid will make a close approach to Earth in 2135, where Bennu will pose no danger at that time. But Earth’s gravity will alter the asteroid’s path, and the team identifies Sept. 24, 2182 as the most significant single date in terms of a potential impact, with an impact probability of 1 in 2,700 (or about 0.037%).

“The impact probability went up just a little bit, but it’s not a significant change,” said Davide Farnocchia, lead author of the paper, and scientist at the Center for Near Earth Object Studies at NASA’s Jet Propulsion Laboratory, speaking at a press briefing this week. Farnocchia added that means there is a 99.94% probability that Bennu is not on an impact trajectory.

“So, there is no particular reason for concern,” he said. “We have time to keep tracking the asteroid and eventually come to a final answer.”

101955 Bennu was discovered in 1999 by the Lincoln Near-Earth Asteroid Research Team. Since its discovery, Bennu has been extensively tracked with 580 ground-based optical astrometric observations. The asteroid made three relatively close passes of Earth in 1999, 2005, and 2011, during which the Arecibo and Goldstone radar stations collected a wealth of data about Bennu’s motion.

OSIRIS-REx discovered particles being ejected from asteroid Bennu shortly after arriving at the asteroid. Image Credit: NASA/Goddard/University of Arizona/Lockheed Martin

But OSIRIS-REx’s two-year reconnaissance and sample collection has provided crucial data about the 500-meter-wide asteroid, including some surprises. Scientists expected Bennu’s surface to be smooth and sandy, but the first images from OSIRIS-REx revealed a rugged boulder-field, littered with large rocks and loose gravel. The team also expected the asteroid to be geologically quiet, but just six days after arriving in orbit, the spacecraft observed the asteroid ejecting bits of rock, due to rocks on the asteroid cracking because of the day-night heat cycle. We also learned that Bennu has pieces of Vesta on it. The spacecraft also scooped up a sample of rock and dust from the asteroid’s surface in October of 2020, which it will deliver to Earth on Sept. 24, 2023, for further scientific investigation.

But OSIRIS-REx’s precise observations of Bennu’s motions and trajectory allowed for the best estimate yet of the asteroid’s future path.

“The OSIRIS-REx mission has provided exquisitely precise data on Bennu’s position and motion through space to a level never captured before on any asteroid,” said Lindley Johnson, planetary defense officer at NASA’s Planetary Defense Coordination Office at NASA.

The researchers took into account all kinds of small influences, including the tiny gravitational pull of more than 300 other asteroids, and the drag caused by interplanetary dust. They even checked to see if OSIRIS-REx pushed the asteroid off course when the spacecraft briefly touched its rocky surface with its Touch-And-Go (TAG) sample collection maneuver. But that event had a negligible effect, as expected.

The researchers especially focused on a phenomenon called the Yarkovsky effect, where an object in space would, over long periods of time, be noticeably nudged in its orbit by the slight push created when it absorbs sunlight and then re-emits that energy as heat. Over short timeframes, this thrust is minuscule, but over long periods, the effect on the asteroid’s position builds up and can play a significant role in changing an asteroid’s path.

“The Yarkovsky effect will act on all asteroids of all sizes, and while it has been measured for a small fraction of the asteroid population from afar, OSIRIS-REx gave us the first opportunity to measure it in detail as Bennu travelled around the Sun,” said Steve Chesley, senior research scientist at JPL and study co-investigator, in a press release. “The effect on Bennu is equivalent to the weight of three grapes constantly acting on the asteroid – tiny, yes, but significant when determining Bennu’s future impact chances over the decades and centuries to come.”

A diagram showing OSIRIS-REx’s sampling maneuver on October 20th, 2020. Image Credit: NASA/GSFC/UA

They also were able to better determine how the asteroid’s orbit will evolve over time and whether it will pass through a “gravitational keyhole” during its 2135 close approach with Earth. These keyholes are areas in space that would set Bennu on a path toward a future impact with Earth if the asteroid were to pass through them at certain times, due to the effect of Earth’s gravitational pull.

The team wrote in their paper that “compared to the information available before the OSIRIS-REx mission, the knowledge of the circumstances of the scattering Earth encounter that will occur in 2135 improves by a factor of 20, thus allowing us to rule out many previously possible impact trajectories.”

“The orbital data from this mission helped us better appreciate Bennu’s impact chances over the next couple of centuries and our overall understanding of potentially hazardous asteroids – an incredible result,” said Dante Lauretta, OSIRIS-REx principal investigator and professor at the University of Arizona. “The spacecraft is now returning home, carrying a precious sample from this fascinating ancient object that will help us better understand not only the history of the solar system but also the role of sunlight in altering Bennu’s orbit since we will measure the asteroid’s thermal properties at unprecedented scales in laboratories on Earth.”

Manned version of X-37 space plane in the works?

X-37B on runway at Vandenberg AFB (Image: USAF)

X-37B on runway at Vandenberg AFB (Image: USAF)VIEW 13 IMAGES

Listen to this article

#X37B #X37C #NASA #Space #Spaceplane #Shuttle #Boeing

When the Space Shuttle Atlantis touched down for the final time on July 21, 2011, it looked as if the notion of a manned spacecraft capable of going into orbit and then landing like a conventional airplane had been abandoned. The US government appears to be in favor of returning to Apollo-style space capsules with anything like the Shuttles being relegated to the private sector. But at the American Institute of Aeronautics and Astronautics’ (AIAA) recent Space 2011 conference, Arthur Grantz, chief engineer of Space and Intelligence Systems’ Experimental Systems Group at Boeing, delivered a paper indicating that the U.S. Air Force and Boeing are already on the way toward developing a manned Shuttle replacement based on the X-37B robot space plane.


The X-37B is one of the US Air Force’s most highly visible yet most secret projects of recent years. A robot spacecraft that looks like a miniature space shuttle without a flight deck was bound to attract public attention, but its mission has remained hidden under the blanket word “classified.” The government has released some information about the X-37B. In part, it’s an experimental test bed based on the Boeing X-40 lifting body. With an overall length of a little under 30 ft (9 m) and a wing span just under 15 ft (4.5 m), it’s small enough to fit easily into the Shuttle’s cargo bay, but it’s still capable of acting like a robot version of the larger, older spacecraft. Launched on top of an Atlas booster at Vandenberg Air Force Base, it can carry payloads into space, return them to earth and then land like a conventional aircraft. The difference is that it doesn’t require a pilot or ground control because it can land by itself.

It also has much more endurance than the old Shuttle. While the Shuttle never remained in orbit for as long as three weeks, the X-37B has already broken the record for a reusable spacecraft in orbit: 244 days. It’s rated to remain on station for 270 days if needed.

Shrouded in Secrecy

But for all that the public knows about the X-37B’s capabilities, its mission remains a secret. Part of the X-37B’s purpose is experimenting with new technologies and it’s clear that the Air Force wants its own way of getting into and coming back from space. What they intend to do once in orbit is another question. One thing that the X-37 is designed for is to release satellites that it can rendezvous with at a later date and retrieve. With today’s cyber-heavy battlefields, that is a considerable advantage. Beyond that, the space plane configuration echoes the Air Force’s earlier Dyna Soar program of the early 1960s, which was also a space plane (in this case manned) intended to be launched from atop a booster rocket. Its purpose was supposed to be as a hypersonic reconnaissance platform and bomber-roles that a variant of the X-37 could also fulfill.


The paper that Grantz delivered to the AIAA, “X-37B Orbital Test Vehicle and Derivatives” [PDF summary], provides further insights into the X-37 including the possibility of a manned version. Grantz says that the X-37B’s successor, the larger X-37C, could be used as a cargo ship for the International Space Station. There are already several spacecraft in service and under development that could do that, but Grantz says that the X-37C could also be easily modified to carry up to six passengers. Unlike other cargo carriers slated to become manned spacecraft, this doesn’t necessarily require major design changes. One proposed version of the X-37 shows the interior significantly altered with the fuel tanks and operating systems pushed aft to make way for a traditional flight deck in the bow, but the alternative is simply to place a self-contained pressurized cylinder in the cargo bay and install television cameras fore and aft so the crew can see what’s going on.

X-37B size comparison (Image: USAF)
X-37B size comparison (Image: USAF)

This “plug and play” feature speaks volumes about the X-37 and the X-37C in particular. Human beings are fragile creatures and space engineers have to bear in mind that the human body can only tolerate a narrow range of variables. Too heavy acceleration on takeoff, too sharp a turn, too much vibration or too hard a landing can injure or kill a person. The X-37 is what is called a “1.5 g” spacecraft. In other words, it operates only within an acceleration range of one and a half times the pull of Earth’s gravity. This means that it can carry delicate instruments into space and return them safely to the ground. It also means that it operates safely within the range of what is called “man rated” or “human rated” flight.

What’s also implied by being human rated is that the craft has a mandatory failure ratio of less than one percent. Unmanned spacecraft are allowed a failure rate of ten percent. Moreover, the ability of the X-37 to launch and retrieve satellites as well as to land autonomously suggests a navigation and guidance system sophisticated enough for manned flight that can be adapted for the option of manual control that a human rating requires.

The future of the X-37 program is not certain, but the fact that a new manned spacecraft can be the result of modifying existing technology rather than starting from scratch shows that the grounding of the Shuttle fleet wasn’t just the end of an era, it was the start of a new one.

Why won’t Starship share the fate of the Space Shuttles?

#Starship #SpaceShuttle #NASA #SpaceX #Heatshield

Many of you probably know the Space Shuttle Columbia disaster, in which 7 NASA astronauts lost their lives while preforming re-entry maneuver after a successful orbiter mission. The disaster was caused by damage to the heat shield from a piece of foam detached from the main fuel tank two weeks before the crash, during Columbia’s launch from Kennedy Space Center.

Listen to this article

Heat shields have been a pain in the Space Shuttle from the beginning. It was the thermal

protection that caused the shuttles to fly a maximum of 4 times a year, and their launch cost was over 1.5 billion dollars.

So what about these shields?

The shuttle was made mostly of fast-melting aluminum, so every square millimeter of the leading site had to be protected to prevent disaster. Unfortunately, the shuttle, as a vehicle that uses the physics of flight and has wings to create aerodynamic force, had a very complicated geometry. So complicated that it needed hundreds, if not thousands, of different types of TPS tiles. Additionally, the time required to install one tile was approximately 40 hours.

If that wasn’t enough, the placement of the tiles on the shuttle’s structure was very complicated. One reason for this is the high expansion of aluminum when heated. There was a possibility that the tile would just pop off. The way the space shuttle and rocket were configured for launch also left much to be desired.

The shuttle flew into space attached to a large tank of foam insulation with a heat shield facing it. As a result, a piece of ice or foam detached from the main tank was enough to literally tear the tile out of the fastening at supersonic speeds.

NASA didn’t fix the problem until the end of the shuttle program in 2011, even though it costed the lives of 7 people and endangered virtually all of the astronauts who flew the vehicle. The preventative measure turned out to be a heat shield inspection by… astronauts on the ISS! The shuttle rotated 180 degrees and you coul review the condition of its tiles before returning to Earth.

Okay, we know what went wrong. So how is SpaceX going to take the consequences of the Space Shuttle program and create the true reusable vehicle that NASA so desperately wanted?

The whole thing can be divided into several subsections:

– One: Ship has a less complex geometry. Except for the four flaps, it’s a regular roller. It doesn’t need to soar, only slow down on landing, so it doesn’t need complicated wings, tail, stabilizers, and a contoured beak.

– Two: In the event that a tile falls off, the Ship has a good chance of surviving re-entry into the atmosphere and returning safely to Earth. That’s because the SpaceX vehicle is made of stainless steel instead of aluminum, so without heat shields, it can withstand more than twice the temperature.

– Three: Lower temperature expansion of steel relative to aluminum. This ensures that the changes in mechanical stress are small, making the tiles less likely to detach from the vehicle.

– Four: Ship’s simple tile mounting system. The heat-insulating mat allows some of the heat to be absorbed and the clip system means that each tile should take literally minutes to install. Additionally, the tile mounts are designed in such a way that each tile has some play, which should prevent the tiles from cracking during the temperature swings they experience during flight.

– Five: Steel has another interesting property when re-entering the atmosphere: when the windward side covered with thermal plates absorbs as much heat as possible, the heated plasma flows around the Ship’s sides and heats the leeward side. In Space Shuttles, this side was painted white to most effectively reflect heat radiation into space. The ship needs no such treatment, as it just so happens that steel itself is an almost perfect heat reflective material!

Considering the above changes, I’m sure SpaceX has learned from the failure of the STS program.