This week, we’re attending the International Consumer Electronics Show (CES), where we’re joining industrial pioneers and business leaders from across the globe to showcase our space technology. Since 1967, CES has been the place to be for next-generation innovations to get their marketplace debut.
Our technologies are driving exploration and enabling the agency’s bold new missions to extend the human presence beyond the moon, to an asteroid, to Mars and beyond. Here’s a look at five technologies we’re showing off at #CES2017:
Our Integrated Display and Environmental Awareness System (IDEAS) is an interactive optical computer that works for smart glasses. The idea behind IDEAS is to enhance real-time operations by providing augmented reality data to field engineers here on Earth and in space.
This device would allow users to see and modify critical information on a transparent, interactive display without taking their eyes or hands off the work in front of them.
This wearable technology could dramatically improve the user’s situational awareness, thus improving safety and efficiency.
For example, an astronaut could see health data, oxygen levels or even environmental emergencies like “invisible” ethanol fires right on their helmet view pane.
And while the IDEAS prototype is an innovative solution to the challenges of in-space missions, it won’t just benefit astronauts—this technology can be applied to countless fields here on Earth.
Engineers at our Ames Research Center are developing robots to work as teammates with humans.
They created a user interface called the Visual Environment for Remote Virtual Exploration (VERVE) that allows researchers to see from a robot’s perspective.
Using VERVE, astronauts on the International Space Station remotely operated the K10 rover—designed to act as a scout during NASA missions to survey terrain and collect science data to help human explorers.
This week, Nissan announced that a version of our VERVE was modified for its Seamless Autonomous Mobility (SAM), a platform for the integration of autonomous vehicles into our society. For more on this partnership: https://www.nasa.gov/ames/nisv-podcast-Terry-Fong
Did you know that we are leveraging technology from virtual and augmented reality apps to help scientists study Mars and to help astronauts in space?
The Ops Lab at our Jet Propulsion Laboratory is at the forefront of deploying these groundbreaking applications to multiple missions.
One project we’re demonstrating at CES, is how our OnSight tool—a mixed reality application developed for the Microsoft HoloLens—enables scientists to “work on Mars” together from their offices.
Supported by the Mars 2020 and Curiosity missions, it is currently in use by a pilot group of scientists for rover operations. Another HoloLens project is being used aboard the International Space Station to empower the crew with assistance when and where they need it.
At CES, we’re also using the Oculus Rift virtual reality platform to provide a tour from the launchpad at our Kennedy Space Center of our Space Launch System (SLS). SLS will be the world’s most powerful rocket and will launch astronauts in the Orion Spacecraft on missions to an asteroid and eventually to Mars. Engineers continue to make progress aimed toward delivering the first SLS rocket to Kennedy in 2018.
The Pop-Up Flat Folding Explorer Robot, PUFFER, is an origami-inspired robotic technology prototype that folds into the size of a smartphone.
It is a low-volume, low-cost enhancement whose compact design means that many little robots could be packed in to a larger “parent” spacecraft to be deployed on a planet’s surface to increase surface mobility. It’s like a Mars rover Mini-Me!
Our Remote Operated Vehicle for Education, or ROV-E, is a six-wheeled rover modeled after our Curiosity and the future Mars 2020 Rover.
It uses off-the-shelf, easily programmable computers and 3D-printed parts. ROV-E has four modes, including user-controlled driving to sensor-based hazard-avoidance and “follow me” modes. ROV-E can answer questions about Mars and follow voice commands.
ROV-E was developed by a team of interns and young, up-and-coming professionals at NASA’s Jet Propulsion Laboratory who wanted to build a Mars rover from scratch to help introduce students and the public to Science, Technology, Engineering & Mathematics (STEM) careers, planetary science and our Journey to Mars.
Make sure to follow us on Tumblr for your regular dose of space: http://nasa.tumblr.com
We're on the verge of launching a new spacecraft to the Sun to take the first-ever images of the Sun's north and south poles!
Credit: ESA/ATG medialab
Solar Orbiter is a collaboration between the European Space Agency (ESA) and NASA. After it launches — as soon as Feb. 9 — it will use Earth's and Venus's gravity to swing itself out of the ecliptic plane — the swath of space, roughly aligned with the Sun’s equator, where all the planets orbit. From there, Solar Orbiter's bird’s eye view will give it the first-ever look at the Sun's poles.
Credit: ESA/ATG medialab
The Sun plays a central role in shaping space around us. Its massive magnetic field stretches far beyond Pluto, paving a superhighway for charged solar particles known as the solar wind. When bursts of solar wind hit Earth, they can spark space weather storms that interfere with our GPS and communications satellites — at their worst, they can even threaten astronauts.
To prepare for potential solar storms, scientists monitor the Sun’s magnetic field. But from our perspective near Earth and from other satellites roughly aligned with Earth's orbit, we can only see a sidelong view of the Sun's poles. It’s a bit like trying to study Mount Everest’s summit from the base of the mountain.
Solar Orbiter will study the Sun's magnetic field at the poles using a combination of in situ instruments — which study the environment right around the spacecraft — and cameras that look at the Sun, its atmosphere and outflowing material in different types of light. Scientists hope this new view will help us understand not only the Sun's day-to-day activity, but also its roughly 11-year activity cycles, thought to be tied to large-scales changes in the Sun's magnetic field.
Solar Orbiter will fly within the orbit of Mercury — closer to our star than any Sun-facing cameras have ever gone — so the spacecraft relies on cutting-edge technology to beat the heat.
Credit: ESA/ATG medialab
Solar Orbiter has a custom-designed titanium heat shield with a calcium phosphate coating that withstands temperatures more than 900 degrees Fahrenheit — 13 times the solar heating that spacecraft face in Earth orbit. Five of the cameras look at the Sun through peepholes in that heat shield; one observes the solar wind out the side.
Over the mission’s seven-year lifetime, Solar Orbiter will reach an inclination of 24 degrees above the Sun’s equator, increasing to 33 degrees with an additional three years of extended mission operations. At closest approach the spacecraft will pass within 26 million miles of the Sun.
Solar Orbiter will be our second major mission to the inner solar system in recent years, following on August 2018’s launch of Parker Solar Probe. Parker has completed four close solar passes and will fly within 4 million miles of the Sun at closest approach.
Solar Orbiter (green) and Parker Solar Probe (blue) will study the Sun in tandem.
The two spacecraft will work together: As Parker samples solar particles up close, Solar Orbiter will capture imagery from farther away, contextualizing the observations. The two spacecraft will also occasionally align to measure the same magnetic field lines or streams of solar wind at different times.
The booster of a United Launch Alliance Atlas V rocket that will launch the Solar Orbiter spacecraft is lifted into the vertical position at the Vertical Integration Facility near Space Launch Complex 41 at Cape Canaveral Air Force Station in Florida on Jan. 6, 2020. Credit: NASA/Ben Smegelsky
Solar Orbiter is scheduled to launch on Feb. 9, 2020, during a two-hour window that opens at 11:03 p.m. EST. The spacecraft will launch on a United Launch Alliance Atlas V 411 rocket from Space Launch Complex 41 at Cape Canaveral Air Force Station in Florida.
Launch coverage begins at 10:30 p.m. EST on Feb. 9 at nasa.gov/live. Stay up to date with mission at nasa.gov/solarorbiter!
Make sure to follow us on Tumblr for your regular dose of space: http://nasa.tumblr.com
While it’s familiar to us, our solar system may actually be a bit of an oddball. Our Milky Way galaxy is home to gigantic worlds with teeny-tiny orbits and planets that circle pairs of stars. We’ve even found planets that don’t orbit stars at all! Instead, they drift through the galaxy completely alone (unless they have a moon to keep them company). These lonely island worlds are called rogue planets.
The planet-building process can be pretty messy. Dust and gas around a star clump together to form larger and larger objects, like using a piece of play-dough to pick up other pieces.
Sometimes collisions and close encounters can fling a planet clear out of the gravitational grip of its parent star. Rogue planets may also form out in space on their own, like the way stars grow.
We’ve discovered more than 4,000 exoplanets, but only a handful are rogue planets. That’s because they’re superhard to find! Rogue planets are almost completely invisible to us because they don’t shine like stars and space is inky black. It’s like looking for a black cat in a dark room without a flashlight.
Some planet-finding methods involve watching to see how orbiting planets affect their host star, but that doesn’t work for rogue planets because they’re off by themselves. Rogue planets are usually pretty cold too, so infrared telescopes can’t use their heat vision to spot them either.
So how can we find them? Astronomers use a cool cosmic quirk to detect them by their effect on starlight. When a rogue planet lines up with a more distant star from our vantage point, the planet bends and magnifies light from the star. This phenomenon, called microlensing, looks something like this:
Imagine you have a trampoline, a golf ball, and an invisible bowling ball. If you put the bowling ball on the trampoline, you could see how it made a dent in the fabric even if you couldn’t see the ball directly. And if you rolled the golf ball near it, it would change the golf ball’s path.
A rogue planet affects space the way the bowling ball warps the trampoline. When light from a distant star passes by a rogue planet, it curves around the invisible world (like how it curves around the star in the animation above). If astronomers on Earth were watching the star, they’d notice it briefly brighten. The shape and duration of this brightness spike lets them know a planet is there, even though they can’t see it.
Telescopes on the ground have to look through Earth’s turbulent atmosphere to search for rogue planets. But when our Nancy Grace Roman Space Telescope launches in the mid-2020s, it will give us a much better view of distant stars and rogue planets because it will be located way above Earth’s atmosphere — even higher than the Moon!
Other space telescopes would have to be really lucky to spot these one-in-a-million microlensing signals. But Roman will watch huge patches of the sky for months to catch these fleeting events.
Scientists have come up with different models to explain how different planetary systems form and change over time, but we still don’t know which ones are right. The models make different predictions about rogue planets, so studying these isolated worlds can help us figure out which models work best.
When Roman spots little microlensing starlight blips, astronomers will be able to get a pretty good idea of the mass of the object that caused the signal from how long the blip lasts. Scientists expect the mission to detect hundreds of rogue planets that are as small as rocky Mars — about half the size of Earth — up to ones as big as gas giants, like Jupiter and Saturn.
By design, Roman is only going to search a small slice of the Milky Way for rogue planets. Scientists have come up with clever ways to use Roman’s future data to estimate how many rogue planets there are in the whole galaxy. This information will help us better understand whether our solar system is pretty normal or a bit of an oddball compared to the rest of our galaxy.
Roman will have such a wide field of view that it will be like going from looking at the cosmos through a peephole to looking through a floor-to-ceiling window. The mission will help us learn about all kinds of other cool things in addition to rogue planets, like dark energy and dark matter, that will help us understand much more about our place in space.
Learn more about the Roman Space Telescope at: https://roman.gsfc.nasa.gov/
Make sure to follow us on Tumblr for your regular dose of space: http://nasa.tumblr.com
This year marks the 50th anniversary of Earth Day, and to commemorate the big day we’re bringing you exclusive access our Acting Director of Earth Sciences, Sandra Cauffman, and Associate Administrator for the Science Mission Directorate, Dr. Thomas Zurbuchen! They will be teaming up to take your questions in an Answer Time session on Earth Day, April 22, from 12-1pm EDT here on NASA’s Tumblr! Make sure to ask your question now by visiting http://nasa.tumblr.com/ask!
Our investment in space – both the unique Earth science we conduct from orbit and the technology we’ve developed by living in space and exploring our solar system and universe – is returning benefits every day to people around the world, particularly those who are working on environmental issues. From documenting Earth’s changing climate to creating green technologies to save energy and natural resources, we’re working to help us all live more sustainably on our home planet and adapt to natural and human-caused changes.
From space we study: dust storms, volcanoes, flooding, coral reefs, night lights, wildfires, urban growth, food production, mosquito tracking and other human health issues, precipitation across the world, hurricanes and typhoons, soil moisture, land and sea ice, and changes to the land and sea surfaces.
From airborne research planes we track: changes in polar ice, glaciers, sea level rise, cloud formation, storms, sea level rise and Earth’s changing landscape.
Our Earth science focus areas include: Atmospheric Composition, Weather and Atmospheric Dynamics, Climate Variability and Change, Water and Energy Cycle, Carbon Cycle and Ecosystems, Earth Surface and Interior
Keep up to date with all our Earth Science missions and research by following NASA Earth on Twitter, Facebook and Instagram.
Make sure to follow us on Tumblr for your regular dose of space: http://nasa.tumblr.com.
On Aug. 21, 2017, a total solar eclipse passed over North America. People throughout the continent captured incredible images of this celestial phenomenon. We and our partner agencies had a unique vantage point on the eclipse from space. Here are a few highlights from our fleet of satellites that observe the Sun, the Moon and Earth.
Our Solar Dynamics Observatory, or SDO, which watches the Sun nearly 24/7 from its orbit 3,000 miles above Earth, saw a partial eclipse on Aug. 21.
SDO sees the Moon cross in front of the Sun several times a year. However, these lunar transits don’t usually correspond to an eclipse here on Earth, and an eclipse on the ground doesn’t guarantee that SDO will see anything out of the ordinary. In this case, on Aug. 21, SDO did see the Moon briefly pass in front of the Sun at the same time that the Moon’s shadow passed over the eastern United States. From its view in space, SDO only saw 14 percent of the Sun blocked by the Moon, while most U.S. residents saw 60 percent blockage or more.
Six people saw the eclipse from the International Space Station. Viewing the eclipse from orbit were NASA’s Randy Bresnik, Jack Fischer and Peggy Whitson, the European Space Agency’s Paolo Nespoli, and Roscosmos’ Commander Fyodor Yurchikhin and Sergey Ryazanskiy. The space station crossed the path of the eclipse three times as it orbited above the continental United States at an altitude of 250 miles.
From a million miles out in space, our Earth Polychromatic Imaging Camera, or EPIC, instrument captured 12 natural color images of the Moon’s shadow crossing over North America. EPIC is aboard NOAA’s Deep Space Climate Observatory, or DSCOVR, where it photographs the full sunlit side of Earth every day, giving it a unique view of the shadow from total solar eclipses. EPIC normally takes about 20 to 22 images of Earth per day, so this animation appears to speed up the progression of the eclipse.
A ground-based image of the total solar eclipse – which looks like a gray ring – is superimposed over a red-toned image of the Sun’s atmosphere, called the corona. This view of the corona was captured by the European Space Agency and our Solar and Heliospheric Observatory, or SOHO. At center is an orange-toned image of the Sun’s surface as seen by our Solar Dynamics Observatory in extreme ultraviolet wavelengths of light.
During a total solar eclipse, ground-based telescopes can observe the lowest part of the solar corona in a way that can’t be done at any other time, as the Sun’s dim corona is normally obscured by the Sun’s bright light. The structure in the ground-based corona image — defined by giant magnetic fields sweeping out from the Sun’s surface — can clearly be seen extending into the outer image from the space-based telescope. The more scientists understand about the lower corona, the more they can understand what causes the constant outward stream of material called the solar wind, as well as occasional giant eruptions called coronal mass ejections.
As millions of Americans watched the total solar eclipse that crossed the continental United States, the international Hinode solar observation satellite captured its own images of the awe-inspiring natural phenomenon. The images were taken with Hinode's X-ray telescope, or XRT, as it flew above the Pacific Ocean, off the west coast of the United States, at an altitude of approximately 422 miles. Hinode is a joint endeavor by the Japan Aerospace Exploration Agency, the National Astronomical Observatory of Japan, the European Space Agency, the United Kingdom Space Agency and NASA.
During the total solar eclipse our Lunar Reconnaissance Orbiter, or LRO, in orbit around the Moon, turned one of its instruments towards Earth to capture an image of the Moon’s shadow over a large region of the United States.
As LRO crossed the lunar south pole heading north at 3,579 mph, the shadow of the Moon was racing across the United States at 1,500 mph. A few minutes later, LRO began a slow 180-degree turn to look back at Earth, capturing an image of the eclipse very near the location where totality lasted the longest. The spacecraft’s Narrow Angle Camera began scanning Earth at 2:25:30 p.m. EDT and completed the image 18 seconds later.
Sensors on the polar-orbiting Terra and Suomi NPP satellites gathered data and imagery in swaths thousands of miles wide. The Moderate Resolution Imaging Spectroradiometer, or MODIS, sensor on Terra and Visible Infrared Imaging Radiometer Suite, or VIIRS, on Suomi NPP captured the data used to make this animation that alternates between two mosaics. Each mosaic is made with data from different overpasses that was collected at different times.
This full-disk geocolor image from NOAA/NASA’s GOES-16 shows the shadow of the Moon covering a large portion of the northwestern U.S. during the eclipse.
Our Interface Region Imaging Spectrograph, or IRIS, mission captured this view of the Moon passing in front of the Sun on Aug. 21.
Check out nasa.gov/eclipse to learn more about the Aug. 21, 2017, eclipse along with future eclipses, and follow us on Twitter for more satellite images like these: @NASASun, @NASAMoon, and @NASAEarth.
Make sure to follow us on Tumblr for your regular dose of space: http://nasa.tumblr.com.
What are you most excited for in 2020?
We've created a virtual Mars photo booth, 3D rover experience and more for you to put your own creative touch on wishing Perseverance well for her launch to the Red Planet! Check it out, HERE.
Don’t forget to mark the July 30 launch date on your calendars!
Make sure to follow us on Tumblr for your regular dose of space: http://nasa.tumblr.com
Our massive James Webb Space Telescope just recently emerged from about 100 days of cryogenic testing to make sure it can work perfectly at incredibly cold temperatures when it’s in deep space.
Webb is a giant infrared space telescope that we are currently building. It was designed to see things that other telescopes, even the amazing Hubble Space Telescope, can’t see.
Webb’s giant 6.5-meter diameter primary mirror is part of what gives it superior vision, and it’s coated in gold to optimize it for seeing infrared light.
Lots of stuff in space emits infrared light, so being able to observe it gives us another tool for understanding the universe. For example, sometimes dust obscures the light from objects we want to study – but if we can see the heat they are emitting, we can still “see” the objects to study them.
It’s like if you were to stick your arm inside a garbage bag. You might not be able to see your arm with your eyes – but if you had an infrared camera, it could see the heat of your arm right through the cooler plastic bag.
Credit: NASA/IPAC
With a powerful infrared space telescope, we can see stars and planets forming inside clouds of dust and gas.
We can also see the very first stars and galaxies that formed in the early universe. These objects are so far away that…well, we haven’t actually been able to see them yet. Also, their light has been shifted from visible light to infrared because the universe is expanding, and as the distances between the galaxies stretch, the light from them also stretches towards redder wavelengths.
We call this phenomena “redshift.” This means that for us, these objects can be quite dim at visible wavelengths, but bright at infrared ones. With a powerful enough infrared telescope, we can see these never-before-seen objects.
We can also study the atmospheres of planets orbiting other stars. Many of the elements and molecules we want to study in planetary atmospheres have characteristic signatures in the infrared.
Because infrared light comes from objects that are warm, in order to detect the super faint heat signals of things that are really, really far away, the telescope itself has to be very cold. How cold does the telescope have to be? Webb’s operating temperature is under 50K (or -370F/-223 C). As a comparison, water freezes at 273K (or 32 F/0 C).
Because there is no atmosphere in space, as long as you can keep something out of the Sun, it will get very cold. So Webb, as a whole, doesn’t need freezers or coolers - instead it has a giant sunshield that keeps it in the shade. (We do have one instrument on Webb that does have a cryocooler because it needs to operate at 7K.)
Also, we have to be careful that no nearby bright things can shine into the telescope – Webb is so sensitive to faint infrared light, that bright light could essentially blind it. The sunshield is able to protect the telescope from the light and heat of the Earth and Moon, as well as the Sun.
Out at what we call the Second Lagrange point, where the telescope will orbit the Sun in line with the Earth, the sunshield is able to always block the light from bright objects like the Earth, Sun and Moon.
By lots of testing on the ground before we launch it. Every piece of the telescope was designed to work at the cold temperatures it will operate at in space and was tested in simulated space conditions. The mirrors were tested at cryogenic temperatures after every phase of their manufacturing process.
The instruments went through multiple cryogenic tests at our Goddard Space Flight Center in Maryland.
Once the telescope (instruments and optics) was assembled, it even underwent a full end-to-end test in our Johnson Space Center’s giant cryogenic chamber, to ensure the whole system will work perfectly in space.
It will move to Northrop Grumman where it will be mated to the sunshield, as well as the spacecraft bus, which provides support functions like electrical power, attitude control, thermal control, communications, data handling and propulsion to the spacecraft.
Learn more about the James Webb Space Telescope HERE, or follow the mission on Facebook, Twitter and Instagram.
Make sure to follow us on Tumblr for your regular dose of space: http://nasa.tumblr.com.
We’re set to launch the Mars 2020 Perseverance rover mission from Cape Canaveral, Florida, on July 30. The rover is loaded with scientific instruments and advanced technology, making it the largest, heaviest and most sophisticated vehicle ever sent to the Red Planet.
What is Perseverance’s mission and what will it do on Mars? Here are seven things to know:
Not only does it have to launch during a pandemic and land on a treacherous planet, it has to carry out its science goals:
Searching for signs of past microbial life
Mapping out the planet’s geology and climate
Collecting rock and other samples for future return to Earth
Paving the way for human exploration
We chose the name Perseverance from among the 28,000 essays submitted during the "Name the Rover" contest. Because of the coronavirus pandemic, the months leading up to the launch in particular have required creative problem solving, teamwork and determination.
In 1997, our first Mars rover – Sojourner – showed that a robot could rove on the Red Planet. Spirit and Opportunity, which both landed in 2004, found evidence that Mars once had water before becoming a frozen desert.
Curiosity found evidence that Mars’ Gale Crater was home to a lake billions of years ago and that there was an environment that may have sustained microbial life. Perseverance aims to answer the age-old question – are there any signs that life once existed on Mars?
The rover will land in Jezero Crater, a 28-mile wide basin north of the Martian equator. A space rock hit the surface long ago, creating the large hole. Between 3 and 4 billion years ago, a river flowed into a body of water in Jezero the size of Lake Tahoe.
Mars orbiters have collected images and other data about Jezero Crater from about 200 miles above, but finding signs of past life will need much closer inspection. A rover like Perseverance can look for those signs that may be related to ancient life and analyze the context in which they were found to see if the origins were biological.
This is the first rover to bring a sample-gathering system to Mars that will package promising samples of rocks and other materials for future return to Earth. NASA and ESA are working on the Mars Sample Return campaign, so we can analyze the rocks and sediment with tools too large and complex to send to space.
Two packages -- one that helps the rover autonomously avoid hazards during landing (TRN) and another that gathers crucial data during the trip through Mars’ atmosphere (MEDLI2) – will help future human missions land safely and with larger payloads on other worlds.
There are two instruments that will specifically help astronauts on the Red Planet. One (MEDA) will provide key information about the planet’s weather, climate and dust activity, while a technology demonstration (MOXIE) aims to extract oxygen from Mars’ mostly carbon-dioxide atmosphere.
Perseverance and other parts of the Mars 2020 spacecraft feature 23 cameras, which is more than any other interplanetary mission in history. Raw images from the camera are set to be released on the mission website.
There are also three silicon chips with the names of nearly 11 million people who signed up to send their names to Mars.
And you can continue to follow the mission on Twitter and Facebook.
Make sure to follow us on Tumblr for your regular dose of space: http://nasa.tumblr.com
Earlier this year, we selected the Lucy mission to make the first-ever visit to a group of asteroids known as the Trojans. This swarm of asteroids orbits in two loose groups around the Sun, with one group always ahead of Jupiter in its path, and the other always behind. The bodies are stabilized by the Sun and Jupiter in a gravitational balancing act, gathering in locations known as Lagrange points.
Jupiter's swarms of Trojan asteroids may be remnants of the material that formed our outer planets more than 4 billion years ago—so these fossils may help reveal our most distant origins. "They hold vital clues to deciphering the history of the solar system," said Dr. Harold F. Levison, Lucy principal investigator from Southwest Research Institute (SwRI) in Boulder, Colorado.
Lucy takes its name from the fossilized human ancestor, called "Lucy" by her discoverers, whose skeleton provided unique insight into humanity's evolution. On the night it was discovered in 1974, the team's celebration included dancing and singing to The Beatles' song "Lucy In The Sky With Diamonds." At some point during that evening, expedition member Pamela Alderman named the skeleton "Lucy," and the name stuck. Jump ahead to 2013 and the mission's principal investigator, Dr. Levison, was inspired by that link to our beginnings to name the spacecraft after Lucy the fossil. The connection to The Beatles' song was just icing on the cake.
One of two missions selected in a highly competitive process, Lucy will launch in October 2021. With boosts from Earth's gravity, it will complete a 12-year journey to seven different asteroids: a Main Belt asteroid and six Trojans.
No other space mission in history has been launched to as many different destinations in independent orbits around the Sun. Lucy will show us, for the first time, the diversity of the primordial bodies that built the planets.
Lucy's complex path will take it to both clusters of Trojans and give us our first close-up view of all three major types of bodies in the swarms (so-called C-, P- and D-types). The dark-red P- and D-type Trojans resemble those found in the Kuiper Belt of icy bodies that extends beyond the orbit of Neptune. The C-types are found mostly in the outer parts of the Main Belt of asteroids, between the orbits of Mars and Jupiter. All of the Trojans are thought to be abundant in dark carbon compounds. Below an insulating blanket of dust, they are probably rich in water and other volatile substances.
This diagram illustrates Lucy's orbital path. The spacecraft's path (green) is shown in a slowly turning frame of reference that makes Jupiter appear stationary, giving the trajectory its pretzel-like shape.
This time-lapsed animation shows the movements of the inner planets (Mercury, brown; Venus, white; Earth, blue; Mars, red), Jupiter (orange), and the two Trojan swarms (green) during the course of the Lucy mission.
Lucy and its impressive suite of remote-sensing instruments will study the geology, surface composition, and physical properties of the Trojans at close range. The payload includes three imaging and mapping instruments, including a color imaging and infrared mapping spectrometer and a thermal infrared spectrometer. Lucy also will perform radio science investigations using its telecommunications system to determine the masses and densities of the Trojan targets.
Several institutions will come together to successfully pull off this mission. The Southwest Research Institute in Boulder, Colorado, is the principal investigator institution. Our Goddard Space Flight Center will provide overall mission management, systems engineering, and safety and mission assurance. Lockheed Martin Space Systems in Denver will build the spacecraft. Instruments will be provided by Goddard, the Johns Hopkins Applied Physics Laboratory and Arizona State University. Discovery missions are overseen by the Planetary Missions Program Office at our Marshall Space Flight Center in Huntsville, Alabama, for our Planetary Science Division.
Make sure to follow us on Tumblr for your regular dose of space: http://nasa.tumblr.com
On June 24, 2020, NASA announced the agency’s headquarters building in Washington, D.C., was to be named after Mary W. Jackson to celebrate her life and legacy. We collaborated with Events DC to create artwork inspired by Jackson’s story as the agency’s first Black female engineer.
Take a look at how six local female artists interpreted Jackson’s place in history through their individual creative lenses.
“To see Mary [W.] Jackson be so successful and to get the recognition that she deserves, it hits home for me in a couple ways.”
Tenbeete Solomon AKA Trap Bob is a visual artist, illustrator, and animator based in Washington, D.C.
“Art is so important across the board because it’s really a form of documentation,” says Trap Bob. “It’s creating a form of a history… that’s coming from the true essence of what people feel in the communities.”
“People can relate to things that may seem foreign to them through imagery.”
Jamilla Okubo is an interdisciplinary artist exploring the intricacies of belonging to an American, Kenyan, and Trinidadian identity.
“I wanted to create a piece that represented and celebrated and honored Mary [W.] Jackson, to remember the work that she did,” says Okubo.
“This is a figure who actually looks like us, represents us.”
Tracie Ching is an artist and self-taught illustrator working in Washington, D.C.
“The heroes and the figures that we had presented to us as kids didn’t ever look like me or my friends or the vast majority of the people around me,” says Ching.
"To be even a Black artist making artwork about space — it’s because of her triumphs and her legacy that she left behind.”
Jennifer White-Johnson is an Afro-Latina, disabled designer, educator, and activist whose work explores the intersection of content and caregiving with an emphasis on redesigning ableist visual culture.
“My piece is… a take on autistic joy because my son is autistic," says White-Johnson. "And I really just wanted to show him… in a space where we often don’t see Black disabled kids being amplified.”
“In my art, I try to highlight really strong and empowering women."
Julia Chon, better known by her moniker “Kimchi Juice,” is a Washington, D.C.-based artist and muralist.
“As minority women, we are too often overlooked and under recognized for the work and time that we give," says Kimchi Juice. "And so to see Mary W. Jackson finally being given this recognition is fulfilling to me.”
“I wanted when one listens to it, to feel like there is no limit.”
OG Lullabies is a Washington D.C. songwriter, multi-instrumentalist, including violin and electronics.
“When you look back at history… art is the color or the sound in the emotions that encapsulated the moment,” says OG Lullabies. “It’s the real human experience that happens as time passes.”
Make sure to follow us on Tumblr for your regular dose of space: http://nasa.tumblr.com.
Explore the universe and discover our home planet with the official NASA Tumblr account
1K posts