Scientists at the University of Minnesota Duluth's Large Lakes Observatory have discovered a new microorganism in an unexpected place—hiding in the oily recesses of a Great Lakes research vessel.
The researchers made the discovery last fall, after crew members aboard the R/V Blue Heron noticed a strange knocking sound coming from the ship’s propeller system while on a research expedition on Lake Erie.
They hauled the ship out of the water at the Great Lakes Shipyard in Cleveland. That’s when the Large Lakes Observatory’s Marine Superintendent Doug Ricketts saw a black, tar-like goop oozing out of the ship’s rudder shaft.
Scientists at the University of Minnesota Duluth's Large Lakes Observatory discovered novel new microbes hiding in the warm, oxygen-free environment of the rudder shaft housing of the research vessel the Blue Heron. Courtesy of UMD's Large Lakes Observatory
He had never seen the stuff before, and thought it was odd. So he filled a red plastic cup full of the substance, and gave it to UMD professor Cody Shiek, a biologist who focuses on microbial ecology. Shiek decided to sample it. "I was completely like, we're not going to get anything off of this,” he remembers thinking. “But surprisingly, we found DNA and it wasn’t too destroyed, nor was the biomass too low."
After sequencing the DNA and comparing it with global databases, the team confirmed they had discovered an entirely new organism, a microbial species that appeared to thrive in the warm, oily, oxygen-free environment within the ship’s rudder shaft.
Shiek and his team temporarily dubbed the substance “ShipGoo001.” "And we don't know exactly what ShipGoo001 is good for right now,” said Catherine O'Reilly, director of the Large Lakes Observatory. “But there's a good chance that we'll learn more about it, and it might turn out to have applications to things that we care about as a society."
For example, some organisms in the goo appear to be methane producers, O’Reilly said, potentially useful for biofuel production.
This isn’t the first time Shiek has discovered a new organism. Far from it. On research trips on Lake Superior aboard the Blue Heron, he said it’s common to find new species, especially in the sediment of the lake “When we go out into the environment, we're constantly finding new organisms, and that's just because things are very under sampled,” Shiek said.
What’s different and exciting about the ship goo is that they were not looking for these organisms. It was an accidental discovery. “We weren’t supposed to probably see this,” he said.
Shiek has studied microbes living in extreme environments from Lake Superior, to deep ocean hydrothermal vents and hot springs.
This discovery highlights how much remains unknown, even in familiar, built-up environments like ships.
“I think it tells us we can discover new things everywhere. We don't have to go to Mars necessarily to find brand new things right under our noses,” O’Reilly said. She added that it’s important to allow scientists to pursue new things without necessarily having a goal in mind.
“This shows us how how important it is to be creative as a scientist, to be open minded, to take advantage of opportunities that come to you and just explore what's right in front of you, because you really don't know what you're going to find.”
Researchers at the University of Minnesota Duluth's Large Lakes Observatory discovered new microorganisms dubbed "ShipGoo001" in a black tar-like substance on the rudder shaft of the research vessel the Blue Heron. Courtesy of UMD's Large Lakes Observatory
A mystery that Shiek is still trying to answer is determining where the organisms originate. He speculates they may have been dormant in the oil used to grease the rudder, waiting until conditions were right for growth.
While ShipGoo001 is new to science, similar species have been found in tar pits and petroleum wells around the world.
Shiek said the next step in the research is to try to decipher the metabolic processes of the microbes.
“Does it eat oil? Does it breathe in metal, like iron? And so that's where we're at right now. Thinking about how these organisms are surviving, or maybe even just thriving, in this built environment that we very rarely think about.”
And soon, the substance will receive a new scientific name. Participants in the Large Lakes Observatory’s Freshwater Discovery Day aboard the Blue Heron on July 30 will have a chance to join Shiek in coming up with an official name for what remains, for now, ShipGoo001.
Archaeologist Greer Jarrett at Lund University in Sweden has been sailing in the footsteps of Vikings for three years.
He can now show that the Vikings sailed farther away from Scandinavia, and took routes farther from land, than was previously believed to have been possible.
In his latest study, he has found evidence of a decentralised network of ports, located on islands and peninsulas, which probably played a central role in trade and travel in the Viking era.
Greer Jarrett has identified four possible small ports, or "havens," used by Vikings along the Norwegian coast.
If you want to learn more about how and where the Vikings sailed, making the journey through the fjords yourself in replica boats is a practical, hands-on approach to achieving that end. Greer Jarrett, an archaeologist at Lund University in Sweden, has spent the last three years doing just that, sailing more than 5,000 kilometers along known Viking trade routes in open, spare-rigged clinker boats similar to those used by the Vikings.
Not only has Jarrett learned a great deal about the boats themselves, he also identified four possible havens along the Norwegian coast, part of what may have been a decentralized network that played a crucial role in trade and travel during that period. And those ports are located farther out to sea than other major ports and hubs known to date, according to a paper he published in the Journal of Archaeological Method and Theory.
It's just the latest intriguing discovery enabled by the growing field of experimental archaeology, whereby researchers seek to reverse-engineer all manner of ancient technologies. Experimental archaeologists have, for instance, built their own versions of Early Upper Paleolithic adzes, axes, and chisels. The resulting fractures and wear enabled them to develop new criteria for identifying the likely functions of ancient tools. Others have tried to cook like the Neanderthals, concluding that flint flakes were surprisingly effective for butchering birds, and that roasting the birds damages the bones to such an extent that it's unlikely they would be preserved in the archaeological record.
Kent State University's Metin Eren has conducted practical experiments to study, for instance, the trajectories of atlatls attached to spears tipped with replica Clovis points, and how their performance compares to javelins used by Neanderthals. He even fashioned rudimentary blades out of his own frozen feces to test whether they could cut through pig hide, muscle, and tendon—solely to test a famous anthropological legend about an elderly Inuit man in the 1950s who purportedly did the same to kill and skin a dog, using its rib cage as a makeshift sled to venture off into the Arctic. (It did not work, so myth: busted. But it did snag Eren an Ig Nobel prize.)
Taking a hands-on, experimental archaeological approach to studying the Vikings makes sense in light of the dearth of contemporary written sources. "We have a few things written by outsiders, but there's very, very few accounts written or delivered by people from Scandinavia during that period," Jarrett told Ars. "We normally rely on indirect forms of evidence, be that genetics or archaeology or linguistics, which show strong, very frequent connections across maritime areas in the North Atlantic. But because traveling by boat is kind of an archaeologically invisible act, you don't leave any footprints. So we have very little information about the voyages between these points."
The sailing voyages made by Greer Jarrett during the research project, as well as the four possible Viking harbors he identified.
Credit: Greer Jarrett
Garrett and his crew used four or five different replica boats for their test voyages. Most were built by volunteers, enthusiasts, or students Jarrett had met during his considerable time in the field. They then sailed along the west coast of the Scandinavian Peninsula, a core area of Viking seafaring.
"These are reconstructions of traditional Norwegian boats from the 1800s and early 1900s," said Jarrett. "My idea was, because of this really long-term continuity in traditional boat building practices, especially in Norway, it might be possible to use these later boats which have lots of similarities to try and work out the potentials of where people might have gotten out. It's the idea of suggesting potentials based on practical experience to try and join those dots between the different evidence we have across the Viking world."
That decision has led to some criticism from colleagues because of the enormous gap in time, but Jarrett defends his choice. "The Viking Age ends in the 11th century, and we're talking about boats from 800 years later," he said. "But the construction techniques and the way they are rigged and their general performance characteristics are similar enough. Because this is a project about voyages and not a project about boat building, it seemed like a defensible analogy."
Seeking safe harbor
"On the long-range voyages, we worked in watches of four hours on and four hours off, and that is just about long enough to get some sleep on your off watch, but also just about short enough that you don't get really, really, really cold, which is obviously a risk," said Jarrett. "It was manageable, but we looked like penguins. I mean, we're wearing six layers of wool at any time and sleeping all stacked together for warmth. But other times it's really nice. The spring and the autumn in Scandinavia, there's much more likelihood of high-pressure cycles, which means that it's clearer and sunnier than in the summer itself."
Nonetheless, there were some rough moments, such as when the mast spar holding up the mainsail snapped, forcing the crew to improvise and lash two oars together to hold the sail so they could continue their journey. It took several days to repair the boat so it could sail again. There was no safety boat following along in case the crew got into trouble, and no engine, although they did have a life raft, which the crew has yet to use.
Based on his sailing trials, Jarrett believes that the Vikings had no need for navigational tools like maps, a compass, or a sextant, relying instead on what he calls "mental maps"—or a "maritime cultural mindscape"—based on sailors' memories and experiences passed down orally through generations. Those maps might also be informed by the myths linked to well-known coastal landmarks, such as skerries, small islets, or reefs.
"People had been moving by boat along the west coast of Scandinavia for a really, really, really long time, probably since the late Neolithic, if not earlier—thousands of years before the Viking age," said Jarrett. "There are big trading networks in place beforehand, and that is reflected in the names, place names along the west coast. My primary argument is if you spend 3,000 years traveling up and down a coastline in which you can use the coast at all times for navigation, then it's unnecessary to develop instrumentation.
"Instruments are used when you are in a place out in the open sea that you don't know," Jarrett continued. "We definitely know they didn't have compasses because those don't arrive from China until the 1200s. There are these ideas about sunstones and sundials, or little sun compasses, which are entirely possible. But there's no legitimate proof of either of them archaeologically yet. I may well be proved wrong if we find them at some point, but I don't think they're necessary for this at all."
This type of sailing boat is known as a faering.
It was built at a folk high school in Norway as part of Greer Jarrett's research project.
The boat at sea.
Greer Jarrett
Based on the sailing trials, archaeological and documentary evidence of Viking Age maritime centers, and digital reconstructions of past sea levels. Jarrett was able to develop a useful set of criteria for evaluating potential havens. For instance, the site should be reachable in low visibility, with land or sea marks that sailors could use as bearings; large enough to accommodate multiple vessels of at least the size of a fyring(which can house a crew of four to 10 people); provide good protection from sea swell and storm surges; and have access to fresh water, among other criteria. Four sites scored sufficiently high by those criteria to qualify as possible Viking havens.
The four sites are Smørhamn, located at the confluence of Oldersund and the Frøysjø, where an inn and trading post are known to have existed since at least the late 17th century; the archipelago of Sørøyane between Stad and Ålesund, near where the sea battle of Hjörungavágr was fought circa 986 CE; Bjørnsund, a number of small islands off the southwestern tip of Hustadvika; and the island of Storfosna, which appears on 16th and 17th century charts.
"I'm not saying, 'This is where they went,'" said Jarrett. "I'm saying that, with these kinds of boats under these conditions, it would be possible to go to these places. And it's much more difficult—not impossible, but much more difficult—to go to these other places or to sail in these other conditions."
Pining for the fjords
The next step is for Jarrett and other archaeologists to hunt for evidence in support of his hypothesis. "Most of these sites have never been excavated," said Jarrett. "There's been a long assumption that these are landing places with the idea that you are dragging your boat ashore. I'm very opposed to that idea because these are two-and-a-half-ton boats, let alone the cargo. Unless you have a team of oxen and 20 people at your command, there is no way you're getting them on the beach. I'm very convinced that these places have jetties and mooring posts likely preserved underwater. All of that organic material survives much better underwater than it does on land. So I think that's very possible."
They might also find smaller items suggestive of a thriving harbor community. "Whenever you go into land, you've got something that's broken, so you need to do repairs," said Jarrett. "So things like clink nails or piles of balustones or signs of smithing—the typical kind of things you'd use for repairing your ship, I think are possible to find." Jarrett's methodology might also prove useful for studying other seafaring communities.
Cold winds off the coast of northern Norway.
Greer Jarrett
A katabatic wind caused this fembøring to capsize off the coast of Kunna in 2023.
HRS Nord-Norge
The practical experience of sailing the same seas as the Vikings naturally led to some surprising insights. "You are able to ask very different questions the minute you walk away from your desk and get on a boat," said Jarrett. "I think it's essential to do that because you think in new ways. In terms of the results themselves, the boats are extremely seaworthy crafts. When you get in them for the first time, you don't think that, because they're very, very light. They feel very flimsy, and they're very low in the water compared to a modern sailing boat. So you feel really in touch with the wave, which is kind of scary. But because they're so flexible and because of the way they're rigged, they're actually really stable, even in big waves.
"We kept going out thinking, 'Oh, this is maybe the limit of what this boat can tolerate,' and then it would be fine, and we'd be, 'Okay, let's go a little bit in slightly bigger waves with slightly stronger wind,'" Jarrett continued. "So I think our comfort zones definitely visibly expanded during that period. And I had the chance to work with the same crews over three years. By the end of those three years, we were doing stuff that we would never have been able to do at the beginning."
Another big difference from modern boats, Jarrett discovered, is that one cannot sail a traditional Viking craft alone. "It has to be a collaborative effort because of how you need a person at the front and the back of the boat basically at all times," he said. "So developing the crew together and gaining not only skills, but also trust between us meant that we could do things in 2024 that seemed completely insane just a couple of years earlier. I cannot imagine what that is like if you have an entire lifetime of Viking sailors working together for 30 years. It must be an incredible way of creating social bonds."
Why raster attribute tables depicting geospatially distributed metadata and bathymetry is a match made in hydro-heaven that will turbo-charge data-driven maritime navigation.
Disclaimer: These views represent my own personal thoughts on this subject as a private citizen, and do not represent the views of NOAA or the US Government.
I. Introduction
Although I feel that it’s an overall healthy thing that official NOAA paper nautical charts have rode off into the sunset to make way for newer electronic versions, I sometimes still miss the thoughtful cartographic touches and historical legacy of old paper charts. One thing I don’t miss is an interesting item of “chart furniture” called the source diagram, which isn’t present in modern electronic files. Tucked away in a corner, this patchwork of what often looked like hand-drawn polygons was the only way to gauge the quality of the seafloor data beneath their keel. A pilot navigating a channel had to mentally cross-reference their position with this diagram, trying to decipher if the soundings in one area were from a 1930s lead-line survey or a multibeam sonar survey from 2011. It was a generalized solution that offered a vague sense of confidence but little in the way of concrete, queryable data. Plus, it was a bit onerous to use.
Today, we are moving into the world of modern electronic navigation with the S-100 standard. High-resolution bathymetry, delivered as an S-102 data product, can now be displayed directly on a vessel’s navigation screen. This gives pilots and mariners an unprecedented view of the seafloor. But it also raises the old question in a new, more urgent way.
I was recently part of a discussion with some fellow NOAA scientists that were working with harbor pilots testing these new S-102 products. A consistent point of discussion is the “survey vintage” of any particular area. One pilot might see a survey launch working in the harbor one week. The next week, they want to know if that new data is already in the product they are using.
This need for current data is critical for modern navigation. A pilot’s navigation system can combine the S-102 surface with real-time water level data from S-104 products. It uses this information, along with the ship’s draft measurements, to calculate a dynamic safety contour and a realistic understanding of their underkeel clearance. Having the absolute latest bathymetry in this system improves a pilot’s ability to navigate the largest ships safely and efficiently. It makes sense to want the newest data in order to make the best real-time navigation decisions.
This is where the true value of a Raster Attribute Table, or RAT, becomes clear. In the S-102 standard, this concept is implemented through a feature attribute table called “QualityOfBathymetryCoverage.”
Instead of a static source diagram, hydrographic offices have the option to build S-102 datasets with a built-in attribute table. Each group of pixels in the grid is linked to a rich set of metadata attributes describing the source and quality of the bathymetry.
This allows the pilot’s navigation system to do something that was impossible with a paper chart. It can query the bathymetry directly and dynamically color-code the seafloor based on its age or quality, giving the mariner instant and intuitive feedback. The RAT transforms the raster from a simple grid of depths into a smart, queryable database… a huge upgrade to the source diagram.
Figure 2: Top: NOAA BlueTopo Bathymetry at St. Andrew Sound colored by Source Survey ID, an attribute found in the Raster Attribute Table. Bottom Left: The paper chart’s source diagram Bottom Right: Hillshaded BlueTopo bathymetry of St. Andrew Sound rendered in QGIS with ENC land and ATON overlay.
II. What Exactly is a Raster Attribute Table?
At its most basic level, a raster is just a grid of numbers composed of rows and columns of data. The magic of a Raster Attribute Table (RAT) is that it gives those numbers meaning beyond their value. Think of a standard raster as a “dumb” grid. If a pixel has a value of 5, it’s just the number 5. A raster with a RAT is a “smart” grid. A pixel value of 5 acts as a key, linking to a row in a table that can tell you anything you want about that pixel’s category.
Figure 3: NOAA BlueTopo bathymetry tile of St. Andrew Sound colored by Band 2: Contributor layer showing the queried integer value 65840. This value acts as the primary key to the look-up table (RAT).
Figure 4: The RAT showing the source and quality metadata associated with the contributor layer 65840. Turns out this area was last surveyed in 1924, from survey H04444.
The concept becomes clearer with the Bathymetric Attributed Grid, or BAG, format. The BAG format was designed specifically to be more than just a grid of depths. While not yet universally implemented, the emerging BAG v2.0 standard, published by the Open Navigation Surface Working Group, aims to solve a classic hydrographic problem: how to create a single, cohesive surface from a survey that used multiple sensors and methods. For example, a survey might include areas of full-coverage multibeam echosounder (MBES), flanked by lanes of single-beam echosounder (SBES), with side-scan sonar (SSS) used to cover the gaps. The BAG format can hold all these different data types, including any interpolated data if used to fill the space between survey lines, within a single data structure.
The S-102 bathymetry standard uses this same philosophy to create a seamless surface model for modern navigation systems. This seamless model, which includes interpolated data between actual soundings, is what allows for powerful functions like generating user-customized safety contours on the fly, and selecting soundings at a custom spacing density.
But this raises an important question for those making navigation decisions, whether through route planning or real-time piloting: how does the system know which depths are real versus interpolated? This is where the RAT, in the form of the “QualityofBathymetryCoverage” (QOBC) layer in S102 datasets, becomes valuable. One of the attributes in the QOBC layer, bathymetryCoverageAchieved, acts as a simple boolean flag. This flag allows a system to perform user-customized sounding selections, ensuring that only soundings from actual, observed bathymetry are displayed, filtering out any interpolated values.
This structure transforms a simple raster grid into an intelligent data product. It allows a user or a software system to not just see a depth, but to instantly ask, “Where did this depth value come from, is it real or interpolated, and how good is it?”
III. A Tale of Three Implementations
Linking rich metadata to a raster grid creates many new uses. Modern data standards and products handle this task in a few different ways. We can look at three important examples in the hydrographic world: the Bathymetric Attributed Grid (BAG) v2.0, the IHO’s S-102 standard, and NOAA’s BlueTopo product.
1. The BAG v2 Method: Georeferenced Metadata
The BAG format is designed as a self-contained data package. It uses the HDF5 file structure, which holds multiple datasets like elevation and uncertainty within a single file. The v2 standard sets rules for how to store metadata for different parts of the grid. It specifies a “georeferenced metadata layer.” This layer has two parts: a “keys” band and a “values” table. The keys band is a raster the same shape as the elevation grid. Each pixel in the keys band contains an integer ID. This ID links that specific pixel to a row in the separate “values” table. The values table can hold detailed information, often following established specifications like the NOAA Office of Coast Survey (OCS) metadata profile. 2. The S-102 Method: Quality of Bathymetric Coverage
The IHO’s S-102 standard for bathymetric data uses a very similar system. It is also based on the HDF5 file structure and was adapted from the BAG format to fit the S-100 hydrographic data model. The standard defines an optional feature called “QualityofBathymetryCoverage.” This feature is a layer of integer IDs. Each ID corresponds to a region of the seafloor surveyed under a unique set of conditions. These IDs link to a feature attribute table that contains specific quality information. This table includes attributes like the survey date and the bathymetryCoverageAchieved flag, which confirms if a grid cell contains measured or interpolated data.
3. The NOAA BlueTopo Method: The Contributors Layer
NOAA’s BlueTopo dataset provides a real-world example of this principle applied to the common GeoTIFF format. BlueTopo is delivered as a multi-band GeoTIFF with three main bands: elevation, uncertainty, and a “Contributors” layer. The Contributors layer is a grid of integer IDs, much like the keys band in a BAG. Each integer corresponds to a unique survey that contributed data to the final BlueTopo model. The GeoTIFF file has a standard Raster Attribute Table (RAT) attached to it. Any user or software can read this table to look up a contributor ID and get the full metadata for that source survey.
This GeoTIFF-plus-RAT method has a key advantage, as it uses a file format that most GIS software can open directly. A user can immediately view the elevation and uncertainty data, and if their software also supports reading a RAT, they can instantly perform the complex queries and visualizations mentioned earlier.
To access BlueTopo bathymetry geotiffs, you can visit NOAA’s NowCOAST web platform to both visualize the bathymetry (including using the RAT in curated views), and downloading the geotiffs or the standalone RATs directly.
IV. Final Thoughts: A Smarter Seafloor for Smarter Ships
The transition from a small, hand-drawn source diagram on a paper chart to a dynamic, queryable bathymetric surface is quite remarkable. The core challenge, however, has not changed. A mariner needs to know the quality and origin of the data they are using to make critical decisions. The simple, straightforward structure of the Raster Attribute Table is a key idea that makes answering this question possible in the modern digital era.
Whether it is implemented within an S-102 high resolution bathymetry dataset, a BAG file, or a GeoTIFF like in NOAA’s BlueTopo, the function of the RAT is the same. It transforms a raster from a simple grid of values into an intelligent dataset. It allows software, and by extension the user, to perform complex queries on the fly. This capability directly addresses the harbor pilot’s need for confidence and is also a powerful tool for creating labeled datasets to train machine learning models.
This becomes even more critical when we look to the future of shipping with Maritime Autonomous Surface Ships (MASS). An autonomous vessel’s navigation system cannot interpret a chart with human intuition; it must rely on clear, machine-readable data to make decisions. The detailed metadata within an S-102 RAT provides exactly this.
A MASS path-planning algorithm could directly query the RAT to perform its own automated risk assessment. It could be programmed to understand the difference between a high-confidence depth value measured last week and a low-confidence one interpolated from a 1940s survey. For example, the system could make a calculated decision following this type of train of reasoning: The shortest route has an under-keel clearance of ~2 meters, but it crosses an area with high bathymetric uncertainty. The alternate route is longer, but it remains entirely within areas surveyed last year with high-resolution sonar. If risk thresholds are properly chosen, it should choose the safer alternate route.
The RAT, with its rich metadata on data quality, source, and uncertainty, provides the ground truth needed for an autonomous system to navigate safely and efficiently. The underlying technology is not flashy, but it is fundamental for the future of both human-centered and autonomous marine navigation.
Imagine being on a boat in the middle of the ocean on a seemingly calm day.
There are waves in the water as usual, and there doesn't seem to be anything out of the ordinary. Suddenly, you notice that in the middle of the waves, a vast void more than 80 feet (24 meters) deep is formed, and the water seems to sink as if something is absorbing it from below.
But you're on the high seas.
Underside that voids forming on the water's surface, there's just more water, yet that considerable void creates a current that drags everything around it.
The waves, the foam of the water, and the boat you are on begin to be swept into the void as if they are being sucked into a giant whirlpool that appeared out of nowhere.
The boat begins to sink into that void, and you notice that you are now below sea level while the boat is about to be entirely covered by the waves you used to see from above.
Although it sounds like a science fiction movie, this scenario could happen.
What is happening is nothing more than an "Oceanic Rogue Holes," an anomaly in the ocean that causes the formation of a large hole in the surface of the water.
This oceanographic phenomenon represents a great danger to ships as it could cause even a huge modern ship to capsize.
Could it be that this phenomenon is responsible for many shipwrecks that remain unsolved?
Or is this kind of phenomenon completely impossible?
Rebel Ocean Holes, or oceanic rogue holes, are the opposite of Rebel Ocean Waves, which we discussed in previous videos.
They are oceanic phenomena in which the depth of a hole in the water can reach twice or more the depth of the surrounding voids that occur when a wave falls under its weight.
When a wave occurs in the ocean, it obtains water from its surroundings to grow upwards.
By doing so, the water around the wave descends below sea level, causing what we know as an oceanic vacuum.
This void also occurs when a wave falls under its weight and produces a push of water downwards, causing another new void.
More about them:
Normal Seastate: Represents typical ocean wave activity with small, relatively uniform wave heights.
Such conditions are common during calm weather or under normal sea-state conditions.
Waves in this category are predictable and exhibit low energy compared to extreme waves.
Rogue Wave: A large and unexpected wave that is much higher than the surrounding waves.
Often referred to as “freak waves,” rogue waves can appear suddenly and are a significant hazard to ships. These waves occur due to constructive interference, where multiple wave crests align to form a single massive wave.
Rogue Hole: This is an inverse phenomenon of a rogue wave, where an unusually deep trough forms in the sea surface.
Such “holes” in the sea can appear abruptly, posing risks to vessels as they may encounter sudden destabilization. Rogue holes are thought to result from destructive interference or particular wave dynamics.
“Three Sisters”:Refers to a sequence of three large waves occurring in close succession.
This phenomenon is particularly dangerous because vessels that survive the first wave might be overwhelmed by the subsequent waves. The “Three Sisters” phenomenon is often associated with storms or nonlinear interactions in wave trains.
Here’s a simplified summary of the original report:
What Was Studied?
Scientists studied how big, dangerous waves (called “rogue waves”) form in the ocean. They used a simple model where smaller waves combine in random ways.
This study helps predict when and where these dangerous waves might happen.
Key Findings:
How Rogue Waves Form:
When fewer than 10 small waves combine, no rogue waves happen. If more than 10 waves combine, rogue waves are much more likely.
Why This Matters:
Rogue waves can happen more often than we thought, depending on how waves mix and how the wind affects them.
How We Can Predict Rogue Waves:
Scientists used a tool (called the Grassberger-Procaccia analysis) to study ocean wave patterns.
This tool helps figure out if the right number of waves is combining to create a rogue wave.
Wind conditions can also affect how rogue waves form, especially if winds come from many directions or are very strong.
What’s New?
The study found that understanding how waves mix is key to predicting rogue waves. The shape and number of waves are more important than just their size.
How This Helps Us:
Scientists might be able to use this information to create devices that monitor ocean waves in real-time. These devices could warn ships about dangerous rogue waves.
Next Steps:
More research is needed to understand how wind patterns and other ocean conditions affect rogue waves. This could make warnings even better.
In 2024, E/V Nautilus and the Corps of Exploration embarked on one of our widest-ranging seasons ever, spending eight months exploring the Central, Eastern, and Western Pacific, mapping unsurveyed seafloor, and bringing new views of never-before-seen seafloor to the world.
From mapping nearly 150,000 km² of seafloor to conducting 62 ROV dives (totaling over 670 hours on the ocean floor), this was one of our most groundbreaking seasons yet!
Our 10 expeditions and 163 days at sea carried us all to Hawaiʻi, American Samoa, US Pacific Remote Islands, Palau, and Canada with the expedition goals based on community input from scientists and resource managers as well as from local communities and stakeholders in the regions where the expeditions took place.
Thanks to YOU—our amazing community—we reached over 56 million views streaming and on social media worldwide, sharing a love for ocean exploration and science!
This highlight reel captures the breathtaking discoveries, inspiring people, and cutting-edge technologies that powered this unforgettable year.
Thanks to our many partners who have made the 2024 E/V Nautilus expeditions possible with their support including NOAA Ocean Exploration via the Ocean Exploration Cooperative Institute, Ocean Networks Canada, the Bureau of Ocean Energy Management, the NOAA Office of Marine and Aviation Operations Uncrewed Systems Operations Center, the Office of Naval Research, the National Marine Sanctuary of American Samoa, and Palau International Coral Reef Center.
Our season would also not be possible without the over 200 professional mariners, scientists, engineers, tech specialists, educators, and students to join the vessel this season.
“This hidden water beneath Antarctica plays a much more significant role than we thought,” says Chen Zhao, a glaciologist at the University of Tasmania in Hobart, Australia.
Policymakers are basing their decisions on current projections of sea level rise, she says, “but what if we largely underestimated it?”
While the Antarctic Ice Sheet may be frozen, it is not static.
The ice sheet deforms under its own weight, and in some places its frigid base slides along the ground like a sled on snow.
This process, known as basal sliding, accounts for most of the movement of the fastest Antarctic glaciers flowing into the ocean.
Understanding basal sliding is crucial for estimating future sea level rise.
The Antarctic Ice Sheet holds around 90 percent of all ice on Earth.
But
human-caused climate change is driving it to shed an average of 150
billion metric tons of ice each year, raising sea levels around the
world.
Researchers already knew that subglacial water can boost a glacier’s basal sliding speed.
Similar to how an air hockey table effuses a thin cushion of air for a puck to glide over, the pressure exerted by subglacial water counters some of the overlying glacier’s weight, easing its flow toward the sea.
“It’s sort of lubricating the ground for the ice,” says glaciologist Mathieu Morlighem of Dartmouth College, who was not involved in the study.
But it remains uncertain just how much basal water enhances glacier flow and sea level rise.
Many Antarctic Ice Sheet computer simulations — or models — that predict sea level rise ignore the effects of subglacial water and are probably underestimating its impact, Zhao says.
She and her colleagues simulated the Antarctic Ice Sheet’s evolution as it flowed over channels and lakes.
Because so little is known about the distribution of the water beneath the ice sheet, they tested different ways of simulating the pressure it exerts.
For instance, in one test, the researchers assumed the water under the ice sheet could flow essentially unhindered into the ocean.
In others, they factored in the topography beneath the Ice Sheet, considering places where water would accumulate to construct a more intricate picture of how pressure was distributed.
And in some tests, the team increased the water pressure near the grounding line, where the ice sheet meets the ocean.
“It makes physical sense,” Morlighem says.
“They’re making the bed more slippery … as the ice starts to float.”
This increased slipperiness contributed to the most extreme result.
Compared with the standard approach used in current Ice Sheet models, one simulation with a slippery grounding line generated 2.2 additional meters of sea level rise by 2300.
“It’s not crazy at all,” Morlighem says. Those two meters represent only 4 percent of what the Antarctic Ice Sheet — which contains about 90 percent of all ice on Earth — could deliver if it all melted, he explains. The rest of the tests yielded a wide range of contributions to sea level rise.
Determining exactly how much subglacial water will contribute to future sea level rise will require further investigation into what lies beneath the Antarctic Ice Sheet.
“Without knowing what’s under the ice, we have to make assumptions in our simulations that can have big impacts on the predictions,” Zhao says.
Using a global ocean model, researchers from the Institute of Industrial
Science, The University of Tokyo, and the Institute of Environmental
Radioactivity, Fukushima University, found that short- and long-term
contribution of treated water released from the Fukushima Daiichi
Nuclear Power Plant on oceanic tritium concentration beyond the vicinity
of the discharge site is negligible, even in climate change scenarios.
From Phys by Stephanie Baum, reviewed by Robert Egan
Operators have pumped water to cool the nuclear reactors at the Fukushima Daiichi Nuclear Power Plant (FDNPP) since the accident in 2011 and treated this cooling water with the Advanced Liquid Processing System (ALPS), which is a state-of-the-art purification system that removes radioactive materials, except tritium.
As part of the water molecule, tritium radionuclide, with a half-life of 12.32 years, is very costly and difficult to remove.
The ALPS-treated water was accumulating and stored at the FDNPP site and there is limited space to store this water.
Therefore, in 2021, the Government of Japan announced a policy that included discharging the ALPS-treated water via an approximately one-kilometer-long tunnel into the ocean. Planned releases of the ALPS-treated water diluted with ocean water began in August 2023 and will be completed by 2050.
In a new numerical modeling study, researchers have revealed that the simulated increase in tritium concentration in the Pacific Ocean due to the tritium originating from the ALPS-treated water is about 0.1% or less than the tritium background concentration of 0.03-0.2 Bq/L in the vicinity of the discharge site (within 25 km) and beyond.
This is below detection limits (i.e., so small that the difference due to the presence or absence of ALPS-treated water added to the original seawater cannot be measured), far below the WHO international safety standard of 10,000 Bq/L and consistent with the results of tritium concentration monitoring in seawater conducted in conjunction with the discharge of ALPS-treated water.
The results appear in the Marine Pollution Bulletin.
The researchers are from the Institute of Industrial Science, The University of Tokyo, in collaboration with Fukushima University.
"Since the government's announcement in 2021 to discharge the ALPS-treated water, several studies have investigated the radiological impact of ALPS-treated water discharge on tritium concentration in seawater and marine biota, but there were no global ocean simulations with anthropogenic tritium concentration using a realistic discharge scenario and for a period long enough to consider long-term impacts such as global warming," explains lead author of the study Alexandre Cauquoin.
"In our global ocean simulations, we could investigate how ocean circulation changes due to global warming and representation of fine-scale ocean eddies influence the temporal and spatial distribution of tritium originating from these treated-water releases."
Climate change and eddies in the water currents speed up the tritium movement through the ocean. However, the researchers found that the concentrations of tritium from ALPS-treated water discharge remain similar and very low.
"Our simulations show that the anthropogenic tritium from the discharge of ALPS-treated water would have a negligible impact on the tritium concentration in the ocean, both in the short and long term," says Maksym Gusyev from the Institute of Environmental Radioactivity, Fukushima University.
This study may help in building models to understand how tritium as a tritiated water molecule moves through water vapor and ocean water.
Tritium is useful for tracing the dynamics of the water cycle, so climate models that can simulate tritiated water can help studies of precipitation patterns, atmospheric and oceanic circulation, moisture sources, river catchments, and groundwater flow in the future.
Links :
Ocean general circulation model simulations of anthropogenic tritium releases from the Fukushima Daiichi Nuclear Power Plant site, Marine Pollution Bulletin (2025). DOI: 10.1016/j.marpolbul.2025.118294
Oil tankers pass through the Strait of Hormuz, December 21, 2018.
REUTERS/Hamad I Mohammed//File Photo
From Reuters by Gram Slattery and Phil Stewart in Washington; Additional
reporting by Michelle Nichols at the United Nations and Jonathan Saul in
London; Editing by Don Durfee and Matthew Lewis
Mines loaded last month, raising fears of blockade
Mining would have severely harmed global commerce
U.S. has not ruled out possibility that loading the mines was a ruse
-The Iranian military loaded naval mines onto vessels in the Persian Gulf last month, a move that intensified concerns in Washington that Tehran was gearing up to blockade the Strait of Hormuz following Israel's strikes on sites across Iran, according to two U.S. officials. The previously unreported preparations, which were detected by U.S. intelligence, occurred some time after Israel launched its initial missile attack against Iran on June 13, said the officials, who requested anonymity to discuss sensitive intelligence matters.
The loading of the mines - which have not been deployed in the strait - suggests that Tehran may have been serious about closing one of the world's busiest shipping lanes, a move that would have escalated an already-spiraling conflict and severely hobbled global commerce.
About one-fifth of global oil and gas shipments pass through the Strait of Hormuz and a blockage would likely have spiked world energy prices. Global benchmark oil prices have instead fallen more than 10% since the U.S. strikes on Iran's nuclear facilities, driven in part by relief that the conflict did not trigger significant disruptions in the oil trade.
On June 22, shortly after the U.S. bombed three of Iran's key nuclear sites in a bid to cripple Tehran's nuclear program, Iran's parliament reportedly backed a measure to block the strait. That decision was not binding, and it was up to Iran's Supreme National Security Council to make a final decision on the closure, Iran's Press TV said at the time. Iran has over the years threatened to close the strait but has never followed through on that threat. Reuters was not able to determine precisely when during the Israel-Iran air war Tehran loaded the mines, which - if deployed - would have effectively stopped ships from moving through the key thoroughfare. It is also unclear if the mines have since been unloaded. The sources did not disclose how the United States determined that the mines had been put on the Iranian vessels, but such intelligence is typically gathered through satellite imagery, clandestine human sources or a combination of both methods.
Iran has in the past threatened to close the Strait of Hormuz but has never followed through on the move, which would restrict trade and impact global oil prices.
But what is the strait and why is it so important for oil?
Asked for comment about Iran's preparations, a White House official said: "Thanks to the President’s brilliant execution of Operation Midnight Hammer, successful campaign against the Houthis, and maximum pressure campaign, the Strait of Hormuz remains open, freedom of navigation has been restored, and Iran has been significantly weakened." The Pentagon did not immediately respond to a request for comment.
The Iranian mission at the United Nations also did not respond to requests for comment.
KEY THOROUGHFARE
The two officials said the U.S. government has not ruled out the possibility that loading the mines was a ruse. The Iranians could have prepared the mines to convince Washington that Tehran was serious about closing the strait, but without intending to do so, the officials said. Iran's military could have also simply been making necessary preparations in the event that Iran's leaders gave the order.
A map showing the Strait of Hormuz and Iran is seen in this illustration created on June 22, 2025. REUTERS/Dado Ruvic/Illustration/File Photo
The Strait of Hormuz lies between Oman and Iran and links the Persian Gulf with the Gulf of Oman to the south and the Arabian Sea beyond. It is 21 miles (34 km) wide at its narrowest point, with the shipping lane just 2 miles wide in either direction. OPEC members Saudi Arabia, the United Arab Emirates, Kuwait and Iraq export most of their crude via the strait, mainly to Asia. Qatar, among the world's biggest liquefied natural gas exporters, sends almost all of its LNG through the strait.
Iran also exports most of its crude through the passage, which in theory limits Tehran's appetite to shut the strait.
But Tehran has nonetheless dedicated significant resources to making sure it can do so if it deems necessary. As of 2019, Iran maintained more than 5,000 naval mines, which could be rapidly deployed with the help of small, high-speed boats, the U.S. Defense Intelligence Agency estimated at the time. The U.S. Fifth Fleet, which is based in Bahrain, is charged with protecting commerce in the region.
The U.S. Navy has typically kept four mine countermeasure vessels, or MCM vessels, in Bahrain, though those ships are being replaced by another type of vessel called a littoral combat ship, or LCS, which also has anti-mine capabilities. All anti-mine ships had been temporarily removed from Bahrain in the days leading up to the U.S. strikes on Iran in anticipation of a potential retaliatory attack on Fifth Fleet headquarters. Ultimately, Iran's immediate retaliation was limited to a missile attack on a U.S. military base in nearby Qatar. U.S. officials, however, have not ruled out further retaliatory measures by Iran.
New analysis of ancient writings suggests that sailors from the Italian
hometown of Christopher Columbus knew of America 150 years before its
renowned ‘discovery’.
Sailors from the same region as Christopher Columbus may have known
about North America more than a century before his 1492 voyage.
A newly analyzed medieval manuscript suggests that sailors from the same region as Christopher Columbus may have known about North America more than a century before his 1492 voyage.
The revelation stems from a little-known text written in the 14th century that challenges long-held views about when Europeans first learned of lands across the Atlantic.
The document, Cronica universalis, was written around 1345 by a Milanese friar named Galvaneus Flamma. Professor Paolo Chiesa, a scholar of Medieval Latin literature, recently brought it to light through a detailed study.
“The reference is astonishing,” Chiesa said, noting that the text appears to describe a part of the North American continent.
In the chronicle, Flamma refers to a distant land called “Marckalada,” a name scholars have connected to “Markland,” which appears in Icelandic sagas.
This term has long been associated with parts of the Atlantic coast such as Newfoundland and Labrador. Chiesa’s interpretation positions this as the earliest Mediterranean mention of the Americas.
Landing of Columbus, oil on canvas by John Vanderlyn, 1846 (CREDIT: Architect of the Capitol)
Trade and travel likely played a role in spreading the story.
According to Chiesa, Genoese sailors—well-traveled and deeply involved in maritime commerce—may have picked up tales of this distant land from their northern counterparts.
These rumors could have been passed through docks and taverns, woven into the everyday conversations of seafarers.
The findings, published in Terrae Incognitae, offer more than just an intriguing footnote.
They contribute to ongoing debates over what Columbus actually hoped to find.
Was his voyage purely exploratory, or was it guided by fragments of preexisting knowledge?
This rediscovered manuscript adds weight to the idea that the Americas weren’t entirely unknown.
These questions are especially relevant today. Each year, Columbus Day draws new scrutiny, with many cities and states instead observing Indigenous Peoples’ Day.
Statues of Columbus have been toppled or defaced, reflecting deeper reassessments of European colonization and its aftermath.
Though the Cronica universalis offers only glimpses, Chiesa believes the fragmentary references are significant.
They suggest that Europe’s awareness of the Western lands may have been more widespread—and far earlier—than most history books admit.
He explains that Genoa, the city where Columbus was born, was a hub of maritime trade, serving as a gateway for news and stories from far-off lands, including Greenland and other northern territories.
Galvaneus Flamma, the author of the document, was well-connected to the ruling family in Milan and wrote extensively on historical subjects.
His writings provide a unique perspective on Milanese history and beyond.
The document itself was unfinished, but it attempted to chronicle the history of the world from creation onward.
In one passage, Galvaneus makes reference to rumors of lands to the northwest, believed to be for the potential of commercial gain.
He describes these lands as being “rich in trees” and inhabited by animals, characteristics not unlike those of the Markland mentioned in the Grœnlendinga Saga, a medieval Icelandic text.
Galvaneus’s detailed description of Greenland, coupled with his mention of Marckalada, reflects the knowledge circulating among Genoese sailors at the time.
Miniature from a 14th-century Italian manuscript identified as Triv.1438 by the Archivio Storico Civico e Biblioteca Trivulziana, entitled Cronica de antiquitatibus civitatis mediolanensis, depicting Italian chronicler and friar Galvaneus Flamma. (CREDIT: CC BY-SA 4.0)
“What makes the passage exceptional is its geographical provenance: not the Nordic area, as in the case of the other mentions, but northern Italy,” Chiesa says.
He adds that while many of the rumors Galvaneus recorded were too vague to be included in maps or scholarly texts, this discovery demonstrates how information from Nordic sources may have reached Italy long before Columbus set sail.
The mention of Marckalada suggests that tales of this distant land traveled from the northern harbors, carried by Scottish, British, Danish, and Norwegian sailors trading with Genoese merchants.
Chiesa emphasizes that Cronica universalis is a reliable source, as Galvaneus is careful to note when he is recounting oral stories, often supporting his claims with elements from both legendary and factual accounts.
This level of detail, Chiesa believes, lends credibility to the friar’s account of the Genoese sailorshaving heard about North America from their northern trading partners.
Milan, Italy, found in 1340 manuscript a mention of America 150 years before Columbus.
(CREDIT: CC BY-SA 4.0)
Though there is no evidence that Italian sailors themselves reached these northern lands, the Genoese were well positioned to gather news and goods from northern Europe and transport them to the Mediterranean.
Chiesa also highlights the advanced geographical knowledge of the north possessed by Genoese and Catalan sailors, as evidenced by their detailed nautical charts from the fourteenth century.
“It has long been noticed that the fourteenth-century portolan charts drawn in Genoa and in Catalonia offer a more advanced geographical representation of the north,” Chiesa explains, indicating that this knowledge likely came from direct contact with northern European traders.
Although the exact extent of Genoese sailors’ understanding of the American continent remains uncertain, the discovery of Cronica universalis opens the door to new interpretations of European exploration in the Middle Ages.
It suggests that Italian merchants and sailors may have been aware of lands beyond Greenland long before Columbus embarked on his voyage.
While the information in Cronica universalis is fragmentary, it provides important context to the notion that knowledge of the Americas may have been circulating in Europe well before Columbus. (CREDIT: Archivos Estatales, mecd.es)
As of now, Cronica universalis remains unpublished, but there are plans for a future edition as part of a scholarly program at the University of Milan.
This forthcoming publication will likely spur further discussion on the global exchanges of knowledge that took place in the centuries leading up to the age of exploration.
5 key facts about the Cronica Universalis:
Authorship and Period: The Cronica Universalis was written by Giovanni da Carignano, a Genoese cleric and cartographer, in the early 14th century. It offers a historical perspective on global events and geography during the medieval period.
Geographical Knowledge: The text reflects Giovanni's knowledge of geography, highlighting early European understanding of distant lands, including Africa, Asia, and even hints of lands across the Atlantic.
Mentions of the New World: It is notable for its early references to regions west of Greenland, possibly alluding to Norse exploration of North America. This predates Columbus's voyages by more than a century.
Integration of Maps and Text: Giovanni combined written accounts with cartographic insights, creating a blend of historical narrative and geographical representation. This approach was advanced for his time, showing an effort to document the known world comprehensively.
Historical Significance: The Cronica Universalis provides valuable insights into medieval European worldviews and early knowledge of lands beyond Europe. It is considered a precursor to later explorations and expansions of global awareness.
Feedback has been a science journalist for more years than we care to remember, and as a result we have come across our fair share of bizarre units of measurement.
The human mind struggles with the very large and the very small, so as a writer it is tempting to say that huge icebergs have an area that is X times the size of Wales, or a mountain is Y times the height of the Burj Khalifa, or a bad book contains Z times more plot holes than Fourth Wing.
In this spirit, Christopher Dionne wrote in to highlight a CNN article about the Blue Ghost lunar lander sending its final message from the moon.
He notes that the writer tries to convey the amount of data the probe sent by saying it “beamed a total of about 120 gigabytes of data — equivalent to more than 24,000 songs — back to Earth”.
“This got me thinking,” says Dionne. Nowadays a lot of music is streamed, so the size of the song files “doesn’t generally matter”.
The size of the files will also vary depending on the compression method, and on a song’s length.
We can surely all agree that All Too Well (10-minute Version) is going to be a slightly larger file than Love Me Do – so you can’t use songs as a standardised unit of dataset size.
Happily, Dionne has come up with a solution.
“Why don’t we use the internationally agreed upon standard of measurement – the blue whale?” The blue whale genome is 2.4 billion bases in length. “So it seems that the Blue Ghost has sent back 50 blue whales of data from the moon.”
Feedback likes the idea, partly because we enjoy the Douglas-Adams-esque image of a torrent of whales hurtling to Earth from the moon. But we are going to quibble Dionne’s maths. A base in a genome isn’t equivalent to a byte in a dataset. Each byte is eight bits, and it is the bits that are analogous to bases. DNA isn’t binary, either: there are four possible choices (A, C, G or T) for each position in the genome. That means you can encode a byte using half as many bases as bits. So, multiply by 8 and divide by 2, and we think Blue Ghost sent back about 200 blue whales.
We encourage readers to submit, as Dionne suggests, “other comparative units of digital measurement… which may be even better at communicating the scale of information”, and we look forward to “a thoughtful discourse around this most pressing issue”.
Goodbye, Alice and Bob
Few things are more likely to kill a joke than the need to meticulously explain it.
So Feedback is a bit nervous about this one, since it involves both a topical event and a cryptographic in-joke.
Let’s start with the cryptography thing, because we think this is the one readers might need a refresher on. When explaining how secure messaging systems work, it has become traditional to refer to the two main agents as “Alice” and “Bob”.
For example: “How can Alice send a secure message to Bob using the Signal messaging app?”
The names have been in use since 1978 and are so widespread they have their own Wikipedia page.
As well as describing the history of the device, this page delineates the hugely extended list of additional characters that can become involved in these thought experiments: from Chad, “a third participant, usually of malicious intent” all the way to Wendy, “a whistleblower”.
Basically, if you are a regular New Scientist reader, you will probably have read stories that used Alice and Bob (and their friends/enemies/acquaintances/lovers) to explain complicated ideas in cryptography and physics.
You are familiar with this.
Parodies of it are therefore amusing.
We aren’t going to bother naming the relevant news event.
It was widely covered and discussed.
Although, who knows: we are writing this on 27 March, so by the time you read this you might have forgotten about it, because the news moves so fast these days.
Maybe the US has invaded Svalbard in the interim because Donald Trump forgot which Arctic landmass he wanted.
Anyway, here we go. Posting on Bluesky, software developer John VanEnk shared a screenshot of a Wikipedia page.
It read: “Hegseth and Waltz are fictional characters commonly used as placeholders in discussions about cryptographic systems and protocols, and in other science and engineering literature where there are several participants in a thought experiment.
The Hegseth and Waltz characters were created by Jeffrey Goldberg in his 2025 article ‘The Trump Administration Accidentally Texted Me Its War Plans’.
Subsequently, they have become common archetypes in many scientific and engineering fields…”
This was accompanied by a diagram, described as an “example scenario where communication between Hegseth and Waltz is intercepted by Goldberg”.
If, after all that buildup, you didn’t find it funny, Feedback encourages you to send your comments to our Signal account, which we don’t have.
What a lark
Readers Patrick Fenlon and Peter Slessenger both wrote in to highlight the same article in The Guardian, on how migrating birds use quantum mechanics to navigate.
Apparently most “migrate at night and by themselves, so they have no one to follow”, according to a biologist quoted in the article.
Her name is Miriam Liedvogel, which of course means “songbird”.
For much of his career, George Tselioudis, a researcher at NASA’s Goddard Institute for Space Studies, has analyzed decades of satellite observations to understand where clouds occur, whether their distribution has changed, and how changes might affect Earth’s energy budget and climate.
The implications of two of his recent studies, he says, are worrying.
Clouds are common on Earth, but they are ephemeral and challenging to study.
Remote sensing has helped scientists tremendously by enabling consistent, global tracking of the elusive features, even over inaccessible areas like the poles and open ocean.
The first study, published in August 2024, showed that Earth’s cloudiest zones over the oceans have shifted and contracted over the past 35 years, allowing more of the Sun’s energy to reach and warm the ocean instead of being reflected back to space by storm clouds.
“The pattern is clear. Where storm clouds form has changed,” Tselioudis said. The implications for the climate, he added, are significant: “This has added a large amount of warming to the system.”
The analysis focused on three major storm zones: a band of thunderstorms near the equator known as the ITCZ (Intertropical Convergence Zone), and two broader mid-latitude zones in the northern and southern hemispheres between roughly 30 and 60 degrees latitude, where comma-shaped storm systems—sometimes called extratropical cyclones—swirl over the oceans.
The illustration at the top of the page depicts where clouds typically form, based on several years (2002-2015) of averaged cloud fraction observations from the MODIS (Moderate Resolution Imaging Spectroradiometer) sensor on NASA’s Aqua satellite.
Stormy areas where the sensor observed clouds more frequently are shown with white; less cloudy areas appear in shades of blue.
1984 - 2018
Earth’s storm clouds typically form near the edges of certain large-scale atmospheric circulation features—the Hadley, Mid-latitude (also called Ferrel), and Polar cells, depicted above—where winds converge and air is forced upward.
Clouds are less common and less reflective where dry air descends in the subtropics, roughly below 30 degrees north and south of the equator.
Though tropical cyclones—hurricanes, typhoons, and cyclones—can reach the subtropics, they do so infrequently.
The image below highlights different storm types as seen by the EPIC (Earth Polychromatic Imaging Camera) on the DSCOVR satellite on June 10, 2025.
The new research shows that areas over the ocean where storm clouds often form have contracted by between 1.5 to 3 percent per decade.
The ITCZ narrowed, and the mid-latitude storm zones moved poleward as they contracted.
Meanwhile, subtropical zones with fewer clouds expanded.
These changes are depicted in the chart above: white areas show where storm clouds were common, and shades of blue indicate less cloudy areas.
The trend line colors depict the degree of cloudiness.
Areas that were cloudy 85 percent or more of days are bounded by black dotted lines.
Drier areas that were cloudy on 65 percent or less of days are bounded by white dotted lines.
In a second analysis published in May 2025, Tselioudis and colleagues examined the effect of these cloud changes on Earth's energy budget.
They found that the shift increased the amount of energy absorbed by the oceans by about 0.37 watts per square meter per decade—a substantial amount on a planetary scale.
June 10, 2025
Past analysis of data from NASA’s CERES (Clouds and the Earth’s Radiant Energy System) instruments has shown that since 2005, there has been an increase of 0.47 watts per square meter per decade in the amount of solar energy Earth absorbs compared to the amount that exits as thermal infrared radiation.
This increase is part of a growing energy imbalance—a more than doubling since 2001—that has heated the oceans and contributed to global warming.
While the accumulation of greenhouse gases in the atmosphere and changes in sea ice cover explain a portion of that imbalance, pinpointing how much other processes have contributed has been a source of uncertainty and debate among scientists.
Changes to airborne particles, the reflectivity of land surfaces, clouds, and oceans are among several possible factors that may be contributing to the growing energy imbalance.
These new findings suggest that the loss of oceanic storm clouds is a key driver of the imbalance, Tselioudis said.
He describes the loss of the reflective clouds detailed in his papers as a “crucial missing piece” in the puzzle of the 21st century energy imbalance.
He added that this change to clouds is also likely a key reason that unusually warm ocean and global temperatures in 2023 surprised scientists by exceeding expectations so much.
“These findings will provide a key test for the latest generation of climate models to see whether they are getting the right answer for the right reasons,” said Gavin Schmidt, director of NASA’s Goddard Institute for Space Studies.
The big question now, Tselioudis said, is what has caused the reduction in reflective storm clouds and whether the trend will continue.
One possibility, long predicted by climate models, is that different rates of warming in the Arctic compared to the equator in recent decades may be expanding the size of Hadley cells and driving the poleward shift of the storm zones. “We can't prove this yet,” said Tselioudis.
“The system is complicated, and other dynamics might be in play.”