Andrew Christ remembers the day he became part of “this 60-year, weird, wild Cold War story.”
It was 2019, and the University of Vermont researcher was just four days away from defending his dissertation. He was beyond stressed and had better things to do than help examine an ice core sample drilled decades earlier.
The core was subglacial sediment and rock, taken from below a mile of ice in 1966 at Camp Century, an American research base in Greenland that had served as cover for a secret—and failed—military project. Since being pulled from beneath the ice sheet, the sample had been separated from the rest of the core, had criss-crossed the Atlantic, was lost, and then rediscovered. But it had never been analyzed.
“Miraculously, it had stayed frozen all that time,” says Christ. “The first thing we did was melt it.”
Christ and other geology department colleagues were sorting through sediment from the core sample, washing it off before the next stage of analysis, when he noted peculiar black specks floating in the water. He collected a few and put them under the microscope for a better look. “Oh my God, these are plants,” he remembers exclaiming. “I went full-on mad scientist.”
After his initial giddiness, the significance of the specks sank in. Christ, the lead author on a paper published this month in Proceedings of the National Academy of Sciences, had found in the sediment “freeze-dried fossils” and other direct evidence that Greenland was ice-free in the last million years.
The finding is more than an academic curiosity: It has direct implications for our future. “It’s not if Greenland is melting, but how fast,” says Joerg Schaefer, a coauthor and climate geochemist at Columbia University’s Lamont-Doherty Earth Observatory. Together with a sample from central Greenland that he and colleagues analyzed in 2016, he says, the Camp Century material shows that “there is no question: Greenland is an unstable ice sheet.”
For Schaefer, analyzing the Camp Century subglacial sample after it languished for more than half a century is a thrill, even though his team’s results are bad news. “As a scientist, it’s exciting,” he says. “As a citizen of the planet, it’s horrifying.”
Researchers had long thought that Greenland’s ice sheet, more than 2 miles thick in places, was essentially permanent and had blanketed the island for more than 2 million years. The subglacial sample confirms the massive ice sheet can probably melt far more easily than most models suggest, which would dump enough water into the oceans to raise sea levels by up to 20 feet, all but wiping major cities like London and Boston off the map.
“This study is very important. It shows the Greenland Ice Sheet can disappear with the kind of climate warming we’re projecting over the next century,” says William Colgan, a climatologist for the Geological Survey of Denmark and Greenland who was not involved in the research.
Earth’s polar regions are warming much faster than the rest of the planet, with most models suggesting a rise of at least 14 degrees Fahrenheit (more than 8 degrees Celsius) in the next century. Together with the 2016 analysis, the new Camp Century paper shows that such a temperature bump is enough to melt the ice sheet and cause catastrophic sea rise. “The Greenland Ice Sheet can disappear,” says Colgan. “It is remarkably climate-sensitive.”
The Camp Century sample’s role in rethinking the impact of climate change is just the latest twist in its strange history. In 1959, the American army set up Camp Century in northwestern Greenland, ostensibly for scientific research. The site’s true purpose, however, was Project Iceworm: a secret Cold War plan to build hundreds of miles of tunnels about 25 feet into the ice to store nuclear missiles within striking range of the Soviet Union.
The secret military plan never happened—engineers quickly learned how rapidly and unpredictably the ice can shift, making the site highly unstable and wholly unsuitable for nuclear weapons. Colgan, the project manager for the Camp Century Climate Monitoring Program, is one of a handful of people who have been to the site of the former Army installation, now buried under more than 100 feet of accumulated snow and ice. “The tunnels are collapsed and compressed,” he says. “The snow has turned to ice with pancakes of debris.”
Camp Century was abandoned in 1967, just a year after its engineers managed a true scientific feat: drilling the first ice cores. Together with more recent cores from Antarctica and elsewhere in Greenland, these slim cylinders of ice provide a crucial record of ancient climate conditions that researchers have since used both to understand our past and model our future. Colgan says Camp Century has been invaluable for science, now more than ever.
“Camp Century was the first ice core program, and we’re still learning from it,” Colgan says, adding that the Cold War–era team probably realized the site’s unsuitability as a missile base very early in their work, but persevered in the name of science. The subglacial sample, he says, “only exists because they wouldn’t take no for an answer. They punched all the way into the bedrock and even then kept going.”
Some of the mile-long Camp Century ice core had been previously studied. After being collected in 1966, however, the subglacial core sample—about 12 feet of frozen mud and bedrock from below the ice—was stored in an Army lab freezer, then at the University of Buffalo. The sample was eventually sent to Denmark, where it languished yet again, at the University of Copenhagen’s ice core archive.
In 2017, as staff prepared to upgrade the facility, someone noticed unopened boxes of Camp Century core samples. Inside, rather than the slim cylinders typical of ice cores, they found glass jars of subglacial rock and clumps of frozen sediment. Almost immediately, the find became a sensation in the field. Getting a comparable subglacial sample today using modern drilling technology would have been prohibitively expensive.
“We knew how important these samples would be. All of us started shaking and even drooling a bit,” says Schaefer. As word of the samples spread, he flew to Copenhagen with University of Vermont geologist Paul Bierman in hopes of negotiating for some of the material. “We were trying not to let them see how excited we were. We just tried to keep it together.”
Subglacial material, collected from where the drill hit sediment and bedrock below the ice sheet, contains information the ice does not. Exposed rock, like everything else on Earth’s surface, gets bombarded with cosmic rays, producing chemical signatures, called cosmogenic nuclides, that can be used to establish whether, and when, an area was ice-free. “The nuclides are only produced if the rock sees open sky,” Schaefer says. The work of dating the material is “really, really hard,” says Colgan, but the Camp Century sample has been initially dated, with confidence, as less than a million years old, lining up with the previously studied sample from central Greenland.
Christ, Schaefer, and their colleagues continue to analyze the Camp Century material to narrow its age range and learn more about the plant material it preserved, which is unique, since massive ice deposits usually destroy organic material. The next phase of research, already underway, includes searching for traces of DNA that could be used to determine the species present, and even reconstruct the entire ecosystem. So far it appears similar to modern Arctic tundra.
There’s yet more to the Camp Century core to explore. The very bottom layers of the sample include sediment that may be up to 3 million years old, Christ says, and may include more organic matter that could be “the oldest material ever recovered from under the ice.”
Camp Century may never have hosted nuclear weapons, but it is proving to be far more significant than even its planners imagined.
The blue lines indicate all locations that are considered mapped by GEBCO and are included in the Seabed 2030 initiative. Note that GEBCO’s definition of “mapped” requires only 100m resolution which is much lower than navigation chart standards. (image credit: Seabed 2030)
many owners and guests, the allure of yachting is the adventure of
cruising through pristine waters and voyaging to some of the most
exclusive locations in the world. These unique experiences stem from a
fundamental urge to explore our world and see its beauty. To be immersed
in unspoiled beauty means venturing into the unknown. According to Seabed 2030
(a global initiative by the Nippon Foundation and GEBCO to map the
world’s oceans 2030 and make it available to all), we’ve mapped less
than 24% of the world’s sea floor. “We know the topography of the Moon
and Mars in greater detail than that of our own planet.”
The most fundamental characteristic to ensure safety when sailing the oceans is knowing how deep the water is. Mapping the seafloor’s bathymetry is a critical key in safety and in scientific endeavors to understand ocean circulation, tides, tsunami forecasting, fishing resources, sediment transport, and environmental changes. It’s also important for commercial endeavors such as infrastructure construction, cable laying and pipeline routing.
Of course, knowing what’s underwater ahead of your vessel is also paramount to safe navigation. Even in well charted areas, you might think that everything you need to know is already on your nautical charts. If so, you would be mistaken. The US has some of the best charts in the world yet according to the National Oceanic and Atmospheric Administration (NOAA) “about half of the depth information found on NOAA charts is based on hydrographic surveys conducted before 1940” and “in too many cases, the data is more than 150 years old. Sometimes, particularly in Alaska, the depth measurements are so old that they may have originated from Captain Cook in 1778.”
Take a moment to think about the reliability of the chart data when you’re navigating in the “exotic” locations your guests’ itineraries demand.
Fortunately, yachts have a wide range of navigation sensors they can use, in conjunction with their charts, to help them navigate such waters more safely. While navigating in these locations, yachts are able to be a part of the solution through a worldwide crowdsourcing initiative. They have the opportunity to contribute to the global community by recording their depth and position observations along the way.
Jennifer Jencks, Director of the IHO Data Center for Digital Bathymetry (DCDB) at NOAA’s National Centers for Environmental Information, is the Chair of the IHO’s Crowdsourced Bathymetry Working Group (CSBWG). She shares that “measurements collected by the yachting community during the course of their normal operations are a valuable contribution to the IHO’s crowdsourced bathymetry efforts. Contributions to the IHO’s Data Centre for Digital Bathymetry are made available for public use and are included in the Seabed 2030 initiative.”
Participation in these types of initiatives is an easy way for the yachts to contribute to the wider, global community, while operating in their “typical” manner. One example of such industry participation is highlighted by a recent collaboration between the IHO and the Yacht Club de Monaco. Participating vessels are outfitted with a simple NMEA 2000 data logger which records the single beam echosounder measurements.
Yet another example is collecting recordings from FarSounder’s Argos 3D Forward Looking Navigation sonars. More advanced sensors such as these are able to record more sophisticated information and can measure a wider swath of depths as the vessel travels as compared to a standard single beam echosounder.
The FarSounder’s Argos series sonars are designed primarily as a realtime, forward looking sensor for obstacle avoidance. These navigation systems can provide a 3D image of waters and bottoms ahead of the vessel out to 1000m range. However, FarSounder’s sonars also include their Local History Mapping feature which builds a map of the bathymetry everywhere the yacht transits. The size of this map is only limited by the hard drive space available on the bridge computer.
FarSounder’s display software includes both a 3D view of the sonar data as well as a chart view with sonar, AIS, and ARPA data as overlays on standard S57/S63 format charts. Realtime forward looking data is available inside the sonar’s field-of-view (i.e the “pie wedge”) with Local History Mapping data stored indefinitely and displayed anywhere the vessel has previously transited.
The standard configuration of FarSounder’s sonars is a stand alone system which keeps all the data inside the software. However, FarSounder customers can choose to participate in the company’s Expedition Sourced Ocean Data Collection Program. As part of this program, participants are sent a USB hard drive, which records all the raw data received by their system. When the drive is full, it is sent back to FarSounder for compilation.
“Observations collected and contributed by the yachting community can provide depth measurements from locations often not covered by formal surveys,” says Dr. Mathias Jonas, Secretary General of the IHO, “These unique contributions play a key role in our crowdsourcing efforts and can help increase our knowledge of what is happening below the surface.”
FarSounder is an official trusted node for the DCDB and all contributions to the IHO’s database are available for public use. Through the DCDB, the data is also shared with Seabed 2030. This program is an opt-in program for select current customers and is focused on those who are traveling to exoctic locations (though data from any location is of value to Seabed 2030).
Jamie McMichael-Phillips, Director of Seabed 2030, echoes the sentiment of Dr. Jonas, “Many yacht captains don’t realize that they have an opportunity to provide real value to the global community without making changes to their itineraries and daily activities. For those vessels where privacy is a concern, their contributions can even be anonymized before submission through the IHO’s network of Trusted Nodes.”
Crowdsourced data has many known uses and new applications for the data are being developed by engineers, scientists and hobbyists around the world. One example is using crowdsourced data from trusted sources to help fill in the gaps in traditional hydrographic surveys. The Canadian Hydrographic Service (CHS) recently produced an update to chart 7053 using data collected by a vessel equipped with FarSounder sonar and made available through the DCDB. In this case, the CHS had no survey data from this part of the Northwest Passage. Using the customer submitted recordings and metadata about the vessel, the CHS was even able to assess the quality and reliability of the measurements.
The Canadian Hydrographic Service does not have comprehensive official surveys in much of the Northwest Passage. The inset shows the location of Chart 7053 which incorporates depth measurements from trusted community contributions including some from a FarSounder customer. (image credit: Fisheries and Oceans Canada)
In another example five vessels operating off the coast of Antarctica during this past season participated in the FarSounder program. When those drives are returned, it is hoped that there will be recordings of multiple voyages over similar locations which will allow for the generation of a large surveyed area. There are plans to repeat this effort in subsequent years through the FarSounder data collection program with the hope of not only expanding the coverage of the surveyed area but also producing information about the seafloor as it changes over time. Such observations of the Antarctic seafloor have never previously been collected and could be a unique perspective for scientists who are studying climate change and polar ice caps.
Heath Henley, PhD, Senior Application Engineer at FarSounder, notes that “yachts often operate in locations that are outside the standard, commercial shipping routes and can offer access to scientific observations which may otherwise not be made. It would be a shame to waste such opportunities, especially when they can be achieved with no significant cost while the vessel operates normally. We’re proud that our customers are able to contribute in this way.”
Through participation in crowdsourcing activities, the yachting industry has an opportunity to provide unique and valuable contributions to the global community and expand the limits of our understanding of our world. FarSounder is proud to do their part in connecting their yacht customers with the Seabed 2030 project. Keeping the oceans safe is a mutual goal for all and they are pleased to have the partnership in place to provide Seabed 2030 with bathymetric data.
A self-powered robot inspired by a fish can survive the extreme pressures at the bottom of the ocean’s deepest trench, thanks to its soft body and distributed electronic system — and might enable exploration of the uncharted ocean.
Writing in Nature, Li et al. report a robot made from soft materials that can brave the unexplored depths of the sea. Remarkably, the authors demonstrate that their robot can operate in the Mariana Trench, the deepest part of the ocean. Conventional underwater vehicles require watertight enclosures made of metallic materials to withstand the high pressures of the deep ocean — the thickness and dimensions of these enclosures have to be increased to cope with greater depths. But in Li and colleagues’ robot, the delicate electronic components are embedded and distributed in soft silicone, a design that removes the need for pressure-resistant cases.
Largely inspired by living organisms, the field of soft robotics involves making robots from pliable materials. Polymers such as silicone are often used, as well as highly deformable structures such as braids and textiles. Soft robots are intrinsically safer than their conventional rigid counterparts in interactions with humans, and their pliability can boost many capabilities — such as their dexterity when manipulating objects, and their ability to squeeze into tight spaces or to travel across uneven surfaces. Marine species such as squid and octopus were one of the original inspirations for soft-robotics research2, but soft robotics, in turn, offers a new approach for tackling marine applications of robots. Li and colleagues’ work is a powerful demonstration of this.
The authors’ robot is designed to have a fish-like body shape and two flapping side fins (Fig. 1). The authors used a well-established mechanism to drive flapping. The fins are attached to ‘muscles’ on the robot’s body; these are made of a soft material that converts electrical energy into mechanical work — when an electric current from the robot’s battery is applied to the muscles, they contract. Tiny solid structures mechanically connect the contracting muscles to the fins, making them flap.
Figure 1 | Designed for the deep.
Li et al.1 have developed a robot made from soft materials that is designed to withstand the extreme pressures of the deep ocean. The robot has a fish-like shape consisting of an elastic frame to which two thin flapping side fins are attached; the fins have leading edges made from a stiffer material. ‘Muscles’ on the frame are made of materials that convert electrical energy into mechanical work, and are attached to the fins (attachment structures not shown). When an electric current from the robot’s battery is applied to the muscles, they contract. The electronic components of the robot and the battery are embedded in the central silicone body; their distributed arrangement in the silicone protects them from high pressures.
One of the challenges faced by Li and co-workers was finding a way to protect the robot’s electronic components from high pressures. Taking inspiration from the bones in the skull of the hadal snailfish (Pseudoliparis swirei), the authors spaced the electronic components apart, rather than packing them together as is typically done in electronic devices. Laboratory tests and simulations demonstrated that this arrangement reduces the stress at the interfaces between components under pressure. The distributed electronics were then embedded in silicone for incorporation into the robot. This approach is more practical, and cheaper, than other methods for protecting the electronics in deep-sea devices.
Li et al. first tested the swimming ability of the robot in the laboratory, in a pressurized water chamber — the robot was connected to a pole, which it swam around in a circle. The machine was then tested in a lake at a depth of 70 metres, where it swam freely at a speed of 3.16 centimetres per second, and then in the South China Sea at a depth of about 3,200 m. It reached a speed of 5.19 cm s–1 (equivalent to 0.45 body lengths per second), which is in line with the capabilities of other soft robots3. Finally, the flapping movement and pressure resistance of the robot were tested in the Mariana Trench, where it was connected to a conventional underwater robot for support, which also took images of the test.
Several previous attempts have been made to develop soft robots for applications underwater — a realm in which it is challenging for robots to interact delicately with objects, because robotic sensors don’t work well in this environment. Soft robotic grippers4 offer substantial advantages over rigid grasping devices when collecting and handling delicate sea organisms for study by marine biologists. And bio-inspired soft robotic fishes5 can swim among other animals without distressing them, thereby allowing close-up study. Li and co-workers’ research now pushes the boundaries of what can be achieved: the replacement of rigid protective enclosures for electronic components by distributed electronics embedded in a soft material paves the way to a new generation of deep-sea explorers.
There is, however, more work to do before the ocean can be populated with robots of this type of design. Li and co-workers’ machine is slower than previously reported underwater robots6, and cannot withstand sizeable disturbances — it could easily be swept away by underwater currents. Its locomotor capabilities will also need to be optimized for practical applications. However, Li and colleagues’ approach lays the foundations for future generations of resilient and reliable deep-sea explorers.
In the long term, one can predict avenues of research being opened up for marine biology, in which soft robots safely navigate coral reefs or underwater caves, to collect delicate specimens without damaging them. Swarms of underwater soft robots, with the ability to crawl on the seabed, anchor themselves on to specific structures, or swim over particular areas, could contribute to the development of technologies for various other applications. These might include monitoring the ocean, cleaning up and preventing sea pollution or preserving marine biodiversity. More fundamentally, they could help researchers to explore the vast uncharted depths of the oceans.
Hurricane Ian left an extraordinarily broad path of destruction across much of South Florida. That was evident in reports from the ground, but it also shows up in satellite data. Using a new method, our team of spatial and environmental analysts was able to quickly provide a rare big picture view of damage across the entire state.
Satellite images and artificial intelligence reveal Hurricane Ian’s widespread damage. The dark areas have a high probability of damage. Su Ye
By using satellite images from before the storm and real-time images from four satellite sensors, together with artificial intelligence, we created a disaster monitoring system that can map damage in 30-meter resolution and continuously update the data.
It’s a snapshot of what faster, more targeted disaster monitoring can look like in the future – and something that could eventually be deployed nationwide.
How artificial intellegence spots the damage
Satellites are already used to identify high-risk areasfor floods, wildfires, landslides and other disasters, and to pinpoint the damage after these disasters. But most satelite-based disaster management approaches rely on visually assessing the latest images, one neighborhood at a time.
Our technique automatically compares pre-storm images with current satellite images to spot anomalies quickly over large areas. Those anomalies might be sand or water where that sand or water shouldn’t be, or heavily damaged roofs that don’t match their pre-storm appearance. Each area with a significant anomaly is flagged in yellow.
Damage detected in the same area of Matlacha as in the photo. Su Ye
Five days after Ian lashed Florida, the map showed yellow alert polygons all over South Florida. We found that it could spot patches of damage with about 84% accuracy.
A natural disaster like a hurricane or tornado often leaves behind large areas of spectral change at the surface, meaning changes in how light reflects off whatever is there, such as houses, ground or water. Our algorithm compares the reflectance in models based on pre-storm images with reflectance after the storm.
Punta Gorda, Florida, was hit by storm surge and high winds from Hurricane Ian.
Damage in the same part of Punta Gorda shown in the photo. Su Ye
The system spots both changes in physical properties of natural areas, such as changes in wetness or brightness, and the overall intensity of the change. An increase in brightness often is related to exposed sand or bare land due to hurricane damage.
Using a machine-learning model, we can use those images to predict disturbance probabilities, which measures the influences of natural disaster on land surfaces. This approach allows us to automate disaster mapping and provide full coverage of an entire state as soon as the satellite data is released.
Extreme storms with destructive flooding have been documented with increasing frequency over large parts of the globe in recent years.
While disaster response teams can rely on airplane surveillance and drones to pinpoint damage in small areas, it’s much harder to see the big picture in a widespread disaster like hurricanes and other tropical cyclones, and time is of the essence. Our system provides a fast approach using free government-produced images to see the big picture. One current drawback is the timing of those images, which often aren’t released publicly until a few days after the disaster.
We are now working on developing near real-time monitoring of the whole conterminous United States to quickly provide the most up-to-date land information for the next natural disaster.
Poised to be the largest boat ever built, the ship is also, apparently, a floating city
Among many things that weren’t on our Bingo cards for 2022, a giant turtle-shaped yacht—which is also being described as a floating city—is certainly at the top of the list. And yet, here we are.
Pangeos, a turtle-shaped yacht, cruising through the ocean.
Photo: All photos courtesy of Lazzarini
As CNN reported, an Italian design firm, Lazzarini Design Studio, has drafted plans for a massive boat shaped like an oversized tortoise with outstretched flippers. The vessel is so big that it’s not even being called a superyacht, but rather designers have described it as a terayacht. As such, it was given an appropriately big name, Pangeos, a nod to Pangea, the ancient supercontinent that once incorporated almost all of Earth’s landmasses. According to Lazzarini, the ship will be big enough to hold various hotels, apartments, condos, shopping centers, parks, and other amenities for up to 60,000 guests. The yacht will even have ship and aircraft ports so guests can visit when Pangeos isn’t docked near land. Which is helpful since, as of now, the company doesn’t foresee any strict itinerary, but rather expects the turtle to simply coast around the world.
Many of the amenities and homes are designed around a central town square.
Extending 1,800 feet in length and measuring 2,000 feet at its widest point, the boat is poised to be the largest floating structure ever built. That is, of course, if it does get built, which is a whole other obstacle in and of itself. As the Pangeos website explains, “the terayacht needs a terashipyard,” which doesn’t currently exist. The project requires a yard that is about 200 miles wide and 180 miles long, which also means dredging about 0.4 square miles of the sea with a circular dam. Once the yacht is constructed, the dam can open, flooding the area and essentially launching the vessel. Lazzarini has currently suggested the coast of Saudi Arabia as the ideal location for construction.
Since the goal is usually to minimize drag, boats don’t usually stray from the standard V shape—and are even less frequently formed like marine animals—but Pangeos’s flippers are more than just a peculiarity. According to the company, the extremities will capture kinetic energy from the waves, which will allow the boat to cruise perpetually without emissions. The rooftop area would also be equipped with solar panels for additional power, should it be necessary; electric engines are also included in the design.
Within the terraced villa, guests can find houses, buildings, and rooftop terraces, with an upper zone that offers landing of various flying vehicles.
According to the firm, the total cost to build the massive vessel will be nothing short of $8 billion. Construction will also take, at minimum, eight years, and Lazzarini doesn’t expect that this could even begin until 2025. With a hefty price tag and decent wait time, the company has conjured a plan to alleviate both. Designed as an NFT crowdfunding project, the firm is selling “Unreal Estate,” allowing interested buyers to purchase virtual boarding tickets, hotel rooms, and even houses for a metaverse version of the yacht, which the company says will be ready for (virtual) boarding by 2023.