Saturday, November 20, 2021

NVIDIA to build Earth-2 supercomputer to see our future

Climate change is arguably the greatest threat facing humanity today.
Accurately predicting climate change is critical to plan for its disastrous impacts well in advance and to adapt to sea level rise, ecosystem shifts, and food and water security needs.
The Fourier Neural Operator (FNO) -- a novel AI model -- learns complex physical systems accurately and efficiently.
Here we see the FNO emulate a high-resolution Earth dataset, ERA5, and predict the behavior of extreme weather events across the globe days in advance in just 0.25 seconds on NVIDIA GPUs.
At 100,000 times faster than traditional numerical weather models, this is a significant step towards building digital twin Earth.
other video

The earth is warming.
The past seven years are on track to be the seven warmest on record.
The emissions of greenhouse gases from human activities are responsible for approximately 1.1°C of average warming since the period 1850-1900.
What we’re experiencing is very different from the global average.
We experience extreme weather — historic droughts, unprecedented heatwaves, intense hurricanes, violent storms and catastrophic floods.
Climate disasters are the new norm.
We need to confront climate change now.
Yet, we won’t feel the impact of our efforts for decades.
It’s hard to mobilize action for something so far in the future. But we must know our future today — see it and feel it — so we can act with urgency.
To make our future a reality today, simulation is the answer.
"The hardware needed to “see” just how bad it’s gonna be." @AnthonyDiMare
see large video
To develop the best strategies for mitigation and adaptation, we need climate models that can predict the climate in different regions of the globe over decades.
Unlike predicting the weather, which primarily models atmospheric physics, climate models are multidecade simulations that model the physics, chemistry and biology of the atmosphere, waters, ice, land and human activities.
Climate simulations are configured today at 10- to 100-kilometer resolutions.
But greater resolution is needed to model changes in the global water cycle — water movement from the ocean, sea ice, land surface and groundwater through the atmosphere and clouds.
Changes in this system lead to intensifying storms and droughts.
Meter-scale resolution is needed to simulate clouds that reflect sunlight back to space.
Scientists estimate that these resolutions will demand millions to billions of times more computing power than what’s currently available.
It would take decades to achieve that through the ordinary course of computing advances, which accelerate 10x every five years.
For the first time, we have the technology to do ultra-high-resolution climate modeling, to jump to lightspeed and predict changes in regional extreme weather decades out.
We can achieve million-x speedups by combining three technologies: GPU-accelerated computing; deep learning and breakthroughs in physics-informed neural networks; and AI supercomputers, along with vast quantities of observed and model data to learn from.
And with super-resolution techniques, we may have within our grasp the billion-x leap needed to do ultra-high-resolution climate modeling.
Countries, cities and towns can get early warnings to adapt and make infrastructures more resilient.
And with more accurate predictions, people and nations will act with more urgency.
So, we will dedicate ourselves and our significant resources to direct NVIDIA’s scale and expertise in computational sciences, to join with the world’s climate science community.
NVIDIA this week revealed plans to build the world’s most powerful AI supercomputer dedicated to predicting climate change.
Named Earth-2, or E-2, the system would create a digital twin of Earth in Omniverse.
All the technologies we’ve invented up to this moment are needed to make Earth-2 possible.
I can’t imagine a greater or more important use.

Friday, November 19, 2021

Croatia (HHI) layer update in the GeoGarage platform

58 new rasterised ENCs added
No this isn't a fingerprint, it's the little uninhabited island of Baljenac!
This aerial shot by the Daily Overview shows off the roughly 14-mile network of low stone walls that line this little corner of the Adriatic Sea.
They were built by residents of a nearby island to separate crop fields and vineyards.
Source Imagery: @Maxartechnologies

Localization of island of Baljenac on the GeoGarage platform

NOAA Ocean Exploration meets major mapping milestone on NOAA Ship Okeanos Explorer

Image courtesy of NOAA Ocean Exploration.

 From NOAA
Two million square kilometers.
Or 772,204 square miles.
That’s more than one quarter the size of the contiguous United States.
And it’s the area of seafloor mapped by NOAA Ocean Exploration using the modern, high-resolution multibeam sonar system aboard NOAA Ship Okeanos Explorer since the ship was commissioned in 2008.

From 2008 through early November 2021, NOAA Ocean Exploration mapped 2 million square kilometers (772,204 square miles) of seafloor aboard NOAA Ship Okeanos Explorer. Okeanos Explorer is equipped with state-of-the-art multibeam sonar systems that use beams of sound to map the ocean floor.
This map shows the cumulative multibeam mapping coverage.
The gray lines indicate the boundaries of the U.S. Exclusive Economic Zone.
Image courtesy of NOAA Ocean Exploration.

Figure 1: During the 2010 Indonesia-USA Deep-Sea Exploration of the Sangihe Talaud Region, NOAA Ocean Exploration mapped Kawio Barat, a volcano in the Celebes Sea of Indonesia, in detail for the first time. The conical volcano is over 3,000 meters (9,843 feet) tall and is the site of black smoker hydrothermal vents and chemosynthetic communities.
Image courtesy of NOAA Ocean Exploration.

Figure 2: Three NOAA Ocean Exploration mapping expeditions in 2016 focused on mapping the numerous seamounts in the U.S. Exclusive Economic Zone around Wake Island in the Pacific Ocean. Each time we learned something new about their height, size, and type.
The “wireframe” in this image shows the predicted bathymetry of the seamounts based on satellite data. After we mapped them with the modern sonar system on NOAA Ship Okeanos Explorer, we found many to be flat topped guyots with tops hundreds of meters higher or lower than predicted.
Image courtesy of NOAA Ocean Exploration. 

Figure 3: Three NOAA Ocean Exploration mapping expeditions focused on mapping the numerous seamounts in the U.S. Exclusive Economic Zone around Wake Island in the Pacific Ocean.
One seamount, seen here, was found to be a guyot larger than the state of Rhode Island.
Image courtesy of NOAA Ocean Exploration.  

Figure 4: During the Atlantic Canyons Undersea Mapping Expeditions campaign, NOAA Ocean Exploration and partners focused on canyons and landslides, mapping every major submarine canyon from North Carolina to the U.S.-Canada maritime border in high resolution.
Image courtesy of NOAA Ocean Exploration.

Figure 5: While exploring the waters off Puerto Rico during Océano Profundo in 2015, NOAA Ocean Exploration mapped a section of the tilted carbonate platform, which includes several major canyons, north of the island.
Our modern high-resolution bathymetry is seen here overlaid on satellite data of the territory and the surrounding waters. Image courtesy of NOAA Ocean Exploration.

Figure 6: In 2012, NOAA Ocean Exploration mapped these salt domes in the Gulf of Mexico.
Salt domes are formed when mounds or columns of salt beneath the seafloor push rocks and sediments above them upward into hill-like structures that rise from the seafloor.
Salt domes have been associated with oil and gas seeps as well as a variety of marine life, including chemosynthetic communities, making them important features to locate and document.
Image courtesy of NOAA Ocean Exploration. Download largest version (519 KB).
Why We Map

There’s so much to thank the ocean for. From the air that we breathe to the food on our plates to the climate that makes our planet habitable
 Nevertheless, so much of our ocean, the deep ocean in particular, remains unexplored.

Exploration leads to discovery, and the first step in exploration is mapping.
Seafloor mapping provides a sense of the geological features and animals of our deep ocean and sets the stage for discoveries to come.
But, it’s not just about the thrill of discovery.
Deep-ocean seafloor mapping has many benefits.
Broadly, it provides insight into geological, physical, and even biological and climatological processes, helping us better understand, manage, and protect critical ocean ecosystems, species, and services for the benefit of all. In addition, it supports navigation, national security, hazard detection (e.g., earthquakes, submarine landslides, and tsunamis), telecommunications, offshore energy, and more.

NOAA Ocean Exploration’s Role

For years, NOAA has been leading efforts to complete seafloor mapping in U.S. waters, and NOAA Ocean Exploration has been leading these efforts in deep waters (below 200 meters, 656 feet).
Now, all of this work is being done in support of the National Strategy for Mapping, Exploring, and Characterizing the United States Exclusive Economic Zone (national strategy), which calls for completing mapping of the U.S. Exclusive Economic Zone by 2040 (waters deeper than 40 meters (131 feet) by 2030 and the near shore by 2040).

As such, mapping is a critical part of every NOAA Ocean Exploration expedition on Okeanos Explorer. In addition, decisions about where we go and where we dive all start with a map.
We pursue every opportunity to collect and archive data from largely unmapped areas of the seafloor using the ship’s suite of modern sonars.
Over the years, mapping data collected aboard the ship in the Atlantic, Pacific, and Indian oceans as well as the Gulf of Mexico and Caribbean Sea have revealed numerous new features and spurred major discoveries.
Seamounts, trenches, canyons, ridges, banks, hydrothermal vents, brine pools, methane seeps, coral mounds, shipwrecks — we’ve mapped them all and more.

NOAA Ocean Exploration has been mapping the seafloor of the Blake Plateau off the Southeast United States aboard NOAA Ship Okeanos Explorer (left) since 2011.
As a result of this mapping, NOAA Ocean Exploration and partner scientists discovered mounds of extensive, dense populations of the deep-sea, reef-building coral Lophelia pertusa (middle and right) — some in areas previously believed to be flat and featureless.
These mounds have been growing for thousands, perhaps millions, of years and provide shelter and habitat to a variety of marine life.
Now, at 28,047 square kilometers (10,829 square miles), this area is considered to be the largest known deep-sea coral province in U.S. waters, possibly the world.
Additional mapping data collected with the support of NOAA Ocean Exploration (not shown here) have contributed to a near complete map of the deep waters of the Blake Plateau, an area of interest for the National Strategy for Mapping, Exploring, and Characterizing the United States Exclusive Economic Zone.
Images courtesy of NOAA Ocean Exploration. 

NOAA Ocean Exploration also uses the multibeam sonar system on NOAA Ship Okeanos Explorer to search for shipwrecks and other submerged cultural resources that help us better understand the past. An anomaly in seafloor mapping data (inset) turned out to be a shipwreck, likely the World War II-era oil tanker SS Bloody Marsh.
Images courtesy of NOAA Ocean Exploration, 2021 U.S. Blake Plateau Mapping 2 (inset) and Windows to the Deep 2021.

The free and publicly accessible data collected aboard Okeanos Explorer have been used by countless scientists, ocean resource managers (e.g., managers of fisheries and marine protected areas), and students.
These data also contribute to Seabed 2030 , a collaborative, international effort to produce the definitive map of the world ocean floor by 2030.

Always Evolving Technologies

Mapping technologies have come a long way in recent decades, and NOAA Ocean Exploration is consistently at the forefront of the deep-ocean mapping community in testing and implementing new and emerging technologies.
In 2008, NOAA Ocean Exploration and NOAA’s Office of Marine and Aviation Operations outfitted Okeanos Explorer with a state-of-the-art deepwater multibeam sonar system — the very first of its kind in the world — capable of collecting high-resolution mapping data across large areas.
And, in 2021, we replaced this system, again with the first of its kind, to enable even greater mapping data quality and coverage.
In addition, NOAA Ocean Exploration recently piloted the remote processing of mapping data in the cloud, allowing mappers and interns to participate from shore. 

What’s Next

Seafloor mapping is difficult, time consuming, and expensive.
To meet the ambitious goals set by the national strategy and Seabed 2030, the mapping community is going to need to increase the pace, efficiency, and affordability of seafloor mapping.
Organizations will need to work together, leverage their resources and expertise, and increase the use of new tools and technologies like autonomous vehicles, innovative telepresence technologies, and the cloud.

Yes, 2 million square kilometers is a lot, and it’s an accomplishment to be proud of, but approximately 50% of the seafloor beneath U.S. waters remains to be mapped to modern standards.
NOAA Ocean Exploration is up to the challenge, and we will continue to evolve our operations, on Okeanos Explorer and through other mechanisms, to help meet it.
With so much to map, it’s a good time to be an ocean explorer.

This seamount was mapped for the first time during NOAA Ocean Exploration’s 2016 Hohonu Moana expedition on NOAA Ship Okeanos Explorer.
Subsequently, the seamount was named Okeanos Explorer Seamount to honor the ship for its key role in the discovery.
Image courtesy of NOAA Ocean Exploration, 2016 Hohonu Moana. Download largest version (629 KB).

This milestone was met on November 1, 2021, on the Blake Plateau off the coast of the Southeast United States during the Windows to the Deep 2021 expedition.

Published: November 15, 2021
Contributed by: Christa Rabenold, NOAA Ocean Exploration
Relevant Expedition: Windows to the Deep 2021

Thursday, November 18, 2021

This intrepid robot is the Wall-E of the deep sea

With extra-wide tracks and a bunch of other clever features, the Benthic Rover II can roam the seafloor for years at a time.

From Wired by Matt Simon

Here's how engineers got the car-sized Benthic Rover II to roam the seafloor 13,000 feet deep without immediately breaking down.

THE BENTHIC ROVER II is the size of a compact car, although it rocks fat treads, making it more like a scientific tank.
That, along with the two googly-eye-like flotation devices on its front, gives it a sort of WALL-E vibe.
Only instead of exploring a garbage-strewn landscape, BR-II roams the Pacific seafloor, 13,000 feet deep.
The robot’s mission: To prowl the squishy terrain in search of clues about how the deep ocean processes carbon.

That mission begins with a wild ride, 180 miles off the coast of Southern California.
Scientists at the Monterey Bay Aquarium Research Institute lower BR-II into the water and then … drop it.
Completely untethered, the robot free-falls for two and a half hours, landing on the abyssal plains—great stretches of what you might generously call muck.
“It's mushy and dusty at the same time,” says MBARI electrical engineer Alana Sherman, coauthor on a new paper in Science Robotics describing findings from the robot’s adventures.
“Which is part of the reason it’s a tracked vehicle, and it has these really wide treads.” That extra surface area distributes the robot’s weight so it doesn’t sink into the sand.

If you wanted to devise the perfect way to torture a robot, the deep sea would be it.
At these depths the water is cold, salty (and therefore corrosive), and highly pressurized; there’s a whole lot of liquid pushing down on the robot.

Like the Mars rovers, this robot must be autonomous.
In fact, in some ways it’s even more difficult to keep tabs on a rover 13,000 deep than it is a rover on another planet.
Radio waves travel well in space, it’s just that they take up to 20 minutes each way to make the trip between Earth and Mars—and good luck remotely piloting a rover in real time with that kind of delay.
But radio waves hate water.
So, instead, BR-II uses acoustic signals to talk to another robot, a floating glider that MBARI scientists release from shore four times a year.
The glider, essentially a very expensive surfboard, travels to the rover’s approximate location, pings it, collects status updates and sample data, and fires that information to a satellite for the researchers to access.

A rattail fish captured on BR-II's camera PHOTOGRAPH: © 2021 MBARI

Notice the simplistic muckiness of the seafloor.

Since MBARI scientists can’t just sit in their labs and pilot the rover, it’s on its own.
But its directives are simple.
Parked on the seafloor, it lowers two oxygen sensors into the muck.
This gives the robot a measure of the biological activity in the sediment, as microbes consume oxygen and spit out carbon dioxide.
The rover also has a fluorescence camera system that casts a blue light, which makes the chlorophyll in organic matter glow.
This gives the robot an idea of how much detritus from surface waters, known as “marine snow,” is making its way down to the seafloor.

The rover sits in one place like this for 48 hours, then moves forward 33 feet.
That’s all.
“It would not know if it drove off a cliff—all it knows is I'm supposed to drive forward 10 meters,” says Sherman.
“But luckily, there are no cliffs around, so we take advantage of the simplicity of the environment to keep the robot more simple.”

Still, there’s a problem: The oversized treads make a mess of the seafloor.
“Even though it is moving very slowly, it doesn't take much to create this huge dust storm,” says Sherman.
“We always want to be driving into the current, so that it can push the sediment that is disturbed behind us.” So before the rover moves, it uses a sensor to get an idea of the current direction of the … er, current, then heads straight for it.

You can see the two oxygen samplers beneath the eyeball floats.

The benthic rover does this for a whole year, unsupervised: park, take measurements, move 33 feet, repeat.
Then the scientists steam out in their research boat to give it a battery change.

At the back of the robot are two titanium spheres—each somewhere between the size of a yoga ball and a beach ball—filled with batteries that power a year of continuous operation.
When it’s time to resupply power, the scientists retrieve BR-11 by sending it a signal that releases a 250-pound weight attached to the robot’s belly.
Once the weight is dropped, those flotation devices that look like eyes begin to do their work.
They are actually “syntactic” foam: Instead of being mushy, porous plastic filled with air, they are actually made of hard material and filled with small glass spheres, each containing air.
Under pressures that would collapse typical foam in on itself, the syntactic foam stays buoyant, and propels the robot to the surface.

The scientists haul the rover up on board their boat, download BRI-II’s data, swap out its batteries, and check it for problems.
If everything’s good, they release it to spend another year roaming the abyssal plains.
The last time the scientists went out, though, they discovered that one of BR-II’s motors had failed, so they had to bring it ashore for repairs.
That ended an incredible seven years of continuous operation, which they recapped in their current paper.

This long observation period has given MBARI scientists unprecedented insight into the goings-on of the deep, across both wide stretches of the seafloor and across long timescales.
That will be critical to understanding our planet’s carbon cycle.
At the ocean’s surface, a galaxy of algae known as phytoplankton sequesters carbon, the way plants do on land.
Then the algae get eaten by tiny animals known as zooplankton.
When these creatures poop, the carbon-rich pellets descend through the water column as marine snow.
Some of the waste gets eaten, either along the way or by bottom-dwelling creatures, but the rest becomes sequestered in the sediment, locking the carbon far away from Earth’s atmosphere.

Yet just how much carbon gets trapped can vary from ocean to ocean and from season to season.
In general, researchers just don’t have a good handle on the biological and chemical processes going on down there.
“The rover helps us understand how much of that carbon might actually make its way into the sediments in the deep sea,” says MBARI marine biologist Crissy Huffard, who coauthored the new paper.
“It's our only view into how much carbon might actually get stored into the sediments, versus how much actually is consumed and probably contributing to acidification in the deep sea.” (When carbon dioxide dissolves in seawater, it forms carbonic acid.)

Here’s a tricky example of one of those seafloor carbon mysteries.
In California, the land is heating up much faster than the adjacent ocean, a differential that intensifies seasonal winds.
That could be driving more upwelling—wind pushes the surface water away, and water from below rushes up to fill the void.
This would bring up more nutrients that feed phytoplankton, which bloom in surface waters, and then die and become marine snow.
Between the years 2015 and 2020, for instance, BR-II’s fluorescence camera detected a massive increase in the amount of phytoplankton reaching the seafloor in big pulses.
Simultaneously, its sensors detected a decrease in oxygen, meaning the microbes in the seafloor were busy processing the bonanza of organic material.

That raises some questions for Huffard.
“Just in general, the area's becoming a lot more erratic in its food supply—it can be years’ worth of food coming down in a few weeks.
So how is that changing the whole ecosystem?” she asks.
"The response by the animal community is almost instant.
They start consuming it right away, there's no big lag.
The microbes are just primed and ready to go.”

What does this mean for the carbon cycle? Theoretically, the more organic material that’s raining down, the more that’s getting sequestered away from the atmosphere.
But at the same time, organisms on the seafloor that are eating this bonus buffet are also using up oxygen and spitting out carbon dioxide, which may be acidifying deeper waters.
And because the ocean is constantly churning, some of that carbon may even make it back up to surface waters and into the atmosphere.
“We’re showing that more and more carbon than would have otherwise been predicted is making its way to the deep sea,” says Huffard.
“The rover adds the dimension to tell us that most of that carbon is actually getting eaten once it's down there, not being stored in the sediment.”

Are these extra-large pulses of marine snow now a permanent feature of the deep waters off California, or an aberration? With the benthic rover, scientists can gather the long-term data required to start providing answers.
“The deep sea is largely understudied and under-appreciated, despite the fact that it is critical to keeping the planet healthy and combating climate change,” says Lisa Levin, who studies the seafloor at the Scripps Institution of Oceanography but wasn’t involved in this work.
“An army of such devices could help us better understand biogeochemical changes—critical to improving climate models, ecosystem models, fisheries models, and more.” Rovers might also help scientists study the effects of deep-sea mining operations.

For now, Huffard and Sherman will keep BR-II rolling off the coast of California—the first of hopefully many such autonomous bottom-dwelling robots that could roam the depths of the world’s oceans.
They say they’ve been approached by other scientists interested in the system, but so far BR-II is more or less one-of-a-kind, both because it’s expensive and requires a lot of engineering know-how to operate.
(Researchers in Germany have developed a similar oxygen-sampling benthic rover called Tramper, which has been roaming the Arctic since 2016.)

“It's almost like if you were an astronomer and if you have the best telescope in the world, but it could only look at one star,” Huffard says.
“If you had more telescopes out there looking at more stars, you'd be able to see a much more complete picture of the sky.”

“I think we would both love it if more people wanted to build more rovers,” she adds.

Sherman laughs.
“As long as they don't call us to fix them.”

Wednesday, November 17, 2021

eLoran: Part of the solution to GNSS vulnerability

From GPSWorld by Matteo Luccio

Opposite and complementary

Though marvelous, GNSS are also highly vulnerable.
eLoran, which has no common failure modes with GNSS, could provide continuity of essential timing and navigation services in a crisis.

GPS fits Arthur C. Clarke’s famous third law: “Any sufficiently advanced technology is indistinguishable from magic.” Yet, it also has several well-known vulnerabilities — including unintentional and intentional RF interference (the latter known as jamming), spoofing, solar flares, the accidental destruction of satellites by space debris and their intentional destruction in an act of war, system anomalies and failures, and problems with satellite launches and the ground segment.

Over the past two decades, many reports have been written on these vulnerabilities, and calls have been made to fund and develop complementary positioning, navigation and timing (PNT) systems.
In recent years, as vast sectors of our economy and many of our daily activities have become dependent on GNSS, these calls have intensified.

A key component of any continent-wide complementary PNT would be a low-frequency, very high power, ground-based system, because it does not have any common failure modes with GNSS, which are high-frequency, very low power and space-based.
Such a system already exists, in principle: it is Loran, which was the international PNT gold standard for almost 50 years prior to GPS becoming operational in 1995.
At that point, Loran-C was scheduled for termination at the end of 2000.

However, beginning in 1997, Congress provided more than $160M to convert the U.S.
portion of the North American Loran-C service to enhanced Loran (eLoran).
In 2010, when the U.S.
Loran-C service ended, its modernized and upgraded successor was almost completely built out in the continental United States and Alaska.
During the following five years, Canada, Japan, and European countries followed the United States’ lead in terminating their Loran-C programs.

Today, however, eLoran is one of several PNT systems proposed as a backup for GPS.

The National Timing Resilience and Security Act of 2018 required the Secretary of the U.S.
Department of Transportation (DOT) to “provide for the establishment, sustainment, and operation of a land-based, resilient, and reliable alternative timing system” as a backup to GPS.
In January 2020, the DOT awarded contracts to 11 companies to demonstrate their technologies’ ability to act as a backup for GPS.
Of these companies, two were working on eLoran projects.

Technical advisers to the federal PNT Executive Committee have been advocating and recommending that the government implement eLoran for the past 11 years.
Yet, while the U.S.
government announced in 2008, and again in 2015, its intention to build an eLoran system, it has not done so yet.

Not Your Grandfather’s Loran

In the 1980s, I used Loran-C to navigate on sailing trips off the U.S. East Coast.
It had an accuracy of a few hundred feet and required interpreting blue, magenta, black and green lines that were overprinted on nautical charts.
The system was a modernized version, launched in 1958, of a radio navigation system first deployed for U.S. ship convoys crossing the Atlantic during World War II.
Its repeatability was greater than its accuracy: lobster trappers could rely on it to return to the same spots where they had been successful before, though they may have had some offset from the actual latitude and longitude.

By contrast, eLoran has an accuracy of better than 20 meters, and in many cases, better than 10 meters.
It was developed by the U.S. and British governments, in collaboration with various industry and academic groups, to provide coverage over extremely wide areas using a part of the RF spectrum protected worldwide.
Unlike GNSS, eLoran can penetrate to some degree indoors, under very thick canopy, underwater and underground, and it is exceptionally hard to disrupt, jam or spoof.

Unlike Loran-C, eLoran is synchronized to UTC and includes one or more data channels for low-rate data messaging, added integrity, differential corrections, navigation messages, and other communications.
Additionally, modern Loran receivers allow users to mix and match signals from all eLoran transmitters and GNSS satellites in view.

Finally, eLoran can be used for integrity monitoring of GPS — and vice versa.
“Think of a resiliency triad, consisting of GNSS (global), eLoran (continental), and an inertial measurement unit, a precise clock, or a fiber connection,” said Charles A. Schue, CEO of UrsaNav.
“It is extremely difficult to jam or spoof all three sources at the same time, in the same direction, and to the same amount.”

For the eLoran system to cover the contiguous United States, between four and six transmission sites could provide overlapping timing coverage, and 18 transmission sites could provide overlapping positioning and navigation.

U.S. Developments

The INVEST in America Act authorizes $157 million for the Department of Homeland Security to conduct research in five separate areas, one of which is positioning, navigation and timing resiliency; however, none of this money is for eLoran per se.
The regular DOT appropriation for next year has $17 million for PNT-related research, $10 million of which is for “GPS Backup/Complementary PNT Technologies Research.” However, neither of these bills has yet been finalized, let alone passed into law, so they may change.

“These are very complex systems, with five- to seven-year sales cycles,” pointed out Schue, “and the process is even slower now due to the pandemic.
With adequate funding, eLoran signals could start becoming available in the contiguous United States within a year of a service contract being signed.
We should recall that GPS — as, indeed all of the GNSS — was brought online gradually as satellites were developed and launched into space.
There should be no expectation that any other nationwide system would be available at the flip of a switch instead of through gradual implementation.”

the former Loran-C transmission antenna at Værlandet, Norway.
(Photo: UrsaNav)
International Developments

Loran-C and eLoran operate internationally.
Saudi Arabia, China and Russia continue to operate Loran-C or Chayka systems.
In October 2020, a Chinese paper described how the nation is expanding Loran to its west to cover the whole country to protect itself from disruptions of space-based services.
A previously published report made it clear that they are upgrading or have upgraded from Loran-C to eLoran.
South Korea has an ongoing project to upgrade its Loran-C to eLoran.
It also seems the project will ensure that the South Korean system will be useable on its own, even if the Russian and Chinese systems with which it normally cooperates are not available for some reason, according to Dana Goward, president of the Resilient Navigation and Timing Foundation.

The United Kingdom is still committed to eLoran, and operates one station that has been used as an alternative time reference to GNSS.
“However, as the sole station still transmitting in that area of Europe it’s of no use for positioning,” said Nunzio Gambale, CEO of Locata Corporation.
“Unfortunately, the EU’s shutdown of their old Loran sites seems to have been completed, and no EU-based Loran sites remain operational.
Their actions leave scant hope for Loran’s resurrection any time soon as an alternative to GNSS positioning in Europe.
That’s a shame, because eLoran has beneficial PNT characteristics that other alternate technologies will struggle to replicate.”

A deck officer on a ship takes a relative bearing using a pelorus.
Loran-C was developed in large part for maritime navigation.
(Photo: aytugaskin/iStock/Getty Images Plus/Getty Images)


“There is fairly good agreement across the PNT community that there is no sole solution [to GPS vulnerabilities],” Schue said.
“It needs to be a system of systems.”

The PNT community, he said, is working with Congress and the administration “to move ahead with actual RFPs to start the contracting process — instead of continuing to admire the problem.” UrsaNav, NextNav, OPNT and other companies and organizations “are working together as best as we can to tell the federal government that we all believe in a system-of-systems approach and that there ought to be some tangible forward motion.”

While DOT has the lead on providing PNT resiliency, it and the departments of Defense and Homeland Security need to cooperate on this, Schue argued.
“Many, if not all, of the other departments — such as Commerce, Energy, State, Interior and Agriculture — also have a stake.”

GNSS will remain for a reason.
“Unless a new national terrestrial PNT system moves the game forward for many markets, it’s just far too easy to remain with the GNSS system, which is fundamentally free,” Gambale said.
“That’s a really difficult price point to compete with, unless you’re delivering significant new value to the market.”

The time to act is now.
“This issue has been studied to death for more than 20 years,” Goward said.
“There are technologies ready to deploy. It is time for action. A failure of national PNT will be catastrophic.”

Links :

Tuesday, November 16, 2021

Norwegian undersea surveillance network had its cables mysteriously cut

LoVe Ocean Observatory

 From The Drive buy Thomas Newdick

The seafloor ocean observatory off the coast of northern Norway can detect submarine traffic, which could make it a prime target for the Russians.

Undersea sensors off the coast of northern Norway that are able to collect data about passing submarines, among other things, have been knocked out, the country’s state-operated Institute of Marine Research, or IMR, has revealed.
The cause of the damage is unknown, but the cables linking the sensor nodes to control stations ashore are said to have been cut and then disappeared.
This has raised suspicions about deliberate sabotage, possibly carried out by the Russian government, which definitely has the means to do so.

The IMR, one of the biggest marine research institutes in Europe, described “extensive damage” to the outer areas of the Lofoten-Vesterålen (LoVe) Ocean Observatory, putting the system offline.
LoVe, which was only declared fully operational in August 2020, consists of a network of underwater cables and sensors located on the Norwegian Continental Shelf, an area of strategic interest for both Norway and Russia.

LoVe Ocean Observatory
Cables associated with the LoVe ocean observatory are prepared for deployment in May 2020.

Norway’s military and the country's national Police Security Service are reportedly investigating what happened to the research surveillance system.
LoVe's stated purpose is to use its sensors to monitor the effects of climate change, methane emissions, and fish stocks, providing scientists with a live feed of imagery, sound, and other data.

LoVe Ocean Observatory
A map of the LoVe cable transect and scientific nodes (left) and its position off the coast of Norway (insert), and a schematic layout of the sensors in the LoVe network (right).
Localization with the GeoGarage platform

Of course, the system also monitors submarine activity in the area, so will immediately be of interest to the Russian Navy, in particular.
Indeed, data gathered by its sensors is first sent to the Norwegian Defence Research Establishment, also known by its Norwegian acronym FFI, before being handed over to the IMR for further study.
“FFI is believed to routinely remove traces of any submarine activity in the area before turning over the observatory’s data to IMR so that it only contains fishing, currents, and climate information,” according to a report from Norway’s News in English website.

“We don’t care so much about the submarines in the area (located not far from onshore military installations at Andøya, Evenes and other bases in Northern Norway), but we know the military is,” IMR director Sissel Rogne told the Norwegian newspaper Dagens Næringsliv.
“You could see what’s going on down there regarding all types of U-boats [submarines] and all other countries’ U-boats.
That’s why I didn’t think this was just a case for the police but a case for [the police security agency].”

“Something or someone has torn out cables in outlying areas,” Geir Pedersen, the LoVe project leader, said in a press statement last Friday.
Reports indicate that more than 2.5 miles of fiber optic and electrical cables were severed and then removed.
In total, LoVe uses more than 40 miles of cables in the Norwegian Sea.

Based on reports in the Dagens Næringsliv, the LoVe observatory has been affected by interference since at least April, when the connection between the sensor network and the control station at Hovden on the northern island of Langøya was lost.
An unmanned submarine subsequently traced the cause of the breakdown to one of the underwater surveillance platforms, Node 2, which had been dragged away from its normal location with its connecting cable severed and removed.

LoVe Ocean Observatory
A platform within the LoVe network undergoes routine maintenance in 2019.

A follow-up mission in September attempted to trace the cable running from Node 2 with Node 3, only to find that this platform also had been moved, its components damaged, and its cable was missing.

Sveter/Wikimedia Commons
The IMR research vessel G.O. Sars has been used to examine the area of the breach in the undersea surveillance network, including hunting for the lost section of cable.

Meanwhile, News in English reports that the surveillance system has not been online since the initial disturbances to its operations in April.

Rogne told Dagens Næringsliv that the size and weight of the cable running between Nodes 2 and 3 was so great it would have required something with considerable power to have severed it.

IMR’s Øystein Brun told the same newspaper that the institute was now assessing whether the cables were cut deliberately, but suggested that seems the most likely explanation since the crew of a vessel should have noticed if they had accidentally become entangled with them and would likely have reported it.

It’s also unclear what has happened to the missing cable, around 9.5 tons in all, which has not been recovered.

Part of the investigation has sought to identify the vessels that were active in the area in question as of April this year.
According to the IMR, that has been made more difficult by the fact that some of them likely were underway without transponders activated, meaning they would not have been broadcasting their positions to the Coast Guard or other agencies.
Any vessel attempting to tamper with the cables would probably have had its transponder off, implying that a foreign power performed this act deliberately.
In the meantime, at least some of the vessels in the area at the time have been identified, although no more details have been disclosed.

The reasons why a foreign nation may have attempted to sever the cables, and take them away, are several.
First, as we have already seen, the surveillance system is an important means for Norway to track foreign submarine activity in the Norwegian Sea, potentially restricting certain operations in these waters.
Second, this power may have wanted to explore the type of information that the LoVe system is capable of gathering, to give an idea of the sorts of capabilities available to Norway and, by extension, NATO.
Third, as IMR director Rogne pointed out, the cables themselves may yield valuable technical information, for anyone wanting to install a similar system, for example.

The video below contains a recording from one of LoVe’s hydrophones of a larger ship passing by, highlighting the kind of acoustic data that the system can collect.

With the Russian Navy’s Northern Fleet on Norway’s doorstep, it would not be unexpected to see suspicion fall upon some kind of Russian espionage or sabotage activity, although the IMR is so far being circumspect on this matter.
While we don’t know what happened, there could also be a more banal explanation, perhaps an unintended tangling of the cables with some kind of vessel or as the result of deep-sea dredging during oil exploration.

However, News in English reports that there has been “lots of Russian shipping activity in the area of late, often cropping up around Norwegian offshore infrastructure.”In this case, the activity referred to is legal, but the implication is that this is an area in which Russia has a particular interest, and in which its vessels, naval and civilian, operate routinely.

While the area is very close to the Norwegian coast, it’s also adjacent to the Greenland, Iceland, United Kingdom Gap, or GIUK Gap, a major strategic bottleneck through which Russian submarines would need to break through undetected if they wish to move out into the wider Atlantic without being traced.

A Cold War-era map of the GIUK Gap.

Norwegian authorities have publicly disclosed Russian interference with and otherwise aggressive actions toward other sensor and communications networks in the region in the past.
In 2018, the Norwegian Intelligence Service (NIS) disclosed three separate instances where Russian aircraft had flown mock attack profiles against a secretive radar station in the northern part of the country.
The year before, the NIS blamed Russian jamming for disruptions in cell phone and GPS service in the region, though it said this was a byproduct of an exercise and not a deliberate attack.

Prior to these new developments involving the LoVe, various reports have suggested that Russia has been deploying boats at least close to undersea cables in the North Atlantic, part of a general uptick in its naval operations in those waters.
Recent activity has included the presence of the survey ship Yantar off the Atlantic coast of Ireland in August.
As well as carrying deep-sea submersibles and sonar systems, the Yantar has been repeatedly suspected of covert operations involving undersea cables.

“We are now seeing Russian underwater activity in the vicinity of undersea cables that I don’t believe we have ever seen,” U.S. Navy Rear Adm. Andrew Lennon, then serving as NATO’s top submarine officer, told The Washington Post in December 2017.
“Russia is clearly taking an interest in NATO and NATO nations’ undersea infrastructure.”

Certainly, Russia possesses special mission submarines that could well be equipped to both cut and tap cables, or even remove them for further study, as seems to have been the case with the LoVe network.
In particular, U.S.
Northern Command has highlighted the potential threat posed by the Russian Navy’s nuclear-powered midget submarine Losharik, compounded by the fact that it’s judged especially difficult to detect and monitor.


A slide from a U.S. Northern Command briefing in 2016 showing an artist’s conception of the Losharik special missions submarine.

In an earlier piece on the Losharik, The War Zone described its covert role and unique capabilities as follows:

The Main Directorate of Deep-Sea Research, Russia’s main naval intelligence entity, also known by the Russian acronym GUGI, operates Losharik and its primary missions are investigating, manipulating, and recovering objects on the seabed, such as hunting for items of intelligence value or tapping or cutting seabed cables.
The small submarine is also designed to ride underneath a larger submarine mothership to get closer to the target area.
GUGI has a number of motherships converted from ballistic missile and cruise missile submarines.

Losharik has been laid up since suffering extensive damage in a fatal fire in 2019, but Russia has other similar special-mission boats available, as well as large mothership submarines capable of bringing them covertly to and from a mission area.

The capabilities thought to be provided by the Losharik, and others like it in Russian service, have long been a worry for NATO officials.
Their concerns include hostile submarines operating close to their coasts and undetected while carrying out missions including tapping cables, deploying sensors, or otherwise collecting intelligence.
Even one Russian submarine could potentially wield power far greater than its size, representing a powerful asymmetric naval threat by cutting cable completely as an information warfare tactic.
While we have no evidence that this is what happened off the coast of Norway, it’s would be expected that this scenario would at least be a line of inquiry.

A reported picture of Losharik running on the surface.

That the North Atlantic is an area of renewed interest for the Russian military is also no surprise, given the establishment of a new Northern Fleet Joint Strategic Command in 2014, responsible for the Arctic, North Atlantic, and Scandinavian regions.
It includes the Northern Fleet, assets of which are concentrated on the Kola Peninsula, as well as military garrisons, and airbases, including a growing number of forward-located airfields in the High North.
The Russian Navy has been exploring establishing underwater sensor networks and other infrastructure, including nuclear reactors on the seafloor to provide power, in the region, as well.

The United States has also stepped up its military presence in this wider region, with a particular tilt toward cooperation with Norway.
In recent years, this has joint exercises in the air and on the ground and consideration has also been given to operating U.S.
Navy submarines from a cavernous naval base built under a Norwegian mountain.
U.S. Navy submarines have been a more visible presence in the region.
This includes a rare publicized appearance by the first-in-class USS Seawolf surfaced in a fjord near Tromsø last year and an actual port visit there by the Virginia class attack submarine USS New Mexico in May of this year. 

U.S. Navy USS Seawolf on the surface in a fjord near Tromsø, Norway on Aug.
21, 2020.

Seemingly, the LoVe case has proven very puzzling — and costly — for the IMR, although it’s not clear what kinds of evidence have been gathered by the Norwegian military or intelligence services.
As it stands, however, for the time being, Norway has lost a very important source of surveillance for all manner of underwater activities.
While the IMR will hope to bring at least a part of the system back online as soon as it can, the Norwegian Armed Forces will likely also be eager to have this source of underwater intelligence restored as soon as possible.

Links :

Monday, November 15, 2021

Sofar nets a $39M round B to grow its ocean-monitoring autonomous buoy network

Image Credits: Sofar Ocean
From Techcrunch by Devin Coldewey

The ocean is vast and mysterious … but rather less so when you have thousands of little autonomous buoys reporting back interesting info to you every day.
That’s just what Sofar Ocean has, and it just raised $39 million to scale up its vision of real-time understanding of the seven seas.

The company operates what it calls an “ocean intelligence platform,” essentially a real-time map of various important oceanic metrics like currents, temperature, weather and so on.
While some of this information is easy enough to get from satellites or the large network of shipping vessels on the water at any given time, the kind of granularity and ground truth you get from having thousands of dedicated observers riding the waves is pretty clear.

If you have data that’s 15 minutes old rather than yesterday’s reading or an estimate by a passing satellite, you can simply make more informed decisions about things like shipping routes, weather predictions (even on land), and of course there are the innumerable scientific applications of such a large amount of data.

Understand the ocean like never before.
There are, so far, if you will, some unspecified thousands of “Spotters,” as they call them, out there.

“One might argue this number still feels small when you think about the size of the oceans,” said CEO Tim Janssen, but it’s both more than others have accomplished and still not enough.
“We’ve already got all five oceans covered, but now it’s time to kick it into even higher gear to improve the density of this distributed platform for the most powerful sensing capability possible.
That’s why we anticipate rapidly adding many more sensors over the next couple of years to expand the data we collect and get even more accurate ocean insights.”

Sofar and DARPA recently announced a hardware standard called Bristlemouth intended to serve as a reference design for people designing their own ocean-going data collection devices.
The idea is to make the growing autonomous presence in the water as interoperable as possible to avoid the bother of overlapping yet incompatible networks.

The challenges from running a network of thousands of presumably barnacle-encrusted, fish-nibbled, weather-beaten robo-buoys are what you might expect.
Janssen said the Spotters require “minimal maintenance,” having been designed to survive the open ocean for long periods of time.
“We recently had a Spotter that was covered in ice because of harsh weather conditions and once the ice melted months later, it automatically started sharing data again,” he recalled.
If one washes up on shore they help the finder return it to where it needs to be.

The devices report not through manual data offloading or mesh networking (though this is an option) but through the Iridium satellite network — though Janssen said the company is “starting to lean into some of the latest technologies, like Swarm, that are revolutionizing the satellite communication space.” (Swarm, as we’ve followed since its early days, is a low-bandwidth satcom network focused on IoT-type applications rather than consumer internet. SpaceX is in the process of acquiring them.)

Sofar’s interface for showing currents and other ocean conditions.
Image Credits: Sofar

The $39 million round was led by Union Square Ventures and the Foundry Group, both of which expressed (in a press release) the clear need for more data in both present enterprises like shipping and future work like studying climate change.

“What we’re seeing now, especially in light of COP26, is that climate change discussions are finally taking center stage as governments across the globe adjust and plan ahead for more intense hurricanes and storms, rising sea levels and threatened ecosystems like coral reefs,” Janssen explained.
“Any clarity that can be provided regarding these changing weather patterns, currents and temperatures, and sensitive marine ecosystems isn’t just a win for us or our partners; it’s truly a win for each individual on this planet as we all collectively work together to beat the ticking clock.”

While governments think about whether they should do anything, of course, shipping and supply chain management companies are willing to pay for Sofar’s data, in the hopes of better routing, which minimizes fuel use and improves logistics generally.

“Having access to real-time data is going to help reduce uncertainty across all these industries to be more efficient, make better business decisions, and even save fuel to reduce emissions — all to establish a more sustainable and more prepared planet,” Janssen said.
Links :

Sunday, November 14, 2021

Wave tank demonstration showing the impact of coastal defences on flood risk

The JBA Trust wave tank shows how different combinations of coastal defences and wave and tide conditions affect the potential for overtopping and flood risk.
Over-topping rates can be measured for the following defences and conditions: 
- beach during a storm surge 
- vertical and recurved sea walls
- stepped and sloped revetments
- rock armour
- submerged near-shore breakwater