Tuesday, September 22, 2020

Autumn equinox

Today is the autumn equinox.
Perfect symmetry between the 2 hemispheres that receive the same amount of solar energy over a day with 12 hours of day/12 hours of night.
Can you see the sun's track exactly on the equator in this satellite animation?

This year's Autumnal Equinox falls on 22 September at 14:30 BST.
The equinox is when the centre of the Sun (as viewed from Earth) crosses the Earth's equator.

Curiosities at the chart makers Imray

The protractor engraved with William Heather's name which was probably his personal instrument

From Yachting Monthly by Katy Stickland

Katy Stickland goes behind the scenes at Imray and discovers the treasures held by the nautical publisher

‘Great Andaman where Inhabitants are said to be Cannibals’.
These ominous words hang in the air, which is thick with the smell of old paper, ink and dust as we gingerly leaf through piles of ‘blueback’ charts in the basement of Imray, Laurie, Norie and Wilson Ltd.

Yachting Monthly is being given a tour of the nautical publisher’s HQ in Wych House in St Ives, Cambridgeshire.

We are poring over some of the company’s oldest charts including this one from 1784 of Andaman and the Nicobar Islands in the Indian Ocean.

The chart is marked in black and red ink, with everything in red highlighting an amendment.

A 1784 chart of the Andaman Islands with amendments in red ink. Credit: Katy Stickland

It also includes remarks from one Captain Phineas Hunt, detailing the Nicobar Islands’ channels and harbours as well as where there is an ‘abundance of Hogs & Fowls’, vital information for Merchant Navy ship captains in the 18th and 19th centuries.

Although Imray as it is today was incorporated in 1904, the roots of the three chart publishers that formed it – James Imray & Son, Norie & Wilson and RH Laurie – can be traced back to the mid 1700s.

For centuries these London-based firms, along with a few other companies, were responsible for producing what became known as blueback charts because of the distinct blue manila paper which was used to back them.

This not only strengthened but also distinguished them from the British Admiralty charts, which were published on heavier-weight paper.

These privately printed charts were mainly used by the Merchant Navy; publications for recreational sailors only really became available from the 1890s.

Imray director Lucy Wilsons shows YM some of the firm’s old blueback charts.
Credit: Theo Stocker

By then, competition from Admiralty charts was damaging the traditional private chart trade, a threat that would eventually lead to the amalgamation of the three firms.

It was Norie & Wilson which took the prudent step in publishing Fore and Aft Seamanship for Yachtsmen: With Names of Ropes, Sails, and Spars in a Cutter, Yawl, or Schooner in 1878.

But it wasn’t until after the First World War that recreational sailing started to become the focus of Imray’s business, partly because of the growth of yachting in the UK.

The C and Y series charts were launched in the late 1920s.

Opposite Wych House’s basement are a framed 1962 C4 chart of the Needles Channel to Portland, and a 1947 chart of the London Docks and the River Thames.

Decades later, the yellow and green colouring on the C4 is still almost psychedelic, competing for eye gaze with the vibrant reds and greens used for colouring the London chart.

Some of the early charts feature the routes of ships

The C4 also has some of the same features of the original bluebacks.

Within the chart are smaller charts for Weymouth Harbour, Christchurch and Lulworth Cove.

The addition of large scale harbour plans were always considered good value for money by Merchant seamen, as it meant they didn’t have to buy additional charts, a continuation which was warmly welcomed by recreational sailors.

Today, Imray C charts cover the whole of the British Isles and parts of Europe.

Imray also rewrote some of its pilot books to include anchorages and passages accessible by smaller sailing boats.

Imray still retains the ethos of its founders: to produce charts using accurate hydrographic data

Previously, pilot books had just concentrated on the needs of larger ships.

The Pilot’s Guide to the English Channel and The Pilot’s Guide to the Thames Estuary and the Norfolk Broads were both rewritten by Eric Wilson and published in 1932 and 1934.

The initial success of Imray’s foray into the yachting market was abruptly halted by the start of the Second World War, which saw Imray move its offices from London to Cambridgeshire.

St Ives was chosen because the print works Enderby & Co were based there, and it had lithographic printers large enough to print charts.

Modern and older versions of Norie’s Nautical Tables for astro navigation

Throughout the 1960s and 1970s Imray established itself as a publisher for cruising sailors.

It started producing folded charts, and in 1979 ended the lining of charts with blue manila paper.

In 1999, the first digitally drawn charts using GIS (geographic information system) software were produced.

Recently Imray has undergone another major change in its production system allowing it to develop new products such as Imray Electronic Navigation Charts (ENCs), update products more frequently, align book, chart and digital products more closely with each other and receive and manipulate data more easily.

Despite advances in technology, Imray still retains the ethos of those three original publishers: to produce charts using accurate hydrographic data.

It is appropriate our tour ends in the boardroom, where the portraits of those early founders stare down at us, alongside the tools of their trade.

The brass protractor belonging to Norie’s founder William Heather; Lord Nelson’s favourite chair which was given to Heather by a friend who served with Nelson on the HMS Boreas and the early editions of Norie’s Nautical Tables, still produced by the company.

Links :

NASA-led study reveals the causes of sea level rise since 1900

This aerial photograph shows fast-moving meltwater rivers flowing across the Greenland Ice Sheet, a region that, combined with Antarctic meltwater and thermal expansion, accounts for two-thirds of observed global mean sea level rise.
Credits: NASA

From NASA by Ian J. O'Neill / Jane J.Lee

Scientists have gained new insights into the processes that have driven ocean level variations for over a century, helping us prepare for the rising seas of the future.

To make better predictions about the future impacts of sea level rise, new techniques are being developed to fill gaps in the historic record of sea level measurements.
We know the factors that play a role in sea level rise: Melting glaciers and ice sheets add water to the seas, and warmer temperatures cause water to expand.
Other factors are known to slow the rise, such as dams impounding water on the land, stymying its flow into the sea.

When each factor is added together, this estimate should match the sea level that scientists observe.
Until now, however, the sea level "budget" has fallen short of the observed sea level rise, leading scientists to question why the budget wouldn't balance.

A new study published on Aug.19 seeks to balance this budget.
By gaining new insights to historic measurements, scientists can better forecast how each of these factors will affect sea level rise and how this rise will impact us in the future.

For example, in its recent flooding report, the National Oceanic and Atmospheric Administration (NOAA) noted a rapid increase in sea level rise-related flooding events along U.S. coasts over the last 20 years, and they are expected to grow in extent, frequency, and depth as sea levels continue to rise.

Factors Driving Our Rising Seas 

On reexamining each of the known contributors to sea level rise from 1900 to 2018, the research, led by NASA's Jet Propulsion Laboratory in Southern California, uses improved estimates and applies satellite data to better understand historic measurements.

This infographic shows the rise in sea levels since 1900.
Pre-1940, glaciers and Greenland meltwater dominated the rise; dam projects slowed the rise in the 1970s.
Now, ice sheet and glacier melt, plus thermal expansion, dominate the rise.
Tide-gauge data shown in blue and satellite data in orange.
Credits: NASA/JPL-Caltech

The researchers found that estimates of global sea level variations based on tide-gauge observations had slightly overestimated global sea levels before the 1970s.
(Located at coastal stations scattered around the globe, tide gauges are used to measure sea level height.) They also found that mountain glacier meltwater was adding more water to the oceans than previously realized but that the relative contribution of glaciers to sea level rise is slowly decreasing.
And they discovered that glacier and Greenland ice sheet mass loss explain the increased rate of sea level rise before 1940.

In addition, the new study found that during the 1970s, when dam construction was at its peak, sea level rise slowed to a crawl.
Dams create reservoirs that can impound freshwater that would normally flow straight into the sea.

"That was one of the biggest surprises for me," said lead researcher Thomas Frederikse, a postdoctoral fellow at JPL, referring to the peak in global dam projects at that time.
"We impounded so much freshwater, humanity nearly brought sea level rise to a halt."

Since the 1990s, however, Greenland and Antarctic ice sheet mass loss and thermal expansion have accelerated sea level rise, while freshwater impoundment has decreased.
As our climate continues to warm, the majority of this thermal energy is absorbed by the oceans, causing the volume of the water to expand.
In fact, ice sheet melt and thermal expansion now account for about two-thirds of observed global mean sea level rise.
Mountain glacier meltwater currently contributes another 20%, while declining freshwater water storage on land adds the remaining 10%.

All told, sea levels have risen on average 1.6 millimeters (0.063 inches) per year between 1900 and 2018.
In fact, sea levels are rising at a faster rate than at any time in the 20th century.
But previous estimates of the mass of melting ice and thermal expansion of the ocean fell short of explaining this rate, particularly before the era of precise satellite observations of the world's oceans, creating a deficit in the historic sea level budget.

Ice shelves in Antarctica, such as the Getz Ice Shelf seen here, are sensitive to warming ocean temperatures.
Ocean and atmospheric conditions are some of the drivers of ice sheet loss that scientists considered in a new study estimating additional global sea level rise by 2100.

Credits: Jeremy Harbeck/NASA

Finding a Balance

In simple terms, the sea level budget should balance if the known factors are accurately estimated and added together.
It's a bit like balancing the transactions in your bank account: Added together, all the transactions in your statement should match the total.
If they don't, you may have overlooked a transaction or two.

The same logic can be applied to the sea level budget: When each factor that affects sea level is added together, this estimate should match the sea level that scientists observe.
Until now, however, the sea level budget has fallen short of the observed sea level rise.

"That was a problem," said Frederikse.
"How could we trust projections of future sea level change without fully understanding what factors are driving the changes that we have seen in the past?"

Frederikse led an international team of scientists to develop a state-of-the-art framework that pulls together the advances in each area of study – from sea level models to satellite observations – to improve our understanding of the factors affecting sea level rise for the past 120 years.

The latest satellite observations came from the pair of NASA – German Aerospace Center (DLR) Gravity Recovery and Climate Experiment (GRACE) satellites that operated from 2002-2017, and their successor pair, the NASA – German Research Centre for Geosciences (GFZ) GRACE Follow-On (launched in 2018).
Additional data from the series of TOPEX/Jason satellites – a joint effort of NASA and the French space agency Centre National d'Etudes Spatiales – that have operated continuously since 1992 were included in the analysis to enhance tide-gauge data.

"Tide-gauge data was the primary way to measure sea level before 1992, but sea level change isn't uniform around the globe, so there were uncertainties in the historic estimates," said Sönke Dangendorf, an assistant professor of oceanography at Old Dominion University in Norfolk, Virginia, and a coauthor of the study.
"Also, measuring each of the factors that contribute to global mean sea levels was very difficult, so it was hard to gain an accurate picture."

 The sea ice is surprisingly weak, has lots of melt ponds, and the expedition ship Polarstern was able to easily break through.
Photo: Steffen Graupnerice / MOSAiC

But over the past two decades, scientists have been "flooded" with satellite data, added Dangendorf, which has helped them precisely track the physical processes that affect sea levels.

For example, GRACE and GRACE-FO measurements have accurately tracked global water mass changes, melting glaciers, ice sheets, and how much water is stored on land.
Other satellite observations have tracked how regional ocean salinity changes and thermal expansion affect some parts of the world more than others.
Up-and-down movements of Earth's crust influence the regional and global levels of the oceans as well, so these aspects were included in the team's analysis.

"With the GRACE and GRACE-FO data we can effectively back-extrapolate the relationship between these observations and how much sea level rises at a particular place," said Felix Landerer, project scientist at JPL for GRACE-FO and a coauthor of the study.
"All observations together give us a pretty accurate idea of what contributed to sea level change since 1900, and by how much."

The study, titled "The Causes of Sea Level Rise Since 1900," was published Aug. 19 in Nature.
In addition to scientists from JPL and Old Dominion University, the project involved researchers from Caltech, Université Catholique de Louvain in Belgium, University of Siegen in Germany, the National Oceanography Centre in the United Kingdom, Courant Institute in New York, Chinese Academy of Sciences, and Academia Sinica in Taiwan.
JPL managed the GRACE mission and manages the GRACE-FO mission for NASA's Earth Science Division of the Science Mission Directorate at NASA Headquarters in Washington.
Based on Pasadena, California, Caltech manages JPL for NASA.

Links :

Monday, September 21, 2020

Tracking undersea earthquakes helps scientists study ocean heating

A team of seismologists and oceanographers has shown that small earthquakes repeatedly emanating from the same spot beneath the ocean floor can help measure changes in ocean temperature.
The quakes generate reliable acoustic signals for measuring ocean temperatures, including at depths below 2000 meters, beyond the reach of other techniques.
If validated, the approach, published this week in Science, could open an entirely new ocean observation system for understanding past and future climate change, says Frederik Simons, a geophysicist at Princeton University unaffiliated with the study.
“There’s a potential treasure trove of data waiting to be analyzed.”

From CleanTechnica by Steve Hanley

Almost every vehicle has a cooling system.
Whether it uses a internal combustion engine or an electric motor, it creates heat while in operation and that heat must be managed to keep the machinery working properly.
If it is not, the radiator may boil over or the battery could catch fire.
Either way, the consequences may be catastrophic.
The world’s oceans are the cooling system for the Earth.
In fact, they absorb up to 95% of the excess heat in the atmosphere.
And if the oceans overheat, the consequences for humanity will be dire indeed.

It is easy to install a temperature gauge in a vehicle’s cooling system to warn if the coolant is getting too hot.
Measuring the temperature of the oceans is much more difficult.
Surface temperatures are relatively simple to monitor but determining the temperature of the deepest parts of the ocean has been almost impossible up until now.
In truth, we know more about the moon, Mars, and the rings of Saturn than we do about the deepest parts of the oceans, where the water can be several miles deep.

In 1951, Rachel Carson published The Sea Around Us, a beautifully written book about the oceans that explained in plain language everything we knew about them at the time.
I read it over the past summer and learned much that I didn’t know about the ocean.
In truth, even though people have stood on the moon since then, our knowledge of the deep oceans has advanced hardly at all since Carson’s book was published.

Oceanographers now have access to the ARGO system, a collection of buoys that can descend up to 2000 meters to measure things like salinity and temperature then rise to the surface where solar powered transmitters send the data to oceanographers around the world.
ARGO is a powerful tool but it covers only a tiny portion of the world’s oceans.
Now scientists at CalTech say they have devised a system that could provide data about deep ocean temperatures and track changes going back decades.

 A global map of earthquake activity.
The earthquakes occur at the boundaries between Earth's tectonic plates.
The colors indicate the depth of the earthquakes, with red being the shallowest and green the deepest. [USGS earthquake catalogue from 2000 to 2008, magnitude of 5.0 M and above.]

How is that possible?
When you hear a siren in the distance, the sound gets higher if the source is getting closer to you, lower if it is going away.
That is known as the Doppler shift.
That shift can be decoded electronically to determine how fast the source of the sound is travelling.
What the scientists at CalTech discovered is that the sounds of undersea earthquakes travel great distances through the water.
But how fast they travel is a function of the temperature of the water they are travelling through.
Carefully measure the amount of time it takes for those sounds to travel underwater and you can calculate the temperature of the water.

And here’s the kicker.
There are recordings of those sounds from some areas of the world that go back decades, so now for the first time it is possible to perform the necessary calculations over a significant period of time and plot the changes.
Jörn Callies is an assistant professor of environmental science and engineering at Caltech and co-author of a study published in the September 18 edition of Science.
He says earthquake sounds are powerful and travel long distances through the ocean without significantly weakening, which makes them easy to monitor.

Wenbo Wu, postdoctoral scholar in geophysics and lead author of the paper, explains that when an earthquake happens under the ocean, most of its energy travels through the earth but a portion of it is transmitted into the water as sound.
These sound waves propagate outward from the quake’s epicenter just like seismic waves that travel through the ground, but the sound waves move at a much slower speed.
As a result, ground waves will arrive at a seismic monitoring station first, followed by the sound waves, which will appear as a secondary signal of the same event.
The effect is roughly similar to how you can often see the flash from lightning seconds before you hear its thunder.

 An artist’s rendering of undersea earthquake waves. Credit: Caltech

“These sound waves in the ocean can be clearly recorded by seismometers at a much longer distance than thunder — from thousands of kilometers away,” Wu says.
“Interestingly, they are even ‘louder’ than the vibrations traveling deep in the solid Earth, which are more widely used by seismologists.” The speed of sound in water increases as the water temperature rises, so the length of time it takes a sound to travel a given distance in the ocean can be used to deduce the water’s temperature.

“The key is that we use repeating earthquakes — earthquakes that happen again and again in the same place,” Wu adds.
“In this example we’re looking at earthquakes that occur off Sumatra in Indonesia and we measure when they arrive in the central Indian ocean.
It takes about a half hour for them to travel that distance, with water temperature causing about one tenth of a second difference.
It’s a very small fractional change, but we can measure it.”

Because the researchers are using a seismometer that has been in the same location in the central Indian Ocean since 2004, they can look back at the data it collected each time an earthquake occurred in Sumatra, for example, and determine the temperature of the ocean at that time.
“We are using small earthquakes that are too small to cause any damage or even be felt by humans at all,” Wu says.
“But the seismometer can detect them from great distances, thus allowing us to monitor large scale ocean temperature changes on a particular path in one measurement.”

The process can detect temperature changes as little as a thousandth of a degree.
Using just the data from the Sumatra area, the researchers say the temperature of the Indian Ocean has risen 0.044 degrees Celsius over the past decade, a result that correlates well with the data provided by the ARGO network.
In fact, the preliminary findings suggest the oceans are warming faster than predicted, although those results are preliminary.

The new research tool is “quite extraordinary and very promising,” says Susan Wijffels, a leader of the ARGO project at the Woods Hole Oceanographic Institute.
What excites here about the new technique is that it allows researchers to examining old seismic records that predate the beginning of the ARGO project.
“What a gift to the climate community that would be,” she says.

Because undersea earthquakes happen all over the world, Callies thinks it should be possible to expand the system to monitor water temperatures in all the world’s oceans..”We think we can do this in a lot of other regions and by doing this, we hope to contribute to the data about how our oceans are warming.” Wu adds that because the technique makes use of existing infrastructure and equipment, it would be quite inexpensive to implement on a global basis.

Of course, having such information available is one thing.
Acting on it is another.
In today’s world, there are any number of powerful corporations who prefer to put masking tape over any global temperature gauge to disguise the danger ahead.
They are only too happy to put profits ahead of people, little realizing that if there are no people, there will be no profits.
There is also an anti-science cult that actively denigrates any and all science.
Either of those forces could delay action to address the factors causing our planet to overheat to the point where corrective action becomes impossible.

Knowing how ocean temperatures are rising is vital information but overcoming the cabal of ignorance and stupidity dedicated to ignoring global heating is even more important.
In the upcoming election, vote as if your life depends upon it, because it does.

Links :

Sunday, September 20, 2020

Huge waves crash against swaying North Sea oil rig

This video could make you seasick...
Huge waves crash against a swaying oil rig, as a severe storm which swept across parts of Scotland hits the North Sea.
The footage of the Borgholm Dolphin installation was captured at the weekend by James Eaton, an offshore worker on the nearby Lomond Platform, around 145 miles east of Aberdeen.

Saturday, September 19, 2020

Image of the week : 5 tropical cyclones over the Atlantic basin

We are issuing advisories on five tropical cyclones over the Atlantic basin.This ties the record for the most number of tropical cyclones in that basin at one time, last set in Sept 1971.

See http://hurricanes.gov for the latest updates. #Paulette #Rene #Sally #Teddy #Vicky

Links :

Friday, September 18, 2020

Chemical tanker grounding and ENC data accuracy

Navigational chart BA2910 (scale 1:500,000)

 Localization with the GeoGarage platform (NGA raster chart)

From TankerOperator

A report by BSU into a chemical tanker grounding gives interesting insight into the importance of understanding the accuracy of electronic chart data.

 Chemical tanker Pazifik (Liquefied gas carrier)

A report by the Germany Federal Bureau of Maritime Casualty Investigation (BSU) of the grounding of 42,000 dwt chemical tanker PAZIFIK in Indonesia in July 2018, published in January 2020, gives interesting insights into the importance of crew understanding the quality of official electronic navigation chart (ENC) data accuracy.

 Voyage planned using ChartCo

The root cause of the accident could be described as the vessel hitting a rock.
The rock was shown on the ECDIS display with a note “underwater rock (always underwater / submerged 1 MAR 2017)”.

 Passage planned using ChartCo

From this, they assumed that the rock was not a hazard, since surrounding water was a comfortable 100m depth.

The crew also thought the vessel was at a safe distance from the underwater rock, since the electronic chart display had a “cross track distance” of 182m either side of the vessel, and the rock was much further away than 182m, according to the ECDIS display.

(The cross track distance is a system on ECDIS displays where vessels are given a safe corridor shown by red and green lines, rather than a specific course, taking uncertainty into account).

 Indonesian navigational chart : ID295 (scale 1:200,000)

 Indonesian navigational chart : ID268-2 (scale 1:50,000)

But in reality the rock was only 9m below the water surface, and located 400m away from where it was stated to be on the ENC.
The ENC’s stated accuracy was +/- 500m.

Indonesian Navigational chart ID295 (scale : 1:200,000)

 Indonesian Navigational chart ID268-2 (scale : 1:50,000)

The rock’s location was also shown accurately on a small scale paper chart mapped in a 1904 Dutch survey, and warnings were published in “Sailing Directions” available onboard. BSU heard from local sources that several other ships have ran aground on the same rock.

The route chosen was recommended by the vessel’s passage planning software.
The vessel's master was familiar with a route through the Lombok Strait, which would have added 200 nautical miles to the voyage.
He decided to take the route recommended by the software to save the 200nm, the Selat Snape strait between Komodo and Banta.

The vessel was loaded with 18,000 tonnes of ammonia – although no cargo escaped because only the forepeak / ballast water tanks were damaged.

It was able to refloat 5 days later after transferring cargo and ballast water to other tanks, and could proceed to a shipyard in Singapore under its own power, supported by a tug.
The repair included renewing 50m of the double bottom.

The company has decided that the vessel will avoid the Selat Sape passage from now on

Voyage according to route planning

 Deviation from route planning up unitil grounding
Navigation background

In its original plan (which was changed due to fishing vessels), the vessel had planned to pass the rock at a distance of 0.7 nautical miles (1300m).

Information stored in the ENC
The following supplementary information is stored for this isolated danger:" Underwater rock (always under water/submerged 1MAR2017)" 

Its ENC was classified as “Zone of Confidence Category C”, which means a position accuracy of +/- 500m horizontally, and “full area search not achieved".

But the ECDIS was set to a cross track distance of 0.1nm (180m) on each side.

This fits company procedures, where it recommends to keep a "cross track distance setting" of 2 x the vessel's beam in confined waters, or just 64.4m, and this passage is considered "confined waters" in the procedural specifications, so the 0.1m (180m) cross track was considered within limits.

There could have been an alarm in the ECDIS that the cross track was set to 180m, while the chart had an accuracy of 500m.

The crew could have brought up data about the chart accuracy on the ECDIS display, including both horizontal and vertical accuracy, such as for submerged rocks.
But it was quite hard to understand how to use it, BSU says.

 Chart p. 138 UKHO Sailing Directions NP34
The electronic version of the UKHO's3sailing directions available on board (e-NP 34, Indonesia Pilot Volume 2) states the following for the Selat Saperoute (6.99): 
"The passage E of PulauBanta is navigable but is seldom used, other than by ferries and other local craft, as tidal streams are strong and fewer anchorages are available [...]." 

If the vessel had been navigating with paper charts, the crew would probably have been more considerate of possible inaccuracies in the chart, and looked up all the “Sailing Directions” if the vessel was going to an area the master was not familiar with.

Or concerns about using paper charts in an unknown area may have led the crew to take on a pilot, who may have had his own accurate soundings map, or had better local knowledge, BSU said.

Sailing directions

The relevant section of Sailing Directions for the strait between Komodo and Banta states "The passage E of Pulau Banta is navigable but is seldom used, other than by ferries and other local craft, as tidal streams are strong and fewer anchorages are available."

The Sailing Direction for the island of Tokohgilibanta states, "a drying rock, 1 mile farther NNW, is small and dangerous; the breakers on it being indistinguishable from the normal overalls and sea conditions in the area."
(Confusingly, on the ENC, the name of the island changes from Tokohgilibanta to Nisabedi when the viewer zooms in).
This description reflects the location of the rock where the vessel ran aground.

A digital version of these sailing directions would have been available onboard, but without any reference to the ENC, which would be required for the computer to connect them.

BSU says that the ECDIS could be described as "not fully engineered" - since it displaces sources of information such as paper sailing directions, without being a consistent replacement for them

 Indonesian ENC

"There are significant differences between traditional voyage planning using paper charts and digital voyage planning using ENCs. Planning a voyage using paper charts often entails referring to sailing directions, the list of lights and pilot charts with proposed routes plotted.”

“Besides drawing on their experience, officers of the navigational watch therefore refer to sources of data other than the navigational chart. Paper charts and sailing directions have developed over centuries and became more accurate in many areas."

"Most of the world's sea areas are looked upon as being inaccurately surveyed, while paper charts only provide an indication of the data of a survey."

"The accident is therefore attributable to the ECDIS and settings specified," BSU says.

More details

The vessel ran aground on a shoal between the islands of Komodo and Banta, Indonesia.

 Waves breaking on rocks

 Rocks above water

It was using a Transas ECDIS, with PassageManager software from ChartCo, with ENCs supplied by ChartCo using data from the Indonesian Hydrographic Office.
It was using voyage planning software "BonVoyage System" (BVS) from StormGeo.

The ChartCo software proposed a route via Selat Sape, passing between Banta and Komodo, going between the tiny islands of Nisabedi and Lubuhtare, which have only 1.5nm between them.
The master and officer decided not to take this route, but instead take a route between the islands of Nisabedi and Banta, which have 2.5nm between them.

BSU looked at the Indonesian and UK Hydrographic paper charts of the region.
The UKHO charts (both 1:500,000) show a "rock awash" symbol, meaning a rock submerged at high tide or temporarily.

The Indonesian smaller scale chart (1:200,000) shows a rock symbol without specifying whether it is sometimes submerged, while the larger scale Indonesian chart (1:50,000) shows a shallow area with water depth of 9m.
This chart was drawn from Dutch surveys carried out in 1904.

 Proper rock designation in official paper charts according to INT1

The ENC shows the shoal 2 cable lengths (400m) from the scene of the accident, with a note saying, "underwater rock (always underwater / submerged 1 MAR 2017)". The general water depth around the rock is about 100m.

The reason for the discrepancy between the ENC and paper chart is not clear.

The ENC has a “Zone of Confidence Category C” (CATZOC) which means “a position accuracy of +/- 500m horizontally, and “full area search not achieved". This data could be fed into the ECDIS to illustrate the range of “cross track distance” needed.

IHO has a Data Quality Working Group looking at options for improve user awareness and presentation of quality data.

Thursday, September 17, 2020

Robots go their own way deep in the ocean

Firms like Rovco are trying to make underwater vehicles more intelligent

From BBC by Ben Morris

Firms like Rovco are trying to make underwater vehicles more intelligent
"It's very common," says Jess Hanham casually, when asked how often he finds suspected unexploded bombs.
Mr Hanham is a co-founder of Spectrum Offshore, a marine survey firm that does a lot of work in the Thames Estuary.
His firm undertakes all sorts of marine surveying, but working on sites for new offshore wind farms has become a big business for him.
Work in the Thames Estuary, and other areas that were the targets of bombing in World War 2, are likely to involve picking up signals of unexploded munitions.
"You can find a significant amount of contacts that need further investigation and for a wind farm that will be established in the initial pre-engineering survey," he says.

Image copyright Spectrum Offshore
Wind farm operators are important customers for Jess Hanham

With that information project managers can decide whether to place turbines and other equipment a safe distance from the suspected bombs, or have them blown up by a specialist firm.

At the moment marine surveying is done by teams who go out on boats, collect the data and bring it back for analysis.

Sometimes that will involve a relatively small vessel with two crew members, a surveyor and his kit. But bigger inspection projects further out to sea can involve much larger boats, with dozens of crew members, costing in the region of £100,000 per day.

The sensor equipment varies according to the job.
Sometimes it might be a sonar array towed behind the boat, for other jobs it might be an underwater unmanned vehicle, which can be controlled by surveyors on the surface.

Bad weather can disrupt the work and make life uncomfortable.
"I've been at sea in force nine and force 10 gales and they're not nice places to work," says Brian Allen, chief executive of Rovco.

His company is one of several looking to disrupt that market using artificial intelligence (AI) systems. They see a future where underwater robots, known as autonomous underwater vehicles (AUVs), will be able to do the surveying work without much human oversight, and send the data back to surveyors in the office.

Bristol-based Rovco is working on key pieces of the technology.
It has trained an AI system to recognise objects on the seabed from data collected at sea, a process which took four years.
Adding AI means the data does not have to be analysed by a human on the ship, or taken back to shore for assessment.
That work is actually being done there and then by the AI, which can operate on the ship, or soon on the underwater robot itself.
"Without AI autonomous underwater robots are pretty dumb - only being able to follow pipelines and cables in pre-programmed lines," says Mr Allen.
"Enabling the AUV to analyse data in real time means you can actually instruct the robot to do other things. If you come across a problem, the survey can be stopped, and more data collected, with the robot making decisions for itself," he says.
So, for example, if the AI flagged up something that looked like an unexploded bomb it could stop, go back, and do further analysis.

Image copyright ROVCO
The goal is for robots to survey underwater structures with minimal human oversight

For some jobs, like dismantling underwater oil and gas infrastructure, engineers need to know the exact dimensions and locations of the equipment.

To help with this Rovco has also developed a vision system that produces accurate maps of underwater infrastructure.
The system generates a 3D cloud of individual data points, a format used in modelling software like CAD.
It combines those points with camera images to generate a realistic 3D reconstruction.
Rovco is currently bringing together the vision system, the AI and the underwater vehicle into one package.

Other companies are also racing to introduce AI into the industry.

Image copyright MODUS 
Companies are racing to introduce AI into their underwater vehicles

Jake Tompkins is the chief executive of UK-based Modus, which owns a fleet of 12 unmanned underwater vehicles. It is about to start a two-year programme with Durham University to develop an artificial intelligence control system that would allow some of its underwater vehicles to recognise their location, objects and anomalies during a survey.

He says that combining with Durham is a very efficient way to develop the technology, because they already have proven AI systems for the car and aerospace industries.

Using autonomous subsea robots to survey the seabed and inspect underwater structures would be a "game changer", according to Mr Tompkins and should "significantly" cut costs.

He thinks it won't be long before underwater robots will be stationed out at sea, perhaps at an offshore wind farm, or at an oil or gas facility.

When needed, they will be woken up and sent to harvest data, which will be sent back to an onshore control centre for processing.

Image copyright MODUS 
In the coming years underwater vehicles will be based offshore

"I think we're probably two or so years away from the first commercial deployment of field-resident autonomous vehicles, but that is certainly where we are heading," says Mr Tompkins.

His company is currently working on ways to keep the AUVs charged while they are out at sea and on technology that allows them to send back data.

There is a juicy prize for the firms that can make such intelligent underwater robots work. Over the next decade the offshore wind market is expected to see "quite extraordinary" growth, according to Søren Lassen, head of offshore wind research at the consultancy Wood Mackenzie.

At the moment only six countries have a commercial-scale offshore wind power industry.
In 10 years' time, he forecasts that 20 countries will have joined that club.

Image copyright ROVCO 
Rapid installation of wind farms is likely to drive demand for autonomous underwater vehicles

Last year 29 gigawatts of offshore wind capacity was connected to electricity grids around the world.
In 2029 Wood Mackenzie forecasts that number will hit 180 gigawatts.
That will involve building thousands of wind turbines and laying thousands of kilometres of cable to connect up those wind farms and all of that will need the services of underwater surveyors.
By 2029, much of that work at sea might be done by autonomous systems, with humans back at the office.

Jess Hanham will keep his business up to date with the latest technology, but fears the work will become less rewarding.
"I love the variety. For me being stuck in the office - I'd hate that. Going out and doing survey work, coming back and seeing the whole thing from start to finish - it gives you ownership of the work. I thoroughly enjoy that. If we were to lose part of that, I think that would be a real shame."

Links :

Wednesday, September 16, 2020

The origin of North Korea's 'ghost boats'

From BBC by Ian Urbina

After exhausting areas close to home, China’s vast fishing fleet has moved into the waters of other nations, depleting fish stocks.
But more than seafood is at stake.

For years, no one knew why dozens of battered wooden “ghost boats” – often along with corpses of North Korean fishermen whose starved bodies were reduced to skeletons – were routinely washing ashore along the coast of Japan.

A recent investigation I did for NBC News, based on new satellite data, has revealed, however, what marine researchers now say is the most likely explanation: China is sending a previously invisible armada of industrial boats to illegally fish in North Korean waters, forcing out smaller North Korean boats and leading to a decline in once-abundant squid stocks of more than 70%.
The North Korean fishermen washing up in Japan apparently ventured too far from shore in a vain search for squid and perished.

The Chinese vessels – more than 700 of them last year – appear to be in violation of United Nations sanctions that prohibit foreign fishing in North Korean waters.
The sanctions, imposed in 2017 in response to the country’s nuclear tests, were aimed at punishing North Korea by not allowing it to sell fishing rights in its waters in exchange for valuable foreign currency.

The South China Sea is home to some of the world's largest fisheries, but they have been severely overfished
(Credit: Getty Images)
The new revelations cast light on the lack of governance of the world’s oceans and raise thorny questions about the consequences of China’s expanding role at sea, and how it is connected to the nation’s geopolitical aspirations.

Estimates of the total size of China’s global fishing fleet vary widely.
By some calculations, China has anywhere from 200,000 to 800,000 fishing boats, accounting for nearly half of the world’s fishing activity.
The Chinese government says its distant-water fishing fleet, or those vessels that travel far from China’s coast, numbers roughly 2,600.
But other research, such as this study by the Overseas Development Institute, puts this number closer to 17,000, with many of these ships being invisible like those that satellite data discovered in North Korean waters.
By comparison, the United States’ distant-water fishing fleet has fewer than 300 vessels.

China is not only the world’s biggest seafood exporter, the country’s population also accounts for more than a third of all fish consumption worldwide.
Having depleted the seas close to home, the Chinese fishing fleet has been sailing farther afield in recent years to exploit the waters of other countries, including those in West Africa and Latin America, where enforcement tends to be weaker as local governments lack the resources or inclination to police their waters.
Most Chinese distant-water ships are so large that they scoop up as many fish in one week as local boats from Senegal or Mexico might catch in a year.

Many of the Chinese ships combing Latin American waters target forage fish, which are ground into fishmeal, a protein-rich pelletised supplement fed to aquaculture fish.
The Chinese fleet has also focused on shrimp.
There is also a large appetite in China for the endangered totoaba fish, which are much prized in Asia for the alleged medicinal properties of their bladders, which can sell for between $1,400 and $4,000 (£1,080 and £3,090) each.

Nowhere at sea is China more dominant than in squid fishing, as the country’s fleet accounts for 50-70% of the squid caught in international waters, effectively controlling the global supply of the popular seafood.
At least half of the squid landed by Chinese fishermen pulled from the high seas is exported to Europe, north Asia and the United States.

To catch squid, the Chinese typically use trawling nets stretched between two vessels, a practice widely criticized by conservationists because it results in a lot of fish inadvertently and wastefully killed.
Critics also accuse China of keeping high-quality squid for domestic consumption and exporting lower-quality products at higher prices.
In addition, critics say, China overwhelms vessels from other countries in major squid breeding grounds and is in a position to influence international negotiations about conservation and distribution of global squid resources for its own interests.

Squid are highly sought-after in many countries, and China is the most active nation in squid fishing (Credit: Alamy)

China’s global fishing fleet did not grow into a modern behemoth on its own.
The government has robustly subsidised the industry, spending billions of yuan annually.
Chinese boats can travel so far partly because of a tenfold increase in diesel fuel subsidies between 2006 and 2011 (Beijing stopped releasing statistics after 2011, according to a Greenpeace study).

For over a decade, the Chinese government has helped pay to construct bigger, more advanced steel-hulled trawlers, even sending medical ships to fishing grounds to enable the fleet to stay at sea longer.
The Chinese government supports the squid fleet in particular by providing it with an informational forecast of where to find the most lucrative squid stocks, using data gleaned from satellites and research vessels.

On its own, distant-water squid fishing is a money-losing business, according to research by Enric Sala, founder and leader of the National Geographic Society’s Pristine Seas project.
The sale price of squid typically does not come close to covering the cost of the fuel required to catch the fish, Sala found.

Still, China is hardly the worst offender when it comes to such subsidies, which conservationists say, along with over-capacity of fishing vessels and illegal fishing, is a major reason that the oceans are rapidly running out of fish.
The countries that provide the largest subsidies to their high-seas fishing fleets are Japan (20% of the global subsidies) and Spain (14%), followed by China, South Korea, and the US, according to Sala’s research.

More recently, the Chinese government has stopped calling for an expansion of its distant-water fishing fleet and released a five-year plan in 2017 that restricts the total number of offshore fishing vessels to under 3,000 by 2021.
Daniel Pauly, a marine biologist and principal investigator for The Sea Around Us Project at The University of British Columbia, said he believes that the Chinese government is serious in wanting to restrict its distant-water fleet.
“Whether they can enforce the planned restrictions onto their fleet is another question,” he added.

Other attempts to rein in China’s fishing fleet, however, have been slow.
Imposing reforms and policing them is difficult partly because laws are lax, much of the workforce on vessels is illiterate, many ships are unlicensed or lack unique names or the identifying numbers needed for tracking, and the country’s fishery research institutions often refuse to standardise or share information domestically or abroad.

The South China Sea is a hotly contested ocean, with several countries laying claim to regions of the sea (Credit: Getty Images)

Chinese fishing boats are notoriously aggressive and often shadowed, even on the high seas or in other countries’ national waters, by armed Chinese Coast Guard vessels.
While reporting at sea, my photographer and I filmed 10 illegal Chinese squid ships crossing into North Korean waters.
Our reporting team was forced to divert its course to avoid a dangerous collision after one of the Chinese fishing captains suddenly swerved toward the team’s boat, coming within 10 metres, likely intending to ward off the boat.

China has sought to extend its maritime reach through more traditional means, too.
The government has been expanding its naval force, while also dispatching at least a dozen advanced research vessels that prospect for minerals, oil and other natural resources.

But China’s fishing fleet too are routinely cast by Western military analysts as a vanguard “civilian militia” that functions as “a non-uniformed, unprofessional force without proper training and outside of the frameworks of international maritime law, the military rules of engagement, or the multilateral mechanisms set up to prevent unsafe incidents at sea”, Greg Poling, director of the Asia Maritime Transparency Initiative at the Center for Strategic and International Studies wrote recently in Foreign Policy.

Nowhere is China’s fishing fleet more omnipresent than in the South China Sea, which is among the most hotly contested regions in the world, with competing historical, territorial and even moral claims from China, Vietnam, Philippines, Malaysia, Brunei, Taiwan and Indonesia.
Aside from fishing rights, the interests in these waters stem from a tangled morass of national pride, lucrative subsea oil and gas deposits, and a political desire for control over a region through which a third of the world’s maritime trade flows.

In the South China Sea, the Spratly islands have attracted most attention as the Chinese government has built artificial islands on reefs and shoals in these waters, and militarised them with aircraft strips, harbours and radar facilities.
Chinese fishing boats bolster the effort by swarming the zone, crowding and intimidating potential competitors, as they did in 2018, suddenly dispatching more than 90 fishing ships to drop anchor within several miles of Philippines-held Thitu Island immediately after the Philippine government began modest upgrades on the island’s infrastructure.

The overfishing of waters close to North Korea are thought to be the reason that boats containing the bodies of fishermen washed up on the shores of Japan (Credit: Getty Images)

In justifying its rights over the region, Beijing usually makes a so-called “nine-dash line” argument, which relies on maps of historic fishing grounds that feature a line made of nine dashes encompassing most of the South China Sea as belonging to China.
Partly because China ignores most of the criticism, and partly because China is economically and otherwise dominant on the global stage, there is a tendency in Western media to lay blame on China for many of the same actions of which the US and Europe have been guilty – in the past or presently.
And while defining what is true or fair in the South China Sea may be no easier than it has proven to be in places like the Middle East, most legal scholars and historians say the nine-dash line argument has no basis in international law, and it was found to be invalid in a 2016 international court ruling.

Clashes over fishing grounds involving the Chinese are not limited to the South China Sea.
Japan and China are at odds over the Senkaku Islands, known in Chinese as the Diaoyu or “fishing” islands.
Elsewhere, an Argentine Coast Guard vessel fired a warning shot to halt a Chinese ship’s escape to international waters in March, 2016.
When the Chinese ship, the Lu Yan Yuan Yu, responded by trying to ram the Argentine vessel, the Coast Guard ship capsized the fishing vessel.
Some of the Chinese crew escaped by swimming out to other Chinese vessels, while others were rescued by the Coast Guard.

From the waters of North Korea to Mexico to Indonesia, incursions by Chinese fishing ships are becoming more frequent, brazen and aggressive.
It hardly takes a great feat of imagination to picture how a seemingly civilian clash could rapidly escalate into a bigger military conflict.
Such confrontations also raise humanitarian concerns about fishermen becoming collateral damage, and environmental questions about the government policies accelerating ocean depletion.
But above all, the reach and repercussions of China’s at-sea ambitions highlight anew that the real price of fish is rarely what appears on the menu.

Links :

Tuesday, September 15, 2020

Microsoft finds underwater datacenters are reliable, practical and use energy sustainably

 Submarine datacenter : project Natick

From Microsoft by John Roach 

Earlier this summer, marine specialists reeled up a shipping-container-size datacenter coated in algae, barnacles and sea anemones from the seafloor off Scotland’s Orkney Islands.

The retrieval launched the final phase of a years-long effort that proved the concept of underwater datacenters is feasible, as well as logistically, environmentally and economically practical.

Microsoft’s Project Natick team deployed the Northern Isles datacenter 117 feet deep to the seafloor in spring 2018.
For the next two years, team members tested and monitored the performance and reliability of the datacenter’s servers.

The team hypothesized that a sealed container on the ocean floor could provide ways to improve the overall reliability of datacenters.
On land, corrosion from oxygen and humidity, temperature fluctuations and bumps and jostles from people who replace broken components are all variables that can contribute to equipment failure.

The Northern Isles deployment confirmed their hypothesis, which could have implications for datacenters on land.

Lessons learned from Project Natick also are informing Microsoft’s datacenter sustainability strategy around energy, waste and water, said Ben Cutler, a project manager in Microsoft’s Special Projects research group who leads Project Natick.

What’s more, he added, the proven reliability of underwater datacenters has prompted discussions with a Microsoft team in Azure that’s looking to serve customers who need to deploy and operate tactical and critical datacenters anywhere in the world.
“We are populating the globe with edge devices, large and small,” said William Chappell, vice president of mission systems for Azure.
“To learn how to make datacenters reliable enough not to need human touch is a dream of ours.”

Earlier this summer, marine specialists reeled up a shipping container-size datacenter coated in algae, barnacles, and sea anemones from the seafloor off Scotland’s Orkney Islands.
The retrieval of the Northern Isles datacenter launched the final phase of Project Natick, a years-long research effort that proved the concept of underwater datacenters is feasible as well as logistically, environmentally, and economically practical.

Phase 2 Project Natick, a successful challenge for Naval Group

Proof of concept

The underwater datacenter concept splashed onto the scene at Microsoft in 2014 during ThinkWeek, an event that gathers employees to share out-of-the-box ideas.
The concept was considered a potential way to provide lightning-quick cloud services to coastal populations and save energy.

More than half the world’s population lives within 120 miles of the coast.
By putting datacenters underwater near coastal cities, data would have a short distance to travel, leading to fast and smooth web surfing, video streaming and game playing.

The consistently cool subsurface seas also allow for energy-efficient datacenter designs.
For example, they can leverage heat-exchange plumbing such as that found on submarines.

Microsoft’s Project Natick team proved the underwater datacenter concept was feasible during a 105-day deployment in the Pacific Ocean in 2015.
Phase II of the project included contracting with marine specialists in logistics, ship building and renewable energy to show that the concept is also practical.

“We are now at the point of trying to harness what we have done as opposed to feeling the need to go and prove out some more,” Cutler said.
“We have done what we need to do. Natick is a key building block for the company to use if it is appropriate.”

Microsoft’s Project Natick team used a gantry barge to retrieve the Northern Isles datacenter from the seafloor off Scotland’s Orkney Islands.
A coat of algae, barnacles and sea anemones grew on the datacenter during its two-year deployment.
Photo by Simon Douglas. 

The Northern Isles datacenter was retrieved from the seafloor off Scotland’s Orkney Islands and towed partially submerged between the pontoons of a gantry barge to a dock in Stromness, Orkney. A coat of algae, barnacles and sea anemones grew on the datacenter during its two-year deployment. Photo by Jonathan Banks.

The Northern Isles datacenter was attached to a ballast-filled triangular base for deployment 117 feet deep on the seafloor off the Orkney Islands in Scotland in 2018.
The Project Natick team retrieved the datacenter this summer.
Two years underwater provided time for a thin coat of algae and barnacles to form on the steel tube, and for sea anemones to grow to cantaloupe size in the sheltered nooks of its ballast-filled triangular base.
Photo by Jonathan Banks. 

The Northern Isles was gleaming white when deployed.
Two years underwater provided time for a thin coat of algae and barnacles to form on the steel tube.
Microsoft Project Natick team members said swift ocean currents at the deployment site limited growth of marine life.
Photo by Jonathan Banks. 

Stephane Gouret of Naval Group checks out a sea anemone that grew in a sheltered nook of the ballast-filled base for the Northern Isles underwater datacenter.
Microsoft’s Project Natick team deployed the datacenter to the seafloor off the coast of the Orkney Islands in Scotland where it operated for two years.
Photo by Jonathan Banks.

Members of the Project Natick team power wash the Northern Isles underwater datacenter, which was retrieved from the seafloor off the Orkney Islands in Scotland.
Two years underwater provided time for a thin coat of algae and barnacles to form on the steel tube, and for sea anemones to grow to cantaloupe size in the sheltered nooks of its ballast-filled triangular base.
Photo by Jonathan Banks. 

Members of the Project Natick team power wash the Northern Isles underwater datacenter, which was retrieved from the seafloor off the Orkney Islands in Scotland.
Two years underwater provided time for a thin coat of algae and barnacles to form on the steel tube, and for sea anemones to grow to cantaloupe size in the sheltered nooks of its ballast-filled triangular base.
Photo by Jonathan Banks.

A seabird alights atop the Northern Isles underwater datacenter after it was retrieved from the seafloor off the Orkney Islands in Scotland and cleaned.
The datacenter operated on the seafloor for two years as part of Microsoft’s Project Natick.
The project is a years-long effort to prove the underwater datacenter concept is feasible as well as logistically, environmentally and economically practical.
Photo by Jonathan Banks. 

Members of Microsoft’s Project Natick team use a bucket lift to collect air samples from the Northern Isles underwater datacenter, which was filled with dry nitrogen and sealed prior to deployment to the seafloor off the Orkney Islands in Scotland.
Another team member cuts the datacenter from its ballast-filled base in preparation for transport to the mainland. Photo by Jonathan Banks. 

Members of the Project Natick team remove the endcap from the Northern Isles underwater datacenter at Global Energy Group’s Nigg Energy Park facility in the North of Scotland.
The datacenter was filled with dry nitrogen and spent two years on the seafloor off the Orkney Islands as part of a years-long effort to prove the underwater datacenter concept is feasible as well as logistically, environmentally and economically practical. Photo by Jonathan Banks. 

Members of the Project Natick team inspect the inside of the Northern Isles underwater datacenter at Global Energy Group’s Nigg Energy Park facility in the North of Scotland after the endcap’s removal. When deployed on the seafloor, cabling connected the underwater datacenter to the Orkney Island power grid, which is supplied 100% by renewable energy technologies. 

Spencer Fowers, a principal member of technical staff for Microsoft’s Special Projects research group, removes a server from the Northern Isles datacenter at Global Energy Group’s Nigg Energy Park facility in the North of Scotland.
Project Natick researchers will analyze it to help determine why the servers in the underwater datacenter were eight times more reliable than those in a replica datacenter on land.
Photo by Jonathan Banks. 

Members of the Project Natick team remove 12 racks of servers and related cooling system infrastructure from the Norther Isles underwater datacenter.
The servers in the underwater datacenter were eight times more reliable than those in a replica datacenter on land.
Photo by Jonathan Banks. 

Mike Shepperd, a senior research and development engineer with Microsoft’s research organization, stands in front of the barnacle-encrusted Northern Isles underwater datacenter.
The datacenter was deployed to the seafloor off the Orkney Islands in Scotland as part of Project Natick, a years-long effort to prove the underwater datacenter concept is feasible as well as logistically, environmentally and economically practical. Photo by Jonathan Banks.

Algae, barnacles and sea anemones

The Northern Isles underwater datacenter was manufactured by Naval Group and its subsidiary Naval Energies, experts in naval defense and marine renewable energy. Green Marine, an Orkney Island-based firm, supported Naval Group and Microsoft on the deployment, maintenance, monitoring and retrieval of the datacenter, which Microsoft’s Special Projects team operated for two years.

The Northern Isles was deployed at the European Marine Energy Centre, a test site for tidal turbines and wave energy converters.
Tidal currents there travel up to 9 miles per hour at peak intensity and the sea surface roils with waves that reach more than 60 feet in stormy conditions.

The deployment and retrieval of the Northern Isles underwater datacenter required atypically calm seas and a choreographed dance of robots and winches that played out between the pontoons of a gantry barge. The procedure took a full day on each end.

The Northern Isles was gleaming white when deployed. Two years underwater provided time for a thin coat of algae and barnacles to form, and for sea anemones to grow to cantaloupe size in the sheltered nooks of its ballast-filled base.

“We were pretty impressed with how clean it was, actually,” said Spencer Fowers, a principal member of technical staff for Microsoft’s Special Projects research group. “It did not have a lot of hardened marine growth on it; it was mostly sea scum.”

A member of the Project Natick team power washes the Northern Isles underwater datacenter, which was retrieved from the seafloor off the Orkney Islands in Scotland.
Two years underwater provided time for a thin coat of algae and barnacles to form on the steel tube, and for sea anemones to grow to cantaloupe size in the sheltered nooks of its ballast-filled triangular base. Photo by Simon Douglas.

Power wash and data collection

Once it was hauled up from the seafloor and prior to transportation off the Orkney Islands, the Green Marine team power washed the water-tight steel tube that encased the Northern Isles’ 864 servers and related cooling system infrastructure.

The researchers then inserted test tubes through a valve at the top of the vessel to collect air samples for analysis at Microsoft headquarters in Redmond, Washington.
“We left it filled with dry nitrogen, so the environment is pretty benign in there,” Fowers said.

The question, he added, is how gases that are normally released from cables and other equipment may have altered the operating environment for the computers.

The cleaned and air-sampled datacenter was loaded onto a truck and driven to Global Energy Group’s Nigg Energy Park facility in the North of Scotland.
There, Naval Group unbolted the endcap and slid out the server racks as Fowers and his team performed health checks and collected components to send to Redmond for analysis.

Among the components crated up and sent to Redmond are a handful of failed servers and related cables.
The researchers think this hardware will help them understand why the servers in the underwater datacenter are eight times more reliable than those on land.
“We are like, ‘Hey this looks really good,’” Fowers said.
“We have to figure out what exactly gives us this benefit.”

The team hypothesizes that the atmosphere of nitrogen, which is less corrosive than oxygen, and the absence of people to bump and jostle components, are the primary reasons for the difference.
If the analysis proves this correct, the team may be able to translate the findings to land datacenters.
“Our failure rate in the water is one-eighth of what we see on land,” Cutler said.
“I have an economic model that says if I lose so many servers per unit of time, I’m at least at parity with land,” he added.
“We are considerably better than that.”

Members of the Project Natick team power wash the Northern Isles underwater datacenter, which was retrieved from the seafloor off the Orkney Islands in Scotland.
Two years underwater provided time for a thin coat of algae and barnacles to form on the steel tube, and for sea anemones to grow to cantaloupe size in the sheltered nooks of its ballast-filled triangular base.
Photo by Jonathan Banks.

Energy, waste and water

Other lessons learned from Project Natick are already informing conversations about how to make datacenters use energy more sustainably, according to the researchers.

For example, the Project Natick team selected the Orkney Islands for the Northern Isles deployment in part because the grid there is supplied 100% by wind and solar as well as experimental green energy technologies under development at the European Marine Energy Centre.

“We have been able to run really well on what most land-based datacenters consider an unreliable grid,” Fowers said.
“We are hopeful that we can look at our findings and say maybe we don’t need to have quite as much infrastructure focused on power and reliability.”

Cutler is already thinking of scenarios such as co-locating an underwater datacenter with an offshore windfarm.
Even in light winds, there would likely be enough power for the datacenter.
As a last resort, a powerline from shore could be bundled with the fiber optic cabling needed to transport data.

Other sustainability related benefits may include eliminating the need to use replacement parts.
In a lights-out datacenter, all servers would be swapped out about once every five years.
The high reliability of the servers means that the few that fail early are simply taken offline.

In addition, Project Natick has shown that datacenters can be operated and kept cool without tapping freshwater resources that are vital to people, agriculture and wildlife, Cutler noted.
“Now Microsoft is going down the path of finding ways to do this for land datacenters,” he said.

Go anywhere

Early conversations about the potential future of Project Natick centered on how to scale up underwater datacenters to power the full suite of Microsoft Azure cloud services, which may require linking together a dozen or more vessels the size of the Northern Isles.

“As we are moving from generic cloud computing to cloud and edge computing, we are seeing more and more need to have smaller datacenters located closer to customers instead of these large warehouse datacenters out in the middle of nowhere,” Fowers said.

That’s one of the reasons Chappell’s group in Azure is keeping an eye on the progress of Project Natick, including tests of post-quantum encryption technology that could secure data from sensitive and critical sectors.
The ability to protect data is core to the mission of Azure in multiple industries.

“The fact that they were very quickly able to deploy it and it has worked as long as it has and it has the level of encryption on the signals going to it combines to tell a pretty compelling vision of the future,” Chappell said.

Links :