Thursday, June 21, 2018

Aeolus: wind satellite weathers technical storm

ESA’s Earth Explorer Aeolus satellite will be launched later this year to measure the world’s winds from space.
The satellite carries one of the most sophisticated instruments ever to be put into orbit: Aladin, which includes two powerful lasers, a large telescope and very sensitive receivers.
The laser generates ultraviolet light that is beamed down into the atmosphere to profile the world’s winds – a completely new approach to measuring the wind from space.
These vertical slices through the atmosphere, along with information it gathers on aerosols and clouds, will improve our understanding of atmospheric dynamics and contribute to climate research.
As well as advancing science, Aeolus will play an important role in improving weather forecasts.
The mission will also complement information about the atmosphere being provided by the Copernicus Sentinel missions.

From BBC by Jonathan Amos

They say there is no gain without pain, but when the European Space Agency (Esa) set out in 2002 to develop its Aeolus satellite, no-one could have imagined the grief the project would bring.

Designed to make the most comprehensive maps of winds across the Earth, the mission missed deadline after deadline as engineers struggled to get its key technology - an ultraviolet laser system - working for long enough to make the venture worth flying.

But now, 16 years on, the Aeolus satellite is finished and ready to ship to the launch pad.
And far from being snuck out the back door at night in embarrassment at the huge delay, the spacecraft will be mated to its launch rocket with something of a fanfare.

Esa is taking pride in the fact that it overcame a major technical challenge.

"Many times I remember people saying, 'there's just no point in continuing because it is simply not possible to build a UV laser for space'. But this is the DNA of Esa - we do the difficult things and we don't give up," said the agency's Earth observation director, Dr Josef Aschbacher.

It helped of course that Aeolus promises data that many experts still believe will be transformative.
From its vantage point some 320km above the planet, the laser will track the movement of molecules and tiny particles to get a handle on the direction and speed of the wind.

Currently, we measure the dynamics of the atmosphere using an eclectic mix of tools - everything from whirling anemometers to other types of satellite that judge wind behaviour from the choppiness of seawater.
But these are all limited indications, telling us what is happening in particular places or at particular heights.

Aeolus, on the other hand, will attempt to build a truly global view of what the winds are doing on Earth, from the surface of the planet all the way up through the troposphere and into the stratosphere (from 0km to 30km).

How to measure the wind from space :
  • Aeolus will fire a laser through the atmosphere and measure the return signal
  • The light will scatter back off air molecules and particles moving in the wind
  • Meteorologists will adjust their numerical models to match this information
  • The biggest benefits should be in medium-range forecasts - a few days hence
  • Aeolus should pave the way for operational weather satellites with lasers

"The lack of wind profile observations is one of the most important gaps to fill in order to improve numerical weather prediction," Dr Florence Rabier, the DG at the European Centre for Medium-Range Weather Forecasts (ECMWF), told BBC News.
"The Aladin Doppler wind lidar instrument onboard Aeolus will be the first satellite instrument that provides wind profiles from space.
"We have very high expectations regarding the quality of the Aeolus wind profile data, and we are anticipating forecast quality to increase by 2-4% in the extra-tropics and up to 15% in the tropics. Aeolus is paving the way for significant improvements in weather forecasting".

There is an example that meteorologists quote from March 2014 - storminess that led to flooding in northern Europe.

When they did the post-event analysis to figure out why no-one had seen it coming, the conclusion was that inaccurate wind data six days previously had been used in the models.
Dr Alain Dabas from MeteoFrance explained: "The error was in the central Pacific at an altitude of about 11km. There was a mistake in the initial winds given to the models and that propagated to Europe.
"The question now is would Aeolus have solved this problem? Probably, yes."

It goes without saying that knowing what the wind is going to do reaches beyond just the nightly weather forecast on TV.
How it blows affects the distribution and transport of pollutants, and how quickly bad air in a hazy city, say, can be cleared away.

 The first Doppler Wind-Lidar in Space
Aeolus will measure global wind speeds in horizontal slices up to 30 km above the Earth’s surface and improve the performance of numerical weather forecasts.
Aeolus will bring improvement for climate research and modelling.
Aeolus will be the first satellite capable of observing wind activity in our atmosphere using laser technology to produce dynamic 3D maps.

Then there are the requirements of safety to consider - think sailors at sea, or construction on high-rise buildings. And don't forget the sectors whose whole reason to exist rests on the wind.

"For instance, the wind energy industry," said Dr Anne Grete Straume, Esa's Aeolus mission scientist.
"They're exploiting the winds and they need to know how much energy they can produce at any point in time. For that they need very accurate forecasts and we hope that our mission can help them with their management."

But all this depends on the UV laser doing its job.
The engineers are very confident now that it can.
They recently put the finished Aeolus satellite in a space chamber for six months to simulate the conditions of being in orbit.
The whole system passed with flying colours.

It is worth recalling some of the past frustrations.
The first problem was in finding diodes to generate laser light with a long enough lifetime.
When those were identified, the mission looked in great shape until engineers discovered their design wouldn't actually operate in a vacuum - a significant barrier for a space mission.

Tests revealed that in the absence of air, the laser was degrading its own optics; as the high-energy light hit the lenses and mirrors, it would blacken them.

Companies across Europe were pushed to develop new coatings for the various elements.
The key breakthrough, however, was to introduce a small amount of oxygen to the instrument to prevent surfaces carbonising.
It's a tiny puff of gas - 40 pascals' worth; the same pressure you might expect to develop from the presence of a photosynthesising plant.
But it is sufficient to oxidise contaminants and remove them.

"When we started, the only references we had were classified because these types of lasers are used to represent atomic bombs, and those technologies were totally locked out," said Anders Elfving, Esa's Aeolus project manager.
"The motivation for my team all these years was that there is no alternative, and of course the user community is still so enthusiastic for what we've built.
"We want to see what is invisible - to see the wind in clear skies. And I think active lidars like Aladin are the future - for much more accurate measurements of CO2 and other trace gases in the atmosphere."

The launch of Aeolus on a Vega rocket is currently set for 21 August.

Links :

Wednesday, June 20, 2018

Germany BSH layer update in the GeoGarage platform

86 nautical charts updated & 11 new charts added

Sailing the mysteries of old maps

From ERC

Dr Joaquim Alves Gaspar is a man of the sea.
After many years in the Portuguese Navy, he gave up plans to become an admiral in favour of pursuing a PhD in the History of Cartography.
This second career led him to receive an ERC Starting Grant, the first awarded in this budding discipline.
With his highly multidisciplinary team (he likes to say that, to work with him, one must be a mathematician fluent in Latin), and the experience obtained as a navigator and navigational instructor, Dr Gaspar hopes to understand how and when the first nautical charts were created.
The MEDEA-CHART team is the best place in Portugal, and probably in the world, to study the history of nautical cartography, hoping that this work will provide the domain with its rightful recognition within world history.

What is your research project about?

Our project is about the origin, the technical evolution and the use of nautical cartography in Europe.
This includes the medieval charts of the Mediterranean, what historians call portolan charts, and the early modern charts, first of the Atlantic and then of the whole world.
These charts, which preceded the Mercator projection (designed in 1569, and on which current navigation is based), didn't even consider the Earth as round!
In fact, although people of course knew about it, the constraints of navigational methods dictated that a flat-earth model be used until mid-18th century.
The MEDEA-CHART project is about studying these apparently naïve forms of cartographic representation, which were used for so much discovery and exploration.

What do you hope to achieve with your grant?

We hope to resolve some historiographical issues which have eluded scholars of cartography for a very long time.
For example, when and how were the first nautical charts constructed?
The earliest existent chart is the Carta Pisana (1275-1280).
But we suspect a long tradition before that, and we know nothing about its development.
Also, how were they updated with new geographical information?
These issues are particularly relevant for the medieval ones, but similar questions could be asked for the more recent, so-called, latitude charts of the Atlantic, which were developed by the Portuguese following the introduction of astronomical navigation.
This new model was based on the traditional charts of the Mediterranean but we don’t know exactly how it evolved from them.
These are two aspects we want to explore.
In addition, we'd like to understand how those charts were used to navigate.
We know almost nothing about that but we hope to by the end of this project.

We hope to resolve some historiographical issues which have eluded scholars of cartography for a very long time.
For example, when and how were the first nautical charts constructed?
How were they updated with new geographical information? (Carta Pisana – 1275-1280)

This research is quite unique, was the ERC support important for the discipline itself?

Absolutely, it was the very first ERC grant in the field of the History of Cartography.
My biggest wish is to include the History of Cartography, now a bit of a niche subject, into the History of Science.
I believe it belongs in this field because of its extraordinary relevance in the period of geographic discovery and maritime expansion.
Nautical charts weren't used only for navigation but also for the construction of the first coherent image of the whole world.
They were the most important source of geographical information during a period when the world was being discovered, explored and mapped by Europeans.
When we see those lavish atlases and maps of the world of the 16th and 17th century, we don’t realise that most of that information came from nautical charts, which were instruments for navigation not intended to depict the world.
Even more surprisingly, nautical charts were constructed not by scholars, but by artisans.
They were scientific tools made and used by illiterate workers, and this is in itself quite notable for the History of Science.
Finally, for the first time, we are using a multidisciplinary approach to study these maps, an approach which is extremely powerful and has already proven its potential.

Tell us more about this multidisciplinarity.

Essentially, not only do we study the sources using the traditional methods of historical research, but we also use geometrical analysis, mathematical modelling, radiocarbon dating and multispectral imaging technology.
Seven people work with me in the team, only one is a traditional historian.
We have three physicists, a philosopher, a computer science engineer, a neuroscientist and a navy officer.
One of them ia an American senior investigator and the world expert of the Piri Reis map (a well-known Turkish portolan chart from the 16th century).
We look at the charts themselves, lots of them.
But then written sources explaining how those charts came to be don't exist, so we try to understand the creation process by examining the charts themselves physically and mathematically, as well as interpreting the few textual sources where they are mentioned.

My biggest wish is to include the History of Cartography, now a bit of a niche subject, into the History of Science.
I believe it belongs in this field because of its extraordinary relevance in the period of geographic discovery and maritime expansion.
Nautical charts weren't used only for navigation but also for the construction of the first coherent image of the whole world.
(Anonymous Atlantic Chart – 1560)

How did you develop this passion for cartography?

I have been connected to the sea since I was a child.
I was always fascinated by maps and charts.
Charts and maps were part of my professional life in the navy but this particular interest in the History of Cartography began when I was sent to the Portuguese Navy Academy to teach cartography and hydrographic surveying.
Then I published two books on theoretical modern cartography.
That, at the time, was my real interest.
When the time came to decide about my career in the navy, about 15 years ago, I could have become an admiral but I realised that I had a bigger ambition.
I decided to start a PhD instead for which my background in the navy was ideal.
I was an expert in navigation, in hydrographical surveying and also in mathematical cartography, which are very powerful tools to approach the study of old nautical charts.

How did your career in the navy develop?

My experience in the navy was very rich.
I spent several years at sea in different kinds of ships, as a desk officer when I was very young, as an operation officer, a navigator, and then as a commanding officer.
But I also had the opportunity to study a lot.
I have a Masters in Physical Oceanography which I obtained in the United States, I taught for many years in the Naval Academy and I served in the Hydrographic Institute as an oceanographer and an expert in navigation.
Most of what I know directly related to my research subject I learned from the navy.

I have been connected to the sea since I was a child.
I was always fascinated by maps and charts.
Charts and maps were part of my professional life in the navy.
At a point in my career, I could have become an admiral but I realised that I had a bigger ambition so I decided to start a PhD instead and study the history of cartography.
(Diogo Homem portolan – 1563).

What motivated you to apply for the ERC?

Simply put, to pass a message.
To make a significant contribution to the training of a new generation of historians of cartography.
Not traditional historians, but researchers prepared to apply a multidisciplinary approach, including physical and numerical methods.
As far as I know, there is no undergraduate degree in the History of Cartography, and the only research team in Europe solely dedicated to the subject is mine.
Being awarded an ERC grant was the only way to have the resources to pass this message.

Links :

Tuesday, June 19, 2018

Flooding from sea level rise threatens over 300,000 US coastal homes – study

Sea levels are rising. For many cities on the the eastern shores of the United States, the problem is existential.
Miami and Atlantic city fight to stay above water

From The Guardian by Oliver Milman

Climate change study predicts ‘staggering impact’ of swelling oceans on coastal communities within next 30 years

Sea level rise driven by climate change is set to pose an existential crisis to many US coastal communities, with new research finding that as many as 311,000 homes face being flooded every two weeks within the next 30 years.

The swelling oceans are forecast repeatedly to soak coastal residences collectively worth $120bn by 2045 if greenhouse gas emissions are not severely curtailed, experts warn.
This will potentially inflict a huge financial and emotional toll on the half a million Americans who live in the properties at risk of having their basements, backyards, garages or living rooms inundated every other week.

“The impact could well be staggering,” said Kristina Dahl, a senior climate scientist at the Union of Concerned Scientists (UCS).
“This level of flooding would be a tipping point where people in these communities would think it’s unsustainable.
“Even homes along the Gulf coast that are elevated would be affected, as they’d have to drive through salt water to get to work or face their kids’ school being cut off. You can imagine people walking away from mortgages, away from their homes.”

The UCS used federal data from a high sea level rise scenario projected by the National Oceanic and Atmospheric Administration, and combined it with property data from the online real estate company Zillow to quantify the level of risk across the lower 48 states.

Under this scenario, where planet-warming emissions are barely constrained and the seas rise by about 6.5ft globally by the end of the century, 311,000 homes along the US coastline would face flooding on average 26 times a year within the next 30 years – a typical lifespan for a new mortgage.

The losses would multiply by the end of the century, with the research warning that as many as 2.4m homes, worth around a trillion dollars, could be put at risk.
Low-lying states would be particularly prone, with a million homes in Florida, 250,000 homes in New Jersey and 143,000 homes in New York at risk of chronic flooding by 2100.

With scientists' predictions starting to come true, Miami Beach residents must decide how to respond to the water that's invading their home.

This persistent flooding is likely to rattle the housing market by lowering property prices and making mortgages untenable in certain areas.
Flood insurance premiums could rise sharply, with people faced with the choice of increasing clean-up costs or retreating to higher ground inland.

“Unfortunately, in the years ahead many coastal communities will face declining property values as risk perceptions catch up with reality,” said Rachel Cleetus, an economist and climate policy director at UCS.
“In contrast with previous housing market crashes, values of properties chronically inundated due to sea level rise are unlikely to recover and will only continue to go further underwater, literally and figuratively.”

The report does not factor in future technological advances that could ameliorate the impact of rising seas, although the US would be starting from a relatively low base compared with some countries given that it does not have a national sea level rise plan.
And the current Trump administration has moved to erase the looming issue from consideration for federally funded infrastructure.

The oceans are rising by about 3mm a year due to the thermal expansion of seawater that’s warming because of the burning of fossil fuels by humans.
The melting of massive glaciers in Greenland and Antarctica is also pushing up the seas – Nasa announced last week that the amount of ice lost annually from Antarctica has tripled since 2012 to an enormous 241bn tons a year.

This slowly unfolding scenario is set to pose wrenching choices for many in the US. Previous research has suggested that about 13 million Americans may have to move due to sea level rise by the end of the century, with landlocked states such as Arizona and Wyoming set for a population surge.

“My flood insurance bill just went up by $100 this year, it went up $100 the year before,” said Philip Stoddard, the mayor of South Miami.
“People on the waterfront won’t be able to stay unless they are very wealthy. This isn’t a risk, it’s inevitable.
“Miami is a beautiful and interesting place to live – I’m looking at a lizard on my windowsill right now. But people will face a cost to live here that will creep up and up. At some point they will have to make a rational economic decision and they may relocate. Some people will make the trade-off to live here. Some won’t.”

Links :

Monday, June 18, 2018

Norway NHS layer update in the GeoGarage platform

105 nautical raster charts updated

Hacking, tracking, stealing and sinking ships

Further illustrating the real-world implications, Pen Test Partners has managed to link version details for ships’ satcom terminals to live GPS position data, to establish a clickable map where vulnerable ships can be highlighted with their real-time position
(it’s not updated however, thus ensuring it remains out of date and useless to hackers).

From PenTestPartners by Ken Munro

Pen Tester find several ways to hijack, track, steal and even sink shipping vessels

At Infosecurity Europe this year, we demonstrated multiple methods to interrupt the shipping industry, several of which haven’t been demonstrated in public before, to our knowledge.

Some of these issues were simply through poor security hygiene on board, but others were linked to the protocols used and systems provided by maritime product vendors.

Tracking and hacking ships: satellite communications

Our earlier satcom work is here but we took this much further at the show:
Shodan already publishes a ship tracker.
We think this only uses AIS data, publicly available.
We’ve broken new ground by linking satcom terminal version details to live GPS position data.

This, we think, is the first ever VULNERABLE ship tracker.
Two public data sets have been linked, so we now have a clickable map where vulnerable ships are highlighted with their real time position

It’s here – note that we deliberately haven’t refreshed the data in use, ensuring it is out of date so that it can’t be used by hackers.
We’ll refresh it in time.

Many satcom terminals on ships are available on the public internet.
Many have default credentials, admin/1234 being very common.
These passwords were found on a ship only two weeks ago:

So that’s an easy way to hijack the satellite communications and take admin rights on the terminal on board.

Hardware hacking the satellite terminal

We applied our expertise in IoT, automotive and SCADA hardware security to a Cobham (Thrane & Thrane) Fleet One satellite terminal.
We haven’t seen much evidence in public of anyone looking hard at maritime satcom terminal hardware security before.
They’re expensive, which may explain it!

Caveat: all of the vulnerabilities we cover here are resolved by setting a strong admin password, as per the manufacturers guidance.
Either that, or they aren’t particularly significant.
We found much more, but the more significant findings have to be disclosed privately to Cobham first!

First, we found that the admin interfaces were over telnet and HTTP.
Pulling the firmware, we found a lack of firmware signing – the validation check was simply a CRC

Then, we discovered that we could edit the entire web application running on the terminal.
That lends itself to attacks.

Further, there was no rollback protection for the firmware.
This means that a hacker with some access could elevate privilege by installing an older more vulnerable firmware version.
Finally, we found the admin interface passwords were embedded in the configs, hashed with unsalted MD5.

Hardly ‘defence in depth’!
Reminder: these are all fixed by setting a strong admin password.
We found lots more, but can’t disclose these yet.

Sending a ship the wrong way: hacking the ECDIS

We often find a lack of network segregation on the vessel.
Hack the satcom terminal and you’re on the vessel network.

ECDIS are the electronic chart systems that are needed to navigate.
They can slave directly to the autopilot – most modern vessels are in ‘track control’ mode most of the time, where they follow the ECDIS course.

Hack the ECDIS and you may be able to crash the ship, particularly in fog.
Younger crews get ‘screen fixated’ all too often, believing the electronic screens instead of looking out of the window.

We tested over 20 different ECDIS units and found all sorts of crazy security flaws.
Most ran old operating systems, including one popular in the military that still runs Windows NT!

One interesting example had a poorly protected configuration interface.
Using this, we could ‘jump’ the boat by spoofing the position of the GPS receiver on the ship.
This is not GPS spoofing, this is telling the ECDIS that the GPS receiver is in a different position on the ship.
It’s similar to introducing a GPS offset (which we can also do!)
Here’s it jumping from one side to the other of Dover Harbour:

Blocking the English Channel?

Worse, we could reconfigure the ECDIS to make the ship appear to be a kilometre square:

This doesn’t sound bad, until you appreciate that the ECDIS often feeds the AIS transceiver – that’s the system that ships use to avoid colliding with each other.

So, simply spoof the ECDIS using the vulnerable config interface, ‘grow’ the ship and ‘jump’ it in to the shipping lanes.

Other ships AIS will alert the ships captain to a collision scenario.
It would be a brave captain indeed to continue down a busy, narrow shipping lane whilst the collision alarms are sounding.
Block the English Channel and you may start to affect our supply chain.

Going the wrong way: hacking NMEA 0183 messages

A completely different technique is to exploit the serial networks on board that control the Operation Technology (OT).
The ethernet and serial networks are often ‘bridged’ at several points, including the GPS, the satcom terminal, the ECDIS and many other points

OT systems are used to control the steering gear, engines, ballast pumps and lots more.
They communicate using NMEA 0183 messages.
Here are several such messages including steering heading, GPS, AIS and Bridge alarm data.

There is no message authentication, encryption or validation of these messages.
They’re plain text.
All we need to do is man in the middle and modify the data.
This isn’t GPS spoofing, which is well known and easy to detect, this is injecting small errors to slowly and insidiously force a ship off course.

If the autopilot is engaged, one could change the rudder command by modifying a GPS autopilot command like this:
Change R to L (Right to Left rudder command!) and then change the 2 byte XOR checksum at the end.


Ship security is in its infancy – most of these types of issues were fixed years ago in mainstream IT systems.

The advent of always-on satellite connections has exposed shipping to hacking attacks.
Vessel owners and operators need to address these issues quickly, or more shipping security incidents will occur.
What we’ve only seen in the movies will quickly become reality.

Links :

Sunday, June 17, 2018

Friday, June 15, 2018

Seafloor cables that carry the world’s internet traffic can also detect earthquakes

Seafloor cables, such as this link between the United States and Spain, can serve as seismic sensors. Run Studios
From ScienceMag by Eric Hand

Some 70% of Earth's surface is covered by water, and yet nearly all earthquake detectors are on land.
Aside from some expensive battery-powered sensors dropped to the sea floor and later retrieved, and a few arrays of near-shore detectors connected to land, seismologists have no way of monitoring the quakes that ripple through the sea floor and sometimes create tsunamis.

Now, a technique described online in Science this week promises to take advantage of more than 1 million kilometers of fiber optic cables that criss-cross the ocean floors and carry the world's internet and telecom traffic.
By looking for tiny changes in an optical signal running along the cable, scientists can detect and potentially locate earthquakes.

The technique requires little more than lasers at each end of the cable and access to a small portion of the cable's bandwidth.
Crucially, it requires no modification to the cable itself and does not interfere with its everyday use.
The method "could be a game-changer," says Anne Sheehan, a seismologist at the University of Colorado in Boulder who wasn't involved in the work.
"More observations from oceanic regions could fill in some pretty big gaps."

MOMotion of the Ocean floor : The network of submarine fiber-optic cables that deliver work emails and cat videos to computers around the world could double as undersea earthquake detectors.
Existing cables are shown in purple; planned cables are in blue.
G. Marra et Al/Science 2018

It began with an accidental discovery, says Giuseppe Marra, a metrologist at the National Physical Laboratory in Teddington, U.K., who works on the fiber optic links that connect atomic clocks in labs across Europe.
He was testing a connection on a 79-kilometer buried cable that runs from Teddington to Reading, U.K., and relies on a stable, resonating loop of laser light.
Vibrations near the cable—even the noise of traffic above—can bend it imperceptibly.
That can shorten or lengthen the light's travel distance by less than the width of a human hair, shifting the resonating light beams slightly out of phase.

Marra was accustomed to background noise on his fiber optics.
But when he reviewed data from October 2016, he saw more than the average amount of noise.
It turned out to be the local effects of 5.9- and 6.5-magnitude quakes that struck central Italy late that month.
"It was quite a revealing moment," Marra says.
That noise, he realized, pointed to a new way to detect earthquakes.

Submarine seismology
An underwater fiber-optic cable stretching from Malta to Sicily sensed a magnitude 3.4 quake in the Mediterranean Sea on September 2, 2017.
Researchers confirmed this detection with two nearby seismometers.
One seismometer near the Malta end of the cable, closer to the earthquake’s epicenter, detected the quake shortly before the cable, and a seismometer near the Sicily end identified it shortly after.
Marra wondered whether the technique could be extended to the ocean, where the environment might be quieter.
Using a 96-kilometer submarine cable connecting Malta and Sicily in Italy, he and his colleagues detected a magnitude-3.4 earthquake in the Mediterranean Sea.
They couldn't pinpoint it.
But by shooting lasers down a cable from both ends, he says, scientists could detect differences in the travel times of the out-of-phase signals, which would reveal just where the earthquake first caused a disruption along the cable.
With three or more cables outfitted this way, he says, the earthquake's exact location in the crust could be triangulated.

By filling in the "seismic desert" in the ocean crust and showing where seafloor earthquakes occur and how often, the method could illuminate new fault structures and regions where tectonic plates are colliding or rifting apart, says Charlotte Rowe, a seismologist at Los Alamos National Laboratory in New Mexico.
It could also help with tsunami warning systems, she says, provided the strength of the optical signal reveals an earthquake's size.

Besides mapping earthquakes, Rowe thinks the cable networks could sharpen pictures of Earth's interior.
Like x-rays in a computerized tomography (CT) scan, seismic waves from big earthquakes carry clues to the density of rock they pass through.
From crisscrossing waves received by multiple sensors, seismologists can construct 3D pictures of mantle convection, in which hot plumes well up and cold tectonic plates plunge toward Earth's core.
Data from seafloor cables could fill in blind spots in these seismic CT scans.
But Rowe says investigators will have to get better at interpreting the cable signals before using them to peer into deep Earth.

Marra says the new technique is sensitive enough to work across ocean basins thousands of kilometers wide.
It requires adding a small cabinet of lasers and optical equipment that costs about $50,000 at each end of the cable, and access to just one of the hundreds of channels in a typical cable.
Renting a dedicated channel might cost about $100,000 a year on a transpacific cable, and much less on one between North America and Europe, says Stephen Lentz, who works with the cable industry as director of network development for Ocean Specialists, Inc., based in Stuart, Florida.
"Frankly, this is the kind of thing where the cable owner could donate the service and take the tax write-off.
It costs them little or nothing to share unused wavelengths."

That's significant, says Bruce Howe, a physical oceanographer at the University of Hawaii in Honolulu, who leads a task force exploring how to stud new ocean cables with seismic, pressure, and temperature sensors, every 50 to 100 kilometers.
Although the add-on sensors, at roughly $200,000 apiece, are cheaper than operating stand-alone ocean bottom detectors, cable owners have been wary of affecting cable performance.
The new technique offers a cheaper and less disruptive way to listen to the ocean floor.
Howe calls the results "intriguing" and says his task force will advocate for a longer test.
"It should absolutely be pursued."

Links :

Thursday, June 14, 2018

Ramp-up in Antarctic ice loss speeds sea level rise

Changes in the Antarctic ice sheet’s contribution to global sea level, 1992 to 2017.
According to research from the Ice Sheet Mass Balance Inter-comparison Exercise (IMBIE), published today in Nature, the Antarctic ice sheet’s contribution to global sea level was 7.6 mm since 1992, with two fifths of this rise (3.0 mm) coming in the last five years alone. 

Credits: IMBIE/Planetary Visions

From NASA by Steve Cole and Alan Buis

Ice losses from Antarctica have tripled since 2012, increasing global sea levels by 0.12 inch (3 millimeters) in that timeframe alone, according to a major new international climate assessment funded by NASA and ESA (European Space Agency).

According to the study, ice losses from Antarctica are causing sea levels to rise faster today than at any time in the past 25 years.
Results of the Ice Sheet Mass Balance Inter-comparison Exercise (IMBIE) were published Wednesday in the journal Nature.

“This is the most robust study of the ice mass balance of Antarctica to date,” said assessment team co-lead Erik Ivins at NASA’s Jet Propulsion Laboratory (JPL).
“It covers a longer period than our 2012 IMBIE study, has a larger pool of participants, and incorporates refinements in our observing capability and an improved ability to assess uncertainties.”

The Antarctic Peninsula from the air: although the mountains are plastered in snow and ice, measurements tell us that this region is losing ice at an increasing rate.
Credits: University of Durham/Pippa Whitehouse

This latest IMBIE is the most complete assessment of Antarctic ice mass changes to date, combining 24 satellite surveys of Antarctica and involving 80 scientists from 42 international organizations.

The team looked at the mass balance of the Antarctic ice sheet from 1992 to 2017 and found ice losses from Antarctica raised global sea levels by 0.3 inches (7.6 millimeters), with a sharp uptick in ice loss in recent years.
They attribute the threefold increase in ice loss from the continent since 2012 to a combination of increased rates of ice melt in West Antarctica and the Antarctic Peninsula, and reduced growth of the East Antarctic ice sheet.

Prior to 2012, ice was lost at a steady rate of about 83.8 billion tons (76 billion metric tons) per year, contributing about 0.008 inches (0.2 millimeters) a year to sea level rise.
Since 2012, the amount of ice loss per year has tripled to 241.4 billion tons (219 billion metric tonnes) – equivalent to about 0.02 inches per year (0.6 millimeters) of sea level rise.

Crevasses near the grounding line of Pine Island Glacier, Antarctica.
Credits: University of Washington/I. Joughin

West Antarctica experienced the greatest recent change, with ice loss rising from 58.4 billion tons (53 billion metric tons) per year in the 1990s, to 175.3 billion tons (159 billion metric tons) a year since 2012.
Most of this loss came from the huge Pine Island and Thwaites Glaciers, which are retreating rapidly due to ocean-induced melting.

 Pine Island Glacier calving front with the GeoGarage platform (NGA chart)

At the northern tip of the continent, ice-shelf collapse at the Antarctic Peninsula has driven an increase of 27.6 billion tons (25 billion metric tons) in ice loss per year since the early 2000s. Meanwhile, the team found the East Antarctic ice sheet has remained relatively balanced during the past 25 years, gaining an average of 5.5 billion tons (5 billion metric tons) of ice per year.

Rapid collapse of Antarctic glaciers could flood coastal cities by the end of this century.
Based on an article written by Eric Holthaus

Antarctica’s potential contribution to global sea level rise from its land-held ice is almost 7.5 times greater than all other sources of land-held ice in the world combined.
The continent stores enough frozen water to raise global sea levels by 190 feet (58 meters), if it were to melt entirely.
Knowing how much ice it’s losing is key to understanding the impacts of climate change now and its pace in the future.

“The datasets from IMBIE are extremely valuable for the ice sheet modeling community,” said study co-author Sophie Nowicki of NASA’s Goddard Space Flight Center.
“They allow us to test whether our models can reproduce present-day change and give us more confidence in our projections of future ice loss.”

The satellite missions providing data for this study are NASA’s Ice, Cloud and land Elevation Satellite (ICESat); the joint NASA/German Aerospace Center Gravity Recovery and Climate Experiment (GRACE); ESA’s first and second European Remote Sensing satellites, Envisat and CryoSat-2; the European Union’s Sentinel-1 and Sentinel-2 missions; the Japan Aerospace Exploration Agency’s Advanced Land Observatory System; the Canadian Space Agency’s RADARSAT-1 and RADARSAT-2 satellites; the Italian Space Agency’s COSMO-SkyMed satellites; and the German Aerospace Center’s TerraSAR-X satellite.

Tom Wagner, cryosphere program manager at NASA Headquarters, hopes to welcome a new era of Antarctic science with the May 2018 launch of the Gravity Recovery and Climate Experiment Follow-on (GRACE-FO) mission and the upcoming launch of NASA’s Ice, Cloud and land Elevation Satellite-2 (ICESat-2).

“Data from these missions will help scientists connect the environmental drivers of change with the mechanisms of ice loss to improve our projections of sea level rise in the coming decades," Wagner said.

Links :

Wednesday, June 13, 2018

SeaFi sets course for marine communications revolution

SeaFi Horizon the future of lighthouses
With ECDIS the future of lighthouses is somewhat compromised.
Ships at sea hardly need anymore lighthouses to find their position at sea.
Once used to flash light at sea, they can easily be turned into a powerful hotspot for ship, from flashing light to wireless marine data communication the conversion is painless...
To prove it we are working at setting a world record for the longest ship to shore marine (big) data marine data communication without assistance from satellite or cellular connection...

From Irish Examiner by Eoin English

A marine engineer and inventor has set what he hopes could be a world record for the longest ship-to-shore email using his own special WiFi system.

The Ocean Spey vessel just passed Roches Point Lighthouse

While 200 metres is considered long-distance in most ordinary WiFi networks, like a modern-day Marconi, Arnaud Disant used his SeaFi network to send an email at 4.18pm yesterday from a ship about 15.5 nautical miles, almost 29km, off Roche’s Point in Cork Harbour to a computer in the Roche’s Point lighthouse.

 Lt Cdr Martin Brett email setting out a distance of 19.4 nautical miles...

The email, which included data and maps charting the vessel’s precise location, was sent without any support from satellites or cellular network.

And just as Marconi’s pioneering work in the early 1900s on long-distance radio transmission led to the telegraph system and ultimately radio, Mr Disant hopes his work will lead to a revolution in marine communications.

“People have little interest for marine telecommunications. It’s not trendy,” he said.
“Beside few people have ever spent more than a few hours at sea — heading to the UK or France on a car ferry is as far as it gets. And who needs to keep in touch with the office while heading on holiday?
“So understanding the needs for modern data communication at sea is really remote for most people.
“But SeaFi is something that could put Ireland on the forefront of modern marine technology. It really takes Marconi into the 21st century.”

Mr Disant, a lecturer in marine data communication at the National Maritime College of Ireland, founded SEATech Evolution in 2007, a company which specialises in network infrastructure and electronic engineering.
The firm has spent the last decade developing SeaFi.

 Official distance of the furthest maritime broadband transmission without satellite or cellular network using SeaFi wireless maritime communication system: 19.4 nautical miles (35.9 kilometres). The world record was established on 6 June 2018

The key to the system is powerful ship antennas which transmit wirelessly and securely over a private wireless network to special receiving shore antennas, one of which is attached to the Roche’s Point lighthouse.

Roche’s Point lighthouse

“To put it simply, WiFi is more like a bare light bulb while SeaFi is more like a focused flashlight,” said Mr Disant.

The system has been tested successfully for the last few years in Cork Harbour in a partnership with the Port of Cork.
The LE Orla and LE Niamh helped test SeaFi ship antennas in seagoing conditions.

SeaFi is the main link to shore for the Port of Cork’s vessels, MV Denis Murphy and MT Gerry O’Sullivan; is used by crews of cruise ships visiting Cork; and has been successfully tested on a data buoy for two years.

Its network covers the navigational areas from 10 nautical miles off Roche’s Point all the way up the river Lee to Tivoli Dock container terminal.

It has been tested to a distance of 15 nautical miles and has achieved constant speeds of between five and 15 mbps — about three times faster than real 3G speeds and five times faster than the latest generation of satellite communications, and at a fraction of the costs.

And while final confirmation is expected today, it is believed Mr Disant sent his last ship-to-shore email yesterday as Ocean Spey sat just over 15.5 nautical miles off Roche’s Point.

The record attempt was witnessed by several independent experts both on board and on shore.
A raft of technical data will now be submitted to the Guinness Book of Records for consideration.

Links :

Tuesday, June 12, 2018

New Zealand Linz layer update in the GeoGarage platform

2 nautical raster charts updated

‘Unofficial Charts’ on the horizon?

Atlantic Ocean (1786)
Dépôt général des Cartes Plans et journaux de la Marine publiée par ordre du Ministre pour le service des vaisseaux Français

From Hydro by Gilles Bessero

In the article entitled ’How Blockchain Will Have an Impact on Navigation’ published in the March/April 2018 issue of 'Hydro International', Gert Büttgenbach explains how the new blockchain technology could be potentially beneficial for the production and distribution of nautical charts.
One of the conclusions of the article indicates that the new technological environment calls for reconsidering the ’exclusive domain of national Hydrographic Offices’ (HOs) and suggests that the private sector could in future produce ‘unofficial charts’ that would be superior to ’official’ charts produced by the HOs.
But these views reflect a misunderstanding of the situation, according to Gilles Bessero.

I believe that encouraging the community to think about the impact of new technologies is always a good thing, especially in an environment that is often considered, whether rightly or wrongly, as rather conservative.
The question as to which organizations should be entrusted with the production of nautical charts as a key enabler of safe navigation is the subject of recurrent debate.
As a matter of fact, this was originally an activity run mostly by private chartmakers and chart information was considered a trade secret.

Detailed Depot De La Marine's sea chart of the region centered on Jamaica, with the southern part of Cuba and western end of Hispaniola.
This detailed map of Jamaica, shown divided into parishes, includes strong topographical details with many coastal toponyms, as well as the coastlines of southern Cuba and western Haiti.
The Dépôt de la Marine, known more formally as the Dépôt des cartes et plans de la Marine, was the central charting institution of France.
The centralization of hydrography in France began in earnest when Jean-Baptiste Colbert became First Minister of France in 1661.
Under his watch, the first Royal School of Hydrography began operating, as did the first survey of France’s coasts (1670-1689).
In 1680, Colbert consolidated various collections of charts and memoirs into a single assemblage, forming the core of sources for what would become the Dépôt.
The Dépôt itself began as the central deposit of charts for the French Navy.
In 1720, the Navy consolidated its collection with those government materials covering the colonies, creating a single large repository of navigation.
By 1737, the Dépôt was creating its own original charts and, from 1750, they participated in scientific expeditions to determine the accurate calculation of longitude.
In 1773, the Dépôt received a monopoly over the composition, production, and distribution of navigational materials, solidifying their place as the main producer of geographic knowledge in France.
Dépôt-approved charts were distributed to official warehouses in port cities and sold by authorized merchants.
The charts were of the highest quality, as many of France’s premier mapmakers worked at the Dépôt in the eighteenth century, including Philippe Bauche, Jacques-Nicolas Bellin, Rigobert Bonne, Jean Nicolas Buache, and Charles-François Beautemps-Beaupré.
The Dépôt continued to operate until 1886, when it became the Naval Hydrographic Service.
In 1971, it changed names again, this time to the Naval and Oceanographic Service (SHOM).
Although its name has changed, its purpose is largely the same, to provide high quality cartographic and scientific information to the France’s Navy and merchant marine.

France was the first country to establish a national Hydrographic Office in 1720.
The rationale behind this initiative was that more warships were being lost at sea because of lack of access to charts than in combat.
The benefit of assigning a dedicated public organization to the task of collecting all available information, compiling it and making it available through ’official’ nautical charts was progressively recognized and all maritime nations followed the lead of France more or less rapidly.
Some private chartmaking continued into the 20th century but it was generally focused on the specific needs of the leisure market.
The obligation for ships to carry adequate and up-to-date nautical charts and publications was introduced in the International Convention for the Safety of Life at Sea (SOLAS) of 1974 (regulation V/20) but the provisions related to the production of adequate nautical charts and publications were left at the discretion of the Contracting Governments.

Carte de la rade de Brest en 1779.
 Chart of Brest with the GeoGarage platform (SHOM 2018)

In the late 1980s, the advent of the digital era created a new opportunity for private entrepreneurs who were keen to develop electronic chart systems (ECS) and proposed digital nautical charts generally obtained simply through digitising the paper charts produced by HOs.
When the progress of ECS technology led to the consideration of using such systems not only as navigation aids complementing paper charts but as meeting as such the SOLAS chart carriage requirement, the International Maritime Organization adopted Performance Standards for Electronic Chart Display and Information Systems (ECDIS) in 1995.
Considering the liability aspects, the Performance Standards included a provision that the associated Electronic Navigational Charts (ENCs) had to be issued ’on the authority of government-authorized hydrographic offices’.
This provision was refined in the amendments to the SOLAS Convention that were adopted in 2000 and entered into force on 1 July 2002.
These amendments include a definition of a nautical chart or publication as ’a special-purpose map or book, or a specially compiled database from which such a map or book is derived, that is issued officially by or on the authority of a Government, authorized hydrographic office or other relevant government institution and is designed to meet the requirements of marine navigation.’ (regulation V/2.2).
They include also the requirement that ’Contracting Governments undertake to arrange for the collection and compilation of hydrographic data and the publication, dissemination and keeping up to date of all nautical information necessary for safe navigation.’ (regulation V/9).

Now it is up to each Contracting Government to decide which arrangements best suit its circumstances.
The requirement is solely that nautical charts and publications should be produced on the authority of a Government and this is justified by the liability issue, noting the extent and cost of the damages that could be caused by a ship’s grounding due to a charting error.
As explained in Publication M-2 of the International Hydrographic Organization on ’The need for hydrographic services’, ’Coastal States can satisfy their hydrographic needs and obligations through a variety of arrangements (…).
The use of bilateral arrangements with established Hydrographic Services and the use of commercial contract support are alternatives to establishing a full in-country Hydrographic Service.’
The reality is that a number of HOs do outsource production activities to the private sector.
Therefore, one should not oppose HOs versus the private sector and ’official’ versus ’unofficial’ charts but encourage both sides to imagine together the most efficient ways to improve future ’official’ charts for which governments continue to accept full responsibility.

In that perspective, it is worth noting that HOs are evolving from a traditional chart-centric model to a data-centric model in order to address the variety of hydrographic requirements associated with all human activities that take place in, on or under the sea and support the sustainable development of the oceans.
This means that delivering a portfolio of nautical charts covering the waters of a country is no longer an end in itself but one of the many applications of a national marine spatial data infrastructure that must be considered as a public good.
The private sector can and should play a major role in developing tools to manage efficiently the MSDI as well as in inventing and developing a variety of value-added products and services derived from that infrastructure.
But as long as shipping remains a significant component of the world trade infrastructure, there will continue to be a substantiated need for ’official’ nautical charts.

What rules apply to migrants rescued at sea?

Photo: Karpov/SOS Mediterranee/AFP

From The Local by AFP

In the standoff between Italy and Malta over a migrant ship stranded in the Mediterranean, both have insisted on their right to refuse a vessel entry to their ports.

Although Spain offered safe harbour to the boat and the 629 migrants on board, the episode has raised questions about a country's legal obligations towards those rescued at sea.

courtesy of Le Monde (Francesca Fattori & Xemartin Laborde)

Here are a few key questions.

Are the rules clear?

Generally speaking, no.
"International maritime law does not provide for specific obligations which would determine in all cases which state is responsible to allow disembarkation on its territory," the United Nations refugee agency (UNHCR) says.

But that does not mean a country can simply hold up a stop sign and wash its hands of the situation when a vessel packed with vulnerable migrants approaches its shores. UNHCR also pointed to "key treaties" stating that a nation which has responsibility for an area in which a search-and-rescue operation takes place is required to "exercise primary responsibility" for coordinating the migrants' safe disembarkation.

Photo: Louisa Gouliamaki/AFP

The International Organization for Migration also said that while states are not forced to accept specific vessels, there is a collective duty to ensure a humane outcome.

"Regarding disembarkation, states are obliged to cooperate to find a safe place to disembark migrants rescued in their search and rescue area," IOM spokesman Leonard Doyle told AFP, citing legal experts.

What if there's an emergency on board?

This could arguably compel a state to grant access to its ports.

"If the country has control over the ship and there are migrants in dire straights aboard and no agreement with another state to take them can be found, they should not delay but accept them," Doyle said.

In the case of the Aquarius, which is operated by SOS Méditerranée, UNHCR said that the dwindling provisions on board created "an urgent humanitarian imperative" for Italy and Malta to allow the boat to dock.

Spain's intervention later appeared to defuse the crisis.

Photo: Louisa Gouliamaki/AFP

What happens after the migrants disembark?

In an apparent attempt to justify Rome's stance, far-right Interior Minister Matteo Salvini said Italy's new populist government could not be forced to turn the country into "a huge refugee camp".

But UNHCR said letting a boat dock did not mean a country would have to take long-term responsibility for those on board.

"A state which allows disembarkation on its territory of rescued persons – particularly in situations involving large numbers of people – need not, in UNHCR's view, be solely responsible for providing durable solutions on its own territory."

Links :

Monday, June 11, 2018

Canada CHS layer update in the GeoGarage platform

42 nautical raster charts updated

The 'dark fleet': Global Fishing Watch shines a light on illegal catches

 Global Fishing Watch's new night light vessel detection layer uses satellite imagery from the US National Oceanic and Atmospheric Administration (NOAA) to reveal the location and activity of brightly lit vessels operating at night.
Because the vessels are detected solely based on light emission, we can detect individual vessels and even entire fishing fleets that are not broadcasting AIS and so are not represented in the AIS-based fishing activity layer.

From The Guardian by Justin McCurry

Low light imaging data being used to expose unregulated and unreported fishing on the high seas

New data is being used to expose fleets of previously unmonitored fishing vessels on the high seas, in what campaigners hope will lead to the eradication of illegal, unregulated and unreported fishing.
Global Fishing Watch (GFW) has turned low light imaging data collected by the US National Oceanic and Atmospheric Administration (NOAA) into the first publicly available real-time map showing the location and identity of thousands of vessels operating at night in waters that lie beyond national jurisdiction.
More than 85% of the “dark fleet” detections include smaller vessels that are not fitted with transponders and larger ones that have switched off their tracking systems to avoid detection, according to GFW, which launched the map on Friday to mark World Oceans Day.

The pink circles on the map represent two fishing vessels in close proximity to one another.
Global Fishing Watch’s new night light vessel detection layer uses satellite imagery from the U.S. National Oceanic and Atmospheric Administration (NOAA) to reveal the location and activity of brightly lit vessels operating at night.
Because the vessels are detected solely based on light emission, we can detect individual vessels and even entire fishing fleets that are not broadcasting AIS and so are not represented in the AIS-based fishing activity layer.

The data, collected by the NOAA’s visible infrared imaging radiometer suite, is being used to track a fleet of about 200 mostly Chinese vessels at the edge of Peru’s economic exclusion zone.
The monitoring, conducted by GFW, a non-profit organisation campaigning for greater transparency in the fishing industry, and the conservation group Oceana, reveals that about 20% of the Chinese vessels are not broadcasting via automatic tracking systems, raising suspicions they are operating illegally.
The report on the high seas activity coincides with the launch by GFW of the first ever real-time view of transshipment, which enables fishing boats to transfer their catch to refrigerated cargo vessels and remain at sea for months, or even years, at a time but still get their catch to the market.

“By harnessing big data and artificial intelligence, we’re able to generate a clearer view into the often shady practice of transshipment,” said Paul Woods, chief technology officer at GFW.
“This data is now freely available to governments, NGOs and academia to use and interrogate, and support global efforts to strengthen monitoring and enforcement to eradicate illegal fishing.”
Four countries – China, Taiwan, Japan and South Korea – account for well over two-thirds of high seas fishing, including 500 vessels belonging to Japan’s distant water fleet.
“If you could get the North Asian countries fully engaged in strengthening regulation of high seas fisheries, you would go a long way towards solving the problem,” said Quentin Hanich, head of the fisheries governance research programme at the Australian National Centre for Ocean Resources and Security.

Global Fishing Watch’s new encounters layer reveals for the first time where and when thousands of vessels are involved in close encounters at sea.
To detect pairs of vessels meeting at sea, analysts applied machine learning algorithms to more than 30 billion Automatic Identification System (AIS) messages from ocean-going boats to find tell-tale transshipment behaviour, such as two vessels alongside each other long enough to transfer catch, crew, or supplies. 

 As a major market for Chinese processed and re-exported seafood, Japan is well placed to use its influence to improve traceability and transparency, Hanich added.
“China is still in an expansionist stage when it comes to high seas fisheries, and it’s still reluctant to agree to many of the types of measures we need to put in place,” he told the Guardian.
“Japan really is the pathway to bringing China in. It’s crucial that we collaboratively develop high seas governance that China is fully engaged in.”
The need for fleets to cut fuel and other costs was highlighted in a new report claiming that fishing in more than half the world’s high seas fishing grounds would be unprofitable without billions of dollars in government subsidies.
“Governments subsidised high seas fishing with $4.2bn in 2014, far exceeding the net economic benefit of fishing in the high seas,” said the report, published this week in the journal Science Advances.
Its lead author Enric Sala, a National Geographic explorer-in-residence, said: “Governments are throwing massive amounts of taxpayer money into a destructive industry.”

Links :