Tuesday, January 17, 2017

How warming seas are forcing fish to seek new waters

 Warming oceans and marine species migration: poleward bound
The fallout from climate change is often framed as a terrestrial problem, yet global warming is having profound effects on marine life.
(courtesy of The Economist)

From The Guardian by Robin McKie

Rising sea temperatures are pushing shoals hundreds of miles from native grounds

Scottish fishermen have uncovered an intriguing way to supplement their income: they have added squid to the menu of marine creatures they regularly pull from the sea.
A species normally associated with the warmth of the Mediterranean, rather than the freezing north, may seem an odd addition to their usual catches of cod and haddock.
Nevertheless, squid has become a nice little earner for fishing boats from Aberdeen and the Moray Firth in recent years.

Thirty years ago, squid was a rarity in the North Sea.
Today, boats bring back thousands of tonnes a year – though cod and haddock still dominate catches.
Nor is this warm-water addition to northern fish menus a unique feature.
Red mullet, sardines and sea bass have also appeared with increasing frequency in North Sea fishermen’s nets in recent years. All of them are associated with warmer waters and their appearance is seen by many scientists as a sign that climate change is beginning to have a serious impact on our planet’s oceans.
For Scottish lovers of fresh squid, this is good news.
However, in many other parts of the world, rising sea temperatures – triggered by climate change – are providing fishing industries and governments with major headaches.
Fish are moving hundreds of miles from their old grounds, sometimes out of zones that had been set up to protect them.

 Climate Change Effects on Fisheries
source FAO

In other cases, fish are simply disappearing from nets.
Part of the problem has its roots in past overfishing.
But now climate change is exacerbating the issue.
Last week, scientists revealed that a vast chunk of ice was set to break away from the Antarctic Larsen C ice shelf, while Arctic sea ice extent is now at its lowest level for this time of year since records began.
And if sea temperatures continue to rise, even greater disruption will be caused to fishing stocks.
Fishermen will lose their livelihoods and communities will be deprived of their only source of food.
“There is an unambiguous trend,” said marine biologist Andrew Bakun of Miami University.
“If you look at the world’s fish catches as a whole, you find they are made up, more and more, of warm-water species as opposed to catches in previous years which had more species that were from cooler waters.”

Seafood is the critical source of protein for more than 2.5 billion people today.
However, over-exploitation in the past has resulted in a crash in fish stocks, with the result that the world’s annual catch is now decreasing by more than 1 million tonnes every year – despite the availability of the latest fishing technology: nets big enough to engulf cathedrals, echo locators, satellite navigation, and powerful engines to drive boats.
Now climate change is making the management of this threatened supply even more difficult.
“All the world’s oceans are facing intense problems but the problem is going to be particularly serious for tropical countries, which are often underdeveloped and are far less able to maintain sustainable management regimes for their fisheries,” said marine biologist Callum Roberts, of York University.

An example is provided by Bangladesh.
Fish gives the nation 60% of its animal protein and is vital to the 16 million Bangladeshis living near the coast, a number that has doubled since the 1980s.
However, a study – led by Jose Fernandes, of the Plymouth Marine Laboratory – of two key fish species, Hilsa shad (Tenualosa ilisha) and Bombay duck (Harpadon nehereus), showed that stocks of both could be devastated by climate change that would affect nutrient flows in coastal waters, ocean temperatures and sea levels.

The introduction of sustainable management measures would offset some of these impacts but stocks still face being cut significantly, the group added.
“Both the sea and land environment are changing,” said Fernandes.
“The problem is that we know much less about the sea than the land, so it is harder to observe and to intervene.”
Think of the problem as a double whammy, said marine ecologist Malin Pinsky, of Rutgers University, New Jersey.
“Fish have already been reduced to low numbers by intense overfishing and that makes them far less able to deal with increasing temperatures or other effects of climate change.”
Pinsky points to the example of the Atlantic cod in the Gulf of Maine.
“It has been badly hit by intense overfishing.
Now it appears that warmer waters have been reducing survival even further.
The trouble is that the fisheries management in the area did not realise this and allowed fishing to continue there at a too high level.”

 Fishermen in South Italy - Climate Change in European Marine Ecosystems
Global warming is profoundly changing the seas and oceans that surrounds us.
Fishermen in Milazo (South Italy) catch fish they never used to catch before.
90% of the alien fish species in Milazo have a tropical or a subtropical origin. 

Managing fish stocks in a warming world is proving to be a particularly thorny problem, he added.
“Fish management maps have lines drawn on them but it turns out fish don’t see those lines.”
As waters warm, fish seek cooler waters and head to higher latitudes, a problem that has also been highlighted in the North Sea.
There, closure areas have been set up to protect spawning and nursery grounds of plaice, herring and sandeel from intense fishing.
“But if species shift their distribution in response to climate change it is possible such measures will become less effective in the future,” says a study by a group of scientists led by John Pinnegar, of the government-funded Centre for Environment, Fisheries and Aquaculture Science (Cefas).

Another example of the problem was highlighted last week by the New York Times, which noted that the centre of the US black sea bass population is now found in waters off New Jersey.
In the 1990s, it was hundreds of miles further south.
Under fishing rules that were laid down then, North Carolina fishermen are still entitled to the largest share of black sea bass catches – which requires them to steam north for 10 hours to reach the black sea bass’s current fishing grounds.
By contrast, local New England fishermen are allowed to catch a small fraction of the black sea bass now found in their own neighbourhood and must throw all excess overboard.
The issue has already reached the status of causing international discord, as is revealed through the example of the humble mackerel.
“Until recently, mackerel in the Atlantic were fished mainly by Britain, Ireland and Norway and stocks were protected by an EU quota system,” said Roberts.
“Then stocks began to head north, most probably because sea temperatures were rising.
Eventually, mackerel reached Iceland – at which point Iceland asked to be included in fishing quotas.
This request was rejected – so Iceland went ahead and started catching mackerel in any case.”
The result was a drop in mackerel stocks and an international dispute that lasted several years and which has only recently been resolved – though this respite may only be temporary.
“Unless we find ways to adapt quota agreements speedily and efficiently, we are going to see a lot more disputes like this one in future,” Roberts said.

This point is highlighted in the study led by Pinnegar, which revealed that anchovy stocks are now spreading along the south coast of England.
Talks are taking place to determine whether French or Spanish boats can fish for these – on the grounds that these stocks are extensions of existing populations from the south.
Others argue that the new anchovy stocks are a separate population that is only now rebounding in numbers thanks to greatly improved climatic conditions, and that French and Spanish boats should be allowed only restricted access to them.
The “anchovy wars” are looming, it would seem.

In addition to overfishing and warming sea temperatures, marine creatures face a further danger: ocean acidification.
Increased amounts of carbon dioxide, pumped into the atmosphere from cars and factories, are being absorbed by the oceans, making their waters more acidic.
The impact on coral reefs, which provide homes to thousands of different species of fish, is already being felt.
Last year, it was reported that a rare underwater heatwave, combined with an increase in ocean acidity, had destroyed swaths of Australia’s Great Barrier Reef.
This has led marine biologists to warn that all coral reefs risk being destroyed by the end of the century even if carbon dioxide emissions are kept to relatively low levels in future decades.
Apart from the impact on one of the world’s greatest natural wonders, the effect on fish stocks, and in particular shellfish, could be grim.
Shells of marine creatures are made from calcium carbonate and their formation is disrupted by acidic water.

 
Climate Change Hits Home - Warming Waters, Fewer Fish 
This video shows how climate change is causing waterways to warm, eroding fish populations from the Pacific Northwest to the Midwest and Maine.
A warming world means warmer waters, threatening the livelihood of our fishermen, our traditions, and what we can serve on our dinner tables.

An example is provided by oyster farms on the Oregon coast.
These farms regularly suffer from upwellings of acidic water from deep regions of the Pacific.
When this happens, larval oysters die at the point when they have to form their first shells.
“From the time eggs are fertilised, Pacific oyster larvae precipitate roughly 90% of their bodyweight as calcium carbonate shell within 48 hours,” George Waldbusser at Oregon State University told the Climate News Network.
“They must build their first shell quickly on a limited amount of energy – and, along with the shell, comes the organ to capture external food.
It becomes a death race of sorts.
Can the oyster build its shell quickly enough to allow its feeding mechanism to develop before it runs out of energy from the egg?”
Increasingly, the answer to this question appears to be no.
This point is summed up by Roberts.
“Prawns, lobsters, clams and scallops – which now dominate our intensively fished seas – all lay down carbonate shells.

The fishing industry is therefore badly exposed to risk from more acidic seas.
Not only that, acidification threatens the important role that filter-feeding shellfish play in cleansing ocean water.
Quite frankly, increased acidity is the last thing marine life needs given all of the other ways in which we are making oceans a tougher place to live.”

And then there is question of just how much seafood is actually eaten today.
This turns out to be an issue of considerable controversy, one that was stoked last year in a study – by Daniel Pauly and Dirk Zeller of the University of British Columbia – that was published in the Nature Communications online journal.
It indicates that the UN’s Food and Agriculture Organisation (FAO) has seriously underestimated the world’s appetite for fish and miscalculated global annual catches.
The FAO – using figures provided by individual governments – had suggested that annual catches began rising significantly over the 20th century, peaked at 96m tonnes in 1996 and have been declining slowly since then – largely due to the fact that fish stocks had been so seriously overfished.
Pauly and Zeller put the annual “peak fish” figure for 1996 at 130m tonnes while adding that levels have fallen off far more dramatically and worryingly as stocks have become depleted at a rate that is far sharper than realised previously.
In other words, far more fish – millions of tonnes – is being taken from the seas than has been recorded by official statistics.
This extra annual catch is made up by small-scale and subsistence fisheries and fish thrown back in the sea as discards, according to Pauly and Zeller.
What is particularly worrying about this discovery is the sharp rate of decline of fish catches in recent years.

Despite sending out more boats, fitted with advanced fish detection technologies, fishermen are unable to catch as much as they used to.
Nor do Pauly and Ziller anticipate that it will stop.
“I expect a continued decline because I don’t expect countries to realise the need to rebuild stocks,” Pauly told the Guardian.
“I don’t see African countries, for example, rebuilding their stocks, or being allowed to by the foreign fleets that are working there, because the pressure to continue to fish is very strong.
We know how to fix this problem but whether we do it or not depends on conditions that are difficult.”
It is against this grim background that the world’s oceans are warming significantly, with temperature rises of several degrees being forecast by the end of the century.
Inexorably, fish stocks will be pushed further towards high latitudes, confusing attempts to manage and to protect them, while the make-up of local fisheries will undergo drastic changes.
The stress on one of the world’s most important resources is going to be intense.
The great fish migration has begun.

Links :

Monday, January 16, 2017

Taming oceans of data with new visualization techniques

Climate change research relies on models to better understand and predict the complex, interdependent processes that affect the atmosphere, ocean, and land.
These models are computationally intensive and produce terabytes to petabytes of data.
Visualization and analysis is increasingly difficult, yet is critical to gain scientific insights from large simulations.
The recently-developed Model for Prediction Across Scales-Ocean (MPAS-Ocean) is designed to investigate climate change at global high-resolution (5 to 10 km grid-cells) on high performance computing platforms.
In the accompanying video, we use state-of-the-art visualization techniques to explore the physical processes in the ocean relevant to climate change.
This project exemplifies the benefits of tight collaboration among scientists, artists, computer scientists, and visualization specialists.

From Princeton University by Catherine Zandonella

The global ocean is the Earth's heating and cooling system, pushing balmy tropical waters toward the poles and bringing back colder, nutrient-rich waters.
But modeling this system is extremely complex, resulting in billions of data points.
To tackle the complexity, researchers at three Princeton-area institutions have transformed complex modeling data into an easily understandable animated movie showing how ocean temperatures and saltiness change over time.
The animation could help climate researchers explore how factors such as rising carbon dioxide levels alter the ocean's ability to transport heat.
The animation illustrates the power of visualization techniques for presenting complex data in ways that are readily understandable, said Eliot Feibush, leader of the Princeton Visualization Consortium, which brings together researchers from the University's Princeton Institute for Computational Science and Engineering (PICSciE), the U.S. Department of Energy's Princeton Plasma Physics Laboratory (PPPL), and the U.S. Department of Commerce's National Oceanic and Atmospheric Administration's Geophysical Fluid Dynamics Laboratory (GFDL).

 A Gyre of Salt: A Climate Model Visualization
A new video animation demonstrates the power of data visualization techniques to make sense of vast amounts of information.
The animation, which reveals how ocean temperatures and salinity change over the course of a year, is based on data from global climate models.
These models aid our understanding of the physical processes that create the Earth's climate, and inform predictions about future changes in climate.

"People are working with increasing amounts of data in all areas of science, and they need better ways to evaluate their results," said Feibush, who divides his time between PPPL and PICSciE.
"The techniques we developed are being applied to climate modeling but the methods can be used for other complex data sets that change over time," he said.
Data visualization techniques make it easier to comprehend information, spot trends and even identify mistakes, Feibush said.
"Visualization helps us to understand complexity — it is more than just a pretty picture."

 Hindcast of the peak of the 2008 hurricane season, one of the most active on records, simulated by an FV3-powered GFDL model at 13-km resolution.
FV3 improves representation of small-scale weather features such as hurricanes while maintaining the quality of large-scale global circulation.

 Matthew Harrison, a climate scientist at GFDL, worked with Feibush to adapt results of climate models into formats that could be used to generate the animation.
Climate models are computer programs that combine real-world observations of temperature, salinity, rainfall amounts and other factors with physical laws.
The models can help researchers better predict long-term climate changes and short-term weather forecasts.
"Understanding how heat moves through the ocean is essential for predicting the behavior of the climate we experience on land," Harrison said.

The process starts when tropical waters soak up the sun's heat.
Ocean currents push heated water toward the poles, warming not only the northern and southern oceans but also the air and land.
This "ocean heat engine" makes northern Europe considerably more habitable than it otherwise would be.
In the North Atlantic, warm water from the tropics rides the Gulfstream extension northward toward the Norwegian Sea and mixes with cold water from the Arctic.
Cold water is denser than warm water, so the mixed water sinks and makes its way eventually southward, bringing nutrients to fisheries off the coast of North America.
The water's saltiness, or salinity, plays a significant role in this ocean heat engine, Harrison said.
Salt makes the water denser, helping it to sink.
As the atmosphere warms due to global climate change, melting ice sheets have the potential to release tremendous amounts of fresh water into the oceans.
Climate visualizations can help researchers see how the influx of fresh water affects global ocean circulation over time.

The animation reveals how factors like evaporation, rainfall and river runoff affect salinity. For example, the Mediterranean Sea, which lies in an arid region and has only a narrow outlet, is much saltier than the nearby Atlantic Ocean.
In contrast, over 250 rivers flow into the Baltic Sea between mainland Europe and Scandinavia, so the sea is about seven-times less salty than the Atlantic Ocean.

One of the special aspects of this video animation is its high resolution, Harrison said.
The simulation's resolution is six million pixels, which is like dividing up the world's ocean surface into a grid consisting of six million sectors.
Each sector corresponds to an ocean area of about 10 kilometers on each side.
The model calculates the temperature and salinity for each sector, which becomes a pixel, or a colored spot on the screen.
In the real world, weather conditions change from moment to moment.
To capture this variability, GFDL's climate models incorporate weather conditions collected from ground stations and satellites to update the model hourly with near-surface wind speeds, temperature, rainfall and solar radiation.
The calculations run on supercomputers at Oak Ridge National Laboratory in Tennessee.
"There is an art to handling large amounts of data," said Whit Anderson, deputy director of GFDL and himself an oceanographer.
"You cannot just brute-force the large and complex data produced at facilities like GFDL and PPPL through a commercial product.
"The increase in the amount of data is due to our better understanding of climate and weather," Anderson continued.
"These large amounts of data in turn are giving us improved skill in predicting future climate and weather."

The video animation could help climate researchers explore how factors such as rising carbon dioxide levels alter the ocean's ability to transport heat.
(Photo by Nick Donnoli, Office of Communications)
Student involvement

Feibush credits the project's success to his student interns, some who started on the visualization project at PPPL while in high school.
"Without the students, this wouldn't have happened," Feibush said.
One of these students, Matthew Lotocki, started working with Feibush while a senior at the Bergen County Academy for Technology and Computer Science, a public magnet high school in New Jersey.
"It was an amazing opportunity," said Lotocki, a member of Princeton's Class of 2017.
"You get to work with cutting-edge computer clusters and systems, really interesting projects, and with a mentor who teaches you how to make the tools to create really cool visualizations."

One of the challenges was figuring out how to combine different types of computer processors to work on the task.
Today's scientists often take advantage of the power of video-game graphics processing units (GPUs) to do their computations.
Lotocki had to get the GPU to do the calculations and generate the graphics on the screen.
Another student intern, Michael Knyszek, who attends the University of California-Berkeley and who had an internship with Feibush as part of the Department of Energy Science Undergraduate Laboratory Internship program, programmed the GPU to combine layers of data.

The developers made the video animation look more realistic by incorporating NASA satellite data of the changing colors of the terrain to show typical seasonal changes.
Zachary Stier, Princeton Class of 2020, also worked on the project as a student at the Bergen Academies.
"A lot of the challenge was figuring out the right tools to address the questions we had at hand," Stier said.
"There were some tasks for which there was no documentation for what we wanted to do."
The experience working at PPPL was one of the factors that influenced Stier's decision to come to Princeton.
"I am much more able to look at a problem and do the research into what tools are available to attack the problem," he said.

The consortium combines three institutions, each with a different research focus. Princeton University is home to expert scientists in a wide range of disciplines.
Scientists at PPPL, which is managed by Princeton University, are developing fusion energy, which involves creating charged gases known as plasmas in a confined reactor for safe and abundant sources of electricity. GFDL's expertise is climate modeling.
"Our organizations all work on very different things but one thing that we all have in common is the need to visualize large and complex data," Anderson said.
Funding was supplied by the U.S. Department of Energy and the National Science Foundation.

Links :

Sunday, January 15, 2017

Saturday, January 14, 2017

High tide in Saint Malo

15 december 2016

see maree.info

  Saint Malo with the GeoGarage platform (SHOM chart)

Friday, January 13, 2017

Australia AHS update in the GeoGarage platform

2 new nautical raster charts added & 20 charts updated

Effective surveying tool for shallow-water zones

 Eomap is patenting technology that can map the water depth and chlorophyll content of lakes in satellite photographs, providing quality control for environmental projects that clean algae from lakes.
Such software processing is challenging to describe in patents, but the company made the investment to protect their innovations from competitors.

From Hydro by Dr. Knut Hartmann, Dr. M. Wettle, Dr. Thomas Heege, EOMAP GmbH & Co.KG, Germany

A recent article provides an overview of satellite-derived bathymetry methods and how data can be integrated into survey campaigns, and showcases three use cases.
Bathymetric data in shallow-water zones is of increasing importance to support various applications such as safety of navigation, reconnaissance surveys, coastal zone management or hydrodynamic modelling.
A gap was identified between data demand, costs and the ability to map with ship and airborne sensors.
This has led to the rise of a new tool to map shallow-water bathymetry using multispectral satellite image data, widely known as satellite-derived bathymetry (SDB).

 Figure 1: The diagram shows the relative amount of measured light energy that contains water depth information.

Strictly speaking, the methods to derive information on seafloor topography using reflected sunlight date back to the 1970s but it has required iterative improvements of algorithms, computational power, satellite sensors and processing workflows to provide the current state of the art tool.
Today, a range of different methods exist under the umbrella of the SDB term.
However, as with traditional survey methods, it is imperative to understand the advantages, disadvantages and overall feasibility in order to evaluate the suitability and fit-for-purpose of a given SDB application.

Bathymetric Data Production using Optical Satellite Imagery

Historically, empirical methods were used, which require known depth information over the study area.
By comparing these known depths with the satellite signal, a statistical relationship can be derived that adequately describes depth as a function of the signal.
Aside from requiring known depth data, these methods will only work for a given satellite image.
A subsequent satellite scene, even of the same location, may contain different atmospheric and in-water parameters, and thus the statistical relationship needs to be re-calculated.

Another aspect of these methods is that the statistical relationship is only valid for one water type and one seafloor type.
Therefore, if an area contains different types such as coral, sediment, algae and rubble, the statistical relationship needs to be calculated for each of these substrate types.
The correct formula then needs to be applied to each pixel in the image, i.e.
the algorithm needs to be informed a-priori which substrate type it is encountering in that image pixel.
This brings the problem full circle back to one of the fundamental challenges of satellite-derived bathymetry: how do you know whether a darker signal is due to deeper water, a darker substratum, or a bit of both?
These methods can still be useful as they are relatively straightforward to implement (see The IHO-IOC GEBCO Cook Book, 2016).

Physics-based methods on the other hand, do not require known depth information for the study area, and can therefore be applied independent of satellite data type and study area.
These methods rely on fully describing the physical relationship between the measured light signal and the water column depth.
Optical variability in the atmosphere and water column is accounted for within the algorithm inversion, and no 'tuning' to known depths is required.

Therefore, an area which is physically inaccessible and for which there is no previous information known can be targeted.
Not surprisingly, these physics-based methods require more sophisticated algorithms and powerful processing capacity.
The benefit is that they typically prove to be more accurate, especially in areas with varying substrate types, turbidity and/or atmospheric conditions.
This is of particular importance because only a small fraction of the sunlight recorded by the satellite’s sensor originates from the source that can be associated with water depth.
Depending on the wavelength channel, this fraction varies typically between less than one and up to a maximum of 20%, going from near-infrared to green/blue light energy.
It is critical to accurately account for the other sources of light energy in order to separate out the relevant water column depth contribution to the measured signal.


Data Integration

The integration of SDB data into daily use can be straightforward if the bathymetric data quality and delivery formats follow best practice.
Hence the file formats typically follow industry standards (OGC) and enable a direct use in current GIS or online visualisation tools through Web Mapping of Coverage (WMS, WCS) interfaces, hydrographic software or scripting tools.
ISO conform metadata including important information on tidal corrections, processing levels and date and time of satellite recording are essential for geodata and are mandatory for all SDB data.

Furthermore, it is important to understand the uncertainties in the data as well as the limitations of SDB for a given application in order to integrate the data appropriately.
Such information needs to be expressed in uncertainty layers which should ideally include quantitative information.
For some applications, such as safety of navigation, additional information such as the ability to identify obstructions of different sizes needs to be included as well.

Use Case: Safety of Navigation

Satellite-derived Bathymetric information supports safety of navigation by providing up-to-date and high-resolution grids of the shallow-water zone.
This is of particular importance in areas with outdated charts or dynamic seafloor.
In addition to the bathymetric information, of particular importance is the identification of obstructions which could be a risk to navigation.

 Figure 2: Current ENC (March 2016, left) and overlaid by SDB data (right) showing shoals misplacement and low details of the ENC compared to the Satellite-derived Bathymetry-ENC.

Ideally the bathymetric data are provided in the form of digital nautical charts (ENCs) and ECDIS (Electronic Chart and Display System) as the main navigation device which represents the standard for the majority of vessels.
Satellite-derived Bathymetry data cannot immediately be used for navigation with ECDIS – however, it can serve as an additional data source when updating the bathymetric information of nautical charts (paper or digital).
ENC Bathymetry Plotter, a recently finished software product of SevenCs’ chart production suite, represents a powerful tool to create depth-related information objects for inclusion in ENCs which fulfill all relevant IHO quality standards.
SevenCs and EOMAP have teamed together to provide an innovative service, the combination of up-to-date shallow water bathymetry provided as a standard ENC.
This can therefore be used immediately on board vessels.
An update of official ENCs which include Satellite-derived Bathymetric data, is therefore possible at the commencement of a voyage, but also during the vessel’s journey - via satellite communication - and therefore allows for the planning of more efficient shipping routes, increased safety as well as an improved situational awareness to react to a forced change of the shipping route (e.g.
weather events or other threats).

It is obvious that the need for updating ENCs for safety of navigation is of importance for poorly mapped areas.
It should not be understood to replace recent, high-resolution and quality ENCs if available.

In 2016, bathymetric data was provided to Van Oord covering several atolls in The Maldives.
The data were used to enhance safe navigation by charting all shoals which might or might not be indicated on Electronic Navigation Charts.
This contributed to efficient planning of the project’s activities.
Data were provided within a few days of ordering covering an area of several hundred sq.
km, which showcases the flexibility of the technique.

 Figure 3: Baseline data on seafloor information based on satellite images and physics-based algorithms.

Use Case: Reconnaissance Survey

Satellite-derived Bathymetry can play a role as a reconnaissance survey tool in applications ranging from shallow-water seismic surveys, coastal engineering to optimal planning of acoustic surveys.
Although different in usage, all of these applications have in common that they require bathymetric data which is :
  • spatial
  • high resolution
  • rapidly available
  • affordable within a typical planning phase budget.
Reconnaissance surveys are usually relevant for areas which are poorly surveyed, where charts are outdated or where bathymetric data are simply not accessible.
Many examples for these kinds of applications have already been published and two showcases are summarised in the following paragraphs.

In 2013, EOMAP mapped the shallow-water bathymetry of the entire Great Barrier Reef, Australia, at 30m grid resolution.
This was the first depth map of its kind for the entire Great Barrier Reef, and also the largest optical SDB dataset ever made.
In 2014, Shell published a paper on the use of EOMAP’s Satellite-derived Bathymetry (delivered at 2m grid resolution) to support their shallow-water seismic campaign in northwest Qatar (Siermann et al.
2014).
Shell summarised the benefits of using the satellite techniques over more traditional methods by citing a 1 Million USD costs savings and very timely delivery of the data.

 Figure 4: Example of the seamless multisource bathymetric grid for the Persian Gulf, including Satellite-derived bathymetric data (left) and the GEBCO dataset (right).

Use Case: Basis Data for Hydrodynamic Modelling

Hydrodynamic modelling exercises, such as generating tsunami forecast models, are typically not the type of applications with budgets that allow for purchasing bathymetric survey campaigns using more traditional methods.
Commonly, very coarse resolution bathymetric grids such as GEBCO are used instead, but this has limited validity in coastal areas.
By using Satellite-derived Bathymetry, shallow-water depth data can be derived at fit-for-purpose grid resolution to within a limited budget.
As a standalone dataset it does not fulfil the modellers requirements but when merged with up-to-date information on the coastline – (also derived from the satellite imagery), survey and chart information, a seamless shoreline-to-deep-water dataset can be created, which greatly improves on currently available datasets.
Such a dataset was created for the Gulf region, which now serves as bathymetric dataset for tsunami modelling in the area.

Future Perspectives

Over the intermediate term it is expected that satellite-derived mapping of the seafloor will continue to be increasingly accepted and integrated as a survey tool - as is now already the case for a number of innovative user groups.
Developments are still needed in areas such as how to best quantify uncertainties and small scale obstructions.
One likely development will be the mutlitemporal and sensor agnostic mapping approach, which can be oversimplified as: use all available image data to the best possible extent and quality.
With the advances of cloud computing, physics-based algorithms and an increasing selection of image data, this is would be a natural evolution for Satellite-derived Bathymetry.

Links :

Thursday, January 12, 2017

Communicating under sea ice

The long-range under-ice sound communication system developed by WHOI engineer Lee Freitag and his colleagues.
In the Arctic Ocean, a cold water layer bounded above and below by warmer layers acts as a "sound duct" that channels sound waves over long distances.
Sound beacons suspended in the channel emit information-carrying sound signals that travel to other buoys and to autonomous underwater vehicles under the ice.
Data is relayed from the buoys to scientists via satellite.
(Illustration by Eric Taylor, WHOI Graphic Services)

From WHOI Oceanus magazine by Kate Madin

Engineers use ocean channel to efficiently relay sound

anks Island is one of 36,563 ice-covered islands sprinkled in the Arctic Ocean north of Canada.
It is home to the world’s largest population of muskoxen—about 68,000—one tiny village with a population of slightly more than 100 people, and an airport, which during the spring and summer of 2014 bustled with researchers poised to jump into the vast white Arctic.
Peter Koski and John Kemp, two engineers at Woods Hole Oceanographic Institution (WHOI), waited in the isolated village through days of high winds and frozen fog.
Finally, on a Saturday in March, the weather cleared.
A pilot gave the OK.
Then pilot, co-pilot, mechanic, Koski, and Kemp took off and flew out over the frozen Beaufort Sea in a small red Twin Otter plane packed full of cables and buoys.
During two days of hectic, hopscotching flights, taking off and landing on patches of floating sea ice, they set up equipment at eight remote sites to carry out a long-awaited experiment.
Their goal was to establish a long-distance communications system that would transmit and receive signals under water and under ice.
Like the telegraph in the Old West, such a system could open up this previously inaccessible ocean to exploration, allowing fleets of autonomous underwater vehicles to navigate and collect data in ice-covered areas where ships and people cannot easily go.
Such data are essential for scientists and the Navy to gain better understanding of the Arctic, a critical region for both environmental and military reasons, that is rapidly changing.
The key to the experiment lay in taking advantage of a naturally occurring layer of water that forms in the Arctic and efficiently channels sound over long ranges—a sound duct within the ocean.
Scientists and the Navy had exploited similar sound ducts in other oceans to measure the water temperatures and find distant submarines.
Would it work in the Arctic Ocean, where the upper 3,280 feet (1,000 meters) of the ocean is completely different than anywhere else in the world?

 A possible future integrated acoustic communications system in the Arctic.
Autonomous vehicles and gliders transmit data via sound signals to transponders suspended beneath the ice.
The transponders send the data to the buoys antennas, and from there via satellite to scientists in other locations.
Scientists can control vehicles movements by communicating via satellite to the buoys, which send sound signals to the vehicles.
(Painting by E. Paul Oberlander, Woods Hole Oceanographic Institution)

Sound pipelines within the ocean

Many transmission options available on land, such as light and radio waves, don’t work under water.
But as whales know well, sound travels far under water, especially low-frequency sound.
Indeed, scientists with acoustic receivers can sometime hear the deep tones of whale songs or sound waves from earthquakes from thousands of miles away.
During World War II, two scientists, Maurice Ewing and J.
Lamar Worzel, conducted basic research at WHOI on sound wave propagation in the ocean—seeking any advantages that would help the Navy detect enemy submarines or mines, or help American subs avoid detection.
In a critical experiment, they detonated one pound of TNT under water near the Bahamas and detected the sound 2,000 miles away near West Africa.
The test confirmed Ewing’s theory that low-frequency sound waves were less easily scattered or absorbed by water and could travel very far.
The scientists discovered a layer of water, between 2,000 and 4,000 feet deep in the ocean, that acted like a pipeline to channel low-frequency sound and transmit it over long distances: the SOFAR (Sound Fixing and Ranging) channel.
The explanation for the SOFAR channel is that the ocean settles into either denser or more buoyant layers of water based on their salinity and temperature.
Sound energy travels in waves that speed up in waters near the surface, where temperatures are warmer, or near the bottom, where water pressure is higher.
In between lies the SOFAR channel, which is bounded top and bottom by water layers where sound velocities are high and sound dissipates quickly.
The boundaries act like a ceiling and floor.
When sound energy enters the channel from below, it slows down.
When it interacts with the ceiling, it is refracted back downward.
Eventually it reaches the bottom boundary of the channel, the high-pressure water near the seafloor, and is refracted back upward again.
In this way, sound is efficiently channeled horizontally with minimal loss of signal.
The Navy immediately saw the value of the SOFAR channel.
It deployed a network of underwater microphones, called hydrophones, to optimally exploit the SOFAR channel to listen for submarines.
More than six decades later, WHOI researchers explored whether something similar might work in the Arctic.
A WHOI engineering team led by Lee Freitag and including Keenan Ball, James Partan, Peter Koski, and Sandipa Singh developed a system to achieve long-distance sound communication under the ice, enabling the control of navigation of autonomous vehicles.
Koski and Kemp brought it to Banks Island to put it to the ultimate test.

On the U.S. Coast Guard icebreaker Healy during a 2016 follow-on experiment in the Arctic, WHOI research engineer Lee Freitag examines the electronics to a new sound-based communications and navigation system that he and his colleagues developed and used in the Arctic.
(Photo courtesy of Lee Freitag, Woods Hole Oceanographic Institution)

A multi-layered Arctic Ocean

The reasons to study the Arctic are compelling.
It is the region of the globe that is warming fastest, causing rapid changes in air-ice-ocean dynamics that not only change the Arctic’s climate but also have cascading impacts on global climate.
Arctic sea ice is diminishing in summer, opening navigation routes and changing the naval theater of operations.
Barriers to studying the Arctic are numerous: 24-hour darkness in winter, severe weather and safety concerns, high expense, and few ships capable of moving through ice.
Autonomous underwater vehicles (AUVs) offer a way around these difficulties, since they could work under the ice without scientists or ships present.
The biggest obstacle has been communications and navigation.
Even in summer, ice makes it impossible for an AUV to come to the surface, take a GPS reading, transmit its data and position, and receive commands.
“We wanted to learn whether we could use acoustic communication in the Arctic to support autonomous vehicles and sensors,” Freitag said.
“We’re exploiting the propagation of sound in the ocean to build a navigation and communications system in the Arctic, so we can tell the vehicles where the ice boundary is, whether they should go north or south, east or west.”
The system is designed to take advantage of a unique combination of conditions that creates a sound channel in the Arctic Ocean.
At the top of the world, water enters the Arctic Ocean from both the Atlantic and Pacific.
Both incoming water masses are warmer than the water residing in the central Arctic Ocean.
“A deeper layer of warm water comes in on the Atlantic side through the Fram Strait,” Freitag said, “and circulates around the Arctic Ocean at about three hundred meters depth.
A different current of warm water comes in from the Pacific side, from the Bering Sea, in the summer, and it goes to about fifty to a hundred meters deep.”
These different incoming currents create a watery “layer cake” of different densities and temperatures in the Canada Basin, where Freitag’s team worked.
“You have very cold Arctic air above the surface, causing very cold water at the surface,” Freitag said.
“Then a warmer layer originally from the Pacific at fifty to a hundred meters.
Then a layer of colder central Arctic Ocean water below that, and finally at three hundred meters, there’s the layer of warmer Atlantic water.”
The two warm layers create top and bottom boundaries to a colder layer, which is the sound duct.
While narrower in depth than the SOFAR channel of the temperate ocean, the sound channel in this area north of Alaska and Canada acts similarly.
“Sound stays in this duct, bounded by these two warm layers,” Freitag said.
“Warmer water above and below results in a faster sound speed.
Sound bends away from the faster water and the sound in the duct travels farther.
Nothing magic, it’s just physics.”

 A lone buoy sits atop Arctic ice in the Canada Basin.
In the water below the buoy, a sound beacon in the cold-water "sound duct"; sends out sound-wave signals to communicate with other buoys and autonomous underwater vehicles hundreds of miles away, part of a new long-range under-ice acoustic communication system.
(Photo courtesy of Peter Koski, Woods Hole Oceanographic Institution)

Hopscotching on sea ice

Back on Banks Island, Koski and Kemp waited to test the new acoustic communications system as poor weather canceled takeoffs and research teams stacked up waiting for flights.
“It’s late into March, and we had to do it before the ice condition deteriorated,” Koski said.
Sea ice begins to melt more as 24-hour sunlight returns in summer.
“Every day you wake up, and the pilots decide if the day is good to fly,” he said.
“Everything’s ready—you pack up and go.”
“The pilots have done it before, and they know what they’re looking at,” Koski said.
“They land somewhere and walk the ice, putting out black trash bags filled with snow to mark a runway—in case they need to take off in bad weather or another plane needs to find the runway the next day.”
Twin Otters carry a 2,000-pound payload, including people, equipment, and fuel, Koski said, so five people made up about half the load.
Each flight to an ice location took two hops, with a stop to refuel on ice five to ten feet thick.
“Sometimes, when a team intended to overnight on the ice, they took a bear dog in the plane,” Koski said.
“They hired a trapper or hunter from town, and his dog, to go out with them.
The dog sleeps on a pallet outside the tents and will whine or bark if it smells a polar bear.”
Koski and Kemp—WHOI’s Moorings Operations Group leader—used an auger to drill holes through the ice and install pairs of small buoys at intervals from 24 to 240 miles away from Banks Island.
“We did the farthest point first, so we didn’t get stranded, and made hops on the way back where we could refuel,” Koski said.
“The pilots like to help out.
They’re interested, and everyone depends on each other.
If anything happens to you, you’re two days from medical help.”
Each buoy connnected to a long cable fed through the drill-hole.
Each cable carried a transducer suspended within the sound duct, 328 feet down.
Every four hours, the transducers sent out sound signals at a frequency of 900 hertz, about the top of a soprano’s singing range.
“The signal levels are kept as low as possible to conserve energy, and span less than one minute every four hours, minimizing potential environmental impact.
In addition, the sources are never operated in sensitive areas near the Alaska  coast,” Freitag said.
“We’d land, get the auger set up, twenty minutes to drill the hole,” Koski said.
“Someone stretches out the equipment and cable, so the buoy is a hundred meters away from the hole.
Make final electrical connections at that point, then put the transducer into the water and turn it on.
When we hear that it’s working, we drag the buoy to the hole, which lowers the transponder as you walk, and we set the buoy onto the hole.”
“The ‘go, no-go’ point is if you can hear the sound signal with your ears,” he said.
“If it’s working, you can hear it.
If yes, then you get back on the plane and go.”

Warming above and below the ice

Freitag was watching on his laptop from the United States, and WHOI scientist Steve Jayne was on Banks Island, when the first signals from the ice buoys deployed by Kemp and Koski reached them.
Signals transmitted via satellite from all eight buoys came through.
“In the course of a weekend, they had put eight buoys in, and the buoys were all able to talk to each other,” Freitag said.
“In a short time, we went from not being positive that it would work for more than a hundred kilometers, to  ‘Wow, this works at a few hundred kilometers!’ We were all very, very pleased!”
That July, researchers from the University of Washington launched gliders from a boat out of Prudhoe Bay, Alaska, to test whether the gliders would communicate with the buoy system.
The gliders traveled up and down through the ocean gathering temperature data.
They detected and responded to signals from the WHOI buoy system—but only when they were within the boundaries of the 328- to 984-foot (100- to 300-meter) sound duct.
“We learned that you have to be able to synchronize the time when the transducer’s beacons transmit to the time when the gliders are up in that layer of water—otherwise, they don’t hear it,” Freitag said.
Ironically, the Arctic sound duct that researchers may now use to gain understanding of the region’s rapid climate change is being strengthened by that very same climate change.
“Data show that over the last thirty years this warmer layer has gotten warmer,” Freitag said.
“And so the strength of this sound propagation duct in this part of the Arctic has grown, enabling this sound propagation.”
“What happens in the future, that’s not clear,” he said.
“But regardless, the warm-layer sound channel took some time to form, and it’s not going to go away very soon—given that the temperature of water in the Bering Sea coming into the Arctic has gone up as well.”
“The change in Arctic temperature is absolutely what has enabled this Arctic acoustic network to actually work the way that it does,” Freitag said.
“But in the middle of the winter, there’s still going to be ice.
No matter how open the Arctic gets in the sumer due to melt-back, it’s still going to freeze in the winter.”
Robotic vehicles could be gathering data under that winter ice, navigating via an under-ice communications system that transmits the data back to scientists warm and snug in their labs.