Monday, January 16, 2017

Taming oceans of data with new visualization techniques

Climate change research relies on models to better understand and predict the complex, interdependent processes that affect the atmosphere, ocean, and land.
These models are computationally intensive and produce terabytes to petabytes of data.
Visualization and analysis is increasingly difficult, yet is critical to gain scientific insights from large simulations.
The recently-developed Model for Prediction Across Scales-Ocean (MPAS-Ocean) is designed to investigate climate change at global high-resolution (5 to 10 km grid-cells) on high performance computing platforms.
In the accompanying video, we use state-of-the-art visualization techniques to explore the physical processes in the ocean relevant to climate change.
This project exemplifies the benefits of tight collaboration among scientists, artists, computer scientists, and visualization specialists.

From Princeton University by Catherine Zandonella

The global ocean is the Earth's heating and cooling system, pushing balmy tropical waters toward the poles and bringing back colder, nutrient-rich waters.
But modeling this system is extremely complex, resulting in billions of data points.
To tackle the complexity, researchers at three Princeton-area institutions have transformed complex modeling data into an easily understandable animated movie showing how ocean temperatures and saltiness change over time.
The animation could help climate researchers explore how factors such as rising carbon dioxide levels alter the ocean's ability to transport heat.
The animation illustrates the power of visualization techniques for presenting complex data in ways that are readily understandable, said Eliot Feibush, leader of the Princeton Visualization Consortium, which brings together researchers from the University's Princeton Institute for Computational Science and Engineering (PICSciE), the U.S. Department of Energy's Princeton Plasma Physics Laboratory (PPPL), and the U.S. Department of Commerce's National Oceanic and Atmospheric Administration's Geophysical Fluid Dynamics Laboratory (GFDL).

 A Gyre of Salt: A Climate Model Visualization
A new video animation demonstrates the power of data visualization techniques to make sense of vast amounts of information.
The animation, which reveals how ocean temperatures and salinity change over the course of a year, is based on data from global climate models.
These models aid our understanding of the physical processes that create the Earth's climate, and inform predictions about future changes in climate.

"People are working with increasing amounts of data in all areas of science, and they need better ways to evaluate their results," said Feibush, who divides his time between PPPL and PICSciE.
"The techniques we developed are being applied to climate modeling but the methods can be used for other complex data sets that change over time," he said.
Data visualization techniques make it easier to comprehend information, spot trends and even identify mistakes, Feibush said.
"Visualization helps us to understand complexity — it is more than just a pretty picture."

 Hindcast of the peak of the 2008 hurricane season, one of the most active on records, simulated by an FV3-powered GFDL model at 13-km resolution.
FV3 improves representation of small-scale weather features such as hurricanes while maintaining the quality of large-scale global circulation.

 Matthew Harrison, a climate scientist at GFDL, worked with Feibush to adapt results of climate models into formats that could be used to generate the animation.
Climate models are computer programs that combine real-world observations of temperature, salinity, rainfall amounts and other factors with physical laws.
The models can help researchers better predict long-term climate changes and short-term weather forecasts.
"Understanding how heat moves through the ocean is essential for predicting the behavior of the climate we experience on land," Harrison said.

The process starts when tropical waters soak up the sun's heat.
Ocean currents push heated water toward the poles, warming not only the northern and southern oceans but also the air and land.
This "ocean heat engine" makes northern Europe considerably more habitable than it otherwise would be.
In the North Atlantic, warm water from the tropics rides the Gulfstream extension northward toward the Norwegian Sea and mixes with cold water from the Arctic.
Cold water is denser than warm water, so the mixed water sinks and makes its way eventually southward, bringing nutrients to fisheries off the coast of North America.
The water's saltiness, or salinity, plays a significant role in this ocean heat engine, Harrison said.
Salt makes the water denser, helping it to sink.
As the atmosphere warms due to global climate change, melting ice sheets have the potential to release tremendous amounts of fresh water into the oceans.
Climate visualizations can help researchers see how the influx of fresh water affects global ocean circulation over time.

The animation reveals how factors like evaporation, rainfall and river runoff affect salinity. For example, the Mediterranean Sea, which lies in an arid region and has only a narrow outlet, is much saltier than the nearby Atlantic Ocean.
In contrast, over 250 rivers flow into the Baltic Sea between mainland Europe and Scandinavia, so the sea is about seven-times less salty than the Atlantic Ocean.

One of the special aspects of this video animation is its high resolution, Harrison said.
The simulation's resolution is six million pixels, which is like dividing up the world's ocean surface into a grid consisting of six million sectors.
Each sector corresponds to an ocean area of about 10 kilometers on each side.
The model calculates the temperature and salinity for each sector, which becomes a pixel, or a colored spot on the screen.
In the real world, weather conditions change from moment to moment.
To capture this variability, GFDL's climate models incorporate weather conditions collected from ground stations and satellites to update the model hourly with near-surface wind speeds, temperature, rainfall and solar radiation.
The calculations run on supercomputers at Oak Ridge National Laboratory in Tennessee.
"There is an art to handling large amounts of data," said Whit Anderson, deputy director of GFDL and himself an oceanographer.
"You cannot just brute-force the large and complex data produced at facilities like GFDL and PPPL through a commercial product.
"The increase in the amount of data is due to our better understanding of climate and weather," Anderson continued.
"These large amounts of data in turn are giving us improved skill in predicting future climate and weather."

The video animation could help climate researchers explore how factors such as rising carbon dioxide levels alter the ocean's ability to transport heat.
(Photo by Nick Donnoli, Office of Communications)
Student involvement

Feibush credits the project's success to his student interns, some who started on the visualization project at PPPL while in high school.
"Without the students, this wouldn't have happened," Feibush said.
One of these students, Matthew Lotocki, started working with Feibush while a senior at the Bergen County Academy for Technology and Computer Science, a public magnet high school in New Jersey.
"It was an amazing opportunity," said Lotocki, a member of Princeton's Class of 2017.
"You get to work with cutting-edge computer clusters and systems, really interesting projects, and with a mentor who teaches you how to make the tools to create really cool visualizations."

One of the challenges was figuring out how to combine different types of computer processors to work on the task.
Today's scientists often take advantage of the power of video-game graphics processing units (GPUs) to do their computations.
Lotocki had to get the GPU to do the calculations and generate the graphics on the screen.
Another student intern, Michael Knyszek, who attends the University of California-Berkeley and who had an internship with Feibush as part of the Department of Energy Science Undergraduate Laboratory Internship program, programmed the GPU to combine layers of data.

The developers made the video animation look more realistic by incorporating NASA satellite data of the changing colors of the terrain to show typical seasonal changes.
Zachary Stier, Princeton Class of 2020, also worked on the project as a student at the Bergen Academies.
"A lot of the challenge was figuring out the right tools to address the questions we had at hand," Stier said.
"There were some tasks for which there was no documentation for what we wanted to do."
The experience working at PPPL was one of the factors that influenced Stier's decision to come to Princeton.
"I am much more able to look at a problem and do the research into what tools are available to attack the problem," he said.

The consortium combines three institutions, each with a different research focus. Princeton University is home to expert scientists in a wide range of disciplines.
Scientists at PPPL, which is managed by Princeton University, are developing fusion energy, which involves creating charged gases known as plasmas in a confined reactor for safe and abundant sources of electricity. GFDL's expertise is climate modeling.
"Our organizations all work on very different things but one thing that we all have in common is the need to visualize large and complex data," Anderson said.
Funding was supplied by the U.S. Department of Energy and the National Science Foundation.

Links :

Sunday, January 15, 2017

Saturday, January 14, 2017

High tide in Saint Malo

15 december 2016

see maree.info

  Saint Malo with the GeoGarage platform (SHOM chart)

Friday, January 13, 2017

Australia AHS update in the GeoGarage platform

2 new nautical raster charts added & 20 charts updated

Effective surveying tool for shallow-water zones

 Eomap is patenting technology that can map the water depth and chlorophyll content of lakes in satellite photographs, providing quality control for environmental projects that clean algae from lakes.
Such software processing is challenging to describe in patents, but the company made the investment to protect their innovations from competitors.

From Hydro by Dr. Knut Hartmann, Dr. M. Wettle, Dr. Thomas Heege, EOMAP GmbH & Co.KG, Germany

A recent article provides an overview of satellite-derived bathymetry methods and how data can be integrated into survey campaigns, and showcases three use cases.
Bathymetric data in shallow-water zones is of increasing importance to support various applications such as safety of navigation, reconnaissance surveys, coastal zone management or hydrodynamic modelling.
A gap was identified between data demand, costs and the ability to map with ship and airborne sensors.
This has led to the rise of a new tool to map shallow-water bathymetry using multispectral satellite image data, widely known as satellite-derived bathymetry (SDB).

 Figure 1: The diagram shows the relative amount of measured light energy that contains water depth information.

Strictly speaking, the methods to derive information on seafloor topography using reflected sunlight date back to the 1970s but it has required iterative improvements of algorithms, computational power, satellite sensors and processing workflows to provide the current state of the art tool.
Today, a range of different methods exist under the umbrella of the SDB term.
However, as with traditional survey methods, it is imperative to understand the advantages, disadvantages and overall feasibility in order to evaluate the suitability and fit-for-purpose of a given SDB application.

Bathymetric Data Production using Optical Satellite Imagery

Historically, empirical methods were used, which require known depth information over the study area.
By comparing these known depths with the satellite signal, a statistical relationship can be derived that adequately describes depth as a function of the signal.
Aside from requiring known depth data, these methods will only work for a given satellite image.
A subsequent satellite scene, even of the same location, may contain different atmospheric and in-water parameters, and thus the statistical relationship needs to be re-calculated.

Another aspect of these methods is that the statistical relationship is only valid for one water type and one seafloor type.
Therefore, if an area contains different types such as coral, sediment, algae and rubble, the statistical relationship needs to be calculated for each of these substrate types.
The correct formula then needs to be applied to each pixel in the image, i.e.
the algorithm needs to be informed a-priori which substrate type it is encountering in that image pixel.
This brings the problem full circle back to one of the fundamental challenges of satellite-derived bathymetry: how do you know whether a darker signal is due to deeper water, a darker substratum, or a bit of both?
These methods can still be useful as they are relatively straightforward to implement (see The IHO-IOC GEBCO Cook Book, 2016).

Physics-based methods on the other hand, do not require known depth information for the study area, and can therefore be applied independent of satellite data type and study area.
These methods rely on fully describing the physical relationship between the measured light signal and the water column depth.
Optical variability in the atmosphere and water column is accounted for within the algorithm inversion, and no 'tuning' to known depths is required.

Therefore, an area which is physically inaccessible and for which there is no previous information known can be targeted.
Not surprisingly, these physics-based methods require more sophisticated algorithms and powerful processing capacity.
The benefit is that they typically prove to be more accurate, especially in areas with varying substrate types, turbidity and/or atmospheric conditions.
This is of particular importance because only a small fraction of the sunlight recorded by the satellite’s sensor originates from the source that can be associated with water depth.
Depending on the wavelength channel, this fraction varies typically between less than one and up to a maximum of 20%, going from near-infrared to green/blue light energy.
It is critical to accurately account for the other sources of light energy in order to separate out the relevant water column depth contribution to the measured signal.


Data Integration

The integration of SDB data into daily use can be straightforward if the bathymetric data quality and delivery formats follow best practice.
Hence the file formats typically follow industry standards (OGC) and enable a direct use in current GIS or online visualisation tools through Web Mapping of Coverage (WMS, WCS) interfaces, hydrographic software or scripting tools.
ISO conform metadata including important information on tidal corrections, processing levels and date and time of satellite recording are essential for geodata and are mandatory for all SDB data.

Furthermore, it is important to understand the uncertainties in the data as well as the limitations of SDB for a given application in order to integrate the data appropriately.
Such information needs to be expressed in uncertainty layers which should ideally include quantitative information.
For some applications, such as safety of navigation, additional information such as the ability to identify obstructions of different sizes needs to be included as well.

Use Case: Safety of Navigation

Satellite-derived Bathymetric information supports safety of navigation by providing up-to-date and high-resolution grids of the shallow-water zone.
This is of particular importance in areas with outdated charts or dynamic seafloor.
In addition to the bathymetric information, of particular importance is the identification of obstructions which could be a risk to navigation.

 Figure 2: Current ENC (March 2016, left) and overlaid by SDB data (right) showing shoals misplacement and low details of the ENC compared to the Satellite-derived Bathymetry-ENC.

Ideally the bathymetric data are provided in the form of digital nautical charts (ENCs) and ECDIS (Electronic Chart and Display System) as the main navigation device which represents the standard for the majority of vessels.
Satellite-derived Bathymetry data cannot immediately be used for navigation with ECDIS – however, it can serve as an additional data source when updating the bathymetric information of nautical charts (paper or digital).
ENC Bathymetry Plotter, a recently finished software product of SevenCs’ chart production suite, represents a powerful tool to create depth-related information objects for inclusion in ENCs which fulfill all relevant IHO quality standards.
SevenCs and EOMAP have teamed together to provide an innovative service, the combination of up-to-date shallow water bathymetry provided as a standard ENC.
This can therefore be used immediately on board vessels.
An update of official ENCs which include Satellite-derived Bathymetric data, is therefore possible at the commencement of a voyage, but also during the vessel’s journey - via satellite communication - and therefore allows for the planning of more efficient shipping routes, increased safety as well as an improved situational awareness to react to a forced change of the shipping route (e.g.
weather events or other threats).

It is obvious that the need for updating ENCs for safety of navigation is of importance for poorly mapped areas.
It should not be understood to replace recent, high-resolution and quality ENCs if available.

In 2016, bathymetric data was provided to Van Oord covering several atolls in The Maldives.
The data were used to enhance safe navigation by charting all shoals which might or might not be indicated on Electronic Navigation Charts.
This contributed to efficient planning of the project’s activities.
Data were provided within a few days of ordering covering an area of several hundred sq.
km, which showcases the flexibility of the technique.

 Figure 3: Baseline data on seafloor information based on satellite images and physics-based algorithms.

Use Case: Reconnaissance Survey

Satellite-derived Bathymetry can play a role as a reconnaissance survey tool in applications ranging from shallow-water seismic surveys, coastal engineering to optimal planning of acoustic surveys.
Although different in usage, all of these applications have in common that they require bathymetric data which is :
  • spatial
  • high resolution
  • rapidly available
  • affordable within a typical planning phase budget.
Reconnaissance surveys are usually relevant for areas which are poorly surveyed, where charts are outdated or where bathymetric data are simply not accessible.
Many examples for these kinds of applications have already been published and two showcases are summarised in the following paragraphs.

In 2013, EOMAP mapped the shallow-water bathymetry of the entire Great Barrier Reef, Australia, at 30m grid resolution.
This was the first depth map of its kind for the entire Great Barrier Reef, and also the largest optical SDB dataset ever made.
In 2014, Shell published a paper on the use of EOMAP’s Satellite-derived Bathymetry (delivered at 2m grid resolution) to support their shallow-water seismic campaign in northwest Qatar (Siermann et al.
2014).
Shell summarised the benefits of using the satellite techniques over more traditional methods by citing a 1 Million USD costs savings and very timely delivery of the data.

 Figure 4: Example of the seamless multisource bathymetric grid for the Persian Gulf, including Satellite-derived bathymetric data (left) and the GEBCO dataset (right).

Use Case: Basis Data for Hydrodynamic Modelling

Hydrodynamic modelling exercises, such as generating tsunami forecast models, are typically not the type of applications with budgets that allow for purchasing bathymetric survey campaigns using more traditional methods.
Commonly, very coarse resolution bathymetric grids such as GEBCO are used instead, but this has limited validity in coastal areas.
By using Satellite-derived Bathymetry, shallow-water depth data can be derived at fit-for-purpose grid resolution to within a limited budget.
As a standalone dataset it does not fulfil the modellers requirements but when merged with up-to-date information on the coastline – (also derived from the satellite imagery), survey and chart information, a seamless shoreline-to-deep-water dataset can be created, which greatly improves on currently available datasets.
Such a dataset was created for the Gulf region, which now serves as bathymetric dataset for tsunami modelling in the area.

Future Perspectives

Over the intermediate term it is expected that satellite-derived mapping of the seafloor will continue to be increasingly accepted and integrated as a survey tool - as is now already the case for a number of innovative user groups.
Developments are still needed in areas such as how to best quantify uncertainties and small scale obstructions.
One likely development will be the mutlitemporal and sensor agnostic mapping approach, which can be oversimplified as: use all available image data to the best possible extent and quality.
With the advances of cloud computing, physics-based algorithms and an increasing selection of image data, this is would be a natural evolution for Satellite-derived Bathymetry.

Links :