Thursday, November 21, 2019

The World’s largest caldera discovered In the Philippine Sea



From Forbes by David Bressan

A team of marine geophysicists recently published a paper describing a large igneous massif east of the island of Luzon, located on the bottom of the Philippine Sea.

 Localization of Benham Bank with the GeoGarage platform (NGA nautical raster chart)

Based on the morphology, the research suggests that the submarine mountain massif represents the remains of a volcanic caldera with a diameter of ~150 km (93 miles), twice the size of the famous Yellowstone caldera in Wyoming (U.S.).


Bathymetry map showing major undersea features in the West Philippine Basin (WPB): Benham Rise, Central Basin Spreading Center, Palau-Kyushu Ridge, Philippine Trench, East Luzon Trough (ELT), Luzon-Okinawa Fracture Zone, Gagua Ridge, Ryukyu Trench, Urdaneta Plateau, Oki-Daito Rise, Oki-Daito Ridge, Daito Ridge, and Amami Plateau.
Inset shows map location relative to the Philippine Sea Plate.
Background is predicted bathymetry

Gravimetric analysis shows that the Benham Rise, as the submarine mountain massif is named, consists of a nine miles thick layer of magmatic and volcanic rocks.

Rock samples comprise ages of 47.9 to 26-million-years, when volcanic activity build up the massif.
Sonar surveys of the seafloor also revealed the morphology for the first time.


Bathymetry map of the Benham Rise region produced from NAMRIA's multibeam grid overlaid on predicted bathymetry grid.
Thin grey line marks the extent of multibeam data.
The thin grey lines on the profiles mark slope values and the blue arrows point to where the platform base transitions to the crest.
Abbreviated feature names are: AB = Aurora Bank, BS = Bicol Saddle, BB = Benham Bank, B = Bayog Seamount, PH=Peña Hill, VS=Vinogradov Seamount, MS=Molave Saddle, PT = Philippine Trench, LOFZ = Luzon-Okinawa Fracture Zone, and CBSC = Central Basin Spreading Center. 

 
Locations of bathymetry profiles A-A′, B-B′ and C-C′ are plotted on the map together with locations DSDP 31 Site 292 and seismic lines RC2006-61 and GC31.
 Profile C-C′ showing a cross-section of the circular outer ridge and basin floor at the NW corner of the crest. Vertical dashed grey lines mark where the profile line changes direction.

The Benham Rise is rising from the 5.200 meters (~17,000 ft) deep seafloor to ~2500 meters, roughly 8,200 ft, beneath the sea surface, with a depression in the central portion, which likely is a volcanic caldera.
When large volumes of magma are erupted over a short time, a volcano may collapses downward into the emptied or partially emptied magma chamber, leaving a massive depression at the surface, from one to dozens of kilometers in diameter.
The circular depression on the Benham Rise is surrounded by a crest with scarps as high as 100 to 300 meters (300 to 900 ft).

 Bathymetry map of Benham Bank, a guyot shallowing to ~50 m below sea level.
Its summit and flanks exhibit evidence of mass wasting such as scarps and embayments.
Contour interval at 100 m.

It may be the world's largest known caldera with a diameter of ~150 km (93 miles).
For comparison, the famous caldera of Yellowstone in Wyoming is only about 60 km (37 miles) wide.
The researchers named the caldera Apolaki, meaning “giant lord”, after the Filipino god of the sun and war.

Links : 

Wednesday, November 20, 2019

Climate change: speed limits for ships can have 'massive' benefits

Cornwall, UK, example of ships’ tracks.
Image courtesy of Edward Gryspeerdt, with data from NASA

From BBC by

Cutting the speed of ships has huge benefits for humans, nature and the climate, according to a new report.

A 20% reduction would cut greenhouse gases but also curb pollutants that damage human health such as black carbon and nitrogen oxides.
This speed limit would cut underwater noise by 66% and reduce the chances of whale collisions by 78%.

UN negotiators will meet in London this week to consider proposals to curb maritime speeds.
Ships, of all sorts and sizes, transport around 80% of the world's goods by volume.
However they are also responsible for a significant portion of global greenhouse emissions thanks to the burning of fuel.

Shipping generates roughly 3% of the global total of warming gases - that's roughly the same quantity as emitted by Germany.
While shipping wasn't covered by the Paris climate agreement, last year the industry agreed to cut emissions by 50% by 2050 compared to 2008 levels.


This new study, carried out for campaign groups Seas at Risk and Transport & Environment builds on existing research that suggests that slowing down ships is a good idea if you want to curb greenhouse gases.

The report though also considers a range of other impacts of a speed cut such as on air pollution and marine noise.

As ships travel more slowly they burn less fuel, which means there are also savings in black carbon, sulphur and nitrogen oxides.
The last two in particular have serious impacts on human health, particularly in cities and coastal areas close to shipping lanes.

The report found that cutting ship speed by 20% would cut sulphur and nitrogen oxides by around 24%.
There are also significant reductions in black carbon, which are tiny black particles contained in the smoke from ship exhausts.

Cutting black carbon helps limit climate warming in the Arctic region because when ships burn fuel in the icy northern waters, the particles often fall on snow, and restrict its ability to reflect back sunlight, which accelerates heating in the Arctic region.

The study also says that a 20% cut in speed would reduce noise pollution by two thirds - while the same speed limitation would reduce the chances of a ship colliding with a whale by 78%.
"It's a massive win, win, win, win," said John Maggs from Seas at Risk.
"We've got a win from a climate point of view, we've got a win from a human health point of view, we've got a win for marine nature, we've got a potential safety gain, and up to a certain point we are saving the shipping industry money.
"It is also of course by far the simplest of the regulatory options.  Thanks to satellites and transponders on commercial vessels it really is quite easy to track their movements and the speed they are travelling."

Proposals to reduce the speed of ships are among the ideas that will be considered at this week's meeting of the International Maritime Organisation (IMO) in London.

Experts believe that in the medium to long term, the industry will move to alternative fuels.
But there is considerable pressure, including from many countries and shipping companies, for effective short term measures to curb emissions.

One proposal from France would focus on oil tankers and bulk carriers but not container or cruise ships.
Denmark is proposing that the industry has a goal-based standard, where it is up to the individual shipping companies as to how they meet it.

Many shipping companies are in favour of slowing down.
"Slow steaming not only reduces the fuel costs but its application does not require time-consuming procedures as it can be implemented instantly, it requires no investment from ship owners, can be easily monitored and is the most efficient means of slashing CO2 emissions," said Ioanna Procopiou, a Greek shipping company owner.

But the idea is not supported by some of the biggest names in the trade.
"Maersk remains opposed to speed limits," said Simon Christopher Bergulf, who is Regulatory Affairs Director with the giant Danish shipping conglomerate.
"We rather support the principle of applying power limitation measures.
Focusing on power instead of speed limits will help deliver on the CO2 reduction targets set by the IMO, whilst rewarding the most efficient ships."

What gives campaigners hope is that shipping has already tried out the concept of going slow - back in 2008, during the global financial crisis, cargo ships slowed down to cut costs.
With average speeds dropping by 12% this helped cut daily fuel consumption by 27%, which equated to a significant drop in emissions.

Campaigners believe that whatever decision the IMO eventually comes to will involve slower steaming.
"The short term measure, whatever it is, is going to reduce ship speed," said John Maggs.
"We think the best way to do this most effectively is with a direct speed limit, whether we get that or not is unknown, but ships will have to slow down in the future."

Links :

Tuesday, November 19, 2019

Mapping global wind power potential


Offshore wind to become a $1 trillion industry

From IEA

Offshore wind power will expand impressively over the next two decades, boosting efforts to decarbonise energy systems and reduce air pollution as it becomes a growing part of electricity supply.
Offshore Wind Outlook 2019 is the most comprehensive global study on the subject to date, combining the latest technology and market developments with a specially commissioned new geospatial analysis.

Offshore wind power will expand impressively over the next two decades, boosting efforts to decarbonise energy systems and reduce air pollution as it becomes a growing part of electricity supply, according to an International Energy Agency report published today.

Offshore Wind Outlook 2019 is the most comprehensive global study on the subject to date, combining the latest technology and market developments with a specially commissioned new geospatial analysis that maps out wind speed and quality along hundreds of thousands of kilometres of coastline around the world.
The report is an excerpt from the flagship World Energy Outlook 2019, which will be published in full on 13 November.

The IEA finds that global offshore wind capacity may increase 15-fold and attract around $1 trillion of cumulative investment by 2040.
This is driven by falling costs, supportive government policies and some remarkable technological progress, such as larger turbines and floating foundations.
That’s just the start – the IEA report finds that offshore wind technology has the potential to grow far more strongly with stepped-up support from policy makers.

 CGI of Eolfi wind farm with Naval Energies floating platforms
Photo: Naval Energies

Europe has pioneered offshore wind technology, and the region is positioned to be the powerhouse of its future development.
Today, offshore wind capacity in the European Union stands at almost 20 gigawatts.
Under current policy settings, that is set to rise to nearly 130 gigawatts by 2040.
However, if the European Union reaches its carbon-neutrality aims, offshore wind capacity would jump to around 180 gigawatts by 2040 and become the region’s largest single source of electricity.

An even more ambitious vision – in which policies drive a big increase in demand for clean hydrogen produced by offshore wind – could push European offshore wind capacity dramatically higher.

China is also set to play a major role in offshore wind’s long-term growth, driven by efforts to reduce air pollution.
The technology is particularly attractive in China because offshore wind farms can be built near the major population centres spread around the east and south of the country.
By around 2025, China is likely to have the largest offshore wind fleet of any country, overtaking the United Kingdom.
China’s offshore wind capacity is set to rise from 4 gigawatts today to 110 gigawatts by 2040.
Policies designed to meet global sustainable energy goals could push that even higher to above 170 gigawatts.

The United States has good offshore wind resources in the northeast of the country and near demand centres along the densely populated east coast, offering a way to help diversify the country’s power mix.
Floating foundations would expand the possibilities for harnessing wind resources off the west coast.

“In the past decade, two major areas of technological innovation have been game-changers in the energy system by substantially driving down costs: the shale revolution and the rise of solar PV,” said Dr Fatih Birol, the IEA’s Executive Director.
“And offshore wind has the potential to join their ranks in terms of steep cost reduction.”

 Offshore wind farm in Denmark (Horns Rev II) with the GeoGarage platform (DGA nautical chart)

Dr Birol launched this special report today in Copenhagen, Denmark – the birthplace of offshore wind – alongside the Danish Minister for Climate, Energy and Utilities, Dan Jørgensen.

The huge promise of offshore wind is underscored by the development of floating turbines that could be deployed further out at sea.
In theory, they could enable offshore wind to meet the entire electricity demand of several key electricity markets several times over, including Europe, the United States and Japan.

“Offshore wind currently provides just 0.3% of global power generation, but its potential is vast,” Dr Birol said.
“More and more of that potential is coming within reach, but much work remains to be done by governments and industry for it to become a mainstay of clean energy transitions.”

Governments and regulators can clear the path ahead for offshore wind’s development by providing the long-term vision that will encourage industry and investors to undertake the major investments required to develop offshore wind projects and link them to power grids on land.
That includes careful market design, ensuring low-cost financing and regulations that recognise that the development of onshore grid infrastructure is essential to the efficient integration of power production from offshore wind.

Industry needs to continue the rapid development of the technology so that wind turbines keep growing in size and power capacity, which in turn delivers the major performance and cost reductions that enables offshore wind to become more competitive with gas-fired power and onshore wind.

What’s more, huge business opportunities exist for oil and gas sector companies to draw on their offshore expertise.
An estimated 40% of the lifetime costs of an offshore wind project, including construction and maintenance, have significant synergies with the offshore oil and gas sector.
That translates into a market opportunity of USD 400 billion or more in Europe and China over the next two decades.

Links :

Monday, November 18, 2019

New Zealand (Linz) layer update in the GeoGarage platform

3 nautical raster charts updated

A naturalist figured out climate change in 1799. The world forgot him.

Horses rear in a shallow pond filled with electric eels in this illustration from "The Adventures of Alexander von Humboldt" written by Andrea Wulf and illustrated by Lillain Melcher

From CSMonitor by Anna Tarnow

Driven by a restless curiosity that resisted the confines of any one scientific discipline, Alexander von Humboldt offered the world a kaleidoscopic view of the wonders of nature.

Andrea Wulf and Lillian Melcher bring this “forgotten father of environmentalism” to life in a lush graphic novel.

Few people today remember Alexander von Humboldt, but the Prussian naturalist predicted climate change back in the early 19th century.
“He’s the forgotten father of environmentalism,” says historian Andrea Wulf.


During Humboldt’s travels through Venezuela in 1799, he noticed that farmers in the Aragua valley were deforesting the region to grow indigo.
As a result, the nearby lake was drying up.
Later, in a letter to President Thomas Jefferson dated June 1804, he wrote, “The wants and restless activity of large communities of men gradually despoil the face of the Earth.”

It was one of the first Western observations of human-caused climate change, according to Wulf.
Environmentalists and scientists like Charles Darwin, John Muir, and Henry David Thoreau were heavily influenced by his writings, which were widely read during his lifetime.
Wulf wanted to raise Humboldt’s profile for today’s readers.
So she wrote “The Adventures of Alexander von Humboldt,” a lush and meticulously illustrated history of his South American expedition.

Courtesy of Penguin Random House
“The Adventures of Alexander von Humboldt” written by Andrea Wulf and Lillian Melcher, Pantheon, 272 pp.

“I grew up in Germany, so we heard about [Humboldt] as an adventurer, or maybe a botanist,” Wulf says.
“But no one talked about him as the man who had predicted harmful, human-induced climate change.
So that became the thing that got me going.”
She’s also the author of “The Invention of Nature,” a 2015 New York Times bestselling nonfiction book that delves more deeply into Humboldt’s life and influence.

“The Adventures of Alexander von Humboldt” stands apart with its rich visual presentation: It’s filled with Humboldt’s own drawings, maps, and writings, all sourced from the Berlin State Library’s digitized collection of his journals.
Those are juxtaposed with a dizzying array of reproductions, including pressed botanical samples, landscape paintings, and photos.
And it’s all stitched together by the artwork of Lillian Melcher, a recent Parsons School of Design graduate.

Early adopter: Alexander von Humboldt on the Orinoco River in Venezuela.
Portrait by Friedrich Georg Weitsch, 1806
(creditline: Staatliche Museen zu Berlin, Nationalgalerie /
photographer: Karin März / montage: Raufeld Medien)

“[Humboldt] was one of the first people to make science popular and accessible, because he used infographics in all of his books,” says Melcher.
She believes strongly in following in his footsteps to increase scientific literacy.
“I think that combination of science and art is a better way to learn,” she says.

Wulf and Melcher collaborated to storyboard the book, but “Humboldt was our third collaborator,” Melcher says.

Each page represents weeks’ worth of research.
“Andrea and I are definitely the same kind of nerdy, where we just want accuracy.
We want to know all the little details,” Melcher says.

 Map of Rio Orinoco designed by Humboldt

For example, the scanned pages of Humboldt’s diary that appear as background images on most pages of the book actually correspond to the events taking place in the story.
When Humboldt’s boat capsized in the Orinoco River, his journals were stained with river water.
Wulf and Melcher used reproductions of those diary pages to collage an image of the river, and Melcher drew Humboldt jumping into the pages to rescue his belongings.

“He’s jumping through that watermark to rescue his diary, but it’s [also] the real watermark,” says Wulf.
“It’s this double sense and I just love it.”


Humboldt’s interests were so wide-ranging that he found it hard to settle into a specific discipline.
(That’s perhaps one of the reasons he fell into obscurity: As scientific thought progressed, narrower focuses took precedent.)
The structure of “The Adventures of Alexander von Humboldt” is a true reflection of his restless curiosity – always varied, sometimes digressive, it’s a kaleidoscopic view into the diverse, fascinating, and occasionally brutal landscape of the South America that he encountered.

Intrinsic to Humboldt’s writings were his critiques of imperialism and slavery, as well as of environmental degradation.
With startling prescience, he pointed out the economic, environmental, and human costs of slavery and silver mining in “Essay on the Kingdom of New Spain” and “Political Essay on the Island of Cuba.”
Both are introduced in Wulf’s and Melcher’s book.


Importantly, he allowed his passion for nature to influence and color his work.
“If I look at [today’s] climate change debate in the political arena ...
what I’m really missing is that no one dares to talk about the wonders of nature,” says Wulf.
“[Humboldt] says we need to feel nature.
We need to use our imagination to understand nature.
...
And this aspect of his work, I think, is what makes it incredibly relevant today.”

There’s no doubt Humboldt was intrepid.
He fearlessly placed himself in harm’s way to gather knowledge, even if that meant climbing active volcanoes, crawling into mines, and prodding electric eels.
When a ship he was on sailed into a hurricane with 40-foot waves, he sat down to calculate the exact angle at which the boat would capsize.
Death would be better experienced methodically, he reasoned.
The ship stayed afloat.

Links :

Sunday, November 17, 2019

NOAA seeks public comment on ending production of traditional paper nautical charts

NOAA cartographers review a traditional printed nautical chart

From NOAA

NOAA is initiating a five-year process to end all traditional paper nautical chart production and is seeking the public’s feedback via a Federal Register Notice published on November 15, 2019.
Chart users, companies that provide products and services based on NOAA raster and electronic navigational chart (NOAA ENC®) products, and other stakeholders can help shape the manner and timing in which the product sunsetting process will proceed.
Comments may be submitted through NOAA’s online ASSIST feedback tool.

 The creation of nautical maps is explained.

A long tradition in transition

For nearly 200 years, NOAA’s Office of Coast Survey has produced traditional paper nautical chart products.
Originally, this took the singular form of hard copy paper charts, today, there are several raster digital chart formats available to download or print through a NOAA certified agent.
Similar to the transition from road atlases to GPS navigation systems that we have witnessed in this digital era, we are also seeing the increased reliance on NOAA electronic navigational charts (ENC) as the primary navigational product and the decreased use of traditional raster chart products.
Since 2008, ENC sales have increased by 425%, while sales of paper charts have dropped by half.

The International Maritime Organization now mandates that all large commercial vessels on international voyages use ENCs.
In 2016, the U.S. Coast Guard started allowing regulated commercial vessels on domestic voyages to use ENCs in lieu of paper charts.
Recreational boaters are also increasingly using electronic chart displays.

NOAA is in the midst of a multi-year program to improve its ENC coverage by replacing over 1,200 irregularly shaped ENC cells, compiled in over 130 different scales, with a standard gridded layout of ENCs, compiled in just a dozen standard scales.
This will increase the number of ENC cells to about 9,000 and significantly improve the level of detail and consistency among ENCs.
More information about improvements being made is in Transforming the NOAA ENC®.

 Electronic navigational chart displayed on an Electronic Chart Display and Information System (ECDIS) on NOAA Ship Thomas Jefferson.

Another option for paper nautical charts

Ultimately, production will be shut down for all raster chart products and services associated with traditional NOAA paper nautical charts, including:

Cancellation of these product and services will start in mid to late 2020 and be completed by January 2025.
More detailed information regarding this transition is explained in the document Sunsetting Traditional NOAA Paper Charts: End of Paper and Raster Nautical Chart Production and Introduction of NOAA Custom Charts.

Over the next five years, NOAA will work to ease the transition to ENC-based products while continuing to support safe navigation.
NOAA will focus on improving data consistency and providing larger scale coverage of NOAA ENC, as well as providing access to paper chart products based on ENC data, either through the NOAA Custom Chart prototype or third-party commercial data providers.

The online NOAA Custom Chart (NCC) application enables users to create their own charts from the latest NOAA ENC data.
Users may define the scale and paper size of custom-made nautical charts centered on a position of their choosing.
NCC then creates a geospatially referenced Portable Document Format (GeoPDF) image of a nautical chart.
Chart notes and other marginalia are placed on a separate PDF page.
Users may then download, view, and print the output.
NCC is an easy way to create a paper or digital backup for electronic chart systems or other Global Positioning System (GPS) enabled chart displays.

 Traditional 1:100,000 scale NOAA paper nautical chart 16204.

 16204 raster chart in the GeoGarage platform (NOAA layer)
 A 1:80,000 scale NOAA Custom Chart of Port Clarence, Alaska.

 Port Clarence ENC US4AK81M

A comparison of NOAA Chart 16204 and the corresponding NOAA Custom Chart is shown below.
Although it looks a bit different from a traditional NOAA chart, NCCs show the latest data as compiled in the NOAA ENCs.
The prototype is in the early phases of development and many improvements are needed to make NCC a viable replacement for traditional paper nautical charts.
We hope you will try out the NOAA Custom Chart prototype and tell us what you think through NOAA’s online ASSIST feedback tool.Electronic navigational chart displayed on an Electronic Chart Display and Information System (ECDIS) on NOAA Ship Thomas Jefferson.
Historical editions of nautical charts – suitable for framing – back to the mid-1800s, may also be downloaded for free from the Coast Survey Historical Map & Chart Collection website.

Links :

Three-masted Belem

Manon embarks for the first time on board the three-masted Belem,
an unprecedented 8-day experience at sea between Denmark and Sweden.
We're taking you on board for a series of 6 episodes to discover starting Friday, December 6, 2019.

Saturday, November 16, 2019

Norway (NHS) layer update in the GeoGarage platform

120 nautical raster charts updated & 1 new inset added

Basque Country and the sea

This promotional film (fiction) highlights the men and women who make fishing a unique profession every day.
A passionate profession, respectful of its environment and its resources, told here through a beautiful story of family, transmission, know-how that also rhymes with another passion, that of surfing.
This short film features a heroine, both surfer and fisherman, worthy representative of the new generations of fishermen.
Entrepreneurs who are first and foremost lovers of the sea and well aware of their responsibilities, young people and women who always want to go further in reinventing a new fishery, more respectful of the ocean and of us all.
Heroin has been young and fascinated by the sea since she was a child.
She likes to think that by taking over from her father at the head of the family fishing company, she will do even more for the sea, for the next generations, because "the future of the land is also at stake at sea".
with Lee-Ann Curren, filmed in the Basque Country


Vizcaya construction of a three-mast mast for the Basque Country 
The association "Trois-mâts basque" wishes to rebuild a tall ship, "Bizkaia", to highlight the Basque maritime heritage and allow as many people as possible to sail on these witnesses of the past.
The four phases of the overall project are:
1) The reconstruction of Alba, a sardine boat emblematic of the port of Saint-Jean-de-Luz - Ciboure, which will take visitors to Socoa.
2) The reconstruction of Bizkaia, a tall ship "three-masted goelette" which will be the subject of a showy project as was the hermione in Rochefort or is still the San Juan in Pasajes.
3) The creation of a scenario-based space designed to highlight the Basque maritime heritage. History of seafarers' families, holograms, and participation of local associations.
4) Sailing for the greatest number of people at Alba (in the bay) and Bizkaia: day trips, training courses in manoeuvring, trips....

Friday, November 15, 2019

A sharper view of the world’s oceans

A high-resolution ocean model reveals surface current flows off the coast of South Africa.
Credit: NASA, Goddard space flight center scientific visualization studio

From Scientific American by Conor Purcell

Models of the behaviour of the oceans with higher spatial resolution could lead to more accurate climate predictions.

When news broke in March 2014 that Malaysian Airlines flight MH370 had gone missing, Jonathan Durgadoo watched in shock with the rest of us.
The Boeing 777-200ER had departed from Kuala Lumpur for Beijing, before unexpectedly turning west about halfway between Malaysia and Vietnam.
An hour-and-a-half after take-off, the plane disappeared from radar over the Andaman Sea, southwest of Thailand.
There were 239 people on board.

Some 16 months later, a piece of debris from the aircraft—a hinged flap known as a flaperon that had broken away from the wing—was found on Réunion Island in the western Indian Ocean.
Durgadoo, an ocean modeller at the GEOMAR Helmholtz Centre for Ocean Research Kiel, Germany, quickly realised he was able to contribute to the search effort.
If his team could describe the movement of water in the Indian Ocean from the time of the aircraft’s disappearance to when the debris was washed up on Réunion Island, they might be able to track its journey and lead investigators to the crash site.

“From an oceanographic perspective, the question was straightforward yet difficult to answer,” Durgadoo says.
“Could we track the flaperon back in time to establish where the plane had crashed? And if so, would that position coincide with the priority search area?”

To track the flaperon, Durgadoo’s team would need a data set of currents in the Indian Ocean during that 16-month period that had no gaps in space or time.
Observational data over such a wide area and long period of time were not available; instead, the team decided to use a high-resolution model of the ocean.

Ocean models describe the movement of water using the mathematical equations of fluid motion, calculated on powerful supercomputers.
The models divide the ocean into a grid of 3D volumetric elements, and calculate the movement of the water between cubes for each time step the researchers choose.
Durgadoo’s team opted for a model with a horizontal grid length of one-twelfth of a degree—about 9 kilometres in every direction.

“The ocean model provided us with consistent data for the entire time period, and all over the Indian Ocean,” says Durgadoo.
They then used the data to simulate virtual objects back in time from July 2015 to the aircraft’s disappearance in March 2014.
In total, Durgadoo and his team launched almost five million virtual flaperons and tracked their likely journey back from the island.
The simulation resulted in almost five million possible trajectories, but by making certain assumptions, such as that the plane could have travelled a maximum of 500 km from its last known position, the researchers were able to whittle their results down to around 800,000 possible starting points.

These points still covered thousands of square kilometres of the southeast Indian Ocean, but this was largely a different area to where the search teams were looking.
According to the models, the probability that the flaperon started its journey in the patch of ocean off southwestern Australia where crash investigators were searching was less than 1.3%.
The more likely origin of the flaperon—and therefore the crash site, assuming it broke off on impact with the water, as is generally accepted—was further north, says Durgadoo.

Durgadoo’s work to track a two-metre-long flaperon across the ocean would not have been possible without considerable advances in the spatial resolution of ocean models.
Increases in computing power over the past few decades have spurred the development of models that use tighter grids and can therefore capture the movement of the ocean at the mesoscale, on the order of 100 km or less.
At this scale, swirling, circulating currents of water can be modelled.
Ocean models with high-enough resolution to represent these eddies can account for parameters such as volumetric flow rate, temperature and salinity, and can therefore reproduce more realistic ocean behaviour than models with lower resolution.

The emergence of high-resolution ocean models raises questions about using models with a coarser resolution, particularly for climate projections into future decades and beyond.
Every climate projection is a result of simulations that use models developed by various research centres around the world.
These models seek to incorporate and couple each component of the Earth system, from the cryosphere (the planet’s ice-covered regions) and biosphere to the atmosphere and ocean.
Because the accuracy of climate projections depends on how well each component of the model represents reality, incorporating ocean models with a higher spatial resolution into climate simulations should provide a better picture of how the climate is likely to change in the coming decades (see Q&A).
But modelling at higher resolutions carries a cost and might not be the only way to improve simulations of the ocean.

Current simulations

“Climate models don’t correctly simulate many aspects of the global ocean,” says Lisa Beal, a physical oceanographer at the University of Miami in Florida.
In particular, she is interested in western boundary current systems, which are deep, narrow, fast-flowing currents on the western side of ocean basins.
These currents carry huge amounts of heat from the tropics to the poles and have a large impact on global climate.
But so far, she says, they have not been correctly simulated in the models that underpin climate projections reported by the Intergovernmental Panel on Climate Change (IPCC).

The panel’s most recent report generally used ocean models with a resolution of around 1 degree.
This is because the models must simulate the ocean, atmosphere, land and ice as coupled systems feeding back on one another, and must simulate change over a period of 200 years.
Both of these objectives are computationally expensive, even at a resolution of 1 degree.
But at this resolution, the entire spatial scale of a western boundary current is covered by a single data point.

“This is our frontier,” says Beal.
“We need to be able to resolve crucial ocean features such as eddies and western boundary current systems in the global climate models that are used to predict the climate of the twenty-first century.”

Resolving a current problem

Whereas a typical climate-prediction model used by the Intergovernmental Panel on Climate Change shows the Agulhas Current off South Africa’s coast flowing freely into the South Atlantic (top), a simulation with ten-times-higher resolution (bottom) reveals swirling currents that more closely match real-world observations.

Credit: Ben Kirtman University of Miami-RSMAS

In 2016, Beal and her colleagues resolved the Agulhas western boundary current system, which lies off the coast of South Africa, for the first time in a climate model.
They used a coupled climate model with a global ocean resolution of one-tenth of a degree.
They then looked at what kind of ocean behaviours changed compared with lower-resolution models, and observed the effect on the heat and salt content of the South Atlantic.
They also calculated the water transport rate into the South Atlantic by so-called Agulhas leakage.
In both cases, the simulation matched real-world observations more closely than did models at lower resolution (see ‘Resolving a current problem’).

The next IPCC report, to be published in 2022, will use some ocean models with a resolution of one-quarter of a degree.
Beal thinks that using higher-resolution ocean models in global climate simulations is likely to change the representation of the way heat is transported from the equator to the poles.
Today’s climate models simulate western boundary current systems as being broader and slower than they really are.
As a result, they might underestimate the efficiency with which heat is carried from the equator towards the poles in the real ocean—a faster current boundary system loses less heat to the atmosphere and so delivers more to the poles.
Such an error in the way the model exchanges heat between the ocean and the atmosphere could result in an inaccurate simulation of global climate.

Levke Caesar : a climate of uncertainty

In 2018, Levke Caesar, a climate scientist at Maynooth University in Ireland, used a high-resolution model and observations of ocean circulation to show that the Atlantic meridional overturning circulation (AMOC)—a complex system of ocean currents that moves heat between the tropics and the North Atlantic—is losing strength.
This weakening could considerably change the climate of the Northern Hemisphere.
At the 2019 Lindau Nobel Laureate Meeting, Caesar spoke to Nature about the role of ocean models in climate science.

Why do climate scientists need to model the ocean?

Ocean circulations move carbon and heat into the deep ocean and transport heat around the globe—up to 1.3 petawatts from the tropics to the poles in the case of the AMOC.
That is as much power as about 1 million average-sized nuclear reactors.
So climate models improve a lot when you include the ocean.
And the longer the timescales you look at, the more important the ocean becomes.

How accurate are ocean models now?

There is a saying that all models are wrong, but some are useful, and it’s true—every model is an approximation.
The observed strength of the AMOC is around 16 million cubic metres per second.
But some climate models put it at nearly 30 million cubic metres per second.
So they’re definitely not perfect, but they can still help us to understand the mechanisms that drive ocean circulation, and how changes in the ocean feed back into the atmosphere.

How can you verify a model?

One reason why models of ocean currents diverge so much is that we are not completely sure what’s true.
But there are more observational systems being set up that we can use to validate our models with real data.
An array of moorings that spans the Atlantic at a latitude of 26 degrees north, from Morocco to Florida, has shown us how the AMOC has behaved in real life since 2004.
Observational work is really important in putting together a broad picture of the ocean.

What makes the AMOC so hard to model?

One problem is that it is difficult to model processes that take place at small scales.
At the surface you can get spatial resolution down to just a few metres, but that increases with depth.
Our high-resolution model has a horizontal scale of about 10 kilometres.
So there will always be processes that have to be represented by coding in the mathematical description of their physics.

Is uncertainty in models a problem?

Yes, definitely.
Some things that are difficult to implement in models could have a big impact on our predictions.
Most models suggest that the AMOC will continue to slow, but will not collapse this century.
But those models don’t account for fresh water coming from the Greenland Ice Sheet.
The omission of that destabilizing force probably makes the AMOC look more stable than it actually is.
Uncertainty in results also makes it harder to convince people about what is happening.
When scientists communicate, we try to be as honest as possible, and that means including all the uncertainties.
The problem is, when non-scientists claim that climate change is not that bad, they don’t say “but we could be wrong”—they say it convincingly.

What can be done about that?

We should probably emphasize the costs more.
If the negative effects of something happening are great, then no matter what the percentage risk is, you should take action to prevent it.
Some climate models suggest that as the AMOC weakens, storm tracks could become more prominent going towards the United Kingdom.
We’re not certain about that, but there’s an indication.
 Do we really want to test it?

Mixing it up

Even when resolution is not high enough to model small-scale processes, steps can be taken to represent them by alternative means.
For mesoscale ocean processes, this can be done by parameterization—a method by which ocean processes are represented by coding in the mathematical description of their physics.

Jennifer MacKinnon, a physical oceanographer at the Scripps Institution of Oceanography in San Diego, California, studies internal waves that oscillate deep in the ocean.
These waves have an important effect on mesoscale turbulent mixing processes in the ocean, which are known to affect the way the entire ocean works, and therefore influence the global climate.

“Because ocean models have a certain resolution which tend to be many kilometres, or tens of kilometres, in scale,” she says, “they cannot resolve and simulate many of the processes in the ocean.” Even for higher-resolution models, these ‘sub-grid-scale processes’ might still be too small to be explicitly resolved.

In 2017, MacKinnon co-authored a paper on internal wave-driven ocean mixing that was the culmination of a five-year study by the Climate and Ocean: Variability, Predictability and Change (CLIVAR) project.
“Models had previously set the mixing rate as a constant, or at least something that was not spatially variable,” she says.
But MacKinnon had seen that this was not really the case.
“Our observations showed us something that the models are not yet incorporating,” she explains.
The researchers tweaked the models to represent those turbulent mixing processes, and then looked at how the ocean models behaved differently over timescales of decades compared with those that took ocean mixing to be a constant.

The results showed that the deep-water mixing parameterization had a significant effect on the ocean’s overturning circulation, which in turn affects the atmosphere and therefore global climate.
So, to have a realistic climate projection, MacKinnon argues, models that are too coarse to include these internal mixing processes should at least include a parameterization to represent them.

Beyond resolution

Despite the benefits that higher resolution and parameterization offer climate modellers, it is not always feasible to use them.
The time required to compute the simulations grows with increasing resolution, as does the quantity of data generated.
“This problem just grows and grows the longer you want your simulations to be,” Durgadoo says.

The temporary nature of a research workforce comprising graduate students and postdocs means there is not time to run ocean models at ever-higher resolution.
And crucially, it is not clear that continuously ramping up model resolution will always bring greater benefits.
At very high resolutions, Beal says, the performance of the models could become unstable.

Researchers also need to think about other factors besides resolution.
One study showed that coupling the ocean with the atmosphere gives a more realistic simulation of the Gulf Stream than that achieved by simply increasing the model’s resolution3.
“If you keep turning up the resolution, there comes a point where you can’t really improve,” Beal says.
Durgadoo agrees.
“Resolution is definitely a limiting factor, but only up to a point,” he says.
The simulations that he and his colleagues performed to trace the missing Malaysian Airlines flight, for example, had a high resolution, but their study had many other limitations.
“It’s not only a problem of model resolution across the surface of the ocean and through time—there are other unknowns,” he says.
For example, researchers have a limited understanding of the physics of fluid mechanics.
It does not matter how high a model’s spatial resolution is if the underlying physics is lacking in detail.
The only way to overcome this problem is by further observational research.

The scientists all agree that better models require collaboration between those who observe the ocean and those who attempt to simulate it.
But that interdisciplinary communication can be lacking.
Beal and MacKinnon are physical oceanographers who lead ocean cruises to deploy measuring devices into the abyssal depths, whereas ocean modellers such as Durgadoo are almost always office-based and often work at different institutes.
Without effort, they might never meet.

Beal says that programmes such as CLIVAR and the Global Ocean Observation System (GOOS) are extremely useful for bringing researchers together, and Mac-Kinnon’s climate process team is an example of a positive outcome from that process.
By grouping observational scientists and modellers together, MacKinnon says, the community can improve its understanding of the physical ocean and refine the performance of the models.

As models improve, so too might confidence in the conclusions that can be drawn from them.
Such a boost might have benefited Durgadoo’s team in the search for MH370.
Although they recognized the limitations of their study, they contacted the search authorities in 2015 with the finding that they were probably looking in the wrong place.
The authorities acknowledged receipt of their correspondence, but there was no discussion or action around shifting the search site.
“More recently, we’ve conducted further research on the matter, but decided not to send it to the authorities,” says Durgadoo.
The current focus of their work is improving the method, he explains.

For now, the disappearance and whereabouts of the aircraft remain a mystery.

Links :

Thursday, November 14, 2019

ENC catalogue in Google Earth

17654 ENC cells from 68 HOs (date 13/11/2019 week 46)
download kmz file (weekly updated) : ENC world coverage in Google Earth

Electric car future may depend on deep sea mining

Apollo II is a prototype deep sea mining machine being tested off the coast of Malaga

From BBC by David Shukman

The future of electric cars may depend on mining critically important metals on the ocean floor.

That's the view of the engineer leading a major European investigation into new sources of key elements.
Demand is soaring for the metal cobalt - an essential ingredient in batteries and abundant in rocks on the seabed.

Laurens de Jonge, who's running the EU project, says the transition to electric cars means "we need those resources".
He was speaking during a unique set of underwater experiments designed to assess the impact of extracting rocks from the ocean floor.

In calm waters 15km off the coast of Malaga in southern Spain, a prototype mining machine was lowered to the seabed and 'driven' by remote control.
Cameras attached to the Apollo II machine recorded its progress and, crucially, monitored how the aluminium tracks stirred up clouds of sand and silt as they advanced.


An array of instruments was positioned nearby to measure how far these clouds were carried on the currents - the risk of seabed mining smothering marine life over a wide area is one of the biggest concerns.

What is 'deep sea mining'?

It's hard to visualise, but imagine opencast mining taking place at the bottom of the ocean, where huge remote-controlled machines would excavate rocks from the seabed and pump them up to the surface.

The vessel used for the underwater research off Spain, the Sarmiento de Gamboa, is operated by CSIC, the Spanish National Research Council.

The concept has been talked about for decades, but until now it's been thought too difficult to operate in the high-pressure, pitch-black conditions as much as 5km deep.

Now the technology is advancing to the point where dozens of government and private ventures are weighing up the potential for mines on the ocean floor.


Why would anyone bother?

The short answer: demand.
The rocks of the seabed are far richer in valuable metals than those on land and there's a growing clamour to get at them.

Billions of potato-sized rocks known as "nodules" litter the abyssal plains of the Pacific and other oceans and many are brimming with cobalt, suddenly highly sought after as the boom in the production of batteries gathers pace.

At the moment, most of the world's cobalt is mined in the Democratic Republic of Congo where for years there've been allegations of child labour, environmental damage and widespread corruption.

Current technology for electric car batteries require cobalt, thought to be abundant on the sea floor
The demand for metals—such as lithium, cobalt and nickel—means that there is only one way to feed an electrifying world: higher prices. 

Expanding production there is not straightforward which is leading mining companies to weigh the potential advantages of cobalt on the seabed.

Laurens de Jonge, who's in charge of the EU project, known as Blue Nodules, said: "It's not difficult to access - you don't have to go deep into tropical forests or deep into mines.
"It's readily available on the seafloor, it's almost like potato harvesting only 5km deep in the ocean."

And he says society faces a choice: there may in future be alternative ways of making batteries for electric cars - and some manufacturers are exploring them - but current technology requires cobalt.

Laurens de Jonge likens the process to "potato harvesting" 5km down in the ocean

"If you want to make a fast change, you need cobalt quick and you need a lot of it - if you want to make a lot of batteries you need the resources to do that."
His view is backed by a group of leading scientists at London's Natural History Museum and other institutions.

They recently calculated that meeting the UK's targets for electric cars by 2050 would require nearly twice the world's current output of cobalt.

So what are the risks?

No one can be entirely sure, which makes the research off Spain highly relevant.
It's widely accepted that whatever is in the path of the mining machines will be destroyed - there's no argument about that.
But what's uncertain is how far the damage will reach, in particular the size of the plumes of silt and sand churned up and the distance they will travel, potentially endangering marine life far beyond the mining site.
The chief scientist on board, Henko de Stigter of the Dutch marine research institute NIOZ, points out that life in the deep Pacific - where mining is likely to start first - has adapted to the usually "crystal clear conditions".


So for any organisms feeding by filter, waters that are suddenly filled with stirred-up sediment would be threatening.
"Many species are unknown or not described, and let alone do we know how they will respond to this activity - we can only estimate."
And Dr de Stigter warned of the danger of doing to the oceans what humanity has done to the land.
"With every new human activity it's often difficult to foresee all the consequences of that in the long term.
"What is new here is that we are entering an environment that is almost completely untouched."
Could deep sea mining be made less damaging?

Ralf Langeler thinks so.
He's the engineer in charge of the Apollo II mining machine and he believes the design will minimise any impacts.
Like Laurens de Jonge, he works for the Dutch marine engineering giant Royal IHC and he says his technology can help reduce the environmental effects.

The machine is meant to cut a very shallow slice into the top 6-10cm of the seabed, lifting the nodules.
Its tracks are made with lightweight aluminium to avoid sinking too far into the surface.

David Shukman (R) talks to Ralf Langeler, the engineer in charge of the Apollo II mining machine

Silt and sand stirred up by the extraction process should then be channelled into special vents at the rear of the machine and released in a narrow stream, to try to avoid the plume spreading too far.
"We'll always change the environment, that's for sure," Ralf says, "but that's the same with onshore mining and our purpose is to minimise the impact."

I ask him if deep sea mining is now a realistic prospect.
"One day it's going to happen, especially with the rising demand for special metals - and they're there on the sea floor."

Who decides if it goes ahead?

Mining in territorial waters can be approved by an individual government.
That happened a decade ago when Papua New Guinea gave the go-ahead to a Canadian company, Nautilus Minerals, to mine gold and copper from hydrothermal vents in the Bismarck Sea.
Since then the project has been repeatedly delayed as the company ran short of funds and the prime minister of PNG called for a 10-year moratorium on deep sea mining.

A Nautilus Minerals representative has told me that the company is being restructured and that they remain hopeful of starting to mine.
Meanwhile, nearly 30 other ventures are eyeing areas of ocean floor beyond national waters, and these are regulated by a UN body, the International Seabed Authority (ISA).
It has issued licences for exploration and is due next year to publish the rules that would govern future mining.

The EU's Blue Nodules project involves a host of different institutions and countries.

Links :

Wednesday, November 13, 2019

These maps show how many people will lose their homes to rising seas—and it’s worse than we thought

Based on new data available through CoastalDEM
 Data comparing CoastalDEM to SRTM 30 m models

From Popular Science by Sara Chodosh

New elevation data triples the people at risk.

When you hear how many people are living on land that might be underwater by 2100, you might wonder how we know exactly how high the sea level will be so far into the future.
That kind of modeling is incredibly complex, and involves countless calculations and assumptions that influence the outcome.
But you probably don’t wonder how we know the elevation of the land.
In many parts of the world a quick glance at Google Maps can tell you how many feet above sea level you are at any given time.

But like anything we measure, our estimations of elevation are inherently error-prone.
When you’re measuring how high a mountain is, being off by two meters (that’s 6.56 feet) it’s not a huge deal.
But rising seas can make the same margin of error deadly for coastal areas.
And that’s exactly what’s happening.

When researchers at Climate Central used a new method called CoastalDEM to estimate the elevations of the world’s coastal areas, the number of people vulnerable to sea level rise nearly tripled previous calculations.
The new projection suggests up to 630 million people live in places that could be underwater by 2100, with more than half of those slipping under the rising seas by 2050.
They published their findings in Nature Communications.

It may be hard to believe our elevation data could be so far off, but consider how we get it.
NASA’s SRTM model calculates the elevation of upper surfaces, not the actual earth itself, which means it’s especially inaccurate anywhere that’s either densely vegetated or populated, since there are physical objects protruding from the ground in both cases.

In places like the U.S. and Australia, high-quality LIDAR data enables us to see the actual elevations, but that data simply didn’t exist for most of the world.

CoastalDEM is essentially a neural network that’s trained by looking at the differences in the LIDAR elevations versus the SRTM elevations in the U.S.
Across the country, SRTM is off by an average of 3.7 meters (12.1 feet), but peaks nearly a meter higher in coastal cities.
By analyzing the patterns in these discrepancies, the CoastalDEM model can reduce the error in elevation data down to less than 0.1 meters (0.3 feet).
The researchers then used a sea-level projection in line with IPCC findings to estimate how many people might be living on land that will be underwater in the near future.

 CoastalDEM (first image) VS STRM (image below)
In Bangkok, Thailand, CoastalDEM reveals significant increases in areas below projected average annual flood heights in 2050.
*Maps do not factor in potential coastal defenses, such as seawalls or levees, and are based on elevation, rather than flood models.
Emissions pathway: moderate emissions cuts (RCP 4.5) roughly consistent with the Paris climate agreement’s two-degree celsius target.
Sea level rise model: Kopp et al. 2014, median climate sensitivity.
source : Climate Central's Coastal Risk Screening Tool interactive map

These new estimates don’t affect everyone equally, though.
As the map above shows, the vast majority of affected people will be in Asia.
The authors calculate that more than 70 percent of people living on threatened land are in just eight countries: China, Bangladesh, India, Vietnam, Indonesia, Thailand, the Philippines, and Japan.

The change in risk based on this new research isn’t spread evenly around the world, either.
Egypt and Cote d’Ivoire saw 428- and 708-fold increases, respectively—so high we had to adjust the graph below so as not to throw the entire scale out of whack.
The next highest—Liberia—still clocked in above a 75-fold increase.
Most of the countries poised to lose the greatest percentages of their land are island nations, 13 of which are still developing states.

Despite these massive upticks in the at-risk population, the authors note this could still be an underestimation.
CoastalDEM isn’t perfect, especially in dense cities, so even more people could be in danger.
Plus, the estimates are based on current populations, and the global count is likely only going up.
If we continue living in coastal areas, we’ll likely see even more people at risk of losing their homes to climate change.

Other comparison between Airbus WorldDEM 12m DSM results vs SRTM 30 m results.
SRTM shows completely different results.

Links :