Saturday, November 16, 2019

Norway (NHS) layer update in the GeoGarage platform

120 nautical raster charts updated & 1 new inset added

Basque Country and the sea

This promotional film (fiction) highlights the men and women who make fishing a unique profession every day.
A passionate profession, respectful of its environment and its resources, told here through a beautiful story of family, transmission, know-how that also rhymes with another passion, that of surfing.
This short film features a heroine, both surfer and fisherman, worthy representative of the new generations of fishermen.
Entrepreneurs who are first and foremost lovers of the sea and well aware of their responsibilities, young people and women who always want to go further in reinventing a new fishery, more respectful of the ocean and of us all.
Heroin has been young and fascinated by the sea since she was a child.
She likes to think that by taking over from her father at the head of the family fishing company, she will do even more for the sea, for the next generations, because "the future of the land is also at stake at sea".
with Lee-Ann Curren, filmed in the Basque Country
see also on Vimeo


Vizcaya construction of a three-mast mast for the Basque Country 
The association "Trois-mâts basque" wishes to rebuild a tall ship, "Bizkaia", to highlight the Basque maritime heritage and allow as many people as possible to sail on these witnesses of the past.
The four phases of the overall project are:
1) The reconstruction of Alba, a sardine boat emblematic of the port of Saint-Jean-de-Luz - Ciboure, which will take visitors to Socoa.
2) The reconstruction of Bizkaia, a tall ship "three-masted goelette" which will be the subject of a showy project as was the hermione in Rochefort or is still the San Juan in Pasajes.
3) The creation of a scenario-based space designed to highlight the Basque maritime heritage. History of seafarers' families, holograms, and participation of local associations.
4) Sailing for the greatest number of people at Alba (in the bay) and Bizkaia: day trips, training courses in manoeuvring, trips....

Friday, November 15, 2019

A sharper view of the world’s oceans

A high-resolution ocean model reveals surface current flows off the coast of South Africa.
Credit: NASA, Goddard space flight center scientific visualization studio

From Scientific American by Conor Purcell

Models of the behaviour of the oceans with higher spatial resolution could lead to more accurate climate predictions.

When news broke in March 2014 that Malaysian Airlines flight MH370 had gone missing, Jonathan Durgadoo watched in shock with the rest of us.
The Boeing 777-200ER had departed from Kuala Lumpur for Beijing, before unexpectedly turning west about halfway between Malaysia and Vietnam.
An hour-and-a-half after take-off, the plane disappeared from radar over the Andaman Sea, southwest of Thailand.
There were 239 people on board.

Some 16 months later, a piece of debris from the aircraft—a hinged flap known as a flaperon that had broken away from the wing—was found on Réunion Island in the western Indian Ocean.
Durgadoo, an ocean modeller at the GEOMAR Helmholtz Centre for Ocean Research Kiel, Germany, quickly realised he was able to contribute to the search effort.
If his team could describe the movement of water in the Indian Ocean from the time of the aircraft’s disappearance to when the debris was washed up on Réunion Island, they might be able to track its journey and lead investigators to the crash site.

“From an oceanographic perspective, the question was straightforward yet difficult to answer,” Durgadoo says.
“Could we track the flaperon back in time to establish where the plane had crashed? And if so, would that position coincide with the priority search area?”

To track the flaperon, Durgadoo’s team would need a data set of currents in the Indian Ocean during that 16-month period that had no gaps in space or time.
Observational data over such a wide area and long period of time were not available; instead, the team decided to use a high-resolution model of the ocean.

Ocean models describe the movement of water using the mathematical equations of fluid motion, calculated on powerful supercomputers.
The models divide the ocean into a grid of 3D volumetric elements, and calculate the movement of the water between cubes for each time step the researchers choose.
Durgadoo’s team opted for a model with a horizontal grid length of one-twelfth of a degree—about 9 kilometres in every direction.

“The ocean model provided us with consistent data for the entire time period, and all over the Indian Ocean,” says Durgadoo.
They then used the data to simulate virtual objects back in time from July 2015 to the aircraft’s disappearance in March 2014.
In total, Durgadoo and his team launched almost five million virtual flaperons and tracked their likely journey back from the island.
The simulation resulted in almost five million possible trajectories, but by making certain assumptions, such as that the plane could have travelled a maximum of 500 km from its last known position, the researchers were able to whittle their results down to around 800,000 possible starting points.

These points still covered thousands of square kilometres of the southeast Indian Ocean, but this was largely a different area to where the search teams were looking.
According to the models, the probability that the flaperon started its journey in the patch of ocean off southwestern Australia where crash investigators were searching was less than 1.3%.
The more likely origin of the flaperon—and therefore the crash site, assuming it broke off on impact with the water, as is generally accepted—was further north, says Durgadoo.

Durgadoo’s work to track a two-metre-long flaperon across the ocean would not have been possible without considerable advances in the spatial resolution of ocean models.
Increases in computing power over the past few decades have spurred the development of models that use tighter grids and can therefore capture the movement of the ocean at the mesoscale, on the order of 100 km or less.
At this scale, swirling, circulating currents of water can be modelled.
Ocean models with high-enough resolution to represent these eddies can account for parameters such as volumetric flow rate, temperature and salinity, and can therefore reproduce more realistic ocean behaviour than models with lower resolution.

The emergence of high-resolution ocean models raises questions about using models with a coarser resolution, particularly for climate projections into future decades and beyond.
Every climate projection is a result of simulations that use models developed by various research centres around the world.
These models seek to incorporate and couple each component of the Earth system, from the cryosphere (the planet’s ice-covered regions) and biosphere to the atmosphere and ocean.
Because the accuracy of climate projections depends on how well each component of the model represents reality, incorporating ocean models with a higher spatial resolution into climate simulations should provide a better picture of how the climate is likely to change in the coming decades (see Q&A).
But modelling at higher resolutions carries a cost and might not be the only way to improve simulations of the ocean.

Current simulations

“Climate models don’t correctly simulate many aspects of the global ocean,” says Lisa Beal, a physical oceanographer at the University of Miami in Florida.
In particular, she is interested in western boundary current systems, which are deep, narrow, fast-flowing currents on the western side of ocean basins.
These currents carry huge amounts of heat from the tropics to the poles and have a large impact on global climate.
But so far, she says, they have not been correctly simulated in the models that underpin climate projections reported by the Intergovernmental Panel on Climate Change (IPCC).

The panel’s most recent report generally used ocean models with a resolution of around 1 degree.
This is because the models must simulate the ocean, atmosphere, land and ice as coupled systems feeding back on one another, and must simulate change over a period of 200 years.
Both of these objectives are computationally expensive, even at a resolution of 1 degree.
But at this resolution, the entire spatial scale of a western boundary current is covered by a single data point.

“This is our frontier,” says Beal.
“We need to be able to resolve crucial ocean features such as eddies and western boundary current systems in the global climate models that are used to predict the climate of the twenty-first century.”

Resolving a current problem

Whereas a typical climate-prediction model used by the Intergovernmental Panel on Climate Change shows the Agulhas Current off South Africa’s coast flowing freely into the South Atlantic (top), a simulation with ten-times-higher resolution (bottom) reveals swirling currents that more closely match real-world observations.

Credit: Ben Kirtman University of Miami-RSMAS

In 2016, Beal and her colleagues resolved the Agulhas western boundary current system, which lies off the coast of South Africa, for the first time in a climate model.
They used a coupled climate model with a global ocean resolution of one-tenth of a degree.
They then looked at what kind of ocean behaviours changed compared with lower-resolution models, and observed the effect on the heat and salt content of the South Atlantic.
They also calculated the water transport rate into the South Atlantic by so-called Agulhas leakage.
In both cases, the simulation matched real-world observations more closely than did models at lower resolution (see ‘Resolving a current problem’).

The next IPCC report, to be published in 2022, will use some ocean models with a resolution of one-quarter of a degree.
Beal thinks that using higher-resolution ocean models in global climate simulations is likely to change the representation of the way heat is transported from the equator to the poles.
Today’s climate models simulate western boundary current systems as being broader and slower than they really are.
As a result, they might underestimate the efficiency with which heat is carried from the equator towards the poles in the real ocean—a faster current boundary system loses less heat to the atmosphere and so delivers more to the poles.
Such an error in the way the model exchanges heat between the ocean and the atmosphere could result in an inaccurate simulation of global climate.

Levke Caesar : a climate of uncertainty

In 2018, Levke Caesar, a climate scientist at Maynooth University in Ireland, used a high-resolution model and observations of ocean circulation to show that the Atlantic meridional overturning circulation (AMOC)—a complex system of ocean currents that moves heat between the tropics and the North Atlantic—is losing strength.
This weakening could considerably change the climate of the Northern Hemisphere.
At the 2019 Lindau Nobel Laureate Meeting, Caesar spoke to Nature about the role of ocean models in climate science.

Why do climate scientists need to model the ocean?

Ocean circulations move carbon and heat into the deep ocean and transport heat around the globe—up to 1.3 petawatts from the tropics to the poles in the case of the AMOC.
That is as much power as about 1 million average-sized nuclear reactors.
So climate models improve a lot when you include the ocean.
And the longer the timescales you look at, the more important the ocean becomes.

How accurate are ocean models now?

There is a saying that all models are wrong, but some are useful, and it’s true—every model is an approximation.
The observed strength of the AMOC is around 16 million cubic metres per second.
But some climate models put it at nearly 30 million cubic metres per second.
So they’re definitely not perfect, but they can still help us to understand the mechanisms that drive ocean circulation, and how changes in the ocean feed back into the atmosphere.

How can you verify a model?

One reason why models of ocean currents diverge so much is that we are not completely sure what’s true.
But there are more observational systems being set up that we can use to validate our models with real data.
An array of moorings that spans the Atlantic at a latitude of 26 degrees north, from Morocco to Florida, has shown us how the AMOC has behaved in real life since 2004.
Observational work is really important in putting together a broad picture of the ocean.

What makes the AMOC so hard to model?

One problem is that it is difficult to model processes that take place at small scales.
At the surface you can get spatial resolution down to just a few metres, but that increases with depth.
Our high-resolution model has a horizontal scale of about 10 kilometres.
So there will always be processes that have to be represented by coding in the mathematical description of their physics.

Is uncertainty in models a problem?

Yes, definitely.
Some things that are difficult to implement in models could have a big impact on our predictions.
Most models suggest that the AMOC will continue to slow, but will not collapse this century.
But those models don’t account for fresh water coming from the Greenland Ice Sheet.
The omission of that destabilizing force probably makes the AMOC look more stable than it actually is.
Uncertainty in results also makes it harder to convince people about what is happening.
When scientists communicate, we try to be as honest as possible, and that means including all the uncertainties.
The problem is, when non-scientists claim that climate change is not that bad, they don’t say “but we could be wrong”—they say it convincingly.

What can be done about that?

We should probably emphasize the costs more.
If the negative effects of something happening are great, then no matter what the percentage risk is, you should take action to prevent it.
Some climate models suggest that as the AMOC weakens, storm tracks could become more prominent going towards the United Kingdom.
We’re not certain about that, but there’s an indication.
 Do we really want to test it?

Mixing it up

Even when resolution is not high enough to model small-scale processes, steps can be taken to represent them by alternative means.
For mesoscale ocean processes, this can be done by parameterization—a method by which ocean processes are represented by coding in the mathematical description of their physics.

Jennifer MacKinnon, a physical oceanographer at the Scripps Institution of Oceanography in San Diego, California, studies internal waves that oscillate deep in the ocean.
These waves have an important effect on mesoscale turbulent mixing processes in the ocean, which are known to affect the way the entire ocean works, and therefore influence the global climate.

“Because ocean models have a certain resolution which tend to be many kilometres, or tens of kilometres, in scale,” she says, “they cannot resolve and simulate many of the processes in the ocean.” Even for higher-resolution models, these ‘sub-grid-scale processes’ might still be too small to be explicitly resolved.

In 2017, MacKinnon co-authored a paper on internal wave-driven ocean mixing that was the culmination of a five-year study by the Climate and Ocean: Variability, Predictability and Change (CLIVAR) project.
“Models had previously set the mixing rate as a constant, or at least something that was not spatially variable,” she says.
But MacKinnon had seen that this was not really the case.
“Our observations showed us something that the models are not yet incorporating,” she explains.
The researchers tweaked the models to represent those turbulent mixing processes, and then looked at how the ocean models behaved differently over timescales of decades compared with those that took ocean mixing to be a constant.

The results showed that the deep-water mixing parameterization had a significant effect on the ocean’s overturning circulation, which in turn affects the atmosphere and therefore global climate.
So, to have a realistic climate projection, MacKinnon argues, models that are too coarse to include these internal mixing processes should at least include a parameterization to represent them.

Beyond resolution

Despite the benefits that higher resolution and parameterization offer climate modellers, it is not always feasible to use them.
The time required to compute the simulations grows with increasing resolution, as does the quantity of data generated.
“This problem just grows and grows the longer you want your simulations to be,” Durgadoo says.

The temporary nature of a research workforce comprising graduate students and postdocs means there is not time to run ocean models at ever-higher resolution.
And crucially, it is not clear that continuously ramping up model resolution will always bring greater benefits.
At very high resolutions, Beal says, the performance of the models could become unstable.

Researchers also need to think about other factors besides resolution.
One study showed that coupling the ocean with the atmosphere gives a more realistic simulation of the Gulf Stream than that achieved by simply increasing the model’s resolution3.
“If you keep turning up the resolution, there comes a point where you can’t really improve,” Beal says.
Durgadoo agrees.
“Resolution is definitely a limiting factor, but only up to a point,” he says.
The simulations that he and his colleagues performed to trace the missing Malaysian Airlines flight, for example, had a high resolution, but their study had many other limitations.
“It’s not only a problem of model resolution across the surface of the ocean and through time—there are other unknowns,” he says.
For example, researchers have a limited understanding of the physics of fluid mechanics.
It does not matter how high a model’s spatial resolution is if the underlying physics is lacking in detail.
The only way to overcome this problem is by further observational research.

The scientists all agree that better models require collaboration between those who observe the ocean and those who attempt to simulate it.
But that interdisciplinary communication can be lacking.
Beal and MacKinnon are physical oceanographers who lead ocean cruises to deploy measuring devices into the abyssal depths, whereas ocean modellers such as Durgadoo are almost always office-based and often work at different institutes.
Without effort, they might never meet.

Beal says that programmes such as CLIVAR and the Global Ocean Observation System (GOOS) are extremely useful for bringing researchers together, and Mac-Kinnon’s climate process team is an example of a positive outcome from that process.
By grouping observational scientists and modellers together, MacKinnon says, the community can improve its understanding of the physical ocean and refine the performance of the models.

As models improve, so too might confidence in the conclusions that can be drawn from them.
Such a boost might have benefited Durgadoo’s team in the search for MH370.
Although they recognized the limitations of their study, they contacted the search authorities in 2015 with the finding that they were probably looking in the wrong place.
The authorities acknowledged receipt of their correspondence, but there was no discussion or action around shifting the search site.
“More recently, we’ve conducted further research on the matter, but decided not to send it to the authorities,” says Durgadoo.
The current focus of their work is improving the method, he explains.

For now, the disappearance and whereabouts of the aircraft remain a mystery.

Links :

Thursday, November 14, 2019

ENC catalogue in Google Earth

17654 ENC cells from 68 HOs (date 13/11/2019 week 46)
download kmz file (weekly updated) : ENC world coverage in Google Earth

Electric car future may depend on deep sea mining

Apollo II is a prototype deep sea mining machine being tested off the coast of Malaga

From BBC by David Shukman

The future of electric cars may depend on mining critically important metals on the ocean floor.

That's the view of the engineer leading a major European investigation into new sources of key elements.
Demand is soaring for the metal cobalt - an essential ingredient in batteries and abundant in rocks on the seabed.

Laurens de Jonge, who's running the EU project, says the transition to electric cars means "we need those resources".
He was speaking during a unique set of underwater experiments designed to assess the impact of extracting rocks from the ocean floor.

In calm waters 15km off the coast of Malaga in southern Spain, a prototype mining machine was lowered to the seabed and 'driven' by remote control.
Cameras attached to the Apollo II machine recorded its progress and, crucially, monitored how the aluminium tracks stirred up clouds of sand and silt as they advanced.


An array of instruments was positioned nearby to measure how far these clouds were carried on the currents - the risk of seabed mining smothering marine life over a wide area is one of the biggest concerns.

What is 'deep sea mining'?

It's hard to visualise, but imagine opencast mining taking place at the bottom of the ocean, where huge remote-controlled machines would excavate rocks from the seabed and pump them up to the surface.

The vessel used for the underwater research off Spain, the Sarmiento de Gamboa, is operated by CSIC, the Spanish National Research Council.

The concept has been talked about for decades, but until now it's been thought too difficult to operate in the high-pressure, pitch-black conditions as much as 5km deep.

Now the technology is advancing to the point where dozens of government and private ventures are weighing up the potential for mines on the ocean floor.


Why would anyone bother?

The short answer: demand.
The rocks of the seabed are far richer in valuable metals than those on land and there's a growing clamour to get at them.

Billions of potato-sized rocks known as "nodules" litter the abyssal plains of the Pacific and other oceans and many are brimming with cobalt, suddenly highly sought after as the boom in the production of batteries gathers pace.

At the moment, most of the world's cobalt is mined in the Democratic Republic of Congo where for years there've been allegations of child labour, environmental damage and widespread corruption.

Current technology for electric car batteries require cobalt, thought to be abundant on the sea floor
The demand for metals—such as lithium, cobalt and nickel—means that there is only one way to feed an electrifying world: higher prices. 

Expanding production there is not straightforward which is leading mining companies to weigh the potential advantages of cobalt on the seabed.

Laurens de Jonge, who's in charge of the EU project, known as Blue Nodules, said: "It's not difficult to access - you don't have to go deep into tropical forests or deep into mines.
"It's readily available on the seafloor, it's almost like potato harvesting only 5km deep in the ocean."

And he says society faces a choice: there may in future be alternative ways of making batteries for electric cars - and some manufacturers are exploring them - but current technology requires cobalt.

Laurens de Jonge likens the process to "potato harvesting" 5km down in the ocean

"If you want to make a fast change, you need cobalt quick and you need a lot of it - if you want to make a lot of batteries you need the resources to do that."
His view is backed by a group of leading scientists at London's Natural History Museum and other institutions.

They recently calculated that meeting the UK's targets for electric cars by 2050 would require nearly twice the world's current output of cobalt.

So what are the risks?

No one can be entirely sure, which makes the research off Spain highly relevant.
It's widely accepted that whatever is in the path of the mining machines will be destroyed - there's no argument about that.
But what's uncertain is how far the damage will reach, in particular the size of the plumes of silt and sand churned up and the distance they will travel, potentially endangering marine life far beyond the mining site.
The chief scientist on board, Henko de Stigter of the Dutch marine research institute NIOZ, points out that life in the deep Pacific - where mining is likely to start first - has adapted to the usually "crystal clear conditions".


So for any organisms feeding by filter, waters that are suddenly filled with stirred-up sediment would be threatening.
"Many species are unknown or not described, and let alone do we know how they will respond to this activity - we can only estimate."
And Dr de Stigter warned of the danger of doing to the oceans what humanity has done to the land.
"With every new human activity it's often difficult to foresee all the consequences of that in the long term.
"What is new here is that we are entering an environment that is almost completely untouched."
Could deep sea mining be made less damaging?

Ralf Langeler thinks so.
He's the engineer in charge of the Apollo II mining machine and he believes the design will minimise any impacts.
Like Laurens de Jonge, he works for the Dutch marine engineering giant Royal IHC and he says his technology can help reduce the environmental effects.

The machine is meant to cut a very shallow slice into the top 6-10cm of the seabed, lifting the nodules.
Its tracks are made with lightweight aluminium to avoid sinking too far into the surface.

David Shukman (R) talks to Ralf Langeler, the engineer in charge of the Apollo II mining machine

Silt and sand stirred up by the extraction process should then be channelled into special vents at the rear of the machine and released in a narrow stream, to try to avoid the plume spreading too far.
"We'll always change the environment, that's for sure," Ralf says, "but that's the same with onshore mining and our purpose is to minimise the impact."

I ask him if deep sea mining is now a realistic prospect.
"One day it's going to happen, especially with the rising demand for special metals - and they're there on the sea floor."

Who decides if it goes ahead?

Mining in territorial waters can be approved by an individual government.
That happened a decade ago when Papua New Guinea gave the go-ahead to a Canadian company, Nautilus Minerals, to mine gold and copper from hydrothermal vents in the Bismarck Sea.
Since then the project has been repeatedly delayed as the company ran short of funds and the prime minister of PNG called for a 10-year moratorium on deep sea mining.

A Nautilus Minerals representative has told me that the company is being restructured and that they remain hopeful of starting to mine.
Meanwhile, nearly 30 other ventures are eyeing areas of ocean floor beyond national waters, and these are regulated by a UN body, the International Seabed Authority (ISA).
It has issued licences for exploration and is due next year to publish the rules that would govern future mining.

The EU's Blue Nodules project involves a host of different institutions and countries.

Links :

Wednesday, November 13, 2019

These maps show how many people will lose their homes to rising seas—and it’s worse than we thought

Based on new data available through CoastalDEM
 Data comparing CoastalDEM to SRTM 30 m models

From Popular Science by Sara Chodosh

New elevation data triples the people at risk.

When you hear how many people are living on land that might be underwater by 2100, you might wonder how we know exactly how high the sea level will be so far into the future.
That kind of modeling is incredibly complex, and involves countless calculations and assumptions that influence the outcome.
But you probably don’t wonder how we know the elevation of the land.
In many parts of the world a quick glance at Google Maps can tell you how many feet above sea level you are at any given time.

But like anything we measure, our estimations of elevation are inherently error-prone.
When you’re measuring how high a mountain is, being off by two meters (that’s 6.56 feet) it’s not a huge deal.
But rising seas can make the same margin of error deadly for coastal areas.
And that’s exactly what’s happening.

When researchers at Climate Central used a new method called CoastalDEM to estimate the elevations of the world’s coastal areas, the number of people vulnerable to sea level rise nearly tripled previous calculations.
The new projection suggests up to 630 million people live in places that could be underwater by 2100, with more than half of those slipping under the rising seas by 2050.
They published their findings in Nature Communications.

It may be hard to believe our elevation data could be so far off, but consider how we get it.
NASA’s SRTM model calculates the elevation of upper surfaces, not the actual earth itself, which means it’s especially inaccurate anywhere that’s either densely vegetated or populated, since there are physical objects protruding from the ground in both cases.

In places like the U.S. and Australia, high-quality LIDAR data enables us to see the actual elevations, but that data simply didn’t exist for most of the world.

CoastalDEM is essentially a neural network that’s trained by looking at the differences in the LIDAR elevations versus the SRTM elevations in the U.S.
Across the country, SRTM is off by an average of 3.7 meters (12.1 feet), but peaks nearly a meter higher in coastal cities.
By analyzing the patterns in these discrepancies, the CoastalDEM model can reduce the error in elevation data down to less than 0.1 meters (0.3 feet).
The researchers then used a sea-level projection in line with IPCC findings to estimate how many people might be living on land that will be underwater in the near future.

 CoastalDEM (first image) VS STRM (image below)
In Bangkok, Thailand, CoastalDEM reveals significant increases in areas below projected average annual flood heights in 2050.
*Maps do not factor in potential coastal defenses, such as seawalls or levees, and are based on elevation, rather than flood models.
Emissions pathway: moderate emissions cuts (RCP 4.5) roughly consistent with the Paris climate agreement’s two-degree celsius target.
Sea level rise model: Kopp et al. 2014, median climate sensitivity.
source : Climate Central's Coastal Risk Screening Tool interactive map

These new estimates don’t affect everyone equally, though.
As the map above shows, the vast majority of affected people will be in Asia.
The authors calculate that more than 70 percent of people living on threatened land are in just eight countries: China, Bangladesh, India, Vietnam, Indonesia, Thailand, the Philippines, and Japan.

The change in risk based on this new research isn’t spread evenly around the world, either.
Egypt and Cote d’Ivoire saw 428- and 708-fold increases, respectively—so high we had to adjust the graph below so as not to throw the entire scale out of whack.
The next highest—Liberia—still clocked in above a 75-fold increase.
Most of the countries poised to lose the greatest percentages of their land are island nations, 13 of which are still developing states.

Despite these massive upticks in the at-risk population, the authors note this could still be an underestimation.
CoastalDEM isn’t perfect, especially in dense cities, so even more people could be in danger.
Plus, the estimates are based on current populations, and the global count is likely only going up.
If we continue living in coastal areas, we’ll likely see even more people at risk of losing their homes to climate change.

Other comparison between Airbus WorldDEM 12m DSM results vs SRTM 30 m results.
SRTM shows completely different results.

Links :

Tuesday, November 12, 2019

Canada (CHS) layer update in the GeoGarage platform

62 nautical raster charts updated & 1 new inset added

'The perfect combination of art and science': mourning the end of paper maps

World Map circa 1900: Mercator Projection of the World.
Photograph: Buyenlarge/Getty Images

From The Guardian by Jeff Sparrow

Digital maps might be more practical in the 21st century, but the long tradition of cartography is magical

“Some for one purpose and some for another liketh, loveth, getteth, and useth Mappes, Chartes, & Geographicall Globes.”

So explained John Dee, the occult philosopher of the Tudor era.
The mystical Dr Dee would, perhaps, have understood the passion stirred by Geosciences Australia’s recent decision to stop producing or selling paper versions of its topographic maps in December, citing dwindling demand.

In the 21st century, digital files might be more practical, particularly for a cash-strapped federal government agency.
But not everyone loveth and getteth their maps for purely practical reasons.

Just ask Brendan Whyte.
He’s a curator at the National Library of Australia, responsible for acquiring a copy of every map published in Australia, as well as managing a collection of perhaps a million or so charts and about the same number of aerial photographs.

A geographer by training, he knows that some people don’t appreciate electronic cartography.
“One of the problems, with the development of GIS [geographic information systems] and everyone making their own maps is that people just dump their data without thinking about the aesthetics or what the map is trying to tell a reader.”
A map, he says, needs beauty so that users want to look at it and absorb what it contains.

But that’s less about particular platforms, whether digital or otherwise, than the cartographer’s skill.

Figurative map of the successive losses in men of the French army in the Russian campaign 1812-1813, Charles Minard, 1869.
Photograph: Wikimedia

Whyte gives, as an example of what can be done with data, Minard’s famous map of Napoleon’s Russian campaign, on which a band represents both the progress and the extent of the Grande Armée juxtaposed against temperatures in 1812 and 1813, so that the viewer necessarily imagines the privations of a disintegrating army throughout a terrible Russian winter.

Whyte also admires the artistry of the Marshall Island stick charts from the NLA’s own collection.
It’s believed to be from the early 1970s.

“They’re thin pieces of coconut wood put together in a sort of lattice like a cat’s cradle and wherever there was an atoll they’d lash on [a] little cowrie shell to represent that island.
The bits of coconut wood represent the routes, the wave patterns, the winds, the currents.
So they’re not a geographical representation like a modern map, but more how a navigator might get from one island to another via the sea route in a canoe with a big sail.”

Stick chart of the Marshall Islands.
Photograph: National Library of Australia

His favourite catalogued item might also be one of the smallest, an atlas from Queen Mary’s dollhouse.

“A lot of publishing companies and authors produced real books for her dollhouse.
The famous map shop and publishers Stanfords made her the Atlas of the British Empire, reducing it to about two inches high.”

At the State Library of Victoria, the librarian Sarah Ryan also nominates an atlas – albeit a rather bigger one – as a particularly treasured item.

“It’s known as the first modern day atlas, even though it was produced in 1572: Ortelius’s Mirror of the World.
The printing of that volume is beautiful and the maps are very colourful, and you’ve got lots of iconography like sea monsters and ships and compass roses.”

She agrees that, while digital maps can be more convenient, a lot of people still prefer paper, particularly for recreational uses.
Ultimately, though, it’s relationships that matter.

“What pulls you to maps,” she says, “is that connection with people and place and culture.”

 A miniaturised Atlas of the British Empire, made for Queen Mary’s dollhouse.
Photograph: National Library of Australia

By way of example, she talks of looking at the SLV’s Atlas of Paris from 1739, just after her own first trip to the city.
“I’d visited all those places, so that has a strong connection to me.”

Maps, of course, also document territorial claims.

David Rumsey Map Collection – Turgot Michel Etienne, Paris 1739.
Photograph: David Rumsey Map Collection

The catalogue describes the SLV’s rarest map as coming “from the survey of Mr Wedge and others”.

The attribution sounds innocuous until you identify the surveyor as the John Wedge who accompanied John Batman on his expedition across Bass Strait.

The yellowing paper thus signals the plans for a township in Port Phillip – and the beginning of Indigenous dispossession.

Yet, if maps represent power, they can also show change.

It’s a point made by Kay Dancey, the CartoGIS Services Manager at the College of Asia and the Pacific at the Australian National University.

Map of Port Phillip from the survey of Mr Wedge and others.
Photograph: State Library Victoria

A cartographer by training, Dancey provides data visualisation for ANU researchers, as well as managing a collection of hardcopy and digital maps.
Her holdings feature items dating back to the 17th century, and include 18th-century works by the French hydrographer and philosopher Jacques-Nicolas Bellin.
“The sheer craft of how they produced these maps … They’re invariably copper engravings, and there’s such skill required in this process.
And then there’s the beauty: the fabulous colours and cartouches that they employ.”

But, when asked to describe a favoured map, she nominates something very different.
“There’s a lovely map here,” she says, “a wall map of Africa from the 1950s.
It’s one of my favourites because it has hand made corrections to the country name Zaire or the Democratic Republic of the Congo, as it is now.
I particularly love that because it has been in this mapping unit for 60 years and these hand annotations have been by made by cartographers over that time.
So it’s a gorgeous link back to the people who have worked as cartographers at the ANU, a reminder of changing sovereignties and what a map is: a snapshot of time and an abstraction of place.”

In that spirit, she notes the real innovations of digital technology making both data and mapping platforms more widely available, and thus facilitating what she calls “a democratising phase of cartography”.

The maps of Adam Mattinson provide an obvious example.
By day, Mattinson works as a geospatial analyst for an engineering firm.
In his own time, he uses his cartographic training to represent the local landscape in strange and fantastical forms.

In one project, he depicts a Melbourne constructed on Port Phillip as it looked in the Ice Age; in another work in progress, he imagines the city after massive rises in sea levels.

In the book How to Lie with Maps, Mark Monmonier notes what he calls “the cartographic paradox”.
To present complex information from a three-dimensional world clearly in two-dimensional format, the surveyor must abstract and distort.
In other words, as Monmonier says, “to present a useful and truthful picture, an accurate map must tell white lies.”

Mattinson’s work takes that idea to its logical conclusion.

He’s probably best known for his Tolkienesque depiction of the Yarra Ranges, in which The Patch looks almost like Hobbiton and the warning “Puffing Billy Roams this Area” appears alongside a depiction of a dragon.
“It’s covering an area in which I grew up,” he says.
“To me, the landscape there really lends itself to a Middle Earth kind of quality.”

As a child, Mattinson used to open the street directory at a random page and then trace his finger to find his way home.
“I think that’s where the passion for maps springs for a lot of kids.
They look at a map and they see that the world is bigger than they had thought.”

Though he works with digital platforms, he loves how physical maps encourage people to gather and discuss the landscape.
“It really is a perfect combination of art and science, cartography.
To have something that’s beautiful to look at but also an object you can look at and think, ‘Oh, I used to live near here’ or ‘I want to go there’ and so on.
It’s part of the beauty of mapping, the shared experience.”

A kind of magic, as John Dee understood.

Links :

Monday, November 11, 2019

AC75 foiling Ineos TeamUK America's Cup

Latest footage of America's Cup team INEOS TEAMUK foiling on the Solent, Portsmouth UK.

AC75 Britannia helmed by Sir Ben Ainslie.

Sunday, November 10, 2019

Diving below the Antarctic ice sheet with no escape

The Seven Worlds, One Planet crew dived beneath the surface of the Antarctic ice sheet with only a tiny bore hole for escape.
Discover the wonders they found in the frozen seas.