Thursday, October 19, 2017

Mapping the Great Barrier Reef with cameras, drones and NASA tech

In this special report, CNET dives into the ways scientists and innovators are using technology to rescue the Great Barrier Reef.

From CNET by Jennifer Bisset

New and old technologies reveal what's killing Australia's great marine wonder.

Richard Vevers, a British underwater photographer, was horrified when he returned in 2015 to a colourful reef in American Samoa he had shot a year earlier.
It had turned pure white.

Vevers, who runs a marine advocacy group called The Ocean Agency, knew bleaching, a process caused by global warming that starves coral, was the cause.
He also knew the public didn't understand the ocean's sorry shape because it couldn't see what was going on.
Cameras, he reckoned, could help.

 Great Barrier Reef with the GeoGarage platform (AHS chart)

So The Ocean Agency partnered with Google to take the search giant's Street View concept underwater at Australia's Great Barrier Reef.
It designed a military-grade scooter with an underwater camera mounted on top worth AU$50,000 (about $39,000 or £29,700).
The thousands of photographs it took were then processed by image-recognition software that group wrote for the project.

"As soon as we designed that [technology], the scientists all realised that this could revolutionise the study of coral reefs," Vevers says.
"You could suddenly look at coral reefs at a scale that was really unprecedented."

Conservationist Richard Vevers goes deep beneath the waves with his underwater camera.The Ocean Agency/XL Catlin Seaview Survey

Vevers and The Ocean Agency aren't the only researchers mapping the Great Barrier Reef, which is dying as man-made climate change wreaks havoc on its beautiful but delicate ecosystem.
Teams from Australia and around the world have projects to chart a complex, dynamic habitat that covers an area as big as Germany.
Their work is crucial to efforts to save the reef, which saw 29 percentof its shallow-water coral die last year.
After all, how can you save something if you can't see it?

The teams are using a range of technology both new and old in their projects.
In Archer Point, North Queensland, a team of indigenous rangers deploys drones to monitor the health of the reef and map the surrounding country.
A team from the University of Sydney takes thousands of photographs of the reef one second apart with GoPro cameras, stitching them into giant high-res images.
Off the reef's Whitsunday coast, marine biologist Johnny Gaskell used Google Maps to strike the location of a blue hole filled with healthy coral protected from the bleaching.

 Google Maps helped spot what has been dubbed "Gaskell's Blue Hole".

The task is daunting.
We think of the Great Barrier Reef as a single expanse of coral, but it's actually a network of 3,000 reefs that spans 2,300 kilometres (1,430 miles) along Australia's eastern coast.
Roughly 9,000 species live on it and 2 million tourists visit every year to drink in its splendour.

The sheer size of the reef makes mapping expeditions expensive, and, no surprise, funding is hard to come by.
Large ships that use remote sensing in deep waters can cost AU$40,000 (about $32,000 or £24,000) a day.
A big drilling ship used to recover parts of fossilized reefs, which shed light on past changes in climate, can cost up to AU$12 million per expedition (about $9.5 million or £7 million).
The Royal Australian Navy and Maritime Safety Queensland, a state department, shoulder some of the costs, as does the International Ocean Discovery Program, an international consortium that's the largest marine geoscience program in the world.

It's still not enough.
One estimate reckons $1 billion a year for the next 10 years is needed to have a chance to save the reef.

In the Navy

The Royal Australian Navy has protected the country's waters since before its formal establishment in 1911.
For the past three decades, it's used a homegrown technology to map the ocean floor around the country, too.

 The Royal Australian Navy uses LADS laser technology to map the ocean floor.

Known as Laser Airborne Depth Sounder (LADS), or airborne lidar, it took 20 years to develop and is now a fixture in marine mapping efforts around the world.
The concept is similar to multibeam sonar, but uses light rather than sound.

A LADS system shoots two beams of amplified light -- a green laser and a red one -- at the water below.
The green laser pierces the water and reaches the ocean floor.
The red one stops at the surface.
Each is precisely measured.
Subtract the red measurement from the green measurement and you have the depth.

Each survey is run by a team of specialists: two officers, three senior sailors and three junior sailors.
The pilots need the support so they can endure flights for up to seven hours at roughly a kilometre off the ground, which is unusually low by aviation standards.

The Navy needs to know where the coral reefs are to protect them, as well as the boats that ply the waters above them.
Chunks of the Great Barrier Reef, it turns out, are under busy shipping lanes.
An Australian government study showed that 3,947 ship voyages called at reef ports in 2012.
Port authorities and corporations expect that number to hit 5,871 in 2017, at a yearly growth rate of between 4 and 5 percent.

Alex Cowdery flew LADS sorties for the Navy when the tech debuted in the 1990s and now works for a company that sells the service to other navies.
He keeps charts from his early flights on his office walls to remind his clients this is a well-tested technology.

Since 1993, the Navy has mapped roughly 240,000 square kilometres of the Great Barrier Reef, more than half of its 347,800-square-kilometre area, he says.
The Navy flies an average of one sortie every other day to keep its charts updated.
Mapping takes discipline.

"It was exciting back then," Cowdery tells me of his early days flying LADS planes.
"And it's still exciting right now."

NASA took to the air with a new instrument that measures the wavelength of light, gathering data on healthy and unhealthy coral.
A space agency maps coral

A NASA-run project called CORAL, the Coral Reef Airborne Laboratory, took to the air from September to October in 2016 to map what's underwater.
Developed by the Jet Propulsion Laboratory, CORAL tests a next-generation hyperspectral instrument important to scientists for distinguishing healthy coral reef systems from unhealthy ones.

The project uses PRISM, an acronym for the JPL's Portable Remote Imaging Spectrometer, to capture spectrum data that previously couldn't be collected.
Spectrometers measure the wavelength of light, which provides insight into what something is made of.
The PRISM system was mounted in the belly of an airplane that flew at low altitudes above the reef and pierced the water's surface to capture high-resolution spectrum data.

The US space agency and Australia's CSIRO proposed the project five years ago but it stalled amid funding issues, a common hurdle for scientific research.
Then in 2016, money was found and the project was back on.
By September, a NASAteam decked out in caps and T-shirts bearing the organisation's iconic logo had arrived in an Australian government facility in Brisbane.

"It's pretty cool," Tim Malthus, who runs a CSIRO coastal studies group, says of working with the NASA team, which he describes as extremely well-organised.The PRISM system is a prototype that's much smaller than previous hyperspectral systems and that will eventually be deployed on satellites.
The PRISM technology has an advantage over multibeam and lidar surveys, Malthus says, because it's uniform and doesn't require corrections.

Multibeam sonar technology bounces acoustic pulses off the seabed to reveal the shape of the seafloor.

Beaman, the James Cook University researcher, brings multibeam and satellite technologies together in generating his 3D models of the reef.
Those models are now being used in a range of projects aimed at saving the reef.

The accuracy of 3D models helps researchers understand natural hazards and features of the seafloor.
For example, E-reefs, a government modeling program, uses the 3D landscapes to model particles as they move through the reef to predict how pollution travels through the area.
The model is also used to make decisions on where fishing can be allowed.

"Unless you map it, you don't know what it is," Beaman says.
"That's really what it comes down to."

Vevers, the head of The Ocean Agency, says that's what the Google Street View project was all about.
If people see the Great Barrier Reef and other reefs around the world, they'll be more emotionally involved in saving them.

The Ocean Agency technology that was designed for the Street View project is now being used to document the state of reefs in 25 countries around the world.
Vevers says the results will be made public so everyone can see what's at stake.

"Everyone feels like a child when they see that new environment for the first time," Vevers says.
"You just get excited."

Links :

1 comment: