Monday, October 28, 2019

Decades of detailed weather reports pulled from old sailor's logs

Polar explorer Robert Falcon Scott's ship Terra Nova is moored amid pack ice in Antarctica in the early 1900s.
Photograph by Herbert Ponting, Nat Geo Image Collection

From National Geographic by Madeleine Stone

A database created in part from 19th-century maritime records sharpens our view of climate change over the past 150 years.

In September of 1879, the Arctic-exploring USS Jeanette was sailing north of the Bering Strait when it was surrounded by ice floes and frozen in place.
Imprisoned at sea, the 33-person crew struggled to survive for nearly two years before their ship sank, forcing them to embark on a perilous journey back to civilization.
While they were stranded, the crew took down regular observations of the weather—winds, clouds, air pressure, temperature—creating a detailed meteorological record where no others existed.

One hundred and forty years later, that record is now helping scientists reconstruct Earth’s weather and climate history in unprecedented detail.

The USS Jeanette’s logs, which eventually made their way back to the United States along with 13 haggard crewmen led by chief engineer George Melville, were among the very first to be rescued as part of the Old Weather: Arctic project, a citizen science-fueled effort to digitize and transcribe the weather observations made by U.S. military vessels that sailed the Arctic in the 19th and 20th centuries.
Those records, along with similar data housed in many other archives, are being fed into the 20th Century Reanalysis, a sophisticated weather reconstruction database developed by the National Oceanic Atmospheric Administration that allows scientists to characterize floods, droughts, storms, and other extreme events from history—and use the violent weather of the past to understand the present.


This woodcut depicts Lieutenant Commander George DeLong and his party wading ashore from the Jeannette in 1881.
Their ship sank after being trapped in ice for two years.
Woodcut By George T. Andrew, Designed By M.J. Burns,
Photograph Courtesy U.S. Naval History And Heritage Command

Earlier this month, that reconstruction received a major update when scientists infused it with millions of new observations from old ship’s logs and weather stations around the world.
Now, NOAA’s souped-up “weather time machine” can produce snapshots of Earth’s atmosphere eight times a day going all the way back to 1836.
“Every three hours, we’re providing an estimate of what the weather was, anywhere in the world,” says Laura Slivinski, a research scientist at the University of Colorado Boulder's Cooperative Institute for Research in Environmental Sciences (CIRES) and NOAA’s Earth System Research Laboratory.
“It’s pretty unique.”

The ‘fog of ignorance’

Today, scientists have myriad satellites and weather stations at their disposal to study the weather.
But satellite record keeping only began about 40 years ago, and prior to the mid-20th century there were far fewer weather stations.
Scientists can use models to “hindcast” the weather further back in time, but without data to feed into those models, their reconstructions are murky.

“We call it the fog of ignorance,” says Gilbert Compo, a senior research scientist at NOAA’s CIRES.

To cut through that fog, researchers at NOAA have spent more than a decade gathering data on surface pressure, temperature, and sea ice conditions from archives around the world that are being digitized and transcribed with the help of volunteers.
These data rescue efforts include several iterations of the Old Weather project, a project that digitized hand-written weather reports from 19th-century England, one focused on log books kept by Australian sea captains, and many more.


Logbook for the Jeannette, a ship that was imprisoned in ice for two years before it sank.
Its crew carried the log to safety.
Photograph Courtesy National Archives And Records Administration

Once written records have been placed in a format NOAA can use, they can be added to the 20th Century Reanalysis, which uses a model similar to the one the National Weather Service relies on to make forecasts to produce snapshots of the atmosphere going back in time.
The latest version of the reanalysis includes 25 percent more observations for years prior to 1930, resulting in far more reliable hindcasts, particularly for the 19th century.

Every additional ship’s log helps.
For instance, in October 1880, a famously powerful cyclone made landfall in the town of Sitka in the Alaska panhandle.
Older versions of the 20th Century Reanalysis couldn’t recreate this storm at all.
But the latest update includes observations from the USS Jamestown, a vessel that was moored offshore at the time.
With its pressure readings, the weather time machine is now able to produce a storm in the right location at the right time.
“It feels like a drop in the bucket, but those observations add up,” Slivinski says.

Weather past and present

Scientists are already putting their new weather time machine to good use.

Barbara Mayes Boustead, a meteorologist and climatologist with NOAA’s National Weather Service, is using it to study the winter of 1880-81 famously described in the The Long Winter, a fictionalized memoir by Laura Ingalls Wilder that draws on her memories of growing up in southeastern Dakota Territory.
She’s discovered that much of what Wilder wrote about that winter some 50 years later is accurate.
“The first snow fell in October exactly as Laura described it,” Boustead says.
“Her depiction of what happened is as good as a weather journal.”

Frequent snowstorms and brutal cold snaps gripped the central United States from October to April.
Using reanalysis, Boustead has managed to reconstruct the global atmospheric setup responsible for this terrible winter, including an extremely negative phase of the North Atlantic Oscillation, a pattern that’s “strongly tied to cold weather anywhere east of the Rockies in the U.S.,” she says.

Boustead is continuing to use NOAA’s weather time machine to study other extreme events described in Wilder’s books.
Other scientists, meanwhile, are interested in what it can tell us about Earth’s most violent weather today.

Colorado State University hurricane researcher Phil Klotzbach is hoping to use the new reanalysis to see how well today’s hurricane prediction tools work for 19th century storms.
For instance, forecasters often use El Niño to predict hurricane activity, since more intense El Niño years tend to coincide with more cyclones in the eastern Pacific and fewer in the Atlantic.
If those relationships were weaker 150 years ago, that may tell us something about how climate change is affecting hurricanes today.

Scientists can use the weather time machine to ask even bigger questions—like whether climate change is disrupting the Gulf Stream.
A lot of the studies investigating this topic “are based on reconstructions that only go back 50-60 years,” Slivinski says.
“If you go back 100 years before that, you get a bigger picture of is this a trend or is this just a blip.
So we’re really excited to look at that.”

Historical weather data remains pretty sparse for the Southern Hemisphere, particularly around Antarctica.
And the farther back in time we go, the scanter the records are in general.
That means there’s still plenty of room for NOAA to improve its time machine.

And with millions of weather records still gathering dust in archives around the world, scientists are hoping to do exactly that.

“Are [storms] faster or slower, stronger or weaker? Are heat waves lasting longer?” Compo says.
“We really want as many ships as we can to clear that fog.”

Links :

Sunday, October 27, 2019

Largest ship ever to transit Corinth Canal

On October 9, the cruise ship Braemar made history, as it became the longest ever ship to cruise through the Corinth Canal in Greece.
It takes precision navigation for this ship to steer through a skinny canal. 
The 24-meter-wide canal connects the Saronic Gulf with the Gulf of Corinth.

Corinth canal in the GeoGarage platform (NGA nautical chart)

The Corinth Canal is just large enough to accommodate the 22.5-meter wide Braemar.

While transiting, the ship was close enough to the canal's sides, that guests could almost touch the walls, informs Clare Ward, director of product for Fred. Olsen Cruise Lines.
Braemar is currently on a 25-night tour of the Greek islands and the Peloponnese.
In the following video you can a first-person timelapse of the ship's transit through the canal.


Saturday, October 26, 2019

Nazaré wave in numbers

Of all the spots that exist on Earth, Nazaré is one of the most intriguing, raising a whole bunch of questions over the years.
How did this small fishing town turn into the European capital of wholesale surfing?
Through what mechanisms does the swell that strikes Praia do Norte create water monsters?
What is the impact of the wave once the lip crashes?
What speed does the wave reach when it breaks?
To see a little more clearly, the Brazilian production company "Canvas 24p", which also produced "Maya Gabeira, Return to Nazaré", deciphered the Nazaré wave with figures.
The impact of the lip represents 5 to 10 tons.
The breaking speed, more than 50km/h.
The height of the wave, the equivalent of a building of 8 to 10 floors.

Friday, October 25, 2019

Source of vast oil spill covering Brazil's Northeast coast unknown

Probably the biggest environmental disaster to affect the Brazilian coast continues to unfold, with the silence of the international community.
Countless gallons of crude oil spilled off Brazil's coast.
No one knows who did it, how or when.
Brazilian Navy: for sure the oil is not from Brazil.
Commander says the oil spill is coming from Atlantic Ocean and the source likely 500 to 600 kilometers off Brazilian coastline.


From EcoWatch by Deutsche Welle

Brazil's main environmental agency said the source of a sprawling oil spill along the northeast coast remains unknown, but that the crude oil was not produced in the country.

The spill stretches over 1,500 kilometers (932 miles) of Brazil's northeast coast, affecting 46 cities and around one hundred of the country's nicest beaches since being first detected on Sept. 2.

 Boy comes out of the sea full of oil in Cabo de Santo Agostinho, Pernambuco
Photo: Leo Malafaia / AFP

Brazilian television has shown slicks at sea and oil puddles along shores, as well as turtles covered in black tar.
Other marine life has also been found dead.


The Brazilian Institute of the Environment and Renewable Natural Resources, Ibama, said state oil company Petrobras analyzed the spill and determined it came from a single source.
However, it said, a molecular analysis of the crude showed that it was not produced in Brazil, the world's 9th largest crude producer at 3.43 million barrels a day.

Petrobras reported that "the oil found is not produced by Brazil.
Ibama requested support from Petrobras to work on beach cleaning.
In the coming days, the company will make available a contingent of about 100 people," the environmental institute announced in a statement.

 see Sentinel Vision portal
200: Number of different locations with crude oil in ocean and/or on beach. (confirmed locations. Probably more not recorded yet)
77: Number of cities that have seen oil on beaches.
9: Number of states effected. 

Extent of Damage

The tests were done at the Petrobras Research Center (Cenpes) in Rio de Janeiro.
So far, 105 crude oil spills have been detected.

Since the beginning of September, Ibama, together with the Federal District Fire Department, Brazil's navy and Petrobras, have been investigating the causes.
Ninety-nine locations in 46 municipalities in 8 states have been affected, including Maranhão, Piauí, Ceará, Rio Grande do Norte, Paraíba, Pernambuco, Alagoas and Sergipe. In the Northeast, only the state of Bahia has not been affected yet.

 Imagery captured by Planet SkySats on September 27th.

Authorities were still conducting cleaning procedures on the Potiguar coast earlier this week.

Links :

Thursday, October 24, 2019

Massive citizen science effort seeks to survey the entire Great Barrier Reef

By collecting images and GPS data from citizen divers, scientists can get a better sense of the health of the entire Great Barrier Reef.
(Damian Bennett)

From Smithsonian by Jessica Wynne Lockart

Only about 1,000 of 3,000 individual reefs have been documented, but the Great Reef Census hopes to fill in the gaps

In August, marine biologists Johnny Gaskell and Peter Mumby and a team of researchers boarded a boat headed into unknown waters off the coasts of Australia.
For 14 long hours, they ploughed over 200 nautical miles, a Google Maps cache as their only guide.
Just before dawn, they arrived at their destination of a previously uncharted blue hole—a cavernous opening descending through the seafloor.

 Great Barrier Reef in the GeoGarage platform

After the rough night, Mumby was rewarded with something he hadn’t seen in his 30-year career.
The reef surrounding the blue hole had nearly 100 percent healthy coral cover.
Such a find is rare in the Great Barrier Reef, where coral bleaching events in 2016 and 2017 led to headlines proclaiming the reef “dead.”
“It made me think, ‘this is the story that people need to hear,’” Mumby says.

The expedition from Daydream Island off the coast of Queensland was a pilot program to test the methodology for the Great Reef Census, a citizen science project headed by Andy Ridley, founder of the annual conservation event Earth Hour.
His latest organization, Citizens of the Great Barrier Reef, has set the ambitious goal of surveying the entire 1,400-mile-long reef system in 2020.
“We’re trying to gain a broader understanding on the status of the reef—what’s been damaged, where the high value corals are, what’s recovering and what’s not,” Ridley says.

High resolution bathymetry data trove released
Araft of detailed new bathymetry datasets have been published on the AusSeabed Marine Discovery Portal. 
 
While considered one of the best managed reef systems in the world, much of the Great Barrier Reef remains un-surveyed, mainly owing to its sheer size.
Currently, data (much of it outdated) only exists on about 1,000 of the Great Barrier’s estimated 3,000 individual reefs, while a mere 100 reefs are actively monitored.

Researchers instead rely on models, which has left gaps in knowledge.
In the last two years, our understanding of how ocean currents dictate the reef’s ability to survive has improved.
According to Mumby, spawn from as few as three percent of sites provides new life to over half of the reef.
Those key reefs, however, still need to be identified.
“You can’t prevent bleaching or cyclones, but you can protect critically important sources of larvae,” he says.
An accurate survey will help to manage coral-hungry Crown-of-thorns starfish, as well inform future restoration project sites.

The majority of individual reefs that make up the Great Barrier Reef have not been directly surveyed.
(Damian Bennett)

The Great Reef Census is not the first attempt to use citizen science to survey the reef.
One such program, Reef Check, has been relying on citizens for 18 years—but it only monitors 40 key sites.
Eye on the Reef, an app from the Great Barrier Reef Marine Park Authority, encourages users to upload significant sightings, such as bleaching events, Crown-of-thorns starfish and mass spawning events.
But the new census will mark the first attempt to survey the entire reef system.

But the ambitious research program hinges on laypeople, meaning the data gathered could be of questionable scientific value.
Citizen science is notoriously problematic, owing to deviations from standard procedures and biases in recording.
For example, contributors to Eye on the Reef are more likely to record the spectacular (whale sharks, dugongs and humpback whales) than the common (starfish).

In 1992, Mumby’s first research project was analyzing reef survey data from citizen scientists in Belize.
The results, he admits, were less than brilliant.
“There are many citizen programs where the pathway between the data collected and the actual usage by management can be somewhat opaque,” he says.

Yet, Mumby believes that the Great Barrier Reef Census is different.
The program has a clear connection to both research and policy, he says.
Unlike other citizen science efforts, unskilled volunteers won’t be asked to estimate or monitor coral cover.
Participants will do the most simplistic of grunt work: uploading 10 representative photos of their diving or snorkelling site with a corresponding GPS tag.
This basic field data will then be used by the University of Queensland, which is already using high-resolution satellite images and geomorphic modelling to map the reef and predict the types of local ecosystems present.

National Oceanic and Atmospheric Administration diver Kelly Gleason injects a crown-of-thorns starfish with with ox bile, a natural substance that kills the creature but does not harm the reef.
(Greg McFall, NOAA Dive Program)

The project is critically important to understanding the reef, but it comes with limitations, says David Kline, a coral reef ecologist at the Smithsonian Tropical Research Institute.
According to Kline, satellite imaging is only capable of penetrating to depths of about 5 meters, although some satellite mapping has achieved about 20 meters in ideal conditions.
This leaves the deep-water mesotrophic reefs—which are less likely to suffer from bleaching and may be critical for reef recovery—under-studied.
Some are located as deep as to 2,000 meters underwater.
“To really [survey] the entire Great Barrier Reef in a meaningful way, you need AUVs [autonomous underwater vehicles], drones, airplanes with multi-spectral imagery, and high-resolution satellites—and you need to be able to link the data between these different levels,” Kline says.

Kline is currently working with the University of Sydney’s Australian Centre for Field Robotics, where engineers are training AUVs to gather high-resolution imagery of the reefs, including mesotrophic reefs.
This information can then be used to train machine learning algorithms to map the entire system.

However, Kline says it will likely be another 5 to 10 years before a fleet of AUVs is ready to efficiently map large areas such as the Great Barrier Reef.
“Until then, we need ambitious projects to start making progress toward that goal,” he says.
The Great Barrier Reef Census and the satellite mapping from the University of Queensland is a good start.

But even if the census’s methodology leads to stronger scientific data than previous efforts, the reef’s prognosis is still bleak.
If global greenhouse emissions continue to rise at their current rate, it’s predicted that mass bleaching events, which have occurred four times in the past 20 years, will occur annually from 2044 onward.

If successful, the Great Barrier Reef Census will be the world’s largest collaborative scientific survey.
And Ridley thinks if reports of the reef’s alleged death didn’t propel people to action, maybe reports of its ability to survive in the face of adversity will.

“We want the citizens to be helpful from a science perspective—but we also want people to give a shit,” Ridley says.
“The world’s not moving fast enough toward net-zero emissions.
Can the Great Barrier Reef be a point of inspiration, rather than a point of doom? I don’t know.
But we’re giving it a bloody shot.”

Links :