Wednesday, July 28, 2021

Deadly coral disease sweeping Caribbean linked to wastewater from ships

 
A researcher off the Virgin Islands swims past a pillar coral showing signs of stony coral tissue loss disease (SCTLD).
Photographs: Lucas Jackson/Reuters


From The Guardian by Jewel Fraser


Researchers find ‘significant relationship’ between stony coral tissue loss disease and nearby shipping

A virulent and fast-moving coral disease that has swept through the Caribbean could be linked to waste or ballast water from ships, according to research.

The deadly infection, known as stony coral tissue loss disease (SCTLD), was first identified in Florida in 2014, and has since moved through the region, causing great concern among scientists.

It spreads faster than most coral diseases and has an unusually high mortality rate among the species most susceptible to it, making it potentially the most deadly disease ever to affect corals.
More than 30 species of coral are susceptible.
It was found in Jamaica in 2018, then in the Mexican Caribbean, Sint Maarten and the Bahamas, and has since been detected in 18 other countries.

In Mexico, more than 40% of reefs in one study had at least 10% of coral infected by SCTLD, and nearly a quarter had more than 30%.
In Florida, regional declines in coral density approached 30% and live tissue loss was upward of 60%.

Biologist Emily Williams moves corals between tanks as researchers try to find out more about an outbreak of SCTLD in Florida in 2019

Scientists have not yet been able to determine whether the disease is caused by a virus, a bacterium, a chemical or some other infectious agent, but the peer-reviewed study in the journal Frontiers in Marine Science supports the theory that ballast water from ships may be involved.
Conducted in the Bahamas by scientists at the Perry Institute for Marine Science, it found that SCTLD was more prevalent in reefs that were closer to the Bahamas’ main commercial ports, in Nassau and Grand Bahama, suggesting a likely link between the disease and ships.

Judith Lang, scientific director at the Atlantic and Gulf Rapid Reef Assessment project, which has been tracking the disease, said: “The prevailing currents in the Caribbean push seawater to Florida and not in the reverse direction, and the predominant wind direction is westward.
So human dispersal [to those three territories] in 2018 seems necessary.”

In 2017, the spread of deadly pathogens by ships when they discharge ballast water prompted the International Maritime Organization to implement the Ballast Water Management Convention, which requires that ships discharge their ballast water – used to maintain the ship’s stability – 200 nautical miles from shore in water at least 200 metres deep before entering port, to ensure they do not bring in harmful foreign pathogens.

A research technician cuts a coral with a steel chisel to remove the section being killed by SCTLD, US Virgin Islands

In the Bahamas, SCTLD has spread rapidly since first being identified in December 2019.

Krista Sherman, senior scientist at the Perry Institute and a co-author of the recently published paper, said: “The disease is spread along about 75km of reef tract, about 46 miles – so for Grand Bahama that is a large structure of reef.
We’re talking about mostly covering the entire southern coastline of the island.”

The disease is also widespread in the coral reefs of New Providence, where the Bahamas’ capital, Nassau, and main port are located.
The study notes the presence of international container ships, cruise ships and pleasure boats at that location, as well as a fuel shipping station.

Infection rates among the most susceptible species were 23% and 45% across New Providence and Grand Bahama respectively, and recent mortality rates have reached almost 43%.

With the exception of two species, the researchers found “there was a significant relationship” between the disease and proximity of reefs to the major shipping ports.
They noted “an increasing proportion of healthy colonies as distance from the port increased on both islands, and a greater proportion of recently dead colonies closer to the port than farther away”.

The locations where SCTLD is prevalent in the Bahamas are all popular with tourists, recreational fishers and divers, Sherman said.

 
A research assistant applies an antibiotic ointment to a mountainous star coral affected by SCTLD near Key West, Florida

There are concerns that the coral disease could affect the country’s main fishery export, spiny lobster, said Adrian LaRoda, president of the Bahamas Commercial Fishers Alliance.
Although the lobster fishers work further out to sea, the industry would be affected if the reefs die.
The spiny lobster fishery brings in $90m (£66m) a year and employs 9,000 people.

“Any negative impact on our reefs would definitely drastically affect our spiny lobsters because the mature animals migrate [from the reefs] to the fish aggregating devices [a technique for catching fish],” LaRoda said.
He added that the lobsters’ reproduction rate and the food supply for juvenile lobsters in the reef would also be affected.

The Bahamian government has set up a national taskforce to tackle the problem.
Currently, the most effective treatment for the disease is the application of the antibiotic amoxicillin directly to the corals, which has seen some success in reducing mortality, but no realistic permanent solution is available.

According to Lang, rather than treating the symptoms, there is a need to tackle the possible human-made causes.
“Given a chance, nature can heal naturally,” she said.

Links :

Tuesday, July 27, 2021

The importance of surveying relic munitions and unexploded ordnance

Relic munitions and unexploded ordnance are a global problem, ubiquitously affecting European coastal waters.
The risk of possible detonations and environmental contamination hinders the development of many sectors of the blue economy — including offshore energy, shipping, aquaculture and tourism. 

From Euronews

For the police divers who work for the Schleswig-Holstein Bomb disposal unit in Kiel, their daily job is to go down into the murky cold sea to find lost weapons of war, a deadly legacy of the 20th century.

The coastal waters of Germany and other European countries are scattered with old munitions.
They rarely explode, but some can detonate if hit by an anchor.

Measures to protect the seafloor
 
On the day we visit Schleswig-Holstein's special unit, the bomb hunters are heading to the military port of Kiel.
Navy specialists have found a submerged explosive device close to the pier there.


 
As a rule, the divers try to extract the weapons for proper on-land disposal.
Only when that is not possible, are the bombs detonated on the seafloor.

Frank Ketelsen, Head of the diving operations at the Schleswig-Holstein Bomb disposal unit, tells us that if they must detonate a bomb in the water, they set up "air bubble curtains to protect marine mammals".

The tip of the iceberg
 
At the bomb unit headquarters, there are many samples of munitions from various periods and of different origins.
The collection is used to train new police officers.

Unexploded bombs found on land often make the news, but munitions on the seabed are rarely heard of, yet their quantity is unbelievable.

Oliver Kinast, Head of the Schleswig-Holstein bomb disposal unit, says that there's an estimated "1.6 million tonnes of munition from the World Wars in the North and Baltic Sea, 300 000 tonnes of which are in the Baltic Sea alone".
According to him though, those figures don't fully take into account munitions lost during battle operations.

The hunt for underwater munitions
 
Littorina, a scientific vessel from the GEOMAR institute, takes us along with a team of scientists to a large munitions dumpsite a few kilometres off the Baltic coast of Germany.
Two EU-funded projects, BASTA and ExPloTect, are testing new methods of finding bombs there: relic munitions are becoming a growing problem for marine industries and underwater ecosystems.

Aaron Beck is a researcher in aquatic biogeochemistry for the GEOMAR Helmholtz Centre for Ocean Research Kiel.
He has realised that the more they develop offshore resources, the more munitions they encounter and "the more they have to be cleaned up".
He thinks that "the biggest impetus for cleaning them up is wind farm installation, cable laying and so forth".

Polluting the sea
 
However, that is not the only problem these munitions pose. 
They are becoming big pollutants. 
"All of these munitions are in metal casings, and they all have been corroding for 70-80 years. We're coming up to a point where all the chemicals that have been inside will all start to come out", Aaron adds.

Much of the munitions on the seabed, both conventional and chemical, were deliberately disposed of in large numbers by the armed forces of many different countries.
Our knowledge of where exactly all these dumpsites are is patchy.

 
An AUV image of underwater munitions, chunks of TNT and other explosives
GEOMAR
 
Vehicles adapted for seafloor exploration
 
Autonomous underwater vehicles explore the seafloor quickly and efficiently.
They take pictures and measurements using a magnetometer.
Several of these devices can work simultaneously, which greatly reduces the costs.

On the seafloor, we see a hoard of decaying munitions that includes two-meter-long bombshells and bare chunks of toxic explosives.
Similar dumpsites can be found off the coasts of various countries in Europe and around the world.

AUV LUISE being lowered into the seaeuronews
 
The BASTA project vehicle, LUISE, explores the seafloor along a programmed trajectory, transmitting collected data to a ship.
The detailed photos and magnetic measurements, together with results of previously conducted acoustic scanning, reveal the exact shape of the suspicious objects and the presence of metal in their composition.

Marc Seidel, a Geophysicist for GEOMAR tells us that by combining the camera footage and the magnetic signatures that they measure, they get a good idea of what the object might be. Chemical analysis gives even more clarity.

'Silver bullet' technology

Scientists from the ExPloTect project are developing a sampling system with special filters for catching dissolved particles of explosive materials from the seawater.
Back on the ship, the samples are further concentrated and analysed with a compact mass spectrometer that indicates the presence of various explosives.
This method can drastically speed up detection of underwater munitions.

According to Aaron Beck, a researcher in aquatic biogeochemistry for the GEOMAR Helmholtz Centre for Ocean Research Kiel, with the new chemical analysis, they've gone from two to three months from collecting a sample to getting the data to the whole process potentially only taking 15 minutes. 
"We need that kind of rapid response", he says.

 
The ExPloTect system visualisationK.U.M. Umwelt- und Meerestechnik Kiel

Developers call this new weapon in the fight against underwater munitions a "silver bullet".
It hits the target for many industrial sectors that now spend a lot of time and resources clearing unexploded ordnances (UXOs), off the seabed.

The technology is also helping out the environment.
Onno Bliss, Business Development Manager for K.U.M. Umwelt und Meerestechnik, says that they will adapt the technology to different kinds of structures.
It will enable them to do long-term "permanent environmental monitoring at known UXO fields".
Doing so is also another way to decide where to start clearing the munitions first.

But how can the huge amount of data collected by underwater vehicles be processed?

Artificial intelligence

Egeos, a company based in Kiel, is developing a software platform that brings together new scientific data and relevant historic records like old archives documenting coastal military operations.
The algorithms look for relevant data patterns, suggesting areas that are likely to be contaminated with munitions.

To the CEO and founder of Egeos, Jann Wendt, "automation is definitely helping".
The process is still quite manual, but they're improving every day. 
"We are getting smarter from the side of data analytics. We are getting smarter from the perspective of autonomous underwater vehicles, autonomous sensors that are capturing this data and that makes the whole process cheaper", he explains.

Clearing the seabed is a task with huge economic potential.
Private companies are already developing large-scale projects for the recovery and proper disposal of underwater munitions.
According to Aaron Beck, there's a whole industry of people able to find and clean up munitions on the seabed.
All they really need is the funding to be able to do so.

Huge masses of underwater munitions are rusting and will release toxic content into the seas in the near future.
Can we stop this ticking time bomb before it’s too late? 
 
Links :

Monday, July 26, 2021

Getting to the bottom of trawling’s carbon emissions


Trawling nets like these disturb delicate ocean floor ecosystems and inadvertently release stored carbon. Credit: Alex Proimos, CC BY 2.0

From EOS by Nacy Averett

A new model shows that bottom trawling, which stirs up marine sediments as weighted nets scrape the ocean floor, may be releasing more than a billion metric tons of carbon every year.

Bottom trawling, a controversial fishing practice in which industrial boats drag weighted nets through the water and along the ocean floor, can unintentionally dig up seafloor ecosystems and release sequestered carbon within the sediments.
For the first time, researchers have attempted to estimate globally how this fishing technique may be remineralizing stored carbon that, as the seabed is tilled, ends up back in the water column and possibly the atmosphere, where it would contribute to climate change.
“The ocean is one of our biggest carbon sinks, so when we put in more human-induced CO2 emissions…we’re weakening that sink.”“The ocean is one of our biggest carbon sinks,” said Trisha Atwood, who researches aquatic ecology at Utah State University.
“So when we put in more human-induced CO2emissions, whether that’s directly dumping CO2 into deep waters or whether that’s trawling and enhancing remineralization of this carbon, we’re weakening that sink.”

Atwood helped build a model that shows that bottom trawling may be releasing as much as 1.5 billion metric tons of aqueous carbon dioxide (CO2) annually, equal to what is released on land through farming.
Her work was part of a paper recently published in Nature that presents a framework for prioritizing the creation of marine protected areas to restore ocean biodiversity and maximize carbon storage and ecosystem services.

Estimating Carbon Loss from the Ocean Floor


To create the model, Atwood and her coauthors first needed to figure out how much of the ocean floor is dredged by trawlers.
They turned to data from the nonprofit Global Fishing Watch, which recently began tracking fishing activity around the world and compiled data on industrial trawlers and dredgers from 2016 to 2019.

The next step was to find data on how much carbon is stored in the world’s ocean sediments.
Because that information was not readily available, Atwood and colleagues built a data set by analyzing thousands of sediment cores that had been collected over the decades.

Last, they dug through the scientific literature, looking at studies that examined whether disturbances to the soil in coastal ecosystems, such as seagrasses, mangroves, and salt marshes, exposed carbon that was once deep in marine sediments and enhanced carbon production in the ocean.
 
 
A group of twin-rigged shrimp trawlers in the northern Gulf of Mexico off the coast of Louisiana.
The trawlers are trailed by a plume of sediment, suggesting that their nets are scraping against the seafloor.
Credit: SkyTruth Galleries, CC BY-NC-SA 2.0


“We lean really heavily on that literature,” said Atwood.
“We used a lot of the equations [in previous papers] to build our model and extend it into the seabeds in these more open ocean locations.
And from there, we were able to come up with this first estimate.”

Their investigation did not attempt to determine whether sequestered carbon that has been released by bottom trawling remains in the water column or is released into the atmosphere, although they noted potential problems either way.
In the paper, the authors noted that it is likely to increase ocean acidification, limit the ocean’s buffering capacity, and even add to the buildup of atmospheric CO2.

Atwood and the lead author of the paper, Enric Sala, a conservation ecologist who is also a National Geographic Explorer-in-Residence, are working with Tim DeVries, who studies ocean biogeochemistry at the University of California, Santa Barbara, and scientists at NASA’s Goddard Space Flight Center to build atmospheric models to try to figure out where the released carbon goes.

Existing Trawling Data May Be Too Scant


Not everyone, however, is convinced that Atwood and Sala’s model on bottom trawling and loss of carbon sequestration in marine sediments is accurate.
Sarah Paradis, who is studying the effects of bottom trawling on the seafloor for her Ph.D.
at the Institute of Environmental Science and Technology in Barcelona, is skeptical.

In an email to Eos, Paradis noted that since the 1980s, there have been fewer than 40 studies that address the impacts that bottom trawling has on sedimentary organic carbon.
These few studies are not enough to build a model on, she said, and in addition, the studies reach different conclusions.
Some studies have observed that bottom trawling decreases organic carbon content of the seafloor, whereas others show it increases organic carbon.“We in no way intended our model to be the end-all in the trawling conversation.
We hope that many more studies will come along that help produce more localized results.”In addition, Paradis wrote that lower organic carbon on the seafloor does not necessarily mean its remineralization to CO2.
Rather, it could simply mean loss of organic carbon through erosion, which means the carbon moves to another area of the seabed but very little is remineralized into CO2.
She pointed to several studies, including one that she was a part of, that showed loss of organic carbon through erosion.

“I want to emphasize that [the authors] address a very important issue regarding how bottom trawling, a ubiquitous and very poorly-regulated anthropogenic activity, is affecting the seafloor,” she wrote.
“But the values they propose are far from being credible.”

Atwood disagreed.
“We don’t need lots of studies on the effects of trawling because we built our model using decades of carbon cycling research,” she wrote in an email to Eos.
“Trawling is simply a perturbation that mixes and re-suspends sediments, leading to increases in carbon availability.
All we needed to know about trawling to apply a carbon model to it is where trawling occurs and how deep in the sediment the trawls go.”

In addition, Atwood said, “We in no way intended our model to be the end-all in the trawling conversation.
We hope that many more studies will come along that help produce more localized results.”

Links :

Sunday, July 25, 2021

Barrels



 
Five barrels on one wave !

Friday, July 23, 2021

First map of marine structures shows how much we've modified the oceans


Through structures like oil rigs, humans have made a big imprint on the world's oceans

From NewAtlas by Nick Lavars

With our long history of altering the environment through manmade structures, we humans sure have made our mark on the Earth in our relatively short time here.
Scientists in Australia have turned their attention to what this perpetual development means for the world’s marine environments, calculating the extent of our construction footprint on the oceans for the first time ever.

The research was carried out at Australia’s University of Sydney and the Sydney Institute of Marine Science, with the team collating data on marine-built structures of all kinds.
These include oil rigs, wind farms, the length of telecommunication cables, commercial ports, bridges and tunnels, artificial reefs and aquaculture farms, with the data painstakingly sourced from the individual sectors of these different industries.

The result is what the scientists call the first map of human development in the world’s oceans, revealing how much of the marine environment had been altered by our activity.
According to the team, a total of around 30,000 sq km (11,600 sq mi) has been modified by human construction, which amounts to 0.008 percent of the entire ocean.
But as lead author Dr Ana Bugnot explains, the effects are a lot more far-reaching than that.

“The effects of built structures extend beyond their direct physical footprint,” she tells New Atlas.
“Marine construction can modify surrounding environments by changing ecological and sediment characteristics, water quality and hydrodynamics, as well as noise and electromagnetic fields.”

Scientists have pieced together the first global map of marine construction
Bugnot et al., 'Current and projected global extent of marine built structures', Nature Sustainability

Dr Bugnot and her team drew on existing data and research to quantify the impact of these types of flow-on effects, and found that the footprint of these structures is actually two million square kilometers (770,200 sq mi), more than 0.5 percent of the ocean as a whole.
Among the more surprising revelations from the analysis were that 40 percent of the physical footprint of all structures can be attributed to aquaculture farms in China, and that noise pollution can carry up to 20 km (12 mi) from commercial ports.

While evidence of manmade alterations to the oceans dates back thousands of years, to the early construction of ports and breakwaters to protect low-lying coasts, the phenomenon began to accelerate around the mid-point of the 20th century, according to the team.
This construction mostly takes place in coastal areas, and to better understand this trend the team cast an eye to the future, assessing data on planned projects and assuming a business-as-usual approach.

"The numbers are alarming," Dr Bugnot says.
"For example, infrastructure for power and aquaculture, including cables and tunnels, is projected to increase by 50 to 70 percent by 2028.
Yet this is an underestimate: there is a dearth of information on ocean development, due to poor regulation of this in many parts of the world.”

The team hopes the study can draw attention to the importance of conserving marine environments, and that the findings can provide a starting point for further investigation and tools to track of these types of ocean construction projects on an ongoing basis.

“The estimates of marine construction obtained are substantial and serve to highlight the urgent concern and need for the management of marine environments,” says Dr Bugnot.
“We hope these estimates will trigger national and international initiatives and boost global efforts for integrated marine spatial planning.
To achieve this, it is important to rump up efforts for detailed mapping of historical and existing marine habitats and ocean construction.”

Links :

Thursday, July 22, 2021

Canada (CHS) layer update in the GeoGarage platform

 
2 new nautical raster charts added

Ocean Data for All: Why we need to break down barriers for the global sharing of ocean data


From Institute of Oceanography, National Taiwan University by Linwood Pendleton

The ocean works at a global scale.
To understand changes at a global scale, we need to ensure that ocean data are readily available in global data sets.
Unfortunately, ocean data are often siloed – trapped in national databases, on laptops, and in logbooks.
How does Ocean Data For All solve the problem?

We have only one ocean

The Ocean supplies the air we breathe, regulates the climate, feeds billions of people, and supports an important and growing global economy.
There are 150 ocean countries, 83 of which are more ocean than land.
To manage this ocean, we need a robust global set of ocean data from all of this ocean.
Ocean data are used to monitor oceanic and climatic change, to assess fisheries health, to track biodiversity, and to alert the public of hazards like harmful algal blooms.
These data also are used to create models to warn of storms and tsunamis, changes in fish abundance, and to plan for ocean change.
Ocean data and the science needed to produce them are essential to plan for sustainable development and are so important that the United Nations declared that the next decade will be the UN Decade of Ocean Science for Sustainable Development.

“The ocean works at a global scale. To understand changes at a global scale, we need to ensure that ocean data are readily available in global data sets. Artificial intelligence and machine learning methods are needed to understand and predict global ocean processes. Yet, AI can only be applied if the data are all in one place, organized and easy to read by computers.”

Unfortunately, ocean data are often siloed – trapped in national databases, on laptops, and in logbooks.
Even when data are shared online, it is often difficult to find and access these online databases.
The United Nation’s Intergovernmental Oceanographic Commission has long been a home to such a global database of ocean data.
Stewarded by the International Ocean Data and Information Exchange, the World Ocean Database (WOD) and the Ocean Biodiversity Information System (OBIS) have provided global sources of ocean data.
But, even with the backing of most of the world’s ocean countries, the WOD and OBIS still do not fully reflect ocean conditions around the planet.

Head of northern right whale with NOAA Ship DELAWARE II in background. 
(Source: NOAA on Unsplash)

We don’t have the global data we need to manage the ocean

Today, fewer than 30 countries regularly contribute to the World Ocean Database – the oldest and most global set of essential ocean data (compared to more than 60 countries just 2 decades prior).
In 2018, the World Ocean Database had fewer than one data station per 1000 km2 for nearly three-quarters of the world’s coastal waters (the blue in the first map).
Without these data, global ocean and climate models do not account for conditions in many of the world’s sovereign ocean areas and we are unable to assess the impacts of climate change or progress towards certain Sustainable Development Goals.
Data regarding biodiversity are similarly lacking (second map).
We have a higher density of biodiversity data for Antarctic waters than for nearly all of the Global South.


“Barriers to data sharing must be removed.
Many countries have ocean data, but do not share these data.
In some cases, lack of human, technical, and financial capacity may limit data sharing.
Formal policies and informal practices may also slow data flow.
National security interests, real or perceived, may prevent ocean data sharing.
Of course, in some places there are simply no data to share.”

Leadership is needed to overcome barriers to ocean data sharing

With more than 120 ocean countries failing to regularly share ocean data and virtually no shared data for the vast majority of ocean countries, steps need to be taken to fill the gaps in our understanding of ocean conditions and change.
Without these data, it will be difficult to achieve sustainable development goals and a sustainable blue economy.

To begin to address these hurdles to ocean data sharing, a unique partnership – Ocean Data For All - has been created by the Intergovernmental Oceanographic Commission, the High Level Panel for a Sustainable Ocean Economy, the World Economic Forum’s Future of a Connected Planet Program, and the Centre for the 4th Industrial Revolution – Ocean.


These partners are working to analyze patterns of ocean data sharing to understand whether, when, and where increased sharing of ocean data could be achieved through technological, political, legal, cultural, financial, or other means.
While the UN Decade of Ocean Science for Sustainable Development is focused on increasing the capacity to create new data through science, the Ocean Data for All project seeks to make sure that these new data and knowledge are shared in ways that are global and open – allowing scientists, planners, and industries around the world to benefit from a more global understanding of ocean processes and conditions.
To go from analysis to action, however, will require the involvement of leaders in government, industry, technology, and philanthropy all of whom are needed to supply the intellectual and financial resources to break down these barriers.

Through opportunities like Taiwan’s Ocean Challenge event in Kaohsiung, I am reaching out to students, governments, and leaders in the technology and business to enlist their help in this global effort.
We all have a role to play in ocean science.

Wednesday, July 21, 2021

Great white sharks at times enter San Francisco…

From MercuryNews by Paul Rogers

Windsurfers, fishing boats and cargo ships aren’t the only traffic in San Francisco Bay.
Great white sharks are there sometimes, too.

In what is believed to be the first scientific confirmation of white sharks in San Francisco Bay, researchers from Stanford University, the University of California-Davis and other organizations put satellite and acoustic tags on 179 white sharks in Northern California waters from 2000 to 2008.
They found that most of the sharks migrated thousands of miles every year, from California to as far away as Hawaii, and five swam underneath the Golden Gate Bridge in 2007 and 2008 and into bay waters.

It isn’t known exactly where the sharks went once in the bay, only that their acoustic tags were detected by receivers anchored to the bay floor between the Golden Gate Bridge and Alcatraz Island.

 
“All we know so far is that they are poking their heads across the Golden Gate,” said Barbara Block, a professor of biology at Stanford’s Hopkins Marine Station in Pacific Grove who helped lead the tagging study.
“I doubt they are coming in very far because they are salt water animals.”

There are nearly a dozen species of sharks known to live in the bay, including leopard, sevengill and other, mostly docile bottom-dwelling varieties.
Researchers have wondered for years whether great whites — apex ocean predators that can reach 15 feet long or more and weigh 4,000 pounds — are there too.
But there have been no documented cases of white sharks ever attacking seals, sea lions or other animals in the bay, let alone people, said John McCosker, a veteran shark researcher at the California Academy of Sciences.

“The conditions aren’t good. The water isn’t clear,” McCosker said.
“You can’t see your dinner in San Francisco Bay.”

Contrary to sensationalism from Hollywood films, attacks by white sharks are rare.
Far more people die each year from dog bites or hitting deer with cars.
Since 1952, there have been 99 white shark attacks in all of California, according to McCosker’s records, and 10 fatalities.
Most were surfers or divers in places where elephant seals and sea lions — some of the white sharks’ primary prey — are frequent.

The only documented white shark fatality in San Francisco came on May 7, 1959, when Albert Kogler Jr., 18, died while swimming in less than 15 feet of water after he was attacked off Baker Beach, about one mile west of the Golden Gate Bridge.

The president of one prominent swimming club said Tuesday that the news that white sharks occasionally come into the bay is interesting, but not that surprising.
“The conventional wisdom has always been that they don’t come beyond the bridge.
I don’t think I ever believed it,” said Ken Coren, president of the Dolphin Club, in San Francisco.

Coren, an open water swimmer for more than 25 years, said bay swimmers think about sharks, but know the risk is very low.
He said he doesn’t think the news will scare many swimmers from their usual routines.
“The most deadly thing we face are propellers,” he said.

The latest research was published today in the “Proceedings of the Royal Society B,” which is the biological research journal of Great Britain’s national academy of sciences, the Royal Society.

In it, scientists used high-tech acoustic tags to find that white sharks display amazingly precise migration patterns.
They are most numerous off Northern and Central California between September and November, and almost all gone from April to July.

While in the area, white sharks congregate at four key sites, each of which supports large colonies of seals and sea lions: Southeast Farallon Island, Tomales Point, Año Nuevo Island, and Point Reyes National Seashore.

When the sharks leave each spring, satellite tags show, they swim as far as Midway Island and Hawaii, but also mass in large numbers in an area of open ocean between Hawaii and Baja California known as “the White Shark Café,” where researchers believe they may mate and forage for food.
In the fall, they return back to the same places they left a year earlier.

“We tend to think of white sharks as animals that wander oceans aimlessly,” said Block.
“What we’re learning is how selective a predator they are.
The go up to 4,000 miles in a trip and come back to within half a mile of where they left.”

It isn’t known how white sharks navigate, although their ability to sense electromagnetic energy in the ocean may play a role.

The researchers also found by taking DNA samples that white sharks in Northern California are a genetically distinct population from other groups of white sharks in the world, such as Australia and South Africa, and that they may have descended from Australian great whites 200,000 years ago.

The five sharks that came into San Francisco Bay represent less than 1 percent of the nearly 64,000 detections researchers picked up along the California coast from acoustic tags.
Some details are known about the bay visits, however.
One shark was detected one time inside the Golden Gate Bridge in December 2008; a second shark was detected one time, in April 2008; a third shark was detected four times, between November 2007 and November 2008; and a fifth shark was detected three times, in August and September 2007.

Salvador Jorgensen, a postdoctoral researcher at Stanford who is the lead author on the study, said that having tagging data could in the future help better inform the public around the world about white shark behavior, reducing the risk of encounters.
It may also help researchers one day get a population estimate for great whites, whose numbers are believed to be on the decline.

In addition to Jorgensen and Block, the research team included Carol Reeb and Christopher Perle of Stanford; Peter Klimley and Taylor Chapple of UC-Davis; Scot Anderson of Point Reyes National Seashore; Adam Brown of PRBO Conservation Science; and Sean Van Sommeran and Callaghan Fritz-Cope of the Pelagic Shark Research Foundation.

Links :

Tuesday, July 20, 2021

‘They just left us’: Greece is accused of setting migrants adrift at sea

A raft carrying Afghan women and children who were rescued by the Turkish Coast Guard in the Aegean Sea this month.Credit...Ivor Prickett for The New York Times

From NYTimes by Carlotta Gall

Frustrated by more than a year of picking up people they say Greece has illegally pushed out, Turkish officials invited journalists to witness rescues firsthand.

ABOARD A TURKISH COAST GUARD VESSEL — Wet and shaken, women and children were pulled aboard the Turkish patrol boat first, then the men and more children.

A 7-year-old girl in striped leggings, Heliah Nazari, shivered uncontrollably as she was set down on the deck. An older woman retched into a plastic bag.

They were two of 20 asylum seekers from Afghanistan who had been drifting in the dark, abandoned in rudderless rafts for four hours before the Turkish Coast Guard reached them.

Just hours earlier they had been resting in a forest on the Greek island of Lesbos when they were caught by Greek police officers who confiscated their documents, money and cellphones and ferried them out to sea.

“They kicked us all, with their feet, even the children, women, men and everyone,” said Ashraf Salih, 21, recounting their story.
 “They did not say anything, they just left us. They weren’t humane at all.”

The Turkish Coast Guard officials described it as a clear case — rarely witnessed by journalists — of the illegal pushbacks that have now become a regular feature of the dangerous game of cat and mouse between the two countries over thousands of migrants who continue to attempt the sea crossing from Turkey to the Greek islands as a way into Europe.
 
Migrants who had become adrift in the Aegean were rescued by the Turkish Coast Guard this month. Thousands of people continue to attempt the sea crossing from Turkey to the Greek islands as a way into Europe.
Credit...Ivor Prickett for The New York Times

Since a mutual agreement broke down last year, Turkey and Greece have been at loggerheads over how to deal with the continuing flow of migrants along one of the most frequented routes used since the mass movement boomed in 2015.

Then, one million migrants, mostly Syrians fleeing the war in their country, led the surge into Europe. The flow is much reduced — 40,000 have arrived by sea into Europe so far this year — but it is now dominated by Afghans, raising fears that the escalating conflict there and the American withdrawal of troops could bring larger numbers.

For more than a year, Turkey has turned a blind eye to the migrants, allowing them to try the sea crossing to Greece.
That country has resorted to expelling migrants forcibly, disabling their boats and pushing them back to Turkey when they are caught at sea.

Increasingly, Greece is even removing asylum seekers who have reached its islands, forcing them into life rafts and towing them into Turkish waters, as the compassion many Greeks had shown during earlier waves of migration has given way to anger and exhaustion.

The tactic of so-called pushbacks has been roundly denounced by refugee organizations and European officials as a violation of international law and of fundamental European values. 
The Greek government denies that it has pushed back any migrants, while insisting on its right to protect its borders.

 “It is obvious they were pushed back,” said Senior Lt. Cmdr. Sadun Ozdemir, the Northern Aegean group commander in the Turkish Coast Guard, after his crew rescued a group of Afghans in the Aegean. Credit...Ivor Prickett for The New York Times

“Numerous cases have been investigated, including by the European Union,” Notis Mitarachi, minister for migration and asylum in Greece, said last week, “and reports have found no evidence of any breach of E.U. fundamental rights.”

Philippe Leclerc, head of the United Nations refugee agency in Turkey, said his office had presented evidence, including “accounts of violence and family separations” to the Greek ombudsman, requesting the cases be investigated, without result.

The two countries are at an impasse, with Turkey demanding that Greece end the pushbacks first, and Greece demanding that Turkey first take back 1,400 migrants whose asylum requests have been rejected, Mr. Leclerc said.

President Recep Tayyip Erdogan of Turkey has been widely accused of precipitating the crisis, when in February last year he announced he was opening his country’s borders for migrants to travel to Europe.

Turkish officials, who spoke on condition of anonymity because they were not permitted to speak to the news media, said the step was taken to draw world attention to Turkey’s own burden in hosting some four million asylum seekers from other nations’ wars — more than 3.6 million Syrians, along with 400,000 other people from Afghanistan, Asia and the Middle East.
It is the single largest refugee community in the world, and has taken over whole suburbs of Istanbul and the capital, Ankara.

But the action was interpreted in Greece as a kind of blackmail to extort money and other concessions from the European Union on a range of issues.
 
Migrants trying to cross the Aegean from Turkey to the Greek island of Lesbos were rescued by the Turkish Coast Guard after their engine stalled this month.
Credit...Ivor Prickett for The New York Times

It led to clashes between migrants and Greek border guards on the Turkish-Greek border and caused the conservative Greek government to adopt aggressive new measures against migrants, including the pushbacks.

Greece has struggled to handle the influx of more than 100,000 asylum cases and overcrowded refugee camps on its islands while other European countries have done little to share the burden.

But Turkish officials stress that the numbers Greece is handling is nothing compared with the scale of the strain on Turkey.
Resentment against the migrants in Turkey has grown as economic conditions have worsened, threatening Mr. Erdogan’s political standing.
He, in turn, has railed against wealthier states shirking their responsibilities toward the world’s refugees and not doing enough to end the conflicts that cause them to flee.

Frustrated after more than a year of picking up thousands of migrants left by their Greek counterparts, the Turkish Coast Guard invited journalists recently on a patrol boat to witness what they said were the Greek violations.

“It is obvious they were pushed back,” Senior Lt. Cmdr. Sadun Ozdemir, the Northern Aegean group commander of the Turkish Coast Guard, said after his crew had rescued the 20 Afghans. 
“They did not come from the sky.”

Some of the rescued Afghans back on shore in Dikili, Turkey. Heliah Nazari, right, was recovering in dry clothes supplied by the International Organization for Migration
Credit...Ivor Prickett for The New York Times
 
He said the Greek vessel had probably towed the rafts deep into Turkish territorial waters before cutting them adrift, which he said was an additional violation.

One raft was overloaded and the thin bottom leaking, he said. 
“That boat could have sunk in one or two minutes, and possibly they do not know how to swim and they could have drowned.”

As often happens, the Turkish crew received an email from their Greek counterparts that migrants were drifting in the area — a seeming effort by the Greeks to mitigate loss of life but something the Turks say is an implicit sign of Greek culpability.

Tommy Olsen, who runs the Aegean Boat Report, a Norwegian nonprofit that tracks arrivals of migrants on the Greek islands, confirmed through photographs and electronic data that members of the group had been on the island of Lesbos that day.

A local photographer also took pictures of some of them in front of a Greek church, a landmark on the south of the island.
Another photograph showed Mr. Salih and his mother resting beside the fence of a house, with the girl in striped leggings drinking juice and smiling.

Rescued migrants on a Turkish Coast Guard ship. Stranded after their engine failed, most of them said they had been pushed back multiple times by the Greek authorities. 
Credit...Ivor Prickett for The New York Times
 
The pushbacks also damage relations between the Greek and Turkish Coast Guards and interfere with work against drug and people trafficking, Commander Ozdemir said.

Commercial ships as well as navy and coast guard vessels pass through the northern Aegean and could easily hit the small rafts and boats, which have no lights or means of navigation, he added.
“This thing we call ‘pushback’ in English is a very innocent expression,” he said. But the action was anything but, he said, hoping to convey “how desperate the situation is.”

Interviews with migrants rescued by the Turkish Coast Guard in several incidents over the course of four nights revealed the scale of Greek violations and the growing desperation of migrants.

One group of 18 people, from Africa and the Middle East, were rescued after their engine broke down.
 
A Turkish Coast Guard ship confronted a Greek Coast Guard vessel near the island of Lesbos this month. Commander Ozdemir said the pushbacks damaged relations between the two services.
Credit...Ivor Prickett for The New York Times

Muhammad Nasir, 29, said he had fled the war in Yemen after his father was killed. He was trying to join his uncle in Britain.
He said he had been pushed back seven times by the Greek Coast Guard; this was his eighth attempt.

“For three months, I have been trying,” he said, his voice cracking. “I feel disappointed. I cannot stay in Turkey, I do not have a job and my family are waiting for me to help them.”

An Afghan teenager with an injured leg, Reyhan Ahmedi, 16, was picked up after six hours at sea alone after being expelled by Greece.
He said he had fled his home in the town of Gereshk, in southern Afghanistan, as attacks from the Taliban escalated.
When he got news that his home had been bombed and was unable to reach his parents, he decided to make a bid to reach Europe.

“I thought I should take myself away from Afghanistan and find a better future for myself,” he said. 
“I want to get an education.”

The Aegean, near Cunda, Turkey. Some 40,000 migrants have arrived by sea into Europe so far this year, mostly Afghans.
Credit...Ivor Prickett for The New York Times

Links :

Monday, July 19, 2021

British Isles & misc. (UKHO) layer update in the GeoGarage platform

Mapping quest edges past 20% of global ocean floor

image copyrightGEBCO / Vicki Ferrini
Seamounts off Brazil: It's around underwater mountains that marine life will congregate

From BBC by Jonathan Amos

The quest to compile the definitive map of Earth's ocean floor has edged a little nearer to completion.

Modern measurements of the depth and shape of the seabed now encompass 20.6% of the total area under water.
It's only a small increase from last year (19%); but like everyone else, the Nippon Foundation-GEBCO Seabed 2030 Project has had cope with a pandemic.
The extra 1.6% is an expanse of ocean bottom that equated to about half the size of the United States.

The progress update on Seabed 2030 is released on World Hydrography Day.

image copyright Nippon Foundation-GEBCO Seabed 2030 Projecti
The black is where we still need modern measurements at a reasonable resolution


The achievement to date still leaves, of course, four-fifths of Earth's oceans without a contemporary depth sounding.
But the GEBCO initiative is confident the data deficit can be closed this decade with a concerted global effort.
"It doesn't matter whether you operate a high-tech fleet of ships or you're just a simple boat-owner - every piece of data matters in this giant jigsaw we're making," said project director, Jamie McMichael-Phillips.
"If you're going to sea for whatever reason, switch on your echosounder.
Even if you're just a yachtsman, a recreational sailor - then low-tech data-logging equipment is only a few hundred dollars, with the price coming down all the time.
Fit it, plug in your GPS, plug in your echosounder and help us get to 100% coverage by 2030," he told BBC News.

 
An elevated section of seafloor in front of a Greenland glacier helps protect it from incursions of warm ocean water that would otherwise melt the ice

When Seabed 2030 was launched in 2017, only 6% of the oceans had been mapped to modern standards.
So, it is possible to make swift and meaningful gains.

For example, a big jump in coverage would be achieved if all governments, companies, and research institutions released their embargoed data.
There's no estimate for how much bathymetry (depth data) is hidden away on private web servers, but the volume may be very considerable indeed.

Those organisations holding such information are being urged to think of the global good and to hand over, at the very least, de-resolved versions of their proprietary maps.

Seabed 2030 is not seeking 5m resolution of the entire floor (close to something we already have of the Moon's surface).
One depth sounding in a 100m grid square down to 1,500m will suffice; even less in much deeper waters.






The UK's new polar ship, the RRS Sir David Attenborough, is equipped to map millions of square km of ocean bottom over its career.
The above image shows the ship's hull in dry dock.
The yellow rectangle in the centre is a cover made of a synthetic material over the 8m-long array of transmit transducers for the deep-water multibeam echosounding system.
The receive array is at right angles to this, just behind the people in the photo, but difficult to see because it is covered by a material that has much less contrast with the rest of the hull.
Transducers for several other systems are also visible.
The yellow square next to the individuals' heads is the transmit transducer for the acoustic sub-bottom profiling system, which provides a profile showing the layers in the upper few metres of sediment.
The past 12 months have been hampered by the limitations Covid has placed on research cruises.
This is unfortunate because it's the science expeditions that will often visit those parts of the oceans where few other ships venture - into the Southern Ocean, for example.

That said, there've been some notable contributions of late from the research sector.
One of the most significant has come from the DSSV Pressure Drop, a ship funded by the Texan billionaire and adventurer Victor Vescovo.
His expeditions to the very deepest parts of Earth's oceans mapped an area equivalent to the size of France in just 10 months (an area the size of Finland within this had never been seen before)
Better seafloor maps are needed for a host of reasons.
They are essential for navigation, of course, and for laying underwater cables and pipelines.
They are also important for fisheries management and conservation, because it is around the underwater mountains that wildlife tends to congregate.
Each seamount is a biodiversity hotspot.

In addition, the rugged seafloor influences the behaviour of ocean currents and the vertical mixing of water.
This is information required to improve the models that forecast future climate change - because it is the oceans that play a pivotal role in moving heat around the planet.

Fugro is operating 12m-long USVs as part of its Blue Essence fleet

If Seabed 2030 is to meet the end-of-decade target, it will have to leverage the emergence of robotic ships and boats.
Uncrewed Surface Vessels (USVs) are becoming increasingly popular.
Fugro, one of the world's leading marine geophysical survey companies, is building a fleet of USVs based on the Ocean X-Prize-winning SeaKit roboboat.

Fugro has two such boats surveying and inspecting oil-and-gas, wind farm, and power installations - in European waters and off Western Australia.
Together, they are known as the Blue Essence fleet.
These boats will even deploy and recover robotic subs.

"These drone-type, or uncrewed-type, solutions will make it easier to gather more data.
They can do a lot more work in the same amount of time," said Ivar de Josselin de Jong, Fugro's solution director for remote inspection.
"Technology is moving very fast. It's unbelievable what we do now compared with what we did 10 years ago.
"We can operate a USV in Australia from our remote operations centre in Aberdeen.
"We've got 26 crewed vessels at the moment and we want to gradually replace them - partially at least.
Where we don't need 'hands' in an offshore environment, we will move to uncrewed solutions, to reduce the health and safety exposure and to reduce carbon footprints," he told BBC News.
 
Unmapped areas of the ocean floor, as per the 2020 dataset. Due to COVID, the coverage only progressed from 19 percent last year to 20.6 percent this year  
Credit: Andrew Douglas-Clifford / The Map Kiwi.

To mark World Hydrography Day, The Nippon Foundation-GEBCO Seabed 2030 Project has entered a technical cooperation agreement with the UK Hydrographic Office and Teledyne CARIS, a leading developer of marine mapping software.
Building the definitive map of Earth's ocean floor means collating colossal volumes of data.
The new tie-up will see a new artificial intelligence tool being used to clean bathymetric data of "noise", making it easier to pull out reliable depth soundings.

Links :

Sunday, July 18, 2021

Interactive visualization of GEBCO 2021 grid released

Building on Colin Ware's work developing BathyGlobe, Paul Johnson of the University of New Hampshire has created an interactive globe of the GEBCO 2021 grid which is publicly available to use as a scientific tool or educational aid.
The web app allows users to quickly zoom in on any location, make perspective plots, change basemaps, and turn on and off a layer showing the direct measurements.
The layers are tile services to speed up transfer and are overlaid on a GEBCO 2021 elevation layer with a 7x vertical exaggeration.
If you wish to make perspective plots you will need to use the 3rd button on your mouse or else click on the 4th icon down on the left to put the viewer in "Navigate" mode.
Once it is in that mode you will be able pan/zoom/tilt the globe to your liking.
Seabed 2030 Project Director, Jamie McMichael-Phillips, says', 'this web app is a fantastic contribution to GEBCO from UNH, and with the potential for increased functionality in the future, it will be an important tool for scientists and non-experts alike as the GEBCO grid grows.
The power of visualization helps us to understand how far we have already got and how much further we have to go.'

Saturday, July 17, 2021

A Beautiful, high-resolution map of the Internet (2021)

 

This is the extraordinary job done by Martin Vargic, artist and writer who took more than a year to create a map of Internet 2021 graphically inspired by historical maps.
This map includes many details, not only websites, but also technologies, companies, digital concepts...etc.
The size of the territories depends on the Alexa ranking (popularity), and the colors are according to the graphic charter of the sites. It is very rich and we could observe it for hours because there are so many things on it.

Links :

Friday, July 16, 2021

How weather forecasts are made

 
 
From DiscoverMag  by Allison Klesman

Meteorologists are better at their jobs than you might think.
Here's how heaps of data are turned into a forecast relevant to you.


Expect rain.
Those two simple words can ruin picnic plans or herald rescue for drought-stricken crops.
Few things in our lives are as universal as the weather.

“It’s what’s going on in the atmosphere all around us all the time,” says Russ Schumacher, Colorado State climatologist and director of the Colorado Climate Center.
“Storms and all the other interesting things that Earth’s atmosphere brings us have this big effect on our daily lives in a lot of ways.” But even though we tune in to local news stations or check apps to find out what the weather will bring, we don’t always trust the forecasts.
You’ve probably heard the joke: Meteorology is the only occupation where you can be wrong all the time and still get paid for it.

In reality, weather forecasts have improved in leaps and bounds in just the past few decades.
And meteorologists in pursuit of an ever-more-perfect forecast continue to push what’s possible toward its theoretical limit.

Making the Weather

Before we can predict the weather, we have to understand where it comes from.
To do that, we must look to the sky.

Earth is enveloped in an atmosphere of mostly nitrogen, oxygen and water vapor.
This air, like liquid water, behaves as a fluid.
As air flows from one place to another, it carries its properties with it, changing the temperature, humidity and more.
Weather is simply the byproduct of our atmosphere moving heat from one place to another.

Cooler air is dense and can’t hold much moisture; warmer air is less dense and can hold more water.
When regions of air with different temperatures and densities meet, the boundary is called a front.
Sometimes these cloudy clashes can cause rain, as the cooling warm air is forced to drop its water.

It’s not just fronts that can make it rain; convection can also drive precipitation.
As warm, moist air rises, it also cools, and its water condenses onto airborne particles such as dust.
These droplets are carried aloft by rising air, growing larger and larger until they become too heavy and fall back to Earth.

When that happens, grab your umbrella.
Once a storm has formed, if there’s nowhere for it to get more moisture from the ground or the air, it will peter out as it lumbers along.
If it finds more warm air and moisture — like a hurricane does as it moves across the ocean — it will grow and grow.

Forecasting Basics

With so many factors involved, it may seem impossible to predict what weather is on the horizon.
But that’s far from the case.
“Weather forecasting is one of only a few fields where we can accurately forecast the evolution of a system.
We cannot do that in economics or sports,” says Falko Judt, a research meteorologist at the National Center for Atmospheric Research in Boulder, Colorado.

Doing so depends on reliable observations.
Scientific weather observations began in the Renaissance, when barometers and thermometers were invented.
European scientists of old, like Galileo, used these instruments to take the types of measurements that would one day explain weather events.
By the late 1800s, rudimentary weather maps had come into common use.

But early forecasts were limited and relied on persistence, or the assumption that a system’s past would dictate its future behavior.
“If a storm system is in Kansas one day and Missouri the next, then by persistence you can say it’ll be in Illinois the next day,” explains Bob Henson, a meteorologist who writes for Weather Underground.
Persistence is an OK way to predict the weather when conditions are constant — when a storm trundles along without breaking up or the local climate changes little day to day, say, in Southern California.

But this simple technique doesn’t account for changing conditions, such as storms that form quickly through convection (typical for thunderstorms) or moving fronts that change the temperature.
Luckily, we have newer, better ways to predict the future.
Today’s weather forecasts aren’t made by people looking at weather maps and yesterday’s highs and lows — they’re made by machines.

 Modern Weather Prediction

Meteorologists use a process called numerical weather prediction to create forecasts by inputting current conditions — which they call the “nowcast” — into computer models.
The more current and accurate information available to these models, the better the forecast will be.
Ground radar, weather balloons, aircraft, satellites, ocean buoys and more can provide three-dimensional observations that a model can use.
This allows meteorologists to simulate what the atmosphere is currently doing and predict what will happen in the next few days or, for some models, hours.

Weather models divide a region, say a single state or even the whole globe, into a set of boxes, or cells.
The size of these cells — the resolution of the model — affects its forecasting accuracy.
Large boxes mean poor resolution, or the inability to tell what’s happening over small areas, but a broad picture of large-scale weather trends over long timelines.
This big-picture forecast is helpful when you want to know how a big storm will move across the U.S.
over the course of a week.

Smaller boxes mean higher resolution, which can forecast smaller storms.
These models are more expensive in terms of computing power, and only run to the one- or two-day mark to tell people whether it might storm in their local area.
Although all models are based on the same physics, each translates those physics into computer code differently, says Judt.
Some models might prioritize certain kinds of data — such as wind speed, temperature and humidity — over others to generate predictions, or simulate physical processes slightly differently than another model.
That’s why two models might spit out slightly different results, even with exactly the same starting observations.

(Credit: Jay Smith)

The Human Touch

With computers now running the show, what’s left for human forecasters to do?

In terms of day-to-day weather like temperatures, perhaps not much.
“For a lot of the routine weather, the forecast models are so good now that there’s really not that much that the human forecasters are going to add,” says Schumacher, who is also an associate professor in the Department of Atmospheric Science at Colorado State University.

But don’t think humans are unnecessary just yet.
“A forecaster might tweak what the computer tells you if they know their area really well and they know that models struggle with a certain kind of weather situation,” says Henson.

One such situation is precipitation, which is more challenging to forecast than temperature, says Matt Kelsch, a hydro meteorologist at the University Corporation for Atmospheric Research in Boulder.
“Temperature is a continuous field, meaning there’s a temperature everywhere,” he explains.
“But precipitation is a discontinuous field, meaning there’s a lot of places there is none, and then some places that it can be raining or snowing very hard.” And local geography — mountain ranges, coastlines or the Great Lakes — can affect precipitation in ways that models may not handle well.
Particularly for forecasts within 24 to 36 hours, Kelsch says, a meteorologist’s experience with the forecasting area comes into play.

Forecasting high-impact situations such as hurricanes, tornadoes and floods is more challenging and comes with much higher stakes.
“Especially when it comes to extreme weather, human judgment is really important,” Henson says.
 
What Are the Chances?

The further in the future your picnic is scheduled, the harder it is to predict rain or shine.
But since the 1950s, ever faster computers have been producing increasingly accurate weather forecasts.
“Many of the world’s largest and most powerful supercomputers are devoted to atmospheric research — to forecasting [weather] and to studying climate change,” Henson says.

According to National Oceanic and Atmospheric Administration, today’s five-day forecast is accurate about 90 percent of the time.
The seven-day forecast is correct 80 percent of the time, and a 10-day forecast reflects the weather that actually occurs about 50 percent of the time.

What about major events? Based on National Hurricane Center forecasts since 2010, a hurricane’s eye made landfall, on average, just 47 miles from where a prediction 24 hours earlier said it would.
That’s only about one-sixth of an average hurricane’s total size.
“Twenty-four hours before a hurricane strikes land, we’ve already pretty much nailed down where it will go,” says Judt.
Going out to five days, the error in the forecasts since 2010 is about 220 miles.

These stats are more impressive when you consider how much meteorologists have improved the number of days out to which an accurate forecast can be made.
For instance, today’s five-day hurricane forecast is more reliable than the four-day forecast in the early 2000s, and more reliable than a three-day forecast in the 1990s.
And a 2015 Nature paper revealed that three- to 10-day forecasts have been improving by about a day per decade — meaning a modern six-day forecast is as accurate as a five-day forecast 10 years ago.
 
Chaos Rules

As forecasts improve, one question naturally arises: How much better can they get?

Unfortunately, the chaotic nature of our atmosphere seriously limits our ability to model it — and therefore to predict what it will do next.
You’ve probably heard that a butterfly flapping its wings in Hong Kong might cause the weather to change in New York.
The idea of this “butterfly effect” — in which minuscule changes can have huge impacts on the development of a dynamic system — was coined in 1972 by mathematician and meteorologist Edward Lorenz.

In practice, this means that a single weather model run more than once with even the most subtle differences in starting conditions can produce very different predictions.
Since no measurement is perfect — every observation has an associated uncertainty — these small imperfections can cause big changes in what a model predicts.
These changes get bigger and bigger the further ahead you try to predict.

Because of this, the potential predictability limit of weather is about two weeks, says Henson.
“[Lorenz] essentially said there’s just no way you can predict weather features beyond that time because those little butterfly wing flaps and countless other little things will add up to so many big changes, and there’s so much uncertainty beyond that range, that it’s just impossible to say anything,” he says.

Judt, whose work focuses on the theoretical limit of accuracy in weather forecasting, says we’ll never be able to predict thunderstorms more than a couple of hours in advance, regardless of how good observations become.
For hurricanes and winter storms, which are much bigger and therefore easier to spot in advance, the theoretical limit is two to three weeks — “so there’s still a couple of days to be gained, if not a whole week,” he says.

“We could forecast perfectly if we had perfect knowledge of the atmosphere and if we had perfect weather models,” Judt says.
But we will never be able to measure everything about every point in the atmosphere all the time with ultimate precision, and our models will never be flawless.
“So we will never be able to actually achieve perfect forecasts.”

Building a Better Forecast

There are more ways to improve forecasts than taking better observations and improving our weather models.
Understanding how people use forecasts and warnings allows meteorologists to provide information in the most useful way.
One of the biggest challenges for meteorologists is condensing a forecast, which represents a spread of possible weather conditions to expect, into a single icon or a few sentences that appear in your weather app.
(Credit: Roen Kelly/Discover)

Take, for instance, today’s chance of rain in your area.
This could mean slightly different things coming from different meteorologists, but in general, it’s not simply the odds that you, personally, will witness rain that day.
Most forecasters calculate this number by multiplying their confidence that rain will occur by the area in which the rain might happen.
So a 40 percent chance of rain might be a 100 percent chance in 40 percent of your county, or, a 60 percent chance across 70 percent of your county.

In addition, what this number doesn’t tell you is how much it will rain, how hard, when or for how long.
So the next time you see a low chance of rain in your forecast, check the full weather report before you leave the umbrella at home.

“The science has outrun our communications skills and knowledge, to a certain extent.
So a lot of the challenge now is, how do we get people what they need?” says Henson.
That’s because more information isn’t always the best way to communicate.
“If people don’t understand it, then it doesn’t help,” he says.

NOAA is working with social scientists to develop forecasts that are more relevant and better targeted.
This is especially important because of how the internet has changed the way people obtain and share information, Kelsch says.


(Credit: Roen Kelly/Discover)

For instance, when creating the official forecast, meteorologists account for uncertainties by running a model several times.
Each time, the model will give a slightly different result, but most results will be very similar.
This ensemble of predictions is what becomes the official forecast.

But outlying, low-probability results occur in the ensemble, too.
Since these data are accessible to the public, there’s always a risk the data will be shared out of context on social media.
“That’s not a challenge that’s going away,” says Kelsch.

And though forecasts have improved dramatically, meteorologists are still blamed when they are wrong.
“We always need to remember that there never will be perfect forecasts, but we’re still improving them,” Judt says.

Because for all of us, “the most salient weather forecast is the one that was wrong — when you expected something and you were surprised, those are the ones you remember.
You don’t remember all the times that it was just as we expected because that’s not news,” Henson says.

For meteorologists, then, the end goal is to make almost every day’s forecast an utterly forgettable one.
 
Where the Magic (Forecast) Happens

In many countries, a single public weather service is typically the only source available for forecasts, warnings and alerts.
These meteorologists work for public (government) organizations or universities.
By contrast, the United States has strong public, private (commercial) and university-based weather observation and forecasting programs.


Forecasters at the National Weather Service are all hands on deck during a major storm.
Here, meteorologists monitor Hurricane Irma in September 2017 at the hurricane center in Miami.
(Credit: Andy Newman/Associated Press)


“We also are a large country and a populous country, and one with a great deal of weather variation.
I think all those things have strengthened our interest in weather and our support for weather research and forecasting,” says Weather Underground’s Bob Henson.
In other words, the U.S. is a bit of a weather powerhouse.
Here, most forecasts originate at the National Centers for Environmental Prediction (NCEP).

These centers are part of the National Weather Service (NWS), which itself is a part of the National Oceanic and Atmospheric Administration (NOAA).
The NCEP runs weather models, then disseminates the results — as well as forecasts — to NWS offices, which may customize the forecasts for their region.

For long-term, large area predictions, the most popular U.S. model is the Global Forecast System, or GFS.
On June 12, NOAA announced its first major upgrade for GFS in nearly 40 years.
The upgrade incorporates a new dynamical core, which is the model’s description of how the atmosphere behaves.
The new system, called GFS-FV3, is better at modeling moisture and clouds, allowing meteorologists to forecast storms with greater accuracy than ever before.

Commercial weather providers typically have some weather modeling capabilities of their own.
For example, Weather Underground refines the official forecast to a neighborhood scale by adding information from its network of over a quarter-million personal weather stations.
This gives you accurate weather information for your exact location when you open the service’s app, rather than what the weather is doing across town.

Each company fills a different niche, providing different forecasts that focus on, say, surfing conditions, fire conditions or transportation concerns, based on specific observations and models that refine the broad public-sector data.
These differences are also why you might prefer using one app or service over another.