Sunday, December 16, 2018

Sailing school (1956)

Newton Ferrers, Devon. 
The Newton Ferrers School of Yachting is run by Lt. Commander Rab Moore and his partner Dennis Montgomery.
It is the only school where students learn to sail on dry land first.
The students are seen gathered around the model of a yacht and Rab Moore points to various parts of the boat.
M/S of the land trainer as two women students get in and learn to sail without the hazards of the water.
They jump from one side to the other with the instructor watching.
Beautiful setting at Newton Ferrers on River Yealm as sailing boat sets out from harbour.
Man and woman take small dinghy out.
Then students are on board a large yacht being shown how to tie knots. On board the 30 foot Gaff Cutter "Ravenswing" the students learn how to put sails up correctly.
M/S "Ravenswing" sails along the River Yealm. 

Friday, December 14, 2018

Why deep oceans gave life to the first big, complex organisms

Fossil photo from the Ediacara Biota.
(Photo by James Gehling)

From Phys

In the beginning, life was small.
For billions of years, all life on Earth was microscopic, consisting mostly of single cells.
Then suddenly, about 570 million years ago, complex organisms including animals with soft, sponge-like bodies up to a meter long sprang to life.
And for 15 million years, life at this size and complexity existed only in deep water.

Scientists have long questioned why these organisms appeared when and where they did: in the deep ocean, where light and food are scarce, in a time when oxygen in Earth's atmosphere was in particularly short supply.
A new study from Stanford University, published Dec.12 in the peer-reviewed Proceedings of the Royal Society B, suggests that the more stable temperatures of the ocean's depths allowed the burgeoning life forms to make the best use of limited oxygen supplies.

Graphic showing origins of different Ediacarans
Thermal stability in the deep ocean fostered complex life
All of this matters in part because understanding the origins of these marine creatures from the Ediacaran period is about uncovering missing links in the evolution of life, and even our own species.
"You can't have intelligent life without complex life," explained Tom Boag, lead author on the paper and a doctoral candidate in geological sciences at Stanford's School of Earth, Energy & Environmental Sciences (Stanford Earth).

The new research comes as part of a small but growing effort to apply knowledge of animal physiology to understand the fossil record in the context of a changing environment.
The information could shed light on the kinds of organisms that will be able to survive in different environments in the future.

"Bringing in this data from physiology, treating the organisms as living, breathing things and trying to explain how they can make it through a day or a reproductive cycle is not a way that most paleontologists and geochemists have generally approached these questions," said Erik Sperling, senior author on the paper and an assistant professor of geological sciences.

Playful illustration shows the appearance of life on Earth as well as the events that preceded it (and were necessary for it).
Complex life develops in the ocean at first, but soon it will try to see how it is on land.
Sea animals are coming out on land in search of food and new experiences.

Goldilocks and temperature change

Previously, scientists had theorized that animals have an optimum temperature at which they can thrive with the least amount of oxygen.
According to the theory, oxygen requirements are higher at temperatures either colder or warmer than a happy medium.
To test that theory in an animal reminiscent of those flourishing in the Ediacaran ocean depths, Boag measured the oxygen needs of sea anemones, whose gelatinous bodies and ability to breathe through the skin closely mimic the biology of fossils collected from the Ediacaran oceans.

"We assumed that their ability to tolerate low oxygen would get worse as the temperatures increased.
That had been observed in more complex animals like fish and lobsters and crabs," Boag said.
The scientists weren't sure whether colder temperatures would also strain the animals' tolerance.
But indeed, the anemones needed more oxygen when temperatures in an experimental tank veered outside their comfort zone.

Together, these factors made Boag and his colleagues suspect that, like the anemones, Ediacaran life would also require stable temperatures to make the most efficient use of the ocean's limited oxygen supplies.

 Factors governing oxygen supply to animals. 
(a) Average annual partial pressure of O2 (pO2) in the global ocean at surface. 
(b) Average annual solubility of O2 (αO2) in the global ocean at surface. Values increase with latitude owing to the thermal effects on Henry's solubility coefficient. 
(c) Average annual diffusivity of O2(DO2) in the global ocean at surface. 
(d) Average annual bioavailability of O2 in the global ocean at surface, expressed using the oxygen supply index (OSI)
Despite the increased solubility of O2 in cold water, the kinematic viscosity also increases substantially, reducing the diffusivity of O2 at a rate greater than the offsetting effect on solubility.
As a result, the supply of O2 to respiratory surfaces actually decreases approximately linearly as water becomes colder.

Refuge at depth

It would have been harder for Ediacaran animals to use the little oxygen present in cold, deep ocean waters than in warmer shallows because the gas diffuses into tissues more slowly in colder seawater.
Animals in the cold have to expend a larger portion of their energy just to move oxygenated seawater through their bodies.

But what it lacked in useable oxygen, the deep Ediacaran ocean made up for with stability.
In the shallows, the passing of the sun and seasons can deliver wild swings in temperature—as much as 10 degrees Celsius (50 degrees F.) in the modern ocean, compared to seasonal variations of less than 1 degree Celsius at depths below one kilometer (.62 mile).
"Temperatures change much more rapidly on a daily and annual basis in shallow water," Sperling explained.

Impact of seasonal temperature variation on aerobic respiration in low pO2 conditions
In a world with low oxygen levels, animals unable to regulate their own body temperature couldn't have withstood an environment that so regularly swung outside their Goldilocks temperature.

The Stanford team, in collaboration with colleagues at Yale University, propose that the need for a haven from such change may have determined where larger animals could evolve.
"The only place where temperatures were consistent was in the deep ocean," Sperling said.
In a world of limited oxygen, the newly evolving life needed to be as efficient as possible and that could only be achieved in the relatively stable depths.
"That's why animals appeared there," he said.

Links :

Thursday, December 13, 2018

Mechanics of Nazaré

Nazare in full glory.
Photo: Andre Bothelo

From Surfline

How one break in Portugal creates the world's largest waves

A wave that produces the 80-foot Guinness world record for largest wave ever ridden needs no introduction.
Even to the non-surfing community, little is needed when mainstream media regularly runs photos and videos of every XXL swell that hits the small Portuguese fishing town.
Hell, CNN’s Anderson Cooper even rode through the rocks on the back of a ski — piloted by none other than previous world record holder (also caught at Nazaré), Garrett McNamara.

Above: Under the right conditions, XL Nazare looks almost inviting.
Photo: Jeremiah Klein

Nazaré is well known for good reason.
It regularly produces the largest rideable waves on planet Earth.
And thanks to the ultimate deepwater canyon set up, Nazaré’s surf size potential is only bound by the size and direction of the swell it receives.

Swell Source
  • Strongest swells of year from October through April when intense mid-latitude frontal lows track eastward across the North Atlantic, interacting with adjacent high pressure.
  • Typical storm track moves towards Europe helping maximize swell potential.
  • Strongest swells from WNW to NW, are often consistent, ranging from short to long period.
  • Travel time from one to five days.
  • Peak hurricane season from mid-August through mid-October can offer a variety of swell directions. Recurving tropical cyclones often undergo extratropical transition (most common, October) or enhance developing winter storms. Tropical systems can impact the region with wind and weather, like Cyclone Leslie in October 2018.
  • Local windswell events do occur and can provide fun surf. Events are not as strong as above mentioned swell sources and do not produce the signature XL surf.
The preferred swell source and swell window for Nazare.

Swell Window
  • Nazaré’s swell window is technically open from SW (226°) to N (357°). West to NW angled swells are strongest and most common; WNW swells are ideal.
  • Between the Peniche peninsula at 226° and 251° lies a small group of islands known as the Berlengas Archipelago. A fraction of swell energy filters through these islands and are not as strong as more prominent West-NW swells.
  • Southerly angled swells are usually local windswell events (e.g., ahead of an approaching front), often coinciding with unfavorable onshore wind.
  • Nazaré receives more northerly angled swells up to 357°. North-northwest to N swells are not a favorable direction — shorter period swells generally sweep across the beach, longer period swells see an occasional canyon set that is too crossed up. Wave amplification through refraction by the canyon and associated constructive interference is not as impactful from the north.
Doesn’t look very playful now, does it? And good luck timing the sets.
Photo: Andre Bothelo


Bathymetry is vital in how waves behave when approaching and breaking along shore, refracting energy into or away from different locations with each variation in swell direction or period.
The surf at certain points can be amplified to greater heights, while other spots are left in a swell void.
And the best spot on the planet to observe extreme wave refraction is Nazaré.

The large, deepwater Nazaré Canyon has the potential to significantly amplify the surf at the beach just to the north of the bay.
Wave-face height can multiply three, four, even five times the offshore deepwater swell height.
But this magnification is highly dependent on the incoming swell angle and period.
Generally, Nazaré favors a longer period swell from the WNW.

Energy in longer period swells extends deeper within the water column, feeling the contours of the ocean bottom sooner, and with a greater degree of effect.
Since swells always refract toward shallower water, longer period swells start to turn and bend sooner and more effectively.

For Nazaré, there is a steep contrast between the large and deep canyon running offshore and the much shallower ridge that lines the northern slope.
This canyon/ridge relationship extends a long distance far offshore all the way up to the break.
The portion of the swell running through the deep canyon maintains a greater percentage of its raw open ocean energy and forward speed closer to shore.
And upon interacting with the adjacent ridge, much of this energy will refract out of the canyon and focus back in toward the break.

The various bends of the canyon also play a role, helping create a more complex scenario of refracting and converging waves.
Meanwhile, the inbound swell traveling over the shallower water north of the canyon starts to gradually slow down and shoal when nearing the coast — and much of this energy focuses toward Nazaré as well.
The result is a compression of these refracting swell lines as they converge at the break, amplifying the waves.

However, there is another key factor at work besides just refractive pileup that helps contribute to the extreme magnification of the waves here, and that is constructive interference (Note – A spot like the Wedge also has this X-factor going on).
After extensive research on the bathymetry, running various swell scenarios through our high-powered computer simulations, and athlete observations, we do know the “magic numbers” for the canyon to perform at its maximum potential.
And the direction is just as important as the period.

From Surfline Labs: Animation (4x normal speed) from Surfline Labs shows the swell that provided the world record wave on November 8th, 2017.
It shows the effects of the Nazare canyon to create mutant, XXL peaks.
The reds indicate peaks, the blues show the troughs — the biggest (red) peaks only appear sporadically when a multitude of factors come perfectly together.
It was timing, and luck, that allowed Ricardo Koxa to catch, and break, the world record wave.

Given the unique layout of this underwater landscape, incoming long period swells from the WNW are ideal for Nazaré.
These swells have just enough west in them to allow the canyon to refract at its fullest potential, yet just enough north that most of the swell is refracting back toward this particular stretch of beach, instead of away to the south.
The north component allows the portion of the swell not running through the canyon to converge with the waves refracting out of the canyon — a combo of NW and SW waves in the surf zone.

If there is too much north in the swell, then the canyon has difficulty refracting swell back toward the north, thus providing less energy and lowering the potential for larger surf.
The sets that do refract from the canyon are almost too peaky with more slopey, mushy shoulders.
There is often more current running on these more northerly angled swells as well.

For W to WSW swells, the refracting energy from the canyon is more evenly split to the north and south, also lowering the potential for larger surf at Nazaré.
SW swells are partially shadowed by offshore islands.
The surf is not as peaky on these swells, as west lines north of the canyon square up more to the coast with less convergence from waves refracting from the canyon.
For more southerly angled swells, the canyon refracts more energy to areas to the south, considerably lowering the refracting factor and peaky nature of Nazaré.

A ski is not required to ride this train — until it gets to a certain size.
Photo: Klein


Like most spots, Nazaré prefers calm or light to moderate offshore wind (east to southeast).
Strong offshore wind can create hazardous conditions and is almost as problematic as an onshore wind, especially in big surf.
Strong offshores make it very difficult to paddle into waves and creates surface chop running up the wave faces.
Bigger, faster-moving waves have a greater opposition to stronger offshore flow, aggravating the sea surface even more.

High pressure overhead or to the north to northeast of Portugal sets up offshore flow for Nazare.

Located on the far southwestern edge of Europe, Nazaré fares better than those at higher latitudes when it comes to severe winter weather.
Systems tracking through the higher latitudes, or storms that lift northward before nearing Europe, can provide good swell with less adverse local weather.

But storms tracking through the lower latitudes can bring poor wind and weather along with swell.
Approaching fronts often bring onshore winds and stormy conditions to the region.
High pressure building in behind these fronts, either over the region or to the north or northeast, turns the wind offshore and improves local weather.
Nazaré can handle light onshores as the waves themselves block the wind on big days and the cliffs shelter the waves from a southerly wind.

The view from the cliff.
Safer than a view from the water.
Photo: Klein

Best Conditions for Nazaré
  • Best Tide: Mid, prefers incoming
  • Best Swell Direction: West-Northwest to Northwest
  • Best Swell Period: Longer period
  • Best Wind: Calm or light to moderate offshore (east-southeast)
  • Best Size: Works on all sizes, no limit on max size
  • Best Season: Fall generally best, winter and spring very solid too
  • Resources for Nazaré
Links :

Wednesday, December 12, 2018

Sails make a comeback as shipping tries to go green

Car manufacturer, Groupe Renault, is partnering with French designer and operator of cargo sailing ships, NeoLine, to reduce the carbon footprint of the Group’s supply chain.
NeoLine has designed a 136-meter ro-ro with 4,200 square meters of sail area it says has the potential to reduce CO2 emissions by up to 90 percent through the use of wind power primarily, combined with a cost-cutting speed and optimized energy mix. commission the vessels by 2020-2021 on a pilot route joining Saint-Nazaire in France, the U.S. Eastern seaboard and Saint-Pierre and Miquelon (off the coast of Newfoundland in Canada).

From The Sentinel by Kelvin Chan

As the shipping industry faces pressure to cut climate-altering greenhouse gases, one answer is blowing in the wind.

European and U.S. tech companies, including one backed by airplane maker Airbus, are pitching futuristic sails to help cargo ships harness the free and endless supply of wind power.
While they sometimes don't even look like sails -- some are shaped like spinning columns -- they represent a cheap and reliable way to reduce CO2 emissions for an industry that depends on a particularly dirty form of fossil fuels.

The merchant shipping industry releases 2.2% of the world’s carbon emissions, about the same as Germany, and the International Maritime Organization estimates that could increase up to 250% by 2050 if no action is taken.
Finnish company Norsepower may have a solution in the spinning cylinders they’ve designed for ships to harness wind power and produce forward thrust.
The result is a ship that needs less fuel to travel the seas - a major boost to the industry that transports 90% of international trade.
VICE News took a ride on the Estraden, a cargo ship fitted with Norsepower Rotor Sails, to see the technology that can reduce a ship’s carbon emissions by 1000 tons per year.
If all 50,000 merchant ships adopted Norsepower Rotor Sails, the costs saved on fuel would be over $7 billion a year, and the emissions prevented would equal more than 12 coal fired power plants.
While zero emission ships could be achieved using Rotor Sails paired with other alternative fuel sources, the economic incentives haven’t been strong enough to mobilize the industry just yet.
But strides such as those taken by Norsepower could help kickstart a widescale greening of the industry.

"It's an old technology," said Tuomas Riski, the CEO of Finland's Norsepower, which added its "rotor sail" technology for the first time to a tanker in August.
"Our vision is that sails are coming back to the seas."

Denmark's Maersk Tankers is using its Maersk Pelican oil tanker to test Norsepower's 30 meter (98 foot) deck-mounted spinning columns, which convert wind into thrust based on an idea first floated nearly a century ago.

Separately, A.P. Moller-Maersk, which shares the same owner and is the world's biggest container shipping company, pledged this week to cut carbon emissions to zero by 2050, which will require developing commercially viable carbon neutral vessels by the end of next decade.

This is Enercon's E-Ship 1 128m cargo vessel built in 2010 designed for the transportation of wind turbine components. She is a most unusual looking ship featuring four 27m tall Flettner Rotor Sails which rotate rapidly, due to the magnus effect this design helps reduce engine fuel costs with greater efficiency.

The shipping sector's interest in "sail tech" and other ideas took on greater urgency after the International Maritime Organization, the U.N.'s maritime agency, reached an agreement in April to slash emissions by 50 percent by 2050.

Transport's contribution to earth-warming emissions are in focus as negotiators in Katowice, Poland, gather for U.N. talks to hash out the details of the 2015 Paris accord on curbing global warming.

Beluga Projects SkySails

Shipping, like aviation, isn't covered by the Paris agreement because of the difficulty attributing their emissions to individual nations, but environmental activists say industry efforts are needed.
Ships belch out nearly 1 billion tons of carbon dioxide a year, accounting for 2-3 percent of global greenhouse gases. The emissions are projected to grow between 50 to 250 percent by 2050 if no action is taken.

Notoriously resistant to change, the shipping industry is facing up to the need to cut its use of cheap but dirty "bunker fuel" that powers the global fleet of 50,000 vessels -- the backbone of world trade.

The IMO is taking aim more broadly at pollution, requiring ships to start using low-sulfur fuel in 2020 and sending ship owners scrambling to invest in smokestack scrubbers, which clean exhaust, or looking at cleaner but pricier distillate fuels.

The GoodShipping Program is the world’s first initiative to decarbonize container shipping by changing the marine fuel mix – switching from heavy fuel oil towards sustainable marine fuel.
The Program enables cargo owners to make a change: their footprint from shipping will be reduced significantly, regardless of existing contracts, cargo routes and volumes.

A Dutch group, the Goodshipping Program , is trying biofuel, which is made from organic matter.
It refueled a container vessel in September with 22,000 liters of used cooking oil, cutting carbon dioxide emissions by 40 tons.

In Norway, efforts to electrify maritime vessels are gathering pace, highlighted by the launch of the world's first all-electric passenger ferry, Future of the Fjords, in April.
Chemical maker Yara is meanwhile planning to build a battery-powered autonomous container ship to ferry fertilizer between plant and port.
Ship owners have to move with the times, said Bjorn Tore Orvik, Yara's project leader.
Building a conventional fossil-fueled vessel "is a bigger risk than actually looking to new technologies ... because if new legislation suddenly appears then your ship is out of date," said Orvik.

Batteries are effective for coastal shipping, though not for long-distance sea voyages, so the industry will need to consider other "energy carriers" generated from renewable power, such as hydrogen or ammonia, said Jan Kjetil Paulsen, an advisor at the Bellona Foundation, an environmental non-government organization.
Wind power is also feasible, especially if vessels sail more slowly.
"That is where the big challenge lies today," said Paulsen.

The performance of the EcoFlettner, which has been tested on the MV Fehn Pollux since July, clearly exceeds the expectations of the scientists.
“The data we have evaluated so far signifcantly outmatch those of our model calculations,” says Professor Michael Vahs, who has been researching the topic of wind propulsion for seagoing vessels at the University of Applied Science Emden / Leer for more than 15 years.
“In perfect conditions, this prototype delivers more thrust than the main engine.”
15 companies from around Leer have been involved in the development and construction of the sailing system. The whole project is funded by the EU and coordinated by Mariko in Leer.
The rotor is 18 meters high and has a diameter of three meters.
After lengthy test runs ashore, the rotor is now being tested under real conditions aboard 90- meter-long multi-purpose freighter MV Fehn Pollux.
On board MV Fehn Pollux more than 50 different data are continuously collected and computed in real time by the Flettner control system on the bridge.
The computer uses the data to calculate the optimum settings for the rotor under the current conditions.

Wind power looks to hold the most promise.
The technology behind Norsepower's rotor sails, also known as Flettner rotors, is based on the principle that airflow speeds up on one side of a spinning object and slows on the other.
That creates a force that can be harnessed.

Rotor sails can generate thrust even from wind coming from the side of a ship.
German engineer Anton Flettner pioneered the idea in the 1920s but the concept languished because it couldn't compete with cheap oil.
On a windy day, Norsepower says rotors can replace up to 50 percent of a ship's engine propulsion. Overall, the company says it can cut fuel consumption by 7 to 10 percent.

Maersk Pelican: Trialling a pair of Norsepower Rotors under trading conditions

Maersk Tankers said the rotor sails have helped the Pelican use less engine power or go faster on its travels across, resulting in better fuel efficiency, though it didn't give specific figures.

One big problem with rotors is they get in the way of port cranes that load and unload cargo.
To get around that, U.S. startup Magnuss has developed a retractable version.
The New York-based company is raising $10 million to build its concept, which involves two 50-foot (15-meter) steel cylinders that retract below deck.
"It's just a better mousetrap," said CEO James Rhodes, who says his target market is the "Panamax" size bulk cargo ships carrying iron ore, coal or grain.

High tech versions of conventional sails are also on the drawing board.
Spain's bound4blue's aircraft wing-like sail and collapses like an accordion, according to a video of a scaled-down version from a recent trade fair.
The first two will be installed next year followed by five more in 2020.
The company is in talks with 15 more ship owners from across Europe, Japan, China and the U.S. to install its technology, said co-founder Cristina Aleixendrei.

Links :

Tuesday, December 11, 2018

Can Artificial Intelligence help build better, smarter climate models?

A computer simulation of carbon dioxide movement in the atmosphere.
The ‘Cloud Brain’ might make it possible to tighten up the uncertainties of how the climate will respond to rising carbon dioxide.

From e360 by Nicola Jones

Researchers have been frustrated by the variability of computer models in predicting the earth’s climate future.
Now, some scientists are trying to utilize the latest advances in artificial intelligence to focus in on clouds and other factors that may provide a clearer view.

Look at a digital map of the world with pixels that are more than 50 miles on a side and you’ll see a hazy picture: whole cities swallowed up into a single dot; Vancouver Island and the Great Lakes just one pixel wide.
You won’t see farmer’s fields, or patches of forest, or clouds.
Yet this is the view that many climate models have of our planet when trying to see centuries into the future, because that’s all the detail that computers can handle.
Turn up the resolution knob and even massive supercomputers grind to a slow crawl.
“You’d just be waiting for the results for way too long; years probably,” says Michael Pritchard, a next-generation climate modeler at the University of California, Irvine.
“And no one else would get to use the supercomputer.”

Earth recently experienced its largest annual increases in atmospheric carbon dioxide levels in at least 2,000 years.
These exchanges vary from year to year, and scientists are using OCO-2 data to uncover the reasons.
The many and varied uses of OCO-2 data will continue to be essential to understanding the dynamics of carbon dioxide across our planet and will help contribute to improved long-term climate forecasting.
NASA has released a video that explains the study, shows changing level of CO2

The problem isn’t just academic: It means we have a blurry view of the future.
It is hard to know if, importantly, a warmer world will bring more low-lying clouds that shield Earth from the sun, cooling the planet, or fewer of them, warming it up.
For this reason and more, the roughly 20 models run for the last assessment of the Intergovernmental Panel on Climate Change (IPCC) disagree with each other profoundly: Double the carbon dioxide in the atmosphere and one model says we’ll see a 1.5 degree Celsius bump; another says it will be 4.5 degrees C.
“It’s super annoying,” Pritchard says.
That factor of three is huge — it could make all the difference to people living on flooding coastlines or trying to grow crops in semi-arid lands.

Pritchard and a small group of other climate modelers are now trying to address the problem by improving models with artificial intelligence.
(Pritchard and his colleagues affectionately call their AI system the “Cloud Brain.”) Not only is AI smart; it’s efficient.
And that, for climate modelers, might make all the difference.

Computer hardware has gotten exponentially faster and smarter — today’s supercomputers handle about a billion billion operations per second, compared to a thousand billion in the 1990s.
Meanwhile a parallel revolution is going on in computer coding.
For decades, computer scientists and sci-fi writers have been dreaming about artificial intelligence: computer programs that can learn and behave like real people.
Starting around 2010, computer scientists took a huge leap forward with a technique called machine learning, specifically “deep learning,” which mimics the complex network of neurons in the human brain.

Traditional computer programming is great for tasks that follow rules: if x, then y.
But it struggles with more intuitive tasks for which we don’t really have a rule book, like translating languages, understanding the nuances of speech, or describing what’s in an image.
This is where machine learning excels.
The idea is old, but two recent developments finally made it practical — faster computers, and a vast amount of data for machines to learn from.
The internet is now flooded with pre-translated text and user-labelled photographs that are perfect for training a machine-learning program.

Companies like Microsoft and Google jumped on deep learning starting in the early 2010s, and have used it in recent years to power everything from voice recognition on smart phones to image searches on the internet.
Scientists have started to pick up these techniques too.
Medical researchers have used it to find patterns in datasets of proteins and molecules to guess which ones might make good drug candidates, for example.
And now deep learning is starting to stretch into climate science and environmental projects.

Researchers hope incorporating artificial intelligence into climate models will further understanding of how clouds, shown here over Bangladesh, will act in a warmer world.
Typical global climate models have pixel sizes far too large to see individual clouds or storm fronts.
The ‘Cloud Brain’ tends to get confused when given scenarios outside its training, such as a much warmer world.
NASA/International Space Station

Microsoft’s AI for Earth project, for example, is throwing serious money at dozens of ventures that do everything from making homes “smarter” in their use of energy for heating and cooling, to making better maps for precision conservation efforts.
A team at the National Energy Research Scientific Computing Center in Berkeley is using deep learning to analyze the vast reams of simulated climate data being produced by climate models, drawing lines around features like cyclones the way a human weather forecaster might do.
Claire Monteleoni at the University of Colorado, Boulder, is using AI to help decide which climate models are better than others at certain tasks, so their results can be weighed more heavily.

But what Pritchard and a handful of others are doing is more fundamental: inserting machine learning code right into the heart of climate models themselves, so they can capture tiny details in a way that is hundreds of times more efficient than traditional computer programming.
For now they’re focused on clouds — hence the name “Cloud Brain” — though the technique can be used on other small-scale phenomena.
That means it might be possible to tighten up the uncertainties of how the climate will respond to rising carbon dioxide, giving us a clearer picture of how clouds might shift and how temperatures and rainfall might vary — and how lives will likely to be affected from one small place to the next.

So far these attempts to hammer deep learning code into climate models are in the early stages, and it’s unclear if they’ll revolutionize model-making or fall flat.

The problem that the Cloud Brain tackles is a mismatch between what climate scientists understand and what computers can model — particularly with regard to clouds, which play a huge role in determining temperature.

While some aspects of cloud behavior are still hard to capture with algorithms, researchers generally know the physics of how water evaporates, condenses, forms droplets, and rains out.
They’ve written down the equations that describe all that, and can run small-scale, short-term models that show clouds evolving over short time periods with grid boxes just a few miles wide.
Such models can be used to see if clouds will grow wispier, letting in more sunlight, or cool the ground by shielding the sun.
But try to stick that much detail into a global-scale, long-term climate model, and it will go about a million times slower.
The general rule of thumb, says Chris Bretherton at the University of Washington, is if you want to cut your grid box dimensions in half, the computation will take 10 times as long.
“It’s not easy to make a model much more detailed,” he says.

The supercomputers that crunch these models cost somewhere in the realm of $100 million to build, says David Randall, a Colorado State University climate modeler; a month’s-worth of time on such a machine could cost millions.
Those fees don’t actually show up in an invoice for any given researcher; they’re paid by institutions, governments, and grants.
But the financial investment means there’s real competition for computer time.
For this reason, typical global climate models like the ones used thus far in IPCC reports have pixel sizes tens of miles wide — far too large to see individual clouds or even storm fronts.

The trick that Pritchard and others are attempting is to train deep learning systems with data from short-term runs of fine-scale cloud models.
This lets the AI basically develop an intuitive sense for how clouds work.
That AI can then be jimmied into a bigger-pixel global climate model, to shove more realistic cloud behavior into something that’s cheap and fast enough to run.

Pritchard and his two colleagues trained their Cloud Brain on high-resolution cloud model results, and then tested it to see if it would produce the same simulated climates as the slower, high-resolution model.
It did, even getting details like extreme rainfalls right, while running about 20 times faster.

Others — including Bretherton, a former colleague of Pritchard’s, and Paul O’Gorman, a climate researcher at MIT, are doing similar work.
The details of the strategies vary, but the general idea — using machine learning to create a more-efficient programming hack to emulate clouds on a small scale — is the same.
The approach could likewise be used to help large global models incorporate other fine features, like miles-wide eddies in the ocean that bedevil ocean current models, and the features of mountain ranges that create rain shadows.

The scientists face some major hurdles.
The fact that machine learning works almost intuitively, rather than following a rulebook, makes these programs computationally efficient.
But it also means that mankind’s hard-won understanding about the physics of gravitational forces, temperature gradients, and everything else, gets set aside.
That’s philosophically hard to swallow for many scientists, and also means that the resulting model might not be very flexible: Train an AI system on oceanic climates and stick it over the Himalayas and it might give nonsense results.
O’Gorman’s results hint that his AI can adapt to cooler climates but not warmer ones.
And Cloud Brain tends to get confused when given scenarios outside its training, such as a much warmer world.
“The model just blows up,” says Pritchard.
“It’s a little delicate right now.” Another disconcerting issue with deep learning is that it’s not transparent about why it’s doing what it’s doing, or why it comes to the results that it does.
“Basically it’s a black box; you push a bunch of numbers in one end and a bunch of numbers come out the other end,” says Philip Rasch, chief climate scientist at the Pacific Northwest National Laboratory.
“You don’t know why it’s producing the answers it’s producing.”

“In the end, we want to predict something that no one has observed,” says Caltech’s Tapio Schneider.
“This is hard for deep learning.”
For all these reasons, Schneider and his team are taking a different approach.
He is sticking to physics-based models, and using a simpler variant of machine learning to help tune the models.
He also plans to use real data about temperature, precipitation, and more as a training dataset.
“That’s more limited information than model data,” he says.
“But hopefully we get something that’s more predictive of reality when the climate changes.” Schneider’s well-funded effort, called the Climate Machine, was announced this summer but hasn’t yet been built.
No one yet knows how the strategy will pan out.

 Using a combination of cloud data, such as this satellite observation of a tropical storm over South America, and "machine learning" could help to fine-tune climate models. 
The ‘Cloud Brain’ tends to get confused when given scenarios outside its training, such as a much warmer world. 
NASA/Goddart Space Flight Center/Scientific Visualization Studio

The utility of these models for predicting the future climate is the biggest uncertainty.
“That’s the elephant in the room,” says Pritchard, who remains optimistic that he can do it, but accepts that we’ll simply have to wait and see.
Randall, who is watching the developments with interest from the sidelines, is also hopeful.
“We’re not there yet,” he says, “but I believe it will be very useful.”

Climate scientist Drew Schindell of Duke University, who isn’t working with machine learning himself, agrees.
“The difficulty with all of these things is we don’t know that the physics that’s important to short-term climate are the same processes important to long-term climate change,” he says.
Train an AI system on short-term data, in other words, and it might not get the long-term forecast right.
“Nevertheless,” he adds, “it’s a good effort, and a good thing to do.
It’s almost certain it will allow us to improve coarse-grid models.”

In all these efforts, deep learning might be a solution for areas of the climate picture for which we don’t understand the physics.
No one has yet devised equations for how microbes in the ocean feed into the carbon cycle and in turn impact climate change, notes Pritchard.
So, since there isn’t a rulebook, AI could be the most promising way forward.
“If you humbly admit it’s beyond the scope of our physics, then deep learning becomes really attractive,” Pritchard says.

Bretherton makes the bullish prediction that in about three years a major climate-modeling center will incorporate machine learning.
If his forecast prevails, global-scale models will be capable of paying better attention to fine details — including the clouds overhead.
And that would mean a far clearer picture of our future climate.

Links :

Monday, December 10, 2018

How ordinary ship traffic could help map the uncharted Arctic Ocean seafloor

A cargo ship sails through multi-year ice in Canada’s the Northwest Passage.
(Timothy Keane / Fednav)

From Arctic Today by Melody Schreiber

Equipping every ship that enters the Arctic with sensors could help fill critical gaps in maritime charts.

Throughout world, the ocean floor’s details remain largely a mystery; less than 10 percent has been mapped using modern sonar technology.
Even in the United States, which has some of the best maritime maps in the world, only one-third of the ocean and coastal waters have been mapped to modern standards.

This map shows unique ship visits to Arctic waters
between September 1, 2009, and December 31, 2016.

But perhaps the starkest gaps in knowledge are in the Arctic.
Only 4.7 percent of the Arctic has been mapped to modern standards.

“Especially when you get up north, the percentage of charts that are basically based on Royal Navy surveys from the 19th century is terrifying — or should be terrifying,” said David Titley, a retired U.S. Navy Rear Admiral who directs the Center for Solutions to Weather and Climate Risk at the Pennsylvania State University.
Titley spoke alongside several other maritime experts at a recent Woodrow Wilson Center event on marine policy, highlighting the need for improved oceanic maps.

 GeoGarage nautical raster chart coverage with material from international Hydrographic Offices
red : US NOAA / grey : Canada CHS /  black : Denmark Greenland DGA / yellow ; Norway NHS

 GeoGarage nautical raster chart coverage (NGA material)

 Catalogue of charts from
Department of Navigation and Oceanography of the Russian Federation

When he was on active duty in the Navy, Titley said, “we were finding sea mounts that we had no idea were there.
And conversely, we were getting rid of sea mounts on charts that weren’t there.”
The problem, he said, comes down to accumulating — and managing — data. But there could be an intriguing solution: crowdsourcing.
“How does every ship become a sensor?” Titley asks.
Ships outfitted with sensors could provide the very information they need to travel more effectively.

Each ship would collect information on oceans, atmosphere, ecosystems, pollutants and more.
As the ships traverse the ocean, they would help improve existing maps and information about the waters they tread.

Maps are becoming more important as shipping activity increases — both around the world and in the Arctic.

In August, the Russian research ship Akademik Ioffe ran aground in Canada’s Arctic. In 2015, the Finnish icebreaker Fennica ripped a three-foot gash in its hull — while sailing within the relatively better charted waters of Alaska’s Dutch Harbor.

“The traditional way that we have supplied these ships with information — with nautical charts and predicted tides and tide tables, and weather over radio facts — are not anywhere near close to being what’s necessary,” said Rear Admiral Shep Smith, director of NOAA’s Office of Coast Survey.
The “next generation of services” would go much further, predicting the water level, salinity, and other information with more precision and detail.
One of NOAA’s top priorities, Smith said, is “the broad baseline mapping of the ocean — including the hydrography, the depth and form of the sea floor, and oceanography.”
Such maps are necessary to support development, including transportation, offshore energy, fishing and stewardship of natural resources, he said.

 A team of engineers and students from the University of New Hampshire’s Center for Coastal and Ocean Mapping recently returned from a voyage that deployed the first autonomous (robotic) surface vessel — the Bathymetric Explorer and Navigator (BEN) — from a NOAA ship far above the Arctic Circle. Credit: Courtesy Christina Belton, NOAA

In NOAA’s records of U.S. waters and coasts, they have at least one piece of information on only 41 percent of the ocean.
“The other 59 percent, there’s potentially a gold mine of economically important information in there,” he continued. “Or environmentally important information.”
NOAA struggles even to model how water moves in the ocean without more information, he said.

They are turning to crowdsourcing, satellite-derived bathymetry — and the idea of turning every ship into a sensor.
Projects like Seabed 2030 — a worldwide effort to map the seabed — will be crucial to these efforts, Smith said.
“It’s hard to map the bottom of the ocean,” said Rear Admiral Jon White, president and CEO of the Consortium for Ocean Leadership.
“It’s like trying to map your backyard with ants, with the ships that we have.”

However, he said, the technology to do so is improving.
“There are great opportunities for the people who understand this technology, to make new ways, better ways to actually map it faster,” White said.
Moving forward, he said, both federal investment and public-private partnerships should focus on “getting every ship to be a sensor in the ocean.”
That effort will be crucial for accomplishing “all the things that we’re trying to do in the maritime environment,” he said.


Sunday, December 9, 2018

Pearl Harbor WWII maps

On the 77th anniversary of the attack on Pearl Harbor:
Japanese Commander Mitsuo Fuchida’s after-action damage assessment map of the 1941 attack, which was presented to Emperor Hirohito

Japanese map of Pearl Harbor that was found in a captured midget sub after the attack
77 years ago the Empire of Japan attacked the US Pacific Fleet at Pearl Harbor.

 by Brenda Lewis & Rupert Matthews 
This map and aerial photo show the catastrophic damage to Battleship Row.source : National Geographic

 NOAA map 19366 with the GeoGarage platform

Links :

Saturday, December 8, 2018

Bathymetry in Australia

Flythrough movie of Gifford Marine Park, which is located 600 km east of Brisbane, Australia.
The park is situated about halfway along the Lord Howe Rise seamount chain on the western flank of the Lord Howe Rise. 
Seamounts along this chain formed from Miocene volcanism via a migrating magma source (“hotspot”) after the opening of the Tasman Sea. 
Two large, flat-topped volcanic seamounts dominate the park. 
Their gently sloping summits have accumulated veneers of sediment, which in places have formed fields of bedforms. 
Steep cliffs, debris and large mass movement scars encircle each seamount, and contrast with the lower gradient abyssal plains from which they rise. 
Spanning over 3 km of ocean depths, the seamounts are likely to serve multiple and important roles as breeding locations, resting areas, navigational landmarks or supplementary feeding grounds for some cetaceans (e.g. humpback whales, sperm whales). 
They may also act as important aggregation points for other highly migratory pelagic species. 
The bathymetry shown here was collected on two surveys - the first in 2007 by Geoscience Australia and the second in 2017 by Geoscience Australia in collaboration with the Japan Agency for Marine-Earth Science and Technology. 
The Gifford Marine Park has also been the focus of a study undertaken by the Marine Biodiversity Hub as part of the National Environmental Science Program.

Flythrough movie of Perth Canyon Marine Park, southwest Western Australia showing seafloor bathymetry and marine life that occurs within the park.
The park encompasses a diversity of geomorphic features, ranging from gently sloping soft sediment plains to near-vertical towering cliffs of exposed bedrock.
This geodiversity extends from the head of Perth Canyon at the shelf break to the slope-confined submarine canyons that dissect the lower continental slope.
Spanning almost 4.5 km of ocean depths, the Perth Canyon has a significant influence on the local ecosystem across the food chain.
The size and location of the canyon is such that it promotes upwelling from the deep ocean, leading to plankton blooms that attract seasonal aggregations of larger pelagic fish, including whales.
Over geological time, the canyon has evolved to provide extensive areas of potential seabed habitat suitable for deep-sea corals and sponges.
The Perth Canyon has been the focus of a study undertaken by the Marine Biodiversity Hub as part of the National Environmental Science Program.

Flythrough movie of Bremer Commonwealth Marine Reserve, southwest Western Australia showing bathymetry of Bremer Canyon, Hood Canyon, Henry Canyon and Knob Canyon.
These canyons are part of the Albany Group of 81 canyons that extend along the continental margin of southwest Australia reaching to water depths of 4000 m.
The Bremer Canyon is one of the few canyons in the group that have incised into the continental shelf, providing a pathway for upwelling of nutrient rich waters to the shelf.
This upwelling is thought to form the basis for aggregations of marine life around the Bremer and adjacent canyons, including orca whales and giant squid.
The Bremer offshore region has been the focus of a study undertaken in 2017 by the Marine Biodiversity Hub as part of the National Environmental Science Program.

Friday, December 7, 2018

The new American weather model shone during Hurricane Lane

Satellite view of Hurricane Lane on Aug. 21.
(Cooperative Institute for Meteorological Satellite Studies)

From WashingtonPost by Jason Samenow

It’s well established that the European weather model, on average, produces the most accurate weather forecasts in the world.
For years, the American model, run by the National Weather Service, has ranked third-best.

The also-ran status of the American model, known as the Global Forecast System (GFS), has caught the attention of Congress, which has appropriated money to the Weather Service to improve our nation’s weather modeling on multiple occasions.
In addition, the Trump administration has stated that building the best prediction model in the world is a “top priority.”
[Trump administration official says it’s a ‘top priority’ to improve U.S. weather forecasting model]

A new analysis of model performance during Hurricane Lane, which unloaded historic amounts of rain on Hawaii’s Big Island, shows that the Weather Service may be making progress.

The Weather Service has developed a new version of the GFS, known as the FV3 (which stands for Finite Volume Cubed-Sphere dynamical core), which it touts as “its next-generation global prediction system.”
While still considered experimental, the FV3 produced the most consistently accurate forecasts of Lane’s track.

 Second only to Katrina in damage cost, Harvey hit the Texas coast as expected.
It stalled for four days, dumped over 60 inches of rain and caused severe flooding

Despite warnings that Maria would hit Puerto Rico, emergency responders were not prepared.
The entire island lost power, clean water and cell service
Five days before hitting Florida, models showed Irma going east.
As it veered west, so too did evacuation orders.
All told, a third of Floridians were mandated to leave 

We obtained a Weather Service chart displaying the track errors for each of the models at different points in time.
Track errors tend to be large for forecasts of the storm’s position several days into the future but grow smaller with time.

(National Weather Service)

NOAA's Geophysical Fluid Dynamics Laboratory (GFDL) Research Team leader, Shian-Jiann Lin, Ph.D, is behind the new FV3-powered GFDL model.
The GFDL model is designed to improve the global weather forecast model by enhancing short-term forecasts and long-term climate prediction.
For more information about the GFDL model and how it will improve the Global Forecast System

The FV3 produced the most accurate forecasts (or smallest track errors) made four (96 hours) and five (120 hours) days into the future, and was neck and neck with the European model and National Hurricane Center forecasts within 72 hours.

The European model, which is run by the European Center for Medium-Range Weather Forecasts in Reading, United Kingdom, had large errors in its forecasts four and five days out but exhibited the skill it is known for within 72 hours as the top performer.

The U.K. Met model, which is the second-most-accurate model in the world and is run by the U.K. Met Office in Exeter, trailed the performance of the European, Hurricane Center and FV3 model forecasts at all times.

 National Weather Service

The current, operational version of the American GFS model had just about the worst forecast performance at every step.
The related American HWRF model, which is a specialized model for hurricanes, also performed poorly, ranking second to last.
Some of its input data come from the GFS, which explains why both models performed comparably poor.

Although the FV3’s results were very promising for Hurricane Lane, they reflect just one very limited case.
To be convinced that this new modeling system might close the gap with the European model, we will need to see such performance repeated storm after storm and in everyday weather situations, from the tropics to the poles.

The target date for the FV3 to become operational is late 2019.

Links :