Wednesday, August 9, 2017

Robot revolution: new generation of cheap drones to explore the seas

The Spotter’s sensors collect data on ocean conditions and beams the information via satellite to scientists’ laptops and smartphones
Spoondrift

From NewsDeeply by Matthew O. Berger

Blue technologies being developed in the San Francisco Bay Area aim to give scientists and citizens low-cost tools to gather and share high-quality data on ocean conditions.

While waves that once a year become the monster swells ridden by surfers in the Mavericks surf contest roll toward the harbor of this small fishing town south of San Francisco, oceanographer Tim Janssen sits in an office a block from the sea with a handful of colleagues and two dogs.
They’re working on a small sensor-laden device he hopes to deploy by the thousands to gather data on those waves and other ocean conditions.
Called the Spotter, the yellow space capsule-shaped float is about the size of a beach ball.
Solar panels keep its batteries charged and the data gathered by its sensors is beamed via satellite to scientists’ laptops and smartphones.
The Spotter is part of an explosion of new, cheaper tools for oceanographic research, giving scientists access to more real-time data about the ocean.

“There’s no better time to have this tech revolution happen than right now,” says Douglas McCauley, a marine biologist at the University of California, Santa Barbara.
He also serves as a director of the Benioff Ocean Initiative, which aims to spur technological innovation to address ocean acidification, rising water temperatures, overfishing and other threats to the ocean.

The types of technologies being developed mirror terrestrial innovations – drones, autonomous vehicles, smartphones.
“Oceanographers, because of limited resources, have always tried to get by with less,” says Mark Schrope, program director of Schmidt Marine Technology Partners in San Francisco, which funds ocean technology startups.
“Whatever it is – ocean conservation, ocean data – there’s some technology on land that could really transform that area.”

The San Francisco Bay Area, home to a concentration of engineers, entrepreneurs and marine scientists, is emerging as a center of this new wave of blue technology.

On the other side of the peninsula from Silicon Valley, Janssen’s startup, Spoondrift, will start shipping its $6,000 Spotter this fall.
His ultimate vision is a constellation of data-gathering Spotters deployed across the ocean that send back a wealth of high-resolution information that can be analyzed in real time.
The current version of the Spotter gathers data on wave height, peak period, peak direction and location.


Janssen pulls up a map of a network of data-collecting buoys operated by the National Oceanic and Atmospheric Administration.
The red and yellow dots representing the buoys are clustered along the coasts.
But most of the open ocean remains a blue void.
Even closer to shore, a buoy may be the only one for miles collecting data on sea surface temperatures and wave activity.

Janssen, an oceanographer at San Francisco State University before starting Spoondrift in 2016, says marine scientists have “learned to live with very sparse data.”
“Everyone is building their own instruments but building them for themselves,” he says.
“We’re taking it one step further.”

More sensors would mean higher-resolution data on ocean acidification, surface temperatures and other marine conditions, showing how variables differ from one spot to another (or to tell you the exact conditions at your favorite surf breaks or fishing spots).
“If you want to find why, say, this coral is bleaching and that isn’t, you’ll need lots of sensors to be able to rule out temperature,” says Schrope.


Real time data access

Janssen brings up a map created by a fleet of sensors Spoondrift sent out from the mouth of the Columbia River in Oregon, a notoriously choppy and unpredictable patch of water to navigate and study.
They sent out so many sensors that the data collected was sufficiently high-resolution for the team to be able to recreate a simulated imageof what the waves looked like.
Making images like that isn’t in the company’s near-term plans as it is focused on obtaining high-resolution data that can yield a complete picture of ocean surface conditions.


Easy to use and deploy

Spoondrift says the Spotter costs about a tenth of the price of weather buoy and does not require a large vessel and winch to deploy it.
The 12lb (5.4kg) device can simply be picked up and dropped into the water.

McCauley notes that his research has been hobbled by a lack of high-quality data about what’s happening physically in the habitats he studies as the cost of sensors prevents their widespread deployment.

The vastness of the ocean – covering two-thirds of the planet – means that scientists need help from both new technologies and a growing legion of smartphone-wielding citizen scientists.

To try to determine the size of giant sea bass populations, McCauley has been tracking fish based on the unique markings each individual sports.
With the help of software, his team has reviewed thousands of photos of giant sea bass posted online by recreational divers, vacationers and whoever else shares photos of big fish.
“It’s about being able to leverage the power of people and our tendencies to post what we see,” he says.

OpenROV, a Berkeley, California startup funded by Schmidt Marine Technology Partners, hopes its technology will spur many more ocean images.
The company makes a tethered “underwater drone” that streams video of life under the sea to the operator’s smartphone or tablet, which is also used to control the device.
OpenROV’s newest model, the Trident, costs $1,500 and looks like a swimming Wi-Fi router with headlights.
It can venture to depths of 328ft (100m) and hits the market in August.

OpenROV’s newest underwater drone can venture to depths of 328ft (100m).

Cofounder David Lang was working at a sailing school when the urge to explore under the waves hit him. But he and his friends couldn’t afford a remotely operated vehicle (ROV).
“The technology had been around for a while, but it was expensive,” he says.
“So we started creating our own,” starting with a DIY kit and later a consumer product.
“But now we’re hearing from lots of people who actually need it,” adds Lang, including climate change researchers in Maine and biologists studying the foraging of minke whales.

A few years ago underwater videos like those taken with OpenROV’s drones would have only been affordable for a select few.
Today, they’re “within the reach of well-funded elementary schools and Girl Scouts who are really good at selling cookies,” McCauley says.

And more people taking and sharing underwater images would mean more data points for his research.
“If you think about engaging citizen scientists, you have to figure out what kind of data they would find interesting,” McCauley says.
“They’re not necessarily interested in salinity data, but are interested in photos and videos.”

These new exploration technologies are taking a ­range of forms.
A humanoid robot diver developed by a Stanford University team can spend more time at depths than would be safe for a human diver.
Autonomous ships are set to start sailing the seas in 2018.
Saildrone’s autonomous wind-powered mini-sailboats can be launched to collect a range of data.
A surfboard fin that contains a sensor leverages the amount of time surfers spend in the water to collect data on ocean salinity, pH, temperature and waves.
The wave-and-solar-powered Wave Glider made by Boeing’s Liquid Robotics can roam the world’s oceans for up to a year at a time collecting data on climate change and other conditions.

Kipp Shearman, a physical oceanographer at Oregon State University, has used the Wave Glider in his research, outfitting it with different sensors to monitor ocean-atmospheric interactions and deploying the surfboard-sized robot for long periods of time.
But he’s also noticed advances in more traditional technology.
Since 2006, he has used undersea gliders, autonomous missile-shaped capsules that can stay in the water for weeks, transmitting data back to shore.
In recent years, he’s seen onboard computing power increase, allowing the gliders to “fly” in more complex ways and avoid shipping lanes.
High-density lithium batteries now allow them to operate autonomously for about a year.
“It’s been a remarkable evolution in that technology over the last 10 years,” Shearman says.
“It’s getting us to the point where I think we’re going to see a lot more persistent observation out in the ocean.”
Shearman says his colleague Jonathan Nash recently showed him pictures taken at the base of glaciers, where it would be too dangerous to send a research ship.
Nash took the photos remotely with a motorized, autonomous kayak.

The challenge now is to make sure all this data is of sufficiently high quality to be useful and that there are adequate resources to process the information.
“If I’m out there collecting pH data with a pH sensor,” McCauley says, “I know exactly what went wrong” and can disregard suspect data or factor in margins of error or variance.
But that’s harder to do with a stream of crowd-sourced data from an array of sensors.
“If some of the sensors are good and some aren’t, for instance, that could be a real train wreck,” he says.
Care will also need to be taken to roll out the new technology in a way that doesn’t interfere with the environments it is studying.
“Drones are an amazing resource for learning about terrestrial wildlife, but if everyone has one they begin to disrupt the ecosystems they’re trying to study,” says McCauley.
“The oceans are big, but often smaller than we expect, at least the parts that we find interesting. But right now the problem is not sensor traffic.”
A nearer-term challenge is processing all the data that’s starting to come in.
“One of the great challenges now on the backend, on my side, is how to deal with that fire hose of data,” says McCauley.
“Data is only good if you can process it to get an answer out of it.”

Schrope sees a “ton of potential” to develop artificial intelligence to catalog and analyze ocean data and make it easily searchable.
That may be the next step in the blue tech revolution.

Links :

Tuesday, August 8, 2017

Cyber threats prompt return of radio for ship navigation

Nautical chart including Loran TD lines for ocean approaches to New York Harbor.
The chart shows TD lines, apparently for LORAN-A, which would make it the Nantucket-Chatam-Montuck-Sandy Hook-Fenwick-Bodie Is-Cape Hatteras chain.
Note that the printed TD lines do not extend into inland waterway areas, as LORAN propagates poorly over land.
The green 1000 lines curve heavily in this area.
Note the "LORAN TR" mark at the tip of Sandy Hook near the focus of the curves.
This would be Station "J" (3H5).
The ochre 4000 lines (3H4) would correspond to the TD between the master and Station "H" at Cape Hatteras.
The master station of this chain was at Sankaty Head on Nantucket, Massachussets.
The sharp angle between these sets of TD rings, especially to the east and north, would make it a poor pair for precise navigation.
Note : old map (current map without Loran hyperboles with the GeoGarage platform)

From Reuters by Jonathan Saul

The risk of cyber attacks targeting ships' satellite navigation is pushing nations to delve back through history and develop back-up systems with roots in World War Two radio technology.

Ships use GPS (Global Positioning System) and other similar devices that rely on sending and receiving satellite signals, which many experts say are vulnerable to jamming by hackers.

About 90 percent of world trade is transported by sea and the stakes are high in increasingly crowded shipping lanes.
Unlike aircraft, ships lack a back-up navigation system and if their GPS ceases to function, they risk running aground or colliding with other vessel


South Korea is developing an alternative system using an earth-based navigation technology known as eLoran, while the United States is planning to follow suit.
Britain and Russia have also explored adopting versions of the technology, which works on radio signals.

The drive follows a series of disruptions to shipping navigation systems in recent months and years.
It was not clear if they involved deliberate attacks; navigation specialists say solar weather effects can also lead to satellite signal loss.

Last year, South Korea said hundreds of fishing vessels had returned early to port after their GPS signals were jammed by hackers from North Korea, which denied responsibility.

In June this year, a ship in the Black Sea reported to the U.S. Coast Guard Navigation Center that its GPS system had been disrupted and that over 20 ships in the same area had been similarly affected.
U.S. Coast Guard officials also said interference with ships' GPS disrupted operations at a port for several hours in 2014 and at another terminal in 2015. It did not name the ports.

A cyber attack that hit A.P. Moller-Maersk's IT systems in June 2017 and made global headlines did not involve navigation but underscored the threat hackers pose to the technology dependent and inter-connected shipping industry. It disrupted port operations across the world.

The eLoran push is being led by governments who see it as a means of protecting their national security.
Significant investments would be needed to build a network of transmitter stations to give signal coverage, or to upgrade existing ones dating back decades when radio navigation was standard.

U.S. engineer Brad Parkinson, known as the "father of GPS" and its chief developer, is among those who have supported the deployment of eLoran as a back-up.

"ELoran is only two-dimensional, regional, and not as accurate, but it offers a powerful signal at an entirely different frequency," Parkinson told Reuters.
"It is a deterrent to deliberate jamming or spoofing (giving wrong positions), since such hostile activities can be rendered ineffective," said Parkinson, a retired U.S. airforce colonel.

 This is the way we used to find our way around.

Korean stations

Cyber specialists say the problem with GPS and other Global Navigation Satellite Systems (GNSS) is their weak signals, which are transmitted from 12,500 miles above the Earth and can be disrupted with cheap jamming devices that are widely available.

Developers of eLoran - the descendant of the loran (long-range navigation) system created during World War II - say it is difficult to jam as the average signal is an estimated 1.3 million times stronger than a GPS signal.
To do so would require a powerful transmitter, large antenna and lots of power, which would be easy to detect, they add.

Shipping and security officials say the cyber threat has grown steadily over the past decade as vessels have switched increasingly to satellite systems and paper charts have largely disappeared due to a loss of traditional skills among seafarers.
"My own view, and it is only my view, is we are too dependent on GNSS/GPS position fixing systems," said Grant Laversuch, head of safety management at P&O Ferries.
"Good navigation is about cross-checking navigation systems, and what better way than having two independent electronic systems."

Lee Byeong-gon, an official at South Korea's Ministry of Oceans and Fisheries, said the government was working on establishing three sites for eLoran test operations by 2019 with further ones to follow after that.
But he said South Korea was contending with concerns from local residents at Gangwha Island, off the west coast.
"The government needs to secure a 40,000 pyeong (132,200 square-meter) site for a transmitting station, but the residents on the island are strongly opposed to having the 122 to 137 meter-high antenna," Lee told Reuters.

In July, the United States House of Representatives passed a bill which included provisions for the U.S. Secretary of Transportation to establish an eLoran system.
"This bill will now go over to the Senate and we hope it will be written into law," said Dana Goward, president of the U.S. non-profit Resilient Navigation and Timing Foundation, which supports the deployment of eLoran.
"We don't see any problems with the President (Donald Trump) signing off on this provision."
The previous administrations of Presidents George W. Bush and Barack Obama both pledged to establish eLoran but never followed through.
However, this time there is more momentum.
In May, U.S. Director of National Intelligence Daniel Coats told a Senate committee the global threat of electronic warfare attacks against space systems would rise in coming years.
"Development will very likely focus on jamming capabilities against ... Global Navigation Satellite Systems (GNSS), such as the U.S. Global Positioning System (GPS)," he said.

 Differential eLoran operation concept
(graphic courtesy Ursanav).

Spoofing dangers

Russia has looked to establish a version of eLoran called eChayka, aimed at the Arctic region as sea lanes open up there, but the project has stalled for now.
"It is obvious that we need such a system," said Vasily Redkozubov, deputy director general of Russia's Internavigation Research and Technical Centre.
"But there are other challenges apart from eChayka, and (Russia has) not so many financial opportunities at the moment."

Cost is a big issue for many countries.
Some European officials also say their own satellite system Galileo is more resistant to jamming than other receivers.
But many navigation technology experts say the system is hackable.
"Galileo can help, particularly with spoofing, but it is also a very weak signal at similar frequencies," said Parkinson.

 The red track is based on raw eLoran data without any corrections.
The transparent blue line is made by GPS-RTK and is widened to 10 meters giving the required ± 5 meter limits of eDLoran.
The white line is output from the eDLoran receiver which stays within the borders of the 10 meter wide transparent blue line.
source : GPSworld

The reluctance of many countries to commit to a back-up means there is little chance of unified radio coverage globally for many years at least, and instead disparate areas of cover including across some national territories and shared waterways.

The General Lighthouse Authorities of the UK and Ireland had conducted trials of eLoran but the initiative was pulled after failing to garner interest from European countries whose transmitters were needed to create a signal network.
France, Denmark, Norway and Germany have all decided to turn off or dismantle their old radio transmitter stations.
Britain is maintaining a single eLoran transmitter in northern England.

Taviga, a British-U.S. company, is looking to commercially operate an eLoran network, which would provide positioning, navigation and timing (PNT).
"There would need to be at least one other transmitter probably on the UK mainland for a timing service," said co-founder Charles Curry, adding that the firm would need the British government to commit to using the technology.

Andy Proctor, innovation lead for satellite navigation and PNT with Innovate UK, the government's innovation agency, said: "We would consider supporting a commercially run and operated service, which we may or may not buy into as a customer."
Current government policy was "not to run large operational pieces of infrastructure like an eLoran system", he added.

Links :

Monday, August 7, 2017

Planet has just 5 percent chance to reach Paris goal

Earth's horizon.
Credit: NASA Goddard/flickr

From The Guardian by Oliver Milman

There is only a 5% chance that the Earth will avoid warming by at least 2C come the end of the century, according to new research that paints a sobering picture of the international effort to stem dangerous climate change.

Global trends in the economy, emissions and population growth make it extremely unlikely that the planet will remain below the 2C threshold set out in the Paris climate agreement in 2015, the study states.

The Paris accord, signed by 195 countries, commits to holding the average global temperature to “well below 2C” above pre-industrial levels and sets a more aspirational goal to limit warming to 1.5C.
This latter target is barely plausible, the new research finds, with just a 1% chance that temperatures will rise by less than 1.5C.
“We’re closer to the margin than we think,” said Adrian Raftery, a University of Washington academic who led the research, published in Nature Climate Change.
“If we want to avoid 2C, we have very little time left. The public should be very concerned.”


Visualization based on GISTEMP data.
Credit Antti Lipponen
Governments settled on the 2C threshold partly through political expediency but also because scientists have warned of severe consequences from sea level rise, drought, heatwaves and social unrest should the temperature rise beyond this.

Such risks have been underscored by a separate study, also released on Monday, that shows unabated climate change will cause around 60,000 deaths globally in 2030 and 260,000 deaths by 2100.
The study, by the University of North Carolina, found that rising temperatures will exacerbate air pollutants that will particularly threaten those with existing conditions.

According to the University of Washington study, there is a 90% likelihood that temperatures will rise between 2C and 4.9C by 2100.
This would put the world in the mid-range warming scenarios mapped out by the UN’s Intergovernmental Panel on Climate Change.
It negates the most optimistic outcome as well as the worst case, which would see temperatures climb nearly 6C beyond the pre-industrial era.

Exploring extreme sea level in 3D and the stakes for America
Step-by-step instructions on how to use Climate Central's NOAA extreme sea level rise layer within Google Earth Chrome.

Rather than look at how greenhouse gases will influence temperature, the new research analyzed the past 50 years of trends in world population, per capita gross domestic product (GDP) and carbon intensity, which is the amount of carbon dioxide emitted for each dollar of economic activity.

After building a statistical model covering a range of emissions scenarios, the researchers found that carbon intensity will be a crucial factor in future warming.
Technological advances are expected to cut global carbon intensity by 90% over the course of the century, with sharp declines in China and India – two newly voracious consumers of energy.
However, this decline still will not be steep enough to avoid breaching the 2C limit.

The world’s population is expected to grow to about 11 billion people by 2100, but the research found that this will have a relatively small impact upon temperatures as much of this growth will take place in sub-Saharan Africa, which is a minor contributor of greenhouse gas emissions.

It has long been acknowledged that emissions cuts promised under the the Paris agreement would not be sufficient to avoid 2C warming.
However, it is hoped that periodic reviews of commitments will result in more severe reductions.

Donald Trump’s pledge to remove the US, the world’s second-largest emitter, from the accord has cast a large shadow over these ambitions.

“Even if the 2C target isn’t met, action is very important,” said Raftery.
“The more the temperature increases, the worse the impacts will be.
“We would warn against any tendency to use our results to say that we won’t avoid 2C, and so it’s too late to do anything. On the contrary, avoiding the higher temperature increases that our model envisages is even more important, and also requires urgent action.”

Raftery acknowledged that a breakthrough technology could “dramatically” change the outlook but noted that major advances of the past 50 years, such as the computer, robotics, hybrid cars, the internet and electronic fuel injection, have improved carbon efficiency steadily at around 2% a year, rather than in huge jumps.

Andrew Dessler, a climate scientist at Texas A&M University who was not involved in the study, said the research’s conclusions were “reasonable” but said it was difficult to assign a precise probability to future temperature rises.
“I agree that staying below 2C and 1.5C are unlikely and very, very unlikely, respectively,” he said. “But this research gives a false sense of rigor. Tomorrow someone could invent a carbon-free energy source that everyone adopts.
“If you look at technology adoption and action taken on the ozone layer and acid rain, it’s clear these things can change faster than people predict.”

Dessler said the falling cost of renewable energy would be a major factor in reducing emissions but further impetus would be needed through new actions such as a price on carbon.
“It’s like you’re driving and about to collide with the car in front of you,” he said.
“You want to hit the brakes as fast as you can. The later you wait, the more painful it’s going to be.”

John Sterman, an academic at the MIT Sloan Sustainability Initiative, said the research was an “urgent call to action”.
MIT research has shown that emissions cuts in the Paris agreement would stave off around 1C of temperature increase by 2100 – findings misrepresented by Trump when he announced the US departure from the pact.

Sterman said the US must “dramatically speed the deployment of renewable energy and especially energy efficiency.
Fortunately, renewables, storage and other technologies are already cheaper than fossil energy in many places and costs are falling fast.
“More aggressive policies are urgently needed, but this study should not be taken as evidence that nothing can be done.”

Links :

Sunday, August 6, 2017

Fractal : storm lapse

The ingredient based explanation for supercell thunderstorms cites moisture, wind shear, instability and lift as the reasons for their formation.
I prefer to focus on the big picture.
Supercell thunderstorms are a manifestation of nature's attempt to correct an extreme imbalance.
The ever ongoing effort to reach equilibrium, or entropy, is what drives all of our weather, and the force with which the atmosphere tries to correct this imbalance is proportional to the gradient.
In other words, the more extreme the imbalance, the more extreme the storm.

This collection of timelapses was gathered over the last six years from Texas to North Dakota and everywhere in between.
The project started out as wanting to be able to see the life-cycles of these storms, just for my own enjoyment and to increase my understanding of them.
Over time, it morphed into an obsession with wanting to document as many photogenic supercells as I could, in as high a resolution as possible, as to be able to share with those who couldn't see first-hand the majestic beauty that comes alive in the skies above America's Great Plains every Spring.
After more than 100,000 miles on the road and tens of thousands of shutter clicks later, this is the result.
I hope you enjoy watching it as much as I enjoyed creating it.

Saturday, August 5, 2017