Wednesday, February 3, 2016

Coast Survey announces surveys by navigation response teams : NRT data will be used to update nautical charts


From NOAA

Coast Survey’s navigation response teams have proven their value, time and again, especially after hurricanes when ports suspend operations, and shipping (or naval movements) cease until Coast Survey’s small boats can locate underwater dangers to navigation.
But what do the six navigation response teams (NRTs) do during those long periods between deployments for maritime emergencies?
They are busy, mostly year-round, collecting hydrographic data for updating nautical charts.

Plans for 2016

Responding to requests from mariners around the country, Coast Survey has set some aggressive projects for the NRTs this year.
Starting from Northeast and working our way around the coasts…

NEW YORK

Beginning in June and throughout the summer, NRT5 will survey the Hudson River, with a focus on the area from Albany to Kingston.
This is a continuation of the project started at the request of the Hudson River Pilots (as reported in NOAA plans multiyear project to update Hudson River charts).
We are planning to have Coast Survey research vessel Bay Hydro II join the NRT for most of the summer, to get as much new charting data as possible.
In October, NRT5 will move to Eastern Long Island Sound, to finish up some shallow survey work adjacent to recent NOAA Ship Thomas Jefferson’s extensive survey project.
The officer-in-charge of NRT5 is NOAA Lt. Andrew Clos.
The officer-in-charge of Bay Hydro II will be NOAA Ensign Sarah Chappel.

GEORGIA

In March, NRT2 starts a 16-month survey project in Saint Andrew Sound.
The area, which has significant traffic from small boats, tugs, and barges, is reportedly experiencing small boat groundings, and Coast Survey’s navigation manager in the area has received several requests for a modern survey.
Coast Survey will use the data to update NOAA chart 11504 and ENC US5GA12M, as well as other charts covering portions of the specific surveyed areas.
The existing charted soundings are from partial bottom coverage surveys dating back to the early 1900s.
NRT2 is led by Erik Anderson.


 NRT1 will check out the 18-yr-old reported depths to update chart 11376 inset.


EASTERN GULF OF MEXICO – BILOXI AND MOBILE

NRT1 will spend March and April acquiring data off the coast of Biloxi, Mississippi, to update the Intracoastal Waterway chart 11372.
They will then move to Alabama for some long-overdue “chart clean up” work at the northern end of the Mobile Ship Channel, outside of the area controlled by the Army Corps of Engineers.
The Mobile project will investigate charted items, verify reported depths, and update older NOAA bathymetry (vintage 1961) that is depicted in the inset area of NOAA chart 11376.
Since the Mobile survey probably will not take the entire rest of the season, depending on interruptions for hurricane response, we are assessing additional survey needs in the area.
NRT1 is led by Mark McMann.

WEST GULF OF MEXICO – TEXAS

NRT4 will spend all of 2016 surveying in Galveston Bay, including the bay entrance and newly charted barge channels along the Houston Ship Channel.
The team is working with Coast Survey’s navigation manager for Texas to identify additional charted features that require investigation to reduce localized chart clutter and improve chart adequacy.
NRT4 is led by Dan Jacobs.

NORTHERN CALIFORNIA

NRT6 is slated to survey the Suisun Bay anchorage used by MARAD’s National Defense Reserve Fleet, to acquire updated depths.
Afterwards, NRT6 will move throughout the bay area to address charting concerns reported by the San Francisco Bar Pilot Association near Pittsburg, Antioch, San Joaquin River, and Redwood City. Coast Survey will use the data to generally update NOAA chart 18652 and ENC US5CA43M, as well as larger scale charts of the specific surveyed areas.
NRT6 is led by Laura Pagano.

PACIFIC NORTHWEST

It has been a while since Coast Survey has had an operational NRT presence for Oregon and Washington, but this is the year we are bringing NRT3 back on line.
Team lead Ian Colvert is shaking the dust off NRT3 and preparing to restart survey operations.
He is working with the Coast Survey navigation manager to develop survey priorities for this summer and fall.

Charting the data

Once the navigation response teams process and submit the data acquired during the surveys, the information is further processed in Coast Survey’s Atlantic and Pacific hydrographic branches, and then submitted to our cartographers for application to the charts.
The turnaround time for updating the chart depends on the update calendars for each regional cartographic branch.
If the NRTs find any dangers to navigation, the information will be relayed to mariners through the Local Notice to Mariners postings and will be applied to NOAA’s electronic navigational charts (NOAA ENC®), online products, and print-on-demand paper charts.
Critical updates will be applied to charts more quickly than normal depth adjustments.

Studying the heart of El Niño, where its weather begins

The weather phenomenon known as El Niño can cause dramatic effects around the world.
Henry Fountain explains where it comes from.
By Henry Fountain, Aaron Byrd and Ben Laffin on Publish Date September 9, 2014.

From NYTimes by Henry Fountain

A thousand miles south of Hawaii, the air at 45,000 feet above the equatorial Pacific was a shimmering gumbo of thick storm clouds and icy cirrus haze, all cooked up by the overheated waters below.
In a Gulfstream jet more accustomed to hunting hurricanes in the Atlantic, researchers with the National Oceanic and Atmospheric Administration were cruising this desolate stretch of tropical ocean where the northern and southern trade winds meet.
It’s an area that becalmed sailors have long called the doldrums, but this year it is anything but quiet.
This is the heart of the strongest El Niño in a generation, one that is pumping moisture and energy into the atmosphere and, as a result, roiling weather worldwide.

 A satellite image of the area of the Pacific where a NOAA research team would be flying. 
Kent Nishimura for The New York Times

The plane, with 11 people aboard including a journalist, made its way Friday on a long westward tack, steering clear of the worst of the disturbed air to the south.
Every 10 minutes, on a countdown from Mike Holmes, one of two flight directors, technicians in the rear released an instrument package out through a narrow tube in the floor.
Slowed by a small parachute, the devices, called dropsondes, fell toward the water, transmitting wind speed and direction, humidity and other atmospheric data back to the plane continuously on the way down.

Leonard Miller, a NOAA technician, left, testing an instrument package called a dropsonde that was set to be launched from the plane’s delivery system, right.
Credit Left, Henry Fountain/The New York Times; right, Kent Nishimura for The New York Times

The information, parsed by scientists and fed into weather models, may improve forecasting of El Niño’s effect on weather by helping researchers better understand what happens here, at the starting point.
“One of the most important questions is to resolve how well our current weather and climate models do in representing the tropical atmosphere’s response to an El Niño,” said Randall Dole, a senior scientist at NOAA’s Earth System Research Laboratory and one of the lead researchers on the project. “It’s the first link in the chain.”
An El Niño forms about every two to seven years, when the surface winds that typically blow from east to west slacken.
As a result, warm water that normally pools along the Equator in the western Pacific piles up toward the east instead.
Because of this shift, the expanse of water — which in this El Niño has made the central and eastern Pacific as much as 5 degrees Fahrenheit hotter than usual — acts as a heat engine, affecting the jet streams that blow at high altitudes.
That, in turn, can bring more winter rain to the lower third of the United States and dry conditions to southern Africa, among El Niño’s many possible effects.
Aided by vast processing power and better data, scientists have improved the ability of their models to predict when an El Niño will occur and how strong it will be.
As early as last June, the consensus among forecasters using models developed by NOAA, as well as other American and foreign agencies and academic institutions, was that a strong El Niño would develop later in the year, and it did.

But scientists have been less successful at forecasting an El Niño’s effect on weather.
This year, for instance, most models have been less certain about what it will mean for parched California.
So far, much of the state has received higher than usual precipitation, but it is still unclear whether Southern California, especially, will be deluged as much as it was during the last strong El Niño, in 1997-98.
Anthony Barnston, the chief forecaster at the International Research Institute for Climate and Society at Columbia University, who has studied the accuracy of El Niño modeling, said that so-called dynamical models, which simulate the physics of the real world, have recently done a better job in predicting whether an El Niño will occur than statistical models, which rely on comparisons of historical data.
With a dynamical model, Dr. Barnston said, data representing current conditions is fed into the model, and off it goes.
“You plug it in and you crank it forward in time,” he said.
This can be done dozens of times — or as often as money will allow — tweaking the data slightly each time and averaging the outcomes.
With any model, good data is crucial.
El Niño models have been helped by the development of satellites and networks of buoys that can measure sea-surface temperatures and other ocean characteristics.
When it comes to forecasting El Niño’s weather effects, however, good data can be harder to come by.
That’s where the NOAA research project aims to help, by studying a key process in the El Niño-weather connection: deep tropical convection.

 Alan S. Goldstein, the radar monitor for the mission, with other researchers before the flight.
Credit Kent Nishimura for The New York Times 

The clouds that the NOAA jet cruised past on Friday were a result of this process, in which air over the warm El Niño waters picks up heat and moisture and rises tens of thousands of feet.
When the air reaches high altitudes — about the flight level of the Gulfstream — the moisture condenses into droplets, releasing energy in the form of heat and creating winds that flow outward.
Scientists know that the energy released can induce a kind of ripple in a jet stream, a wave that as it travels along can affect weather in disparate regions around the world.
And they know that the winds that are generated can add a kick to a jet stream, strengthening it. That’s a major reason California and much of the southern United States tend to be wetter in an El Niño; the winds from convection strengthen the jet stream enough that it reaches California and beyond.
But to study convection during an El Niño, data must be collected from the atmosphere as well as the sea surface.
That’s a daunting task, because the convection occurs in one of the most remote areas of the planet. As a result, there has been little actual data on convection during El Niño events, Dr. Dole said, and most models, including NOAA’s own, have had to make what amount to educated guesses about the details of the process.
“Our strong suspicion is that our models have major errors in reproducing some of these responses,” he said.
“The only way we can tell is by going out and doing observations.”
When forecasters last year began to predict a strong El Niño, the NOAA scientists saw an opportunity and started making plans for a rapid-response program of research.

Dr. Dole estimated that it would normally take two or three years to put together a program they assembled in about six months.
In a way, he said, they were helped by the developing El Niño, which suppressed hurricane activity in the Atlantic last fall.
The Gulfstream flew fewer missions and the available flight hours, as well as extra dropsondes, were transferred to the project.
In addition to the jet — which is also equipped with Doppler radar to study wind — the program is launching other sondes, from a ship and a small atoll near the Equator.
A large remotely piloted aircraft from NASA, the Global Hawk, has also been enlisted to study the Pacific between Hawaii and the mainland.
The Gulfstream flight Friday was the researchers’ fourth so far, out of nearly two dozen planned over the next month.
The day began at Honolulu International Airport five hours before the 11:30 a.m. takeoff when Ryan Spackman, the other lead investigator, and NOAA colleagues sat down for a weather briefing with Dr. Dole and other scientists at the agency’s offices in Boulder, Colo.
The original plan was to fly due south from Honolulu and around an area of convection — a “cell” in meteorological terms — near the Equator.
But when the plane’s three pilots arrived for their briefing several hours later, the plan was changed out of safety concerns.
There was a risk they would have no way to get back from the south side of the convection area without going through a storm, and the Gulfstream, unlike NOAA’s other hurricane-hunting planes, cannot do that.

Links :
 

Tuesday, February 2, 2016

Microsoft plumbs ocean’s depths to test underwater data center


Introducing Microsoft Project Natick, a Microsoft research project to manufacture and operate an underwater datacenter.
The initial experimental prototype vessel, christened the Leona Philpot after a popular Xbox game character, was operated on the seafloor approximately one kilometer off the Pacific coast of the United States from August to November of 2015.
Project Natick reflects Microsoft’s ongoing quest for cloud datacenter solutions that offer rapid provisioning, lower costs, high responsiveness, and are more environmentally sustainable.

From NYTimes by John Markoff

Taking a page from Jules Verne, researchers at Microsoft believe the future of data centers may be under the sea.
Microsoft has tested a prototype of a self-contained data center that can operate hundreds of feet below the surface of the ocean, eliminating one of the technology industry’s most expensive problems: the air-conditioning bill.
Today’s data centers, which power everything from streaming video to social networking and email, contain thousands of computer servers generating lots of heat.
When there is too much heat, the servers crash.
Putting the gear under cold ocean water could fix the problem.
It may also answer the exponentially growing energy demands of the computing world because Microsoft is considering pairing the system either with a turbine or a tidal energy system to generate electricity.
The effort, code-named Project Natick, might lead to strands of giant steel tubes linked by fiber optic cables placed on the seafloor.
Another possibility would suspend containers shaped like jelly beans beneath the surface to capture the ocean current with turbines that generate electricity.

 Ben Cutler, left, and Norman Whitaker, both of Microsoft Research, with the “Leona Philpot,” a prototype underwater data center, at the company’s headquarters in Redmond, Wash.
Credit Matt Lutton for The New York Times

“When I first heard about this I thought, ‘Water ... electricity, why would you do that?’ ” said Ben Cutler, a Microsoft computer designer who is one of the engineers who worked on the Project Natick system.
“But as you think more about it, it actually makes a lot of sense.”
Such a radical idea could run into stumbling blocks, including environmental concerns and unforeseen technical issues.
But the Microsoft researchers believe that by mass producing the capsules, they could shorten the deployment time of new data centers from the two years it now takes on land to just 90 days, offering a huge cost advantage.
The underwater server containers could also help make web services work faster.
Much of the world’s population now lives in urban centers close to oceans but far away from data centers usually built in out-of-the-way places with lots of room.
The ability to place computing power near users lowers the delay, or latency, people experience, which is a big issue for web users.
“For years, the main cloud providers have been seeking sites around the world not only for green energy but which also take advantage of the environment,” said Larry Smarr, a physicist and scientific computing specialist who is director of the California Institute for Telecommunications and Information Technology at the University of California, San Diego.
Driven by technologies as varied as digital entertainment and the rapid arrival of the so-called Internet of Things, the demand for centralized computing has been growing exponentially.
Microsoft manages more than 100 data centers around the globe and is adding more at a rapid clip. The company has spent more than $15 billion on a global data center system that now provides more than 200 online services.

The “Leona Philpot” prototype was deployed off the central coast of California on Aug. 10, 2015.
In 2014, engineers in a branch of Microsoft Research known as New Experiences and Technologies, or NExT, began thinking about a novel approach to sharply speed up the process of adding new power to so-called cloud computing systems.
“When you pull out your smartphone you think you’re using this miraculous little computer, but actually you’re using more than 100 computers out in this thing called the cloud,” said Peter Lee, corporate vice president for Microsoft Research and the NExT organization.
“And then you multiply that by billions of people, and that’s just a huge amount of computing work.”
The company recently completed a 105-day trial of a steel capsule — eight feet in diameter — that was placed 30 feet underwater in the Pacific Ocean off the Central California coast near San Luis Obispo.
Controlled from offices here on the Microsoft campus, the trial proved more successful than expected.
The researchers had worried about hardware failures and leaks.
The underwater system was outfitted with 100 different sensors to measure pressure, humidity, motion and other conditions to better understand what it is like to operate in an environment where it is impossible to send a repairman in the middle of the night.
The system held up.
That led the engineers to extend the time of the experiment and to even run commercial data-processing projects from Microsoft’s Azure cloud computing service.
The research group has started designing an underwater system that will be three times as large.
It will be built in collaboration with a yet-to-be-chosen developer of an ocean-based alternative-energy system.
The Microsoft engineers said they expected a new trial to begin next year, possibly near Florida or in Northern Europe, where there are extensive ocean energy projects underway.
The first prototype, affectionately named Leona Philpot — a character in Microsoft’s Halo video game series — has been returned, partly covered with barnacles, to the company’s corporate campus here.
It is a large white steel tube, covered with heat exchangers, with its ends sealed by metal plates and large bolts.
Inside is a single data center computing rack that was bathed in pressurized nitrogen to efficiently remove heat from computing chips while the system was tested on the ocean floor.
The idea for the underwater system came from a research paper written in 2014 by several Microsoft data center employees, including one with experience on a Navy submarine.
Norman A. Whitaker, the managing director for special projects at Microsoft Research and the former deputy director at the Pentagon’s Defense Advanced Research Projects Agency, or Darpa, said the underwater server concept was an example of what scientists at Darpa called “refactoring,” or completely rethinking the way something has traditionally been accomplished.
Even if putting a big computing tube underwater seems far-fetched, the project could lead to other innovations, he said.
For example, the new undersea capsules are designed to be left in place without maintenance for as long as five years.
That means the servers inside it have to be hardy enough to last that long without needing repairs.
That would be a stretch for most servers, but they will have to improve in order to operate in the underwater capsule — something the Microsoft engineers say they are working on.

Project Natick vessel being deployed.
They’re also rethinking the physical alignment of data centers.
Right now, servers are put in racks so they can be maintained by humans.
But when they do not need maintenance, many parts that are just there to aid human interaction can be removed, Mr. Whitaker said.
“The idea with refactoring is that it tickles a whole bunch of things at the same time,” he said.
In the first experiment, the Microsoft researchers said they studied the impact their computing containers might have on fragile underwater environments.
They used acoustic sensors to determine if the spinning drives and fans inside the steel container could be heard in the surrounding water.
What they found is that the clicking of the shrimp that swam next to the system drowned out any noise created by the container.
One aspect of the project that has the most obvious potential is the harvest of electricity from the movement of seawater.
This could mean that no new energy is added to the ocean and, as a result, there is no overall heating, the researchers asserted.
In their early experiment the Microsoft engineers said they had measured an “extremely” small amount of local heating of the capsule.
“We measured no heating of the marine environment beyond a few inches from the vessel,” Dr. Lee said.

Links :

Monday, February 1, 2016

US NOAA update in the GeoGarage platform

4 nautical raster charts updated

Less than 1 percent of the world's shipwrecks have been explored

US AWOIS database (extract)

From Popular Mechanics by Jay Bennett

A rough estimate puts more than three million shipwrecks on the ocean floor.
This number represents ships throughout the entirety of human history, from 10,000-year-old dugout canoes preserved in the muck to 21st century wrecks that you might have read about in the news.
There are so many shipwrecks, in fact, that a search operation for the missing Malaysia Airlines Flight 370 has discovered two by accident.
The Battle of the Atlantic alone, which spanned nearly six years during World War II, claimed over 3,500 merchant vessels, 175 warships, and 783 submarines.
Particularly interesting are the cargo ships that literally contain treasure, such as Spanish galleons that transported gold and jewels across the Atlantic.
The Uluburun shipwreck off the coast of southwestern Turkey is roughly 3,300 years old, and that Late Bronze Age vessel contained gold, silver, jewels, copper and tin ingots, tools, swords and other weapons, and much more trade cargo—all of it hauled up over the course of 10 years and 22,413 dives.
But most wrecks don't receive that kind of attention.
In fact, less than 10 percent of the shipwrecks that we we've located—which account for just 10 percent of all shipwrecks in the world—have been surveyed or visited by divers.
Fishing trawlers snag on sunken ships, sonar readings pick them up, historical records tell us where they should be, harbor dredging operations uncover wrecks that have long been lost below the seafloor—but there simply isn't enough time and money to explore the vast majority of them.

The Sweepstakes was built in 1867 as a two-masted schooner in Burlington, Ontario by John Simpson.
The ship's length is 36.3m (119ft) and lays just below the surface in Big Tub harbor with a maximum depth of 7m (20ft).
The Sweepstakes was damaged off Cove Island, then towed to Big Tub harbor where she sank in 1885.
At times, the shipwreck will sit well below the surface of Lake Huron and then when the lake becomes shallower, sections of the Sweepstakes rise up out of the water making parts clearly visible.

Throughout Fathom Five National Marine Park, there are 22 shipwrecks and many people come here to snorkel and scuba dive in these pristine waters.

Daunting Task

James Delgado, the Director of Maritime Heritage at the National Oceanic and Atmospheric Administration (NOAA), says that there are an estimated 4,300 shipwrecks within NOAA's 14 National Marine Sanctuaries.
Of these, 432 have been dived on and surveyed.
And these are shipwrecks within a mapped area set aside for preservation.
"There are laws and regulations directing NOAA to find what lies in those waters and assess it," Delgado said in an email.
Similar to other marine preservation organizations around the world, NOAA is not only devoted to discovering what the ships are, but also how their presence might affect the ecology of the marine environments they lie within.
Outside of marine sanctuaries, there isn't as much of an incentive.
Most shipwrecks are documented for a much simpler reason: to avoid collisions or other incidents. NOAA's Office of Coast Survey maintains a database of about 20,000 ships that is available to the public, primarily for the benefit of navigators and researchers.
The information for that database comes from two organizations within NOAA, the Electronic Navigational Charts (ENC) and the Automated Wrecks and Obstructions Information System (AWOIS).
Still, it's difficult to pinpoint exactly where a shipwreck is on the ocean floor.
The database lists some limitations, including that it "contains wreck features from two different sources that were created for different purposes, and there is not a perfect match of features from either source. The same wreck may be found in both the ENC wrecks and AWOIS wrecks layers, although the positions may not necessarily agree."

Riches Under the Sea

Still, there is an estimated $60 billion in sunken treasure around the world, just waiting at the bottom of the ocean.
And that doesn't include the historical and cultural value of excavating shipwreck sites.
So why don't we explore more of them?
For one thing, it's hard to know what's worth the time.
Diving operations can cost millions of dollars, and before we go down there, we have no idea what the ship is, what it was carrying, and what condition the cargo is in. In some cases, we are not even 100 percent sure that the identified object is a ship at all.
"Not many people follow up on a target to determine if it is a wreck, and if so what type it is, and then if possible, which ship it is," says Delgado.
It is possible, however, that the situation will improve.
As Delgado points out, 90 to 95 percent of the sea floor itself remains unexplored.
There are a number of efforts to change that, such as the Ocean Discovery XPrize that is offering $7 million in prize money for private teams that build an autonomous underwater vehicle (AUV) and create a bathymetric map (like a topographic map, but of the sea floor).
The Schmidt Ocean Institute, founded by former Google CEO Eric Schmidt, maintains a 272-foot vessel outfitted with modern oceanographic equipment that scientists can apply to use for various research expeditions. 
The good news, for shipwrecks explorers at least, is that the majority of shipwrecks are actually near the coast, with a large percentage of incidents occurring in and around the entryways to ports and harbors.
"Some harbors are tough to enter, like Oregon's Columbia River Bar, or leave, like San Francisco's Golden Gate and Bar, due to shifting winds, shifting sands, fog, storms, or strong tides," says Delgado.
"But also for the same reason that most auto accidents seem to happen within a mile of home, and there are many accidents coming in and out of parking lots, people seem to be less cautious or more aggressive."
With most shipwrecks so close to the shore, and multiple examples of wealthy patrons sponsoring exploration and research expeditions, we could see many of these unexplored shipwrecks investigated in the coming years.

Sunday, January 31, 2016

Saturday, January 30, 2016

Two Miles deep trailer



Syndicated cartoonist Jim Toomey is best known for his daily comic strip “Sherman’s Lagoon,” which explores themes ranging from pop culture to ocean conservation through the eyes of a cast of sea creatures living in an imaginary lagoon.
In June of 2014, Jim was invited by the Duke University Marine Lab to be a “cartoonist-in-residence” aboard the famed deep submersible vehicle Alvin.
“Two Miles Deep” is an account of his dive to the bottom of the Gulf of Mexico.
In this 27-minute film, we discover, from the perspective of a cartoonist, through video and animation, that the deep ocean is a world full of beauty and complexity.

Friday, January 29, 2016

Cape Horn discovered 400 years ago

Cabo de Hornos with the GeoGarage platform (SHN Argentina chart)

From Maritime Executive by Niek Boot

On January 29 2016, it is exactly 400 years ago that a Dutch merchant ship, the Eendracht, sailed by Cape Horn, the southern-most point of South America.
When Fernando Magallanes discovered and sailed the Strait of Magellan in 1520 it was still assumed that Tierra del Fuego, the southern bank of the Strait, was part of Terra Australis, the unknown continent. Maps of the era show no passage south of the Strait of Magellan.
Some 80 years later, in 1602, the Dutch established the Dutch East India Company (the Verenigde Oostindische Compagnie or VOC) and granted it a monopoly to trade with the “Spice Islands” east of Cape of Good Hope and west of the Strait of Magellan.

 NGA chart with the GeoGarage platform

One of the founders and the first president of the VOC was Isaac Le Maire.
He soon fell out with the board and was expelled in 1605 with the prohibition never to trade in VOC territory.
For a number of years he complied, but then the temptation became too great and he got permission to establish an “Australische Compagnie” or “South Company” and to launch an expedition to investigate the possibility of trade with the unknown Southern Continent.
His intention, from the start, was to find a new way to the East Indies, bypassing the exclusive routes of the VOC.
He purchased two vessels, the Eendracht (about 40m (130 feet) long with a crew of 65) and the Hoorn (about 30m (98 feet) long with a crew of 22) and had them fitted out by Captain Willem Schouten.

 Jacob Le Maire 

Le Maire appointed his son Jacob as leader of the expedition.
They sailed from the city of Hoorn, which was an important investor in the adventure, in June 1615. After calling at Cape Verde and Sierra Leone in Africa to replenish stores, water in particular, they arrived at what is today Puerto Deseado in the South of Argentina early December.
It is a protected inlet with a tidal range of over five meters, ideal to ground the vessels and clean their hulls of molluscs and other growth.
The cleaning was done by scratching the hulls with burning grass and scrubs.
During this work the Hoorn caught fire, and when the flames reached the gunpowder room, the vessel exploded and was irretrievably lost.
All of the crew survived and they then spent some weeks recovering was could be saved to put it on board Eendracht.

Beagle canal (SHN nautical charts)

On January 13, 1616, they set sail on the next leg of the trip.
They continued south past the latitude of the Strait of Magellan.
Here the coast of Tierra del Fuego forced them to sail eastbound in bad and cold weather.
Captain Schouten was tempted to abandon the search and set sail for Cape of Good Hope, unconvinced of the existence of a passage to the east and less secure without the assistance of his support vessel Hoorn. 
Jacob Le Maire insisted, and they continued.
On January 24, they found an opening and against current, waves and wind they managed to sail through.
Isla de los Estados with the GeoGarage platform (UKHO chart)
To the west was Tierra del Fuego, to the east there was land which they called Staten Land, not knowing it was an island.
Today it is called Staten Island, just like the island at the entrance of the Hudson River in New York, both named in honor of the General Staten of Holland, the Dutch government at the time.

 1633 map of Strait of Magellan, showing Strait Le Maire at the right, marked Fretum le Maire (Latin) and Straet Le Maire (Dutch)

They called the passage “Strait Lemaire.”
Continuing south, they sailed by various islands, some of which still today carry the names they were given then.
On the afternoon of January 29, 1616, they came by a cape which they realized was the southernmost of all and called it Kaap Hoorn in honor of the city they had sailed from.
They crossed the Pacific Ocean and arrived in Djakarta on the island of Java at the end of October 1616.
Instead of congratulating them with their discovery, the VOC-appointed governor did not believe their story and confiscated their ship and the goods on board.
Le Maire, Schouten and some of the crew were shipped to Holland as criminals for having infringed the monopoly of the VOC.
Jacob Le Maire died on board at the end of December.
The others arrived in Holland by July 1617. Isaac Le Maire was of course most distressed for having lost his son and his ships.
He claimed against the VOC for the confiscated vessel.
He won the case and recovered 65,000 florins.
But in the meantime the Dutch set up a new company, the West India Company, which they granted the monopoly of trading with the Americas, including the route via Cape Horn.
As a result Le Maire could not take advantage of his son’s discovery.
He died a bitter man in 1624, but his name lives on, 400 years later.

Links :

Thursday, January 28, 2016

Why we’ve been hugely underestimating the overfishing of the oceans


What if everything we know about the amount of fish in the ocean isn't true?

What if the quantity of fish we catch is much higher than we realize?
What if we're heading for a global fishing catastrophe that could trigger a food crisis for millions?
In a multi-year investigation, an international team of scientists led by Dr. Daniel Pauly has set out to challenge dangerous assumptions about the amount of fish we remove from the oceans.

Dr. Pauly contends that as governments and regulators report on commercial fishing, and claim the oceans can handle the huge catches - they're wrong.
The official data fails to account for entire categories of fishing, including small-scale, recreational, and illegal fishing (collectively known in the industry as IUU fishing).
If we don't know how many fish we catch?

How can we know that there are enough left?
The fate of one of humanity's most important food sources depends on convincing governments and industry to finally take stock of the missing fish.
"The Missing Fish" film will follow the journey of Dr. Daniel Pauly and his team as they gather information to calculate the world’s total fish catch.

The film is scheduled for release later this year.

From WashingtonPost by Chelsea Harvey

The state of the world’s fish stocks may be in worse shape than official reports indicate, according to new data — a possibility with worrying consequences for both international food security and marine ecosystems.
A study published Tuesday in the journal Nature Communications suggests that the national data many countries have submitted to the UN’s Food and Agriculture Organization (FAO) has not always accurately reflected the amount of fish actually caught over the past six decades.
And the paper indicates that global fishing practices may have been even less sustainable over the past few decades than scientists previously thought.

The FAO’s official data report that global marine fisheries catches peaked in 1996 at 86 million metric tons and have since slightly declined.
But a collaborative effort from more than 50 institutions around the world has produced data that tell a different story altogether.
The new data suggest that global catches actually peaked at 130 metric tons in 1996 and have declined sharply — on average, by about 1.2 million metric tons every year — ever since.

 In this April 27, 2011 photo, Atlantic bluefin tuna are corralled by fishing nets during the opening of the season for tuna fishing off the coast of Barbate, Cadiz province, southern Spain.
(AP Photo/Emilio Morenatti)

The effort was led by researchers Daniel Pauly and Dirk Zeller of the University of British Columbia’s Sea Around Us project.
The two were interested investigating the extent to which data submitted to the FAO was misrepresented or underreported.
Scientists had previously noticed, for instance, that when nations recorded “no data” for a given region or fishing sector, that value would be translated into a zero in FAO records — not always an accurate reflection of the actual catches that were made.
Additionally, recreational fishing, discarded bycatch (that is, fish that are caught and then thrown away for various reasons) and illegal fishing have often gone unreported by various nations, said Pauly during a Monday teleconference.
“The result of this is that the catch is underestimated,” he said.

So the researchers teamed up with partners all over the world to help them examine the official FAO data, identify areas where data might be missing or misrepresented and consult both existing literature and local experts and agencies to compile more accurate data.
This is a method known as “catch reconstruction,” and the researchers used it to examine all catches between 1950 and 2010.  
Ultimately, they estimated that global catches during this time period were 50 percent higher than the FAO reported, peaking in the mid-1990s at 130 million metric tons, rather than the officially reported 86 million.
As of 2010, the reconstructed data suggest that global catches amount to nearly 109 million metric tons, while the official data only report 77 million metric tons.

Overfishing causing global catches to fall 3X faster than estimated

This news can be interpreted as both good and bad news.
On the one hand, “it means that fisheries are more important than we think,” Pauly said — in other words, when catches were at their highest, they were producing more food for the world than scientists previously thought.
This is a plus for global food security in the authors’ eyes.
Overfishing and the subsequent decline of the world’s fish stocks can be a threat to the food security of cultures that rely heavily on fish — but Pauly suggests that if we implement better management techniques in the future that allow these stocks to replenish themselves, we may be able to feed more people than we thought, as the new data suggest.

On the other hand, the higher catch numbers also suggest that fishing has been even more unsustainable in the past than scientists thought.
And the world is now suffering the consequences, as the authors point out.

Their second major finding was that fish catches have been sharply declining from the 1990s up through 2010 — much more severely than the FAO has reported.
At first, the authors thought that these declines might be due to increased restrictions by certain countries on fishing quotas in recent years.
But when the researchers removed those countries from their calculations, they found that the catch data was still caught up in a downward trend.
“Our results indicate that the declining is very strong and the declining is not due to countries fishing less,” Pauly said during the teleconference.
“It is due to the countries fishing too much and having exhausted one fish after the other.”
The data indicate that the largest of these declines come from the industrial fishing sector.
To be clear, the research is not meant to assess the state of the world’s fisheries, Pauly added — but, nonetheless, the study does raise some important questions about fisheries management moving forward. 

Russia saw the giant ships drains tons of fish in the coast of Morocco Dakhla

The authors suggest that, in the future, the FAO might consider requiring nations to submit catch statistics separately for both large-scale and small-scale fisheries in order to ensure that small-scale fisheries don’t fly under the radar.
They also point out the importance of stock rebuilding — that is, enacting fishing quotas to cut down on overfishing and allow fish stocks to replenish themselves.
Such action may become even more important in the future, as additional factors — most notably, the effects of climate change — place even more pressure on global fish stocks, Pauly noted.
“In the future there will be another mechanism that will begin to play a role [in catch declines] — that is global warming — and it will be very difficult to separate from the effects of fishing,” he said. 

So while a few countries have already implemented fishing caps, he predicted that the world will continue to see a sharp and continual decline in catch until better practices are enacted worldwide.
And this will be important to consider, not only for the health of the oceans, but for the health of the millions of people worldwide who depend on fish for their food and their livelihoods.
With good management, though, there’s room for optimism, Pauly suggested.
“The fact that we catch far more than we thought is, if you like, a positive thing,” he said during the teleconference.
“If we rebuild stocks, we can rebuild to more than we thought before. Basically, the oceans are more productive than we thought before.”

-> FAO’s response to the Nature Communications article “Catch reconstructions reveal that global marine fisheries catches are higher than reported and declining"

Links :

Wednesday, January 27, 2016

Henry Worsley’s journey wasn’t foolhardy – it was tremendous

Cold, bleak and deadly: Antarctica is little changed since the days of Scott and Shackleton
Photo: Global Book Publishing Photo Library

From The Telegraph by Paul Rose (Base Commander of Rothera Research Station, Antarctica, for the British Antarctic Survey for 10 years)

In Antarctica, making the slightest mistake can put your life at risk.
It is an unforgiving place.
Colder than cold, bleak, a vast wasteland of iciness, its deadliness stretches for thousands of miles.
True, it has been explored and mapped.
Yet the minute you step out of your modern base, regardless of all your hi-tech equipment, you’re in exactly the same Antarctica that Scott and Shackleton travelled in.
It’s remote and it is hostile.

That’s why Henry Worsley’s attempt to follow in Shackleton’s footsteps and travel across the Antarctic alone, pulling his own supplies, was so impressive.
He was a formidable explorer: well-organised, determined and incredibly powerful – not one of those people who just goes off with a dream and not much of a plan.
His was a good expedition, and I followed him all the way.
It looked as if he was cruising it and sometimes he was even going like the clappers.

 Antarctica from space (NASA)

But you’ve got to remember those conditions.
Even walking outside at minus 40 degrees when you’re well-rested is a very, very cold, potentially deadly experience.
For Henry to face those conditions alone every day would have been incredibly tough.

 Pulling a sledge full of supplies is brutal  Photo: PA
The former Army officer turned explorer died just 30 miles short of his attempt to become the first person to cross the Antarctic alone

The final expedition:
A solo 943 mile coast-to-coast trek across the Antarctic, pulling a sledge with everything he needed. He collapsed 71 days into the anticipated 80 day journey, and later died of organ failure
Bear in mind that he had to carry everything he needed.
He couldn’t take anything that would add unnecessary weight – such as a spare pair of gloves.
And everything you do in those bitter conditions takes effort.
Say you’re thirsty and want to get some water out of your bag.
You’ve got to get the bag off the sledge and unzip it.
But you’re wearing thick mittens for travelling – warmer than gloves, but offering less dexterity – and you’ve got to take the outer mitten off to reach the zip.
Where do you put that outer mitten to make sure it doesn’t blow away?
Even the simplest task can be fraught with danger, and the only way to stay alive is with a severe amount of discipline.

 His lifelong hero was Ernest Shackleton and it was his journey across the Antarctic that Henry Worlsley was trying to recreate - with the huge, added challenge that Worsley was entirely alone.
Like Shackleton, his bravery and his willingness to endure endless, uncharted terrain led him into a desperate race for survival that ended in his death
The British explorer died of organ failure - tragically - when the end of the mission was almost in sight - just 30 miles remained of his 1,000 mile journey.

It’s bloody hard at the end of a long day spent pulling that sledge.
All you want to do is get the tent up, get in and have a warm drink.
But the tent doesn’t go up by magic.
First you’ve got to secure the sledge, skis and poles so they don’t blow away.
You also have to bear in mind that the moment you stop you are instantly cold, so you have to put on a thicker, insulating down layer.
Then you find the tent and secure it – but it’s still just a shelter and minus 40 inside.
So you put the sleeping bag in, find the stove and melt some snow.
From stopping to getting a cup of instant soup takes an hour and a half.
Mornings are the worst, as you lie there, very hungry, tired and cold and have to force yourself to get up and start the routine over again: melt snow, make food, load sledge.
You love the sledge – because all that equipment is keeping you alive – but you are also beginning to hate the thing, the feeling of it rubbing on your hips as you struggle to put one foot in front of the other.

For all its harshness, though, Antarctica has something we love.Frank Wild, Shackleton’s right-hand man said that it calls you back with little white voices, and he was spot on.
Once you’ve worked there, it’s hard to resist its siren call.
Some people may say that Henry’s journey was foolhardy.
But it wasn’t.
For me it is only natural that we should want to explore new ground, no matter the dangers.
It is good for us to discover the “ground truth” of the planet for ourselves.
Henry’s was a tremendous journey and he very nearly made it.
For that, I salute him.

Links :




Tuesday, January 26, 2016

Were Portuguese explorers the first Europeans to find Australia?

Is this the first map of Australia?
Jave La Grande's east coast: from Nicholas Vallard's atlas, 1547.
This is part of an 1856 copy of one of the Dieppe Maps.
Copy held by the National Library of Australia
(Photo: Wikipedia)

From Atlas Obscura by Eric Grundhauser

Did a secret search for Marco Polo’s islands of gold lead Portuguese explorers to be the first Europeans to discover Australia?
According to some theories, the Dieppe maps, a series of artful 16th century maps say yes.
Operating in the mid-1500s, the Dieppe mapmakers created elaborate, hand-made world maps for wealthy patrons and royals.
The French artists who created the maps were just that, leaving the actual exploration to others and simply translating more utilitarian nautical charts into things of beauty.
The surviving maps are beautifully rendered, although their exact cartographic sources seem to have been lost to time.
This becomes most problematic in the case of "Java la Grande", a giant landmass unique to the maps that was drawn between Antarctica and what we would today consider to be Indonesia.
According to some modern researchers, this mystery island is actually the first record of Europeans seeing Australia.

 The map has been inverted to represent the modern view, but Java la Grande can be found where Australia would be.
World map of Nicolas Desliens, 1566.
(Photo: Wikipedia)

The maps, with their fancy compass roses and detailed illustrations, were intended to be pieces of art, rather than navigational aids, but their information had to come from somewhere.
The names and script on the charts are written out in a mix of French and Portuguese, giving rise to the theory, which was popularized in Kenneth McIntyre's 1977 book, The Secret Discovery of Australia, that the mapmakers of Dieppe were getting their view of the world, at least in part, from Portuguese expeditions.
In particular, one of the maps that came out of Dieppe, (and is survived by a faithful recreation) depicts the east coast of the fabled Java la Grande with place names almost exclusively in Portuguese.
Given the vagaries of the Dieppe map sources, this has led to the theory that it was the Portuguese who were the first Europeans to spy the Australian coast.
In addition to the general location of Java la Grande on the maps, there are certain features that adherents to the theory claim are unmistakably bits of Australia, such as an inlet that looks like Botany Bay and the Abrolhos island chain.

 Java la Grande was thought to be so big the map was awkwardly extended. 
World map, by Guillaume Brouscon, 1543
(Photo: Wikipedia)

As to what expedition could have seen the coast, it is suggested by McIntyre that it was a search for Marco Polo’s fabled Isles of Gold that led to the discovery.
Wealthy Portuguese explorer Cristóvão de Mendonça is recorded as having been tasked by King Manuel with sailing out in search of Polo’s treasure islands, but actual record of this voyage has been lost, if there ever was one.
Manuel was notoriously secretive about the findings of his exploration teams.
According to popular history, Australia was first visited by Europeans when Dutch explorer Willem Janszoon “discovered” the continent in the early 17th century, and later fully explored by Captain Cook.

On the left, first Portuguese chart designed in Dieppe by Jean Rotz in 1542.
On the right, Australia seen by Dutch in 1628...

While no direct evidence of Portuguese discovery exists, there have been other findings that seem to support the theory of their early Australian discovery.
Various ruins, cannons, and other archeological artifacts have been found on the Australian continent that believers say point to Portuguese discovery, but the Dieppe maps remains the prime source of speculation.

Links : 

Monday, January 25, 2016

France SHOM update in the GeoGarage platform

12 nautical charts updated

Opinion: Were US sailors 'spoofed' into Iranian waters?

A riverine patrol boat from Costal Riverine Squadron 2 escorts the guided-missile cruiser USS Bunker Hill (CG 52) while in the Arabia Gulf in this November 15, 2014 handout photo, provided by the U.S. Navy, January 12, 2016.
Ten sailors aboard two U.S. Navy riverine patrol boats were seized by Iran in the Gulf on Tuesday, and Tehran told the United State the crew members would be promptly returned, according to U.S. Officials.
REUTERS/Mass Communication Specialist 1st Class LaTunya Howard/U.S. Navy/Handout via Reuters

From CSMonitor by Dana A. Goward

In 2011, Iran spoofed – or faked – Global Positioning System signals to send a CIA drone off course.

Did it do the same to trick Navy vessels into Iranian waters?

As images of captured American sailors competed with those of the President Obama during the State of the Union address Tuesday, viewers across the world asked: "How could this happen?"
The world’s most powerful nation with the most advanced navy had been embarrassed on the same day as the president's speech.

After a series of other implausible explanations, the Department of Defense settled on the explanation that the crews on both boats "misnavigated."
That in the middle of their trip between Kuwait and Bahrain the two boats accidentally went more than 50 miles out of their way to venture into Iranian waters.
But were they really that poorly trained and inattentive?
Is the navigation equipment in the world’s best navy that poor?
And was it just a coincidence it all happened on the day of the president’s address?
Or was something much more deliberate – and potentially troubling – to blame?

Iran has demonstrated in the past that it has the capability – and the will – to exploit a critical and broad vulnerability in our key navigation system – the Global Positioning System, or GPS.
In 2011, Iran manipulated GPS systems on a CIA surveillance drone to send it off course and capture it.

Now, at a time when elements in Iran are feeling their power and prestige diminish after Tehran agreed to the US-led pact to limit the country's nuclear program, the Islamic Republic could once again flex its muscles and show it has the wherewithal to toy with nearby Navy crews.
And, as the US government is well aware, the GPS network that both drivers and sailors rely on remains vulnerable to attacks.
Powered by solar panels and some 12,000 miles above the earth, GPS satellites broadcast very weak signals that are easy to block or jam.
Over the past few years, illegal jamming by criminals and terrorists trying to hide their whereabouts has become an increasing threat to those signals.
But perhaps more worrisome, GPS signals and receivers can also be spoofed, or faked.
This involves the spoofer sending a bogus signal that can fool GPS receivers, allowing the attacker to trick the device into thinking it's in another location.
Iran claims to have used that technique in 2011 to redirect a CIA surveillance drone from Afghanistan.
Their claim was credible at the time as they clearly had possession of the undamaged drone.

 Demonstration of a Remote Unmanned Aerial Vehicle Hijacking via GPS Spoofing
Military Global Positioning System (GPS) signals have long been encrypted to prevent counterfeiting and unauthorized use.
Civil GPS signals, on the other hand, were designed as an open standard, freely-accessible to all. These virtues have made civil GPS enormously popular, but the transparency and predictability of its signals give rise to a dangerous weakness: they can be easily counterfeited, or spoofed. Like Monopoly money, civil GPS signals have a detailed structure but no built-in protection against counterfeiting.
Civil GPS is the most popular unauthenticated protocol in the world.
The vulnerability of civil GPS to spoofing has serious implications for civil unmanned aerial vehicles, or UAVs.
This was demonstrated in June, 2012 by a dramatic remote hijacking of a UAV at White Sands Missile Range.
The demonstration was conducted by the University of Texas Radionavigation Laboratory at the invitation of the Department of Homeland Security.

It became much more credible several months later when Prof. Todd Humphreys and his students at the University of Texas showed how it was done.
In a live demonstration in 2013, they took over the navigation system of a large yacht in the Mediterranean.
Now, hackers are even selling spoofing kits.

For the 2015 DEF CON hacking conference in Las Vegas, a Chinese researcher sold equipment and published step-by-step instructions for building a spoofing device for about $300.
The loss of the CIA drone in 2011 should have been a wake-up call for the US military that GPS needs more safeguards.
That incident was yet another warning sign that's gone ignored.
But even presidential mandates meant to protect GPS have been ignored over the years.



In 1998, President Clinton became concerned about America’s growing reliance on GPS for navigation.
He directed the Department of Transportation to study the issue and make recommendations.
Those recommendations, which called for improving receivers, developing interference detection networks, and developing non-satellite navigation systems for use alongside GPS, came out just 12 days before 9/11.
Most of them, understandably, were tabled.

Then, in 2004, the Bush administration began to focus on GPS's other functions – providing highly precise timing signals for synchronizing telecommunications and IT networks, financial systems, and power grids.
President Bush issued a presidential directive that identified GPS services as essential to the nation’s critical infrastructure, security, and economy.
Among its provisions to protect GPS, it directed acquisition of a "back-up system" to serve the nation in the event of a GPS disruption.
President Obama later reaffirmed that directive and has issued several additional presidential orders designed to make the nation’s critical infrastructure more resilient.
The Obama administration has also continued to voice significant concerns about GPS vulnerability. Department of Homeland Security officials have called GPS "a single point of failure for critical infrastructure."
Secretary of Defense Ashton Carter has said he wants to "unplug the military from GPS."

But plans to construct a land-based GPS backup system remain dormant.
Studies have shown that, for about $50 million a year, a system known as eLoran could provide a signal more than 1.3 million times stronger than GPS.
And, importantly, the signal is incredibly difficult to jam or spoof.
The deputy secretaries of both the Department of Defense and Department of Transportation have spoken out in favor of such a system.
Yet nothing has been done.
Similar systems are currently being used by Russia, China, South Korea,Britain, Saudi Arabia, and even Iran.

We may never know what truly led two Navy vessels into Iranian waters – the Iranians confiscated the boat’s GPS navigation suites before they were released.
But all the reasons that have been offered to the press seem unlikely.
Small Navy vessels like these have multiple and redundant systems, and usually travel in pairs or small groups specifically to avoid having a single point of failure threaten their mission.
But the incident is once again an important reminder that GPS as a single point of failure can cause significant problems for America, the least of which are minor embarrassments like this one.
Officials in the Obama administration have said they are going to act and address this problem.
Let’s hope that they – and the administration that comes next – follow through on presidential commitments and finally do something to safeguard GPS for everyone.

Links :