Wednesday, February 3, 2016

Coast Survey announces surveys by navigation response teams : NRT data will be used to update nautical charts


From NOAA

Coast Survey’s navigation response teams have proven their value, time and again, especially after hurricanes when ports suspend operations, and shipping (or naval movements) cease until Coast Survey’s small boats can locate underwater dangers to navigation.
But what do the six navigation response teams (NRTs) do during those long periods between deployments for maritime emergencies?
They are busy, mostly year-round, collecting hydrographic data for updating nautical charts.

Plans for 2016

Responding to requests from mariners around the country, Coast Survey has set some aggressive projects for the NRTs this year.
Starting from Northeast and working our way around the coasts…

NEW YORK

Beginning in June and throughout the summer, NRT5 will survey the Hudson River, with a focus on the area from Albany to Kingston.
This is a continuation of the project started at the request of the Hudson River Pilots (as reported in NOAA plans multiyear project to update Hudson River charts).
We are planning to have Coast Survey research vessel Bay Hydro II join the NRT for most of the summer, to get as much new charting data as possible.
In October, NRT5 will move to Eastern Long Island Sound, to finish up some shallow survey work adjacent to recent NOAA Ship Thomas Jefferson’s extensive survey project.
The officer-in-charge of NRT5 is NOAA Lt. Andrew Clos.
The officer-in-charge of Bay Hydro II will be NOAA Ensign Sarah Chappel.

GEORGIA

In March, NRT2 starts a 16-month survey project in Saint Andrew Sound.
The area, which has significant traffic from small boats, tugs, and barges, is reportedly experiencing small boat groundings, and Coast Survey’s navigation manager in the area has received several requests for a modern survey.
Coast Survey will use the data to update NOAA chart 11504 and ENC US5GA12M, as well as other charts covering portions of the specific surveyed areas.
The existing charted soundings are from partial bottom coverage surveys dating back to the early 1900s.
NRT2 is led by Erik Anderson.


 NRT1 will check out the 18-yr-old reported depths to update chart 11376 inset.


EASTERN GULF OF MEXICO – BILOXI AND MOBILE

NRT1 will spend March and April acquiring data off the coast of Biloxi, Mississippi, to update the Intracoastal Waterway chart 11372.
They will then move to Alabama for some long-overdue “chart clean up” work at the northern end of the Mobile Ship Channel, outside of the area controlled by the Army Corps of Engineers.
The Mobile project will investigate charted items, verify reported depths, and update older NOAA bathymetry (vintage 1961) that is depicted in the inset area of NOAA chart 11376.
Since the Mobile survey probably will not take the entire rest of the season, depending on interruptions for hurricane response, we are assessing additional survey needs in the area.
NRT1 is led by Mark McMann.

WEST GULF OF MEXICO – TEXAS

NRT4 will spend all of 2016 surveying in Galveston Bay, including the bay entrance and newly charted barge channels along the Houston Ship Channel.
The team is working with Coast Survey’s navigation manager for Texas to identify additional charted features that require investigation to reduce localized chart clutter and improve chart adequacy.
NRT4 is led by Dan Jacobs.

NORTHERN CALIFORNIA

NRT6 is slated to survey the Suisun Bay anchorage used by MARAD’s National Defense Reserve Fleet, to acquire updated depths.
Afterwards, NRT6 will move throughout the bay area to address charting concerns reported by the San Francisco Bar Pilot Association near Pittsburg, Antioch, San Joaquin River, and Redwood City. Coast Survey will use the data to generally update NOAA chart 18652 and ENC US5CA43M, as well as larger scale charts of the specific surveyed areas.
NRT6 is led by Laura Pagano.

PACIFIC NORTHWEST

It has been a while since Coast Survey has had an operational NRT presence for Oregon and Washington, but this is the year we are bringing NRT3 back on line.
Team lead Ian Colvert is shaking the dust off NRT3 and preparing to restart survey operations.
He is working with the Coast Survey navigation manager to develop survey priorities for this summer and fall.

Charting the data

Once the navigation response teams process and submit the data acquired during the surveys, the information is further processed in Coast Survey’s Atlantic and Pacific hydrographic branches, and then submitted to our cartographers for application to the charts.
The turnaround time for updating the chart depends on the update calendars for each regional cartographic branch.
If the NRTs find any dangers to navigation, the information will be relayed to mariners through the Local Notice to Mariners postings and will be applied to NOAA’s electronic navigational charts (NOAA ENC®), online products, and print-on-demand paper charts.
Critical updates will be applied to charts more quickly than normal depth adjustments.

Studying the heart of El Niño, where its weather begins

The weather phenomenon known as El Niño can cause dramatic effects around the world.
Henry Fountain explains where it comes from.
By Henry Fountain, Aaron Byrd and Ben Laffin on Publish Date September 9, 2014.

From NYTimes by Henry Fountain

A thousand miles south of Hawaii, the air at 45,000 feet above the equatorial Pacific was a shimmering gumbo of thick storm clouds and icy cirrus haze, all cooked up by the overheated waters below.
In a Gulfstream jet more accustomed to hunting hurricanes in the Atlantic, researchers with the National Oceanic and Atmospheric Administration were cruising this desolate stretch of tropical ocean where the northern and southern trade winds meet.
It’s an area that becalmed sailors have long called the doldrums, but this year it is anything but quiet.
This is the heart of the strongest El Niño in a generation, one that is pumping moisture and energy into the atmosphere and, as a result, roiling weather worldwide.

 A satellite image of the area of the Pacific where a NOAA research team would be flying. 
Kent Nishimura for The New York Times

The plane, with 11 people aboard including a journalist, made its way Friday on a long westward tack, steering clear of the worst of the disturbed air to the south.
Every 10 minutes, on a countdown from Mike Holmes, one of two flight directors, technicians in the rear released an instrument package out through a narrow tube in the floor.
Slowed by a small parachute, the devices, called dropsondes, fell toward the water, transmitting wind speed and direction, humidity and other atmospheric data back to the plane continuously on the way down.

Leonard Miller, a NOAA technician, left, testing an instrument package called a dropsonde that was set to be launched from the plane’s delivery system, right.
Credit Left, Henry Fountain/The New York Times; right, Kent Nishimura for The New York Times

The information, parsed by scientists and fed into weather models, may improve forecasting of El Niño’s effect on weather by helping researchers better understand what happens here, at the starting point.
“One of the most important questions is to resolve how well our current weather and climate models do in representing the tropical atmosphere’s response to an El Niño,” said Randall Dole, a senior scientist at NOAA’s Earth System Research Laboratory and one of the lead researchers on the project. “It’s the first link in the chain.”
An El Niño forms about every two to seven years, when the surface winds that typically blow from east to west slacken.
As a result, warm water that normally pools along the Equator in the western Pacific piles up toward the east instead.
Because of this shift, the expanse of water — which in this El Niño has made the central and eastern Pacific as much as 5 degrees Fahrenheit hotter than usual — acts as a heat engine, affecting the jet streams that blow at high altitudes.
That, in turn, can bring more winter rain to the lower third of the United States and dry conditions to southern Africa, among El Niño’s many possible effects.
Aided by vast processing power and better data, scientists have improved the ability of their models to predict when an El Niño will occur and how strong it will be.
As early as last June, the consensus among forecasters using models developed by NOAA, as well as other American and foreign agencies and academic institutions, was that a strong El Niño would develop later in the year, and it did.

But scientists have been less successful at forecasting an El Niño’s effect on weather.
This year, for instance, most models have been less certain about what it will mean for parched California.
So far, much of the state has received higher than usual precipitation, but it is still unclear whether Southern California, especially, will be deluged as much as it was during the last strong El Niño, in 1997-98.
Anthony Barnston, the chief forecaster at the International Research Institute for Climate and Society at Columbia University, who has studied the accuracy of El Niño modeling, said that so-called dynamical models, which simulate the physics of the real world, have recently done a better job in predicting whether an El Niño will occur than statistical models, which rely on comparisons of historical data.
With a dynamical model, Dr. Barnston said, data representing current conditions is fed into the model, and off it goes.
“You plug it in and you crank it forward in time,” he said.
This can be done dozens of times — or as often as money will allow — tweaking the data slightly each time and averaging the outcomes.
With any model, good data is crucial.
El Niño models have been helped by the development of satellites and networks of buoys that can measure sea-surface temperatures and other ocean characteristics.
When it comes to forecasting El Niño’s weather effects, however, good data can be harder to come by.
That’s where the NOAA research project aims to help, by studying a key process in the El Niño-weather connection: deep tropical convection.

 Alan S. Goldstein, the radar monitor for the mission, with other researchers before the flight.
Credit Kent Nishimura for The New York Times 

The clouds that the NOAA jet cruised past on Friday were a result of this process, in which air over the warm El Niño waters picks up heat and moisture and rises tens of thousands of feet.
When the air reaches high altitudes — about the flight level of the Gulfstream — the moisture condenses into droplets, releasing energy in the form of heat and creating winds that flow outward.
Scientists know that the energy released can induce a kind of ripple in a jet stream, a wave that as it travels along can affect weather in disparate regions around the world.
And they know that the winds that are generated can add a kick to a jet stream, strengthening it. That’s a major reason California and much of the southern United States tend to be wetter in an El Niño; the winds from convection strengthen the jet stream enough that it reaches California and beyond.
But to study convection during an El Niño, data must be collected from the atmosphere as well as the sea surface.
That’s a daunting task, because the convection occurs in one of the most remote areas of the planet. As a result, there has been little actual data on convection during El Niño events, Dr. Dole said, and most models, including NOAA’s own, have had to make what amount to educated guesses about the details of the process.
“Our strong suspicion is that our models have major errors in reproducing some of these responses,” he said.
“The only way we can tell is by going out and doing observations.”
When forecasters last year began to predict a strong El Niño, the NOAA scientists saw an opportunity and started making plans for a rapid-response program of research.

Dr. Dole estimated that it would normally take two or three years to put together a program they assembled in about six months.
In a way, he said, they were helped by the developing El Niño, which suppressed hurricane activity in the Atlantic last fall.
The Gulfstream flew fewer missions and the available flight hours, as well as extra dropsondes, were transferred to the project.
In addition to the jet — which is also equipped with Doppler radar to study wind — the program is launching other sondes, from a ship and a small atoll near the Equator.
A large remotely piloted aircraft from NASA, the Global Hawk, has also been enlisted to study the Pacific between Hawaii and the mainland.
The Gulfstream flight Friday was the researchers’ fourth so far, out of nearly two dozen planned over the next month.
The day began at Honolulu International Airport five hours before the 11:30 a.m. takeoff when Ryan Spackman, the other lead investigator, and NOAA colleagues sat down for a weather briefing with Dr. Dole and other scientists at the agency’s offices in Boulder, Colo.
The original plan was to fly due south from Honolulu and around an area of convection — a “cell” in meteorological terms — near the Equator.
But when the plane’s three pilots arrived for their briefing several hours later, the plan was changed out of safety concerns.
There was a risk they would have no way to get back from the south side of the convection area without going through a storm, and the Gulfstream, unlike NOAA’s other hurricane-hunting planes, cannot do that.

Links :
 

Tuesday, February 2, 2016

Microsoft plumbs ocean’s depths to test underwater data center


Introducing Microsoft Project Natick, a Microsoft research project to manufacture and operate an underwater datacenter.
The initial experimental prototype vessel, christened the Leona Philpot after a popular Xbox game character, was operated on the seafloor approximately one kilometer off the Pacific coast of the United States from August to November of 2015.
Project Natick reflects Microsoft’s ongoing quest for cloud datacenter solutions that offer rapid provisioning, lower costs, high responsiveness, and are more environmentally sustainable.

From NYTimes by John Markoff

Taking a page from Jules Verne, researchers at Microsoft believe the future of data centers may be under the sea.
Microsoft has tested a prototype of a self-contained data center that can operate hundreds of feet below the surface of the ocean, eliminating one of the technology industry’s most expensive problems: the air-conditioning bill.
Today’s data centers, which power everything from streaming video to social networking and email, contain thousands of computer servers generating lots of heat.
When there is too much heat, the servers crash.
Putting the gear under cold ocean water could fix the problem.
It may also answer the exponentially growing energy demands of the computing world because Microsoft is considering pairing the system either with a turbine or a tidal energy system to generate electricity.
The effort, code-named Project Natick, might lead to strands of giant steel tubes linked by fiber optic cables placed on the seafloor.
Another possibility would suspend containers shaped like jelly beans beneath the surface to capture the ocean current with turbines that generate electricity.

 Ben Cutler, left, and Norman Whitaker, both of Microsoft Research, with the “Leona Philpot,” a prototype underwater data center, at the company’s headquarters in Redmond, Wash.
Credit Matt Lutton for The New York Times

“When I first heard about this I thought, ‘Water ... electricity, why would you do that?’ ” said Ben Cutler, a Microsoft computer designer who is one of the engineers who worked on the Project Natick system.
“But as you think more about it, it actually makes a lot of sense.”
Such a radical idea could run into stumbling blocks, including environmental concerns and unforeseen technical issues.
But the Microsoft researchers believe that by mass producing the capsules, they could shorten the deployment time of new data centers from the two years it now takes on land to just 90 days, offering a huge cost advantage.
The underwater server containers could also help make web services work faster.
Much of the world’s population now lives in urban centers close to oceans but far away from data centers usually built in out-of-the-way places with lots of room.
The ability to place computing power near users lowers the delay, or latency, people experience, which is a big issue for web users.
“For years, the main cloud providers have been seeking sites around the world not only for green energy but which also take advantage of the environment,” said Larry Smarr, a physicist and scientific computing specialist who is director of the California Institute for Telecommunications and Information Technology at the University of California, San Diego.
Driven by technologies as varied as digital entertainment and the rapid arrival of the so-called Internet of Things, the demand for centralized computing has been growing exponentially.
Microsoft manages more than 100 data centers around the globe and is adding more at a rapid clip. The company has spent more than $15 billion on a global data center system that now provides more than 200 online services.

The “Leona Philpot” prototype was deployed off the central coast of California on Aug. 10, 2015.
In 2014, engineers in a branch of Microsoft Research known as New Experiences and Technologies, or NExT, began thinking about a novel approach to sharply speed up the process of adding new power to so-called cloud computing systems.
“When you pull out your smartphone you think you’re using this miraculous little computer, but actually you’re using more than 100 computers out in this thing called the cloud,” said Peter Lee, corporate vice president for Microsoft Research and the NExT organization.
“And then you multiply that by billions of people, and that’s just a huge amount of computing work.”
The company recently completed a 105-day trial of a steel capsule — eight feet in diameter — that was placed 30 feet underwater in the Pacific Ocean off the Central California coast near San Luis Obispo.
Controlled from offices here on the Microsoft campus, the trial proved more successful than expected.
The researchers had worried about hardware failures and leaks.
The underwater system was outfitted with 100 different sensors to measure pressure, humidity, motion and other conditions to better understand what it is like to operate in an environment where it is impossible to send a repairman in the middle of the night.
The system held up.
That led the engineers to extend the time of the experiment and to even run commercial data-processing projects from Microsoft’s Azure cloud computing service.
The research group has started designing an underwater system that will be three times as large.
It will be built in collaboration with a yet-to-be-chosen developer of an ocean-based alternative-energy system.
The Microsoft engineers said they expected a new trial to begin next year, possibly near Florida or in Northern Europe, where there are extensive ocean energy projects underway.
The first prototype, affectionately named Leona Philpot — a character in Microsoft’s Halo video game series — has been returned, partly covered with barnacles, to the company’s corporate campus here.
It is a large white steel tube, covered with heat exchangers, with its ends sealed by metal plates and large bolts.
Inside is a single data center computing rack that was bathed in pressurized nitrogen to efficiently remove heat from computing chips while the system was tested on the ocean floor.
The idea for the underwater system came from a research paper written in 2014 by several Microsoft data center employees, including one with experience on a Navy submarine.
Norman A. Whitaker, the managing director for special projects at Microsoft Research and the former deputy director at the Pentagon’s Defense Advanced Research Projects Agency, or Darpa, said the underwater server concept was an example of what scientists at Darpa called “refactoring,” or completely rethinking the way something has traditionally been accomplished.
Even if putting a big computing tube underwater seems far-fetched, the project could lead to other innovations, he said.
For example, the new undersea capsules are designed to be left in place without maintenance for as long as five years.
That means the servers inside it have to be hardy enough to last that long without needing repairs.
That would be a stretch for most servers, but they will have to improve in order to operate in the underwater capsule — something the Microsoft engineers say they are working on.

Project Natick vessel being deployed.
They’re also rethinking the physical alignment of data centers.
Right now, servers are put in racks so they can be maintained by humans.
But when they do not need maintenance, many parts that are just there to aid human interaction can be removed, Mr. Whitaker said.
“The idea with refactoring is that it tickles a whole bunch of things at the same time,” he said.
In the first experiment, the Microsoft researchers said they studied the impact their computing containers might have on fragile underwater environments.
They used acoustic sensors to determine if the spinning drives and fans inside the steel container could be heard in the surrounding water.
What they found is that the clicking of the shrimp that swam next to the system drowned out any noise created by the container.
One aspect of the project that has the most obvious potential is the harvest of electricity from the movement of seawater.
This could mean that no new energy is added to the ocean and, as a result, there is no overall heating, the researchers asserted.
In their early experiment the Microsoft engineers said they had measured an “extremely” small amount of local heating of the capsule.
“We measured no heating of the marine environment beyond a few inches from the vessel,” Dr. Lee said.

Links :

Monday, February 1, 2016

US NOAA update in the GeoGarage platform

4 nautical raster charts updated

Less than 1 percent of the world's shipwrecks have been explored

US AWOIS database (extract)

From Popular Mechanics by Jay Bennett

A rough estimate puts more than three million shipwrecks on the ocean floor.
This number represents ships throughout the entirety of human history, from 10,000-year-old dugout canoes preserved in the muck to 21st century wrecks that you might have read about in the news.
There are so many shipwrecks, in fact, that a search operation for the missing Malaysia Airlines Flight 370 has discovered two by accident.
The Battle of the Atlantic alone, which spanned nearly six years during World War II, claimed over 3,500 merchant vessels, 175 warships, and 783 submarines.
Particularly interesting are the cargo ships that literally contain treasure, such as Spanish galleons that transported gold and jewels across the Atlantic.
The Uluburun shipwreck off the coast of southwestern Turkey is roughly 3,300 years old, and that Late Bronze Age vessel contained gold, silver, jewels, copper and tin ingots, tools, swords and other weapons, and much more trade cargo—all of it hauled up over the course of 10 years and 22,413 dives.
But most wrecks don't receive that kind of attention.
In fact, less than 10 percent of the shipwrecks that we we've located—which account for just 10 percent of all shipwrecks in the world—have been surveyed or visited by divers.
Fishing trawlers snag on sunken ships, sonar readings pick them up, historical records tell us where they should be, harbor dredging operations uncover wrecks that have long been lost below the seafloor—but there simply isn't enough time and money to explore the vast majority of them.

The Sweepstakes was built in 1867 as a two-masted schooner in Burlington, Ontario by John Simpson.
The ship's length is 36.3m (119ft) and lays just below the surface in Big Tub harbor with a maximum depth of 7m (20ft).
The Sweepstakes was damaged off Cove Island, then towed to Big Tub harbor where she sank in 1885.
At times, the shipwreck will sit well below the surface of Lake Huron and then when the lake becomes shallower, sections of the Sweepstakes rise up out of the water making parts clearly visible.

Throughout Fathom Five National Marine Park, there are 22 shipwrecks and many people come here to snorkel and scuba dive in these pristine waters.

Daunting Task

James Delgado, the Director of Maritime Heritage at the National Oceanic and Atmospheric Administration (NOAA), says that there are an estimated 4,300 shipwrecks within NOAA's 14 National Marine Sanctuaries.
Of these, 432 have been dived on and surveyed.
And these are shipwrecks within a mapped area set aside for preservation.
"There are laws and regulations directing NOAA to find what lies in those waters and assess it," Delgado said in an email.
Similar to other marine preservation organizations around the world, NOAA is not only devoted to discovering what the ships are, but also how their presence might affect the ecology of the marine environments they lie within.
Outside of marine sanctuaries, there isn't as much of an incentive.
Most shipwrecks are documented for a much simpler reason: to avoid collisions or other incidents. NOAA's Office of Coast Survey maintains a database of about 20,000 ships that is available to the public, primarily for the benefit of navigators and researchers.
The information for that database comes from two organizations within NOAA, the Electronic Navigational Charts (ENC) and the Automated Wrecks and Obstructions Information System (AWOIS).
Still, it's difficult to pinpoint exactly where a shipwreck is on the ocean floor.
The database lists some limitations, including that it "contains wreck features from two different sources that were created for different purposes, and there is not a perfect match of features from either source. The same wreck may be found in both the ENC wrecks and AWOIS wrecks layers, although the positions may not necessarily agree."

Riches Under the Sea

Still, there is an estimated $60 billion in sunken treasure around the world, just waiting at the bottom of the ocean.
And that doesn't include the historical and cultural value of excavating shipwreck sites.
So why don't we explore more of them?
For one thing, it's hard to know what's worth the time.
Diving operations can cost millions of dollars, and before we go down there, we have no idea what the ship is, what it was carrying, and what condition the cargo is in. In some cases, we are not even 100 percent sure that the identified object is a ship at all.
"Not many people follow up on a target to determine if it is a wreck, and if so what type it is, and then if possible, which ship it is," says Delgado.
It is possible, however, that the situation will improve.
As Delgado points out, 90 to 95 percent of the sea floor itself remains unexplored.
There are a number of efforts to change that, such as the Ocean Discovery XPrize that is offering $7 million in prize money for private teams that build an autonomous underwater vehicle (AUV) and create a bathymetric map (like a topographic map, but of the sea floor).
The Schmidt Ocean Institute, founded by former Google CEO Eric Schmidt, maintains a 272-foot vessel outfitted with modern oceanographic equipment that scientists can apply to use for various research expeditions. 
The good news, for shipwrecks explorers at least, is that the majority of shipwrecks are actually near the coast, with a large percentage of incidents occurring in and around the entryways to ports and harbors.
"Some harbors are tough to enter, like Oregon's Columbia River Bar, or leave, like San Francisco's Golden Gate and Bar, due to shifting winds, shifting sands, fog, storms, or strong tides," says Delgado.
"But also for the same reason that most auto accidents seem to happen within a mile of home, and there are many accidents coming in and out of parking lots, people seem to be less cautious or more aggressive."
With most shipwrecks so close to the shore, and multiple examples of wealthy patrons sponsoring exploration and research expeditions, we could see many of these unexplored shipwrecks investigated in the coming years.