Saturday, April 13, 2013

‘Lost’ sailing pioneer photo archive uncovered after 4 decades

'Circa 22nd April 1969: Robin Knox-Johnston with champagne aboard his 32ft yacht SUHAILI off Falmouth, England after becoming the first man to sail solo non-stop around the globe. Knox-Johnston was the sole finisher in the Sunday Times Golden Globe solo round the world race, having set out from Falmouth, England on 14th June 1968'    Bill Rowntree - PPL ©

From PPLmedia

A 3,000-strong archive of pictures, taken during Sir Robin Knox-Johnston’s pioneering solo non-stop round the world voyage in 1968/9, has been discovered and saved for posterity by specialist photo library PPL.
The story is published for the first time in the May issue of Classic Boat magazine.

The pictures cover Knox-Johnston’s entry in the Sunday Times Golden Globe Race in1968/9 when Robin was one of nine starters attempting to become the first to sail alone non stop around the world. The event was billed as one of the last great challenges left to man.
For the other eight, it was a challenge too far, for Robin was the only finisher, completing the 27,000 mile circumnavigation aboard his self-built 32ft ketch Suhaili in 312 days.



The valuable archive had been gathering dust in the library of the Sunday Mirror newspaper which had published Robin's stories during the voyage.
The negatives were about to be dumped in a skip when the London newspaper moved from Fleet Street to Canary Wharf, but by pure chance, Bill Rowntree, the Mirror photographer who had covered Knox-Johnston’s departure and return to Falmouth, happened to be in the newspaper the day of the big clear out.

Robin Knox-Johnston celebrates aboard

Bill, now 73, recalls, “The picture library manager poked his head round the door and asked if I had any use for them.”
Rowntree put the box under his arm and took it back to his home at Tunbridge Wells, Kent, and there the negatives may have lain, had it not been for a particular picture request from Henri-Lloyd, the clothing manufacturer that had supplied Knox-Johnston’s original oilskins, to mark their 60th anniversary celebrations this year.

 a TV helicopter hovers overhead as Robin Knox-Johnston sails his 32ft yacht SUHAILI off Falmouth

Many of the colour pictures in Knox-Johnston’s bestselling book A World of My Own, were lost after publication, but PPL Photo Agency holds what was left of the Knox-Johnston archive within its Pictures of Yesteryear Library.
This also contains the archives of other famous sailing pioneers including Sir Francis Chichester, Sir Chay Blyth, and photographer Eileen Ramsay, who captured many of these great events between 1949 and 1970.
It was Barry Pickthall’s call to Rowntree to see if he still had the negative that Henri-Lloyd needed, that prompted the question "What should we do with the other 3,000 pictures I have here?”

 Robin Knox-Johnston regailing friends with stories of his solo circumnavigation, in the bar of the Royal Cornwall Yacht Club

Sir Robin Knox-Johnston, who is now inspiring hundreds of amateur sailors to follow in his adventurous wake aboard a 12-strong fleet of yachts in the Clipper Round the World Race starting from the UK in August, says of the find, "I thought these pictures were lost, so it was a wonderful surprise to discover that this unique record of my return to Falmouth in 1969 still exists."


Dan Houston, the Editor of Classic Boat Magazine, says:

"There is something quite awe-inspiring when you look at these photographs. They capture a moment in sailing history when Sir Robin, then a young man, almost unwittingly stepped into the history books. His boat Suhaili was not the fastest – he never expected to win, but as the other competitors were beaten by the elements and gear failure, it was Robin and his seaworthy little ketch that won the day. Sir Robin went on to become Britain's most endearing and enduring sailing hero – and gave his £5,000 prize money from the race to the family of Donald Crowhurst who committed suicide during the race.


These photographs show the man as much as the hero: his parents tense faces as he reassures them saying goodbye; life aboard in his untidy cabin; his first unsteady steps as he gets ashore; his first beer ashore. It is all such a contrast with how things are done today – and it makes his feat of seamanship all the more remarkable."


Links :

Friday, April 12, 2013

Liquid Robotics unveils Wave Glider SV3 ocean robot which brings the Cloud to the waves

Liquid Robotics debuted Monday the Wave Glider SV3, the world's first hybrid wave and solar propelled unmanned ocean robot, and with it, the technology to explore portions of the world previously too challenging or costly. (Photo : Liquid Robotics)

From CNN

 Where data is concerned -- that is, where usable, potentially profitable data -- the world's oceans are somewhat akin to black holes: We know they are out there, but beyond what we can see at the horizon, we really have no idea what's happening in any one place from one moment to the next.
The amount of useful information streaming back to shore from the world's oceans, home to critical food stocks, abundant energy reserves, vital shipping lanes, and the engine driving global climate, is so thin as to be meaningless for all but the most academic purposes.

 Liquid Robotics's Wave Glider are the first unmanned autonomous marine robots that use the ocean's waves for energy.

"Do you realize that in the ocean today there is often one sensor for an area the size of California?" says Liquid Robotics CEO Bill Vass.
He likens this to standing in Death Valley and trying to determine the local temperature via a thermometer that is hundreds of miles away.
"It may not feel like 58 degrees to you," he says, capping off the analogy.
"But that's what your sensor says because your sensor is in San Francisco."

This dearth of data places Liquid Robotics in a truly unique position.
Its seaworthy, sensor-laden, surfboard-shaped Wave Glider robots use a novel propulsion system to convert the rolling motion of ocean waves into energy for forward thrust, creating a self-contained system that requires no refueling and very little maintenance as long as the ocean continues to move. 

The company proved this last December when one of its Wave Gliders -- launched from San Francisco a year prior -- arrived in Brisbane, Australia after autonomously completing a 9,000-mile trans-Pacific crossing.
It proved its durability again when one of the NOAA's Wave Gliders traveled right through the center of Hurricane Sandy in October.
But with the release of its latest iteration of Wave Glider this week at the Navy's Sea-Air-Space expo near Washington, D.C., Liquid Robotics has more or less completed a transition from robot manufacturer to one of the world's more interesting big data companies.

Why?
Because the Wave Glider's trip across the Pacific was little more than a warm-up lap.
Liquid Robotics has previously sold and leased its robotic sensing platforms to the U.S. Navy and the National Oceanic and Atmospheric Administration, but for most applications -- oil and gas exploration, tropical storm tracking and prediction, fisheries management, maritime threat interdiction -- data, rather than a robot, is what the customer really wants.

Wave Glider can provide data by the terabyte.
But the brand new Wave Glider SV3 processes data by the terabyte and networks with other Wave Gliders in its vicinity, basically creating an information-rich cloud stretching across the high seas.


 Roger Hine, co-founder and CTO of Liquid Robotics talks about the process of developing the Wave Glider SV3, the next generation of Wave Glider, autonomous ocean surface vehicle

Key to this is a proprietary cloud-based operating system called Regulus (designed for Liquid Robotics by Java creator James Gosling) that allows the SV3 to exhibit a fairly dazzling degree of autonomy while also maintaining an open source component that allows for rapid deployment of new sensor payloads and software packages as well as rapidly swappable software and hardware to run them.
Sensor payloads can include virtually anything that fits in the SV3's seven modular payload units: atmospheric and oceanographic sensors applicable to ocean and climate science, video cameras and acoustic sensors useful for national security and marine environment protection purposes, or instruments for mapping and evaluating geography on the seafloor and below.
And thanks to the cloud-based software architecture, some SV3s could carry all of these sensors and more while offloading some of the data processing to another nearby Wave Glider serving as a central data hub.

In other words, Liquid Robotics can deploy something like a floating server farm to process the data collected by other Wave Gliders in the area, then supply customers on shore with only the refined, processed data that they want -- something oil and gas exploration companies in particular have been quick to embrace.
(Alongside the NOAA and U.S. Navy, Liquid Robotics' client list includes names like Schlumberger (SLB) and BP (BP).)

"If you look at an ocean-rated crew vessel that can do this, it costs about $150,000 a day in a commercial environment," Vass says.
"An ocean-rated research vessel is about $40,000 a day. We do the same kind of data collection -- usually denser data collection actually, because we move more slowly -- at about a tenth of that cost, and we don't pollute or put people at risk when we do it."

 SV3 & SV2 side by side

Currently Liquid Robotics is operating 200 Wave Gliders at sea in every ocean on Earth, Vass says, a number that is growing 60% year over year.
The company has provided data to about 100 customers thus far, and when its current fleet of SV2s -- which largely stream raw data back to shore for processing -- are replaced and upgraded with the SV3's onboard processing capability (the new SV3 begins shipping in Q3, though most of its upgrades are retrofittable to existing SV2s) Liquid Robotics' ability to provide companies with dense but highly refined data sets will likely grow exponentially.

Vass and his colleagues envision a globe swimming with Wave Gliders, creating a mesh network that spans the 70% of the Earth that is, as yet, largely unwired.
"Our customer is anyone who moves over the ocean or extracts value from it," Vass says.
"Or anyone who deals with weather," he adds, more or less tying up what the company sees as its real value proposition.
Not every company needs high-resolution data streaming in from far out at sea, but the data Liquid Robotics provides could have impacts far beyond its client base (think anyone who relies on NOAA or the National Weather Service to make decisions).
And those entities that directly need this kind of data -- whether oil and gas outfits, national security agencies, or wildlife management, oceanographic researchers, or international shipping concerns -- have never been able to access it before.
At least not like this.
"Ten years ago this company would've been science fiction," Vass says.
" Bringing all of this technology together is really going to change the world."

Links :

Thursday, April 11, 2013

Probing the reasons behind the changing pace of warming


From YaleEnvironment360

A consensus is emerging among scientists that the rate of global warming has slowed over the last decade.
While they are still examining why, many researchers believe this phenomenon is linked to the heat being absorbed by the world’s oceans.

Whatever happened to global warming? Right now, that question is a good way of starting a heated argument. Some say it is steaming ahead.
But others say it has stalled, gone into reverse, or never happened at all — and they don’t all run oil companies or vote Republican.

So what is going on?

First, talk of global cooling is palpable nonsense.
This claim relies on the fact that no year has yet been hotter than 1998, an exceptional year with a huge planet-warming El Nino in the Pacific Ocean.
Naysayers pretend that 1998 was typical, when it was anything but, and that temperatures have been declining since, which is statistical sleight of hand.

Meanwhile consider this.
According to the National Oceanic and Atmospheric Administration (NOAA), all 12 years of the new century rank among the 14 warmest since worldwide record-keeping began in 1880. The second-warmest year on record, after 1998, was 2010.
This is not evidence of cooling.

But there is a growing consensus among temperature watchers that the pace of warming in the atmosphere, which began in earnest in the 1970s and seemed to accelerate in the 1990s, has slackened, or stalled, or paused, or whatever word you choose.
It may turn out to be a short blip; but it is real.
“Although the first decade of the 21st century was the warmest on record, warming has not been as rapid since 2000,” says Pete Stott, head of climate monitoring and attribution at the UK’s Met Office, one of the leading keepers of the global temperature.
He calls it a “hiatus” in warming.

In a blog last week, James Hansen, the retiring head of NASA’s Goddard Institute for Space Studies (GISS), agreed that “the rate of global warming seems to be less this decade than it has been during the prior quarter century”

Something is going on.
With heat-trapping greenhouse gases accumulating in the atmosphere ever faster, we might expect accelerated warming.
So it needs explaining.

There are a number of theories.
Hansen suggested that extra emissions of particles in Asian smogs could be shading the Earth and camouflaging the greenhouse effect.
In a February post on RealClimate, his Goddard Institute colleague Gavin Schmidt instead pointed to fewer warming El Ninos and more cooling La Ninas.
He suggested that adjusting for their influence produced an unbroken pattern of warming.


Schmidt’s analysis certainly hints at a role for the oceans in all this.
And most researchers on the case argue that, one way or another, the most likely explanation for the heating hiatus is that a greater proportion of the greenhouse warming has been diverted from the atmosphere into heating the oceans.
A new study from Kevin Trenberth of the National Center for Atmospheric Research in Boulder, Colorado, published online in Geophysical Research Letters, found that ocean warming has been accelerating over the last 15 years.

Richard Allan of the University of Reading in England says simply: “Warming over the last decade has been hidden below the ocean surface.”
If you take the oceans into account, he says, “global warming has actually not slowed down.”

This should not come as a surprise, notes Chris Rapley of University College London.
The oceans are the planet’s main heat sinks.
More than 90 percent of the extra heat trapped in the atmosphere by greenhouse gases ends up there.
But, while climate models are good at calculating atmospheric processes, they are poorer at plumbing the ocean-atmosphere interactions that determine how fast and how regularly this happens.

That makes those interactions a big source of uncertainty about atmospheric global warming, especially over the short term.
If oceans grab a bit more heat one year, they can shut down that year’s warming. Equally, if they release a bit more they can accelerate atmospheric warming.
This matters. “The way the ocean distributes the extra energy trapped by rising greenhouse gases is critical... [to] global surface temperatures,” says Allan.
For forecasters trying to figure out the next decade or so, oceans could screw it all up.

Some bits of the puzzle have been known for a while.
For instance, during El Nino years, warm water spreads out across the equatorial Pacific and the ocean releases heat into the air, warming the air measurably.
That is what happened in 1998.

But while El Ninos come and go within a year or so, there are other cycles in heat distribution and circulation of the oceans that operate over decades.
They include the Pacific Decadal Oscillation and the Atlantic Multidecadal Oscillation (AMO), both of which have been implicated in climate fluctuations in the 20th century.
So have these or other ocean cycles been accelerating the uptake of heat by the oceans?


Virginie Guemas of the Catalan Institute of Climate Sciences in Barcelona, believes so.
In a paper published in Nature Climate Change this week, she claims to provide the first “robust” evidence linking ocean uptake of heat directly to what she calls the recent “temperature plateau” in the atmosphere.

By plugging detailed measurements of recent atmospheric and sea temperatures into EC-Earth, a European model of interactions between atmosphere, oceans, ice and land surfaces, Guemas found that about 40 percent of the take-up was in the tropical Pacific, and another 40 percent in the tropical and North Atlantic.

She told me that it seems likely the changing thermohaline ocean circulation, which starts in the North Atlantic, plus the cycles of El Nino and perhaps the AMO, may play a prominent role.
She thinks her model could have predicted the recent slowdown of atmospheric warming ahead of time.

That would be a breakthrough, but nobody has done it yet.
Meanwhile, the climate modellers are skating on thin ice when they make predictions that play out over the timescales of a decade or so on which ocean cycles seem to operate.
These forecasters can claim that, all things considered, they have done pretty well.
But the forecasts remain hostages to fortune.

If anything, the recent pause shows the model forecasts in a good light. Myles Allen, a climate modeller at Oxford University in England, reported in Nature Geoscience last month on an audit of one of his own forecasts, which he made in 1999.
He had predicted a warming of a quarter-degree Celsius between the decade that ended in 1996 and the decade that ended in 2012.
He found that, in the real world, temperatures got too warm too soon during the 1990s; but the slackening pace since had brought the forecast right back on track.

That shows the forecast is performing well so far, but Allen admitted it might not stay that way.
If temperatures flat-line out to 2016, his model’s prediction for that year will look no better than a forecast based on a series of random fluctuations.

Some in the mainstream climate community privately admit that they were caught out by the slackening pace of warming in the past decade or so.
Back in the 1990s, some suggested — or at least went along with — the idea that all the warming then was a result of greenhouse gases.
They were slow to admit that other factors might also be at work, and later failed to acknowledge the slowdown in warming. As Allen pointed out earlier this year: “A lot of people were claiming in the run-up to the Copenhagen 2009 [climate] conference that warming was accelerating. What has happened since then has demonstrated that it is foolish to extrapolate short-term climate trends.”

Not surprisingly they have been taken to task for this hubris.
Roger Pielke Jr., an environmental studies professor at the University of Colorado at Boulder, who enjoys baiting the mainstream, told me last month: ”It is good to see climate scientists catching up with the bloggers. They should ask why it took so long to acknowledge what has been apparent to most observers for some time.”

But modellers are now responding more actively to the new real-world temperature data.
For instance, the Met Office’s Stott reported last month that global temperatures were following the “lower ranges” of most model forecasts, and that higher projections were now “inconsistent” with the temperature record.

And last December, the Met Office downgraded its best guess for temperatures in the five years to 2017 from 0.54 degrees C higher than the average for the late-20th century average to 0.43 degrees higher. It said the new forecast was the first output of its latest climate model, HadGEM3, which incorporates new assessments of natural cycles.

But the problem is that these cycles are not well integrated into most climate models.
Natural cycles could switch back to warming us again at any time, admits Stott.
But he has no clear idea when.

The stakes for the climate forecasting community are high.
It may be unfair, but the brutal truth is that if the climatologists get their forecasts for the coming decade badly wrong, then a great many in the public will simply not believe what they have to say about 2050 or 2100 – even though those forecasts may well be more reliable.

Forecasters badly need a way to forecast the ocean fluctuations, and it could just be that Guemas’s new study will help them do that.
She claims that her findings open the way to the future delivery of “operational decadal climate predictions.”
For now she is cautious about making firm predictions, but told me she believes that “the heat that has been absorbed recently by the ocean might very well be released back to the atmosphere soon. This would be the scenario of highest probability. It would mean an increased rate of [atmospheric] warming in the next decade.”

It would indeed.
If natural cycles start pushing towards strong warming, they will add to the continued inexorable upward push from rising concentrations of heat-trapping greenhouse gases.
In that case, we would see climate change returning to the rapid pace of the 1990s.
Whatever happened to global warming?
The odds may be that by 2020 it will have come roaring back.

Links :
  • DiscoveryNews : Think the Planet Isn't Warming? Check the Ocean

Wednesday, April 10, 2013

France SHOM update in the Marine GeoGarage

26 charts have been withdrawn since the last update :

  • 1619    Mouillages de Tarifa   
  • 3357    De la Pte Banda à la Riv. Coanza   
  • 3462    Baie de Ba (Baie Le Bris)   
  • 3519    Delta du Tonkin   
  • 3865    De l'île Hon Tseu au Cap Lay   
  • 4697    Baie de Diego-Suarez   
  • 5414    Baie d'Halong   
  • 5427    Baie de Cam-Ranh   
  • 5509    Du Cap Padaran à la Baie de Cam-Ranh   
  • 5549    De la Baie d'Halong à Pak-Ha-Mun   
  • 5559    Chenaux entre Quang-Yen et la Baie d'Halong   
  • 5563    Baie de Nhatrang   
  • 5571    Cambodge et Cochinchine Mékong  
  • 5652    Grande Baie des Faï Tsi Long   
  • 5676    Abords de Poulo Condore et embouchure du Bassac   
  • 5691    Annam et Cochinchine   
  • 5809    Du Cap Varella à l'île Nuoc   
  • 5892    De la Pointe Samit à Tian-Moi (Ile à l'Eau) Koh Prins, Koh Tang, Poulo Wai  
  • 5899    De Hon-Tseu à Hon-Matt   
  • 6150    Mouillage et Passes de Tamatave   
  • 6238    Iles Anjouan et Mohéli   
  • 6239    Grande Comore   
  • 6290    Abords de Sihanoukville (Kompong Som)   
  • 6527    Port de Tamatave   
  • 6666    De l'estuaire du Gabon à l'estuaire du Congo   
  • 7520    INT 7115 Abords de Djibouti    

and 6 charts have been added

  • 7547    Abords de Djibouti
  • 7678    ÃŽles Anjouan et Mohéli
  • 7679    ÃŽles Grande Comore et Mohéli
  • 7681    Baie d'Antsiranana (Diégo-Suarez)
  • 7683    Port et Passes de Toamasina (Tamatave)
  • 7791    De Gamba à Luanda 

so 600 charts from SHOM are displayed in the Marine GeoGarage




















































Sharper view of the Southern Ocean: New chart shows the entire topography of the Antarctic seafloor in detail for the first time


 The new IBCSO map of Antarctica (Alfred-Wegener-Institut)

From AWI

Reliable information on the depth and floor structure of the Southern Ocean has so far been available for only few coastal regions of the Antarctic.
An international team of scientists under the leadership of the Alfred Wegener Institute, Helmholtz Centre for Polar and Marine Research, has for the first time succeeded in creating a digital map of the entire Antarctic seafloor.
The International Bathymetric Chart of the Southern Ocean (IBCSO) for the first time shows the detailed topography of the seafloor for the entire area south of 60°S.
An article presented to the scientific world by IBCSO has now appeared online in the scientific journal Geophysical Research Letters.
The IBCSO data grid and the corresponding Antarctic chart will soon be freely available in the internet and are intended to help scientists amongst others to better understand and predict sea currents, geological processes or the behaviour of marine life.

 Multibeam bathymetric survey techniques provide a rapid means of determining the morphology and nature of the seafloor.
The recent Hydrosweep DS-2 System onboard R/V Polarstern provides 59 individual soundings of the water depth and echo strength for each ping.
Moreover sidescan information (2048 echos per ping) is retrieved.
The system can be operated with 90 or 120 degrees fan angle and is designed for deep sea observations.

The new bathymetric chart of the Southern Ocean is an excellent example of what scientists can achieve if researchers from around the world work across borders.
“For our IBCSO data grid, scientists from 15 countries and over 30 research institutions brought together their bathymetric data from nautical expeditions. We were ultimately able to work with a data set comprising some 4.2 billion individual values”, explains IBCSO editor Jan Erik Arndt, bathymetric expert at the Alfred Wegener Institute in Bremerhaven.

This video shows typical use cases of a modern multibeam echosounder, like the HYDROSWEEP of the 3rd generation, on a modern research vessel, like the German research icebreaker POLARSTERN.

Collecting bathymetric data, as with the German research vessel Polarstern with its multibeam echo sounding system, was nowhere near enough, however, to develop a useful, three-dimensional model of the seafloor:
“The ocean south of the 60th parallel extends over an area of some 21 million square kilometres and is therefore around 60 times as large as the Federal Republic of Germany. Reliable bathymetric data have so far existed for only 17 per cent of this area. The largest data gaps, for example, are in the deep sea regions of the south Indian Ocean and the South Pacific and in areas which experience difficult sea ice conditions throughout the year in some places, such as in the Weddell Sea”, says Jan Erik Arndt.

 The IBCSO database currently consists of more than 4200 million data points contributed by more than 30 institutions from 15 countries.
In total 177 Multibeam cruises were available building the nuclei of the database.
Single beam echo sounding data, digitized soundings from nautical charts and regional bathymetric compilations are rounding off the database

For this reason the mappers did not just take the trouble to digitize old Antarctic nautical charts and to convert satellite data.
They also used a mathematical trick by interpolating the data set.
“We treated every existing measurement point like a tent pole to a certain extent and arithmetically covered these poles with a tarpaulin. In this way we obtained approximate values about the height of the tarpaulin between the poles”, explains the AWI specialist for data modeling.

 Marie Byrd seamounts
Depiction of the Marie Byrd seamounts in the IBCSO grid (bottom) compared to its depiction in the older GEBCO_08 data grid (top).
The lines one can see in the IBCSO-graph are the multibeam tracklines of the research vessels.

This work was worth it: the IBCSO data grid has a resolution of 500 times 500 metres.
This means that one data point reflects the depth of a sea area of 500 times 500 metres – a feature that leads to impressive degree of detail.
Where older models only offer a glimpse of a mountain in the deep sea, IBCSO shows an elevation with sharp ridge crests and deep channels in the slopes.

 Depiction of the region around Ritscher Canyon in the IBCSO grid (bottom) compared to its depiction in the older GEBCO_08 data grid (top).

A formerly flat point at the bottom of the Riiser-Larsen Sea can now be identified as an offshoot, some 300 metres deep, of the underwater Ritscher Canyon which runs along a length of over 100 kilometres from the south west to the north.
And not far away from today’s shelf ice edge of the large Getz ice shelf the furrows are to be seen quite clearly which were ploughed into the seafloor by the ice tongue as it grew.

Depiction of the region north of Neumayer station III (Weddell sea) in the IBCSO grid (bottom) compared to its depiction in the older GEBCO_08 data grid (top).

Using this degree of detail IBCSO is primarily intended to push ahead with research: “The depth data of the Southern Ocean are of great interest to polar researchers from many disciplines. The 3D data grids of the seafloor enable oceanographers to model currents and the movement of the deep Antarctic water which is of such great importance. Geologists are able to recognise the structures of geological processes more easily. Biologists may be able to better estimate the regions in which certain biological communities may emerge or whether, for example, seals dive to the bottom of the sea in a certain area in search of food”, explains Jan Erik Arndt.

Depiction of the region around Dotson Getz trough in the IBCSO grid (bottom) compared to its depiction in the older GEBCO_08 data grid (top).

However, despite the elation about the new model and its chart, it should not be forgotten that more than 80 per cent of the area of the South Polar Sea is still unchartered.
Jan Erik Arndt: “We hope that as our data grid becomes better known in the scientific world, other scientists will be more willing to provide us with their data of current and future depth measurements in the South Polar Sea. The chances are not bad. A few new research ice breakers are currently being built around the world and every one of them will presumably be equipped with a modern multibeam echo sounder in the same way as Polarstern.”

IBCSO is a project of the General Bathymetric Chart of the Oceans (GEBCO).
It is supported by the Intergovernmental Oceanographic Commission (IOC) of UNESCO, the International Hydrographical Organisation (IHO), the Hydrographic Commission on Antarctica  (HCA) and by the Scientific Committee on Antarctic Research (SCAR).
The geodesy and bathymetry working group of the Alfred Wegener Institute coordinates the project and is responsible for the entire modelling work.

Links :

Mysterious structure discovered beneath sea of Galilee

This shot of the Sea of Galilee was taken near the old city of Tiberias.
The newly discovered structure is located just to the south. Credit : Deror Avi

From LiveScience

A giant "monumental" stone structure discovered beneath the waters of the Sea of Galilee in Israel has archaeologists puzzled as to its purpose and even how long ago it was built.
The mysterious structure is cone shaped, made of "unhewn basalt cobbles and boulders," and weighs an estimated 60,000 tons the researchers said.
That makes it heavier than most modern-day warships.
Rising nearly 32 feet (10 meters) high, it has a diameter of about 230 feet (70 meters).
To put that in perspective, the outer stone circle of Stonehenge has a diameter just half that with its tallest stones not reaching that height.


 >>> geolocalization with the Marine GeoGarage <<<

It appears to be a giant cairn, rocks piled on top of each other.
Structures like this are known from elsewhere in the world and are sometimes used to mark burials.
Researchers do not know if the newly discovered structure was used for this purpose.
The structure was first detected in the summer of 2003 during a sonar survey of the southwest portion of the sea.
Divers have since been down to investigate, they write in the latest issue of the International Journal of Nautical Archaeology.

 "Close inspection by scuba diving revealed that the structure is made of basalt boulders up to 1 m (3.2 feet) long with no apparent construction pattern," the scientists stressed.
"The boulders have natural faces with no signs of cutting or chiseling. Similarly, we did not find any sign of arrangement or walls that delineate this structure."
 
They say it is definitely human-made and probably was built on land, only later to be covered by the Sea of Galilee as the water level rose.
"The shape and composition of the submerged structure does not resemble any natural feature. We therefore conclude that it is man-made and might be termed a cairn," the researchers write.
The circular structure was first detected in a sonar survey of part of the sea in the summer of 2003.
CREDIT: Shmuel Marco

More than 4,000 years old? 

Underwater archaeological excavation is needed so scientists can find associated artifacts and determine the structure's date and purpose, the researchers said.
Researcher Yitzhak Paz, of the Israel Antiquities Authority and Ben-Gurion University, believes it could date back more than 4,000 years.
"The more logical possibility is that it belongs to the third millennium B.C., because there are other megalithic phenomena [from that time] that are found close by," Paz told LiveScience in an interview, noting that those sites are associated with fortified settlements.
The researchers list several examples of megalithic structures found close to the Sea of Galilee that are more than 4,000 years-old.
One example is the monumental site of Khirbet Beteiha, located some 19 miles (30 kilometers) north-east of the submerged stone structure, the researchers write.
It "comprises three concentric stone circles, the largest of which is 56 m [184 feet] in diameter."
[Gallery: Aerial Photos Reveal Mysterious Stone Structures]

An ancient city

If the third-millennium B.C. date idea proves correct it would put the structure about a mile to the north of a city that researchers call "Bet Yerah" or "Khirbet Kerak."

 Putting all the data together researchers found that the structure is cone shaped, about 230 feet (70 meters) in diameter and nearly 32 feet (10 meters) tall.
It weighs an estimated 60,000 tons.
CREDIT: Diagram courtesy Shmuel Marco

During the third millennium B.C. the city was one of the biggest sites in the region, Paz said.
"It's the most powerful and fortified town in this region and, as a matter of fact, in the whole of Israel."
Archaeologist Raphael Greenberg describes it in a chapter of the book "Daily Life, Materiality, and Complexity in Early Urban Communities of the Southern Levant" (Eisenbrauns, 2011) as being a heavily fortified 74-acre (30 hectares) site with up to 5,000 inhabitants.
With paved streets and towering defenses its people were clearly well organized.
"They also indicate the existence of some kind of municipal authority able to maintain public structures ..." Greenberg writes.
The research team says that, like the leaders of Bet Yerah, whoever built the newly discovered Sea of Galilee structure needed sophisticated organization and planning skills to construct it.
The "effort invested in such an enterprise is indicative of a complex, well-organized society, with planning skills and economic ability," they write in their journal paper.
Paz added that "in order to build such a structure a lot of working hours were required" in an organized community effort.

Future exploration

Paz said that he hopes soon that an underwater archaeological expedition will set out to excavate the structure.
They can search for artifacts and try to determine its date with certainty.
He said that the Israel Antiquities Authority has a research branch capable of excavating it.
"We will try to do it in the near future, I hope, but it depends on a lot of factors."

Tuesday, April 9, 2013

Coast Survey unveils easier access to wreck information

AWOIS on Google Maps

From NOAA

Maintaining documentation for features depicted on nautical charts is more complicated than you probably imagine.
For instance, Coast Survey maintains information on more than 10,000 submerged wrecks and obstructions in U.S. coastal waters – and it just got easier for the public to access that free information.

Coast Survey uses our Automated Wreck and Obstruction Information System (AWOIS) to help plan hydrographic survey operations and to catalog the many reported wrecks and obstructions considered navigational hazards within U.S. coastal waters.
The public also has access to this rich information source.
Marine archaeologists and historians, fishermen, divers, salvage operators, and others in the marine community find AWOIS valuable as an historical record of selected wrecks and obstructions.

Information contained in the database includes latitude and longitude of each feature, along with brief historic and descriptive details.
Until recently, that information was available for download in Microsoft Access MDB and Adobe PDF format.
However, these formats were difficult to search.

As of today, AWOIS information will no longer be available in MDB or PDF format.
Instead, users can download AWOIS files in the more useful Google Earth Keyhole Markup Language (KML) format.
KML is an XML grammar and file format for modeling and storing geographic features such as points, lines, images, polygons, and models for display in Google Earth, Google Maps, and other applications. (KML is an international standard, maintained by Open Geospatial Consortium, Inc.)

example of the information that will be displayed by clicking on a AWOIS item

Once you download an AWOIS file, you can open that file directly in a mapping application, such as Google Earth or Google Maps.
You can then navigate directly to your area of interest and obtain information about individual features.
Clicking on any AWOIS item will bring up additional information, such feature type, position, and history.


AWOIS file opened in Google Earth

In wake of Sandy, NOAA alters hurricane warning policy

NOAA's GOES-13 satellite captured this visible image of the massive Hurricane Sandy on Oct. 28 at 1615 UTC (12:02 p.m. EDT).
The line of clouds from the Gulf of Mexico north are associated with the cold front that Sandy is merging with.
Sandy's western cloud edge was already over the Mid-Atlantic and northeastern U.S.
Credit: NASA GOES Project

From ClimateCentral 

Ahead of the 2013 Atlantic Hurricane Season and in the wake of Hurricane Sandy, the National Weather Service announced Thursday that it is changing its policy on the issuance of tropical storm and hurricane watches and warnings.
The changes will give forecasters more flexibility in issuing hurricane warnings, and streamline the authority for issuing such warnings.

Beginning on June 1, the agency will be permitted to leave watches and warnings in effect even if a hurricane transitions into a post-tropical cyclone, which technically speaking is a different type of storm than a storm of purely tropical origins, provided that the storm still poses “a significant threat to life and property.”

The agency’s procedures came under criticism after forecasters decided not to issue hurricane warnings as Hurricane Sandy approached the Mid-Atlantic coast, since the storm was transitioning from a hurricane into a post-tropical storm.
Based on policies in place at the time, forecasters would have had to cancel hurricane warnings once the storm completed that transition, and they feared that would confuse the public and the emergency management community.

However, critics argue that by not issuing hurricane warnings at all, and instead posting a plethora of high wind, coastal flooding, and other watches and warnings, the Weather Service may have downplayed the risks from Sandy, and caused some people to remain in their homes that were then subject to the storm’s record storm surge along the New Jersey, New York, and Connecticut coastlines.
According to a recent report from the reinsurance company Swiss Re, Sandy caused an estimated $70 billion in total damage and $35 billion in insured losses.

The lack of hurricane warnings during Sandy also has had implications for the insurance industry, since most hurricane-insurance policies contain deductibles that only kick in if a hurricane warning is in effect at the time that the home is damaged.
This means that insurance companies may be on the hook for a greater percentage of losses than if a warning had been issued.

In addition to allowing tropical storm and hurricane watches and warnings to remain in effect after a hurricane transitions into a different type of storm, the new policy would keep the National Hurricane Center in Miami in charge of all forecast products until the storm no longer poses a significant threat. During Hurricane Sandy, responsibility was passed from the Hurricane Center to local NWS offices, which resulted in inconsistent communications and warnings being issued from one state to another.

“Our forecasters now have more flexibility to effectively communicate the threat posed by transitioning tropical systems,” NWS director Louis W. Uccellini said.
“Sandy’s forecast was remarkably accurate and under a similar situation in the future, forecasters will be able to choose the best option to underscore the urgency involved.”

According to an NWS statement, the policy change, which was first proposed during an annual NOAA hurricane conference in November, is supported by preliminary findings from NOAA’s service assessment on Sandy, which is slated to be released in May.
The NWS released an example of a statement that would be issued for a hypothetical hurricane named "Mandy," which states: "Mandy loses tropical characteristics but hurricane warnings remain in effect. Mandy expected to bring life-threatening storm surge and hurricane-force winds to the coast this afternoon."

Links :
  • LiveSciences : Perfect Storm: Climate Change and Hurricanes
  • Accuweather : No hurricane warning for what could be the most expensive storm in history

Monday, April 8, 2013

Canada CHS update in the Marine GeoGarage


23 charts have been updated (March 29, 2013) :
    • 1312 LAC SAINT-PIERRE
    • 1313 BATISCAN TO LAC SAINT-PIERRE
    • 1430 LAC SAINT-LOUIS
    • 2100 LAKE ERIE
    • 2121 LONG POINT TO PORT GLASGOW
    • 2181 HARBOURS IN LAKE ERIE
    • 2202A PORT SEVERN TO TOMAHAWK ISLAND
    • 2202B TOMAHAWK ISLAND TO TWELVE MILE BAY
    • 2202C TWELVE MILE BAY TO ROSE ISLAND
    • 2202D SOUTH CHANNEL AMANDA ISLAND TO PARRY SOUND
    • 2202E MOON ISLAND AND SURROUNDING AREAS
    • 3424 APPROACHES TO OAK BAY
    • 3440 RACE ROCKS TO D'ARCY ISLAND
    • 3462 JUAN DE FUCA STRAIT TO STRAIT OF GEORGIA
    • 4012 YARMOUTH TO HALIFAX
    • 4013 HALIFAX TO SYDNEY
    • 4115 PASSAMAQUODDY BAY AND ST CROIX RIVER
    • 4116 APPROACHES TO SAINT JOHN
    • 4237 APPROACHES TO HALIFAX HARBOUR
    • 4320 EGG ISLAND TO WEST IRONBOUND ISLAND
    • 4375 GUYON ISLAND TO FLINT ISLAND
    • 4462 ST. GEORGE'S BAY
    • 4850 CAPE ST FRANCIS TO BACCALIEU ISLAND AND HEART'S CONTENT
    So 688 charts (1658 including sub-charts) are available in the Canada CHS layer. (see coverage)

    Note : don't forget to visit 'Notices to Mariners' published monthly and available from the Canadian Coast Guard both online or through a free hardcopy subscription service.
    This essential publication provides the latest information on changes to the aids to navigation system, as well as updates from CHS regarding CHS charts and publications.
    See also written Notices to Shipping and Navarea warnings : NOTSHIP

    Each year, a land area larger than Manhattan disappears off Louisiana's coast

    >>> geolocalization with the Marine GeoGarage <<<

    From NOAA

    Every year, 25-35 square miles of land off the coast of Louisiana—an area larger than Manhattan–disappears into the water due to a combination of subsidence (soil settling) and global sea level rise.
    The maps above show how much land has been lost to the Gulf of Mexico in the past 80 years.

    The first image shows the state of the coast in 2011.

    Based on a NASA satellite image, gray and white areas show land and blue indicates open water. New land—mainly coastal improvements such as shoreline revetments and enriched beach areas—that built up since 1932 is shown in green.

    How much of what is now open water was once land?

    The second image shows the state of the coast in 1932.

    The image combines the 2011 satellite image with a U.S. Geological Survey map in which land areas that were present in 1932 are light gray.
    Since the 1930s, Louisiana's coast has lost 1,900 square miles of land, primarily marshes.
    The two maps reveals the dramatic coastal change.

    In Southeast Louisiana, relative sea level is rising at a rate of three feet every one hundred years, according to sixty years of tidal gauge records.
    Relative sea level refers to the change in sea level compared to the elevation of the land, which can be due to a combination of global sea level rise and subsidence—the settling and sinking of soil over time.

    Storm surge—the water from the ocean that is pushed toward the shore by the force of storm winds—takes advantage of the problems caused by subsidence and global sea level rise.
    Because much of the Louisiana coast is very low in elevation and gradually converting to open water, entire neighborhoods, roads, and other structures are vulnerable to even small storm events.

    At the very tip of the coast lies Port Fourchon—one of the country's major ports serving the deepwater oil and gas industry in the Gulf of Mexico.
    Next to it is Grand Isle, the last inhabited barrier island in Louisiana.
    Various beach restoration projects over the years have helped build up and maintain Grand Isle and other Louisiana's barrier islands.
    They are the first line of defense against storms headed toward the mainland and New Orleans.

    The Louisiana Highway 1 is the only road leading to Port Fourchon and Grand Isle.
    While the port sits on a five-foot ridge, much of the LA-1 highway is built on land only two feet in elevation.
    The highway is growing increasingly vulnerable to sea level rise, subsidence, and storm surge every year.
    One section of the road is so low that even small storm events cause flooding that makes it impassable.
    Disruptions to the infrastructure surrounding the port have the potential to impact every American at the gas pump.

    Sea level rise in Louisiana is a challenge today, not just one for the future, and a wide group of people and organizations are helping develop solutions.
    Want to know more?
    Our recent feature story, "Thriving on a Sinking Landscape," provides an in-depth look at what is at stake for locals and the rest of the country if the LA-1, 'America's longest Main Street,' fails to stay above water.

    Sunday, April 7, 2013

    Tidal flats and channels on Long Island, Bahamas

    Tidal flats and channels on Long Island, Bahamas are featured in this image
    photographed by an Expedition 26 crew member on the International Space Station 
    (NASA, 27 November 2010)

    The islands of the Bahamas in the Caribbean Sea are situated on large depositional platforms (the Great and Little Bahama Banks) composed mainly of carbonate sediments ringed by fringing reefs – the islands themselves are only the parts of the platform currently exposed above sea level.
    The sediments are formed mostly from the skeletal remains of organisms settling to the sea floor; over geologic time, these sediments will consolidate to form carbonate sedimentary rocks such as limestone.

     >>> geolocalization with the Marine GeoGarage <<<

    This detailed photograph provides a view of tidal flats and tidal channels near Sandy Cay on the western side of Long Island, located along the eastern margin of the Great Bahama Bank.
    The continually exposed parts of the island have a brown coloration in the image, a result of soil formation and vegetation growth (left).

    To the north of Sandy Cay an off-white tidal flat composed of carbonate sediments is visible; light blue-green regions indicate shallow water on the tidal flat.
    Tidal flow of seawater is concentrated through gaps in the anchored land surface, leading to formation of relatively deep tidal channels that cut into the sediments of the tidal flat.
    The channels, and areas to the south of the island, have a vivid blue coloration that provides a clear indication of deeper water (center).