Tuesday, February 16, 2016

Greenville Collins’ Coasting Pilot (1693)



Before the late 1700s sailors couldn't fix their position at sea and had to sail close to land, increasing the risk of shipwreck.
So when Greenville Collins' charts of Britain's coastline were published in 1693, hundreds of lives and ships were saved.
Explorer Nicholas Crane navigates Cornish waters in a square-rigger of the period to reveal the extent of Collins's achievement.


Perhaps the best known surveyors of Great Britain’s coastline, Greenville Collins was an English captain serving in the royal navy.
In 1683, King Charles II appointed Collins to the role of Hydrographer to the King and placed him in command of the Royal yacht Merlin.
Collins was commissioned to survey, draw and publish the most complete charts of the British coastline.
Over the eight or so years between 1681 and 1689, Collins commanded the Merlin the vessel from which he proceeded to chart the waters of Britain.
These working charts were released and put into practice aboard ship very quickly after
publication.
The collection was eventually finished, collated and published in large folio by 1693.
They were released for publication by Freeman Collins (possible a relation) as Great Britain's Coasting Pilot was initially sold by Richard Mount of London.


True to map making of the time, a lot of the basic detail was procured and re-edited from earlier Dutch charts.
Collins, being a working naval officer, ensured the charts were of practical purpose: durable and accurate as required.
Though fairly scant compared to modern navigational charts, these initial workable sheet were an extraordinary achievement for the time.


The remarkable archive of this point in naval history confirms that the Collins charts were frequently mentioned as the best charting available for many years after their publication.
The Merlin is cited as the first British warship dedicated to marine survey work as opposed to exploration.
Somewhat ironically, it was also known to have been involved in initiating the Anglo-Dutch war in 1671.

Links :

Admiralty e-Nautical publications - NMs in seconds

e-Nautical Publications (e-NPs) are electronic versions of official ADMIRALTY Nautical publications.
Easy to use and update, they bring improved efficiency, accuracy and access to information bridge crews need. 
Find out how easy it is to add NMs with e-NPs by watching the above short video

 From UKHO

e-Nautical Publications (e-NPs) are electronic versions of official ADMIRALTY Nautical publications.
Easy to use and update, they bring improved efficiency, accuracy and access to information bridge crews need.

Offering:
  • Weekly Notices to Mariners added accurately in seconds to ensure ongoing safety and compliance.
  • Simple search functionality for easier access to information the mariner needs.
  • Approved for use by the Flag States of over three quarters of ships trading internationally, with clear display of NM updates to aid inspections.
    86 official ADMIRALTY Nautical Publications available in an electronic format. The range includes Sailing Directions, the Mariner’s Handbook and many more.
Why e-Nautical Publications

e-NPs are designed to meet SOLAS carriage requirements, contain the same information as their paper equivalents and are approved for use by the Flag States of over three quarters of ships trading internationally.
Unlike their paper counterparts, each e-NP allows bridge officers to take advantage of accurate electronic updating and quick information access through simple search functionality.
Additionally, the new e-Reader snapshot function allows crews to view, save and print e-NP pages and any applicable NMs and addendums, which can be used to support passage planning.

Fast and accurate NM updates

e-NPs are updatable, electronic versions of official ADMIRALTY Nautical Publications such as Sailing Directions and the Nautical Almanac.
This means that bridge crews can download and apply electronic weekly Notices to Mariners (NMs) updates to publications in just a few seconds every week, freeing their time to focus on other important duties.
This functionality can also help to improve the accuracy of passage plans; giving decision makers more confidence on the bridge.

Easier access to important information

Simple search functionality gives users quick access to important planning information.
The e-Reader snapshot functionality also helps bridge crews to view, save and print e-NP pages and any applicable NMs and addendums, which can be used to support passage planning.​

Carriage compliance
e-NPs are designed to meet SOLAS carriage requirements, contain the same official information as their paper equivalents, and have been approved by the Flag States of over three quarters of ships trading internationally.
e-NPs can also aid inspections by clearly showing when a publication was last updated with weekly NMs.​

A growing list of e-Nautical Publications
86 of the world's leading Nautical Publications will be available in an e-NP format in February 2015. 

They include:

Mariners Handbook (NP100)e-NP bar Ocean passages for the world (NP136)*e-NP bar Symbols and abbreviations used on ADMIRALTY charts (NP5011)*e-NP bar ILALA Maritime Buoyage System (NP735)*e-NP bar Annual summary ADMIRALTY Notices to Mariners*e-NP bar Cumulative list of ADMIRALTY Notices to Mariners*e-NP bar
Sailing Directionse-NP bar Nautical Almanac (NP314)e-NP bar Guide to the practical use of ENCs (NP231)*e-NP bar Guide to ENC symbols used in ECDIS (NP5012)*e-NP bar Guide to ECDIS Implementation, Policy and Procedures (NP232)*e-NP bar How to keep your ADMIRALTY products up-to-date (NP294)*e-NP bar


(*available from February)


Monday, February 15, 2016

These Terabit satellites will bring Internet to the remotest places on Earth

ViaSat offers today's fastest service over land and water for business jets and VIP aircraft.
Today’s service delivers an unmatched internet experience to hundreds of aircraft and our Ka-band service available to commercial airline passengers since 2013 and for business aviation in 2015, is winning awards.
ViaSat keeps everyone on board productive and entertained like no one else.

From FastCompany by Michael Grothaus

The three new ViaSat-3s will deliver twice the combined network capacity of all the connected satellites in space.

The U.S.-based satellite company ViaSat has announced that it has teamed up with aerospace giant Boeing to create three new satellites that will bring high-speed Internet to the remotest parts of the world.

ViaSat said it would be spending about $1.4 billion over five years to provide inexpensive bandwidth to terrestrial consumers, business and commercial aviation passengers and government mobile platforms.
Also in its cross-hairs, ViaSat said, are maritime and offshore-energy markets, which are now paying far too much for their broadband connectivity.

The three ViaSat-3 satellite will join the already 400 other connected satellites in space.
However, the ViaSat-3s will deliver twice the network capacity of the other 400—combined.
The satellites will be capable of 1 terabit speeds each (that’s 1,000 gigabits per second).
That amount of bandwidth will be able to provide fast enough Internet to reliably deliver bandwidth-hogging 4k video to isolated areas—and in the sky.

These three new satellites, named ViaSat3, will be carrying a total network capacity of a whopping 1 Terabit per second of internet bandwidth to remote regions , triple the capacity of ViaSat2.

The satellites will offer residential service to users of up to 100 megabits or more per second in areas that are so rural or remote they don’t have the infrastructure to support hardwired Internet services.
The company says this will enable billions of more people who don’t have access to the web today to get online.
The ViaSat-3 satellites will also deliver in-flight Internet access operating at hundreds of megabits per second to commercial airlines, business jets, and high-value government aircraft.
Additionally the new satellites will deliver the Internet, operating at speeds up to 1 gigabits per second, to maritime operations, including freighter ships, and oil and gas platforms.

"The innovations in the ViaSat-3 system do what until now has been impossible in the telecommunications industry—combining enormous network capacity with global coverage, and dynamic flexibility to allocate resources according to geographic demand," Mark Dankberg, chairman and CEO of ViaSat, said in a statement.
"While there are multiple companies and consortia with ambitions to connect the world with telecom, satellite and space technologies, the key technologies underlying ViaSat-3 are in hand today, enabling us to move forward in building the first broadband platform to bring high-speed Internet connectivity, including video streaming, to all."

ViaSat isn’t alone in the race to supply high speed Internet from above.
Companies from Google to Facebook have looked into satellite Internet technology, but have both abandoned the plans in favor of other methods.
SpaceX and Virgin Galactic are in various stages of development and deployment when it comes to satellite Internet services.
And ViaSat itself already has one satellite in the sky capable of delivering 100Mbps Internet to users in the U.S.
It will also launch a ViaSat-2 satellite, capable of delivering speeds up up to 300Mbps, on a SpaceX Falcon 9 rocket in the next few months.

As for the ViaSat-3 satellites, the first two will be completed and delivered into space via Boeing Satellite Systems in 2019 and provide service for users in the Americas and Europe, the Middle East, and Africa (EMEA).
The third satellite will go up sometime after 2019 and provide service to users in Asia.

Links :

Sunday, February 14, 2016

Big surf in Nazare : a closer look

 
A giant swell hits the Portuguese coast, causing indescribable waves due to their strength and size!
Surfers as Garrett McNamara, Andrew Cotton, Hugo Vau, Eric Rebiere, Carlos Burle, Maya Gabeira, Pedro Scooby, Felipe Cesarano, Nitzan Benhaim, Sylvio Mancusi went into the sea of Nazareth willing to write a new chapter in the history of surfing.


From the near drowning of Maya up to the largest possible wave surfed records favored by all surfers who risked surfing a wave on this day!
Nazareth once again shows the world its size and power ...
A Closer Look is an introspetiva and clear vision, which seeks to show in a neat way the waves of Nazaré and their sound...


Nazaré is the world’s stage for the biggest waves ever ridden, most of the memorable rides so far were all during big days with favourable or at least sufficient conditions for surfers and safety teams to ride and operate... when conditions go beyond that point, we call it Black Naza.
During the session of February 9th, 2016, things were far from ideal in Praia do Norte, big storm, the swell was big with high period, but the wind was too strong and onshore, most of the waves were too bumpy, the big ones were closing out, the inside looked like nothing less than a war zone.
Despite those conditions, Australian hardcore surfers Mick Corbett and Jarryd Foster decided to have a go anyway and test the waters, this guys have made a name charging “The Right” in Western Australia, one of the most dangerous slabs on the Planet, and it’s such a thrill to watch them here testing the limits even in days like this, literally on an unridable naza. 

 From IHPT, Instituto Hidrografico de la Marinha de Portugal :
a scientific perspective of the Nazaré wave

Links :

Valentine's day : earth imagery, map and nature shaped hearts

 Heart cloud.
The first day of Hurricane Season 2011 brings a fast moving surface low tracking west-southwestward near 20 mph.
This image was taken by GOES-East at 1315Z on June 1, 2011.

 Bonne projection
World map by Bernard Sylvanus, 1511

Werner projection
 Heart shaped map projections are known as cordiform map projections.. 
(other Bottomley projection)

Heart wave

Saturday, February 13, 2016

M.V. Barzan time-lapse


This is the story of the world’s greenest ultra-large container vessel M.V. Barzan.

Friday, February 12, 2016

Russia submits revised claims for extending Arctic shelf to UN

© Valeriy Melnikov / Sputnik

From RT

Russia has submitted a revised application to the United Nations to extend its share of the Arctic continental shelf.
Though the 2001 application was rejected, Moscow is now confident that it has provided enough data to back up its claims.
“Russia has presented its application to extend its territory on the Arctic continental shelf and it was registered. The work we have done to gather material is extensive,” said the Russian Minister for Natural Resources and the Environment Sergey Donskoy.
He added that the documents had been well received by the UN commission, which will now look at Russia’s application.

Moscow's application to extend its share of the Arctic continental shelf has been registered by the United Nations.
The claim was made at a session of the UN Commission on the Limits of the Continental Shelf.

Moscow’s first bid in 2001 was rejected due to a lack of scientific proof.
Russia has been aiming to prove that its claims are in fact correct ever since.
"From 2002 to 2014, nine geological and geophysical expeditions took place in the central part of the Arctic Basin, using atomic-powered icebreakers, as well as research submarines," Donskoy said.
The UN commission could start studying the documentation provided by Russia later this month, Donskoy noted, adding it could be between two and four years before a decision is made.
The minister also added that Canada, Denmark and the US, who also have borders with the Arctic region, did not oppose the UN consideration of Russia’s proposals.

Territorial claims on the Arctic shelf

Moscow is currently abiding by the 1982 UN Convention on the Law of the Sea.
The convention says the Arctic countries are entitled to a 200-nautical mile economic zone over the continental shelf abutting their shores.  

Currently, Russia claims to areas outside the established 200-mile economic zone.
The zone  covers the geomorphological shelf of the Russian Arctic marginal seas, a part of the Eurasian Basin (Nansen and Amundsen basins, the Gakkel Ridge) and the central part of the Amerasian basin as part of the Makarov basin and the Complex of Central Arctic submarine elevations.

Moscow’s revised bid covers an underwater space area of about 1.2 million square kilometers and goes beyond 350 nautical miles from the shore.
It includes claims for the Lomonosov Ridge, the Mendeleev-Alpha Rise and the Chukchi Plateau.


“In particular, the revised application includes the areas of the southern end of the Gakkel Ridge and the Podvodnikov Basin,” the ministry said in a statement posted on Facebook.
The Podvodnikov Basin is another name given to the Wrangel Abyssal Plain, as named by the International Bathymetric Chart of the Arctic Ocean (IBCAO).


Russia is trying to prove that these ranges are actually a continuation of the Russian continental shelf. If the UN rules in its favor, Moscow will be entitled to the exclusive rights to develop vast resources, the volume of which, according to the Ministry of Natural Resources, may reach five billion tons of untapped oil and natural gas reserves – worth as much as $30 trillion.
“All collected data confirm the continental nature of the Lomonosov ridge, Mendeleev-Alpha Rise, Chukchi Plateau, as well as a continuous extension of these elements from the shallow shelf of Eurasia,” the statement said on Tuesday.

Russia Arctic control to expand?
Canada, Denmark, Norway also submit claims to territory

On Tuesday, Donskoy also raised the question of the maritime delimitations in the Arctic Ocean, which are still pending.
In particular, several disputed areas in parts of Amundsen Basin, the Lomonosov Ridge, Makarov-Podvodnikov Basins and Mendeleev Rise are also claimed by Denmark as an extension of Greenland. 
In December, Denmark filed a claim for the territory of approximately 895,541 square kilometers of the Arctic seabed – an area 20 times larger than Denmark itself – after 12 years of research and over $55 million worth of investments.
In addition, there are unresolved issues of maritime delimitation between Russia and Canada in the areas of Makarov Basin and Mendeleev Rise.
According to international law, if the UN does not issue recommendations on the maritime demarcation, the countries themselves could delimit the disputed territories bilaterally.

However, Russia could face problems further down the line even if the UN eventually rules in its favor, due to current geo-politics, says Conn Hallinan, a columnist for Foreign Policy in Focus.
“I think you have to see this as part of a wider global issue. The fact is that the Russians are being demonized right now. I think you have to see this Arctic push back by the United States as part of that whole process, which is to try and isolate Russia internationally because Russia represents an independent force in the world and the United States does not like that,” Hallinan told RT.

Russia’s military aspirations in the Arctic

Since 2012, Russia has been undergoing a process of re-establishing military bases in its Arctic regions, which also includes introducing mobile nuclear power plants.
Defense Minister Sergey Shoigu said that Russian troops will be deployed in the Arctic by 2018, equipped with all the necessary high-tech weaponry in a number of bases across the polar region.
Washington is closely watching Moscow’s activity in the Arctic region, according to US Secretary of State John Kerry.

Russia deploys S-400 missile defense systems in Arctic
source : RT

“Economic riches tend to attract military interest as nations seek to ensure their own rights are protected. And we know, because we track it, that these countries – like Russia, China, and others – are active in the Arctic,” Kerry said.

Meanwhile, the director of the US Defense Intelligence Agency said that “containing Russia’s presence in the Arctic is crucial to the interests of the US.”
Lieutenant General Vincent Stewart was testifying on Tuesday before the Senate Armed Services Committee, providing an annual assessment of top global threats.
“The Russians intend to increase their ability to control the Arctic regions. They have built airbases, they are building missile defense capability – both costal and naval missile defense capability and they are doing that for economic and military reasons,” Stewart said.
“In the absence of something that counters that, they will continue to expand, so I think it is imperative that we have the willingness and the ability to be able to push back their control or dominance of the Arctic region.”

Links :

Thursday, February 11, 2016

Arctic shipping passage 'still decades away'

Russia, Europe and the shipping industry have been waiting
for a major ice-free shipping lane to open through the Arctic.
Photograph: Sergio Pitamitz/Robert Harding World Imagery/Corbis

From The Guardian by John Vidal

Ordinary merchant ships will not be able to take an ice-free shortcut from China to Europe until at least 2040, report predicts

It will be decades before big cargo ships link China and northern Europe by taking a shortcut through the Arctic Ocean, a report predicts.
Climate change, retreating summer ice and the prospect of shorter journey times and 40% lower fuel costs has led Russia, European governments and some industries to expect a major ice-free shipping lane to open above Russia, allowing regular, year-long trade between the Atlantic and Pacific oceans within a few years.


But, says the Arctic Institute in a new paper, low bunker fuel prices, a short sailing season and continuing treacherous ice conditions in the Arctic even in summer months means it could be 2040 at the earliest before it is commercially viable for ordinary merchant ships to pass through what is known as the northern sea route.
Until then it will remain cheaper to send trade between Europe and the east via the Suez canal, it says.


The conclusions of the report were backed this month by the powerful Danish Shipowners’ Association, which includes 40 major shipping companies such as Maersk, the world’s largest. Denmark has the eighth largest fleet in the world and would stand to gain the most in Europe if the northern sea route opened.
“We have gone from hyper-optimism to total realism. The world economy was developed on the basis of a high oil price. The northern sea route seemed viable [a few years ago] but now it’s not the case. The route has vast potential but it will take a long time to open up,” said Anne H. Steffensen, director of the association at a meeting of Arctic country ministers and industry in Tromsø.

Russia has tried to open up the Arctic to international traffic by offering icebreaker service and better port facilities.
But cargo in transit along the northern sea route dropped from 1.3m tonnes in 2013 to 300,000 tonnes in 2014.
Last year only 100,000 tonnes was transported between Asia and Europe on the route.
However, there was a big rise in the number of vessels going to and from Russian Arctic ports.


The Arctic Institute report, which compares the costs of building ice-reinforced ships suitable for the northern sea route, to existing costs of using the Suez canal, includes fuel prices, wait times, lengths of journey, canal fees and different sea conditions.
It concludes that trade is unlikely to open up the northern route for decades.

It expects the Arctic sea ice to be too thick and treacherous for many years, requiring expensive ice breakers and strengthened hulls.
“The Arctic navigation season is currently too short and ice conditions are too unpredictable for liner shipping to be feasible. Arctic liner shipping will only become a viable alternative to the contemporary shipping lanes if global warming continues to melt the ice cover along the North-west passage and the Northern sea route.
“It is highly unlikely that large-scale containerised cargo transports will appear in the near future. The question then arises: when, if ever, will the ice conditions allow for continuous and economically feasible container transport along the route?”


The greatest potential for the use of ice-reinforced container ships was found if the speed of global warming increased and the price of fuel is high.
But even in this scenario, the cost per container was about 10% higher than going via the Suez canal route.
Scientists have predicted that ordinary vessels would be able travel easily along the northern sea route, and moderately ice-strengthened ships should be able to pass over the pole itself by 2050.
Russian authorities still sees a bright future for shipping along its northern shoreline, but not as a busy international shipping route.
“It is 100% sure that the northern sea route will be no alternative to the Suez Canal,” Russia’s deputy minister of transport, Viktor Olersky, told the Arctic Circle 2015 assembly.

Holland America Line, cruise liner for penguins (but in Antarctica)

Unusually high temperatures in January led to January seeing the lowest recorded extent of sea ice in the satellite record.
The ice extent averaged 13.53 million square kilometres (5.2 million square miles), which is 1.04 million sq km (402,000 sq m) below the 1981 to 2010 average, according to the US government’s National Snow and Ice Data Centre.

Links :

Wednesday, February 10, 2016

Measuring ocean heating is key to tracking global warming

Sun shining over the sea.
Photograph: Alamy Stock Photo

From The Guardian by John Abraham

Taking the Earth’s temperature is a challenge, but a critically important one if we are to better understand the nature of climate change

Human emissions of greenhouse gases such as carbon dioxide are causing the Earth to warm.
We know this, and we have known about the heat-trapping nature of these gases for over 100 years.
But scientists want to know how fast the Earth is warming and how much extra energy is being added to the climate because of human activities.

If you want to know about global warming and its future effects, you really need to answer these questions.
Whether this year was hotter than last year or whether next year breaks a new record are merely one symptom of a warming world.
Sure, we expect records to be broken, but they are not the most compelling evidence.

The most compelling evidence we have that global warming is happening is that we can measure how much extra heat comes in to the Earth’s climate system each year.
Think of it like a bank account.
Money comes in and money goes out each month.
At the end of the month, do you have more funds than at the beginning?
That is the global warming analogy.
Each year, do we have more or less energy in the system compared to the prior year?



The answer to this question is clear, unassailable and unequivocal: the Earth is warming because the energy is increasing.
We know this because the heat shows up in our measurements, mainly in the oceans.
Indeed the oceans take up more than 92% of the extra heat.
The rest goes into melting Arctic sea ice, land ice, and warming the land and atmosphere.
Accordingly, to measure global warming, we have to measure ocean warming.
Results for 2015 were recently published by Noaa and are available here.


A recent paper by Karina von Schuckmann and her colleagues appeared in Nature Climate Change, and provides an excellent summary of our knowledge of the energy balance of the Earth and recent advances that have been made.
The article describes the complexity of the situation.
The Earth is continuously gaining energy from greenhouse gases, but there are also natural fluctuations that cause both increases and decreases to the energy flows.

For instance, volcanic eruptions may temporarily reflect some solar energy back to space.
Natural variability like the El Niño/La Niña cycle can change heat flows and how deep the heat is buried in the ocean.
The energy from the sun isn’t constant either; it varies on an 11-year cycle, but by less than 0.01%.
With all of this and more happening, how do we know if an energy imbalance is natural or human caused?
How do we separate these effects?

The effort to separate human from natural effects is seen to be possible when one considers how the imbalance is measured in the first place.
There are multiple complementary ways to make these measurements.
Each technique has advantages and disadvantages and they have to be considered together.

One way is through satellites that orbit the Earth.
These satellites can measure the heat entering the atmosphere and the heat leaving the system.
The difference between them is the imbalance.
Currently, the longest operating satellite measurement for this is from Nasa and is named Ceres (Clouds and Earth’s Radiation Energy System).
The difficulty is that the energy imbalance is only about 0.1% of the actual energy flows in and out, and while the changes can be tracked, their exact values are uncertain.

Another way to measure the imbalance is to actually take the ocean’s temperature.
Temperature tells us how much heat a system has. If the temperature is increasing, it means the energy within the system is increasing as well – the system is out of balance.
Not only do we have to measure the ocean temperatures accurately, but there is a need to measure the temperatures year after year after year exceedingly accurately to much better than a 0.1°C margin.
What really matters is how the temperature is changing over long periods of time.

While it may sound easy to measure the oceans, it is actually quite challenging.
The oceans are huge (and deep) and difficult to access.
The need is for enough measurement locations at enough depths and with enough precision to get an accurate temperature.

 Argo Program: Deep sea probes (drifters) drift with ocean currents at a depth of around 2000 meters, surface every ten days and send their data on temperature and salinity to satellites. Afterwards they sink down again into the "tranquil depths".
The probes area part of the international ARGO program and measure the upper 2000 meters of the Earth's oceans.
The data is used by scientists, fishers and the military, for example for research on climate change, prediction of the seasons and "ocean weather".

This visualization shows the locations of the ARGO buoy array over time.
When the buoys are above water, the lines are brighter; when the buoys are under water, the lines are fainter.
The ARGO buoys measure ocean salinity, column temperature, and current velocities.
This version of the visualization uses a slow camera move.
Credit: NASA/Goddard Space Flight Center Scientific Visualization Studio

In recent years, we have relied upon a system of automated ocean measurement devices called the Argo fleet.
These devices are scattered across the globe and they autonomously rise and sink (down to 2,000 meters) and record temperatures and salinity during their travels.
Because of the Argo fleet, we know a lot more about our oceans, and this new knowledge helps us ask better questions.
But the fleet could be made even better.
They do not measure the bottom half of the ocean (below 2,000m depth) and they do not fully cover regions near or under ice or near shores.

 Argo buoy

Furthermore, a 10-year trend is much too short to make long-term climate conclusions.
We have to stitch Argo temperatures to other instruments, which have been measuring the oceans for decades.
That stitching process has to be done carefully so that a false cooling or warming trend is not introduced.

Another way is through ocean levels.
As the oceans warm, the water expands and sea levels rise.
So, just by measuring the changing water levels, it is possible to assess how much heat the oceans are absorbing.
The drawback to this method is that oceans are also rising because ice around with world is melting, particularly in Greenland and Antarctica.
As this melted ice water flows into the oceans, it too causes sea levels to rise.
So, it’s important to separate how much of ocean level rise is from heat-expansion and how much is from ice melting.

And another way is through the use of climate models, which are computer simulations of the environment.
Very powerful computers are used to calculate the state of the climate at millions of locations across the globe, in both the oceans and in the atmosphere.
The calculations use basic physics and thermodynamics equations to track the thermal energy at each of the locations.

 When CO2 rises, wet and wild planets may lose their oceans to space. (see Nature)
(demabg/iStock)

So, there are many ways to measure the Earth’s energy imbalance.
While all methods are telling us the Earth has a fever, they differ in details and better synthesis of all the information is essential to improve the knowledge of what Earth’s energy imbalance is.
Right now, the Earth is gaining perhaps as much as 1 Watt of heat (a Joule per second) for every square meter of surface area.
Considering how large the Earth is, this is an incredible amount of heat being gained day and night year after year.
This is over 1 zettaJoules (sextillion Joules) per year.

What I like about this new paper is the recommendations for the future.
Perhaps the most important recommendation is that we need to continue to make accurate measurements of the Earth’s temperatures, especially in the oceans.
We need to extend those temperate measurements to deeper locations (below 2,000 m) and make measurements near shores, in the Polar Regions, underneath ice, etc.
This will require a sustained funding of our measurement systems and a long-term view of the Earth’s changing climate.

Fully understanding where the excess heat is going in the Earth system is a first step to making good predictions as to what its consequences are for the future climate and the oceans.
It is an essential activity to enable planning for the future. Dr von Schuckmann summarized her work nicely when she told me, 
"Advancing our capability to monitor the Earth’s Energy Imbalance means increasing our knowledge on the status of global climate change - and the global ocean plays a crucial role. A concerted multi-disciplinary and international effort is needed to improve our ability to monitor this fundamental metric defining global warming."

Links :

    Tuesday, February 9, 2016

    The truth about politics and cartography: mapping claims to the Arctic seabed

    New Arctic map, with August 2015 Russian claims shown in pale yellow.

    From The Conversation

    While maps can certainly enlighten and educate, they can just as easily be used to support certain political narratives.
    With this in mind, Durham University’s Centre for Borders Research (IBRU) has updated its map showing territorial claims to the Arctic seabed following a revised bid submitted by Russia to the United Nations on August 4.
    The decision to release the map was not made lightly.
    The map of “Maritime jurisdiction and boundaries in the Arctic region” by IBRU depicts the claims to Arctic seabed resources that have been made, or could potentially be made, by Canada, Denmark, Russia, Norway, and the USA.
    In addition, IBRU has also created a simplified map showing the old and new Russian claims from 2001 and 2015 – and the differences between them.

    The myth of a “Cold War”

    We created our first Arctic map in 2008 to dispel reports that the region was about to erupt in a “new Cold War”.
    As the map’s notes explain, nothing could be further from truth.
    Since 2001, Arctic states have been engaging in scientific research – often in cooperation with each other – to gather the data that would enable them to make submissions to the Commission on the Limits of the Continental Shelf (CLCS).

     BRU map comparing the 2001 and 2015 Russian claim areas.
    Areas in green are in the 2015 claim only. Areas in red are in the 2001 claim only.
    Areas in pale yellow are in both claims. 
    Author provided


    The CLCS is empowered by the UN to assess whether areas of the seabed meet a complicated series of bathymetric and geological criteria which can permit coastal states to claim exclusive rights to the non-living resources of the seabed, beyond 200 nautical miles from coastal baselines.
    The original Arctic map denoted the maximum claims that could be made given the scientific data that was then publicly available.
    The map’s accompanying notes clearly stated, however, that these were hypothetical maximums and that the actual extent of each state’s extended continental shelf would likely be reduced once more data were gathered.
    States around the world have been making these submissions, with some 77 filed to date for seas ranging from Oceania to the Caribbean.
    The CLCS has reached decisions on about a quarter of them.
    In the Arctic, Norway’s submission has been approved, Denmark’s is under review, Canada’s is being prepared, and Russia has just deposited a revised submission after its original 2001 submission was returned with a request for more detailed scientific evidence.
    The United States is the sole Arctic state frozen out of the process because it has failed to ratify the United Nations Convention on the Law of the Sea.

    The new Russian claim adds two new areas and subtracts one from the original 2001 claim.
    In total, it adds about 103,000 square kilometres to what had been a claim of 1,325,000 square kilometres.
    The new Russian claim crosses into the Canadian and Danish sides of the North Pole for the first time. While this may have symbolic impact (especially for Canadians and Danes), it has no legal significance.
    In short, little is actually happening on the international seabed – in the Arctic or elsewhere – other than states using science to claim the limited economic rights that are reserved for them by international law.
    These filings should therefore be celebrated as reaffirmations of the will toward peace and stability, rather than feared as unilateral acts of aggression.
    All too often, however, states’ CLCS filings have been interpreted as territorial “land grabs” (or, more correctly, “sea grabs”).
    The most recent Russian claim has been met with a predictable round of defensive sabre rattling.
    The IBRU map may inadvertently aid this impression.
    Solid lines and bright colours imply that vast areas of ocean are being claimed by individual states as sovereign territory, while overlapping areas appear as spaces where conflict already exists.
    News stories that reprint the map rarely include the notes that explain what its colours and shadings actually mean.
    The medium of the map – which appears to communicate a world of states “owning” territory and keeping others out – has in some senses overtaken the message of states working together.

    The Russians are coming … or are they?

    In the context of Russia’s expansion into non-Arctic territories (notably in Crimea), the revised Russian claim has struck the media as another tale of Russian expansion. Provocative headlines noted that, with the filing, “Russia claims North Pole for itself” in a “Move to seize oil and gas rights”.
    Having drawn the revised map, IBRU had a difficult decision: Do we issue a new map and potentially add fuel to this misleading narrative or do we wait for the story to die down so that lawyers, diplomats and scientists can work quietly with the data?
    We soon reached a conclusion that, even when they misinform, maps provide an opportunity for education.
    Therefore, IBRU chose to release not just the revised version of the general map, but also the second map showing the difference between the two Russian claims.
    Recent cartographic theorists have stressed that maps are not the static representations that they purport to be.
    Rather, they are living documents that are remade with each reading.
    In one reading, the IBRU Arctic map may “prove” that there is a “scramble for the Arctic”.
    But the map may also be read as testament to the world’s commitment to the rule of law and the orderly settlement of disputes.
    The stories within – and about – the IBRU Arctic map illustrate not just how we think about the Arctic and its resources, but also how we think about the map as a tool of science, politics, and law.

    Links :

    Monday, February 8, 2016

    USCG: Guidance on the use of Electronic Charts and Publications

    Official nautical raster charts from original material coming from international Hydrographic Offices displayed online with mobile marine planning applications (W4D screenshot) and GeoGarage platform

    The US Coast Guard has issued a Navigation and Vessel Inspection Circular (NVIC) providing USCG marine inspectors and the maritime industry with uniform guidance regarding what the Coast Guard policy regarding use of electronic charts and publications in lieu of paper charts, maps, and publications.

    As per reference, US flagged vessels may maintain in electronic format the navigation publications required by 33 C.F.R § §
    • 164.33 (Charts and publications),
    • 164.72 (Navigational-safety equipment, charts or maps, and publications required on towing vessels) and 161.4 (Requirement to carry the rules)
    • and SOLAS Chapter V Regulation 27 (Nautical charts and nautical publications). 

    The following guidance applies to US flagged vessels subject to US domestic chart 9or map) and publication carriage requirements codified in Titles 33 and 46 of the C.F.R and provides a voluntary equivalency to comply with those requirements

    Click to read USCG NVIC 01-16

    Extracts :

    "Due to the current state of technology, the Coast Guard believes that official electronic charts provide substantially more information to the mariner, and therefore may enhance navigational safety beyond that of official paper charts.
    Official electronic charts, when displayed on electronic charting systems (with integrated systems such as Electronic Positing-Fixing Devices, Automatic Identification System, gyro, radar), can provide the mariner with substantially more navigational information than a paper chart. These enhancements better facilitate voyage planning and monitoring and thus may reduce the potential for marine accidents."

    Links :

    New Air Force satellites launched to improve GPS

     Final 12th GPS II satellite goes into orbit as Air Force gets ready for GPS III

    From Techcrunch

    Last Friday, the United Launch Alliance (ULA) successfully launched a Boeing-built satellite into orbit as part of the U.S. Air Force’s Global Positioning System (GPS).

    This $131 million satellite was the final addition to the Air Force’s most recent 12-satellite GPS series, known as the Block IIF satellites.

    GPS satellites are operated by the Air Force and provide global positioning, navigation and timing services both for the military and civilian users.
    We can all access GPS from our phones because of this very constellation.

    Back in 1978, the first GPS satellite was launched into orbit.
    Since then, the Air Force has improved their satellite design and released new versions of GPS satellites in blocks.
    Starting with Block I, the Air Force has moved through Block IIA, Block IIR, Block IIR-M, and today they’ve completed the launch of their Block IIF series.

    While only 30 GPS satellites are currently operational, 50 have been launched in total.
    The most recent group of Block IIF satellites were launched between May of 2010 and today.

    Col. Steve Whitney, the director of the Global Positioning System Directorate, said that the last leg of launches had “one of the most aggressive launch schedules of the last 20 years.”
    There were 7 Block IIF satellites launched in just over 21 months.


    The GPS Block IIF satellites were launched to improve the accuracy of GPS. Col. Steve Whitney, the director of the Global Positioning System Directorate, said that before the Block IIF series, the accuracy of GPS could be off by 1 meter.
    With the new Block IIF satellites in place that error is down to 42 centimeters.

    The change won’t mean much to the average civilian, but it could mean the difference between life and death for the military who uses GPS to guide munition to specific targets.

    In order to make room for today’s satellite, the Air Force will move one of the older Block IIA satellites that was launched in 1990 out of its orbit.
    Impressively, the satellite is still operational and will continue to serve the GPS constellation as a back-up satellite.

    Now that Block IIF is up and running, the Air Force will shift its focus to the next series of Block III satellites for the GPS-3 constellation.
    Block III satellites will continue to improve the accuracy and reliability of GPS navigation and will have upgraded anti-jamming and security capabilities for military signals.

    Maintaining an up-to-date fully functioning GPS is pertinent to national security.
    For these reasons, selecting a company to launch these assets is an important decision.
    There’s been some controversy recently over which company (ULA or SpaceX) should launch the Air Force’s next block of satellites.
    The decision has not yet been made.

    The first GPS 3 satellites are scheduled to be launched in 2018.

    Link :

    Sunday, February 7, 2016

    Watch all of 2015’s weather in super high-def

     This visualisation, comprised of imagery from the geostationary satellites of EUMETSAT, NOAA and the JMA, shows an entire year of weather across the globe during 2015, with audio commentary from Mark Higgins, Training Manager at EUMETSAT.

    From ClimateCentral by Brian Kahn 

    Another year of wild weather is behind us. But thanks to EUMETSAT, you can now relive it in amazing high-definition video from space.
    The new visualization uses geostationary satellite data from EUMETSAT, the Japan Meteorological Agency and the National Oceanic and Atmospheric Administration to stitch together 365 days of data into one stunning highlight reel of 2015’s weather.
    And what a year it was. You’ll definitely want to keep your eye on the tropics throughout the animation as the northern hemisphere set a record for the most major tropical cyclones to form in a year.
    Around the 6:30 mark, you can see the evolution of Hurricane Joaquin, the strongest Atlantic hurricane of 2015. It went from a tropical depression in late September to a Category 4 storm that battered the Bahamas and menaced the East Coast before steering all the way across the Atlantic and plowing into the U.K.

     

    The transition of Hurricane Joaquin near the Bahamas to an extratropical storm that hit the U.K.
    Hurricane Patricia became the strongest hurricane ever recorded in October and at the 6:55 mark, you can see it quickly slam into Mexico’s west coast before heading inland to inundate parts of Texas.

    But beyond the highlights, there’s also yearly the ebb and flow of weather on our fair planet. During the southern Amazon’s rainy season, which last from December-April, you can see clouds pop up almost daily to spread rains across the region.
    Clouds become far less plentiful during the region’s dry season.

    And more broadly, you can see weather patterns flow across continents and oceans.
    Today’s storm in the Southeast U.S. is next week’s rain in Spain.
    By putting together a global view of our planet, EUMETSAT’s video shows how our atmosphere is the common tie that binds humanity together.
    There have been a few things updated since last year’s version.
    For one, EUMETSAT has cranked the resolution to 4K for truly epic detail.
    And more importantly, the quality of satellites in space has improved.
    Both Japan and EUMETSAT launched new satellites last year that have higher resolutions than their predecessors. The National Oceanic and Atmospheric Administration plans to launch a new high resolution geostationary satellite this year, adding even more detailed coverage of the planet.
    That’s good news if you want an even sharper 4K experience or improved forecasts.
    And if you want both, well, then life is really good.

    Links :

    Saturday, February 6, 2016

    A plastic ocean

    A Plastic Ocean is an adventure documentary shot on more than 20 locations over the past 4 years. Explorers Craig Leeson and Tanya Streeter and a team of international scientists reveal the causes and consequences of plastic pollution and share solutions.



    This film directed by Emily V. Driscoll, is an award-winning short documentary that follows NYC sci-artist Mara G. Haseltine as she creates a sculpture to reveal a microscopic threat beneath the surface of the ocean.
    During a Tara Oceans expedition to study the health of the oceans, Haseltine finds an unsettling presence in samples of plankton she collected.
    The discovery inspires her to create a sculpture that shows that the microscopic ocean world affects all life on Earth.
    Watch Mara G. Haseltine's art film featuring her sculpture and opera singer Joseph Bartning: La Boheme- A Portrait of Our Oceans in Peril vimeo.com/128797284 

    Friday, February 5, 2016

    Swarming robot boats demonstrate self-learning


     The video above describes how the sea swarm works.
    Bio-inspired Computation and Intelligent Machines Lab, Lisbon, Portugal, Initituto de Telecomunições, Lisbon, Portugal, University Instituite of Lisbon (ISCTE-IUL), Lisbon, Portugal

    From Gizmag by David Szondy

    Robots may be the wave of the future, but it will be a pretty chaotic future if they don't learn to work together.
    This cooperative approach is known as swarm robotics and in a first in the field, a team of engineers has demonstrated a swarm of intelligent aquatic surface robots that can operate together in a real-world environment.

     The sea-going robots are made using digital manufacturing techniques
    (Credit: Biomachines Lab)
     
    Using "Darwinian" learning, the robots are designed to teach themselves how to cooperate in carrying out a task.
    A major problem facing the navies of the world is that as ships become more sophisticated they also become much more expensive.
    They are packed with highly trained personnel that cannot be put at risk, except in the most extreme circumstances, and even the most advanced ship suffers from not being able to be in two places at once.
    One solution to this dilemma is to augment the ships with swarms of robot boats that can act as auxiliary fleets at much lower cost and without risk of life.
    The tricky bit is figuring out how to get this swarm to carry out missions without turning into a robotic version of the Keystone Cops.
    The approach being pursued by a team from the Institute of Telecommunications at University Institute of Lisbon and the University of Lisbon in Portugal is to rely on self-learning robots.
    Led by Dr. Anders Christensen, the team recently demonstrated how up to ten robots can operate together to complete various tasks.
    The small robots are made of CNC-machined polystyrene foam and 3D-printed components at a materials cost of about €300 (US$330).
    The electronics pack include GPS, compass, Wi-Fi, and a Raspberry Pi 2 computer.
    However, the key is their decentralized programming.

    "Swarm robotics is a paradigm shift: we rely on many small, simple, and inexpensive robots, instead of a single or a few large, complex, and expensive robots," says Christensen.

    "Controlling a large-scale swarm of robots cannot be done centrally. Each robot must decide for itself how to carry out the mission, and coordinate with its neighbors."
    Instead of using a central computer or programming each robot individually, the swarm operates on what the team calls a Darwinian approach.
    In other words, each robot is equipped with a neural network that mimics the operations of a living brain.
    The robots are given a simple set of instructions about how to operate in relationship to one another as well as mission goals.
    The robots are then allowed to interact with one another in a simulated environment and those that display successful mission behavior are allowed to proceed.
    The "fittest" robots from the simulations are then tested in the real world.
    According to the team, the clever bit about the swarm is that, like schools of fish or flocks of birds, none of the robots know of or "care" about the other robots beyond their immediate neighbors. Instead, they react to what their immediate neighbors do as they determine the best way to fulfill their mission objectives such as area monitoring, navigation to waypoint, aggregation, and dispersion.
    In a sense, they learn to cooperate with one another.
    The team is currently working on the next generation of aquatic robots with more advanced sensors and the ability to handle longer missions.
    Eventually, they could be used in swarms numbering hundreds or thousands of robots for environmental monitoring, search and rescue, and maritime surveillance.
    The team's research is being peer reviewed and is available here.

    Thursday, February 4, 2016

    Eyes in the sky: Green groups are harnessing data from space


    From Eye360 by Jacques Leslie

    An increasing number of nonprofit organizations are relying on satellite imagery to monitor environmental degradation.
    Chief among them is SkyTruth, which has used this data to expose the extent of the BP oil spill, uncover mining damage, and track illegal fishing worldwide.

    When Brian Schwartz, a Johns Hopkins University epidemiologist researching the public health impacts of hydraulic fracturing, read about an environmental group that uses satellite imagery and aerial photography to track environmental degradation, he was intrigued.

    It was the summer of 2013, and the group, SkyTruth, had just launched a crowdsourcing project on its website to map fracking activity in Pennsylvania.
    The site provided volunteers with U.S. government aerial images from across the state and a brief tutorial on how to identify fracking locations.
    Within a month, more than 200 volunteers sorted through 9,000 images to pinpoint 2,724 fracking wellpads.
    Schwartz ended up using this data in a study published last October in the journal Epidemiology, showing that women living near hydraulic fracturing sites in 40 Pennsylvania counties faced a significantly elevated risk of giving birth prematurely.



    That’s precisely the sort of result that John Amos, SkyTruth’s president, envisioned when he founded the group in 2001.
    He has since become part data analyst, part environmental advocate, and part satellite-imagery proselytizer as he looks for ways to use remote sensing to call attention to little-noticed environmental damage.
    This month, SkyTruth’s website is displaying a map showing the global prevalence of flaring, the wasteful and carbon-spewing oil industry practice of burning natural gas and other drilling byproducts.
    Through most of December, SkyTruth and another satellite-focused nonprofit, Moscow-based Transparent World, displayed images of a burning oil platform and a 2,300-barrel oil slick in the Caspian Sea.
    The platform’s owner, Azerbaijan’s state-owned oil company, SOCAR, denied that any spill had occurred.


    In the 5 years since BP, there have been nearly 10,000 spills reported in the Gulf of Mexico

    SkyTruth’s defining moment came in 2010, when Amos — analyzing satellite photographs — sounded the alarm that the Deepwater Horizon oil spill in the Gulf of Mexico was far larger than the petroleum company, BP, and the U.S. government were acknowledging.
    “If you can see it,” says SkyTruth’s motto, displayed at the top of its website, “you can change it.”

    One indication of SkyTruth’s influence is a cautionary headline that appeared after SkyTruth formed a partnership with Google and the nonprofit Oceana in November 2014 to launch a system called Global Fishing Watch, which uses the satellite transponders found aboard most large fishing vessels to track the activities of the world’s fishing fleets.
    “Big Brother is watching,” warned World Fishing & Aquaculture, a trade journal.



    That admonition could be extended to all the extractive industries — oil and gas, mining, logging, and fishing — whose operations can be tracked by remote sensing.
    A growing number of governments now conduct environmental observation by satellite, including Brazil, which monitors deforestation in the Amazon.
    And environmental groups now commonly use remote sensing tools.
    One prominent example is Global Forest Watch, a system launched two years by Washington-based World Resources Institute to monitor logging and fires in the world’s forests.
    Russia-based Transparent World employs satellite imagery for many purposes, including monitoring of protected areas and observing the impacts of dam construction.

    Amos, 52, says he considered himself an environmentalist even while he spent a decade working for oil and gas companies as a satellite imagery analyst looking for drilling sites.
    He quit in 2000 to start a non-profit that would apply his skills to environmental protection.
    For years he ran SkyTruth from the basement of his Shepherdstown, West Virginia home on an annual budget of less than $100,000, and he still speaks of “begging” satellite images from commercial providers.

    Although SkyTruth has expanded in recent years to eight employees supported by a $600,000 budget, it is still tiny, particularly compared to the U.S. government’s massive satellite resources.
    Nevertheless, SkyTruth has delved into realms that the government has avoided.
    One reason, Amos says, is that satellite imagery analysis is so unfamiliar that “nobody has known what to ask for” — thus, one of SkyTruth’s missions is to show what’s possible.
    Its usual method is to release a trove of environment-related data, then invite researchers and crowdsource amateurs to analyze it.


    SkyTruth has benefited enormously from the explosion in the last 15 years in satellite imagery and other digital technologies.
    When Amos started SkyTruth, a single Landsat satellite image cost $4,400; now the entire U.S. government collection— more than 4.7 million images and growing daily— is available free of charge.
    Not only have satellites and satellite imagery become cheap, but the capacity to analyze, duplicate, send, and store satellite data has expanded by orders of magnitude.
    In fact, satellite technology is now considered a subset of a larger field, geospatial intelligence, which has tens of thousands of practitioners around the world employing an array of optical, thermal, radar, and radiometric remote sensing tools.
    “It’s evolved from a problem of getting imagery to deciding which image do I want to pluck out of this massive cloud,” Amos told me.

    The finding by Schwartz, the Johns Hopkins epidemiologist, on premature births suggests a correlation between fracking and poor human health; but because the chemical trigger wasn’t identified, the link isn’t regarded as causal.
    From more than 1,000 available chemicals, fracking operators select a dozen or so that fit the geological challenges of a particular site.
    People living near the site typically can’t find out whether their wells and aquifers have been contaminated because the cost of testing for all 1,000 chemicals is prohibitive, and operators treat each site’s chemical recipe as a trade secret.

    The quandary led Amos to venture beyond satellite imagery into the larger field of geospatial data. Along with several better-known environmental groups, SkyTruth argued for disclosure of the recipe used at each frackingsite.
    Two industry lobbying groups, the American Petroleum Institute and America’s Natural Gas Alliance, defused mounting Congressional pressure for mandatory disclosure by launching a website, FracFocus, where operators could post their recipes voluntarily.
    But soon after the site’s launch in 2011, users found that information posted on it was entered in the wrong field, misspelled chemical trade names, or omitted key facts deemed proprietary.
    The site thwarted researchers by requiring postings in a format that computers couldn’t read.
    Although 23 states require fracking companies to use FracFocus to disclose their chemical use, a 2013 Harvard Law School report concluded that FracFocus’ “fails as a regulatory compliance tool.”

    SkyTruth’s lead programmer, Paul Woods, devised a way around some of FracFocus’ barriers by writing software that “scraped” all the chemical data from the tens of thousands of reports posted on the site.
    Then he posted it in a database on SkyTruth’s website.
    In addition, under pressure from SkyTruth, other environmental groups, and an Energy Department advisory board, FracFocus agreed to make its data available in machine-readable form beginning in May 2015.
    These developments have yielded more and more information for researchers, such as Schwartz, who are investigating fracking’s health impact.
    “This is a very wonky issue that makes people’s eyes glaze over,” Amos said.
    “But it’s where the rubber meets the road in terms of understanding if fracking is bad for you.”

    The first time that SkyTruth attracted national attention was in April 2010, when Amos received a Google alert that an oil platform called Deepwater Horizon, 50 miles off the Louisiana coast, had exploded and burned.
    Amos knew explosions like this one were uncommon and usually led to spills.

    He began searching for satellite photos, but the first ones he found were obscured by clouds. Meanwhile, BP, which leased the rig, and the Coast Guard, echoing BP, maintained that the ruptured well beneath the rig was leaking oil at a rate of 1,000 barrels a day— a major spill but perhaps not a catastrophic one.
    The number was vital, for it would help determine the scale and strategy of the leak containment effort, the eventual cost to BP in fines and damages, and the scope of preparations for the next spill.


    It took Amos six days to acquire clear images.
    His first thought, he says, was: “Oh my God! This is much bigger than anybody realizes.”
    He calculated that the slick was 50 miles long and covered 817 square miles.
    He outlined the slick, along with his calculations, and posted both on SkyTruth’s website.

    Within a day, Ian MacDonald, a Florida State University oceanographer and oil slick authority, notified Amos that the leak’s flow rate was much bigger than a thousand barrels a day.
    Using Amos’ calculations of the lick’s size and conservative assumptions about its thickness, MacDonald concluded that it was “not unreasonable” that the leak was 20 times BP’s initial estimate.
    Undermined by SkyTruth’s numbers, the National Oceanic and Atmospheric Administration conceded the next day that BP’s initial estimate was too low: over BP’s public objections, NOAA revised the government estimate to 5,000 barrels a day.
    Two months later, — prodded, in part, by SkyTruth — government scientists concluded that the initial flow rate was 62,000 barrels a day, 62 times BP’s initial estimate.

    SkyTruth has also affected the course of mountaintop removal coal mining.
    Appalachian states have issued hundreds of permits for mountaintop removal mines, but they’ve rarely checked to see whether the mines have stayed within the permitted boundaries.
    Permits are supposed to be issued only after assessing impacts on downstream waterways, and a study of ten West Virginia counties published in 2004 by the state’s environmental protection department found that nearly 40 percent of mines in ten counties were situated outside permitted locations.

    Acting on a request from Appalachian Voices, a North Carolina-based nonprofit that opposes mountaintop removal mining, SkyTruth devised a technique for identifying the mines from satellite images, then mapped their growth over three decades and posted the results on its website in 2009.
    The information was used in six peer-reviewed academic articles, including a Duke University study that found that once five percent of a watershed is mined, water quality in its rivers and streams usually fails to meet state standards.
    That study in turn provided empirical backing for the U.S. Environmental Protection Agency’s 2011 revocation of a mine permit in West Virginia that had been issued by the U.S. Army Corps of Engineers.
    The decision marked the first time the EPA had ever reversed a coal mine’s permit under the Clean Water Act.

     This June 21 2014 satellite photo from NASA, annotated by SkyTruth, shows an oil slick extending in an arc at least 8.3 miles (13.4 km) long from a well site at a Taylor Energy Co. platform that was toppled in an underwater mudslide triggered by Hurricane Ivan's waves in September 2004.

    In search of images that tell environmental stories, SkyTruth pays close attention to news reports, but occasionally it finds stories of its own.
    One example is what is probably the Gulf of Mexico’s longest-running commercial oil spill, at the site of a rig destroyed by an underwater mudslide during Hurricane Ivan in 2004.
    The slide buried 28 wells on the sea floor under 100 feet of mud, which made sealing them extremely difficult.
    The rig’s owner, Taylor Energy Company, went bankrupt trying.
    Amos discovered the leaks in 2010 while studying Hurricane Katrina’s impacts, and has been sounding an alarm ever since.
    The leaks have trickled steadily into the Gulf’s waters since 2004 at a rate Amos estimates at between one and 20 barrels a day, creating a slick that is sometimes 20 miles long.
    The wells are ten miles offshore in federally managed water, but no federal agency has tried to seal the leak.

    Given the controversial issues SkyTruth has been involved with, the group has attracted surprisingly little criticism, perhaps because so much of its work is grounded in visual data— for SkyTruth, seeing really is believing.
    A notable exception occurred in 2009 when Amos testified at a U.S. Senate subcommittee hearing on the under-appreciated risks of deepwater oil drilling.
    Senator Mary Landrieu, a Louisiana Democrat, attacked Amos for overlooking the oil industry’s safety record and economic benefits.
    “You do a great disservice by not telling the American people the truth about drilling and putting it in the perspective it deserves,” Landrieu told Amos.

    Landrieu didn’t give Amos a chance to respond, but, as it turned out, he didn’t have to.
    The BP spill occurred five months later.

    Links :