The Mediterranean Sea concentrates many challenges: tourism, transport, fishing, biodiversity. Considerable efforts have been made over the past 30 years to preserve it.
Scientists monitor the state of health of the Mediterranean Sea through biological monitoring of its richest and most sensitive ecosystems such as posidonia meadows and coralligenous reefs, listening to underwater sounds and bottom mapping.
Through the magnificent underwater images of Roberto Rinaldi and Laurent Ballesta, you will dive into the heart of the Mediterranean Sea and its biodiversity to better understand why/how it is monitored.
You will also discover the latest improvement actions implemented and their results.
They are professional fishermen, scientists or coastal managers, and they share their expertise on global warming in the Mediterranean...
Automated vessels, digital dependence put paper charts in rearview, agency says at Juneau meetings
The tan and blue paper nautical charts that line wheelhouses and galleys on Alaska ships will soon be a relic of the past.
In an effort to increase automation and adapt to digital navigation, the National Oceanic and Atmospheric Administration is rebuilding its chart products for digital use, said Rear Admiral Shep Smith, Director of the Office of Coast Survey.
The effort will take about 10 years and allow for more seamless navigation, and a larger level of detail, Smith said.
“The paper charts that we all grew up with and know and love were the traditional way of capturing the survey information and making it actionable for mariners,” Smith explained.
Not so anymore.
The move isn’t just a matter of digitizing its paper charts.
NOAA and its partners already do that.
Many global positioning systems used on vessels simply layer a ship’s GPS positioning on top of NOAA’s charts.
The Office of Coast Survey will have to totally revamp how it builds charts to properly adapt them for digital and automated uses.
“We’re basically starting over and redesigning a suite of nautical charts that is optimized and designed from the ground up to be used digitally. It’s not intended to be used on paper,” Smith said.
A closer look at Thomas Jefferson‘s project area highlights its navigational characteristics.
Right now, NOAA produces new editions to nautical charts periodically.
There are over 1,000 of these charts, each detailing a certain area at a certain scale.
Each chart has an edition date and uses certain contour lines for water depth — similar to the contour lines used on land maps to show the height of a mountain at a certain position, for instance.
But those lines — and other features of the paper charts — are inconsistent, a function of the charts being designed as paper products.
The first contour line on one nautical map may indicate a water depth of 10 fathoms (60 feet).
On another chart, the first contour line might indicate a depth of 30 fathoms (180 feet).
There are discrepancies in scale, too. NOAA uses over 100 different scales in paper charts of U.S. waters.
At each different scale, there’s a difference in the level of detail.
When moving from open water, traveling across the Gulf of Alaska, for instance, a mariner might use a larger scale chart — they don’t need the level of detail when traveling open waters.
When a ship approaches an anchorage, where knowing the exact location of a rock in a small area becomes important, they might switch to a chart with a smaller scale.
Bathymetric data collected by Rainier in Tracy Arm Fjord.
With something like Google Earth, the transition between the two scales is seamless.
Just zoom in and greater levels of detail reveal themselves.
That’s one of the hopes of the new system, Smith said.
“If you use the nautical charts that we have now, derived from the paper in the digital format, there are discontinuities … they don’t line up,” Smith said.
“There’s all these artifacts of the fact that they’re derived from paper.”
The redesign is intended to make automated and unmanned vessel navigation easier.
Smith used the example of cruise ships.
(Five cruise ships were in Juneau on Wednesday when Smith spoke to the Empire about the project at a meeting of the Hydrographic Services Review Panel.)
Cruise ship captains have to know their vessel’s exact location in relation to the shore to be sure they’re discharging waste in a legal area.
But discharge regulations aren’t delineated on a map.
Instead, they’re written down and enforced by the Alaska Department of Environmental Conservation.
Right now, cruise ships take DEC’s written regulations and translate them into action plans for their engineers to be able to avoid dumping waste too close to shore.
With NOAA’s digital built charts, the system could be more foolproof.
“In an automated world, the ship itself and the ship’s systems could know whether it was in a discharge zone or what kind of discharge regime it’s in,” Smith said.
As is happening on terrestrial highways, driverless technology may soon take to ocean shipping lanes. Smith said Alaska could see an increase in unmanned — or less-manned — shipping in coming years.
Those vessels will need purpose-built charts to move about safely, Smith said.
The Office of Coast Survey is working with a crew from the University of New Hampshire to make unmanned survey vessels aware of where they’re at in the ocean.
“Within the lifespan of the charts that we’re beginning to make now, there will be unmanned ships … The algorithms that would control an unmanned ship would have to interact with the chart,” Smith said.
UK ministers are setting aside £92m to study the feasibility of building a sovereign satellite-navigation system. (see Gov.uk press release)
The new network would be an alternative to the Europe Union's Galileo project, in which Britain looks set to lose key roles as a result of Brexit.
The UK Space Agency will lead the technical assessment.
Officials will engage British industry to spec a potential design, its engineering requirements, schedule and likely cost.
The first contracts for this study work could be issued as early as October.
The UKSA expects the assessment to take about a year and a half.
Ministers could then decide if they really want to proceed with a venture that will have a price tag in the billions.
Seeking a deal
London and Brussels are still negotiating Britain's future participation in Galileo.
The parties are currently in dispute over the UK's access, and industrial contribution, to the system's Public Regulated Service (PRS) beyond March of next year.
PRS is a special navigation and timing signal intended for use by government agencies, the armed forces and emergency responders. Expected to come online in 2020, it is designed to be available and robust even in times of crisis.
Brussels says London cannot immediately have access to PRS when the UK leaves the European bloc because it will become a foreign entity.
London says PRS is vital to its military and security interests and warns that if it cannot use and work on the signal then it will walk away from Galileo in its entirety.
The Prime Minister Theresa May, presently on a tour of Africa, told the BBC it was, "not an idle threat to achieve our negotiating objectives".
The UK did not want to be simply an "end user" and needed full access to Galileo if it was to continue to contribute to the system, she added.
UK firms have been integral to the development of Galileo
A project of the European Commission and the European Space Agency
24 satellites constitute a full system but it will have six spares in orbit also
26 spacecraft are in orbit today; the figure of 30 is likely to be reached in 2021
Original budget was 3bn euros but will now cost more than three times that
Works alongside the US GPS, Chinese Beidou and Russian Glonass systems
Promises eventual real-time positioning down to a metre or less
'Best of British'
The UK as an EU member state has so far invested £1.2bn in Galileo, helping to build the satellites, to operate them in orbit, and to define important aspects of the system's encryption, including for PRS itself.
"Due to the European Commission post-Brexit rules imposed on UK companies, Airbus Defence and Space Ltd was not able to compete for the Galileo work we had undertaken for over the last decade," Colin Paynter, MD of Airbus DS in the UK, said.
"We therefore very much welcome the UK Space Agency's announcement today which we believe will allow Airbus along with other affected UK companies to bring together an alliance of the Best of British to produce innovative solutions for a possible future UK navigation system."
Man quits job to spell out 'Stop Brexit' across Europe with GPS tracker
Analysis - Could the UK go it alone?
Few people doubt Britain is capable of developing its own satellite-navigation system.
But the task would not be straight-forward. Here are just four issues that will need to be addressed before ministers can sign off on such a major project:
COST:
The initial estimate given for a sovereign system when first mooted was put in the region of £3bn-5bn.
But major space infrastructure projects have a history of under-estimating complexities.
Both GPS and Galileo cost far more - and took much longer - to build than anyone expected.
In addition to the set-up cost, there are the annual running costs, which in the case of Galileo and GPS run into the hundreds of millions of euros/dollars.
A sat-nav system needs long-term commitment from successive governments.
BENEFIT:
Just the year-to-year financing for a sat-nav system would likely dwarf what the UK government currently spends on all other civil space activity - roughly £400m per year.
The question is whether investments elsewhere, in either the space or military sectors, would bring greater returns, says Leicester University space and international relations expert Bleddyn Bowen: "We could spend this £100m [feasibility money] doubling what the government is giving to develop launcher capability in the UK, which is only £50m - it could make a real difference. You could also spend that money buying some imagery satellites for the MoD, which would transform their capabilities overnight."
SKILLS:
Britain has a vibrant space sector.
It has many of the necessary skills and technologies to build its own sat-nav system, but it does not have them all. Many of the components for Galileo satellites, for example, have single suppliers in Europe.
If Britain cannot develop domestic supply chains for the parts it needs, there may be no alternative but to bring them in from the continent.
Spending the project's budget in the EU-27 may not be politically acceptable given the state of current relations on Galileo.
FREQUENCIES:
The UKSA says a British system would be compatible with America's GPS - and by extension with Galileo - because both these systems transmit their timing and navigation signals in the same part of the radio spectrum.
This simplifies receivers and allows manufacturers to produce equipment that works with all available systems.
This is the case for the chips in the latest smartphones, for instance.
But America and the EU had a huge row in 2003 over frequency compatibility and the potential for interference.
It was British engineers who eventually showed the two systems could very happily co-exist.
They would have to do the same again for a UK sovereign network.
Without international acceptance on the frequencies in use, no project could proceed.
Some analysts believe the most fruitful approach now for the UK would be to extend its space expertise and capabilities in areas not already covered by others - in space surveillance, or in secure space communications, for example.
This would make Britain an even more compelling partner for all manner of projects, including Galileo.
Alexandra Stickings from the Royal United Services Institute for Defence and Security Studies said: "Working its way to a negotiated agreement on Galileo would allow the the UK to then focus its space budget and strategy to build UK capabilities and grow the things we're able to offer as an international partner."
The science of weather forecasting falls to public scrutiny every single day.
When the forecast is correct, we rarely comment, but we are often quick to complain when the forecast is wrong.
Are we ever likely to achieve a perfect forecast that is accurate to the hour?
There are many steps involved in preparing a weather forecast.
It begins its life as a global "snapshot" of the atmosphere at a given time, mapped onto a three-dimensional grid of points that span the entire globe and stretch from the surface to the stratosphere (and sometimes higher).
Using a supercomputer and a sophisticated model that describes the behaviour of the atmosphere with physics equations, this snapshot is then stepped forward in time, producing many terabytes of raw forecast data.
It then falls to human forecasters to interpret the data and turn it into a meaningful forecast that is broadcast to the public.
Find out what the lines, arrows and letters mean on synoptic weather charts.
The word 'synoptic' simply means a summary of the current situation.
In weather terms this means the pressure pattern, fronts, wind direction and speed and how they will change and evolve over the coming few days.
Temperature, pressure and winds are all in balance and the atmosphere is constantly changing to preserve this balance.
The whether in the weather
Forecasting the weather is a huge challenge.
For a start, we are attempting to predict something that is inherently unpredictable.
The atmosphere is a chaotic system – a small change in the state of the atmosphere in one location can have remarkable consequences over time elsewhere, which was analogised by one scientist as the so-called butterfly effect.
Any error that develops in a forecast will rapidly grow and cause further errors on a larger scale.
And since we have to make many assumptions when modelling the atmosphere, it becomes clear how easily forecast errors can develop.
For a perfect forecast, we would need to remove every single error.
Forecast skill has been improving.
Modern forecasts are certainly much more reliable than they were before the supercomputer era.
The UK's earliest published forecasts date back to 1861, when Royal Navy officer and keen meteorologist Robert Fitzroy began publishing forecasts in The Times.
His methods involved drawing weather charts using observations from a small number of locations and making predictions based on how the weather evolved in the past when the charts were similar.
But his forecasts were often wrong, and the press were usually quick to criticise.
A great leap forward was made when supercomputers were introduced to the forecasting community in the 1950s.
The first computer model was much simpler than those of today, predicting just one variable on a grid with a spacing of over 750km.
This work paved the way for modern forecasting, the principles of which are still based on the same approach and the same mathematics, although models today are much more complex and predict many more variables.
Nowadays, a weather forecast typically consists of multiple runs of a weather model.
Operational weather centres usually run a global model with a grid spacing of around 10km, the output of which is passed to a higher-resolution model running over a local area.
To get an idea of the uncertainty in the forecast, many weather centres also run a number of parallel forecasts, each with slight changes made to the initial snapshot.
These small changes grow during the forecast and give forecasters an estimate of the probability of something happening – for example, the percentage chance of it raining.
History of Weather Forecasting
The future of forecasting
The supercomputer age has been crucial in allowing the science of weather forecasting (and indeed climate prediction) to develop.
Modern supercomputers are capable of performing thousands of trillions of calculations per second, and can store and process petabytes of data.
The Cray supercomputer at the UK's Met Office has the processing power and data storage of about a million Samsung Galaxy S9 smartphones.
A weather chart predicts atmospheric pressure over Europe, December 1887.
This means we have the processing power to run our models at high resolutions and include multiple variables in our forecasts.
It also means that we can process more input data when generating our initial "snapshot", creating a more accurate picture of the atmosphere to start the forecast with.
This progress has led to an increase in forecast skill.
A neat quantification of this was presented in a Nature study from 2015 by Peter Bauer, Alan Thorpe and Gilbert Brunet, describing the advances in weather prediction as a "quiet revolution".
They show that the accuracy of a five-day forecast nowadays is comparable to that of a three-day forecast about 20 years ago, and that each decade, we gain about a day's worth of skill.
Essentially, today's three-day forecasts are as precise as the two-day forecast of ten years ago.
But is this skill increase likely to continue into the future?
This partly depends on what progress we can make with supercomputer technology.
Faster supercomputers mean that we can run our models at higher resolution and represent even more atmospheric processes, in theory leading to further improvement of forecast skill.
According to Moore's Law, our computing power has been doubling every two years since the 1970s.
However, this has been slowing down recently, so other approaches may be needed to make future progress, such as increasing the computational efficiency of our models.
So will we ever be able to predict the weather with 100% accuracy? In short, no.
There are 2×10⁴⁴ (200,000,000,000,000,000,000,000,000,000,000,000,000,000,000) molecules in the atmosphere in random motion – trying to represent them all would be unfathomable.
The chaotic nature of weather means that as long as we have to make assumptions about processes in the atmosphere, there is always the potential for a model to develop errors.
Progress in weather modelling may improve these statistical representations and allow us to make more realistic assumptions, and faster supercomputers may allow us to to add more detail or resolution to our weather models but, at the heart of the forecast is a model that will always require some assumptions.
However, as long as there is research into improving these assumptions, the future of weather forecasting looks bright.
How close we can get to the perfect forecast, however, remains to be seen. Links :