Tuesday, May 19, 2020

Underwater telecom cables make superb seismic network

The oceans are criss-crossed by telecommunications cables, as illustrated by this graphic predicting the fiber-optic cables that will be operational by 2021, many of them (yellow) owned by private companies like Google and Microsoft.
These cables could serve a dual purpose as seismic stations to monitor earthquakes and fault systems over the 70% of Earth covered by water.
(Graphic courtesy of New York Times)

From Berkeley by Robert Sanders

Fiber-optic cables that constitute a global undersea telecommunications network could one day help scientists study offshore earthquakes and the geologic structures hidden deep beneath the ocean surface.

In a paper appearing this week in the journal Science, researchers from the University of California, Berkeley, Lawrence Berkeley National Laboratory (Berkeley Lab), Monterey Bay Aquarium Research Institute (MBARI) and Rice University describe an experiment that turned 20 kilometers of undersea fiber-optic cable into the equivalent of 10,000 seismic stations along the ocean floor.
During their four-day experiment in Monterey Bay, they recorded a 3.5 magnitude quake and seismic scattering from underwater fault zones.

Their technique, which they had previously tested with fiber-optic cables on land, could provide much-needed data on quakes that occur under the sea, where few seismic stations exist, leaving 70% of Earth’s surface without earthquake detectors.

“There is a huge need for seafloor seismology.
Any instrumentation you get out into the ocean, even if it is only for the first 50 kilometers from shore, will be very useful,” said Nate Lindsey, a UC Berkeley graduate student and lead author of the paper.

Lindsey and Jonathan Ajo-Franklin, a geophysics professor at Rice University in Houston and a faculty scientist at Berkeley Lab, led the experiment with the assistance of Craig Dawe of MBARI, which owns the fiber-optic cable.
The cable stretches 52 kilometers offshore to the first seismic station ever placed on the floor of the Pacific Ocean, put there 17 years ago by MBARI and Barbara Romanowicz, a UC Berkeley Professor of the Graduate School in the Department of Earth and Planetary Science.
A permanent cable to the Monterey Accelerated Research System (MARS) node was laid in 2009, 20 kilometers of which were used in this test while off-line for yearly maintenance in March 2018.

Researchers employed 20 kilometers (pink) of a 51-kilometer undersea fiber-optic cable, normally used to communicate with an off-shore science node (MARS, Monterey Accelerated Research System), as a seismic array to study the fault zones under Monterey Bay.
During the four-day test, the scientists detected a magnitude 3.5 earthquake 45 kilometers away in Gilroy, and mapped previously uncharted fault zones (yellow circle).
(Image by Nate Lindsey)

“This is really a study on the frontier of seismology, the first time anyone has used offshore fiber-optic cables for looking at these types of oceanographic signals or for imaging fault structures,” said Ajo-Franklin.
“One of the blank spots in the seismographic network worldwide is in the oceans.”

The ultimate goal of the researchers’ efforts, he said, is to use the dense fiber-optic networks around the world — probably more than 10 million kilometers in all, on both land and under the sea — as sensitive measures of Earth’s movement, allowing earthquake monitoring in regions that don’t have expensive ground stations like those that dot much of earthquake-prone California and the Pacific Coast.

“The existing seismic network tends to have high-precision instruments, but is relatively sparse, whereas this gives you access to a much denser array,” said Ajo-Franklin.
Photonic seismology

The technique the researchers use is Distributed Acoustic Sensing, which employs a photonic device that sends short pulses of laser light down the cable and detects the backscattering created by strain in the cable that is caused by stretching.
With interferometry, they can measure the backscatter every 2 meters (6 feet), effectively turning a 20-kilometer cable into 10,000 individual motion sensors.

The Monterey Accelerated Research System (MARS) cabled observatory, a node for science instruments on the ocean floor 891 meters (2,923 feet) below the surface of Monterey Bay, is connected to shore by a 52-kilometer (32-mile) undersea cable that carries data and power.
About 20 kilometers of the cable was used to test photonic seismology on the seafloor.
(Copyright MBARI, 2009)

“These systems are sensitive to changes of nanometers to hundreds of picometers for every meter of length,” Ajo-Franklin said.
“That is a one-part-in-a-billion change.”

Earlier this year, they reported the results of a six-month trial on land using 22 kilometers of cable near Sacramento emplaced by the Department of Energy as part of its 13,000-mile ESnet Dark Fiber Testbed.
Dark fiber refers to optical cables laid underground, but unused or leased out for short-term use, in contrast to the actively used “lit” internet.
The researchers were able to monitor seismic activity and environmental noise and obtain subsurface images at a higher resolution and larger scale than would have been possible with a traditional sensor network.

“The beauty of fiber-optic seismology is that you can use existing telecommunications cables without having to put out 10,000 seismometers,” Lindsey said.
“You just walk out to the site and connect the instrument to the end of the fiber.”

During the underwater test, they were able to measure a broad range of frequencies of seismic waves from a magnitude 3.4 earthquake that occurred 45 kilometers inland near Gilroy, California, and map multiple known and previously unmapped submarine fault zones, part of the San Gregorio Fault system.
They also were able to detect steady-state ocean waves — so-called ocean microseisms — as well as storm waves, all of which matched buoy and land seismic measurements.

“We have huge knowledge gaps about processes on the ocean floor and the structure of the oceanic crust because it is challenging to put instruments like seismometers at the bottom of the sea,” said Michael Manga, a UC Berkeley professor of earth and planetary science.
“This research shows the promise of using existing fiber-optic cables as arrays of sensors to image in new ways.
Here, they’ve identified previously hypothesized waves that had not been detected before.”

According to Lindsey, there’s rising interest among seismologists to record Earth’s ambient noise field caused by interactions between the ocean and the continental land: essentially, waves sloshing around near coastlines.

“By using these coastal fiber optic cables, we can basically watch the waves we are used to seeing from shore mapped onto the seafloor, and the way these ocean waves couple into the Earth to create seismic waves,” he said.

To make use of the world’s lit fiber-optic cables, Lindsey and Ajo-Franklin need to show that they can ping laser pulses through one channel without interfering with other channels in the fiber that carry independent data packets.
They’re conducting experiments now with lit fibers, while also planning fiber-optic monitoring of seismic events in a geothermal area south of Southern California’s Salton Sea, in the Brawley seismic zone.

Links :




Monday, May 18, 2020

35 years of submarine cables in one map

The 400 or so submarine cables weave an invisible yet crucial network of the contemporary world. 1.3 million km long, they are essential to the smooth functioning of the Internet and host 99% of our intercontinental exchanges.

From VisualCapitalist by Nick Routley

You could be reading this article from nearly anywhere in the world and there’s a good chance it loaded in mere seconds.

Long gone are the days when images would load pixel row by pixel row. Now, even high-quality video is instantly accessible from almost everywhere. How did the internet get so fast? Because it’s moving at the speed of light.


The Information Superhighway

The miracle of modern fiber optics can be traced to a single man, Narinder Singh Kapany.
The young physicist was skeptical when his professors asserted that light ‘always travels in a straight line’.
His explorations into the behavior of light eventually led to the creation of fiber optics—essentially, beaming light through a thin glass tube.

The next step to using fiber optics as a means of communication was lowering the cable’s attenuation rate.
Throughout the 1960-70s, companies made gains in manufacturing, reducing the number of impurities and allowing light to cross great distances without a dramatic decrease in signal intensity.

By the mid-1980s, long distance fiber optic cables had finally reached the feasibility stage.
Crossing the Pond

The first intercontinental fiber optic cable was strung across the floor of the Atlantic Ocean in 1988.
The cable—known as TAT-8*—was spearheaded by three companies; AT&T, France Télécom, and British Telecom.
The cable was able to carry the equivalent of 40,000 telephone channels, a ten-fold increase over its galvanic predecessor, TAT-7.

Once the kinks of the new cable were worked out, the floodgates were open.
During the course of the 1990s, many more cables hit the ocean floor.

Deep on the ocean floor you will find communication cables made to carry signals from one land to another.
The first undersea communications cables, laid in the 1850s, carried telegraphy.
Now these cables carry our phone and Internet traffic.
Yet, they remain relatively hidden in the depths of the ocean.

By the dawn of the new millennium, every populated continent on Earth was connected by fiber optic cables.
The physical network of the internet was beginning to take shape.

As today’s video from ESRI shows, the early 2000s saw a boom in undersea cable development, reflecting the uptick in internet usage around globe.
In 2001 alone, eight new cables connected North America and Europe.

From 2016-2020, over 100 new cables were laid with an estimated value of $14 billion.
Now, even the most remote Polynesian islands have access to high-speed internet thanks to undersea cables.

*TAT-8 does not appear in the video above as it was retired in 2002.
The Shifting Nature of Cable Construction

Even though nearly every corner of the globe is now physically connected, the rate of cable construction is not slowing down.

This is due to the increasing capacity of new cables and our appetite for high-quality video content.

New cables are so efficient that the majority of potential capacity along major cable routes will come from cables that are less than five years old.

Traditionally, a consortium of telecom companies or governments would fund cable construction, but tech companies are increasingly funding their own submarine cable networks.


Amazon, Microsoft and Google own close to 65% market share in cloud data storage, so it’s understandable that they’d want to control the physical means of transporting that data as well.

These three companies now own 63,605 miles of submarine cable.
While laying cable is a costly endeavor, it’s necessary to meet surging demand—content providers’ share of data transmission skyrocketed from around 8% to nearly 40% over the past decade.
A Bright Future for Dark Fiber

At the same time, more aging cables will be taken offline.
Even though signals are no longer traveling through this network of “dark fiber”, it’s still being put to productive use.
It turns out that undersea telecom cables make a very effective seismic network, helping researchers study offshore earthquakes and the geologic structures on the ocean floor.

Links :

Sunday, May 17, 2020

Saturday, May 16, 2020

Arctic

Short film released in 2015,
on the occasion of the publication of the book ARCTIQUE
by French photographer Vincent Munier (Kobalann Publishing).

Friday, May 15, 2020

Coronavirus could disrupt weather forecasting

Data on temperature, wind and humidity collected by commercial airline flights has fallen sharply.

From NYTimes by Henry Fountain

The amount of atmospheric data routinely gathered by commercial airliners has dropped sharply as a result of the coronavirus, the World Meteorological Organization announced.

The drop in airline travel caused by the coronavirus pandemic has sharply reduced the amount of atmospheric data routinely gathered by commercial airliners, the World Meteorological Organization said Thursday, adding that it was “concerned about the increasing impact” on weather forecasts worldwide.

The agency said data on temperature, wind and humidity from airplane flights, collected by sensors on the planes and transmitted in real time to forecasting organizations around the world, has been cut by nearly 90 percent in some regions.

In the United States, airlines are operating skeletal schedules and have opted to store much of their fleets rather than continuing to lose money by flying near-empty planes.
(Nick Oxford/Reuters)

In the United States data declined by 75 percent  during the pandemic, according to the National Oceanic and Atmospheric Administration.
Under the observational program, established in the 1960s, data from 3,500 aircraft operated by Delta, United, American and Southwest, and by the cargo carriers United Parcel Service and FedEx, is transmitted directly to National Weather Service forecasting operations.

Christopher Vaccaro, a NOAA spokesman, said the decline “does not necessarily translate into a reduction in forecast accuracy since National Weather Service meteorologists use an entire suite of observations and guidance to produce an actual forecast.”
That includes data from satellites, radar and other land- and sea-based instruments and radiosondes, small instruments that are launched into the upper atmosphere on a daily schedule and provide data as they descend.

 Aviation-gathered observations via the AMDAR network on Jan 31. (WMO)
 Aviation-derived weather observations on May 3-4, via the AMDAR network. (WMO)

But fewer than 200 radiosondes are launched each day.
Observations from aircraft have been far more abundant, said William R.
Moninger, a retired NOAA physicist who now works at the Cooperative Institute for Research in Environmental Sciences at the University of Colorado.
Not every airplane supplies data, but those that do transmit readings as often as every few seconds, depending on altitude, he said.

At this time last year, Dr. Moninger said, aircraft in the United States provided nearly 600,000 observations a day.
Now, with far fewer flights, on a recent day in April there were 180,000 observations, he said.

The observational data is fed into weather service computer models that forecast conditions anywhere from several hours to days ahead.
Dr. Moninger and some colleagues are currently studying whether short-term forecasts have been affected.
“The short answer is we haven’t seen an unequivocal impact yet,” he said, noting they have yet to complete their analysis.

 The steep drop in aviation observations from various networks, including AMDAR.
(WMO/Canadian Meteorological Center)

The World Meteorological Organization, an arm of the United Nations that coordinates a global observing system for 193 member nations, said that in addition to aircraft data, surface-based weather observations have been affected in some parts of the world, including Africa and Central and South America, where many weather instruments are not automated and must be visited regularly to obtain readings.

The agency said that automated instruments should continue to function well for some time, but that if the pandemic is prolonged, lack of maintenance and repair may become a problem.

Sign up to receive an email when we publish a new story about the coronavirus outbreak.
The agency also said some countries, especially in Europe, were launching more radiosondes to partially make up for the loss of aircraft data.

National weather agencies “are facing increasingly severe challenges as a result of the coronavirus pandemic, especially in developing countries,” said Petteri Taalas, the agency’s director-general, in a statement.

“As we approach the Atlantic hurricane season, the Covid-19 pandemic poses an additional challenge, and may exacerbate multi-hazard risks at a single country level,” he added.

The aircraft data is also used by airlines as they manage daily flight operations both in the air and at airports.
Observations of wind speeds at cruising altitudes of about 30,000 feet and higher, for example, can help plan for refueling needs.  And if observations during ascent or descent show that it is likely that icing conditions may soon occur at an airport, an airline can save money by moving some aircraft elsewhere.

The data can sometimes be a lifesaver.
In a 2003 paper, Dr. Moninger and others wrote about a 1998 incident in which a plane nearing Miami on a flight from Europe radioed that it was nearly out of fuel because it had encountered strong headwinds.
A quick check of data from three other aircraft showed an alternative track, not far away, with calmer air.

For weather service forecasting, observations during take off and landings are especially useful, Dr.
Moninger said.
Data is collected more frequently than at cruising altitude: advanced instruments take readings every 300 feet between the ground and 10,000 feet.
The resulting collection of readings is called a profile and typically there would be about 12,000 of them a day.
On March 23 this year, there were 3,500.

The data can help forecasters better understand the vertical structure of the lower atmosphere and how it may change in the near term.
“If you’re looking for things like the likelihood of thunderstorms, the vertical structure of the atmosphere is important,” Dr. Moninger said.
“I expect the decrease in weather data could also make a big impact on things like predicting when fog is going to break,” he said.

Air-quality experts also use the data to forecast because it can help predict the appearance of near-surface inversions, when air temperatures, which are normally warmest at the ground, flip and become warmer at higher altitudes.
Inversions trap pollutants, greatly worsening air quality for up to a day or longer.

Links :