Tuesday, March 2, 2021

ClimaCell, an ambitious private weather firm, plans to launch its own satellites

 Artistic rendering of a ClimaCell satellite. (ClimaCell)
 
From WP by Andrew Fredman

The company would be the first to operate a satellite fleet to improve its own forecasts, rather than selling the data to others


ClimaCell, a growing private weather company based in Boston whose customers include airlines, maritime shipping firms and everyday consumers, plans to spend $150 million during the next few years to launch its own satellite radar constellation.
The goal, company leaders said in an interview, is to make its own forecasts more reliable, thereby benefiting its clients, the public through its weather app and policymakers.

This aim contrasts with the business of most, if not all, space companies today that are pursuing weather applications. 
These firms, such as GeoOptics and Spire, have business models based on selling the data for others to use in forecasting the weather, with customers that include federal agencies.
However, ClimaCell would use its own technology, which already includes proprietary weather modeling, to take advantage of the data it gathers from space.

The end result, if all goes well, would be a vertically integrated weather company whose operations range from generating its own data to sifting through that information using computer models and turning that into products aimed at improving how businesses operate.

According to ClimaCell co-founder and chief executive Shimon Elkabetz, ClimaCell has several dozen scientists and engineers now dedicated to developing and eventually deploying a fleet of small space-based weather radars that could gather real-time data of every location on the globe at any time.
This would be a major leap forward for radar coverage over data-sparse regions, he said, such as Africa, South America and the oceans.

The satellites would carry a Ka-band radar instrument, Elkabetz said, which he compared to a research mission that NASA has carried out known as the Global Precipitation Measurement (GPM) satellite.

GPM consists of a dual-frequency radar that allows it to get a three-dimensional view of precipitation falling within a storm, including by seeing the distribution of different droplet sizes within the clouds, according to Dalia Kirschbaum, who heads NASA’s hydrological sciences lab at the Goddard Space Flight Center in Greenbelt, Md.

The downside to GPM is that it is just one satellite. 
“When you have a single orbiting spacecraft, if you don’t get a good [pass over] a storm, then you just miss it,” she said.

The space agency has also launched small satellites, such as rainCube, which was deployed from the International Space Station, to help solve the challenge of building powerful radars in small boxes, Kirschbaum said.
“The instrument will offer similar capabilities” to the radar aboard GPM, Elkabetz said, “in terms of both resolution and sensitivity, but exceed the swath,” or scan footprint, by a factor of more than two.

To accomplish this, the company is planning to use its own technologies to develop a new radar and antenna. ClimaCell is seeking to lower the costs per satellite by at least half compared with the NASA satellite, which scans a location on Earth only every three days.
The cost savings, Elkabetz said, “will allow us to scale this from a single-satellite mission to a constellation of dozens of satellites that enables global coverage with high revisit rates.”

Rei Goffer, co-founder and chief strategy officer at ClimaCell, said revisit times, the interval between instances when the satellite passes over the same location on Earth, would be one hour in the company’s planned satellite constellation.
“We are not going to space just because it’s cool,” Goffer said in an interview, but instead are trying to solve a data gap that could allow the company to make far more accurate forecasts.

Outside experts, such as Brian Weeden of the Secure World Foundation, questioned whether the new satellites would interfere with other spacecraft also operating within the Ka band of spectrum, including planned 5G satellites and other weather satellites already in low Earth orbit.

Elkabetz said he expects to encounter skepticism from those who may not believe that ClimaCell has solved some of the technical challenges in developing and deploying these satellites.
If he were not involved in the project already, he would not believe it, either, he said.

“I respect anyone who thinks it’s difficult, and as we are able to reveal in the future how it works, hopefully people will be able to witness it themselves,” he said in an interview.

Marshall Shepherd, director of the University of Georgia’s atmospheric sciences program, said he sees this project as a way to better predict weather extremes.
“Precipitation is at the heart of many weather-related extremes ranging from flooding to hurricanes, yet is very difficult to measure on global scales,” Shepherd said in an email.

Estimates of rainfall rates on July 8, 2019, a day that brought flooding to the D.C. region, from NASA's Global Precipitation Measurement Core Observatory. (NASA)

“I am not surprised that scholars are exploring new ways to provide measurements with the accuracy and resolution useful for applications.”

ClimaCell has raised a substantial amount of money for a recent entrant into the weather forecasting business: about $112 million in venture capital funding, with the most recent round closing in July 2020.

Elkabetz noted that most of the world still does not have radar coverage, including in Latin America, Africa, the Middle East and Asia.
“The system’s capabilities will enable new modeling and analytics with precision never before available in the developing world,” he said.

“The data will power applications such as monitoring the conditions favorable for locust reproduction and migrations, as well as conditions that lead to devastating infectious diseases such as malaria, which put millions of lives and livelihoods at risk,” Elkabetz said in a statement.

The satellites could significantly help hurricane forecasts, he said, since they would provide details about the structure and evolution of such storms.
The National Hurricane Center has utilized data from the GPM mission and previous weather satellites for forecasting purposes.

The chief engineer for the program is John Springmann, who has worked with private sector space firms including SpaceFlight industries, which launched the BlackSky constellation.
The team has also been working with Kerri Cahoy, co-director of the small-satellite center at MIT.

ClimaCell is aiming to launch its first radar satellite in the third quarter of 2022.

Through the company’s nonprofit arm known as ClimaCell.org, the satellite data could flow to areas where improved forecasts are desperately needed, mainly in the developing world, Goffer and Elkabetz said.

Links :

Monday, March 1, 2021

Brunt Ice Shelf in Antarctica calves

First imageFirst image of the newly calved Brunt Ice Shelf from Sentinel 2!
Taken (26th Feb) at 09:40:19 UTC.

From BAS

A huge iceberg (1270 km²) the size of the county of Bedfordshire has broken off the 150-m thick Brunt Ice Shelf, almost a decade after scientists at British Antarctic Survey (BAS) first detected growth of vast cracks in the ice.


The Brunt Ice Shelf is the location of British Antarctic Survey’s (BAS) Halley Research Station.
BAS glaciologists, who have been expecting a big calving event for at least a decade, say that the research station is unlikely to be affected by the current calving.
The 12-person team working at the station left mid-February by BAS Twin Otter aircraft.
The station is now closed for the Antarctic winter.

North Rift crack photographed by Halley team in January 2021

The first indication that a calving event was imminent came in November 2020 when a new chasm – called North Rift – headed towards another large chasm near the Stancomb-Wills Glacier Tongue 35 km away.
North Rift is the third major crack through the ice shelf to become active in the last decade.

During January, this rift pushed northeast at up to 1 km per day, cutting through the 150-m thick floating ice shelf.
The iceberg was formed when the crack widened several hundred metres in a few hours on the morning of 26th Feb, releasing it from the rest of floating ice shelf.
 
Map of Brunt ice shelf and Halley Research Station

The glaciological structure of this vast floating ice shelf is complex, and the impact of ‘calving’ events is unpredictable.
In 2016, BAS took the precaution of relocating Halley Research Station 32 km inland to avoid the paths of ‘Chasm 1’ and ‘Halloween Crack’.

Since 2017, staff have been deployed to the station only during the Antarctic summer, because during the dark winter months evacuation would be difficult.
‘Chasm 1’ and ‘Halloween Crack’ have not grown in the last 18 months.

Professor Dame Jane Francis, Director of British Antarctic Survey says:
“Our teams at BAS have been prepared for the calving of an iceberg from Brunt Ice Shelf for years.
We monitor the ice shelf daily using an automated network of high-precision GPS instruments that surround the station, these measure how the ice shelf is deforming and moving.
We also use satellite images from ESA, NASA and the German satellite TerraSAR-X.
All the data are sent back to Cambridge for analysis, so we know what’s happening even in the Antarctic winter, when there are no staff on the station, it’s pitch black, and the temperature falls below minus 50 degrees C (or -58F).

“Over coming weeks or months, the iceberg may move away; or it could run aground and remain close to Brunt Ice Shelf.
Halley Station is located inland of all the active chasms, on the part of the ice shelf that remains connected to the continent.
Our network of GPS instruments will give us early warning if the calving of this iceberg causes changes in the ice around our station.”

Simon Garrod, Director of Operations at British Antarctic Survey adds:
“This is a dynamic situation. Four years ago we moved Halley Research Station inland to ensure that it would not be carried away when an iceberg eventually formed. That was a wise decision.
Our job now is to keep a close eye on the situation and assess any potential impact of the present calving on the remaining ice shelf.
We continuously review our contingency plans to ensure the safety of our staff, protect our research station, and maintain the delivery of the science we undertake at Halley.”

More information


About Halley VI 

Halley VI Research Station is an internationally important platform for, atmospheric and space weather observation in a climate-sensitive zone.
In 2013, the station attained the World Meteorological Organization (WMO) Global Atmosphere Watch (GAW) Global station status, becoming the 29th in the world and 3rd in Antarctica.

Halley VI Research Station sits on Antarctica’s up to 150–m thick Brunt Ice Shelf.
This floating ice shelf flows at a rate of up to 2 km per year west towards the sea where, at irregular intervals, it calves off as icebergs.

Long-term monitoring of the natural changes that occur in the ice shelf has revealed changes, including growth of a recently-formed chasm, the North Rift.
Halley VI Research Station has been unoccupied during the last four winters because of the complex and unpredictable glaciological situation.

Change in the ice at Halley is a natural process and there is no connection to the calving events seen on Larsen C Ice Shelf, and no evidence that climate change has played a significant role.

During the 2016-17 Antarctic Summer season (Nov-March), in anticipation of calving, the eight station modules were uncoupled and transported by tractor to a safer location upstream of Chasm-1.

Over the summer 18/19, BAS installed an autonomous power generation and management system – Halley Automation project – which provides a suite of scientific instruments with power even when we have no staff at the station.
This system has proved effective in running through more than eight months of darkness, extreme cold, high winds and blowing snow and delivering important data back to UK.

There have been six Halley research stations on the Brunt Ice Shelf since 1956.

About Chasm 1


In 2012, satellite monitoring revealed the first signs of change in a chasm (Chasm 1) that had lain dormant for at least 35 years.  This change had implications for the operation of Halley VI Research Station.
In the 2015/16 field season, glaciologists used ice penetrating radar technologies to ‘ground truth’ satellite images and to calculate the most likely path and speed of Chasm 1.
Chasm 1 grew up until 2019 but has not moved for the past 18 months.
There is now 2 km of ice holding this iceberg in place.

About Halloween Crack

In October 2016, a new crack was detected some 17 km to the north of the research station across the route sometimes used to resupply Halley.  The ‘Halloween Crack’ continues to widen and a second large iceberg may calve to the north’.
The tip of Halloween Crack is also currently static.

About North Rift Crack


In November 2020, a new chasm, known as the North Rift opened and started extending towards Brunt-Stancomb Chasm.
The Brunt Ice Shelf is probably the most closely monitored ice shelf on Earth.
A network of 16 GPS instruments measure the deformation of the ice and report this back on a daily basis.
European Space Agency satellite imagery (Sentinel 2), TerraSAR-X, NASA Worldview satellite images, US Landsat 8 images, ground penetrating radar, and on-site drone footage have been critical in providing the basis for early warning of changes to the Brunt Ice Shelf.
These data have provided science teams with a number of ways to measure the width of Chasm 1 and changes to the Halloween Crack and North Rift crack, with very high precision.
In addition, scientists have used computer models and bathymetric maps to predict how close the ice shelf was to calving.

About Halley science 
 
Ozone measurements that have been made continuously at Halley since 1956 (which led to the discovery of the ozone hole in 1985, and since that time, its slow progress towards recovery)
Monitoring of space weather undertaken at Halley contributes to the Space Environment Impacts Expert Group that provides advice to UK Government on the impact of space weather on UK infrastructure and business.
 

Sunday, February 28, 2021

The Long Leg , Edward Hopper (1935)

Messing about on boats all summer is a holiday lived in the present; that’s the sense of this picture.
The boat remains still; it is the wind and water that move it along.
Hopper’s coolly beautiful painting of a sailboat off the New England shore perfectly expresses this curious fact about sailing.
And in conditions like these – a hot blue day, windless, the sun beating down on the blanched sand – the boat is lying almost motionless to one side, solitary as the characteristic Hopper lighthouse in the background.
Like the water itself, the painting is almost entirely composed in shades of blue.
 
Links :

Saturday, February 27, 2021

Underwater photographer of the year 2021

Renee Capolzzola (USA), for her graceful image: “Shark’s Skylight“.
 
The full set of UPY2021 results is now available to view in the Winners’ gallery, and the complete collection is available to download and keep in the free Yearbook.
The UPY team would like to thank all the talented photographers who supported this year’s competition with their pictures, especially in these challenging times.
We hope that this year’s stunning collection of winning images provides a welcome escape to everyone who enjoys them and a chance to reconnect with the underwater world.

Friday, February 26, 2021

The Titanic disaster and its aftermath

View of the bow of the RMS Titanic photographed in June 2004 by the ROV Hercules during an expedition returning to the shipwreck of the Titanic.
(Courtesy of NOAA/Institute for Exploration/University of Rhode Island)


From Hydro by Albert E. Theberge

Understanding the Unthinkable

In the night of 14 April 1912, the unthinkable happened.
The mightiest ship afloat, the brand new White Star Line ship Titanic, was on its maiden voyage from Southampton, England, to New York.
The ship was advertised as unsinkable.
And, if unsinkable, why should there be adequate lifeboats for all of the passengers and crew? The ship departed from Southampton on 10 April.
Less than five days later, it was at the bottom of the Atlantic Ocean.
More than 1,500 people perished within three hours of striking an iceberg, which ripped the bottom out of the ship.

How this happened is a story told many times.
Human hubris, unswerving trust in the infallibility of technology, and the commercial impetus of fast Atlantic passages all contributed to the loss of the ship and the accompanying loss of life.
Even as the ship was settling in the waters of an icy North Atlantic, some survivors reported that there was a belief among many passengers that the ship was the safer place to be; accordingly, not all the lifeboats were filled to capacity.

This accident shocked the international community.
The British and American governments investigated the accident – the British determined: “That the loss of said ship was due to collision with an iceberg, brought about by the excessive speed at which the ship was being navigated.” Certainly, that was the major factor.
However, like many accidents, there were a number of contributing causes.
These included: watertight bulkheads that were improperly designed; an insufficient number of lifeboats and life rafts; apparent lack of concern by the captain concerning reports of ice prior to collision with the iceberg; little training of crew in emergency procedures including lowering of lifeboats; no radio watches on nearby ships which could have assisted in lifesaving efforts; and, remarkably, not even binoculars for the ship’s lookouts.

Steamship Titanic showing length as compared with highest buildings.

Both the British and American governments arrived at similar conclusions and recommendations following the loss of the Titanic.
The chief recommendation was that all ships be equipped with sufficient lifeboats for passengers and crew, that all ocean-going ships maintain 24-hour radio-telegraph watches, and that bulkheads be designed such that flooding of any two adjacent compartments would not result in sinking of a vessel.
These recommendations and others were adopted by the first International Convention for the Safety of Life at Sea (SOLAS) at a conference held in London in 1914.

Development of Seafloor Mapping Technologies


Commercial concerns saw an opportunity in the Titanic disaster and began searching for a means to determine the presence of icebergs and other unseen or submerged obstructions forward of moving vessels.
European and North American inventors joined the race.
In 1912, Reginald Fessenden, a Canadian inventor and radio pioneer, joined Submarine Signal Company, a forerunner of today’s Raytheon, and began work on an electro-acoustic oscillator similar to a modern transducer.
This oscillator was originally designed for both ship-to-ship communication and to receive reflected sound from an underwater object.
In late April 1914, Fessenden tested this device off the Grand Banks on the US Revenue Cutter Miami and succeeded in reflecting sound off an iceberg at a range of approximately two miles and hearing the return echo.
A second echo was heard that was determined to be from the bottom.

Submarine warfare during World War I accelerated research into the field of acoustics.
By the end of the war, the use of acoustics for both detection of objects in the water and measuring depth had been proven.
In 1922, the USS Stewart, equipped with a Hayes Sonic Depth Finder that utilized a Fessenden oscillator, ran a line of soundings across the Atlantic Ocean taking over 900 individual soundings.
The profile obtained from these soundings was published in the first issue of the International Hydrographic Review.
Piano-wire sounding systems became obsolete overnight.
Although leadline sounding continued for a number of years in shallow water, acoustic sounding systems replaced the leadline for most purposes within two decades.

World War II further accelerated the development of directional sonar systems (called Asdic in England).
Although meant originally for detection of submarines, these systems ultimately developed into modern side-scan sonar systems.
Underwater photography equipment and magnetic anomaly detection (MAD) instruments were in their infancy during this period.
MAD systems were proved effective in detecting submarines.
An early use by hydrographers of the complementary use of sonar, underwater photography and MAD gear was in the charting of ships torpedoed off the United States East Coast.
This was done by Coast and Geodetic Survey (C&GS) officers working off the Coast Guard buoy tender Gentian in 1944.

Following the war, there were further advances, including the development of an early side-scan sonar system called Shadowgraph in 1954 by German scientist Julius Hagemann, who was working at the United States Navy Mine Defense Laboratory.
This system remained classified for many years, but civil use of side-scan began developing shortly after this advance.
In the commercial sector, Harold Edgerton of the Massachusetts Institute of Technology (MIT) and Martin Klein, also of MIT, were early pioneers.
Edgerton turned a bottom-penetration sonar on its side in 1963 and imaged a sunken lightship from a C&GS vessel.
Edgerton was a founder of EG&G and discovered the Civil War era USS Monitor off Cape Hatteras with an EG&G commercial side-scan system.
Martin Klein began his career with EG&G but left to found Klein Associates, a name synonymous with side-scan technology.

Advances in depth measurement technology paralleled the development of side-scan technology.
In April 1961, engineers at General Instruments Corporation developed a proposal for BOMAS, Bottom Mapping Sonar.
Quoting from the proposal: “BOMAS derives bottom profile information from the intersection of the ocean bottom with a vertical plane perpendicular to the heading of a ship.
The sonar data is processed automatically and in real time to provide a depth contour strip map….
A sonar intensity map can be provided simultaneously….” 
Multi-beam sounding with its attendant bottom reflectivity mapping capability was born.
Two years later, the first prototype multi-beam system was installed on the USS Compass Island and subsequent units installed on Navy survey ships.
In the meantime, the acronym had changed to SASS (Sonar Array Sounding System).
By the late 1970s, the technology had migrated to the civil community and has since displaced single beam sounding systems as the standard seafloor mapping tool.

Painting of the Titanic sinking by the bow, with people rowing a lifeboat in the foreground and other people in the water.
Icebergs are visible in the background.
(Engraving by Willy Stöwer: Der Untergang der Titanic) 

Finding Titanic and the Aftermath of the Discovery


In the immediate aftermath of the sinking, proposals to locate the sunken Titanic were discussed and ultimately dismissed because the wreck lay well beyond the limits of technology at that time.
Through the decades, the development of subsea technology finally provided the means to locate the wreck and subsequently to not only investigate it using remote technology, but also to dive to the wreck and conduct a series of investigations that included surveys of the interior of the ship.

In July 1985, the final search began, with Ifremer deploying their newly developed side-scan sonar SAR vehicle on a mission led by Jean-Louis Michel on the research vessel Le Suroit.
That survey covered 70% of a 150 square nautical mile survey box without locating the Titanic.
Picking up the search in August, the WHOI team, led by Robert Ballard aboard the research vessel Knorr, utilized the towed vehicle Argo, with a 100kHz side-scan sonar, and three low-light black and white video cameras.
Ballard’s team relied on the optical system to locate the Titanic, and in the early morning hours of 1 September, the unmistakable form of a boiler made it clear that the search was over.
Titanic’s final resting place had been found.

Since the discovery in 1985, a series of expeditions have visited the Titanic with a variety of goals.
Ballard and Woods Hole returned to the wreck in July 1986 on the WHOI research vessel Atlantis II, with the submersible Alvin, and the ROV Jason Jr.
The 1986 expedition photographed and filmed the wreck, focusing on the largely intact bow section.
Working from the data collected from the 1985 Argo survey as well as 1986 data, WHOI’s William Lange and others assembled a preliminary site map of the Titanic wreck site that delineated the site from the bow to the stern section and plotted a wide range of features scattered on the seabed.
A private venture funded and led by RMS Titanic, Inc., the salvor-in-possession of the wreck (RMST), and technically supported by Ifremer, returned to the wreck in July 1987 and made 32 dives to recover some 1,800 artifacts from the seabed, the first of a series of recovery dives made by RMST until 2004, which ultimately salvaged nearly 5,000 artifacts.

The remotely operated vehicle (ROV) Hercules exploring the bow of the Titanic, 2004.
(Courtesy: Institute for Exploration/University of Rhode Island/NOAA


Dives made by documentary film crews and James Cameron (whose first dives were in 1995) working with the P.P.
Shirsov Institute, captured dramatic images of the wreck as well as additional technical information and a more detailed view of aspects of the wreck site in the Mir submersibles.
In particular, Cameron’s extensive documentation and penetration of the interior of the bow with small ROVs known as ‘bots’ provided incredible insights into the ongoing processes of environmental change and preservation inside the ship, as well as evidence of what had occurred during the sinking of the Titanic.
Cameron’s work has arguably done more to share the Titanic as a wreck site with a greater audience than anyone else.

The scientific products of the various expeditions include a detailed analysis of the microbiological corrosion of the ship’s steel (led by Roy Cullimore), geological studies of the sediments and current studies (by the Shirsov Institute), a detailed sonar survey of the bow where the Titanic struck the iceberg, photo mosaics of the bow section, and forensic studies of the ship’s sinking sequence and break-up.
In addition, RMS Titanic, Inc. commissioned the creation of an ‘archaeological GIS’ map delineating where the 5,000 artifacts had been recovered from between 1987 and 2004.
That GIS, which is being completed by RMST under contract by the Center for Maritime & Underwater Resource Management of Michigan, a private non-profit, is reported to be nearly complete.

The National Oceanic & Atmospheric Administration’s Office of Ocean Exploration conducted two missions to the Titanic in 2003 and 2004.
As the nation’s ocean agency, NOAA has an interest in the scientific and cultural aspects of the Titanic.
NOAA’s focus is to build a baseline of scientific information from which we can measure the processes and deterioration of the Titanic, and apply that knowledge to many other deepwater shipwrecks and submerged cultural resources.
The 2003 mission, with the Shirsov Institute, had several key goals, the first being to catalogue any anthropogenic activities currently impacting the wreck site, or evidence of such activity since its discovery in 1985.
Digital imagery was obtained and a deck-view mosaic of the bow section was created.
Additionally, ongoing bacteriological analysis was conducted as well as basic oceanographic research.

The 2004 Mission, Conducted On Board the NOAA Research Vessel

Ronald H. Brown, working with Robert Ballard, then (and now) with the University of Rhode Island and the Institute of Archaeological Oceanography, utilized an ROV to continue the assessment of the wreck’s ongoing environmental changes and the bacteriological work of Roy Cullimore.
One other key achievement of the 2004 mission was the completion of a topographic map of Titanic Canyon and the surrounding area, including the wreck of the Titanic, with a Seabeam 2112 multi-beam sonar system.
The digital terrain model of this large area of seabed places the Titanic within a larger geological and geographical context.

NOAA also participated, as did Woods Hole, the National Park Service, the Institute of Nautical Archaeology, the Waitt Institute and contracted partners such as Phoenix International, Ltd., in RMS Titanic, Inc.’s last (to date) expedition to the wreck in August 2010.
This mission, with a non-recovery scientific focus, focused on William Lange’s and the WHOI Advanced Imaging and Visualization Laboratory’s work to create a detailed 2D and 3D visual mosaic of the site.
To do so, it made a detailed survey using the Waitt Institute’s REMUS 6000 autonomous underwater vehicles of an approximately ten square nautical mile survey zone around the wreck site, with a series of closer, higher resolution surveys of the area delineated in the 1986 WHOI map of the site and even closer surveys of key features and areas of the site.
That project was successful in generating the mapping data as well as comprehensive visual coverage of the wreck, including detailed photo mosaics of a number of features in the artifact scatter, which included sections of the ship’s hull, machinery and equipment and other artifacts.

This composite image, released by RMS Titanic Inc., and made from sonar and more than 100,000 photos taken in 2010 by unmanned, underwater robots, shows a small portion of a comprehensive map of the 3-by-5-mile debris field surrounding the bow of the Titanic on the bottom of the North Atlantic Ocean (Courtesy: AP Photo/RMS Titanic Inc.)

What is clear in this brief overview is that the last few decades have witnessed a revolutionary expansion of humanity’s capacity to not only locate deep-sea shipwrecks, but increasingly to capture imagery and data that essentially ‘virtually raises’ these wrecks for ongoing research as well as public education.
In many ways, the Titanic and the surrounding area are likely to be the best-studied section of the deep ocean floor.
That status has come because of the iconic nature of the wreck and the potential for profit from the opportunity to connect to this ship and its tragic loss either through a tour of the recovered artifacts or a virtual tour on film or in a photograph.
At the same time, measurable and important science has been conducted, and in that, a way forward for not only this site but others has been demonstrated, especially in the adaptation and adoption of technology to access and learn from sites once thought unreachable.

Links :

Thursday, February 25, 2021

Scientists use nuclear reactor to investigate Amelia Earhart’s mysterious disappearance

Earhart beneath the nose of her Lockheed Model 10-E Electra, March 1937, Oakland, California, before departing on her final round-the-world attempt prior to her disappearance.
Credit: Wikimedia Commons.

From ZMEScience by Tibi Puiu

A metal plate thought to have once belonged to Earhart's plane was probed for hidden secrets using neutron beams.

One of the bravest women of the 20th century, Amelia Earhart, vanished unexpectedly during her attempt to fly around the world.
Now, scientists have turned to nuclear technology to analyze a piece of metal debris that some suspect was part of Earhart’s wrecked plane. In doing so, they hope to piece together the final moments of the pioneering aviator’s final living hours. 

A tragic end to a brave pioneer

Amelia Earhart was the first female pilot to fly across the Atlantic Ocean.
In 1937, Earhart and her navigator, Fred Noonan, were flying their Lockheed Model 10-E Electra on an even more ambitious quest: flying around the world.
On July 2, 1937, they were about six weeks and 20,000 miles into their journey when their plane suddenly crashed en route to Howland Island in the Pacific, which is halfway between Hawaii and Australia. 
 

Howland island with the GeoGraage platform (NGA/NOAA source)
 
The Howland Island is a flat sliver of land about 2,000 meters (6,500 feet) long and 460 meters (1600 feet wide), so it must have been very difficult to distinguish from similar-looking clouds’ shapes from Earhart’s altitude.
Of course, Earhart and Noonan were well aware of the challenges, which is why they had an elaborate plan that involved tracking their routes using celestial navigation and linking to a U.S. Coast Guard vessel stationed off Howland Island using radios.

But despite their well-thought-out contingency plans, the pair were simply flat out of luck.
When they took off, witnesses reported that a radio antenna may have been damaged by the maneuver. On that morning, there were also extensive overcast conditions.
Later investigations also showed that the fliers may have been using outdated, inaccurate maps.
 
 
On the morning of July 2, 1937, at 7:20 AM, Earhart reported her position to the crew at the Coast Guard vessel, placing her plane on a course at 32 kilometers (20 miles) southwest of the Nukumanu Islands.
“We must be on you, but we cannot see you. Fuel is running low. Been unable to reach you by radio. We are flying at 1,000 feet.”

The ship replied but there was no indication that the signal ever reached Earhart’s plane.
The Coast Guard ship released its oil burners in an attempt to signal the flyers, but they weren’t seen by all accounts.
Noonan’s chart of the island’s position was off by about five nautical miles, subsequent investigations showed, and it seems likely that the plane ran out of fuel.

Despite a huge search and rescue mission involving 66 aircraft and nine ships, the fate of the two flyers remains a mystery to this day.
With the years, the mystery only intensified, amplified by countless conspiracy theories surrounding Earhart’s last days.

Neutrons and dirty metal plates

While watching a National Geographic documentary on the disappearance of Earhart, Daniel Beck, a pilot who also manages the engineering program for the Penn State Radiation Science and Engineering Center (RSEC), home to the Breazeale Nuclear Reactor, was shocked by a particular scene discussing an aluminum panel believed to be part of the wrecked airplane.
The documentary ended with the idea that, perhaps, sometime in the future, technology will advance to the point where scientists can elucidate more information from the panel.
“I realized that technology exists. I work with it every day,” Beck said.

The scientist got ahold of Richard “Ric” Gillespie, who leads The International Group for Historic Aircraft Recovery (TIGHAR) and was featured in the documentary, and offered to analyze the metal part using neutron technology at his lab. 
 

Kenan Ãœnlü, director of the Penn State Radiation Science and Engineering Center, holds a metal patch that might be from Amelia Earhart’s airplane.
Credit: Kenan Ünlü/Penn State.

The metal panel had been recovered in storm debris on Nikumaroro, a Pacific island located about 480 kilometers (300 miles) away from Howland Island.
Some have suggested before that Earhart’s plane made an emergency landing on the reef surrounding the small, uninhabited island.
A human skeleton was even found in 1940, and although the bones were lost, a 2018 study found that the historical records of the bones’ measurements matched Earhart’s closer than 99% 0f the general population.

A skull fragment that may be from the original skeleton was found in a storage facility in a museum on a nearby island and is currently being tested to see if it is a genetic match for any of Earhart’s relatives. Beck’s goal was to perform a similar investigation, only instead of genetics, he wanted to use the reactor’s neutron beams to reveal the history of the metal patch.
Perhaps they could find a long-faded serial number or other marks that might link the debris to the Electra.

Beck and colleagues placed the sample in front of the neutron beam, while a digital imaging place was placed behind the sample.
As the neutron beam passed through the sample and then through the imaging plate, an image was recorded and digitally scanned.

“As the beam passes through, if it were uniform density, we wouldn’t see anything,” Beck said.
“If there’s paint or writing or a serial number, things that have been eroded so we can’t see with the naked eye, we can detect those.”

This investigation revealed that the metal plate had axe marks along the edges, except for one of the edges where the metal must have snapped from whatever it was attached to.
In other words, not much linking it back to Earhart.

“It doesn’t appear that this patch popped off on its own,” Beck said.
“If it was chopped with an axe, we should see peaks for iron or nickel left by the axe along that edge. Neutron activation analysis gives us that detail at a very fine resolution.”

For now, the researchers plan on performing more examinations using more comprehensive experiments, including adjusting the irradiation time and power level of the reactor.

Even if they eventually don’t find anything in connection to Earhart, this inquiry is still valuable.
For one, it disqualifies the object so other people don’t waste time in the future.
Secondly, it sets a precedent that may spur more research with neutron radiography.
“It’s possible we’ll learn something that actually disqualifies this artifact from being part of Earhart’s plane, but I prefer the knowing! It is so exciting to work with scientists who share our passion for getting to the truth, whatever it is,” Gillespie said in a statement.

Links :

Wednesday, February 24, 2021

Princeton astrophysicists re-imagine world map, designing a less distorted, ‘radically different’ way to see the world


From Princeton by Liz Fuller-Wright

How do you flatten a sphere?

For centuries, mapmakers have agonized over how to accurately display our round planet on anything other than a globe.

Now, a fundamental re-imagining of how maps can work has resulted in the most accurate flat map ever made, from a trio of map experts: J. Richard Gott, an emeritus professor of astrophysics at Princeton and creator of a logarithmic map of the universe once described as “arguably the most mind-bending map to date”; Robert Vanderbei, a professor of operations research and financial engineering who created the “Purple America” map of election results; and David Goldberg, a professor of physics at Drexel University.

Their new map is two-sided and round, like a phonograph record or vinyl LP.
Like many radical developments, it seems obvious in hindsight.
Why nothave a two-sided map that shows both sides of the globe? It breaks away from the limits of two dimensions without losing any of the logistical convenience — storage and manufacture — of a flat map.

“This is a map you can hold in your hand,” Gott said.

In 2007, Goldberg and Gott invented a system to score existing maps, quantifying the six types of distortions that flat maps can introduce: local shapes, areas, distances, flexion (bending), skewness (lopsidedness) and boundary cuts (continuity gaps).
The lower the score, the better: a globe would have a score of 0.0.

“One can’t make everything perfect,” said Gott, who is also a 1973 graduate alumnus of Princeton.
“A map that is good at one thing may not be good at depicting other things.”

The Mercator projection, popular on classroom walls and used as the basis for Google maps, is excellent at depicting local shapes, but it distorts surface areas so badly near the North and South Poles that polar regions are usually simply chopped off.

 

In the Mercator projection, the polar regions are completely distorted — Antarctica looks bigger than all other continents combined — and distances are misleading: Japan and Hawaii look very far apart.
Under the system designed by Goldberg and Gott to quantify map errors, where lower numbers represent less distortion, the Mercator projection receives a score of 8.296.
Map by Daniel R. Strebe via Wikimedia Commons


Using their metrics, the best previously known flat map projection was the Winkel Tripel, with a Goldberg-Gott score of 4.563.
But that still had the “boundary cut” problem of splitting the Pacific Ocean and creating the illusion of great distance between Asia and Hawaii.


The Winkel Tripel projection, chosen by the National Geographic for its world maps, represents the poles more accurately than the Mercator, but it still distorts Antarctica badly and creates the illusion that Japan is hugely to the east of California, instead of its nearest neighbor to the west.
Goldberg-Gott score: 4.563
Map by Daniel R. Strebe via Wikimedia Commons

Clearly, a completely new approach was needed.
Gott drew a comparison to Olympic high jumpers: In 1968, Dick Fosbury shocked sports fans by arching his back and jumping over the bar backwards.
He set a new record and won a gold medal, and high jumpers have jumped backwards ever since.

“We’re like Mr. Fosbury,” Gott said.
“We’re doing this to break a record, to make the flat map with the least error possible.
So, like him, we’re surprising folks.
We’re proposing a radically different kind of map, and we beat Winkel Tripel on each and every one of the six errors.”

The inspiration came from Gott’s work on polyhedra — solid figures with many faces.

Polyhedral maps are nothing new — in 1943, Buckminster Fuller broke the world into regular shapes, and provided instructions for how to fold it up and assemble it as a polyhedral globe — but while he could protect the shapes of continents, Fuller shredded the oceans and increased many distances, such as between Australia and Antarctica.


Buckminster Fuller popularized the “Dymaxion” polyhedral projection, based on an unfolded icosahedron.
Antarctica is “round, as it properly should be,” said Gott, but this projection “shatters” the oceans.
Goldberg-Gott score: greater than 15
Imagery courtesy NASA’s Earth Observatory, with modifications by Mapthematics LLC
 
A presentation copy of the rare reprint of R. Buckminster Fuller’s “Fluid Geography,” describing his marvelous Dymaxion Map and illustrated by a large folding example thereof.

In a recent paper, Gott began considering “envelope polyhedra,” with regular shapes glued together back-to-back, which led to the breakthrough idea for the double-sided map.

It can be displayed with the Eastern and Western Hemispheres on the two sides, or in Gott’s preferred orientation, the Northern and Southern Hemispheres, which conveniently allows the equator to run around the edge.
Either way, this is a map with no boundary cuts.
To measure distances from one side to the other, you can use string or measuring tape reaching from one side of the disk to the other, he suggested.

“If you’re an ant, you can crawl from one side of this ‘phonograph record’ to the other,” Gott said.
“We have continuity over the equator.
African and South America are draped over the edge, like a sheet over a clothesline, but they’re continuous.”

This double-sided map has smaller distance errors than any single-sided flat map — the previous record-holder being a 2007 map by Gott with Charles Mugnolo, a 2005 Princeton alumnus.
In fact, this map is remarkable in having an upper boundary on distance errors: It is impossible for distances to be off by more than ± 22.2%.
By comparison, in the Mercator and Winkel Tripel projections, as well as others, distance errors become enormous approaching the poles and essentially infinite from the left to the right margins (which are far apart on the map but directly adjacent on the globe).
In addition, areas at the edge are only 1.57 times larger than at the center.


Gott, Goldberg and Vanderbei’s revolutionary, double-sided disk map minimizes all six types of map distortions. 
They used an equidistant azimuthal projection: a compromise projection, like the Winkel Tripel, with small errors in both local shapes and areas, instead of optimizing one at the expense of the other. 
Antarctica and Australia are more accurately represented than in most other maps, and distances across oceans or across poles are both accurate and easy to measure, unlike one-sided flat maps. 
Goldberg-Gott error score: 0.881Map by J. Richard Gott, Robert Vanderbei and David Goldberg

The map can be printed front-and-back on a single magazine page, ready for the reader to cut out.
The three cartographers imagine printing their maps on cardboard or plastic and then stacking them like records, to be stored together in a box or slipped inside the covers of textbooks.

“A thin box could hold flat, double-sided maps of all the major planets and moons in the solar system,” Gott said, “or a stack of Earth maps giving physical data, political boundaries, population density, climate, languages, explorers’ voyages, empires at different historical periods or continents at different geological epochs.”

To the best of their knowledge, no one has ever made double-sided maps for accuracy like this before.
A 1993 compendium of nearly 200 map projections dating back 2,000 years did not include any, nor did they find any similar patents.

“Our map is actually more like the globe than other flat maps,” Gott said.
“To see all of the globe, you have to rotate it; to see all of our new map, you simply have to flip it over.”

Flat maps that improve on the Winkel Tripel,” by J. Richard Gott III, David M. Goldberg and Robert J.
Vanderbei, was published on Arxiv on Feb. 15.
You can see their double-disk maps of Earth, Mars, Jupiter, the sun, and other heavenly bodies here.

Links :