IBM Reseach in Dublin has demonstrated that Simulating WAves Nearshore (SWAN) models previously requiring high performance computing could be done with lower-end computing devices such as a Raspberry Pi.
From IBM by Fearghal O'Donncha
Scientists have made amazing advances enabling machines to understand language and process images for such applications as facial recognition, image classification (e.g., “cat” or “dog”) and translation of texts.
Work in the IBM Research lab in Dublin this summer was focused on a very different problem: using AI techniques such as deep learning to forecast a physical process, namely, ocean waves.
Traditional physics-based models are driven by external forces: The tides rise and fall, winds blow in different directions, the depth and physical properties of water influence the speed and height of the waves.
These physical processes and their relationships are encapsulated in the differential equations that are coded into numerical models of wave transport.
The nature of the computations typically demands High Performance Computing infrastructure to resolve the equations.
This high computational expense limits the spatial resolution, physical processes and time-scales that can be investigated by a real-time forecasting platform.
Representative heat maps of the difference between SWAN- and machine- learning-simulated Hs. The wave-height snapshot on the left shows some trends of local discrepancy (in this image, RMSE is 6 cm) not evident in the right figure, which actually has a higher RMSE (14 cm in this image).
We developed a deep learning framework that provides a 12,000 percent acceleration over these physics-based models at comparable levels of accuracy.
The validated deep-learning framework can be used to perform real-time forecasts of wave conditions using available forecasted boundary wave conditions, ocean currents, and winds.
The huge reduction in computational expense means that
- simulations can be made on a Raspberry Pi rather than a HPC centre and
- it enables investigation of a vastly increased set of physical conditions, geometries and time-scales by amending input datasets to the deep learning model.
We use the physics-based Simulating WAves Nearshore (SWAN) model to generate training data for the deep learning network.
The model — driven by measured wave conditions, ocean currents from an operational forecasting system, and wind data from The Weather Company — was run between April 1st, 2013 and July 31st, 2017 generating forecasts at three hour intervals to provide a total of 12,400 distinct model outputs.
Specifically, images of 3,111 wave heights and periods could be replicated with the deep-learning algorithm with errors less than those for the SWAN model-verification exercise.
Outputs from SWAN and the deep learning network were compared to observed buoy wave data within the model domain demonstrating that despite the huge reduction in computational expense, the new approach provides comparable levels of accuracy to the traditional physics-based, SWAN model.
Accurate forecasts of ocean wave heights and directions are a valuable resource for many marine-based industries.
Many of these industries operate in harsh environments where power and computing facilities are limited.
A solution to provide highly-accurate wave condition forecasts at low computational cost is essential for improved decision making.
As an example, shipping companies can use highly accurate forecasts to determine the best voyage route in rough seas to minimise a desired metric (e.g. fuel consumption, voyage time, etc.). Aquaculture operators require timely, continuously updating forecasts to inform decision-making related to high-margin activities such as feeding and harvesting.
This study extends and builds on a collaboration between IBM Research – Ireland, Baylor University and the University of Notre Dame. Prof. Scott James from Baylor, who has extensive industry experience in wave forecasting applications, specifically for wave energy, joined the IBM Dublin Research Lab for a summer sabbatical to further an existing research collaboration.
The objective of the sabbatical was to leverage IBM’s skills in AI to extend wave forecasting capabilities beyond current state-of-the-art.
Yushan Zhang, a Ph.D candidate at the University of Notre Dame, brought experience in application of machine learning analytics to a number of research studies.
Together, the blend of modelling skills, machine learning capabilities and industry experience from the three institutions resulted in innovative deep learning solutions to enable wave forecasting at a fraction of the computational cost of current state-of-the-art methods. This method is illustrated in our paper “A Machine Learning Framework to Forecast Wave Conditions.”
Post a Comment