The science of weather forecasting has a long history. As early as 650 BC. the Babylonians used cloud patterns and astrology to make predictions. But it was not until the nineteenth century that the weather forecast gained momentum.
Fast forward 150 years, and the science of weather forecasting has changed beyond recognition. Today, thanks to supercomputers, it is possible to make much more accurate weather forecasts. With knowledge and especially data from today, these powerful computers can make smart statements about the weather tomorrow, the day after tomorrow and who knows, even next month.
Faster, better results
“The data used for weather forecasts comes from a variety of sources, provided by satellites, weather stations, balloons, aircraft and even ships.”
To perform their work, supercomputers need data. The same goes for the weather forecast. The data used for this comes from various sources, provided by satellites, weather stations, balloons, planes and even ships. In addition, the weather forecast has access to the Global Telecommunication System (GTS), which collects and distributes data four times a day at six-hour intervals.
The data from these sources can range from five hundred gigabytes to one terabyte. Before this data can be used, they undergo a quality control. Once this process is complete, mathematical models make the predictions. These models – already in use since the nineteenth century – are equations that describe the state, motion and time evolution of atmospheric parameters such as wind and temperature.
An unprecedentedly strong computing power
Converting these equations to accurate predictions requires computational power. If we divide a large country into a network of ten-kilometer blocks, a significant level of computational power is required to make local predictions within each block. But making predictions about smaller blocks is a real challenge. Thunderstorms and small-scale effects are closely linked to local weather. With a network that is too large for the small block, these effects are easy to overlook. It’s like a fisherman; to catch small fish, a finer net is needed.
A smaller surface area requires an extraordinary amount of computing power. For example, it can take 100 computer nodes to make a prediction on a grid of ten-kilometer blocks. But a prediction on a three-mile grid of the same area requires sixteen times as much computing power. For an even more concentrated area of 2.5 km, the computing power had to be increased by a factor of sixteen.
Ai as a solution
“Thanks to advances in technology, the 4-day forecast is now as accurate as the 1-day forecast was 30 years ago.”
Because of the amount of computing power required for these calculations, scientists are looking for artificial intelligence (ai) technologies. Instead of using powerful calculations to predict the weather based on current conditions, AI systems look at data from the past. This has a big impact on the weather forecast. For example, the United Kingdom Meteorological Bureau recently conducted an experiment with AI technology to predict floods and storms. Based on radar maps from 2016 to 2018, the system was able to accurately predict precipitation patterns for 2019 in nearly 90 percent of cases. Thanks to technological advances, the 4-day forecast is now as accurate as the 1-day forecast was 30 years ago.
New technologies herald an era of more accurate weather forecasting. Despite this, long-term weather forecasts will never be 100 percent accurate. This is because the equations used to make weather forecasts are non-linear – there is a certain peculiarity about them.
In addition, the science of weather forecasting is relatively backward. Although technology doubles the power of computing approximately every two years, it still takes longer for science to actually apply that power in their knowledge and research.
Still, the weather forecast has improved tremendously thanks to today’s computing power. To give an example of the difference between then and now, a weather model that would have lasted 600 years on computer systems in the 1960s now lasts only 15 minutes on a standard server.
As computing power evolves in the coming years, as well as scientific knowledge about weather patterns, even more accurate predictions will be possible. This is crucial because by predicting extreme weather, supercomputers can potentially save lives and have a major impact on the world.