Weath­er fore­cast­ing is noto­ri­ous­ly dif­fi­cult, but in recent years experts have sug­gest­ed that machine learn­ing could bet­ter help sort the sun­shine from the sleet. Google is the lat­est firm to get involved, and in a blog post this week shared new research that it says enables “near­ly instan­ta­neous” weath­er fore­casts.

The work is in the ear­ly stages and has yet to be inte­grat­ed into any com­mer­cial sys­tems, but ear­ly results look promis­ing. In the non-peer-reviewed paper, Google’s researchers describe how they were able to gen­er­ate accu­rate rain­fall pre­dic­tions up to six hours ahead of time at a 1km res­o­lu­tion from just “min­utes” of cal­cu­la­tion.

That’s a big improve­ment over exist­ing tech­niques, which can take hours to gen­er­ate fore­casts, although they do so over longer time peri­ods and gen­er­ate more com­plex data.

Speedy pre­dic­tions, say the researchers, will be “an essen­tial tool need­ed for effec­tive adap­ta­tion to cli­mate change, par­tic­u­lar­ly for extreme weath­er.” In a world increas­ing­ly dom­i­nat­ed by unpre­dictable weath­er pat­terns, they say, short-term fore­casts will be cru­cial for “cri­sis man­age­ment, and the reduc­tion of loss­es to life and prop­er­ty.”

Google’s work used radar data to pre­dict rain­fall. The top image shows cloud loca­tion, while the bot­tom image shows rain­fall. Cred­it: NOAA, NWS, NSSL

The biggest advan­tage Google’s approach offers over tra­di­tion­al fore­cast­ing tech­niques is speed. The company’s researchers com­pared their work to two exist­ing meth­ods: opti­cal flow (OF) pre­dic­tions, which look at the motion of phe­nom­e­non like clouds, and sim­u­la­tion fore­cast­ing, which cre­ates detailed physics-based sim­u­la­tions of weath­er sys­tems.

The prob­lem with these old­er meth­ods — par­tic­u­lar­ly the physics-based sim­u­la­tion — is that they’re incred­i­bly com­pu­ta­tion­al­ly inten­sive. Sim­u­la­tions made by US fed­er­al agen­cies for weath­er fore­cast­ing, for exam­ple, have to process up to 100 ter­abytes of data from weath­er sta­tions every day and take hours to run on expen­sive super­com­put­ers.

“If it takes 6 hours to com­pute a fore­cast, that allows only 3–4 runs per day”

“If it takes 6 hours to com­pute a fore­cast, that allows only 3–4 runs per day and result­ing in fore­casts based on 6+ hour old data, which lim­its our knowl­edge of what is hap­pen­ing right now,” wrote Google soft­ware engi­neer Jason Hick­ey in a blog post.

Google’s meth­ods, by com­par­i­son, pro­duce results in min­utes because they don’t try to mod­el com­plex weath­er sys­tems, but instead make pre­dic­tions about sim­ple radar data as a proxy for rain­fall.

The company’s researchers trained their AI mod­el on his­tor­i­cal radar data col­lect­ed between 2017 and 2019 in the con­tigu­ous US by the Nation­al Ocean­ic and Atmos­pher­ic Admin­is­tra­tion (NOAA). They say their fore­casts were as good as or bet­ter than three exist­ing meth­ods mak­ing pre­dic­tions from the same data, though their mod­el was out­per­formed when attempt­ing to make fore­casts more than six hours ahead of time.

This seems to be the sweet spot for machine learn­ing in weath­er fore­casts right now: mak­ing speedy, short-term pre­dic­tions, while leav­ing longer fore­casts to more pow­er­ful mod­els. NOAA’s weath­er mod­els, for exam­ple, cre­ate fore­casts up to 10 days in advance.

While we’ve not yet seen the full effects of AI on weath­er fore­cast­ing, plen­ty of oth­er com­pa­nies are also inves­ti­gat­ing this same area, includ­ing IBM and Mon­san­to. And, as Google’s researchers point out, such fore­cast­ing tech­niques are only going to become more impor­tant in our dai­ly lives as we feel the effects of cli­mate change.

AI could be par­tic­u­lar­ly use­ful for short-term fore­casts. Pho­to by APU GOMES/AFP via Get­ty Images

Source link