One of the most promising aspects of a Science 2.0 future is not just being able to analyze trillions of data points or getting the public to help with biology, but making more accurate models using much larger data sets. Big data.

This is not new in commercial physics software, where a Monte Carlo analysis stopped being how products were made by the mid-1990s. If a semiconductor company is going to tape out a product, they want accurate models and that is why physics and engineering were among the rare tools that got less bloated and faster as computing power increased. But it's new for climate science where models have been parameter-based and plighted by researcher and sample bias. In the Science 2.0 world, one of those problems can be fixed - if all the data can be used, and normalized to account for different levels of accuracy, we can more accurately predict the effects of climate change. 

One of the advantages academics have is that they can get government funding, and that allows them to use taxpayer money whereas Science 2.0 cannot.  Aided by part of a $10-million multi-university Expeditions in Computing grant, Northeastern researchers Evan Kodra and Auroop Ganguly did a robust analysis and found that global temperature is increasing but so too are the variability in temperature extremes. For instance, while each year's average hottest and coldest temperatures will likely rise, those averages will also tend to fall within a wider range of potential high and low temperate extremes than are currently being observed. This means that even as overall temperatures rise, we may still continue to experience extreme cold snaps, said Kodra. 

So an extremely cold year is part of the new normal, just like an extremely hot year is.

More powerful numerical modeling will also continue to make climate science more accurate, because new methods will be able to show, rather than speculate, if natural processes that drive weather anomalies may cause ice melt in hotter years to cause colder subsequent winters. Those hypotheses can only be confirmed in physics-based studies. 

The authors used simulations from the most recent climate models used by the Intergovernmental Panel on Climate Change and "reanalysis data sets," which are generated by blending weather observations with numerical weather models. The team combined a suite of methods to characterize extremes and explain how their variability is influenced by things like the seasons, geographical region, and the land-sea interface. The analysis of multiple climate model runs and reanalysis data sets was necessary to account for uncertainties in the physics and model imperfections. 

A Science 2.0 approach in a Big Data world will even, yes, predict the weather next year. It sounds impossible, but nothing is impossible with enough data.

 Published in Scientific Reports. Source: Northeastern University