The weather sure has been hot lately. Want to get a good picture of the weather over the past year? Then get NOAA’s National Climate Data Center recently published report on the weather and climate occurring around the world in 2009. Because weather fascinates many of us and is experienced by all of us, the report provides good examples of how a federal agency can analyze, visualize, predict and model data – – not just weather data.
The first observation I’d make is that the report focuses on unusual or anomalous events. It tries to put them in historical context. For example, I learned that the U.S. had the wettest October since records were collected 115 years ago. And Toronto had a snow-free November for the first time in recorded history. In all data analysis – weather data, financial data, or performance data – it is important to pull out the significant events from the rest of the data, or the “noise” as we say. NOAA does this by comparing the past year’s data with their historical data to find out where the year stood in comparison to all the other years. Similar analysis can be done by other agencies whether the metric is road-miles constructed or the percent of students receiving student aid that graduated. What is key is comparing the results in light of the historical data and trying to gain insights on what the trend is and what it means.
Because 2009 was the end of the decade, they have also compiled some data at the decade level rather than at the year level. While the 2009 average global temperature was the fifth warmest year on record, the 2000-2009 decade was the warmest on record for the globe. And the decade before, 1990 – 1999, was the warmest on record at that time. The use of multi-year averages is a good example of smoothing that can be done to help ferret out significant information and remove the year-to-year fluctuation in large collections of time series data.
The State of the Climate report shows good examples of many data analysis techniques including historical analysis, near-real time reporting, reanalysis of past data using newer, improved techniques, averaging
of multiple datasets to improve reliability, and drill down capabilities from decades, to years, to seasons, to months, and from global to regional to country and state. They also use interesting visualization techniques.
Those interested in data analysis, as well as weather, should see my full blog article here. Also,
download this report from the NOAA Website. (Arndt, D.S.,M.O. Baringer, and M.R. Johnson, Eds., 2010: State of the Climate in 2009, Bull. Amer. Meteor. Soc.). Note: While NOAA does use IBM Technology in its Research, the report does not state which technology is used in the reported climate studies and I don’t intend to imply any relationship between this report and IBM.)
Those interested in further information on Analytics including our fall schedule of seminars, please visit the Analytics Solution Center website at www.ibm.com/ASCdc.
If your agency uses analytics in interesting and novel ways, I’d like to hear from you. Please write to me at [email protected].
– Frank Stein, Director
– Analytics Solution Center
– Washington, D.C.
Leave a Reply
You must be logged in to post a comment.