Drought in South Dakota

South Dakota drought map

Western South Dakota is one of those classic drought cases. It was extraordinarily wet there during the 1990s, well above the long term mean. Of the 10 wettest years in a century of records in northwest South Dakota, four came during the 1990s. An astonishing six years during the 1990s were greater than one standard deviation wetter than the long term mean in the southwest South Dakota climate division.

Andrea Cook at the Rapid City Journal gets points for trying to explain this to folks:

South Dakota benefited from relatively wet periods through most of the 1990s and in 2001, Todey said.

But her story reflects what seems to be an inevitable mistake in thinking about drought in this country (not her mistake, but the mistake of the people she’s talking to). Variation around the mean is normal. When we’re on the wet side of the mean – even way on the wet side of the mean – agriculturalists in the United States treat it as “normal,” expanding herds, planting, etc. (See my previous discussion of the fact that we don’t really have a word for the opposite of “drought”.) When we slip to the dry side of the mean, that’s a “drought”. There’s no question that western South Dakota has been on the dry side of the mean for the last five years. But within the ups and downs, the Standardized Precipitation Index for the region shows “near normal” conditions.

4 Comments

  1. How long a span of time does the Standardized Precipitation Index measure? (Excuse my ignorance.) And how long do you think we should looking at? 25 years? 100 years? 250 years? 10,000 years? I take your point with short-term thinking versus the long-term mean, but as John Maynard Keynes pointed out, in the long run, we’re all dead. I’m not sure how to bridge this gap between scientific thinking and popular thinking, and I’m not sure who is working on the problem, outside of a handful of journalists. Your general thoughts on this would be appreciated.

  2. Kit –

    Thanks for the question (and thanks for exposing my sleight of hand – drat you!). The SPI was developed back in the early 1990s as an alternative to the Palmer Drought Index precisely because Palmer is a black box, and it’s essentially impossible to tell what’s represented by a Palmer number. The SPI is calculated based on the period of the instrumental record, so it is essentially based on the variability over the last century. It tells you how far outside of the normal distribution a particular time period lies. So an SPI of -1 means you’re essentially 1 standard deviation drier than the mean over the time period in question. +1 means one standard deviation wetter.

    The beauty of the SPI is you can pick the relevant time scale you want to measure. So if you’re a farmer dependent on what fell in the last thee months, you can do the 3-month SPI. If you’re dependent on soil moisture that builds up over a year, you can use a 12-month SPI. If you depend on multi-year precipitation, you can do 12, 24, 48, etc. The link I gave above is a five year SPI, based on the article’s contention that South Dakota was in the midst of a multi-year event. But at all the relevant time scales, the SPI for the three climate divisions on the western side of South Dakota are all in the normal range as measured by SPI.

    As to your underlying question, I don’t think there’s a particular “right” answer to the question of what the relevant time scale is to look at when you’re thinking about these things. It’s domain-specific. But I do think it’s relevant for the people making the decisions to be aware of what the data says across the time scales that matter. In the case of drought, there’s a tendency to forget that the unusual wet periods, on decadal scales, are just that: unusual. For the kind of example we’re talking about here, you don’t need to go back to the paleo record. A century of instrumental record is sufficient to ‘splain to people that the 1990s were really unusually wet, and the last five years was not that unusually dry.

Comments are closed.