Inside Track

Prediction and Its Follies

Prediction and Its Follies

By Eric P. Harding

We humans are creatures of patterns. We learn by noticing patterns in our surroundings, then making informed applications of those patterns in new scenarios.

One common example of an observed pattern that lends itself to misapplication is the “preterite T.” As toddlers, we discover that verbs take tenses—I turn the page now, I turned it yesterday, I’ll turn it tomorrow. But that pesky past tense “-ed” isn’t used in many irregular verbs … those verbs that are among the most commonly used words in our language. So when a kid says “I goed down the slide,” while we may chuckle at the error, we understand why the mistake was made (and truthfully, such overgeneralization errors are actually a sign of a deeper understanding of linguistic rules).

Just because we’ve noticed a pattern, that doesn’t necessarily mean we can successfully predict outcomes in new scenarios. Actuaries, of course, are adept at examining past experience and applying the lessons learned to new applications … but there’s a limit to how precise, how perfect such a prediction can be.

This issue’s features explore that idea of prediction—how themes interrelate, the practical limitations of prognosticating, and when forecasting falls apart.

In “An Imperfect Storm” (page 20), author Jeff Reeves looks at the technological state of the weather forecasting system in the United States. Weather modeling helps predict where and when cataclysmic storms will strike, allowing those in the predicted path of the destruction to better prepare. But the United States has fallen behind other countries when it comes to forecasting infrastructure, Reeves illustrates—and the unpredictable nature of a changing climate only amplifies the need for better forecasting technology.

Our second feature, “The Ticking Clock” (page 28), looks at so-called doomsday factors—geopolitical risk, a growing population, pandemics, economic inequality, and so on—and explores how these factors may intertwine in unpredictable ways. Author Wes Edwards paints a dire picture of an uncertain world, but he stresses that understanding these risk factors—and pondering how they may interact—can lead to better and more thoughtful risk-mitigation efforts.

Our final feature in this issue, “Patterns and Noise” (page 34), looks at an informational metric called the Kolmogorov complexity, exploring how its theoretical concept can be applied to real-world tasks. Actuaries are accustomed to working with complex models and predictive tools. In order to improve a business unit’s profits or productivity, actuaries are often asked to refine those models to increase the quality of the output. But sometimes the incremental gain—in terms of increased profits or predictive power—does not justify the additional costs—in terms of time of opportunity cost.

Also in this issue, Academy President Bob Beuerlein provides the final in his “Professionalism in Action” series, “Setting the Temperature to ‘Ethical’” (page 16). In it, he closes the circle on a theme he’s been discussing throughout his presidential term—why it’s important to act more like a thermostat than a thermometer. To do so, Beuerlein says, actuaries can use the vast professionalism resources established by the Academy—including the Actuarial Standards Board and the Actuarial Board for Counseling and Discipline—in order to resolve ethical dilemmas, and they can rely on the interconnected professionalism framework to underpin their work.

Thank you for picking up this issue. I predict you’ll find something interesting on the pages that follow.

Print Article

Next article An Imperfect Storm: What we know—and don’t know—about weather modeling and climate risk
Previous article Developing a Game Plan

Related posts