The 2014 Edge Annual Question (EAQ) is out. This year, the question posed to the contributors is: What scientific idea is ready for retirement?
As usual with the EAQ, it provokes thought and promotes discussion. I have only read through a fraction of the responses so far, but I think it is important to highlight a few Edge contributors who answered with a common, and in my opinion a very important and timely, theme. The responses that initially caught my attention came from Laurence Smith (UCLA), Gavin Schmidt (NASA), Guilio Boccaletti (The Nature Conservancy) and Danny Hillis (Applied Minds). If I were to have been asked this question, my contribution for idea retirement would likely align most closely with these four responses: Smith and Boccaletti want to see same idea disappear — stationarity; Schmidt’s response focused on the abolition of simple answers; and Hillis wants to do away with cause-and-effect.
In the age of big-data, from a decision-making standpoint, all of these responses address the complex nature of interconnected scientific topics and the search for one-size-fits-all answers. The conclusions should all point toward what science is supposed to do in the first place: to generate knowledge. Of course with every experiment there is the objective of finding an answer to a specific question, but each experiment, if performed properly, should fundamentally serve to generate more questions, not answers. When a newly-minted PhD successfully defends his or her dissertation, for a brief moment in time, they are the world’s expert on their particular subject. However, the next day, that may not hold true. This is progress and should be embraced. If we can apply findings of a study to address a specific problem, great. But science should be a humbling endeavor — as each day we should realize how much we actually don’t know.
As more and more universities are cranking out graduates under the data-science rubric, I hope that part of the curriculum stresses that while machine learning and advanced algorithms can be used to uncover new, useful and novel patterns contained within large datasets, it should also be stressed that these same tools and techniques can be trained to determine when false positives might lead down dark data alleys. This is all part of a proper lens through which to view scientific risk management. Taking a complex adaptive systems approach to data analysis will better prepare decision makers to identify tipping points and non-stationarity, while providing a foundation to continuously challenge assumptions, and at the same time, to embrace the notion of complexity, shifting baselines, and ambiguity.