While the evidence-based approach of science is lauded for introducing objectivity to processes of investigation, the role of subjectivity in science is less often highlighted in scientific literature. Nevertheless, the scientific method comprises at least two components: forming hypotheses, and collecting data to substantiate or refute each hypothesis (Descartes’ 1637 discourse [Olscamp, 1965]). A hypothesis is a conjecture of a new theory that derives from, but by definition is unproven by, known laws, rules, or existing observations. Hypotheses are always made by one individual or by a limited group of scientists, and are therefore subjective—based on the prior experience and processes of reason employed by those individuals, rather than solely on objective external process. Such subjectivity and concomitant uncertainty lead to competing theories that are subsequently pared down as some are proved to be incompatible with new observations.

Allowing subjectivity is a positive aspect of the scientific method: it allows for leaps of faith which occasionally lead to spell-binding proposals that prove to be valid. Some scientific studies have analyzed how subjectivity contributes to the progression of ideas, and some of those studies are in the geological sciences (Aspinall, 2010). Bond et al. (2012, p. 75 in this issue of Geology) showed a computer-generated seismic cross section, created from an underlying (invented) geological model, to several hundred individual geologists. The model included structural deformation and inversion of faults, with pre-, syn- and post-deformational stratigraphic development. Each geologist interpreted the cross section to hypothesize a geological model; they also provided information about their academic and professional background. Concepts employed by each geologist were categorized (e.g., as dominantly diapirism, thrusting, extension, inversion, etc.) and analyzed statistically. Importantly, the geologists’ background and experience correlated significantly with their likelihood of having invoked the correct concepts. Those with Master's or doctoral (Ph.D.) degrees were most likely to make a successful interpretation. Analyzing the techniques employed (e.g., feature identification, horizon picking, annotation, evolutionary sketches), successful interpretations were most often obtained from using multiple techniques, particularly if they included evolutionary sketches; academic staff were notably successful because they tended to use multiple techniques. Thus, variations in prior experience are shown to bias the formation of evidence-based geological hypotheses.

Such biases in geologists are quite expected as the processes through which they develop in experts in any field are well known to cognitive psychologists. Biases include over-confidence, anchoring and adjustment, availability, and motivational bias, and the definitions of these can be found in Kahneman et al. (1982) or O'Hagan et al. (2006). All such biases occur in situations of uncertainty (such as when forming hypotheses), when various heuristics (rules of thumb) are employed subconsciously.

Bond et al. use large numbers of geologists to identify such biases, which is not usually practical in interpretational settings. An alternative approach to analyzing bias is to use the theory of Elicitation—how to interrogate people in a manner designed to obtain the most reliable information (O'Hagan et al., 2006). Elicitation theory is cross-disciplinary, combining elements of statistics, cognitive psychology, and the field under investigation (here, geology). Structured elicitation methods, and even real-time optimization of questions posed during elicitation, have been used to assess uncertainty and bias in expert opinions (e.g., Rankey and Mitchell, 2003; Curtis and Wood, 2004; Polson et al. 2009; Aspinall, 2010). In all cases, a facilitator manages the process of elicitation, and the entire system of facilitator, experts, and information flow may be analyzed using statistical techniques (Lindley et al., 1979).

To reduce biases associated with individual experts, information is often elicited from groups of experts simultaneously. However, groups of experts are subject to additional biases caused by social influence, resulting in convergence, divergence, or herding behavior (Kahneman et al., 1982; Baddeley et al., 2004). Polson and Curtis (2010) conducted an experiment representing an asset-team environment in the hydrocarbon industry, in which a range of experts were asked to assess the potential of a prospective reservoir stratum (in their case, for CO2 storage). Four experts were asked to interpret existing geological and geophysical data to assess the likelihood of the existence of a particular fault, a specific reservoir stratum, and a sealing cap rock. The experts’ individual levels of certainty were quantified three times: days before the group meeting, just after the beginning of the meeting, and ∼5 min after the end of the meeting. During the meeting, the geologists were asked to reach a consensus position on their joint level of certainty through reasoned discussion.

Figure 1 shows the range of group-averaged individual experts’ uncertainties in whether the reservoir, seal, and fault exist, and the respective group consensus positions. Expert opinion changed significantly during the process, even in the absence of new information. For case C, the group consensus position combined both the most extreme position and the highest degree of certainty (the narrowest range). The final positions shown in Figure 1 were obtained ∼10 min after the final consensus position had been agreed, and at this point, one particular geologist was even shown to disagree with the consensus to which he had just agreed. In this case, there is a clear lack of objectivity in hypothesis formation due to group dynamics. However, subjectivity is also shown to be important: the consensus position in case C was not adopted by any geologist before the meeting—without group dynamics it might not have been considered.

Similar dynamism of opinion was observed in a study by Phillips (1999): two sets of 10 experts estimated corrosion time scales of containers to be used for geological storage of nuclear waste. Even accounting for the experts’ own uncertainty estimates, final results from the two groups were barely consistent. Also, using a computerized laboratory at the University of Edinburgh (UK) that tracks dynamic opinions throughout any group elicitation session, all studies to date have exhibited similar dynamism in the opinions of geoscientific experts during discussion and scientific exchange.

The above studies significantly influence the way one should interpret consensus-driven results. Consensus positions clearly may only represent the group opinion at one instant in time, and may not represent the true range of uncertainty about the issue at hand (e.g., Fig. 1C). This is disturbing because consensus is often used in the geosciences. For example, the Intergovernmental Panel on Climate Change (IPCC; http://www.ipcc.ch/) has affected a significant shift in public opinion toward acceptance of the anthropogenic origin of relatively rapid, current climatic variations. However, IPCC conclusions are all consensus driven—positions agreed between groups of scientists. Group interactions might reduce individual biases, such as anchoring and over-confidence (such biases have nevertheless been recognized in IPCC results [Oppenheimer et al., 2007]). However, the group consensus approach may also introduce dynamic biases (recognized by Mastrandrea et al. [2011]), which are more difficult to detect without tracking the dynamics of opinion.

It is interesting to note that Bayesian methods now enjoy widespread acceptance within the geological and other sciences (Tarantola, 2005). These inference methods are objective, being governed by mathematical laws, yet they explicitly represent (possibly subjective) prior information. From the prior position, inferences are formed by assimilating new data in a quantitative, probabilistic manner. Some non-Bayesian methods also allow, for example, qualitative geologists to directly influence quantitative inferences about the Earth by interacting intuitively with the optimization of Earth model parameter values (Boschetti and Moresi, 2001; Curtis and Wood 2004). Thus, a range of methods explicitly facilitate the use of subjective geological information within otherwise objective processes of scientific inference.

While Polson and Curtis (2010) and now Bond et al. (2012) show clearly that subjectivity affects geologists’ interpretations, either as individuals or in groups, the existence of subjectivity in forming hypotheses does not necessarily imply a lack of scientific rigour. When recognized explicitly, subjectivity may properly influence scientific inferences, and can also lead to novel hypotheses. Scientists should therefore not be ashamed of subjectivity, but we should strive to develop methods to quantify and sometimes to reduce its effects.