Earth systems thinking (EST), or thinking of the Earth as a complex system made up of interworking subsystems, has been shown to reflect the highest level of knowing and understanding in the geosciences. Previous work has found four frameworks of EST that repeatedly appear in the geoscience education literature. This study aims to quantitatively build on this work by employing structural equation modeling to understand the current state of EST teaching as shown by the 2016 iteration of the National Geoscience Faculty Survey (United States; n = 2615). Exploratory and confirmatory factor analyses were conducted on survey items to understand and develop three models, one for EST teaching practices, one for course changes, and one for active-learning teaching practices. Analyses revealed that reported EST teaching practices relate back to the four EST frameworks proposed in the literature. The three models explored in this study were used to build a full structural model, where it was hypothesized that active-learning teaching practices would predict EST course changes and EST teaching. However, the model revealed that EST course changes mediate, or bring about, the relationship between active-learning teaching practices and EST teaching. In other words, the relationship between active-learning and EST teaching practices is not direct. This implies the need for continued efforts to provide professional development opportunities in both active-learning teaching practices and EST, as active-learning practices are not sufficient to implicitly teach EST skills. Results also revealed that the teaching approaches that emphasize modeling and complexity sciences had the weakest relationship to the broader EST teaching practices, suggesting a need for more professional development opportunities as they relate to systems modeling, quantitative reasoning, and complexity sciences in the context of the Earth sciences.

The teaching of geosciences takes many forms and covers a broad range of disciplines. Unifying all geoscience teaching is the understanding that the Earth itself is a system composed of interacting subsystems. In order to understand and improve the current state of post-secondary geoscience education and the impact of professional development aimed to improve undergraduate classroom teaching, it is necessary to understand how current geoscience instructors are teaching about the Earth system and incorporating aspects of systems thinking into their classrooms. “Systems thinking” is a term originally used to indicate a holistic approach through which to account for dynamic interdependencies among parts, or in other terms, seeing a whole as a sum of its parts (Arnold and Wade, 2015). This way of thinking is transferable to a variety of fields and disciplines, including the geosciences; however, this way of thinking is commonly challenging to undergraduate students (Stillings, 2012). One of the major challenges in teaching the geosciences is helping students to develop Earth systems thinking (EST) skills.

To address these needs, we conducted a study to investigate how geoscience instructors are engaging in teaching systems thinking in their classrooms and how teaching practices, particularly active-learning practices, are related to EST teaching. We analyzed the results of the U.S. 2016 National Geoscience Faculty Survey (https://serc.carleton.edu/NAGT Workshops/ about/evaluation.html) to understand what EST teaching currently looks like in post-secondary geoscience classrooms. Based on work by Scherer et al. (2017), we wanted to see if the frameworks described in the literature manifested themselves in how survey items about EST teaching related to each other. We also examined the latent or underlying structure of participant responses to items related to changes or innovations that instructors were making to their courses, and if those changes were related to EST teaching practices.

Additionally, this study aims to understand whether instructors who are more engaged in active-learning strategies are more likely to engage in EST teaching practices. Here, our hypothesis was that instructors who engaged in active-learning practices would be more likely to engage in EST teaching practices. This is based on the challenges associated with systems thinking (e.g., Herbert, 2006; Stillings, 2012), which are much more likely to be addressed by active learning than by traditional lecture. Lastly, this study aims to understand whether other factors relate to, influence, or bring about this hypothesized relationship. We hypothesized that active-learning practices would influence both EST teaching as well as changes or innovations made to current courses. In summary, our research questions ask: (1) What is the current state of EST-teaching in geoscience classrooms?; (2) How do instructor strategies for teaching EST relate to instructors’ broader instructional practices?; and (3) How do instructor strategies for teaching EST relate to recent changes instructors have made to their courses?

Perspectives on Earth Systems Thinking (EST)

The geosciences encompass a broad range of disciplines and subdisciplines, all surrounding the study of the Earth. Phenomenographic work by Stokes (2011) demonstrated that the most advanced and complex conceptions of the geosciences involve the conceptions of interacting systems and the relationship between Earth and society. Thus, increasing interest has been paid to the development of EST in geoscience courses. This Earth systems perspective allows for an integrated view of the Earth as interacting parts, different from approaches to geoscience that involve presenting facts about the Earth, ocean, atmosphere, and life without highlighting the interactions between all components (Ireton et al., 1997). This Earth systems approach to education (Mayer, 1991; Ireton et al., 1997) has been particularly well documented in literature about primary and secondary science education (NGSS Lead States, 2013; Orion and Libarkin, 2014; College Board, 2019) and geoscience workforce expertise (U.S. Bureau of Labor Statistics, 2015).

Naturally, EST is predicated on the learner’s systems thinking abilities (Orion and Libarkin, 2014). Systems thinking and its development are not unique to the geosciences and have been explored in a wide range of fields, highlighting the transdisciplinary nature of systems thinking (Trujillo and Long, 2018). The idea of EST, however, goes beyond pure systems thinking and considers the role that humans play within the Earth system and the inherent complexities that come along with those interactions (Manduca and Kastens, 2012; Gosselin et al., 2013; Orion and Libarkin, 2014; InTeGrate Program, 2015; Orr et al., 2016; Kastens and Manduca, 2017). Stillings (2012) synthesized many of ideas about EST, particularly as they relate to complexity, and outlined many future pathways for curriculum, instruction, and research on complex Earth systems.

Earth Systems Frameworks

A literature review was conducted by Scherer et al. (2017) to determine the current state of the study of learning and teaching the Earth system in geoscience education research. This work built on foundations laid down by Stillings (2012) and identified four conceptual frameworks (capitalized for clarity) relating to complex systems within the geoscience education research literature: Earth Systems Perspective, Earth Systems Thinking Skills, Complexity Sciences, and Authentic Complex Earth and Environmental Systems (Fig. 1). The Earth Systems Perspective framework focuses on the interactions of the four major spheres of the Earth system (lithosphere: solid Earth; biosphere: life; atmosphere: gaseous envelope surrounding the Earth; and hydrosphere: water and ice) and their complex interconnections. This framework is concerned with the interdisciplinary nature of the Earth system and limits systems thinking to conceptualizing the Earth system as a whole and commonly includes aspects of human interactions and environmental decision-making (Davies, 2006).

The second framework, Earth Systems Thinking Skills, emphasizes systems thinking skills, particularly as they relate to cyclic and dynamic thinking. Work by Assaraf and Orion (2005) on applying systems thinking to the transformation of matter in Earth cycles—e.g., the water cycle—exemplifies this framework. This framework also includes feedback loop identification as well as the understanding of underlying causes of processes. Thus, this Earth Systems Thinking Skills perspective can be differentiated from the Earth Systems Perspective by its emphasis on the inclusion of specific systems thinking skills and abilities. While systems thinking is highly emphasized in this framework, heavy use of computer modeling or consideration of complex systems or chaos theory are not included.

The framework of Complexity Sciences largely pulls from the theoretical tradition of the interdisciplinary study of complex systems. This framework is embodied by a wide array of studies that have considered complexity science from the lens of systems dynamics (Shepardson et al., 2014), complex systems theory (including mathematical approaches) (Fichter et al., 2010), and Gaia theory (explaining the environmental conditions of Earth in terms of biological forcing) (Haigh, 2001, 2014). Structure-behavior-function analysis, which emphasizes thinking about how a system works and its function rather than focusing on the components, is also included in this framework and is evidenced in work by Hmelo-Silver et al. (2014). Computer modeling work commonly falls within this framework.

The final framework of Authentic Complex Earth and Environmental Systems pulls its systems ideas from the study of real-world environmental or ecological activities. This framework commonly involves intentional connections to human activities and environmental decision-making. The systems thinking emphasis is on a real-world environmental system or phenomenon, thus incorporating complexity and looking beyond any one process or single component (Herbert, 2006). This framework, more so than the other three, is highly contextualized. Examples of work done in this area include studies on student reasoning on real-world systems like coastal eutrophication (Sell et al., 2006; McNeal et al., 2008), ecosystem dynamics (Grotzer et al., 2013; Sutter et al., 2018), soil microbial activity (Appel et al., 2014), and socio-hydrologic systems (Gunckel et al., 2012; Sabel et al., 2017; Forbes et al., 2018; Petitt and Forbes, 2019).

These four frameworks, each with strengths and limitations, represent four ways in which educators and researchers are employing EST. These frameworks also offer a more focused way to consider EST instructional practices than the often-nebulous umbrella of complex systems.

EST and Active Learning

The complex nature of Earth systems, regardless of framework, makes it challenging to teach and understand students’ learning and conceptual change. Herbert (2006) identified three challenges regarding the reasoning behind complex thinking. The first is the fact that many Earth processes occur at scales—both spatial and temporal—beyond human experience (Dodick and Orion, 2003; Giorgi and Avissar, 1997). Secondly, it is difficult to develop accurate conceptual models of complex systems in which there are a number of variables controlling the behavior of the system (Berger, 1998). Lastly is the tendency of individuals to disregard beyond-average data as noise, meaning that individuals try to oversimplify systems as being near or at equilibrium, ignoring important information or data as just “noise.” This is problematic, as many systems are not at or near equilibrium, and in the case of the Earth system, get farther from equilibrium due to human influence (Goldenfeld and Kadanoff, 1999). Considering this, Herbert (2006) emphasized the importance of inquiry-based and other active learning strategies as necessary in helping students understand complex Earth systems.

Active learning is typically defined as any instructional method that engages students in the learning process (Bonwell and Eison, 1991). Frequently, we contrast active-learning practices with more passive traditional lecture; thus, active learning is sometimes considered anything that is not traditional lecture (Prince, 2004). Work by Macdonald et al. (2004, 2005) has explored active-learning teaching practices within the geoscience community through other iterations of the National Geoscience Faculty Survey. Additional work by Kastens et al. (2009) has explored how geoscientists think and learn and the role active-learning teaching practices can play in geoscience classrooms. Active learning, particularly problem-based learning, has been shown to positively affect student achievement, minimize misconceptions, and positively contribute to students’ conceptual development (Akınoğlu, and Tandoğan, R., 2007). Work by Holder et al. (2017) emphasized the importance of problem-solving and its relationship to student conceptualization of the Earth as a system. Problem-solving is a skill that cannot be learned passively and is a central component to understanding complex systems; thus, active learning is essential to helping students experience actual problem-solving tasks and develop this problem-solving expertise. Due to the complexity of the Earth system, we hypothesize that active-learning practices may be key in helping students begin to make sense of the interrelationships among the Earth systems and their interdisciplinary nature.

In this study, we aim to understand the current state of EST teaching by post-secondary geoscience instructors based on results of the 2016 administration of the National Geoscience Faculty Survey. The National Geoscience Faculty Survey was initially developed in 2004 as part of On the Cutting Edge, a U.S. National Science Foundation (NSF)–funded professional development program for geoscience faculty sponsored by the National Association of Geoscience Teachers (NAGT). In our analysis of the survey results, we took a multipronged approach looking at distinct parts of the survey. We analyzed survey items relating to the teaching of EST, course change, and active-learning practices using exploratory and confirmatory factor-analytical procedures to find the latent structure of items relating to these overlying themes or constructs. This was used to develop models of various teaching practices related to teaching EST skills, course change, and active-learning practices. Through exploratory factor analysis, we identified items that grouped together on overlying constructs. For clarity’s sake, these constructs identified through factor analysis will be italicized (e.g., EST Teaching is a construct made up of correlated survey items relating to it, as identified through exploratory factor analysis). We then developed a full structural equation model to understand how these constructs relate to each other.

Participants

The target audience was instructors teaching college-level geoscience courses. The survey was refined with multiple iterations (discussed below), and for this study, the 2016 results were used (Manduca et al., 2017; Macdonald et al., 2005; Lally et al., 2019). The sampling frame for 2016 was composed of the following lists of geoscience faculty: the American Geosciences Institute membership list obtained with permission for this use (AGI; Alexandria, Virginia, USA, https://www.americangeosciences.org); the On the Cutting Edge professional development program participant list obtained with permission from the PIs; a list of faculty at Texas Two-Year Colleges generated from public websites of two-year colleges; the Supporting and Advancing Geoscience Education at Two-Year Colleges (SAGE2YC) list obtained with permission from the PIs; an additional set of On the Cutting Edge participants specific to the Early Career workshop obtained with permission from the PIs; a Geosciences Two-Year Colleges list composed of instructors from two-year colleges generated from public institutional websites with guidance from regional contacts in New York, Wisconsin, Oregon, Washington State, Idaho, and Illinois; and a list of atmospheric science faculty generated from public institutional websites linked from the American Meteorological Society website. After removing 2116 duplicates and removing 81 names without e-mail addresses, the total number of eligible individuals was 10,910. The survey was piloted in September 2016 with a sample of 200 individuals randomly selected from the survey sampling frame. A total of 33 individuals completed at least one question of the pilot survey. Based on the results of the pilot survey, a few minor changes were made to the final survey. As none of these changes were sufficient to alter the meaning or order of the questions, the results of the completed 33 pilot surveys were included in the data set.

The survey was conducted with the remaining sample of 10,910 individuals between 19 October and 6 November 2016. Individuals were contacted up to four times until they took the survey. Messages to 1296 e-mail addresses were returned as bad or invalid. From this sample 9596 were identified as eligible to participate in the survey as they had a legitimate, functioning e-mail address and were actively teaching post-secondary geoscience courses. Of these participants, 60.9% reported having a geology or geophysics disciplinary focus, 8.0% reported an oceanography or marine science disciplinary focus, 9.1% identified an atmospheric science or meteorology disciplinary focus, 8.9% reported a geoscience education or science education disciplinary focus, and 13.0% indicated some other disciplinary focus. Among the respondents, 88.9% reported that a Ph.D. or doctorate was their highest completed degree level, while 11.1% indicated that a master’s degree was their highest degree level. In terms of courses taught, 2290 participants reported teaching undergraduate classes, and 123 reported teaching graduate classes. Regarding undergraduate geoscience courses, 539 participants reported teaching introductory courses targeted toward a general audience, 570 reported teaching a major (non-introductory) course, and 1053 reported teaching introductory courses geared primarily toward majors. This study looked at all instructors, regardless of faculty type, university type, education level, or type or level of courses taught.

There was some slight response bias, as survey respondents were more likely to be tenured or tenure-track faculty rather than instructors, lecturers, adjunct faculty, or other faculty types (28% of contacted professors, associate professors, and assistant professors responded to the survey, versus 21% of contacted instructors, lecturers, adjuncts, and others [χ2 = 33.38, df = 1, p <0.001]). df is degrees of freedom. Survey respondents were also less likely to teach at research and/or doctoral institutions and more likely to teach at master’s, baccalaureate, two-year college, and other institution types (23% of contacted faculty from research and/or doctoral institutions responded to the survey, while 28% of contacted faculty from the other institution types responded [χ2 = 36.64, df = 1, p <0.001]).

Materials

On the Cutting Edge developed the National Geoscience Faculty Survey in 2004, 2009, and 2012, and On the Cutting Edge leadership modified it in 2009 and 2012. NAGT conducted this national survey in 2004, 2009, 2012, and 2016 (Manduca et al., 2017; Macdonald et al., 2005). The instrument was initially developed in 2003 and was modified in 2009 based on the results of the 2004 administration. Revisions to the survey took place after each iteration, and revisions for the 2016 survey were developed by leadership from On the Cutting Edge, InTeGrate (https://serc.carleton.edu/integrate/index.html), Supporting and Advancing Geoscience Education at Two-Year Colleges (SAGE 2YC), and NAGT with expertise from Greenseid Consulting Group, LLC (http://www.greenseidgroup.com/), and Professional Data Analysts, Inc. (https://www.pdastats.com/). The survey instruments for all administrations can be viewed from the On the Cutting Edge Evaluation summary web page (https://serc.carleton.edu/NAGTWorkshops/about/evaluation.html). The survey broadly explores three questions: (1) How are faculty teaching undergraduate courses?; (2) How do faculty learn about the content and methods that they use in their teaching?; and (3) How do faculty share with their colleagues what they learn about teaching? The survey follows a similar structure across years and has three parts:

  1. The first section consists of demographic questions about education and experience teaching, disciplinary focus, and position and teaching responsibilities.

  2. The second section asks respondents to self-report about specific courses they have taught in the past two years, the design of these courses, and the teaching methods, strategies, content, and assessment approaches they used in their implementation of the course. It is from this section that we conducted exploratory factor analyses to understand EST teaching strategies as well as active-learning teaching styles.

  3. The third section asks questions about how participants learned content and methods as well as information about any changes that were made to a course in the past year. We derived information on incorporating EST course change elements from this section.

The survey consisted of 209 questions with a median completion time of 14.4 min. Respondents answered questions about (1) disciplinary focus, teaching background, and institution; (2) introductory-level course teaching strategies; (3) major and minor teaching; (4) learning new teaching methods, active-learning strategies included, and course changes; (5) communication within the geosciences community and their reasons for attending teaching workshops; and (6) use of online resources, articles published, and conference presentations (Macdonald et al., 2005). Survey questions included a variety of types of items with variant response options, including open response, yes/no, and frequency responses.

Structural Equation Modeling as a Tool for Assessing Survey Results

Survey results are an excellent tool to gather a wide variety of information from a wide swath of a desired cross-section. Survey data can be vast in terms of both scope and number of responses, and there are many tools with which to analyze the data. One way to examine the current state of teaching in the geosciences is to utilize structural equation modeling to make sense of the latent or underlying structure of the survey results. This approach also makes apparent the complex relationships between responses to survey items and allows researchers to identify direct and indirect relationships between various items and the overlying constructs they represent. Structural equation modeling (SEM) encompasses a variety of techniques that allow researchers to model the relationships among both observed and latent (unobserved) variables. The unobserved variables are commonly referred to as constructs (Pituch and Stevens, 2016). This methodology takes a confirmatory, or hypothesis-testing, approach to understand causal processes through a series of regression, or structural, equations. These structural relations are also modeled pictorially in order to provide a clearer conceptualization of the theory that is being studied (Byrne, 2016).

Though based on regression, SEM has the advantage of allowing analysis of all of the variables in a model simultaneously instead of separately (Fornell and Larcker, 1987; Chin, 1998). Due to the large number of participants in this survey (n = 2615), SEM can be used to understand the relationships between participant responses to various items and in this case to quantify and examine how EST teaching practices are manifesting themselves in current practice by a broad swath of geoscience instructors. SEM also allows us to examine what other latent variables can be gleaned from survey responses and how those variables influence EST teaching.

Statistical Analysis

The statistical software suite used to analyze the data was the IBM Statistical Package for Social Science (SPSS) Statistics 23 and SPSS Amos 23. The data were used to develop understanding surrounding four models: EST teaching, teaching changes, active learning, and the relationships between the previous three models. The whole survey data set was randomly split in half using SPSS in order to use one half for exploratory factor analyses and the other for confirmatory factor analyses in order to establish cross-validation. The items that were used in the exploratory factor analyses were then analyzed using Little’s MCAR (missing completely at random) test, which found that missing data could be treated as missing completely at random (p = 0.255). Then an expectation maximization was used to impute missing data. For the confirmatory factor analyses, full information maximum likelihood in AMOS was used to create unbiased estimates of missing data. For all exploratory factor analyses, a combination of Kaiser’s criterion and a scree analysis was used to determine the number of factors. Because the factors were expected to be correlated, an oblique (direct oblimin) rotation was used in all cases. Criterion pattern loading of 0.30 or higher was used to determine which items were loading onto which factors for all exploratory factor analyses (Byrne, 2016).

Initially, we identified items related to EST. The survey included nine items written to address systems thinking skills (Table 1), which we analyzed in a previous study (Lally et al., 2019). However, based on the work of Scherer et al. (2017), it was clear that these only addressed the frameworks of Earth Systems Thinking Skills and Complexity Sciences. Therefore, we examined the survey for other items that may include items relating to the other frameworks (Earth Systems Perspective, and Authentic and Complex Earth and Environmental Systems, specifically including human interactions with the Earth system and interdisciplinary thinking). Items related to quantitative reasoning and data analysis were also included in order to align with the Complexity Sciences framework of EST (Scherer et al., 2017). After this inspection of the survey, the 12 items in Table 2 were incorporated into the exploratory factor analysis. The 21 items in Tables 1 and 2 were shown to have reasonable internal consistency with a Cronbach’s alpha of 0.77. All items were included with the assumption that they would not be utilized if they did not load on a factor. For all exploratory factor analyses, a maximum likelihood analysis was used.

After a latent structure was hypothesized using exploratory factor analysis, we imposed this structure on the second half of the data in a confirmatory factor analysis using Amos. We examined this for model fit and significance of all paths (p <0.05). In all confirmatory factor analyses, unit loading identification was used to ensure that the model was identified. Throughout fit analyses, the following fit indices were used: ratio of χ2 to degrees of freedom, comparative fit index (CFI), and root mean square error of approximation (RMSEA). We also used the Akaike information criterion (AIC) to compare initial models with pruned models if pruning was necessary. Pruning means removing components of a model in order to improve fit. Table 3 summarizes the various fit statistics employed.

We used a similar procedure with items relating to changes instructors made in their courses (Table 4). The survey did not include specific items targeting EST teaching changes; therefore, we identified items related to course changes connected with Scherer et al.’s (2017) EST frameworks. These items had a Cronbach’s alpha of 0.54, indicating low internal consistency. These items were included in an exploratory factor analysis with the assumption that we would eliminate items that did not load from the confirmatory factor analysis. We analyzed the confirmatory factor analysis for significance of all paths as well as the model fit statistics discussed earlier.

Previous work by Manduca et al. (2017) identified a three-factor structure that characterized learning profiles as active learning, active lecture, and traditional lecture (Table 5). Because this has already been established in the literature, we did not conduct an exploratory factor analysis of these constructs. Rather, we imposed this structure on the 2016 data in this study in a confirmatory factor analysis. The traditional lecture was not used, as only two inversely related items loaded on it. Additionally, it is not related to active learning, and thus not related to this study. We analyzed the confirmatory factor analysis for the significance of all paths as well as the fit statistics discussed previously. These items had a Cronbach’s alpha of 0.49.

Upon the completion of the confirmatory factor analyses, we developed a full structural model using the three previously mentioned measurement models (EST teaching, teaching changes, and active learning). To understand model fit, an initial model was developed using active learning as a predictor for EST teaching. We assessed this model for fit and significance of paths. We then added EST teaching changes to the model to analyze its relationship to the other two latent variables. Other demographic information from the survey, such as teaching experience and percent time spent in active learning, was also included in the structural model to evaluate best fit. Throughout the process, we analyzed the model to see how to modify it to improve fit. Throughout fit analyses, we used the same fit indices as for the confirmatory factor analyses: ratio of χ2 to degrees of freedom, CFI, and RMSEA. We also used the AIC to compare initial models with pruned models.

Model 1: EST Teaching

An exploratory factor analysis was used to explore the factor structure of items relating to EST teaching practices. This factor analysis was conducted on one-half of the data that was randomly selected, and a three-factor structure was explored (Table 6). Items loading onto factor 1, which we name Systems Thinking Elements, represented classroom teaching practices that included: discussing a change that has multiple effects in a system, analyzing feedback loops, discussing complexity of scale and interactions, and describing a system in terms of its parts and relationships. Items loading onto factor 2, which we name Systems Model Elements, represented classroom teaching practices that included: building predictive models, exploring systems behaviors using computer models, students collecting their own data and analyzing them to solve a problem, addressing uncertainly, non-uniqueness, and ambiguity when interpreting data. Items loading onto factor 3, which we name Real-World Application Elements, represented classroom teaching practices that included: addressing a problem of global or national interest, working on a problem of interest to the local community, addressing environmental justice issues, and addressing a problem that required bringing together geoscience knowledge with knowledge from another discipline. A four-factor structure was examined, but it included an unstable fourth factor, with only one item loading on it. In both structures, discussing relationships between implications and predictions failed to load, and thus we excluded it. After eliminating the excluded items, the remaining items had a Cronbach’s alpha of 0.66.

The second half of the data was used to perform a confirmatory factor analysis for cross-validity before putting this measurement model into a full structural model (Fig. 2). Because this measurement was used in a full structural model, rather than correlating the factors, it was represented as all making up a broader latent factor of EST Teaching. The confirmatory factor analysis indicated that all loadings were significant. It also indicated acceptable fit with a χ2 to degrees of freedom relationship of 4.682, a CFI of 0.86, and an RMSEA of 0.054 (0.047, 0.061). (Parentheses indicate the confidence interval for RMSEA values.)

This model can be interpreted to mean that instructors engaging in EST teaching practices are employing three main broad instructional techniques: incorporating systems thinking elements into their content, including modeling and quantitative systems approaches into their content, and bringing in real-world system application elements into their content. Based on the lower factor loading, it is clear that the systems model elements are the most distinct from the other instructional practices.

Model 2: EST Teaching Changes

An exploratory factor analysis was used to explore the factor structure of the survey items relating to changes that instructors had recently made to their courses. In this case, we explored a two-factor structure (Table 7). Items loading onto factor 1, which we name Adding Environment and Society Elements, were: including recent geological events, increasing emphasis on environmental issues, and adding content linking geoscience to societal issues. Items loading onto factor 2, which we name Adding Quantitative and Systems Thinking Elements, were: increasing emphasis on systems thinking, increasing focus on quantitative skills, and increasing focus on communication skills. We explored a three-factor structure, but it featured an unstable third factor, with only one item loading on it, so a two-factor structure best fit the data. In both structures, several items failed to load, including updating content with latest research findings, changing textbooks, and reorganizing topics covered. After excluding items that failed to load, the remaining items had a Cronbach’s alpha of 0.55.

We used the second half of the data to perform a confirmatory factor analysis for cross-validity before putting this measurement model into a full structural model. Because we would be using this measurement model in a full structural model, rather than correlating the factors, they were set to make up a broader latent factor of EST Teaching Changes (Fig. 3). The confirmatory factor analysis indicated that all loadings were significant. It also indicated good model fit with a χ2 to degrees of freedom relationship of 3.110, and an RMSEA of 0.041 (0.023, 0.059). A CFI of 0.937 indicated acceptable fit.

The results can be interpreted to mean that when it comes to making changes in course content, instructors are engaging in two main practices to incorporate more systems thinking: (1) adding elements relating to the environment and society, and (2) adding explicit elements relating to using more quantitative data and adding explicit systems thinking elements.

Model 3: Active Learning

We used the full data set to perform a confirmatory factor analysis for cross-validity before putting the active-learning measurement model into a full structural model. This was based on an exploratory factor analysis previously completed by Manduca et al. (2017) with a past iteration of the survey. Based on this work, which identified the factors as active learning and active lecture, in this study we name the factors Student-Centered Practices and Mixed-Centered Practices (student- and instructor-centered) based on the items themselves. Because we were using these in the full structural model, we hypothesized that these would make up a broader construct of Active Learning (Fig. 4), which relates to the use of active-learning teaching practices. The confirmatory factor analysis indicated that all loadings were significant. The ratio of χ2 to degrees of freedom relationship of 14.084 did not indicate good fit; however, an RMSEA of 0.071 (0.055, 0.088) and a CFI of 0.914 indicated acceptable fit. Thus, because the model was based on a hypothesized structure from earlier work (Manduca et al., 2017), it was included in the full structural model.

This model, based on the survey results, shows two main active-learning approaches employed by geoscience instructors: those that are very student centered, and those that are a compromise between student and instructor (or mixed) instruction.

Model 4: Full Structural Model

A full structural model (Fig. 5) was developed using the aforementioned measurement models. It was hypothesized that Active Learning (model 3) teaching practices would predict both EST Teaching Changes (model 2) as well as EST Teaching (model 1). The full model (model 4), however, showed all paths to be significant (p <0.001) except for Active Learning as a predictor of EST Teaching. Thus, this model was mediated by EST Teaching Changes. The initial model’s fit was not ideal, with a χ2 to degrees of freedom ratio of 5.607, a CFI of 0.812, and an AIC of 1274.987. To address this, the weakest loading of Systems Model Elements as an underlying latent variable to EST Teaching was pruned. This decision was made based on the theoretical basis laid out by Scherer et al. (2017) of quantitative reasoning and computer modeling largely being an aspect of the Complexity Sciences framework of EST and due to its relatively weaker loading. Comparison of fit between the two models can be seen in Table 8. Thus, this pruned model may be more applicable than the initial model to the other three frameworks or to EST practices that do not include modeling or extensive use of quantitative data. The new model fit was improved with a χ2 to degrees of freedom ratio of 4.272, a CFI 0.894, and an AIC of 655.205. A RMSEA of 0.035 (0.032, 0.038) also indicated good fit. In this model, Active Learning predicted 9% of the variance in EST Teaching Change (r2 = 0.09) and EST Teaching Changes accounted for 91% of the variance in EST Teaching (r2 = 0.91).

In this model, the path between Active Learning and EST Teaching is significant at p = 0.02; however, all other paths are significant at that significance level of p <0.001. This noticeably smaller factor loading suggests a mediation effect of EST Teaching Changes on the relationship between Active Learning and EST Teaching. A model that excludes EST Teaching Changes and just models the relationship between Active Learning and EST Teaching has a much higher factor loading of 0.45 (p <0.001), indicating that EST Teaching Changes is a variable mediating that relationship.

Adding the construct of EST Teaching Changes to the model reveals that Active Learning is indirectly, rather than directly, related to EST Teaching. This indicates that those instructors who are involved in practices that support active learning are more likely to engage in making changes to the curriculum that involve practices related to EST. In turn, it is the element of course change that is bringing about enhanced EST teaching practices. It is also important to note that the Active Learning construct from work by Manduca et al. (2017) was a superior predictor when compared to self-reported percent time spent involved in active learning when that was incorporated into the model.

Earth systems thinking (EST) reflects an advanced state of competency in the geosciences (Stokes, 2011) and is a core focus of both standards and outcomes for geoscience teaching and learning (NGSS Lead States, 2013; Orion and Libarkin, 2014; College Board, 2019) and preparation for the geoscience workforce (U.S. Bureau of Labor Statistics, 2015). However, EST has been shown to be challenging for students given the inherent complexity of Earth systems involving both natural and human dimensions (Berger, 1998; Dodick and Orion, 2003; Giorgi and Avissar, 1997; Goldenfeld and Kadanoff, 1999; Herbert; 2006; Stillings, 2012). Scherer et al.’s (2017) literature review on EST in geoscience education foregrounded four conceptual frameworks supported by empirical research. The purpose of our study was to investigate these frameworks for EST and examine the current state of EST teaching by geoscience faculty members using data from the 2016 National Geoscience Faculty Survey (Lally et al., 2019; Manduca et al., 2017; Macdonald et al., 2005). Specifically, we sought to better understand how EST teaching, innovative teaching (i.e., active learning; Bonwell and Eison, 1991), and course innovation (EST teaching changes) are related to each other. Structural equation modeling including exploratory and confirmatory factor analyses allowed us to discover how survey items relate to these constructs, and in turn how these constructs relate to each other.

Model 1: EST Teaching

We found that three main teaching strategies make up the broader concept of EST teaching: incorporating systems thinking elements, incorporating systems model elements, and incorporating real-world application elements. These general practices relate back to the Earth systems frameworks identified by Scherer et al. (2017) (Table 9). The first practice of adding systems thinking elements corresponds to aspects of both the Earth Systems Perspective framework and the Earth Systems Thinking Skills framework, which together involve interconnections between parts of a system and applying systems-thinking vocabulary and concepts to a system. The practice of adding system model elements corresponds very well to the Complexity Sciences framework of EST, which includes a strong quantitative and modeling component. Finally, the practice of adding real-world application elements corresponds to the Authentic Complex Earth and Environmental Systems framework as well as to aspects of the Earth Systems Perspective framework. This provides an excellent quantitative analogue to the qualitative work that was based on the existing literature on EST teaching and learning. It also confirms that these frameworks that are grounded in the literature are also being expressed in the self-reporting of many practicing post-secondary geoscience educators. Based on the structure of the survey, it is not surprising that the Earth Systems Perspective framework was not distinct, as there were no survey items that seemed to represent it explicitly. It is also interesting to note that of these three (incorporation systems thinking elements, systems model elements, and real world application elements) general teaching practices identified from the survey, teaching practices involving incorporating systems model elements had the weakest relationship to other EST instructional practices. This indicates that incorporating systems models into teaching is not as strongly correlated to other EST teaching practices, meaning that these practices are either done in isolation or not done as frequently as other EST teaching practices.

Model 2: EST Teaching Changes

When we examined items related to teaching changes, we initially considered a variety of changes to curricula that survey respondents indicated that they had made to their reported courses in the past two years. Interestingly, only items that are related to EST loaded on broader constructs, while items like changing content sequence or textbook failed to load. This suggests that when making course changes, instructors are intentionally or unintentionally implementing a suite of changes that enhance the teaching of EST, which are not related to more simple changes like changing sequence of a course or a textbook. This has positive implications, as it means that instructors are collectively making changes bring both more quantitative and society-based elements into their courses. These changes corresponded to the frameworks previously discussed, with one change featuring a clearly quantitative and databased component and another change related to interdisciplinary and environmentally based teaching (Scherer et al., 2017; Table 10). Instructors who are adding more environment and society elements to their courses are likely making changes in their courses that reflect the Earth Systems Perspective and the Authentic Complex Earth and Environmental Systems frameworks. Those who are adding quantitative and systems thinking elements are incorporating components of the Earth Systems Thinking Skills and the Complexity Sciences frameworks. It was surprising that increased focus on communication skills was part of the broader construct of adding quantitative and systems thinking elements, but this may be due to the relationship between communication and problem solving (Holder et al., 2017). This may also indicate that as instructors are adding quantitative and systems thinking elements, perhaps in the context of problem solving, they are also increasing their emphasis on communication skills.

Model 3: Active Learning

The confirmatory factor analysis on teaching style based on earlier work with past iterations of the survey by Manduca et al. (2017) was verified in this study and applied to our current work. While Manduca et al. used the exploratory factor analysis to create teaching profiles (e.g., active lecture, active learning), our study interpreted these profiles as two aspects of active learning, one being highly student centered and the other mixing instructor- and student-centered practices. The Mixed-Centered Practices or active-lecture construct included lectures with demonstrations or lectures mixed with individual and whole-class questions. The Student-Centered Practices or active-learning construct from Manduca et al. included in-class exercises and whole-group discussions. As these did not include traditional lecture, in our study these were deemed as part of the broader construct of Active Learning (Bonwell and Eison, 1991). Manduca et al. noted that the underlying structure is not what would necessarily be hypothesized (think-pair-share and small-group discussion were not statistically related to the other items) and thus may reflect something about the survey and how participants are responding to it or how they are engaging in those practices. While the full range of active-learning strategies may not be truly captured in the survey, the combination of items listed above served as a much better predictor of EST teaching changes and EST teaching than the self-reported percentage of time engaged in active learning (which was also collected in the survey) when inserted into the full model (model 4). This indicates that responses to those survey items better captured participant behavior than a participant’s self-reported time spent engaging in active-learning teaching practices. So while the construct of Active Learning as found in this study might not be the ideal measure of actual active-learning practices, it does relate well to this particular model for the construct of active learning based on significance of paths and model fit.

Model 4: Full Structural Model

The full structural model revealed that active-learning teaching practices did not necessarily predict EST teaching, as hypothesized; rather, EST teaching changes made by instructors mediated, or controlled, the relationship. Thus, instructors who engage in active-learning practices are more likely to make changes to their curriculum that in turn are related to the construct of EST Teaching, meaning that instructors who are making changes are more likely to be engaging in EST teaching practices. This means that engaging in active-learning practices alone is not sufficient to predict an instructor’s engagement in EST teaching practices. This has implications for the importance of training instructors in both active-learning practices and EST. Geoscience faculty need to have the opportunity to learn about the challenges associated with systems thinking (Herbert, 2006; Stillings, 2012) so that they can make appropriate changes to their courses. As suggested by Holder et al. (2017), faculty would then likely be better able to use active-learning strategies, like group problem solving, to enhance EST teaching.

Part of building this model involved putting together the previous three models, thereby making the model more complex. The initial EST Teaching measurement model (model 1) did have significant paths; however, when it was placed in model 4, the model fit was not acceptable. This indicates that though the relationships between the survey items are significant, this model in this circumstance did not fit the data well. In order to prune the model and improve the fit, we chose to eliminate the construct of Systems Model Elements from the broader EST Teaching construct, as it had the weakest loading, meaning it was the most distinct of the EST teaching practices. Upon doing that, the fit statistics improved, meaning that the model now better fit the data. This tells us that teaching using systems modeling is a much more distinct teaching practice than adding systems-thinking elements or real-world applications. Additionally, that the eliminated construct did not fit well in the model indicates that Active Learning and EST Teaching Changes may not currently influence the likelihood of instructors engaging in systems-modeling practices or implementing aspects of the associated Complexity Sciences framework of EST (Scherer et al., 2017). However, the model does seem to correspond with the frameworks of Earth Systems Perspective, Earth Systems Thinking Skills, and Authentic and Complex Earth and Environmental Systems.

The distinctness of the systems modeling in the full model demonstrates a greater need for resources on quantitative reasoning as it relates to systems as well as system modeling resources in the vein of work by Shepardson et al. (2014), Fichter et al. (2010), and Hmelo-Silver et al. (2014). It appears that it is the quantitative reasoning skills as well as modeling skills that make this construct unique. The relationship between EST Teaching Changes and increasing the focus on quantitative skills was much weaker than the relationship between EST Teaching Changes and increasing emphasis on systems thinking. This suggests that while instructors are adding systems-thinking elements to their courses, they are not complementing that by adding the quantitative component that may be critical to understanding complex systems during the classroom deployment of systems-modeling approaches.

The fact that systems modeling is the most distinct of the EST teaching constructs indicates that it is not employed as frequently in relationship to the other practices of including systems-thinking elements or real-world elements. It also indicates that using active-learning practices and making EST teaching changes to courses are not as predictive of including systems-modeling elements as they are of including the other two EST teaching practices. Thus, the community must continue and expand its discussion on how to best incorporate quantitative reasoning, modeling, and complexity sciences into resources for practicing Earth science educators. Survey respondents included a large number of instructors who taught introductory courses, so it is particularly important to pay special attention to introductory geoscience courses in this discussion. InTeGrate, a National Science Foundation–funded geoscience project, continues to work to address this problem and has assessed faculty and student weaknesses in teaching and learning about systems thinking (InTeGrate Program, 2015).

Limitations and Future Research

Limitations to this study include the self-reported nature of the data, which may not be reflective of what instructors actually do. The voluntary nature of the survey, like with all surveys, is a natural limitation. These factors may have resulted in a participant pool that is skewed toward those with an interest in teaching and learning in the geosciences. As with any survey, there may be issues with the instrument itself or fatigue associated with the length. Some item groupings did not have good internal consistency, as evidenced by lower Cronbach’s alpha values. This may indicate that the results of some item groupings may not be as reliable as they were intended. Responses were also not evenly distributed among tenure and tenure-track faculty, instructors, lecturers, adjunct faculty or other faculty types, as survey respondents were more likely to be tenured or tenure-track faculty.

Survey bias also occurred in several other ways. Survey respondents were not evenly distributed among institution types, as respondents were most likely to teach at master’s, baccalaureate, two-year college, and other institution types over research and/or doctoral institutions. Additionally, individuals with disciplinary focuses in oceanography (9.3%) and atmospheric science (9.5%) were far less represented than those in geology or associated fields (81.2%). Thus, while the sample size is large, it is likely not truly representative of all geoscience instructors. It is also notable that the majority of instructors indicated teaching introductory courses, primarily to majors, while only about a quarter of respondents indicated that they taught upper-level geoscience courses. This means that the sample is slightly biased toward EST teaching practices in introductory courses. It is also important to note that the models developed in this study are just models, and while they fit the data, they may not necessarily be true. As Rasch (1960, p. 37–38) and Tukey (1963) noted, no model is perfect, but it is the insight that they give researchers that is valuable. It is of similar value to note that the EST Teaching Changes items specifically asked instructors about changes made within the past two years. It is possible that some instructors may have made changes to their courses related to EST prior to the period indicated by the survey prompt. In this case, we may have lost the data of a small group of participants who may have already made these changes.

In terms of future directions for research, an obvious step is to use these measurement and structural models on future iterations of this survey to ensure that they hold up to scrutiny. While this work confirms many of the qualitative findings found in the literature by Scherer et al. (2017), it is important that this work be taken into the field and observed both qualitatively and quantitatively. The qualitative approach calls for work examining EST teaching and its relationship to teaching approach in the classroom as well as aiming to understand instructors’ conceptions and understanding of EST. Quantitative work calls for the development of instruments that can help researchers measure learner development of systems thinking (in general or within each of the four frameworks) to complement and expand on existing instruments (Jordan et al., 2014; Grohs et al., 2018). This step will be essential in measuring the effectiveness of teaching EST, both by the nature of courses and contexts that may be most effective and by the specific teaching strategies used. Assaraf and Orion (2005) noted the difficulty in conducting EST research due to the need to evaluate the strength and weaknesses of each research tool concerning what skills each tool is actually measuring. Thus, it is difficult to assess EST skills across many courses or instructors without a more streamlined instrument. There is a need for more classroom-based studies in which EST teaching practices are observed and coupled with student learning outcomes to complement and build upon previous studies based on self-reporting.

Additional structural equation modeling could also be completed to understand whether the model is variant across a variety of groups (discipline, years teaching, training, etc.). Work to understand differences in EST teaching practices between introductory and upper-level courses would also be worthwhile. Additionally, future work should focus on the survey to understand why individuals that make teaching changes to their courses seem to be making these changes toward more of an Earth systems approach, and if this is a result of nationwide influencers such as InTeGrate, Next Generation Science Standards, and Quantitative Reasoning across the Curriculum (Numeracy Infusion Course for Higher Education (NICHE), a Project of the City University of New York (CUNY) Quantitative Reasoning Alliance https://serc.carleton.edu/NICHE/qr_across_curriculum.html). This is a crucial step in influencing professional development around EST teaching and learning. Structural equation modeling is an excellent tool in analyzing large data sets, and the survey studied here certainly contains many more constructs waiting to be explored.

This study, which used the 2016 National Geoscience Faculty Survey, found that the current state of Earth systems teaching in American colleges and universities is largely consistent with the qualitative Earth systems frameworks proposed by Scherer et al. (2017). Current innovation in geoscience teaching also tends to revolve around the major frameworks of EST, and is predictive of individuals incorporating EST elements into their courses. These results also suggest that the Complexity Sciences framework of EST (involving computer modeling and quantitative data) is much more distinct than the other three frameworks. This means that instructors are commonly not engaging in teaching practices that involve modeling or complexity sciences in conjunction with other EST teaching practices. Active-learning teaching practices do share a relationship with EST teaching practice; however, it is mediated through course changes instructors are making. Thus, individuals who engage in active learning are more likely to engage in changes to their curriculum that incorporate more EST elements rather than just naturally including more EST teaching practices in their courses. This also indicates that active-learning practices alone are not sufficient to bring about EST teaching practices. This study gives us a snapshot at the current state of EST teaching in higher education, and suggests some interesting relationships between active learning, course changes, and EST instruction and implications for additional professional-development opportunities.

This study also draws attention to the importance of geoscience educators receiving resources and training in EST, as active-learning practices themselves are not enough to ensure that instructors are explicitly teaching EST. Thus, continued professional development in not only active-learning practices but also systems thinking and its teaching is essential. This is particularly true in the sense of pedagogical content knowledge and Earth systems content knowledge, which points to the significance of work being done by NAGT, On the Cutting Edge professional development programs, InTeGrate workshops, and others in bringing professional development to a wide range of geoscience educators. Professional development and training in the complexity sciences, quantitative skills, and systems modeling is especially essential, as in our model, the construct of Systems Model Elements is the most distinct and least related to the other constructs that make up EST Teaching. As evidenced by the large number of participants who reported teaching introductory courses, it is particularly important to consider and research how we enhance systems modeling practices in these courses. As instructors continue to implement EST teaching strategies in their courses, it is important that researchers begin to take these findings from the literature and the survey studied here and begin exploring them in practice.

This work was supported by the National Science Foundation’s Division of Undergraduate Education (DUE) under awards 0127310, 0127141, 0127257, 0127018, 0618482, 0618725, 0618533, 1022680, 1022776, 1022844, 1022910, 1125331, 1525593, 1524605, 1524623, and 1524800. Any opinions, findings, and conclusions or recommendations expressed in this material are those of the authors and do not necessarily reflect the views of the National Science Foundation.

We acknowledge and thank the following individuals for other contributions that made this study possible: Raymond Y. Chu, Julius Dollison, and Roman Czujko of the Statistical Research Center of the American Institute of Physics for helping to develop the 2004 and 2009 survey instruments, administering these surveys, and doing the initial analysis of the results; Diane Ebert-May and colleagues in biology for providing an unpublished copy of a similar survey developed for biology from which the 2004 geoscience faculty survey leadership team benefited; staff including Nick Claudy and Christopher Keane from the American Geological Institute for working through permissions to provide the initial set of geoscience faculty email addresses; John McLaughlin, the On the Cutting Edge external evaluator, for contributions to the development of the 2004 and 2009 survey instruments; experts from Professional Data Analysts, Inc., including Michael Luxenberg, Becky Lien, Eric Graalum, and Mao Thao, for work on the analysis of the 2009 survey and development and analysis of the 2012 and 2016 surveys; Lija Greenseid of Greenseid Consulting Group, LLC, for facilitating survey design and implementation and contributed to interpretation of data analysis (2012 and 2016); On the Cutting Edge Principal Investigators R. Heather Macdonald, Cathryn A. Manduca, David W. Mogk, Barbara J. Tewksbury, Rachel Beane, David McConnell, Katryn Wiese, and Michael Wysession for their work on the survey; Joni Lakin, Kim Kastens, Rachel Beane, Kathleen Quadorkus Fisher, and Professional Data Analysts, Inc., for reviews and suggestions that strengthened this article; the geoscience survey working group, Greenseid Consulting Group, and Professional Data Analysts for support and guidance in working with the survey data; leadership from NAGT, On the Cutting Edge, InTeGrate, and SAGE 2YC for their work in administering the 2016 geoscience faculty survey; and the editors and reviewers of Geosphere for their constructive and thoughtful feedback, which strengthened this manuscript.

Science Editor: David E. Fastovsky
Associate Editor: Lesli Wood
Gold Open Access: This paper is published under the terms of the CC-BY-NC license.