Internet-based information and knowledge systems developed under the umbrella of geoinformatics have the potential to revolutionize traditional classroom practice. As the availability of computers and Internet connections in classrooms increases, the potential of harvesting such resources to advance teaching and learning provides unprecedented opportunities for both researchers and educators. Currently, numerous online resources are utilized in classrooms. This study focuses on such a resource, Discover Our Earth (http://www.discoverourearth.org), and evaluates its uses in classroom environments at the undergraduate and middle school levels. Discover Our Earth is designed around the theme of plate tectonics and covers topics such as earthquakes, volcanoes, topography, and sea level change. It is designed to promote hands-on experiences and inquiry-based learning. Users can access relevant data, mapping tools, and additional interactive virtual experiments from Web browsers without any additional required software installation. In order to fully understand the impact of technology tools in the classroom, it is critical that they be evaluated thoroughly. To evaluate the effectiveness of this online resource, we undertook both formative and summative mixed-method evaluations. Using a combination of quantitative (pretests and posttests) and qualitative (questionnaires and observations) methods in our evaluation design enabled us to evaluate the Discover Our Earth tools in our targeted user groups. We found that Discover Our Earth resources are more valuable and have significant advantages over using commercially available software. Formative evaluations in undergraduate classes helped us correct a series of user interface design issues at the early stages of software development. We also observed that Discover Our Earth resources enabled students to engage in inquiry and to develop their own understanding of plate tectonics through the exploration and visualization of the data sets. Summative tests conducted at middle schools showed that students paid more attention to interactive tools, and they continued to be engaged in learning activities even after the class was over.

In our fast-paced, ever-changing society, we are constantly increasing our reliance on technology. In recent years, remarkable technological advances in communication and computing have enabled society to utilize technology in new ways. These changes have begun to have a substantial impact on scientific education and research, but have been complicated by the many challenges that accompany the utilization of new technology. Currently, there are several education and research initiatives that aim to effectively utilize these technological advances and address the challenges that accompany them.

At the same time, a new and comprehensive approach to technology has been introduced to research communities by national and international cyberinfrastructure (CI) initiatives. The CI efforts will develop technological infrastructures or “backbones,” which will provide services for the integration of data sets and resources for both research and education. By providing efficient mechanisms to handle data and computing resources, scientists will be able to conduct research efforts previously impossible due to the lack of appropriate resources. In the past, out of necessity, scientists created their own systems to integrate technology into their research. However, attempts to apply each other's technology systems to their research have not been effective. CI has the potential to profoundly impact scientific research by providing comprehensive and easy-to-use work environments and by shifting scientists' focus toward multidisciplinary analysis, resulting in rapid discoveries in research. Educational environments supported by CI can provide “interesting resources, exciting experiences, and expert mentoring to students, faculty, and teachers anywhere there is access to the Web” (Atkins et al., 2003, p. 17).

While research and development communities invest extensively in building information networks, educational communities attempt to take advantage of these resources (e.g., Manduca and Mogk, 2002). There have been significant increases in the availability of, and access to, technology within schools in recent years. The National Center for Educational Statistics reports that in 1994 only 35% of public schools had access to the Internet, with only 3% of that access in instructional rooms; in 2002, 99% of public schools had access to the Internet and 92% of that access was in instructional rooms (Kleiner and Lewis, 2003). There has also been a considerable increase in the Internet connection speed used in public schools in recent years, allowing students to access more information at faster transfer rates than ever before. In 1996, dial-up Internet connections were used in 74% of the public schools with Internet access (Heaviside et al., 1997), while in 2001, 55% of schools reported using T1 and/or digital subscriber line (DSL) Internet connections (Kleiner and Farris, 2002). These encouraging statistics illustrate that most schools are now in a position to effectively utilize information technology (IT) resources.

Given the effectiveness of inquiry-based pedagogy and the potential of IT resources to support and extend that approach, educational institutions must now address the question of the effectiveness of current technology in the classroom. Specifically, how much does it aid student learning? This is a current topic of discourse in research circles. In this paper we will discuss a technology based learning tool: Discover Our Earth (http://www.discoverourearth.org/) and evaluate its use in classroom settings.

Discover Our Earth is a Web-based learning tool that incorporates advanced technology, research-quality data sets, and supportive materials in order to facilitate effective inquiry-based science education for learners at all levels. It focuses primarily on geology-related topics (specifically plate tectonics). Geologic concepts can be difficult for students to grasp because they generally involve large scales and four dimensions: x, y, z, and time. Students studying plate tectonics cannot walk outside and record plate motions in the way students studying biology can record the growth of a plant—it would be prohibitively expensive and time consuming to collect enough data to determine, for instance, the rate of motion of a plate or the geometry of a subduction zone. In these cases, research-quality data sets can replace student-collected data, and computer technology can be used to represent the data visually in an easy-to-comprehend manner.

The Discover Our Earth learning tool is the offspring of the Geoscience Information System Project, which originated in 1995 at Cornell University (Seber et al., 1997, 2000) and now continues at the San Diego Supercomputer Center as a collaborative research activity. The original project developed a comprehensive information system for the geosciences and was initiated primarily to handle the changing needs of the geoscience research community as it studied Earth's complex system of interrelated mechanisms. The result of this effort was the Geoscience Interactive Database (GEOID). This database contains solid-earth data sets on both the regional and global scale. The GEOID is accessed through an easy-to-use interactive Web mapping tool that allows users to study and map the massive collection data sets through the Internet.

Although the GEOID was originally designed as a tool for research scientists, it quickly became apparent that educators, students, and lifelong learners were using this Internet mapping tool. While the GEOID was a direct route for researchers to access and display data in meaningful ways, we found that students were overwhelmed by the number of data sets available and could not distinguish between similar data sets and their uses. In order to make the information easier to use for students and other nonexperts, Discover Our Earth was created and developed to provide supportive educational materials and easy-to-use tools for database access at no cost to its users (Moore et al., 1999; Moore, 2000). While commercial geographic information system (GIS) software is a powerful instrument for scientific study, it is relatively expensive, and therefore not often available at small undergraduate institutions and secondary schools. Discover Our Earth is a set of Internet accessible tools that offers much of the functionality of commercial GIS software at no cost. By storing the data remotely instead of providing it on CD-ROM or DVD, we provide students access to a much larger database. This also allows us to easily update the data sets and system tools without the time consuming and costly process of redistributing them to users.

Discover Our Earth (Fig. 1) was originally developed primarily for use by undergraduate students, allowing students to engage in science learning that models the actual practice of science through the use of data (Seber et al., 2003). It has since been expanded to accommodate primary and secondary school teachers and students as well as lifelong learners. Discover Our Earth is comprised of three main components: (1) a series of educational Web pages designed as user guides for both teachers and students; (2) the Quick Use Earth Study Tool (QUEST), an interactive mapping tool designed specifically for educational use; and (3) a collection of software tools designed to perform interactive experiments.

Traditional learning materials are often static and allow for little or no interactivity. “Cookbook” exercises, paper maps, and video and computer animations are all passive learning experiences in which students do not have the opportunity to ask their own questions, design their own experiments, or collect their own data. Discover Our Earth not only allows for, but also requires students to follow the methods of practicing earth scientists. Students work with actual data sets acquired by researchers and have the ability to manipulate, query, and display the data. They may do this in any manner that they choose, not only by following a specific set of instructions supplied by their teachers. Thus, students are driven by their own curiosity and are empowered to learn. The design of Discover Our Earth provides students with the tools to learn about earth processes through inquiry and discovery.

The user guide pages, the first component, are divided into two sections: one for teachers and one for students. Both are primarily comprised of text and images presented in hypertext markup language (HTML) format. These pages focus on the subject of plate tectonics and cover such topics as earthquakes, volcanoes, topography, and sea level change. These supportive materials include background information, lesson plans, activity outlines, directions to activities, and assessment suggestions. They are not comprehensive guides to the broad topics; instead, they are designed to provide information and opportunities for discovery on focused topics. The lesson plans, activities, guides, and student directions give specific examples of how best to explore each topic using the QUEST tool.

The second component, QUEST, is the second generation of the Web-based interactive mapping tool: It is written as a Java Applet, and designed to cater to the educational needs of teachers and students. QUEST allows access to three spatially referenced data sets within the database created and managed by the Geoscience Information System Project. This interactive Web mapping tool gives students the ability to create maps by querying and overlaying earthquakes, volcanoes, and topography data. Here, students control the region they want to study and the data sets they wish to view; they can query the data to display certain criteria, and they can alter the appearance of the display. The QUEST mapping tool can be used independently or be paired with the supportive materials provided in the user guides to create meaningful, inquiry-based activities for students.

QUEST operates as part of a three-tiered system that allows users to access the data in an orderly, easy-to-analyze fashion. The base tier contains the database of primarily solid-earth data at the regional scale. Most of the data stored in this database have been collected from various data providers, and in many cases these data must be reformatted to conform to our standardized database. Once the data are in the correct format they are entered into the database. The middle tier contains the “middleware,” a series of software that manages communication between the base tier and the top tier. Using commercial GIS software as a component in the middle-ware enables a high level of data management functionality. ArcINFO®, the commercial GIS software, gives this system substantial control over how the data are accessed and manipulated before it is displayed to the users. The top tier of this system is the QUEST user interface. Here, the users interact with the database by using the QUEST Java Applet to select data sets they wish to access and complete queries selecting data they wish to view.

The last component of Discover Our Earth is a collection of virtual experiments; a group of interactive software created in Macromedia Flash designed to complement the QUEST mapping tool. These virtual experiments include such topics as continental drift, United States geology, isostasy, and viscosity (Fig. 2). They are designed to allow users to manipulate them in order to better understand earth processes. For example, using the Continental Drift Virtual Experiment, students are able to click on continental puzzle pieces to move and rotate them to recreate the supercontinent of Pangea.

The Discover Our Earth educational system focuses on plate tectonics because it is essential to the study of earth science and is studied by students at many educational levels. QUEST accesses three data sets that are integral to the understanding of plate tectonics: earthquakes, volcanoes, and topography. These data sets are the most commonly requested data sets by the user community. The virtual experiments support the study of plate tectonic processes through the use of data. The combination of the QUEST mapping tools and the virtual experiments gives students several different ways to explore plate tectonics and to support students with different learning styles.

In an effort to evaluate Discover Our Earth, we engaged in both formative and summative evaluations over the past four years. We used a combination of quantitative (pretests and posttests) and qualitative (questionnaires and observations) methods to investigate the usage and effectiveness of the Discover Our Earth learning tools. However, the rubrics used focused more on observational techniques rather than objective, quantitative analysis due to limitations in time and finances. Student interest and interaction with tools as well as instructors were observed, compared, and analyzed with pretest and posttest assessments. Since the same person has conducted all the classroom work and was the lead observer, we believe our results are internally consistent and valid. However, more comprehensive, multiyear, quantitative observations will still have to take place in order to show concrete progress and learning gains with statistical validity. Our evaluation methods were primarily based on resources published by the National Science Foundation and augmented with other published works that outline and discuss evaluation in more detail (e.g., Altschuld and Kumar, 2002; Campbell and Stanley, 1966; Frechtling and Sharp, 1997; Hannah, 1996; Stevens et al., 1993). A number of evaluation studies have employed these research methods to study the effectiveness of other educational programs including: Bartholomew et al. (2003), Chang (2000), Hall-Wallace and Regens (2003), Kali and Orion (1996), Kastens et al. (2001), Kiboss (2002), and Moss (2003). This design allowed us to evaluate the Discover Our Earth tools in our targeted user groups: undergraduate level university students (formative evaluation) and middle school students (summative evaluation).

Formative Evaluation—An Undergraduate Study

Our formative evaluation began when Discover Our Earth was piloted at the undergraduate level in an introductory geology course. Students were asked to analyze, synthesize, and evaluate earthquake hypocenter data and discern plate geometry in a subduction zone. In EAS 101 (an undergraduate course at Cornell University), we first employed primarily qualitative evaluation methods in all sections of a 3 h laboratory activity devoted to plate tectonics. Over the course of three semesters, this study included 120 undergraduate students. In this phase, our goals were to learn how well the QUEST tool functioned in a classroom environment, how it was perceived by students, and how it compared relative to commercial GIS software. While it is the goal of many undergraduate institutions to expose their students to powerful software such as ArcView® (a commercial GIS software), we investigated if the use of such software was too time consuming in the context of a survey course such as introductory level geology where the primary goal is to teach science, not tools, for future use. We accomplished this with the use of a student questionnaire and observations made by the developers, faculty, and teaching assistants, who kept careful notes as both groups completed the lab activity.

In the first evaluation (semester 1), the lab sections were split into control and study groups. During this phase of formative evaluation, students performed an inquiry-based laboratory exercise that utilized research quality data sets to explore plate tectonics. The control group did this with commercial GIS software (ArcView®), while the study group utilized the QUEST mapping tool available as a resource in Discover Our Earth. Students in the control group used a standard edition of ArcView® that required them to download, format, and display the data. While this added an additional step to the tasks of the ArcView® users, we opted not to provide an already formatted data set as it is usually necessary to download and reformat the data sets in this type of work. While it might be favorable if all students had an opportunity to do the same plate tectonic activities using both QUEST and ArcView® so they could compare the software, this was not possible in a control study. After the completion of the plate tectonics laboratory exercise, students completed a simple questionnaire, which asked them to assess their experience.

Because of the results from our first semester of evaluation, which will be discussed in the analysis and results section, we chose to forego the control group in the second and third evaluations (semesters 2 and 3); all students participated in the study using the QUEST mapping tool in the lab activity. We used a combination of observations and student questionnaires to evaluate how well the QUEST tool functioned in a classroom environment and how students perceived it.

Summative Evaluation: A Study Involving Middle School Students

The summative evaluation took place in 2004 and focused on middle school students. We employed a pretest-posttest evaluation design and qualitative observations (Brindisi, 2004). In the summative evaluations, students demonstrate knowledge, comprehension application, and analysis through the production of annotated maps. After creating several maps that plot earthquake data over increasing amounts of time, students are asked to use earthquake data to distinguish plate boundaries. In this phase, we assessed the educational impact and evaluated usability of Discover Our Earth in 7th and 8th grade science classes. We worked with two teachers and 162 students from two middle schools: one in a large metropolitan area in Texas, and one in a semi-rural area of upstate New York. The evaluation process consisted of an interactive lesson flanked by a pretest and posttest. While it would have been preferable to use the same exact lesson plans and testing instruments in both schools, this was not possible based on the class schedules of each school, its curricular requirements, the scheduled unit of study, and the student populations. Thus, we adapted our assessment instruments. All students participated in an inquiry-based lesson that utilized earthquake and volcano data sets to map plate boundaries. Students read about the relationship between the occurrence of earthquakes, volcanoes, and plate boundaries, and then proceeded to make their own maps of the plate boundaries in the same manner scientists did in the late 1960s, i.e., during the discovery of plate tectonics. This lesson was used because it is an introductory level lesson that can be recreated as an analogous paper-based lesson.

In Texas, the evaluation took place over the course of three days in January 2004 in the classroom of one 7th grade science teacher in an urban school. Here, we used a control group pretest-posttest assessment of student learning coupled with observations made by the classroom teacher and the investigator. The study group completed the mapping activity using the QUEST mapping tool, while the control group utilized paper-based materials to complete an analogous lesson. The computer-based study group consisted of 61 students, and the paper-based control group consisted of 44 students. Six separate classes were tested; the class size varied from 16 to 32 students. The students were split into study and control groups by class period in order to minimize the disruption to the daily routine. The pretest and posttests were developed and reviewed by one classroom teacher, two university faculty, and one researcher. The paper-based control group (44 students) was presented with similar activities, but they worked with only paper-based materials, and therefore were unable to query them and plot the earthquakes interactively.

The evaluation in the upstate New York school took place in May 2004 in two 8th grade New York State earth science Regents classes comprised of 30 students each. In this evaluation, we used the one group pretest-posttest evaluation design (Campbell and Stanley, 1966) because the teacher asked that both classes have the opportunity to do the activity in exactly the same manner. This lesson was designed by the classroom teacher and is titled “Finding Evidence of Plate Tectonics.” In the “Finding Evidence of Plate Tectonics” lab, students first mapped the plate boundaries using the QUEST activity and then went on to investigate a series of Web links that presented them with evidence supporting the theory of plate tectonics. The mapping plate boundaries activity performed by the New York students was nearly identical to that performed by the computer-based study group in Texas. The students were required to answer several questions regarding each piece of evidence. The pretest and posttests were altered at the request of the local teacher to fit the needs of the students. Specifically, the earth science teacher in New York felt that the original pretest and posttest were too easy for the advanced students taking the New York State Regents curriculum.

In the formative evaluation, carried out at the undergraduate level, we compared the use of the QUEST tool to commercially available GIS software (i.e., ArcView®). The students were divided into control (ArcView®) and study (QUEST) groups, and completed a series of inquiry-based activities as part of a plate tectonics laboratory. Students finished the plate tectonics lab in a 3 h laboratory period and also completed a user survey. Although the surveys were not identical, several questions were the same, which allowed the groups to be compared to each other. Both groups rated the lab 4.1 on a scale of 1–5. Both groups also rated the first 5 labs of the semester in a similar, but not identical manner. We also asked the students which software they would prefer to use based on the information provided and their limited experience. In this evaluation, all students in the control and study groups gained experience with ArcView® in the first laboratory exercise of the semester when they studied global positioning systems (GPS), maps, and digital mapping tools. Therefore, the students in the study group were able to adequately judge which software they preferred. The overwhelming majority of students from both groups chose the QUEST tool.

The remainder of the survey questions were more qualitative in nature and required full-length, written answers. Several students in the control group commented about the retrieval and formatting of data. One student wrote, “I disliked the menial aspect of importing and formatting the data before it was mapped out because—I admit—I am impatient.” Another student commented, “It wasn't that it was difficult or easy, just tedious and boring. I don't believe it helped the learning process.”

Students in the study group, which used the QUEST tool, answered a slightly different set of questions and were not asked to comment on data formatting and retrieval since it was not required for their lab. These students commented, “Good tool to compare and contrast activities of the Earth throughout the years. One can also explore interests on his or her own, and not be working a fast/slow pace during class” and “I liked learning at my own pace and having both illustrations and information.”

During the lab periods, the instructor, teaching assistants (TAs), and the investigators observed the students as they worked. We observed several differences in the way students completed activities that were not recorded by the survey instruments. The control group asked many questions about downloading data, formatting it properly in a spreadsheet program, and creating maps using GIS software. Approximately 90% of all questions asked by these students concerned technical aspects of the GIS or spreadsheet software. The instructor and TA spent much of the lab period helping students properly format and display their data. These students spent considerably more time preparing the data and less time engaged in the inquiry activities of the laboratory exercise than their counterparts in the study group. It took the students in the control group longer to complete the entire lab than the study group and several control group students expressed concern that they would not complete the laboratory on time. Students in the QUEST group generally asked questions related to the scientific content of the laboratory activity and spent more time attempting to answer the lab questions by exploring the data through the use of the QUEST tool. The instructor and TA spent most of their time with this group helping students to understand the science of plate tectonics, rather than the mechanics of computer software.

In our second and third semesters of evaluation at the undergraduate level, we used only the QUEST mapping tool and abandoned the use of commercial GIS software as a control due to the overwhelming results from the first semester of evaluation. In the survey, students were asked to list what they liked the most and least about this laboratory exercise. Most often, students commented that they (1) enjoyed interacting with the data, and (2) liked the aspect of learning with imagery (01). Students also commented that they liked learning about plate tectonics, enjoyed drawing their own conclusions from the data, and enjoyed working with computers during the laboratory. Students least enjoyed the perceived slowness of the mapping tool (01). Many students also commented that they would like to see several features added to the QUEST tool including a “zoom” function, added functionality to the “filmstrip,” and more color choices for the data symbols.

In the summative, control group pretest-posttest evaluation, we observed 7th grade students in Texas as they used QUEST or a paper-based activity to plot plate boundaries using earthquake and volcano data. The first several questions in the pretest-posttest provided us with baseline information about the students. Notably, only 11% of students in both groups reported some familiarity with the concept of plate tectonics prior to the lesson. To quantify the learning gains, we use the Hake factor (Hake, 1998). The Hake factor measures student improvement by comparing the observed gain to the possible gain of student scores and is calculated with this formula:

Figure 3 graphs the Hake factor for the Texas middle school students by control and study groups, where the study group averages a Hake factor of 0.353 and the control group averages one of 0.343. There was more variability in the Hake factors when the students were separated by pre-advanced placement (AP) and non-AP student groups (Fig. 4). Overall, the pre-AP students experienced more gains than the non-AP students, with the study group pre-AP students scoring an average Hake factor of 0.422. When asked to explain the theory of plate tectonics in their own words, 92% of the study group and 84% of the control group could not answer the question in the pretest. When asked the same question in the posttest, 43% of the study group and 57% of control group improved their answers from 0 points to 2 points.

The observations and questions that were more qualitative in nature were enlightening. Students in both groups were asked what they liked best and least about the laboratory activity. Although many students in both groups noted that they disliked drawing the plate boundary maps in the “liked least” question, other students overwhelmingly commented that they enjoyed drawing and making their own maps based on observations of the data. Some interesting observations were made by the teacher and evaluator during the lab activity. Most notably, the study group students spent extra time working on the computers requesting additional maps after they completed the required tasks of the activity. In at least two cases, the student groups had to be told by their teacher to “move on” with the lab activity because of time constraints. The students in the control group finished the lab more quickly than their counterparts in the study group. Students in the control group expressed frustration that they were presented with only one map of earthquakes and one of volcanoes and that they did not receive a third map with both earthquakes and volcanoes.

The separate, but similar summative evaluation carried out in the New York middle school employed a one-group pretest-posttest design and observations. The students in this evaluation were 8th graders taking the New York State earth science Regents curriculum. When asked in the pretest if they were familiar with plate tectonics, 55% of the New York students responded “yes.” On average, the New York students improved by a Hake factor of 0.441. In the pretest, 30% of students were unable to provide an answer when asked to explain the theory of plate tectonics; in the posttest, 68% of students were able to provide a sufficient answer. The students were also asked to name two pieces of evidence that help scientists locate plate boundaries on their pretest and posttest. Nineteen percent of the students in the pretest and 58% in the posttest received full credit for their answers.

During our observations in New York, we found several instances of student groups (three in one class and four in the other) making extra maps of earthquakes and volcanoes. When asked why they were making extra maps, students responded by saying “because we wanted to know what two years of earthquakes would look like.” Another group of students responded that they didn't feel that the map showing six months of earthquakes had enough information for them to draw the plate boundaries. They continued to make additional maps of longer time periods until they felt that they had enough information to clarify their maps. This process indicates that the students were actively engaged in the exercise and used their judgment to conclude when enough data were analyzed. This is a critical learning gain and an indicator that Discover Our Earth empowers students to conduct self-directed inquiry by pursuing answers to satisfy their curiosity.

Since our sample sizes are limited (i.e., two schools for summative evaluations), it is very difficult to put some statistical certainty into the quantitative results provided above and to validate their statistical significance. Our main goal in this study has been to start developing a metrics and to form some baseline calculations. It is clear that measuring learning gains is a very difficult task and will take much larger and more comprehensive studies. However, our study is a first and an important step toward reaching a comprehensive analysis of how Discover Our Earth resources help learning and whether a measurable learning gain is obtained without causing significant misconceptions.

We found our formative and summative evaluations provided us with information instrumental to the enhancement of our project. Our first-time evaluation also provided valuable experience that can be applied by other researchers in their evaluations. Although we invested a significant amount of time and energy to our evaluation design, we found that our testing instrument was in fact not one-size-fits-all and did not serve the needs of both middle school populations. In order to amend this deficiency, we suggest that evaluators closely incorporate classroom teachers in the development of testing instruments. Had we been able to use the same testing instruments in both of our middle school assessments of learning, we would have produced a much richer data set. In addition, the testing instrument assessed the students' mastery of the lesson content, but did not quantitatively assess the amount of inquiry that the students performed. We observed many instances where the study and control groups differed in their use of inquiry, but did not have an adequate method of measuring the amount of inquiry that students engaged in.

While our time frame and resources did not permit an extended evaluation period, we suggest that evaluators consider an extended evaluation period, perhaps one that spans the course of an entire school year. This would have allowed us to gain a better understanding of each classroom environment and the ability to test for long-term retention of information acquired during the plate tectonics lesson.

Our evaluation could have benefited from a series of semi-structured interviews with students both before and after the evaluation lesson. These interviews would have provided us with the opportunity to follow up on the student perceptions of Discover Our Earth, and their understanding of the concepts covered in the plate tectonics lesson. Incorporation of these alterations would provide evaluators with a more comprehensive evaluation plan, one that attains an in-depth look at the impact of using information technology as a tool for earth science education.

The initial formative evaluation (of undergraduate students) provided us with our first classroom-based field test of the QUEST tool and was extremely beneficial to the project. Through the use of control and study groups in the first semester of evaluation, we determined QUEST to be a more efficient method of data delivery and display than commercial GIS software for the purpose of undergraduate, introductory level geology classes. Undergraduate students using the QUEST tool spent more time thinking about the science of plate tectonics than about data acquisition, format, and display. Commercial GIS software such as ArcView® is a powerful and valuable tool in many learning situations; however, the use of ArcView® is too time consuming within the constraints of a survey course such as introductory level geology. QUEST was developed with the intention of moving students through the data acquisition and display process quickly so they could spend more time engaged in the meaningful study of earth science phenomena. Commercial GIS software is better suited to students in a subject course setting, where they have the opportunity to use the software many times over the course of a semester, taking greater advantage of its power and flexibility. QUEST users found the tool intuitive and were able to use it expertly within the lab period. This was extremely encouraging to the software developers.

Students made many suggestions for the improvement of the tool, and several of these suggestions were implemented. By far, the most common suggestion was to improve the speed at which map requests are delivered. It is important to note that Discover Our Earth allows users to request custom maps which require the server to create and serve each map individually from the large underlying data library. This multipart process sometimes takes 30 seconds or more. Students who used ArcView® had to invest a significant amount of time up front to acquire and format their data before they could begin the process of making maps. Subsequently, they queried a much smaller data set that was stored on their local computers, which took less time for the maps to display. The length of time from start to finish was greater for the ArcView® students than for the QUEST group. In addressing the concerns of QUEST users, we improved the response time of QUEST by introducing a distributed system on the server side that utilizes up to five different computers when the number of requests is larger than capacity of the primary server. Next, we made the main map area larger by modifying the layout of the QUEST tool. Last, we added functionality to the tool by increasing the number of features including more selectable map areas, the deletion or reorganization of maps in the filmstrip, and additional color choices for the data symbols. With these improvements to the tool, we were prepared for the impact evaluation.

The summative pretest and posttest studies, carried out in middle schools in Texas and New York, provided valuable evaluation information. The Texas and New York students entered into the evaluation process from two very different starting points. The 8th grade New York students were following an advanced New York State earth science Regents curriculum and were in the middle of a unit on plate tectonics, while the 7th grade Texas students were following a general science curriculum and had not yet started a unit covering plate tectonics. This was evident in the students' answers to the question regarding their familiarity with plate tectonics on the pretests and posttests. These differences in the student populations, as well as the differences in the evaluation methodology, make it impossible to compare the summative evaluations to each other; instead, these evaluations should be considered as separate, but similar evaluations.

The Texas study group used the computers within their regular classroom, which cut down on the travel time to and from the school computer lab, but was limited by the large group size since there were only eight computers available in the room. Some computer work groups in Texas swelled to five students and although the classroom itself was quite large, the computers were grouped in one corner of the room. In this setting, the working groups were literally bumping into each other and students were straining to hear each other as they worked in their groups. In New York, the students worked within a computer lab managed by the school library. The computers were set up on tables along the perimeter of a large room. The group sizes in New York were smaller, with students working in groups of one to three students, primarily in groups of two. In the larger groups in Texas, the students were apt to argue over whose turn it was to “drive,” and the “passengers” tended to be marginalized and often couldn't see the full computer screen. Students who were marginalized often turned to see what the neighboring group was doing, which caused some behavioral problems for the classes. From our observations, smaller groups (less than three students) proved to be the most effective, since all students were able to see the computer screen easily and have a chance to use the QUEST tool themselves.

While it is difficult to compare the two summative evaluation studies, we found that all middle school students were able to master the QUEST tool quickly and that it was an effective method for students to access and display data. In Texas, the students were only supplied with a set of directions and a short introduction to the tool by the teacher. In New York, the students were supplied with a set of written instructions and the teacher also performed a short demonstration on how to make a map using QUEST. All middle school students, from both Texas and New York, were able to immediately make maps and experienced few difficulties with the QUEST interface.

Because of differences in the testing instruments and lessons, the test scores of the students in Texas and New York cannot be compared directly. Instead, their learning gains can be compared using the Hake factor. Interestingly, the pre-AP students in the Texas study group and the New York students experienced the largest learning gains. Though the statistical validity of the Hake factor numbers still remains unresolved, we observed a Texas pre-AP study group student average Hake factor of 0.422 and a New York student (all students here are considered above average) average of 0.441. It is difficult to say why these students experienced more learning gains than the average students; more investigation into this difference would be beneficial. One possible explanation may be rooted in the cognitive development of the students. In general, the students from New York exhibited a better understanding of the concepts over their counterparts in Texas. This can be attributed to the differences in age, course curriculum, and amount of previous instruction on plate tectonics of the two study populations.

This mapping plate boundaries lesson was selected as the base of our evaluations because it can be recreated in ArcView® and as an analogous paper-based lesson. Although we could have chosen a more complex lesson, one that better highlighted the strengths of QUEST and computer visualization in general, we felt strongly that the control group should be able to participate in an analogous lesson. In addition, this lesson is the first of a series of activities in Discover Our Earth that explore the phenomena of plate tectonics using data. Thus, this lesson is an appropriate starting point for learners at all levels including middle school students and undergraduates that may complete several other plate tectonics related lessons. While middle school classes may not have a need for the more advanced mapping activities of Discover Our Earth due to time and curriculum constraints, it is valuable to evaluate these young users during their first exposure to inquiry tools like Discover Our Earth in an effort to understand how to best address their needs.

While it is nearly impossible to compare quantitative data from the assessments of learning, the qualitative observations can be easily compared to each other. Here, there was a significant difference in the style of learning between the control and computer-based leaning groups. Across the board, students using QUEST were able to engage in self-guided inquiry. In Texas and New York, students continued making maps after they had completed the required set of maps and expressed interest in knowing how different data requests would look on their maps. It was exciting to both the investigators and the classroom teachers as they watched the students engage in inquiry and become interested in what the data could tell them. While this inquiry was often at a low level—for example, some students wanted to map the earthquakes that occurred on their birthday—it highlighted their interest in exploring what the data could tell them and therefore affected their perception of the nature of science and scientific inquiry. The open-ended nature of the activities clearly fostered inquiry and curiosity (Edelson, 2001). The Hake values of the students using QUEST in Texas indicate that these students learned slightly more than their control group counterparts who were unable to become engaged in the activity in the same manner as the study group students.

We suspect that the students who used QUEST will retain their understanding longer than the students who used the paper-based lesson because they were more involved in the process of making maps. Future work should be careful to include longer-term evaluation studies, which assess the retention of student learning. Future evaluations should also carefully assess the student gains in understanding by asking novel questions to the students that require them to apply their understanding.

Discover Our Earth is a valuable technology-based, educational tool which provides access to research-quality data sets through Internet mapping software and the use of inquiry-based activities. Experience using technology tools is beneficial to students as they come of age in today's fast-paced, technology-reliant society. Discover Our Earth's emphasis on inquiry-based learning allows students to develop their own understanding of plate tectonics through the exploration and visualization of the varied data sets. Programs like Discover Our Earth are poised to have a significant impact on education as the availability of computer technology and Internet access in the classroom continues to skyrocket over the next several years. We note, for example, that students who used Discover Our Earth to ask a question such as, “How many earthquakes occurred on my birthday?” would never have been able to find the answer in such a short time prior to the development of the Internet. Now, this question and many others are answered in a matter of seconds.

This work would have been impossible without the cooperation of Anastasia Furitsch of the Houston Independent School District and Laurie VanVleet of the Ithaca School District, who were kind enough to allow us to conduct research within their classrooms. We benefited greatly from countless discussions with Muawia Barazangi, Nimat Hafez Barazangi, and Larry Brown as well as from the technical support of Steven Gallow. Many thanks to Dan Danowski, whose technical and artistic expertise contributed to the creation of the outstanding interfaces and graphics on the Discover Our Earth Web site. This research is partially supported by National Science Foundation grants EAR-0353590 and DUE-0121390.