Initial Publication Date: March 7, 2018

Geoscience Literacy Exam (GLE)

Purpose of the GLE:

  • Project-level benchmark measure used to compare student populations pre/post across institutions
  • First order measure to gauge content knowledge gains over the course of one term

Instrument Format

The GLE content framework is provided by the four geoscience literacy documents: Earth Science, Ocean Science, Atmospheric Science, Climate Science. The resulting exam contains 60 multiple choice and 30 essay questions that align with the content in the literacy documents.

The GLE testing schema is organized into three levels of increasing complexity.

  • Level 1 questions (1 point each) are single answer, understanding- or application-level multiple choice questions. For example, selecting which type of energy transfer is most responsible for the movement of tectonic plates. They are designed such that most introductory level students should be able to correctly answer them after taking an introductory geoscience course.
  • Level 2 questions (up to 2 points each) are more advanced multiple answer/matching questions at the understanding- through analysis-level. Students might be asked to determine the types of earth-atmosphere interactions that could result in changes to global temperatures in the event of a major volcanic eruption. Because the answers are more complicated, some introductory students and most advanced students should be able to respond correctly.
  • Level 3 questions (up to 3 points) are analyzing- to evaluating-level short essays, such as 'describe the ways in which the atmosphere sustains life on Earth.' These questions are designed such that introductory students could probably formulate a rudimentary response. We anticipate the detail and sophistication of the response will increase as students progress through the InTeGrate curriculum.

Current Version of the Instrument

The following secure files contain the current set of GLE questions. This set contains questions that passed a preliminary item analysis following field tests in InTeGrate classrooms. Level 3 items did not have sufficient responses to analyze, and have been removed. Access is restricted to preserve the validity of the questions- follow the instructions after clicking on a file. To maintain control of these questions, any GLE questions must be administered in a secure environment. They must be given as a paper & pencil quiz or on a secure LMS system and proctored in either case. Questions can NOT be returned to students.

How the GLE is used:

Eight common multiple choice questions were piloted as pre and post instruction exams in all InTeGrate materials enactments. This subset provides comparable project-level data along with the interdisciplinary and systems thinking essay questions and the InTeGrate Attitudinal Instrument. Any of the remaining 44 questions can be used in addition to the common eight, as relevant to course content.

  • Proctored and Secure Environment The GLE is most often administered on paper in a proctored setting to keep the questions secure. There is also an option to deliver the GLE online, if a secure and proctored environment is possible.
  • Data acquisition and storage: Paper copies are submitted to Serckit, the secure CMS system at the Science Education Resource Center (SERC). Through Serckit, data is anonymized and encrypted, and removes any data from students that do not consent to the research study, in accordance with institutional IRB requirements. Paring of student data through Serckit allows for longitudinal comparison and responses from multiple instruments.
  • Pre-post: The GLE is administered pre-instruction (during the first week of class) and then again post-instruction (during the final week of class).
  • Undergraduates: The GLE was designed for and tested with undergraduate students.
  • IRB: Development and project-level data analysis for the GLE have been under the jurisdiction of the Institutional Review Board on Human Subjects Research of Carleton College. On individual campuses, instructors have obtained either implied consent or signed consent (as required by their institution's IRB).

Usage to date:

The GLE-8 has been used in the following contexts:

  • Pilots: Courses that pilot InTeGrate curriculum materials, which is a required step in the InTeGrate materials development process (100+ enactments at 100+ institutions).
  • IP's: Many of InTeGrate's Implementation Programs taught with InTeGrate materials and collected and submitted GLE data. The materials had already been piloted in at least three courses, data analyzed, and revision plans in place
  • Post-publication enactments: After InTeGrate materials were piloted and revised, data has been collected on the published modules:
    • InTeGrate Research Team: 7 instructors taught without InTeGrate materials in Fall of 2015, and then taught the same courses incorporating InTeGrate materials in Spring and Fall 2016.
    • Ongoing non-InTeGrate courses: enlisted to test InTeGrate's Geoscience Literacy Exam and the IAI

Development of the GLE:

The GLE was developed through a community approach to maximize the adoption through relevance across departments and programs, institution types, and InTeGrate materials. Led by the InTeGrate assessment team, revisions were informed by InTeGrate materials authors and leadership team, as well as by outside expert reviews. The content was guided by a materials development rubric (guiding principles, learning outcomes, assessment and measurement, active learning pedagogies and alignment) and encode the "big ideas" within the first four earth science literacy documents (atmospheric, climate, earth, and ocean sciences).

Process of selecting, vetting, and testing items:

Expert review
After items were reviewed by the InTeGrate Assessment Team and first content authors, all questions were also sent out for two types of expert reviews: content and assessment. The content reviewers were content experts who were asked to 1) respond to the clarity of the relationship of each question to the relevant literacy document, 2) confirm that the questions go beyond memorization, and 3) that the questions were scientifically correct. Multiple choice distractors were also reviewed to ensure that the given options were reasonable and experts were asked for any other alternate conceptions that that could be added to the list. Content reviewers were also asked to provide their expert answer to the essay questions.

  1. Does the individual question clearly address a core concept in the literacy document? If not, what core concept could best be assessed using this approach for [Earth/Atmosphere/Ocean/Climate literacy?
  2. Does the question probe the concept at a level that requires insight beyond simple memorization?
  3. Is the question scientifically correct (are there any science errors in the questions or answers)?
  4. Are the distractors reasonable for the question (any missing alternative conceptions)?
  5. Please provide your thought on the key conceptual elements an expert would expect for the essay questions.

Pilot testing
The eight common GLE questions were tested as a pre- and post- course measure in both control environments without InTeGrate materials and courses teaching InTeGrate materials. Two essays were also tested as a post-course measure. At least three instructors at three institutions piloted GLE questions for each of the InTeGrate modules and courses. The courses also included science and non-science InTeGrate participants.

Item response and test theory

Item difficulty (classical test theory-CTT and item response theory-IRT), answer distribution, distractor analysis was examined. 8 common multiple choice questions used across all pilot tests passed validity and reliability testing, while the other tested questions appear promising (Steer, 2013). Student gains were seen from the pre to post test, with the highest gains seen in the lowest quartile (Iverson et al., 2016)

References: