“Science in Action”: New Forms of Student Assessment

By Eric Wearne

Eric Wearne, Senior Fellow, Georgia Public Policy Foundation

Each year the National Assessment Governing Board (NAGB) and the National Center for Education Statistics (NCES) announce results for tests that make up the National Assessment of Educational Progress (NAEP).  This week, results of 2009 school year science tests in grades four, eight, and 12 were announced in a report titled “Science in Action.”  This announcement was different because for the first time results include assessments of interactive computer tasks (ICTs).

The results include both hands-on and interactive computer tasks.  Students have been doing hands-on activities on NAEP science tests since the 1990s. A video describing the hands-on tasks is here.  In these tasks, students receive a kit with materials and lab equipment, and, in 40 minutes, they conduct a scientific investigation, apply their knowledge by working through a problem, and describe their results.

The ICTs, however, were new in 2009. These consist of either 20- or 40-minute tasks, and, as with the hands-on tasks, ask students to watch a tutorial, then predict the results of a real-world science situation, conduct an investigation and observe what happens, and then explain the results.  The obvious difference is that all of the experimentation, data collection, and writing, are done in simulated computer environments.

In the 4th grade task called “Mystery Plants,” for example, students are presented with a greenhouse allowing varying levels of light, different types of plants, and different types of fertilizer.  Students are able to move the plants around in the greenhouse and experiment with various levels of sunlight and types of nutrients.  They set up their own experimental design, and describe their results.  To the student, the task looks (in part) like this:

As for the results of the assessments, NCES came away with three “key discoveries”:

  1. Students were successful on parts of the investigations that involved limited sets of data and making straightforward observations of that data.

For example: In 8th grade, “84% of students could correctly use a simulated laboratory to test how much water flowed through two different types of soil.”

2. Students were challenged by parts of investigations that contained more variables to manipulate or involved strategic decision making to collect appropriate data.

Again in 8th grade, “24% of students could appropriately decide how to manipulate four metal bars made of unknown materials to determine which ones were the magnets” and “…apply their knowledge of how to test for magnetic properties.”

3.  The percentage of students who could select correct conclusions from an investigation was higher than for students who could select correct conclusions and also explain their results.

This is to be expected, but the gap between being able to select a correct answer and then also explain it was large.  In 8th grade, “88% of students could select the liquid that flowed at the same rate as water at a given temperature, while 54% could select the correct liquid and support this conclusion in writing using evidence from their own investigation.”

Overall, students in 4th grade answered 42% of the questions on their ICTs correctly. Eighth graders answered 41% correctly, and 12th graders answered 27% correctly.  Girls scored slightly better than boys on the hands-on tasks, boys scored slightly better than girls on the traditional paper and pencil forms, and there was no difference between boys and girls on the ICTs.

So why do scores seem to decline the longer students stay in school?  NAEP also includes a questionnaire as part of each assessment, and the results suggest some possible, partial reasons.  Ninety-two percent of 4th graders said they did “hands-on activities or investigations in science” at least once a month.  Ninety-eight percent of 8th graders said the same.  In 12th grade, however, just 51% of students said they “designed a science experiment” at least once every few weeks.  And this probably underestimates 12thgraders’ lack of science participation, as the questionnaire also found that only 53% of them were even taking a science course.

What might this mean for online learning?  We know we can do multiple choice tests and even written assessments online now.  These results show that we are beginning to be able to do richer, more complex assessments as well.   As an added bonus, students actually seem to like them.  Anecdotally, students told test administrators that “This is fun!,” and asked, “Are we allowed to do as many as we want?”  Actually, anyone can take any of the published IC Ts and test themselves by going to the NAEP website here.

(Eric Wearne is a Georgia Public Policy Foundation Senior Fellow and Assistant Professor at the Georgia Gwinnett College School of Education.  Previously he was Deputy Director of the Governor’s Office of Student Achievement.)

By Eric Wearne

Eric Wearne, Senior Fellow, Georgia Public Policy Foundation

Each year the National Assessment Governing Board (NAGB) and the National Center for Education Statistics (NCES) announce results for tests that make up the National Assessment of Educational Progress (NAEP).  This week, results of 2009 school year science tests in grades four, eight, and 12 were announced in a report titled “Science in Action.”  This announcement was different because for the first time results include assessments of interactive computer tasks (ICTs).

The results include both hands-on and interactive computer tasks.  Students have been doing hands-on activities on NAEP science tests since the 1990s. A video describing the hands-on tasks is here.  In these tasks, students receive a kit with materials and lab equipment, and, in 40 minutes, they conduct a scientific investigation, apply their knowledge by working through a problem, and describe their results.

The ICTs, however, were new in 2009. These consist of either 20- or 40-minute tasks, and, as with the hands-on tasks, ask students to watch a tutorial, then predict the results of a real-world science situation, conduct an investigation and observe what happens, and then explain the results.  The obvious difference is that all of the experimentation, data collection, and writing, are done in simulated computer environments.

In the 4th grade task called “Mystery Plants,” for example, students are presented with a greenhouse allowing varying levels of light, different types of plants, and different types of fertilizer.  Students are able to move the plants around in the greenhouse and experiment with various levels of sunlight and types of nutrients.  They set up their own experimental design, and describe their results.  To the student, the task looks (in part) like this:

As for the results of the assessments, NCES came away with three “key discoveries”:

  1. Students were successful on parts of the investigations that involved limited sets of data and making straightforward observations of that data.

For example: In 8th grade, “84% of students could correctly use a simulated laboratory to test how much water flowed through two different types of soil.”

2. Students were challenged by parts of investigations that contained more variables to manipulate or involved strategic decision making to collect appropriate data.

Again in 8th grade, “24% of students could appropriately decide how to manipulate four metal bars made of unknown materials to determine which ones were the magnets” and “…apply their knowledge of how to test for magnetic properties.”

3.  The percentage of students who could select correct conclusions from an investigation was higher than for students who could select correct conclusions and also explain their results.

This is to be expected, but the gap between being able to select a correct answer and then also explain it was large.  In 8th grade, “88% of students could select the liquid that flowed at the same rate as water at a given temperature, while 54% could select the correct liquid and support this conclusion in writing using evidence from their own investigation.”

Overall, students in 4th grade answered 42% of the questions on their ICTs correctly. Eighth graders answered 41% correctly, and 12th graders answered 27% correctly.  Girls scored slightly better than boys on the hands-on tasks, boys scored slightly better than girls on the traditional paper and pencil forms, and there was no difference between boys and girls on the ICTs.

So why do scores seem to decline the longer students stay in school?  NAEP also includes a questionnaire as part of each assessment, and the results suggest some possible, partial reasons.  Ninety-two percent of 4th graders said they did “hands-on activities or investigations in science” at least once a month.  Ninety-eight percent of 8th graders said the same.  In 12th grade, however, just 51% of students said they “designed a science experiment” at least once every few weeks.  And this probably underestimates 12thgraders’ lack of science participation, as the questionnaire also found that only 53% of them were even taking a science course.

What might this mean for online learning?  We know we can do multiple choice tests and even written assessments online now.  These results show that we are beginning to be able to do richer, more complex assessments as well.   As an added bonus, students actually seem to like them.  Anecdotally, students told test administrators that “This is fun!,” and asked, “Are we allowed to do as many as we want?”  Actually, anyone can take any of the published IC Ts and test themselves by going to the NAEP website here.


Eric Wearne is a Georgia Public Policy Foundation Senior Fellow and Assistant Professor at the Georgia Gwinnett College School of Education.  Previously he was Deputy Director of the Governor’s Office of Student Achievement.

« Previous Next »