Aug 08 2016
Classroom

Test Takers Continue to Face Challenges When Using Some Technology

Reports suggest that familiarity with devices is key to better performance on standardized tests.

The average U.S. student completes 112 mandatory standardized tests between pre-K and graduation, according to the Council of the Great City Schools, and last year, for the first time, a majority of those tests were given using technology. Now comes research suggesting that “device effects” may be hindering performance for some students.

There is still much to learn about how digital and paper tests compare, but Government Technology magazine reports that students who took an electronic version of the Common Core State Standards exams scored lower than their peers who took a pencil-and-paper version.

Research on multiple exams compiled by the Council of Chief State School Officers found that device use made it difficult to compare test results across students, schools, districts and states. Students experienced issues with screen size — particularly when they had to scroll to read large passages — and had trouble with touch screens when questions required a precise answer. The CCSSO also found that students who used a device with an onscreen keyboard to answer essay questions wrote less than students who used a desktop computer keyboard.

“Because students cannot rest their fingers on the onscreen keyboard, students’ keyboarding skills are restricted and they instead defer to the ‘hunt-and-peck’ method to input their responses to essay items,” the CCSSO report states.

Not surprisingly, younger students didn’t have issues typing with an onscreen keyboard, CCSSO found, likely because they are inexperienced typists and hunt-and-peck is the method they regularly use.

Further study is needed on device comparability, the CCSSO report summary states, but familiarizing students with the technology they will use for tests is key: “The current literature suggests that the difference in devices can be minimized if all students are sufficiently fluent with the functionality of the device on which they are testing.”

Other recommendations include standardizing across devices for the amount of content that can be read on a screen without scrolling; larger items for easier touch-screen input; and external keyboards for answering essay questions.

Data collected from the exams given by the Partnership for Assessment of Readiness for College and Careers found that the 5 million students who took the 2014–2015 PARCC assessments with a computer scored lower than those who took it with a paper and pencil.

PARCC’s data also indicated that problems were dependent on the type of device students were using and the type of questions they were answering. Students who took the Algebra 1 and Geometry exams, for example, struggled with certain tasks on tablets, while students taking the Algebra 2 exam found some questions difficult to answer using a desktop or notebook computer.

oksun70/ThinkStock
Close

Become an Insider

Unlock white papers, personalized recommendations and other premium content for an in-depth look at evolving IT