This spring, the new Common Core Standards took their first major exam. It was a test of the test, with assessments rolled out for a trial run of 36 million students in 36 states and the District of Columbia.

These were field tests only: schools won’t see the results and they won’t show up on kids’ school records. The idea was to identify problems now and resolve them in time for next year’s assessments — when the results will matter.

Unlike previous standardized tests, which a child could complete at her desk with just a #2 pencil, the Common Core assessments are taken online. This means that not only must schools have enough computers and broadband access, but their students need the right technical skills as well.

So — was the test of the test a pass or a fail?

Better than expected (but not so great)

The field tests wrapped up in early June, and the early consensus is that they passed — if not with flying colors, then certainly with a solid C+.

Education Week’s report on 29 school districts in 24 states found many small problems, but concluded that the tests went better than many educators expected. “There were a few hiccups, but overall it went well,” said Joe Willhoft, executive director of Smarter Balanced Assessment Consortium, in the careful language of a longtime education administrator. (Smarter Balanced was one of two state-led consortia that designed the Common Core assessments; the other was Partnership for Assessment of Readiness for College and Careers (PAARC). PAARC officials were also positive about field tests.)

John Burke, who supervises San Francisco Unified School District Achievement Assessment’s office, agreed that field tests went well overall. “It was pretty much what I expected… I would have liked everything to work better, with no glitches, but the glitches fit my expectations of what a field test should be.”

Jackie King, Smarter Balanced’s Director of Higher Education Collaboration, kept in touch with education administrators around the country throughout the testing period. “There were no system failures,” she reported. “Just little niggling problems that were minor, but disruptive.” Case in point: if an iTunes auto-update message popped up during the test and a student accepted the update, it would kick the student out of the assessment server. “So we had to get the word out to turn off the auto update function,” said King.

That’s the big picture. But a close-up look at the testing process at a few randomly selected schools underscores the tech challenges many schools face — not just to provide the infrastructure and hardware, but to make sure some students are not at a drastic disadvantage.

Notes from the field test

At some schools, glitches had more to do with the hardware than with the tests themselves. At one San Francisco school, for example, outdated computers shut down mid-test, and students lost their work.

At Millikan Middle School in Sherman Oaks, California, some students used laptops; others used iPads supplied by the Los Angeles Unified School District. Kim Estrada, Millikan’s testing coordinator, said many students had trouble using the iPads. “The kids did better when they could use the external mouse, versus an iPad where there was so much room for user error.” Millikan won’t be using iPads next time around, according to Estrada: the school has purchased laptops and laptop carts for next year’s test.

Three San Francisco elementary schools didn’t participate in the field test at all, according to San Francisco Unified School District’s John Burke. Monroe Elementary School, for one, opted out — in part because the school didn’t have a single computer that met the requirements for the assessments, according to parent Kentaro Iwasaki, a member of the school’s tech committee.

Iwasaki says the district offered to provide laptops for the tests, but Monroe didn’t have sufficient WiFi capacity. A renovation at the school also complicated space and connectivity issues.

Iwasaki worries about equity issues at schools like Monroe, where 77 percent of students are low-income. “Well-funded schools where parents make big contributions to the PTA have computer labs with the latest technological equipment. Their kids are tech-savvy,” he says. “A school like ours has to struggle to catch up. There’s a real equity gap in terms of resources at different schools around the city.”

Tech skills gap

Millikan’s Kim Estrada also observed glaring gaps in her students’ skills. “Some kids didn’t even know how to find the on and off switch, or how the touch screen worked, while others were completely comfortable with the devices,” she said.

Estrada points out that the tests require fairly sophisticated computer skills. Kids must know how to drag-and-drop — to plot points on a graph, for example — and have strong typing skills since they’re required to write essays.

“We had a lot of kids who knew the content,” adds Estrada, “but didn’t have the skills because they don’t have computers at home.”

The connectivity gap

For many schools, connectivity presented the biggest hurdle during the field test. “[B]andwidth was more of a problem than having enough computers,” says Jackie King.

At Monroe Elementary, spotty WiFi was one of the reasons the school didn’t test students this year. At San Francisco’s Gateway High School, administrators asked teachers not to use computers during the testing period, just in case there wasn’t enough Internet capacity. “We didn’t know if this was a real concern, but we wanted play it safe,” says Vice Principal Shawna Gallo, who said that the field tests went smoothly at Gateway.

At California’s Natomas School District, schools didn’t have sufficient bandwidth to test adjoining classrooms without causing computers to crash, so the district scheduled testing in every other classroom, according to EdSource.

Education technology advocate Evan Marwell argues that such connectivity restraints in U.S schools are unacceptable. According to his nonprofit EducationSuperHighway, 72 percent of public schools have insufficient internet access — not just for the online assessments, but for day-to-day operations.

“The typical U.S. school has the same bandwidth as a four- to five-person home — even though the average school has 600 students,” Marwell points out. “Internet access and speed that adults take for granted at work isn’t available to our kids, even as education content just keeps getting better.” Low-poverty schools are three times more likely to have broadband than schools with high rates of poverty, according to Marwell.

What’s next?

With field tests complete, Smarter Balanced and PARCC staff will pore over test results, evaluating test questions and figuring out what worked and what didn’t.

Jackie King reported that Smarter Balanced also plans to evaluate how the tests worked for different groups of students, in terms of technology skills and other issues, and make recommendations to states and districts based on these conclusions.

District administrators like SFUSD’s Chief Technology Officer Matt Kinzie will use the field test experience to inform tech decisions for next year. “We didn’t want to risk spending public funds until we knew what we needed,” says Kinzie. “Now that we’ve been through the field test, we’re putting together a plan.” Under the plan, schools will be able to choose one of three computer devices (Chromebook, MacBook, or a Windows machine) for assessments and regular instruction. Every San Francisco elementary school will also get wireless connectivity.

Such piecemeal improvements are essential, but schools still have a long way to go to meet the technological demands of the 21st century, according to Marwell. The Common Core Standards didn’t create the digital divide — they’re just placing it in stark relief by increasing the tech demands schools face.

“The Common Core in concept is about creating equity, so every kid gets the same level of education,” Marwell says. “But if schools don’t have the technology or the connectivity, and students don’t have the skills, they aren’t going to benefit — no matter how great the intentions.”

Share on Pinterest