Top Ed-Tech Trends of 2011
A Hack Education Project
Data (Which Still Means Mostly Standardized Testing)
This post first appeared on Hack Education on December 6, 2011. Part 4 in my Top 10 Ed-Tech Trends of 2011 series.
If data was an important trend for 2011, I predict it will be even more so in 2012. That's the world we're living in. That's the world we're moving into.
More of our activities involve computers and the Internet, whether it's for work, for school, or for personal purposes. Thus, our interactions and transactions can be tracked. As we click, we leave behind a trail of data--something that's been dubbed "data exhaust." It's information that's ripe for mining and analysis, and thanks to new technology tools, we can do so in real time and at a massive, Web scale.
There's incredible potential for data analytics to impact education. We already collect a significant amount of data about school and students (attendance, demographics, test scores, free and reduced lunches, and the like), but much of it is administrative and/or siloed and/or unexamined.
Earlier this year, I interviewed George Siemens for O'Reilly Radar about the ways in which data and analytics has the potential to improve teaching and learning. He argued that
Education is, today at least, a black box. Society invests significantly in primary, secondary, and higher education. Unfortunately, we don't really know how our inputs influence or produce outputs. We don't know, precisely, which academic practices need to be curbed and which need to be encouraged. We are essentially swatting flies with a sledgehammer and doing a fair amount of peripheral damage. Learning analytics are a foundational tool for informed change in education. Over the past decade, calls for educational reform have increased, but very little is understood about how the system of education will be impacted by the proposed reforms. I sometimes fear that the solution being proposed to what ails education will be worse than the current problem. We need a means, a foundation, on which to base reform activities. In the corporate sector, business intelligence serves this "decision foundation" role. In education, I believe learning analytics will serve this role. Once we better understand the learning process--the inputs, the outputs, the factors that contribute to learner success--then we can start to make informed decisions that are supported by evidence.
But Siemens cautions,
We have to walk a fine line in the use of learning analytics. On the one hand, analytics can provide valuable insight into the factors that influence learners' success (time on task, attendance, frequency of logins, position within a social network, frequency of contact with faculty members or teachers). Peripheral data analysis could include the use of physical services in a school or university: access to library resources and learning help services. On the other hand, analytics can't capture the softer elements of learning, such as the motivating encouragement from a teacher and the value of informal social interactions. In any assessment system, whether standardized testing or learning analytics, there is a real danger that the target becomes the object of learning, rather than the assessment of learning. (emphasis mine)
Despite the promise of personalized learning through analytics and data, what we've actually seen this year is an increasing emphasis on standardization (or rather, standardized testing).And as such, most of the stories about education data this year have been stories about testing. Stories about dismal test scores. Stores about teachers' performance tied to those student test scores. Stories about cheating.
Hand-wringing about test scores seems to accompany many of the narratives about "our failing schools." And there was plenty of hand-wringing to be had in 2011, some of it left over from the release in December 2010 of the PISA (Program for International Student Assessment) scores, tests administered internationally by the OECD. The U.S.'s latest scores were decidedly average. Average in reading (500 compared to 493 among OECD member countries). Average in science (502 for the US versus the average of 502). And below average in math (487 compared to the average score of 496). The latest SAT Test Score data released by the College Board this fall was even grimmer: both math and reading scores were down from the previous year. Reading scores fell to itslowest level since 1972. And just 43% of test-takers had scores that indicated they were college-ready.
The reason for the declining scores, according to the College Board at least, was the increased number of students participating. But there was other finger-pointing too: at poverty, at budget cuts, at No Child Left Behind (which was up for re-authorization this year), at parents, and of course, at teachers.
Despite being widely challenged, The Los Angeles Times 2010 series on the"value add" model for grading teachers(which lookedat test scores as a way to measure a teacher's impact) continued to be pointed to as a new assessment tool. Indeed, one of the major "data" stories of 2011 may be the push for tying students' standardized test scores to teacher evaluations and teacher pay. No longer was this just the purview of the media's analysis of teachers either; this was a requirement of the Obama Administration's Race to the Top state school funding competition.
But in 2011, many people started pushing back. In February, Penn State Altoona education professor Timothy Slekar wrote about his plans to have his 11-year-old child opt out of standardized tests in an Huffington Post op-ed. Slekar questionedthe ways in testing dictates much of the curriculum and classroom time. And more recently, in late November, a some 658 New York administrators came out in opposition to tying test scores to teacher evaluations, claiming this was poorly planned and was "education by humiliation."
But perhaps the most damning condemnation of high stakes testing this year could be found in the number of cheating scandals: Atlanta. Washington DC. Philadelphia. Long Island.
But all of this testing is, of course, big business--for test- and curriculum makers, for test-prep companies, and for learning analytics companies.
Two of the most interesting ed-tech startups this year out of those latter categories wereKnewton and Grockit, and both seemed to have a good 2011. In October, both companies announced major funding: Knewton raising $33 million (bringing the total amount raised by the company to $54 million) and Grockit announcing a $7 million round (bringing its total investment to over $24 million).
Both offer adaptive learning platforms--that is, their software is adaptive as the content it presents changes based on the students' performance. Algorithms dictate the most appropriate next question for students. The right content with the right level of difficulty.
But Grockit and Knewton are quite different. Grockit describes itself as a social learning company, supplementing the individualized practice for the SATs, GREs, APs and the like (the adaptive learning piece) with social learning opportunities. That means it isn't simply one student working alone with the software. Grockit also offers a way for students to work together--via the Grockit website or via its newFacebook app, as well as with tutors.
Knewton's path this year became more tightly integrated with Pearson. The education giant was one of the investors in Knewton's funding round this fall, and in November the two announced a partnership that would enable Knewton's adaptive platform to power several of Pearson's higher ed MyLab and Mastering courses. It's just the math, reading and writing MyLabs for now, but it's worth pointing out that Pearson boasts that it's the content provider for some 9 million college students.
This sort of pairing--adaptive learning plus textbook content--will probably be one of the major trends of 2012 too. But as I noted when I wrote about this topic for Inside Higher Ed, we need to ask a lot of questions if this indeed is what we want to construe as "personalized learning." Indeed as student data becomes more of a goldmine for schools and for researchers and for companies, there will probably be lots of questions to ask.