Education Technology and the Ideology of "Personalization"
This is part nine of my annual review of the year in ed-tech
Facebook’s Plans to “Personalize” Education
Facebook, like many digital technology companies, promises that in exchange for collecting your personal data – your name, your age, your gender, your photos, metadata on your photos, your location, your preferences, your browsing and clicking habits, your friends’ names – it will deliver “personalization.” A personalized news feed, for example. Personalized ads.
“Personalization” is also the cornerstone of the investment strategy for Mark Zuckerberg’s new venture philanthropy firm, the Chan-Zuckerberg Initiative. And “personalization” is the underlying promise of the new education software Facebook is itself building.
Facebook worked with the Summit Public Schools charter chain in order to develop this “personalized learning platform,” which it released last year and now licenses to other schools under the product name “Basecamp.” Some 20 Facebook engineers work on the software, and according to The Washington Post, the student information it tracks and stores is not housed on Facebook servers, although Facebook does have access to the data.
Parents must sign away the privacy rights for their children who use Basecamp, as is required under COPPA. But in this case, they must also sign away their right to sue Facebook or Summit Public Schools in case of a problem (like, say, a data breach). Basecamp’s Terms of Service “require disputes to be resolved through arbitration, essentially barring a student’s family from suing if they think data has been misused. In other realms, including banking and health care, such binding arbitration clauses have been criticized as stripping consumers of their rights.” Data can be shared with any company that Facebook deems necessary. “A truly terrible deal,” says Cathy O’Neil, author of Weapons of Math Destruction.
Basecamp is essentially a learning management system (with the adjective “personalized” appended to it). According to The New York Times, “The software gives students a full view of their academic responsibilities for the year in each class and breaks them down into customizable lesson modules they can tackle at their own pace. A student working on a science assignment, for example, may choose to create a project using video, text or audio files. Students may also work asynchronously, tackling different sections of the year’s work at the same time.” I’ll discuss some of the competing definitions of what “personalization” might mean, but in this case, it’s the emphasis on working “at your own pace” on school assignments.
According to Summit’s own reports on those piloting the Basecamp software, “student growth has been positive amongst the cohort schools thus far. Specifically, students who were the furthest behind (in that lowest [Measure of Academic Progress] testing bracket) outperformed the national U.S. average by 1.23 in math and 1.95 in reading, shown below. Translation: if the average American student grew by 1 point in math, the average Basecamp student grew by 1.23 points.” It’s a meager growth, but as the CEO of Summit Public Schools contends, it’s better than what traditional schools are doing. (Stanford historian Larry Cuban has also written a number of articles this year on his observations of the instructional practices and technology usage at the charter school chain.)
Charter schools have been a core part of Mark Zuckerberg’s investment in education reform since the Facebook founder famously donated $100 million to the Newark, New Jersey school system – well, not to the Newark school system, but rather to a local foundation in charge of handling the money. Despite the press coverage – the funding announcement was made on Oprah – things didn’t really go as planned, as journalist Dale Russakoff has recounted in her 2015 book The Prize.
Since his Newark fumbles, Zuckerberg has continued to fund charter school chains, but most of his investment has gone towards software companies that he hopes will help bring about structural changes in the education system, specifically through personalization. Perhaps the most high-profile of these: AltSchool.
More details on Mark Zuckerberg’s education investment portfolio can be found at funding.hackeducation.com.
Personalized Surveillance at AltSchool
AltSchool, a private school startup, was founded in 2014 by Max Ventilla, a former Google executive. AltSchool has raised $133 million in venture funding from Zuckerberg Education Ventures, the Emerson Collective (the venture philanthropy firm founded by Steve Jobs’ widow Laurene Powell Jobs), Founders Fund (Peter Thiel’s investment firm), Andreessen Horowitz, and others.
None of that funding came in 2015; and there were rumors of layoffs at the startup, as it pivoted towards a focus on selling its “personalized learning” software to other schools – the seat license will cost $1000 per student – rather than opening more schools of its own. Or “Phase 2,” as Techcrunch politely called it.
In April, Edsurge reported that AltSchool had hired a new chief operating officer, Coddy Johnson, a former executive at Activision who’d been in charge of the Call of Duty video game line. Call of Duty is often touted for its ultra-violence, but hey! Max Ventilla told Edsurge that “There aren’t a lot of people who have multiple times, managed a thousand-plus organization and done it in a way where anyone you talked to says they're absolutely incredible from a leadership perspective.” (Of course, AltSchool is nowhere near a thousand-plus person organization, even if you count the students as workers, which perhaps you should.) Johnson’s management of education-focused companies includes his seat on the board of Twigtale, a “personalized” children’s book startup. That’s his wife Carrie Southworth’s company, and its investors include Ivanka Trump and Rupert Murdoch’s ex-wife Wendi Deng. Johnson himself is the godson of George W. Bush. (Johnson’s dad was roommates with the former President while at Yale.) It’s a small world, I guess, when one is disrupting education via “personalization.”
Everything at AltSchool is driven by data. As Education Week’s Benjamin Herold observed in January at a product team meeting for the startup school’s software, Stream, the following information was analyzed by developers:
- "Parent usage, measured by total views per week, as well as the percentage of parents who viewed the app at least once each week;
- Teacher adoption, measured by the frequency with which each teacher in each classroom posted updates to the app;
- Personalization, measured by the number of student-specific posts and “highlights” per student shared over the previous two weeks;
- Quality, measured by a review of the content of every single post that every teacher had made to Stream;
- Parent and teacher satisfaction, measured through constant AltSchool surveys of each group."
The AltSchool classroom is one of total surveillance: cameras and microphones and sensors track students and teachers – their conversations, their body language, their facial expressions, their activities. The software – students are all issued computing devices – track the clicks. Everything is viewed as a transaction that can be monitored and analyzed and then re-engineered.
AltSchool has attempted to brand this personalized surveillance as progressive education – alternately Montessori or Reggio Emilia (because apparently neither Silicon Valley tech executives nor education technology marketers know the difference between their Italian learning theories).
Defining Personalization (and Insisting It “Works”)
There isn’t one agreed-upon definition of “personalization” – although there were lots and lots and lots and lots and lots and lots of articles published this year that tried to define it (and at least one that said “stop trying”).
That fuzziness – “moving goalposts” as math educator Dan Meyer has called it – does not stop the word “personalization” from being used all the time in policy documents and press releases: “personalized test prep” and “personalized CliffNotes” and the like. These two examples highlight quite well the mental gymnastics necessary to believe that a “personalized” product is actually personalized. This isn’t about a student pursuing her own curiosity – the topics covered by both CliffNotes and standardized tests are utterly constrained. Personalization is not about the personal; it does not involve students controlling the direction or depth of their inquiry.
It’s just the latest way to describe what B. F. Skinner called “programmed instruction” back in the 1950s.
There were several attempts this year to link the history of “personalized learning” to recent education reforms (but not surprisingly, not to Skinner). “The hottest trend in education actually started in special-ed classrooms 40 years ago,” as Business Insider contended in October. These sorts of articles, many parroting the quite paltry historical knowledge of ed-tech investors, tend to argue that personalization has its roots in the 1970s, in the work of educational psychologists like Benjamin Bloom, for example. Alas, no one reads Rousseau anymore, do they? Or more likely, Rousseau’s vision of education is harder to systematize and monetize and turn into “personalized” flashcards. “Can Venture Capital Put Personalized Learning Within Reach of All Students?” Edsurge asked in June. Poor Rousseau. Without NewSchools Venture Fund, he never had a chance.
Many of the discussions about “personalized learning” insist that technology is necessary for “personalization,” often invoking stereotypes of whole class instruction and denying the myriad of ways that teachers have long tailored what they do in the classroom to the individual students in it. Teachers look for interpersonal cues; they walk around the classroom and check on students’ progress; they adjust their lessons and their assignments in both subtle and conspicuous ways. In other words, “personalization” need not rely on technology or on data-mining; it does, however, demand that teachers attend to students’ needs and to students’ interests.
But “personalization” – at least as it’s promoted by education technology companies and their proponents – requires data collection, and it requires algorithms and analytics. The former, as a practice, is already in place in education. Indeed, in April, the Data Quality Campaign issued a report claiming that schools have collected plenty of data, and now it’s time to use it to “personalize learning.”
But again, what does that phrase “personalize learning” mean?
Education technology companies hope it means that schools buy their products. In a Data & Society report on personalized learning – probably the most helpful guide on the topic – Monica Bulger has identified five types of products that market themselves as “personalized”:
- Customized learning interface: Invites student to personalize learning experience by selecting colors and avatars, or uses interest, age or geographic indicators to tailor the interface.
- Learning management: Platforms that automate a range of classroom management tasks.
- Data-driven learning: A majority of platforms described as ‘adaptive’ fall into this category of efficient management systems that provide materials appropriate to a students’ proficiency level.
- Adaptive learning: Data-driven learning that potentially moves beyond a pre-determined decision tree and uses machine learning to adapt to a students’ behaviors and competency.
- Intelligent tutor: Instead of providing answers and modular guidance, inspires questions, interacts conversationally and has enough options to move beyond a limited decision tree.
Bulger’s report also underscores one of the most important caveats for “personalized learning” products: they aren’t very good.
While the responsiveness of personalized learning systems hold promise for timely feedback, scaffolding, and deliberate practice, the quality of many systems are low. Most product websites describe the input of teachers or learning scientists into development as minimal and after the fact. Products are not field tested before adoption in schools and offer limited to no research on the efficacy of personalized learning systems beyond testimonials and anecdotes. In 2010, Houghton Mifflin Harcourt commissioned independent randomized studies of its Algebra 1 program: Harcourt Fuse. The headline findings reported significant gains for a school in Riverside, California. The publicity did not mention that Riverside was one of four schools studied, the other three showed no impact, and in Riverside, teachers who frequently used technologies were selected for the study, rather than being randomly assigned. In short, very little is known about the quality of these systems or their generalizability.
Nevertheless, Knewton claims Knewton’s personalized learning products work. Pearson claims Pearson’s personalized learning products work. Blackboard claims Blackboard’s personalized learning products work. McGraw-Hill claims McGraw-Hill’s personalized learning products work. Front Row claims Front Row’s personalized learning products work. Organizations in the business of lobbying for and investing in “personalized” ed-tech claim personalized ed-tech works. And so on.
IBM Watson and the “Cognitive Era”
Perhaps the company with the biggest advertising budget for promoting its version of “personalized learning” is IBM, which has been running TV spots for about a year now touting the capabilities of Watson, its artificial intelligence product. Watson famously won Jeopardy! in 2011, a PR stunt that the company hoped would demonstrate how well it could handle Q&A. Since then, IBM has moved to commercialize Watson, particularly in healthcare and education.
This year, IBM announced Watson would be used to power an advising system at the University of Michigan. IBM released a Watson-powered iPad app. IBM partnered with the American Federation of Teachers. It partnered with Sesame Street. It partnered with Blackboard. It partnered with Pearson.
University of Stirling’s Ben Williamson has described the partnership between IBM and Pearson as “part of a serious aspiration to govern the entire infrastructure of education systems through real-time analytics and machine intelligences, rather than through the infrastructure of test-based accountability that currently dominates. … IBM and Pearson are seeking to sink a cognitive infrastructure of accountability into the background of education – an automated, data-driven, decision-making system which is intended to measure, compare, reorganize and optimize whole systems, institutions and individuals alike.”
For its part, IBM says that, with Watson, it will bring education into the “cognitive era” through personalization: “Cognitive solutions that understand, reason and learn help educators gain insights into learning styles, preferences, and aptitude of every student. The results are holistic learning paths, for every learner, through their lifelong learning journey.” Its product, Watson Element, “is designed to transform the classroom by providing critical insights about each student – demographics, strengths, challenges, optimal learning styles, and more – which the educator can use to create targeted instructional plans, in real-time.”
Roger Schank, a pioneer of “cognitive computing,” doesn’t buy it. “Could IBM stop lying about Watson already? I guess not,” he wrote in April. “Is IBM trying to kill off AI research by misusing the word ‘cognitive?’” he wrote in May. The word “cognitive,” he argues, no longer has any meaning.
I am trying to understand what IBM could possibly mean when it uses the word cognitive and announces that we are now in the “cognitive era”. Do they think they Watson is actually thinking? I certainly hope not.
Do they think that Watson is imitating how people think in some way? I can’t believe that they think that either. No one has ever proposed that machines that can search millions of pages of text are smart. Matching key words, no matter how well you do it, is not even a human capability much less one that underlies the human ability to think.
The use of Watson at Georgia Tech to create a “robot teaching assistant” garnered lots of headlines about the possibilities for automation and artificial intelligence to “save education.” But it also confirms some of Schank’s arguments about how truly overrated Watson is as any sort of pedagogical agent. Jill Watson, as the program was called (of course it’s a woman’s name), answered students’ questions on a course website – or rather, answered those questions when it had a confidence rate of 97% it could respond correctly. “Most chatbots operate at the level of a novice,” Ashok Goel, the CS professor who built the program told The Wall Street Journal. “Jill operates at the level of an expert.” What Jill demonstrates isn’t really “smarts” or “intelligence,” and it isn’t “pedagogical”; it’s just a more efficient (and expensive) Q&A system.
Nevertheless, how IBM imagines intelligence – how it imagines the human brain works and how the brain learns – will shape the cognitive systems it builds. And the marketing – all those TV ads – will shape our understanding of “intelligence” in turn.
The goal, says IBM: to “achieve the utopia of personalised learning.”
Marketing the Mindsets
Intertwined with the push for “personalization” in education are arguments for embracing a “growth mindset.” The phrase, coined by Stanford psychologist Carol Dweck, appears frequently alongside talk of “personalized learning” as students are encouraged to see their skills and competencies as flexible rather than fixed. (Adaptive teaching software. Adaptive students.)
The marketing of mindsets was everywhere this year: “How to Develop Mindsets for Compassion and Caring in Students.” “Building A Tinkering Mindset In Young Students Through Making.” “6 Must-Haves for Developing a Maker Mindset.” The college president mindset. Help wanted: must have an entrepreneurial mindset. The project-based learning mindset. (There’s also Gorilla Mindset, a book written by alt-right meme-maker Mike Cernovich, just to show how terrible the concept can get.)
“Mindset” joins “grit” as a concept that’s quickly jumped from the psychology department to (TED Talk to) product. Indeed, Angela Duckworth, who popularized the latter (and had a new book out this year on grit), now offers an app to measure “character growth.” “Don’t Grade Schools on Grit,” she wrote in an op-ed in The New York Times. But there are now calls that students should be tested – and in turn, of course, schools graded – on “social emotional skills.”
Promising to measure and develop these skills are, of course, ed-tech companies. Pearson even has a product called GRIT™. But it’s probably ClassDojo, a behavior tracking app, that’s been most effective in marketing itself as a “mindset” product, even partnering with Carol Dweck’s research center at Stanford.
The startup, which has raised $31.1 million in venture funding ($21 million of that this year), is “teaching kids empathy in 90% of K–8 schools nationwide,” according to Fast Company. Edsurge says ClassDojo is used by two-thirds of schools, and Inc says it’s used by one out of four students, but hey. What’s wrong with a little exaggeration, right? It’s only “character education.”
More details on who’s funding “character education” startups are available at funding.hackeducation.com.
Ben Williamson argues that ClassDojo exemplifies the particularly Silicon Valley bent of “mindset” management:
The emphasis … is on fixing people, rather than fixing social structures. It prioritizes the design of interventions that seek to modify behaviours to make people perform as optimally as possible according to new behavioural and psychological norms. Within this mix, new technologies of psychological measurement and behaviour management such as ClassDojo have a significant role to play in schools that are under pressure to demonstrate their performance according to such norms.
In doing so, ClassDojo – and other initiatives and products – are enmeshed both in the technocratic project of making people innovative and entrepreneurial, and in the controversial governmental agenda of psychological measurement. ClassDojo is situated in this context as a vehicle for promoting the kind of growth mindsets and character qualities that are seen as desirable behavioural norms by Silicon Valley and government alike.
ClassDojo’s popularity is down to its meeting of teachers’ concerns about behaviour management. But, it has fast become part of a loose network of governmental, academic and entrepreneurial agendas focused on behavioural measurement and modification.
ClassDojo is, Williamson argues, “prototypical of how education is being reshaped in a ‘platform society.’”
Personalization in a Platform Society
Media scholars José van Dijck and Thomas Poell have argued that “Over the past decade, social media platforms have penetrated deeply into the mechanics of everyday life, affecting people’s informal interactions, as well as institutional structures and professional routines. Far from being neutral platforms for everyone, social media have changed the conditions and rules of social interaction.” In this new social order – “the platform society” – “social, economic and interpersonal traffic is largely channeled by an (overwhelmingly corporate) global online infrastructure that is driven by algorithms and fueled by data.”
We readily recognize Facebook and Twitter as these sorts of platforms; but I’d argue that they’re more pervasive and more insidious, particularly in education. There, platforms include the learning management systems and student information systems, which fundamentally define how teachers and students and administrators interact. They define how we conceive of “learning”. They define what “counts” and what’s important.
They do so, in part, through this promise of “personalization.” Platforms insist that, through data mining and analytics, they offer an improvement over existing practices, existing institutions, existing social and political mechanisms. This has profound implications for public education in a democratic society. More accurately perhaps, the “platform society” offers merely an entrenchment of surveillance capitalism, and education technologies, along with the ideology of “personalization”, work to normalize and rationalize that.
This post first appeared on Hack Education on December 19, 2016. Financial data on the major corporations and investors involved in this and all the trends I cover in this series can be found on funding.hackeducation.com. Icon credits: The Noun Project