Study Touting iPad Textbooks Raises More Questions Than It Answers (UPDATED 1/27)

The latest Apple innovation everyone is talking about is interactive textbooks for the iPad, which Apple and others are promising will revolutionize education. So far, the buzz seems to be working: in the first three days after the project was unveiled last week, 350,000 textbooks were downloaded, along with 90,000 copies of the program authors can use to create the textbooks. And on Friday, Houghton Mifflin Harcourt, one of the nation’s largest textbook publishers and a key participant in the new initiative, released a pilot study touting the benefits of its HMH Fuse: Algebra 1 iPad textbook. The press release points to a twenty percentage point increase in math proficiency among students who used the app and claims these students were, “more motivated, attentive, and engaged than traditionally educated peers.” The findings have been picked up by Wired, MarketWatch, and a host of other technology and news sites.

Unfortunately, this seems to be another case of poorly-done research being used to promote a product or policy to journalists and consumers who may not have the statistical background to question the evidence they’re being presented with.

Here’s the basic gist of the study: During the 2010-2011 school year, two randomly selected algebra classes at a middle school in Riverside, California used HMH’s Algebra 1 textbook, loaded on iPads donated by the company, as their main instructional tool. The teachers of these two classes taught the rest of their algebra classes using traditional textbooks and the students in these classes served as a comparison group. At the end of the year, 78% of the students using HMH’s e-textbook were deemed proficient or advanced on the state standards test, compared with 59% of the students taught by the same teachers using traditional textbooks. This sounds very impressive, but anyone with some statistical know-how who actually reads the study will realize it isn’t. It’s not that the study is bad – pilot studies are often small and simple, that’s why they’re called “pilots.” The problem is that HMH, and the media outlets covering the research, are presenting very tentative findings as if they are solid evidence for this product’s effectiveness.

Here are some of the study’s limitations:

  • The sample size is small. The study doesn’t tell us the exact number of students who used the e-textbook, but if it’s only two classes I would guess somewhere between 50 and 60. The strength of the findings depends on how the data looks, but usually only very tentative conclusions can be drawn from a sample of this size.
  • No statistical analysis is provided. It’s possible that the HMH e-textbook helped boost student proficiency. It’s also possible that the researchers happened to randomly select a group of students who would have done better than their peers without the e-textbook. Simple statistical tests would tell us how likely it is that the higher scores were the result of the intervention being tested, but HMH doesn’t provide any statistical analysis. Participating students’ scores from the previous year’s state standards exam would also provide valuable context but aren’t included.
  • The sample doesn’t represent typical California students. Even before the study, students at the middle school where the program was piloted were much more proficient in math (around 25 percentage points more) than students in the rest of the district and state. So the study tells us nothing about how e-textbooks might work in average- or low-performing classrooms.

And the biggest red flag that makes me suspicious of this “research”:

  • Previous news reports indicated that the program was being tested with over 400 students at middle schools in Riverside, Fresno, Long Beach, and San Francisco. Yet the materials just released by HMH don’t mention schools other than the one in Riverside, nor do they discuss the 350 or so other students who participated in the pilot study. Maybe the researchers haven’t analyzed the data yet, but it’s also possible that they didn’t get the findings HMH wanted and HMH decided only to report on the study site where things did work.

For the record, I think interactive computer and tablet-based textbooks sound really cool and have a lot of potential to improve learning. There is already some evidence that computer technologies can have a small positive impact on mathematics performance. My problem is with the way HMH is presenting this study (to their credit, I haven’t found any mention of the study on Apple’s website). HMH claims that the, “results demonstrated that HMH Fuse is an effective means of improving students’ Algebra achievement.” School administrators and politicians might be excited to hear that they can increase math scores significantly simply by switching to iPad textbooks. But a lot more research is needed before we can determine if e-textbooks really work, and trying to bypass this process by promoting very limited findings as solid evidence rubs me the wrong way.

I emailed HMH to ask about some of the issues I outlined above. I’ll update this post if I get a response from them.

UPDATE 1/27/12 @ 2:15pm

Since I wrote this piece on Monday, I’ve seen the study covered in a few other places, without any criticism of the research methods or (lack of) statistical analysis. Even The Economist mentioned it, which breaks my heart a little because I think of them as having some of the highest standards in journalism.

Meanwhile, a Houghton Mifflin Harcourt representative responded to my questions without giving me any real answers. She did tell me that 72 students used the iPad textbook and 252 students in the comparison group used traditional textbooks. HMH claims that the statistical analysis was completed by the school district (which is weird because their website claims they hired a third-party research firm to do the study), and I would have to contact the school district for the details of the analysis. One would hope that a company promoting research findings about their products would at least look at the statistical analysis behind them first, not to mention make this analysis available to people who might want to understand how the findings were reached.

As for the other three study sites that weren’t mentioned anywhere in the report, the HMH rep says they were dropped from the study because of implementation issues, “including differences in the way students were allowed to access the iPads (some were only allowed to use them in school, others brought them home, for instance), lack of administration of final testing, differing levels of teacher experience and other variables.” If you drop ¾ of your study sites because they can’t implement the program the way you think it should be implemented, even with the extensive guidance that would be provided during a pilot study, that suggests that few real-world schools will be able to duplicate the results either. Of course it’s also possible that HMH dropped the sites because they weren’t seeing the results they wanted and knew the final data wouldn’t be in their favor.

I’ve followed up asking for more detail on the statistical analysis and research methods, and will update again if I hear anything.

Image Credit: Apple

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s