It started last night: links showing up on Twitter and elsewhere to articles about how Facebook users do worse in school. It’s not hard for people then to jump quickly to the conclusion that Facebook use results in worse grades (e.g., Study: Facebook Hurts Grades). Unfortunately, I know of no data set out there that could help us answer that question. The few people who have relevant data sets could establish correlation at best. I myself have not found such a connection in my data, but let’s back up a bit.
Reading the press coverage about this recent study from a researcher at Ohio State and one at Ohio Dominican University, it’s difficult to get enough information to offer a careful critique. All we’re told is that the findings concern “219 U.S. undergraduates and graduates“, but no idea as to how they were sampled or how the survey was administered. Additionally, there is no detail given in these articles as to how either Facebook use or grades were measured. Is this good and responsible reporting? Hardly.
Doing a search on the AERA’s annual meeting Web site for study author Aryn Karpinski brings up the abstract of the paper “A Description of Facebook Use and Academic Performance Among Undergraduate and Graduate Students”. It’s reasonable to assume that this is the study upon which the press coverage is based as the articles mention AERA. The abstract for a poster to be presented this Thursday reveals a bit more information about the study than the press coverage: a survey was administered to 71 undergraduate and 43 graduate students. It’s not clear how that adds up to 219 respondents as per the press coverage. Perhaps this is the wrong abstract, but I don’t see anything else that would fit the description better. Perhaps the study has been updated since the abstract was initially submitted. Nonetheless, this doesn’t help with transparency about the project.
The abstract suggests that the study is comparing the GPA of users vs non-users without regard to amount of time spent online. Comments by Karpinski in the press coverage, however, suggest measures of amount of time spent on the site: “Our study shows people who spend more time on Facebook spend less time studying.” Of course, it wouldn’t be the first time a researcher gets misquoted in the press so not clear if the researcher really said this (or perhaps the abstract doesn’t include everything that’s covered in the piece). Alternatively, “more time” here is simply meant to refer to “any time at all”, not exactly how I’d talk about having “any use” data, but I guess technically any use is more than no use. Point being, we’re not any closer to understanding the study’s scope and the extent to which we should put much faith in its findings.
Having done related work, I didn’t recall any such relationship between Facebook use and grades so I went back to my data set this morning to check. Indeed, based on data about 1,060 first-year students at the University of Illinois, Chicago collected on a paper-pencil survey in Winter, 2007 (data set described in detail here), I find no relationship between whether someone uses Facebook and self-reported GPA (collected in categories, not in specific grade-point average terms). Additionally, I also have data on number of times the respondent used a social networking site the day before taking the survey and there is no correlation between that measure and grades either.
It is also worth noting that an important finding of my study was how Facebook use is not randomly distributed among participants (e.g., parental education, race, ethnicity predicted adoption) so it’s helpful to look at the relationship of various factors such as grades (or whatever else) to Facebook usage while controlling for other variables.
There are lots of reasons why one may or may not find a relationship between Facebook use and grades. I won’t get into that here, it could make for a very long essay. The point of this post is mainly to suggest a careful approach to what we see in the press and at conferences.
A caveat: I woke up this morning with a million immediate things to do and happy that I’d finally get to do them. Then I realized this story had kept spreading since last night and some people asked me to blog about it. I may have missed some relevant resources in my search for background material and others may show up after I post this. Feel free to post updates below with relevant information.
UPDATE (4/16/09): I’m working on a longer response piece with Josh Pasek (Stanford) and Eian More (Penn). Will update this post and probably put up a new entry when that’s out.
{ 13 comments }
Ankit Guglani 04.13.09 at 3:58 pm
Thanks for the post. None of the versions of the media articles (about this study) I’ve come across so far, go into detail of survey method, sampling method or anything. CNet also called a “technically incorrect” on it: http://news.cnet.com/8301-17852_3-10217704-71.html.
– Ankit.G
Paul Gowder 04.13.09 at 4:44 pm
Is there a relationship between facebook use and dissertation completion/quality?
Oh god I hope not I hope not Ihopenot hopenothopenothopenot…
Anomie 04.13.09 at 5:09 pm
If they looked at grad students and undergrads, I can maybe see why they’d get a correlation. Grad students tend to get better grades just as a result of the way the system is set up (I mean, a C is failing for us). Grad students also are probably still less likely to be on Facebook or, at the least, use it less. But that’s more of a spurious effect based on cohort, not a causal one based on study habits.
And aren’t like 90% of all undergrads on Facebook now? Isn’t it hard make relevant distinctions on an action’s effects when damn near everyone is engaging in said action?
Fred 04.13.09 at 9:02 pm
Following Anomie, I also wonder about how this study was powered. In my research – using a list-based SRS at UNC, I found 95% adoption among undergraduates (+/-3). I’m also working on a meta-analysis and have found similar adoption rates at other universities. There would be no reason I can think of to suspect that OSU or ODU differ. If the sample is comprised of 200 undergraduates, you’re talking about 10 non-users.
In an analysis of variance, you can get significance between unbalanced groups if the variance is small. For example, if all ten of those non-users report an A GPA, and the 190 users have a more normally distributed B GPA, an ANOVA will find significance. This problem is compounded when the researcher asks for GPA data as interval (I’m not sure if that was the case here). For this reason, ANOVA (and other tests of difference) generally assume homogeneity of variances.
The tremendous uptake of Facebook on most college campuses hampers interesting research. In our study we looked at privacy behaviors by length of adoption. The number of subjects in the <1yr cell was so small that it has caused some analytic issues.
Eszter Hargittai 04.13.09 at 9:42 pm
Fred, Facebook use in my study was 78.8%. We’re in the midst of entering data for another round this year, based on 650 cases entered so far, it looks like it’s going to be below 90%. Unlike most studies, my data collection is done on paper surveys so I may be getting more of the low-level Internet users who’re much more likely not to be Facebook users than others. (I don’t know how the studies you’re looking at collected their data, but my impression is that data collection is often done online, which is problematic when online engagement is one of the variables of interest.) Just a thought as you work on your piece.
Fred 04.13.09 at 11:36 pm
Eszter – yes, most of these high-engagement studies use online surveys, so coverage is certainly an issue. Reading my comment, I realize I didn’t mean to emphasize the particular percentage, but that the non-user cell is typically small. For the record, I don’t think that 95% adoption is typical, but rather that high adoption is typical. Does that make sense…looking forward to your data!
HolfordWatch 04.14.09 at 10:13 am
Just to add to the moral alarm about Twitter, the Daily Mail reveals that an upcoming PNAS publication reveals that Twitter can make users immoral.
Declining grades, declining moral sensation, next thing it will be announced that it rots teeth (too busy tweeting to floss).
Sandeep Ray 04.14.09 at 9:52 pm
I should really be writing my paper instead of reading this which was a link off of Facebook.
Eszter Hargittai 04.15.09 at 11:38 am
Sandeep, that may be the case, but you may end up writing a better paper by having considered the details of this press phenomenon, uhm, not that I want to draw a direct link between your paper and this issue.:-)
Nicole Ellison 04.17.09 at 12:30 am
E, We also found no relationship between self-reported GPA and FB use in our 2006 ICA paper which later became the JCMC article. We dropped it out because it didn’t seem that interesting. Little did we know! I’ve also surveyed students about other educational uses of Facebook and have found that many use FB to set up study groups, talk about course material, and arrange project meetings. All the focus on this one conference paper seems a bit crazy to me!
Thanks for this thoughtful piece.
Barry Wellman 04.17.09 at 12:51 am
I also wonder if the small minority who don’t use Facebook are equivalent in other characteristics to those who do use it. I’d suspect they are atypical in other ways.
Kevin R. Guidry 04.17.09 at 2:56 am
I stopped by this poster session today and for what it’s worth Aryn seemed aghast at the press coverage and how this is being (mis)spun (her first words to me after I introduced myself were to timidly ask “Are you going to tear [my research] apart, too?” My heart went out to her!). From what I saw in her poster and what I heard in our brief discussion, she wasn’t making and causal links or any really strong conclusions whatsoever given the limitations of this small study. This seems to be a tempest in a teapot stirred vigorously by our friends in the media. Of all of the interesting, exciting, and important research that could (and should) have been (mis)reported at AERA it’s sad that this seems to have been one of the bigger stories.
But I’m making lemonade from this lemon by using this an another example of how the media report on and write about research. It’s a very timely example as this is exactly what I’m discussing with my students right now. And having all of this expert commentary makes this an incredibly rich example!
jkd 04.17.09 at 5:19 pm
All the focus on this one conference paper seems a bit crazy to me!
Well yes, but this result does (purportedly) confirm the media’s favorite storyline around SNS (i.e., Facebook is teh suxxor!!!1!).
And as Fred points out, our research on FB at UNC showed close-enough-to-universal adoption rates such that non-adopters are, as Barry notes, probably atypical in other ways. I really would like to see the full numbers on this one, though.
Comments on this entry are closed.