USING A SEEDED SAMPLE: Are Homeschoolers Over or Under Represented in Surveys?

Record:  Stacey Bielick, Lina Guzman, Astrid Atienza, and Andrew Rivers, “Using a Seeded Sample to Measure Response among Homeschooling Households” in Survey Practice 2, no. 9 (2009).  Available Here.

 Summary:  Bielick and colleagues used a complex and creative sampling strategy to try to determine whether or not homeschooling families were likely to be under or over represented in The National Center of Education Statistics’ National Household Education Survey (NHES).

 Bielick and colleagues begin by explaining how the NHES works.  A first round of eligibility screening is used on households whose phone numbers are randomly selected to determine if they are a fit for the education survey (basically if school-aged children live in the home).  Homeschooling status is determined at this stage.  Then at a later date a much more elaborate main interview is conducted.  The NHES has followed this basic strategy every four years since 1999, and the results provide the best and most frequently cited data regarding basic homeschooling demographics and parental motivation in the United States.  But NHES conclusions about homeschooling are often greeted with skepticism by researchers and others who wonder if homeschoolers, given their frequent animus against government and invasive research, might be less willing than other groups to participate in the surveys.  If so, the NHES might seriously underestimate the number of homeschoolers in the country, and its conclusions about motivation might not be representative either because parents with certain motivations might be less willing to be interviewed than homeschoolers with other motivations.  Alternately, it could be the case that homeschoolers, profoundly concerned as they are with their children’s educations, might be more likely to accept the offer to be interviewed due to their interest in the subject.  Bielick and colleagues wanted to find out if either of these things is true.

 To do that they created a “seeded sample” to compare with the sample from the 2007 NHES randomized sample.  Though it was harder than they had originally anticipated, the researchers combined three lists of homeschoolers they acquired in various ways and randomly selected 2,400 households to contact.  This response rate of these 2,400 families to a request for interviews was then compared to the overall NHES response rate.

 On its face the response rate for their seeded sample was significantly better (76% positive) than that for the general survey (62%).  But there are a few complications that make it difficult to compare the groups.  Bielick and colleagues discuss some of these complications (the most important of which was their difficulty in obtaining good, current mailing lists of homeschoolers) and conclude modestly that at the very least “cooperation rates of homeschooling households were not likely lower than that of non-homeschooling households.”

Appraisal:  While not perfect, this study is the only of its kind ever attempted and likely the best that could be expected given the inherent limitations of studying homeschoolers.  I’m personally comfortable using it as evidence for a claim that homeschoolers are not by definition less likely to agree to a phone interview about education than are other Americans.  If that’s true, then it gives us more confidence that the National Household Education Survey data about homeschooling is pretty accurate.  That’s a nice thing to be able to say.

Milton Gaither, Messiah College, author of Homeschool: An American History.

Disclaimer:  The views expressed in reviews are not the official views of ICHER or of its members.  For more information about ICHER’s Reviews, please see the « About these Reviews » Section.

 

This entry was posted in Research Methodology and tagged , , , , , , , , . Bookmark the permalink.

1 Response to USING A SEEDED SAMPLE: Are Homeschoolers Over or Under Represented in Surveys?

  1. Isaac D says:

    Thank you for posting this!
    I had not seen this paper before, but I found it very helpful in deciding how heavily I could lean on the 2007 PFI data in particular, and PFI data in general.
    In general it seems to confirm my working hypothesis that the factors biasing survey-based research toward homeschooler overreporting and underreporting tend to cancel each other out in practice.

    I assume that since Ms. Bielick was involved with the decision to conduct the 2012 PFI survey by mail, that she and/or her colleagues were able to conduct a similar study on mail response rates for the 2012 survey. I suspect that the change in format will not significantly affect homeschooler nonresponse, but I would be very interested in seeing the results of such a study if it was conducted.

Comments are closed.