Hillary Clinton leads Donald Trump by 17 percentage points in the race for California’s 55 electoral votes, down from 24 points in July, while Kamala Harris has expanded her lead over Loretta Sanchez in the U.S. Senate race to 22 percentage points, up from 15 points in the same time span, according to new survey data from the Field Poll and the Institute of Governmental Studies at UC Berkeley.
Nothing too surprising there: Clinton 50-33%, with 17% for others or undecided; Harris 42-20% with 26% undecided and 12%, mostly Republicans, not voting since no GOP contender made it into the finals.
Is it real or Memorex? What’s horrifying shocking stunning surprising is that the Field Poll, widely regarded as one of the most accurate survey firms in the country, suddenly switched up its methodology, abandoning calls to known registered voters with a history of voting, to use an internet panel provided by YouGov.
For those who care about polling – including Calbuzz – this is shocking. Not that we have anything but the highest regard for Mark DiCamillo at the Field Poll and Professor Jack Citrin, at UC Berkeley’s IGS. But even with YouGov, a creation of Professor Doug Rivers at Stanford, providing the panel, and with DiCamillo weighting the data to approximate California’s voting population, this is like Ghiradelli suddenly announcing they’ve switched from sugar to Splenda.
The overall results may turn out to be OK in terms of who wins and who loses California. That’s easy. But explaining how or why or what the state’s voters tell us about what is happening nationally is far less satisfying. For example, how do you explain that Latinos in California favored Clinton 71-9% over Trump in July but now it’s 61-21%? Do we really believe Clinton lost 10 points and Trump picked up 12 points among Latinos? Or that blacks went from 80-5% for Clinton in July to 77-13% in September?
No, we do not.
But the Field/IGS/YouGov poll says so.
Switch hitter. DiCamillo – who for years has argued that no online panel can actually stand in for a random sample of actual voters – says he was persuaded to give it a try because there are so many ballot propositions in California and they are so complex that a pollster who tries to read the ballot summaries to telephone respondents can’t keep them focused and engaged.
He readily acknowledges there are weaknesses to the methodology – and he had to weight the final data for Spanish speakers, older voters, non-party-preference voters and others in order to match known frequencies. But since 90% or more of California registered voters have internet access, he thought it was worth a try to see if he could get more accurate measures on the ballot props.
And hey, lots of respected pollsters are trying to figure out how to deal with declining response rates for telephone surveys and they’re struggling to figure out ways to recreate the accuracy of probability sampling with cheaper internet surveys.
We found that not all online surveys perform equally well. A key reason seems to be the different methods employed by vendors. One of the nine nonprobability samples clearly performed better than the others, and it seems to be related to the fact that they use a more sophisticated set of statistical adjustments, both in selecting their sample and weighting their results [This, we understand, was YouGov] Our conclusions about why that sample performed the best are preliminary, though, because we have just one survey from that vendor and the relevant design features were not experimentally tested within that survey.
One of the other major findings was that, in general, subgroup estimates from these samples for blacks and Hispanics were very inaccurate. Almost all the samples were off by double digits – more than 10 percentage points – on average in their estimates for blacks and Hispanics. That is quite concerning for researchers like us who study the experiences of different groups within the U.S. population.
Quick PEW: there may be hope for online surveys but the jury’s still out.
Chicken hawks come home to roost. Even with all our reservations about the Field/IGS polling methodology, it makes sense that overall, Clinton stayed at 50% and Trump rose to 33%, up from 26% in July. Why? Because the Republican voters – apparently — are accepting they have no other real choice.
Back in July, Clinton was pulling 81% of Democrats in a three-way race and Trump was winning just 64% of Republicans. In the new survey, Clinton has 85% of Democrats and Trump has 84% of Republicans. (Of course, because the sample is not of actual registered voters, the pollsters are relying on respondents themselves to say how they’re registered to vote – not always a reliable factoid, especially without sophisticated questions to weed out those who aren’t really registered or who don’t know how they’re registered).
The findings in this report come from a survey of California voters conducted jointly by The Field Poll and the Institute of Governmental Studies at the University of California, Berkeley. The survey was completed online by YouGov September 7-13, 2016 in English and Spanish among 1,800 registered voters in California, including 1,426 considered likely to vote in the November 2016 general election.
In order to cover a broad range of issues and still minimize possible respondent fatigue, some of the questions included in this report are based on a random subsample of voters statewide. YouGov administered the survey among a sample of the California registered voters who were included as part of its online panel of over 1.5 million U.S. residents.
Eligible panel members were asked to participate in the poll through an invitation email containing a link to the survey. YouGov selected voters using a proprietary sampling technology frame that establishes interlocking targets, so that the characteristics of the voters selected approximate the demographic and regional profile of the overall California registered voter population. To help ensure diversity among poll respondents, YouGov recruits its panelists using a variety of methods, including web-based advertising and email campaigns, partner-sponsored solicitations, and telephone-to-web recruitment or mail-to-web recruitment. Difficult-to-reach populations are supplemented through more specialized recruitment efforts, including telephone and mail surveys.
The Field Poll and the Institute of Governmental Studies were jointly responsible for developing all questions included in the survey. After survey administration, YouGov forwarded its data file to The Field Poll for processing. The Field Poll then took the lead in developing and applying post-stratification weights to more precisely align the sample to Field Poll estimates of the demographic characteristics of the California registered voter population both overall and by region. The Field Poll was also responsible for determining which voters in the survey were considered most likely to vote in this year’s election.
Polls conducted online using an opt-in panel do not easily lend themselves to the calculation of sampling error estimates as are traditionally reported for random sample telephone surveys.
Bottom Line That last sentence is important. It says in essence: because this was not a probability sample, we have no way of actually telling you what the margin of error is.