Today newspaper ran an exclusive story on 28 March 2011 on a survey of “heartland” voters it had commissioned. Voter surveys are relatively rare in Singapore’s politics — the reasons for which can be debated separately — and for this reason alone, this one deserves attention.
Naturally, the first thing anyone does before trying to understand a survey’s findings is to look at its design parameters, as these may skew the results. It is also important to understand what exactly the survey was trying to achieve. In Part 1, I will share with you my thoughts on the overall shape of the survey; in Part 2, I will discuss its specific findings.
The four articles within that edition of Today did not state explicitly in any single sentence the objective of the study. However, the biggest headline within was “What are the hot-button issues?” and the opening paragraph under that said:
As the drumbeats of the general election get louder by the day, a survey of 618 voters commissioned by Today has found that the rising cost of living is more likely to influence how the voters cast their ballot than supposedly hot-button issues such as influx of foreigners or housing.
Immediately, questions rained down in my brain.
1. The opening paragraph dismissed “influx of foreigners” and housing as hot-button issues by its use of the word “supposedly”.
2. That left only “cost of living” as the single “hot-button issue”, which in turn contradicted the plural in the headline: “What are the hot-button issues?”
3. The survey was not open-ended. It didn’t seem ever to have asked respondents to name issues. It merely asked them for their views on three pre-selected issues. Effectively then, the survey only found that cost of living was “hotter” than the other two. It didn’t ask if there might be other concerns on voters’ minds.
This is not to say that cost of living isn’t an important issue. It always is and quite plausibly would be the topmost issue even if survey participants had 20 – 30 other issues to choose from. Surveys from other countries too indicate that cost of living is an important issue everywhere. However, we must be careful about saying it is THE top issue when the survey was not designed to discover that.
Further down in the story it said that the 618 were “statistically representative of HDB dwellers”. HDB stands for Housing and Development Board and is commonly used to denote public housing, which over 80 percent of Singaporeans live in. Nonetheless, a significant minority of Singaporeans live in private housing and why they were excluded from the survey was not explained.
If a survey is meant to fathom voter concerns, then it ought to be representative of all voters.
The survey method was also described in the news story as one done “via telephone”. There are different ways of doing surveys through the telephone and the method should have been described in greater detail. Traditionally, pollsters call phone numbers randomly selected from a phone book, but nowadays this produces skewed results, because there is a significant portion of households that do not have a landline (e.g. myself). People who choose to do away with a landline may be distinctly different from those who keep one in terms of their demographic profile and thus, opinions.
Or did the survey do random dialling of mobile numbers? Yet, this too can give distorted results.
Since the news story did not describe the method, it is hard to estimate what effect the choice of method might have had on the results.
It was reported that the findings were re-weighted by gender, age, race and HDB housing type. This goes some way to correct for skewing. In the newspaper’s writeup, there was a hint that the survey also collected data about the personal income of respondents, but oddly, no statement was made that the overall results were re-weighted for this.
Yet another problem I had in trying to understand the findings was the way all findings were presented as simple Yes/No binaries. Weren’t there “Don’t knows”? How many were there? Moreover, did all 618 answer all questions?
Today published some results in tables, giving Yes/No answers as percentages to one decimal place. Nowhere did it state the margin of error.
* * * * *
Survey questions seemed to have been structured in four sections. Section 1 was about housing costs, section 2 about the influx of foreigners, and section 3 about the cost of living. In each of them were a series of questions in the form of “Are you concerned that. . . .?” followed by a final question about whether the government’s corrective measures were considered adequate. Sections 1 to 3 therefore measured respondents’ views about various facets of these three issues, and governmental response.
Section 4 appear to have consisted of three questions.:
Putting aside twiddly questions like (a) What’s the margin of error? (b) How many don’t knows were there? (c) How many respondents even answered each question? I had some difficulty anticipating how respondents might react to the wording of the questions. What does “influence how you vote” mean?
Would the average participant take that to mean whether this issue would cause him to vote against the ruling People’s Action Party (PAP)? Or would he take it to mean that this issue would be decisive in how he decides on his vote? They may sound similar, but they are quite different things.
For example, someone very concerned about housing costs, foreigner influx and the cost of living, but who is anyway determined to vote for an opposition party might just answer “No” to all three questions. No, they won’t influence how he is going vote, because he long ago decided he would vote for a opposition party (e.g. because he just can’t stand the PAP’s arrogance, or he was once wrongly accused of robbery by a bungling police investigation).
This in fact points to the most curious thing about his survey: Why didn’t it ask voters which party they intended to vote for? Why not ask a direct question? Why ask such a roundabout question like “influence”?
Only with this question added can we really make sense of all the other questions about housing costs, etc, because if the avowed intention of the survey is to discover which issues impact ballot choices, there is first a need to determine which voters are swing voters. For example, the opening paragraph dismissed housing costs as less important than cost of living generally, but perhaps for swing voters they are more important?
Here again, the exclusion of other issues diminish the utility of this survey. Perhaps for swing voters, civil liberties or the institutionalisation of the income gap are burning issues?
Part 2 will discuss in more detail the findings of the survey.