On Mediacorp’s heartland voter survey, part 1

Today newspaper ran an exclusive story on 28 March 2011 on a survey of “heartland” voters it had commissioned. Voter surveys are relatively rare in Singapore’s politics — the reasons for which can be debated separately — and for this reason alone, this one deserves attention.

Naturally, the first thing anyone does before trying to understand a survey’s findings is to look at its design parameters, as these may skew the results. It is also important to understand what exactly the survey was trying to achieve. In Part 1, I will share with you my thoughts on the overall shape of the survey; in Part 2, I will discuss its specific findings.

The four articles within that edition of Today did not state explicitly in any single sentence the objective of the study. However, the biggest headline within was “What are the hot-button issues?” and the opening paragraph under that said:

As the drumbeats of the general election get louder by the day, a survey of 618 voters commissioned by Today has found that the rising cost of living is more likely to influence how the voters cast their ballot than supposedly hot-button issues such as influx of foreigners or housing.

Immediately, questions rained down in my brain.

1. The opening paragraph dismissed “influx of foreigners” and housing as hot-button issues by its use of the word “supposedly”.

2. That left only “cost of living” as the single “hot-button issue”, which in turn contradicted the plural in the headline: “What are the hot-button issues?”

3. The survey was not open-ended. It didn’t seem ever to have asked respondents to name issues. It merely asked them for their views on three pre-selected issues. Effectively then, the survey only found that cost of living was “hotter” than the other two. It didn’t ask if there might be other concerns on voters’ minds.

This is not to say that cost of living isn’t an important issue. It always is and quite plausibly would be the topmost issue even if survey participants had 20 – 30 other issues to choose from. Surveys from other countries too indicate that cost of living is an important issue everywhere. However, we must be careful about saying it is THE top issue when the survey was not designed to discover that.

You will also have noticed that the opening paragraph mentioned “618 voters”. A closer reading of the news report would qualify this severely. And beg more questions.

Further down in the story it said that the 618 were “statistically representative of HDB dwellers”. HDB stands for Housing and Development Board and is commonly used to denote public housing, which over 80 percent of Singaporeans live in. Nonetheless, a significant minority of Singaporeans live in private housing and why they were excluded from the survey was not explained.

If a survey is meant to fathom voter concerns, then it ought to be representative of all voters.

The survey method was also described in the news story as one done “via telephone”. There are different ways of doing surveys through the telephone and the method should have been described in greater detail. Traditionally, pollsters call phone numbers randomly selected from a phone book, but nowadays this produces skewed results, because there is a significant portion of households that do not have a landline (e.g. myself). People who choose to do away with a landline may be distinctly different from those who keep one in terms of their demographic profile and thus, opinions.

Or did the survey do random dialling of mobile numbers? Yet, this too can give distorted results.

Since the news story did not describe the method, it is hard to estimate what effect the choice of method might have had on the results.

It was reported that the findings were re-weighted by gender, age, race and HDB housing type. This goes some way to correct for skewing.  In the newspaper’s writeup, there was a hint that the survey also collected data about the personal income of respondents, but oddly, no statement was made that the overall results were re-weighted for this.

Yet another problem I had in trying to understand the findings was the way all findings were presented as simple Yes/No binaries. Weren’t there “Don’t knows”?  How many were there? Moreover, did all 618 answer all questions?

Today published some results in tables, giving Yes/No answers as percentages to one decimal place. Nowhere did it state the margin of error.

* * * * *

Survey questions seemed to have been structured in four sections. Section 1 was about housing costs, section 2 about the influx of foreigners, and section 3 about the cost of living. In each of them were a series of questions in the form of “Are you concerned that. . . .?” followed by a final question about whether the government’s corrective measures were considered adequate.  Sections 1 to 3 therefore measured respondents’ views about various facets of these three issues, and governmental response.

Section 4 appear to have consisted of three questions.:

Putting aside twiddly questions like (a) What’s the margin of error? (b) How many don’t knows were there? (c) How many respondents even answered each question? I had some difficulty anticipating how respondents might react to the wording of the questions. What does “influence how you vote” mean?

Would the average participant take that to mean whether this issue would cause him to vote against the ruling People’s Action Party (PAP)? Or would he take it to mean that this issue would be decisive in how he decides on his vote? They may sound similar, but they are quite different things.

For example, someone very concerned about housing costs, foreigner influx and the cost of living, but who is anyway determined to vote for an opposition party might just answer “No” to all three questions. No, they won’t influence how he is going vote, because he long ago decided he would vote for a opposition party (e.g. because he just can’t stand the PAP’s arrogance, or he was once wrongly accused of robbery by a bungling police investigation).

This in fact points to the most curious thing about his survey: Why didn’t it ask voters which party they intended to vote for? Why not ask a direct question? Why ask such a roundabout question like “influence”?

A voting intention question is perhaps best given as a five-option question as at left.

Only with this question added can we really make sense of all the other questions about housing costs, etc, because if the avowed intention of the survey is to discover which issues impact ballot choices, there is first a need to determine which voters are swing voters. For example, the opening paragraph dismissed housing costs as less important than cost of living generally, but perhaps for swing voters they are more important?

Here  again, the exclusion of other issues diminish the utility of this survey. Perhaps for swing voters, civil liberties or the institutionalisation of the income gap are burning issues?

Part 2 will discuss in more detail the findings of the survey.

6 Responses to “On Mediacorp’s heartland voter survey, part 1”


  1. 1 Gard 29 March 2011 at 20:37

    1) I am a little bewildered by how does ‘cost-of-living’ get into the survey design. Isn’t it a topic that is quite broad in economic sense (more than foreigners and housing) – not just dealing with the issue of inflation, but also savings, employment and wages, cost of running a business, health care cost, etc. and including housing cost and cost of hiring foreign maids and workers?

    Singaporeans are not stupid. Raising the foreign worker levy will hit their pockets somewhere if businesses have to raise wages to attract local workers and pass some of these higher costs to consumers.

    And since over 50% answered ‘No’ to the three issues, you have to wonder what are these majority concerned or influenced by.

    2) Mediacorp have to plan for the contingency in case the results are not ‘favorable’ (in whatever sense of the word).

    So, actually, the first thing is to check for any inherent bias of the survey commissioner, becauses results, if they are not favorable, will not be published; and any favorable result would not be believed.

    3) Actually, it is possible to estimate the margin of error based on the information given. At 99% confidence, it is about 5%. Care must be used in interpreting the margin.

  2. 2 Criticalist 30 March 2011 at 00:21

    I want to put a caveat upfront first, that I’m not a survey specialist, although I am familiar with both quantitative and (especially) qualitative research methods. Now having said that, I’d like to offer the following thoughts on this matter. These considerations are discussed with a colleague who *is* stats trained, and proficient in survey design.

    1. It *is* strange that the survey design for the ‘hot button’ survey is almost amateurish when considering that it was conducted by a company that supposedly deals with “research studies” and some complex statistical analyses. Using simple “yes” “no” responses to questions simplifies what is a complex matter. At the least, a survey should be likert in nature, and include a ‘don’t know’ category.
    2. I should add that Media Research Consultants (MRC) is a subsidiary of Mediacorp, so it is NOT an independent research company. See http://www.mrconsultants.sg/services.htm.
    3. We’re not cognisant of the survey design but it plays a signficant role in how respondants answer the questions. For example, were the interviewees explicitly informed that the survey was ONLY about housing, foreigners and living costs? If they didn’t, they would have answered the 3 hotbutton questions differently because they may expect that the interviewer will ask them about other issues (eg government performance, freedom of speech etc). They’d then be responding, potentially, no to certain issues because they are keeping their ‘yes’ responses to the ones that, in the end, never turned up.
    4. There’s also a sequence and cause effect (as I’m told by my stats friend) which will skew your results. For example, if I was asked about housing influencing my vote, I might say no, because I might expect a more important issue to emerge in the next few questions. Then I’m asked “foreigners” and again I said no, and then suddenly I’m told the last question is “living costs” and I’d have to say “yes” because I suddenly realise I have run out of issues to choose from. Surveys try to resolve this problem by randomising the sequencing of questions, or alerting interviewees that these are the specific issues, or that they can choose multiple issues to say yes to. Of course we don’t know if they did these or not.
    5. I’m surprised that there does not seem to be any question that asks interviewees to ‘rank order’ these issues, which would give a better sense of what’s important to people, instead of a yes or no response. So the survey could ask an interviewee “in decreasing order of importance, which would influence you most in your voting – housing, foreigners or living costs”, and then the results would indicate (eg) 59% ranked housing as their no 1 concern etc.
    6. The cynic in me have to ask this question. Maybe the *reporting* of the survey data simplified the actual content and findings of the survey, we won’t know that unless the survey report is provided for the public to see. But given the rather simplified questions asked and the dubious design, and given the supposed (but non independent) professionals employed to conduct the survey, I can be led to believe that the survey was designed intentionally to lead to a specific set of results. Results which are designed to counteract the internet’s supposed hot button issues, as if to say, what you read online is not *true* after all.

    • 3 yawningbread 30 March 2011 at 00:47

      In my younger days, I had a short-term job with a survey company, and all the things you said were among the issues I learnt about during that stint. My experience thus supports the argument that these are important factors. It is indeed a pity that a full survey report has not (yet?) been published.

    • 4 Gard 30 March 2011 at 09:40

      1) Yes, survey design is important, and there is a trade-off between simplicity and precision. To get a truly representative sample, I can imagine the surveyor conducting the survey (a likert scale?) in various different dialects and languages to older folks over the telephone.

      (on 5) Assuming that the three concerns were not anywhere near their top concerns, getting respondents to rank order them would not support any intended inference e.g., “59% ranked housing as no. 1 concern.”

      2) Let’s suppose all the flaws in the survey design exist. What would be the best inference to be drawn from the survey results? Good enough only to pack roasted peanuts?

  3. 5 Amused 30 March 2011 at 09:42

    How can you be sure that they never asked about party preference? Perhaps they are publishing only results favorable to the ruling party. (Surprise!)

    Seriously, if some stranger asks you which party you will vote for, will you be comfortable giving away your true intention? There is still fear among voters, especially if they worry about their job security in the civil service or state enterprises. And government related jobs make up a huge share in Singapore economy.

  4. 6 Walau 30 March 2011 at 13:16

    I think Alex analysis is correct. It makes the whole research pretty meaningless!


Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s





%d bloggers like this: