Tag Archives: Frameworks

Prizes and Incentives for completing surveys

There are a number of things to think about when deciding whether to offer a prize for participating in a social research survey.

The three main things to consider are: is it really necessary; is it an appropriate prize; and does it comply with lottery laws? You must also comply with privacy law when collecting, storing and using personal information (name, phone, email) supplied to enter the draw.

Definition: An incentive is an item (product or voucher) offered to a research participant either at the time of participation (offered to all participants) or as a prize draw (occurring at the end of the fieldwork).

  1. Is it really necessary?
    Our community should want to give us their feedback without us having to pay for it. We don’t want to set a precedent that it is only worth providing feedback if there is a reward.  Offering an incentive also increases the likelihood that people will just complete the survey to enter the prize draw (resulting in answers which are not as honest), or complete it multiple times. When offering an incentive we need to factor in methods to identify people who are ‘skimming’ (just completing the survey for the prize), as well as those who complete it multiple times for multiple chances at winning.
  2. Is the prize appropriate?
    When offering an incentive, it is important to ensure that it isn’t going to skew who participates in the survey. For instance, if you offer a public transport voucher, you are going to get a disproportionate number of respondents who use public transport, and those who live in areas without access to public transport who might otherwise have participated, may not bother. Try to pick something that would be of value to all demographics and locations.
  3. Does the prize comply with lottery laws?
    The following information is specific to Victoria, Australia. If people are likely to be responding from other states or Countries, you will need to review the laws in each location.
    If offering an incentive as a prize draw (that is, people provide their details and then a random entrant is drawn to win) it is recognised under Victorian lottery law as a ‘trade promotion lottery’. When running a trade promotion lottery, entering must be free and you must include the following information when the respondent enters the prize draw:

    • Closing date
    • Where and when the prize will be drawn
    • Where the names will be published (If the prize is over $1,000 you must publish the name of the winner.)
    • Any other entry requirements (such as they must have completed at least 80% of the survey and only one entry per person will be accepted).

Other items for consideration are:

    • You must notify the winner in writing
    • Records must be kept for 3 years to prove random selection.
    • Winner must be selected using a randomisation algorithm (so each person has equal chance of being selected).
    • The prize must be delivered to the winner within 28 days of being drawn.
    • The winner may be substituted for another draw only if reasonable efforts have been made to contact them and were unsuccessful.
    • If you need to change the prize after commencement of the survey, the new prize must be of equal or greater value and the winner needs to agree in writing, or you need to make reasonable attempts to provide the alternative.

Privacy law

In order to comply with privacy law, you must follow the following steps regarding the personal information collected to enter the draw (name, email, phone number, address etc):

  1. In the introduction to the section asking for their personal information for the prize draw, include a link to your privacy policy:
  2. When collecting this information, it must not be physically stored outside of Australia (that is, you can’t use Surveymonkey or Google forms. ASDF research has a locally installed online surveying tool, hosted in Australia. Please see our Online Surveying information sheet for further details).
  3. Ensure that you do not store the contact information in the same data file as the survey responses.
  4. The contact information provided must not be used for any other purpose, unless written permission is provided by the individual. That is, if you collect their name and email address for a prize draw, you are not allowed to add them to your enewsletter list. You can include a checkbox asking if they would like to be added to the list, but this must be default unchecked.

The cost factor: cutting corners in design to reduce cost

[dropcap]I[/dropcap] am going to make my position on this very clear from the outset… Don’t do it!!!

This is perhaps one of the worst trends I see happening in research in recent years. As it becomes easier to do research for low cost (Surveymonkey etc) I see lots of organisations running sub-standard research.Not only does this devalue research as a whole (that is, respondents who receive poorly designed surveys develop cynical views towards participating in research), it results in organisations making decisions based on unsound data. This has dire ramifications both for the future of social research (as a way to provide the community with an avenue to have their say on important issues concerning them) but also on the functioning of businesses who make important decisions based on poor data.Some of the most common cost-cutting mistakes I see are:

  • Conducting surveys in-house without the expertise to adequately design questionnaires/methodology or analyse findings. This is a particular challenge for the industry today as commissioning a research company to conduct research is often prohibitively expensive, while many organisations are required to undertake research to meet funding / board obligations. Furthermore, research is usually the first item to be reduced or removed to meet budgets, whilst the requirement for delivering evidence of progress remains.
  • Survey Monkey (or similar). I cannot express enough how dangerous Survey Monkey is to the research industry, and for organisations who use it without utilizing any expertise in research design. It has made it incredible easy for anyone to run a survey without the need for any knowledge on how to design questions or indeed even reach a representative target market.
  • Combining two surveys together to reduce administration costs, resulting in prohibitively long surveys (some more than 100 questions!!). This affects response rates (reducing representativeness) and also the accuracy of results in the later questions within the survey (response fatigue).
  • Long barrages of statements to be rated to reduce survey length. In a telephone survey environment, this is both taxing on the interviewer and the respondent; and in a self-completion environment (online or paper based) there is a risk of ‘skimming’ (that is, people just circling the same option, or random options, for each statement just to complete the question – there are methods to identify and remove people who are doing this, but that is for another post).
  • Using poor respondent sourcing methodology. This is an item for its own post later, but the two cheapest options at present are using online research panels and random digit dialling (RDD) landlines. Online research panels are self-selected (people choose to join) and are populated with professional respondents (people who conduct lots of surveys, and therefore not necessarily typical of the general population). In Australia, recruiting survey respondents using random digit dial landline numbers, or White Pages listing (including listed mobiles) will not achieve a representative sample. Less than half of adults under the age of 40 years have a landline telephone, and less than 8% of mobile telephones are listed in the White Pages (mostly trades businesses). Unfortunately using mobile phone RDD in Australia is not feasible unless it is a national survey as mobile phone numbers are not assigned by region, and screening for the defined region would result in a very low response rate, and consequently high cost.

Survey sampling: Is telephone research no longer viable in Australia?

[dropcap]C[/dropcap]onducting research using random digit dial (RDD) landline numbers has for decades been the staple of the research industry. In recent years the effectiveness of this methodology has been in significant decline; first due to the withdrawl of electronic White Pages from public access in 2002, followed by a significant decline in landline installation at home (no longer necessary now that everyone has mobile phones).

The ACMA Communications report for 2011/2012 shows that only 22% of Australian adults who have a fixed line telephone or mobile most use fixed-line at home to communicate, meaning that even when people have a fixed line, it is usually not their primary method of phone communication. Furthermore, the incidence of having access to a fixed line telephone is low amongst younger adults. In June 2011 it was found that only 63% of 18-24 year olds (mostly those still living in their parental home) and 64% of 25-34 year olds claimed to have a fixed line telephone at home. These figures have been falling over the years, so are most likely much lower than this now. [1]

Research conducted by the Social Research Centre reveals that there are statistically significant variations in the populations reached by different telephone sampling methodologies. Specifically, those who were contacted over a mobile phone who didn’t have a landline showed a higher incidence of being male, in younger age groups, live in capital cities, born overseas, live in rental accommodation, and living in their neighbourhood for less than five years. [2] In addition, significant biases were observed in the sample contacted over landline, with landline sample showing lower levels of a variety of important variables including health issues, public transport usage and smoking and alcohol consumption. [3]

There are telephone number list providers out there that claim to include mobile numbers by region. These are ‘listed’ mobile numbers. That is, when someone obtains a new mobile number, it is default unlisted unless the owner requests that it is listed. Many mobile number providers don’t actively prompt for people to have it listed. Mobile numbers that are listed are highly likely to be home businesses (as these are the people who go out of their way to get their numbers listed), thereby skewing the ‘mobile population’ in the survey.

Conclusion

Using Random Digit Dial (RDD) with a mix of mobile and landline numbers would be viable to achieve representative samples. However, this will only work for national surveys, as mobile phone numbers are not assigned by region. Undertaking local area telephone surveys using RDD landlines or white pages phone numbers (even if listed mobiles are included) will miss large, and often critical chunks of the community.

It should be noted, however, that telephone surveys would still be viable if you are sampling a population where you have phone numbers for the entire population (eg. using a client list).

References
[1]  Australian Communication and Media Authority (2012) Communications report 2010–11 series Report 2: Converging communications channels: Preferences and behaviours of Australian communications users, ACMA.
[2] Penney, D (2012), Second national dual-frame omnibus survey announced, www.srcentre.com.au, accessed 21 August 2013.
[3] Penney, D & Vickers, N (2012), Dual Frame Omnibus Survey: Technical and Methodological Summary Report, The Social Research Centre.