I am going to make my position on this very clear from the outset… Don’t do it!!!
This is perhaps one of the worst trends I see happening in research in recent years. As it becomes easier to do research for low cost (Surveymonkey etc) I see lots of organisations running sub-standard research.Not only does this devalue research as a whole (that is, respondents who receive poorly designed surveys develop cynical views towards participating in research), it results in organisations making decisions based on unsound data. This has dire ramifications both for the future of social research (as a way to provide the community with an avenue to have their say on important issues concerning them) but also on the functioning of businesses who make important decisions based on poor data.Some of the most common cost-cutting mistakes I see are:
- Conducting surveys in-house without the expertise to adequately design questionnaires/methodology or analyse findings. This is a particular challenge for the industry today as commissioning a research company to conduct research is often prohibitively expensive, while many organisations are required to undertake research to meet funding / board obligations. Furthermore, research is usually the first item to be reduced or removed to meet budgets, whilst the requirement for delivering evidence of progress remains.
- Survey Monkey (or similar). I cannot express enough how dangerous Survey Monkey is to the research industry, and for organisations who use it without utilizing any expertise in research design. It has made it incredible easy for anyone to run a survey without the need for any knowledge on how to design questions or indeed even reach a representative target market.
- Combining two surveys together to reduce administration costs, resulting in prohibitively long surveys (some more than 100 questions!!). This affects response rates (reducing representativeness) and also the accuracy of results in the later questions within the survey (response fatigue).
- Long barrages of statements to be rated to reduce survey length. In a telephone survey environment, this is both taxing on the interviewer and the respondent; and in a self-completion environment (online or paper based) there is a risk of ‘skimming’ (that is, people just circling the same option, or random options, for each statement just to complete the question – there are methods to identify and remove people who are doing this, but that is for another post).
- Using poor respondent sourcing methodology. This is an item for its own post later, but the two cheapest options at present are using online research panels and random digit dialling (RDD) landlines. Online research panels are self-selected (people choose to join) and are populated with professional respondents (people who conduct lots of surveys, and therefore not necessarily typical of the general population). In Australia, recruiting survey respondents using random digit dial landline numbers, or White Pages listing (including listed mobiles) will not achieve a representative sample. Less than half of adults under the age of 40 years have a landline telephone, and less than 8% of mobile telephones are listed in the White Pages (mostly trades businesses). Unfortunately using mobile phone RDD in Australia is not feasible unless it is a national survey as mobile phone numbers are not assigned by region, and screening for the defined region would result in a very low response rate, and consequently high cost.
Recent Comments