How to write a survey questionnaire for evaluation: A guide for beginners
About this resource
The resource provides basic information and practical tips to help you design and implement simple survey questionnaires for your program evaluation activities. It is a companion to the resource Using a survey to collect data for evaluation: a guide for beginners. These resources are intended for people working in or with child and family services who are considering writing a survey questionnaire to collect data for program evaluation but have limited experience or training in writing survey questions.
What is a survey questionnaire?
A survey questionnaire is a data collection tool: a set of questions used to collect information from individuals. In this resource we will focus on ‘structured’ survey questionnaires: that is a questionnaire that asks questions in a specific way and order to collect primarily quantitative data. Sometimes people use the term ‘survey’ to refer to the list of questions; however, a ‘survey’ more accurately refers to both the questions and the processes related to participant recruitment, data collection, analysis and reporting. This resource focuses on the basics of creating a simple survey questionnaire.
A survey questionnaire can be administered in a variety of ways including in person or on the telephone, as handouts or by physical mail, or electronically using email or online questionnaires. The questions in a survey questionnaire are usually either ‘closed-ended’ or ‘open-ended’, with many questionnaires using a mix of both question types. Table 1 summarises the difference between closed-ended and open-ended questions. We provide more information about each question type later in this resource.
|Limited set of options from which to choose a response
|No preset responses or lists
Response options include:
|Respondents use their own words to provide information, comments or opinions
|Often focused on ‘what’ and ‘when’
|Often focused on ‘how’ and ‘why’
Designing a survey questionnaire
Typically, you would choose a survey method – and use a survey questionnaire for data collection – when you need to systematically collect data (using the same set of questions) from a relatively large group of people. A survey method would be an appropriate choice if you’ve decided you need to collect quantitative data or a mix of quantitative and qualitative data. If you haven’t made these decisions yet, pause here and refer to these AIFS resources: Using a survey to collect data for evaluation: a guide for beginners and Evaluation design.
Once you have decided to do a survey, a crucial part of the process is determining whether you can use (or adapt) an existing questionnaire or if you need to create your own. Ideally, there will be an existing standardised and validated questionnaire that you can use. This saves time, increases the reliability of your results and allows you to compare your findings with those of other surveys. For details on existing standardised measures, refer to the AIFS resource: How to choose an outcomes measurement tool.
However, there may be times when there is not a suitable questionnaire that suits your purpose, or you need to supplement standardised measures with survey questions that address your needs (e.g. evaluation questions about your program) or the needs of your target population. The remainder of this guide briefly describes the basics of survey questionnaire design, both when writing all new questions or using a combination of new and pre-existing ones.
How to write your survey questionnaire
Writing a survey questionnaire is a complex task. It requires a specific skill set and can take time to master. That is why we recommend using existing standardised measures where possible.
However, if there is not a suitable questionnaire, or you need to supplement your use of standardised measures, the following tips may help you think through the design process:
- Seek out help when writing your questionnaire. Even if you do have some experience, it is usually a good idea to workshop your questions and ideas with others.
- Review existing, well-tested questionnaires in a similar sector rather than starting from scratch. For example, the Australian National University (ANU), the International Social Survey Programme (2019) - Australian Social Inequality Questionnaire and Culture Counts Evaluation Platform provide a list of questions you can draw on for your questionnaire, while also providing an idea of how a questionnaire is structured. Also keep in mind that you may need to purchase a questionnaire or get permission from the developer before you can use it. You can also refer the Further Reading section of this resource for additional information.
- Pilot-test any new questionnaire on a sample of your target population. This helps ensure your respondents will understand the questionnaire in the way you intend, that it is appropriate for this target group, and that you are receiving the kinds of responses you are expecting.
Steps to write your survey questionnaire
Step 1: Plan your survey questions
While it might be tempting to jump ahead and start writing your questions, it is important to spend some time planning. As part of the planning, brainstorm with evaluators and other key stakeholders, including service users (where appropriate), to ensure the questionnaire meets your needs and those of your stakeholders (Salabarría-Peña, et al., 2007). Your questionnaire is likely to be improved if you capture a range of views in the planning process (Preskill & Jones, 2009). The following questions can help guide your brainstorm:
- What do I need to know?
- Why do I need to know it?
- What will happen as a result of this questionnaire?
- Do I really need to do a survey or can I get the information from other existing data sources?
You should also refer to any other documents you have that describe what the program is intending to achieve, including your program logic (read more on program logic here: How to develop a program logic for planning and evaluation) or any outcomes or reporting frameworks that apply to your service, including AIFS Families and Children Activity outcomes framework. These documents will help you to identify what is important to know about your program or service.
It is also important to consider:
- Who should be surveyed. It isn’t always possible to survey everyone in your target population. As a result you often need to survey a subset or sample of the population of interest (for more on sampling, refer to the AIFS practice guide: Using a survey to collect data for evaluation: a guide for beginners). To ensure that any data you collect is meaningful, it is important that you think about who your sample is, and how well they represent the population of interest.
- What type of responses you want to collect, and what skills you have to analyse the data. There are lots of different ways to answer a question – for example, answers may be a numerical rating, numerical ranking, yes/no, multiple choice, open-ended or a mixture of all of these. The type of answers you collect will influence the skills, time and other resources you need to analyse and report on the data.
During the planning phase, it is also a good idea to look at other surveys and talk to your colleagues and networks about what approaches have worked for them in similar programs or services.
Step 2: Write your survey questions
When writing survey questions, you need to start by considering what you want to know and the type of question you need to ask to get the answer. Simple questionnaires generally feature two broad types of questions: closed-ended and open-ended questions.
Closed-ended questions are questions that have a set of predetermined responses or answers from which participants can choose (Centers for Disease Control and Prevention, 2018). Closed-ended question can include yes/no questions, multiple choice questions, scales and ranking questions.
Closed-ended questions are easier to administer and analyse than open-ended questions and are usually easier for respondents to answer. However, they limit peoples’ choices of response which can mean some information is missed if the categories do not include the response a person wants to provide. As a result, careful thought needs to be given to the range of response options provided to ensure they are as accurate and comprehensive as possible. To ensure that response lists are comprehensive – without being so long they are difficult to read or to analyse and interpret – it is common to include ‘other (please describe)’ as a response option. This allows survey respondents to record other answers that aren’t listed.
Examples of some common closed-ended question types
Below are some common closed-ended questions used in survey questionnaires. These are not the only question types, but they are the ones most likely to be used in the simple surveys used to evaluate child and family services.
Limited choice questions (also called Yes/No or dichotomous questions)
Have you ever attended a child and family centre?
Multiple choice questions (either single answer or multiple answer)
Which Australian state or territory do you work in?
- Australian Capital Territory
- New South Wales
- Northern Territory
- South Australia
- Western Australia
Partially closed multiple choice question (single answer)
Which of the following best describes the services you use at the child and family centre you attend?
- Parenting advice
- Parenting programs
- Supported playgroups
- Peer group support
- Other (please describe...)
Rating Scale (Likert)
I can access the services I need.
- Strongly Disagree
- Neither agree nor disagree
- Strongly agree
Rating scale (numerical)
On a scale of 1 to 10, how likely are you to recommend this service to a friend?
Not at all likely Extremely likely
Open-ended questions require participants to provide answers in their own words. They can be short – e.g. simply asking for a number such as the respondent’s age or postcode – or long and designed to prompt people to answer with sentences, lists, opinions and stories. Long open-ended questions can provide rich data and new insights into the respondents’ views and opinions. However, open-ended questions can be more difficult and time-consuming for respondents to answer, and some people may choose to skip these questions because they require too much effort. They can also be time consuming to analyse because they require the survey analyst to review the responses, identify themes and categories, and summarise the answers.
It is common for a survey questionnaire to include a combination of closed-ended and open-ended questions. Combining question types can help draw out richer information and allow you to delve deeper into closed-ended answers. For example, you might choose to follow up a closed-ended question asking the respondent to ‘rate their experience of service delivery’ with an open-ended question asking them to explain their answer (e.g. ‘What is the main reason for your answer?’). This can help you understand more about the respondent’s experience.
Tips for writing survey questions
The information you wish to collect with your questionnaire is likely to be determined by your program logic, outcomes framework or organisational reporting domains. However, it is usually not advisable to put your overarching evaluation questions or reporting domains as questions in your questionnaire. This is because those questions or domains are usually very broad and may require multiple questions or data sources to answer properly. Instead, when writing your questionnaire you should aim to write short, specific questions that focus on one key concept at a time.
Consider the following as you write your questions:
- Avoid writing overly complex questions. Make them simple (i.e. do not use complex or ambiguous vocabulary or grammar) and short (i.e. they can be easily read and understood). Less than 25 words in one question is a good guide.
- Use language and vocabulary appropriate to your target group.
- Avoid double-barrelled questions, i.e. a question that includes more than one topic and asks about two different issues, while only allowing a single answer. (e.g. how satisfied are you with the program service delivery and customer support?).
- Avoid using words that may have different interpretations or might be hard for some respondents to understand.
- Avoid jargon and acronyms. If you need to use acronyms, spell them out at first use.
- Think about the length of the questionnaire. People are less likely to complete long surveys (e.g. if it takes more than about 20 minutes to complete) but may be willing to spend more time on a survey if they know and trust the person or organisation administering it. For online questionnaires, it is best to provide a progress bar that shows how much of the survey has been completed. Where possible, it also helps to allow participants to save their progress and come back later.
- To help prevent surveys becoming too long, focus on ‘need to know’ questions and minimise ‘nice to know’ questions. Ensure your survey questions focus on the information you need to answer your overall evaluation questions and address your purpose.
- Use neutral language that doesn’t predict or signal a ‘correct’ answer. For example, instead of asking, ‘How much did you benefit from our program?’ you could ask respondents to rate their satisfaction with the program using a rating scale (Likert); e.g. ‘on a scale of 1–5, please rate your experience of the program’.
- Ensure the questions in your survey apply to all. You can create specific sets of questions for specific participants; however, this requires more advanced survey programming skills and are harder to administer and analyse. If you choose to do this, it’s important to ensure participants are responding to the right questions.
Other considerations when developing a survey questionnaire include:
- Think about your question sequence, layout and appearance. It is often a good idea to place sensitive questions near the end of a questionnaire so that participants have a good sense of what the survey is about before being asked to consider difficult questions. Most surveys collect basic information about the person completing the questionnaire at the start, before going on to more detailed questions.
- Make sure you have a clear introduction and cover letter that includes all the information needed to gain informed consent from participants. It is important the cover letter includes what type of topics will be covered so that respondents don’t become distressed or upset if they are asked about sensitive or personal topics without warning. Cover letters include details about:
- the purpose of the study
- how the participants may obtain additional information about the study
- the study’s importance
- the types of questions that will be asked
- how the researchers/evaluators will ensure confidentiality and anonymity
- the time required to complete the survey.
Step 3: Write your questionnaire responses
Writing questionnaire responses usually happens when writing the questions, but we have included it here as a separate step because the response options you provide in closed-ended questions are just as important as the questions you ask. Response options that do not allow respondents to select an option that matches what they really think can produce poor quality or biased data. Poorly worded responses may also lead respondents to respond in a particular way. For example, a question such as ‘do you think we should deliver this program to more people?’ assumes the program is successful and may influence respondents to answer in a positive way (regardless of what they think) or not answer at all.
The following tips can help you write better responses for closed-ended questions:
- Make the survey responses clear and concise.
- When writing multiple choice responses try to include answer options that cover a range of reasonable and meaningful answers. This does not mean you need to include every possible answer in your list – you may supplement the key choices with an ‘other’ response option. If you are not sure what the best selection of responses are, you can test the topic using an open-ended survey question with a sample of your target group. This allows you to identify common answers from those test surveys to include in a closed-question list.
- Although a list should cover the full breadth of response options, do not offer too many or too few options (e.g. in single choice or multiple-choice lists). Too few choices can be restrictive, while too many can be hard to read and more difficult to analyse. Where possible, limit the response options to 4 or 5 and provide an ‘other’ response option to allow respondents to add their own answer.
- Make sure response options are mutually exclusive. For example, multiple-choice questions about the respondent’s age bracket should be written as 18–25, 26–35, 36–45, rather than, 18–25, 25–35, 35–45.
- Provide ‘No opinion’ responses such as ‘I don’t know’, ‘Undecided’ or ‘Not Applicable’ when possible.
- For multiple-choice questions, decide if you want to allow respondents to select more than one response (e.g. ‘Check all that apply’) or if you want them to indicate only one answer. Make it clear what responses are allowed.
- For scale questions (such as Likert scales), it is better to have an odd number of responses because it allows a neutral midpoint. For example, in a question asking a respondent how much they agreed with a statement about their satisfaction with a service, you might have ‘strongly disagree’, ‘disagree’, ‘neither agree nor disagree’, ‘agree’, ‘strongly agree’. Unless you have a specific reason for a more nuanced scale, it is generally more straightforward to keep your scales limited to 5 points.
- It is usually easier for respondents to complete scale questions (such as Likert scales) if you label all response options (e.g. from strongly disagree to agree) rather than using numerical signifiers (e.g. where 1 = strongly agree and 5 = strongly disagree).
- Avoid unequal comparisons such as having one end of a scale worded very negatively (e.g. strongly disagree) and the other only mildly positively (e.g. agree).
- Where possible (and meaningful), when asking questions about frequency or the time something happened, use precise timescale options such as ‘last week’ or ‘last month’ rather than vague terms such as ‘recently’ or ‘often’.
- For online surveys, consider whether you want to force respondents to answer all questions by making them mandatory (which gets more complete data) or if you want to allow them to skip questions (which allows respondents to skip questions that might make them feel uncomfortable).
Step 4: Test your survey questionnaire
Before rolling out your questionnaire, you should test it with other staff or a group of participants from your target group who have agreed to pilot the survey. Key questions to ask people piloting or testing your survey include:
- Was the content what you expected?
- Were any questions confusing?
- Were any questions redundant?
- Were response choices clear?
- Do you have any feedback about topics that were not included?
- Were there any spelling, grammar or printing errors?
As the person administering the survey you should also use a test or pilot of the survey to assess how easy it is to administer and if it collects the data you need. Firstly, it's important to ask whether the questions and answers were interpreted the way you intended.It is also important to assess how long it took respondents to complete the survey and what were the shortest and longest completion times. If using an online survey, the software will usually provide this information for you.
Where needed, revise the questionnaire based on test results.
A survey questionnaire is an important data collection tool for evaluation. They tend to be relatively cost-effective to administer and allow you to systematically collect both quantitative data (using closed-ended questions) and qualitative data (using open-ended questions) from a large number of people. However, writing survey questions can be difficult and time-consuming and poor-quality questions can mean you do not get meaningful data. When you can’t use a pre-existing questionnaire, or when you need to supplement standardised measures with your own questions, investing the time to develop good questions (e.g. that are easy to complete and are appropriately worded and structured) will increase the likelihood you will get the data you need for a meaningful evaluation.
The following list provides you with more information on what is included in this resource. It should be used as a supplement to resources under the reference list below.
Australian Institute of Family Studies (2021). Ethics in evaluation. Melbourne: Australian Institute of Family Studies.
Australian Institute of Family Studies (2020). Planning an evaluation: step by step. Melbourne: Australian Institute of Family Studies.
Australian Institute of Family Studies (2016). How to develop a program logic for planning and evaluation. Melbourne: Australian Institute of Family Studies.
Australian Institute of Family Studies (2019). Identifying evaluation questions. Melbourne: Australian Institute of Family Studies.
Australian National University. Personality & Total Health (PATH) Through Life, National Centre for Epidemiology and Population Health, ANU College of Health & Medicine.
BetterEvaluation Provides and list of resources for different types of survey method, including Designing the Face-to-Face Survey and Collecting Evaluation Data: Surveys, and advice for planning and designing surveys.
Culture Counts Evaluation Platform (website). Question Bank.
Department of Communities and Justice (2021). TEI guide to developing surveys. Sydney: NSW Department of Communities and Justice Targeted Early Intervention program.
William M.K (online resource) Research Methods Knowledge Base. A comprehensive web-based textbook of social research methods. Retrieved from https://conjointly.com/kb/survey-research/
Australian Bureau of Statistics. (2023). Questionnaire Design. Canberra: Australian Bureau of Statistics.
Australian Institute of Family Studies (AIFS). (2021). Evaluation design (Practice Guide). Melbourne: Australian Institute of Family Studies.
Australian Institute of Family Studies (AIFS). (2011). Collecting data from parents and children for the purpose of evaluation (CAFCA Practice sheet). Melbourne: Australian Institute of Family Studies.
Centers for Disease Control and Prevention. (2018). Data collection methods for program evaluation: questionnaires. Evaluation Briefs No 14. US Department of Health and Human Services.
Salabarría-Peña, Y, Apt, B.S., Walsh, C.M. (2007). Practical use of program evaluation among sexually transmitted disease (STD) programs, Atlanta (GA): Centers for Disease Control and Prevention.
Preskill, H., & Jones, N. (2009). A practical guide for engaging stakeholders in developing evaluation questions. Princeton, NJ: Robert Wood Johnson Foundation Evaluation Series.
1 Unstructured questionnaires may have a list of loose questions and prompts but focus on more open-ended, qualitative questions. This kind of questionnaire is similar to a qualitative interview topic guide.
Contributions and acknowledgements
This practice guide was written by Dr Megerssa Walo, with key contributions from Dr Stewart Muir, Sharnee Moore, Dr Jasmine B. MacDonald and Dr Lakshmi Neelakantan.
Walo, M. (2023). How to write a survey questionnaire for evaluation: A guide for beginners. Melbourne: Australian Institute of Family Studies.
Using a survey to collect data for evaluation: A guide for…
This practice guide is for people working in child and family support services who have limited experience or training…
Using surveys to understand your service: Research and…
This webinar will explore effective ways of using surveys in practice settings to understand child and family outcomes…