ESRA logo

ESRA 2019 glance program


Surveying Children and Young People 1

Session Organisers Ms Kate Smith (Centre for Longitudinal Studies, UCL Institute of Education, London )
Dr Emily Gilbert (Centre for Longitudinal Studies, UCL Institute of Education, London )
TimeWednesday 17th July, 09:00 - 10:30
Room D22

Many large-scale surveys successfully collect a variety of distinct types of data from children and young people (up to the age of 25). However, there is relatively little methodological evidence in this area. Much of the literature relating to children and young people’s participation in research focuses on small-scale qualitative studies and tends to concentrate on ethical issues relating to the rights of children and young people in research. This session will cover the challenges and experiences of including children and young people in surveys as they move from childhood to adulthood, and related survey design issues. A major challenge when interviewing teenagers is that while children’s participation in surveys is often mediated by and involves their parents, teenagers and young people make autonomous decisions, bringing challenges particularly in terms of engagement. The session aims to explore a variety of methodological issues around surveying young people. Submissions are particularly welcomed on:
- designing questionnaires for children and young people, including question testing methods
- collecting data on sensitive topics from young people, including methods for ensuring privacy and encouraging accurate reporting
- collecting different types of data from children and young people including physical measurements and cognitive assessments
- using different methods of data collection, including the use of innovative technology such as the web and mobile phones
- inclusivity in data collection methods, including facilitating the participation of children and young people with lower literacy levels
- assessing the reliability and validity of children and young people’s self-reports
- preventing non-response by engaging young people in research, including designing survey materials to appeal to young people and using new technology and digital media for participant engagement
- the challenges of retaining young people’s contact and interest in surveys over time
- ethical issues in involving children and young people in surveys, including gaining informed consent and protecting young people’s rights and well-being

Keywords: Children, young people, surveys

Finding Effective Ways to Encourage Teens to Participate in a Web-Push Survey (and Getting Their Parents to Let Them)

Mr Brian Wells (UCLA Center for Health Policy Research)
Mr Todd Hughes (UCLA Center for Health Policy Research) - Presenting Author
Mr Royce Park (UCLA Center for Health Policy Research)
Ms Kathy Langdale (SSRS)
Dr Suzanne Ryan-Ibarra (Public Health Institute)
Mrs Kyli Gallington (Public Health Institute)
Ms Rebecca Garrow (Public Health Institute)

Download presentation

Surveys of young people often have the difficult task of obtaining parental permission to interview a child and then getting those children to respond. The California Health Interview Survey (CHIS) has seen permission rates cut in half over the last five years for a CATI interview. As surveys consider moving from CATI to a mixed-mode design, studies aimed at surveying youth need to better understand how the permission-cooperation dynamic shifts with frame and mode alterations. This paper will discuss the results from two field experiments trying to obtain teen interviews using an ABS mail push-to-web design with a CATI follow-up and highlight revelations gained from focus groups with eligible parents.
The first field experiment invited parents of teens (age 12 to 17) to provide permission to contact their teen via text, email, and telephone. Overall permission rates were similar to those achieved in CHIS 2017 using only CATI. Parents were generally willing to provide permission to text their teen an invitation if they had a personal cell phone. However, the teen cooperation rate was less than 30% for this experiment as compared to 90% in production.
The second experiment tried to improve on these methods by changing the nature of the permission request, removing the more invasive and less applicable contact techniques, and giving the parent a more active role in getting the teen to participate. We introduced a monetary incentive for teens who completed the survey and experimentally tested a monetary incentive for permission-granting parents.
This paper will postulate why the frame and mode changes resulted in reduced levels of response, evaluate the impact of both the teen and parental incentives, discuss the importance of a parent’s role in teen participation, and lay out warnings for researchers trying to survey teens.


Effects of Differences in Data Collection Method in a Youth Programme Evaluation

Mr David Andersson (Kantar Public) - Presenting Author
Mr Sinan Fahliogullari (Kantar Public)
Mr Peter Matthews (Kantar Public)

This paper will look at an experiment investigating the effects of differences in methods of data collection within a major youth programme evaluation in the UK.

The National Citizen Service (NCS) is a youth programme for 15-17-year-olds in the UK, aimed at bringing together young people from different backgrounds to enable and build social cohesion, social responsibility and social mobility.

Since 2012 the Department for Digital, Culture, Media and Sports (DCMS) have commissioned annual evaluations of the programme. These evaluations have been based on baseline and follow-up surveys with NCS participants, and a comparison group of young people, who did not attend the programme. For practical reasons, the participants’ baseline survey is completed on paper at the start of the programme, while the comparison group is invited to an online survey. Both groups then complete a follow-up survey online, around three months later.

In 2018, a new approach was tested where some NCS participants were invited to complete the baseline survey online prior to the start of their course. We assess the viability of this new approach, comparing the results against those of the established evaluation design. We look at the representativity of the sample profiles, differences in data quality, and differences in measurement of programme outcomes. Young people in the online trial were randomly assigned to one of six experimental conditions determining the contact strategy followed. We also estimate the sample sizes that could be achieved through the online approach through different contact strategies, and the implications for study costs.


Comparability of Online and Paper Questions and Most Suitable Online Formats According to Children’s Opinions: Results of Focus Groups

Dr Mònica González-Carrasco (University of Girona) - Presenting Author
Dr Ferran Casas (University of Girona)

Online questionnaires are increasingly used for surveys with children and adolescents because of their many advantages (reduction of errors, greater attractiveness,..). The format of an online questionnaire can hardly be the same than the format of a paper questionnaire, and its application is not without difficulties (lack of sufficient computers available in schools, problems with Internet connection,...). Because of said difficulties, we often have available data from children collected with both formats, but strict equivalence has seldom been checked.

We have experience improving paper questionnaires formats with children as advisers, but we lack that experience for online formats. Taking into account children’s opinions about a questionnaire is always helpful to improve the format and therefore the quality of the data collected.

The study has the objective of qualitatively investigating children’s opinions about different possible formats used in online questions and to explore the perceived equivalence between online and paper format questions in each age group.

Focus groups were conducted with students of 3rd and 5th grade of primary education (8 and 10 age-groups, respectively) and of 1st grade of secondary education (12 year-olds). These focus groups took place in three public schools (a primary, a secondary, and a primary-secondary) in Catalonia (north-east of Spain).

A total of 91 students of primary and secondary education took part. Each group was composed of 7/8 participants (4 girls and 4 boys). The children were between 8 and 12 years of age.

Results put into evidence that the preferred format for online questions depends mainly on the age of the participants, how many alternative answers are offered to the question and how understandable it is perceived. Some participants prefer online alternatives that that can be represented in paper questionnaires and some other not, while the equivalence between online and paper questions is only possible with some formats.


Challenges and Weaknesses in Using Online Surveys to Conduct Public Opinion Research of Teenagers

Ms Laura Wronski (SurveyMonkey) - Presenting Author

Measuring public opinion among young people is a newly resonant endeavor. As research organizations and polling firms look to expand into surveys of teenagers, they face many difficulties thanks to inherent differences in how teens and adults approach surveys.

This paper will present new findings from research on how teens understand surveys, why they decide to participate in surveys, and how they take them. We focus specifically on web surveys, which are seemingly the perfect survey mode for teenagers, given the prevalence of smartphones among this group and their familiarity with the internet from a young age. We conduct several survey experiments, varying the survey topic, question wording, question order, and visual format to determine whether well-established best practices for online surveys of adults also apply to teens. We also present findings from focus groups we conducted with teenagers to better understand their feelings about online surveys in qualitative terms. Finally, we examine data from SurveyMonkey’s database to profile the types of surveys that are sent to teens and the types of teens that participate in an online panel, both which can be compared to the same profiles for adults.

We find that the online mode presents many advantages for surveys of teenagers relative to other modes, but that it introduces new challenges that are specific to this young population.