Impact factor (WEB OF SCIENCE - Clarivate)

2 year: 7.664 | 5 year: 7.127


The effect of interviewer-respondent age difference on the reporting of sexual activity in the Demographic and Health Surveys: Analysis of data from 21 countries

Jeffrey W Rozelle1, Mark J Meyer2, Anne H McKenna3, Hawa Obaje4, John D Kraemer5

1 Spatial Sciences Institute, University of Southern California, Los Angeles, California, USA
2 Georgetown University, Department of Mathematics and Statistics, Washington D.C., USA
3 Independent Researcher, Oakland, California, USA
4 Last Mile Health, Tubman Blvd., Monrovia, Liberia
5 Georgetown University, Department of Health Management and Policy, Washington D.C., USA

DOI: 10.7189/jogh.13.04002




Interviewer effects can have consequential impacts on survey data, particularly for reporting sensitive attitudes and behaviours such as sexual activity and drug use, yet these effects remain understudied in low- and middle-income countries. The Demographic and Health Surveys (DHS) present a unique opportunity to study interviewer effects on the self-report of sensitive topics in low- and middle-income countries by including interviewer characteristics data. This paper aims to narrow the gap in research on interviewer effects by studying the effects that age difference between interviewer and respondent and interviewer survey experience have on the reporting of ever having sexual intercourse.


We used DHS data from 91 066 women and 56 336 men in 21 countries where the standard DHS was implemented among all women of reproductive age, and interviewer characteristics were included in the data set. Using a Bayesian cross-classified model with random intercepts for interviewer and cluster, we assessed whether the effect of an age difference of 10 years or greater was associated with a difference in self-report of ever having sexual intercourse, adjusting for respondent demographics.


There was a meaningful association between an age difference of greater than ten years and reporting of ever having had sexual intercourse in most countries for both genders after adjusting for interviewer age and experience, rural or urban cluster, and individual-level characteristics. Among women, the marginal posterior probability of reporting ever having sexual intercourse if the interviewer was ten years or more years older was lower for 17 of 19 countries (countries ranged from -12.50 to 3.90 percentage points). Among men, the marginal posterior probability was lower for 16 of 20 countries, ranging from -18.30 to 17.10 percentage points.


In most countries, women and men were less likely to report ever having sexual activity if the interviewer was ten or more years older than them, adjusting for potential confounders. These findings have important implications for interpreting numerous sexual health indicators, such as unmet family planning needs and human immunodeficiency virus (HIV)/acquired immunodeficiency syndrome (AIDS) risk. Survey administrators may consider more careful interviewer-respondent characteristic matching or novel approaches like Audio Computer Assisted Self Interview to minimize interviewer-induced variance.

Print Friendly, PDF & Email

Interviewer effects on survey response is well-documented, particularly when seeking to elicit true responses around sensitive behaviour such as sexual activity or substance use [14]. Researchers widely acknowledge the difficulty and potential data quality problems in collecting survey data about socially undesirable attitudes and behaviour in the United States and Europe, yet critical examination of survey data from low- and middle-income countries (LMICs) has often been slower to materialize [5,6]. A growing body of research around survey methods (with a particular interest in Audio Computer Assisted Self Interview (ACASI)) suggests that there are opportunities for novel survey administration techniques [3,5,79]. Nevertheless, researchers, policy makers, and public health practitioners will likely rely heavily on interviewer collected survey data for the foreseeable future to develop and target programming, especially in LMICs. It is therefore vital to develop a deeper understanding of the impact of interviewer effects on response accuracy.

Interviewer-induced bias is more likely to occur for sensitive questions such as sexual behaviour and falls into role-restricted and role-independent [10,11]. Role-restricted interviewer effects refer to the introduction of bias as a result of conscious and unconscious interviewer conduct such as reacting to responses, probing sensitive questions differently within or between interviewers, or modifying questions to reduce workload [6,12]. Role-independent effects result from respondent bias toward interviewer’s social characteristics such as gender, ethnicity, age, or lack of trust in interviewer and must be observable by the respondent [10,13]. Even with training, participants often detect these characteristics and edit their responses as a result [1]. It can be difficult to discern whether interviewer effects are role-restricted or -independent, but interviewer’s age, education, race, gender, and more characteristics have been associated with non-response to questions about sexual behaviour [8].

The Demographic and Health Surveys Program (DHS) is one of the most widely cited data sets in LMICs. It is a gold standard data set for many population statistics, particularly where few other similar sources of data may be available. Since 2015, DHS data sets have included surveyor characteristics, which creates an opportunity to study interviewer effects at a global scale [14]. The DHS has published two reports on interviewer effects. Their 2018 report focused on data quality issues like non-response, time to completion, and other general indicators of quality [12]. A recent report extended this and found that more sensitive and complicated questions were associated with larger interviewer-induced variance when models considered the cross-classified structure of interviews within communities and interviewers [15].

There are few peer-reviewed studies of interviewer effects in the DHS. Leone et al. found that variance attributable to interviewers was usually larger than variance attributed to clusters [16], though they called for more research into the specific interviewer characteristics that caused this variance. Metheny and Stephenson found that most interviewer characteristics, except for interviewer experience, did not significantly affect reporting of intimate partner violence [17]. Each of these studies were important milestones in studying the DHS, and this paper seeks to extend these efforts still further as the first study to investigate specific interviewer characteristics across all DHS countries with available data.

Interviewer effects on the reporting of sexual activity from the DHS does not yet appear in the literature, despite calls for research into it [12]. Because pre-marital sexual activity is stigmatized in many contexts [1820] it is especially vulnerable to interviewer-induced social desirability bias [21,22]. It is likely that some interviewer characteristics – and how they interact with respondent characteristics – may exacerbate the risk of bias. For example, there is robust evidence in higher income countries and some low income countries that responses about sexual activity can be modified by interviewer and respondent gender [23,24]. Age has also been long identified as a potential source of interviewer effects, though age difference is less thoroughly studied [24,25]. Accurate reporting of sexual behaviour is valuable both on its own, as an important public health indicator, and as a gateway question for other questions and indicators about contraception use and sexual health [26]. Bias in respondents reporting sexual activity may propagate to other items that are important for assessing other domains of sexual and reproductive health.

This study aims to determine associations between interviewer characteristics and respondents’ reporting of ever having sexual intercourse in the DHS, using data from 91 066 respondents to the women’s and 56 336 respondents to the men’s questionnaires across 21 countries.


Data source

This study makes secondary use of data from 21 countries’ cross-sectional surveys that were administered through the DHS program. The DHS is a nationally representative household survey administered in low- and middle- income countries with publicly available data. Participants are typically selected using a two-stage cluster sampling design with probability of selection proportional to size of the cluster at the first stage [27]. The sampling and data collection approach is fully described on the DHS program web portal [28]. In all countries, respondents and interviewers are gender-matched: women are interviewed by female interviewers and men by male interviewers.

We used DHS Survey Search tool to identify all current and future standard surveys that included the interviewer characteristics data set. We then included all surveys that a) used the standard DHS questionnaire, b) collected data from all women of reproductive age, and c) included the requisite data to link interviewer characteristics to respondents. Additionally, we excluded data sets where fewer than three percent of never-married, never-union respondents reported ever having sexual activity because at DHS sample sizes, models failed to converge when reporting was exceptionally rare. Twenty-one DHS countries met the inclusion criteria (Table 1).

Table 1.  Included surveys and descriptive statistics

WordPress Data Table

All male and female respondents who had never been in a marriage or union with complete data were included. We excluded respondents who reported ever being married or in a union because essentially all such respondents reported sexual activity. Although inclusion criteria are never-married or in a union, we use “never-unioned” for ease of expression. Detailed reports of all respondents are available from the DHS Program [28].

Outcome of interest

The outcome of interest is whether a never-unioned respondent has reported ever having sexual intercourse. This is measured dichotomously with a standard DHS question: “How old were you when you had sexual intercourse for the very first time?”, one response option of which is “Never had intercourse.”

Explanatory variables

We examined all potential interviewer-level factors to identify those that were a) available across all surveys, b) had any variance between interviewers, and c) fit theory for factors that could influence reporting of having sexual intercourse. Ultimately, our final model included interviewer age (continuous), age difference between interviewer and respondent (dichotomized to interviewers being 10+ years older or not), and any previous DHS or other survey experience (dichotomous).

We hypothesized that the relationship between age difference and probability of reporting sensitive behaviour was not linear, and that there is likely a difference at which respondents would perceive an interviewer as older and be more inclined to edit their responses. We calculated the difference between interviewer age and respondent age, and coded the age difference as fewer than 10 years older and 10 or more years older. We also constructed a variable of five or more years older for sensitivity analyses.

The interviewer characteristics data set also includes information on previous DHS experience and “other survey experience” collecting data. We collapsed this into a single “any survey experience” variable.

We also adjusted for several potential confounders: respondent age (continuous), education level (categorized as no school, primary, secondary, more than secondary), wealth index in five quintiles, and residence type (dichotomous as rural or urban). These variables were included because there is a well-established link between these predictors and age at first sexual activity, and because of their potential to confound the relationship between the outcome and variables of interest [29].


We first performed Bayesian simple logistic regression analyses with ever reporting sexual intercourse as the dependent variable for men’s and women’s questionnaires in each country. Each model had one independent variable: either any previous survey experience (DHS or other), wealth index quintiles, rural, respondent age in years, respondent education level, interviewer age in years, interviewer age difference of 10 or more years, interviewer education level, and difference in native language between interviewer or respondent native language (Table S3 and Table S4 in the Online Supplementary Document).

Interviews are nested within interviewers and within sampling clusters. Thus, every cluster has multiple interviewers who also worked across multiple clusters. We therefore fit Bayesian multilevel, cross-classified logistic regression models to analyse the effects of interviewer characteristics on respondent level responses, as cross-classified models are conventional for this type of analysis [11,15]. Interviewer-specific intercept was included in the model along with cluster-specific intercepts. The inclusion of cluster- and interviewer-specific intercepts accounts for potential additional sources of variability induced by the sampling design, and controls for a probable scenario where interviewers are assigned to clusters with different latent propensities of reporting ever having sexual activity. Sampling weights were not used in the present analysis, as the research question does not pertain to a population estimate, but to the equally weighted interviewer-respondent interaction.

We also checked for interpenetration of interviewers across clusters to understand the potential for cluster level variance to confound the conclusions. Prior research suggests that three or more clusters per interviewer would yield sufficient interpenetration for modelling [30], and that interviewer effects often account for more homogenization than sampling or spatial clustering [31].

We let “y” be the probability that respondent “i” interviewed by interviewer “j” in cluster “k” reported ever having intercourse. The difference in the log odds of an interviewer-respondent age difference of more than 10 years is represented by β1, while log odds of interviewer level age and survey experience are represented by β5 and β6 respectively. Level one, respondent characteristics included are respondent age, wealth index, wealth index, and education level (β2, β3, β4) included as fixed effects. The cluster-specific fixed effect of residence type (urban or rural) is included as β7. Intercepts ςj and ςk are the cross-classified interviewer- and community-specific intercepts.

logit (Pr(yijk = 1)) = β0 + β1 (age diff. >10 yrs) i(jk) + β2 (resp. age) i(jk) + β3 (resp. edu) i(jk) + β4 (wealth index) i(jk) + β5 (int. age) j + β6 (svy. exp) j + β7 (residence type) k + ςj + ςk

Because there is little relevant published literature on our research question in this context, we used non-informative priors. Specifically, we placed an improper flat prior over the reals for population-level effects. For the interviewer- and cluster-specific intercepts, we placed mean zero normal priors with half Student-t3 priors on the standard deviations (the subscript on t3 denotes the degrees of freedom).

Models for each country and each respondent gender were fitted separately. Estimates for all models were produced using eight Markov Chain Monte Carlo chains, with 6000 warm-up iterations and 6000 retained posterior samples per chain resulting in 48 000 total posterior samples available for analysis. The Gelman-Rubin potential scale reduction factor was used to assess convergence across chains. All covariates across all included models had a potential scale reduction factor, sometimes referred to as R-hat, of 1.00 – demonstrating model convergence [32]. Two key sensitivity analyses were performed. First, we fitted the models using a half-Cauchy distribution for the interviewer- and cluster-specific intercept prior standard deviations, and results were similar (Table S7 and Table S8 in the Online Supplementary Document). Second, we changed the interviewer-respondent age difference to five years. Associations held in most countries, with the strength of the association somewhat muted, as expected (Table S9 and Table S10 in the Online Supplementary Document).

We summarize the results using the median of the posterior samples, the middle 95% of the posterior samples (95% credible interval (CrI)), and the posterior probability that the difference is greater than zero. Posterior probabilities close to either zero or one indicate statistical significance [32].

Because our models estimate odds ratios, we also used average marginal effects to convert observed odds ratios into differences in the probability respondents reported sexual activity for interviewers within 10 years and 10 or more years, adjusting for potential confounders held at their observed values. We did this to reduce the risk that effect sizes would be misinterpreted, which is common with odds ratios.

All Bayesian regression analyses were performed using the brms function in R 4.1.1, and the posterior package was used for summary statistics of the posterior distributions [3335]. We used the brmsmargins command in the brmsmargins package to estimate average marginal effects [36,37].


This analysis used data from 21 countries, 91 066 women, and 56 336 men (Table 1). There were more women (median per country (mdn) = 4567) than men (mdn = 2745.50) included for all countries except Ethiopia, Zambia and Zimbabwe where there were more men than women that met inclusion criteria. Correspondingly, there were more unique interviewers for women (mdn = 71 interviewers) than for men (mdn = 38 interviewers) in all countries except Zimbabwe. The mean (m) number of women’s interviews per interviewer ranged from 25.60 in Mali to 106.40 in Rwanda. For the men’s questionnaire, the mean number of interviews per interviewer ranged from 33.90 in Myanmar to 153.60 in Haiti.

Among those who were never married, there were also inter-country variations in reporting of ever having had sex. In most countries, more never-unioned men (mdn = 49.90%) than women (mdn = 36.80%) reported ever having intercourse. Among included countries, Timor-Leste had the lowest proportion of never-unioned female respondents who reported ever having sexual intercourse (3.80%), followed by the Philippines, with 15% of never-unioned respondents reporting ever being sexually active. Liberia had the highest proportion of never-unioned women reporting sexual activity (81.30%). The proportion of never-unioned male respondents who reported ever having intercourse ranged from 12.20% in Myanmar to 80.70% in South Africa. The majority of included respondents were younger than 25 across contexts (Table S1 and Table S2 in the Online Supplementary Document).

In the final model, several independent variables are associated with reporting ever having sex among never-unioned respondents in the countries we analysed (Table S5 and Table S6 in the Online Supplementary Document for full models). In all countries, respondent age was a critical predictor of reporting ever having sexual intercourse. Among the covariates that describe interviewer effects, an age difference of 10 or more years appears to generally maintain a consistent, negative trend (Table 2 and Table 3).

Table 2.  Posterior median adjusted odds for age difference of 10 years or less and interviewer and community level predictors of reporting ever having sexual intercourse among never-unioned women

WordPress Data Table

aOR – adjusted odds ratio, CrI – credible intervals, P.P. – posterior probability

*Adjusted odds ratio (95% credible intervals). Odds ratios are estimated from the median of the posterior samples adjusted for other model predictors including interviewer respondent age difference of greater than 10 years, interviewer years of age, interviewer with previous survey experience, rural residency, respondent age, wealth index, and education level.

†Posterior probability that the odds ratio is greater than one. Constructed as the proportion of posterior samples where the odds are greater than 1. Posterior probabilities of exactly 0 or 1 are when no or all samples, respectively, were above 1.

Table 3.  Posterior median adjusted odds for age difference of 10 years or less and interviewer and community level predictors of reporting ever having sexual intercourse among never-unioned men

WordPress Data Table

aOR – adjusted odds ratio, CrI – credible intervals, P.P. – posterior probability

*Adjusted odds ratio (95% credible intervals). Odds ratios are estimated from the median of the posterior samples adjusted for other model predictors including interviewer respondent age difference of greater than 10 years, interviewer years of age, interviewer with previous survey experience, rural residency, respondent age, wealth index, and education level.

†Posterior probability that the odds ratio is greater than one. Constructed as the proportion of posterior samples where the odds are greater than 1. Posterior probabilities of exactly 0 or 1 are when no or all samples, respectively, were above 1.

In most countries, both women (17 of 19 countries) and men (16 of 20 countries) are less likely to report ever having sexual intercourse if the interviewer is 10 or more years older than they are, adjusting for interviewer age, survey experience, and respondent age and other demographics. We consider a posterior probability of 5% or lower that the coefficient was less than 0 to suggest statistical significance, although this is not strictly the same as frequentist statistical significance [32]. For female respondents, having an interviewer who was 10 or more years older had a significantly negative effect on odds of reporting of ever having sexual activity in Benin, Burundi, Cameroon, Ethiopia, Liberia, Malawi, Rwanda, Sierra Leone, Timor-Leste, Uganda, South Africa, and Zimbabwe (12 out of 19 included countries). For men, the effect was significantly negative in Benin, Cameroon, Ethiopia, The Gambia, Haiti, Mali, Malawi, Myanmar, Nepal, Timor-Leste, Uganda, South Africa, and Zimbabwe (13 out of 20 countries). In most countries the posterior median for the adjusted odds ratio is lower for men than for women.

The influence of previous survey experience is less clear (Table 2 and Table 3, and Figure S1 in the Online Supplementary Document). Similarly, rural residency and interviewer age in years do not exhibit a consistent association. While living in a rural community does seem to have more influence in some countries than other community and interviewer level variables, the direction of the association is inconsistent between countries. In seven of 19 women’s and nine of 20 men’s questionnaires, the medians of the posterior adjusted odds suggest a positive relationship between living in a rural community with reporting intercourse, all others are negative.

Figure 1 shows the difference in average marginal effect of having an interviewer that is 10 or more years older, adjusting for interviewer experience and age, respondent characteristics, and integrating out interviewer- and cluster-specific intercepts. The median of the posterior distribution of the average marginal effect contrasts among female respondents ranged from 3.90% (95% CrI = 1.20% to 6.60%) in Nigeria to -12.50% (95% CrI = -17.80% to -7.20%) in Zimbabwe. Excluding Zambia, an interviewer that is 10 or more years older than the respondent was associated with a difference in the probability of reporting sexual activity from 0.90% (95% CrI = -2.70% to 4.30%) probability in Nigeria to -17.10% (95% CrI = -22.90% to -11.40%) difference probability in South Africa.

Figure 1.  Average marginal effect contrasts of reporting ever having sexual intercourse among never-union respondents when an interviewer is 10 or more years older and fewer than 10 years, adjusting for respondent and interviewer characteristics.

The posterior medians of the difference in marginal effects were positive for Nigeria and Zambia. Given the vast differences in cultures and settings, this is not unexpected, and the 95% credible intervals for the male respondents in Zambia and female respondents in Zambia included zero.

One outstanding exception is for Zambia men, where the association is in the opposite direction and has a wide credible interval. The coefficient estimate for age difference of 10 or more years is substantially higher when including the random intercept for interviewers.


To our knowledge, this is the first study to leverage the interviewer characteristics across multiple countries. Across countries and genders included in this analysis, respondents are less likely to report ever having had sex when an interviewer is 10 or more years older, controlling for respondent characteristics, interviewer age, experience, and cluster level variance. The other interviewer characteristic included in this analysis, previous survey experience, appears to have little association with reporting ever having sexual intercourse in most included countries.

Respondents in Nigeria, Zambia, and to the men’s questionnaire in Guinea were more likely to report ever having sexual activity to people more than 10 years older. Analyses with future DHS rounds in these and other included countries will shed light on the stability of the observed trends.

Response bias is well established when reporting sexual activity prior to a union in many contexts [5,8,38]. Where much of the extant literature has focused on non-response, scholars note that while cooperation is often more likely when interviewers and respondents are alike, cooperation does not necessarily imply candor [2,5,12]. Our findings align with literature from higher income countries that older interviewers elicit more conservative responses to sexual behaviour questions [24]. Studies have found inconsistencies from even the same respondent across self-reports, with differences that vary in directionality and magnitude by gender across contexts [39,40].

Previous research has also identified differences between reporting behaviour of men and women. Upchurch et al. found that both men and women can be inconsistent in their reporting of ever having sexual activity, but men were more likely to be inconsistent [41], whereas Soler-Hampejsek et al., found that women were more likely to be inconsistent in self-report [40]. Mensch et al. suggested that boys are more likely to exaggerate their sexual activity [5]. The results from the present analysis suggest a more nuanced perspective, that this bias is likely to be moderated by external factors including interviewer characteristics, particularly age difference between interviewer and respondent. These inconsistencies highlight the importance of better understanding interviewer characteristics in relationship to self-reported outcomes’ validity and reliability.

The biases we detected from interviewer effects will be transferred across other population metrics which use sexual activity as a numerator or denominator component. These include common indicators around use of family planning and human immunodeficiency virus (HIV)/acquired immunodeficiency syndrome (AIDS) prevention behaviour among young, or never married segments of the population. Although we did not examine other outcomes, it is plausible that interviewer effects are present in other sensitive outcomes and those who have inconsistent reporting on sexual behaviour are likely to have inconsistent reporting on other measures [4]. Metheny and Stephenson found that previous survey experience influenced reporting of intimate partner violence in Zimbabwe [17], and others have detected interviewer effects in the reporting of abortion [16,42]. This bias may lead to inaccuracies in estimates of the need for interventions among the populations that have the most to gain from such programs.

Population estimates may be improved with protocols that are more sensitive to interviewer effects and with novel collection modes. Mitigation strategies might include adding age and other characteristics when matching interviewers to respondents where feasible. Survey administrators may also wish to review interviewer training considering how to minimize role-restricted bias. ACASI is another promising tactic to address both role-independent and role-restricted interviewer biases. ACASI has been shown to improve accuracy and reduce inter-interviewer variance on sensitive questions in rural and semi-literate settings, though it may not be suitable in all contexts [5,8,43]. The DHS program may examine how to integrate these modalities in a manner that is accessible across populations.

There are salient limitations to this analysis. Although we were able to examine some key background effects, there was little information on mediators, such as interviewer attitudes, beliefs, or behaviours. This analysis also does not determine what is true. For example, although underreporting is typically considered to be a greater threat to validity for questions about sexual behaviour [24], it is also plausible that respondents may exaggerate their sexual experience to someone who is within 10 years of age.

Finally, key data limitations must be acknowledged. Several data sets either did not include characteristics for some interviewers or had improperly entered interviewer IDs – although the proportion of observations in affected data sets was small. It is also possible that, at the country level, models could be improved with the inclusion of variables that we did not include. Finally, upon advice from the DHS team, we removed observations where interviewers were misclassified, and missingness of individual variables is described in the respondent characteristics.

Finally, we note that this research is exploratory in nature. While there is sufficient consistency across surveys that the risk of age discrepancy-induced response bias should be taken seriously, this research would benefit from replication across contexts. In particular, because age can be a proxy for other sociodemographic characteristics, it would be valuable to extend this analysis with surveys that have more detailed interviewer data than DHS is able to collect. We suspect that there may be additional contextual factors that may modify interviewer effects, and this is an area rich for future research.

Kianersi et al. noted that much of the scholarship on interview effects is now relatively old, while attitudes, sensitivity and norms are shifting and the public health community should maintain a contemporary understanding of interviewer effects on the reporting of sexual activity [23]. Future research into the effects of interviewer characteristics on responses to sensitive questions in LMICs should be ongoing and focus on improving collection methods, better understanding what interviewer characteristics make response editing more likely, and re-evaluating estimates of indicators that may be vulnerable to interviewer effects. The DHS program could facilitate more robust research into interviewer-induced bias by standardizing and expanding interviewer characteristics data sets and matching response categories with the survey questionnaires where relevant (i.e. interviewer native language and ethnicity). This practice would also support replication of these results and analysis of trends in these effects over time.


In most countries, women and men are less likely to report ever having sex when an interviewer is 10 or more years older when controlling for respondent characteristics. This has meaningful implications on the interpretation of this indicator, but also makes a strong case that the impact of interviewer effects should be considered more broadly in the DHS and other similar surveys. DHS could improve our understanding of these issues by collecting existing interviewer characteristics more consistently and adding several more dimensions. Survey implementers may consider better matching or careful training, or the assistance of technology like ACASI to reduce interviewer effects. Analysts may consider including a random intercept for interviewers when using DHS and similar survey data.

Additional material

Online Supplementary Document


We would like to specifically thank Dr Thomas Pullum and Dr Sara Riese from the DHS Program for their support, insight, and availability. Dr Kristin Johnson also provided early feedback on the research question.

Ethics statement: The de-identified secondary data was obtained from the DHS program, which makes it publicly available. The DHS program secures appropriate ethical approval in each of the countries where it collects data, which is detailed in the final reports produced for each country.

Data availability: All data used in this analysis is available upon request from the DHS Program at

[1] Funding: None.

[2] Authorship contributions: All authors contributed to the draft and final manuscript. Jeffrey W. Rozelle performed all analysis.

[3] Disclosure of interest: The authors completed the ICMJE Disclosure of Interest Form (available upon request from the corresponding author) and disclose no relevant interests.


[1] RE Davis, MP Couper, NK Janz, CH Caldwell, and K Resnicow. Interviewer effects in public health surveys. Health Educ Res. 2010;25:14-26. DOI: 10.1093/her/cyp046. [PMID:19762354]

[2] GB Durrant, RM Groves, L Staetsky, and F Steele. Effects of Interviewer Attitudes and Behaviors on Refusal in Household Surveys. Public Opin Q. 2010;74:1-36. DOI: 10.1093/poq/nfp098

[3] CA Kelly, E Soler-Hampejsek, BS Mensch, and PC Hewett. Social Desirability Bias in Sexual Behavior Reporting: Evidence from an Interview Mode Experiment in Rural Malawi. Int Perspect Sex Reprod Health. 2013;39:14-21. DOI: 10.1363/3901413. [PMID:23584464]

[4] LA Palen, EA Smith, LL Caldwell, AJ Flisher, L Wegner, and T Vergnani. Inconsistent Reports of Sexual Intercourse Among South African High School Students. J Adolesc Health. 2008;42:221-7. DOI: 10.1016/j.jadohealth.2007.08.024. [PMID:18295129]

[5] BS Mensch, PC Hewett, and AS Erulkar. The reporting of sensitive behavior by adolescents: A methodological experiment in Kenya. Demography. 2003;40:247-68. DOI: 10.1353/dem.2003.0017. [PMID:12846131]

[6] S Randall, E Coast, N Compaore, and P Antoine. The power of the interviewer. Demogr Res. 2013;28:763-92. DOI: 10.4054/DemRes.2013.28.27

[7] T Gnambs and K Kaspar. Disclosure of sensitive behaviors across self-administered survey modes: a meta-analysis. Behav Res Methods. 2015;47:1237-59. DOI: 10.3758/s13428-014-0533-4. [PMID:25410404]

[8] Mensch B, Hewett P. Obtaining more accurate and reliable information from adolescents regarding STI/HIV risk behaviors. New York: Population Council, 2007. Brief No.: 25.

[9] BS Mensch, PC Hewett, R Gregory, and S Helleringer. Sexual Behavior and STI/HIV Status Among Adolescents in Rural Malawi: An Evaluation of the Effect of Interview Mode on Reporting. Stud Fam Plann. 2008;39:321-34. DOI: 10.1111/j.1728-4465.2008.00178.x. [PMID:19248718]

[10] S Bignami-Van Assche, G Reniers, and AA Weinreb. An Assessment of the KDICP and MDICP Data Quality: Interviewer Effects, Question Reliability and Sample Attrition. Demogr Res. 2003;S1:31-76. DOI: 10.4054/DemRes.2003.S1.2

[11] BT West and AG Blom. Explaining Interviewer Effects: A Research Synthesis. J Surv Stat Methodol. 2017;5:175-211.

[12] Pullum TW, Juan C, Khan N, Staveteig S. The Effect of Interviewer Characteristics on Data Quality in DHS Surveys. Rockville, Maryland, USA: ICF; 2018. (DHS Methodological Reports). Report No.: 24.

[13] AA Weinreb. The Limitations of Stranger-Interviewers in Rural Kenya. Am Sociol Rev. 2006;71:1014-39. DOI: 10.1177/000312240607100607

[14] DHS. A New DHS Questionnaire: Interviewing Fieldworkers Available: Accessed: 26 September 2021.

[15] Elksabi M, Khan A. Modeling interviewer effects in DHS surveys. 2022. Available: Accessed: 3 October 2022.

[16] T Leone, L Sochas, and E Coast. Depends Who’s Asking: Interviewer Effects in Demographic and Health Surveys Abortion Data. Demography. 2021;58:31-50. DOI: 10.1215/00703370-8937468. [PMID:33834247]

[17] N Metheny and R Stephenson. Interviewer effects on the reporting of intimate partner violence in the 2015 Zimbabwe Demographic and Heath Survey. J Gend-Based Violence. 2020;4:241-58. DOI: 10.1332/239868020X15881856964966

[18] LM Kaljee, M Green, R Riel, P Lerdboon, LH Tho, and LTK Thoa. Sexual Stigma, Sexual Behaviors, and Abstinence Among Vietnamese Adolescents: Implications for Risk and Protective Behaviors for HIV, Sexually Transmitted Infections, and Unwanted Pregnancy. J Assoc Nurses AIDS Care. 2007;18:48-59. DOI: 10.1016/j.jana.2007.01.003. [PMID:17403496]

[19] JN Leerlooijer, AE Bos, RA Ruiter, MA van Reeuwijk, LE Rijsdijk, and N Nshakira. Qualitative evaluation of the Teenage Mothers Project in Uganda: a community-based empowerment intervention for unmarried teenage mothers. BMC Public Health. 2013;13:816 DOI: 10.1186/1471-2458-13-816. [PMID:24011141]

[20] AG Nmadu, S Mohamed, and NO Usman. Adolescents’ utilization of reproductive health services in Kaduna, Nigeria: the role of stigma. Vulnerable Child Youth Stud. 2020;15:246-56. DOI: 10.1080/17450128.2020.1800156

[21] D Beguy, CW Kabiru, EN Nderu, and MW Ngware. Inconsistencies in Self-Reporting of Sexual Activity Among Young People in Nairobi, Kenya. J Adolesc Health. 2009;45:595-601. DOI: 10.1016/j.jadohealth.2009.03.014. [PMID:19931832]

[22] B Houle, N Angotti, SJ Clark, J Williams, FX Gómez-Olivé, and J Menken. Let’s Talk about Sex, Maybe: Interviewers, Respondents, and Sexual Behavior Reporting in Rural South Africa. Field Methods. 2016;28:112-32. DOI: 10.1177/1525822X15595343. [PMID:28190977]

[23] S Kianersi, M Luetke, R Jules, and M Rosenberg. The association between interviewer gender and responses to sensitive survey questions in a sample of Haitian women. Int J Soc Res Methodol. 2020;23:229-39. DOI: 10.1080/13645579.2019.1661248

[24] SR Wilson, NL Brown, C Mejia, and PW Lavori. Effects of Interviewer Characteristics on Reported Sexual Behavior of California Latino Couples. Hisp J Behav Sci. 2002;24:38-62. DOI: 10.1177/0739986302024001003

[25] M Benney, D Riesman, and SA Star. Age and Sex in the Interview. Am J Sociol. 1956;62:143-52. DOI: 10.1086/221954

[26] Croft TN, Marshall AMJ, Allen CK, et a. Guide to DHS Statistics. Available: Accessed: 8 December 2022.

[27] International ICF. Survey Organization Manual for Demographic and Health Surveys. Calverton, Maryland: ICF International: MEASURE DHS. Available: Accessed: 8 December 2022.

[28] Program DHS. Final DHS Reports. Available: Accessed: 2 April 2022.

[29] J Amo-Adjei and DA Tuoyire. Timing of sexual debut among unmarried youths aged 15-24 in Sub-Saharan Africa. J Biosoc Sci. 2018;50:161-77. DOI: 10.1017/S0021932017000098. [PMID:28382871]

[30] R Vassallo, G Durrant, and P Smith. Separating interviewer and area effects by using a cross-classified multilevel logistic model: simulation findings and implications for survey designs. J R Stat Soc Ser A Stat Soc. 2017;180:531-50. DOI: 10.1111/rssa.12206

[31] Schnell R, Kreuter F. Separating Interviewer and Sampling-Point Effects. 2003 [cited 2021 Sep 26]; Available from:

[32] Gelman A, Carlin JB, Stern HS, Dunson DB, Vehtari A, Rubin DB. Bayesian data analysis. Third edition. Boca Raton: CRC Press; 2014. 661 p. (Chapman & Hall/CRC texts in statistical science).

[33] Bürkner P-C. Advanced Bayesian Multilevel Modeling with the R Package brms. arXiv:170511123 [stat]. 2017 [cited 16 Mar 2022]. Available: Accessed: 8 December 2022.

[34] Bürkner P-C, Gabry J, Kay M, Vehtari A. posterior: Tools for Working with Posterior Distributions. 2022. Available: Accessed: 8 December 2022.

[35] R Core Team. R: A Language and Environment for Statistical Computing. Vienna, Austria: R Foundation for Statistical Computing; 2022. Available: Accessed: 8 December 2022.

[36] M Pavlou, G Ambler, S Seaman, and RZ Omar. A note on obtaining correct marginal predictions from a random intercepts model for binary outcomes. BMC Med Res Methodol. 2015;15:59 DOI: 10.1186/s12874-015-0046-6. [PMID:26242875]

[37] Wiley J. brmsmargins: Bayesian Marginal Effects for “brms” Models. 2021. Available: Accessed: 8 December 2022.

[38] MS Fabic and A Jadhav. Standardizing Measurement of Contraceptive Use Among Unmarried Women. Glob Health Sci Pract. 2019;7:564-74. DOI: 10.9745/GHSP-D-19-00298. [PMID:31874938]

[39] E Eggleston, J Leitch, and J Jackson. Consistency of Self-Reports of Sexual Activity among Young Adolescents in Jamaica. Int Fam Plan Perspect. 2000;26:79-83. DOI: 10.2307/2648271

[40] E Soler-Hampejsek, MJ Grant, BS Mensch, PC Hewett, and J Rankin. The Effect of School Status and Academic Skills on the Reporting of Premarital Sexual Behavior: Evidence From a Longitudinal Study in Rural Malawi. J Adolesc Health. 2013;53:228-34. DOI: 10.1016/j.jadohealth.2013.03.008. [PMID:23688856]

[41] DM Upchurch, LA Lillard, CS Aneshensel, and NF Li. Inconsistencies in reporting the occurrence and timing of first intercourse among adolescents. J Sex Res. 2002;39:197-206. DOI: 10.1080/00224490209552142. [PMID:12476267]

[42] K Footman. Interviewer effects on abortion reporting: a multilevel analysis of household survey responses in Côte d’Ivoire, Nigeria and Rajasthan, India. BMJ Open. 2021;11:e047570. DOI: 10.1136/bmjopen-2020-047570. [PMID:34799361]

[43] G Harling, D Gumede, T Mutevedzi, N McGrath, J Seeley, and D Pillay. The impact of self-interviews on response patterns for sensitive topics: a randomized trial of electronic delivery methods for a sexual behaviour questionnaire in rural South Africa. BMC Med Res Methodol. 2017;17:125 DOI: 10.1186/s12874-017-0403-8. [PMID:28818053]

Correspondence to:
Jeffrey W Rozelle
Spatial Sciences Institute, University of Southern California
3616 Trousdale Parkway, Los Angeles, California
[email protected]