Announcements

Call for Early Career Reviewer (ECR) College Members

The IJSRM is a leading methods journal in the field, publishing articles across the methodological spectrum.  The Journal has established a Reviewer College as part of its commitment to supporting early career social researchers and methodologists, progression to full board members, and values the new ideas and approaches they bring.

We welcome applications from early career researchers with demonstrable expertise in:

  • quantitative research design and analysis
  • online research design
  • Indigenous methodologies
  • social media research
  • ethnography
  • computational social science

As a member of the IJSRM Reviewer College you will be invited to review up to four submitted papers a year, and will receive a discount voucher for Taylor & Francis products for each review you undertake. To apply for the College, please submit your CV and covering letter outlining your methods expertise to the IJSRM Administrator tsrm-editor@tandf.co.uk.

featured, Notebook

Bystanders and response bias in face-to-face surveys in Africa

By Zack Zimbalist

Public opinion surveys are crucial fonts for understanding the public’s perceptions, values, and attitudes across the world. By conducting such surveys repeatedly with random samples, social scientists are able to track how responses change over time. This allows researchers to capture the dynamics of social perceptions on a range of interesting topics, from the economy, to health, education, crime, corruption, democracy, and government performance.

Ideally, respondents feel secure to disclose accurate information (avoiding reporting bias and item non-response) in the context of a face-to-face interview. Yet, survey research in political science seldom accounts for peer effects caused by bystanders. Much of the existing research focuses primarily on the effects of parents and spouses on self-reporting illicit activities or marriage-related issues. Moreover, these studies have mainly been carried out in industrialized countries with only a few studies that are also confined to similar survey questions in a small sample of developing countries.

This is thus the first study to investigate bystander effects across a large sample of developing countries for a broad set of questions related to social, political, and economic outcomes. Studying the presence of bystanders is important because third parties are often present in population surveys, especially in developing country contexts where extended family members and communities live in close proximity. For example, a bystander is present at 34% of interviews conducted by the Afrobarometer survey. Of those, 16% of respondents in the total sample is accompanied by non-familial bystanders, 6% by their spouses, and 12% by their children.

Using survey data from over 45,000 households across 34 African countries (collected by the Afrobarometer), my new articleBystanders and response bias in face-to-face surveys in Africa” finds that bystanders, especially non-familial ones, substantially affect responses to an array of questions (some sensitive and some not). The paper also demonstrates that these biased responses run counter to biases due to fear (linked to the perception of a state interviewer) and are most likely explained by social desirability when one is among her peers (a few people or a small crowd). The biases are far rarer for interviews conducted among just a spouse or children.

Let me provide a few examples from the article. First, in the presence of non-familial bystanders, respondents understate the extent (or supply) of democracy and their satisfaction with democracy and report higher levels of fear of political violence. These results run counter to respondents’ overstatement of supply and satisfaction with democracy with respect to the perception of a government interviewer. I argue that these overstatements correspond to the fear of criticizing the government’s performance on this dimension. By contrast, in the presence of non-familial bystanders, the opposite effect is most likely driven by social desirability concerns around what one’s neighbors believe to be the appropriate answer.

Second, respondents supervised by their peers express more disapproval for the performance of their MPs, local government, and the mayor. Here, again, it seems likely that expressing disapproval for politicians and government is, on average, the socially desirable response. This result contrasts with reporting systematically more approval in the case of perceiving a state interviewer (a fear-induced bias).

Third, in line with the social desirability of reporting disapproval of elected officials, respondents supervised by non-familial bystanders report higher levels of corruption in the presidential office, among MPs and government officials. Again, this result runs counter to the fear-induced state interviewer response, which is underreporting corruption levels to state interviewers.

In addition to misreporting, bystanders of both kin and non-kin are strongly associated with higher rates of item nonresponse. The levels of nonresponse and the gaps between bystander and non-bystander interviews were largest for arguably sensitive questions wherein a “don’t know” answer could be seen as satisficing and socially desirable.

This article’s results suggest the need to implement additional measures to measure and mitigate bystander presence. To measure bystander bias in contexts outside of Africa, other surveys such as other regional barometers and Pew polls would do well to include a question on the presence of bystanders. Mitigating bystander-induced biases is a thornier challenge that requires further experimentation across contexts. One alternative approach is self-administration in high literacy contexts (which eliminates the biases caused by bystanders overhearing answers) as some research has shown that respondents are more willing to answer sensitive questions when they are self-administered (see Krumpal, 2013 for a review). In addition, indirect modes of administration such as endorsement experiments, list experiments (or item count or unmatched count technique) (Glynn, 2013) or randomized response techniques (RRT) (Coutts & Jann, 2011; Rosenfeld, Imai, & Shapiro, 2015) could also be tested. Despite their limitations, indirect methods may improve data collection on sensitive questions. Moreover, they are more implementable than self-administration in low literacy contexts. Further research would be helpful in bolstering our understanding of whether, and to what extent, these methods obtain more reliable estimates across different contexts.

Overall, the article provides new evidence of substantial effects of bystanders across a range of survey questions in a large sample of democratic and non-democratic African countries. Securing private interviews is a sine qua non for obtaining accurate data. In the absence of this, alternative techniques could be deployed to ensure that respondents are free to provide honest assessments and perspectives on important economic, political and social questions.

Read the full article on IJSRM here.

featured, Notebook

Testing key underlying assumptions of respondent driven sampling within a real-world network of people who inject drugs

By Ryan Buchanan, Charlotte Cook, Julie Parkes, & Salim I Khakoo

The World Health Organization has recently set a target for the global elimination of Hepatitis C. However, to monitor progress it is necessary to have accurate methods to track the changing prevalence of Hepatitis C in populations that are most affected by the virus. People who inject drugs are a marginalized and often hidden population with a high prevalence of Hepatitis C. As such, tracking Hepatitis C infections in these populations can be difficult. One method to do just this Respondent Driven Sampling or RDS. However, prevalence estimates made using RDS make several assumptions and it is difficult to test whether these assumptions have been met.

However, our recently published article in the International Journal of Social Research Methodology describes a novel way to do just this. This blog shares some of the challenges faced in doing this work and how, by using novel social network data collection techniques, we were able to test some of the assumptions of RDS in an isolated population who inject drugs on the Isle of Wight in the United Kingdom. However, before delving into how we did this, a brief introduction to the RDS method is necessary.

RDS requires that researchers start with a carefully selected sample within a target population. These individuals (called seeds) are asked to refer two or three friends or acquaintances to researchers who are also eligible to take part. These new participants are asked to do the same and recruitment continues in this way through ‘waves’ until the desired sample size is achieved. Then, using appropriate software, the data collected during the survey about each persons’ social network allows for the estimation population prevalence (e.g., how common Hepatitis C is in the population of people who inject drugs).

Using RDS to estimate the prevalence of Hepatitis C among the population of the Isle of Wight, we hypothesized that the treatment program was closer to achieving the elimination of the virus than the available data suggested.

However, concerns remained about the potential flaws of RDS and we were interested in how one could develop methods to assess these flaws. Here our study on the Isle of Wight presented a unique opportunity. The small island population made it possible to map the social networks connecting people who inject drugs through which the sampling process passes. With this network ‘map’ it would then be possible to test whether some of the assumptions underlying the method had been met.

To achieve a mixed methods social network study was run alongside the main survey. Interviews were conducted with people who inject drugs on the Island as well as the service providers who worked with them. These interviews explored how they were all interconnected. Survey participants were also asked about their social networks which then aided in the construct of a representation network through which the ‘waves’ of the sampling process passed.

Unsurprisingly, many survey participants were unenthusiastic about identifying friends and acquaintances who also inject drugs. Instead, unique codes for each individual described were utilized. These comprised of their initials, age, hair colour, gender and village or town where they lived. Participants were asked about each individual they described e.g., how frequently do they inject or if they use needle exchange services? In this way a picture of the social network of people who inject drugs on the Isle of Wight was gradually built up which provided insights into this population even though some of the target population hadn’t come forward to directly participate in the survey.

With this ‘map’ in-hand and the personal information collected we collected it was possible to test some of the assumptions of RDS like (1) if the target population for the survey are all socially interconnected or (2) if members of the population are equally likely to participate in the survey.

Read the full article is the IJSRM here.

The researchers would like to thank the individuals who came forward and took part in this study, the community pharmacists who provided research venues in towns and villages across the Island, and the study funders (NIHR CLAHRC Wessex).

featured, Notebook

Analysing complexity: Developing a modified phenomenological hermeneutical method of data analysis for multiple contexts

By Debra Morgan

Qualitative data analysis has been criticised for a lack of credibility over recent years when vagueness has been afforded to the reporting of how findings are attained. In response, there has been a growing body of literature emphasising a need to detail methods of qualitative data analysis. As a seasoned academic in nurse education, comfortable with the concept of evidence-based practice, I was concerned as a PhD researcher (exploring student nurse experiences of learning whilst studying abroad) to ensure I selected a sound approach to data analysis that would ensure transparency at each stage of the analysis process; so avoiding the aforementioned criticism from being levelled against my research.

The analytical journey began well, with the selection of the ‘phenomenological hermeneutical method for interpreting interview texts’ (Lindseth and Norberg, 2004​). This method appeared ideally suited to my research methodology (hermeneutic phenomenology) and the process is well described by the authors, so offering the level of transparency desired. Briefly, this analysis method comprises three stages: naïve reading: the text must be read many times in an open-minded manner so that a first ‘naïve understanding’ is arrived at; structural analysis: commences with the identification of ‘meaning units’ from the text, these are condensed and sub-themes and themes emerge; comprehensive understanding: themes are further considered in relation to the research question and wider texts and a comprehensive understanding emerges (Lindseth and Norberg, 2004).

Analysis of each individual research participant’s data progressed well following these stages. However, I found it difficult to understand how I could then combine individual data to develop core phenomenon themes and yet not lose the individual student voice and learning experiences which were embedded in diverse contexts. This was concerning to me as my research comprised gathering student experiences of their learning in multiple contexts. For example, student experiences ranged from short placements in low and middle income countries in Africa and Asia, to longer placements in high income countries in Europe. Whilst the phenomenon of learning may exist in each diverse context, each experience is unique, therefore illuminating and preserving experiences in context was important to me. I was concerned to ensure that each individual ‘lived experience’, within each of these different contexts, were explored so that an understanding of learning in each type of placement could be revealed prior to developing a comprehensive understanding of the phenomenon more generically.

Whilst Lindseth and Norberg suggest reflecting on the emergent themes in relation to the context of the research (such as different study abroad types) at the final ‘comprehensive understanding’ stage, I felt it was important to preserve experiences specific to each study abroad type throughout each stage of data analysis in order they were not ‘lost’ during this process. To capture such contextual elements, Bazeley (2009) recommends describing, comparing and relating the characteristics or situation of the participants during analysis. I therefore incorporated these aspects into Lindseth and Norberg’s approach so that the varied study abroad types could be explored individually before then combining with the other types. In order to achieve this, I developed a modified approach to analysis. In particular, I further differentiated at the stage of structural analysis. Accordingly, I introduced two sub-stages, these are: ‘individual structural analysis’ and an additional ‘combined structural analysis’. This development represents a refinement in relation to moving from the individual participant experience (which I have termed ‘the individual horizonal perspective’ or ‘individual horizon’) to combined experiences of the phenomenon (respectively termed ‘the combined horizonal perspective’ or ‘combined horizon’).

This modified qualitative data analysis method ensures that the subjective origins of student, or research participant, experience and contexts remain identifiable throughout each stage of the data analysis process. Consequently, I feel this modification to an existing qualitative data analysis method permits greater transparency when dealing with data that relates to multiple contexts. I have called this approach the ‘Modified Phenomenological Hermeneutical Method of Data Analysis For Multiple Contexts’ and I have also developed a visual model to further illuminate this modified approach.

The ‘modified phenomenological hermeneutical method of data analysis for multiple contexts’ holds utility, and whilst my research is focused upon student nurse education it is transferable to other subject and research areas that involve multiple research contexts. To this effect, I have shared a reflexive review, and in this I include data extracts, of the development and application of this approach in my article: ‘Analysing complexity: Developing a modified phenomenological hermeneutical method of data analysis for multiple contexts’  published in‘The International Journal of Social Research Methodology’. This paper also presents the developed visual model. Additionally, an exploration of the underpinning theoretical basis to this data analysis method, and modification is provided, so adding to the expanding body of evidence and reflecting the ethos of transparency in qualitative data analysis.

Read the full IJSRM article here.

References:

Bazeley, P. (2009). Analysing qualitative data: More than ‘identifying themes’. Malaysian Journal of Qualitative Research 2, 6-22. Lindseth, A. & Norberg, A. (2004). A phenomenological hermeneutical method for researching lived experience. Scandinavian Journal of Caring sciences 18(2), 145-153.

featured, Notebook

Decentering methodological nationalism to study precarious legal status trajectories

By Patricia LandoltLuin Goldring & Paul Pritchard

Our paper tackles the knowledge gap between people’s experiences of immigration status and quantitative research on the matter, and proposes a survey design solution for the gap. We consider how this knowledge gap is produced and perpetuated in Canada, a traditional country of permanent immigration in which temporary migration has become a core feature of immigration management. We showcase the most important survey and administrative data sources used in Canada to study the relationship between immigration and social inequality. We capture gaps and omissions in data sources and show why the loyal application of state legal status categories to the empirical study of the relationship between changes in immigration status and social inequality is flawed. In the second half of the paper, we present the design lessons learned by the Citizenship and Employment Precarity project. We discuss our community consultation process and the survey questions we developed to capture respondent’s precarious legal status trajectories, and some of the limitations of our approach.

Research shows that precarious legal status situations are proliferating globally and have profound long-term impacts on migrant life chances and social inequality. Precarious legal status is characterized by a lack of permanent authorized residence, partial and temporary authorized access to social citizenship and employment, dependence on a third party for authorized presence or employment, and deportability (Goldring et al., 2009). Across the world, migrants move between immigration status categories, often in unpredictable ways that do not fit easily within state categories or the design of international migration management or state policy. Migrant precarious legal status trajectories crisscross jurisdictions and programs. There is movement between authorized and unauthorized legal status situations, failed transitions, denied and repeat applications. Yet the bulk of quantitative research on immigration status is organized around state legal status categories and programs. As a result, analysis cannot take into account potentially life altering moments in a person’s precarious legal status trajectory that fly under or around the radar of the state.

The research design process is a crucial moment with the potential to generate methodological autonomy from state classification and counting systems. In our work, the community social service agency-centred knowledge network generated conceptual and analytical autonomy from state practice. Active consultation with stakeholders to incorporate their insights and priorities into the survey design was essential throughout the research process.

The target population for the survey included employed, working age, residents of the Greater Toronto Area who had entered Canada with precarious legal status. Respondents’ current legal status was not a sampling criterion, which means that the sample ranges from illegalized, unauthorized migrants to permanent residents and naturalized citizens. In addition to questions on precarious legal status trajectories, the survey includes demographic data and modules on pre-arrival planning and financing of migration, education pre- and post-arrival, early settlement and work, current work, income and financial security, and a self-rated health score and questions about access to healthcare. The twenty-minute, self-administered online survey includes a mixture of closed-response and multiple-choice items and a limited number of open-response items.

The survey has three sets of questions that together capture a respondent’s precarious legal status trajectory in Canada. Respondents are asked for immigration status at entrance or the first time they entered Canada. Current status is captured with three sets of questions. One question establishes the respondent’s immigration status category. Another asks if and when the respondent transitioned to permanent residence. Third, the respondent is asked if they currently have a valid work permit, with a probe for work permit type (open or closed). Work permit type is associated with varying degrees of labour market access, rights and entitlements. The three-part question is a check on the potential for mismatches between authorization to be present and authorization to work that are built into the immigration system.

A separate set of questions captures movement within and between legal status categories including successful and failed attempts to change status. The respondent is asked to identify all immigration statuses and work permits held, including a deportation order and the total number of attempts to gain permanent resident status via humanitarian applications. They are also asked if they ever left and returned to Canada under a different or renewed immigration status. The survey also asks for costs incurred by a family unit to finance migration and efforts to regularize their legal status.

The survey has limitations that truncate the temporal and directional complexity of precarious legal status trajectories. Limitations result from balancing the practical requirement of manageable survey length with comprehensiveness. The survey does not capture sequential data on legal status trajectories, only what statuses a respondent applied for and held. It is also not possible to establish whether the total period without status or work permit is continuous or discontinuous. We also note that survey pilot testing confirmed the inherent difficulties of collecting sequential data given lacunae in respondent recall and gaps in knowledge, particularly when these are handled by a third party such as an immigration lawyer, consultant or family member. A second limitation involves legal status trajectories outside of Canada, before first arriving in the country, or between arrival and the survey date. The survey does not address precarious transit zones.

You can read the full article in IJSRM here.