featured, Notebook

Testing key underlying assumptions of respondent driven sampling within a real-world network of people who inject drugs

By Ryan Buchanan, Charlotte Cook, Julie Parkes, & Salim I Khakoo

The World Health Organization has recently set a target for the global elimination of Hepatitis C. However, to monitor progress it is necessary to have accurate methods to track the changing prevalence of Hepatitis C in populations that are most affected by the virus. People who inject drugs are a marginalized and often hidden population with a high prevalence of Hepatitis C. As such, tracking Hepatitis C infections in these populations can be difficult. One method to do just this Respondent Driven Sampling or RDS. However, prevalence estimates made using RDS make several assumptions and it is difficult to test whether these assumptions have been met.

However, our recently published article in the International Journal of Social Research Methodology describes a novel way to do just this. This blog shares some of the challenges faced in doing this work and how, by using novel social network data collection techniques, we were able to test some of the assumptions of RDS in an isolated population who inject drugs on the Isle of Wight in the United Kingdom. However, before delving into how we did this, a brief introduction to the RDS method is necessary.

RDS requires that researchers start with a carefully selected sample within a target population. These individuals (called seeds) are asked to refer two or three friends or acquaintances to researchers who are also eligible to take part. These new participants are asked to do the same and recruitment continues in this way through ‘waves’ until the desired sample size is achieved. Then, using appropriate software, the data collected during the survey about each persons’ social network allows for the estimation population prevalence (e.g., how common Hepatitis C is in the population of people who inject drugs).

Using RDS to estimate the prevalence of Hepatitis C among the population of the Isle of Wight, we hypothesized that the treatment program was closer to achieving the elimination of the virus than the available data suggested.

However, concerns remained about the potential flaws of RDS and we were interested in how one could develop methods to assess these flaws. Here our study on the Isle of Wight presented a unique opportunity. The small island population made it possible to map the social networks connecting people who inject drugs through which the sampling process passes. With this network ‘map’ it would then be possible to test whether some of the assumptions underlying the method had been met.

To achieve a mixed methods social network study was run alongside the main survey. Interviews were conducted with people who inject drugs on the Island as well as the service providers who worked with them. These interviews explored how they were all interconnected. Survey participants were also asked about their social networks which then aided in the construct of a representation network through which the ‘waves’ of the sampling process passed.

Unsurprisingly, many survey participants were unenthusiastic about identifying friends and acquaintances who also inject drugs. Instead, unique codes for each individual described were utilized. These comprised of their initials, age, hair colour, gender and village or town where they lived. Participants were asked about each individual they described e.g., how frequently do they inject or if they use needle exchange services? In this way a picture of the social network of people who inject drugs on the Isle of Wight was gradually built up which provided insights into this population even though some of the target population hadn’t come forward to directly participate in the survey.

With this ‘map’ in-hand and the personal information collected we collected it was possible to test some of the assumptions of RDS like (1) if the target population for the survey are all socially interconnected or (2) if members of the population are equally likely to participate in the survey.

Read the full article is the IJSRM here.

The researchers would like to thank the individuals who came forward and took part in this study, the community pharmacists who provided research venues in towns and villages across the Island, and the study funders (NIHR CLAHRC Wessex).

featured, Notebook

Analysing complexity: Developing a modified phenomenological hermeneutical method of data analysis for multiple contexts

By Debra Morgan

Qualitative data analysis has been criticised for a lack of credibility over recent years when vagueness has been afforded to the reporting of how findings are attained. In response, there has been a growing body of literature emphasising a need to detail methods of qualitative data analysis. As a seasoned academic in nurse education, comfortable with the concept of evidence-based practice, I was concerned as a PhD researcher (exploring student nurse experiences of learning whilst studying abroad) to ensure I selected a sound approach to data analysis that would ensure transparency at each stage of the analysis process; so avoiding the aforementioned criticism from being levelled against my research.

The analytical journey began well, with the selection of the ‘phenomenological hermeneutical method for interpreting interview texts’ (Lindseth and Norberg, 2004​). This method appeared ideally suited to my research methodology (hermeneutic phenomenology) and the process is well described by the authors, so offering the level of transparency desired. Briefly, this analysis method comprises three stages: naïve reading: the text must be read many times in an open-minded manner so that a first ‘naïve understanding’ is arrived at; structural analysis: commences with the identification of ‘meaning units’ from the text, these are condensed and sub-themes and themes emerge; comprehensive understanding: themes are further considered in relation to the research question and wider texts and a comprehensive understanding emerges (Lindseth and Norberg, 2004).

Analysis of each individual research participant’s data progressed well following these stages. However, I found it difficult to understand how I could then combine individual data to develop core phenomenon themes and yet not lose the individual student voice and learning experiences which were embedded in diverse contexts. This was concerning to me as my research comprised gathering student experiences of their learning in multiple contexts. For example, student experiences ranged from short placements in low and middle income countries in Africa and Asia, to longer placements in high income countries in Europe. Whilst the phenomenon of learning may exist in each diverse context, each experience is unique, therefore illuminating and preserving experiences in context was important to me. I was concerned to ensure that each individual ‘lived experience’, within each of these different contexts, were explored so that an understanding of learning in each type of placement could be revealed prior to developing a comprehensive understanding of the phenomenon more generically.

Whilst Lindseth and Norberg suggest reflecting on the emergent themes in relation to the context of the research (such as different study abroad types) at the final ‘comprehensive understanding’ stage, I felt it was important to preserve experiences specific to each study abroad type throughout each stage of data analysis in order they were not ‘lost’ during this process. To capture such contextual elements, Bazeley (2009) recommends describing, comparing and relating the characteristics or situation of the participants during analysis. I therefore incorporated these aspects into Lindseth and Norberg’s approach so that the varied study abroad types could be explored individually before then combining with the other types. In order to achieve this, I developed a modified approach to analysis. In particular, I further differentiated at the stage of structural analysis. Accordingly, I introduced two sub-stages, these are: ‘individual structural analysis’ and an additional ‘combined structural analysis’. This development represents a refinement in relation to moving from the individual participant experience (which I have termed ‘the individual horizonal perspective’ or ‘individual horizon’) to combined experiences of the phenomenon (respectively termed ‘the combined horizonal perspective’ or ‘combined horizon’).

This modified qualitative data analysis method ensures that the subjective origins of student, or research participant, experience and contexts remain identifiable throughout each stage of the data analysis process. Consequently, I feel this modification to an existing qualitative data analysis method permits greater transparency when dealing with data that relates to multiple contexts. I have called this approach the ‘Modified Phenomenological Hermeneutical Method of Data Analysis For Multiple Contexts’ and I have also developed a visual model to further illuminate this modified approach.

The ‘modified phenomenological hermeneutical method of data analysis for multiple contexts’ holds utility, and whilst my research is focused upon student nurse education it is transferable to other subject and research areas that involve multiple research contexts. To this effect, I have shared a reflexive review, and in this I include data extracts, of the development and application of this approach in my article: ‘Analysing complexity: Developing a modified phenomenological hermeneutical method of data analysis for multiple contexts’  published in‘The International Journal of Social Research Methodology’. This paper also presents the developed visual model. Additionally, an exploration of the underpinning theoretical basis to this data analysis method, and modification is provided, so adding to the expanding body of evidence and reflecting the ethos of transparency in qualitative data analysis.

Read the full IJSRM article here.

References:

Bazeley, P. (2009). Analysing qualitative data: More than ‘identifying themes’. Malaysian Journal of Qualitative Research 2, 6-22. Lindseth, A. & Norberg, A. (2004). A phenomenological hermeneutical method for researching lived experience. Scandinavian Journal of Caring sciences 18(2), 145-153.

featured, Notebook

Decentering methodological nationalism to study precarious legal status trajectories

By Patricia LandoltLuin Goldring & Paul Pritchard

Our paper tackles the knowledge gap between people’s experiences of immigration status and quantitative research on the matter, and proposes a survey design solution for the gap. We consider how this knowledge gap is produced and perpetuated in Canada, a traditional country of permanent immigration in which temporary migration has become a core feature of immigration management. We showcase the most important survey and administrative data sources used in Canada to study the relationship between immigration and social inequality. We capture gaps and omissions in data sources and show why the loyal application of state legal status categories to the empirical study of the relationship between changes in immigration status and social inequality is flawed. In the second half of the paper, we present the design lessons learned by the Citizenship and Employment Precarity project. We discuss our community consultation process and the survey questions we developed to capture respondent’s precarious legal status trajectories, and some of the limitations of our approach.

Research shows that precarious legal status situations are proliferating globally and have profound long-term impacts on migrant life chances and social inequality. Precarious legal status is characterized by a lack of permanent authorized residence, partial and temporary authorized access to social citizenship and employment, dependence on a third party for authorized presence or employment, and deportability (Goldring et al., 2009). Across the world, migrants move between immigration status categories, often in unpredictable ways that do not fit easily within state categories or the design of international migration management or state policy. Migrant precarious legal status trajectories crisscross jurisdictions and programs. There is movement between authorized and unauthorized legal status situations, failed transitions, denied and repeat applications. Yet the bulk of quantitative research on immigration status is organized around state legal status categories and programs. As a result, analysis cannot take into account potentially life altering moments in a person’s precarious legal status trajectory that fly under or around the radar of the state.

The research design process is a crucial moment with the potential to generate methodological autonomy from state classification and counting systems. In our work, the community social service agency-centred knowledge network generated conceptual and analytical autonomy from state practice. Active consultation with stakeholders to incorporate their insights and priorities into the survey design was essential throughout the research process.

The target population for the survey included employed, working age, residents of the Greater Toronto Area who had entered Canada with precarious legal status. Respondents’ current legal status was not a sampling criterion, which means that the sample ranges from illegalized, unauthorized migrants to permanent residents and naturalized citizens. In addition to questions on precarious legal status trajectories, the survey includes demographic data and modules on pre-arrival planning and financing of migration, education pre- and post-arrival, early settlement and work, current work, income and financial security, and a self-rated health score and questions about access to healthcare. The twenty-minute, self-administered online survey includes a mixture of closed-response and multiple-choice items and a limited number of open-response items.

The survey has three sets of questions that together capture a respondent’s precarious legal status trajectory in Canada. Respondents are asked for immigration status at entrance or the first time they entered Canada. Current status is captured with three sets of questions. One question establishes the respondent’s immigration status category. Another asks if and when the respondent transitioned to permanent residence. Third, the respondent is asked if they currently have a valid work permit, with a probe for work permit type (open or closed). Work permit type is associated with varying degrees of labour market access, rights and entitlements. The three-part question is a check on the potential for mismatches between authorization to be present and authorization to work that are built into the immigration system.

A separate set of questions captures movement within and between legal status categories including successful and failed attempts to change status. The respondent is asked to identify all immigration statuses and work permits held, including a deportation order and the total number of attempts to gain permanent resident status via humanitarian applications. They are also asked if they ever left and returned to Canada under a different or renewed immigration status. The survey also asks for costs incurred by a family unit to finance migration and efforts to regularize their legal status.

The survey has limitations that truncate the temporal and directional complexity of precarious legal status trajectories. Limitations result from balancing the practical requirement of manageable survey length with comprehensiveness. The survey does not capture sequential data on legal status trajectories, only what statuses a respondent applied for and held. It is also not possible to establish whether the total period without status or work permit is continuous or discontinuous. We also note that survey pilot testing confirmed the inherent difficulties of collecting sequential data given lacunae in respondent recall and gaps in knowledge, particularly when these are handled by a third party such as an immigration lawyer, consultant or family member. A second limitation involves legal status trajectories outside of Canada, before first arriving in the country, or between arrival and the survey date. The survey does not address precarious transit zones.

You can read the full article in IJSRM here.

Calls, featured, Notebook

Remote qualitative data collection: Lessons from a multi-country qualitative evaluation

By Mehjabeen Jagmag *

Like most researchers who had planned to begin their research projects earlier this year, our research team found our data collection plans upended by the pandemic. We had designed our research guides, received ethical clearance and completed training our research teams for a multi-country endline evaluation of an education programme in Ghana, Kenya and Nigeria much before we heard of the COVID-19 pandemic.

A few days before our teams travelled to their respective data collection sites, phone calls started pouring in – schools were closed indefinitely, travel between cities was restricted, and we were beginning to understand how much the COVID-19 pandemic would change our lives. After a few weeks of waiting and watching, it became apparent that we could not continue in-person data collection.

We revised our research guides and prepared ourselves for conducting remote phone-interviews with our research participants. Given that this was the third and last round of data collection in our multi-year panel research, we had previously collected phone numbers of our research participants and acquired permission to be able to contact them on the phone for further research. We set up remote research desks for the team and began preparation for data collection.

What we were unsure about was whether our research plans would be successful. Accounts of fraudulent callers promising medical remedies and peddling fake health insurance packages had made people wary of responding to unknown phone numbers. We were not sure how many of the phone numbers we had collected in the previous year would still be working, and most importantly, we were not sure how our research participants were faring under the lockdown and whether they would want to speak with us. Finally, our research participants included primary school students, who were an essential voice in our study. We were keen to conduct interviews but were not sure if this would be feasible – would parents trust us enough to speak to us and consent to their children speaking to us? Once we secured consent from parents, would children provide assent? As trust was the key element to completing our research successfully, we devised a data collection plan that included the following elements, that are likely to be necessary for future remote data collection.

Training and retraining for remote data collection

We spent time discussing as a team what the potential challenges may be and how we plan to respond to them. We drew up a collective list of answers that we could draw on to communicate clearly and effectively about the evaluation, respond to any queries and alleviate any concerns that our participants had. This list and knowledge grew, and we collected data, and debrief meetings with the teams at the end of each data helped ensure this was a live document.

Seek feedback from key informants

We contacted community leaders and headteachers to enquire about how we should approach data collection with school and community participants. They provided important contextual information that was occasionally specific to each community. We used this information to improve our introductory messages, the time and dates we called and how we approached research participants.

Seek introductions from trusted leaders

We also asked community leaders and headteachers to support our recruitment process by sending messages to the participants about our research before it began. Doing so helped minimise any uncertainty of the veracity of our calls. Where relevant, we compensated them for their airtime.

Give participants time to prepare for the interview

We shared information about our organisation and the research objective over text messages or calls, which gave research participants enough time to decide whether they wanted to participate. It also helped plan to speak at a time would suit them best for a discussion, and also consult with their family and children if they wanted to participate in the research.

Ensure continuity of research teams

As this was an endline evaluation, we had research team members who participated in previous rounds of data collection calling the participants they were likely to have visited in the past. Where this was possible, it increased trust and facilitated easy conversations.

Prepare case-history notes

We prepared short case history notes about the programme and school and what we had learned from previous research rounds for each school to build confidence that our intentions and credentials were genuine. These notes helped remind research participants of our last conversation, helped us focus on what has changed since that last conversation, which in turn helped keep conversations short and in general proved to be a useful conversation starter.

Save time at the beginning and end for questions

We ensure research participants had enough time to ask us about the programme, our motivations, go over the consent form, understand why we wanted to speak with the children or for children to ask parents for their permission before we began our interviews. To ensure that that the conversation did not feel rushed, we designed shorter research guides.

Plan for breaks or changes when interviewing with young participants

When speaking with students, we anticipated time to break and distractions during the call, which helped maintain a relaxed pace during the interview. If students were uncomfortable with phone interviews, we, eased the conversation to a close to minimise any distress caused to the participant.

Summary and Conclusion

We completed data collection in all three countries, albeit with a less ambitious research plan that we originally intended for an in-person research study. The key objective of our research was to collect the optimal amount of data that would inform the programme evaluation while making the interview process convenient and comfortable for the research participants involved. To do so, we have learned that it is vital for participants to have confidence in the researchers and the motive for collecting data. Planning before we began data collection and updating our knowledge as the research progressed proved invaluable to our experience.

* Mehjabeen Jagmag is a Senior Consultant with Oxford Policy Management.

Notebook, Uncategorized

“Who says what” in multiple choice questions. A comprehensive exploratory analysis protocol

By M. Landaluce-Calvo, Ignacio García-Lautre, Vidal Díaz de Rada, & Elena Abascal

The aim of much sociological research is to assess public opinion, and the data are often collected by the survey method. This enables the detection of different response, behaviour or opinion profiles and the characterization of groups of respondents with similar views on a certain topic or set of questions. As is widely known, however, different types of question not only yield different qualities of response, but also require different methods of analysis.

Any attempt to classify survey question types require consideration of five criteria: 1) degree of freedom in the response; 2) type of content, 3) level of sensitivity/threat; 4) level of measurement; and 5) number of response options per question. The last classification (with respect to the number of responses) first differentiates between single response and multiple response questions. Here is the main objective of our article in IJSRM: How to extract maximum information from multiple response questions.

There are two broad types of multiple-response questions. One is the categorical response question, where the respondent is instructed to “check-all-that-apply” (the categories are exhaustive, but not mutually exclusive.). The other is the binary response question, where the respondent is required to check yes or no to each response option. Respondents find “check-all-that-apply” questions more difficult to answer because the multiple options require more use of memory. Under the binary-response format the respondent must consider pairs of options, one by one, and check one option in each case. Each pair of options requires an answer, so only a minimal demand is placed on memory. This procedure yields more responses, in both telephone and online surveys and requires less effort on the part of the respondent, although it may lengthen the questionnaire.

Those admitting various response options can be further classified into grid or check-all-that-apply questions. In the case of the latter, the categories are exhaustive, but not mutually exclusive. This multiple-response question format is its widespread use both in the field of opinion polling and in sociological and marketing research. International research project such as the European Social Survey and the Word Values Survey, for example, contain large numbers of multiple responses questions.

All the above considerations relate to the stages of data collection and participant opinion retrieval, but what about the analysis? A review of the specialist literature reveals a lack of attention to the specific data-processing treatment, and the failure to use a multidimensional exploratory approach that would enable the maximum amount of information to be extracted from the response options. The analysis is limited mainly to calculating one-dimensional frequencies (the frequency with which a given response occurs over the total number of respondents or total number of responses) or two-dimensional frequencies resulting from crossing the chosen response option with other socio-demographic or socio-economic characteristics, etc; in other words, a partial approach in either case.

Our article in IJSRM present a multidimensional analysis protocol that provides the researcher with tools to identify more and better profiles about “who says what”. The underlying philosophy in this approach is to “let the data speak for themselves”, and to learn from them. The strategy begins by coding the response options as a set of metric binary variables (presence/absence). The ideal methodological duo for the exploration of the resulting data is Principal Component Analysis coupled with an Ascending Hierarchical Cluster Analysis, incorporating, in addition, supplementary variables (gender, age, marital status, educational attainment, etc.).

This protocol applies to the analysis of three different multiple-response questions included in a Spanish National Sociological Survey (CIS- Centro de Investigaciones Sociológicas):

  1. “How do you usually spend your free time?”, the respondent has 17 options and can select as many as desired; no order of preference is required and the categories are not mutually exclusive.
  2. “During 2017, how have you spent or do you intend spending your leisure periods?”, with 10 options, there is no limit on the number of them that can be checked, but there are two which automatically exclude the rest: “I haven’t thought about it yet” and “I have no leisure periods”.
  3. When deciding how to spend your days off, what are your top three priorities?”, there is alimit of three options, out of 10 possible, no order of preference required.

This empirical analysis provides evidence not only of the interpretation potential of the coding/analysis protocol, but also of the limitations of some multiple-response question formats. Specifically, it is shown that multi-response with limited options is not a suitable format for detecting response patterns or overall tendencies leading to the identification of global respondent profiles. In addition, this study corroborates that in the “forced choice” and “check all that apply” the respondents are more likely to choose from the options presented at the beginning of a list (primacy effect). Early theories attributing the phenomenon to such questions requiring deeper cognitive processing.

Read the full article in IJSRM here.