Calls, featured, Notebook

Remote qualitative data collection: Lessons from a multi-country qualitative evaluation

By Mehjabeen Jagmag *

Like most researchers who had planned to begin their research projects earlier this year, our research team found our data collection plans upended by the pandemic. We had designed our research guides, received ethical clearance and completed training our research teams for a multi-country endline evaluation of an education programme in Ghana, Kenya and Nigeria much before we heard of the COVID-19 pandemic.

A few days before our teams travelled to their respective data collection sites, phone calls started pouring in – schools were closed indefinitely, travel between cities was restricted, and we were beginning to understand how much the COVID-19 pandemic would change our lives. After a few weeks of waiting and watching, it became apparent that we could not continue in-person data collection.

We revised our research guides and prepared ourselves for conducting remote phone-interviews with our research participants. Given that this was the third and last round of data collection in our multi-year panel research, we had previously collected phone numbers of our research participants and acquired permission to be able to contact them on the phone for further research. We set up remote research desks for the team and began preparation for data collection.

What we were unsure about was whether our research plans would be successful. Accounts of fraudulent callers promising medical remedies and peddling fake health insurance packages had made people wary of responding to unknown phone numbers. We were not sure how many of the phone numbers we had collected in the previous year would still be working, and most importantly, we were not sure how our research participants were faring under the lockdown and whether they would want to speak with us. Finally, our research participants included primary school students, who were an essential voice in our study. We were keen to conduct interviews but were not sure if this would be feasible – would parents trust us enough to speak to us and consent to their children speaking to us? Once we secured consent from parents, would children provide assent? As trust was the key element to completing our research successfully, we devised a data collection plan that included the following elements, that are likely to be necessary for future remote data collection.

Training and retraining for remote data collection

We spent time discussing as a team what the potential challenges may be and how we plan to respond to them. We drew up a collective list of answers that we could draw on to communicate clearly and effectively about the evaluation, respond to any queries and alleviate any concerns that our participants had. This list and knowledge grew, and we collected data, and debrief meetings with the teams at the end of each data helped ensure this was a live document.

Seek feedback from key informants

We contacted community leaders and headteachers to enquire about how we should approach data collection with school and community participants. They provided important contextual information that was occasionally specific to each community. We used this information to improve our introductory messages, the time and dates we called and how we approached research participants.

Seek introductions from trusted leaders

We also asked community leaders and headteachers to support our recruitment process by sending messages to the participants about our research before it began. Doing so helped minimise any uncertainty of the veracity of our calls. Where relevant, we compensated them for their airtime.

Give participants time to prepare for the interview

We shared information about our organisation and the research objective over text messages or calls, which gave research participants enough time to decide whether they wanted to participate. It also helped plan to speak at a time would suit them best for a discussion, and also consult with their family and children if they wanted to participate in the research.

Ensure continuity of research teams

As this was an endline evaluation, we had research team members who participated in previous rounds of data collection calling the participants they were likely to have visited in the past. Where this was possible, it increased trust and facilitated easy conversations.

Prepare case-history notes

We prepared short case history notes about the programme and school and what we had learned from previous research rounds for each school to build confidence that our intentions and credentials were genuine. These notes helped remind research participants of our last conversation, helped us focus on what has changed since that last conversation, which in turn helped keep conversations short and in general proved to be a useful conversation starter.

Save time at the beginning and end for questions

We ensure research participants had enough time to ask us about the programme, our motivations, go over the consent form, understand why we wanted to speak with the children or for children to ask parents for their permission before we began our interviews. To ensure that that the conversation did not feel rushed, we designed shorter research guides.

Plan for breaks or changes when interviewing with young participants

When speaking with students, we anticipated time to break and distractions during the call, which helped maintain a relaxed pace during the interview. If students were uncomfortable with phone interviews, we, eased the conversation to a close to minimise any distress caused to the participant.

Summary and Conclusion

We completed data collection in all three countries, albeit with a less ambitious research plan that we originally intended for an in-person research study. The key objective of our research was to collect the optimal amount of data that would inform the programme evaluation while making the interview process convenient and comfortable for the research participants involved. To do so, we have learned that it is vital for participants to have confidence in the researchers and the motive for collecting data. Planning before we began data collection and updating our knowledge as the research progressed proved invaluable to our experience.

* Mehjabeen Jagmag is a Senior Consultant with Oxford Policy Management.

Calls, covid-19, Notebook

Teaching online research methods online with asynchronous international distance learning students during Covid-19

By Elizabeth Hidson and Vikki Wynn

Challenges in asynchronous international distance learning pre-Covid

Working on an international distance learning teacher training programme brings multiple challenges, the biggest of which had previously been the asynchronous pattern of teaching and learning for the academic elements. Teaching is based on a systematic instructional design approach adopted by our university and broken down into weekly thematic units to support acquisition, discussion, investigation, collaboration, practice and production to meet learning outcomes. Recorded micro-lectures, learning activities and discussion boards are accessed asynchronously, with face-to-face online group sessions for further consolidation. The assessed teaching practice element of the programme had always been carried out in the host international schools, facilitated by school-based mentors and in-country professional practice tutors.

Developing research-informed practitioners

The importance of developing research capacity in trainee teachers stems from the expectation that they will become research-informed practitioners who can use evidence to inform decision-making (Siddiqui and Wardle, 2020). Being consumers of research is not enough, however: teachers need to also develop the tools to carry out their own research in school settings. The first MA-level module that our trainees encounter requires a case study approach to explore specific interventions that their schools implement to address targeted pupils’ learning needs. Typically, our trainee teachers undertake observations, conduct interviews and collect a range of data in their settings to understand how and why this additional support is provided and discuss it in relation to ‘what works’ in education, using initial sources such as the Education Endowment Foundation and the What Works Clearinghouse portals.

Establishing the heritage of research methods and methodology

Good teaching is good teaching, and it follows therefore that good research practice is still good research practice, irrespective of a global pandemic. Early rapid evidence assessments concluded that teaching quality was more important for remote teaching and learning than how it was delivered (Education Endowment Foundation, 2020), which had also been our starting point when considering our own research methods pedagogy. The initial teaching of research methods starts on our programme with key concepts and expectations: conceptualisation, literature, developing research questions, justification of research methods, consideration of ethics, all designed to ensure that the student teacher can apply theory to practice. We start with a formative proposal assignment to ensure early engagement with methodology and methods.

Our face-to-face online group sessions, themed as weekly ‘coffee shop’ meetings, provide a collaborative forum for knowledge exchange and trouble-shooting. Trainee teachers join to listen, to share ideas, to pose questions and problems and the module leaders respond with a dialogic teaching approach, helping to contextualise research methods in school settings and develop knowledge and understanding in a supportive online space.

Elizabeth Hidson promoting the weekly ‘coffee shop’ meeting

The ‘hybrid’ assignment and hybrid research methods

As teaching practice became hybrid for trainee teachers, so did research and assessment. Schooling around the world moved in and out of face-to-face, hybrid and fully online modes over the course of 2019, with the realities of the pandemic hitting earliest in the far east, where half of our students are based. As physical access to schools and participants fluctuated with local restrictions and impacted on students’ research plans, our alternative assignment pathways opened out to include hybrid and hypothetical assignments designed to act as a safety net for completion.

A key feature of the hybrid assignment was the shift to online and alternative research methods, building on the core research methods pedagogy we had established. Where face-to-face interviews were not an option, we promoted video calling and desktop-sharing (Hidson, 2020), but maintaining the spirit of semi-structured or artefact-based interviewing. Where classroom observations were no longer possible, we promoted fieldnotes captured from hybrid or online teaching sessions, urging a re-think of ethics and collection of additional secondary data in various forms to attempt triangulation.

The outcomes in terms of the final case studies produced have been pleasing: creative and thoughtful academic discussions that responded to the unique challenges of each setting. We regularly quoted Hamilton and Corbett-Whittier (2013) to our trainees, where they advised thinking of a case study as a living thing and ensuring that it made “as much sense to the reader as it did to the researcher” (p.179). The act of thinking in detail about the research methods seemed to have been beneficial to the understanding of research methods and real-world research.

Developing resilient research capability as a factor of resilient teaching

Although our programme continues to respond to the global challenges of Covid-19, we are keen to retain what has worked into the future. The ability for trainee teachers to embrace the need for resilience in teaching as well as in research is a benefit. Their capacity to see research as a live and responsive part of their practice has always been our intention; we believe that the response to research during Covid will itself be a case study for future cohorts.

References

Education Endowment Foundation (2020). Remote Learning, Rapid Evidence Assessment. London: Education Endowment Foundation.

Hamilton, L., and Corbett-Whittier, C. (2013). Using Case Study in Education Research. London: Sage.

Hidson, E (2020) Internet Video Calling and Desktop Sharing (VCDS)as an Emerging Research Method for Exploring Pedagogical Reasoning in Lesson Planning. Video Journal of Education and Pedagogy, 5 (1). pp. 1-14. https://doi.org/10.1163/23644583-00501001.

Siddiqui, N. and Wardle, L (2020). Can users judge what is ‘promising’ evidence in education? Research Intelligence 144 (Autumn 2020). London: BERA.

Notebook, Uncategorized

“Who says what” in multiple choice questions. A comprehensive exploratory analysis protocol

By M. Landaluce-Calvo, Ignacio García-Lautre, Vidal Díaz de Rada, & Elena Abascal

The aim of much sociological research is to assess public opinion, and the data are often collected by the survey method. This enables the detection of different response, behaviour or opinion profiles and the characterization of groups of respondents with similar views on a certain topic or set of questions. As is widely known, however, different types of question not only yield different qualities of response, but also require different methods of analysis.

Any attempt to classify survey question types require consideration of five criteria: 1) degree of freedom in the response; 2) type of content, 3) level of sensitivity/threat; 4) level of measurement; and 5) number of response options per question. The last classification (with respect to the number of responses) first differentiates between single response and multiple response questions. Here is the main objective of our article in IJSRM: How to extract maximum information from multiple response questions.

There are two broad types of multiple-response questions. One is the categorical response question, where the respondent is instructed to “check-all-that-apply” (the categories are exhaustive, but not mutually exclusive.). The other is the binary response question, where the respondent is required to check yes or no to each response option. Respondents find “check-all-that-apply” questions more difficult to answer because the multiple options require more use of memory. Under the binary-response format the respondent must consider pairs of options, one by one, and check one option in each case. Each pair of options requires an answer, so only a minimal demand is placed on memory. This procedure yields more responses, in both telephone and online surveys and requires less effort on the part of the respondent, although it may lengthen the questionnaire.

Those admitting various response options can be further classified into grid or check-all-that-apply questions. In the case of the latter, the categories are exhaustive, but not mutually exclusive. This multiple-response question format is its widespread use both in the field of opinion polling and in sociological and marketing research. International research project such as the European Social Survey and the Word Values Survey, for example, contain large numbers of multiple responses questions.

All the above considerations relate to the stages of data collection and participant opinion retrieval, but what about the analysis? A review of the specialist literature reveals a lack of attention to the specific data-processing treatment, and the failure to use a multidimensional exploratory approach that would enable the maximum amount of information to be extracted from the response options. The analysis is limited mainly to calculating one-dimensional frequencies (the frequency with which a given response occurs over the total number of respondents or total number of responses) or two-dimensional frequencies resulting from crossing the chosen response option with other socio-demographic or socio-economic characteristics, etc; in other words, a partial approach in either case.

Our article in IJSRM present a multidimensional analysis protocol that provides the researcher with tools to identify more and better profiles about “who says what”. The underlying philosophy in this approach is to “let the data speak for themselves”, and to learn from them. The strategy begins by coding the response options as a set of metric binary variables (presence/absence). The ideal methodological duo for the exploration of the resulting data is Principal Component Analysis coupled with an Ascending Hierarchical Cluster Analysis, incorporating, in addition, supplementary variables (gender, age, marital status, educational attainment, etc.).

This protocol applies to the analysis of three different multiple-response questions included in a Spanish National Sociological Survey (CIS- Centro de Investigaciones Sociológicas):

  1. “How do you usually spend your free time?”, the respondent has 17 options and can select as many as desired; no order of preference is required and the categories are not mutually exclusive.
  2. “During 2017, how have you spent or do you intend spending your leisure periods?”, with 10 options, there is no limit on the number of them that can be checked, but there are two which automatically exclude the rest: “I haven’t thought about it yet” and “I have no leisure periods”.
  3. When deciding how to spend your days off, what are your top three priorities?”, there is alimit of three options, out of 10 possible, no order of preference required.

This empirical analysis provides evidence not only of the interpretation potential of the coding/analysis protocol, but also of the limitations of some multiple-response question formats. Specifically, it is shown that multi-response with limited options is not a suitable format for detecting response patterns or overall tendencies leading to the identification of global respondent profiles. In addition, this study corroborates that in the “forced choice” and “check all that apply” the respondents are more likely to choose from the options presented at the beginning of a list (primacy effect). Early theories attributing the phenomenon to such questions requiring deeper cognitive processing.

Read the full article in IJSRM here.

Notebook

The feasibility and challenge of using administrative data: a case study of historical prisoner surveys

By Anthony Quinn, David Denney, Nick Hardwick, Rahul Jalil, & Rosie Meek

This research note arose from a collaboration between researchers at HM Inspectorate of Prisons and Royal Holloway University of London. Within the last two decades, HM Inspectorate of Prisons [HMIP] has collected a vast array of survey data from detainees in England and Wales. As part of HMIP inspections, the detainee voice is captured via a survey and it is triangulated with other evidence. The survey data inform the inspection report of each establishment as well as annual reports and thematic studies. 

These survey data are important because they provide detainees with a rare opportunity to voice their experiences of incarceration. There are questions about a range of aspects of prison life. For instance, participants are asked about the amount of time that they spend outside of their cells, the quality of the prison food and how often they are able to receive visits from their family and friends. Currently there are 159 question items in total.

It goes without saying that in twenty years a dedicated research team has amassed a huge volume of data from detainees within the 117 prisons in England and Wales. HMIP have retained these data for inspectorate purposes and digital records of the paper surveys have been stored in an electronic archive. So, to analyse these historical survey data is it just a case of logging in to the archive and inputting the data into a statistical computing program?

Well, no… far from it. There are a number of complexities that must be addressed to make just one or a few larger datasets with these data. With these prisoner survey data, a major sticking point is the number of iterations of the questionnaires that there have been since the year 2000.i.e. there have been several different versions of the questionnaire and so questions and their response options have varied. This establishes the need to create metadata such as inventories and timelines so that available data can be easily identified.

A wealth of literature explains the challenges of opening up data to secondary users. Primarily, curating and maintaining datasets is costly and time-consuming. This is not an undertaking that should be taken lightly or, where it can be avoided, alone. Secondly, survey data can contain personal information and so it needs to be ensured that data are sufficiently anonymised. Thirdly, data can easily be misused or misinterpreted so it is vital to document and explain the data and their limitations for secondary users.

Any research involving places of detention and detainees raises significant ethical considerations. In this case, detainees had not explicitly agreed data from the surveys they returned to the inspectorate could be made more widely accessible. So, we conducted some focus groups with current long-serving prisoners (we would have conducted more if the epidemic had not halted our efforts) to ask what they thought – they were emphatic that the data should be shared if it would help improve prison conditions. Indeed, some said they would not have taken part in the survey if they had thought the data were NOT going to be used in this way. Further qualitative research with data subjects in order to ascertain their perspectives is certainly an endeavour to be pursued.

Within our Research Note we have put a spotlight on the intricacies involved in identifying administrative data, aggregating them and fully understanding the context within which they were collected. To achieve the latter aim it is vital that, where possible, those who have collected data play a prominent role in the collation of administrative data. This is not a task that should simply be outsourced. Rather, to do justice to such a potentially valuable resource, the expertise of a diverse collaboration of professionals is vital.

Oh, and there’s COVID-19 as well… this has prevented researchers from gaining access to prisons to talk directly to detainees. It has also highlighted the importance of making better use of existing operational and administrative data sources.

Read the full IJSRM article here.

Calls, covid-19, Notebook

Qualitative health research beyond and alongside COVID-19

By Sue Chowdhry, Emily Ross and Julia Swallow

As qualitative researchers in academia, like many others our practice has been transformed in light of the global Coronavirus pandemic. The ‘lockdowns’ enforced across the world have introduced greater awareness of our proximity to others in everyday life, and of the need to maintain a prescribed distance between bodies. This has implications for our work as researchers in the field of health and illness, whose tools include face-to-face methods such as focus groups, interviews and ethnography.

In this blog post, we reflect on the meaning and implications of doing qualitative health research beyond and alongside COVID-19. Drawing on examples from our individual research projects, we first focus on who and what might be excluded or silenced through the changes to our research environments and practices prompted by the pandemic. We then reflect on several implications of the ruptures caused by the pandemic for qualitative research in health more widely.

Exclusions and silences

Research interactions

As researchers in medical sociology and science and technology studies, we had been undertaking separate projects at the time of the pandemic. Sue’s research concerned pregnant women with experience of pre-term birth, and Emily and Julia’s considered patient and practitioner engagement with novel cancer treatments (genomic techniques and immunotherapies respectively). All three of our projects thus involved individuals classified as especially ‘vulnerable’ to COVID-19 by the UK Government (see Ganguli-Mitra’s opinion piece for a wider discussion of the classification of ‘vulnerability’ as related to COVID-19). As a result, in addition to the restrictions imposed by Institutional and NHS bodies on research practice, we were particularly mindful of the potential consequences of face-to-face methods for our participants.

The prospect of continuing our research in the absence of physical proximity to our participants was daunting. Viewing interactions between researcher and participant as sites for the active co-creation of qualitative data, we were concerned that the inability to conduct research encounters in person, and loss of the intersubjective encounter, could be detrimental to our practice. As Alondra Nelson’s blog post on this issue points out, valuable research insights can be gained from being close enough to observe gestures such as “toes tapping and nervous hands”. Sue interpreted her physical presence as key to the success of focus groups she had conducted prior to COVID-19 restrictions. For example, her occupation of the physical space was performed so as to signal to participants that they controlled the discussion. Equally, participants orientated their bodies to each other in ways that indicated interest and support, through spontaneous shared laughter, eye contact and sometimes fleeting touches of hands at emotional junctures.

We have also been reflecting on the implications of a move away from face-to-face methods for relations of power within qualitative research practice. Our research projects have often focused on life-events that can be distressing and emotional for participants, and throughout we have all maintained a commitment to democratising the research process. We have endeavoured to forge reciprocal relationships with participants, and adopted forms of practice that more equitably distribute control whilst qualitative interviewing. Reflection on the issue of power in research is significant for those turning to online methods in light of the pandemic, particularly where online material pre-exists the research encounter. Here the intimacy of face-to-face methods, which feminist scholars have claimed better allow for the involvement of participants in the production of knowledge (Ramazanoğlu and Holland, 2002), is absent. Having used pre-existing online material in previous projects, Emily felt that the creativeness of qualitative research practice as a shared project between researcher and participant was not as achievable in online research, nor was the closeness that comes from being a key participant in the creation of qualitative data. As such, those adopting online methods in light of the pandemic may try to re-craft participant involvement and reciprocity in other ways. This may be through initiating contact with authors of online material, sharing information about the research with them, and if appropriate seeking consent from authors to use online posts in research.

Research landscapes and spaces

The concerns discussed above are further situated within the landscapes and spaces in which qualitative research takes place. In the example of focus groups, the ‘affective atmospheres’ (Anderson, 2009) shaping the research encounter provided the backdrop for Sue and her participants’ responses to the research and each other. Sue offered refreshments to her focus group participants, and vividly recalls the smells, tastes and sounds of this shared experience. The atmosphere was carefully fashioned for respondents to feel valued and at ease, and to allow for the exchange of intimate reflections on experience.

The arrangements of care provision, and situated contexts in which care is given and received, shape patient, clinician and researcher accounts of disease and treatment. In the time of COVID-19, the research spaces with which we as health researchers had been familiar are being re-shaped, with this particularly visible in cancer care. Before the pandemic, the settings for Julia’s ethnographic research were already stratified and fragmented, with consequences for the practice of healthcare and patients’ biosocial experiences of cancer. Novel immunotherapies could not be accessed by all, raising questions around the inclusions, exclusions and silences provoked by these therapies – who has access and who benefits? COVID-19 is potentially (re)producing or exacerbating existing inequities. As researchers, who we are able to observe and engage in our projects is a key concern, as we ask what and whose realities, experiences and practices might be privileged over others in the context of contemporary cancer care, and in relation to the healthcare worlds (re)shaped by COVID-19.

Responding and intervening

Although the current situation has prompted us to halt or reformulate our ongoing research, in our experience the need to reflect on and adapt our methodologies has also provided opportunities. Importantly, recognising and responding to the methodological restrictions prompted by the pandemic has encouraged us to think about the inclusions, exclusions and silences that already exist in healthcare worlds, which have been exacerbated or magnified by COVID-19. Attention to these issues through an alternative lens has prompted us to question how we can use method to respond and intervene. Method as practice is a means of understandingrather than organising for complexity and uncertainty, and a way to respond to the disruptions, inclusions/exclusions and silences which are rendered visible and exacerbated by COVID-19. Method produces particular realities and as such, drawing on feminist STS scholars, we have the opportunity to intervene, and to do what Alondra Nelson describes as ‘creating knowledge pathways to a better world’. If methods shape how and what we know and are always political (Annemarie Mol (1999) would describe this as ‘ontological politics’) – what kinds of social realities do we want to create or bring into being?

Online methods afford possibilities for responding to the contemporary challenges we face as researchers. Qualitative analysis of pre-existing blog posts, solicited online diaries and other methods helpfully detailed by Deborah Lupton and colleagues allows us to continue research projects disrupted by the pandemic. Further, online spaces present opportunities to intervene; to engage with those typically excluded from qualitative research due to geographical location or accessibility – with this even more pronounced in a time of ‘shielding’ those deemed most vulnerable. Online approaches can capture forms of networking and support-seeking around experiences of ill health which have been obscured by the pandemic, but which continue to be shaped by inequalities in access and survival.

As another approach, the benefit of doing ethnography, however limited this might be and whatever this might look like in the future, is that it is about opening space for complexity and uncertainty. It allows us to acknowledge and respond to the messiness of practice as an attempt to understand, rather than organise, the uneven and unpredictable ways in which knowledge is produced in research (Law, 2004). It is about taking the world as it is,whilst also keeping in mind the importance of doing what Donna Haraway would describe as critical, political, partial and situated work which is always on-going.

Reflecting

In an academic environment which emphasises activity and impact, COVID-19 has forced upon us ‘space to breathe’ (Will, 2020). The restrictions imposed by our governments and institutions have demanded an additional layer of reflexivity as we contemplate our research projects in light of the pandemic. In some cases, this has entailed the adaptation of research questions, as well as consideration of how alternative methods align with our wider research paradigms. With restrictions in our ability to engage in face-to-face research methods, we lose key aspects of the relational qualitative research encounter, and are pulled away from the research atmospheres, landscapes and spaces with which we are familiar. However, the loss of face-to-face methods has provided us with an unexpected opportunity to explore new approaches, encouraged tough reflection on our research questions and methodologies, and prompted deeper contemplation of the worth of our research itself.

The authors of this post are supported by the Wellcome Trust (grants 104831/Z/14/Z and 218145/Z/19/Z) and NIHR (grant 17/22/02).

Anderson, B. (2009) Affective atmospheres. Emotion, Space and Society. 2, pp.77-81. doi:10.1016/j.emospa.2009.08.005

Haraway, D. (1988) Situated Knowledges: The Science Question in Feminism and the Privilege of Partial Perspective. Feminist Studies, Vol. 14, No. 3, pp. 575-599.

Law, J. (2004) After Method: Mess in social science research. London: Routledge.

Mol, A. (1999), Ontological politics. A word and some questions. The Sociological Review, 47: 74-89. doi:10.1111/j.1467-954X.1999.tb03483.x

Ramazanoğlu, C and Holland, J (2002) Feminist methodology: challenges and choices. London: Sage.

Will, C.M. (2020), ‘And breathe…’? The sociology of health and illness in COVID ‐19 time. Sociology of Health & Illness, 42: 967-971. doi:10.1111/1467-9566.13110