Kennedy, Jenny, Indigo Holcombe-James e Kate Mannell. "Access Denied". M/C Journal 24, n. 3 (21 giugno 2021). http://dx.doi.org/10.5204/mcj.2785.
Abstract (sommario):
Introduction As social-distancing mandates in response to COVID-19 restricted in-person data collection methods such as participant observation and interviews, researchers turned to socially distant methods such as interviewing via video-conferencing technology (Lobe et al.). These were not new tools nor methods, but the pandemic muted any bias towards face-to-face data collection methods. Exemplified in crowd-sourced documents such as Doing Fieldwork in a Pandemic, researchers were encouraged to pivot to digital methods as a means of fulfilling research objectives, “specifically, ideas for avoiding in-person interactions by using mediated forms that will achieve similar ends” (Lupton). The benefits of digital methods for expanding participant cohorts and scope of research have been touted long before 2020 and COVID-19, and, as noted by Murthy, are “compelling” (“Emergent” 172). Research conducted by digital methods can expect to reap benefits such as “global datasets/respondents” and “new modalities for involving respondents” (Murthy, “Emergent” 172). The pivot to digital methods is not in and of itself an issue. What concerns us is that in the dialogues about shifting to digital methods during COVID-19, there does not yet appear to have been a critical consideration of how participant samples and collected data will be impacted upon or skewed towards recording the experiences of advantaged cohorts. Existing literature focusses on the time-saving benefits for the researcher, reduction of travel costs (Fujii), the minimal costs for users of specific platforms – e.g. Skype –, and presumes ubiquity of device access for participants (Cater). We found no discussion on data costs of accessing such services being potential barriers to participation in research, although Deakin and Wakefield did share our concern that: Online interviews may ... mean that some participants are excluded due to the need to have technological competence required to participate, obtain software and to maintain Internet connection for the duration of the discussion. In this sense, access to certain groups may be a problem and may lead to issues of representativeness. (605) We write this as a provocation to our colleagues conducting research at this time to consider the cultural and material capital of their participants and how that capital enables them to participate in digitally-mediated data gathering practices, or not, and to what extent. Despite highlighting the potential benefits of digital methods within a methodological tool kit, Murthy previously cautioned against the implications posed by digital exclusion, noting that “the drawback of these research options is that membership of these communities is inherently restricted to the digital ‘haves’ ... rather than the ‘have nots’” (“Digital” 845). In this article, we argue that while tools such as Zoom have indeed enabled fieldwork to continue despite COVID disruptions, this shift to online platforms has important and under-acknowledged implications for who is and is not able to participate in research. In making this argument, we draw on examples from the Connected Students project, a study of digital inclusion that commenced just as COVID-19 restrictions came into effect in the Australian state of Victoria at the start of 2020. We draw on the experiences of these households to illustrate the barriers that such cohorts face when participating in online research. We begin by providing details about the Connected Students project and then contextualising it through a discussion of research on digital inclusion. We then outline three areas in which households would have experienced (or still do experience) difficulties participating in online research: data, devices, and skills. We use these findings to highlight the barriers that disadvantaged groups may face when engaging in data collection activities over Zoom and question how this is impacting on who is and is not being included in research during COVID-19. The Connected Students Program The Connected Students program was conducted in Shepparton, a regional city located 180km north of Melbourne. The town itself has a population of around 30,000, while the Greater Shepparton region comprises around 64,000 residents. Shepparton was chosen as the program’s site because it is characterised by a unique combination of low-income and low levels of digital inclusion. First, Shepparton ranks in the lowest interval for the Australian Bureau of Statistics’ Socio-Economic Indexes for Areas (SEIFA) and the Index of Relative Socioeconomic Advantage and Disadvantage (IRSAD), as reported in 2016 (Australian Bureau of Statistics, “Census”; Australian Bureau of Statistics, “Index”). Although Shepparton has a strong agricultural and horticultural industry with a number of food-based manufacturing companies in the area, including fruit canneries, dairies, and food processing plants, the town has high levels of long-term and intergenerational unemployment and jobless families. Second, Shepparton is in a regional area that ranks in the lowest interval for the Australian Digital Inclusion Index (Thomas et al.), which measures digital inclusion across dimensions of access, ability, and affordability. Funded by Telstra, Australia’s largest telecommunications provider, and delivered in partnership with Greater Shepparton Secondary College (GSSC), the Connected Students program provided low-income households with a laptop and an unlimited broadband Internet connection for up to two years. Households were recruited to the project via GSSC. To be eligible, households needed to hold a health care card and have at least one child attending the school in year 10, 11, or 12. Both the student and a caregiver were required to participate in the project to be eligible. Additional household members were invited to take part in the research, but were not required to. (See Kennedy & Holcombe-James; and Kennedy et al., "Connected Students", for further details regarding household demographics.) The Australian Digital Inclusion Index identifies that affordability is a significant barrier to digital inclusion in Australia (Thomas et al.). The project’s objective was to measure how removing affordability barriers to accessing connectivity for households impacts on digital inclusion. By providing participating households with a free unlimited broadband internet connection for the duration of the research, the project removed the costs associated with digital access. Access alone is not enough to resolve the digital exclusion confronted by these low-income households. Digital exclusion in these instances is not derived simply from the cost of Internet access, but from the cost of digital devices. As a result, these households typically lacked sufficient digital devices. Each household was therefore provided both a high speed Internet connection, and a brand new laptop with built-in camera, microphone, and speakers (a standard tool kit for video conferencing). Data collection for the Connected Students project was intended to be conducted face-to-face. We had planned in-person observations including semi-structured interviews with household members conducted at three intervals throughout the project’s duration (beginning, middle, and end), and technology tours of each home to spatially and socially map device locations and uses (Kennedy et al., Digital Domesticity). As we readied to make our first research trip to commence the study, COVID-19 was wreaking havoc. It quickly became apparent we would not be travelling to work, much less travelling around the state. We thus pivoted to digital methods, with all our data collection shifting online to interviews conducted via digital platforms such as Zoom and Microsoft Teams. While the pivot to digital methods saved travel hours, allowing us to scale up the number of households we planned to interview, it also demonstrated unexpected aspects of our participants’ lived experiences of digital exclusion. In this article, we draw on our first round of interviews which were conducted with 35 households over Zoom or Microsoft Teams during lockdown. The practice of conducting these interviews reveals insights into the barriers that households faced to digital research participation. In describing these experiences, we use pseudonyms for individual participants and refer to households using the pseudonym for the student participant from that household. Why Does Digital Inclusion Matter? Digital inclusion is broadly defined as universal access to the technologies necessary to participate in social and civic life (Helsper; Livingstone and Helsper). Although recent years have seen an increase in the number of connected households and devices (Thomas et al., “2020”), digital inclusion remains uneven. As elsewhere, digital disadvantage in the Australian context falls along geographic and socioeconomic lines (Alam and Imran; Atkinson et al.; Blanchard et al.; Rennie et al.). Digitally excluded population groups typically experience some combination of education, employment, income, social, and mental health hardship; their predicament is compounded by a myriad of important services moving online, from utility payments, to social services, to job seeking platforms (Australian Council of Social Service; Chen; Commonwealth Ombudsman). In addition to challenges in using essential services, digitally excluded Australians also miss out on the social and cultural benefits of Internet use (Ragnedda and Ruiu). Digital inclusion – and the affordability of digital access – should thus be a key concern for researchers looking to apply online methods. Households in the lowest income quintile spend 6.2% of their disposable income on telecommunications services, almost three times more than wealthier households (Ogle). Those in the lowest income quintile pay a “poverty premium” for their data, almost five times more per unit of data than those in the highest income quintile (Ogle and Musolino). As evidenced by the Australian Digital Inclusion Index, this is driven in part by a higher reliance on mobile-only access (Thomas et al., “2020”). Low-income households are more likely to access critical education, business, and government services through mobile data rather than fixed broadband data (Thomas et al., “2020”). For low-income households, digital participation is the top expense after housing, food, and transport, and is higher than domestic energy costs (Ogle). In the pursuit of responsible and ethical research, we caution against assuming research participants are able to bear the brunt of access costs in terms of having a suitable device, expending their own data resources, and having adequate skills to be able to complete the activity without undue stress. We draw examples from the Connected Students project to support this argument below. Findings: Barriers to Research Participation for Digitally Excluded Households If the Connected Students program had not provided participating households with a technology kit, their preexisting conditions of digital exclusion would have limited their research participation in three key ways. First, households with limited Internet access (particularly those reliant on mobile-only connectivity, and who have a few gigabytes of data per month) would have struggled to provide the data needed for video conferencing. Second, households would have struggled to participate due to a lack of adequate devices. Third, and critically, although the Connected Students technology kit provided households with the data and devices required to participate in the digital ethnography, this did not necessarily resolve the skills gaps that our households confronted. Data Prior to receiving the Connected Students technology kit, many households in our sample had limited modes of connectivity and access to data. For households with comparatively less or lower quality access to data, digital participation – whether for the research discussed here, or in contemporary life – came with very real costs. This was especially the case for households that did not have a home Internet connection and instead relied solely on mobile data. For these households, who carefully managed their data to avoid running out, participating in research through extended video conferences would have been impossible unless adequate financial reimbursement was offered. Households with very limited Internet access used a range of practices to manage and extend their data access by shifting internet costs away from the household budget. This often involved making use of free public Wi-Fi or library internet services. Ellie’s household, for instance, spent their weekends at the public library so that she and her sister could complete their homework. While laborious, these strategies worked well for the families in everyday life. However, they would have been highly unsuitable for participating in research, particularly during the pandemic. On the most obvious level, the expectations of library use – if not silent, then certainly quiet – would have prohibited a successful interview. Further, during COVID-19 lockdowns, public libraries (and other places that provide public Internet) became inaccessible for significant periods of time. Lastly, for some research designs, the location of participants is important even when participation is occurring online. In the case of our own project, the house itself as the site of the interview was critical as our research sought to understand how the layout and materiality of the home impacts on experiences of digital inclusion. We asked participants to guide us around their home, showing where technologies and social activities are colocated. In using the data provided by the Connected Students technology kit, households with limited Internet were able to conduct interviews within their households. For these families, participating in online research would have been near impossible without the Connected Students Internet. Devices Even with adequate Internet connections, many households would have struggled to participate due to a lack of suitable devices. Laptops, which generally provide the best video conferencing experience, were seen as prohibitively expensive for many families. As a result, many families did not have a laptop or were making do with a laptop that was excessively slow, unreliable, and/or had very limited functions. Desktop computers were rare and generally outdated to the extent that they were not able to support video conferencing. One parent, Melissa, described their barely-functioning desktop as “like part of the furniture more than a computer”. Had the Connected Students program not provided a new laptop with video and audio capabilities, participation in video interviews would have been difficult. This is highlighted by the challenges students in these households faced in completing online schooling prior to receiving the Connected Students kit. A participating student, Mallory, for example, explained she had previously not had a laptop, reliant only on her phone and an old iPad: Interviewer: Were you able to do all your homework on those, or was it sometimes tricky?Mallory: Sometimes it was tricky, especially if they wanted to do a call or something ... . Then it got a bit hard because then I would use up all my data, and then didn’t have much left.Interviewer: Yeah. Right.Julia (Parent): ... But as far as schoolwork, it’s hard to do everything on an iPad. A laptop or a computer is obviously easier to manoeuvre around for different things. This example raises several common issues that would likely present barriers to research participation. First, Mallory’s household did not have a laptop before being provided with one through the Connected Students program. Second, while her household did prioritise purchasing tablets and smartphones, which could be used for video conferencing, these were more difficult to navigate for certain tasks and used up mobile data which, as noted above, was often a limited resource. Lastly, it is worth noting that in households which did already own a functioning laptop, it was often shared between several household members. As one parent, Vanessa, noted, “yeah, until we got the [Connected Students] devices, we had one laptop between the four of us that are here. And Noel had the majority use of that because that was his school work took priority”. This lack of individuated access to a device would make participation in some research designs difficult, particularly those that rely on regular access to a suitable device. Skills Despite the Connected Students program’s provision of data and device access, this did not ensure successful research participation. Many households struggled to engage with video research interviews due to insufficient digital skills. While a household with Internet connectivity might be considered on the “right” side of the digital divide, connectivity alone does not ensure participation. People also need to have the knowledge and skills required to use online resources. Brianna’s household, for example, had downloaded Microsoft Teams to their desktop computer in readiness for the interview, but had neglected to consider whether that device had video or audio capabilities. To work around this restriction, the household decided to complete the interview via the Connected Students laptop, but this too proved difficult. Neither Brianna nor her parents were confident in transferring the link to the interview between devices, whether by email or otherwise, requiring the researchers to talk them through the steps required to log on, find, and send the link via email. While Brianna’s household faced digital skills challenges that affected both parent and student participants, in others such as Ariel’s, these challenges were focussed at the parental level. In these instances, the student participant provided a vital resource, helping adults navigate platforms and participate in the research. As Celeste, Ariel’s parent, explained, it's just new things that I get a bit – like, even on here, because your email had come through to me and I said to Ariel "We're going to use your computer with Teams. How do we do this?" So, yeah, worked it out. I just had to look up my email address, but I [initially thought] oh, my god; what am I supposed to do here? Although helpful in our own research given its focus on school-aged young people, this dynamic of parents being helped by their dependents illustrates that the adults in our sample were often unfamiliar with the digital skills required for video conferencing. Research focussing only on adults, or on households in which students have not developed these skills through extended periods of online education such as occurred during the COVID-19 lockdowns, may find participants lacking the digital skills to participate in video interviews. Participation was also impacted upon by participants' lack of more subtle digital skills around the norms and conventions of video conferencing. Several households, for example, conducted their interviews in less ideal situations, such as from both moving and parked cars. A portion of the household interview with Piper’s household was completed as they drove the 30 minutes from their home into Shepperton. Due to living out of town, this household often experienced poor reception. The interview was thus regularly disrupted as they dropped in and out of range, with the interview transcript peppered with interjections such as “we’re going through a bit of an Internet light spot ... we’re back ... sorry ...” (Karina, parent). Finally, Piper switched the device on which they were taking the interview to gain a better connection: “my iPad that we were meeting on has worse Internet than my phone Internet, so we kind of changed it around” (Karina). Choosing to participate in the research from locations other than the home provides evidence of the limited time available to these families, and the onerousness of research participation. These choices also indicate unfamiliarity with video conferencing norms. As digitally excluded households, these participants were likely not the target of popular discussions throughout the pandemic about optimising video conferences through careful consideration of lighting, background, make-up and positioning (e.g. Lasky; Niven-Phillips). This was often identified by how participants positioned themselves in front of the camera, often choosing not to sit squarely within the camera lens. Sometimes this was because several household members were participating and struggled to all sit within view of the single device, but awkward camera positioning also occurred with only one or two people present. A number of interviews were initially conducted with shoulders, or foreheads, or ceilings rather than “whole” participants until we asked them to reposition the device so that the camera was pointing towards their faces. In noting this unfamiliarity we do not seek to criticise or apportion responsibility for accruing such skills to participating households, but rather to highlight the impact this had on the type of conversation between researcher and participant. Such practices offer valuable insight into how digital exclusion impacts on individual’s everyday lives as well as on their research participation. Conclusion Throughout the pandemic, digital methods such as video conferencing have been invaluable for researchers. However, while these methods have enabled fieldwork to continue despite COVID-19 disruptions, the shift to online platforms has important and under-acknowledged implications for who is and is not able to participate in research. In this article, we have drawn on our research with low-income households to demonstrate the barriers that such cohorts experience when participating in online research. Without the technology kits provided as part of our research design, these households would have struggled to participate due to a lack of adequate data and devices. Further, even with the kits provided, households faced additional barriers due to a lack of digital literacy. These experiences raise a number of questions that we encourage researchers to consider when designing methods that avoid in person interactions, and when reviewing studies that use similar approaches: who doesn’t have the technological access needed to participate in digital and online research? What are the implications of this for who and what is most visible in research conducted during the pandemic? Beyond questions of access, to what extent will disadvantaged populations not volunteer to participate in online research because of discomfort or unfamiliarity with digital tools and norms? When low-income participants are included, how can researchers ensure that participation does not unduly burden them by using up precious data resources? And, how can researchers facilitate positive and meaningful participation among those who might be less comfortable interacting through mediums like video conferencing? In raising these questions we acknowledge that not all research will or should be focussed on engaging with disadvantaged cohorts. Rather, our point is that through asking questions such as this, we will be better able to reflect on how data and participant samples are being impacted upon by shifts to digital methods during COVID-19 and beyond. As researchers, we may not always be able to adapt Zoom-based methods to be fully inclusive, but we can acknowledge this as a limitation and keep it in mind when reporting our findings, and later when engaging with the research that was largely conducted online during the pandemic. Lastly, while the Connected Students project focusses on impacts of affordability on digital inclusion, digital disadvantage intersects with many other forms of disadvantage. Thus, while our study focussed specifically on financial disadvantage, our call to be aware of who is and is not able to participate in Zoom-based research applies to digital exclusion more broadly, whatever its cause. Acknowledgements The Connected Students project was funded by Telstra. This research was also supported under the Australian Research Council's Discovery Early Career Researchers Award funding scheme (project number DE200100540). References Alam, Khorshed, and Sophia Imran. “The Digital Divide and Social Inclusion among Refugee Migrants: A Case in Regional Australia.” Information Technology & People 28.2 (2015): 344–65. Atkinson, John, Rosemary Black, and Allan Curtis. “Exploring the Digital Divide in an Australian Regional City: A Case Study of Albury”. Australian Geographer 39.4 (2008): 479–493. Australian Bureau of Statistics. “Census of Population and Housing: Socio-Economic Indexes for Areas (SEIFA), Australia, 2016.” 2016. <https://www.abs.gov.au/ausstats/abs@.nsf/Lookup/by%20Subject/2033.0.55.001~2016~Main%20Features~SOCIO-ECONOMIC%20INDEXES%20FOR%20AREAS%20(SEIFA)%202016~1>. ———. “Index of Relative Socio-Economic Advantage and Disadvantage (IRSAD).” 2016. <https://www.abs.gov.au/ausstats/abs@.nsf/Lookup/by%20Subject/2033.0.55.001~2016~Main%20Features~IRSAD~20>. Australian Council of Social Service. “The Future of Parents Next: Submission to Senate Community Affairs Committee.” 8 Feb. 2019. <http://web.archive.org/web/20200612014954/https://www.acoss.org.au/wp-content/uploads/2019/02/ACOSS-submission-into-Parents-Next_FINAL.pdf>. Beer, David. “The Social Power of Algorithms.” Information, Communication & Society 20.1 (2017): 1–13. Blanchard, Michelle, et al. “Rethinking the Digital Divide: Findings from a Study of Marginalised Young People’s Information Communication Technology (ICT) Use.” Youth Studies Australia 27.4 (2008): 35–42. Cater, Janet. “Skype: A Cost Effective Method for Qualitative Research.” Rehabilitation Counselors and Educators Journal 4.2 (2011): 10-17. Chen, Jesse. “Breaking Down Barriers to Digital Government: How Can We Enable Vulnerable Consumers to Have Equal Participation in Digital Government?” Sydney: Australian Communications Consumer Action Network, 2017. <http://web.archive.org/web/20200612015130/https://accan.org.au/Breaking%20Down%20Barriers%20to%20Digital%20Government.pdf>. Commonwealth Ombudsman. “Centrelink’s Automated Debt Raising and Recovery System: Implementation Report, Report No. 012019.” Commonwealth Ombudsman, 2019. <http://web.archive.org/web/20200612015307/https://www.ombudsman.gov.au/__data/assets/pdf_file/0025/98314/April-2019-Centrelinks-Automated-Debt-Raising-and-Recovery-System.pdf>. Deakin Hannah, and Kelly Wakefield. “Skype Interviewing: Reflections of Two PhD Researchers.” Qualitative Research 14.5 (2014): 603-616. Fujii, LeeAnn. Interviewing in Social Science Research: A Relational Approach. Routledge, 2018. Helsper, Ellen. “Digital Inclusion: An Analysis of Social Disadvantage and the Information Society.” London: Department for Communities and Local Government, 2008. Kennedy, Jenny, and Indigo Holcombe-James. “Connected Students Milestone Report 1: Project Commencement". Melbourne: RMIT, 2021. <https://apo.org.au/node/312817>. Kennedy, Jenny, et al. “Connected Students Milestone Report 2: Findings from First Round of Interviews". Melbourne: RMIT, 2021. <https://apo.org.au/node/312818>. Kennedy, Jenny, et al. Digital Domesticity: Media, Materiality, and Home Life. Oxford UP, 2020. Lasky, Julie. “How to Look Your Best on a Webcam.” New York Times, 25 Mar. 2020 <http://www.nytimes.com/2020/03/25/realestate/coronavirus-webcam-appearance.html>. Livingstone, Sonia, and Ellen Helsper. “Gradations in Digital Inclusion: Children, Young People and the Digital Divide.” New Media & Society 9.4 (2007): 671–696. Lobe, Bojana, David L. Morgan, and Kim A. Hoffman. “Qualitative Data Collection in an Era of Social Distancing.” International Journal of Qualitative Methods 19 (2020): 1–8. Lupton, Deborah. “Doing Fieldwork in a Pandemic (Crowd-Sourced Document).” 2020. <http://docs.google.com/document/d/1clGjGABB2h2qbduTgfqribHmog9B6P0NvMgVuiHZCl8/edit?ts=5e88ae0a#>. Murthy, Dhiraj. “Digital Ethnography: An Examination of the Use of New Technologies for Social Research”. Sociology 42.2 (2008): 837–855. ———. “Emergent Digital Ethnographic Methods for Social Research.” Handbook of Emergent Technologies in Social Research. Ed. Sharlene Nagy Hesse-Biber. Oxford UP, 2011. 158–179. Niven-Phillips, Lisa. “‘Virtual Meetings Aren’t Going Anywhere Soon’: How to Put Your Best Zoom Face Forward.” The Guardian, 27 Mar. 2021. <http://www.theguardian.com/fashion/2021/mar/27/virtual-meetings-arent-going-anywhere-soon-how-to-put-your-best-zoom-face-forward>. Ogle, Greg. “Telecommunications Expenditure in Australia: Fact Sheet.” Sydney: Australian Communications Consumer Action Network, 2017. <https://web.archive.org/web/20200612043803/https://accan.org.au/files/Reports/ACCAN_SACOSS%20Telecommunications%20Expenditure_web_v2.pdf>. Ogle, Greg, and Vanessa Musolino. “Connectivity Costs: Telecommunications Affordability for Low Income Australians.” Sydney: Australian Communications Consumer Action Network, 2016. <https://web.archive.org/web/20200612043944/https://accan.org.au/files/Reports/161011_Connectivity%20Costs_accessible-web.pdf>. Ragnedda, Massimo, and Maria Laura Ruiu. “Social Capital and the Three Levels of Digital Divide.” Theorizing Digital Divides. Eds. Massimo Ragnedda and Glenn Muschert. Routledge, 2017. 21–34. Rennie, Ellie, et al. “At Home on the Outstation: Barriers to Home Internet in Remote Indigenous Communities.” Telecommunications Policy 37.6 (2013): 583–93. Taylor, Linnet. “What Is Data Justice? The Case for Connecting Digital Rights and Freedoms Globally. Big Data & Society 4.2 (2017): 1–14. Thomas, Julian, et al. Measuring Australia’s Digital Divide: The Australian Digital Inclusion Index 2018. Melbourne: RMIT University, for Telstra, 2018. ———. Measuring Australia’s Digital Divide: The Australian Digital Inclusion Index 2019. Melbourne: RMIT University and Swinburne University of Technology, for Telstra, 2019. ———. Measuring Australia’s Digital Divide: The Australian Digital Inclusion Index 2020. Melbourne: RMIT University and Swinburne University of Technology, for Telstra, 2020. Zuboff, Shoshana. “Big Other: Surveillance Capitalism and the Prospects of an Information Civilization. Journal of Information Technology 30 (2015): 75–89.