Lee, Ashlin. "In the Shadow of Platforms." M/C Journal 24, no. 2 (April 27, 2021). http://dx.doi.org/10.5204/mcj.2750.
Abstract:
Introduction This article explores the changing relational quality of “the shadow of hierarchy”, in the context of the merging of platforms with infrastructure as the source of the shadow of hierarchy. In governance and regulatory studies, the shadow of hierarchy (or variations thereof), describes the space of influence that hierarchal organisations and infrastructures have (Héritier and Lehmkuhl; Lance et al.). A shift in who/what casts the shadow of hierarchy will necessarily result in changes to the attendant relational values, logics, and (techno)socialities that constitute the shadow, and a new arrangement of shadow that presents new challenges and opportunities. This article reflects on relevant literature to consider two different ways the shadow of hierarchy has qualitatively changed as platforms, rather than infrastructures, come to cast the shadow of hierarchy – an increase in scalability; and new socio-technical arrangements of (non)participation – and the opportunities and challenges therein. The article concludes that more concerted efforts are needed to design the shadow, given a seemingly directionless desire to enact data-driven solutions. The Shadow of Hierarchy, Infrastructures, and Platforms The shadow of hierarchy refers to how institutional, infrastructural, and organisational hierarchies create a relational zone of influence over a particular space. This commonly refers to executive decisions and legislation created by nation states, which are cast over private and non-governmental actors (Héritier and Lehmkuhl, 2). Lance et al. (252–53) argue that the shadow of hierarchy is a productive and desirable thing. Exploring the shadow of hierarchy in the context of how geospatial data agencies govern their data, Lance et al. find that the shadow of hierarchy enables the networked governance approaches that agencies adopt. This is because operating in the shadow of institutions provides authority, confers bureaucratic legitimacy and top-down power, and offers financial support. The darkness of the shadow is thus less a moral or ethicopolitical statement (such as that suggested by Fisher and Bolter, who use the idea of darkness to unpack the morality of tourism involving death and human suffering), and instead a relationality; an expression of differing values, logics, and (techno)socialities internal and external to those infrastructures and institutions that cast it (Gehl and McKelvey). The shadow of hierarchy might therefore be thought of as a field of relational influences and power that a social body casts over society, by virtue of a privileged position vis-a-vis society. It modulates society’s “light”; the resources (Bourdieu) and power relationships (Foucault) that run through social life, as parsed through a certain institutional and infrastructural worldview (the thing that blocks the light to create the shadow). In this way the shadow of hierarchy is not a field of absolute blackness that obscures, but instead a gradient of light and dark that creates certain effects. The shadow of hierarchy is now, however, also being cast by decentralised, privately held, and non-hierarchal platforms that are replacing or merging with public infrastructure, creating new social effects. Platforms are digital, socio-technical systems that create relationships between different entities. They are most commonly built around a relatively fixed core function (such as a social media service like Facebook), that then interacts with a peripheral set of complementors (advertising companies and app developers in the case of social media; Baldwin and Woodard), to create new relationships, forms of value, and other interactions (van Dijck, The Culture of Connectivity). In creating these relationships, platforms become inherently political (Gillespie), shaping relationships and content on the platform (Suzor) and in embodied life (Ajunwa; Eubanks). While platforms are often associated with optional consumer platforms (such as streaming services like Spotify), they have increasingly come to occupy the place of public infrastructure, and act as a powerful enabler to different socio-technical, economic, and political relationships (van Dijck, Governing Digital Societies). For instance, Plantin et al. argue that platforms have merged with infrastructures, and that once publicly held and funded institutions and essential services now share many characteristics with for-profit, privately held platforms. For example, Australia has had a long history of outsourcing employment services (Webster and Harding), and nearly privatised its entire visa processing data infrastructure (Jenkins). Platforms therefore have a greater role in casting the shadow of hierarchy than before. In doing so, they cast a shadow that is qualitatively different, modulated through a different set of relational values and (techno)socialities. Scalability A key difference and selling point of platforms is their scalability; since they can rapidly and easily up- and down-scale their functionalities in a way that traditional infrastructure cannot (Plantin et al.). The ability to respond “on-demand” to infrastructural requirements has made platforms the go-to service delivery option in the neo-liberalised public infrastructure environment (van Dijck, Governing Digital Societies). For instance, services providers like Amazon Web Services or Microsoft Azure provide on demand computing capacity for many nations’ most valuable services, including their intelligence and security capabilities (Amoore, Cloud Ethics; Konkel). The value of such platforms to government lies in the reduced cost and risk that comes with using rented capabilities, and the enhanced flexibility to increase or decrease their usage as required, without any of the economic sunk costs attached to owning the infrastructure. Scalability is, however, not just about on-demand technical capability, but about how platforms can change the scale of socio-technical relationships and services that are mediated through the platform. This changes the relational quality of the shadow of hierarchy, as activities and services occurring within the shadow are now connected into a larger and rapidly modulating scale. Scalability allows the shadow of hierarchy to extend from those in proximity to institutions to the broader population in general. For example, individual citizens can more easily “reach up” into governmental services and agencies as a part of completing their everyday business through platform such as MyGov in Australia (Services Australia). Using a smartphone application, citizens are afforded a more personalised and adaptive experience of the welfare state, as engaging with welfare services is no-longer tied to specific “brick-and-mortar” locations, but constantly available through a smartphone app and web portal. Multiple government services including healthcare and taxation are also connected to this platform, allowing users to reach across multiple government service domains to complete their personal business, seeking information and services that would have once required separate communications with different branches of government. The individual’s capacities to engage with the state have therefore upscaled with this change in the shadow, retaining a productivity and capacity enhancing quality that is reminiscent of older infrastructures and institutions, as the individual and their lived context is brought closer to the institutions themselves. Scale, however, comes with complications. The fundamental driver for scalability and its adaptive qualities is datafication. This means individuals and organisations are inflecting their operational and relational logics with the logic of datafication: a need to capture all data, at all times (van Dijck, Datafication; Fourcade and Healy). Platforms, especially privately held platforms, benefit significantly from this, as they rely on data to drive and refine their algorithmic tools, and ultimately create actionable intelligence that benefits their operations. Thus, scalability allows platforms to better “reach down” into individual lives and different social domains to fuel their operations. For example, as public transport services become increasingly datafied into mobility-as-a-service (MAAS) systems, ride sharing and on-demand transportation platforms like Uber and Lyft become incorporated into the public transport ecosystem (Lyons et al.). These platforms capture geospatial, behavioural, and reputational data from users and drivers during their interactions with the platform (Rosenblat and Stark; Attoh et al.). This generates additional value, and profits, for the platform itself with limited value returned to the user or the broader public it supports, outside of the transport service. It also places the platform in a position to gain wider access to the population and their data, by virtue of operating as a part of a public service. In this way the shadow of hierarchy may exacerbate inequity. The (dis)benefits of the shadow of hierarchy become unevenly spread amongst actors within its field, a function of an increased scalability that connects individuals into much broader assemblages of datafication. For Eubank, this can entrench existing economic and social inequalities by forcing those in need to engage with digitally mediated welfare systems that rely on distant and opaque computational judgements. Local services are subject to increased digital surveillance, a removal of agency from frontline advocates, and algorithmic judgement at scale. More fortunate citizens are also still at risk, with Nardi and Ekbia arguing that many digitally scaled relationships are examples of “heteromation”, whereby platforms convince actors in the platform to labour for free, such as through providing ratings which establish a platform’s reputational economy. Such labour fuels the operation of the platform through exploiting users, who become both a product/resource (as a source of data for third party advertisers) and a performer of unrewarded digital labour, such as through providing user reviews that help guide a platform’s algorithm(s). Both these examples represent a particularly disconcerting outcome for the shadow of hierarchy, which has its roots in public sector institutions who operate for a common good through shared and publicly held infrastructure. In shifting towards platforms, especially privately held platforms, value is transmitted to private corporations and not the public or the commons, as was the case with traditional infrastructure. The public also comes to own the risks attached to platforms if they become tied to public services, placing a further burden on the public if the platform fails, while reaping none of the profit and value generated through datafication. This is a poor bargain at best. (Non)Participation Scalability forms the basis for a further predicament: a changing socio-technical dynamic of (non)participation between individuals and services. According to Star (118), infrastructures are defined through their relationships to a given context. These relationships, which often exist as boundary objects between different communities, are “loosely structured in common use, and become tightly bound in particular locations” (Star, 118). While platforms are certainly boundary objects and relationally defined, the affordances of cloud computing have enabled a decoupling from physical location, and the operation of platforms across time and space through distributed digital nodes (smartphones, computers, and other localised hardware) and powerful algorithms that sort and process requests for service. This does not mean location is not important for the cloud (see Amoore, Cloud Geographies), but platforms are less likely to have a physically co-located presence in the same way traditional infrastructures had. Without the same institutional and infrastructural footprint, the modality for participating in and with the shadow of hierarchy that platforms cast becomes qualitatively different and predicated on digital intermediaries. Replacing a physical and human footprint with algorithmically supported and decentralised computing power allows scalability and some efficiency improvements, but it also removes taken-for-granted touchpoints for contestation and recourse. For example, ride-sharing platform Uber operates globally, and has expressed interest in operating in complement to (and perhaps in competition with) public transport services in some cities (Hall et al.; Conger). Given that Uber would come to operate as a part of the shadow of hierarchy that transport authorities cast over said cities, it would not be unreasonable to expect Uber to be subject to comparable advocacy, adjudication, transparency, and complaint-handling requirements. Unfortunately, it is unclear if this would be the case, with examples suggesting that Uber would use the scalability of its platform to avoid these mechanisms. This is revealed by ongoing legal action launched by concerned Uber drivers in the United Kingdom, who have sought access to the profiling data that Uber uses to manage and monitor its drivers (Sawers). The challenge has relied on transnational law (the European Union’s General Data Protection Regulation), with UK-based drivers lodging claims in Amsterdam to initiate the challenge. Such costly and complex actions are beyond the means of many, but demonstrate how reasonable participation in socio-technical and governance relationships (like contestations) might become limited, depending on how the shadow of hierarchy changes with the incorporation of platforms. Even if legal challenges for transparency are successful, they may not produce meaningful change. For instance, O’Neil links algorithmic bias to mathematical shortcomings in the variables used to measure the world; in the creation of irritational feedback loops based on incorrect data; and in the use of unsound data analysis techniques. These three factors contribute to inequitable digital metrics like predictive policing algorithms that disproportionately target racial minorities. Large amounts of selective data on minorities create myopic algorithms that direct police to target minorities, creating more selective data that reinforces the spurious model. These biases, however, are persistently inaccessible, and even when visible are often unintelligible to experts (Ananny and Crawford). The visibility of the technical “installed base” that support institutions and public services is therefore not a panacea, especially when the installed base (un)intentionally obfuscates participation in meaningful engagement like complaints handling. A negative outcome is, however, also not an inevitable thing. It is entirely possible to design platforms to allow individual users to scale up and have opportunities for enhanced participation. For instance, eGovernance and mobile governance literature have explored how citizens engage with state services at scale (Thomas and Streib; Foth et al.), and the open government movement has demonstrated the effectiveness of open data in understanding government operations (Barns; Janssen et al.), although these both have their challenges (Chadwick; Dawes). It is not a fantasy to imagine alternative configurations of the shadow of hierarchy that allow more participatory relationships. Open data could facilitate the governance of platforms at scale (Box et al.), where users are enfranchised into a platform by some form of membership right and given access to financial and governance records, in the same way that corporate shareholders are enfranchised, facilitated by the same app that provides a service. This could also be extended to decision making through voting and polling functions. Such a governance form would require radically different legal, business, and institutional structures to create and enforce this arrangement. Delacoix and Lawrence, for instance, suggest that data trusts, where a trustee is assigned legal and fiduciary responsibility to achieve maximum benefit for a specific group’s data, can be used to negotiate legal and governance relationships that meaningfully benefit the users of the trust. Trustees can be instructed to only share data to services whose algorithms are regularly audited for bias and provide datasets that are accurate representations of their users, for instance, avoiding erroneous proxies that disrupt algorithmic models. While these developments are in their infancy, it is not unreasonable to reflect on such endeavours now, as the technologies to achieve these are already in use. Conclusions There is a persistent myth that data will yield better, faster, more complete results in whatever field it is applied (Lee and Cook; Fourcade and Healy; Mayer-Schönberger and Cukier; Kitchin). This myth has led to data-driven assemblages, including artificial intelligence, platforms, surveillance, and other data-technologies, being deployed throughout social life. The public sector is no exception to this, but the deployment of any technological solution within the traditional institutions of the shadow of hierarchy is fraught with challenges, and often results in failure or unintended consequences (Henman). The complexity of these systems combined with time, budgetary, and political pressures can create a contested environment. It is this environment that moulds societies' light and resources to cast the shadow of hierarchy. Relationality within a shadow of hierarchy that reflects the complicated and competing interests of platforms is likely to present a range of unintended social consequences that are inherently emergent because they are entering into a complex system – society – that is extremely hard to model. The relational qualities of the shadow of hierarchy are therefore now more multidimensional and emergent, and experiences relating to socio-technical features like scale, and as a follow-on (non)participation, are evidence of this. Yet by being emergent, they are also directionless, a product of complex systems rather than designed and strategic intent. This is not an inherently bad thing, but given the potential for data-system and platforms to have negative or unintended consequences, it is worth considering whether remaining directionless is the best outcome. There are many examples of data-driven systems in healthcare (Obermeyer et al.), welfare (Eubanks; Henman and Marston), and economics (MacKenzie), having unintended and negative social consequences. Appropriately guiding the design and deployment of theses system also represents a growing body of knowledge and practical endeavour (Jirotka et al.; Stilgoe et al.). Armed with the knowledge of these social implications, constructing an appropriate social architecture (Box and Lemon; Box et al.) around the platforms and data systems that form the shadow of hierarchy should be encouraged. This social architecture should account for the affordances and emergent potentials of a complex social, institutional, economic, political, and technical environment, and should assist in guiding the shadow of hierarchy away from egregious challenges and towards meaningful opportunities. To be directionless is an opportunity to take a new direction. The intersection of platforms with public institutions and infrastructures has moulded society’s light into an evolving and emergent shadow of hierarchy over many domains. With the scale of the shadow changing, and shaping participation, who benefits and who loses out in the shadow of hierarchy is also changing. Equipped with insights into this change, we should not hesitate to shape this change, creating or preserving relationalities that offer the best outcomes. Defining, understanding, and practically implementing what the “best” outcome(s) are would be a valuable next step in this endeavour, and should prompt considerable discussion. If we wish the shadow of hierarchy to continue to be productive, then finding a social architecture to shape the emergence and directionlessness of socio-technical systems like platforms is an important step in the continued evolution of the shadow of hierarchy. References Ajunwa, Ifeoma. “Age Discrimination by Platforms.” Berkeley J. Emp. & Lab. L. 40 (2019): 1-30. Amoore, Louise. Cloud Ethics: Algorithms and the Attributes of Ourselves and Others. Durham: Duke University Press, 2020. ———. “Cloud Geographies: Computing, Data, Sovereignty.” Progress in Human Geography 42.1 (2018): 4-24. Ananny, Mike, and Kate Crawford. “Seeing without Knowing: Limitations of the Transparency Ideal and Its Application to Algorithmic Accountability.” New Media & Society 20.3 (2018): 973–89. Attoh, Kafui, et al. “‘We’re Building Their Data’: Labor, Alienation, and Idiocy in the Smart City.” Environment and Planning D: Society and Space 37.6 (2019): 1007-24. Baldwin, Carliss Y., and C. Jason Woodard. “The Architecture of Platforms: A Unified View.” Platforms, Markets and Innovation. Ed. Annabelle Gawer. Cheltenham: Edward Elgar, 2009. 19–44. Barns, Sarah. “Mine Your Data: Open Data, Digital Strategies and Entrepreneurial Governance by Code.” Urban Geography 37.4 (2016): 554–71. Bourdieu, Pierre. Distinction: A Social Critique of the Judgement of Taste. Cambridge, MA: Harvard University Press, 1984. Box, Paul, et al. Data Platforms for Smart Cities – A Landscape Scan and Recommendations for Smart City Practice. Canberra: CSIRO, 2020. Box, Paul, and David Lemon. The Role of Social Architecture in Information Infrastructure: A Report for the National Environmental Information Infrastructure (NEII). Canberra: CSIRO, 2015. Chadwick, Andrew. “Explaining the Failure of an Online Citizen Engagement Initiative: The Role of Internal Institutional Variables.” Journal of Information Technology & Politics 8.1 (2011): 21–40. Conger, Kate. “Uber Wants to Sell You Train Tickets. And Be Your Bus Service, Too.” The New York Times, 7 Aug. 2019. 19 Jan. 2021. <https://www.nytimes.com/2019/08/07/technology/uber-train-bus-public-transit.html>. Dawes, Sharon S. “The Evolution and Continuing Challenges of E‐Governance.” Public Administration Review 68 (2008): 86–102. Delacroix, Sylvie, and Neil D. Lawrence. “Bottom-Up Data Trusts: Disturbing the ‘One Size Fits All’ Approach to Data Governance.” International Data Privacy Law 9.4 (2019): 236-252. Eubanks, Virginia. Automating Inequality: How High-Tech Tools Profile, Police, and Punish the Poor. New York: St. Martin’s Press, 2018. Fisher, Joshua A., and Jay David Bolter. “Ethical Considerations for AR Experiences at Dark Tourism Sites”. IEEE Explore 29 April. 2019. 13 Apr. 2021 <https://ieeexplore.ieee.org/document/8699186>. Foth, Marcus, et al. From Social Butterfly to Engaged Citizen: Urban Informatics, Social Media, Ubiquitous Computing, and Mobile Technology to Support Citizen Engagement. Cambridge MA: MIT Press, 2011. Fourcade, Marion, and Kieran Healy. “Seeing like a Market.” Socio-Economic Review, 15.1 (2017): 9–29. Gehl, Robert, and Fenwick McKelvey. “Bugging Out: Darknets as Parasites of Large-Scale Media Objects.” Media, Culture & Society 41.2 (2019): 219–35. Gillespie, Tarleton. “The Politics of ‘Platforms.’” New Media & Society 12.3 (2010): 347–64. Hall, Jonathan D., et al. “Is Uber a Substitute or Complement for Public Transit?” Journal of Urban Economics 108 (2018): 36–50. Henman, Paul. “Improving Public Services Using Artificial Intelligence: Possibilities, Pitfalls, Governance.” Asia Pacific Journal of Public Administration 42.4 (2020): 209–21. Henman, Paul, and Greg Marston. “The Social Division of Welfare Surveillance.” Journal of Social Policy 37.2 (2008): 187–205. Héritier, Adrienne, and Dirk Lehmkuhl. “Introduction: The Shadow of Hierarchy and New Modes of Governance.” Journal of Public Policy 28.1 (2008): 1–17. Janssen, Marijn, et al. “Benefits, Adoption Barriers and Myths of Open Data and Open Government.” Information Systems Management 29.4 (2012): 258–68. Jenkins, Shannon. “Visa Privatisation Plan Scrapped, with New Approach to Tackle ’Emerging Global Threats’.” The Mandarin. 23 Mar. 2020. 19 Jan. 2021 <https://www.themandarin.com.au/128244-visa-privatisation-plan-scrapped-with-new-approach-to-tackle-emerging-global-threats/>. Jirotka, Marina, et al. “Responsible Research and Innovation in the Digital Age.” Communications of the ACM 60.6 (2016): 62–68. Kitchin, Rob. The Data Revolution: Big Data, Open Data, Data Infrastructures and Their Consequences. Thousand Oaks, CA: Sage, 2014. Konkel, Frank. “CIA Awards Secret Multibillion-Dollar Cloud Contract.” Nextgov 20 Nov. 2020. 19 Jan. 2021 <https://www.nextgov.com/it-modernization/2020/11/exclusive-cia-awards-secret-multibillion-dollar-cloud-contract/170227/>. Lance, Kate T., et al. “Cross‐Agency Coordination in the Shadow of Hierarchy: ‘Joining up’Government Geospatial Information Systems.” International Journal of Geographical Information Science, 23.2 (2009): 249–69. Lee, Ashlin J., and Peta S. Cook. “The Myth of the ‘Data‐Driven’ Society: Exploring the Interactions of Data Interfaces, Circulations, and Abstractions.” Sociology Compass 14.1 (2020): 1–14. Lyons, Glenn, et al. “The Importance of User Perspective in the Evolution of MaaS.” Transportation Research Part A: Policy and Practice 121(2019): 22-36. MacKenzie, Donald. “‘Making’, ‘Taking’ and the Material Political Economy of Algorithmic Trading.” Economy and Society 47.4 (2018): 501-23. Mayer-Schönberger, V., and K. Cukier. Big Data: A Revolution That Will Change How We Live, Work and Think. London: John Murray, 2013. Michel Foucault. Discipline and Punish. London: Penguin, 1977. Nardi, Bonnie, and Hamid Ekbia. Heteromation, and Other Stories of Computing and Capitalism. Cambridge, MA: MIT Press, 2017. O’Neil, Cathy. Weapons of Math Destruction – How Big Data Increases Inequality and Threatens Democracy. London: Penguin, 2017. Obermeyer, Ziad, et al. “Dissecting Racial Bias in an Algorithm Used to Manage the Health of Populations.” Science 366.6464 (2019): 447-53. Plantin, Jean-Christophe, et al. “Infrastructure Studies Meet Platform Studies in the Age of Google and Facebook.” New Media & Society 20.1 (2018): 293–310. Rosenblat, Alex, and Luke Stark. “Algorithmic Labor and Information Asymmetries: A Case Study of Uber’s Drivers.” International Journal of Communication 10 (2016): 3758–3784. Sawers, Paul. “Uber Drivers Sue for Data on Secret Profiling and Automated Decision-Making.” VentureBeat 20 July. 2020. 19 Jan. 2021 <https://venturebeat.com/2020/07/20/uber-drivers-sue-for-data-on-secret-profiling-and-automated-decision-making/>. Services Australia. About MyGov. Services Australia 19 Jan. 2021. 19 Jan. 2021 <https://www.servicesaustralia.gov.au/individuals/subjects/about-mygov>. Star, Susan Leigh. “Infrastructure and Ethnographic Practice: Working on the Fringes.” Scandinavian Journal of Information Systems 14.2 (2002):107-122. Stilgoe, Jack, et al. “Developing a Framework for Responsible Innovation.” Research Policy 42.9 (2013):1568-80. Suzor, Nicolas. Lawless: The Secret Rules That Govern Our Digital Lives. Cambridge: Cambridge University Press, 2019. Thomas, John Clayton, and Gregory Streib. “The New Face of Government: Citizen‐initiated Contacts in the Era of E‐Government.” Journal of Public Administration Research and Theory 13.1 (2003): 83-102. Van Dijck, José. “Datafication, Dataism and Dataveillance: Big Data between Scientific Paradigm and Ideology.” Surveillance & Society 12.2 (2014): 197–208. ———. “Governing Digital Societies: Private Platforms, Public Values.” Computer Law & Security Review 36 (2020) 13 Apr. 2021 <https://www.sciencedirect.com/science/article/abs/pii/S0267364919303887>. ———. The Culture of Connectivity: A Critical History of Social Media. Oxford: Oxford University Press, 2013. Webster, Elizabeth, and Glenys Harding. “Outsourcing Public Employment Services: The Australian Experience.” Australian Economic Review 34.2 (2001): 231-42.