Data and Afrofuturism: an emancipated subject? (2024)

Abstract

The concept of an individual, liberal data subject, who was traditionally at the centre of data protection efforts has recently come under scrutiny. At the same time, the particularly destructive effect of digital technology on Black people establishes the need for an analysis that not only considers but brings racial dimensions to the forefront. I argue that because Afrofuturism situates the Black struggle in persistent, yet continuously changing structural disparities and power relations, it offers a powerful departure point for re-imagining data protection. Sketching an Afrofuturist data subject then centres on radical subjectivity, collectivity, and contextuality.

Citation & publishing information

Received: November 18, 2020 Reviewed: April 2, 2021 Published: December 7, 2021
Licence: Creative Commons Attribution 3.0 Germany
Competing interests: The author has declared that no competing interests exist that have influenced the text.
Keywords: Data, Afrofuturism, Data protection, Race, Racism
Citation: Kadiri, A. P. (2021). Data and Afrofuturism: an emancipated subject? . Internet Policy Review, 10(4). https://doi.org/10.14763/2021.4.1597

This paper is part of Feminist data protection, a special issue of Internet Policy Review guest-edited by Jens T. Theilen, Andreas Baur, Felix Bieker, Regina Ammicht Quinn, Marit Hansen, and Gloria González Fuster.

Introduction:

“The data thief travels light. A small multi-functional flat black box, the size of a CD, is all s/he needs.” (Akomfrah & George, 2013)

Central to the Black Audio Film Collective’s documentary ‘Last Angel of History’ is a time- traveling protagonist named the ‘data thief’ (Akomfrah, 1996). The movie is part of a phenomenon that has been called Afrofuturism, which is, broadly speaking, an aesthetic and philosophy that centres around themes of technology, diaspora, Blackness, and speculative temporalities. What is interesting about the figure is the positive connotation of the term ‘data thief’ and the accompanying emancipatory, that is, self-determined, approach to data. S/he travels through time to excavate fossils, collecting pieces and knowledge—data—of a past that has been lost to assemble a future which will be gained. Afrofuturism has its roots in response and resistance to persistent structural racism and the grim future that results from it. An, at first sight, unlikely match to that is the so-called ‘Afro-census’ (Aikins et al., 2021), a first nationwide survey of Black people in Germany published in November 2021, that aims “to obtain as comprehensive a picture as possible of the experiences of people of African descent in Germany” (Afrozensus, n.d.). I will later explore the connections between Afrofuturism and the Afro-census, by illustrating how an Afrofuturist data subject can guide a critical approach to data protection. Both Afrofuturism and the Afro-census find very distinct ways to oppose and question the status quo. Furthermore, what unites both is the re-imagination of the relationship to data in a positive manner in order to assemble a better future.

In contrast to the liberated ‘data thief’ above, today’s digital reality is much bleaker. We are increasingly becoming aware of our data-saturated world’s dark underbelly, where, as for instance seen in the case of the Facebook-Cambridge Analytica scandal, data traded as a commodity can be transformed into a political weapon (Confessore, 2018). By the same token, it becomes more and more apparent that many injustices in the digital sphere share a very distinct common characteristic: racial discrimination. Algorithms and data are neither neutral nor objective. In fact, they are fed by and feed back into structural racism1 and are linked to forms of knowledge production, which are deeply situated in oppressive structures. Browne (2015) for instance delineates a genealogy of surveillance and data collection that builds firmly on race and Blackness. Equally important, Raval (2019) highlighted the colonial context of knowledge and data production in which “both colonial and postcolonial bodies are inseparable from the past and contemporary technoscientific innovation and production.” (p. 5)

With this in mind, my aim is to call into question a crucial concept within data protection, that is, the data subject. The “subject[] of the right to personal data protection” (González Fuster & Gutwirth, 2017, p. 184) is defined by the European General Data Protection Regulation (GDPR) as “a[] […] natural person […]; who can be identified, directly or indirectly, in particular by reference to an identifier” (Regulation 2016/6) and forms the starting point from where to search for an emancipatory engagement with data. Specifically, how can we rethink the data subject in light of current digital discrimination and racism? Examining the data subject makes it necessary to recognise long-standing notions of who counts as human and therefore enjoys protection within that category. In concurrence with Benjamin’s (2019a) call for rethinking the role of technoscience, Afrofuturism will aid to re-imagine data protection and form new narratives while building on the specific nature of the injustices. I argue that looking at Afrofuturism can help us move toward an emancipatory approach to the data subject in our digital age.

In the following, I will, first, explore why our current digital working models are inadequate to address digital racism and injustices and how Afrofuturism may fill a conceptual gap. Second, I will sketch out an Afrofuturist data subject, guided by the main themes of collectivity, contextuality, and subjectivity, illustrated by the Afro-census, before, third, delineating how the Afrofuturist data subject emancipates through subjectification.

Section 1: #Data, something’s off

As more and more examples of digital injustices accumulate, it becomes clear that something is awfully off in the world of data. Facial recognition systematically misclassifies darker-skinned women (Buolamwini & Gebru, 2018), popular speech recognition systems have shown racial bias (C. Metz, 2020), and the algorithms that feed into credit scores perpetuate racism and marginalisation (Singletary, 2020). In short, digital biases operate along racial and gender lines, with the creation of data knowledge neatly falling into and extending existing forms of oppression. When looking back in history, these accounts do not come as surprises. Browne (2015) powerfully illustrates that the violent surveillance of Black people persisted before, throughout and after the ‘pre-technological’ era. The digital injustices are then possibly just a consequential continuation of this racial surveillance. And yet, the scale and scope of what we currently see unfolding seems to be a heightened one, perhaps simply due to the sheer extent with which data now permeates our lives. In effect, we need a more fundamental engagement with how datafication, that is, “the wider transformation of human life so that its elements can be a continual source of data” (Mejias & Couldry, 2019, p. 2) perpetuates and builds on racism.

Equally important, data is also off because strict digital categories, and the quest for what Cohen (see 2012, 2019, p. 5) calls “conceptual coherence”, run the risk of obscuring the fact that the common denominator for all of these categories remains data. Nowadays, the digital is everywhere but it is not clear where privacy ends, algorithms intersect, and data protection starts. In response to student protests in the United Kingdom against school performance decisions made by predictive algorithms, Amoore (2020) posed the interesting question whether we are moving into new territory, that is to say, past the former opposition to privacy intrusion into the opposition of algorithmic models themselves. Significantly, the protesters “[...] weren’t focused on how their data might be used in the future, but how their data had been actively used to change their future. The potential pathways open to young people were reduced, limiting their life chances according to an oblique prediction” (Amoore, 2020, emphasis added). Risking repetition, I want to reiterate the importance of Amoore’s observation for understanding what is at stake with the digital injustices. The students were protesting algorithmic determination of their futures and deep intrusions into their life realities. Their protests were targeting the profoundly structural character and the inherent power imbalance of those algorithmic decisions.2 Also, the new character of these protests is interesting because the line between using data in the future and using data to change the future is a difficult one to draw, yet, what is remarkably clear is the power and impact that data and algorithms now have on our (future) lives. In light of racially biased algorithms in medical patient care (Begley, 2020), arrests being made solely due to biased facial recognition (Hill, 2020), Noble’s (2018) analysis of destructive algorithmic potential in “Algorithms of Oppression”, and Browne’s (2015) illustration of an ongoing history of racialised surveillance and privacy intrusions, this foreclosure of futures has a distinctively racial character.

In fact, the analytical lens of the data subject risks being lost between different fields of knowledge. The lines between data protection, privacy, and algorithmic governance are difficult to draw because algorithms are fed by data and the foreclosure of futures transcends these established categories. Furthermore, the classification and attribution of digital injustices is remarkably elusive. Thus, from a legal perspective, distinctions are difficult to make. For instance, data protection and privacy are, although overlapping, not the same (see González Fuster, 2014) since “privacy protection can aim at a different kind of protection than data protection does, and the scope of data protection covers personal information in a distant or indirect relation with the private sphere” (Somody et al., 2017, p. 161). Which categories do these racially biased, future-changing digital injustices belong to? Are they matters of data protection? Privacy? Surveillance? Algorithmic equality? Scientific knowledge production requires definitional specificity, yet, these interdisciplinary boundaries also risk obscuring the fact that the data subject is affected by all of these categories and that what needs to be corrected—race and gender bias—as well as the damaging scope of injustices is structural.

Importantly, the paradigm brought forward in the current legal provisions centres on the individual human. For instance, Article 4 of the European General Data Protection Regulation (Regulation 2016/6), defines the data subject as “a[] […] natural person […]; who can be identified, directly or indirectly, in particular by reference to an identifier such as a name, an identification number, location data, an online identifier or to one or more factors specific to the physical, physiological, genetic, mental, economic, cultural or social identity of that natural person”.

Yet, this focus on the traditional individual data subject is outdated and inadequate. First, a group of authors highlights the complicated role of consent to personal data collection and processing, which is envisioned to speak to the ‘empowerment’ of data subjects (González Fuster & Gutwirth, 2017, p. 185). Peña and Varon (2019) demonstrate that in its current form the concept of consent neglects power dynamics and, similarly, both Solove (2013) and Cohen (2019) illustrate the inadequacy of consent-focused solutions for effective privacy protection. Second, critical legal literature has focused on anti-discrimination. For instance, Hintz et al. (2018) have drawn attention to the fact that through the “reconstitut[ion] [of] populations as data subjects […], questions of discrimination, exclusion and inequality have (re)emerged in both existing and new incarnations, […]” (p. 61). Similarly, Taylor (2017) makes the case for international data justice that partly builds on anti-discrimination. Third, the individual-focused provisions have been criticised for their neglect of group-level privacy. For instance, Van der Sloot (2017) argues that in light of “the current technological reality, often referred to as big data, […] threats to privacy increasingly do not take place on an individual level, but on a general or group level” (p. 216). And fourth, the empirical realities paint a grim picture of the dominant human rights approach focusing on each individual’s rights to privacy and data protection. A recent United Nations report on ‘Racial discrimination and emerging digital technologies’ (Achiume, 2020) has shown that specifically the right to non-discrimination and racial equality is often de-prioritised leading to ‘techno-racism’.

Most importantly for my argument, and connected to the previous point, the individual starting point also harbours a darker limitation, that is, exclusionary categories of citizenship and humanity. The individual agent, citizen, and more fundamentally, human, is one of the core concepts within the theoretical structure of liberal democracy (see Mill, 1859/2015). Although often portrayed as universal, these categories are in fact exclusionary. Notably, the often explicit racism of key Enlightenment figures like Immanuel Kant, who were at this concept’s drawing board, has widely been illustrated (see Boxill, 2017). Moreover, the curious concurrence of the development of human rights and slavery was predicated on the denial of humanity. Only assigning slaves the status of “incomplete humans” (Sala-Molins & Conteh-Morgan, 2006, pp. xxxii–xxxiii) could uphold Enlightenment ideals while prioritising slave owners’ property rights over slaves’ right to freedom (see Därmann, 2020). With this in mind, the existing conflict within and between human rights has been present since their conceptual beginnings. For that reason, scholars such as Sylvia Wynter (1994, 2003), and Carole Pateman and Charles Mills (2007) highlighted the fundamental exclusion from the realm of citizenship and humanity along racial and gender lines.

To summarise, I am arguing, first, that in order to critically assess digitally-enabled racism it is paramount to look through the lens of the data subject. And second, more fundamentally, that the specific kind of “data subject lens” employed needs to build on a different form of knowledge, one distinct from the liberal data subject. In the following, I will provide an overview of Afrofuturism, before outlining how it can inform data protection.

Section 2: What Afrofuturism can do

Broadly speaking, Afrofuturism is an aesthetic and philosophy that centres around themes of technology, diaspora, Blackness, and speculative temporalities (Eshun, 2003; Womack, 2013; Youngquist, 2016; Quan, 2017; Avanessian & Moalemi, 2018; Steinskog, 2018).3 Among many others, examples include Jazz musician Sun Ra, who created an alternative planetary universe and accompanying cosmic philosophy (Youngquist, 2016). Or pioneering science fiction author Octavia Butler whose stories and lasting legacy continue to inspire young writers (Imarisha & Brown, 2015). But more recently, this also includes singer Janelle Monáe’s android alter-ego Cindi Mayweather, who fights for justice in a dystopian future (Aghoro, 2018). Mark Drery (1994) first coined the term Afrofuturism in the essay “Black to the Future”. Although it is by no means a homogeneous category, this array of music, literature, and visual arts is unified by its effort to imagine the future and thereby also re-imagine the past. Thus, Afrofuturism is political because the future and imaginations thereof are inherently political. Importantly, in contrast to other futurisms, Afrofuturism does not have fascist roots (R. Anderson, 2015; Avanessian & Moalemi, 2018; Quan, 2017) but developed to provide “radical democratic politics and life-forms” (Quan, 2017, An Afrofuturist prelude to traveling, para. 9). In light of the recent increase in right-wing ethno-futurisms, Avanessian and Moalemi (2018, p. 38) argue that we are in dire need of visions, like Afrofuturism, which explore a politically and ethnically inclusive technological future while acknowledging its political past. Furthermore, Afrofuturism’s political potential lies in making sense of the inequalities and traumas of the past and present. Thus, as much as Afrofuturism reclaims narratives, it also offers a regenerative reading of technology and the Black community.

Afrofuturism’s complexity and the multitude of layers make it difficult to pin down. In fact, the term Afrofuturism contains different layers. For one thing, there is the musical, visual, and literary work that has been described as Afrofuturism. Additionally, there are scholarly dissections of specific works as well as (meta)analyses of the synthesised Afrofuturist canon more broadly. Yet, Steinskog (2018) remarks that “[i]t would be better to think of Afrofuturism as an emergent phenomenon” (p. 22) and that “[t]he term Afrofuturism has gotten a life of its own and will continue to be used” (p. 25). Moreover, Afrofuturism also describes two different, yet intertwined, trajectories. The first one being the cultural escape from a grim present into the futurist realm, that is to say, futurism as a tool of refuge. The second, arguably more recent, one is addressing the lack of Black presence in science fiction, cyberspace, and digital realms, such as for example digital humanities (McPherson, 2012). To illustrate, in her book “Afrofuturism: The World of Black Sci-Fi and Fantasy Culture”, Womack (2013) illustrates that the white and male dominance in Silicon Valley’s decision-making processes, its ‘geek culture’, and technological imaginations are intertwined. Afrofuturism’s complex quest to counteract Black people’s absence from technological history and future is thus intricately linked to opening up new digital future possibilities.

In essence, there is no uniform theory of Afrofuturism, but rather a blend of cultural phenomenon and philosophy. This disregard of categorisations provides it with its richness, but might at times, for instance in this article, also provoke analytical unease due to its evasion of the clearly definable that academic inquiry usually demands. And it is exactly this unease that raises important questions about which forms of knowledge are considered legitimate, by whom and through what processes.4 In its refusal to be separated into exclusionary categories, Afrofuturism can provide the lens that creates a liberated and intersectional data subject. In light of emerging digital injustices and racism, we need a theoretical framework that is able to render the systemic factors visible while at the same time providing a platform for action. Likewise, we need an emancipatory approach to technology that goes beyond the current data subject. Addressing what Benjamin (2019b) calls the “The New Jim Code” (p. 3) means scrutinising what has been overlooked so far and dissecting the underlying criteria working models which rendered these injustices invisible. Equally important, apart from deconstructing concepts, “[a] liberatory engagement with technology” (Applegate, 2020, p. 139) also entails building new ones. Due to the crucial role of imagination, Afrofuturism is an interesting starting point for this endeavour because it “operates from a standpoint that intersects theories of time and space, technology, class, race, gender, and sexuality and delineates a general economy of racialization in relation to force of production and apocalyptic, dystopian, and utopian futures” (R. Anderson, 2015, p. 183).

Only an intersectional data subject can fully address the unique challenges we face. As outlined, data is neither objective nor neutral, since variables such as race, gender, but also economic status, most powerfully illustrated by Eubanks (2018), play a role. This corresponds to what Crenshaw (1989) termed intersectionality, the interaction of different demographic factors that work together to either empower or oppress. Significantly, this is far from a purely theoretical exercise—there are real-life consequences to this removal. Deeply rooted in Black feminism (see Carastathis, 2016, pp. 15-68), Crenshaw’s concept highlights the failure of legal protection frameworks against discrimination to account for Black women (Crenshaw, 1989, p. 140). Importantly, Crenshaw stresses that “intersectionality is not being offered as some new, totalizing theory. [...] [Instead, her] focus on the intersections of race and gender only highlights the need to account for multiple grounds of identity when considering how the social world is constructed” (pp. 1244-1245). Similarly, in a close reading of Crenshaw’s texts, Carastathis (2016) positions intersectionality as a mode of thinking and “as a ‘provisional concept’ that disorients cognitive habits” (p. 3) to shine light on categorisations and the blind spots in between. “[M]eant to get us to think about how we think” (Carastathis, 2016, p. 4), intersectionality thus makes possible a knowledge production that brings into focus rather than overlooks marginalisation in order “to resist efforts to compartmentalise experiences and undermine potential collective action” (Crenshaw, 1989, p. 167). The intersectional data subject, therefore, is one that accounts for interconnectedness, unsettles predefined categories, and acknowledges the structural aspect of discrimination.

Structural problems such as racism are embedded in foundational concepts, go unnoticed by dominant paradigms and are thus not acted upon. The question whether analyses of (in)justice should orientate themselves by the theoretical future ideal or the practical present non-ideal has been a philosophical debate for some time now (Valentini, 2012). Both feminist theorists (E. Anderson, 2009) and critical race scholars (Mills, 2017) have argued for using the non-ideal as an analytical vantage point, thereby centring on the present wrongs. Similarly, the subject that is affected by data goes beyond the individualist, neutral conception that we have of it today. It is racialised, it is gendered, and it is collective. Buolamwini and Gebru (2018) illustrated that in the case of facial recognition “[…] gender and skin type analysis by themselves do not present the whole story […] [,] [t]he intersectional error analysis […] provides more answers” (Buolamwini & Gebru, 2018, p. 11). Thus, taking into account the unique type of bias data subjects face, requires the recognition of intersectionality from the get-go.

Section 3: The Afrofuturist data subject

From the above, it follows that we need to reframe the data subject. The—what I call—Afrofuturist data subject has three specific characteristics: it is radically subjective, collective, and contextual. Furthermore, I will examine how these ideas are present within the Afro-census.

3.1 The (Afrofuturist) data subject is radically subjective

The first characteristic is a radical opposition to the historical trajectory that has conceptually removed Black people from the subject category.

Afrofuturism’s radicalism stems from its ‘countering’ of existing concepts and paradigms by establishing counter-temporalities (Avanessian & Moalemi, 2018, p. 14), “counter-histories” (Steinskog, 2018, p. 4), “counter-memory” (Eshun, 2003, p. 287), “countertradition” (Youngquist, 2016, p. 79), “counter-futures” (Eshun, 2003, p. 301), and “counter-wisdom” (Youngquist, 2016, p. 91). These oppositions are radical in the sense that, as Angela Davis stated, “radical simply means ‘grasping things at the root’” (Griffin, 1996, p. 69). In other words, not accepting the status quo and limiting neither imagination nor political action to the prevailing paradigm. Or as Andrews (2018) frames it, “Black radicalism [...] calls for an overturning of the system that oppresses Black people [...]” (p. xvii). Through its potential to transform (political) reality by offering counter-knowledge and counter-concepts,5 Afrofuturism “[...] is resistance at its most creative [...]” (Youngquist, 2016, p. 60). In the context of the data subject, I employ the term radicalism to rethink the fundamental notions of the data subject, that is, based on individuality, and more specifically, an individuality that is historically based on exclusionary notions.

Moreover, determining the Afrofuturist data subject requires the acknowledgment of the historical exclusion of Black people from the category of subjects. Traditionally, the data subject is understood as an individual, a human. Yet, what happens if who is considered human is conditional itself and one is assigned the status of an object by the classification “into full humans, not-quite-humans, and nonhumans” (Weheliye, 2014, p. 3)? In Toni Morrison’s (2019) words: “There is no reliable literary or journalistic or scholarly history available to them, to help them, because they are living in a society and a system in which the conquerors write the narrative of their lives. They are spoken of and written about — objects of history, not subjects within it” (p. 324, emphasis added). Corresponding to this notion of denied subjectivity, one of the central premises of what has come to be known as ‘Afro-pessimism’ is that the act of slavery constituted a preclusion from humanity as it robbed slaves of “their very being” and the status “as a social subject” (Wilderson III et al., 2017, p. 8, emphasis added). And as an even more explicit indictment, Wynter (1994) puts the categorisation “NHI – No Humans Involved” (p. 42) at the centre of persistent systemic racism in the United States.

We can use this radicalism to rethink our engagement with the digital world generally, and data protection more specifically, by moving the Black ‘object’ into the realm of the subject. If the digital injustices we see today are a continuation of the historical objectification that Morrison (2019) decried, then we need what David (2007) calls “radical black subjectivity” (p. 182). That is, “[...] Afrodiasporic subjects live the estrangement that science-fiction writers envision. Black existence and science fiction are one and the same” (Greg Tate in Eshun, 2003, p. 298, emphasis added). The value in the knowledge that arises from Afrofuturism’s ‘counter’-traditions is that its subjective opposition makes visible what has largely been invisible objectively. Rewriting the data subject thus entails the recognition of who has been an object for far too long in order to break down underlying exclusionary notions of who counts as human and therefore enjoys protection within that category.6

As an illustration, by collecting information on Black people, the Afro-census (Aikins et al., 2021) is creating a category and thereby a data subject that has not existed before. Constructing through data “[…] a demographic group in Germany which is severely affected by intersectional discrimination [that] can finally attain the public visibility that is needed for a better representation of their interests”(Afrozensus, n.d.). In its emphasis on self-determination and the subsequent need for appropriate data, the Afro-census also resembles recent movements of Indigenous data sovereignty, which focuses “on the need for data, which meet Indigenous data needs and aspirations” (Walter & Suina, 2019, p. 236). Moreover, making visible what is often deemed invisible actively opposes the denying of Black people’s existence in Germany as well as the denial of structural racism within the country (Arte Tracks, 2020; Deutsche Welle, 2020). With that, the Afro-census goes beyond the creation of statistical group categories. Data in its broadest sense has always been entangled in the power and the politics of knowledge (see Tuhiwai Smith, 1999). Way too often, the credo seems to be that there is no racism, where it is not documented (or recorded) and efforts to therefore start documentation are in turn encountering strong resistance.7 With that in mind, the Afro-census to collects extensive data on the lived experience of Black Germans (Aikins et al., 2021), thereby making visible as well creating vocabulary for the Black experience in Germany.

3.2 The (Afrofuturist) data subject is collective

Collective trauma, community-building, collective healing, collective memory and identity formation play a prominent role in Afrofuturism. The idea of the collective trauma which has engraved itself into the minds and hearts of the community is a recurring theme. Whether, the historic trauma of slavery or the ongoing trauma of systemic racism, “[...] the propensity to run toward freedom and community building away from conditions of bondage has barely diminished within the context of persistent labor exploitation, hyper-surveillance and unending incarceration.” (Quan, 2017, Flight, fugitivity and time travels, para. 5). The collective unearthing and memory-building unifies. Yet, not only does Afrofuturism engage in the process of making formerly unseen conditions visible, its storytelling also serves a collective healing function by making sense of the inequalities and traumas. Interestingly, the process of collective healing within Afrofuturism seems to have a distinctly feminine character. Womack (2013) outlines its “divine feminine principle” (p. 103) and states that “many women artists and writers use the aesthetic [Afrofuturism] as a healing device” (p. 114). Equally important, Etter-Lewis (1993) illustrates that identity is chiefly formed and strengthened within the community. Whereby, also in light of literary history’s absence during slavery, Afrofuturism establishes identity through music (Steinskog, 2018) as well as creativity more generally (Youngquist, 2016). In this manner, “Afrofuturist thought posits a reconciliation between an imagined disembodied identity-free future and the embodied identity-specific past and present [...]” (David, 2007, p. 697). Within these interwoven endeavours of community and collectivity lies a distinctive sense of defiance and resistance.

Corresponding to Afrofuturism’s collective identity formation, there is a need for data protection to move toward an acknowledgment of collectivity (see Van der Sloot, 2017; Tisné, 2020). Strauß (2019) highlights the complex interplay between privacy and (digital) identity stating that “there is a certain demand to re-conceptualize privacy with respect to the informational nature of humans and the representation of their (digital) identities [...] because threats to privacy can threaten identity-building of the individual concerned as well [...]” (p. 6). As Afrofuturism forms, reclaims, and enacts identity as a collective rather than individual and with no claim of universality, it poses questions about whose digital identity exactly data protection is protecting. Moreover, can this digital identity be collective? Given these points, collectivity acknowledges the need to think data protection beyond the current legal focus on individual rights towards an acknowledgement of collectivity. In opposition to the traditional conceptions of data protection which focus on individual rights and processes, “[c]ollectivity is an alternative institutional principle — an alternative mode of connection and self-determination” (Applegate, 2020, p. 140). More practically, it becomes clear that certain individual-focused data protection measures cannot live up to the new realities of digital developments. For instance, data anonymisation may protect the individual’s privacy but does not protect against data aggregation targeting collective subjects. Similarly, Tisné (2020) recently argued for thinking data protection in collective terms, criticising that there are “systemic mismatches between individual privacy law and the value of collective data processing” (p. 10). Notably, the notion of a collective viewpoint starts to have an effect within policy, where for instance bans on the use of facial recognition technology cite the detrimental effects on whole population groups as opposed to individuals (R. Metz, 2020).

With this in mind, collectivity also plays a major role within the Afro-census (Aikins et al., 2021). The most apparent observation is the aim to collect data on a group rather than individual, and thereby the creation of not an individual data subject but a collective Black one. Moreover, collectivity not only shows in terms of numbers but also in the collective meaning-making of Black citizens’ role in German society. Who are we, what do we encounter, and what do we want? The Afro-census is thus a distinctively community-focused effort in which individuals voluntarily provide data to a campaign that started in civil society (Gaul & Vooren, 2020) rather than from government initiative.

3.3 The (Afrofuturist) data subject is contextual

Contextualising data protection means seeing the data subject within structural power dynamics. The role of contextuality within privacy and surveillance has most notably been explored by Nissenbaum’s (2010) concept of ‘contextual integrity’. More specifically, Nissenbaum (2010) proposes that the appropriateness of information and data flows ought to be judged by their violation of, or accordance with social norms. Thus, “[c]ontextual integrity is achieved when actions and practices comport with informational norms” (Nissenbaum, 2015, p. 840). This notion of context provides nuance in the analysis of privacy and a conceptual toolset with which to investigate data and information flows. While I agree with Nissenbaum’s (2015) emphasis on contexts, that is, social spheres, in guiding policy, it is also important to take into account that dominant norms in social spheres such as healthcare often contain deeply-entrenched structural racism (see Feagin & Bennefield, 2014), which would then in turn inform the boundaries of contextual integrity. It is therefore paramount to carefully scrutinise whose voices are heard and amplified in the “processes of norm discovery, articulation and formation” (Nissenbaum, 2015, p. 840). Moreover, as Rule (2019) points out, contextual integrity does not “transcend personal mindsets and convictions to identify the uniquely correct privacy norms in contested situations” (p. 276) since its normative claims rest on social norms that can, arguably, never be “unambiguous or uncontested” (p. 260). Thus, rather than a panacea to resolve privacy disputes and data violations, the value in stressing context may rather lie in its ability for “‘friendly persuasion,’ than revelations of unique ‘right answers’.”

In light of this subjectivity and the potential of persuasion, the Afro-census (Aikins et al., 2021) illustrates what knowledge production on contextual norms with regards to the data subject can mean. In its quest to collect data on minorities, it is diverging from the German policy to not collect data on racial differences but rather the broad category of “migration background” (Farkas, 2017, p. 12). Notably, the absence of a race-based census is a political choice which has to be seen in the historic role of data collection within the Third Reich (see Black, 2012). In a similar vein, the politics of counting as well as the demarcation and creation of racial categories within the context of state power that is inextricably linked with censuses (see Thompson, 2016), make it a sensitive if not problematic exercise. With this in mind, linking data protection to the call for more data collection seems counter-intuitive. But what seems paradoxical at first glance is, upon closer look, only a consequent call for action based on the knowledge that the current system is flawed. In order to substantiate political demands against racism quantitatively one simply needs more data. The Afro-census explicitly aims to make the matter politically actionable in order to reduce racial discrimination based on the data collected (EOTO, 2020). In fact, the census has formulated concrete policy recommendations spanning a variety of areas such as healthcare, police reform, media coverage, asylum law, and university curricula(Aikins et al., 2021, pp. 261-279). Within this ‘paradox’, data protection then transforms from protection of data to protection through data. It produces knowledge and creates a different context in which to judge data and information flow.

Another example of contextuality’s importance is the call to ‘Abolish Big Data’ by the US-American activist group ‘Data 4 Black Lives’, which denounces the ‘weaponization’ of data (Watson-Daniels et al., 2020, p. 26). Principally, the demand to abolish big data is not about abolishing technology, but about abolishing the current flawed system in exchange for an emancipated version of it. Calls for abolition are calls for a genuine recognition of misgivings, the subsequent redirection of resources, and radical renewal. With this in mind, data protection emerges as a fluid endeavour, in which what is emancipatory and what is oppressive depends on the context. The need for contextuality hence stems from the structural disparities and power relations within the digital sphere that currently go unnoticed in traditional views on data protection. Re-imagining the data subject along Afrofuturist lines is then about opening up futures that transcend injustices in the overlapping categories of privacy, data protection, surveillance, and algorithmic governance because the data subject is involved in all of those.

Section 4: The digital subject’s emancipation

Having outlined the three main characteristics—radical subjectivity, collectivity, and contextuality—the following will look at the broader implications. More specifically, first, how the data subject is emancipated through ‘subjectification’ and second, how we can find universality in the specific and what ‘Blackness’ can mean.

4.1 Emancipation through subjectification

What follows from the Afrofuturist data subject is an emancipatory approach to data through subjectification. The word emancipation has been used in multiple ways, for instance to describe the movement for women’s suffrage (Evans, 2013), the US-American Civil Rights movement (Branch & Edwards, 2013), or Marxist theory (Marx, 1843/1994). Yet, at the heart of the matter are power struggles, defined as “the process of giving people social or political freedom and rights” (‘Emancipation’, n.d.).8 What we see today is a digital continuation of the historical objectification that Black people have experienced for centuries. It can be argued that objectification is an inherent part of digital data management and does affect everybody as it “enframes the human subject in a new and different way […] [in] which an entity is recognisable and can be indexed” (Ansorge, 2016, p. 95). Yet, combined with the historic objectification as a denial of humanness, the current digital injustices suggest that the objectification via data sorting has a distinct character for Black people. Countering this doubled objectification requires subjectification, which means de- and re-constructing the data subject, reclaiming it and thereby transforming the ‘object’ into a subject.

Afrofuturism emancipates both from a prescribed identity (Gipson, 2019, p. 102) and through the use of time; past, present and future (Eshun, 2003; Avanessian & Moalemi, 2018; Womack, 2013). Significantly, as Strauß (2019) highlights, escaping a prescribed identity poses an arguably more critical but also more challenging task in today’s digitalised world. Furthermore, subjectification does not have to be a process of individualisation but, quite the contrary, focuses on a collectivity that broadens the data subject. “[A]cts of preservation and self-defense are not simply expressions of the negative; they define a practice of differential articulation that refuses to be separated and contained.” (Applegate, 2020, p. 141). Correspondingly, activist groups such as ‘Data 4 Black Lives’ are preserving and giving voice to data subjects through their critical inquiry into whose data is being weaponised by whom. Likewise, as Black German Asa Awad-Bergström remarks, the Afro-census “empowers” (Deutsche Welle, 2020) Black Germany. A section within the Afro-census for instance takes stock of Black Germans “Selbstpositionierungen” (Aikins et al., 2021, p. 244). That is, the self-chosen terminology one identifies with, such as “Black”, “Afro-German”, “Person of Colour” etc. Arguably, both the Afro-census and Data 4 Black Lives are aiming for the production of knowledge that works toward the formation of subjects and subjectivities. Overall, it is an emancipatory approach to the entanglements of technology, race, and the political economy that makes Afrofuturism so powerful in reimagining data protection in ways which take into account its complexity.

4.2 Making the subjective universal?

Importantly, another subtext of subjectivity is its deeply personal, and intuition-based rather than fact-based character. A remaining question is then, what is the, for lack of a better term, universal9 in the extremely culturally specific of Afrofuturism and why is it insightful for data protection in general?

In order to counteract the unique digital injustices affecting the data subject, we need to acknowledge the structural problems and intersectional differences. The Afrofuturist data subject is about what is made visible as well as what can be imagined based on the recognition that in the current conceptual models, certain things are invisible. Thus, as Benjamin (2019a, p. 10) remarked, deconstruction of a priori exclusionary categories benefits us all. Or as Dratwa (2017) pointedly framed it in the context of the security-surveillance nexus: “What do we want to secure and surveil? Why and how, and at what price? What do we want to make or keep safe? And who is in the ‘we’? This also traces the early connection between surveillance and citizenship, indeed between empowerment, participation and subjection.” (p. xix). Accordingly, the value of this deliberately subjective approach to the data subject, and data protection more generally, lies in the examination of who the most vulnerable data subjects are because, “[w]hen they enter, we all enter”(Crenshaw, 1989, p. 167).

By the same token, ‘Blackness’ as the ultimately subjective aspect of subjectification can point towards the digitally marginalised more generally. But what exactly is ‘Black’ and what does its both intangible and at the same time very undeniable history of oppression stand for? As Steinskog (2018) remarks, in Afrofuturism “‘black’ does not refer to any universal entity, but is an umbrella term for a diversity of different positions all having in common that they are described or describe themselves as ‘black’.” (p. 21). Furthermore, “Sun Ra’s musical teleportation holds out the potential for everyone on Earth to become black. No longer merely a historical identity, blackness becomes a transformative effect of space music, the cultural means of inhabiting a new world. Blacks may be in the best social position to make the most of this opportunity, but Space Is the Place holds it open to others, too, at least theoretically. Astro-black mythology can transport people to a brave new black world.” (Youngquist, 2016, p. 213). Through searching for universality in the specific, Blackness can be the departure point to examine who is marginalised, in other words, identified as ‘Black’ in our society.10

Painted with a broad brush, this analysis calls for further inquiry into the data subject. The argument brought forward in this article provides several avenues for further investigation. In fact, in many places it deserves more nuance than this scope can provide. A non-exhaustive list is, first with regards to contextuality, a further exploration of the possibility of an anti-racist critique within Helen Nissenbaum’s (2010; 2015) concept of contextual integrity. Particularly, the assessment process of a potential violation of the data subject’s information. This could for instance be an examination of structural disparities within the production, visibility, and evaluation of social norms and knowledge. Second, further research could focus on a more elaborate investigation into the protection of the data subject that does not rely on the subject/object binary. And third, future examinations should also particularly look into what Chude-Sokei (2016) calls “false universalism” (p. 168), that is, the dominance of the African-American context in discussions of Blackness and Afrofuturism. Despite potential similarities (see Chander, 2020), other diaspora and population groups face different challenges and discriminatory digital practices than in the United States. In order to prevent and act on digital injustices, we need a nuanced understanding of distinct cases such as the Afro-census. Indeed, “America possesses no monopoly on either blackness or death. A planetary perspective on black suffering and human subjugation would provide globalisation with a conscience irreducible to national identity or imperial ambition.” (Youngquist, 2016, p. 205). Nonetheless, theories of Blackness and the Afrofuturist data subject can teach us about marginalisation, defiance and empowerment. The value in this exercise thus lies in starting to reimagine and deconstruct the data subject, not in conclusively determining it now.

Conclusion

Starting from the question how we can rethink the data subject in light of digital racism and discrimination, I have looked towards Afrofuturism. Recent examples of racism and algorithmic bias have shown the inadequacy of the traditional data subject paradigm, making it necessary to recognise long-standing notions of who counts as human and therefore enjoys protection within that category. Furthermore, to recognise the interacting gender and race biases we thus need to re-conceptualise the data subject along intersectional lines. Consequently, the Afrofuturist data subject has three specific characteristics. First, the Afrofuturist data subject is a radical opposition to the historical trajectory that has conceptualised Black people as objects rather than subjects. Second, the Afrofuturist data subject goes beyond the current individual data subject paradigm and highlights the collective factor that should inform data protection going forward. Third, the Afrofuturist data subject needs to be contextual because it recognises the societal and political power dynamics that data protection is set in. This radical subjectivity, collectivity, and contextuality can be seen in the German civil society initiative Afro-census (Aikins et al., 2021). Taken together, and in an attempt to reclaim digital humanity, the three characteristics of the Afrofuturist data subject emancipate the data subject through subjectification. Certainly, the Afrofuturist data subject sketched out here is only the beginning and a call for more subjectivity in data protection in order to counter the digital racism we see right now with an emancipated data subject.

The Afrofuturist data subject is about resistance in the form of knowledge. According to Womack (2013), “the absence of Africa’s contribution to global knowledge in history, science, and beyond is a gaping hole so expansive it almost feels like a missing organ in the planet’s cultural anatomy.” (p. 80). Similar to the resulting quest of Afrofuturism to fill the hole by looking towards African mythology and mysticism, it is now upon us to fill the knowledge gap that characterises our conceptions of data and the digital world. As a movement by the African diaspora, Afrofuturism can aid to re-think our engagement with data through the cultural and political power of imagination. In light of the persistent injustices, it is exactly this speculative technological self-empowerment as a form of resistance that can counter the foreclosure of futures by opening up new ones. Racism is a sometimes blatant, sometimes shape-shifting beast that lives in almost all of our institutions. More importantly to this discussion though, it is also installed in the new digital institutions we build. At times this beast is not easily detectable, yet that should not prevent us from challenging it. Indeed, this fight is now more urgent than ever. For that, Afrofuturism is suitable because it encapsulates the historicity and future of structural racism. There is continuity, both in racism as well as the struggle against it.

Acknowledgment

I would like to thank the editors of this special issue as well as the editors of Internet Policy Review for their encouraging comments and their help in the publishing process. Furthermore, I would like to thank Smarika Lulz and Francesca Sobande for both the comprehensiveness and the thoughtfulness of their review.

References

Achiume, E. T. (2020). Racial discrimination and emerging digital technologies: A human rights analysis.

Afrozensus. (XXXX). Afrozensus. https://afrozensus.de

Aghoro, N. (2018). Agency in the Afrofuturist Ontologies of Erykah Badu and Janelle Monáe. Open Cultural Studies, 2(1), 330–340. https://doi.org/10.1515/culture-2018-0030

Ahmed, S. (2017). Living a feminist life. Duke University Press.

Aikins, M. A., Bremberger, T., Aikins, J. K., Gyamerah, D., & Yildirim-Caliman, D. (2021). Afrozensus 2020: Perspektiven, Anti-Schwarze Rassismuserfahrungen und Engagement Schwarzer, afrikanischer und afrodiasporischer Menschen in Deutschland. Each One Teach One (EOTO) e.V. & Citizens For Europe (CFE). https://afrozensus.de/reports/2020/Afrozensus-2020.pdf

Akomfrah, J. (1996, September 11). The Last Angel of History. Icarus Films.

Akomfrah, J., & George, E. (2013, December 20). The Last Angel of History. The Chimurenga Chronic. https://chimurengachronic.co.za/the-last-angel-of-history/

Amoore, L. (2020, August 19). Why ‘Ditch the algorithm’ is the future of political protest. The Guardian. https://www.theguardian.com/commentisfree/2020/aug/19/ditch-the-algorithm-generation-students-a-levels-politics

Anderson, E. (2009). Toward a Non-Ideal, Relational Methodology for Political Philosophy: Comments on Schwartzman’s Challenging Liberalism. Hypatia, 24(4), 130–145. https://doi.org/10.1111/j.1527-2001.2009.01062.x

Anderson, R. (2019). Critical Afrofuturism: A Case Study In Visual Rhetoric, Sequential Art, And Postapocalyptic Black Identity. In F. Gateward & J. Jennings (Eds.), The Blacker the Ink (pp. 171–192). Rutgers University Press. https://doi.org/10.36019/9780813572369-010

Andrews, K. (2018). Back to Black: Retelling Black radicalism for the 21st century. Zed books.

Ansorge, J. T. (2016). Identify and Sort: How Digital Power Changed World Politics. Oxford University Press. https://doi.org/10.1093/acprof:oso/9780190245542.001.0001

Applegate, M. (2019). Guerrilla Theory: Political Concepts, Critical Digital Humanities. Northwestern University Press. https://doi.org/10.2307/j.ctvwh8f6x

Arte TRACKS. (2020, January 31). Afrozensus—Mit Megaloh, Aminata Belli und Aminata Touré. YouTube. https://www.youtube.com/watch?v=I57LePt0cgc&feature=emb_logo

Avanessian, A., & Moalemi, M. (Eds.). (2018). Ethnofuturismen (R. Voullié, Trans.; Originalausgabe). Merve Verlag.

Balakrishnan, S. (2018). Afropolitanism and the End of Black Nationalism. In G. Delanty (Ed.), Routledge Handbook of Cosmopolitan Studies (2nd ed., pp. 575–585). Routledge.

Begley, S. (2020, June 17). Racial bias skews algorithms widely used to guide patient care. Stat News. https://www.statnews.com/2020/06/17/racial-bias-skews-algorithms-widely-used-to-guide-patient-care/

Benjamin, R. (Ed.). (2019a). Captivating Technology: Race, Carceral Technoscience, and Liberatory Imagination in Everyday Life. Duke University Press. https://doi.org/10.1215/9781478004493

Benjamin, R. (2019b). Race after technology: Abolitionist tools for the new Jim code. Polity.

Black, E. (2012). IBM and the Holocaust: The Strategic Alliance Between Nazi Germany and America’s Most Powerful Corporation (3rd pbk. ed.; Expanded pbk. ed). Dialog Press.

Boxill, B. (2017). Kantian Racism and Kantian Teleology (N. Zack, Ed.; Vol. 1). Oxford University Press. https://doi.org/10.1093/oxfordhb/9780190236953.013.46

Braidotti, R. (2019). Posthuman knowledge. Polity.

Branch, T., & Edwards, H. S. (2013, January). A Second Emancipation. Washington Monthly, 45(1/2), 27–29.

Brouwer, E. (2020). Large-Scale Databases and Interoperability in Migration and Border Policies: The Non- Discriminatory Approach of Data Protection. European Public Law, 21(2), 71–92.

Browne, S. (2015). Dark Matters: On the Surveillance of Blackness (p. dup;9780822375302/1). Duke University Press. https://doi.org/10.1215/9780822375302

Bundesministerium des Innern, für Bau und Heimat. (2020, October 20). Seehofer: “Keine Rassismus-Studie in der Polizei” [Pressemitteilung]. https://www.bmi.bund.de/SharedDocs/pressemitteilungen/DE/2020/10/keine-studie-rechtsextremismus-polizei.html

Buolamwini, J., & Gebru, T. (2018). Gender Shades: Intersectional Accuracy Disparities in Commercial Gender Classification. Proceedings of Machine Learning Research, 81, 1–15. http://proceedings.mlr.press/v81/buolamwini18a.html

Carastathis, A. (2016). Intersectionality: Origins, contestations, horizons. University of Nebraska Press.

Chander, S. (2020). Data racism: A new frontier. European Network Against Racism. https://www.enar-eu.org/Data-racism-a-new-frontier

Christian, B. (1987). The Race for Theory. Cultural Critique, 6, 51. https://doi.org/10.2307/1354255

Chude-Sokei, L. O. (2016). The sound of culture: Diaspora and black technopoetics. Wesleyan University Press.

Clifford, D., Graef, I., & Valcke, P. (2019). Pre-formulated Declarations of Data Subject Consent—Citizen-Consumer Empowerment and the Alignment of Data, Consumer and Competition Law Protections. German Law Journal, 20(05), 679–721. https://doi.org/10.1017/glj.2019.56

Cohen, J. E. (2012). Configuring the networked self: Law, code, and the play of everyday practice. Yale University Press.

Cohen, J. E. (2019). Turning Privacy Inside Out. Theoretical Inquiries in Law, 20(1), 1–31. https://doi.org/10.1515/til-2019-0002

Confessore, N. (2018). Cambridge Analytica and Facebook: The Scandal and the Fallout So Far. The New York Times. https://www.nytimes.com/2018/04/04/us/politics/cambridge-analytica-scandal-fallout.html

Crenshaw, K. (1989). Demarginalizing the Intersection of Race and Sex: A Black Feminist Critique of Antidiscrimination Doctrine, Feminist Theory and Antiracist Politics. University of Chicago Legal Forum, 1989(1), 139–167.

Crenshaw, K. (1991). Mapping the Margins: Intersectionality, Identity Politics, and Violence against Women of Color. Stanford Law Review, 43(6), 1241. https://doi.org/10.2307/1229039

Därmann, I. (2020). Undienlichkeit: Gewaltgeschichte und politische Philosophie (Erste Auflage). Matthes & Seitz.

David, M. (2007). Afrofuturism and Post-Soul Possibility in Black Popular Music. African American Review, 41(4), 695. https://doi.org/10.2307/25426985

De, E. N. (2002). Decolonizing Universality: Postcolonial Theory and the Quandary of Ethical Agency. Diacritics, 32(2), 42–59. https://www.jstor.org/stable/1566286

Deutsche Welle. (2020, June 13). Afro Census’ in Germany: Black lives count. DW News. https://www.dw.com/en/afro-census-in-germany-black-lives-count/av-53740117

Diagne, S. B. (2013). On the Postcolonial and the Universal? Rue Descartes, 78(2), 7. https://doi.org/10.3917/rdes.078.0007

DiAngelo, R. J. (2016). What does it mean to be white? Developing white racial literacy (Revised edition). Peter Lang.

DiAngelo, R. J. (2018). White fragility: Why it’s so hard for white people to talk about racism. Beacon Press.

Dratwa, J. (2017). Foreword: Ethical experimentations of security and surveillance as an inquiry into the Open Beta Society. In M. Friedewald, J. P. Burgess, J. Čas, R. Bellanova, & W. Peissl (Eds.), Surveillance, Privacy and Security: Citizens’ Perspectives (1st ed.). Routledge. https://doi.org/10.4324/9781315619309

Drery, M. (1994). Black to the Future: Interviews with Samuel R. In G. T. Delany & T. R. I. M. Drery (Eds.), Flame Wars: The Discourse of Cyberculture (pp. 179–222). http://post-what.com/2014/09/what-is-afrofuturism-black-to-the-future-by-mark-dery/

Each One Teach One. (2020, January 4). #Afrozensus FAQ. https://eoto-archiv.de/neuigkeiten/afrozensus-faq/

Emancipation. (XXXX). Cambridge Dictionary. https://dictionary.cambridge.org/dictionary/english/emancipation

Erasmus, Z. (2020). Sylvia Wynter’s Theory of the Human: Counter-, not Post-humanist. Theory, Culture & Society, 37(6), 47–65. https://doi.org/10.1177/0263276420936333

Eshun, K. (2003). Further Considerations of Afrofuturism. CR: The New Centennial Review, 3(2), 287–302. https://doi.org/10.1353/ncr.2003.0021

Etter-Lewis, G. (1993). My Soul is My Own: Oral Narratives of African American Women in the Professions. Routledge.

Eubanks, V. (2017). Automating inequality: How high-tech tools profile, police, and punish the poor (First Edition). St. Martin’s Press.

European Parliament, Council of the European Union. (2016). Regulation 2016/6. On the protection of natural persons with regard to the processing of personal data and on the free movement of such data, and repealing Directive 95/46/EC (General Data Protection).

Evans, R. J. (2014). The feminists: Women’s emancipation movements in Europe, America and Australasia 1840 - 1920. Routledge.

Fanon, F. (1952). Black skin, white masks (1st ed., new ed). Grove Press ; Distributed by Publishers Group West.

Farkas, L. (2017). Data collection in the field of ethnicity: Analysis and comparative review of equality data collection practices in the European Union [Report]. European Commission, Directorate General for Justice and Consumers. https://ec.europa.eu/info/sites/default/files/data_collection_in_the_field_of_ethnicity.pdf

Feagin, J., & Bennefield, Z. (2014). Systemic racism and U.S. health care. Social Science & Medicine, 103, 7–14. https://doi.org/10.1016/j.socscimed.2013.09.006

Fraser, N. (1990). Rethinking the Public Sphere: A Contribution to the Critique of Actually Existing Democracy. Social Text, 25(26), 56–80. https://doi.org/10.2307/466240

Gaul, S., & Vooren, C. (2020, June 12). Afrozensus: Endlich sichtbar. Die Zeit. https://www.zeit.de/gesellschaft/2020-06/afrozensus-rassismus-deutschland-aufklaerung-schwarze-community

Gillespie, T. (2014). The Relevance of Algorithms. In T. Gillespie, P. J. Boczkowski, & K. A. Foot (Eds.), Media Technologies (pp. 167–194). The MIT Press. https://doi.org/10.7551/mitpress/9780262525374.003.0009

Gipson, G. (2019). Creating and Imagining Black Futures through Afrofuturism. In A. D. Kosnik & K. P. Feldman (Eds.), #identity (pp. 84–103). University of Michigan Press. https://www.jstor.org/stable/j.ctvndv9md.9

González Fuster, G., & Gutwirth, S. (2017). The legal significance of individual choices about privacy and personal data protection. In M. Friedewald, J. P. Burgess, J. Cas, R. Bellanova, & W. Peissl (Eds.), Surveillance, Privacy and Security: Citizen’s Perspectives. Routledge.

González-Fuster, G. (2014). The emergence of personal data protection as a fundamental right of the EU. Springer.

Griffin, C. L. (1996). Angela Y. Davis. In R. W. Leeman (Ed.), African-American orators: A bio-critical sourcebook. Greenwood Press.

Hill, K. (2020, August 3). Wrongfully Accused by an Algorithm. The New York Times. https://www.nytimes.com/2020/06/24/technology/facial-recognition-arrest.html

Hintz, A., Dencik, L., & Wahl-Jorgensen, K. (2018). Digital Citizenship in a Datafied Society. Polity Press.

Imarisha, W., & Brown, A. M. (2015). Octavia’s Brood: Science Fiction Stories from Social Justice Movements. AK Press.

Jackson, Z. I. (2020). Becoming human: Matter and meaning in an antiblack world. New York University Press.

Marx, K. (1994). ‘On the Jewish Question.’ In J. J. O’Malley (Ed.), Marx: Early Political Writings (1st ed., pp. 28–56). Cambridge University Press. https://doi.org/10.1017/CBO9781139168007.006

McPherson, T. (2012). Why Are the Digital Humanities So White? Or Thinking the Histories of Race and Computation. In M. K. Gold (Ed.), Debates in the Digital Humanities (pp. 139–160). University of Minnesota Press. https://doi.org/10.5749/minnesota/9780816677948.003.0017

Mejias, U. A., & Couldry, N. (2019). Datafication. Internet Policy Review, 8(4). https://doi.org/10.14763/2019.4.1428

Metz, C. (2020, March 23). There Is a Racial Divide in Speech-Recognition Systems, Researchers Say. The New York Times. https://www.nytimes.com/2020/03/23/technology/speech-recognition-bias-apple-amazon-google.html

Metz, R. (2020). Portland passes broadest facial recognition ban in the US. CNN. https://edition.cnn.com/2020/09/09/tech/portland-facial-recognition-ban/index.html

Mignolo, W. D. (2015). Sylvia Wynter: What Does It Mean to Be Human? In K. McKittrick (Ed.), Sylvia Wynter (pp. 106–123). Duke University Press. https://doi.org/10.1215/9780822375852-004

Mill, J. S. (1859). On Liberty. Philosophical Library/Open Road.

Mills, C. W. (2017). Black Rights/White Wrongs: The Critique of Racial Liberalism. Oxford University Press. https://doi.org/10.1093/acprof:oso/9780190245412.001.0001

Morrison, T. (2019). The source of self-regard: Selected essays, speeches, and meditations (First edition). Alfred A. Knopf.

Moten, F. (2018). The Universal Machine. Duke University Press.

Nazir, F. (2018). Humanism with a Difference: Universality and Cultural Difference in Postcolonial Theory. Journal of Contemporary Poetics, 2(1), 1–18. https://doi.org/10.54487/jcp.v2i1.774

Nissenbaum, H. (2010). Privacy in Context: Technology, Policy, and the Integrity of Social Life. Stanford University Press.

Nissenbaum, H. (2018). Respecting Context to Protect Privacy: Why Meaning Matters. Science and Engineering Ethics, 24(3), 831–852. https://doi.org/10.1007/s11948-015-9674-9

Noble, S. U. (2018). Algorithms of oppression: How search engines reinforce racism. New York University Press.

Okorafor, N. (2019, October 19). Africanfuturism Defined. Nnedi’s Wahala Zone Blog. http://nnedi.blogspot.com/2019/10/africanfuturism-defined.html

Paquette, E. (2020). Universal emancipation: Race beyond Badiou. University of Minnesota Press.

Pateman, C., & Mills, C. W. (2013). The Contract and Domination. https://public.ebookcentral.proquest.com/choice/publicfullrecord.aspx?p=1180907

Peña, P., & Varon, J. (2019, October 3). A feminist approach to consent in digital technologies. Dones Tech. https://donestech.net/noticia/feminist-approach-consent-digital-technologies-paz-pena-and-joana-varon

Quan, H. L. T. (2017). Its Hard to Stop Rebels That Time Travel’: Democratic Living and the Radical Reimagining of Old Words. In G. T. Johnson & A. Lubin (Eds.), Futures of Black Radicalism. Verso.

Raval, N. (2019). An Agenda for Decolonizing Data Science. Spheres - Journal for Digital Cultures, #5 Spectres of AI. https://mediarep.org/bitstream/handle/doc/14422/spheres_5_0202_Raval_Agenda-for-Decolonizing-Data-Science.pdf?sequence=1

Roy, A. (1999). The Cost of Living (Modern Library Pbk. ed). Modern Library.

Rule, J. B. (2019). Contextual Integrity and its Discontents: A Critique of Helen Nissenbaum’s Normative Arguments. Policy & Internet, 11(3), 260–279. https://doi.org/10.1002/poi3.215

Sala-Molins, L., & Conteh-Morgan, J. (2006). Dark Side of the Light: Slavery and the French Enlightenment. University of Minnesota Press.

Singletary, M. (2020). Credit scores are supposed to be race-neutral. That’s impossible. Washington Post. 126 Harvard Law Review, 1880, 25.

Somody, B., Szabó, M. D., & Székely, I. (2017). Moving away from the security-privacy trade-off: The use of the test of proportionality in decision support. In M. Friedewald, J. P. Burgess, J. Čas, R. Bellanova, & W. Peissl (Eds.), Surveillance, Privacy and Security: Citizens’ Perspectives (1st ed.). Routledge. https://doi.org/10.4324/9781315619309

Steinskog, E. (2018). Afrofuturism and Black Sound Studies. Springer International Publishing. https://doi.org/10.1007/978-3-319-66041-7

Strauß, S. (2019). Privacy and Identity in a networked society: Refining privacy impact assessment. Routledge.

Thompson, D. (2016). The Schematic State: Race, Transnationalism, and the Politics of the Census. Cambridge University Press. https://doi.org/10.1017/CBO9781316442951

Tisné, M. (2020). The Data Delusion [White Paper]. Stanford Cyber Policy Center. https://cyber.fsi.stanford.edu/publication/data-delusion

Tuhiwai Smith, L. (1999). Decolonizing Methodologies. Zed Books.

Valentini, L. (2012). Ideal vs. Non-ideal Theory: A Conceptual Map: Ideal vs Non-ideal Theory. Philosophy Compass, 7(9), 654–664. https://doi.org/10.1111/j.1747-9991.2012.00500.x

Van der Sloot, B. (2017). Do Groups Have a Right to Protect Their Group Interest in Privacy and Should They? Peeling the Onion of Rights and Interests Protected Under Article 8 ECHR. In L. Taylor, L. Floridi, & B. V. Sloot (Eds.), Group Privacy: New Challenges of Data Technologies. Springer International Publishing.

Wabuke, H. (2020). Afrofuturism, Africanfuturism, and the Language of Black Speculative Literature. Los Angeles Review of Books. https://lareviewofbooks.org/article/afrofuturism-africanfuturism-and-the-language-of-black-speculative-literature/

Walter, M., & Suina, M. (2019). Indigenous data, indigenous methodologies and indigenous data sovereignty. International Journal of Social Research Methodology, 22(3), 233–243. https://doi.org/10.1080/13645579.2018.1531228

Watson-Daniels, J., Milner, Y., Triplett, N., Headen, I., Day, D., Bailey, Z., Styles, M., Clinton, L., Andrews, C., Wilson, M., Ezeokoli, N., Jebbett Bullard, S., & Mason-Brown, L. (2020). Data for Black Lives COVID-19 Movement Pulsecheck and Roundtable Report. Data for Black Lives. http://d4bl.org/reports

Weheliye, A. G. (2014). Habeas Viscus: Racializing Assemblages, Biopolitics, and Black Feminist Theories of the Human. Duke University Press.

Wilderson, F. B., Hartman, S., Martinot, S., & Sexton, J. (2019). Afro-Pessimism: An Introduction (H. J. Spillers, Ed.). https://libcom.org/library/afro-pessimism-introduction

Womack, Y. (2013). Afrofuturism: The world of black sci-fi and fantasy culture (First edition). Chicago Review Press.

Wynter, S. (1992). Beyond the Categories of the Master Conception: The Counterdoctrine of the Jamesian Poiesis. In P. Henry & P. Buhle (Eds.), C. L. R. James’s Caribbean (pp. 63–91). Duke University Press. https://doi.org/10.1215/9780822382386-009

Wynter, S. (1994). No Humans Involved: An Open Letter to My Colleagues. Forum N.H.I Knowledge for The 21st Century, 1(1), 42–73.

Wynter, S. (1995). 1492: A New World View. In V. L. Hyatt & R. M. Nettleford (Eds.), Race, discourse, and the origin of the Americas: A new world view. Smithsonian Institution Press.

Wynter, S. (2003). Unsettling the Coloniality of Being/Power/Truth/Freedom: Towards the Human, After Man, Its Overrepresentation--An Argument. CR: The New Centennial Review, 3(3), 257–337. https://doi.org/10.1353/ncr.2004.0015

Youngquist, P. (2016). A pure solar world: Sun Ra and the birth of Afrofuturism (First edition). University of Texas Press.

Footnotes

1. Sociologist Robin DiAngelo (2016) differentiates between racist prejudice, racist discrimination, and racism. Whereas, prejudice is an individual “learned prejudgment” (p.46) and racist discrimination consists of “action based upon prejudice” (p.52), “racism [..] occurs when a racial group’s prejudice is backed by legal authority and institutional control.” (DiAngelo, 2018, p.21). Understanding racism as a system of oppression rather than discrimination or prejudice on an individual level is thus paramount for counter-acting it.

2. The tricky thing about digitally-enabled power imbalances is that they are hidden in seemingly objective methodologies and epistemic claims. What Gillespie (2014) titled “[t]he promise of algorithmic objectivity” (p. 179) is then a legitimacy through the absence of human subjectivity. In turn, leading to the perception of an automated world of serene objectivity, impartiality, and even fairness, that is difficult to challenge.

3. The term Afrofuturism has recently been criticised for its focus on a Western lens and its decentering of African imaginaries (see Okorafor, 2019; Wabuke, 2020). Indeed, the academic production of African-American rather than African narratives and imaginaries is a valid and important critique. More specifically, Okorafor (2019) explicitly links her terminological choice of Africanfuturism and Africanjujuism to her artistic and writing practice and its mismatch with Afrofuturist themes. Building on Okorafor’s argument and her work, Wabuke (2020) proposes the term “Black speculative fiction” to incorporate all three terminologies, and draws attention to the prominence of Mark Drery (1994) in Afrofuturist analyses, the white scholar who coined the term. While I do agree that uncovering the “white gaze” (Wabuke, 2020) potentially inherent in Afrofuturism and analyses thereof is important, I would also contend that a singular focus on Drery’s (1994) essay does not leave a lot of room for the Black musicians such as Sun Ra, Black feminist writers such as Octavia Butler, or Black scholars such as Kodwo Eshun, thus risking disregarding their work, their struggle, and the richness with which they have given the term their own meaning. Ultimately, the question of who and what is African in Afrofuturism is beyond the scope of this article and I am by no means attempting to speak of a universal (Black) experience. A further analysis on this might consult the concept of “Afropolitanism” (see Balakrishnan, 2018 for an overview). Last but not least, the choice of Afrofuturism in this article rather than Africanfuturism and Africanjujuism is also a pragmatic one—there simply exist more scholarly analyses to draw upon. Hopefully, this will change soon.

4. Here it is important to note that the term “theory” is not entirely innocent. Christian (1987) provides a thoughtful problematisation of the theorising process and what is at stake when theory works for a privileged ivory tower by claiming universality. Indeed, where the aim is to make literature and art useful, usable, applicable, and manageable, there is a risk of erasure as well as dominance. At the same time, theorising can open new avenues of inquiry when it “is based on our multiplicity of experiences” (Christian, 1987, p. 60). Similarly, Ahmed (2017, p .8) draws attention to the arguments with which theory is often placed outside of politics and empirical work, while stressing that in order to thoroughly understand racism and sexism we may have to question as well as go beyond the pre-defined boundaries of theory. Thus, my aim is to not lose the radical potential inherent in Afrofuturism, that is, a different way of knowing. And furthermore, not to co-opt it by losing nuance but rather to start conversations. Or in Arundhati Roy’s (1999) words, “[t]o never simplify what is complicated and to never or complicate what is simple” (p. 105 ).

5. Similar to Fraser’s (1999) illustration of counter-publics.

6. Several authors argue that empowerment and emancipation from oppressive and exclusionary demarcations of who is and who is not human ought to stem from entirely different avenues. Jackson (2020), for instance, advances an approach that “neither rel[ies] on animal abjection to define being (human) nor reestablish ‘human recognition’ within liberal humanism as an antidote to racialization.” (p. 1). Whereas, Braidotti (2019) probes the potential of posthuman subjects. Acknowledging the inevitable linkage between the concepts human and subject, my usage of the term subject and subjectification therefore relies on Sylvia Wynter’s (1992; 2003) writings on the human. Erasmus (2020) points out that Wynter’s “conception of the human being [is] a hybrid being: both biological/organic and symbolic/myth-making.” (p.48). Thereby, Wynter refuses the traditional subject/object binary from which to understand the human (Erasmus, 2020, p.61). Furthermore, Mignolo (2015) illustrates how “Wynter [2003] refuses to embrace the entity of the Human independently of the epistemic categories and concepts that created it” (p. 108), that is to say, its colonial origins. In my goal to re-think the subject in data subject I therefore do not aim to replicate the traditional liberal creation of hu(man), or subject for that matter.

7. In Germany, an example of this was the public debate about the proposal for a study investigating racism within law enforcement. The effort was ultimately rejected by the responsible Minister of Interior, Building and Community, Horst Seehofer, who maintained that to aim for such an investigation is a priori biased in itself (Bundesministerium des Innern, für Bau und Heimat, 2020).

8. Throughout the article, I understand emancipation as political self-determination and the political subject’s liberation from systems of colonial, racial, and patriarchal oppression and exploitation. Building on Fanon (1952/2008) and Wynter (1992; 1995; 2003), I contend that emancipation can have a distinct racial character (see Paquette, 2020). This clarification is important since the term emancipation can also be associated with liberalism and Enlightenment (see Israel, 2006), arguably exactly the target of Fanon’s and Wynter’s critique, and moreover, contrary to this article’s understanding of emancipation. Importantly, “for Wynter, universal emancipation requires that one address multiple forms of oppression through a multiplicity of forms of resistance” (Paquette, 2020, p. 162).

9. Within postcolonial thought, theories’ aim for and claim of universality is a contested one (see De, 2002; Diagne, 2013; Nazir, 2018). Furthermore, Moten (2018) provides a salient critique of the universal within the context of Blackness. Therefore, the use of the term universality in this article acknowledges “how whiteness matures and ascends the throne of universalism by maintaining its powers to describe and to enforce descriptions.” (Morrison, 2019, p. 273)

10. For instance, to analyse the use of data with regards to other diaspora demographics or specifically vulnerable groups such as refugees (see Brouwer, 2020).

Data and Afrofuturism: an emancipated subject? (2024)

References

Top Articles
Latest Posts
Recommended Articles
Article information

Author: Eusebia Nader

Last Updated:

Views: 6236

Rating: 5 / 5 (80 voted)

Reviews: 95% of readers found this page helpful

Author information

Name: Eusebia Nader

Birthday: 1994-11-11

Address: Apt. 721 977 Ebert Meadows, Jereville, GA 73618-6603

Phone: +2316203969400

Job: International Farming Consultant

Hobby: Reading, Photography, Shooting, Singing, Magic, Kayaking, Mushroom hunting

Introduction: My name is Eusebia Nader, I am a encouraging, brainy, lively, nice, famous, healthy, clever person who loves writing and wants to share my knowledge and understanding with you.