Abstract

The Black Technical Object retraces and problematizes the entanglements between technology, data and race. Ramon Amaro’s genealogy of racial sorting argues that machine learning is preconditioned by a prototypical Whiteness that relegates Black beings as objects measured against a White norm. In raising salient questions about technological, political and legal agency – which intersect with various modes of critique of algorithmic governance – I explore three main contributions the book makes to international law. First, machine learning and algorithmic decision-making are increasingly deployed in different domains of international law. The Black Technical Object speaks to emergent strands of scholarship that map how these developments reconfigure and disrupt key legal concepts and categories. Second, both the individual ‘subject’ and the collective ‘public’ of (international) law come into formation differently through these technological systems. Amaro helps us engage with these modalities of subject-making and understand their lineages and political stakes. Finally, the book provides a distinct political perspective of refusal and resistance – a refusal of recognition to resist the reinforcement of racial constructs (re)produced in the digital space.

1 Introduction

In international law today, a growing body of literature is engaging with the racialization (re)produced in our digitalized societies. To counter racial discriminations and biases, so the argument tends to go, the computational milieu and algorithmic decision-making processes must go through greater diversification and inclusion of data. But what if, as Ramon Amaro asks, the legibility and recognition of the subject that is taken for granted in both the analogue and the digital society it mirrors or reflects gets dissolved by way of generative misreadings of the liberal ‘subject’? What if, rather than focusing on better representing Black subjects, we orientated our critical attention to an emancipatory misrecognition of Black subjectivities that machine-learning systems and artificial intelligence (AI) technologies perform? Amaro shows how such technologies are complicit in maintaining and re-enforcing racial hierarchies. He embeds these developments into a longer history of philosophical, scientific and technological constructions of race. Yet, in addition to traditional critiques of the perpetuation of racism in the digital space, Amaro explores how Black existence is represented and manipulated within technical frameworks and how it could be actualized differently, if only Black life and agency were affirmed and repossessed.

I first came across Amaro’s work in 2020 via e-flux – an online platform that spans numerous strains of critical discourse in art, architecture, film and theory and connects many of the most significant art institutions with audiences around the world.1 I was immediately hooked by the theoretically sophisticated and original interventions he was articulating and happy to discover two further contributions to edited volumes that came out that same year in More-than-Human and Atlas of Anomalous AI.2 Aware that his book was due to come out in 2020, I hit the pre-order button and suggested to discuss it as part of the Black Anthropocene Working Group,3 as it promised to explore the ‘incompatible relation between machine learning, data, and race’4 – questions that lie at the heart of our interest in the meaning of Blackness, ontological politics and the formation and deformation of the ‘subject’ through emerging technologies.5 Due to paper shortage, however, the book’s publication was significantly delayed. While Amaro wrote The Black Technical Object as a lecturer in art and visual cultures of the global South at University College London, expanding the doctoral dissertation he wrote at Goldsmiths, the book only got published after Amaro left academia to become a senior researcher in digital cultures at the Nieuwe Instituut in Rotterdam – the national institute for architecture, design and digital culture in The Netherlands – a position he holds to this day.

When the book finally came out in 2022, I knew I was in for an inspiring yet challenging treat: ‘What if the Black technical object was to interact with the logics of machine learning beyond the desire for recognition and reinforcement of its existing rudimentary operations?’, Amaro asks.6 ‘What if, as Stefano Harney and Fred Moten argue, the Black technical object were to take a right of refusal to racial perception, and aspire to be that which is out of reach of the negating factors of race?’.7 These are the main questions that Amaro seeks to answer in The Black Technical Object. The formulation of these questions matters, since the ‘what if’ – in line also with the title of Chapter 1: ‘As If: Critical Thoughts on Gaining Access to Black Aspiration’ – hints to the speculative nature of the argument (a point to which I will return). The terms of these questions – the ‘Black technical object’, the ‘logics of machine learning’, and the ‘desire for recognition’ – are the main themes around which the argumentative thread of the book revolves.

The ‘Black technical object’ refers to the convergence between the Black object, on the one hand, which Amaro refers to as being dually fragmented – the fragmented body as a ‘racialized object’ and the ‘fragmented psyche’ of Black beings or non-White subjectivities – and the ‘technical object’, on the other hand.8 As Amaro puts it, the book traces an ‘unwitting link between Black pathology and the technical object, what – in its convergence – we might call the “Black technical object”’.9 In his engagement with what he calls ‘Black pathology’, Amaro draws on Frantz Fanon’s understanding of the operation of race and its impact on the psyche of racialized beings.10 In Black Skin, White Masks, Fanon calls this process ‘sociogeny’ – building on yet also departing from Sigmund Freud’s work – to refer to the ‘product of the historical realities of anti-Blackness, whose outcome is the epidermalization, or internalization, of the racial being’s psychic inferiority’.11 As a result of social factors such as colonialism and racism, the racialized being internalizes how they are perceived by the dominant Other and the dominant order – a sense that W.E.B. Du Bois originally referred to as ‘double consciousness’ in The Souls of Black Folk in relation to the lived experience of African Americans.12 It is, as such, the sociogenesis of race that conditions the individual and collective psychic individuation, thereby entrenching constructed forms of racialized social objectification that lead to conditions of ‘psychic fragmentation’ between body and mind and between internal and external perceptions of oneself.13

This is the ‘Black pathology’ that Amaro takes as the starting point to explore its encounter with, and effects on, machinic alienation. Staying true to Fanon and Du Bois’ sensibilities, Amaro rejects and refuses the desire for representation, for recognition and for inclusion into a societal order that creates and sustains this psychic fragmentation of racialized beings and expands this politics of refusal against the backdrop of machine learning. The premise is thus simple: ‘The technology of machine learning is preconditioned by a process of human relations that has already conceived of Black beings as objects among other objects, following Frantz Fanon’.14 This conceptual apparatus composes the first Act of the book, constituted of Chapter 1, ‘As If’, and Chapter 2, ‘Sociogeny’. The following five chapters in Act II and Act III – a stylistic choice of literary genre that leaves the reader wondering whether Amaro wants the book to read like a play, which is perhaps better aligned to the speculative gesture of the argument itself – offer a genealogy of racial sorting. Here, Amaro investigates the entanglements between technology, data and race to show how machine learning is preconditioned by a prototypical Whiteness that relegates Black beings as objects measured against a White norm, in line with works from Franz Fanon and Sylvia Wynter.15 Whiteness, in this techno-ontological infrastructure, is always and already positioned as the ‘baseline of social and phenotypical measurement’.16

Readers of the European Journal of International Law (EJIL) may wonder what the ‘Black technical object’ might have to do with law and international law specifically. I see

at least three strands of discussions in and of (international) law that the book engages. First, machine learning and algorithmic decision-making are increasingly deployed in different domains of international law. The Black Technical Object speaks to the emergent strands of scholarship that map how these developments reconfigure and disrupt key legal concepts and categories.17 Second, both the individual ‘subject’ and the collective ‘public’ of (international) law come into formation differently through these technological systems.18 Amaro’s writing helps us to engage these new modalities of subject-making and understand their lineages and political stakes. Finally, the book provides a distinct political perspective of refusal and resistance – a refusal of recognition to resist the reinforcement of racial constructs. This raises salient questions about political, legal and technological agency that intersect with various strands of critical international law.19 As such, The Black Technical Object speaks to these three emergent debates in international law and provides important insights that run against the grain of both international law’s critical and regulatory repertoires.

2 The Belief in the ‘Universal Computational Gaze’: Countering Racial Biases in Machine Learning

Much of the current debate on the racial inequalities engendered by machine learning, facial recognition and algorithmic decision-making focuses on ‘de-biasing’ these systems by including more diverse training data or expunging specific protected categories of data from the algorithmic calculus. This echoes in calls to enhance the diversity of coding communities and the data on which their algorithms feed – an attempt to reverse how ‘the White world buil[t] technology in its own image’.20 Amaro draws on computer scientist Joy Buolamwini and her work on countering racial biases in machine learning perception, but we could add authors like Ruha Benjamin who have long called for the inclusion and recognition of Black subjects against the racial discrimination that computer codes reproduce.21 In his foreword in EJIL, Eyal Benvenisti equally underlined the need for algorithms to be made more ‘inclusive’ or ‘transparent’ to counter prevailing ‘biases’.22 This problematization points to a logic of reproduction: ‘While the algorithm itself does not comprehend, at least computationally, the complexities of race or racism, it can be perceived as racist when outputs simulate existing racialized human dynamics’.23 In this sense, racial discrimination is often framed as ‘algorithmic error’ or ‘computational inefficiency’ that can and must be corrected through further investment in technological solutions, by ‘de-biasing’ or reverse-engineering the computational codes.

Yet, pointing to the severe political stakes in these operations, Amaro guides us in precisely the opposite direction, arguing that a ‘call to make Black technical objects compatible with computer vision algorithms risks the further reduction of the lived potentiality of Black individuals and collectives’.24 Solutions premised on the inclusion, recognition and participation of excluded bodies in computer vision algorithms reinforce, in other words, ‘the presupposition that coherence and detectability are necessary components of techno-human relations’.25 Amaro rejects this position – as if computational logics could be freed from racial logics, something that would require ‘dismantling the [social] racial schema in order to rebuild a new relation between the Black psyche and technology’.26 Against this backdrop, in one of the book’s key lines, he holds that ‘[t]o merely include a representational object in a computational milieu that has already positioned the white object as the prototypical characteristic, catalyzes disruption superficially’: the ‘white object remains whole, while the object of difference is seen as alienated, fragmented, and lacking in comparison’.27

Seen this way, attempts to clean or de-bias the computational milieu are a trap: an erasure rather than an affirmation of difference, an enclosure rather than an opening. It is the belief in the ‘universal computational gaze’ – and the coherence and order of the computational milieu in the first place – that clashes with ‘the dynamism of Black life’.28 Black existence, instead, thrives for incoherence and undetectability, akin to ‘fugitive’ modes of being and survival in oppressive socio-legal orders.29 Translated to more familiar terms in international law, this raises the question of what a right to self-determination (can) entail(s) in relation to the algorithmic designation. Does it entail a right to be seen or sensed correctly or appropriately? Or does it entail a claim to potentialities and political possibilities residing outside the coherent computational calculus?30 Instead of pretending to revalorize Black life from within the racialized logics of the computational milieu, the Black technical object generates new possibilities outside of the prototypical logics of machine learning that are entangled with the substance of race. In doing so, it ‘dislodges both the ontological and functional process of machine perception from its roots in substantialist metaphysics’.31

3 The Formation and Deformation of the Individual ‘Subject’ and the Collective ‘Public’ in the Computational Milieu

The nature of the ‘dislodging’ of machine perception provides a distinct perspective on prospects of subjectivity and collectivity – or the formation and deformation of the individual ‘subject’ and the collective ‘public’ in the computational milieu. Amaro invokes the notion of ‘sociogeny’ developed by Fanon and Wynter, where anti-Blackness is a historical reality that results in ‘the epidermalization, or internalization, of the racial being’s psychic inferiority’.32 I understand this claim about ‘dislodging’ to refer to the disruption of this psychic inferiority, which demands a confrontation with the duress of the ‘racial subjection that “divides the soul”’ – drawing on Du Bois – to achieve a re-evaluation of the self: a prioritization of ‘self-actualization’ at the psychic level.33 I emphasize the ‘psychic’ here as Amaro is not advocating merely for an optimization of the design of machine learning as such, but arguing that different modes of becoming can be enacted by an actualization of and within the self when a ‘subject’ is interpellated by the racialized logics of machine learning.

It is here that the argument about Black subject-formation – as a ‘process of ongoing self-making built upon the efficacy of Black consciousness’ – comes to the fore to affirm ‘the potential of self-formation for a Black sense of the self’.34 This Black self-actualization enables a mode of subject-formation that exceeds and exhausts the modernist, liberal ideal of the legal ‘subject’ as one that is always and already actualized in the image of the White norm. Tying it back to the computational milieu, Amaro refers to ‘sociogeny’ as the ‘context through which the generation of Black being can be conceptualized’ when aiming to ‘disrupt[] the durability of racial sorting and the negation of Black being in techno-racial ecologies’.35 In this perspective on ‘sociogeny’, he therefore sees the possibility to ‘provide alternative ways to frame the contemporary relationship between race and machine learning’.

Yet, rather than rejecting the processes of machine learning as such, the argument cuts deeper and considers technology as a ‘means by which the colonized body can auto-generate new forms of radical subjectivity’ and gain ‘new methods to liberate Black psychic generation from negating forms of power’.36 While this refers to the promise in machine learning to fracture or displace existing racialized representational categories and the forms of social sorting tied to those, it is not entirely clear how the Black technical object generates new modes of self-formation or auto-actualization through machine learning. On the one hand, Amaro perceives what he calls a convergence between machine learning and the Black psyche since both are subjected to, and expressions of, pre-established racial categorizations entangled with practices of power. On the other hand, however, he sees machine learning as enabling a dissonance between self-perception and any externally constructed view of Black life, which is overdetermined from the outside, as Fanon would have it.37 As such, it is at the moment of interpellation between the Black object and the technical machine that ‘new iterations of the self are generated, exceeding the reductions of representation and visibility’.38 This reveals the ‘false assumption of coherence’ that the computational milieu seeks, which merely mirrors and reproduces the ‘false assumption of coherence’ that underpins the social order through ‘fictive substances of race and racialization’.39 For Amaro, then, as much as this is the case in society or the world as such, we must take as a starting point that machine learning is ‘always already informed by racial classification and preemptive sorting’.40 Rather than pretending or promising to overcome this reality, we must affirm it to explore what alternative modes of being – individually and collectively – can emerge from within this order of things by disrupting rather than ‘correcting’ it from within. This, I believe Amaro to argue, opens up a possibility to ‘set the Black technical object free’.41 It is here that we reach, in my view, the most interesting, yet perhaps also the most opaque, part of the book.

4 Refusing Recognition, or: ‘If Only We Were Able to Set the Black Technical Object Free’

In the final chapter, The Black Technical Object outlines a distinct critical project of refusal and resistance. In Act III, Chapter 7, ‘A Correction of Metaphysics and the Concept of Black Substance’, Amaro titles one section ‘If Only We Were Able to Set the Black Technical Object Free’.42 Here, he voices what I believe to be his main aspiration: ‘[i]f only we could set Black life – as well as machine learning technology – free’.43 This ‘if only’ returns us to the speculative nature of the overall argument of the book – a speculative turn that is also increasingly taking hold in (international) law.44 Politics of refusal, as a distinctive mode of critique, are indeed speculative in orientation, which distinguishes them from affirmative and negative critiques (as I explored elsewhere in relation to practices of ‘disordering international law’).45 Amaro’s aspired freedom of and for the Black technical object is orientated both towards the Black object and the technical object, which he sees as subjected to the same oppressive representation of racial categorizations. The capacity to auto-actualize the Black object turns it into a ‘subject’, yet not a liberal subject modelled on the normatively White subjectivity that exhausts the possibility of being in this modernist anti-Black world.46 By refusing to be included in the technical object as or in the image of the liberal ‘subject’ – which does not accommodate Black life in the first place – and by self-actualizing a distinctive Black consciousness that rejects the epidermalization of objectified inferiority, Amaro conceives these acts as forms of ‘techno-resistance’.47

It remains difficult to grasp how Amaro concretizes the actual refusal to be included in the machine to reject Black objectification. How can one prevent one’s data from becoming part of the techno-machinery of data consumption and commodification, which is part and parcel of what Harney and Moten call the ‘compulsion of logistical capitalism’?48 Is this refusal not presenting a risk of overburdening racialized individuals by demanding them to prevent the internalization of a socially constructed inferiority, whilst resisting demands from and upon the social order that objectifies them? Amaro seems to be aware of these tensions when reckoning that ‘[t]his move, however, necessitates a pathological viewpoint that resists the temptations of representation and revision in computational culture’.49 He proposes one example where he sees potential for emancipation and freedom when suggesting that the ‘power of contingency’ – which he associates with the dynamics of Blackness or Black life – can emerge in machine learning from the latter’s ‘capacity to fail’ in its operation.50 This ‘power of error’ can join with the ‘power of misrecognition’ of Black life to interrupt and distract the aspired order of the computational milieu.51 Here, Amaro invites us to think of the potential of errors, of misrecognition – these failures of the system that he sees as moments that evade White comprehension akin to the dynamics of Blackness that thrive outside or in the fugitivity of governance.

Indeed, Amaro finds emancipatory power in the potentiality of self-actualization of Blackness in the computational milieu, in those very moments, those micro-seconds, when the ‘computational logic cannot be comprehended by humans’, opening up ‘something that exists in our world outside of human cognition, of white comprehension, and hence lives outside of the understanding of bias’.52 While this does not operate functionally, it does ‘provide potential for expression that resides in the gaps of existing nontechnical knowledge for the generation of new meaning’. It is in these gaps, these moments of absence, of failure and of misrecognition, that we find a distinct and radical opening of contingency. This is not the agency of awareness so often celebrated in international law’s critical canon,53 as Fleur Johns observed,54 but perhaps its opposite – an engagement with contingency through and within the dissonant and illegible; a contingency that does not strive for its revelation or resolution.

5 Conclusion: A ‘Self-affirmative and Optimistic View of Black Life’

What emerges from the analysis, and coming back to the speculative, is a liberation of the Black psyche by imagining socialities or modes of being outside of the categories and categorizations of being that are already pre-given – and taken as a given – in this anti-Black world. The aspirational, speculative and figurative transpires here as the only way to seek freedom of and for the ‘Black technical object’. This might explain why Amaro ends with a quantum understanding of his position, specifying that ‘this techno-resistance is a quantum generation of the self, a consistency in and of itself that thrives within as well as in excess of modes of domination’.55 While the Black technical object might be ‘externally positioned outside of ontology’, it is in ‘its pre-individuated state’ – before, in other words, the ontological inferiority imposed on Black beings in the modernist world as we know it – that potential exists ‘prior to the amplification of machinic perception or the colonial logics of racial substance’.56 The Black technical object, Amaro concludes, thereby ‘prefigures the constitution of any white prototypicality and instead resides in a continual process of becoming that which is fictively perceived as an imposition of ontological truth’.57 From this, a ‘self-affirmative and optimistic view of Black life’ emerges, where Black life is seen as ‘generative, affirmative, and fundamental to a technically mediated life’.58

While the book is certainly not easy to read or digest, it offers an innovative view on the risks posed by algorithms and machine learning to racialized beings and flips the script or current tendency that calls for better awareness, recognition and inclusion of racial difference on its head, calling instead for a refusal of racial constructs and their reinforcement through algorithmic logics. There are important lessons to be learned by questioning the premises, assumptions and terms of the debate that deserve to be (re)considered by legal scholars and practitioners from across various fields of (international) law – from specific questions of AI regulation or non-discrimination law to general writing on global governance by data.59 The book also contributes to a growing literature on regimes of visibility and politics of refusal in (international) law,60 and the emerging trend of scholarship on speculative legal theory,61 by inviting readers to consider figurative ways of becoming and acting in algorithmic times against the backdrop of an anti-Black world. Finally, the openings of the book render us more attuned to the beings, experiences and aspirations beyond the liberal promises of formal equality and emancipation. Ultimately, the book reads as an invitation to consider options often unthought in our field of scholarship and practice: what if by going against the grain of demanding more representation, more inclusion, more recognition of those subjected to, and objectified by, a predominantly White socio-legal and techno-legal order, we could revisit, refuse and reject the very terms and processes that hold this order together?

Let us then conclude by tying this to the theme of international law with Fleur Johns, who reckons: ‘What if? Those two little words can, and have, carried international lawyers so far, so high. What if the actual world with which we feel compelled to grapple – in all its inequality, intolerability, and inertia – might quite plausibly have been, and might yet be, otherwise?’.62 Contrary to Johns, who wants ‘to steel [her]self against the appeal of the “what if?”’, Amaro dwells with the ‘as if’. He does so while being conscious of and foregrounding that ‘there are many modes of power and hierarchy prevailing on the global plane that insist and thrive upon contingency’ – as he shows is also the case with machine learning – thereby suggesting a ‘“what if?” that one could direct against the standard versions of the “what if?” in international legal writing’, as John invites us to consider.63 Amaro’s ‘what if’ is not orientated towards a revelation of contingency – invested in making visible invisibilized forms of agency, paths not taken, or determinate courses of action that demand political and legal re-actions – but lingers with indeterminacy as an onto-epistemological condition that ought not to be resolved or reduced. Contingency, in this take, works with speculation rather than historicization, with figuration rather than pre-figuration, with misrecognition rather than recognition.

Footnotes

1

R. Amaro, As If (2019); R. Amaro, Threshold Value (2020); R. Amaro and M. Khan, Towards Black Individuation and a Calculus of Variations (2020).

2

Amaro, ‘As If’, in A. Jaque, M. Otero Verzier and L. Pietroiusti (eds), More-than-Human (2020) 300; Amaro, ‘Machine Learning, Surveillance and the Politics of Visibility’, in B. Vickers and K. Allado-McDowell (eds), Atlas of Anomalous AI (2021) 152.

3

‘Black Anthropocene Working Group’, David Chandler, available at www.davidchandler.org/black-anthropocene-working-group.

4

R. Amaro, The Black Technical Object: On Machine Learning and the Aspiration of Black Being (2022), at 13.

5

The book we discussed in the prior session of the Black Anthropocene Working Group was S. Browne, Dark Matters: On the Surveillance of Blackness (2015), which shows how contemporary surveillance technologies and practices are informed by the long history of racial formation and by methods of policing Black life under slavery. Black Anthropocene Working Group, supra note 3.

6

Amaro, supra note 4, at 14.

7

Ibid., at 15.

8

Ibid., at 26, 47.

9

Ibid., at 46.

10

F. Fanon, Black Skin, White Masks (1952).

11

Amaro, supra note 4, at 70.

12

W.E.B. Du Bois, The Souls of Black Folk (1903).

13

Amaro, supra note 4, at 47.

14

Ibid., at 13.

15

Fanon, supra note 10; and Wynter, ‘Human Being as Noun? Or Being Human as Praxis? Towards the Autopoietic Turn/Overturn: A Manifesto’ (unpublished essay, 2007), available at https://bcrw.barnard.edu/wp-content/uploads/2015/10/Wynter_TheAutopoeticTurn.pdf.

16

Amaro, supra note 4, at 29.

17

Johns, ‘Data, Detection, and the Redistribution of the Sensible in International Law’, 111(1) American Journal of International Law (2017) 57; Kingsbury and Maisley, ‘Infrastructures and Laws: Publics and Publicness’, 17 Annual Review of Law and Social Science (2021) 353; Van Den Meerssche, ‘Virtual Borders: International Law and the Elusive Inequalities of Algorithmic Association’, 33(1) European Journal of International Law (EJIL) (2022) 171; R. Mignot-Mahdavi, Drones and International Law: A Techno-Legal Machinery (2023).

18

Petersmann and Van Den Meerssche, ‘On Phantom Publics, Clusters and Collectives: Be(com)ing Subject in Algorithmic Times’, 39 AI and Society (2024) 107.

19

Chandler, ‘The Black Anthropocene: And the End(s) of the Constitutionalizing Project’, 15(1) Journal of Human Rights and the Environment (2024) 37; Petersmann, ‘In the Break (of Rights and Representation): Sociality beyond the Non/Human Subject’ 28(8–9) International Journal of Human Rights (2023) 1279.

20

Amaro, supra note 4, at 71.

21

J. Buolamwini, ‘Aspire Mirror’, available at www.aspiremirror.com; R. Benjamin, Race after Technology: Abolitionist Tools for the New Jim Code (2019).

22

Benvenisti, ‘Upholding Democracy amid the Challenges of New Technology: What Role for the Law of Global Governance?’, 29 EJIL (2018) 9, at 58ff. This is a common refrain in legal scholarship. Cf. Endicott and Yeung, ‘The Death of Law? Computationally Personalized Norms and the Rule of Law’, 72 University of Toronto Law Journal (2021) 373.

23

Amaro, supra note 4, at 21.

24

Ibid., at 48.

25

Ibid.

26

Ibid., at 93.

27

Ibid., at 52–53.

28

Ibid., at 48.

29

Best and Hartman, ‘Fugitive Justice’, 92(1) Representations (2005) 1.

30

Lahmann, ‘Algorithmic Warfare, Spontaneity, and the Denial of the Right to Self-Determination’, European Journal of Legal Studies (forthcoming).

31

Amaro, supra note 4, at 58.

32

Ibid., at 70.

33

Ibid., at 74, 75. On Du Bois, see supra note 12.

34

Ibid., at 76, 86.

35

Ibid., at 92.

36

Ibid, at 27, 34.

37

As Fanon put it: ‘I am overdetermined from the outside. I am a slave not to the “idea” others have of me, but to my appearance’. Fanon, supra note 10, at 95.

38

Amaro, supra note 4, at 61.

39

Ibid.

40

Ibid., at 103.

41

Ibid., at 218.

42

Ibid., at 218.

43

Ibid.

44

D. Cooper, ‘Prefigurative Law Reform: Creating a New Research Methodology of Radical Change’, Critical Legal Thinking (3 March 2023), available at https://criticallegalthinking.com/2023/03/03/prefigurative-law-reform-creating-a-new-research-methodology-of-radical-change/; Cohen and Morgan, ‘Prefigurative Legality’, 48(3) Law and Social Inquiry (2023) 1053; N. Rogers and M. Maloney (eds), The Anthropocene Judgments Project: Futureproofing the Common Law (2024).

45

Petersmann, ‘“Re/de/composing” International Law’, Völkerrechtsblog (2024).

46

An interesting tension underpins the book by qualifying Black beings as ‘objects’, on the one hand, thereby hinting to an Afro-pessimist understanding of Black social life as being politically dead and ontologically inexistent, yet calling for a self-actualization of Black subjectivity and an embrace of Black life as generative and affirmative.

47

Amaro, supra note 4, at 221.

48

As Stefano Harney and Fred Moten put it, ‘[l]ogistical capitalism subjects that formula [movement + access] to the algorithm: total movement + total access’. They ask: ‘Can we dodge and blur an algorithmic syntax that straightjackets and atomizes us into total access, so we can get back to rebuilding our atrophied habits of assembly?’ S. Harney and F. Moten, All Incomplete (2021), at 38, 171.

49

Amaro, supra note 4, at 219.

50

Ibid., at 218.

51

Ibid., at 219.

52

‘Haunting, Blackness, and Algorithmic Thought’, Recursive Colonialism, available at https://recursivecolonialism.com/topics/haunting.

53

I. Venzke and K.J. Heller (eds), Contingency in International Law: On the Possibility of Different Legal Histories (2021).

54

Johns, ‘On Dead Circuits and Non-events’, in Venzke and Heller, ibid, 25, at 25.

55

Amaro, supra note 4, at 221.

56

Ibid.

57

Ibid.

58

Ibid., at 226.

59

F. Johns, G. Sullivan and D. Van Den Meerssche (eds), Global Governance by Data: Infrastructures of Algorithmic Rule (forthcoming).

60

Nesiah, ‘A Double Take on Debt: Reparations Claims and Regimes of Visibility in a Politics of Refusal’, 59(1) Osgoode Hall Law Journal (2022) 153.

61

‘What Is Speculative Legal Theory?’, Critical Legal Thinking (2023), available at https://criticallegalthinking.com/2023/10/06/conference-what-is-speculative-legal-theory.

62

Johns, supra note 54, at 25.

63

Ibid., at 26.

This is an Open Access article distributed under the terms of the Creative Commons Attribution License (https://creativecommons.org/licenses/by/4.0/), which permits unrestricted reuse, distribution, and reproduction in any medium, provided the original work is properly cited.