Abstract

This special issue is based in the belief that theoretically informed, methodologically diverse, and sociotechnically inspired research is our best approach for understanding contemporary entanglements between the technological and social aspects of work, and for grappling with what that means for our futures. In this Editors’ Introduction to JCMC’s Technology and the Future of Work special issue, we synthesize emergent themes across the eleven papers included and reflect on productive analytic lenses for anticipating how technologies may shape social and work practices, and vice versa. We identify four themes woven across the papers—visibility, relationships, boundaries, and power—and explicate some of the ways that social, technical, temporal, and communicative dimensions of work emerge across a variety of work contexts. Together, these papers highlight the creative, sense-making, and collaborative dynamics of the technologically infused workplace while acknowledging the amorphous nature of work and place, past, present and future.

Toward work’s new futures

Technological innovation has influenced how and where the diverse set of practices we call “work” unfolds since at least the advent of axes and arrowheads. The COVID-19 pandemic offered a stark reminder of this truth, as it forced many to replace face-to-face work with mediated interactions, bringing computer-mediated communication (CMC) into new work contexts and centering its presence in others. As we finalize this introduction in June of 2023, some of these disruptions remain in place. Amplifying already existing practices, people are increasingly working from home or other noncentralized locations, collaborating across countries and time zones, and stitching together schedules that combine time in and out of traditional workplaces and hours. Work processes, organizational forms, and standards for what constitutes “good” work are evolving rapidly, as are threats to autonomy, privacy, knowledge-sharing, and other contemporary challenges.

In addition to reshaping where and how work unfolds, newer information and communication technologies evoke reconsideration of what, for whom, and—perhaps most importantly—why. Focusing for the moment on traditional office work, people from a wide spectrum of industries are reconsidering the place—both literally and metaphorically—that work should play in their daily lives. Those who realized they could be as productive at home—maybe even more so—are chafing at calls to return to “the office.” Hybrid or online meetings, conferences, and healthcare appointments are now as routine as in-person only meetings. Job titles that would have been inconceivable even a few years ago—such as AI prompt engineer—are hyped as poised to redefine fields such as software engineering and design.

Given all this, penning predictions of the future of work in a persistent and public venue is a futile, and potentially humiliating, endeavor. Consider the terms “future” and “work.” The notion that there is a singular future that might be knowable is laughable. New technologies are being introduced at a dizzying pace, continuously changing the landscapes of possible futures. In the time between the call for papers for this issue and its publication, we’ve seen “generative Artificial Intelligence” based on Large Language Models (LLMs) adopted at an unprecedented rate, with ramifications yet to be seen. “Work” is itself a catch-all category for countless dynamic and ever-evolving practices at scales ranging from the micro to the global, solitary and large-scale, and paid and unpaid. To describe this complexity fully will require years of interdisciplinary scholarship. As such, this special issue is necessarily limited.

Futures are full of possibilities and promise, but they are also intimately tied to our past. The LLMs that threaten to upend both quotidian tasks and billion-dollar industries are trained on data consisting of scores of historical utterances and archived written materials, offering historic prejudices and injustices a fresh invitation to be reborn in amplified form. We cannot understand technology as independent from the histories of the social worlds in which it is situated, given the ways these two forces shape, enable, constrain and even come to define one another. As we elaborate below, the future of work is both social and technical: it is sociotechnical, “a web-like arrangement of the technological artefacts, people, and the social norms, practices, and rules” (Sawyer & Tyworth, 2006, p. 51). The articles in this issue offer fruitful approaches for understanding how these sociotechnical arrangements evolve to change work, even as these changes are still unfolding.

Empirically each article examines concrete arrangements of people in situated contexts working with the particular technologies they encounter, or expect to encounter, at that moment in time. Most of the workers represented in this issue are knowledge workers, such as radiologists, scientists, data analysts, and software engineers. They work for technology sales organizations, telecommunication corporations, natural resource companies, and the like, some with offices in multiple countries and time zones. Others are care workers dealing directly with the public. Many were deeply familiar with technologies of work long before the pandemic. Others were among many forced by pandemic restrictions to shift overnight to exclusive use of communication technologies for work.

As a whole, they work with and through an extraordinary range of communication technologies. Among those covered here are email, instant messaging, texting, phone calls, shared calendars, video conferencing platforms like Zoom, enterprise social media like Microsoft Teams and Salesforce Chatter, direct to client service platforms, online collaborative competition platforms like Kaggle, data analytic and surveillance monitoring software, and AI job interviewers and imaging tools. Whether well-versed in technologies or new to using them in work, none of the participants whose stories are told here can be accurately described as passive recipients of technology’s effects. They are envisioning, interpreting, selectively appropriating, enriching, rejecting, and intentionally jamming and distorting the tools they encounter.

Prediction may be futile—but the future is not entirely unknowable. Though these papers represent just small slivers of the universe of possible technology and work configurations, by diving deep into their contexts while drawing on theory and findings into what we already know about technology and work, they show that many of the dynamics at play are familiar. Hence, while this issue may represent this period in history, or these particular workers and their technologies and contexts, we are hopeful that it offers enduring insights into contemporary work and what that may portend for work’s futures.

The future is sociotechnical

Public discourse often centers technology as the cause of societal change. This deterministic construction can limit people’s sense of agency. David Nye (1997) described the typical stories people tell about new technologies (electricity in his case) as falling along a spectrum of determinism ranging from utopian hopes to dystopian fears; often the introduction of new technologies is met with moral panics that amplify feelings of fear and anxiety (Orben, 2020). In contemporary public narratives around the future of work we see these deterministic tendencies, with some extolling how digital media enable greater flexibility, autonomy, relationship building, information discovery, access to work and services, and diverse perspectives, while others point out threats of surveillance, potential erosion of social ties, and harms to well-being. Of course, all of these can be and are true at once. Yet this need not foreclose the kind of mindful intervention we hope research like the papers included here may spark.

Decades of scholarship have made it clear that determinism is not helpful for understanding or predicting the complex relationship between technology and society. The “effects” of technology are malleable, non-uniform, and sometimes unanticipated. Humans have agency. Sociotechnical networks are less things to be predicted than dynamics and processes to be understood and, hopefully, shaped with care and intent. This issue’s papers point to futures of work that will emerge from what new and old technologies make possible and constrain, but also from how they are (and are not) interpreted, (not) put to use, and creatively reshaped by human actors. These articles take nuanced approaches, predicting neither good nor bad, but interrogating the tensions technologies raise and how people manage them.

The role of human interpretation in shaping how a technology is understood and then used (or not used) is central to a sociotechnical perspective and to the pieces in this issue. Often interpretation begins before people experience technologies, as collective norms emerge through communication. Very early work on organizational uses of email—a new technology at the time—found that perceptions of its usefulness were influenced by how co-workers used and talked about the technology (e.g., Schmitz & Fulk, 1991). Rezazade Mehrizi’s paper in this issue shows similar anticipatory sense-making processes in the field of radiology. Radiology is often one of the first professions listed when people discuss AI replacing jobs, yet few of the radiologists he studied had used AI. Nonetheless at professional conferences and in interviews, they co-constructed frames for understanding how it would shape their work, ranging from expectations that it would automate them away (most common among those who had not used it), to envisioning AI as likely to enhance or rearrange their work, to expecting that their work would become increasingly about communicating to the AI in order to make it work more effectively. Vitak and Zimmer also point to the importance of interpretation. They show that whether surveillance is seen by employees as appropriate or not hinges on factors such as whether data are collected from spaces traditionally understood as ‘private,’ the rationale offered for data collection, and the sense of control people have about how such information is used. Liu, Wei, Wu, and Luo’s experimental analysis of conversations with AI interviewers points to a domain where we still know little about how people perceive a particular technology, let alone how communicating with it may influence their behavior and the outcomes that ensue.

The notion of an “affordance” (e.g., Gibson, 1977)—sitting between the material characteristics of a particular technological artifact and a user’s perception of that artifact’s potential uses—connects the ways that people interpret specific tools to what those tools do and don’t do in practice. Studying technologies through an affordance lens also enables us to see patterns across diverse technologies that may otherwise appear as idiosyncratic, chaotic collections of rapidly changing feature sets that require a new study with each iteration. As these articles show, people perceive and discuss both affordances and their absences, then exert agency by changing their behaviors, working around technologies, intentionally shaping what technologies and others who use them will see, and using technology’s outputs to tell the stories they need told. Treem, Barley, Weber, and Barbour, for example, show through a series of case studies that when the priorities of an organization and those of its employees did not match the priorities built into the analytic software that monitored them, people worked around the software, changing work practices in ways the developers most certainly did not seek.

Huber and Pierce argue in their contribution that technologies shape practices and norms not only through the affordances they provide, but also through those they do not provide. In the telehealth direct-to-client platform they studied, the platform’s business models and need to scale meant that affordances critical to therapists’ work were omitted. Therapists found that the platform afforded more flexibility and autonomy, and enabled some clients to access therapy who could not have done so previously. But they also found that the strategic absence of affordances designed to ensure that therapy was being delivered in professionally compliant ways made the platform an “empty shell,” demanding more invisible labor. This kind of “articulation work” pervading platform labor is an important dimension of work’s futures.

To say the futures of work are sociotechnical is thus to acknowledge the qualities of specific technologies, while also attending to the social contexts of their use and the accompanying human processes: the agency, interpretation, adoption, articulation, and work-arounds workers engage in to get work done, or at least to signal the appearance of work. In the remainder of this brief introduction, we highlight four important themes emerging from the confluence of technologies, affordances, contexts, and agency that this issue positions as central to the sociotechnical arrangements that will constitute work’s futures: visibility, relationships, boundaries, and power.

Shifting dynamics

Visibility

Technologies often change the available signals through which people can make meaning, heightening the potential visibility of some things and diminishing the visibility of others. As Vitak and Zimmer put it, technologies can rearrange social contexts, changing what can be seen by whom, where, and why. Treem, Leonardi, and Van den Hooff (2020) argue that visibility is a “root affordance,” which describes both the “central and fundamental nature of visibility in CMC, but also the ways that other possible affordances spring from and are dependent upon visibility” (p. 48). In a parallel work stream, scholars like Crystal Abidin (2016) have emphasized the importance of “visibility labor” in online creator work. Many of the articles in this issue show how visibility affordances can shape how people present themselves and how they gain awareness of and form impressions of others.

Visibility affordances influence patterns of collaboration, shaping how people see one another and who interacts with whom, and how. In this issue, Keppler and Leonardi, for example, show how enterprise social media offer people many ways to observe and form impressions of more distant colleagues, giving them new pathways to gradually gain confidence about interacting with them, with potentially positive consequences for help seeking and organizational knowledge sharing. Twyman, Murić, and Zheng focus on data scientists using the Kaggle “Machine Learning and Data Science Community” platform to collaborate with people they find on the platform. They show how different patterns of collaboration correspond to different levels of competition awards, another visibility affordance of the platform that shapes but does not determine collaboration patterns.

Ciccone highlights some of the more problematic elements of visibility in her article, showing how affordances can lead to unexpected modes of interpersonal surveillance. Her interviews about digital calendaring practices illustrate how people in the organization she studied formed inequitable judgements of one another based on whether or not their appointment details could be seen. Those of higher status were seen as important when they obscured such details, while those of lower status were viewed with suspicion when they did. Organizational norms and social hierarchies are embedded in the design of technologies like calendars that are set to open, viewable default settings. Those organizational contexts and values then shape work in the aftermath of technological change.

Technologies also “see” and track things humans can’t, making it possible for machines to observe phenomena humans may not. Radiologists in Rezazade Mehrizi’s piece, for instance, note that machine-learning models may read things they cannot in images, even if their own expertise still exceeds that of those machines. Those who deploy AI interviewers, as Liu et al. discuss, no doubt hope that those technologies will “see” qualities humans do not while avoiding the biases of human perception. Technologies make new visibilities possible as they collect the digital detritus of our interactions and actions, filtering them through algorithms, resulting in, among other things, workplace analytics such as those discussed by Treem and his collaborators. While it’s tempting to treat algorithms as a computational rather than communicative problem, these analytics are fundamentally communication phenomena, they argue. It is communication that is made visible to algorithms and analytics, and those analytics are made meaningful through people’s interpretations of—and ensuing responses to—the signals they provide.

Ironically, as central as technological visibility affordances are to work’s futures, humans often have very little visibility into how technologies work. This is especially true when they rely on machine-learning algorithms (sometimes trained on unknown data) that continuously self-optimize while offering neither transparency nor accountability. As many in this issue mention, whether it is monitoring software, analytics, algorithms, or platform politics, technologies are often opaque, hiding their own mechanisms from those whose careers may depend upon them. Increasingly, as a result, work requires new skills to interpret how algorithms might actually work, with intent to either become more, or less, visible.

Relationships

New technologies can destabilize and lead to new relational arrangements. The articles in this issue show relationships changing at multiple levels. Dynamics among colleagues, between employees and organizations, workers and platforms, and workers and machines are all shifting. Interpersonal patterns of information and advice seeking and of collaboration amongst colleagues can be reshaped by technological practices and perceptions. For instance, in their study of workers in China immediately before and during the pandemic, Wu, Antone, DeChurch, and Contractor found that colleagues were most likely to continue advice-seeking relationships after shifting to remote work when their own sense of how easy it was to use enterprise social media matched that of their potential interaction partners. Dynamics of trust, approachability, and confidence are simultaneously enhanced and undermined by technological shifts. New relational arrangements are also seen with the increasing globalization of work, as addressed most directly in this issue by Sivunen, Gibbs, and Leppäkumpu. One of the insights that they offer in their analysis of employees in one company’s sites in different countries and time zones is that the shift to remote work changed power dynamics in relationships, lessening some of the status differences in place previously.

Relationships between workers and those for whom they work, whether organizations or intermediating platforms that allow them to be “self-employed,” are also shifting. Employers who kept their employee oversight bound to the worksite are now pushing the limits of employee autonomy as in-home remote work becomes more normal, as the contributions by Golden, Jorgenson, and Williams and by Vitak and Zimmer make clear. As we have seen with gig work platforms, and now increasingly in professions, some workers are foregoing direct employment, creating new relationships with platforms that can be simultaneously empowering and fraught. As Huber and Pierce explain, telehealth direct-to-client delivery platforms structure interactions between therapists and clients, but they also structure interactions between therapists and platforms, and much of the work lies there. A site like Kaggle, the platform from which Twyman et al. draw their data, offers yet another kind of worker-platform relationship, serving as an intermediary between data scientists and organizations in search of their talents.

As the emerging field of Human–Machine Communication explores, relationships between people and machines are also at stake. As we see in Liu et al.’s article about AI job interviewers, some roles formerly assumed by humans can now be done with AI. What happens, for example, if the ways people respond to AI systems—because they are nonhuman—feed back into opaque systems that falsely interpret their responses as evidence of undesirable personality traits? People will have to learn to work in new ways with AI in order to make this new technology serve their needs. Some of Rezazade Mehrizi's radiologists, for instance, expected AI to change their work by increasing the need for them to teach the AI so it could best assist them. As the new career of prompt engineer suggests, the future of work will certainly entail more machine whispering, as people learn to forge relationships with machines in the daily course of work.

Boundaries

These shifts in visibility and relationships are intertwined with a third theme throughout these articles: the continuous pressures that new technologies place on boundaries. Sivunen and colleagues outline two foundational boundaries disrupted by communication tools that connect people anytime, anywhere: global/temporal and work/home. The workers and teams in this issue respond to this troubling by seeking to create new norms for themselves at each of the relational levels just discussed, even when societal consensus has yet to emerge. Golden et al.’s community health workers, newly cast as telecare providers, struggled to work out strategies for keeping the difficult emotions of their clients’ experiences out of their home life. With multiple communication channels possible during the pandemic, just managing the boundaries of which channel they would use when with which client became yet another professional obligation. The global workers Sivunen et al. studied, like Huber and Pierce’s therapists, also made explicit rules about their own accessibility via different communication media to keep work from spilling into their home life. Importantly, this boundary blurring was not all bad. The introduction of home into work could be exploited to increase relational connection and trust with colleagues.

Power

These papers show the importance of understanding how power relations run through all sociotechnical networks. It is often those with the least power who are the most impacted by technological change, whether as targets of surveillance technologies or through interpretations of one’s calendar that take technology ostensibly intended to coordinate meetings and make them into a contested site of hierarchy, social performance, and status-based judgments. Golden et al. are among many scholars who have noted the gendered impacts of the disruption of home/work boundaries as women, more likely to be tasked with domestic duties and caring for any on-site children, found flexibility newly empowering, but also found their doubly continuous accessibility an ongoing problem. If those with less structural power are most likely to be negatively impacted in new sociotechnical arrangements, as we see in this issue, those with more power and those who already have technological capital and digital skills have an advantage when work becomes more digitized.

We would be remiss in a discussion of power not to acknowledge that those whose voices are represented in this issue hold more privilege than the majority of humans whose livelihoods will be affected by sociotechnical changes in the next decade. The workers in these articles are primarily professionals. Americans are overrepresented. They may experience work precarity, but in general, they enter the future with a good deal of social, technological, and other forms of capital, in contrast to most of the world’s workers. Migrant laborers, gig workers, truck drivers, and those in the agricultural, service sectors, and care industries are among others at the front lines of integrating technology into their daily work practices.

The ideal version of this special issue would include a broader range of workers, as well as creators, artists, and those in jobs that involve directly engaging with the public. It would also have authors and contexts from a broader geographical range. Understanding the future of work will require immersion in a huge range of contexts where the enormous varieties of sociotechnical arrangements unfold. It is essential to understand power and other dynamics at play in these and other forms of work that this issue missed. We hope that this issue might help encourage readers to seek out scholarship that has already been done in these areas and that it will encourage more work along these lines in the future.

Conclusion

Some of us are drawn to study technology because its introduction can unsettle social practices, make what has been tacit explicit, and open the possibility of transformation. In this short introduction, we’ve sought to outline what it means to take a sociotechnical approach to the future of work and to highlight some of the cross-cutting themes in this issue. Much contemporary rhetoric is about how technologies may automate and replace. This perspective is important, but inadequate. The future of technologies in work is also about how people will exchange information and build relationships with one another, how they will reconfigure their own practices to make new technologies fit their needs, how norms around those technologies will evolve as people negotiate new boundaries, and how developers will iterate upon those technologies in response. The future is about how configurations of power that predate technologies will be amplified and countered.

It's important to realize that we are at a time of rapid transformation. While sociotechnical dynamics of visibility, relationships, boundaries, and power discussed here have existed as long as tool-using humans have lived in community, they are at inflection points (again). New technologies are being invented, deployed, and adapted at record rates. The norms and practices we create, and the values embedded in future technologies, are still open to shaping. We need to be asking not what the future of work will be, but what we want to make it and what we must do to make that happen.

Acknowledgments

We would like to thank Brenna Davidson and Elizabeth Fetterolf, without whom this special issue would not have happened. We deeply appreciate the efforts of Ifeoma Ajunwa, who had to step away from the issue but whose vision and networks helped shape it. We thank all the reviewers, especially our Friends of the Special Issue—Dan Greene, Arturo Arriagada, Jeremy Bailenson, Solon Barocas, Jeremy Birnholtz, danah boyd, Tarleton Gillespie, Ashley Mears, Cameron Piercy, Rida Qadri, Karina Rider, and Sean Rintel—whose willingness to review multiple papers, many twice, helped move this toward publication quickly. And of course, we thank the authors, without whom there would be nothing to reflect on, and the countless participants whose presence is reflected in these papers and who were so generous with their time and insights.

References

Abidin
C.
(
2016
).
Visibility labour: Engaging with Influencers’ fashion brands and #OOTD advertorial campaigns on Instagram
.
Media International Australia
,
161
(
1
),
86
100
.

Gibson
J.
(
1977
). The theory of affordances. In
Shaw
R.
,
Bransford
J.
(Eds.),
Perceiving, acting, and knowing: Toward an ecological psychology
(pp.
67
82
).
Lawrence Erlbaum Associates
.

Nye
D. E.
(
1997
)
Narratives and spaces: Technology and the construction of American culture
.
Columbia University Press
.

Orben
A.
(
2020
).
The Sisyphean cycle of technology panics
.
Perspectives on Psychological Science
,
15
(
5
),
1143
1157
.

Sawyer
S.
,
Tyworth
M.
(
2006
) IFIP International Federation for Information Processing, Volume
223
, in
Berleur
T.
,
Numinen
M. I.
,
Impagliazzo
J.
(Eds.),
Social informatics: An information society for all? In remembrance of Rob Kling
(pp.
49
62
).
Springer
.

Schmitz
J.
,
Fulk
J.
(
1991
).
Organizational colleagues, media richness, and electronic mail: A test of the social influence model of technology use
.
Communication Research
,
18
(
4
),
487
523
.

Treem
J. W.
,
Leonardi
P. M.
,
Van den Hooff
B.
(
2020
).
Computer-mediated communication in the age of communication visibility
.
Journal of Computer-Mediated Communication
,
25
(
1
),
44
59
.

This is an Open Access article distributed under the terms of the Creative Commons Attribution License (https://creativecommons.org/licenses/by/4.0/), which permits unrestricted reuse, distribution, and reproduction in any medium, provided the original work is properly cited.
Associate Editor: Nancy Baym,
Nancy Baym
Associate Editor
Search for other works by this author on:
Nicole Ellison
Nicole Ellison
Associate Editor
Search for other works by this author on: