I am sitting at my workstation in the factory hall and assemble plug sockets. As I am completely unexperienced, I lean over to the worker sitting next to me once in a while to ask him what to do. In one of these moments, a foreman walks by on his patrol through the factory hall. He immediately stops and launches into a critique. “There is a prescribed distance between the workers that is to be kept,” he says. “Otherwise, as one can see here, there would always be talking instead of working. Time is money, after all.” As soon as he moves on, the two workers next to me lean over again. One says that I should not take the foreman too seriously. The other imitates a Hitler moustache with two fingers and says: “He is playing Führer again”.

Half a year later, I am still sitting at the same workstation. But now, the foreman has been replaced by a computer screen showing me exactly what to do and a “pick-by-light-system” indicating where the respective parts can be found. I have to confirm every working step on the display. As soon as I click, tenths of a second start to race on the bottom of the screen. I am still finding it hard to keep up the pace. The system tells me I fell below my average working speed. I am leaning over to my coworker to make a joke but he stares at his own screen and does not notice me.

Both situations contain the core elements of workplace control: evaluation and feedback. In the first situation, the foreman embodies both. In the second situation, this is taken over by a system that management refers to as “AI”. It aims at making the labor process easier so that, as a manager put it, “anybody from the street” could do it. This would enable the company to use unskilled or semi-skilled instead of skilled workers and thereby save a lot of labor costs. Furthermore, the system employs a “cybernetic” mode of control: instead of merely standardizing the labor process, this aims at creating a continuous process of optimization based on automatic feedback on the workers’ performance.

But there is another difference: In the first situation, the “feedback” takes the form of a confrontation. My coworkers responded with expressing their solidarity with me and against the foreman whom they ridiculed. In the second situation, there was no such confrontation, for who could be ridiculed in the absence of a foreman embodying factory discipline?

Like most of the critical literature on algorithmic management and workplace surveillance, I immediately saw a confirmation that digitalization leads to atomization and a withering of solidarity among workers. Yet, in the course of my ethnographic research in the German mechanical engineering factory, I realized that this was not the case.

Already when the pick-by-light system was installed at the shop floor there was a confrontation: The system did not work properly and instead blinked wildly. One of the workers who had watched, commented: “At least now we have a replacement Christmas tree!” Another responded: “Or a disco!” Immediately, several workers joined him in imitating dancing moves. From this point onward, workers referred to the system as “the disco” and it became common to say that one worked “at the disco”.

This was not an isolated case. Workers appropriated most of the newly introduced digital technologies in a humorously critical way. This appropriation also had a practical side to it. For example, workers used a digitalized tool cabinet to store their snacks and referred to it as “the candy machine”. Another example was a smart glove that tracked workers hand movements and vibrated when they did something wrong. This became known among the workers as “the sex toy” or “the electro shocker”. All of these names were not situational jokes but turned into stable identities of the tools.

This technique of ascribing “cultural identities” to tools is well known in ethnographic research. For example, Fél and Hofer observed it regularly while studying Hungarian peasants. However, unlike the peasants, my co-workers did not ascribe affectionate or even tender identities to their tools. Instead, the names they gave them were always meant to ridicule digital technology. This either emphasized the control function of the devices – as in the electro shocker – or their dysfunctionality. The latter became especially evident in the case of a robot that was meant to transport components to workplaces. The robot always collided with racks while continuously making announcements in Dutch which nobody understood. Thus, it always had to be accompanied by two apprentices for “training”. Therefore, it became known among the workers as “Fiffi”, the stereotypical German name for a stupid dog.

This collided with the identity that management tried to ascribe to the robot, which was either “our Mir”, in reference to the space station, or “our AI”. Thus, while the identities ascribed by the workers emphasized dysfunctionality, management tried to emphasize technological progress. In the beginning, both names were present among the workers but in the end, “Fiffi” prevailed.

Thus, the workers used the same cultural techniques on the digital technologies that they had previously used to criticize their human superiors. In both cases, ridicule had the function to assure each other of a common critical stance. In this, humor has the advantage that it is relatively risk free as one can always retreat to the position that “it was only a joke”.

The practice of ridicule proved crucial to creating a culture of resistant solidarity among the workers. In the presence of this solidarity, workers dared to engage in more practical acts of resistance. These included manipulating the algorithms, or using gaps of control for illegal breaks. In some cases, workers even engaged in outright sabotage. These “technopolitics from below” had very real consequences for the course of digitalization. For example, the implementation of the smart glove was aborted due to the resistance of the workers.

This shows that digital technologies or “AI” by no means eliminate workplace conflicts or resistance. They might foreclose some ways of communicating or acting but they also open up a plethora of new opportunities for resistant solidarity and autonomous practices.

Dr. Simon Schaupp is senior researcher and lecturer (Oberassistent) in sociology at the University of Basel, Switzerland.

Image: StockSnap via Pixabay

To read more, see Simon Schaupp. “Ridiculing the artificial boss: organizational technocultures and the humorous criticisms of AI at work” in Work in the Global Economy 2023.