I hired into a “Big Box” warehouse in the summer of 2006. That is what megastores, usually shaped like square or rectangular boxes, are called. In March of 2016, I, along with my advisor published an ethnographic account of working there (Elliott and Long 2016). Each day of the four-year observation period, half a million cases moved through the facility, yet the 200 workers could, if they chose, go the entire shift barely speaking a word to one another. A computer coordinated the work.
The debate
How does technological progress affect workers? Many argue that there are some positives and some negatives, and that they cancel each other out for a largely neutral effect. Take job creation, for example. Metal tools used to be made by hand, using a machine called a lathe. Once numerically controlled machines were developed, these skills were no longer necessary (Shaiken). However, new skills were needed to control those machines. Ever since centralised power changed the organisation of textile mills — closing down a system of making clothes small batch in the home, but paving the way for management to become a profession — we have seen this pattern.
A similar pattern of “some good, some bad” may be seen with managerial control. In many cases, such as with Ford’s assembly line, workers came under greater control. However, new technology can also subvert workplace hierarchy. When radiology was being developed, doctors would sometimes defer to the expertise of technicians running machines to make a diagnosis. These contradictory examples have long been a source of ammunition for those who wish to argue the effects of technological change on workplaces are largely neutral.
Should we expect a different pattern with information technology? Perhaps. The above arguments assume a rather narrow understanding of the potential for technology to shape human behaviour. They tend to presume a time frame of only a few hundred years. When we are considering something as fundamental as information —the cornerstone for any cognitive processing of reality — we should broaden our thinking just a bit. As computers become better at “thinking,” or using algorithms to make decisions — a phenomena which, once networked with systems of memory, is not all that different from human cognition — more and more types of work have the potential to become “computer controlled.” What can that mean for the human beings controlled by computers?
Computers and workers in the warehouse
Let us consider the case of warehouse work, particularly in grocery distribution. Following the “logistics revolution” of the late 1980’s (Bonacich and Wilson 2008) — underpinned primarily by containerisation and the UPC code — grocery stores began offering more and more products on their shelves. Along with the growth of discount stores and “Big Box” companies, the logistical pressures increased for distributors. Offering a wide array of products — and an increasingly sophisticated customer base — creates a tremendous challenge to get the right product to the right store at the right time. These pressures have created a strong impetus toward using information technology to make warehouse work more efficient.
What kind of workplace culture can develop in such a context? Our study attempts to answer that question. First, let me explain the physically demanding, isolating, and repetitive nature of the work. We examined order fillers, the modal group of workers. After clocking into a shift, an order filler would first get their “unit.” This was a wireless mini-computer, which hooked into a belt about the waist and plugged into a headset. Workers logged onto the unit, giving their employee ID number, using voice commands. Simultaneously, they would find a “jack,” or vehicle for moving pallets of freight around the warehouse. Once logged onto the system, the worker instantly received a set of cases which were to be collected, or a “trip.” How do workers know which cases they will be retrieving? They don’t. The unit tells them—one case at a time.
The grocery stores served by this warehouse required customised orders of freight: a case of cereal, a case of sport drink, etcetera. The computer compiled these orders and sorted them into trips—routes through the facility for filling a customers’ order. For each of the two to four hundred cases that might comprise a single trip, the device would tell the worker where to go. Once there, the worker would read a code signifying the slot’s “address” to the device, and wait to be told how many cases to retrieve from the slot. These would be manually lifted and stacked onto the pallet. Cases weighed between 4 and 30 pounds. What’s more fascinating: the computer developed a standard time for each trip. Using a formula that factored in distance travelled, as well as size, weight, and number of cases, workers were expected to complete the trip within the standard time.
The computer’s voice
Once a trip was completed—by manually stacking various cases of differing sizes, weights, and textures, eventually constructing a cube-like tower of six feet that must withstand starting, stopping and turning—workers would take the trip to the loading dock. A new trip was then assigned. At that point, the digitized voice in their ear might say something like this: “Okay, the actual time for your last trip was twenty-six minutes and thirteen seconds. Your per cent performance was one-one-four point three. The standard time for this trip is forty-five minutes and fifty-six seconds. Proceed to aisle H:VA slot zero one three.” The worker would then drive toward that location. A typical shift might involve interacting with the unit 1500 to 2000 times, and lifting about 30,000 pounds of freight. Workers needed to maintain a 95 per cent performance to avoid disciplinary action, which could, in time, result in dismissal.
Computer control created a job that was isolating, repetitive and physically taxing. As you might expect, the intrinsic quality of the work was very low. To overcome this, Big Box management used the data generated by computer control to create extrinsic incentives. First and foremost was the bonus. Every two weeks, workers received extra pay: for every percentage point above 100, workers received that same percentage increase of pay. This bonus capped, however, at 130 percent. Management also created daily awards. Workers that pulled the most cases in a shift would receive easier clean-up duties, or no clean up at all. The next day they could choose easier jobs, and “get out of the pick.” Also, Big Box arranged shifts so that a certain amount of volume had to be completed before the shift ended. Because of this, an eight-hour shift might stretch into 10 or 11 hours.
Social isolation
These extrinsic incentives we labelled the “digital arena”—a social space where performance numbers took on a life of their own. Computer control, developed primarily for efficiently moving freight about the warehouse, led to an isolating, physically demanding job. The data generated from that process, however, is repurposed by management to create a social context where the meaning of performance numbers becomes symbolic material for face to face interaction. In turn, workers generated and maintained an informal status hierarchy. Within this highly masculine culture, mostly male workers ranked one another, competed and derided those who failed to meet informal standards. “Running” was a term workers use to connote commitment to achieving a high performance number. By “running,” order fillers attempted to complete trips as quickly as their skills and stamina would allow. Workers who could only “run production,” or only perform at the level of 95, were at the bottom of this hierarchy.
The effort to make distinctions via performance numbers became evident on day one of my stint at Big Box. In the break room, after the initial conversation consisted of order fillers guiltily confessing “one-fourteen,” or “one-oh-three,” numbers well above the minimum production, the trainer began lamenting the fact that “no one (on this shift) really runs anymore.” The trainer explained what it meant to “really run.”
“It depends on the shift. Like the guys on this shift, no one can really run that well. Maybe 140 or 150. I can run, [he started pointing people out in the break room,] he can run, him, he can’t run. [He went through the ten males who were in the booth tables in the break room. They just nodded or smiled, embarrassed.] Now, if you want to see some people who can really run, come in on day shift. Those guys clock out with 160 or 170.”
The bonus pay alone cannot explain this production game. The informal culture erected within the digital arena pushed workers to achieve 200 per cent, well above the 130 bonus cap. Our paper documents a number of ways that Big Box management did not enjoy perfect control over the worker’s culture. However, the use of “official” production performance to create an informal status hierarchy is rare. The use of performance standards in the workplace normally spawns the derision of “rate busters” (Dalton 1948). In Big Box, these rate busters are celebrated as legend. How did Big Box management do it?
Loss of control over how the work is done
In our paper, we argue that the social space erected by the digital arena offers a potential platform for rescuing meaning and sociability from an otherwise isolating, repetitive and meaningless task. Computer control creates a dehumanising work task—and simultaneously offers a reprieve from that task. However, to engage this reprieve, one must accept the political conditions of the workplace. To engage the digital arena is to accept a complete loss of control over how the work is actually done. The control of information, then, is potentially the control over how social space is perceived. To control perception is to control the terms of political contestation.
Is this case generalisable? Electronic performance monitoring is being used in more and more types of work—from truck driving to nursing. The impetus to control work is inherent to capitalism, but even more pressing where competition is by “economies of scale.” We should expect to see computer control, and perhaps even digital arenas, in diverse, unexpected industries.
What do these developments in information technology mean for the future of work? For we hope to see the “creative destruction” that such progression in efficiency might entail, we are also seeing an ever-widening “digital divide.” The polarisation of work is a largely unexpected development of the past decade. As computers become better tools for shaping social space, who can say what it will take to bridge this divide, or what the long-term consequences might be.
♣♣♣
Notes:
- This post is based on the author’s paper Manufacturing rate busters: computer control and social relations in the labour process, co-authored with Gary Long, in Work Employment & Society February 2016 vol. 30 no. 1 135-151
- This post gives the views of its author, not the position of LSE Business Review or the London School of Economics.
- Featured image credit: Nick Saltmarsh CC-BY-2.0; Forklift photo: FEMA Public Library Public Domain; Conveyor belt: Tecnowey, Wikimedia Commons, CC BY-SA 3.0
Christopher “Shane” Elliott is a PhD candidate at the Department of Sociology at the University of North Carolina at Chapel Hill. He is interested in how external environments such as institutions, culture and markets impact the subjective dimensions of organizational life — especially management-worker dynamics, workplace culture, and identity. His dissertation, “Capitalizing Craft: The Intersection of Production and Consumption in Craft Beer Markets of North Carolina,” examines the intersection of consumer culture discourse, and craft beer workers’ identity. He plans to defend it in the winter of 2016.