Abstract
Discussions of automation in the workplace typically omit policing. This is a mistake. The increasing combination of artificial intelligence and robotics will provide us with social benefits, but it will also create new problems as automation replaces human labor. Mass unemployment may be one consequence. Another is deskilling, the loss of the skills and knowledge needed to perform a job when automation takes over. We need to ask: What will be the consequences of police deskilling?
The issue of deskilling matters because automation is already a part of ordinary policing. Many police departments around the United States use tools that rely on artificial intelligence, such as automatic license plate readers and predictive policing software. Robotics will also likely play a role in policing just as it will in other employment contexts. This means that increasing automation will replace some traditional policing altogether. As a result, police officers will face a deskilling problem: the prospect of losing some of the core skills we associate with policing.
These changes may lie in the future, yet we can already identify some of the law and policy questions they will raise. First, automation and deskilling would undermine the Fourth Amendment’s premise that the police possess a specialized skill requiring judicial deference. Second, automation and deskilling may lead to dramatically different ways of organizing the police. Third, automation and deskilling force us to consider whether diminishing the social aspects of policing is worth the benefit of increased automation.
Introduction
Almost no one thinks of the consequences automation will bring to policing.[1] This is a mistake. Automation—the combination of artificial intelligence and robotics—is spreading everywhere.[2] Whatever automation’s benefits, we will also face problems as it displaces human labor. One consequence may be mass unemployment in sectors like truck driving, agriculture, and health care.[3] Another consequence of automation is sometimes referred to as deskilling: The skills and knowledge needed to perform a job that are lost when automation takes over. What happens when policing becomes deskilled?
The deskilling of the police is inevitable because automation is increasingly becoming a part of policing. Many American police departments already use artificial intelligence: It drives automatic license plate recognition, social media threat analysis, predictive policing software, facial recognition technology, and autonomous drones.[4] Observers of the military have already begun to discuss the legal and policy effects similar automation will have on soldiers, and on the nature of waging war itself.[5] Ordinary street policing has thus far received much less attention, even though some of the tools of automation come directly from the military.
This Article contends that because the deskilling associated with automation will soon apply to the police,[6] omitting the police from the automation debate is a mistake. The increasing role that artificial intelligence and robotics will play in policing will mean not only that automation will help traditional policing, but also that it will replace some of it altogether.[7] That replacement will mean that human police officers will face a deskilling issue: They will lose some of the core skills we associate with policing.
While these technological changes may lie in the future, we can identify some of the law and policy questions that they will raise now. First, automation and deskilling in policing would undermine the Fourth Amendment’s premise that the police possess a specialized skill requiring judicial deference. Second, automation and deskilling may lead to dramatically different ways of organizing the police. Third, automation and deskilling force us to ask whether diminishing the social aspects of policing is worth the benefits of automation.
The Article proceeds as follows. Part I presents the case for thinking of automation and deskilling as a set of challenges for the police. Part II examines how deskilling might compel courts to reexamine their presumptions about police expertise. Part III looks at the potential effects that automation and deskilling would have on the profession of policing. Part IV takes a broader view and identifies the potential effects, both positive and negative, that deskilling will have on police-community relationships.
I. The Context of Police Automation and Deskilling
Policing may not be an obvious candidate for automation. When asked which jobs will be subject to automation, Americans point to fast food workers and insurance claims processors.[8] This makes sense. Repetitive, low-skill tasks are amenable to the first waves of workplace automation, unlike other jobs that require creative thinking and problem solving. Automation of the workplace is a trend that, by some estimates, may affect nearly half of all American jobs within the next twenty years.[9] Policing, however, rarely appears in discussions about automation in the workplace. This Part discusses why policing will be changed by automation, which policing tasks are likely to be automated first, and how deskilling might arise.
A. The Police as Candidates for Automation
Omitting policing from the debate on automation suggests some misunderstandings about the police. This is a problem of perception: There is a gap between what we think the police do and what they really do. To be sure, the police are tasked with enforcing the law, investigating crimes, and maintaining social order in sometimes unpredictable and violent situations; all tasks that seem unlikely candidates for automation because they require high levels of discretion and problem solving.
In reality, ordinary street policing is not concerned primarily with crime fighting. Most police resources are directed at patrol, and most police patrol is, to paraphrase criminologist David Bayley, trivial, noncriminal, and boring.[10] The average police officer rarely fires his gun.[11] Arrests are infrequent.[12] In any given shift, the average police officer spends his time not catching criminals but working in a service capacity.[13] This encompasses activities such as directing traffic, responding to accidents, resolving disputes with nonlegal methods, addressing matters of homelessness and mental illness, and sometimes, just driving around.[14]
Police also engage in a great deal of order maintenance and surveillance; tasks that overlap with the duties of private guards,[15] who have, by contrast, already been discussed as candidates for automation.[16] Autonomous security robots that can record information and relay possible threats are already available for hire, and at far cheaper rates than human security guards.[17] Other capabilities that security robots may possess in the future include weapons detection capability, facial recognition technology, and electric shock weapons.[18]
And while talk of deskilling has not yet reached policing, the tools of automation that rely upon artificial intelligence are becoming widespread in American policing. Automatic license plate readers that can scan hundreds of license plates per minute are common.[19] Many police departments use predictive algorithms to determine where crime might occur in the future in order to efficiently deploy officers.[20] Predictive algorithms developed in the private sector also help some departments identify high-risk persons worthy of extra police scrutiny.[21] Some police departments are already piloting facial recognition programs that would instantly identify faces in a crowd.[22]
B. Automating Police Tasks
Recall that most of policing is patrol work, and most of that patrol work is spent on tasks other than crime fighting. When we look at the tasks that the police do perform, many of them would seem to be natural candidates for automation.
Consider the example of traffic enforcement. Every state’s vehicle code contains numerous possible violations that the police can enforce, including speeding, expired registration, and broken brake lights. The police may rely on the pretext of enforcing these minor offenses even if their true interests lie in looking for illegal guns and drugs.[23] In other words, car stops can begin with a possible ticket for speeding and end with a search of the car or its occupants, and then an arrest. These stops are the most common type of police-civilian interaction most Americans experience.[24]
Car manufacturers and policymakers are preparing, however, for a future of autonomous and connected cars that changes this premise. When self-driving cars are commonplace, there will be fewer car crashes and fewer violations of the traffic laws. However, even if most of these cars will be programmed to obey the speed limit and to drive safely, there will still be reasons for the police to stop them.[25] Autonomous cars may, for instance, facilitate some crimes like illegal drug delivery and prostitution.[26] But enforcing the law against an autonomous car may not require a human police officer. Tickets for speeding and vehicle defects could be issued automatically by a smart infrastructure that identifies these offenses. An autonomous police car could also be programmed to identify an autonomous car that is speeding because a person inside has overridden its controls.[27]
Both the police and the public may balk at this sort of automation. If every motor vehicle law can be enforced, then the public may object to the resulting mass enforcement. The police, too, may object to the loss of a significant use of instrumental policing: relying upon traffic stops as lawful pretexts for the investigation of more serious crimes. Whatever the objection, automation in this context will mean that there will be less work for human police officers.
Another mundane aspect of policing amenable to automation is paperwork. Police officers fill out a lot of forms.[28] They do paperwork to impound vehicles, document traffic accidents, and arrest people. Post-arrest paperwork can often cost an officer several hours in a shift.[29] These administrative burdens can in turn influence whether an officer decides to make an arrest at all—depending on whether he wishes to work overtime to process the paperwork.[30]
Responsibility for generating paperwork could be assumed by artificial intelligence. The Axon corporation is the largest vendor of police body cameras, but it has also announced its interest in “automated reporting” generated directly from the audio and video data recorded by Axon police body cameras.[31] Few people would consider paperwork a core police function. But when we take into account how much time police spend on it now, administrative work is shown to be a large portion of what the police do. Automation could significantly reduce direct human involvement in generating the forms and the data police must process.
Advances in robotics and artificial intelligence may also eliminate the police role in transporting and processing people who have been arrested. Imagine an autonomous police car that would meet a police officer with a recent arrestee. Instead of a police officer driving the arrestee back to the station house, the autonomous car would transport the person. In addition, the autonomous police car might perform other functions: testing the arrestee for alcohol or drugs, scanning for weapons, conducting records checks for outstanding warrants, reading the arrestee Miranda rights, and even arranging for defense attorneys, arraignments, and bail payments. A patent that outlines this scenario already exists.[32]
Figure 1: Mobile Law Enforcement Communication System and Method
Perhaps traffic law enforcement, paperwork, and transport seem to be peripheral functions that should be automated instead of performed by the police. But what about the tasks we associate with core police functions: investigating crime, catching offenders, and maintaining order? It turns out that automation has begun to encroach upon these roles as well.
Consider the identification of criminally suspicious behavior. Police departments have begun to adopt artificial intelligence tools that analyze massive amounts of data and identify or predict potentially suspicious persons and places. Many police departments around the United States have adopted predictive policing software that forecasts the specific areas in a city where crime is likely to happen.[33] The prediction is meant to help police departments direct the attention of its police officers. Another program used by the Chicago Police Department identifies individuals most at risk for gun violence.[34] That “hot list” is used by the Department so that the police can preemptively intervene with warning visits to those identified at heightened risk.[35] Other companies promote risk assessment tools to police departments that comb through private and public databases to identify whether persons the police encounter may be at high risk of violent behavior in police-civilian encounters.[36]
In other words, the very development of suspicion—a skill we may once have deemed an exclusively human capability—is increasingly the responsibility of artificial intelligence. These trends are likely to accelerate. Police in the U.K. and Belgium, for instance, are piloting a system that uses artificial intelligence to engage in investigation.[37] By assuming the role of a crime analyst, the system scans millions of police records, pictures, videos, and other data to generate plausible leads about how to investigate a particular crime.[38] The VALCRI program[39] uses machine learning to offer ideas about “how, when and why a crime was committed as well as who did it.”[40]
One final role we might characterize as a core feature of policing is the capacity to use justifiable force, even lethal force. Here, too, however, developments in artificial intelligence and robotics suggest that the use of force may one day be subjected to some degree of automation. Consider the military’s increasing interest in robots that might replace human soldiers in the battlefield.[41] For now, military robots are controlled directly by human operators. Current policy debates, however, include discussions about the kinds of rules that will be necessary for autonomous and lethal military robots, and even whether they should be permitted at all. Technologies and tools first adopted by the military often find their way to domestic policing, so we should not be surprised by the possibility of autonomous, armed robots playing a role in future policing.
Indeed, we can already see these automation conversations happening within the private security industry, whose services functionally overlap to a considerable degree with public policing. Autonomous security patrol robots equipped with surveillance cameras, GPS, and license plate readers are already available for hire for as little as seven dollars an hour,[42] and those with facial recognition and perhaps even nonlethal weapons may be available soon.[43] They might one day receive tips of suspicious activity sent in by people through cellphone apps for robotic investigation. Cost-conscious customers may find the costs of security robots that need no “raises and healthcare” far preferable to security guards.[44]
C. Deskilling
If many of the most common and important tasks the police perform can be assumed by machines, officers are at a high risk of deskilling. The deskilling of labor occurs when machines assume so much work previously assumed by people that workers begin to lose their acquired skill and fail to learn new ones.[45] Sociologist Harry Braverman famously critiqued the increasing automation of the labor market in the 1970s.[46] Braverman argued that the drive to increase productivity resulted in the increased routinization and deskilling of workers.[47] As a result, employers could more easily keep wages low and control their employees, who had lost their craft skills.[48]
An example described in the 1950s illustrates the deskilling process.[49] If a metalworker at one time applies her knowledge and dexterity to a job such as filing with a hand tool, those skills become increasingly less important as machines take over those tasks.[50] As machines incorporate even more of the processes of metalwork, the former metalworker becomes a machine operator, who continues to lose her technical knowledge.[51] The continued process of automation relieves “the operator of manual effort and then [relieves her] of the need to apply continuous mental effort.”[52] The same concerns raised about the metalworker could easily apply to cooks, carpenters, and security guards: all jobs identified as at high risk for automation today.[53] There are few skills necessary when “the ‘skill’ can be built into the machine.”[54]
Although concerns about the effects of automation on skilled labor are not new, the deskilling debate has attracted new attention as rapid advances in artificial intelligence have allowed more tasks to become automated. Cashiers, truck drivers, and cooks are not the only jobs at risk. Algorithms can return medical diagnoses with high degrees of accuracy,[55] and new software can review contracts.[56] This in turn has led to considerable “automation anxiety,”[57] as survey research suggests most of the public are worried about the impact of automation on their jobs.[58]
The police cannot be immune to the effects of automation that are present in nearly every sector of the economy. Even if no police department today expects a fully automated policing system in the near future, we can already see that many of the core aspects of policing—conducting surveillance, finding patterns to generate investigative leads, enforcing minor offenses, managing traffic, writing reports—are and will be assumed by automation. In particular, the identification of suspicious behavior, perhaps the key skill most often associated with policing, is already subject to automation, as police department around the country rely on predictions, alerts, and assessments by machines to help them in their basic policing tasks.
The social benefits of increasing automation in policing may be significant. Automation might provide us with lower crime rates, greater transparency through data gathering, and less unnecessary police violence. Heavier reliance on the algorithmic analysis of big data might force the Supreme Court to rethink its refusal to quantify the Fourth Amendment standards of reasonable suspicion and probable cause.[59] But for the police themselves, automation introduces risks to their own livelihoods. These risks will also create ripple effects in law and policy.
II. The Fourth Amendment and Police Deskilling
Police are expert crime fighters: Modern search and seizure law presumes that policing is a profession characterized by specialized knowledge and experience. Courts have justified a high degree of deference to the police based on that presumption. If increasing automation changes what the police do, courts will need to reconsider how the craft of policing ought to figure into the Fourth Amendment calculus.
First, courts engaged in Fourth Amendment analysis tend to characterize policing as a profession in which officers exercise specialized skills based on training and experience. This perspective did not characterize policing at its inception. The earliest urban police officers of the nineteenth century were sent out with little more than a weapon and a uniform,[60] and were beholden to local political machines.[61] Such training as existed focused on acclimating officers to the quasimilitaristic organization of the police.[62]
These relaxed standards would not last. By the early twentieth century, the police had begun to conceive of themselves as professionals.[63] Progressive Era activists and police reformers advocated for efficiently organized police departments, with the division of labor and specialization familiar to us today.[64] In this approach to police reform, standardized measures should guide officer recruitment and training; scientific methods should guide criminal investigations.[65] The embrace of these beliefs meant that the professional era of policing would become the dominant form of American police departments by the middle of the twentieth century.[66]
Today American police departments share a common professional culture, even if policing itself is highly decentralized. Professional organizations issue best practices.[67] Federal agencies like the Bureau of Justice Statistics publish empirical studies on policing.[68] States define education and training minimums for local departments.[69] And while most departments only require a high school diploma for entry level positions,[70] a college degree is often important for promotions at higher ranks.[71] Policing varies in its details, but by midcentury, it was overwhelmingly a professional occupation.
By the middle of the twentieth century, the Supreme Court had embraced this characterization of policing.[72] In the case that defined both reasonable suspicion and legitimated stop-and-frisks, the Court noted that officer Martin McFadden detained John Terry and two associates observing a store in Cleveland Ohio because he believed the “defendants were conducting themselves suspiciously.”[73] In upholding the temporary seizure and limited search of the three men, the Terry Court even observed that it “would have been poor police work indeed for an officer of 30 years’ experience in the detection of thievery from stores in this same neighborhood to have failed to investigate this behavior further.”[74]
And while the Terry decision is best known for its recognition of police stops and frisks, the Court grounded its decision in terms of police expertise. The police are entitled to conduct a limited search and temporary detention “where a police officer observes unusual conduct which leads him reasonably to conclude in light of his experience that criminal activity may be afoot.”[75] The Court would emphasize the special abilities of the police to distinguish suspicious from innocent behavior in subsequent cases on reasonable suspicion. The reasonable suspicion standard depends on the “observations of a trained, experienced police officer who is able to perceive and articulate meaning in given conduct which would be wholly innocent to the untrained observer.”[76] In the Court’s view, what distinguishes the police from ordinary civilians is both their professional training and experience.[77]
The Court has also extended that recognition of police expertise to the probable cause standard. The probable cause required for a warrant should not depend on “library analysis by scholars, but as understood by those versed in the field of law enforcement.”[78] Moreover, the Court has frequently observed that these skills allow the police to identify suspicious behavior that civilians might not notice.[79]
That expertise, in turn, has justified deference. While this judicial deference to police expertise has attracted substantial criticism,[80] it remains the prevailing view of the courts.[81] Those professional judgments, which are granted the benefit of the doubt, include a range of Fourth Amendment questions, including whether probable cause or reasonable suspicion has been met,[82] whether a suspect is dangerous, whether an area represents a “high crime neighborhood,” and whether and how to enforce the law at all.
Consequently, automation-driven deskilling may dramatically alter the relationship between ordinary policing and legal doctrine. Much of modern search and seizure law assumes that human police officers possess special investigatory skills that courts should not second guess except in extreme cases. Should machines assume many of these tasks—both at the core and the periphery of the job—the presumption of deference to human decision making in policing should look different to courts than it does now.
III. Deskilling and the Reorganization of Policing
How would deskilling change the organization of policing? While there has been little attention paid to the effects of increasing automation on the police, we might make some predictions by looking at an analogous debate about automation. The U.S. military has invested considerable research and funding into artificial intelligence and robotics.[83] Today the American military operates robots with varying degrees of human control. Military forces around the world employ surveillance drones as well as drones equipped with lethal force, although these are operated remotely by human pilots.[84] The Army and Marine Corps have awarded contracts for autonomous and semiautonomous ground support tanks and other vehicles for demonstration projects.[85] We can expect developments in artificial intelligence to allow military robots to become increasingly autonomous. For instance, in a report on unmanned flight systems, the U.S. Air Force suggested that advances in artificial intelligence would “enable systems to make combat decisions and act within legal and policy constraints without necessarily requiring human input.”[86]
Automation will change how we think of warfare.[87] Robots equipped with artificial intelligence could one day replace some human soldiers and support personnel in some of the most dangerous jobs in the field.[88] Robot medics could identify wounded soldiers, provide temporary treatment, and return them to safety. Supply robots could identify and provision soldiers in the field. Most controversially, autonomous robots could serve in combat roles alongside human soldiers.[89]
While the greatest public attention to automation in the military has focused on the future use of autonomous weapons systems,[90] the increasing use of robotics and artificial intelligence suggests many sweeping changes throughout the military. New forms of combat may emerge, including the use of swarms of autonomous robots that corral and target adversaries.[91] Human soldiers might fight alongside robots that assist or augment their capabilities. One report asks us to imagine “[a] highly capable and sustainable land combat battlegroup in 2030 [that] may consist of as few as 250–300 human soldiers and several thousand robotic systems of various sizes and functions.”[92] The pressures to increase military automation occur not just at home, but from the other countries and nonstate groups eager to adopt these means.[93]
Automation will likely change not only military tactics but its organization, too. Increasing reliance on robotics and artificial intelligence will keep human soldiers out of danger.[94] But saving lives in this way may also reduce the need for some absolute number of human soldiers, just as automation has affected employment needs in the civilian workforce.[95] Alternatively, even if the total numbers of personnel may not change, the military may find itself with the challenge of recruiting and retaining people with advanced technical training.[96] Most people today entering the armed forces have no more than a high school diploma (enlisted members) or bachelor’s degree (officers).[97]
The military’s future may well find an analogue in ordinary policing. If many of the basic tasks assumed by the police today could be automated, there may be fewer needs for human police officers. If surveillance, the identification of suspicious behavior, car stops, traffic management, administrative tasks, and even arrests could be automated, police officers might face some of the same questions that cashiers, accountants, fast food workers, and security guards face now. Automation may create less work for police officers: not the absence of humans, but many fewer of them.
External pressures may compel even reluctant agencies to embrace increasing degrees of automation. We have already witnessed some of these changes. First, there are the financial pressures that face local police departments—more than half of which employ less than ten officers.[98] The 2007 recession led to budget cuts to police departments throughout the United States. Technological solutions have looked increasingly more attractive as “force multipliers”[99] to police administrators faced with layoffs and cutbacks.[100]
Second, increasing attention to complaints of racial bias and unnecessary force intensified after the August 2014 fatal shooting of Michael Brown in Ferguson, Missouri.[101] The subsequent protests, social media attention, and calls for policy changes played a significant role in the embrace of police body cameras by reformers and police leaders alike.[102] And while they were initially embraced as a means of police accountability, these body cameras have also become a means of introducing even more forms of automation into policing.[103] Body cameras will one day be equipped with automated identification systems like facial recognition[104] or gait recognition,[105] and will generate the massive amounts of data that could feed machine learning algorithms designed to identify suspicious behavior and persons.[106]
In other words, there may be structural conditions having little to do with technology that will encourage the embrace of automation in policing. Exactly how people will fit into this reorganization remains to be seen. The demand for traditional police officers (and administrators) may diminish with automation. But there may also be new specialized roles that do not yet exist. Just as military planners forecast the need for human operators to monitor swarms of drones “without micromanaging” them,[107] there may be roles for police operators to oversee the autonomous or semiautonomous police robots in the air and on the streets that assume traditional policing tasks, to manage largescale crime analytics recommendations, and to complete other supervisory tasks.
And one particularly controversial debate from the military discussion applicable to policing is the degree to which there should be a “human in the loop.”[108] Having a human in the loop generally refers to having some human involvement in the decision making and execution of an automated system.[109] This can refer to many different kinds of human involvement.[110] In the military, one of the most debated questions is the degree to which a machine could act on a “kill” mission without human approval. To what degree can an autonomous system execute a decision without any human involvement? What types of decisions may be delegated completely to machines? What does meaningful control over automated systems look like?
Even if there is human involvement in the machine decision making and action process, there may be reasons to doubt that this provides meaningful oversight. Military researchers have observed that as we grow increasingly dependent on ever more complex machines, human operators may lack the ability to perceive all the factors the machine observes, react as quickly as a machine does, or even determine whether an error is being made.[111] To further complicate matters, artificially intelligent machines may display unexpected emergent behaviors that are wholly unanticipated by any preexisting policy or legal framework.[112]
Increasing reliance on automated systems by the police might lead to two types of reorganization. First, small police agencies might rely heavily on automated systems while retaining a core group of officers and administrators as systems operators. The result would be very thinly staffed agencies. Cities and counties—the largest employers of police—might well embrace this version of policing as cheaper and more efficient than traditional policing.
Second, large urban departments may also embrace replacing many of its people with machines, but these effects may be different compared to those of small agencies. Police automation may increase the overall policing presence in cities. When the activity of policing becomes cheaper through technology, police presence may grow exponentially. People may have a hard time perceiving these changes. Silent but certain automated enforcement of regulatory offenses, as well as surveillance, detection, and analysis that is embedded into the framework of cities, and other forms of policing automation, look nothing like police officers. The “limited resources and community hostility”[113] that have served as a nonlegal check on traditional policing will have little effect. And if perceptions about crime rates spur claims of overenforcement now, we might expect similar complaints about the heavy presence of policing machines embedded within neighborhoods in the future.
Finally, increasing degrees of automation in policing will affect the police as workers. Just as it is now a problem for trade unions,[114] we should expect automation to become a potentially divisive labor issue for the police. Imagine a future case in which an urban police department wants to reduce its ranks by acquiring a combination of autonomous and networked robots.[115] The police union might argue that such an acquisition might violate its collective bargaining agreement with the department.
Private sector employees have faced these issues. Under section 158(d) of the National Labor Relations Act, both private employers and employee representatives shoulder an obligation to meet and to confer in good faith with respect to conditions of employment when there is a proposed change to the facet of a collective bargaining agreement.[116] Technological changes that affect unionized employees have been considered “conditions of employment”[117] that are subject to mandatory bargaining.[118] Even if a technological change in the workplace might have broad social benefits, “the impact of automation on a specific category of employees is a matter of grave concern to them. It may involve not only their present but their future employment in the skills for which they have been trained.”[119]
A police department’s decision to increase automation in a department may encounter similarly strident union opposition. Such objections will be rooted in precedent. Around the country, police unions have objected to the imposition of new technologies on their officers. Police unions in Massachusetts have objected to the introduction of GPS trackers on squad cars.[120] Police unions in Chicago and Massachusetts have filed lawsuits about the introduction of body-worn cameras.[121] These rifts show that labor-related complaints about increasing reliance on artificial intelligence and robotics will likely arise in policing.
IV. The Desocialization of the Police
By substituting machines for people, automation might dramatically restructure policing. Some of that transformation, will be nearly invisible—traditional surveillance and investigative legwork may fade into a “smart” landscape.[122] Other forms of automation may mean that robotic and automated systems will perform some tasks formerly performed by police officers. Whatever the details of this scenario, one likely result may be fewer police officers. What will that mean for the public?
Civilians might feel the impact of police automation first in their perception of community policing. The term “community policing” sounds like an amorphous concept, sometimes used to describe very specific tactics and in different context to refer to practices some police officers and departments have always used.[123] For policing researchers, though, community policing has as its central premise the active engagement of the public in increasing public safety.[124] Community policing can take many forms. This might include partnerships with local residents to identify and to address problems of crime and disorder.[125]
As a matter of policing history, we can identify community policing as a reform movement that arose partly in response to the civil unrest of the 1960s.[126] Newark and Detroit saw some of the worst violence in the summer of 1967.[127] The Kerner Commission’s report in 1968 declared that police were “not merely a ‘spark’ factor” in the unrest.[128] Informally, many police engaged in what racial minorities viewed as double standards of justice: one for them and another for white Americans.[129] Moreover, the professional era of policing—dominant in the first half of the twentieth century—favored an approach that discouraged police engagement with the public. Urban policing in the professional model involved car patrols driving from one service call to the next.[130] Not only did this approach create distance between the public and the police, this crime fighting image of the police suffered from another critical defect: Randomized car patrols and shows of force did not appear to have any serious effect on crime rates.[131]
Community policing arose as a repose to the shortcomings of the professional model. Under this approach, the police were not to be crime fighters working at a distance from residents but rather a normal presence in any neighborhood.[132] Enforcing the law was still the responsibility of the police, but the police could also encourage requests for nonemergency services and remain a visible and accessible presence. The 1982 publication of “Broken Windows,” in which George Kelling and James Q. Wilson emphasized the importance of a policing style focused on maintaining order and reducing the fear of crime that led neighbors to social withdrawal, was especially influential.[133] Such an approach required increased foot patrols and a focus on minor offenses previously deemed by the police too minor to enforce. The police in New York city would later adopt this style of community policing, first to great fanfare, and later to critique.[134]
Although its effect on reducing crime rates remains disputed, community policing spread throughout North America, Western Europe, and Australia in the 1990s.[135] In the United States, the federal government played a pivotal role in supporting the community policing model. The federal Community Oriented Policing Services program funded more than 100,000 police officer positions by the year 2000.[136] Having more officers on the ground supports the community policing emphasis on devolving decision making to individual officers. Working with the public requires a large number of ordinary patrol officers to be communicative, flexible, and empathetic.[137]
In theory, the fruit of community policing is not just the reduction of crime and disorder, but also enhanced legitimacy and trust in police-civilian relationships.[138] Under this view, repeated positive contacts with the police should increase the public’s perception that the police are partners and community members, not just an invading force.[139] And this perception of police legitimacy is not just a symbolic victory. The field of procedural justice studies has underscored the practical impacts of police legitimacy. People are more likely to obey the law and to cooperate with legal authorities—including the police—if they perceive them as legitimate.[140] That perception of fairness has a more significant impact on legitimacy than a decision in one’s favor, like escaping a traffic ticket or an arrest.[141]
Increased automation in policing poses complications for this style of policing. In the most optimistic version of this future, automation might assume some of the mundane and repetitive tasks so that police officers—even if there are fewer of them—will spend more time within the community. Examples of such minor tasks include providing the public with nonemergency information, issuing tickets, and assisting in traffic accidents. Increased automation may free police officers to be more communicative and creative in their interactions with the public.
There are good reasons, though, to be skeptical about this sunny view of police automation. First, the community policing model is premised, in part, on mundane, ordinary, and often noncriminal tasks that serve as the vehicle for increased communication between police officers and the public.[142] Transferring more tasks to machines may result in reducing these opportunities for contact with the public, and consequently fewer opportunities for establishing trust and legitimacy.
Second, even when police officers continue to make decisions and spend time in the community, their increasing reliance on artificial intelligence for determinations about whom to stop and where to patrol may undermine community relationships. In other areas of government decision making, we can see increasing concerns about algorithmic accountability, including demands that decisions achieved by artificial intelligence be capable of explanation. The European Union’s General Data Protection Regulation, for instance, establishes the right to an explanation about individual decisions made by algorithm.[143] Among the principles of accountability suggested by those working in the Fairness, Accountability, and Transparency in Machine Learning (FATML) is the need to “[e]nsure that algorithmic decisions as well as any data driving those can be explained to end-users and other stakeholders in non-technical terms.”[144]
In this way, a decision about suspicious behavior that originates with a machine may appear opaque and untrustworthy. No court decision has yet determined whether the Fourth Amendment should permit police reliance on a suspicion algorithm, but we do know something about public attitudes. Survey data suggests that a significant percentage of Americans view automated decision making in other realms like financial scoring and résumé screening with suspicion and skepticism.[145] Even if such automated decisions were not the sole reason for stops and arrests, the perception that machines determine or guide policing behavior may distance the public from the police.
***
Concerns about the effect of increasing automation on work are not new, nor are they unprecedented in policing. Sociologist Sid Harring observed in 1981 that police work was “especially vulnerable” to automation.[146] Harring felt it “critically important” to understand this trend. What is different today is the scale and speed of the changes being wrought upon the workplace and on the police in particular.[147] If twentieth century automation increasingly replaced bodies with machines, twenty-first century automation attempts to replace bodies and minds.[148] This is why current discussions of automation pose such a difficult challenge for the workplace.
Conclusion
No one knows for sure precisely how automation will change the workplace[149] but omitting policing from these discussions is a mistake. Indeed, the police are already experiencing the automation of some tasks we associate with the core skills of policing, such as surveillance and identifying suspicious behavior. Police departments will likely increase their reliance on artificial intelligence and robotics, especially if they are deemed to be superior to traditional methods, and cheaper than hiring more officers.
The changes wrought by increasing automation will not mean just a larger array of tools for the police, but a challenge for how to regulate and organize conventional policing. Our models of regulating, organizing, and explaining policing are based on models of human decision making. As more policing is performed by machine, we will urgently need changes to those assumptions and rules.
[1]. For instance, a recent RAND Corporation report discusses the impact automation will have on domestic security but does not address the possibility of deskilling in policing. See Osonde A. Osoba & William Welser IV, RAND Corp., The Risks of Artificial Intelligence to Security and the Future of Work 7–17 (2017).
[2]. Neither “artificial intelligence” nor “robotics” has a universally agreed upon meaning, but we can define each here in these terms. In this Article “artificial intelligence” refers to “a set of techniques aimed at approximating some aspect of human or animal cognition using machines.” Ryan Calo, Artificial Intelligence Policy: A Primer and Roadmap, 51 U.C. Davis L. Rev. 399, 404 (2017). A “robot” here is defined as a machine that can collect information, process it, and use it act upon the world. See P.W. Singer, Wired for War: The Robotics Revolution and Conflict in the 21st Century 67 (2009) (offering this definition).
[3]. See, e.g., James Manyika et al., McKinskey Glob. Inst., Jobs Lost, Jobs Gained: Workforce Transitions in a Time of Automation 11 (2017) (“We estimate that between 400 million and 800 million individuals could be displaced by automation and need to find new jobs by 2030 around the world, based on our midpoint and earliest (that is, the most rapid) automation adoption scenarios.”); Martin Ford, Driverless Trucks: Economic Tsunami May Swallow One of Most Common US Jobs, Guardian (Feb. 16, 2017, 7:00 AM), https://www.theguardian.com/technology/2017/feb/16/self-driving-trucks-automation-jobs-trucking-industry [https://perma.cc/U84V-EBS7] (“While truck driving may eventually become the poster child for the automation wave, the disruption will, of course, be far broader, eventually encompassing the fast food, retail and office jobs that currently employ tens of millions of Americans.”). Such concerns have prompted calls for a universal basic income (UBI). See, e.g., Peter S. Goodman, Free Cash in Finland. Must be Jobless, N.Y. Times (Dec. 17, 2016), https://nyti.ms/2hZr1AS [https://perma.cc/FY6J-HQQG] (“Universal basic income is a catchall phrase that describes a range of proposals, but they generally share one feature: All people in society get a regular check from the government—regardless of their income or whether they work.”).
[4]. Kathleen Walch, The Growth of AI Adoption in Law Enforcement, Forbes (July 26, 2019, 5:25 AM), https://www.forbes.com/sites/cognitiveworld/2019/07/26/the-growth-of-ai-adoption-in-law-enforcement/#27e9778435dd [https://perma.cc/G2C2-YBXG].
[5]. See, e.g., Singer, supra note 2, at 41 (“Man’s monopoly of warfare is being broken. We are entering the era of robots at war.”).
[6]. A short version of this question is discussed in Elizabeth E. Joh, Automated Policing, 15 Ohio St. J. Crim. L. 559, 561 (2018).
[7]. Automation can also play a role in complementing (“allowing people to achieve more or do better quality work”) and creating (“doing work that was never previously done by humans”). Benedict Dellot & Fabian Wallace-Stephens, Action & Research Ctr., The Age of Automation: Artificial Intelligence and the Future of Low-Skilled Work 31 (2017).
[8]. See, e.g., Aaron Smith & Monica Anderson, Automation in Everyday Life, Pew Research Ctr. (Oct. 4, 2017), http://www.pewinternet.org/2017/10/04/automation-in-everyday-life [https://perma.cc/JGK6-A9CL?type=image].
[9]. Carl Benedikt Frey & Michael A. Osborne, The Future of Employment: How Susceptible Are Jobs to Computerisation?, 114 Technological Forecasting & Soc. Change 254, 268 (2017) (“According to our estimates around 47% of total US employment is in the high risk category.”).
[10]. See, e.g., David H. Bayley, Police for the Future 17–21, 23 (1994) (“Patrol officers spend a lot of time simply waiting for something to happen—a summons from dispatch, a supervisor to show up, ambulances to arrive, detectives to finish with a crime scene, tow trucks to haul a car away, relatives to be summoned, and the fire department to flush gasoline off the street.”).
[11]. While three out of ten adults believe that the police fire their weapons a few times a year, only twenty seven percent of “all officers say they have ever fired their service weapon while on the job.” Rich Morin & Andrew Mercer, A Closer Look at Police Officers Who Have Fired Their Weapon on Duty, Pew Research Ctr. (Feb. 8, 2017), http://www.pewresearch.org/fact-tank/2017/02/08/a-closer-look-at-police-officers-who-have-fired-their-weapon-on-duty [https://perma.cc/9WQP-H5LB].
[12]. See, e.g., Bayley, supra note 10, at 20.
[13]. See id. at 15–35.
[14]. The sociologist Richard Ericson famously described the boredom inherent in ordinary patrol work. See Richard V. Ericson, Reproducing Order: A Study of Police Patrol Work 62 (1982).
[15]. For a longer discussion of the importance of surveillance and order maintenance in private security work, see Elizabeth E. Joh, The Paradox of Private Security, 95 J. Crim. L. & Criminology 49 (2004).
[16]. Frey and Osborne’s study ranked 702 jobs by order of their susceptibility to automation within twenty years. The job of security guard qualified as one highly susceptible to automation, while the job of patrol officer was not. Frey & Osborne, supra note 9, at 275.
[17]. For example, the Knightscope security robot can patrol independently for $7.00 an hour. Nicky Woolf, RoboCop Is Real—And Could Be Patrolling a Mall Near You, Guardian (May 20, 2016), https://www.theguardian.com/us-news/2016/may/20/robocop-robot-mall-security-guard-palo-alto-california [https://perma.cc/CP28-C2FY] (“They are completely autonomous, navigating like self-driving cars.”).
[18]. Stephen Chen, Meet China’s RoboCop: The Robot Police Officer Who Doesn’t Tire—Or Second-Guess Comments, South China Morning Post (May 5, 2016, 11:20 AM), https://www.scmp.com/news/china/policies-politics/article/1941394/meet-chinas-robocop-robot-police-officer-who-doesnt [https://perma.cc/6QJZ-JXWD] (describing robot that can detect faces and weapons, and can deliver shocks).
[19]. Automatic license plate readers (ALPR) rely upon cameras and algorithms to identify individual license plates, sometimes hundreds per minute. Private companies also collect ALPR data accessible by police departments. See, e.g., Julia Angwin & Jennifer Valentino-DeVries, New Tracking Frontier: Your License Plates, Wall St. J. (Sept. 29, 2012), https://www.wsj.com/articles/SB10000872396390443995604578004723603576296 [https://perma.cc/Q4HU-T5QR] (describing both private and public use of license plate readers).
[20]. For an overview of predictive policing programs, see generally Elizabeth E. Joh, Policing by Numbers: Big Data and the Fourth Amendment, 89 Wash. L. Rev. 35 (2014). The Department of Justice has also announced its support of predictive policing software. See, e.g., Deputy Attorney General Rod J. Rosenstein Delivers Remarks at the Project Safe Neighborhoods National Conference, U.S. Dep’t of Justice (Dec. 5, 2018), https://www.justice.gov/opa/speech/deputy-attorney-general-rod-j-rosenstein-delivers-remarks-project-safe-neighborhoods [https://perma.cc/2MTD-HL7K] (“We also support ‘predictive policing,’ which involves analyzing data so police can anticipate crime and preempt it.”). The incorporation of bias into predictive policing has also been raised as a concern. See, e.g., Kristian Lum & William Isaac, To Predict and Serve?, Significance (Oct. 7, 2016), https://doi.org/10.1111/j.1740-9713.2016.00960.x [https://perma.cc/2RPP-H2FS] (analyzing use of biased data in predictive policing).
[21]. See, e.g., Justin Jouvenal, The New Way Police are Surveilling You: Calculating Your Threat ‘Score,’ Wash. Post (Jan. 10, 2016), https://www.washingtonpost.com/local/public-safety/the-new-way-police-are-surveilling-you-calculating-your-threat-score/2016/01/10/e42bccac-8e15-11e5-baf4-bdf37355da0c_story.html?utm_term=.5b25f38ea6b4 [https://perma.cc/7GFC-3NUB] (describing piloting of threat scoring software, “Beware,” in Fresno, California).
[22]. The Metropolitan Police (U.K.) piloted ten facial recognition trials in December 2018 and January 2019. Live Facial Recognition Technology to be Deployed in Romford, Metropolitan Police (Jan. 30, 2019), http://news.met.police.uk/news/live-facial-technology-to-be-deployed-in-romford-356772 [https://perma.cc/YDU7-PBH2].
[23]. Whren v. United States, 517 U.S. 806 (1996) (deciding that probable cause for a civil traffic violation is a reasonable Fourth Amendment seizure regardless of any subjective police motive).
[24]. Elizabeth Davis, Anthony Whyde, & Lynn Langton, Contacts Between Police and the Public, 2015 1 (2018), https://www.bjs.gov/content/pub/pdf/cpp15.pdf [https://perma.cc/XUQ4-2NJQ].
[25]. See Elizabeth E. Joh, Automated Seizures: Police Stops of Self-Driving Cars, 94 N.Y.U. L. Rev. Online 292 (2019).
[26]. Id. at 306–07.
[27]. See Autonomous Police Vehicle, U.S. Patent No. 10,269,242 (filed July 12, 2016); discussion of the patent in Joh, Automated Seizures, supra note 25, at 302.
[28]. Bayley, supra note 10, at 40 (noting “universal” complaints from officers about paperwork); House of Commons Home Affairs Committee, Policing in the 21st Century: Seventh Report of Session 2007–08 52 (2008), https://publications.parliament.uk/pa/cm200708/cmselect/cmhaff/364/36408.htm [https://perma.cc/MZM8-HKWJ] (“We had previously raised concerns about the amount of time police officers spend completing paperwork at the station at the expense of time spent on patrol or investigating incidents. . . . Sir Ronnie Flanagan admitted to us that in conducting his Review of Policing, he was ‘quite staggered at the bureaucratic burden’ on officers, compared to 30 years ago. He told us that it was possible a police officer could spend 25–30% of his or her time on paperwork.”).
[29]. The best study of the process of arrest decisionmaking is Edith Linn, Arrest Decisions: What Works for the Officer? 1–5 (2009).
[30]. Linn described the NYPD’s arrest processing as a “system hobbled by redundant paperwork, misused personnel, broken equipment, backward technology, dispersed facilities, and conflicts among police units and justice agencies.” Id. at 2.
[31]. See Axon AI Research, Axon https://www.axon.com/info/ai [https://perma.cc/RF9T-SKTP] (“Today, officers spend as much as two-thirds of their day on paperwork. We at Axon consider this a failure of their technology.”); Dana Goodyear, Can the Manufacturer of Tasers Provide the Answer to Police Abuse?, New Yorker (Aug. 20, 2018), https://www.newyorker.com/magazine/2018/08/27/can-the-manufacturer-of-tasers-provide-the-answer-to-police-abuse [perma.cc/627C-JGQC] (“Already, [Axon] is testing software, aided by artificial intelligence, that can automatically transcribe dialogue and collect identification information, capabilities that could one day obviate written reports.”); Beryl Lipton, Shifting from Tasers to AI, Axon wants to use terabytes of data to automate police records and redactions, Muckrock (Feb. 12, 2019), https://www.muckrock.com/news/archives/2019/feb/12/algorithms-ai-task-force [https://perma.cc/SY5T-Z8QU] (“‘Axon is focused on developing AI that will simplify the writing of police reports based on body-worn camera footage . . . .’”) (quoting email from Axon spokesperson Carley Partridge).
[32]. Mobile Law Enf’t Commc’n Sys. & Method, U.S. Patent No. 10,049,419 (filed Sept. 6, 2017) (issued Aug. 14, 2018).
[33]. Locational predictive policing programs rely on factors like historical crime data to predict where crime might occur in the future. See, e.g., Andrew G. Ferguson, Policing Predictive Policing, 94 Wash. U. L. Rev. 1109, 1127 (2017) (describing the first iteration of predictive policing). Some departments have formally adopted the software, but dozens have experimented with the product, the dominant vendor of which is PredPol. Caroline Haskins, Dozens of Cities Have Secretly Experimented with Predictive Policing Software, Vice: Motherboard (Feb. 6, 2019, 7:00 AM), https://motherboard.vice.com/en_us/article/d3m7jq/dozens-of-cities-have-secretly-experimented-with-predictive-policing-software [https://perma.cc/7RS9-2BXP]; see also Ali Winston & Ingrid Burrington, A Pioneer in Predictive Policing is Starting a Troubling New Project, Verge (Apr. 26, 2018), https://www.theverge.com/2018/4/26/17285058/predictive-policing-predpol-pentagon-ai-racial-bias [https://perma.cc/5UVQ-LMZX] (describing PredPol as a market leader in predictive policing).
[34]. The Department’s Strategic Subject List applies an algorithm to hundreds of thousands of arrestees and assigns risk scores to who is most likely to involved in a shooting, either as a victim or perpetrator. Jeff Asher & Rob Arthur, Inside the Algorithm That Tries to Predict Gun Violence in Chicago, N.Y. Times (June 13, 2017), https://nyti.ms/2tgi63U [https://perma.cc/7LDR-2JUZ]. The City of Chicago posts a deidentified list of arrest data used for the Strategic Subject Algorithm. See Strategic Subject List, City of Chi.: Chi. Data Portal (last updated Dec. 7, 2017), https://data.cityofchicago.org/Public-Safety/Strategic-Subject-List/4aki-r3np [https://perma.cc/U6JD-M3TP]. The way the list is compiled and what may be inferred from it has drawn criticism. See, e.g., Brianna Posadas, How Strategic is Chicago’s “Strategic Subjects List”? Upturn Investigates., Medium (June 22, 2017), https://medium.com/equal-future/how-strategic-is-chicagos-strategic-subjects-list-upturn-investigates-9e5b4b235a7c [https://perma.cc/K4K6-7W3V].
[35]. Jeremy Gorner, Chicago Police Use ‘Heat List’ as Strategy to Prevent Violence, Chi. Tribune (Aug. 21, 2013), https://www.chicagotribune.com/news/ct-xpm-2013-08-21-ct-met-heat-list-20130821-story.html [https://perma.cc/5ADB-K4PD] (“The strategy calls for warning those on the heat list individually that further criminal activity, even for the most petty offenses, will result in the full force of the law being brought down on them.”).
[36]. Justin Jouvenal, The New Way Police are Surveilling You: Calculating Your Threat ‘Score’, Wash. Post (Jan. 10, 2016), https://www.washingtonpost.com/local/public-safety/the-new-way-police-are-surveilling-you-calculating-your-threat-score/2016/01/10/e42bccac-8e15-11e5-baf4-bdf37355da0c_story.html?utm_term=.ed71a697156e [https://perma.cc/7GFC-3NUB] (describing Beware software that provides threat scoring based on data including “arrest reports, property records, commercial databases, deep Web searches, and [a person’s] social media postings”).
[37]. Timothy Revell, AI Detective Analyses Police Data to Learn How to Crack Cases, New Scientist (May 10, 2017), https://www.newscientist.com/article/mg23431254-000-ai-detective-analyses-police-data-to-learn-how-to-crack-cases [https://perma.cc/PG27-A4HH].
[38]. See id.
[39]. VALCRI stands for Visual Analytics for sense making in Criminal Intelligence analysis. The “system features an AI algorithm that is capable of learning as more data is added, so that its ability to see patterns only grows more robust as data is added.” Deloitte, Artificial Intelligence Innovation Report 16 (2018).
[40]. See id.
[41]. See, e.g., Justin Bachman, The U.S. Army is Turning to Robot Soldiers, Bloomberg (May 18, 2018 12:00 AM), https://www.bloomberg.com/news/articles/2018-05-18/the-u-s-army-is-turning-to-robot-soldiers [https://perma.cc/C5QF-GUZ7].
[42]. Woolf, supra note 17.
[43]. Kashmir Hill, Uber Hired a Robot to Patrol its Parking Lot and it’s Way Cheaper Than a Security Guard, Splinter News (July 5, 2016), https://splinternews.com/uber-hired-a-robot-to-patrol-its-parking-lot-and-its-wa-1793859990 [https://perma.cc/EDZ3-4FEU] (noting robots can be rented for seven dollars an hour compared to twenty-five dollars an hour for a human guard).
[44]. Shan Li, Robots Are Becoming Security Guards. ‘Once It Gets Arms . . . It’ll Replace All of Us’, L.A. Times (Sept. 2, 2016, 3:00 AM), https://www.latimes.com/business/la-fi-robots-retail-20160823-snap-story.html [https://perma.cc/FY6Q-MH7A] (statement of Steve Claton, President of Allied Universal southwest security services).
[45]. See generally Harry Braverman, Labor and Monopoly Capital: The Degradation of Work in the Twentieth Century (1974). There is also a body of scholarship critiquing Braverman’s thesis, but this lies beyond the scope of the article. See Osoba & Welser IV, supra note 1, at 10 (noting “near-term labor market effects of artificial agents include a deskilling effect by which automation leads to the loss of specialized human abilities or skills. Automation reduces labor demand for people with the skills in question.”).
[46]. See Braverman, supra note 45, at 3 (describing contradiction between increasing demands of automation and decreasing emphasis on skills).
[47]. See id.
[48]. See id. at 155–56.
[49]. This example is taken from James R. Bright, Automation and Management 188 (1958).
[50]. See id.
[51]. See id.
[52]. See id.
[53]. Carl Benedikt Frey & Michael A. Osborne, The Future of Employment: How Susceptible are Jobs to Computerisation?, 114 Tech. Forecasting & Soc. Change 254, 274–78 (2017).
[54]. James R. Bright, The Relationship of Increasing Automation and Skill Requirements, in The Emp’t Impact Tech. Change 203, 218 (1966).
[55]. Andrew Burt & Samuel Volchenboum, How Health Care Changes When Algorithms Start Making Diagnoses, Harv. Bus. Rev., May 8, 2018, https://hbr.org/2018/05/how-health-care-changes-when-algorithms-start-making-diagnoses [https://perma.cc/5QTD-CJBE].
[56]. Beverly Rich, How AI is Changing Contracts, Harv. Bus. Rev., Feb. 12, 2018, https://hbr.org/2018/02/how-ai-is-changing-contracts [https://perma.cc/EY4N-XDES].
[57]. Robert E. Litan, Four Cures for Automation Anxiety, Brookings (June 21, 2018), https://www.brookings.edu/blog/up-front/2018/06/21/four-cures-for-automation-anxiety [https://perma.cc/53MH-SGN7] (describing automation anxiety as worries about the “impact of automation on jobs”).
[58]. Smith & Anderson, supra note 8 (reporting that 72 percent of American adults surveyed are “worried” about a “[f]uture where robots and computers can do many human jobs”). There is considerable disagreement as to whether automation will lead to widespread unemployment or create new job opportunities. Compare, e.g., Litan, supra note 57 (“The middle class justifiably is worried about automation.”), with David Autor, Why Are There Still So Many Jobs? The History and Future of Workplace Automation, J. Econ. Persp., Summer 2015, at 3, 5 (arguing that we “tend to overstate the extent of machine substitution for human labor and ignore the strong complementarities between automation and labor that increase productivity, raise earnings, and augment demand for labor”).
[59]. See Andrew Guthrie Ferguson, Big Data and Predictive Reasonable Suspicion, 163 U. Pa. L. Rev. 327, 405 (2015) (“If big data makes more information available with relatively little effort, then big data should be required to be part of the reasonable suspicion calculus.”).
[60]. See, e.g., Mark H. Haller, Historical Roots of Police Behavior: Chicago, 1890–1925, 10 Law & Soc’y Rev. 303, 306 (1976) (describing the early years of the Chicago Police Department as those in which “little attention was given to formal training; when such training did appear, it had low priority and reflected the department’s military conception of organization. . . . A 1929 study of the recruit school found not only that the instruction was inadequate but, as has generally been the case with police training, no recruit had ever failed the course.”).
[61]. See id. at 306–07.
[62]. See id.
[63]. See David Sklansky, The Persistent Pull of Police Professionalism, Harv. Exec. Session on Policing and Pub. Safety (Mar. 2011), https://www.ncjrs.gov/pdffiles1/nij/232676.pdf [https://perma.cc/96FU-WUV3]; see also Samuel Walker, A Critical History of Police Reform: The Emergence of Professionalism (1977).
[64]. Michael D. Reisig, Community and Problem-Oriented Policing, 39 Crime & Just. 1, 12–13 (2010).
[65]. See id.
[66]. See id. at 13.
[67]. See, e.g., Police Foundation, Open Data and Policing-PDI Best Practice Series (2018), https://www.policedatainitiative.org/resources/open-data-and-policing [https://perma.cc/9KQL-SAXK] (publishing a five part best practices guide).
[68]. See, e.g., Law Enforcement, Bureau of Justice Statistics, https://www.bjs.gov/index.cfm?ty=tp&tid=7 [https://perma.cc/9J9F-4Z5C] (collecting and publishing empirical data on policing).
[69]. Rachel Harmon, Reconsidering Criminal Procedure: Teaching the Law of the Police, 60 St. Louis U. L.J. 391, 394 (2016) (“[S]tate statutes create police officers, determine what qualifications and training they possess, and empower and command officers to coerce citizens.”).
[70]. According to the most recent survey by the Bureau of Justice Statistics, eighty four percent of nearly all of the nation’s local police departments require no more than a high school diploma to be eligible for employment. Brian A. Reaves, U.S. Dep’t of Justice, Bureau of Justice Statistics, Local Police Departments, 2013: Personnel, Policies, and Practices 7 (2015).
[71]. Christie Gardiner, Policing around the Nation: Education, Philosophy, and Practice 3 (2017) (reporting results of nationwide law enforcement survey).
[72]. Cf. Terry v. Ohio, 392 U.S. 1 (1968). Although the Terry decision is commonly understood as the moment when the Court embraced the notion of police expertise, Anna Lvovsky has demonstrated that the judicial acceptance of police as experts had started earlier, in the lower courts, and in substantive areas other than search and seizure law. Anna Lvovsky, The Judicial Presumption of Police Expertise, 130 Harv. L. Rev. 1995, 2015–24, 2036–51 (2017).
[73]. Terry, 392 U.S. at 8.
[74]. Id. at 23 (emphasis added).
[75]. Id. at 30 (emphasis added).
[76]. Brown v. Texas, 443 U.S. 47, 52 n.2 (1979).
[77]. Cf. (United States v. Brignoni-Ponce, 422 U.S. 873, 885 (1975) (“In all situations the officer is entitled to assess the facts in light of his experience in detecting illegal entry and smuggling”) (citing Terry, 392 U.S. at 27).
[78]. Illinois v. Gates, 462 U.S. 213, 232 (1983) (quoting United States v. Cortez, 449 U.S. 411, 418 (1981)); see also Ornelas v. United States, 517 U.S. 690, 700 (1996) (Scalia, J., dissenting) (Police officers may establish the probable cause necessary for a search or seizure by drawing “inferences based on [their] own experience.”).
[79]. See, e.g., United States v. Arvizu, 534 U.S. 266, 273 (2002) (quoting Cortez, 449 U.S. at 417–18 (1981)) (noting that reasonable suspicion standard is analyzed by a totality of the circumstances standard which “allows officers to draw on their own experience and specialized training to make inferences from and deduction about the cumulative information available to them that ‘might well elude an untrained person.’”); New Jersey v. T.L.O., 469 U.S. 325, 353 (1985) (“A teacher has neither the training nor the day-to-day experience in the complexities of probable cause that a law enforcement officer possesses, and is ill-equipped to make a quick judgment about the existence of probable cause.”).
[80]. See, e.g., Anthony O’Rourke, Structural Overdelegation in Criminal Procedure, 103 J. Crim. L. & Criminology 407, 429 n.79 (2013) (“For example, police officers receive an extremely high level of deference about their determinations whether there was probable cause to conduct a stop, so long as they are prepared to invoke their ‘experience and expertise’ as the basis of their decision.”); L. Song Richardson, Police Efficiency and the Fourth Amendment, 87 Ind. L.J. 1143, 1155 (2012) (emphasis omitted) (“A further problem with the reasonable suspicion standard is that courts often defer to officer judgments of criminality without any criteria for determining whether deference is justifiable. Instead, courts repeatedly defer to the judgements of all officers, with no inquiry into the particular officer’s training, experience, and skill.”); David A. Sklansky, Traffic Stops, Minority Motorists, and the Future of the Fourth Amendment, 1997 Sup. Ct. Rev. 271, 301 (1997) (noting of Ornelas decision, “the Court in effect declared that police officers should receive as much deference as trial judges”); Tracey Maclin, The Central Meaning of the Fourth Amendment, 35 Wm. & Mary L. Rev. 197, 248–49 (1993) (asserting that the Supreme Court’s “degree of deference to police searches is at odds with the central purpose of the Fourth Amendment”).
[81]. See Milton Hirsch & David Oscar Markus, Fourth Amendment Forum, Champion, Nov. 2005, at 65 (“Appellate opinions are replete with admonitions that deference should be afforded the on-the-spot-judgments of highly-trained, highly-experiencedpolice officers, and that the true test of Fourth Amendment reasonableness is not what any objective observer would have believed in the circumstances but what a highly-trained, highly-experienced objective police officer would have believed in the circumstances.”). Judicial deference to police expertise has its academic defenders as well. See, e.g., Craig S. Lerner, Reasonable Suspicion and Mere Hunches, 59 Vand. L. Rev. 407, 472 (2006) (arguing “the mere fact that a police officer cannot glibly articulate his suspicions does not mean that these suspicions are not reasonable”).
[82]. See, e.g., Ornelas, 517 U.S. at 699 (cautioning reviewing courts to defer to police, because “a police officer views the facts through the lens of his police experience and expertise”).
[83]. E.g., Justin Bachman, The U.S. Army is Turning to Robot Soldiers, Bloomberg, May 18, 2018, https://www.bloomberg.com/news/articles/2018-05-18/the-u-s-army-is-turning-to-robot-soldiers [https://perma.cc/XE4Q-XYGE] (“Over the next few years, the Pentagon is poised to spend almost $1 billion for a range of robots designed to complement combat troops. Beyond scouting and explosives disposal, these new machines will sniff out hazardous chemicals or other agents, perform complex reconnaissance and even carry a soldier’s gear.”).
[84]. See, e.g., Mark Mazzetti, The Drone Zone, N.Y. Times Mag. (Jul. 6, 2012) https://www.nytimes.com/2012/07/08/magazine/the-drone-zone.html [https://perma.cc/6D94-HZP9] (describing remote drone piloting facility in New Mexico); see also Eyal Press, The Wounds of the Drone Warrior, N. Y. Times Mag. (Jun. 13, 2018) https://www.nytimes.com/2018/06/13/magazine/veterans-ptsd-drone-warrior-wounds.html [https://perma.cc/S48H-KXZB] (observing that “targeted killings by drones have become the centerpiece of U.S. counterterrorism policy”).
[85]. See Andrew Feickert et al., Cong. Research Serv., U.S. Ground Forces Robotics and Autonomous Systems (RAS) and Artificial Intelligence (AI): Considerations for Congress 22–26 (2018) (describing selected unclassified military projects in robotics and autonomous systems).
[86]. U.S. Air Force, Unmanned Aircraft Systems Flight Plan, 2009–2047 41 (2009), https://fas.org/irp/program/collect/uas_2009.pdf [https://perma.cc/8J97-MSGE].
[87]. See Daniel S. Hoadley & Kelley M. Sayler, Cong. Research Serv., Artificial Intelligence and National Security 33 (2019) (“Most analysts believe that AI will at a minimum have significant impact on the conduct of warfare.”).
[88]. See Feickert et al., supra note 85, at 1 (noting that robotics and autonomous systems “offer[] the possibility of a wide range of platforms—not just weapon systems—that can perform ‘dull, dangerous, and dirty’ tasks—potentially reducing the risks to soldiers and Marines.”).
[89]. Kelsey D. Atherton, Are Killer Robots the Future of War? Parsing the Facts on Autonomous Weapons, N.Y. Times Mag. (Nov. 15, 2018), https://www.nytimes.com/2018/11/15/magazine/autonomous-robots-weapons.html [https://perma.cc/V7YJ-4XPP] (“Modern advancements in artificial intelligence, machine image recognition and robotics have poised some of the world’s largest militaries on the edge of a new future, where weapon systems may find and kill people on the battlefield without human involvement.”).
[90]. In response to concerns about lethal autonomous weapons systems (LAWS) in war, the United Nations established in 2016 a Group of Governmental Experts (GGE) on LAWS. Its mandate is to review the emerging technologies in LAWS with a view to identifying appropriate rules and principles. 2017 Group of Governmental Experts on Lethal Autonomous Weapons Systems (LAWS), United Nations, https://www.unog.ch/80256EE600585943/(httpPages)/F027DAA4966EB9C7C12580CD0039D7B5?OpenDocument [https://perma.cc/35DY-YR68].
[91]. See Feickert et al., supra note 85, at 11.
[92]. Mick Ryan, Ctr. for Strategic & Budgetary Assessments, Human-Machine Teaming for Future Ground Forces 20 (2018).
[93]. See id. at 13 (citing DoD Directive 3000.09, Autonomy in Weapon Systems (Nov. 21, 2012)) (noting that “although the U.S. Department of Defense has enacted restrictions on the use of autonomous and semi-autonomous systems wielding lethal force, hostile nations and non-state actors may not exercise such self-restraint.”).
[94]. See e.g., Robert O. Work & Shawn Brimley, Ctr. for a New Am. Sec., 20YY: Preparing for War in the Robotic Age 29 (2014), https://www.cnas.org/publications/reports/20yy-preparing-for-war-in-the-robotic-age [https://perma.cc/QZ3A-ZDA5] (“Networked, cooperative swarms of unmanned systems that can maneuver and engage targets collectively also have the potential to achieve reaction times much faster than that of human operators. . . . Human controllers, safely removed from harm’s way, would provide mission-level control over the swarm, but the leading edge of the battlefront across all domains would be unmanned, networked, intelligent and autonomous.”).
[95]. See Feickert et al., supra note 85, at 6 (noting that several studies “raise additional questions that may have implications for both civilian and military workers . . . . If entire jobs are eliminated, employers could be reluctant to maintain the size of their workforce and pay for employee re-skilling while also investing in AI and RAS [robotics and autonomous systems] technologies”).
[96]. See id. at 27–28 (“The introduction of RAS and AI brings with it a greater need for military personnel with advanced technical knowledge.”).
[97]. Id. at 28.
[98]. Reaves, supra note 70, at 4.
[99]. See U.S. Dep’t of Justice, Office Cmty. Oriented Policing Services , The Impact of the Economic Downturn on American Police Agencies 26 (2011), http://www.ncdsv.org/images/COPS_ImpactOfTheEconomicDownturnOnAmericanPoliceAgencies_10-2011.pdf [https://perma.cc/X97F-S759] (“Around the country, cash-strapped communities are looking for any way to boost efficiency and cut spending. . . . [Some] police agencies are shifting their operational models to include the use of technology systems that can help agencies to improve outcomes and increase efficiency.”).
[100]. See Andrew Guthrie Ferguson, The Rise of Big Data Policing: Surveillance, Race, and the Future of Law Enforcement 20–33 (2018).
[101]. See, e.g., Monica Davey & Julie Bosman, Protests Flare After Ferguson Police Officer Is Not Indicted, N.Y. Times (Nov. 24, 2014), https://nyti.ms/1yNsywu (“The killing . . . set off weeks of civil unrest—and a national debate—fueled by protesters’ outrage over what they called a pattern of police brutality against young black men.).
[102]. See, e.g., Alan Gomez, After Ferguson, Police Rush to Buy Body Cameras, USA Today (Oct. 11, 2014, 3:22 PM), https://www.usatoday.com/story/news/nation/2014/10/11/police-body-cameras-ferguson-privacy-concerns/16587679 [https://perma.cc/6Y3J-3XUL]; Drew Harwell, The Body-Camera Industry Is “Feeling Phenomenal” After Ferguson, Wash. Post (Dec. 3, 2014), https://www.washingtonpost.com/news/wonk/wp/2014/12/03/the-body-camera-industry-is-feeling-phenomenal-after-ferguson/?utm_term=.4580c656e08a [https://perma.cc/R7Q7-DZ4S] (noting “few rejoiced [at calls for officer transparency] like the body-camera industry, which stands to make a fortune off police forces nationwide with newfound millions to spend”); Josh Sanburn, The One Battle Michael Brown’s Family Will Win, Time, Nov. 26, 2014, http://time.com/3606376/police-cameras-ferguson-evidence [https://perma.cc/5E4Q-NW62] (noting the Brown family’s call for “every police officer working the streets in this country [to wear] a body camera.”).
[103]. See Elizabeth E. Joh, Beyond Surveillance: Data Control and Body Cameras, 14 Surveillance & Soc’y 133 (2016).
[104]. Patrick Tucker, Facial Recognition Coming to Police Body Cameras, Defense One (July 17, 2017), https://www.defenseone.com/technology/2017/07/facial-recognition-coming-police-body-cameras/139472 [https://perma.cc/A74N-3W8D] (statement from Neurala CEO, Massimiliano Versace) (describing announcement of partnership between Motorola body camera division and AI startup Neurala to build “‘real-time learning for a person of interest search’”). But see First Report of the Axon AI & Policing Technology Ethics Board: Overview, Policing Project, https://www.policingproject.org/axon [https://perma.cc/9DWG-S5L7] (noting Axon’s agreement not to “proceed with the development of face matching products” because “[f]ace recognition technology is not yet reliable enough to justify its use on body-worn cameras . . .”).
[105]. Cf. Dake Kang, Chinese ‘Gait Recognition’ Tech IDs People by How They Walk, Associated Press (Nov. 6, 2018), https://apnews.com/bf75dd1c26c947b7826d270a16e2658a? [https://perma.cc/7VVT-WBZ3] (discussing potential of gait recognition to be used with facial recognition).
[106]. See, e.g., Ava Kofman, Taser Will Use Police Body Camera Videos “to Anticipate Criminal Activity,” The Intercept (April 30, 2017, 6:29 AM), https://theintercept.com/2017/04/30/taser-will-use-police-body-camera-videos-to-anticipate-criminal-activity [https://perma.cc/WNX9-TGZC] (“Taser is betting that its artificial intelligence tools might be useful not just to determine what happened, but to anticipate what might happen in the future.”); Doug Wyllie, What TASER’s Acquisition of 2 AI Companies Means for the Future of Policing, PoliceOne, (Feb. 9, 2017), https://www.policeone.com/police-products/less-lethal/TASER/articles/289203006-What-TASERs-acquisition-of-2-AI-companies-means-for-the-future-of-policing [https://perma.cc/GZB9-3ESN] (quoting Axon CEO Rick Smith as stating “We’ve got all of this law enforcement information with these videos, which is one of the richest treasure troves you could imagine for machine learning.”).
[107]. See Atherton, supra note 89.
[108]. See, e.g., John Markoff, Report Cites Dangers of Autonomous Weapons, N.Y. Times (Feb. 28, 2016), https://www.nytimes.com/2016/02/29/technology/report-cites-dangers-of-autonomous-weapons.html?searchResultPosition=10 [https://perma.cc/VTH7-JZBM] (describing problem of having human input into robotic decisionmaking).
[109]. This is generally seen as an issue of whether humans should be in the loop, not whether they must. Work & Brimley, supra note 94 at 25 (“The number of instances where humans must remain in the loop will likely shrink over time.”).
[110]. See William C. Marra & Sonia K. McNeil, Understanding “The Loop”: Regulating the Next Generation of War Machines, 36 Harv. J.L. & Pub. Pol’y 1139, 1179 (2013) (arguing that the in-the-loop argument is “too simplistic”).
[111]. Cf. Work & Brimley, supra note 94, at 24 (“While human decision-making will likely retain advantages in situations that are complex, ambiguous, require understanding of context or require judgment for some time, steadily improving autonomous logic will certainly be useful in situations where simple, predictable tasks are being performed, where reaction time is critical, or where communications links with human controllers are fragile.”).
[112]. Ryan Calo, Robotics and the Lessons of Cyberlaw, 103 Cal. L. Rev. 513, 515 (2015) (“Robots display increasingly emergent behavior, permitting the technology to accomplish both useful and unfortunate tasks in unexpected ways.”).
[113]. United States v. Jones, 565 U.S. 400, 416 (Sotomayor, J., concurring) (quoting Illinois v. Lidster, 540 U.S. 419, 426 (2004)).
[114]. David Shepardson, Teamsters Chief Fears U.S. Self-Driving Trucks May Be Unsafe, Hit Jobs, Reuters (Sept. 12, 2017, 3:50 PM), https://www.reuters.com/article/us-autos-selfdriving-teamsters/teamsters-chief-fears-u-s-self-driving-trucks-may-be-unsafe-hit-jobs-idUSKCN1BN337 [https://perma.cc/WYN2-KJ3H].
[115]. This example is derived from Patrick T. Wilson, Competing With a Robot: How Automation Affects Labor Unions, J. Bus. & Intellectual Prop. L. (Aug. 22, 2017), http://ipjournal.law.wfu.edu/2017/08/competing-with-a-robot-how-automation-affects-labor-unions [https://perma.cc/K682-G37S] (“What would happen if a trucking company wished to acquire a fleet of Otto [self-driving] trucks, which would replace the majority of their truckers?”).
[116]. 29 U.S.C. § 158(d) (2012) (“For the purposes of this section, to bargain collectively is the performance of the mutual obligation of the employer and the representative of the employees to meet at reasonable times and confer in good faith with respect to wages, hours, and other terms and conditions of employment, or the negotiation of an agreement, or any question arising thereunder, and the execution of a written contract incorporating any agreement reached if requested by either party . . . .”).
[117]. See id.
[118]. E.g., Leach Corp., 312 N.L.R.B. 990 (1993) (holding employer engaged in unfair labor practices under Act by refusing to recognize Union regarding technological upgrade to production process); Columbia Tribune Publ’g Co., 201 N.L.R.B. 538, 538 (1973) (holding employer engaged in unfair labor practices after instituting technological production change that might “require lesser skills” and that might “justif[y] . . . paying lower rates”).
[119]. Renton News Record, 136 N.L.R.B. 1294, 1297 (1962).
[120]. See Kay Lazar & Matt Rocheleau, State Police to Activate GPS in Cruisers, Restructure Staff, Bos. Globe (May 2, 2018) https://www.bostonglobe.com/metro/2018/05/02/policereforms/GrDXW7qM1I5wt2mdutWfhM/story.html?p1=Article_Inline_Bottom [https://perma.cc/7Z8J-LSZU] (“The State Police Association of Massachusetts, which represents the bulk of the force, said last month it believes the union contract calls for such sweeping changes to be negotiated and not thrust on troopers.”); see also Maria Cramer, Boston Police Officers Wary of GPS for Cruisers, Bos. Globe, (Nov. 18, 2013) https://www.bostonglobe.com/metro/2013/11/18/gps-now-monitor-bpd/Vc6qOHTlvehT2YzYWIQkiP/story.html [https://perma.cc/W72L-DR23] (noting concerns by Boston Police Patrolmen’s Association about introduction of GPS).
[121]. See Zuri Berry, Judge Denies Police Union Injunction on Body Cameras, Bos. Herald (Sept. 9, 2016), https://www.bostonherald.com/2016/09/09/judge-denies-police-union-injunction-on-body-cameras [https://perma.cc/GNK8-Q693] (reporting that state judge denied Boston Patrolmen’s Association request for injunction on body camera pilot program); Tony Briscoe, Judge Says CPD Violated State Labor Laws in Body Cam Expansion, Chi. Trib. (Jan. 4, 2018) https://www.chicagotribune.com/news/local/breaking/ct-met-body-camera-ruling-20180103-story.html [https://perma.cc/2WRC-SBW6] (“An administrative law judge says the Chicago Police Department violated state labor law when it failed to negotiate expansion of its body camera program with the city’s largest police union . . . .”).
[122]. Generally speaking, smart cities are ones that rely heavily on sensors to manage information like traffic and parking. See, e.g., Sophie Quinton, What is a Smart City?, GovTech (Apr. 26, 2016), https://www.govtech.com/fs/What-Is-a-Smart-City.html [https://perma.cc/F9Y6-QGBB].
[123]. See Vincent J. Webb & Charles M. Katz, Citizen Ratings of the Importance of Community Policing Activities, 20 Policing: Int’l J. Police Strategy & Mgmt. 1, 7 (1997) (“Ironically, given the attention and resources that community policing has received, there is limited agreement among police professionals and academics as to the definition of community policing.”).
[124]. Jerome Skolnick & David H. Bayley, Theme and Variation in Community Policing, 10 Crime & Just. 1, 4–5 (1988).
[125]. See id. at 15–18.
[126]. Reisig, supra note 64, at 13 (2010).
[127]. Nat’l Advisory Comm’n on Civil Disorders, Report of the National Advisory Commission on Civil Disorders 1 (1968).
[128]. Id. at 5. The Commission identified policing as a symbol perceived of racial prejudice as perceived by African Americans “The atmosphere of hostility and cynicism is reinforced by a widespread belief among Negroes in the existence of police brutality and in a ‘double standard’ of justice and protection—one for Negroes and one for whites.” Id.
[129]. Id.
[130]. See, e.g., Reisig, supra note 64, at 13.
[131]. See Skolnick & Bayley, supra note 124, at 2 (citing Police Foundation, The Newark Foot Patrol Experiment (1981)) (noting “randomized motorized patrolling neither reduces crime nor improves the chances of catching criminals.”).
[132]. Herman Goldstein, Toward Community-Oriented Policing: Potential, Basic Requirements, and Threshold Questions, 33 Crime & Delinq. 6, 8–9 (1987).
[133]. George Kelling & James Q. Wilson, Broken Windows: The Police and Neighborhood Safety, Atlantic, (Mar. 1982) https://www.theatlantic.com/magazine/archive/1982/03/broken-windows/304465 [https://perma.cc/JP89-EYDF].
[134]. And while the broken windows thesis may have spurred on the broader movement to adopt a community policing model, some have disputed whether the particular strain of order maintenance policing that enforced the smallest of offenses (zero tolerance policing) can be considered a form of community policing at all. See, e.g., RAND Corp., Zero Tolerance and Aggressive Policing (And Why to Avoid It), https://www.rand.org/pubs/tools/TL261/better-policing-toolkit/all-strategies/zero-tolerance.html [https://perma.cc/TQ4B-XG7T] (“Zero tolerance policing is sometimes known as ‘aggressive policing’ or ‘aggressive order maintenance’ is sometimes incorrectly tied to ‘broken windows’ policing.”).
[135]. See, e.g., Stephen Mastrofski, Critic: Community Policing: A Skeptical View, in Police Innovation 45, 58 (David Weisburd & Anthony Braga eds., 2019 2d ed.) (“Studies do not identify a consistent or strong crime control effect issuing from community policing.”).
[136]. Reisig, supra note 64, at 20.
[137]. Skolnick & Bayley, supra note 124, at 34.
[138]. Tom Tyler, credited with founding the procedural justice studies field, defined legitimacy as “a property of an authority or institution that leads people to feel that that authority or institution is entitled to be deferred to and obeyed.” Jason Sunshine & Tom R. Tyler, The Role of Procedural Justice and Legitimacy in Shaping Public Support for Policing, 37 L. & Soc’y Rev. 513, 514 (2003).
[139]. See, e.g., Skolnick & Bayley, supra note 124, at 33 (“Community policing probably raises the morale of the police involved because it multiplies the positive contact they have with those supportive people in a community who welcome police presence and activity.”).
[140]. See Tom R. Tyler, Why People Obey the Law 5 (2006) (observing that “legitimacy in the eyes of the public is a key precondition to the effectiveness of authorities”).
[141]. See, e.g., Tom R. Tyler, Social Justice: Outcome and Procedure, 35 Int’l j. Psychol. 117, 119–20 (2000) (concluding that procedural justice factors are more important than the outcomes of police interaction for assessing legitimacy).
[142]. See U.S. Dep’t of Justice, Bureau of Justice Assistance, Understanding Community Policing: A Framework for Action 1, 15 (1994) (“Community partnership means adopting a policing perspective that exceeds the standard law enforcement emphasis. This broadened outlook recognizes the value of activities that contribute to the orderliness and well-being of a neighborhood. These activities could include: helping accident or crime victims, providing emergency medical services, helping resolve domestic and neighborhood conflicts (e.g. family violence, landlord-tenant disputes, or racial harassment), working with residents and local businesses to improve neighborhood conditions, controlling automobile and pedestrian traffic, providing emergency social services and referrals to those at risk (e.g. adolescent runaways, the homeless, the intoxicated, and the mentally ill), protecting the exercise of constitutional rights (e.g. guaranteeing a person’s right to speak, protecting lawful assemblies from disruption), and providing a model of citizenship (helpfulness, respect for others, honesty, and fairness).”). Such contacts may also help police morale. See Skolnick & Bayley, supra note 124, at 33 (noting same).
[143]. Council Regulation 2016/679, The Protection of Natural Persons With Regard to the Processing of Personal Data and on the Free Movement of Such Data, and Repealing Directive 95/46/EC, 2016 O.J. (L 119) 1, https://eur-lex.europa.eu/legal-content/EN/TXT/PDF/?uri=CELEX:32016R0679 [https://perma.cc/T487-2J3K]. The scope and interpretation of this right remains subject to further interpretation. For one such interpretation, see Margot E. Kaminski, The Right to Explanation, Explained, 34 Berkeley Tech. L.J. 189, 204 (2019) (asserting that “an individual has a right to explanation of an individual decision because that explanation is necessary for her to invoke the other rights—e.g., to context a decision, to express her view—that are explicitly enumerated in the text of the GDPR”).
[144]. Nicholas Diakopoulos et al., Principles for Accountable Algorithms and a Social Impact Statement for Algorithms, FAT/ML, https://www.fatml.org/resources/principles-for-accountable-algorithms [https://perma.cc/9A2J-8WX7] (last visited Sept. 19, 2019).
[145]. See Aaron Smith, Pew Research Ctr., Public Attitudes Toward Computer Algorithms 1, 9 (2018) (“Americans are largely skeptical about the fairness of [certain types of algorithmic decisionmaking]: None is viewed as fair by a clear majority of the public.”).
[146]. Sid Harring, Taylorization of Police Work, Prospects for the 1980s, 11 Critical Soc. 25, 28 (1981).
[147]. Id. at 30.
[148]. Many thanks to Professor Geoffrey Rockwell for this observation.
[149]. See, e.g., Osoba & Welser IV, supra note 1, at 11 (“We . . . have not always done a stellar job forecasting what tasks are difficult for artificial agents to learn . . . . [W]e are poor at objective estimations of cognitive and processing difficulty. This bias in judgment makes forecasting the evolution of work error-prone.”).
[pdf-embedder url="https://www.uclalawreview.org/wp-content/uploads/securepdfs/2019/12/Joh-67.pdf" title="Joh - 67"]