Amazon's facial recognition technology isn't the only tool that Orlando is considering in its efforts to monitor the public using street cameras – the University of Central Florida has also developed a mass surveillance system for the city.
UCF researchers have installed an artificial intelligence-powered software that can recognize facial characteristics and body movements to detect assaults, robberies and even explosions in real time.
Orlando was the subject of blistering criticism in May after reports revealed that the police department was piloting Amazon's Rekognition, a facial-recognition technology that plugs into security cameras to identify and track people of interest as they walk down the street. Although currently being tested on a small number of cameras throughout the city with volunteer Orlando police officers, when Rekognition becomes fully operational, it will look for "persons of interest" by tapping into Orlando's network of surveillance cameras and essentially scanning everyone it can see until it finds a match.
But in early 2016, students and professors from UCF's Center for Research in Computer Vision were already using a $1.3 million federal grant to start testing software for the city's surveillance network that would instantly flag suspicious activities.
Related Under Orlando’s real-time surveillance partnership with Amazon, everyone’s a suspect: All eyes on us
The Orlando Police Department assigns officers to watch dozens of screens displaying live feeds from the city's estimated network of about 180 security cameras in a second-floor hub called the IRIS room. But the department eventually realized its surveillance capabilities were ineffective, says Raymond Surette, a criminal justice professor at UCF and one of the project's researchers.
"There are simply too many cameras and too few monitors," UCF researchers wrote in a proposal for a grant from the National Institute of Justice. "Thousands of cameras go unwatched and hours of video unviewed."
The IRIS room was also inconsistently staffed with officers assigned to light duty, Surette says.
"The goal is to get the human out of the task of sitting there and watching the screens," Surette says. "Once you get above like 10 or 15 monitors, people just start missing things like crazy. There's this thing called inattentional blindness that kicks in ... psychologically, it's just a boring task."
With surveillance technology, the ultimate goal for police is predicting escalating threats to public safety so that "an assault on the street doesn't turn into a murder on the street," Surette says.
To bolster OPD's surveillance capabilities, UCF researchers proposed a four-function computer vision workstation with analytical software for anomaly detection, face-attribute prediction, body-attribute prediction and action detection to test on cameras they originally planned to install in Orlando's Rosemont neighborhood. First, though, researchers had to develop and test algorithms to train the software using hundreds of video clips containing simple actions like jumping, kicking and punching as well as more intricate activities like playing the violin or sumo wrestling.
Like a human, the software needs to be trained to accurately distinguish the normal from the abnormal. "You need to anticipate that these things will happen, so you show enough examples of it to the system," says Mahdi Kalayeh, a computer vision researcher at UCF whose software is embedded on OPD's workstation.
Action recognition and facial attribute software, Kalayeh says, fall under a field of artificial intelligence called object recognition. Unlike facial recognition, which looks for facial traits to identify a person in real-time within a camera's field of view, action recognition is broader – looking for abnormal movements and gestures to alert police of suspicious events, or "anomalies," like fights, drug deals or robberies.
UCF researchers included anomalies such as abuse, arrest, arson, assault, burglary, explosion, fighting, road accidents, robbery, shooting, shoplifting, stealing and vandalism to train the software. The software could also label actions the surveillance videos recorded, creating a database officers could search by activity.
To get ahead of a potential crime, Kalayeh says, the software is predictive – it's designed to learn, through thousands of examples, what events and conditions precede an anomaly so it can alert law enforcement before future events happen.
Kalayeh wrote the workstation's facial-attribute algorithms, which are tailored to seek more specific characteristics of a person's face as shown in a mugshot or portrait. The software can detect up to 40 different attributes, ranging from large noses, stubble and high cheekbones to whether the person is "attractive," according to a report researchers submitted to the National Institute of Justice in October.
Officers can describe what someone looks like to the software – in essence, an advanced, law enforcement-tuned search engine for faces – and it will display a bank of people who match that description, pulling results from the police department's live surveillance feeds or a predefined bank of images.
"Let's say I couldn't take a photo, but the person was wearing a hoodie, he had black hair, glasses and a mustache," Kalayeh says. An officer could use the program to search those traits within the live video feed of a neighborhood. "And of course it doesn't directly give you the final person, but it narrows down from probably thousands of people to let's say 100 people."
Kalayeh's facial-attribute software looks for the same set of features as facial recognition software, but the two are used for slightly different reasons, he says. Facial-attribute software is more predictive, spotting clues to build a digital picture of a person's mood in order to anticipate their actions. An angry face could lead to hostility, while a happy face wouldn't. Facial recognition software studies the same traits in a surveillance feed to identify a person by name, using an image to match.
"We are not involved in facial-recognition work," Mubarak Shah, the founding director of UCF's CRCV and the study's principal investigator, tells Orlando Weekly. But despite the differences, Kalayeh's facial-attribution algorithms at OPD could be used to supplement Orlando's facial recognition pilot with Amazon.
Similarly, the "body-attribute prediction" recognizes characteristics like "male" and "long hair," and knows if the body it's analyzing is wearing "sunglasses, hat, T-shirt, long sleeve, formal, shorts, jeans, long pants, skirt, face mask, logo [or] stripe," according to the report.
The software was better at recognizing some activities than others – covert drug deals, for example, were hard to distinguish, Surette says, but the more example videos are given to the software, the better it gets at identifying any activity.
"If somebody walks up to their trunk, opens their trunk, and puts in their groceries, that quantitatively is pretty similar to somebody walking up to a trunk, popping it with a crowbar and taking out something," he says. "The videos from these cameras are often obscured or not the best. So you're often dealing with a very limited number of pixels. And pixels translate into data, and the less data you have, the more uncertainty you have."
- Photo courtesy of Mahdi M. Kalayeh, researcher on the UCF CRCV paper "Improving Facial Attribute Prediction using Semantic Segmentation."
UCF researchers purchased 10 cameras and related equipment with the NIJ grant money, including a server and storage, to establish a "community computer vision enhanced camera network" in the Rosemont neighborhood of west Orlando.
But starting in 2017, the project was plagued by delays. The city's technology management division took over the camera network, which meant agreements had to be renegotiated. OPD had to move the IRIS room to its new headquarters on South Street. Researchers struggled to keep the workstation staffed, and expanding construction sites on Interstate 4 repeatedly disrupted the test camera's video streams, Surette says. And, like the city's first pilot with Amazon Rekognition, scanty bandwidth on the police department's computers meant researchers at the workstation could only use the software with no more than two cameras for real-time analysis, according to Surette.
Emails requested by Orlando Weekly show Rosa Akhtarkhavari, Orlando's chief information officer and the driving force behind Rekognition inside City Hall, tried to involve UCF researchers in the Rekognition project by including them in at least one general discussion in which Amazon developers talked about how the city's police, traffic and fire departments could use their technology. But ultimately, Surette says, UCF decided not to become involved in the pilot.
"The move to the new building in combination with local construction projects resulted in the significant disruption and downtime of the existing police IRIS camera network," the study said. "The purchase of cameras and support technology was additionally delayed by a lag in camera equipment specifications from the City of Orlando Technology Management Division."
Due to delays, the equipment purchased by UCF was ultimately transferred to OPD. Researchers debuted a working computer vision workstation with UCF's software on June 12 that streams and duplicates nine video feeds before the project ended that month.
"The project cumulated with a well-received demonstration of the developed computer vision capabilities and workstation to the Orlando Police Department staff at project's conclusion," the study said. "OPD is considering means to continue to staff the workstation in partnership with UCF and negotiations with OPD regarding on-going utilization ... are in progress."
While UCF researchers turned in their study to the NIJ in October, Orlando was still assessing whether to continue funding students to monitor the computer vision workstation for research purposes. It currently sits unused in the IRIS room and is not being used by the department in its day-to-day policing, former Deputy Chief Mark Canty, who has since resigned from OPD to join former Chief John Mina at the Orange County Sheriff's Office, told the Weekly.
"Right now, it's not being monitored," Canty says. "When the grant ended, that kind of ended [UCF's] funds to pay for those students. We're trying to assess whether we're going to continue on and keep the program going."
Despite the UCF workstation hitting a financial roadblock and Amazon's Rekognition program still being in the pilot phase, Akhtarkhavari sees no limits to the technologies Orlando might use to improve policing and advance itself as one of the nation's "smart cities" – urban areas that invest in technology and intelligent design to create sustainable high-quality housing and jobs.
"I'm envisioning a future where the city's going to use whatever is available to address the city's needs," she says. "Whether that's going to be UCF, Amazon, Microsoft, you know, the kid who's writing a solution in their garage – whatever's going to address our business needs can be stacked and support our mission. ... We have to test it, we have to prove it works, and then we have to look at the cost for that and decide if the value and accuracy is worth it for us."
Potentially, OPD could use both Amazon Rekognition and UCF's software to analyze the same video streams, Akhtarkhavari adds.
"Obviously none of that is in production," she says. "At this point, they're working on two separate projects. Now, the city needs might intersect, so I'm not saying there is any defined role for one or the other at this point. Each project on its own has to show that it is providing value and it is cost-effective. Then we'll figure out which one we want to implement and how."
Since the city's partnership with Amazon was revealed, civil liberties advocates have condemned the use of facial recognition and similar technologies by law enforcement, calling them an intrusive surveillance technique rife with potential for abuse with no federal or state laws to regulate it.
"We're basically talking about attaching artificial intelligence to surveillance cameras," says Jay Stanley, a senior policy analyst with the ACLU Speech, Privacy and Technology Project. "As the AI gets better and better, it will allow individuals to be monitored in more intrusive and accurate ways, and that has the potential to create real, chilling effects on our society. How would you feel driving home with a police car behind you? I don't want to feel that way all the time."
Canty argues that while OPD has noted the public's concern, the department is committed to ensuring it uses the technology in a proper way that allows for community input. As he puts it, "This is not Minority Report."
"The goal is to look for people who are out there trying to hurt other people," he says. "We're looking for violent criminals and people who are in danger, in cases where we have information. It's not just scanning a bunch of photos and hopefully we get something. We're working a specific case, we have a specific suspect or endangered person, and we're going to try to find them to prevent them from either harming somebody or to be recovered."
Civil liberties organizations including the ACLU and the Project on Government Oversight, as well as Amazon's own employees, have pleaded with the company to stop selling its product to law enforcement agencies. But the retail and shipping giant has ignored these requests, says Jake Laperruque, senior counsel with POGO.
Documents uncovered by POGO reveal that officials with U.S. Immigration and Customs Enforcement met with Amazon last summer so that the company could pitch its Rekognition program to the federal agency.
Laperruque argues that real-time facial recognition surveillance could help the government track undocumented immigrants simply because of their legal status and scare them away from public places where cameras are located. The risk of misidentification is prevalent in facial recognition systems, particularly among dark-skinned women, and advocates fear a mistake could lead to a wrongful arrest or a deadly encounter.
"City governments and police departments need to put the question before their citizens who are going to be affected by these programs before they roll them out," Laperruque says. "Police and federal agencies spy first, ask permission later. This is highly advanced surveillance technology. These are the types of questions that should be going to the public before law enforcement is putting these tools into practice."
At a December meeting with Florida lawmakers, Mina was asked whether the Orange County Sheriff's Office would be participating in any facial recognition programs. State Rep. Anna Eskamani, D-Orlando, who was at the meeting, said Mina confirmed twice that "no facial recognition" would be brought to the county.
OCSO spokesperson Michelle Guido also told Orlando Weekly that "there were no plans in the immediate future for the Sheriff's Office to undertake any kind of facial recognition program." Asked if the Sheriff's Office would consider using UCF's software, Guido said there were "no plans to adopt any of those programs at this time."
Facial recognition technology is virtually unregulated by state and federal laws, and there are currently no bills in the Legislature that address the issue. Eskamani said that while the use of this technology is being watched in Central Florida because of Orlando's involvement with Amazon, it's "not on everybody else's radar."
Still, Eskamani is encouraging organizations to take the "Safe Face Pledge" started by the Algorithmic Justice League and the Center on Technology & Privacy at Georgetown Law. The pledge promises to mitigate abuse of facial analysis technology. Among other things, it asks tech companies to refrain from selling facial recognition software to law enforcement "unless a governing legislative body has explicitly and publicly considered all potential harms and authorized use of the technology."
"This has to lead to a larger conversation on policy," Eskamani says. "With this technology, we have to make sure we don't oppress people's privacy rights."