Atlanta is Watching You

--

“Surveillance States become accepted after claims of a threat of terrorism. Perhaps the most iconic story of mass surveillance is George Orwell’s 1949 novel Nineteen Eighty-Four, which depicts a dystopian surveillance state (Oceania) created “to protect” it’s citizens from all types of evil.” This is how Atlanta’s Creative Loafing newspaper describes how surveillance is being accepted in Atlanta GA.

For better or worse, fears of Big Brother have diminished, but that doesn’t mean widespread monitoring has disappeared in Atlanta. In fact, it’s increased. Atlanta is now under surveillance more than at any point in the city’s history. And, if things go according to Operation Shield’s plan, more than $50 million will be spent over the next five years to install and link 10,000 cameras ( like the one pictured above) across the city — roughly one for every 43 Atlantans.

APD’s ( Atlanta Police Department) Loudermilk Video Integration Center ( pictured here) was under the direction of now Chief Erika Shields when it went live in August 2011.

On an average day, four active-duty officers watch over video feeds of primarily downtown, Midtown, and Buckhead. Atlanta Police Foundation President and CEO Dave Wilkinson, who has masterminded Atlanta’s surveillance plan since 2007, contends that “Operation Shield will soon be the most “effective” and “robust” video surveillance system in the United States.

Top 3 Most Popular Ai Articles:

1. TensorFlow Object Detection API tutorial

2. Deep Learning Book Notes, Chapter 1

3. Deep Learning Book Notes, Chapter 2

Currently Operation Shield appears to be focused on “large-grained” photographs, like those of automobiles, license plates, etc. but that doesn’t mean the technology cannot be applied to pictures of individuals as well and that’s when real issues are likely to arise, issues like those being created by Los Angeles County.

The FBI is currently facing lawsuits attempting to force transparency regarding their Next Generation Identification System which contains the faces of almost half of all Americans. Will NEC and the L.A. County Sheriff’s Department also fight against transparency when it comes to their new facial recognition toy?

The Los Angeles County Sheriff’s Department has made public it’s partnership with NEC Corporation of America (NEC) which will allow the department to utilize NEC’s facial recognition program.

It was recently reported that “More than two dozen civil rights organizations are calling on Amazon CEO Jeff Bezos to stop selling its facial recognition technology to the government, according to a letter recently made public by the American Civil Liberties Union ( ACLU).” ( see ACLU letter https://www.aclunc.org/docs/20180522_AR_Coalition_Letter.pdf )

The problem with the Amazon technology, along with that offered by NEC, IBM and other technology suppliers, is that it is unreliable when matching photographs of individual human beings with those stored in various databases.

Image quality is the largest contributing factor to recognition accuracy. Research results note a four fold reduction in miss rates using mugshots vs. webcam images” which says a great deal about the ability to deploy facial recognition applications using webcams or other real-time photographic devices like street and crowd control cameras.

A receiver operating characteristic curve, i.e. ROC curve, is a graphical plot that illustrates the diagnostic capability of a binary classifier system, like facial recognition. It is a plot of the true positive rate against the false positive rate for the different possible cutpoints of a diagnostic test. The graph at left shows three ROC curves representing excellent, good, and worthless tests plotted on the same graph. Accuracy is measured by the area under the ROC curve. An area of 1 represents a perfect test ( yellow line); an area of .5 represents a worthless test ( dark blue line) .

Researchers at the National Institute of Justice studied facial recognition performance of humans and seven algorithms on the “difficult face pairs” immediately below and “easy face pairs”, further below, using ROC curves.

Only three of the algorithms outperformed humans ( dark black line) on the difficult face pairs at most or all combinations of verification rate and false accept rate. Humans out-performed the other four algorithms on the difficult face pairs. Algorithms did better on easy face pairs, most likely because of similar frontal comparisons and lighting, with all but one algorithm performing more accurately than humans.

The problem of producing the same levels of certainty and uncertainty of a facial match using different frontal comparisons is cleverly illustrated by Abhishek Dutta in his PhD thesis using the three illustrations below, which also explain the difficulty of using both the Amazon and NEC algorithms, because neither controls Facial image 1, taken by a camera, or Facial Image 2, stored in a database. When the input picture and database picture appear at different facial views facial recognition applications encounter difficulties that should disqualify them for law enforcement purposes.

As MIT’s Joy Buolamwini explains, a Facial Recognition system can also suffer from what she calls “Algorithmic Bias” when the database, against which input pictures are compared, is insufficiently robust.

In 2015, a black software developer embarrassed Google by tweeting that the company’s Photo service had labeled photos of him with a black friend as “gorillas.” Google declared itself appalled and genuinely sorry.” An engineer who became the public face of the clean-up operation said the label gorilla would no longer be applied to groups of images, and that Google was “working on longer-term fixes.”

More than two years later, one of those fixes was erasing gorillas, and some other primates, from the service’s vocabulary. The awkward workaround illustrates the difficulties Google and other tech companies have in advancing face-recognition technology, which they hope to use in self-driving cars, personal assistants, and other products.

The lower the curvature of a “Receiver Operating Characteristics (ROC) curve” along with higher Algorithmic Bias should give all, including the ACLU and law enforcement agencies, concern with regard to the law enforcement utility of any Face Recognition System, including Amazon’s Rekognition, NEC’s NeoFace Match and IBM’s Intelligent Video Analytics.

___________________________________________________________________

Notes:

  1. https://www.buckhead.com/buckhead-blue-light-police-cameras-do-they-really-work/
  2. https://creativeloafing.com/content-170670-Atlanta-under-surveillance
  3. https://www.wired.com/story/when-it-comes-to-gorillas-google-photos-remains-blind/ )
  4. https://www.nist.gov/sites/default/files/documents/2016/12/14/facerecognitionalgorithmssurpasshumans.pdf
  5. ibid
  6. https://ris.utwente.nl/ws/portalfiles/portal/6053415

Originally published at neutec.wordpress.com on May 29, 2018.

--

--