Wednesday, May 15, 2019
So a cop using the old Mark I Eyeball can still recognize crooks. A device that captures images of faces and presents them to a person for identification seems to be outlawed too.
San Francisco Bans Facial Recognition Use by Police
San Francisco on Tuesday became the first US city to ban use of facial recognition technology by police or other government agencies.
Backers of the legislation argued that using software and cameras to positively identify people is, as city councillor Aaron Peskin put it, "not ready for prime time."
… "The propensity for facial recognition technology to endanger civil rights and civil liberties substantially outweighs its purported benefits, and the technology will exacerbate racial injustice and threaten our ability to live free of continuous government monitoring," read the legislation passed Tuesday.
The ban was part of broader legislation setting use and auditing policy for surveillance systems, creating high hurdles and requiring board approval for any city agencies.
"It shall be unlawful for any department to obtain, retain, access, or use any Face Recognition Technology or any information obtained from Face Recognition Technology," read a graph tucked into the lengthy document.
"Face recognition technology" means an automated or semi-automated process that assists in identifying or verifying an individual based on an individual's face.
A useful(?) quick summary.
What is the California Consumer Privacy Act and Does it Apply to Me?
Quit worrying about killer robots, they are coming whether you like it or not – and they absolutely will not stop
The use of fully automated AI systems in military battles is inevitable unless there are strict regulations in place from international treaties, eggheads have opined.
Their paper, which popped up on arXiv [PDF ] last week, discusses the grim outlook of developing killing machines for armed forces. The idea of keeping humans in the loop has always been favoured because modern AI systems like neural networks are like black boxes, their inner workings are inherently difficult to understand. Plus, you know, we've all seen Terminator.
Counter suit anyone?
Adobe Warns Users Someone Else Might Sue Them For Using Old Versions Of Photoshop
For years we've noted repeatedly how in the modern era you no longer truly own the things you buy. From game consoles that magically lose important functionality post purchase, to digital purchases that just up and disappear, we now live in an era where a quick firmware update can erode functionality and overlong EULAs can strip away all of your rights in an instant, leaving you with a hole in your pocket and a glorified paperweight.
The latest case in point: Adobe this week began warning users of its Creative Cloud software applications that they are no longer authorized to use older versions of the company's software platforms (Lightroom Classic, Photoshop, Premiere, Animate, and Media Director). In the letter, Adobe rather cryptically implied that users could risk copyright infringement claims by mysterious third parties if they continued using older versions of these platforms and refused to update them. End users, not surprisingly, were equal parts confused and annoyed:
… While Adobe couldn't be bothered to clarify this fact, the company was apparently making a vague reference to its ongoing legal dispute with Dolby Labs. Dolby sued Adobe last year (pdf) for copyright violations after it wasn't happy with the new revenue sharing arrangement crafted in the wake of Adobe's 2013 shift toward its controversial cloud-based "software as a subscription" model. There's really no indication that Dolby would actually sue Adobe customers, and it seems more than likely that Adobe was just interested in throwing some shade at Dolby -- without making it entirely clear that's what they were doing.
Regardless, copyright experts were quick to point out that given the overbroad nature of modern EULAs, users are completely out of luck when it comes to having any real legal recourse:
For our programmers.
Building and training machine-learning models using a web-scripting language might seem ambitious, but in 2019 it's perfectly feasible.
Redefine reading practice with Rivet
… Rivet is a new reading app from Area 120, Google’s workshop for experimental projects, that addresses the most common barriers to effective reading practice through a free, easy-to-use reading experience optimized for kids. Evidence shows that one of the major differences between poor and strong readers is the amount of time spent reading, so we're introducing Rivet to make high-quality reading practice available to all.
… Rivet is now available on Android smartphones, tablets, iPads, iPhones and Chromebooks in eleven countries worldwide. If you know a little reader who could benefit from better reading practice, check us out in the Play Store or App Store today.