Sunday, September 17, 2023

In the future, will everyone have their own personal surveillance drone?

https://link.springer.com/chapter/10.1007/978-3-031-40118-3_3

Facial Recognition Technology, Drones, and Digital Policing: Compatible with the Fundamental Right to Privacy?

Drones are the new gadget law enforcement agencies cannot get enough of. These agencies widely deploy drones, amongst others, for search and rescue operations or in response to a natural disaster. The benefits these drones offer are unquestionable. However, these drones are increasingly being deployed for a less self-evident and legitimate purpose: surveillance. The recourse to drones for surveillance operations is highly problematic, given its intrusiveness on citizens’ fundamental right to privacy. Furthermore, this intrusiveness becomes even more worrisome when these drones are equipped with facial recognition technology. Consequently, this paper will critically examine law enforcement’s recourse to facial recognition technology in drones and the worrying consequences of such deployment on citizens’ fundamental right to privacy.





Lazy government. What else is new?

https://www.pogowasright.org/gao-reports-shortcomings-in-federal-law-enforcement-on-privacy-and-civil-liberties/

GAO reports shortcomings in federal law enforcement on privacy and civil liberties

Joe Cadillic sends this along concise write-up from Matthew Casey at KJZZ:

GAO reports shortcomings in federal law enforcement on privacy and civil liberties

The first says a Homeland Security intelligence office that shares sensitive data with police and others has not done audits to make sure employees accessing the info have permission.
The second says a number of federal law enforcement agencies use facial recognition technology, but don’t have a correlating policy to protect civil rights and liberties.
And the last report says Homeland Security needs to do much more to protect the privacy of those whose info eventually goes into a delayed-and-over-budget system expected to store biometrics for hundreds of millions of people.





Could a stand alone AI serve as the “house ethicist?”

https://www.sciencedirect.com/science/article/abs/pii/S1546084323001128

Socrates in the Machine: The “House Ethicist” in AI for Healthcare

The physical presence of an on-site ethics expert has emerged as one way of addressing current and future ethical issues associated with the use of artificial intelligence in healthcare. We describe and evaluate different roles of the “house ethicist,” a figure we consider beneficial to both artificial intelligence research and society. However, we also argue that the house ethicist is currently in need of further professionalization to meet a number of challenges outlined here and will require both individual and concerted community efforts to ensure the flourishing of the embedded ethics approach.





Redefining privacy.

https://kilthub.cmu.edu/articles/thesis/Reframing_Privacy_in_the_Digital_Age_The_Shift_to_Data_Use_Control/24097575

Reframing Privacy in the Digital Age: The Shift to Data Use Control

This dissertation is concerned with privacy: specifically, vulnerabilities to harm that stem from infringements on individuals’ privacy that are unique to or exacerbated by the modern power of predictive algorithms in the contemporary digital landscape. For one ubiquitous example, consider how facial recognition technology has evolved over the past decade and fundamentally altered the sense in which our personal image is exposed, searchable, and manipulable in digital spaces. Modern algorithms are capable, based on relatively few data points (often in the form of photos freely uploaded to online albums), of identifying individuals in pictures in a variety of contexts—in different settings, from different angles, even at vastly different ages, sometimes including baby pictures! Relatedly, reverse image search is now quite an effective tool, in many cases allowing anyone with access to easily ascertain someone’s identity from a single photograph. And of course, image manipulation has progressed by leaps and bounds in recent years, approaching a point where predictive algorithms can, for instance, generate false but eerily accurate portrayals of people in situations they may never have actually been in.



No comments: