Sunday, August 06, 2023

Might be interesting to follow…

https://www.commondreams.org/news/chinook-center-aclu-lawsuit

ACLU Sues Colorado Springs, FBI Over 'Unconstitutional' Spying on Activists' Devices

The ACLU of Colorado on Tuesday filed a federal lawsuit against the city of Colorado Springs, four members of the Colorado Springs Police Department, and the Federal Bureau of Investigation, accusing them of illegally spying on the private communications of a local activist arrested on minor—and critics say dubious—charges during a 2021 housing rights protest.

The suit continues:

Ultimately, a CSPD commander ordered arrests of prominent Chinook Center members for marching in the street, even after the protestors complied with police requests to move onto the sidewalk.
Colorado Springs police then obtained a search warrant—one of several that are the subject of this lawsuit—to search the Chinook Center's private chats on Facebook Messenger. The warrant did not even purport to be supported by probable cause. It was not limited to a search for any particular evidence, let alone evidence of a particular crime, and it was unlimited as to topics.





Gotta love these “How To” manuals…

https://www.researchgate.net/profile/Abu-Rayhan-11/publication/372775589_THE_DARK_SIDE_OF_INTELLIGENCE_HOW_TO_MANIPULATE_AND_CONTROL_WITH_CHATBOTS/links/64c7c2a13d1a321c1b4cf3b3/THE-DARK-SIDE-OF-INTELLIGENCE-HOW-TO-MANIPULATE-AND-CONTROL-WITH-CHATBOTS.pdf

THE DARK SIDE OF INTELLIGENCE: HOW TO MANIPULATE AND CONTROL WITH CHATBOTS

Chatbots are becoming increasingly sophisticated, and with this sophistication comes the potential for misuse. In this paper, we explore the dark side of chatbot intelligence, examining the ways in which chatbots can be used to manipulate and control users. We begin by discussing the psychology of intelligence, the ethics of artificial intelligence, and the implications of dark intelligence. We then explore the power of chatbots to manipulate and control users, examining the psychology of persuasion, the manipulative power of chatbots, and the use of chatbots in advertising and marketing. We also examine the role of chatbots in political campaigns, highlighting the ways in which chatbots can be used to sway public opinion and influence election outcomes. Finally, we discuss the art of deception and the use of chatbots for fraud and scams. This paper provides a comprehensive overview of the dark side of chatbot intelligence. It discusses the potential for chatbots to be used for malicious purposes, and it provides insights into how these dangers can be mitigated. The paper is intended for researchers, developers, and policymakers who are interested in the ethical and legal implications of chatbot technology.





This makes it even more difficult to determine if the training data is likely to produce a “true” outcome.

https://mindmatters.ai/2023/08/the-secret-ingredient-for-ai-ergodicity/

THE SECRET INGREDIENT FOR AI: ERGODICITY

Before applying AI in deep convolutional neural networks, practitioners need to address whether the problem under consideration is “ergodic.” 1

We are rightly amazed when deep learning wins at Atari arcade games using only display pixels. But in doing so, the AI is exposed to the same game again and again (and again). The scenarios change, but the game and its rules remain static. The same is true with chess or GO. When trained and tested against a human opponent, we know that the AI will be playing the same game.

Statisticians know that ergodicity comes in many flavors. Here ergodicity simply means that the data used to train AI must characterize similar data not yet seen. [i.e. a representative sample Bob]





Does it work?

https://dataspace.princeton.edu/handle/88435/dsp01dn39x481s

Faces at Face Value: An Analysis of Face Recognition Technology Policy and Performance

Facial recognition is one of the best developed and widely used applications of machine learning and examples of artificial intelligence in 2023. What was once a distant idea of futurism is now used in nearly every smart phone for identity verification. Alongside the convenient uses of the technology stand its more Orwellian counterparts - most notably, the use of face recognition for public surveillance. The development of the technology and the ubiquity of high-quality video recording devices like traffic cameras, surveillance cameras, and police body cameras enables the permeation of this technology throughout all spheres of life. Such technology invites the fear of constant surveillance and the decline in individual privacy, particularly in public areas. While the field has been subject to significant research and policy interest, it remains insufficiently regulated and misunderstood from a technological perspective. This study aims to comprehensively gauge the current policies surrounding the use of face recognition - specifically regarding its implementation on police body cameras and public footage as well as the use of personal photos in face recognition databases. Additionally, this study aims to quantify its accuracy in the face of adversarial factors - specifically, relating to age-invariant cross-demographic identity tracking i.e. identity matching with photos from different ages across different ethnic groups.



No comments: