Monday, May 09, 2022

Mission creep. You knew it would happen.

https://www.theregister.com/2022/05/08/pegasus_india_data_law_controversy/

India's ongoing outrage over Pegasus malware tells a bigger story about privacy law problems

NSO Group's Pegasus spyware-for-governments keeps returning to the headlines thanks to revelations such as its use against Spain's prime minister and senior British officials. But there's one nation where outrage about Pegasus has been constant for nearly a year and shows little sign of abating: India.

A quick recap: Pegasus was created by Israeli outfit NSO Group, which marketed the product as "preventing crime and terror acts" and promised it would only sell the software to governments it had vetted, and for approved purposes like taking down terrorists or targeting criminals who abuse children.

Those promises are important because Pegasus is very powerful: targets are fooled into a "zero click" install of the software, after which their smartphones are an open book.

In July 2021, Amnesty International and French journalism advocacy organisation Forbidden Stories claimed Pegasus had been used well beyond its intended purpose, and claimed to have accessed a list of over 50,000 phone numbers NSO clients had targeted for surveillance.

Many were politicans, activists, diplomats, or entrepreneurs - jobs that are just not the sort of role NSO said it would let governments target with Pegasus.

Over 300 Indian residents made that list – among them opposition politicians, activists, and officers of the Tibetan government in exile.

NSO has offered no explanation, or theory, for how its promises turned to dust.

The New York Times reported Prime Minister Narendra Modi purchased Pegasus in 2017 as part of an overall weapons deal worth roughly $2 billion, but Indian politicians have resisted admitting to its acquisition or use.

The mere implication that India's government had turned Pegasus against political opponents was dynamite and complaints poured in from those who felt they had been targeted.





A new “Most Wanted?”

https://www.axios.com/2022/05/06/data-company-headache-user-nightmare-abortion-roe

Without Roe, data will become a company headache and a user nightmare

The treasure troves of data tech companies have spent decades accumulating could put them right in the middle of efforts to prosecute people if the Supreme Court eliminates federal guarantees of abortion rights.

Why it matters: If Monday's leaked draft opinion becomes law, court orders could soon arrive at tech firm offices seeking info about individuals searching for emergency contraception, those seen near a suspected abortion clinic and more.

In addition to non-medical information such as location, shopping and search data, medical records themselves could be targeted. And those records are far more digitized than they were in the pre-Roe era.

While HIPAA restricts how providers share medical information, it doesn't prevent them from sharing it with law enforcement. "I don’t think people can rely on HIPAA as being a defense in these cases if there were a criminal prosecution," Granick said.





Another way of saying ‘human error.’ Perhaps the instructions should be disclosed along with the algorithm?

https://techcrunch.com/2022/05/08/perceptron-ai-bias-can-arise-from-annotation-instructions/

Perceptron: AI bias can arise from annotation instructions

This week in AI, a new study reveals how bias, a common problem in AI systems, can start with the instructions given to the people recruited to annotate data from which AI systems learn to make predictions. The coauthors find that annotators pick up on patterns in the instructions, which condition them to contribute annotations that then become over-represented in the data, biasing the AI system toward these annotations.

Many AI systems today “learn” to make sense of images, videos, text, and audio from examples that have been labeled by annotators. The labels enable the systems to extrapolate the relationships between the examples (e.g., the link between the caption “kitchen sink” and a photo of a kitchen sink) to data the systems haven’t seen before (e.g., photos of kitchen sinks that weren’t included in the data used to “teach” the model).

As it turns out, annotators’ predispositions might not be solely to blame for the presence of bias in training labels. In a preprint study out of Arizona State University and the Allen Institute for AI, researchers investigated whether a source of bias might lie in the instructions written by data set creators to serve as guides for annotators. Such instructions typically include a short description of the task (e.g. “Label all birds in these photos”) along with several examples.





Mr Zillman assembles very complete lists of resources.

https://www.bespacific.com/web-guide-for-the-new-economy-2022/

Web Guide for the New Economy 2022

Via LLRX Web Guide for the New Economy 2022 Accurate and actionable data on the economy is critical to many aspects of our research and scholarship. This guide by research expert Marcus P. Zillman provides researchers with links to information on a range of sources focused on new economy data and analysis from the public and private sectors, as well as scholarly work, news, government information, reports and alerts. Many of these sources should find a place in your customized research toolkit. The sites recommended in this guide are all free to use, and they are published by advocacy, government, corporate, academic, international financial groups and research experts. Many of the sites are updated on a regular basis, so it is recommended that you use RSS feeds or alerts to remain abreast of changes.





Tools & Techniques.

https://finance.yahoo.com/news/turn-smartphone-flatbed-scanner-sign-163309887.html

How to turn your smartphone into a flatbed scanner to sign forms or digitize text

… Obviously, a “scan” really means taking a “photo” of what you’re pointing the camera at, but the technology can go beyond that.

Along with adding color and lighting correction to photos, today’s phones also boast “OCR” technology, which stands for “optical character recognition,” which can translate typewritten (and even handwritten) words into editable and searchable text.



No comments: