Never assume that ‘someone will ask eventually.’
The Project for Privacy and Surveillance Accountability (PPSA) issued a press release this week. It begins:
Former U.S. Sen. Mark Udall (D-CO) and former House Judiciary Committee Chairman Bob Goodlatte (R-VA) are leading an effort by Demand Progress Education Fund (DPEF) and the Project for Privacy and Surveillance Accountability (PPSA) to compel the government to come clean about the legal basis for mass domestic surveillance of Americans in the absence of Congressional authorization.
“Our request follows months of efforts by Members of Congress and civil liberties organizations to get the government to explain on what authority the government bases domestic surveillance of U.S. persons,” said Bob Goodlatte, senior policy advisor to PPSA who joined with former Sen. Mark Udall to add their names to the Freedom of Information Act (FOIA) request submitted today to the Department of Justice, FBI and other agencies.
Section 215, known as the “business records provision” of the PATRIOT Act (later amended and reauthorized by the USA FREEDOM Act), governed the warrantless surveillance of a wide range of personal information held by businesses. To acquire such sensitive records, all the FBI had to do was assert the data sought was relevant to a foreign intelligence investigation. With the expiration of Section 215 on March 15, Members of Congress and civil liberties organizations want to know the current legal basis for government surveillance.
Read more on PPSA. I would really encourage readers who are new to the issue of massive domestic surveillance of U.S. persons to read the entire announcement and its follow some of the links to find out more. Are you really okay with the idea of the government buying huge databases with tons of personal information about you, like enriched voter databases? How about if they go and buy hacked databases from sites where people seek support for mental health, disability, or gender-identity related issues? Even if you’re “just curious” about how the government might justify such acquisitions or programs, find out more.
Legislation as a ‘we gotta do something!’ reaction.
French politicians urge deployment of surveillance technology after series of attacks
France has so far resisted a broad rollout of surveillance technology in public spaces. That could be about to change.
After a series of bloody attacks, right-wing politicians and a minister in President Emmanuel Macron's government have called for increased use of surveillance technology, breaking with privacy advocates in the name of tracking would-be assailants and preventing further violence.
… "The idea is to use artificial intelligence to track suspicious behavior, and it's already being done in several countries," Djebbari told a national radio station on Sunday.
The comment seemed to go against recommendations from France's privacy regulator, the CNIL, which has blocked attempts to deploy facial recognition cameras in public spaces as being "neither necessary, nor proportionate" to their aims of boosting security.
Growing concern, as expected.
https://fpf.org/2020/10/30/exploring-consumer-attitudes-about-privacy/
Exploring Consumer Attitudes About Privacy
… A new study, “Privacy Front and Center,” from Consumer Reports’ Digital Lab with support from Omidyar Network, found that American consumers are increasingly concerned about privacy and data security when purchasing new products and services, which may be a competitive advantage to companies that raise the bar for privacy. A majority of smart product owners (62%) worry about potential loss of privacy when buying them for their home or family
… The Cisco 2020 Consumer Privacy Survey, “Protecting Data Privacy to Maintain Digital Trust,” found that protecting data privacy remains important to consumers during the pandemic.
… In addition, Deloitte recently released its 2020 Digital Consumer Trends survey, which focused on the growth in smart device use and data in the United Kingdom. It found that UK consumers have become less concerned about the use of their data.
Be careful what you promise or don’t promise?
Experian’s GDPR violation leaves companies scrambling to understand ‘legitimate interest’
A General Data Protection Regulation enforcement notice from United Kingdom regulators could leave credit reporting giant Experian on the hook for as much as $24 million – baffling U.S. and European Union companies alike, say legal experts.
The investigation that led to the notice found issues in each of the big three credit reporting agencies, and the data brokerage economy in general. While Experian, TransUnion and Equifax received praise for working with regulators on several of the problems apparently endemic to the industry, Experian reportedly failed to meet all its requests.
An enforcement notice is a warning that a fine will come should a company not take action. Experian now has nine months to do so, pending appeal.
The key issue flagged in the Experian enforcement is one that all companies that handle data from brokers need to consider when establishing data privacy practices.
“At a high level, the issue is transparency. It’s one of the key pillars of data protection,” said Sarah Pearce, an attorney at Paul Hastings’ London offices. “You need a lawful basis for each use of data.”
In GDPR, there are several categories of ways to legally obtain data. Companies can outright the users for permission to store and process data, for example. Or, companies can claim “legitimate interest,” where the data use is necessary for business purposes that aren’t seen as threats to privacy.
Direct marketing via mail is considered legitimate interest. But, in this case, the consent to use the data had been received by a broker that hadn’t specified the data would be sold. That negates the buyer (in this case Experian) being able to claim direct marketing as a legitimate interest.
The result of a perception of ‘doing a bad job?’
Big tech’s ‘blackbox’ algorithms face regulatory oversight under EU plan
Major Internet platforms will be required to open up their algorithms to regulatory oversight under proposals European lawmakers are set to introduce next month.
In a speech today Commission EVP Margrethe Vestager suggested algorithmic accountability will be a key plank of the forthcoming legislative digital package — with draft rules incoming that will require platforms to explain how their recommendation systems work as well as offering users more control over them.
“The rules we’re preparing would give all digital services a duty to cooperate with regulators. And the biggest platforms would have to provide more information on the way their algorithms work, when regulators ask for it,” she said, adding that platforms will also “have to give regulators and researchers access to the data they hold — including ad archives”.
No comments:
Post a Comment