Saturday, January 16, 2021

Who gains from this? Anti-vaxxers making up their justification? Other vaccine makers?

https://www.databreaches.net/hackers-leaked-altered-pfizer-data-to-sabotage-trust-in-vaccines/

Hackers leaked altered Pfizer data to sabotage trust in vaccines

Sergiu Gatlan reports:

The European Medicines Agency (EMA) today revealed that some of the stolen Pfizer/BioNTech vaccine candidate data was doctored by threat actors before being leaked online with the end goal of undermining the public’s trust in COVID-19 vaccines.
EMA is the decentralized agency that reviews and approves COVID-19 vaccines in the European Union, and the agency that evaluates, monitors, and supervises any new medicines introduced to the EU.

Read more on BleepingComputer.





Should we make Smartwatches mandatory?

https://www.cbsnews.com/news/covid-symptoms-smart-watch/

Smartwatches can help detect COVID-19 days before symptoms appear

Devices like the Apple Watch, Garmin and Fitbit watches can predict whether an individual is positive for COVID-19 even before they are symptomatic or the virus is detectable by tests, according to studies from leading medical and academic institutions, including Mount Sinai Health System in New York and Stanford University in California. Experts say wearable technology could play a vital role in stemming the pandemic and other communicable diseases.





One man’s ‘feature’ is another man’s downfall?

https://www.makeuseof.com/bumble-turns-off-politics-filter-hunt-capitol-rioters/

Bumble Turns Off Politics Filter Amid Hunt for Capitol Rioters

Bumble, a widely popular dating app, has disabled the feature that lets users search for dates according to political viewpoint. This comes after users took advantage of this feature to find and report rioters who stormed Capitol Hill.

Following the pro-Trump protests at Capitol Hill, Bumble users reported finding a number of potential dates who bragged about attending the Washington D.C. riots on the app.





If you clean up your act now, can you avoid these lawsuits?

https://www.pogowasright.org/new-york-could-become-the-next-hotbed-of-class-action-litigation-over-biometric-privacy/

New York Could Become the Next Hotbed of Class Action Litigation Over Biometric Privacy

Joseph Lazzarotti of JacksonLewis writes:

Dubbed the “Biometric Privacy Act,” New York Assembly Bill 27 (“BPA”) is virtually identical to the Biometric Information Privacy Act in Illinois, 740 ILCS 14 et seq. (BIPA). Enacted in 2008, BIPA only recently triggered thousands of class actions in Illinois. If the BPA is enacted in New York, it likely will not take as long for litigation to begin under the new privacy law. Interestingly, late last year, Governor Cuomo signed AB A6787D which, among other things, prohibited the use of biometric identifying technology in schools at least until July 1, 2022.
BPA contains a private right of action. If it’s enacted, well, it will likely result in a spurt of new litigation.

Read more on Workplace Privacy, Data Management & Sec





Is this going to become a “thing?” Too much for the CIO to handle?

https://www.nextgov.com/emerging-tech/2021/01/hhs-names-first-ever-chief-artificial-intelligence-officer/171439/

HHS Names First Ever Chief Artificial Intelligence Officer

“AI is playing and will continue to play a significant role in overall technology modernization,” HHS Chief Information Officer Perryn Ashmore told Nextgov via email Thursday. “As such, I have named Oki Mek the Chief Artificial Intelligence Officer (CAIO) for the Office of the Chief Information Officer.”





Is “clearly overlooked” an oxymoron?

https://www.law.com/corpcounsel/2021/01/15/rush-to-use-artificial-intelligence-creates-privacy-headaches-for-gcs/

Rush to Use Artificial Intelligence Creates Privacy Headaches for GCs

AI is a technology that can do lots of good. But in that rush, a lot of the liabilities and risks have been very clearly overlooked," Andrew Burt, managing partner at bnh.ai, said.

According to a November 2020 study published by McKinsey & Co., 50% of 2,395 respondents indicated that at least one function in their corporations has incorporated artificial intelligence or machine learning. However, the rush to implement artificial intelligence and machine learning will lead to many legal departments finding ways to solve problems that arise in the technology’s life cycle in 2021, experts say.

The liabilities that corporations and their legal departments are now realizing that are associated with artificial intelligence and machine learning span the technology’s entire life cycle. The only real solution is to create a governance plan before the technology is implemented.

… Sarah Pearce, a partner at Paul Hastings in London and co-chair of the firm’s artificial intelligence practice group, said issues such as data privacy and security and ethics will be something that in-house counsel have to consider through the AI life cycle.

“This is not just privacy. This is privacy as it pertains to data collection and extraction; model training and deployment; and where prediction data is stored,” Burt added.

… Governance will be critical to avoid long-term issues with artificial intelligence and machine learning technology. The recognition of this fact has led to the development of a new in-house legal role for AI, Burt said.

“Part of that is hiring an in-house lawyer who can monitor and mitigate the liabilities associated with the AI the organization is adopting,” Burt said. “It’s kind of like what happened with privacy 10 or so years ago. Where you’d have privacy counsel or privacy officers that would be embedded in the legal department.”



No comments: