Wednesday, August 02, 2023

It sounds so simple…

https://www.cpomagazine.com/data-protection/lessons-learned-from-gdpr-fines-in-2023/

Lessons Learned From GDPR Fines in 2023

In a year marked by record-breaking GDPR fines from companies like Meta and Amazon— Criteo, the French ad tech giant, is the latest company to find itself at the receiving end of a GDPR fine of €40 million ($44 million) penalty for its failure to obtain users’ consent regarding targeted advertising. This case serves as a reminder to companies worldwide about the importance of GDPR compliance. As businesses grapple with the repercussions of non-compliance, it becomes crucial to identify and avoid the three common mistakes that have landed countless organizations in hot water.

Not obtaining informed user consent

Data transfers outside the EU

Illegally processing children’s data





Typical arguments, but in the end an interesting question...

https://www.databreaches.net/the-plaintiffs-have-standing-to-sue-court-no-they-dont-appeals-court/

The plaintiffs have standing to sue — court. No, they don’t — appeals court.

Here’s yet one more case to note about standing and how cases may get dismissed before they even really get started. This case involved Syracuse ASC, LLC. In 2021, they experienced a cyberattack and notified 24,891 patients. A copy of their notification was posted to the Vermont Attorney General’s website at the time.

In due course, a patient sued, seeking potential class-action status (Greco v. Syracuse ASC LLC).

As Jeffrey Haber of Freiberger Haber LLP reminds us, in order to have Article III standing to sue, a plaintiff must allege the existence of an injury-in-fact that ensures that s/he has some concrete interest prosecuting the action. That

necessitates a showing that the party has “an actual legal stake in the matter being adjudicated”[3] and that the party has suffered a cognizable harm that is not “‘tenuous,’ ‘ephemeral,’ or ‘conjectural,’” but is, instead, “sufficiently concrete and particularized to warrant judicial intervention.”[4] Notably, an alleged injury will not confer standing if it is based on speculation about what might occur in the future or what future harm might be incurred.[5]

Somewhat surprisingly, the motion court denied the defendant’s motion to dismiss for lack of standing, finding that the plaintiff had established a risk of imminent future harm.

The defendant appealed and the Fourth Department “unanimously reversed.”

The Court held, after considering “all relevant circumstances,” that plaintiff failed to allege “an injury-in-fact and thus lack[ed] standing.” [9] “[I]mportantly,” explained the Court, “plaintiff ha[d] not alleged that any of the information purportedly accessed by the unknown third party ha[d] actually been misused.”[10] Similarly, the Court noted that “Plaintiff ha[d] not alleged that her own information ha[d] been misused or that the data of any similarly situated person ha[d] been misused in the over one-year period between the alleged data breach and the issuance of the trial court’s decision.”[11] The absence of such allegations, held the Court, was fatal to the survival of the pleading.
Further, the Court noted that, according to the complaint, only health information was accessed by a third-party.[12] The complaint did not, said the Court, “allege that a third party accessed data more readily used for financial crimes such as dates of birth, credit card numbers, or social security numbers.”[13]

Read more at JDSupra.

Here’s a Thought

So a data breach by itself, without any evidence of misuse of data, does not demonstrate “injury-in-fact” or imminent risk of harm, and so does not confer standing?

Would a court agree that criminals leaking the data on the dark web changes the risk of imminent harm or injury?

If so, then, the failure of entities to notify those affected that their data is on the dark web or any leak site or forum is essentially withholding information that would likely give people standing to sue.

DataBreaches has been a vocal proponent for transparency in disclosing leaks or listing breached data on the dark web or clear net. And maybe it’s time all law firms that are in the business of suing over data breaches should make a point of checking this site and other sites that expose these leaks before filing any complaint so that an argument can be made that the leak of the data makes the risk of harm imminent or more imminent, and the entity’s failure to disclose that to victims is an attempt to cover up the risk of harm the incident has caused.

Just a thought…





An argument from ‘the other side?’ He may have a point.

https://www.politico.com/news/2023/08/01/ai-politics-eric-wilson-00109214

The case for more AI in politics

Eric Wilson thinks AI has an important, not-at-all scary role to play in professional politics. The tech platforms just need to loosen up.





Uncommon opinion?

https://www.pcmag.com/opinions/why-ai-is-the-nemesis-of-truth-itself

Why AI Is the Nemesis of Truth Itself

AI isn’t going to take over the world. It probably won't even to take your job. The real threat is far more insidious—the AI boom heralds the erosion of truth and fact, and it's already happening.

Stephen Wolfram, mathematician and founder of Wolfram Research, has written an extensive description(Opens in a new window) of just how a large language model turns its corpus of data into rules for generating text. Not prepared to read 20,000 or so words on the subject? I'll try to break it down.

… When ChatGPT does something like write an essay what it’s essentially doing is just asking over and over again “given the text so far, what should the next word be?



No comments: