Sunday, July 21, 2024

It became necessary to destroy the town to save it. Have we given too much access for “fairness?”

https://www.wsj.com/tech/cybersecurity/microsoft-tech-outage-role-crowdstrike-50917b90?st=pkas1bzrhcoj0os&reflink=desktopwebshare_permalink

Blue Screens Everywhere Are Latest Tech Woe for Microsoft

A Microsoft spokesman said it cannot legally wall off its operating system in the same way Apple does because of an understanding it reached with the European Commission following a complaint. In 2009, Microsoft agreed it would give makers of security software the same level of access to Windows that Microsoft gets.



(Related)

https://www.nytimes.com/2024/07/19/us/politics/crowdstrike-outage.html?unlocked_article_code=1.8k0._ZDj.e5unf_bqIJNo&smid=url-share

What Happened to Digital Resilience?

With each cascade of digital disaster, new vulnerabilities emerge. The latest chaos wasn’t caused by an adversary, but it provided a road map of American vulnerabilities at a critical moment.





You can’t object to an AI?

https://scholarlycommons.law.case.edu/jolti/vol15/iss2/6/

Artificial Intelligence in the Courtroom: Forensic Machines, Expert Witnesses, and the Confrontation Clause

From traditional methods like ballistics and fingerprinting, to the probabilistic genotyping models of the twenty-first century, the forensic laboratory has evolved into a cutting-edge area of scientific exploration. This rapid growth in forensic technologies will not stop here. Considering recent developments in artificial intelligence (“AI”), future forensic tools will likely become increasingly sophisticated. To be sure, AI-enabled forensic tools are far from theoretical; AI applications in the forensic sciences have already emerged in practice. Machine learning-enabled acoustic gunshot detectors, facial recognition software, and a variety of pattern recognition learning models are already disrupting law enforcement operations across the country. Soon, criminal defendants will need to learn how to navigate a courtroom dominated by AI-enabled expert systems. Unfortunately, there is little guidance in the caselaw or in the Federal Rules of Evidence on how exactly criminal defendants should approach AI as evidence in the courtroom. Although a handful of scholars have taken up the task of exploring the intersection of AI and evidence law, these studies have primarily focused on issues in authentication or issues with applying the Daubert standard to AI evidence. This study contributes to this ongoing exploration of AI in the courtroom by providing an analysis of the rights of criminal defendants facing AI-generated testimony under the Confrontation Clause of the Sixth Amendment. This study will illustrate that, in a future where AI-enabled forensic tools are increasingly used to inculpate defendants in criminal prosecutions, the right to confrontation will become increasingly eroded. This is largely because courts have carved out a broad “machine-generated data” exception to the Confrontation Clause. Under this exception, data generated by a sufficiently autonomous machine will fall outside the ambit of constitutional protection. The rationale is that such transmissions are too autonomous to be attributed to any human actor, and the Confrontation Clause protects only statements made by a human rather than a machine learning model. This exception to the right to confrontation is significant. Practically, these limitations could have a measurable negative impact on a defendant’s capacity to test the reliability of an AI model in court. Normatively, this study illustrates that, in a world where AI algorithms proffer inculpatory evidence of criminal wrongdoing, the right to confrontation adds little value for criminal defendants. As courts and scholars reinterpret and refine the rules of evidence to better reflect technological realities, some attention should be given to the proper place of the right to confrontation.



No comments: