Does
this question work for both people and AIs?
Software
As a Profession
Choi,
Bryan H., Software As a Profession (2019). Harvard Journal of Law &
Technology, Vol. 33, 2020, Forthcoming. Available at SSRN:
https://ssrn.com/abstract=3467613
–
“When
software kills, what is the legal responsibility of the software
engineer?
Discussions of software liability have avoided assessing the duties
of “reasonable care” of those who write software for a living.
Instead, courts and legal commentators have sought out other
workarounds—like immunity, strict liability, or cyber
insurance—that avoid the need to unpack the complexities of the
software development process. As software harms grow more severe and
attract greater scrutiny, prominent voices have called for software
developers to be held to heightened duties of “professional
care”—like doctors or lawyers. Yet, courts have long rejected
those claims, denying software developers the title of
“professional.” This discord points to a larger confusion within
tort theory regarding the proper role of “professional care”
relative to “reasonable care.”
This Article offers a reconceptualized theory of malpractice law that treats the professional designation as a standard of deference, not a standard of heightened duty. This new theoretical framework rests not on the virtues of the practitioner, but on the hazards of the practice. Despite best efforts, doctors will lose patients; lawyers will lose trials. Accordingly, the propriety of the practitioner’s efforts cannot be judged fairly under an ordinary negligence standard, which generates too many occasions to second-guess the practitioner’s performance. Instead, the professional malpractice doctrine substitutes a customary care standard for the reasonable care standard, and thereby allows the profession to rely on its own code of conduct as the measure of legal responsibility…”
A
“security committee” makes sense.
Equifax
Lawsuit Reveals Embarrassingly Lax Security Protections
In
2017, the Equifax
data breach affecting
over 147 million people in the United States, Canada and UK quickly
made history as the first-ever “mega-breach.” Two years later,
it still ranks as one of the worst data breach violations in history.
Unfortunately, as the details of a new Equifax lawsuit reveal, there
is a very strong likelihood the
entire data breach could have been avoided in the first place if the
company had adopted even the most basic security protocols.
… If
nothing else, the Equifax lawsuit – and all of the embarrassing
security weaknesses that it is revealing – should be a wakeup call
to C-suite executives and board members. If cybersecurity was not
yet a board-level priority, it should be now. In the future, a
mega-breach of the same scale might do more than just result in huge
financial losses and damaging lawsuit claims – it might also end up
with those same executives and board members headed to prison.
Using
machines to catch errors made by machines is not as effective as
using humans.
Mae
Anderson reports:
Apple is resuming the use of humans to review Siri commands and dictation with the latest iPhone software update.
In August, Apple suspended the practice and apologized for the way it used people, rather than just machines, to review the audio.
While common in the tech industry, the practice undermined Apple’s attempts to position itself as a trusted steward of privacy.
Read
more on APNews.
I wonder how many people will read Apple’s notice about having a
choice on this. According to the AP, you supposedly have a choice
when installing the update iOS13.2:
Individuals can choose “Not Now” to decline audio storage and review. Users who enable this can turn it off later in the settings.
So
I went and looked at my settings. I haven’t gotten that update
yet, so when I do, I will look to see how that choice is presented.
No comments:
Post a Comment