No
doubt an unintended consequence.
More
than 38,000 people will stand in line this week to get a new password
All
of this is going on at the Justus Liebig University (JLU) in Gießen,
a town north of Frankfurt, Germany.
The
university suffered a malware infection last week. While the name or
the nature of the malware strain was not disclosed, the university's
IT staff considered the infection severe enough to take down its
entire IT and server infrastructure.
… Furthermore,
JLU staff also believed the malware infection impacted the
university's email server, and, as a precautionary measure, they
reset all passwords for all email accounts, used by students and
staff alike.
But
in a bizarre twist of German law, the university couldn't send out
the new passwords to the students' personal email accounts.
Instead,
citing legal requirements imposed by the German National Research and
Education Network (DFN), JLU professors and students had to pick up
their passwords from the university's IT staff in person, after
providing proof of their identity using an ID card.
Here
we go again!
How
India Plans to Protect Consumer Data
The
Indian government looks set to legislate
a
Personal
Data Protection Bill (DPB),
which would control the collection, processing, storage, usage,
transfer, protection, and disclosure of personal data of Indian
residents. Despite its regional nature, DPB is an important
development for global managers. The digital economy in India is
expected to reach a
valuation of $1 trillion dollars by 2022 —
and it will attract numerous global players who must comply with DPB.
… Yet,
Indian DPB carries additional provisions beyond the EU regulation.
Because India is a nation state, it would treat the data generated by
its citizens as a national asset, store and guard it within national
boundaries, and reserve the right to use that data to safeguard its
defense and strategic interests.
There
are a number of features of the DPB that will require companies to
change their business models, practices, and principles.
… Ownership
of personal data: In principle, DPB proposes that the data
provider is the owner of their own personal data. While simple in
idea, this notion could impose an enormous implementation burden for
digital companies.
… Three
classes of data: DPB has identified three categories of data from
which a principal can be identified: Sensitive data includes
information on financials, health, sexual orientation, genetics,
transgender status, caste, and religious belief. Critical data
includes information that the government stipulates from time to time
as extraordinarily important, such as military or national security
data. The third is a general category, which is not defined but
contains the remaining data. DPB prescribes specific requirements
that data fiduciaries must follow for the storage and processing for
each data class.
All
sensitive and critical data must be stored in servers located in
India.
Data
sovereignty: DPB reserves the right to access the locally stored
data to protect national interests.
Why
we need AI lawyers! Why would this be different from today’s
‘reverse engineering?’
Researchers
were about to solve AI’s black box problem, then the lawyers got
involved
… The
downside of transparency
This
is fine when we’re using blackbox AI to determine whether something
is a
hotdog or not, or when Instagram uses it to determine if you’re
about
to post something that might be offensive. It’s not fine when
we can’t explain why an
AI sentenced a black man with no priors to more time than a white
man with a criminal history for the same offense.
The
answer is transparency. If there is no black box, then we can tell
where things went wrong. If our AI sentences black people to longer
prison terms than white people because it’s over-reliant on
external sentencing guidance, we can point to that problem and fix it
in the system.
But
there’s a huge downside to transparency: If
the world can figure out how your AI works, it can figure out how to
make it work without you. The companies making money off
of black box AI – especially those like Palantir, Facebook, Amazon,
and Google who have managed to entrench biased AI within government
systems – don’t want to open the black box anymore than they want
their competitors to have access to their research. Transparency is
expensive and, often, exposes just how unethical some companies’
use of AI is.
As
legal expert Andrew Burt recently wrote in Harvard
Business Review:
To start, companies attempting to utilize artificial intelligence need to recognize that there are costs associated with transparency. This is not, of course, to suggest that transparency isn’t worth achieving, simply that it also poses downsides that need to be fully understood. These costs should be incorporated into a broader risk model that governs how to engage with explainable models and the extent to which information about the model is available to others.
Why
fast is not always best.
Are
California's New Data Privacy Controls Even Legal?
A
new paper raises constitutional questions about expansive state-level
regulations that reach beyond their borders.
Data
privacy hardliners [???]
are pretty jazzed about the California
Consumer Protection Act (CCPA),
which is slated to take effect on the first of the next year. While
many outside of the Golden State may not have heard of this bold
foray into computing regulation, activists hope that it
will soon effectively control how much of the country is allowed to
process data.
If they can't have an European Union-level General Data Protection
Regulation (GPPR), then at least this state law can kind of regulate
through the back door without the pesky need to go through Congress.
Of
course any strong enough data controls imposed in California would
inevitably affect
everyone else in
the US. Most technology companies are based there, and even those in
other states would be fools to lock themselves out of California's
population of almost 40 million.
And
CCPA supporters know this. In fact, many of them see this as a
feature.
… A
new Federalist Society Regulatory Transparency Project paper by my
colleague Jennifer Huddleston and TechFreedom's Ian Adams suggests
that state data controls like the
CCPA raise serious legal questions
about
potential free speech and dormant commerce clause violations.
There's
this thing called "the Constitution…"
In
the rush to get a GDPR-style regulatory framework in place in
California, no one seemed to stop and ask whether what they were
doing was actually legal.
Indeed, many of the controls enshrined in the European law are
fundamentally at odds with American principles of permissionless
innovation and
open interstate commerce. Huddleston and Adams point out that state
laws like the CCPA may run into constitutional problems concerning
speech and interstate trade.
Data
is often speech. Laws that regulate speech are subject to a high
level of legal scrutiny because of our First Amendment protections.
States
don't get to ignore the First Amendment just
because they really don't like Facebook. If they try to regulate
data-as-speech, the courts may promptly strike them down.
No comments:
Post a Comment