Learn
from the mistakes of others (or make the same mistakes yourself).
The
UK’s contact tracing app fiasco is a master class in mismanagement
There
are advantages to being one of the world’s largest single-payer
health-care systems. For the UK’s National Health Service, the
NHS, big data is increasingly one of them.
Its
Recovery Trial, launched early in the coronavirus outbreak to collect
information from across the system, has led to the discovery
of dexamethasone as
one of the most promising life-saving treatments for the novel
coronavirus. In other areas of medicine, its cancer data store, now
nearly a decade old, is one of the world’s richest sources of
clinical data for cancer research.
So
it was hardly surprising that when UK ministers proposed a contact
tracing smartphone app, NHS officials saw an opportunity to create a
world-leading piece of technology.
But
on Thursday the British government announced
that it was ditching its original plan in favor of a much simpler
backup option —drawing
criticism and anger, and leaving many concerned about the prospect of
contact tracing technology in general. What happened?
(Related)
Maryam
Casbarro of Davis Wright Tremaine takes a look at potential risks for
firms, writing, in part:
The nature of contact tracing apps provides a number of parties in the data ecosystem with a broad set of data that could be used either purposefully or unintentionally and shared for purposes other than contact tracing. For example, it was recently revealed that public health authorities in North Dakota and South Dakota had rolled out a contact tracing app that shared location data with an outside location data aggregator, contrary to the app’s Privacy Policy.
Surreptitious data sharing (or other practices of the like) may expose the companies developing and/or deploying the contact tracing apps to typical privacy claims: collecting data beyond the scope of what the individual agreed to may lead to claims of intrusion upon seclusion, violation of constitutional rights to privacy, and breach of contract. Moreover, state laws, such as the California Consumer Privacy Act, permit consumers to prohibit sharing their data with third parties.
If the contact tracing technology collects more data than was consented to by consumers, or if the data—without notice or consent—is linked with other information about an individual to create profiles of specific individual consumers, the entities that develop and deploy the app may be subject to state “unfair and deceptive acts and practices” claims.
Keep
up!
French
Privacy Watchdog Offers New Guidance for Web Scraping and Its Use in
Direct Marketing
Explicit
General Data Protection Regulation (GDPR) guidance on the subject of
web scraping for purposes of direct marketing has finally been laid
out to the public following the publication of a
set of guidelines by
France’s data watchdog, the CNIL.
According
to the new recommendations, published on April 20, publicly available
contact information belonging to individual people that is gathered
online by companies with the intention of selling it on to
third-parties for direct marketing purposes (a process known as ‘web
scraping’ or ‘data extraction’), should
be regarded as still being personal data, even if the data is
publicly available.
Link
to the sessions...
Frank
Ready reports:
On Thursday, data protection provider WireWheel continued its two-day privacy technology virtual conference—Spokes 2020 —with a series of webinars examining how compliance professionals are adapting to some of the significant cultural shifts taking place across the nation. Topics ranged from the impact of COVID-19 on privacy programs to the challenges that poor data practices pose to diversity and inclusion.
While data and tech factored into the discussion, the “Privacy Leaders Panel” spent a significant portion of its runtime mulling some of the very human problems impacting the space. Panelist Barbara Lawler, chief privacy and data ethics officer at Looker, addressed the challenges of attempting to maintain team unity when COVID-19 makes physical proximity a liability.
Read
more on Law.com.
Is
this the model law I’ve been waiting for?
From
EPIC.org:
[On
June 18], the New York City Council passed the Public
Oversight of Surveillance Technology (POST) Act,
a law that enables public oversight of surveillance technologies used
by the New York Police Department. The POST Act will require the
police to publish documents explaining their use of surveillance
technologies, accept public comments about them, and provide a final
surveillance impact and use policy to the public. EPIC has worked
for years to focus public attention on the privacy impact of emerging
surveillance technologies, and has pursued open government cases
against the FBI and other law enforcement agencies to release
information about cell
site simulators and
other surveillance technologies. EPIC has recently launched a
project to track
and review algorithms used
in the criminal justice system.
The
New York Times covered the new law here.
Yes,
it’s a nit but we choose to pick it.
Daniel
R. Stoller reports:
Alphabet Inc.’s Google said a California federal court should dismiss a lawsuit alleging the company violated an Illinois law that protects biometric identifiers such as fingerprints, because the statute doesn’t pertain to photographs.
The plaintiffs accuse the company of creating faceprints from consumer photos, not actual people, so the Illinois Biometric Information Privacy Act doesn’t apply, Google told the U.S. District Court for the Northern District of California.
Read
more on Bloomberg
Law.
Perspective.
Clearest indication yet that software (think: self-driving) has
moved to a place as significant as engine design. Manufacturers no
longer feel comfortable outsourcing this work.
VW
vows to go it alone on software despite mishaps
Carmaker
rules out working with tech companies in order to “retain control”
of vehicle data