Something for my deep thinking students.
It's really hard to estimate the cost of an
insecure Internet. Studies are all over the map. A methodical study
by RAND is the best work I've seen at trying to put a number on this.
The results are, well, all over the map:
"Estimating the Global Cost of Cyber Risk: Methodology and Examples":
Here's
Rand's risk calculator, if you want to play with the parameters
yourself.
Note: I was an advisor to the project.
Separately, Symantec has published
a new cybercrime report with their own statistics.
Something for my students to reverse so they can
de-anonymize.
On January 25, the Personal Data Protection
Commission of Singapore issued a guide to basic anonymization
techniques. You can access the guide here
(pdf).
(Related) Why anonymization is important.
David Gershgorn reports:
Some of Google’s top AI researchers are trying to predict your medical outcome as soon as you’re admitted to the hospital.
A new research paper, published Jan. 24 with 34 co-authors and not peer-reviewed, claims better accuracy than existing software at predicting outcomes like whether a patient will die in the hospital, be discharged and readmitted, and their final diagnosis. To conduct the study, Google obtained de-identified data of 216,221 adults, with more than 46 billion data points between them. The data span 11 combined years at two hospitals, University of California San Francisco Medical Center (from 2012-2016) and University of Chicago Medicine (2009-2016).
Read more on Quartz.
OK, now if this is accurate, it sounds really
promising, right? But I wondered how they got so much de-identified
medical data on so many people. So I took a look at the paper’s
methods section and here’s what is says:
We included EHR data from the University of California, San Francisco (UCSF) from 2012-2016, and the University of Chicago Medicine (UCM) from 2009-2016. We refer to each health system as Hospital A and Hospital B. All electronic health records were de-identified, except that dates of service were maintained in the UCM dataset. Both datasets contained patient demographics, provider orders, diagnoses, procedures, medications, laboratory values, vital signs, and flowsheet data, which represents all other structured data elements (e.g. nursing flowsheets), from all inpatient and outpatient encounters. The UCM dataset (but not UCSF) additionally contained de-identified, free-text medical notes. Each dataset was kept in an encrypted, access-controlled, and audited sandbox.
Ethics review and institutional review boards approved the study with waiver of informed consent or exemption at each institution.
So if you went to either of these hospitals, the
hospital might have subsequently waived your informed consent and
just turned over data on you that everyone believes is de-identified.
Now it’s great that that it was kept encrypted, access-controlled,
and in an audited sandbox, but here’s the thing: are you okay with
a hospital waiving your informed consent? How difficult might it be
to re-identify the data?
I know a lot of people feel that it’s okay for
entities to do this (waive consent) because it’s in the best
interests of public health and progress, but of course, I focus on
the individual’s rights. So think about it… is this okay and if
it’s not, how does that affect your use of a particular hospital?
Would you say or do anything different?
They have never done this before. Perhaps they
are concerned about GDPR?
Facebook
marks Data Privacy Day by sharing its 7 privacy principles
With the European Union’s new data protection
laws coming into force this year, Facebook has begun preparing for
the General Data Protection Regulation (GDPR) by publishing its
privacy principles for the first time.
The company also announced that it will push
videos into users’ news feeds detailing how they can manage their
privacy on the social network, and it recently
revealed plans to roll out a new privacy center later this year
that pulls together key settings into a single hub.
The announcement was timed to coincide with Data
Privacy Day, an occasion
marked every January 28 to promote best practices around online
data privacy and security.
When Privacy equals Targeting.
Strava
fitness tracking app reveals movements on remote military bases
A fitness tracking app that maps people's exercise
habits could pose security risks for security forces around the
world.
Strava, which bills itself as "the social
network for athletes" and allows its users to share their
running routes, released a newly updated global heatmap last
November. But experts and keen observers have recently realized its
potential to reveal location patterns of security forces working out
at military bases in remote locations.
Nathan Ruser, a 20-year-old Australian student and
analyst for the Institute for United Conflict Analysts, noted
on Twitter on Saturday that the map made US bases "clearly
identifiable and mappable."
… In a post
about the update in November, Strava said the update would include
"six times more data than before – in total one billion
activities from all Strava data through September 2017." Strava
boasts "tens
of millions" of users, and according to the company, marked
three trillion latitude/longitude points on the updated map. It
tracks location data using GPS from FitBits, cellphones, and other
fitness tracking devices.
… Scott Lafoy, an open-source imagery analyst,
told CNN it's too early to truly assess how useful the data is.
"In terms of strategic stuff, we know all the
bases there, we know a lot of the positions, this will just be some
nice ancillary data," said Lafoy.
… "If the data is not actually anonymous,
then you can start figuring out timetables and like some very
tactical information, and then you start getting into some pretty
serious issues," LaFoy said.
Do most companies measure downtime? I doubt it.
https://smallbiztrends.com/2018/01/cost-of-a-tech-fail-small-business.html?google_editors_picks=true
IT Downtime
Costs Businesses $1.55 Million Per Year, Report Says (INFOGRAPHIC)
Ready or not, here it comes.
NIST Report
on Blockchain Technology Aims to Go Beyond the Hype
“Beguiling, baffling or both—that’s
blockchain. Aiming to clarify the subject for the benefit of
companies and other organizations, the National Institute of
Standards and Technology (NIST) has released a straightforward
introduction to blockchain, which underpins Bitcoin and other digital
currencies. Virtual barrels of digital ink are flowing in the media
nowadays about these cryptocurrencies and the underlying blockchain
technology that enables them. Much of the attention stems either
from the giddy heights of value attained lately by the most
well-known of these currencies, Bitcoin, or from the novelty of
blockchain itself, which has been described
(link is external) as the most disruptive technology since the
internet. Blockchain’s proponents believe it lets individuals
perform transactions safely without the costs or security risks that
accompany the intermediaries that are required in conventional
transactions. The NIST report’s authors hope it will be useful to
businesses that want to make clear-eyed decisions about whether
blockchain would be an asset to their products.”
Blockchain
Technology Overview NISTIR 8202 (DRAFT) Date Published: January
2018
Something to cheer up my students? Power to the
programmers!
No comments:
Post a Comment