Measuring
the “harm” of a security breach.
PayTime
Data Breach Hits Some Workers Hard
When
we think about consequences of hacks or breaches, let’s not lose
sight that people may lose their jobs simply because their data was
caught up in an incident – even if there was no evidence that their
information was misused. idRADAR.com
has a good example of that in the aftermath of the PayTime hack.
They previously reported other
examples of how becoming a victim of hack can cost security
clearance and/or jobs, with a follow-up
on one such case.
I
don't think I'd put it that way. Still, some interesting assertions.
Upsurge
in hacking makes customer data a corporate time bomb
…
The reality, cyber security experts say, is that however much they
spend, even the largest companies are unlikely to be able to stop
their systems being breached. The best defense may simply be either
to reduce the data they hold or encrypt
it so well that if stolen it will remain useless. [Or take most of
it off-line? Bob]
…
A report from cyber security think tank the Ponemon Institute showed
the average cost of a data breach in the last year grew by 15 percent
to $3.5 million. The likelihood of a company having a data breach
involving 10,000 or more confidential records over a two-year period
was 22 percent, it said.
… Still, a study of 102 UK financial institutions and 151 retail
organizations conducted earlier this year by Tripwire showed 40
percent said they would
need 2 to 3 days to detect a breach.
So,
if I search for information on a company and Google indicates they
have “something to hide,” I will expand my search by using search
engines that do not comply with the EU rule. Or I may just not
invest in that company. (Imagine the impact on politicians!)
Google
may soon let you know when it’s required to hide something from you
A
European Union court recently ruled that Google must respect the EU’s
“right to be forgotten” and remove links to web
pages that individuals find embarrassing.
Now,
the Guardian
reports, Google may soon add a note to its edited search results,
indicating that something is missing.
Google
already does this with pages from which it’s removed search results
in response to DMCA takedown requests, usually as a result of alleged
copyright violations.
Interesting
idea. Automate the Privacy Policy review. (Maybe I see it as an
Audit tool because of 35 years of auditing?)
Bootstrapping
Privacy Compliance in Big Data Systems
by
Sabrina I.
Pacifici on June 8, 2014
“In
this
paper, we demonstrate a collection of techniques to transition to
automated privacy compliance compliance checking in big data systems.
To this end we designed the LEGALEASE language, instantiated for
stating privacy policies as a form of restrictions on information
flows, and the GROK data inventory that maps
low level data types in code to highlevel policy concepts.
We show that LEGALEASE is usable by non-technical privacy champions
through a user study. We show that LEGALEASE is expressive enough to
capture real-world privacy policies with purpose, role, and storage
restrictions with some limited temporal properties, in particular
that of Bing and Google. To build the GROK data flow grap we
leveraged past work in program analysis and data flow analysis. We
demonstrate how to bootstrap labeling the graph with LEGALEASE policy
datatypes at massive scale. We note that the structure of the graph
allows a small number of annotations to cover a large fraction of the
graph. We report on our experiences and learnings from operating the
system for over a year in Bing. — Shayak Sen (Carnegie Mellon
University), Saikat Guha (Microsoft Research, India), Anupam Datta
(Carnegie Mellon University), Sriram Rajamani (Microsoft Research,
India), Janice Tsai (Microsoft Research, Redmond), and Jeannette Wing
(Microsoft Research), Bootstrapping Privacy Compliance in Big Data
Systems, IEEE Security and Privacy Symposium 2014, Best Student Paper
(1 of 2) – See more at:
https://www.cylab.cmu.edu/news_events/news/2014/ieee-sp-2014.html#sthash.eM6ZYdS3.dpuf”
Another
programming inspired paper?
Location
Tracking, Mosaic Theory, and Machine Learning
by
Sabrina I.
Pacifici on June 8, 2014
Enough
is Enough - Location Tracking, Mosaic Theory, and Machine Learning
- Steven M. Bellovin, Renée M. Hutchins, Tony Jebara, Sebastian
Zimmeck. New York University Journal of Law & Liberty, vol
8:555, 2014.
“Since
1967, when it decided Katz
v. United States, the Supreme Court has tied the right to be free
of unwanted government scrutiny to the concept of reasonable
expectations of privacy. An evaluation of reasonable expectations
depends, among other factors, upon an assessment of the intrusiveness
of government action. When making such assessment historically the
Court has considered police conduct with clear temporal, geographic,
or substantive limits. However, in an era where new technologies
permit the storage and compilation of vast amounts of personal data,
things are becoming more complicated. A school of thought known as
“mosaic theory” has stepped into the void, ringing the alarm that
our old tools for assessing the intrusiveness of government conduct
potentially undervalue privacy rights. Mosaic theorists advocate a
cumulative approach to the evaluation of data collection. Under the
theory, searches are “analyzed as a collective sequence of steps
rather than as individual steps.” The
approach is based on the recognition that comprehensive aggregation
of even seemingly innocuous data reveals greater insight than
consideration of each piece of information in isolation.
Over time, discrete units of surveillance data can be processed to
create a mosaic of habits, relationships, and much more.
Consequently, a Fourth Amendment analysis that focuses only on the
government’s collection of discrete units of trivial data fails to
appreciate the true harm of long-term surveillance — the composite.
In the context of location tracking, the Court has previously
suggested that the Fourth Amendment may (at some theoretical
threshold) be concerned with the accumulated information revealed by
surveillance. Similarly, in the Court’s recent decision in United
States v. Jones, a majority of concurring justices indicated
willingness to explore such an approach. However, in general, the
Court has rejected any notion that technological enhancement matters
to the constitutional treatment of location tracking. Rather, it has
found that such surveillance in public spaces, which does not require
physical trespass, is equivalent to a human tail and thus not
regulated by the Fourth Amendment. In this way, the Court has
avoided quantitative analysis of the amendment’s protections. The
Court’s reticence is built on the enticingly direct assertion that
objectivity under the mosaic theory is impossible. This is true in
large part because there has been no rationale yet offered to
objectively distinguish relatively short-term monitoring from its
counterpart of greater duration. As Justice Scalia recently observed
in Jones: “it remains unexplained why a 4-week investigation is
‘surely’ too long.” This article suggests that by combining
the lessons of machine learning with the mosaic theory and applying
the pairing to the Fourth Amendment we can see the contours of a
response. Machine learning makes clear that mosaics can be created.
Moreover, there are also important lessons to be learned on when that
is the case… In five parts, this article advances the conclusion
that the duration of
investigations is relevant to their substantive Fourth
Amendment treatment because
duration affects the accuracy of the predictions. Though
it was previously difficult to explain why an investigation of four
weeks was substantively different from an investigation of four
hours, we now have a better understanding of the value of aggregated
data when viewed through a machine learning lens. In some
situations, predictions of startling accuracy can be generated with
remarkably few data points.”
See
also a rebuttal of interpretations of this paper by Orin Kerr – No,
machine learning doesn’t resolve how the mosaic theory applies
No comments:
Post a Comment