A leading US supplier of voting machines confirmed on
Thursday that it exposed the personal information of more than 1.8 million
Illinois residents.
State authorities and the Federal Bureau of Investigation
were alerted this week to a major data leak exposing the names, addresses,
dates of birth, partial Social Security numbers, and party affiliations of over
a million Chicago residents. Some
driver’s license and state ID numbers were also exposed.
Jon Hendren, who works for the cyber resilience firm UpGuard,
discovered the breach on an Amazon Web Services (AWS) device that was not secured by a password. The voter data was then downloaded by cyber
risk analyst Chris Vickery who determined Election Systems & Software
(ES&S) controlled the data. ES&S provides voting machines and services
in at least 42 states.
Perfect for my Software Assurance class.
Well, this sounds like an epic FAIL on the City of
Yonker’s part, doesn’t it?
City
of Yonkers – Information Technology (Westchester County)
The IT department’s acceptable computer use policy was not signed or acknowledged by all employees and city officials have also not classified personal, private and sensitive information based on its level of sensitivity and the potential impact should that data be disclosed, altered or destroyed without authorization. In addition, city officials have not ensured that employees received adequate cyber security training and have not adopted a breach notification policy or a disaster recovery plan.
The IT department’s acceptable computer use policy was not signed or acknowledged by all employees and city officials have also not classified personal, private and sensitive information based on its level of sensitivity and the potential impact should that data be disclosed, altered or destroyed without authorization. In addition, city officials have not ensured that employees received adequate cyber security training and have not adopted a breach notification policy or a disaster recovery plan.
You can access the full report here
(.pdf).
Gosh, you don’t think the government would lie do you? (Me too!)
Dems want independent probe into FCC cyberattack
Democratic lawmakers are calling for an independent
investigation into how the Federal Communications Commission responded to a
reported cyberattack in May that crippled the agency’s comment filing system.
Sen. Brian Schatz (D-Hawaii) and Rep. Frank
Pallone Jr. (D-N.J.) sent a letter
to the Government Accountability Office (GAO) on Thursday that cast doubt on
the FCC’s version of the incident.
“While the FCC and the FBI have responded to Congressional
inquiries into these [distributed denial of service] attacks, they have not
released any records or documentation that would allow for confirmation that an
attack occurred, that it was effectively dealt with, and that the FCC has begun
to institute measures to thwart future attacks and ensure the security of its
systems,” the letter reads.
“As a result, questions remain about the attack itself and
more generally about the state of cybersecurity at the FCC — questions that
warrant an independent review.”
Perspective. A partial
list of victims.
NotPetya Attack Costs Big Companies Millions
Obvious security?
Facebook Awards $100,000 Prize for Spear-Phishing Detection
Method
… To test their
method, the researchers analyzed
more than 370 million emails received by a large enterprise’s employees between
March 2013 and January 2017.
The first part of the detection method relies on the
analysis of two key components: domain reputation features and sender
reputation features. The domain
reputation feature involves analyzing the link included in an email to see if
it poses a risk. A URL is considered
risky if it has not been visited by many employees from within an organization,
or if it has never been visited until very recently.
The sender reputation feature aims to identify spoofing of
the sender’s name in the From header, a previously unseen attacker using a name
and email address closely resembling a known or authoritative entity,
exploitation of compromised user accounts, and suspicious email content (i.e.
messages that reference accounts and credentials, or ones that invoke a sense
of urgency).
If it’s good enough for Russia…
Natalia Gulyaeva, Maria Sedykh, and Bret Cohen write:
On 31 July, the Russian data
protection authority, Roskomnadzor, issued guidance for data operators on the
drafting of privacy policies to comply with Russian data protection law. Russia’s 2006 privacy law – Federal Law No.
152-FZ of 27 July 2006 “On Personal Data” (Personal Data Law) – requires, among
other things, that Russian data operators must adopt a privacy policy that
describes how they process personal data. This notice requirement is similar to the
approach in Europe. Furthermore, data
operators shall publish such a policy online when personal data is collected
online or otherwise provide unrestricted access to the policy when personal
data is collected offline. The guidance
– although non-binding and recommendatory in nature – emphasizes the
regulator’s compliance expectations and should therefore be taken into account
by organizations acting as data operators in Russia.
Read more on Chronicle
of Data Protection.
How to write Terms of Service? More important: How to read them!
2nd Circuit’s Uber arbitration ruling huge win for app
industry
On Thursday, the 2nd U.S. Circuit Court of Appeals ruled
that Uber user Spencer Meyer assented to the company’s mandatory arbitration
requirement when he clicked a button to complete his registration for the Uber
smartphone app. The 2nd Circuit’s
decision, written by Judge Denny Chin for a panel that also included Judges Reena Raggi
and Susan
Carney, rejected Meyer's argument that he wasn’t on fair notice
of the arbitration provision because the Uber registration process presented
the app’s terms of service only via hyperlink.
That's great news for companies with
smartphone apps – and not just because the court held that app purchasers can
be bound by a “sign-in wrap” that folds assent to terms of service into
registration for the app. The 2nd
Circuit also confirmed the obvious: Now that Internet-connected devices have
become nearly ubiquitous, smartphone
users ought to know that registering for an app has legal consequences.
A project for my students.
Algorithmic Transparency for the Smart City
by
on
Brauneis, Robert and Goodman, Ellen P., Algorithmic
Transparency for the Smart City (August 2, 2017). Available at SSRN: https://ssrn.com/abstract=3012499
“Emerging across many disciplines are questions about
algorithmic ethics – about the values embedded in artificial intelligence and
big data analytics that increasingly replace human decision making. Many are concerned that an algorithmic society
is too opaque to be accountable for its behavior. An individual can be denied parole or denied
credit, fired or not hired for reasons she will never know and cannot be
articulated. In the public sector, the
opacity of algorithmic decision making is particularly problematic both because
governmental decisions may be especially weighty, and because
democratically-elected governments bear special duties of accountability. Investigative journalists have recently
exposed the dangerous impenetrability of algorithmic processes used in the
criminal justice field – dangerous because the predictions they make can be
both erroneous and unfair, with none the wiser. We set out to test the limits of transparency
around governmental deployment of big data analytics, focusing our
investigation on local and state government use of predictive algorithms. It is here, in local government, that
algorithmically-determined decisions can be most directly impactful. And it is here that stretched agencies are
most likely to hand over the analytics to private vendors, which may make
design and policy choices out of the sight of the client agencies, the public,
or both. To see just how impenetrable
the resulting “black box” algorithms are, we filed 42 open records requests in
23 states seeking essential information about six predictive algorithm
programs. We selected the most
widely-used and well-reviewed programs, including those developed by for-profit
companies, nonprofits, and academic/private sector partnerships. The goal
was to see if, using the open records process, we could discover what policy
judgments these algorithms embody, and could evaluate their utility and
fairness. To do this work, we
identified what meaningful “algorithmic transparency” entails. We found that in almost every case, it wasn’t
provided. Over-broad assertions of trade
secrecy were a problem. But contrary to
conventional wisdom, they were not the biggest obstacle. It will not usually be necessary to release
the code used to execute predictive models in order to dramatically increase
transparency. We conclude that
publicly-deployed algorithms will be sufficiently transparent only if (1)
governments generate appropriate records about their objectives for algorithmic
processes and subsequent implementation and validation; (2) government
contractors reveal to the public agency sufficient information about how they
developed the algorithm; and (3) public agencies and courts treat trade secrecy
claims as the limited exception to public disclosure that the law requires. Although it
would require a multi-stakeholder process to develop best practices for record
generation and disclosure, we present what we believe are eight principal types
of information that such records should ideally contain.”
Keeping my students busy.
For my Geeks.
A reminder.
Last chance to get eclipse glasses?
Community College of Denver Solar Eclipse Party
Community College of Denver will be setting up two
telescopes to safely view the 93% partial solar eclipse on August 21st. One telescope is a Coronado Solarmax 60mm with
an H-alpha solar filter, the other is a 6" Celestron scope with a
broadband solar filter. Safe
viewing glasses provided.
No comments:
Post a Comment