Thursday, July 04, 2019

Whatever you do, don’t act like a criminals covering their tracks...
Georgia Failed to Subpoena Image of Wiped Elections Server
Nearly two years ago, state lawyers in a closely watched election integrity lawsuit said they intended to subpoena the FBI for the forensic image, or digital snapshot, the agency made of a crucial server before state election officials quietly wiped it clean. Election watchdogs want to examine the data to see if there might have been tampering, given that the server was left exposed by a gaping security hole for more than half a year.
A new email obtained by The Associated Press says state officials never did issue the subpoena.
The FBI's data is central to activists' challenge to Georgia's highly questioned, centrally administered elections system, which lacks an auditable paper trail and was run at the time by Gov. Brian Kemp, then Georgia's secretary of state.
The plaintiffs contend Kemp's handling of the wiped server is the most glaring example of mismanagement that could be hiding evidence of vote tampering. They have been fighting for access to the state's centralized black-box voting systems and to individual voting machines, many of which they say have also been wiped clean.

Shucks! Now I’ll have to make my own.
On July 2nd, YouTube Help added “more examples of content that violates” its policy regarding “harmful or dangerous content.” The list includes “Extremely dangerous challenges,” “Violent events,” and “Eating disorders.”
The last item is now “Instructional hacking and phishing,” which the video site identifies as “showing users how to bypass secure computer systems or steal user credentials and personal data.”

Coming soon to a country near me?
Cookie consent – What “good” compliance looks like according to the ICO
On 3 July 2019, the UK data protection authority (the ICO) updated its guidance on the rules that apply to the use of cookies and other similar technologies. The ICO has also changed the cookie control mechanism on its own website to mirror the changes in the new guidance.
  • The use of cookie walls as a blanket approach to restrict access to a service until users consent will not comply with the cookie consent requirements.
  • Implied consent is also no-go.
  • The ICO also views consent mechanisms that emphasise that users should ‘agree’ or ‘allow’ cookies over ‘reject’ or ‘block’ as non-compliant. It calls this ‘nudge behaviour’ which influences users towards the ‘accept’ option.

Because it came up in both my classes yesterday.
US Journalist Detained When Returning to US
Pretty horrible story of a US journalist who had his computer and phone searched at the border when returning to the US from Mexico.
The EFF has extensive information and advice about device searches at the US border, including a travel guide.
If you are a U.S. citizen, border agents cannot stop you from entering the country, even if you refuse to unlock your device, provide your device password, or disclose your social media information. However, agents may escalate the encounter if you refuse. For example, agents may seize your devices, ask you intrusive questions, search your bags more intensively, or increase by many hours the length of detention. If you are a lawful permanent resident, agents may raise complicated questions about your continued status as a resident. If you are a foreign visitor, agents may deny you entry.

Another take on the topic.
Ethics in the Age of Artificial Intelligence
If we don’t know how AIs make decisions, how can we trust what they decide?
We are standing at the cusp of the next wave of the technological revolution: AI, or artificial intelligence. The digital revolution of the late 20th century brought us information at our fingertips, allowing us to make quick decisions, while the agency to make decisions, fundamentally, rested with us. AI is changing that by automating the decision-making process, promising better qualitative results and improved efficiency.
Unfortunately, in that decision-making process, AI also took away the transparency, explainability, predictability, teachability and auditability of the human move, replacing it with opacity. The logic for the move is not only unknown to the players, but also unknown to the creators of the program. As AI makes decisions for us, transparency and predictability of decision-making may become a thing of the past.

Anyone can (and will) play.
Make: a machine-learning toy on open-source hardware
In the latest Adafruit video (previously ) the proprietors, Limor "ladyada" Friend and Phil Torrone, explain the basics of machine learning, with particular emphasis on the difference between computing a model (hard) and implementing the model (easy and simple enough to run on relatively low-powered hardware), and then they install and run Tensorflow Light on a small, open-source handheld and teach it to distinguish between someone saying "No" and someone saying "Yes," in just a few minutes. It's an interesting demonstration of the theory that machine learning may be most useful in tiny, embedded, offline processors.

Siri, draft a smarter bill.”
California’s AB-1395 Highlights the Challenges of Regulating Voice Recognition
Under the radar of ongoing debates over the California Consumer Privacy Act (CCPA), the California Senate Judiciary Committee will also soon be considering, at a July 9th hearing, an unusual sectoral privacy bill regulating “smart speakers.” AB-1395 would amend California’s existing laws to add new restrictions for “smart speaker devices,” defined as standalone devices with an integrated virtual assistant connected to a cloud computing storage service that uses hands-free verbal activation.” Physical devices like the Amazon Echo, Google Home, Apple HomePod, and others (e.g. smart TVs or speakers produced by Sonos or JBL that have integrated Alexa or Google Assistant), would be included, although the bill exempts the same cloud-based voice services when they are integrated into cell phones, tablets, or connected vehicles.

Let’s replace all that new technology we don’t understand with old technology we can understand, like locks and keys!” Not very urgent if it took three years to pass.
U.S. Government Makes Surprise Move To Secure Power Grid From Cyberattacks
Homeland Security officials say that Russian hackers used conventional tools to trick victims into entering passwords in order to build out a sophisticated effort to gain access to control rooms of utilities in the U.S. The victims included hundreds of vendors that had links to nuclear plants and the electrical grid.
Nations have been trying to secure the industrial control systems that power CNI for years. The challenge lies in the fact that these systems were not built with security in mind, because they were not originally meant to be connected to the internet. [They were not built with Internet security in mind. The new bits were! Bob]
It is with this in mind that the U.S. has responded with a new strategy: rather than bringing in new technology and skills, it will use analog and manual technology to isolate the grid's most important control systems. This, the government says, will limit the reach of a catastrophic outage.
"This approach seeks to thwart even the most sophisticated cyber-adversaries who, if they are intent on accessing the grid, would have to actually physically touch the equipment, thereby making cyberattacks much more difficult," said a press release as the Securing Energy Infrastructure Act (SEIA), passed the Senate floor.
When introducing the bill in 2016, U.S. Senators Angus King (I-Maine) and Jim Risch (R-Idaho) said: "Specifically, it will examine ways to replace automated systems with low-tech redundancies, like manual procedures controlled by human operators."

No comments: