Sunday, September 29, 2019


What are the minimum requirements for a disclosure order?
The U.S. and UK governments are expected to sign a treaty in October that will force social media platforms based in either of the countries to “disclose encrypted messages from suspected terrorists, paedophiles and other serious criminals” to police in the other, according to the Times of London.
According to the paper, Patel said UK authorities’ hands are currently tied by arrangements that keep transnational data-sharing to emergencies and a slow-moving treaty process:
At present the security services are only able to obtain data if there is a need for an “emergency disclosure” due to an imminent threat to life. The police and prosecutors can also request data under the “mutual legal assistance” treaty but the process is highly bureaucratic and can take up to two years.
Under the new treaty, the police, prosecutors and the security services can submit requests for information to a judge, magistrate or “other independent authority”. The process will be overseen by the investigatory powers commissioner.
Under the terms of the proposed arrangement, both governments will agree not to investigate each others’ citizens. The U.S. won’t be able to use data obtained from companies based in the UK in death penalty cases unless UK authorities have explicitly given permission to do so. Bloomberg confirmed news of the data sharing agreement as well.
It’s not clear whether the proposed arrangement actually requires companies to build backdoors into their encrypted products, something that law enforcement and intelligence agencies have been demanding for years, but which has been resisted by tech firms.




A bit snarky, don’t you think?
Mike Masnick writes:
California is inching ever closer to having its very problematic privacy law take effect. As we’ve noted, while good privacy legislation would be desirable, this is not it. Indeed, this law is woefully undercooked by design. If you don’t remember, the process by which we got here dictated terrible results. A wealthy real estate developer, Alastair Mactaggart, decided that he was going to “fix” internet privacy, by putting a truly bad proposal regarding internet privacy to a public vote, using California’s somewhat horrific public referendum system — that allows for the public to effectively modify California’s constitution by popular vote.
Read more on TechDirt.
[From the article:
But, here's the thing, after agreeing to pull that referendum from the ballot, Mactaggart has now announced that he's bringing it back for the next ballot.




Legal creep?
Odia Kagan of Fox Rothschild writes:
The Danish Data Protection Authority has changed its position regarding the legal basis for posting pictures online under the General Data Protection Regulation (GDPR). Rather than a distinction between “situational” and “portrait” pictures, Datatilsynet now requires a case-by-case analysis.
Read more on Fox Rothschild.




I’ve come to expect poor reporting. Did anyone notice that this was implemented without notice? That it does not scan the Internet as the school board seems to claim. And did they check that claim of preventing suicides?
Safety versus privacy: Williamson County School introduces new 'Gaggle' student surveillance program
After the Williamson County School District implemented a threat surveillance computer program, some parents are concerned about protecting students’ privacy.
WCS Superintendent Jason Golden announced the use of the online safety program at a school board work session in August. Without giving details due to federal family protection laws, Golden explained the new program had already “paid for itself” this year by identifying a threat of student self-harm, allowing district officials to intervene.
According to the Gaggle distributor, during the 2018-19 school year, the program helped school districts save 722 students from carrying out an act of suicide.
Gaggle operates using a mathematical algorithm to identify high risk words and phrases when students are logged into the district’s server.
Gaggle scans WCS student accounts in G Suite for Education within the district for inappropriate or concerning words and images.
Systems monitored by Gaggle in WCS include Gmail, Google Docs, Google Drive and other programs using myplace.wcs.edu accounts provided to students.
The district says Gaggle is aimed at protecting students on the internet, and it's the district role to do so, according to federal law.
Gaggle does not monitor student internet use.




Just because I found it…
The Ethics of Artificial Intelligence
Undergraduate Thesis




I’ll definitely use Dilbert’s explanation of BlockChain.



No comments: