As I try to
teach my students how to manage Computer Security, I find it very
easy to find really good really bad examples of security failures for
them to analyze.
So yeah, I’d
say this is pretty bad. Cassie Williams reports:
Security protocols on surveillance cameras at a Cape Breton school remain out of date, months after images of its students were unintentionally broadcast on the internet, Nova Scotia’s privacy commissioner Catherine Tully says.
Tully’s report on the matter found there were “inadequate passwords and insufficient technical controls” behind the initial breach. While passwords have been changed, Tully said the school has still not placed the streams behind a firewall or equivalent protection, and two of the cameras are no longer supported by manufacturer security updates.
Read more on CBC.ca.
Something to amuse (not inspire) my Computer
Security students?
Ghost in
the cell
How an inmate hacker
hid computers in the ceiling and turned his prison upside down
So far, none of my students thought this was a
good idea either.
Joe Cadillic writes:
Recently, Walmart shocked privacy conscious Americans by announcing they wanted customers to let Walmart employees inside their homes.
Then….
Three days ago, Amazon shocked privacy conscious Americans by announcing that they also want to deliver packages inside peoples homes and cars.
Who wouldn’t want to let two of the largest corporations in the world have access to your car and home, what could possibly go wrong?
If you want to find out what could go wrong, read Computerworld’s article that describes how letting corporations inside your home and car is one of the worst ideas ever.
Read more on MassPrivateI.
Would this hold up in court? (I think so!)
Perhaps someone at Gizmodo thought this would be a
great attention-getter of a headline. And maybe they’re right as
far as that goes, but this is not about Facebook outing sex workers.
It’s about Facebook being able to make connections – and expose
connections – that perhaps you do not want made or exposed.
Kashmir Hill reports:
Leila has two identities, but Facebook is only supposed to know about one of them.
Leila is a sex worker. She goes to great lengths to keep separate identities for ordinary life and for sex work, to avoid stigma, arrest, professional blowback, or clients who might be stalkers (or worse).
Her “real identity”—the public one, who lives in California, uses an academic email address, and posts about politics—joined Facebook in 2011. Her sex-work identity is not on the social network at all; for it, she uses a different email address, a different phone number, and a different name. Yet earlier this year, looking at Facebook’s “People You May Know” recommendations, Leila (a name I’m using using in place of either of the names she uses) was shocked to see some of her regular sex-work clients.
Despite the fact that she’d only given Facebook information from her vanilla identity, the company had somehow discerned her real-world connection to these people—and, even more horrifyingly, her account was potentially being presented to them as a friend suggestion too, outing her regular identity to them.
Read more on Gizmodo.
Something to share with my students.
Top Tools
for Learning 2017
Jane Hart, of the Centre for Learning and
Performance Technologies, has published her annual list of Top Tools
for Learning
For the toolkit. If I can’t give my students
real Intelligence, perhaps AI will do?
AWS &
Microsoft Build an AI Starter Kit
Amazon Web Services (AWS) has launched an
artificial intelligence library, continuing the competitive AI push
among the public cloud giants.
The Gluon
library, created by Amazon
Web Services Inc. with help from Microsoft
Corp., is an AI beginner's kit. It lets developers build models
by coding in Python and assembling pre-built chunks of code.
The idea is to bring AI, or at least machine learning, into the hands
of programmers who aren't experts in the subject, as AWS explained in
a blog
posting yesterday.
… Gluon is particularly interesting because
it's not a research effort, but a tool for reducing AI into normal
programming. It fits into Microsoft's goal of democratizing
AI. And it wouldn't be surprising to see more efforts emerge
along these lines, giving developers more ways to add some
intelligence to their cloud applications.
Another strange day here at Centennial-Man:
No comments:
Post a Comment