It
will be interesting to see how big a fuss this makes. There appears
to be no specific benefit to Facebook (unlikely to attract new users
or increase profits) Are there enough Facebook friends to tip the
“emotional contagion” into a Class Action lawsuit or perhaps a
lynch mob? (Announcement timed for a weekend to reduce the number of
people who see it?)
Facebook
is learning the hard way that with great data comes great
responsibility
Facebook
released the results
of a study where its data scientists skewed the
positive or negative emotional content that appeared in the news
feeds of nearly 700,000 users over the course of a week in order to
study their reaction. The study found evidence of “emotional
contagion,” in other words, that the emotional content of posts
bled into user’s subsequent actions.
Professions
online.
Maanvi
Singh reports on the uptick in online psychotherapy services, but
notes that there are concerns not only about efficacy, but licensing
and privacy. With respect to privacy, Singh reports:
Some studies
suggest that therapy online can be as effective as it is face to
face. “We have a lot of promising data suggesting that technology
can be a very good means of providing treatment,” says Lynn Bufka,
a clinical psychologist who helps develop health-care policy for the
American Psychological Association.
“I don’t think we have all the answers yet,” Bufka says. There
are cases where therapy online may not work, she notes. Therapists
usually don’t treat people with severe issues online, especially if
they are contemplating suicide. That’s
because in case of a crisis, it’s much harder for online therapists
to track down their patients and get them help. [Clearly,
they have not been paying attention to GPS tracking. Bob]
Privacy is another a concern. Instead of Skype, many online therapy
companies choose to use teleconferencing software with extra
security. Arthur at Pretty Padded Room says her company takes
measures to protect her clients’ records.
But it can be hard for people to know exactly how secure the website
they’re using really is, Bufka says.
Read
more on NPR.
Worth
reading
Last
week’s National Post features an op-ed written by
Ontario’s Information and Privacy Commissioner Dr. Ann Cavoukian
and the founder and co-chair of the Future of Privacy Forum think
tank Christopher Wolf commenting if a recent European Court of
Justice judgement requiring Internet search providers to remove links
to embarrassing information should also be applied to Canadian
Citizens. The full article is below:
A man walks into a library. He asks to see the librarian. He tells
the librarian there is a book on the shelves of the library that
contains truthful, historical information about his past conduct, but
he says he is a changed man now and the book is no longer relevant.
He insists that any reference in the library’s card catalog and
electronic indexing system associating him with the book be removed,
or he will go to the authorities.
The librarian refuses, explaining that the library does not make
judgments on people, but simply offers information to readers to
direct them to materials from which they can make their own judgment
in the so-called “marketplace of ideas.” The librarian goes on
to explain that if the library had to respond to such requests, it
would become a censorship body — essentially the arbiter of what
information should remain accessible to the public. Moreover, if it
had to respond to every such request, the burden would be enormous
and there would be no easy way to determine whether a request was
legitimate or not. The indexing system would become swiss cheese,
with gaps and holes. And, most importantly, readers would be
deprived of access to historical information that would allow them to
reach their own conclusions about people and events.
The librarian gives this example: What if someone is running for
office but wants to hide something from his unsavory past by blocking
access to the easiest way for voters to uncover those facts? Voters
would be denied relevant information, and democracy would be
impaired.
The man is not convinced, and calls a government agent. The
government agent threatens to fine or jail the librarian if he does
not comply with the man’s request to remove the reference to the
unflattering book in the library’s indexing system.
Is this a scenario out of George Orwell’s Nineteen Eighty-Four?
No, this is the logical extension of a recent ruling from Europe’s
highest court, which ordered Google to remove a link to truthful
information about a person, because that person found the information
unflattering and out of date. (The scale of online indexing would of
course be dramatically more comprehensive than a library indexing
system.)
The European Court of Justice ruled that Google has a legal
obligation to remove, from a search result of an individual’s name,
a link to a newspaper containing a truthful, factual account of the
individual’s financial troubles years ago. The individual, a
Spanish citizen, had requested that Google remove the newspaper link
because the information it contained was “now entirely irrelevant.”
This concept has been described as the “right to be forgotten.”
While one may have sympathy for the Spanish man who claimed he had
rehabilitated his credit and preferred that his previous setback be
forgotten, the rule of law that the highest European Court has
established could open the door to unintended consequences such as
censorship and threats to freedom of expression.
The European Court relied on the fundamental rights to privacy and to
the protection of personal data contained in the Charter of
Fundamental Rights of the European Union, without so much as citing,
much less analyzing, one of the other fundamental rights contained in
the Charter, namely the right to free expression.
Moreover, the Court did not provide sufficient instruction on how the
“right to be forgotten” should be applied. When do truthful
facts become “outdated” such that they should be suppressed on
the Internet? Do online actors other than search engines have a duty
to “scrub” the Internet of unflattering yet truthful facts? The
Court didn’t say. The European Court of Justice has mandated that
the Googles of the world serve as judge and jury of what legal
information is in the public interest, and what information needs to
be suppressed because the facts are now dated and the subject is a
private person. Under penalty of fines and possibly jail time,
online companies may err on the side of deleting links to
information, with free expression suffering in the process.
The European Court’s own Advocate General argued that a right to be
forgotten “would entail sacrificing pivotal rights such as freedom
of expression and information” and would suppress “legitimate and
legal information that has entered the public sphere.” Further,
the Advocate General argued, this would amount to “censuring”
published content. In the First Amendment parlance of the U.S.
Supreme Court, the European Court’s decision may amount to “burning
the house to roast the pig.” [Being
quite literate, I recognize this as a reference to “A Dissertation
Upon Roast Pig,” by Charles Lamb (Just showing off) Bob]
You might think this problem is limited to Europe, and that the
search results in North America will remain unaffected by the Court’s
ruling. But earlier European efforts to cleanse the Internet (in the
context of hate speech) suggested that even materials on North
American domains would be subject to European law.
As privacy advocates, we strongly support rights to protect an
individual’s reputation and to guard against illegal and abusive
behaviour. If you post something online about yourself, you should
have the right to remove it or take it somewhere else. If someone
else posts illegal defamatory content about you, as a general rule,
you have a legal right to have it removed. But while personal
control is essential to privacy, empowering individuals to demand the
removal of links to unflattering, but accurate, information arguably
goes far beyond protecting privacy. Other solutions should be
explored to address the very real problem posed by the permanence of
online data.
The recent extreme application of privacy rights in such a vague,
shotgun manner threatens free expression on the Internet. We cannot
allow the right to privacy to be converted into the right to censor.
A
few items in the slideshow that I hadn't thought of. I'm clearly not
thinking “ubiquitously” enough.
The
Internet of Things at home: Why we should pay attention
What is the Internet of Things (IoT), exactly? If you're a consumer,
then the first thing that leaps to mind might be a Nest
Wi-Fi thermostat, or perhaps those smart
health bands that let you monitor your activity level from an app
on your smartphone.
That's
part of it. But if you're an engineer, you might think of the smart
sensors that General Electric embeds in locomotives
and wind turbines, while a city manager might be considering
smart parking meters,
and a hospital administrator might envision swallowable smart
pill sensors that monitor how much medication you've taken or
blood pressure
cuffs and blood glucose monitors that can monitor patient health
in the field and wirelessly stream updates into clinical systems.
[[Note:
This article accompanies our slideshow The
Internet of Things at home: 14 smart products that could change your
life; you can get more info about these products by checking out
The Internet
of Things at home: 14 smart products compared.]]
Perhaps
the MPAA has outlived its usefulness? (Was it ever more that a
source of amusement?)
The
MPAA Targets A Subreddit & Opens Everyone’s Eyes To Free Movies
SOPA
and PIPA terrified those of us who cherish the Internet for what
it has become today. In light of these bills, the MPAA
embarrassed itself on numerous occasions, once
even citing countries like China, Iran, and Syria as role models of
sorts when it comes to how they think the Internet should be
censored by the US.
This
week, they’re at it again, opening our eyes to a beautiful example
of the Streisand
effect
…
The MPAA’s
latest attempt at thwarting the piracy of movies on the Internet
sent them after a subreddit that many people never even knew existed.
That
subreddit is /r/fulllengthfilms,
which up until a couple hours from before I started this post was
unkempt, had an obscenity in the header, and included CSS and a
general color scheme that made you want to claw your eyes out.
…
In this particular case, what the MPAA needs to consider is that
Reddit is not a content delivery network. The things that are posted
and linked on Reddit are not uploaded or hosted on Reddit. The only
things that you’ll find on Reddit are links and text. Everything
linked in the /r/fulllengthfilms subreddit is hosted on servers away
from Reddit, and these are the websites that they should first be
targeting. [If they had
gone after the hosting servers, there would have been no “Reddit
kerfuffle.” Makes you wonder if this was deliberate rather than
merely stupid. Bob]
We
push our students to use Linkedin...
New
on LLRX – Fourteen LinkedIn Tips for (the Rest of) 2014
by
Sabrina I.
Pacifici on Jun 29, 2014
With
over 300 million users, LinkedIn is the most popular social media
platform for business and professional use, and attorneys Dennis
Kennedy and Allison
C. Shields clearly and concisely
outline how to leverage this space with smart, targeted and effective
ways that positively identify you in communities of best practice,
proactively communicate with peers and potential clients, and expand
your business reach.
No comments:
Post a Comment