Monday, October 31, 2011


Okay, who tipped the guys at Stanford? I hope you got something cool in trade...
Outsmarted: Captcha security not much of a gotcha
PALO ALTO--A team of Stanford University researchers has bad news to report about Captchas, those often unreadable, always annoying distorted letters that you're required to type in at many a Web site to prove that you're really a human.
Many Captchas don't work well at all. More precisely, the researchers invented a standard way to decode those irksome letters and numbers found in Captchas on many major Web sites, including Visa's Authorize.net, Blizzard, eBay, and Wikipedia.
… "Most Captchas are designed without proper testing and no usability testing," Elie Bursztein, 31, a postdoctoral researcher at the Stanford Security Laboratory, told CNET yesterday. "We hope our work will push people to be more rigorous in their approach in Captcha design." Captcha stands for Completely Automated Public Turing test to tell Computers and Humans Apart.


It's better than that. Since your signal is more powerful than the local cell towers, you become the phone systems, allowing you to locate and track any/all phone in real time and listen in. Don't believe it? I'll have my Ethical Hackers demonstrate their version...
"UK Metropolitan Police have purchased a 'covert surveillance technology that can masquerade as a mobile phone network, transmitting a signal that allows authorities to shut off phones remotely, intercept communications and gather data about thousands of users in a targeted area.' Other customers apparently include 'the U.S. Secret Service, the Ministry of Defence and regimes in the Middle East.'"


Attention Ethical Hackers: Think of the white areas as a “free fire zone”
Where in the world are there data protection laws?
October 30, 2011 by Dissent
I stand in awe of how much some folks accomplish. Dave Banisar alerts me that he has updated the global map showing which countries have comprehensive data protection laws. The number is now up to 70.
You can see the map on SSRN (click the download link).
And yes, that glaring white area in North America where there is no comprehensive data protection law is the U.S.

(Related)
October 30, 2011
Privacy and Security in the Implementation of Health Information Technology: U.S. and EU Compared
  • "The importance of the adoption of Electronic Health Records (EHRs) and the associated cost savings cannot be ignored as an element in the changing delivery of health care. However, the potential cost savings predicted in the use of EHR are accompanied by potential risks, either technical or legal, to privacy and security. The U.S. legal framework for healthcare privacy is a combination of constitutional, statutory, and regulatory law at the federal and state levels. In contrast, it is generally believed that EU protection of privacy, including personally identifiable medical information, is more comprehensive than that of U.S. privacy laws. Direct comparisons of U.S. and EU medical privacy laws can be made with reference to the five Fair Information Practices Principles (FIPs) adopted by the Federal Trade Commission and other international bodies. The analysis reveals that while the federal response to the privacy of health records in the U.S. seems to be a gain over conflicting state law, in contrast to EU law, U.S. patients currently have little choice in the electronic recording of sensitive medical information if they want to be treated, and minimal control over the sharing of that information. A combination of technical and legal improvements in EHRs could make the loss of privacy associated with EHRs de minimis. The EU has come closer to this position, encouraging the adoption of EHRs and confirming the application of privacy protections at the same time. It can be argued that the EU is proactive in its approach; whereas because of a different viewpoint toward an individual’s right to privacy, the U.S. system lacks a strong framework for healthcare privacy, which will affect the implementation of EHRs. If the U.S. is going to implement EHRs effectively, technical and policy aspects of privacy must be central to the discussion."


Who is interpreting this law? If someone who had been threatened is later injured, wouldn't the schools liability be practically infinite?
WA: Federal student privacy law blamed for school’s failure to prevent attack?
October 30, 2011 by Dissent
@EducationNY pointed me to this news story by Christine Clarridge:
Five months before she allegedly attacked two schoolmates with a knife, nearly killing one, a Snohomish High School student underwent counseling after she threatened to kill another student’s boyfriend.
The 15-year-old Snohomish girl was allowed to return to school only after she presented proof she had attended counseling.
The earlier threats would have never been made public if the information wasn’t contained in court documents charging the girl with first-degree attempted murder and first-degree assault in last Monday’s attack.
Some Snohomish parents were surprised to learn of the earlier threat and have expressed concern that they weren’t notified.
But student information, including mental-health records, is tightly held by school districts because of federal privacy laws. The district says it cannot even discuss whether counselors or teachers were made aware of the earlier threats because of privacy laws.
The case underscores the delicate and complicated balancing act faced by schools in their efforts to meet the educational and privacy rights of individual students, as well as their need to ensure the safety of the larger student body.
Read more on Seattle Times.
Back in 2007, following the “Virginia Tech Shooter” case, that college also alleged that it could not do certain things because of federal privacy laws. There was tremendous public debate, and FERPA was subsequently amended to make clear that schools could protect student safety and what the exemptions to privacy were. Past media coverage can be found in the archives for this blog.
So FERPA was amended to prevent another “hands tied” Virginia Tech type of situation and yet Snohomish claims that its hands were still tied?
Back in 2007 and 2008, both Daniel Solove and I argued that the problem was not with FERPA but with how schools were interpreting it. The amendments enacted in 2008 should have made clear that student privacy does not trump public safety. So why this failure – again? Well, what if you don’t really know whether the student poses a current threat? What if a psychiatrist sent a letter asserting that the student is not a risk to others? Under such circumstances, can a school district disclose past problems? Some would argue – and I would agree – that they cannot disclose under such circumstances. But what they probably can do is not rely on a psychiatrist’s one-time assessment and require regular re-assessments as a precondition of return to school and staying in school. And what they probably can do is ensure that the student is seen each week for counseling in school by a professional who can assess if there is any change in mood or behavior. And what they probably can do is follow up regularly with the student’s parents, offering them resources and support to ensure that the student’s mental health or needs are getting met.
There is no way to predict with absolute certainty whether a student will become violent, but there are some factors or predictors. FERPA and HIPAA should not be obstacles to getting students needed care or help, and I don’t think that they are. I tend to think that districts are so afraid of litigation that [they do nothing... Bob] in some cases, they err on the side of protecting privacy instead of student safety.
A district’s duty to provide safe schools applies not only to the students who might become victims but also to the student who may become the attacker. Just as my patients know that as a mandated reporter, there are certain secrets I can’t keep, students need to understand that there are certain secrets their schools can’t keep secret, either, and that their school will do what it has to do to keep them and their peers safe.


Oh look Ethical Hackers, a challenge!
Suppose you're working and you need to search for some sites, and these are the kind of sites that for a reason or the other you just don't want on your browsing history. What can you do to get there without anybody being able to trace your steps afterwards? You can delete all your cookies, have your history wiped away and use a tool like CCleaner. Or you can take the short way around, and use a service like Stealth instead.
Stealth is an anonymous search engine. It allows you to find any site that crosses your mind, without comprising your privacy in any way. Unlike other search engines, when you click on a link produced by Stealth your browser and computer information is kept private, it isn't sent to the site you're visiting. And information like your interests, your family circumstances, your political leanings and your medical conditions (which other search engines can quickly figure out, use and store) is also kept in your possession only. No marketers, no government officials and no hackers could access it.


Geeky stuff...
CircuitBee: Upload & Share Electronic Circuit Diagrams
Similar tool: DZ863

No comments: