Monday, September 25, 2017

Why can’t it happen here?
It was not the first time Muhammad Rabbani had problems when returning to the United Kingdom from travels overseas. But on this occasion something was different — he was arrested, handcuffed, and hauled through London’s largest airport, then put into the back of a waiting police van.
… Particularly unusual about Rabbani’s case is that he had been stopped on many prior occasions — dating back to 2008 — and never before did police arrest him when he declined to turn over his phone or laptop passwords. He is already well known to the authorities due to his employment with Cage, and he has never been accused of involvement in any sort of terrorism plot.
… While the existence of Schedule 7 is widely known in the U.K., the government has kept secret some significant details about its function.
Those who are examined under the law will usually be searched and questioned by officers. Like Rabbani, they may also have cellphones or laptops they are carrying inspected or confiscated.
Unknown to people who have gone through this process, however, is that police may also have covertly downloaded the contents of their phone and sent copies to the British eavesdropping agency Government Communications Headquarters, or GCHQ.
Every month the agency was receiving a copy of phone data that had been “downloaded from people stopped at U.K. ports (i.e. sea, air and rail),” according to a classified GCHQ document obtained by The Intercept from Edward Snowden.




As long as you think it through…
The Ethics of Running a Data Breach Search Service
No matter how much anyone tries to sugar coat it, a service like Have I been pwned (HIBP) which deals with billions of records hacked out of other peoples' systems is always going to sit in a grey area. There are degrees, of course; at one end of the spectrum you have the likes of Microsoft and Amazon using data breaches to better protect their customers' accounts. At the other end, there's services like the now defunct LeakedSource who happily sold our personal data (including mine) to anyone willing to pay a few bucks for it.




Even systems that seem quite complex may be simple to break. Just saying.
Distrustful U.S. allies force spy agency to back down in encryption fight
An international group of cryptography experts has forced the U.S. National Security Agency to back down over two data encryption techniques it wanted set as global industry standards, reflecting deep mistrust among close U.S. allies.
In interviews and emails seen by Reuters, academic and industry experts from countries including Germany, Japan and Israel worried that the U.S. electronic spy agency was pushing the new techniques not because they were good encryption tools, but because it knew how to break them.




As in ‘Standing?’
The Federal Trade Commission will host a workshop on informational injury on December 12, 2017. The FTC’s three main goals for hosting the workshop are to:
  1. “Better identify the qualitatively different types of injury to consumers and businesses from privacy and data security incidents;”
  2. “Explore frameworks for how the FTC might approach quantitatively measuring such injuries and estimate the risk of their occurrence;” and
  3. “Better understand how consumers and businesses weigh these injuries and risks when evaluating the tradeoffs to sharing, collecting, storing and using information.”
FTC Acting Chairwoman Maureen Ohlhausen announced the workshop during her speech [PDF] to the Federal Communications Bar Association, titled “Painting the Privacy Landscape: Informational Injury in FTC Privacy and Data Security Cases.” The speech focused on the five different types of consumer informational injury alleged in the FTC’s body of privacy and data security case law: (1) deception injury or subverting consumer choice; (2) financial injury; (3) health or safety injury; (4) unwarranted intrusion injury and (5) reputational injury.
Acting Chairwoman Ohlhausen noted that the FTC initiates many of its cases under the agency’s deception authority, stating that “from an injury standpoint, a company’s false promise to provide certain privacy or data security protections harms consumers like any false material promise about a product.” The Acting Chairwoman further highlighted that the most commonly alleged injuries in the FTC’s body of privacy and data security case law are financial injury and health and safety injury. She also emphasized that the type of injury is not dispositive in the FTC’s decision of whether to bring a privacy or data security case. The FTC also evaluates the strength of the evidence linked to the consumer injury, the magnitude of the injury (both to individuals and groups of consumers), and the likelihood of future consumer injury. In closing her speech, Acting Chairwoman Ohlhausen rhetorically raised three questions: (1) whether the list of consumer informational injuries is representative, (2) whether these or other informational injuries require government intervention, and (3) how the list maps to the FTC’s statutory deception and unfairness standards. Acting Chairwoman Ohlhausen plans to address these issues in depth at the December 12 workshop.
SOURCE: FTC




Too late for summer reading?
Life 3.0’ gives you a user’s guide for superintelligent AI systems to come
Do we need to be concerned about the rapid rise of artificial intelligence? Some people say there’s nothing to worry about, while others warn that a Terminator-level nightmare is dead ahead.
MIT physicist Max Tegmark says both sides of that argument are exaggerations.
In his newly published book, “Life 3.0: Being Human in the Age of Artificial Intelligence,” Tegmark lays out a case for what he calls “mindful optimism” about beneficial AI — artificial intelligence that will make life dramatically better for humans rather than going off in unintended directions.
Tegmark, who’s also the co-founder and president of the Future of Life Institute, says AI won’t be beneficial unless it incorporates safety measures yet to be developed.


No comments: