Friday, May 22, 2020


A really simple question.
Who Owns Privacy?
With GDPR, CCPA, and a US federal bill being actively considered by Congress, we’ve reached a regulatory ‘point of no return’ with privacy compliance. GDPR alone has generated over 30 large fines worth more than 400 million euros in less than 24 months…. And we’ve yet to observe the initial cost of non-compliance with CCPA.
Regulation aside, we’re seeing a dramatic increase in awareness among customers, employees, and ‘data subjects’ about how their information is used in the data economy. This rising awareness has spurred demands for more transparency and control over data access, deletion, and rectification — with our recent DataGrail study finding that 65% of participants desire to know what information is collected on them.
As of today, most of the means for organizations to deliver on privacy expectations are unsustainable. Another DataGrail survey from 2019 found that the average company involved 26 different stakeholders across almost as many functional groups to deliver an access request [link]. The pervasiveness of personal data across a modern business — from marketing, to customer support, to finance, to business intelligence — has forced a sprawl in responsibility.




The GDPR doesn’t apply, except when it does.
Grandmother ordered to delete Facebook photos under GDPR
It ended up in court after a falling-out between the woman and her daughter.
The judge ruled the matter was within the scope of the EU's General Data Protection Regulation (GDPR).
The case went to court after the woman refused to delete photographs of her grandchildren which she had posted on social media.
The mother of the children had asked several times for the pictures to be deleted.
The GDPR does not apply to the "purely personal" or "household" processing of data.
However, that exemption did not apply because posting photographs on social media made them available to a wider audience, the ruling said.
"With Facebook, it cannot be ruled out that placed photos may be distributed and may end up in the hands of third parties," it said.




Interesting. Like twisting the knob to see if a door is locked?
Just turning your phone on qualifies as searching it, court rules
Smartphones are a rich data trove not only for marketers but also for law enforcement. Police and federal investigators love to get their hands on all that juicy personal information during an investigation. But thanks to the Fourth Amendment of the US Constitution and all the case law built upon it, police generally need a warrant to search your phone—and that includes just looking at the lock screen, a judge has ruled (PDF).
Usually when the topic of a phone search comes up in court, the question has to do with unlocking. Generally, courts have held that law enforcement can compel you to use your body, such as your fingerprint (or your face ), to unlock a phone but that they cannot compel you to share knowledge, such as a PIN. In this recent case, however, the FBI did not unlock the phone. Instead, they only looked at the phone's lock screen for evidence.
In his ruling, the judge determined that the police looking at the phone at the time of the arrest and the FBI looking at it again after the fact are two separate issues. Police are allowed to conduct searches without search warrant under special circumstances, Coughenour wrote, and looking at the phone's lock screen may have been permissible as it "took place either incident to a lawful arrest or as part of the police's efforts to inventory the personal effects" of the person arrested. Coughenour was unable to determine how, specifically, the police acted, and he ordered clarification to see if their search of the phone fell within those boundaries.
But where the police actions were unclear, the FBI's were both crystal clear and counter to the defendant's Fourth Amendment rights, Coughenour ruled. "Here, the FBI physically intruded on Mr. Sam's personal effect when the FBI powered on his phone to take a picture of the phone's lock screen." That qualifies as a "search" under the terms of the Fourth Amendment, he found, and since the FBI did not have a warrant for that search, it was unconstitutional.
Attorneys for the government argued that Sam should have had no expectation of privacy on his lock screen—that is, after all, what everyone who isn't you is meant to see when they try to access the phone. Instead of determining whether the lock screen is private or not, though, Coughenour found that it doesn't matter. "When the Government gains evidence by physically intruding on a constitutionally protected area—as the FBI did here—it is 'unnecessary to consider' whether the government also violated the defendant’s reasonable expectation of privacy," he wrote.




Sounds a lot like Phrenology to me.
Artificial intelligence can make personality judgments based on photographs
Russian researchers from HSE University and Open University for the Humanities and Economics have demonstrated that artificial intelligence is able to infer people's personality from 'selfie' photographs better than human raters do. Conscientiousness emerged to be more easily recognizable than the other four traits. Personality predictions based on female faces appeared to be more reliable than those for male faces. The technology can be used to find the 'best matches' in customer service, dating or online tutoring.
The article, "Assessing the Big Five personality traits using real-life static facial images," will be published on May 22 in Scientific Reports.
The average effect size of r = .24 indicates that AI can make a correct guess about the relative standing of two randomly chosen individuals on a personality dimension in 58% of cases as opposed to the 50% expected by chance.




Worth considering.
A Buyer’s Guide to AI and Machine Learning
One limitation of some AI or ML products is that for certain applications of the technology, there is no source of absolute truth to compare against the accuracy of the output. For example, neither humans nor machines know how to produce the perfect set of end-to-end tests for any given application. This is the test oracle problem: there is no objective standard of truth. No one wants to introduce this kind of uncertainty into their sales process. Yet, our buyers deserve well-informed answers about our products.
Regardless of how you plan to use a product, it’s important to ask the right questions to understand the product and build resiliency around its accuracy levels. The next time a seller tells you “AI is doing this,” you can ask the following:


(Related)
Six things CCOs need to know about ICO’s AI guidance
The 122-page publication, called “Explaining decisions made with AIand written in conjunction with The Alan Turing Institute, the U.K.’s national center for AI, hopes to ensure organizations can be transparent about how AI-generated decisions are made, as well as ensure clear accountability about who can be held responsible for them so that affected individuals can ask for an explanation.



No comments: