Friday, July 05, 2019

For both my Security classes. (I do like good bad examples) If facial recognition is used to say “Hey, check this out!” I’d be happy. If it says, “Found target, launching weapons.” I’m a bit more concerned.
Biased and wrong? Facial recognition tech in the dock
The Californian city of San Francisco recently banned the use of FR by transport and law enforcement agencies in an acknowledgement of its imperfections and threats to civil liberties. But other cities in the US, and other countries around the world, are trialling the technology.
In the UK, for example, police forces in South Wales, London. Manchester and Leicester have been testing the tech to the consternation of civil liberties organisations such as Liberty and Big Brother Watch, both concerned by the number of false matches the systems made.
Just this week, academics at the University of Essex concluded that matches in the London Metropolitan police trials were wrong 80% of the time, potentially leading to serious miscarriages of justice and infringements of citizens' right to privacy.

A bit of Computer Security history.
Getting up to speed with AI and Cybersecurity
Many people are unaware that the first computer virus predates the public internet.
In 1971 Bob Thomas, an American IT academic wrote Creeper, the first computer program that could migrate across networks. It would travel between terminals on the ARPANET printing the message “I’m the creeper, catch me if you can”. Creeper was made self-replicating by fellow academic and email inventor, Ray Thomlinson, creating the first documented computer virus.
In order to contain Creeper, Thomlinson wrote Reaper, a program that would chase Creeper across the network and erase it – creating the world’s first antivirus cybersecurity solution.

Something for my Computer Security discussions.

A ‘fake news” law? Think Russia will comply?
Will California’s New Bot Law Strengthen Democracy?
The New Yorker – “When you ask experts how bots influence politics—that is, what specifically these bits of computer code that purport to be human can accomplish during an election—they will give you a list: bots can smear the opposition through personal attacks; they can exaggerate voters’ fears and anger by repeating short simple slogans; they can overstate popularity; they can derail conversations and draw attention to symbolic and ultimately meaningless ideas; they can spread false narratives. In other words, they are an especially useful tool, considering how politics is played today.
On July 1st, California became the first state in the nation to try to reduce the power of bots by requiring that they reveal their “artificial identity” when they are used to sell a product or influence a voter. Violators could face fines under state statutes related to unfair competition. Just as pharmaceutical companies must disclose that the happy people who say a new drug has miraculously improved their lives are paid actors, bots in California—or rather, the people who deploy them—will have to level with their audience… By attempting to regulate a technology that thrives on social networks, the state will be testing society’s resolve to get our (virtual) house in order after more than two decades of a runaway Internet…”

Another good bad example. What is the ‘balance of power’ when companies have “big Data?”
Public Management of Big Data: Historical Lessons from the 1940s
Public Management of Big Data: Historical Lessons from the 1940s by Margo Anderson – Distinguished Professor, History and Urban Studies, University of Wisconsin-Milwaukee.
At its core, public-sector use of big data heightens concerns about the balance of power between government and the individual. Once information about citizens is compiled for a defined purpose, the temptation to use it for other purposes can be considerable, especially in times of national emergency. One of the most shameful instances of the government misusing its own data dates to the Second World War. Census data collected under strict guarantees of confidentiality was used to identify neighborhoods where Japanese-Americans lived so they could be detained in internment camps for the duration of the war.” – Executive Office of the President, Big Data: Seizing Opportunities, Preserving Values, May 2014

(Related) Will the lawyer with the “bigger Data” always win?
Methods of Data Research for Law
Custers, Bart, Methods of Data Research for Law (October 28, 2018). Custers B.H.M. (2018), Methods of data research for law. In: Mak V., Tjong Tjin Tai E., Berlee A. (Eds.) Research Handbook in Data Science and Law. Research Handbooks in Information Law Cheltenham: Edward Elgar. 355-377. Available at SSRN:
Data science and big data offer many opportunities for researchers, not only in the domain of data science and related sciences, but also for researchers in many other disciplines. The fact that data science and big data are playing an increasingly important role in so many research areas raises the question whether this also applies to the legal domain. Do data science and big data also offer methods of data research for law? As will be shown in this chapter, the answer to this question is positive: yes, there are many methods and applications that may be also useful for the legal domain. This answer will be provided by discussing these methods of data research for law in this chapter. As such, this chapter provides an overview of these methods.”

Crisis or not?
Opinion: Legislative Fix Needed to Keep Internet Applications Free in California
No matter where you go in California, you’re likely to see someone on their smartphone, tablet or computer using an app or other online service to look something up, watch a video, send an email, check social media or the weather, or any number of the dozens of things people do online every day.
Today, most of these online activities are free, meaning whether you are a college student researching a paper, a single mom looking for a job, or a small employer sending an email to your staff, the internet provides everyone from all walks of life equal and free access to information and services critical to our everyday lives. But these free services that have helped level the socio-economic playing field are at risk unless a policy fix is passed in Sacramento this year.
In 2018, the State Legislature hastily passed a sweeping measure known as the California Consumer Privacy Act (CCPA). This law was intended to give consumers more understanding and control of their online personal data and information, something we all support.
Many flaws did come to light. One of the most significant has to do with language in the CCPA that hinders tailored online advertising by prohibiting the sharing of technical information necessary to make the ads work. These ads are a major reason why many online services are free now, and unless fixed this year, this flaw in the CCPA could result in new costs for online services we take for granted and get for free today.
A policy fix would clarify that when a consumer opts-out of the “sale” of their personal information, it does not restrict the ability of companies to continue to market targeted ads to that same consumer as long as those ads rely on sharing technical information only, not personally identifiable information.

(Related) Is this ‘personal information” or merely technical information?
Fingerprinting’ to Track Us Online Is on the Rise. Here’s What to Do.
The New York Times – Advertisers are increasingly turning to an invisible method that pulls together information about your device to pinpoint your identity. “Fingerprinting involves looking at the many characteristics of your mobile device or computer, like the screen resolution, operating system and model, and triangulating this information to pinpoint and follow you as you browse the web and use apps. Once enough device characteristics are known, the theory goes, the data can be assembled into a profile that helps identify you the way a fingerprint would.
And here’s the bad news: The technique happens invisibly in the background in apps and websites. That makes it tougher to detect and combat than its predecessor, the web cookie, which was a tracker stored on our devices. The solutions to blocking fingerprinting are also limited…”

I had hoped to stay away from the whole Block Chain / Bitcoin kerfuffle.
Facebook’s Libra Cryptocurrency Could Have Profound Implications for Personal Privacy
In the never-ending search for new revenue streams, social media giant Facebook is now looking to launch its very own cryptocurrency, known as Libra. While the stated goal of Facebook’s Libra cryptocurrency is to bring financial services to the world’s 1.7 billion unbanked population, and to make it easier and more convenient than ever before to send and receive money around the world, data privacy experts, politicians, regulators, central bankers and government officials are already warning that the latest innovation from Facebook may result in a confusing headache of privacy, financial, political, and socio-economic issues.

No comments: