Friday, August 26, 2022

After cashing our Tabor checks, should we look into a law like this to keep the money flowing?

https://www.nbcchicago.com/news/local/heres-a-look-at-all-the-settlements-stemming-from-illinois-biometric-privacy-act/2922736/

Here's a Look at Settlements Stemming From Illinois' Biometric Privacy Act

Illinois residents are likely familiar with a privacy law in the state that was behind checks worth hundreds of dollars for many Facebook users in the state.

Now, that same law is behind a number of other potential settlements, and more checks could soon be arriving.

Earlier this year, more than one million Illinois Facebook users began receiving checks following a $650 million settlement in a class-action suit alleging it violated residents' rights by collecting and storing digital scans of their faces without permission.

Another lawsuit, which mirrors the one settled with Facebook, claimed Google violated the Illinois Biometric Information Privacy Act by "collecting and storing biometric data of individuals who, while residing in Illinois, appeared in a photograph in the photograph sharing and storage service known as Google Photos, without proper notice and consent."

Most recently, a federal judge in Illinois granted final approval for a $92 million class-action lawsuit settlement between the social media network TikTok and users of the platform, with Illinois residents set to receive the largest share of the payout due to BIPA.

Snapchat The social media company has agreed to a $35 million settlement in a class-action lawsuit in Illinois.





When you are near panic, this may look like a solution. After all, everyone knows AI can do anything!

https://www.vice.com/en/article/5d3dw5/the-least-safe-day-rollout-of-gun-detecting-ai-scanners-in-schools-has-been-a-cluster-emails-show

The Least Safe Day’: Rollout of Gun Detecting AI Scanners in Schools Has Been a ‘Cluster,’ Emails Show

There is currently no peer-reviewed research showing that AI gun detection is effective at preventing shootings, and Evolv has offered little evidence supporting claims of its system’s effectiveness in meeting these objectives. Schools have also encountered problems with the scanners confusing laptops and other everyday items with guns.

But the documents obtained by Motherboard provide a more detailed look into how Evolv scanners are actually deployed and the problems they actually face. On the ground, the reality of deploying Evolv scanners is very different than marketing materials suggest. Some school administrators are reporting that the scanners have caused “chaos”—failing to detect common handguns at commonly-used sensitivity settings, mistaking everyday school items for deadly weapons, and failing to deliver on the company’s promise of frictionless school security.

Today was probably the least safe day,” one principal observed the day scanners were deployed at her school, because the machines were triggering false alarms and requiring manual searches on “almost every child as they walked through” monopolizing the attention of safety officers who would otherwise be monitoring the halls and other entrances.





Will they now come fast and furious?

https://www.insideprivacy.com/ccpa/california-attorney-general-announces-first-ccpa-settlement/

California Attorney General Announces First CCPA Settlement

Today, the California Attorney General announced the first settlement agreement under the California Consumer Privacy Act (“CCPA”). The Attorney General alleged that online retailer Sephora, Inc. failed to disclose to consumers that it was selling their information and failed to process user requests to opt out of sale via user-enabled global privacy controls. The Attorney General also alleged that Sephora did not cure these violations within the cure period.





Perspective.

https://fpf.org/blog/looking-back-to-forge-ahead-challenges-of-developing-an-african-conception-of-privacy/

LOOKING BACK TO FORGE AHEAD: CHALLENGES OF DEVELOPING AN “AFRICAN CONCEPTION” OF PRIVACY

Few things depend on context, like privacy, which strongly hinges on how people within various communities and other social organizations perceive it. While the need for privacy may be universal, the particularities of its social acceptance and articulation differ depending on cultural norms that vary among communities. Whitman succinctly captured the cultural cause of the diverse forms of privacy when he posited that “culture informs greatly the different intuitive sensibilities that cause people to feel that a practice is privacy invasive while others do not feel that way”. 1





What other labels might Google provide? Home of undercover narc? Law offices: will defend drug dealers?

https://techcrunch.com/2022/08/25/google-search-and-maps-will-now-clearly-label-if-a-healthcare-facility-provides-abortions/

Google Search and Maps will now clearly label if a healthcare facility provides abortions

Google will start adding clear labels to Search and Map listings for healthcare facilities that provide abortions. The change comes in light of the Supreme Court’s decision to strip federal abortion rights. The company said on Thursday that if it has received confirmation that a healthcare facility provides abortions, the label for the center will say “Provides abortions.” In cases where Google doesn’t have that confirmation, the label for relevant searches will say “Might not provide abortions.”





The quest continues. Would you reject an AI Mozart?

https://kotaku.com/ai-art-dall-e-midjourney-stable-diffusion-copyright-1849388060

AI Creating 'Art' Is An Ethical And Copyright Nightmare

If a machine makes art, is it even art? And what does this mean for actual artists?

If you haven’t read or seen anything about the subject, AI art—or at least as it exists in the state we know it today—is, as Ahmed Elgammal writing in American Scientist so neatly puts it, made when “artists write algorithms not to follow a set of rules, but to ‘learn’ a specific aesthetic by analyzing thousands of images. The algorithm then tries to generate new images in adherence to the aesthetics it has learned.”

The worries over young, upcoming and part-time artists is one shared by Karla Ortiz, who has worked for Ubisoft, Marvel and HBO. “The technology is not quite there yet in terms of a finalized product”, she tells Kotaku. “No matter how good it looks initially, it still requires professionals to fix the errors the AI generates. It also seems to be legally murky territory, enough to scare many major companies.”

However, it does yield results that will be ‘good enough’ for some, especially those less careful companies who offer lower wages for creative work. Because the end result is ‘good enough’, I think we could see a lot of loss of entry level and less visible jobs. This would affect not just illustrators, but photographers, graphic designers, models, or pretty much any job that requires visuals. That could all potentially be outsourced to AI.”





Resources!

https://www.bespacific.com/us-government-to-make-all-research-it-funds-open-access-on-publication/

US government to make all research it funds open access on publication

Ars Technica: “Many federal policy changes are well known before they are announced. Hints in speeches, leaks, and early access to reporters at major publications all serve to pave the ground for the eventual confirmation. But on Thursday, the White House Office of Science and Technology Policy (OSTP) dropped a big one that seemed to take everyone by surprise. Starting in 2026, any scientific publication that receives federal funding will need to be openly accessible on the day it’s published. The move has the potential to further shake up the scientific publishing industry, which has already adopted preprint archives, similar mandates from other funding organizations, and greatly expanded access to publications during the pandemic. The change was announced by Alondra Nelson, acting head of the OSTP (a permanent Director is in the process of Senate confirmation). The formal policy is laid out in an accompanying memorandum …”



No comments: