Perhaps it’s not cyber war, but I find it interesting that bad things happen after missile tests.
https://www.databreaches.net/n-korean-internet-downed-by-suspected-cyber-attacks-researchers/
N.Korean internet downed by suspected cyber attacks -researchers
Josh Smith reports:
North Korea’s internet appears to have been hit by a second wave of outages in as many weeks, possibly caused by a distributed denial-of-service (DDoS) attack, researchers said on Wednesday.
The latest incident took place for about six hours on Wednesday morning local time, and came a day after North Korea conducted its fifth missile test this month.
Junade Ali, a cybersecurity researcher in Britain who monitors a range of different North Korean web and email servers, said that at the height of the apparent attack, all traffic to and from North Korea was taken down.
Read more at Reuters.
Anyway, the questions are useful.
FIVE BURNING QUESTIONS (AND ZERO PREDICTIONS) FOR THE U.S. STATE PRIVACY LANDSCAPE IN 2022
Entering 2022, the United States remains one of the only major economic powers that lacks a comprehensive, national framework governing the collection and use of consumer data throughout the economy. An ongoing impasse in federal efforts to advance privacy legislation has created a vacuum that state lawmakers, seeking to secure privacy rights and protections for their constituents, are actively working to fill.
Last year we saw scores of comprehensive privacy bills introduced in dozens of states, though when the dust settled, only Virginia and Colorado had joined California in successfully enacting new privacy regimes. Now, at the outset of a new legislative calendar, many state legislatures are positioned to make progress on privacy legislation. While stakeholders are eager to learn which (if any) states will push new laws over the finish line, it remains too early in the lawmaking cycle to make such predictions with confidence. So instead, this post explores five key questions about the state privacy landscape that will determine whether 2022 proves to be a pivotal year for the protection of consumer data in the United States.
Perhaps what we need is a statement from an organization’s Privacy Officer (or Inspector General) detailing the privacy risks? That may force them to actually look at the risks… Maybe even catch vendors who are lying to them.
https://gizmodo.com/how-id-me-irs-face-recognition-works-1848429342
The IRS Needs to Stop Using ID.me's Face Recognition, Privacy Experts Warn
Privacy groups are demanding transparency following news that ID.me—the biometric identity verification system used by the IRS and over 27 states—has failed to be entirely transparent in how its facial recognition technology works.
In a LinkedIn post published on Wednesday, ID.me founder and CEO Blake Hall said the company verifies new enrolling users’ selfies against a database of faces in an effort to minimize identity theft. That runs counter to the more privacy-preserving ways ID.me has pitched its biometric products in the past and has drawn scrutiny from advocates who argue members of the public compelled to use ID.me for basic government tasks have unclear information.
On the company’s website and in white papers shared with Gizmodo, ID.me suggests its services rely on 1:1 face match systems that compare a user’s biometrics to a single document. That’s opposed to so-called 1:many facial recognition systems (the kind deployed by the likes of now-notorious firms like Clearview AI) that compare users to a database of (many) faces.
Perspective.
Personal identifying information for 1.5 billion users was stolen in 2021, but from where?
TechRepublic: “It was a big year for cybercriminals, who made off with somewhere in the neighborhood of $1.5 billion worth of users’ personal identifying information (PII) in 2021, according to a report from threat intelligence company Black Kite. Black Kite looked at 81 third-party breaches that accounted for over 200 public disclosures, and its top findings are unsurprising for anyone who lived through the past year: Ransomware attacks were the most common, healthcare providers were the most popular target, and attackers mostly exploited software vulnerabilities to accomplish their goals. Bob Maley, chief security officer at Black Kite, said that the trends it identified in the report show that threat actors, like many companies, are becoming more agile and capable of launching quick, devastating attacks. “[Increased attacker agility] is not just a change from 2021, but an overall message. Attack methods are becoming more clever, more detailed, with flexibility and dexterity. If agile attack methods are improving, our response must match, if not counter their growth,” Maley said in the report…”
Since we are not going to get a federal privacy law…
Consumer Reports and EPIC release paper calling on the Federal Trade Commission to pursue a privacy rulemaking
Consumer Reports and the Electronic Privacy Information Center (EPIC) today released a white paper that provides a detailed roadmap for how the Federal Trade Commission (FTC) should issue privacy rules under its unfair practices authority.
Justin Brookman, director of technology policy at Consumer Reports, said, “We have been waiting decades for Congress to provide baseline privacy protections over our data. Given the continued erosion of consumer privacy, the FTC should press forward in crafting rules that prohibit by default unnecessary data collection, use, and disclosure.”
Alan Butler, executive director and president of EPIC, said, “The Federal Trade Commission has the authority and the ability to make the internet safer and more private for everyday people. For too long the data practices of companies online have been dedicated by large and powerful corporations, and users have been subject to invasive surveillance and dangerous profiling. It is time for rules that put users first and end these invasive and unfair business practices.”
The paper urges the FTC to establish a Data Minimization Rule to prohibit all secondary data uses with limited exceptions, ensuring that people can safely use apps and online services without having to take additional action. It also lays out two additional options to consider should the FTC decline to prohibit all secondary uses: prohibit specific secondary data uses, such as behavioral advertising or the use of sensitive data; or mandate a right to opt out of secondary data use, including through global opt-out controls and databases.
Additionally, the paper encourages the FTC to adopt data transparency obligations for primary use of data; civil rights protections over discriminatory data processing; nondiscrimination rules, so that users cannot be charged for making privacy choices; data security obligations; access, portability, correction, and deletion rights; and to prohibit the use of dark patterns with respect to data processing.
As outlined in the paper, the FTC has wide authority to issue prescriptive rules in order to forestall business practices that can cause consumer injury. With respect to judicial interpretation, the courts generally give broad deference to expert agencies’ interpretation of their substantive statutes, and these privacy regulations are likely to withstand First Amendment scrutiny.
The two groups submitted the paper to the FTC in support of the privacy rulemaking petition from Accountable Tech, which calls on the FTC to prohibit surveillance advertising under its authority to regulate unfair competition in the marketplace. Last year, CR and EPIC joined over 40 groups in calling on the FTC to begin a privacy rulemaking.
The groups also support Congressional efforts to provide $500 million to the FTC over ten years to fund an office focused on policing privacy abuses and other data violations. New funding will be crucial in enabling the FTC to meet its responsibilities to protect consumer privacy, including pursuing a privacy rulemaking. CR and EPIC, along with dozens of other groups, recently called on Congress to adequately fund the FTC, and urged Congress to support the provision, currently in the Build Back Better Act, that gives the FTC civil penalty authority for first-time violations.
DOWNLOAD: How the FTC Can Mandate Data Minimization Through a Section 5 Unfairness Rulemaking
Source: advocacy.consumerreports.org.
Not all new tech needs new law. (What percentage of the cost of a self-driving car will be insurance?)
Who’s to blame for self-driving vehicle accidents? UK says it’s on automakers
… theLaw Commission of England and Wales and the Scottish Law Commission have revised the Automated Vehicles Act of 2018, recommending a new system of legal accountability.
Drivers shouldn’t be held responsible for accidents
As a matter of fact, the person who’s seating on the driver’s seat while a self-driving feature is engaged will no longer be a “driver.” Instead, they’re considered a “user-in-charge.”
The users-in-charge can’t be prosecuted for offences which arise directly from the driving task: exceeding the speed limit, running a red light, causing an accident, etc.
Perspective.
https://www.fool.com/investing/2022/01/26/ebay-sees-extra-risks-around-artificial-intelligen/
eBay Sees Extra Risks Around Artificial Intelligence Software
An eBay executive who focuses on artificial intelligence recently sat down for an interview during which he described some of the unique challenges facing IT managers who use AI software.
In this video from "The Virtual Opportunities Show," recorded on Jan. 18, Fool.com analyst Asit Sharma and Fool.com contributor Demitri Kalogeropoulos discuss the extra risks for tech companies as they use more AI across their systems.
No comments:
Post a Comment