Wednesday, July 24, 2024

Once again an assumption has unintended consequences.

https://www.theregister.com/2024/07/24/crowdstrike_preliminary_incident_report/

CrowdStrike blames a test software bug for that giant global mess it made

Whatever the Validator does or is supposed to do, it did not prevent the release of the July 19 Template Instance, despite it being a dud. That happened because CrowdStrike assumed that tests that passed the IPC Template Type delivered in March, and subsequent related IPC Template Instances, meant the July 19 release would be OK.

History tells us that was a very bad assumption. It "resulted in an out-of-bounds memory read triggering an exception."

"This unexpected exception could not be gracefully handled, resulting in a Windows operating system crash."

On around 8.5 million machines.





Why would anyone want to open this can of worms?

https://therecord.media/ftc-surveillance-pricing-inquiry

FTC launches probe into how companies use data to tailor what each customer pays

The Federal Trade Commission (FTC) announced Tuesday that it has launched an inquiry into how companies surveil consumers to set individualized pricing for the same products and services based on private data from their financial profiles.

The profiles are built from consumer demographics as well as web browsing, credit and geolocation histories, but sometimes harness even real-time data to determine what the agency refers to as “surveillance pricing.”

Eight companies — including Fortune 500 firms Mastercard, JPMorgan Chase and Accenture as well as the consulting firm McKinsey and Co. — have been ordered to explain how they gather and use consumers’ “characteristics and behavior” to set pricing, potentially undermining consumer privacy and marketplace competition, according to an FTC announcement.





When educators fail to learn…

https://pogowasright.org/uk-essex-school-reprimanded-after-using-facial-recognition-technology-for-canteen-payments/

UK: Essex school reprimanded after using facial recognition technology for canteen payments

From the Information Commissioner’s Office:

We have issued a reprimand to a school that broke the law when it introduced facial recognition technology (FRT).
Chelmer Valley High School, in Chelmsford, Essex, first started using the technology in March 2023 to take cashless canteen payments from students.
FRT processes biometric data to uniquely identify people and is likely to result in high data protection risks. To use it legally and responsibly, organisations must have a data protection impact assessment (DPIA) in place. This is to identify and manage the higher risks that may arise from processing sensitive data.
Chelmer Valley High School, which has around 1,200 pupils aged 11-18, failed to carry out a DPIA before starting to use the FRT. This meant no prior assessment was made of the risks to the children’s information. The school had not properly obtained clear permission to process the students’ biometric information and the students were not given the opportunity to decide whether they did or didn’t want it used in this way.
[...]
In March 2023, a letter was sent to parents with a slip for them to return if they did not want their child to participate in the FRT. Affirmative ‘opt-in’ consent wasn’t sought at this time, meaning until November 2023 the school was wrongly relying on assumed consent. The law does not deem ‘opt out’ a valid form of consent and requires explicit permission. Our reprimand also notes most students were old enough to provide their own consent. Therefore, parental opt-out deprived students of the ability to exercise their rights and freedoms.
Ms Currie added:
A DPIA is required by law – it’s not a tick-box exercise. It’s a vital tool that protects the rights of users, provides accountability and encourages organisations to think about data protection at the start of a project.”
We have provided Chelmer Valley High School with recommendations for the future



No comments: