A mere $2.75 per credit card (if we don’t pay the lawyers)?
Dena Aubin reports:
Restaurant company Wendy’s has agreed to pay $50 million to resolve a 2016 lawsuit by financial institutions nationwide alleging that the company’s negligence allowed hackers to steal credit and debit card information in a 2015 data breach.
Disclosed in a filing on Wednesday in Pittsburgh federal court, the settlement will be paid to approximately 7,500 banks and credit unions that issued about 18 million credit or debit cards exposed in the data breach. The deal must still be approved by the court.
To read the full story, you’ll need an account
on WestlawNext Practitioner Insights.
I try to teach my Computer Security students that
you need to think about everything you do. And think: What could go
wrong? Unencrypted? No contact information on the disk drive?
Someone asked me today about the lack of W-2
phishing reports or W-2 incidents that we’ve seen so far this year.
I responded that I hadn’t really had time to research W-2 attacks
yet, but a reader, “DLOW,” has now kindly submitted a news story
by Mary Richards of KSL in Utah. The kinds of tax documents involved
in this incident do not contain full Social Security numbers like W-2
forms do, but it’s still a tax document incident:
Forty-two thousand students at Salt Lake Community College are learning that their tax documents got lost.
An email sent to students and obtained by KSL Newsradio explained that a memory drive with tax documents for the students somehow fell out of an envelope on its way from a contracted company to the college.
SLCC spokesman Joy Tlou said that when the college processes these documents that deal with the 1098-T tax form used for getting educational tax credits, the college goes through a third-party vendor and uses a secured cloud server to access the information. That information is then also backed up on a memory drive and sent to the college.
Read more on KSL
and see the FAQ on the incident.
(Related) Easy to program if you ignore the
security requirements.
Julia Ingram and Hannah Knowles report:
Before this week, Stanford students could view the Common Applications and high school transcripts of other students if they first requested to view their own admission documents under the Family Educational Rights and Privacy Act (FERPA).
Accessible documents contained sensitive personal information including, for some students, Social Security numbers. Other obtainable data included students’ ethnicity, legacy status, home address, citizenship status, criminal status, standardized test scores, personal essays and whether they applied for financial aid. Official standardized test score reports were also accessible.
Students’ documents were not searchable by name, but were instead made accessible by changing a numeric ID in a URL.
Read more on The
Stanford Daily.
(Related) Why I recommend regular reporting of
“who can access what” to managers.
Ugh.DutchNews.nl reports:
Students working for extra cash at Amsterdam’s OLVG hospital group have for years been given complete access to the medical records system, allowing them to read personal information about friends, family and famous people, the Volkskrant said on Friday.
The leak was made public by a philosophy student who made telephone appointments for the hospital. Fellow students recommended digging up ‘juicy details’ in the files while doing boring jobs, she told the paper.
Read more at DutchNews.nl.
Dr. Michelle Post shares this Call for Speakers.
Syndicated
Radio Show Safety Talk needs Safety & Security Experts
Public safety and security including law
enforcement professionals, security experts and campus safety
professionals as well as those who represent security products and
solutions.
This includes representatives of companies that
deal with video surveillance, access control, cybersecurity, personal
safety products, and others.
How to face down facial recognition? What else
should we ban?
Facial
Recognition Surveillance Now at a Privacy Tipping Point - CPO
Magazine
… San Francisco is now considering an outright
ban on facial recognition surveillance. If pending legislation known
as “Stop Secret Surveillance” passes, this would make San
Francisco the first city ever to ban (and not just regulate) facial
recognition technology.
… One reason why the outright ban on facial
recognition technology is so important is because it fundamentally
flips the script on how to talk about the technology. Previously,
the burden of proof was on the average citizen and advocacy groups –
it was up to them to show the hazards and negative aspects of the
technology. Now, the burden of proof is on any city agency
(including local police) that would like to implement the technology
– they not only have to demonstrate that there is a clear use case
for the technology, but also demonstrate that the pros far outweigh
the cons for any high-tech security system (including a facial
recognition database).
An observation: Facebook takes its own security
seriously. Would they offer this tracking as a service to other
organizations or individuals?
Salvador Rodriguez reports:
In early 2018, a Facebook user made a public threat on the social network against one of the company’s offices in Europe.
Facebook picked up the threat, pulled the user’s data and determined he was in the same country as the office he was targeting. The company informed the authorities about the threat and directed its security officers to be on the lookout for the user.
“He made a veiled threat that ‘Tomorrow everyone is going to pay’ or something to that effect,” a former Facebook security employee told CNBC.
Read more on CNBC.
Reminds me of a paper by Paul David (The Dynamo
and the Computer) drafted in 1990 but never finalized. There is an
entire infrastructure that has to change to fully utilize AI.
This is why
AI has yet to reshape most businesses
The art of making perfumes and colognes hasn’t
changed much since the 1880s, when synthetic ingredients began to be
used. Expert fragrance creators tinker with combinations of
chemicals in hopes of producing compelling new scents. So Achim
Daub, an executive at one of the world’s biggest makers of
fragrances, Symrise, wondered what would happen if he injected
artificial intelligence into the process. Would a machine suggest
appealing formulas that a human might not think to try?
… Daub is pleased with progress so far. Two
fragrances aimed at young customers in Brazil are due to go on sale
there in June.
… However, he’s careful to point out that
getting this far took nearly two years—and it required investments
that still will take a while to recoup. Philyra’s initial
suggestions were horrible: it kept suggesting shampoo recipes. After
all, it looked at sales data, and shampoo far outsells perfume and
cologne. Getting it on track took a lot of training by Symrise’s
perfumers. Plus, the company is still wrestling with costly IT
upgrades that have been necessary to pump data into Philyra from
disparate record-keeping systems while keeping some of the
information confidential from the perfumers themselves. “It’s
kind of a steep learning curve,” Daub says. “We are nowhere near
having AI firmly and completely established in our enterprise
system.”
(Related)
Gartner
debunks five Artificial Intelligence misconceptions
No comments:
Post a Comment