External (portable) drives are often used to store evidence…
https://gizmodo.com/western-digital-confirms-my-book-live-drives-are-being-1847171372
Western Digital Confirms 'My Book Live' Drives Are Being Deleted Remotely
Unplug your drives from the internet right now to keep any data safe.
Western Digital’s popular My Book Live hard drives are being deleted remotely by an unknown attacker, according to the company. And there’s not much anyone can do at this point but unplug their drives from the internet.
“We have determined that some My Book Live devices have been compromised by a threat actor,” Western Digital’s Jolin Tan told Gizmodo early Friday by email. “In some cases, this compromise has led to a factory reset that appears to erase all data on the device.”
A whole new field to hack. Be the first one on your block to brick your neighbor’s Tesla.
https://www.cpomagazine.com/cyber-security/protecting-the-software-defined-vehicle/
Protecting the Software-Defined Vehicle
The move toward software-defined vehicles is enabling a wealth of safety, comfort and convenience innovations – and the innovations do not stop when those vehicles leave a dealership. Through over-the-air (OTA) updates, the software that runs the vehicle can continue to evolve and improve throughout its lifecycle, delighting consumers for years to come.
This is a powerful capability, but it requires an approach to development that always has cybersecurity in mind. Attacks might come from physical access to a vehicle, or even via Wi-Fi or Bluetooth, but cellular connections mean an attacker could potentially access the vehicle’s systems from anywhere in the world.
Could this idea spread?
https://www.schneier.com/blog/archives/2021/06/banning-surveillance-based-advertising.html
Banning Surveillance-Based Advertising
The Norwegian Consumer Council just published a fantastic new report: “Time to Ban Surveillance-Based Advertising. ” From the Introduction:
The challenges caused and entrenched by surveillance-based advertising include, but are not limited to:
privacy and data protection infringements
opaque business models
manipulation and discrimination at scale
fraud and other criminal activity
serious security risks
At least it was unarmed.
Baltimore spy plane program was invasion of citizens’ privacy, court rules
Kim Lyons reports:
The city of Baltimore’s spy plane program was unconstitutional, violating the Fourth Amendment protection against illegal search, and law enforcement in the city cannot use any of the data it gathered, a court ruled Thursday. The Aerial Investigation Research (or AIR) program, which used airplanes and high-resolution cameras to record what was happening in a 32-square-mile part of the city, was canceled by the city in February.
Local Black activist groups, with support from the ACLU, sued to prevent Baltimore law enforcement from using any of the data it had collected in the time the program was up and running.
Read more on The Verge
EPIC.org writes:
The en banc 4th Circuit ruled today that Baltimore’s warrantless aerial surveillance program violates the Fourth Amendment because it “enables police to deduce from the whole of individuals’ movements[.]” The Aerial Investigation Research program was a public-private partnership with Persistent Surveillance Systems that flew several surveillance planes above Baltimore, capturing detailed video of 32 square miles of the city per second. Using the AIR pilot program, Baltimore Police were able to track individual movements throughout the city for up to 12 hours a day. The pilot program was not renewed at the end of its 6-month term last year. EPIC joined an amicus brief in the case, arguing that under Carpenter v. United States the Baltimore Police Department’s ability to track individuals with at least 45 days of flight video augmented by automated license plate reader systems constituted a search. EPIC previously filed an amicus brief in Carpenter v. United States and has long fought to limit drone surveillance and other forms of aerial spying.
Resource?
FPF PARTNERS WITH PENN STATE AND UNIVERSITY OF MICHIGAN RESEARCHERS ON SEARCHABLE DATABASE OF PRIVACY-RELATED DOCUMENTS
FPF is collaborating with a team of researchers to build a searchable database of privacy policies and other privacy-related documents. The PrivaSeer project, led by researchers from Penn State and the University of Michigan, has received a $1.2 million grant from the National Science Foundation (NSF) to ease the process of collecting and utilizing privacy documents and privacy-related data.
AI has a negative impact?
https://hai.stanford.edu/news/new-approach-mitigating-ais-negative-impact
A New Approach To Mitigating AI’s Negative Impact
Stanford launches an Ethics and Society Review Board that asks researchers to take an early look at the impact of their work.
… The Ethics and Society Review (ESR) requires researchers seeking funding from the Stanford Institute for Human-Centered Artificial Intelligence (HAI) to consider how their proposals might pose negative ethical and societal risks, to come up with methods to lessen those risks, and, if needed, to collaborate with an interdisciplinary faculty panel to ensure those concerns are addressed before funding is received.
(Related) If you can’t lick ‘em, join ‘em?
https://www.protocol.com/workplace/twitter-ethical-ai-meta
How Twitter hired tech's biggest critics to build ethical AI
… One year later, Twitter's commitment to Font's team has convinced even the most skeptical people in tech — the ethics research community itself. Rumman Chowdhury, notorious and beloved by her fellow researchers for her commitment to algorithmic auditing, announced that she would be leaving her new startup to become Twitter's META leader. Kristian Lum, a University of Pennsylvania professor renowned for her work building machine-learning models that could reshape criminal justice, will join Twitter at the end of June as their new head of research. And Sarah Roberts, famous for her critiques of tech companies and the co-director of the Center for Critical Internet Inquiry at UCLA, will become a consultant for the META team this summer, researching what Twitter users actually want from algorithmic transparency.
I need this technology! It will also be fun to reverse the transformation…
Nvidia AI could let you make video calls in your PJs without anyone knowing
… Nvidia has unveiled an AI model that converts a single 2D image of a person into a “talking head” video.
Known as Vid2Vid Cameo, the deep learning model is designed to improve the experience of videoconferencing.
If you’re running late for a call, you could roll out of bed in your pajamas and disheveled hair, upload a photo of you dressed to impress, and the AI will map your facial movements to the reference image — leaving the other attendees unaware of the chaos behind the camera. That could be a boon for the chronically unkempt, but you should probably test the technique before you turn up in your birthday suit.
No comments:
Post a Comment