Thursday, April 01, 2021

For my security students.

https://www.tripwire.com/state-of-security/security-data-protection/role-of-encryption-in-gdpr-compliance/

Role of Encryption in GDPR Compliance

Today’s article is about one such data privacy law that repeatedly mentions the adoption of encryption. GDPR is a data privacy law in the EU that mentions the use of encryption. Although not mandatory, it is yet seen as a best practice for protecting personal data. So, let us first understand what data encryption is and then understand the role of encryption in GDPR compliance.





The role keeps changing…

https://www.csoonline.com/article/3332026/what-is-a-ciso-responsibilities-and-requirements-for-this-vital-leadership-role.html#tk.rss_all

What is a CISO? Responsibilities and requirements for this vital leadership role

Learn what it takes to land a CISO job and how to be successful in the role.





Timely.

https://www.bespacific.com/crs-biometric-technologies-and-global-security/

CRS – Biometric Technologies and Global Security

CRS In Focus – Biometric Technologies and Global Security March 30, 2021: “Biometric technologies use unique biological or behavioral attributes—such as DNA, fingerprints, cardiac signatures, voice or gait patterns, and facial or ocular measurements—to authenticate an individual’s identity. Although biometric technologies have been in use for decades, recent advances in artificial intelligence (AI) and Big Data analytics have expanded their application. As these technologies continue to mature and proliferate, largely driven by advances in the commercial sector, they will likely hold growing implications for congressional oversight, civil liberties, U.S. defense authorizations and appropriations, military and intelligence concepts of operations, and the future of war…”



(Related)

https://www.cpomagazine.com/data-privacy/white-collar-blue-collar-schism-at-apple-factory-workers-subject-to-collection-of-biometric-data-extra-security-measures/

White Collar / Blue Collar Schism at Apple: Factory Workers Subject to Collection of Biometric Data, Extra Security Measures

Apple has branded itself as the company that puts user privacy front and center, making bold moves to that end with the release of iOS 14. It has called privacy a “fundamental human right” and has said that its internal human rights policy applies to ” … business partners and people at every level of its supply chain.” However, it seems that some elements of the chain are more equal than others. A new company policy forbids manufacturing partners from collecting the biometric data of visiting Apple employees, but says nothing about the over one million workers that put Apple’s products together in these facilities.

These workers will also now be subject to tighter security controls mandated by Apple, which include criminal background checks, an expansion of surveillance cameras and new systems that track components during the assembly process and issue alerts when something is in one place for too long or not moving as expected.





Not yet a tool, but the start of specifications for a tool.

https://www.infoworld.com/article/3613832/4-key-tests-for-your-ai-explainability-toolkit.html

4 key tests for your AI explainability toolkit

Until recently, explainability was largely seen as an important but narrowly scoped requirement towards the end of the AI model development process. Now, explainability is being regarded as a multi-layered requirement that provides value throughout the machine learning lifecycle.

An enterprise-grade explainability solution must meet four key tests:

    1. Does it explain the outcomes that matter?

    2. Is it internally consistent?

    3. Can it perform reliably at scale?

    4. Can it satisfy rapidly evolving expectations?





I think their definition is unworkable.

https://www.natlawreview.com/article/cpsc-digs-artificial-intelligence

The CPSC Digs In on Artificial Intelligence

… On March 2, 2021, at a virtual forum attended by stakeholders across the entire industry, the Consumer Product Safety Commission (CPSC) reminded us all that it has the last say on regulating AI and machine learning consumer product safety.

… The CPSC defines AI as “any method for programming computers or products to enable them to carry out tasks or behaviors that would require intelligence if performed by humans” and machine learning as “an iterative process of applying models or algorithms to data sets to learn and detect patterns and/or perform tasks, such as prediction or decision making that can approximate some aspects of intelligence.”3



No comments: