Tuesday, June 04, 2019


If you’re good, your hack goes undetected.
ABC in Australia reports:
The Australian National University has been hit by a massive data hack, with unauthorised access to significant amounts of personal details dating back 19 years.
A sophisticated operator accessed the ANU’s systems illegally in late 2018 but the breach was only detected two weeks ago, the university said in a statement.
Based on student numbers over that time, as well as staff turnover, the university has estimated approximately 200,000 people were affected by the breach.
Read more on ABC.




Maury Nichols, who is absolutely, positively not an evil hacker and had nothing whatsoever to do with various hacks I may have mentioned in this blog, forwarded this article for my Ethical Hackers. It might be a pretty good deal at $39.00 (8 courses segregated into 384 lessons).
Ethical Hacking A to Z Training Bundle
Break Into the Lucrative World of Ethical Hacking with Over 45 Hours of Immersive Content




Some background for my Computer Security students.




For my HIPAA lecture.
Aimee Jachym and Samantha A. Kopacz of Miller Canfield PLC write:
New guidance issued by the U.S. Department of Health & Human Services (HHS) Office for Civil Rights (OCR) reaffirms that business associates must have proper HIPAA compliance practices, safeguards and documentation in place in order to avoid costly penalties.
OCR recently released a Fact Sheet summarizing the instances in which a business associate is directly liable for HIPAA violations. While nothing in the HIPAA Privacy, Security, Breach Notification, and Enforcement Rules (HIPAA Rules) has changed at this time, the Fact Sheet, released on May 24, 2019, aims to make it easier for regulated entities to understand and comply with their obligations under the law.
Read more on Miller Canfield.




Not so much mini-GDPRs, more like easy to digest bits and pieces.
New Nevada Privacy Law With “Sale” Opt-Out Right Will Take Effect Before the CCPA
Nevada has a new privacy law. On May 29, Nevada Governor Steve Sisolak signed Senate Bill 220 (SB-220 ) into law, making Nevada the first state to join California in granting consumers the right to opt out of the sale of their personal information. The act, which amends an existing online privacy notice law, is significantly narrower than the California Consumer Privacy Act (CCPA). It applies only to online activities, defines “consumer” and “sale” in a much more limited manner than the CCPA, and includes broad exceptions for financial institutions subject to the Gramm-Leach-Bliley Act, entities subject to the Health Insurance Portability and Accountability Act, and vehicle manufacturers and vehicle service and repair entities that collect covered information from vehicles through connected or subscription services.


(Related) Hey! It’s not a race!
Issie Lapowsky reports:
As tech giants and lobbying groups race to defang California’s landmark consumer privacy law before it takes effect next year, lawmakers on the other side of the country are considering a bill that’s even more drastic.
The New York Privacy Act, introduced last month by state senator Kevin Thomas, would give residents there more control over their data than in any other state. It would also require businesses to put their customers’ privacy before their own profits. The bill is still seeking a cosponsor in the state assembly, but Thomas says he is confident that he has majority support in the senate and hopes to pass the bill this summer. The Committee on Consumer Protection, which Thomas chairs, is scheduled to hold a hearing on the bill Tuesday.
Read more on Wired.




Perspective.
Consumer Surveillance Enters Its Bargaining Phase
Amazon and Google are happy to give users the option to pause tracking. Why can’t we stop it entirely?
The best measure of whether a company cares about privacy is what it does by default. Many apps and products are initially set up to be public: Instagram accounts are open to everyone until you lock them, and the whole world can see who you split utilities with until you take your Venmo private. Even when companies announce convenient shortcuts for enhancing security, their products can never become truly private. Strangers may not be able to see your selfies, but there’s no way for users to untether themselves from the larger ad-targeting ecosystem.




Someone in Congress asked for this?
Enforcing Federal Privacy Law – Constitutional Limitations on Private Rights of Action
CRS Legal Sidebar – Enforcing Federal Privacy Law—Constitutional Limitations on Private Rights of Action, May 31, 2019: “Over the last two years, the prospect of a comprehensive federal data privacy law has been the subject of considerable attention in the press and in Congress. Some Members of Congress and outside groups have developed many proposals in the last six months alone. Some of the proposed legislation would limit companies’ ability to use personal information collected online, require that companies protect customers from data breaches, provide certain disclosures about their use of personal information, or allow users to opt out of certain data practices. Some proposals combine all of those elements or take still differentapproaches.
One overarching question that every data privacy proposal raises is how to enforce any new federal rights or obligations that a given bill would impose. One traditional method of enforcement would be by a federal agency, such as the Federal Trade Commission or Department of Justice, through civil penalties or criminal liability. A bill could also provide for enforcement in civil lawsuits brought by State Attorney Generals. Along with these methods, several outside commentators have recently called for any new federal privacy legislation to include a federal private right of action—a right that would allow individuals aggrieved by violations of the law to file lawsuits against violators in order to obtain money damages in federal court. At least one bill proposed in Congress includes such a right: the Privacy Bill of Rights Act, S. 1214.
Such proposals for judicial enforcement by individual lawsuits must necessarily tangle with the constitutional limits on when federal courts can hear such claims. This Sidebar considers how the lower courts have addressed such questions in the wake of the Supreme Court’s 2016 decision in Spokeo v. Robins. As is discussed in detail below, these cases reveals some common principles on the limits of federal justiciability that might inform Congress’s efforts to craft a private right of action in the data privacy context…”




A solution that should work, if nothing changes.
How federated learning could shape the future of AI in a privacy-obsessed world
You may not have noticed, but two of the world’s most popular machine learning frameworks — TensorFlow and PyTorch — have taken steps in recent months toward privacy with solutions that incorporate federated learning.
Instead of gathering data in the cloud from users to train data sets, federated learning trains AI models on mobile devices in large batches, then transfers those learnings back to a global model without the need for data to leave the device.
As part of the latest release of Facebook’s popular deep learning framework PyTorch last month, the company’s AI Research group rolled out Secure and Private AI, a free two-month Udacity course on the use of methods like encrypted computation, differential privacy, and federated learning.
Google AI researchers first introduced federated learning in 2017, and since then it’s been cited more than 300 times by research scientists, according to arXiv. In March, Google released TensorFlow Federated to make federated learning easier to perform with its popular machine learning framework.




Something for my Architecture students.
The Risks and Rewards of Digital Maturity




Perspective. Would Adam Smith agree? Would this apply to individual companies?
A Nation’s Wealth May Depend on How Much Its Workers Can Learn on the Job
New research suggests that formal schooling is not the panacea to global inequality that many have long believed it to be.
when economists have added up the components known to make up physical and human capital, they haven’t been able to explain as much of the wealth gap as one might expect. “It seemed like [these factors] should matter more,” says Nancy Qian. a professor of managerial economics and decision sciences at Kellogg.
So could one of those elements be missing a crucial factor? In a recent study, Qian and her collaborators investigated a factor in human capital that hadn’t received much attention: on-the-job learning. The researchers found that the rate at which people acquire skills at work seemed to be substantially different in rich versus poor countries.
In poor countries, workers are not learning nearly as much on the job as in rich countries,” Qian says. And because on-the-job learning is the primary way people gain new skills after their formal schooling ends, this can have dramatic consequences for a nation’s economic development.




Updating an oldie.
IBM aims to meld Db2 with machine learning, data science workflows
Db2 version 11.5 features a series of new drivers for multiple open source programming languages and frameworks. The idea is that developers can build machine models into applications via Db2. In addition, Db2 integrates Jupyter Notebooks.
Features and integrations include:
    • Support for Go, Ruby, Python, PHP, Java, Node.js, Sequelize.
    • Framework support for Jupyter and Visual Studio Code.
    • Augmented Data Explorer, a natural language querying tool build to resemble a traditional search engine.
    • Data Virtualization to search across multiple diverse data sources.
    • Blockchain support.
    • A common SQL engine that can access all IBM Db2 offerings as well as Oracle, Teradata, Microsoft SQL Server and cloud databases such as Amazon Redshift.




Technology moves on…



No comments: