The welcome screen on the prison laptop was simple to navigate. Prison officials clicked on the dog icon, inmates clicked on the cat. Clicking on the dog – and entering the password – allowed access to a section with administrator privileges and access to the internet. The cat was a gateway to little more than a basic word processor.
Unlocking the “dog” was key to the plotters’ attempts to use the computer to smuggle drugs. Using an east European hacker inside the prison, the gang obtained a coded pen drive that was smuggled into the prison by a visitor.
Frequent visitors to the Hustler Club, a gentlemen’s entertainment venue in New York, could not have known that they would become part of a debate about anonymity in the era of “big data”. But when, for sport, a data scientist called Anthony Tockar mined a database of taxi-ride details to see what fell out of it, it became clear that, even though the data concerned included no direct identification of the customer, there were some intriguingly clustered drop-off points at private addresses for journeys that began at the club. Stir voter-registration records into the mix to identify who lives at those addresses (which Mr Tockar did not do) and you might end up creating some rather unhappy marriages.
Earlier this week, the Online Trust Alliance released a draft framework of best practices for Internet of Things device manufacturers and developers, such as connected home devices and wearable fitness and health technologies. The OTA is seeking comments on its draft framework by September 14.
The framework acknowledges that not all requirements may be applicable to every product due to technical limitations and firmware issues. However, it generally proposes a number of specific security requirements, including encryption of personally identifiable data at rest and in transit, password protection protocols, and penetration testing.
Privacy is breached at several levels; at the time of data collection (especially when biometrics are involved); at the time of its storage by multiple actors (which federated and decentralised enrollment apparatus facilitates by design); at the time of use (especially when Aadhaar is tagged for banal everyday activities that are low-risk from an identity theft or benefits fraud point of view, risking an allegedly secure system being devalued through ubiquity, and compromised through biometric overuse). All of this is compounded by the lack of a statutory frame for the Unique Identification Authority of India and/or a dedicated privacy law.
When the Attorney General contends, as he did during the ongoing matter before the Supreme Court, and as referenced in Tuesday’s order, that there is no privacy violation if the data is not shared, this fails to acknowledge the very complex network of transactions and uses that the scheme is predicated on. When the Supreme Court misses the opportunity to put the brakes on the continued collection of data, it opens the door for the government relying on the Too Big To Fail, Too Late to Turn Back rhetoric.