A little slow to disclose?
McKesson: Stolen Computers Contain Patient Information
Health-care services company, McKesson, is alerting thousands of its patients that their personal information is at risk after two of its computers were stolen from an office.
The company, which helps pharmaceutical manufacturers set up assistance programs for patients in need, sent out a letter alerting patients that the computers were stolen on July 18. The names of the people being alerted were on one of the two PCs, but it's not known how much of their accompanying identifying information was also contained on the machines.
Source - InformationWeek
[From the article:
The company representative said it's not clear if the data on the machines was encrypted. [“We don't know what the hell we're doing...” Bob]
Clueless in Canada?
Ca: CHR patient data stolen
Patient information has been compromised after Calgary Health Region computers were stolen in a sophisticated break-and-enter early yesterday, officials said.
... "Apart from other electronics, seven laptops were stolen, two of which contained patient information."
How sensitive the information in the stolen machines is and how much there is isn't yet known, [“We have no idea what our employees do, and they have no idea what data they do it with...” Bob] said Rougeau.
Source - Calgary Sun
This is interesting...
ID Theft Research Group to Come Out of the Shadows
The Center for Identity Management and Information Protection (CIMIP) has kept a low profile since its inception over a year ago, but that's about to change: The public-private partnership that includes IBM, the U.S. Secret Service, and the FBI, has just broken ground on a new multi-million dollar secured facility, and next month will release some surprising findings about the bad guys behind identity theft.
Source - Dark Reading
So is this (some of the same folks as in the previous story)
International Journal of Digital Evidence
The major drawback is that US “broadband” is much slower that broadband in third world countries. That will need to change.
Analysts Predict Death of Traditional Network Security
By Brian Prince September 7, 2007
As the number of mobile workers grow, businesses will be forced to opt for desktop virtualization, Forrester analysts say.
Robert Whiteley and Natalie Lambert have seen the future—and in it, traditional network security is dead. At least that is the message the two Forrester Research analysts delivered to a crowd at the Forrester Security Forum in Atlanta Sept. 6.
According to them, in the next five years the Internet will be the primary connectivity method for businesses, replacing their private network infrastructure as the number of mobile workers, contractors and other third-party users continues to grow. In this new world, which Whiteley and Lambert called "Internet Everywhere," corporations will have to redefine network security and focus on data encryption, managing risk at the endpoint and having strict data access controls, they said.
Some corporations, such as the energy giant BP, have already taken big steps towards deperimeterization—a term created by the Jericho Forum to describe a strategy that focuses on protecting data with tactics such as encryption rather than traditional efforts aimed at fending off attacks from intruders at the network's boundary. BP has taken some 18,000 of its 85,000 laptops off its LAN and allowed them to connect directly to the Internet, the two said.
... Desktop virtualization allows a PC's operating system and applications to execute in a secure area separate from the underlying hardware and software platform. Its security advantages have become a major selling point, as all a virtualized terminal can do is display information; if it is lost or stolen, no corporate data would likely be compromised since it wouldn't be stored on the local hard drive.
This wouldn't be interesting except for the “We didn't know... “ aspect. Is this a one-in-a-billion situation? (see next article)
National Intelligence Web site no longer invisible to search engines
Posted by Declan McCullagh September 7, 2007 4:30 PM PDT
Until a few hours ago, the Web site of National Intelligence Director Mike McConnell had been invisible in Google, MSN and Yahoo searches. That's because dni.gov's robots.txt file told search engines to stay away. [This is not a default, it requires action by someone. Bob]
Now it's been fixed. DNI spokesman Ross Feinstein told me, apologetically, a moment ago: "When we saw your story posted, I asked our developers to look into it... We certainly appreciate you bringing it to our attention. It's a public Web site. We want it to be indexed. We're not even sure how (the robots.txt file) got there." [The Tooth Fairy strikes again! Bob]
Typically, updates to a database are fed back to the updater, allowing them to confirm that ALL the updates were made. This is Programming 101.
Database Glitch Trips Up Terrorist Screening
By Lara Jakes Jordan AP 09/07/07 8:25 AM PT
A database mistake on the part of the FBI resulted in the records of 20 terror suspects not being available to front-line screeners, an audit found. The problem is that records for two systems that feed to and from the central terror watch list database don't match. The FBI says it is working on the problem and should have it fixed within six months.
I wonder what this cost the taxpayers...
High technology off menu
After 1 day, Wilmette district's use of pupils' fingerprints to pay for lunches is put on hold because of privacy and legal concerns
By Lisa Black Tribune staff reporter September 7, 2007
Shortly after rolling out a new lunch program that allows pupils to pay for hot meals with a scan of their fingerprint, Wilmette school officials put the system on hold after learning that a new Illinois law limits the use of biometric information to protect children's privacy.
That, and the system didn't work, perhaps because of grubby fingers or a computer glitch, said officials from Wilmette Elementary School District 39.
"The jury is still out. We tried it just one day, and it was unsuccessful," said interim Supt. Ray Lechner. [Translation: “We're not done being stupid yet!” Bob]
The US isn't the only clueless government.
September 7, 2007 By Lisa Vaas
On July 18, Sunbelt Software came across a SQL command passed as a query within a URL belonging to an arm of a European country's military. With that, any visitor can pass queries in the URL straight to the back-end database and squeeze out any data, no password required.
At the time, the URL displayed what Sunbelt President Alex Eckelberry calls an "infantile" security screw-up: Namely, putting production code and a back-end database into the hands of anybody who wanders by. It was, in other words, a serious security vulnerability that even the most basic security policy should have forbidden, never mind the security policy of a major defense agency.
Sunbelt, of Clearwater, Fla., alerted security researchers from the country in question. They in turn assured Sunbelt that they would notify the defense agency.
End of story? Unfortunately not. Six weeks later, Sunbelt checked the site and found it was still a sitting duck, serving up military base information to any visitor who knows how to frame a SQL query, telling potential attackers exactly which database it was running and what operating system it was using, thereby painting a day-glow arrow toward the exact class of known vulnerabilities and exploits that could bring it to its knees.
Sunbelt alerted security researchers from the country in question. Again. They in turn assured Sunbelt that they would notify the defense agency. Again.
This is far from an anomaly. As evidenced by the recent attack on a portion of the Pentagon's network—allegedly perpetrated by the Chinese People's Liberation Army—continued vulnerability in defense establishments is leaving governments exposed and populaces at risk. What's worse, much of it is due to sheer sloppiness: Poor security policies, unpatched systems, you name it—nothing glamorous, nothing cutting-edge, just run-of-the-mill slacker lack of attention.
... But even without specifics from the horses' mouths, finding specific vulnerabilities on these sites isn't particularly difficult. Eckelberry directed eWEEK to simply Google "sex porn site:.gov." Out of the 10 top hits Sept. 6 at 4:13 EDT, eight were for pornography somehow tied in to Web servers hosted by the government of California.
Same questions I asked only better...
Why Is The Justice Department Commenting On Net Neutrality?
from the not-really-their-area-of-interest dept
There's been a fair amount of chatter over the Justice Department's decision to comment to the FCC about network neutrality, but there's been almost no discussion as to why the Justice Department should be involved at all. It's true that the DOJ covers anti-trust issues, but this isn't about a merger or the potential to create a monopoly. While I'm not in favor of regulating network neutrality, there are a bunch of really questionable statements in the DOJ's filing that simply don't make much sense. Take, for example, the following statement: "Regulators should be careful not to impose regulations that could limit consumer choice and investment in broadband facilities." If the DOJ really feels that way, then shouldn't it have also come out against the FCC's decision to do-away with line sharing rules that actually did allow for competition? Does the DOJ not realize that the market for broadband is already heavily regulated, which is why most consumers here only have one or two choices -- compared to other countries that have created more open markets on top of the infrastructure, allowing for competition, faster speeds and increased innovation? Does the DOJ really not realize how many gov't subsidies and handouts have been given to the telcos so that they could build networks where no one else could enter the market in the same manner?
The DOJ also makes the bizarre argument that without breaking net neutrality, broadband providers will never make enough money to upgrade their networks. It's a dumb argument for the same reason that it's a dumb argument to claim that without network neutrality, it'll be too costly for certain sites to make enough money to offer cool services to users. Both arguments are ridiculous because they focus on the specific benefits to one private party and not how they impact the rest of the market -- and the DOJ shouldn't have any interest in focusing on the benefits of a single private party (and it's even worse for the DOJ to do so under the false guise of "free market" economics). Sure, without network neutrality telcos might be able to make more money in the short term. But you could just as easily argue that if network neutrality remains, it'll be easier (and cheaper) to create the next generation of killer apps that will make more bandwidth more valuable (allowing the telcos to profit handsomely). And, it's not even worth going into the DOJ's use of the thoroughly debunked claim comparing network neutrality to different delivery speeds at the post office. Basically, the DOJ brief (and, again, it's still not clear why they even have an opinion on this) repeats a bunch of the misleading half-truths that the telcos have spouted for months. Yet, it doesn't touch on the really key issue: there simply isn't real competition in the broadband market. Allowing the telcos to break network neutrality doesn't change that.
Free is good!
Taking the Open Road: University Libraries Explore Options
By Tracey Caldwell Information World Review 09/08/07 4:00 AM PT
The virtual learning environment could be where university libraries first encounter open source. An increasing number of content management and portal systems are also open source and many university libraries are involved in setting up open source repositories. As acceptance of open source grows, the next step will be to consider open source solutions for the core integrated library system.
... Open source learning management system (LMS) Moodle alone is now used in 56 percent of universities since its introduction three years ago, and the Open University has moved over to it wholesale. Supporters say open source LMS tend to be more modular and make it much easier for libraries to contribute content than is the case with commercial solutions.