The Canadian Privacy Law Blog: Developments in privacy law and writings of a Canadian privacy lawyer, containing information related to the Personal Information Protection and Electronic Documents Act (aka PIPEDA) and other Canadian and international laws.

Search this blog

Recent Posts

On Twitter

About this page and the author

The author of this blog, David T.S. Fraser, is a Canadian privacy lawyer who practices with the firm of McInnes Cooper. He is the author of the Physicians' Privacy Manual. He has a national and international practice advising corporations and individuals on matters related to Canadian privacy laws.

For full contact information and a brief bio, please see David's profile.

Please note that I am only able to provide legal advice to clients. I am not able to provide free legal advice. Any unsolicited information sent to David Fraser cannot be considered to be solicitor-client privileged.

David Fraser's Facebook profile

Privacy Calendar

Archives

Links

Subscribe with Bloglines

RSS Atom Feed

RSS FEED for this site

Subscribe to this Blog as a Yahoo! Group/Mailing List
Powered by groups.yahoo.com

Subscribe with Bloglines
Add to Technorati Favorites!

Blogs I Follow

Small Print

The views expressed herein are solely the author's and should not be attributed to his employer or clients. Any postings on legal issues are provided as a public service, and do not constitute solicitation or provision of legal advice. The author makes no claims, promises or guarantees about the accuracy, completeness, or adequacy of the information contained herein or linked to. Nothing herein should be used as a substitute for the advice of competent counsel.

This web site is presented for informational purposes only. These materials do not constitute legal advice and do not create a solicitor-client relationship between you and David T.S. Fraser. If you are seeking specific advice related to Canadian privacy law or PIPEDA, contact the author, David T.S. Fraser.

Thursday, May 22, 2008

Schneier calls for a data privacy law 

In Wired, security and privacy guru Bruce Schneier is calling for a comprehensive privacy law in the United States: Our Data, Ourselves.

Labels: ,

Saturday, May 17, 2008

Cleanse or secure your electronics before crossing the border 

Over the past weeks, I've done a lot of travelling. First to Geneva and then to the US. On both occasions, I had to be very mindful of what information I have on my laptop and my USB drives, since I am subject to the Personal Information International Disclosure Protection Act.

This new law prohibits the export of personal information by Nova Scotia public bodies and their service providers. As a lawyer to a number of public bodies and an instructor at Dalhousie Law School, my laptop an blackberry are subject to those laws. Since I didn't want to go to the bother of asking the chief executive of each public body I work for wheter I had one-off permission to take their data with me (and since I wouldn't need their data on the road), I had to delete all traces of such personal information from my portable electronics. While this is a concern for public bodies in Nova Scotia and their service providers, it's also a concern for anyone who is crossing the border into the United States as increasingly customs officers are scrutinizing laptops at the border.

Bruce Schneier, who always has interesting things to say, has an article in the Guardian on how to secure your laptops if you're taking them into the US. It's a good read and probably something to bookmark to read next time you're crossing the frontier: Read me first: Taking your laptop into the US? Be sure to hide all your data first Technology The Guardian.

Labels: , , , ,

Sunday, February 03, 2008

Schneier on Security vs. Privacy 

Here's a really great read from Bruce Schneier:

Schneier on Security: Security vs. Privacy

If there's a debate that sums up post-9/11 politics, it's security versus privacy. Which is more important? How much privacy are you willing to give up for security? Can we even afford privacy in this age of insecurity? Security versus privacy: It's the battle of the century, or at least its first decade.

In a Jan. 21 New Yorker article, Director of National Intelligence Michael McConnell discusses a proposed plan to monitor all -- that's right, all -- internet communications for security purposes, an idea so extreme that the word "Orwellian" feels too mild.

The article (now online here) contains this passage:

In order for cyberspace to be policed, internet activity will have to be closely monitored. Ed Giorgio, who is working with McConnell on the plan, said that would mean giving the government the authority to examine the content of any e-mail, file transfer or Web search. "Google has records that could help in a cyber-investigation," he said. Giorgio warned me, "We have a saying in this business: 'Privacy and security are a zero-sum game.'"

I'm sure they have that saying in their business. And it's precisely why, when people in their business are in charge of government, it becomes a police state. If privacy and security really were a zero-sum game, we would have seen mass immigration into the former East Germany and modern-day China. While it's true that police states like those have less street crime, no one argues that their citizens are fundamentally more secure.

We've been told we have to trade off security and privacy so often -- in debates on security versus privacy, writing contests, polls, reasoned essays and political rhetoric -- that most of us don't even question the fundamental dichotomy.

But it's a false one.

Security and privacy are not opposite ends of a seesaw; you don't have to accept less of one to get more of the other. Think of a door lock, a burglar alarm and a tall fence. Think of guns, anti-counterfeiting measures on currency and that dumb liquid ban at airports. Security affects privacy only when it's based on identity, and there are limitations to that sort of approach.

Since 9/11, approximately three things have potentially improved airline security: reinforcing the cockpit doors, passengers realizing they have to fight back and -- possibly -- sky marshals. Everything else -- all the security measures that affect privacy -- is just security theater and a waste of effort.

By the same token, many of the anti-privacy "security" measures we're seeing -- national ID cards, warrantless eavesdropping, massive data mining and so on -- do little to improve, and in some cases harm, security. And government claims of their success are either wrong, or against fake threats.

The debate isn't security versus privacy. It's liberty versus control.

You can see it in comments by government officials: "Privacy no longer can mean anonymity," says Donald Kerr, principal deputy director of national intelligence. "Instead, it should mean that government and businesses properly safeguard people's private communications and financial information." Did you catch that? You're expected to give up control of your privacy to others, who -- presumably -- get to decide how much of it you deserve. That's what loss of liberty looks like.

It should be no surprise that people choose security over privacy: 51 to 29 percent in a recent poll. Even if you don't subscribe to Maslow's hierarchy of needs, it's obvious that security is more important. Security is vital to survival, not just of people but of every living thing. Privacy is unique to humans, but it's a social need. It's vital to personal dignity, to family life, to society -- to what makes us uniquely human -- but not to survival.

If you set up the false dichotomy, of course people will choose security over privacy -- especially if you scare them first. But it's still a false dichotomy. There is no security without privacy. And liberty requires both security and privacy. The famous quote attributed to Benjamin Franklin reads: "Those who would give up essential liberty to purchase a little temporary safety, deserve neither liberty nor safety." It's also true that those who would give up privacy for security are likely to end up with neither.

This essay originally appeared on Wired.com

Labels: , , , ,

Thursday, December 27, 2007

Identity Theft Cartoon 

Thanks to Schneier on Security for the link.

Labels: , ,

Thursday, November 29, 2007

Whole disk encryption made easy 

If you have laptop, you should read Bruce Schneier's commentary in Wired: How Does Bruce Schneier Protect His Laptop Data? With His Fists -- and PGP.

Labels: , ,

Saturday, November 10, 2007

Salesforce.com leak leads to targeting phishing attacks 

An employee of Salesforce.com has been taken in by a phishing scam and had his credentials compromised. The fraudsters have since used data from the vast ASP an in attempt to defraud a handful of users. See Schneier on Security: Targeted Phishing from Salesforce.com Leak and Salesforce.com Acknowledges Data Loss - Security Fix.

Labels: ,

Thursday, October 18, 2007

NJ hospital suspends 27 for peeking at celebrity's medical record 

CNN is reporting that 27 employees of the Palisades Medical Center in North Bergen, New Jersey, hav been suspended for a month without pay for looking at actor George Clooney's medical records without a valid reason for doing to. See: 27 suspended for Clooney file peek - CNN.com. (via Schneier on Security: 27 Suspended for Looking at George Clooney's Personal Data).

Labels: ,

Friday, August 10, 2007

US unveils more privacy-friendly no-fly list 

Apparently the American government is about to implement its latest version of the no-fly list, without data mining using commercial sources. It looks a lot like the Canadian "Passenger Protect" program:

Even Bruce Schneier thinks it shows common sense.

Feds offer simpler flight screening plan on Yahoo! News

By MICHAEL J. SNIFFEN, Associated Press Writer

Thu Aug 9, 6:34 PM ET

The government proposed a new version of its airline passenger screening program Thursday, stripped of the data mining that aroused privacy concerns and led Congress to block earlier versions.

It's been three years since the Sept. 11 Commission recommended and Congress ordered that the government take over from the airlines the job of comparing passenger lists with watch lists of known terrorist suspects to keep them off flights. Even this new version of the Secure Flight program is open for public comment and will be tested this fall before it can be implemented fully in 2008.

The third version of the program, once known as CAPPS II, drew positive reviews from privacy advocates and members of Congress who had objected to more elaborate earlier versions. Congress enacted legislation blocking earlier plans to collect private commercial data — like credit card records or travel histories — about all domestic air travelers in an effort to predict which ones might be terrorists.

The new plan would require passengers to give their full name when they make their reservations — either in person, by phone or online. They also will be asked if they are willing to provide their date of birth and gender at that time to reduce the chance of false positive matches with names on the watch lists.

"Finally, this appears to have a coherent, narrow and rational focus," said James Dempsey of the Center for Democracy and Technology, a privacy advocacy group. "This is a vast improvement over what we've seen before."

Even Democrats in Congress were cautiously positive.

"They've been slow to admit that minimizing invasions and breaches of Americans' privacy is part of their job," said Senate Judiciary Committee Chairman Patrick Leahy, D-Vt. "We will evaluate these steps to see if they measure up."

House Homeland Security Chairman Bennie Thompson, D-Miss., said he hoped the administration would stay alert to privacy issues. "I am extremely disappointed it has taken three years and passage of several pieces of legislation to get us to step one."

Thompson added that he hoped it was a sign of foresight that the new plan was announced along with new screening arrangements for international travelers.

At a news conference at Reagan National Airport, Homeland Security Secretary Michael Chertoff also announced that starting six months from now airlines operating international flights will be required to send the government their passenger list data before the planes take off rather than afterward, as is now the case.

Earlier sharing of passenger information is designed to give U.S. authorities more time to identify terrorists like Richard Reid, who attempted to light a shoe bomb on a trans-Atlantic flight in December 2001, and keep them off planes.

"Now the airlines give us their manifests after the plane has left the ground and that is too late," Chertoff said.

The Homeland Security chief said he was unaware of any specific, credible threat against airlines. But based on recent car bomb attempts in Great Britain and public statements by terrorists, he repeated his view that "we are entering a period where the threat is somewhat heightened."

"Look at the history of al-Qaida," Chertoff said. "The airplane has been a consistent favorite target of theirs."

On the domestic side, transferring watch-list checks to Transportation Security Administration officers "should provide more security and more consistency, and thus reduce misidentifications" that have frustrated passengers, Chertoff said.

Existing screening has been widely ridiculed because people like Sen. Edward M. Kennedy, D-Mass., other members of Congress and even infants have been blocked from boarding or delayed because their names are similar to names on the lists.

Chertoff said the new domestic system will avoid activities envisioned earlier that raised privacy concerns.

"Secure Flight will not harm personal passenger privacy," Chertoff said. "It won't collect commercial data (about passengers). It will not assign risk scores and will not attempt to predict behaviors."

Such plans alarmed Congress so much that it barred implementing the program until it passed 10 tests to ensure privacy and accuracy. The Government Accountability Office, Congress' auditing arm, found the previous version failed almost all of them.

Currently, only a passenger's full name is required when reservations are made although date of birth and gender usually become known to transportation security officers later in the boarding process.

Transportation Security Administrator Kip Hawley said volunteering those two items earlier would reduce misidentifications in watch-list matching.

"With the full name, we can resolve 95 percent of the cases correctly. The date of birth adds 3.5 percent to that, and the gender adds another one percent," Hawley said.

Privacy advocates like Dempsey and Bruce Schneier, chief technology officer at the security company BT Counterpane, also were pleased with limits on how long most records will be kept. A check that produces no match — which will be the case for the vast majority of travelers — would be kept only seven days. A false positive match would be kept seven years. Confirmed matches would be kept 99 years.

"On the surface, it looks pretty good," Schneier said. "I'm cautiously optimistic. It's nice to see some common sense."

Labels: , , , , ,

Tuesday, July 24, 2007

The "but I've got nothing to hide" argument 

Daniel Solove, at the University of George Washtington School of Law, has written an interesting article on the "But I've got nothing to hide." Here's a link to the download site and the introduction:

SSRN-'I've Got Nothing to Hide' and Other Misunderstandings of Privacy by Daniel Solove

INTRODUCTION

Since the September 11 attacks, the government has been engaging in extensive surveillance and data mining. Regarding surveillance, in December 2005, the New York Times revealed that after September 11, the Bush Administration secretly authorized the National Security Administration (NSA) to engage in warrantless wiretapping of American citizens’ telephone calls.2 As for data mining, which involves analyzing personal data for patterns of suspicious behavior, the government has begun numerous programs. In 2002, the media revealed that the Department of Defense was constructing a data mining project, called “Total Information Awareness” (TIA), under the leadership of Admiral John Poindexter. The vision for TIA was to gather a variety of information about people, including financial, educational, health, and other data. The information would then be analyzed for suspicious behavior patterns. According to Poindexter: “The only way to detect . . . terrorists is to look for patterns of activity that are based on observations from past terrorist attacks as well as estimates about how terrorists will adapt to our measures to avoid detection.”3 When the program came to light, a public outcry erupted, and the U.S. Senate subsequently voted to deny the program funding, ultimately leading to its demise. Nevertheless, many components of TIA continue on in various government agencies, though in a less systematic and more clandestine fashion.4

In May 2006, USA Today broke the story that the NSA had obtained customer records from several major phone companies and was analyzing them to identify potential terrorists.5 The telephone call database is reported to be the “largest database ever assembled in the world.”6 In June 2006, the New York Times reported that the U.S. government had been accessing bank records from the Society for Worldwide Interbank Financial Transactions (SWIFT), which handles financial transactions for thousands of banks around the world.7 Many people responded with outrage at these announcements, but many others did not perceive much of a problem. The reason for their lack of concern, they explained, was because: “I’ve got nothing to hide.”

The argument that no privacy problem exists if a person has nothing to hide is frequently made in connection with many privacy issues. When the government engages in surveillance, many people believe that there is no threat to privacy unless the government uncovers unlawful activity, in which case a person has no legitimate justification to claim that it remain private.

Thus, if an individual engages only in legal activity, she has nothing to worry about. When it comes to the government collecting and analyzing personal information, many people contend that a privacy harm exists only if skeletons in the closet are revealed. For example, suppose the government examines one’s telephone records and finds out that a person made calls to her parents, a friend in Canada, a video store, and a pizza delivery shop. “So what?” that person might say. “I’m not embarrassed or humiliated by this information. If anybody asks me, I’ll gladly tell them what stores I shop at. I have nothing to hide.”

The “nothing to hide” argument and its variants are quite prevalent in popular discourse about privacy. Data security expert Bruce Schneier calls it the “most common retort against privacy advocates”8 Legal scholar Geoffrey Stone refers to it as “all-too-common refrain.”9 The “nothing to hide” argument is one of the primary arguments made when balancing privacy against security. In its most compelling form, it is an argument that the privacy interest is generally minimal to trivial, thus making the balance against security concerns a foreordained victory for security. Sometimes the “nothing to hide” argument is posed as a question: “If you have nothing to hide, then what do you have to fear?” Others ask: “If you aren’t doing anything wrong, then what do you have to hide?”

In this essay, I will explore the “nothing to hide” argument and its variants in more depth. Grappling with the “nothing to hide” argument is important, as the argument reflects the sentiments of a wide percentage of the population. In popular discourse, the “nothing to hide” argument’s superficial incantations can readily be refuted. But when the argument is made in its strongest form, it is far more formidable.

In order to respond to the “nothing to hide” argument, it is imperative that we have a theory about what privacy is and why it is valuable. At its core, the “nothing to hide” argument emerges from a conception of privacy and its value. What exactly is “privacy”? How valuable is privacy and how do we assess its value? How do we weigh privacy against countervailing values? These questions have long plagued those seeking to develop a theory of privacy and justifications for its legal protection. This essay begins in Part I by discussing the “nothing to hide” argument. First, I introduce the argument as it often exists in popular discourse and examine frequent ways of responding to the argument. Second, I present the argument in what I believe to be its strongest form. In Part II, I briefly discuss my work thus far on conceptualizing privacy. I explain why existing theories of privacy have been unsatisfactory, have led to confusion, and have impeded the development of effective legal and policy responses to privacy problems. In Part III, I argue that the “nothing to hide” argument—even in its strongest form—stems from certain faulty assumptions about privacy and its value. The problem, in short, is not with finding an answer to the question: “If you’ve got nothing to hide, then what do you have to fear?” The problem is in the very question itself.

Labels: , , ,

Sunday, July 01, 2007

Respectful surveillance? 

Last week, Bruce Schneier linked to an article on "respectful cameras" that can recognize faces and obscure them with an oval. The oval can be removed in the event of an investigation. See: Schneier on Security: Surveillance Cameras that Obscure Faces. But one commentator says it's just the "illusion of privacy".

Labels: , ,

Wednesday, January 24, 2007

This is staggeringly stupid and dangerous 

A company has developed an RFID tattoo, that has all the benefits of RFID implantation, but without the messy chip. The chip is replaced by a tattoo. The company is touting its benefits in traceability of the meat supply, but is also suggesting that it may be useful in soldiers:

Industrial Control Designline RFID Ink

... The ink also could be used to track and rescue soldiers, Pydynowski said.

"It could help identify friends or foes, prevent friendly fire, and help save soldiers' lives," he said. "It's a very scary proposition when you're dealing with humans, but with military personnel, we're talking about saving soldiers' lives and it may be something worthwhile."

I can't imagine anything more dangerous than tagging all soliders with a tracking device that may be hacked by the other side. Instead of saving lives, it may result in wholesale destruction. I wonder how long it would be before we saw RFID activated IEDs? Not long, I expect.

Thanks to Schneier for the link.

Labels: , ,

Saturday, December 23, 2006

You shall know them by their sneakers 

Bruce Schneier recently linked to an interesting project that set up a surveillance system to track people using the Nike/iPod Sport Kit. The kit's intended use is to have your shoes talk to your iPod Nano to track your run. However, the shoes transmit a unique ID more than sixty feet to whoever may be listening in. With less than $250 in equipment, the researchers were able to track unwitting joggers.

As one commenter noted at Bruce's blog, perhaps we can't call them sneakers any more...

Labels: , , ,

Saturday, November 11, 2006

"Streeet Sweeper" deployed in British Columbia 

Thanks to a friend in BC for sending me this:

Police in the Lower Mainland of British Columbia have just completed a trial of license plate recognition technology and are planning a widespread rollout of the technology in Vancouver. It consists of a camera mounted atop a police car that looks up license plates, checks them against a database and alerts the cop at the wheel if the car is "suspicious". The technology can look up about 3000 plates an hour and they apparently find that they are overwhelmed with the number that turn up as being suspicious - about one in fifty.

The technology has apparently had great success in the UK (you might remember this) and some people have concerns with privacy aspects of what is characterised by the BC Attorney General as a "street sweeper".

Sgt. Gord Elias said at a press conference: "The potential of ALPR is up to everyone's imagination. There is absolutely no end to what you could do with it."

Like any surveillance technology, it would be prone to "mission creep". It can target child abductors and terrorists. Or people with unpaid parking tickets, those defaulting on student loans. Or it could be used for profiling by flagging people who choose to drive in suspicious places at suspicious times. The reports I have seen do not say whether the system keeps a record of cars looked up or the location that this takes place. If that were the case, the police would be able to create a log of where you (or your car) has been and at what time.

There was no mention of privacy issues in any of the reports, or how these might have been addressed. I wonder whether a privacy impact assessment was carried out as part of the roll-out. Check out: CTV Video - licence plate scanning. And Vancouver Sun: 1 in 50 drivers 'commits crime' on roads.

The technology isn't exactly brand new. Check out what Bruce Schneier had to say when the technology was rolled out in Connecticut in 2004:

Schneier on Security: License Plate "Guns" and Privacy:

... On the face of it, this is nothing new. The police have always been able to run a license plate. The difference is they would do it manually, and that limited its use. It simply wasn't feasible for the police to run the plates of every car in a parking garage, or every car that passed through an intersection. What's different isn't the police tactic, but the efficiency of the process.

Technology is fundamentally changing the nature of surveillance.... It's wholesale surveillance.

And it disrupts the balance between the powers of the police and the rights of the people....

Like the license-plate scanners, the electronic footprints we leave everywhere can be automatically correlated with databases. The data can be stored forever, allowing police to conduct surveillance backwards in time.

The effects of wholesale surveillance on privacy and civil liberties is profound; but unfortunately, the debate often gets mischaracterized as a question about how much privacy we need to give up in order to be secure. This is wrong. It's obvious that we are all safer when the police can use all techniques at their disposal. What we need are corresponding mechanisms to prevent abuse, and that don't place an unreasonable burden on the innocent.

...

For license-plate scanners, one obvious protection is to require the police to erase data collected on innocent car owners immediately, and not save it. The police have no legitimate need to collect data on everyone's driving habits. Another is to allow car owners access to the information about them used in these automated searches, and to allow them to challenge inaccuracies.

We need to go further. Criminal penalties are severe in order to create a deterrent, because it is hard to catch wrongdoers. As they become easier to catch, a realignment is necessary. When the police can automate the detection of a wrongdoing, perhaps there should no longer be any criminal penalty attached. For example, both red light cameras and speed-trap cameras all issue citations without any "points" assessed against the driver.

Wholesale surveillance is not simply a more efficient way for the police to do what they've always done. It's a new police power, one made possible with today's technology and one that will be made easier with tomorrow's. And with any new police power, we as a society need to take an active role in establishing rules governing its use. To do otherwise is to cede ever more authority to the police.

Labels: , , , , ,

Friday, August 25, 2006

Promiscuous pluggers beware 

Don't plug your USB drive in an unknown port! Not only could you get (or give) cooties, but Schneier on Security writes about software out in the wild that allows the covert copying of everything on your drive. Not a good thing for those who carry their lives on their USB drives and plug them into untrusted PCs in internet cafes, at conferences, at hotel business centres and the like.

Labels: ,

Tuesday, June 20, 2006

How to Build a Low-Cost, Extended-Range RFID Skimmer 

My kids just taught me that Darth Vader can read my mind. (It apppears to work even if you're wearing your tifoil hat.) I thought that was bad. Now Schneier and Boing Boing are telling me that any nerd with a soldering iron and directions to Radio Shack (or the Force, I guess) can read the RFIDs in my pocket. What is the world coming to? Check this out: How to Build a Low-Cost, Extended-Range RFID Skimmer.

Labels: , ,

Sunday, June 11, 2006

Beware of strangers bearing USB drives 

The fact that Microsoft Windows will automatically run software from a USB drive with no user intervention is a well-known security vulnerability. For example, the autorun function is the way that the infamous Sony rootkit gets its hooks into your system. With this feature enabled (or, rather, not blocked) on PCs, its an easy way for malware to be installed on your desktops via USB. Read this chilling example:

Dark Reading - Host security - Social Engineering, the USB Way - Security:

... Once I seeded the USB drives, I decided to grab some coffee and watch the employees show up for work. Surveillance of the facility was worth the time involved. It was really amusing to watch the reaction of the employees who found a USB drive. You know they plugged them into their computers the minute they got to their desks.

I immediately called my guy that wrote the Trojan and asked if anything was received at his end. Slowly but surely info was being mailed back to him. I would have loved to be on the inside of the building watching as people started plugging the USB drives in, scouring through the planted image files, then unknowingly running our piece of software.

After about three days, we figured we had collected enough data. When I started to review our findings, I was amazed at the results. Of the 20 USB drives we planted, 15 were found by employees, and all had been plugged into company computers. The data we obtained helped us to compromise additional systems, and the best part of the whole scheme was its convenience. We never broke a sweat. Everything that needed to happen did, and in a way it was completely transparent to the users, the network, and credit union management.

Of all the social engineering efforts we have performed over the years, I always had to worry about being caught, getting detained by the police, or not getting anything of value. The USB route is really the way to go. With the exception of possibly getting caught when seeding the facility, my chances of having a problem are reduced significantly.

You’ve probably seen the experiments where users can be conned into giving up their passwords for a chocolate bar or a $1 bill. But this little giveaway took those a step further, working off humans' innate curiosity. Emailed virus writers exploit this same vulnerability, as do phishers and their clever faux Websites. Our credit union client wasn’t unique or special. All the technology and filtering and scanning in the world won’t address human nature. But it remains the single biggest open door to any company’s secrets.

Disagree? Sprinkle your receptionist's candy dish with USB drives and see for yourself how long it takes for human nature to manifest itself.

Also read Bruce Schneier on this avenue of attack: Schneier on Security: Hacking Computers Over USB.

Labels: , ,

Thursday, May 18, 2006

Schneier on The Eternal Value of Privacy 

Run, do not walk, to read this very interesting comment by Bill Schneier: Wired News: The Eternal Value of Privacy. Here's a taste:

The most common retort against privacy advocates -- by those in favor of ID checks, cameras, databases, data mining and other wholesale surveillance measures -- is this line: "If you aren't doing anything wrong, what do you have to hide?"

Some clever answers: "If I'm not doing anything wrong, then you have no cause to watch me." "Because the government gets to define what's wrong, and they keep changing the definition." "Because you might do something wrong with my information." My problem with quips like these -- as right as they are -- is that they accept the premise that privacy is about hiding a wrong. It's not. Privacy is an inherent human right, and a requirement for maintaining the human condition with dignity and respect.

Labels: , ,

Tuesday, February 28, 2006

Facial recognition for banning bar "troublemakers" 

Bars seem to be on the cutting edge of identification technology. Regular readers of the blog probably have noted references to bars scanning identification documents of visitors and some using external databases to keep track of banned patrons. (see Swiping driver's licenses - instant marketing lists?, Calgary student challenges nightclub over scanning ID, Alberta bar to continue scanning IDs despite Commissioner's advice not to, New technologies for scanning IDs.) Now, Wired News is reporting on facial recognition software that takes a picture of visitors to bars and matches them against a database of banned patrons. The technology was born in Toronto, Canada:

Wired News: BioBouncer Might Make Bars Safer

Privacy watchdog groups, however, don't like the sound of it, and it's not clear club patrons will dig it, either. Many people are already accustomed, or oblivious, to cameras recording their every move at ATMs and 7-11s. But in a bar's let-loose environment the sign Dussich wants posted at the entrance announcing that BioBouncer is recording their faces might send customers running.

Lee Tien, a staff attorney with the Electronic Frontier Foundation, said people may find BioBouncer insulting or invasive. Facial recognition software is notoriously inaccurate, he said, and he is concerned that data-sharing could be used to blackball innocent partiers.

"Think about it: Someone doesn't like you, your photo gets in there, you walk in someplace and they're telling you, 'You're a troublemaker, you got bounced from that other bar.'"

BioBouncer was born when a Toronto club owner asked if Dussich could help curb a burgeoning crime problem. Dussich may be on to something, as crime is plaguing the club scene nationwide, said Robert Smith, a police officer and nightclub security expert, who runs the Hospitality and Security Alliance.

Update: Bruce Schneier has some things to say about this:

Schneier on Security: Face Recognition Comes to Bars:

And the data will be owned by the bars that collect it. They can choose to erase it, or they can choose to sell it to data aggregators like Acxiom.

It's rarely the initial application that's the problem. It's the follow-on applications. It's the function creep. Before you know it, everyone will know that they are identified the moment they walk into a commercial building. We will all lose privacy, and liberty, and freedom as a result.

Technorati tags: :: :: ::

Labels: , , ,

Saturday, January 28, 2006

RFID Cartoon 

Thanks to for leading me to this great .

It's funny because it's true.

Labels: , , ,

Saturday, December 24, 2005

New TSA passenger screening guidelines, courtesy of the Onion 

New TSA Guidelines - click for full versionHere are the latest TSA guidelines for the traveling public, courtesy of The Onion, which bills itself as America's finest news source. Thanks to Schneier on Security for the link.

Labels: ,

Story about feds visiting after request for Mao book is a hoax 

From the same source that originally reported the story comes news that the story about a visit from federal agents following an interlibrary loan request for Mao's Little Red Book is a hoax. The student now admits making up the story:

Federal agents' visit was a hoax: 12/ 24/ 2005

Student admits he lied about Mao book

By AARON NICODEMUS, Standard-Times staff writer

NEW BEDFORD -- The UMass Dartmouth student who claimed to have been visited by Homeland Security agents over his request for "The Little Red Book" by Mao Zedong has admitted to making up the entire story.

The 22-year-old student tearfully admitted he made the story up to his history professor, Dr. Brian Glyn Williams, and his parents, after being confronted with the inconsistencies in his account.

Had the student stuck to his original story, it might never have been proved false.

But on Thursday, when the student told his tale in the office of UMass Dartmouth professor Dr. Robert Pontbriand to Dr. Williams, Dr. Pontbriand, university spokesman John Hoey and The Standard-Times, the student added new details.

The agents had returned, the student said, just last night. The two agents, the student, his parents and the student's uncle all signed confidentiality agreements, he claimed, to put an end to the matter.

But when Dr. Williams went to the student's home yesterday and relayed that part of the story to his parents, it was the first time they had heard it. The story began to unravel, and the student, faced with the truth, broke down and cried.

It was a dramatic turnaround from the day before.

For more than an hour on Thursday, he spoke of two visits from Homeland Security over his inter-library loan request for the 1965, Peking Press version of "Quotations from Chairman Mao Tse-Tung," which is the book's official title.

His basic tale remained the same: The book was on a government watch list, and his loan request had triggered a visit from an agent who was seeking to "tame" reading of particular books. He said he saw a long list of such books.

In the days after its initial reporting on Dec. 17 in The Standard-Times, the story had become an international phenomenon on the Internet. Media outlets from around the world were requesting interviews with the students, and a number of reporters had been asking UMass Dartmouth students and professors for information....

I reported on the original story (The Canadian Privacy Law Blog: Borrow the wrong book and get it personally delivered by the feds) as did hundreds of other blogs, assuming it to be true. Well, it simply was not which shows the risk of believing what you read about on a blog, or in the conventional media (since the story originated with the South Coast Times of Massachusetts).

I tend to agree with most of what Bruce Schneier observes on this latest turn of events:

"I don't know what the moral is, here. 1) He's an idiot. 2) Don't believe everything you read. 3) We live in such an invasive political climate that such stories are easily believable. 4) He's definitely an idiot."

I won't tell which parts I agree with most ...

Labels: ,

Wednesday, December 14, 2005

Korea Solves the Identity Theft Problem 

Rob Hyndman is pointing to Schneier on Security: Korea Solves the Identity Theft Problem. Apparently, Korea is about to pass a law placing full responsibility for losses on the banks for identity theft and online financial fraud, even if the bank is only partially responsible. This will provide the incentive to put in place fraud-blocking measures.

The next questions are: (i) will it work? and (ii) will it only be a Korean phenomenon?

Labels: , ,

Thursday, November 24, 2005

Entertainment industry accused of 'trying to hijack data retention directive' 

Many people are willing to sacrifice some privacy to gain increased security. In this "age of terrorism", initiatives such as the European Data Retention Directive and the Canadian Lawful Access proposals seem more palatable when we are told they are essential to protecting against serious crimes such as terrorism. The European Data Protection Directive has consistenly been "sold" as being limited to protecting the continent against terrorism. Now, representatives of the entertainment industry are making the request that the retained information be available for investigations of copyright and other IP violations. Critics are saying that the entertainment industry is trying to hijack the directive. See: Entertainment industry 'trying to hijack data retention directive' - ZDNet UK News.

Also, check out the discussion on Slashdot: Slashdot | Music Industry 'trying to hijack EU data laws'.

Update (20051127) from Schneier on Security: European Terrorism Law and Music Downloaders:

"Our society definitely needs a serious conversation about the fundamental freedoms we are sacrificing in a misguided attempt to keep us safe from terrorism. It feels both surreal and sickening to have to defend out fundamental freedoms against those who want to stop people from sharing music. How is possible that we can contemplate so much damage to our society simply to protect the business model of a handful of companies."

Labels: , , ,

Saturday, November 19, 2005

Cartoon: False sense of security 

Thanks to Bruce Schneier for pointing to this great cartoon: False sense of security.

Labels: , ,

Thursday, November 03, 2005

Wired News: Fatal Flaw Weakens RFID Passports 

Bruce Schneier has a great article at Wired News on the new RFID enabled passports that the US Government is introducing. It chronicles the security problems and the (half way) solutions offered by the US State Department. It is very interesting reading, both for those interested in the actual project and those interested in problems that can arise in projects with privacy issues that require a high level of technical expertise:

Wired News: Fatal Flaw Weakens RFID Passports

"...The State Department has done a great job addressing specific security and privacy concerns, but its lack of technical skills is hurting it. The collision-avoidance ID is just one example of where, apparently, the State Department didn't have enough of the expertise it needed to do this right.

Of course it can fix the problem, but the real issue is how many other problems like this are lurking in the details of its design? We don't know, and I doubt the State Department knows either. The only way to vet its design, and to convince us that RFID is necessary, would be to open it up to public scrutiny.

The State Department's plan to issue RFID passports by October 2006 is both precipitous and risky. It made a mistake designing this behind closed doors. There needs to be some pretty serious quality assurance and testing before deploying this system, and this includes careful security evaluations by independent security experts. Right now the State Department has no intention of doing that; it's already committed to a scheme before knowing if it even works or if it protects privacy."

Labels: , ,

Saturday, October 29, 2005

Through-the-wall audio surveillance 

Thanks to Bruce Schneier (Schneier on Security: Eavesdropping Through a Wall) for directing me to an interesting US patent application for through-the-wall audio surveillance technology that was developed in association with NASA: United States Patent Application: 0050220310.

Labels: , ,

Thursday, October 20, 2005

Schneier on Private Webcams and the Police 

Since the Conrona (Southern California) Chamber of Commerce and the local police have begun asking local businesses to provide law enforcement with access to the feeds from local businesses' web-enabled security cameras, Bruce Schneier asks how long it will be before a law is passed requiring a backdoor for police?: Schneier on Security: Private Webcams and the Police.

"Lawful access" to the next level? "We're just keeping up with technology. In the olden days, we'd be able to post a cop in front of your store so this is no different ..."

Labels: , ,

Sunday, August 28, 2005

Privacy Risks of Used Cell Phones 

A few months ago, I used a loaner Blackberry for a week or so. When I was bored and fiddling around with it, I discovered the "saved messages" folder on the device had about a dozen e-mails in it from a previous user. Not good. I deleted them all and then did a bit of research to make sure that I didn't leave any data behind when I returned it.

This has just happened on a massive scale, according to Schneier on Security. He's blogging about a recent incident that has more than a few cellular customers hopping mad. When trading up, customers of a certain cellular provider were asked if they wanted to donate their older phones to charities, such as local women's shelters. The phones ended up on e-bay and the company didn't even bother purging the phones of data. Not good in more ways than one. See Schneier on Security: Privacy Risks of Used Cell Phones.

Labels: ,

Friday, July 29, 2005

CardSystems made its choices clear 

The chorus in favour of stronger privacy protections is getting louder. Daniel Handson, at SecurityFocus, has written an opinion piece at that website, calling for greater laws to deal with incidents like the CardSystems breach:

CardSystems made its choices clear

"... The latest news in this escapade is that CardSystems has now lost the contracts it had, and also faces corporate extinction. Now some reading this may be cheering a little, or perhaps a lot, at the karmic balance of CardSystems potentially paying the ultimate price for their cavalier attitude. However other people are suggesting that this corporate extinction might come as a result of misguided notification laws implemented in California, and that without the mandated public disclosure and the resulting firestorm of controversy, the company could have fixed its problems quietly and kept on serving its shareholders and customers. I think that both of these views are misguided and miss the truth.

CardSystems violated a contractual agreement that was put in place by the companies it served. It's that simple. CardSystems kept data in an insecure fashion, with no concern given to the minimum security and encryption standards that it was required to implement. I fail to see why legislation on data protection would change this situation. CardSystems was already required to maintain a certain level of security and failed to do that. In one report, Bruce Schneier, mentioned that this was a common problem with contractual obligations: the fact that auditing is hard. Therefore I cannot see why changing a contractual agreement into a legislated law will make auditing any easier. To draw another comparison, did the fact that they were violating laws affect the behavior of the people at Enron?

Many companies have a long way to go in the security world, and yet the one sector of our civilian society that tends to get information security is the banking and financial industry. Sure they aren’t perfect, but in my experience they are heads and tails better than almost anyone else that I deal with at understanding data privacy. In the case of CardSystems, however, the industry insisted that minimum standards be maintained, outlined what those minimum standards were, and yet much of that was ignored. CardSystems, if it does go bankrupt, will have done so because they willfully violated a contractual obligation, not because of disclosure laws, or public pressure. Would you use a company that had willfully violated previous contracts? Would you want your credit card company to supply your data to that company? I cannot see why repealing disclosure laws and helping to mitigate the lynch mob mentality that can follow a mistake changes the fact that CardSystems violated a contract, and that contract violation is what has brought about this imminent death. I await the forthcoming laws that attempt to prevent something like this from ever happening again. Meanwhile, I continue to check my credit-card statement, bank statements and never give out my Social Insurance Number (or SSN) unless I absolutely have to. I wonder if any of the legislators who are outraged by this would give me their mother’s maiden name, birth-date and the name of their first pet? ..."

Labels: , , ,

Thursday, July 28, 2005

Automatic Surveillance Via Cell Phone 

Bruce Schneier always has interesting things to say about privacy and security. Today, he points to a research project carried out at MIT in which volunteers allowed their cell phones to report back tracking data. The aggregated data was mined to reveal interesting insights into the individual phone users.

Schneier on Security: Automatic Surveillance Via Cell Phone:

"...This is worrisome from a number of angles: government surveillance, corporate surveillance for marketing purposes, criminal surveillance. I am not mollified by this comment:

People should not be too concerned about the data trails left by their phone, according to Chris Hoofnagle, associate director of the Electronic Privacy Information Center.

'The location data and billing records is protected by statute, and carriers are under a duty of confidentiality to protect it,' Hoofnagle said.

We're building an infrastructure of surveillance as a side effect of the convenience of carrying our cell phones everywhere."

There's some interesting discussion in the post's comments, too.

Labels: , ,

Saturday, July 23, 2005

Changing credit card numbers won't help 

Over at Schneier on Security, there's been a bit of a discussion in the comments about how to deal with the increasingly reported security incidents involving credit card processors. One commentator suggested a novel approach to protecting his own accounts:

Schneier on Security: Visa and Amex Drop CardSystems:

"Me? I request replacement credit and debit card numbers every six months, and watch my account activity carefully."

Interestingly, Dr. Don at Bankrate.com just fielded a question on the practice:

Changing credit card numbers won't help:

"Dear Kim,

Your idea about rotating credit card numbers is inventive but it could actually wind up increasing the odds that you find yourself a victim of identity theft or credit card theft. Getting a new credit card number every quarter would mean that you will have credit cards in your mailbox four times a year vs. once every three to four years, and fraud programs that recognize when your spending patterns don't jibe with past purchases aren't going to be effective, because the account won't have a transaction history for comparison.

It's also likely to hurt your credit rating because your credit history will show a series of accounts closed at your request every three months -- unless the series of account numbers is treated as a single account relationship by the credit card provider. For this to happen it would have to be a practice established by the credit card provider in reporting your history to the credit bureaus. It isn't something that you can do on your own...."

Labels: , , ,

Thursday, June 09, 2005

Public Disclosure of Personal Data Loss 

Since the ChoicePoint fiasco, the hot topic in privacy is the question of public notification of security breaches. California has led the way on this and many state and federal legislators are looking to follow California's lead. The Federal Privacy Commissioner in Canada has suggested that notification should be done, but our privacy law contains no obligation (except for Ontario's Personal Health Information Protection Act).

Bruce Scheier always has interesting things to say and on this topic there's no exception:

Schneier on Security: Public Disclosure of Personal Data Loss:

"... As a security expert, I like the California law for three reasons. One, data on actual intrusions is useful for research. Two, alerting individuals whose data is lost or stolen is a good idea. And three, increased public scrutiny leads companies to spend more effort protecting personal data.

Think of it as public shaming. Companies will spend money to avoid the PR cost of public shaming. Hence, security improves.

This works, but there's an attenuation effect going on. As more of these events occur, the press is less likely to report them. When there's less noise in the press, there's less public shaming. And when there's less public shaming, the amount of money companies are willing to spend to avoid it goes down...."

The attenuation effect may be true, but I don't think we've peaked on this yet. If you search Google News for "citigroup tape", you get well over 360 news stories about the incident. Eventually the media's interest will trail off, but I don't think it has happened yet.

Labels: , , , , , , ,

Tuesday, June 07, 2005

Data Aggregators: A Study on Data Quality and Responsiveness 

PrivacyActivisim.org has conducted a study of the accuracy of information contained in the records of Choicepoint and Acxiom.

In light of the use the employers and others make of this information, the results are troubling....

Data Aggregators: A Study on Data Quality and Responsiveness:

"...100% of the reports given out by ChoicePoint had at least one error in them. Error rates for basic biographical data (including information people had to submit in order to receive their reports) fared almost as badly: Acxiom had an error rate of 67% and ChoicePoint had an error rate of 73%. In other words, the majority of participants had at least one such significant error in their reported biographical data from each data broker...."

Thanks to Schneier on Security: Accuracy of Commercial Data Brokers for the link.

Labels: , ,

U.S. Medical Privacy Law Gutted 

There are a number of articles on this that I was going to link to, but once again Schier on Security has it all summed up:

Schneier on Security: U.S. Medical Privacy Law Gutted:

"In the U.S., medical privacy is largely governed by a 1996 law called HIPAA. Among many other provisions, HIPAA regulates the privacy and security surrounding electronic medical records. HIPAA specifies civil penalties against companies that don't comply with the regulations, as well as criminal penalties against individuals and corporations who knowingly steal or misuse patient data...."

Labels: ,

Monday, May 09, 2005

Schneier on REAL ID 

Bruce Schneier is a consistently good source of information on privacy and security. Today, he has posted a summary of the issues related to the proposed American REAL ID Act. This law imposes a uniform standard on states' drivers licenses, which may object to as being outside of the federal government's jurisdiction, a thinly-veiled anti-immigrant measure and the first step toward a national ID card: consistently on Security: REAL ID.

Labels: ,

Friday, May 06, 2005

The Five Most Shocking Things About the ChoicePoint Debacle 

Thanks to Bruce Schneier (Schneier on Security: Lessons of the ChoicePoint Theft) for the pointer to this very interesting essay from CSO (Chief Security Officer) magazine about the ChoicePoint and related privacy incidents: The Five Most Shocking Things About the ChoicePoint Debacle - CSO Magazine - May 2005.

Labels: , ,

Tuesday, April 19, 2005

Describing privacy 

When I give presentations and teach about privacy, I always start with a discussion of "what is privacy". The concept means very different things to people, depending upon their background and the baggage they bring to the discussion. to help us wade through this, Daniel Solove, of George Washington University Law School has written an article in the U. Penn Law Review that addresses the vocabulary and taxonomy of the slippery concept of privacy:

SSRN-A Taxonomy of Privacy by Daniel Solove:

"Privacy is a concept in disarray. Nobody can articulate what it means. As one commentator has observed, privacy suffers from 'an embarrassment of meanings.' Privacy is far too vague a concept to guide adjudication and lawmaking, as abstract incantations of the importance of 'privacy' do not fare well when pitted against more concretely-stated countervailing interests.

In 1960, the famous torts scholar William Prosser attempted to make sense of the landscape of privacy law by identifying four different interests. But Prosser focused only on tort law, and the law of information privacy is significantly more vast and complex, extending to Fourth Amendment law, the constitutional right to information privacy, evidentiary privileges, dozens of federal privacy statutes, and hundreds of state statutes. Moreover, Prosser wrote over 40 years ago, and new technologies have given rise to a panoply of new privacy harms.

A new taxonomy to understand privacy violations is thus sorely needed. This article develops a taxonomy to identify privacy problems in a comprehensive and concrete manner. It endeavors to guide the law toward a more coherent understanding of privacy and to serve as a framework for the future development of the field of privacy law. "

Thanks to Bruce Schneier for the link: Schneier on Security: A Taxonomy of Privacy.

Labels: , ,

Tuesday, April 05, 2005

Technological responses to ID theft 

Thanks to the ever-useful beSpacific for the link to a new US Treasury Department Report on The Use of Technology to Combat Identity Theft. A pretty hefty 116 pages, but an interesting addition to the library.

On a related note, thanks to Schneier on Security for leading me to an equally-lengthy publication by London School of Economics research report on the proposed national ID car scheme for the UK. Tops the scales at 117 pages and also promises to be a good read while I curl up in front of the fireplace this evening.

Labels: , ,

Thursday, March 10, 2005

Schneier on Security: ChoicePoint Says "Please Regulate Me" 

Schneier on Security has posted an extract from ChoicePoint's most recent 8K filing with the SEC and suggests that the company is just crying out to be regulated. The post itself is worth reading, but there are also a wide range of comments posted that are also worth a look: Schneier on Security: ChoicePoint Says "Please Regulate Me".

Labels: , ,

Tuesday, March 01, 2005

Schneier on Security: Choicepoint's CISO Speaks 

Bruce Schneier has some interesting comments flowing from an interview with the CISO of ChoicePoint that appeared in SearchSecurity.com:

Schneier on Security: Choicepoint's CISO Speaks: "Choicepoint's CISO Speaks Richard Baich, Choicepoint's CISO, is interviewed on SearchSecurity.com:
This is not an information security issue. My biggest concern is the impact this has on the industry from the standpoint that people are saying ChoicePoint was hacked. No we weren't. This type of fraud happens every day.

Nice spin job, but it just doesn't make sense. This isn't a computer hack in the traditional sense, but it's a social engineering hack of their system. Information security controls were compromised, and confidential information was leaked.

It's created a media frenzy; this has been mislabeled a hack and a security breach. That's such a negative impression that suggests we failed to provide adequate protection. Fraud happens every day. Hacks don't.

So, Choicepoint believes that providing adequate protection doesn't include preventing this kind of attack. I'm sure he's exaggerating when he says that 'this type of fraud happens every day' and 'frauds happens every day,' but if it's true then Choicepoint has a huge information security problem."

The article and interview are worth reading on their own, as well.

Labels: , ,

Sunday, February 27, 2005

NYT: Some Sympathy for Paris Hilton 

The most recent Sunday New York Times has an article on the past week in privacy. Both the Paris Hilton and ChoicePoint incidents are discussed. The Times also quotes Bruce Schneier, the author of Schneier on Security.

The New York Times > Week in Review > Some Sympathy for Paris Hilton:

"...But the implications of the problem at ChoicePoint are enormous, said Daniel J. Solove, an associate professor of law at George Washington University and author of 'The Digital Person: Technology And Privacy in The Information Age.' The company, he noted, has collected information on practically every adult American, and 'these are dossiers that J. Edgar Hoover would be envious of.' Government has looked into ways to mine commercial data to detect patterns of suspicious activity, he noted, and it will continue to do so. But who watches the watchers? Lawmakers like Senators Charles Schumer of New York and Dianne Feinstein of California are calling for tighter regulation of data brokers. That would be a good idea, said Marc Rotenberg, executive director of the Electronic Privacy Information Center in Washington. 'It's a big, largely unregulated industry that doesn't bear consequences when things go wrong.' Even those who pursue fame, he noted, deserve a measure of privacy...."

Labels: , ,

Wednesday, February 23, 2005

Schneier on Security: ChoicePoint 

Bruce Scheier has a good comment on the ChoicePoint fiasco and the lessons to be learned about incident response:

Schneier on Security: ChoicePoint:

"...This story would have never been made public if it were not for SB 1386, a California law requiring companies to notify California residents if any of a specific set of personal information is leaked.

ChoicePoint's behavior is a textbook example of how to be a bad corporate citizen. The information leakage occurred in October, and it didn't tell any victims until February. First, ChoicePoint notified 30,000 Californians and said that it would not notify anyone who lived outside California (since the law didn't require it). Finally, after public outcry, it announced that it would notify everyone affected...."

Labels: ,

You too can be hacked when the answer to your secret question is the name of your famous, book-writing dog 

How secret is your "secret question" when you are famous for being famous and your life is an open book. It is looking more and more like Paris Hilton's Sidekick II was hacked into thanks to really, really bad password protection. Or, as MacDevCenter points out, a really obvious "secret question" to make it really easy for users who have fogotten their passwords.

"Like many online service providers, T-Mobile.com requires users to answer a 'secret question' if they forget their passwords. For Hilton's account, the secret question was 'What is your favorite pet's name?' By correctly providing the answer, any internet user could change Hilton's password and freely access her account. "

Apparently her dog, Tinkerbell, is almost as famous as her. He is an author (The Tinkerbell Hilton Diaries: My Life Tailing Paris Hilton), a fashion accessory and a dog-about-town. Anybody with more interest in inane celebrities than I would have been able to get her secret question and log into the T-Mobile system.

For a good review of the inherent weakness of these systems, see Schneier on Security: The Curse of the Secret Question.

Labels: ,

Monday, February 21, 2005

Paris Hilton's Sidekick gets hacked 

The Internet is abuzz this morning with the exciting contents of Paris Hilton's T-Mobile Sidekick. It appears that someone hacked into the T-Mobile system and was able to get the contents of her address book, notepad and the photos she had take with the gadget. Most of the links earlier today were to the photos themselves, which are not "safe for work".

Most of the discussion about it suggests that it may be related to the recent hacking of T-Mobile's systems (see PIPEDA and Canadian Privacy Law: Incident(s): Hacker breaches T-Mobile systems, reads US Secret Service email), but it could just have easily been a result of someone guessing her password and accessing the system via the T-Mobile login page. I wouldn't be surprised if her password was "password".

This incident does, however, highlight the vulnerability of personal information when it is in possession of third parties. Our e-mail and address books are held by Yahoo! or Hotmail or whoever. Our voice mail resides on some telco server and our instant messages are archived. It used to be that the bad guys had to break into our homes and offices for this stuff. Now they just have to hack into one of dozens of systems. (See Schneier on Security: T-Mobile Hack).

For (safe for work) coverage of the incident, see Paris Hilton's Sidekick gets hacked. What is T-Mobile going to do about it? - Engadget - www.engadget.com and Hackers post Paris Hilton's address book online - Computerworld:

"Hackers post Paris Hilton's address book online

A copy of her T-Mobile USA cell phone address book appeared on the Web

News Story by Paul Roberts

FEBRUARY 21, 2005 (IDG NEWS SERVICE) - Hackers penetrated the crystalline ranks of Hollywood celebrity Saturday, posting the cellular phone address book of hotel heiress and celebrity Paris Hilton on a Web page and passing the phone numbers and e-mail addresses of some of Tinsel Town's hottest stars into the public realm.

A copy of Hilton's T-Mobile USA Inc. cell phone address book appeared on the Web site of a group calling itself 'illmob.' The address book contains information on over 500 of Hilton's acquaintances, including super celebrities such as Eminem and Christina Aguilera. It is not known how the information was obtained, but the release of the contact book may be further fallout from a hack of T-Mobile's servers that came to light in January...."

Labels: , ,

Saturday, December 11, 2004

Insightful blog-post on the nature of privacy 

Bruce Schneider, one of the leading thinkers on security has recently had some interesting things to say about privacy. In my experience, most IT-types usually think about privacy as being primarily a security issue: you keep information private by keeping the baddies out. But privacy is more than that. It's about giving people control over their own personal information....

Schneier on Security: The Digital Person:

"Last week, I stayed at the St. Regis hotel in Washington, DC. It was my first visit, and the management gave me a questionnaire, asking me things like my birthday, my spouse's name and birthday, my anniversary, and my favorite fruits, drinks, and sweets. The purpose was clear; the hotel wanted to be able to offer me a more personalized service the next time I visited. And it was a purpose I agreed with; I wanted more personalized service. But I was very uneasy about filling out the form.

It wasn't that the information was particularly private. I make no secret of my birthday, or anniversary, or food preferences. Much of that information is even floating around the Web somewhere. Secrecy wasn't the issue.

The issue was control. In the United States, information about a person is owned by the person who collects it, not by the person it is about. There are specific exceptions in the law, but they're few and far between. There are no broad data protection laws, as you find in the European Union. There are no Privacy Commissioners, as you find in Canada. Privacy law in the United States is largely about secrecy: if the information is not secret, there's little you can do to control its dissemination...."

If you aren't a regular reader of Schneider on Security, I highly recommend adding it to your blogroll.

Labels: ,

Sunday, October 24, 2004

Bruce Schneider on RFID passports 

Most followers of computer security and privacy news know about Bruce Schneier. (He is the author and editor of the Crypto-Gram newsletter and of Beyond Fear: Thinking Sensibly about Security in an Uncertain World.) In his recent blog entry about the security and privacy issues related to the reports that RFID will be added to American passports (see Wired News: American Passports to Get Chipped), he very clearly articlates the perceived privacy risks of adding this technology to passports. I would only add that the same risks are inherent in adding RFID to any identity document.

Schneier on Security: RFID Passports:

"October 04, 2004
RFID Passports

... But the Bush administration is advocating radio frequency identification (RFID) chips for both U.S. and foreign passports, and that's a very bad thing.

These chips are like smart cards, but they can be read from a distance. A receiving device can "talk" to the chip remotely, without any need for physical contact, and get whatever information is on it. Passport officials envision being able to download the information on the chip simply by bringing it within a few centimeters of an electronic reader.

Unfortunately, RFID chips can be read by any reader, not just the ones at passport control. The upshot of this is that travelers carrying around RFID passports are broadcasting their identity.

Think about what that means for a minute. It means that passport holders are continuously broadcasting their name, nationality, age, address and whatever else is on the RFID chip. It means that anyone with a reader can learn that information, without the passport holder's knowledge or consent. It means that pickpockets, kidnappers and terrorists can easily--and surreptitiously--pick Americans or nationals of other participating countries out of a crowd.

...

The Bush administration is deliberately choosing a less secure technology without justification. If there were a good offsetting reason to choose that technology over a contact chip, then the choice might make sense.

Unfortunately, there is only one possible reason: The administration wants surreptitious access themselves. It wants to be able to identify people in crowds. It wants to surreptitiously pick out the Americans, and pick out the foreigners. It wants to do the very thing that it insists, despite demonstrations to the contrary, can't be done.

Normally I am very careful before I ascribe such sinister motives to a government agency. Incompetence is the norm, and malevolence is much rarer. But this seems like a clear case of the Bush administration putting its own interests above the security and privacy of its citizens, and then lying about it."

Labels: , ,

This page is powered by Blogger. Isn't yours? Creative Commons License
The Canadian Privacy Law Blog is licensed under a Creative Commons Attribution-Noncommercial-No Derivative Works 2.5 Canada License. lawyer blogs