The Canadian Privacy Law Blog: Developments in privacy law and writings of a Canadian privacy lawyer, containing information related to the Personal Information Protection and Electronic Documents Act (aka PIPEDA) and other Canadian and international laws.
The author of this blog, David T.S. Fraser, is a Canadian privacy lawyer who practices with the firm of McInnes Cooper. He is the author of the Physicians' Privacy Manual. He has a national and international practice advising corporations and individuals on matters related to Canadian privacy laws.
For full contact information and a brief bio, please see David's profile.
The views expressed herein are solely the author's and should not be attributed to his employer or clients. Any postings on legal issues are provided as a public service, and do not constitute solicitation or provision of legal advice. The author makes no claims, promises or guarantees about the accuracy, completeness, or adequacy of the information contained herein or linked to. Nothing herein should be used as a substitute for the advice of competent counsel.
This web site is presented for informational purposes only. These materials do not constitute legal advice and do not create a solicitor-client relationship between you and David T.S. Fraser. If you are seeking specific advice related to Canadian privacy law or PIPEDA, contact the author, David T.S. Fraser.
Monday, April 06, 2009
As of today, all internet service providers in Europe are required by law to retain information about every e-mail and VOIP call made by their users thanks to the European Data Retention Directive.
BBC NEWS Technology Net firms start storing user dataAnd, as an aside, I'm not sure many will find comfort in the idea that RIPA will act to protect privacy: RIPA surveillance may break human rights laws - ZDNet.co.uk.
Details of user e-mails and net phone calls will be stored by internet service providers (ISPs) from Monday under an EU directive.
The plans were drawn up in the wake of the London bombings in 2005.
ISPs and telecoms firms have resisted the proposals while some countries in the EU are contesting the directive.
Jim Killock, executive director of the Open Rights Group, said it was a "crazy directive" with potentially dangerous repercussions for citizens.
All ISPs in the European Union will have to store the records for a year. An EU directive which requires telecoms firms to hold on to telephone records for 12 months is already in force.
The data stored does not include the content of e-mails or a recording of a net phone call, but is used to determine connections between individuals.
Authorities can get access to the stored records with a warrant.
Governments across the EU have now started to implement the directive into their own national legislation.
The UK Home Office, responsible for matters of policing and national security, said the measure had "effective safeguards" in place.
There is concern that access to our data is widening to include many public bodies ISPs across Europe have complained about the extra costs involved in maintaining the records. The UK government has agreed to reimburse ISPs for the cost of retaining the data.
Mr Killock said the directive was passed only by "stretching the law".
The EU passed it by "saying it was a commercial matter and not a police matter", he explained.
"Because of that they got it through on a simple vote, rather than needing unanimity, which is required for policing matters," he said.
Sense of shock
He added: "It was introduced in the wake of the London bombings when there was a sense of shock in Europe. It was used to push people in a particular direction."
Sweden has decided to ignore the directive completely while there is a challenge going through the German courts at present.
"Hopefully, we can see some sort of challenge to this directive," said Mr Killock.
Isabella Sankey, Policy Director at Liberty, said the directive formalised what had already been taking place under voluntary arrangement for years.
"The problem is that this regime allows not just police to access this information but hundreds of other public bodies."
In a statement, the Home Office said it was implementing the directive because it was the government's priority to "protect public safety and national security".
It added: "Communications data is the where and when of the communication and plays a vital part in a wide range of criminal investigations and prevention of terrorist attacks, as well as contributing to public safety more generally.
"Without communications data resolving crimes such as the Rhys Jones murder would be very difficult if not impossible.
"Access to communications data is governed by the Regulation of Investigatory Powers Act 2000 (Ripa) which ensures that effective safeguards are in place and that the data can only be accessed when it is necessary and proportionate to do so."
Friday, January 02, 2009
Just posted on Slaw:
Slaw: Log retention initiatives
I wrote two weeks ago about privacy issues related to the log files that are created and retained by internet companies. The moral of that story was that there is a significant amount of information that is collected in these logs and when they are retained and collated, they can reveal a lot of personal information. I concluded by saying:I don’t think it’s too far fetched to think of a day when it will become standard for all investigations involving the internet to include a warrant served on Google or Yahoo! or Microsoft for all logs related to a particular user or IP address or both.
In Canada, many may remember "lawful access", which was the subject of a number of consultations beginning in 2002. The consultation backgrounder and FAQ solicited comment on preservation orders (here) but the topic was not addressed when the Liberal government introduced the Modernization of Investigative Techniques Act (MITA). I am sure that preservation orders remain on the wish lists for law enforcement in Canada, but they're not here yet.
Europe has taken a different path. In 2006, the European Union adopted Directive 2006/24/EC entitled "on the retention of data generated or processed in connection with the provision of publicly available electronic communications services or of public communications networks". The Directive is meant to harmonize the retention rules of the members of the European Union. It requires that member states adopt rules or legislation to make it mandatory for communications providers to retain certain log-type data for at least six to twelve months. From the "Subject Matter and Scope" clause of the Directive:1. This Directive aims to harmonise Member States' provisions concerning the obligations of the providers of publicly available electronic communications services or of public communications networks with respect to the retention of certain data which are generated or processed by them, in order to ensure that the data are available for the purpose of the investigation, detection and prosecution of serious crime, as defined by each Member State in its national law.
The Directive goes beyond web communications and includes e-mail, telephone, VOIP and mobile phones. The sort of data that has to be collected and retained is that which identifies the source of the communication, the destination of the communication, the device that was used to make the communication and the "user ID" (defined to mean "a unique identifier allocated to persons when they subscribe to or register with an Internet access service or Internet communications service"). The Directive makes is plain that communications providers are not to retain the content of the communication (Article 5(2)).
While the Directive is aimed at saving information so that it can be obtained after the fact in connection with investigations, the debate over data retention in the United States has mainly focused on what has been reported to be informal and secret arrangements made by the National Security Agency and various telephone companies to save telephone calling information. This story was broken by USA Today: USATODAY.com - NSA has massive database of Americans' phone calls.
In addition, US criminal law permits law enforcement to make a written request for the preservation of records for 90 days (renewable for a further 90 days) (US CODE: Title 18, s. 2703(f)):(f) Requirement To Preserve Evidence.—
(1) In general.— A provider of wire or electronic communication services or a remote computing service, upon the request of a governmental entity, shall take all necessary steps to preserve records and other evidence in its possession pending the issuance of a court order or other process.
(2) Period of retention.— Records referred to in paragraph (1) shall be retained for a period of 90 days, which shall be extended for an additional 90-day period upon a renewed request by the governmental entity.
More recently, the Bush Administration has been pushing for broader retention requirements: FBI, politicos renew push for ISP data retention laws | Politics and Law - CNET News.
This posting has presented a brief snapshot of some legal initiatives that affect internet log retention in a selection of countries. It does not seem likely to me that the debate is over; we will likely see EU-type proposals put forward in both Canada and the US in the coming years.
Friday, December 19, 2008
In the past two weeks, the New York Times reported that Microsoft has made a minor concession with European privacy authorities about how long it retains its log files. A committee of European privacy regulators had asked that these logs be kept for only six months. Microsoft's response? Eighteen months.Yahoo used to keep them for thirteen months and just announced it will cut retention to 90 days. Google keeps them for nine.
The privacy implictions of these innocuous log files have been underestimated, particularly when you think about the fulsome picture of your private life that companies like Google may be assembling about you. The information in an ordinary web-server log usually contains the just a tid-bit of information. One "hit" on a website may look like this (but all on one line):127.0.0.1 - frank [10/Oct/2000:13:55:36 -0700] "GET /apache_pb.gif HTTP/1.0" 200 2326 "http://www.example.com/start.html" "Mozilla/4.08 [en] (Win98; I ;Nav)"
The first bundle of numbers is the IP address of the computer that requested a particular web-page. "Frank" refers to a userid, which is usually not eabled. The next field is the date" Following that, and usually preceded by "GET" is the command your web-browser sent to the server. The next bits are the status code returned by the server and then the size of the entity requested. Next is something called a "referer" (mis-spelled) , followed by details about your browser.
Since many people often share the same IP address (it could be one IP for an entire company or just a group of people in a house using the same internet connection), some have argued it is not personal information and a log-file doesn't contain personal information. The problem is that even if an IP address is not directly connected to one individual, one can do some easy analysis to make the connections. After AOL released supposedly de-identified search logs to researchers, an intrepid reporter was able to track down at least one of the users who had some very personal health-related searches in the logs (see: Users identifiable by AOL search data).
What's additionally troubling from a privacy point of view is that the large inernet companies, like Google, Yahoo and Microsoft, don't just have your search queries. Increasingly, they have a huge trove of data sources in their logs.
Take Google, for example. Google has their famous Google search. They also have GMail, Google Analytics, Google AdSense, Google Documents, Google Toolbar and more. Each time you "hit" one of their sites, you're in their logs. Most internet users hit Google's logs dozens of times a day and on many of those occasions aren't even aware that they're using a Google service. Google has what is probably the most popular and widely used network of online advertising: AdSense. Each time you go to a website that features Google's ads, your computer sends a request to Google's servers and that "hit" goes into their logs, along with the information about what site you were visiting, when you visited and what ad was served. If you click on the ad, even more information is collected and logged. But even if you don't visit a site with Google's ads, there's a very good chance that the webmaster is using Google Analytics to find out about useage of his or her site. (Full disclosure: I use Google Analytics for my site at www.privacylawyer.ca.) I should also note that Yahoo! and MSN also have advertising networks, which collect the same sort of information.What this means is that Google, Yahoo and Microsoft register in their logs a significant portion of your usage of the internet.
And if you have a Google, Yahoo! or MSN account, that hit can be connected to your account details, includig your name.
I don't think it's too far fetched to think of a day when it will become standard for all investigations involving the internet to inlcude a warrant served on Google or Yahoo! or Microsoft for all logs related to a particular user or IP address or both.
Next week, I'll discuss efforts being made by governments and law enforcement to make log rentention mandatory.
Friday, December 05, 2008
The European Court of Human Rights has ruled that the indiscriminate retention of DNA samples by UK law enforcement is illegal. See: Spy Blog - ECHR judgment on the Marper case - rules that UK Government and Police indefinate retention of innocent people's tissue samples, DNA profiles and fingerprints is illegal.
Tuesday, September 09, 2008
Google has just announced that they are cutting their log retention period in half: from 18 monts to 9 months.
From the Official Google Blog:
Official Google Blog: Another step to protect user privacy
Today, we're announcing a new logs retention policy: we'll anonymize IP addresses on our server logs after 9 months. We're significantly shortening our previous 18-month retention policy to address regulatory concerns and to take another step to improve privacy for our users.
Back in March 2007, Google became the first leading search engine to announce a policy to anonymize our search server logs in the interests of privacy. And many others in the industry quickly followed our lead. Although that was good for privacy, it was a difficult decision because the routine server log data we collect has always been a critical ingredient of innovation. We have published a series of blog posts explaining how we use logs data for the benefit of our users: to make improvements to search quality, improve security, fight fraud and reduce spam.
Over the last two years, policymakers and regulators -- especially in Europe and the U.S. -- have continued to ask us (and others in the industry) to explain and justify this shortened logs retention policy. We responded by open letter to explain how we were trying to strike the right balance between sometimes conflicting factors like privacy, security, and innovation. Some in the community of EU data protection regulators continued to be skeptical of the legitimacy of logs retention and demanded detailed justifications for this retention. Many of these privacy leaders also highlighted the risks of litigants using court-ordered discovery to gain access to logs, as in the recent Viacom suit.
Today, we are filing this response (PDF file) to the EU privacy regulators. Since we announced our original logs anonymization policy, we have had literally hundreds of discussions with data protection officials, government leaders and privacy advocates around the world to explain our privacy practices and to work together to develop ways to improve privacy. When we began anonymizing after 18 months, we knew it meant sacrifices in future innovations in all of these areas. We believed further reducing the period before anonymizing would degrade the utility of the data too much and outweigh the incremental privacy benefit for users.
We didn't stop working on this computer science problem, though. The problem is difficult to solve because the characteristics of the data that make it useful to prevent fraud, for example, are the very characteristics that also introduce some privacy risk. After months of work our engineers developed methods for preserving more of the data's utility while also anonymizing IP addresses sooner. We haven't sorted out all of the implementation details, and we may not be able to use precisely the same methods for anonymizing as we do after 18 months, but we are committed to making it work.
While we're glad that this will bring some additional improvement in privacy, we're also concerned about the potential loss of security, quality, and innovation that may result from having less data. As the period prior to anonymization gets shorter, the added privacy benefits are less significant and the utility lost from the data grows. So, it's difficult to find the perfect equilibrium between privacy on the one hand, and other factors, such as innovation and security, on the other. Technology will certainly evolve, and we will always be working on ways to improve privacy for our users, seeking new innovations, and also finding the right balance between the benefits of data and advancement of privacy.
Thursday, July 31, 2008
The Province of Nova Scotia has for some time been consulting with inside stakeholders on the development of health information legislation. It has just launched a consultation, seeking input from interested parties. I haven't had a chance to look at the discussion paper yet, but I understand they've been using Ontario's PHIPA as the model:
Personal Health Information Legislation for Nova Scotia Department of Health Government of Nova Scotia
For the past several years the Department of Health has been working with health sector partners on initiatives related to the protection and use of personal health information. As part of the evolution of standards, policy and law on these issues, .the Department is developing a Personal Health Information Act for the province.
The Department is pleased to present the Discussion Paper Personal Health Information Legislation for Nova Scotia (PDF: 70p). Throughout the Discussion Paper, key issues related to the collection, use, disclosure, retention and destruction of personal health information are discussed, and legislative provisions for a Personal Health Information Act are proposed.
Public and stakeholder input to this legislation is critical to its success. Any feedback on the issues raised in the paper, and on any issues related to the management of personal health information in Nova Scotia can be submitted through the online questionnaire, by e-mail to mailto:firstname.lastname@example.org by regular mail to the Personal Health Information Project, Department of Health, 1690 Hollis Street, P.O. Box 488 , Halifax , Nova Scotia , B3J 2R8
The deadline for comments is November 1, 2008.
- Personal Health Information Legislation for Nova Scotia Discussion Paper (PDF:70p)
- Frequently Asked Questions - Foire aux questions (PDF)
- Questionnaire (MS Word) Questionnaire French (MS Word)
- Personal Health Information Legislation Online Questionnaire
Friday, July 11, 2008
More commentary on the Viacom v. Google/YouTube case, this time from MIT's Technology review:
Technology Review: Privacy protections disappear with a judge's order
Privacy protections disappear with a judge's order
By Associated Press
NEW YORK (AP) _ Credit card companies know what you've bought. Phone companies know whom you've called. Electronic toll services know where you've gone. Internet search companies know what you've sought.
It might be reassuring, then, that companies have largely pledged to safeguard these repositories of data about you.
But a recent federal court ruling ordering the disclosure of YouTube viewership records underscores the reality that even the most benevolent company can only do so much to guard your digital life: All their protections can vanish with one stroke of a judge's pen.
"Companies have a tremendous amount of very sensitive data on their customers, and while a company itself may treat that responsibly ... if the court orders it be turned over, there's not a lot that the company that holds the data can do," said Jennifer Urban, a law professor at the University of Southern California.
In the past, court orders and subpoenas have generally been targeted at records on specific individuals. With YouTube, it's far more sweeping, covering all users regardless of whether they have anything to do with the copyright infringement that Viacom Inc., in a $1 billion lawsuit, accuses Google Inc.'s popular video-sharing site of enabling.
It's a scenario privacy activists have long warned about.
"What we're seeing is (that) the theoretical is becoming real world," said Lauren Weinstein, a veteran computer scientist. "The more data you've got, the more data that's going to be there as an attractive kind of treasure chest (for) outside parties."
U.S. District Judge Louis L. Stanton dismissed privacy arguments as speculative.
Last week, Stanton authorized full access to the YouTube logs -- which few users even realize exist -- after Viacom and other copyright holders argued that they needed the data to prove that their copyright-protected videos for such programs as Comedy Central's "The Daily Show with Jon Stewart" are more heavily watched than amateur clips.
"This decision makes it absolutely clear that everywhere we go online, we leave tracks, and every piece of information we access online leaves some sort of record," Urban said. "As consumers, we should all be aware of the fact that this sensitive information is being collected about us."
Mark Rasch, a former Justice Department official who is now with FTI Consulting Inc., said the ruling could open the floodgates for additional disclosures.
Though lawyers have known to seek such data for years, Rasch said, judges initially hesitant about authorizing their release may look to Stanton's ruling for affirmation, even though U.S. District Court rulings do not officially set precedence.
The YouTube database includes information on when each video gets played. Attached to each entry is each viewer's unique login ID and the Internet Protocol, or IP, address for that viewer's computer -- identifiers that, while seemingly anonymous, can often be traced to specific individuals, or at least their employers or hometowns.
Elsewhere, search engines such as Google and Yahoo Inc. keep more than a year of records on your search requests, from which one can learn of your diseases, fetishes and innermost thoughts. E-mail services are another source of personal records, as are electronic health repositories and Web-based word processing, spreadsheets and calendars.
One can reassemble your whereabouts based on where you've used credit cards, made cell phone calls or paid tolls or subway fares electronically. One can track your spending habits through loyalty cards that many retail chains offer in exchange for discounts.
Though companies do have legitimate reasons for keeping data -- they can help improve services or protect parties in billing disputes, for instance -- there's disagreement on how long a company truly needs the information.
The shorter the retention, the less tempting it is for lawyers to turn to the keepers of data in lawsuits, privacy activists say.
With some exceptions in banking, health care and other regulated industries, requests are routinely granted.
Service providers regularly comply with subpoenas seeking the identities of users who write negatively about specific companies, at most warning them first so they can challenge the disclosure themselves. The music and movie industries also have been aggressive about tracking individual users suspected of illegally downloading their works.
Law enforcement authorities also turn to the records to help solve crimes.
The U.S. Justice Department had previously subpoenaed the major search engines for lists of search requests made by their users as part of a case involving online pornography. Yahoo, Microsoft Corp.'s MSN and Time Warner Inc.'s AOL all complied with parts of the legal demand, but Google fought it and ultimately got the requirement narrowed.
In the YouTube case, Viacom largely got the data it wanted.
Google has said it would work with Viacom on trying to ensure anonymity, and Viacom has pledged not to use the data to identify individual users to sue. The YouTube logs will also likely be subject to a confidentiality order.
But privacy advocates warn that there's no guarantee that future litigants will be as restrained or that data released to lawyers won't inadvertently become public -- through their inclusion as an attachment in a court filing, for instance.
And retailers, government agencies and others are regularly announcing that personal information, stored without adequate safeguards, is being stolen by hackers or lost with laptops or portable storage drives.
"You just never know," said Steve Jones, an Internet expert at the University of Illinois at Chicago. "There are some circumstances under which what seems to be private information is going to be shared with a third party, and the court says it's OK to do that."
Copyright Technology Review 2008.
Saturday, April 12, 2008
CityNews in Toronto is reporting that the city's chief of police is calling for forced DNA samples for a national database even before an individual is convicted, and the retention of those samples even if the individual is acquitted. See: CityNews: Toronto Police Chief Calls For Forced DNA Samples.
Monday, April 07, 2008
The BBC is reporting that the Article 29 Working Group in Europe is calling on search engines to render their logs anonymous after six months.
BBC NEWS Technology Search engines warned over data
... The report from the Article 29 Data Protection Working Party said search engine providers had "insufficiently explained" why they were storing and processing personal data to their users.
It said "search engine providers must delete or irreversibly anonymise personal data once they no longer serve the specified and legitimate purpose they were collected for".
The report said the personal data of users should not be stored or processed "beyond providing search results" if the user had not created an account or registered with the search engine.
The advisory body also said it preferred search engines did not collect and use personal data to serve personalised adverts unless the user had consented and signed up to the service....
Google has recently reduced its log retention to eighteen months while other search engines are in the one year to one-and-a-half year ballpark.
Via the ever vigilant Slaw. For Google's previous announcement on retention, check out Canadian Privacy Law Blog: Google to anonymize older data.
Monday, March 03, 2008
The Information and Privacy Commissioner of Ontario has released an extensive report on the use of video surveillance by the Toronto Transit Commission. The report can be found here: Privacy and Video Surveillance in Mass Transit Systems: A Special Investigation Report - Privacy Investigation Report MC07-68.
From the media release:
TTC’s surveillance cameras comply with privacy Act, but additional steps needed to enhance privacy protection, says Privacy Commissioner Ann Cavoukian
TORONTO – Ontario Information and Privacy Commissioner Ann Cavoukian ruled today that the Toronto Transit System’s expansion of its video surveillance system, for the purposes of public safety and security, is in compliance with Ontario’s Municipal Freedom of Information and Protection of Privacy Act – but she is calling on the TTC to undertake a number of specific steps to enhance privacy protection.
The Commissioner’s office conducted a four-month special investigation that went beyond the scope of the usual privacy investigation conducted in that it included:
- A detailed review of the literature and analysis from various parts of the world on the effectiveness of video surveillance;
- An examination of the role that privacy-enhancing technologies can play in mitigating the privacy-invasive nature of video surveillance cameras; and
- A detailed investigation into a privacy complaint by U.K-based Privacy International about the expansion of the TTC’s video surveillance system.
“Video surveillance presents a difficult subject matter for privacy officials to grapple with impartially because, on its face, it is inherently privacy-invasive due to the potential for data capture – despite that fact, there are legitimate uses for video surveillance … that render it in compliance with our privacy laws,” said the Commissioner. “Mass transit systems like the TTC, that are required to move large volumes of people, in confined spaces, on a daily basis, give rise to unique safety and security issues for the general public and operators of the system.”
“The challenge we thus face is to rein in, as tightly as possible, any potential for the unauthorized deployment of the system. We have attempted to do this by ensuring that strong controls are in place with respect to its governance (policy/procedures), oversight (independent audit, reportable to my office) and, the most promising long-term measure, the introduction of innovative privacy-enhancing technologies to effectively eliminate unauthorized access or use of any personal information obtained.”
While the expectation of privacy in public places is not the same as in private places, it does not disappear. People have the right, the Commissioner stresses in her report, to expect the following when it comes to video surveillance:
- That their personal information will only be collected for legitimate, limited and specific purposes;
- That the collection will be limited to the minimum necessary for the specified purposes; and
- That their personal information will only be used and disclosed for the specified purposes.
“These general principles,” said Commissioner Cavoukian, “should apply to all video surveillance systems. Where developments such as video surveillance in mass transit systems, like the TTC, can be shown to be needed for public safety, you must also ensure that threats to privacy are kept to an absolute minimum.”
Among the 13 recommendations the Commissioner is making to the TTC are the following:
- That the TTC reduce its retention period for video surveillance images from a maximum of seven days to a maximum of 72 hours (the same standard as the Toronto Police), unless required for an investigation;
- That the TTC’s video surveillance policy should specifically state that the annual audit must be thorough, comprehensive, and must test all program areas of the TTC employing video surveillance to ensure compliance with the policy and the written procedures. The initial audit should be conducted by an independent third party using Generally Accepted Privacy Principles, and should include an assessment of the extent to which the TTC has complied with the recommendations made in this special report;
- That the TTC should select a location to evaluate the privacy-enhancing video surveillance technology developed by University of Toronto researchers, K. Martin and K. Plataniotis; and
- That, prior to providing the police with direct remote access to the video surveillance images, the TTC should amend the draft memorandum of understanding (MOU) with the Toronto Police to require that the logs of disclosures be subjected to regular audits, conducted on behalf of the TTC. A copy of the revised draft MOU should be provided to the Commissioner prior to signing.
EMERGING PRIVACY-ENHANCING TECHNOLOGY
The Commissioner devotes part of her 50-page special report, and a specific recommendation, to the area of emerging privacy-enhancing video surveillance technology.
“In light of the growth of surveillance technologies, not to mention the proliferation of biometrics and sensoring devices, the future of privacy may well lie in ensuring that the necessary protections are built right into their design,” said the Commissioner. “Privacy by design may be our ultimate protection in the future, promising a positive sum paradigm instead of the unlikely obliteration of a given technology.”
As an example of the research being conducted into privacy-enhancing technologies, the Commissioner cites the work of researchers Karl Martin and Kostas Plataniotis at the University of Toronto, who used cryptographic techniques to develop a secure object-based coding approach. While the background image captured by a surveillance camera can be viewed, the sections where individuals are caught in the image would automatically be encrypted by the software. Designated staff could monitor the footage for unauthorized activity, but would not be able to identify anyone. Only a limited number of designated officials with the correct encryption key could view the full image.
The Commissioner is recommending that the TTC select a location to evaluate the video surveillance technology developed by Martin and Plataniotis.
A copy of the special report is available on the IPC’s website, www.ipc.on.ca.
Monday, August 27, 2007
DP Thinker has posted a few developments in UK data protection law:
DP thinker: A few developments
Just a few developments to note on data protection in the UK:
1) The draft Data Retention (EC Directive) Regulations 2007 will take effect on 1st October 2007. These regulations implement the Data Retentions Directive 2006/24/EC and will apply to public electronic communications providers. Data will be retained for a period of 12 months from the date of communication (Regulation 4(2)). The types of data to be retained are telephone numbers and mobile numbers (Regulation 5(1) and 5(2)). The regulations do not apply to data from internet access, e-mail and internet telephony (VoIP). The Information Commissioner will monitor the application of these regulations (Regulation 8). A comparison of the other European Member States' Laws implementing the Data Retentions Directive 2006/24/EC can be found here.
2) On 24 October 2007, the transitional exemptions under the UK Data Protection Act 1998 will end. This means that structured manual filing systems containing personal records will be covered under the Data Protection Act, but would apply to data that was held before October 1998. The Durant case will be relevant, which took the view that most manual file files are not relevant filing systems.
3) Draft Freedom of Information and Data Protection (Appropriate Limit and Fees) Regulations 2007 - The Government has drafted amended freedom of information (FOI) fees regulations which will allow public authorities to take into account more comprehensively the work involved in dealing with an FOI request. The consultation was completed in June, but further details can be found here.
Thursday, August 23, 2007
Friday, July 20, 2007
While Ask.com is not at the top of the heap of search engines, I think this is an interesting development. Ask.com announced today that it will not store users' queries. What I find most interesting is this observation:
"The number of people this is important to is small," said Doug Leeds, Ask's vice president of product management. "But to these people, it's very important."
This has been my experience. Despite polls and surveys, people who truly care about privacy are a minority. But it seems to be a growing minority and it isn't black and white. And within that segment is a group that cares about privacy a lot. There are also degrees of privacy interest and many people will use privacy characteristics of a service as a distinguishing feature between products or services. It makes sense to address that group of potential customers.
Monday, July 09, 2007
Chris Graves, over at Citadel of the Blogs has some interesting things to say about Google and privacy. He's obviously done some thinking about issues such as Google's retention policy, their participation in the Safe Harbor program and cooperation with law enforcement. Take a look: Citadel of the Blogs » Google & Privacy.
Saturday, July 07, 2007
Earlier this week, the Ontario Court of Appeal, in Cash Converters Canada Inc. v. Oshawa (City) (July 4, 2007) (an appeal from Cash Converters Canada Inc. v. Oshawa (City), 2006 CanLII 3469 (ON S.C.)), overturned a City of Oshawa Bylaw that required sellers of second hand goods to collect detailed personal information about those who sell second hand goods to the stores. The bylaw was inconsistent with the Municipal Freedom of Information and Protection of Privacy Act.
Here's what the Toronto Star had to say about it:
TheStar.com - News - Oshawa second-hand store bylaw invades privacy: Court
LEGAL AFFAIRS REPORTER
The Ontario Court of Appeal has struck down sections of a controversial Oshawa bylaw that require second-hand dealers to collect detailed personal information from people who sell them goods and transmit the data to police.
The bylaw conflicts with provincial privacy legislation, which requires the collection and retention of personal information to be strictly controlled, the court ruled Wedneday, The 3-0 decision could influence challenges to similar bylaws in other parts of the country, including Alberta and British Columbia.
“This decision comes at a time when cities are gaining broader law-making powers,” said David Sterns, a lawyer representing the Oshawa franchise of Cash Converters Canada Inc., a second-hand store that challenged the bylaw.
“The court has sent a strong signal that all forms of information gathering and surveillance by municipalities are subject to the public’s overriding right to privacy.”
Under the Oshawa bylaw, passed by the city in 2004 as part of a new licensing system for second-hand dealers, stores were required to record the name, address, sex, date of birth, phone number and height of their vendors, who also had to produce three pieces of identification, such as a driver’s licence, birth certificate or passport.
“This information is then transmitted and stored in a police data base and available for use and transmissions by the police without any restriction and without any judicial oversight,” said Justice Kathryn Feldman said, writing on behalf of Associate Chief Justice Dennis O’Connor and Justice Paul Rouleau.
Store owners were required to send reports to police at least daily, in some cases at the time of purchase. The city argued the bylaw was meant to protect consumers from purchasing stolen goods.
But the municipality offered no evidence of a growing problem involving the sale of stolen goods to second-hand dealers, said Feldman.
Nor is there evidence that unscrupulous people are more likely to be deterred by the electronic collection and transmission of personal information, she said.
In 2003, Cash Converters purchased more than 28,000 used items from people in 2003. About 30 of those were seized by police in connection with criminal investigations.
It’s unknown whether any were confirmed as stolen, the court said.
The bylaw did not apply to pawn shops, which are provincially regulated.
See, also, James Daw's column: TheStar.com - columnists - New ruling stands up for privacy.
Thursday, July 05, 2007
According to the Financial Times, both Yahoo and Microsoft are soon going to announce changes in their privacy polies regarding the retention of search users' information. "Changes" may be stretching it a bit, since neither have publicly come out to say what their retention policies are:
FT.com / Companies / Media & internet - Web search groups to yield on privacy
Web search groups to yield on privacy
By Maija Palmer in London
Published: July 5 2007 22:02 | Last updated: July 5 2007 22:02
Yahoo and Microsoft are preparing to announce concessions in their privacy policies in the next few weeks, as pressure mounts in Europe over the length of time internet search companies should be allowed to hold personal data.
The Working Party has already been in discussions with Google over its policies for keeping data, and intends to widen scrutiny to the rest of the market.
The Article 29 group is concerned that data kept by search engine companies can be used to identify individuals and create profiles of their preferences....
I was just browsing Google Inc.'s Form 10-K for 2006 and happend upon this little tidbit under "Risk Factors":
Google Form 10-K for 2006
Privacy concerns relating to our technology could damage our reputation and deter current and potential users from using our products and services.
From time to time, concerns have been expressed about whether our products and services compromise the privacy of users and others. Concerns about our practices with regard to the collection, use, disclosure or security of personal information or other privacy-related matters, even if unfounded, could damage our reputation and operating results. While we strive to comply with all applicable data protection laws and regulations, as well as our own posted privacy policies, any failure or perceived failure to comply may result in proceedings or actions against us by government entities or others, which could potentially have an adverse affect on our business.
In addition, as nearly all of our products and services are web based, the amount of data we store for our users on our servers (including personal information) has been increasing. Any systems failure or compromise of our security that results in the release of our users’ data could seriously limit the adoption of our products and services as well as harm our reputation and brand and, therefore, our business. We may also need to expend significant resources to protect against security breaches. The risk that these types of events could seriously harm our business is likely to increase as we expand the number of web based products and services we offer as well as increase the number of countries where we operate.
A large number of legislative proposals pending before the United States Congress, various state legislative bodies and foreign governments concern data protection. In addition, the interpretation and application of data protection laws in Europe and elsewhere are still uncertain and in flux. It is possible that these laws may be interpreted and applied in a manner that is inconsistent with our data practices. If so, in addition to the possibility of fines, this could result in an order requiring that we change our data practices, which could have a material effect on our business. Complying with these various laws could cause us to incur substantial costs or require us to change our business practices in a manner adverse to our business.
Just for fun, I thought I'd check out the 10-K's for Yahoo and DoubleClick.
Yahoo Inc. Form 10-K for 2006
Changes in regulations or user concerns regarding privacy and protection of user data could adversely affect our business.
Federal, state, foreign and international laws and regulations may govern the collection, use, retention, sharing and security of data that we receive from our users and partners. In addition, we have and post on our website our own privacy policies and practices concerning the collection, use and disclosure of user data. Any failure, or perceived failure, by us to comply with our posted privacy policies or with any data-related consent orders, Federal Trade Commission requirements or other federal, state or international privacy-related laws and regulations could result in proceedings or actions against us by governmental entities or others, which could potentially have an adverse effect on our business.
Further, failure or perceived failure to comply with our policies or applicable requirements related to the collection, use, sharing or security of personal information or other privacy-related matters could result in a loss of user confidence in us, damage to the Yahoo! brands, and ultimately in a loss of users, partners or advertisers, which could adversely affect our business.
A large number of legislative proposals pending before the United States Congress, various state legislative bodies and foreign governments concern data privacy and retention issues related to our business. It is not possible to predict whether or when such legislation may be adopted. Certain proposals, if adopted, could impose requirements that may result in a decrease in our user registrations and revenues. In addition, the interpretation and application of user data protection laws are in a state of flux. These laws may be interpreted and applied inconsistently from country to country and inconsistently with our current data protection policies and practices. Complying with these varying international requirements could cause us to incur substantial costs or require us to change our business practices in a manner adverse to our business.
Doubleclick Form 10-K for 2006
Privacy and Data Protection
We continue to be a leader in promoting consumers’ privacy and understanding the technologies that our clients, marketers, advertising agencies and data companies use to communicate with their existing customers and to acquire new customers. Our Chief Privacy Officer leads our privacy and data protection efforts. Our privacy team focuses on ensuring that we are effectively implementing our privacy policies and procedures and works with our clients to institute and improve their privacy procedures, while helping them to educate their customers about the privacy issues applicable to them. In addition, our privacy team actively participates in a number of industry privacy organizations.
Our business may be materially adversely affected by lawsuits related to privacy, data protection and our business practices.
We have been a defendant in several lawsuits and governmental inquiries by the Federal Trade Commission and the attorneys general of several states alleging, among other things, that we unlawfully obtain and use Internet users’ personal information and that our use of ad serving “cookies” violates various laws. Cookies are small pieces of data that are recorded on the computers of Internet users. Although the last of these particular matters was resolved in 2002, we may in the future be subject to additional claims or regulatory inquiries with respect to our business practices. Class action litigation and regulatory inquiries of these types are often expensive and time consuming and their outcome may be uncertain.
Any additional claims or regulatory inquiries, whether successful or not, could require us to devote significant amounts of monetary or human resources to defend ourselves and could harm our reputation. We may need to spend significant amounts on our legal defense, senior management may be required to divert their attention from other portions of our business, new product launches may be deferred or canceled as a result of any proceedings, and we may be required to make changes to our present and planned products or services, any of which could materially and adversely affect our business, financial condition and results of operations. If, as a result of any proceedings, a judgment is rendered or a decree is entered against us, it may materially and adversely affect our business, financial condition and results of operations and harm our reputation.
All three seem relatively boilerplate-ish, but what's interesting is that none of the 10-Ks go to any length to discuss how privacy and customer trust might be a real driver for their brands. Privacy and trust are taken for granted. Some dicussion elsewhere in each document includes privacy as part of their brands, but it is mainly in the context of risks to those brands.
Tuesday, June 12, 2007
I spoke with Briony Smith of IT Business about the recent Privacy International report that put Google at the bottom of their study on the privacy practices of online businesses. She also spoke with Phillipa Lawson and Richard Rosenberg.
Here's a bit:
David T.S. Fraser, a privacy lawyer with the Halifax-based McInnes-Cooper, is unsurprised that Google is coming under fire. Said Fraser: “This is probably inevitable because of their size and the diversity of their business interests: e-mail, social networking, search, classified ads, Google Documents.”
There are also no overarching privacy laws, comparable to PIPEDA, in the United States, according to the Vancouver-based Richard Rosenberg, president of the British Columbia Freedom of Information and Privacy Association.
Lawford said that Google’s business seems to be set up to cull the maximum amount of information about its users, and that he wouldn’t be at all surprised to find that Google was farming out profiled information to outside parties. Proving this can be difficult, according to Lawford. “Following the information through the chain can be hard,” he said.
Fraser suggests that Google’s privacy policies be made much more transparent, and that it tells its users as well just how long their information will be retained for (which, in North America, is indefinitely, according to Rosenberg).
One minor correction: Google has recently announced their retention schedule for their log information, but it still is likely beyond what's reasonably necessary (Canadian Privacy Law Blog: Why does Google remember information about searches?).
Sunday, June 10, 2007
Michel-Adrien Sheppard, aka Libray Boy, is linking to a new report by Privacy International that ranks the privacy practices of online companies. What's most interesting is that Google is at the bottom and merits special mention:
A Race to the Bottom - Privacy Ranking of Internet Service Companies
We are aware that the decision to place Google at the bottom of the ranking is likely to be controversial, but throughout our research we have found numerous deficiencies and hostilities in Google's approach to privacy that go well beyond those of other organizations. While a number of companies share some of these negative elements, none comes close to achieving status as an endemic threat to privacy. This is in part due to the diversity and specificity of Google's product range and the ability of the company to share extracted data between these tools, and in part it is due to Google's market dominance and the sheer size of its user base. Google's status in the ranking is also due to its aggressive use of invasive or potentially invasive technologies and techniques.
The view that Google "opens up" information through a range of attractive and advanced tools does not exempt the company from demonstrating responsible leadership in privacy. Google's increasing ability to deep-drill into the minutiae of a user's life and lifestyle choices must in our view be coupled with well defined and mature user controls and an equally mature privacy outlook. Neither of these elements has been demonstrated. Rather, we have witnessed an attitude to privacy within Google that at its most blatant is hostile, and at its most benign is ambivalent. These dynamics do not pervade other major players such as Microsoft or eBay, both of which have made notable improvements to the corporate ethos on privacy issues.
In the closing days of our research we received a copy of supplemental material relating to a complaint to the Federal Trade Commission concerning the pending merger between Google and DoubleClick. This material, submitted by the Electronic Privacy Information Center (EPIC) and coupled with a submission to the FTC from the New York State Consumer Protection Board, provided additional weight for our assessment that Google has created the most onerous privacy environment on the Internet. The Board expressed concern that these profiles expose consumers to the risk of disclosure of their data to third-parties, as well as public disclosure as evidence in litigation or through data breaches. The EPIC submission set out a detailed analysis of Google's existing data practices, most of which fell well short of the standard that consumers might expect. During the course of our research the Article 29 Working Group of European privacy regulators also expressed concern at the scale of Google's activities, and requested detailed information from the company.
In summary, Google's specific privacy failures include, but are by no means limited to:
- Google account holders that regularly use even a few of Google's services must accept that the company retains a large quantity of information about that user, often for an unstated or indefinite length of time, without clear limitation on subsequent use or disclosure, and without an opportunity to delete or withdraw personal data even if the user wishes to terminate the service.
- Google maintains records of all search strings and the associated IP-addresses and time stamps for at least 18 to 24 months and does not provide users with an expungement option. While it is true that many US based companies have not yet established a time frame for retention, there is a prevailing view amongst privacy experts that 18 to 24 months is unacceptable, and possibly unlawful in many parts of the world.
- Google has access to additional personal information, including hobbies, employment, address, and phone number, contained within user profiles in Orkut. Google often maintains these records even after a user has deleted his profile or removed information from Orkut.
- Google collects all search results entered through Google Toolbar and identifies all Google Toolbar users with a unique cookie that allows Google to track the user's web movement.17 Google does not indicate how long the information collected through Google Toolbar is retained, nor does it offer users a data expungement option in connection with the service.
- Google fails to follow generally accepted privacy practices such as the OECD Privacy Guidelines and elements of EU data protection law. As detailed in the EPIC complaint, Google also fails to adopted additional privacy provisions with respect to specific Google services.
- Google logs search queries in a manner that makes them personally identifiable but fails to provide users with the ability to edit or otherwise expunge records of their previous searches.
- Google fails to give users access to log information generated through their interaction with Google Maps, Google Video, Google Talk, Google Reader, Blogger and other services.
Saturday, May 26, 2007
From the Independent (UK):
Google is watching you - Independent Online Edition > Science & Technology
'Big Brother' row over plans for personal database
By Robert Verkaik, Law Editor
Published: 24 May 2007
Google, the world's biggest search engine, is setting out to create the most comprehensive database of personal information ever assembled, one with the ability to tell people how to run their lives.
In a mission statement that raises the spectre of an internet Big Brother to rival Orwellian visions of the state, Google has revealed details of how it intends to organise and control the world's information.
The company's chief executive, Eric Schmidt, said during a visit to Britain this week: "The goal is to enable Google users to be able to ask the question such as 'What shall I do tomorrow?' and 'What job shall I take?'."
Speaking at a conference organised by Google, he said : "We are very early in the total information we have within Google. The algorithms [software] will get better and we will get better at personalisation."
Google's declaration of intent was publicised at the same time it emerged that the company had also invested £2m in a human genetics firm called 23andMe. The combination of genetic and internet profiling could prove a powerful tool in the battle for the greater understanding of the behaviour of an online service user.
Privacy protection campaigners are concerned that the trend towards sophisticated internet tracking and the collating of a giant database represents a real threat, by stealth, to civil liberties.
That concern has been reinforced by Google's $3.1bn bid for DoubleClick, a company that helps build a detailed picture of someone's behaviour by combining its records of web searches with the information from DoubleClick's "cookies", the software it places on users' machines to track which sites they visit.
The Independent has now learnt that the body representing Europe's data protection watchdogs has written to Google requesting more information about its information retention policy.
The multibillion-pound search engine has already said it plans to impose a limit on the period it keeps personal information.
A spokesman for the Information Commissioner's Office, the UK agency responsible for monitoring data legislation confirmed it had been part of the group of organisations, known as the Article 29 Working Group, which had written to Google.
It is understood the letter asked for more detail about Google's policy on the retention of data. Google says it will respond to the Article 29 request next month when it publishes a full response on its website.
The Information Commissioner's spokeswoman added: "I can't say what was in it only that it was written in response to Google's announcement that will hold information for no more than two years."
A spokeswoman for the Information Commissioner said that because of the voluntary nature of the information being targeted, the Information Commission had no plans to take any action against the databases.
Peter Fleischer, Google's global privacy Ccunsel, said the company intended only doing w hat its customers wanted it to do. He said Mr Schmidt was talking about products such as iGoogle, where users volunteer to let Google use their web histories. "This is about personalised searches, where our goal is to use information to provide the best possible search for the user. If the user doesn't want information held by us, then that's fine. We are not trying to build a giant library of personalised information. All we are doing is trying to make the best computer guess of what it is you are searching for."
Privacy protection experts have argued that law enforcement agents - in certain circumstances - can compel search engines and internet service providers to surrender information. One said: "The danger here is that it doesn't matter what search engines say their policy is because it can be overridden by national laws."
Monday, May 14, 2007
Straight from Google's official blog:
Official Google Blog: Why does Google remember information about searches? 5/11/2007 11:21:00 AM Posted by Peter Fleischer, Global Privacy Counsel
We recently announced a new policy to anonymize our server logs after 18–24 months. We’re the only leading search company to have taken this step publicly. We believe it’s an important part of our commitment to respect user privacy while balancing a number of important factors.
In developing this policy, we spoke with various privacy advocates, regulators and others about how long they think the period should be. There is a wide spectrum of views on this – some think data should be preserved for longer, others think it should be anonymized almost immediately. We spent a great deal of time sorting this out and thought we’d explain some of the things that prompted us to decide on 18-24 months.
Three factors were critical. One was maintaining our ability to continue to improve the quality of our search services. Another was to protect our systems and our users from fraud and abuse. The third was complying—and anticipating compliance—with possible data retention requirements. Here’s a bit more about each of these:
At the same time, regulators in other parts of governments have argued for shorter retention periods, reflecting the conflicts in every country between privacy and data protection objectives on the one hand, and law enforcement objectives on the other. Companies like Google are trying to be responsible corporate citizens, and sometimes we are told to do different things by different government entities, or to follow conflicting legal obligations. It's hard enough to get different government entities to talk to each other inside one country. When you multiply this by all the countries where Google must comply with the laws, the potential conflicts are enormous. Nonetheless, Google is committed to providing its users around the world with one consistent high level of data protection.
- Improve our services: Search companies like Google are constantly trying to improve the quality of their search services. Analyzing logs data is an important tool to help our engineers refine search quality and build helpful new services. Take the example of Google Spell Checker. Google’s spell checking software automatically looks at your query and checks to see if you are using the most common version of a word’s spelling. If it calculates that you’re likely to generate more relevant search results with an alternative spelling, it will ask “Did you mean: (more common spelling)?” We can offer this service by looking at spelling corrections that people do or do not click on. Similarly, with logs, we can improve our search results: if we know that people are clicking on the #1 result we’re doing something right, and if they’re hitting next page or reformulating their query, we’re doing something wrong. The ability of a search company to continue to improve its services is essential, and represents a normal and expected use of such data.
- Maintain security and prevent fraud and abuse: It is standard among Internet companies to retain server logs with IP addresses as one of an array of tools to protect the system from security attacks. For example, our computers can analyze logging patterns in order to identify, investigate and defend against malicious access and exploitation attempts. Data protection laws around the world require Internet companies to maintain adequate security measures to protect the personal data of their users. Immediate deletion of IP addresses from our logs would make our systems more vulnerable to security attacks, putting the personal data of our users at greater risk. Historical logs information can also be a useful tool to help us detect and prevent phishing, scripting attacks, and spam, including query click spam and ads click spam.
- Comply with legal obligations to retain data: Search companies like Google are also subject to laws that sometimes conflict with data protection regulations, like data retention for law enforcement purposes. For example, Google may be subject to the EU Data Retention Directive, which was passed last year, in the wake of the Madrid and London terrorist bombings, to help law enforcement in the investigation and prosecution of “serious crime”. The Directive requires all EU Member States to pass data retention laws by 2009 with retention for periods between 6 and 24 months. Since these laws do not yet exist, and are only now being proposed and debated, it is too early to know the final retention time periods, the jurisdictional impact, and the scope of applicability. It's therefore too early to state whether such laws would apply to particular Google services, and if so, which ones. In the U.S., the Department of Justice and others have similarly called for 24-month data retention laws.
It’s also worth reiterating that we do not ask our users for their names, address, or phone numbers to use most of our services. For those who want to see what their logs history looks like, we offer transparent access via a Google Account to their own personal Web History.
Finally, we maintain rigorous internal controls of our logs database. We look forward to an ongoing discussion with privacy stakeholders around the world as we pursue a common goal of improving privacy protections for everyone on the Internet.
Wednesday, March 14, 2007
This is an interesting development, though some think it is too late and doesn't go far enough:
Official Google Blog: Taking steps to further improve our privacy practices
3/14/2007 03:00:00 PM
Posted by Peter Fleischer, Privacy Counsel-Europe, and Nicole Wong, Deputy General Counsel
Just as we continuously work to improve our products, we also work toward having the best privacy practices for our users. This includes designing privacy protections into our products (like Google Talk's “off the record” feature or Google Desktop’s “pause” and “lock search” controls). This also means providing clear, easy to understand privacy policies that help you make informed decisions about using our services.
After talking with leading privacy stakeholders in Europe and the U.S., we're pleased to be taking this important step toward protecting your privacy. By anonymizing our server logs after 18-24 months, we think we’re striking the right balance between two goals: continuing to improve Google’s services for you, while providing more transparency and certainty about our retention practices. In the future, it's possible that data retention laws will obligate us to retain logs for longer periods. Of course, you can always choose to have us retain this data for more personalized services like Search History. But that's up to you.
Our engineers are already busy working out the technical details, and we hope to implement this new data policy over the coming months (and within a year's time). We’ll communicate more as we work out these details, but for now, we wanted you to know that we’re working on this additional step to strengthen your privacy.
If you want to know more, read the log retention FAQ (PDF).
There's more here: WIRED Blogs: 27B Stroke 6: Google To Anonymize Data -- Updated. And here: Google adopts tougher privacy measures.
Thanks to Boing Boing for the tipoff.
Tuesday, September 19, 2006
Once again, the US Attorney General is calling for a law requiring internet service providers to collect and keep logs for law enforcement purposes:
USATODAY.com - Gonzales calls for law to require Internet companies to preserve customer data:
... 'We respect civil liberties but we have to harmonize this so we can get more information,' he said.
The subject has prompted some alarm among Internet service provider executives and civil liberties groups after the Justice Department took Google to court earlier this year to force it to turn over information on customer searches. Civil liberties groups also have sued Verizon and other telephone companies, alleging they are working with the government to provide information without search warrants on subscriber calling records.
Justice Department officials have said that any proposal would not call for the content of communications to be preserved and would keep the information in the companies' hands. The data could be obtained by the government through a subpoena or other lawful process....
Friday, September 15, 2006
An Irish activist group has begun a court challenge to the European Union's data retention rules, arguing they are contrary to the Irish Constitution and the European Convention on Human Rights. See: Digital rights activists take aim at EU data laws The Register.
Friday, August 11, 2006
The AOL search data blunder (see below) has revived discussion and interest in an American law that was proposed after the earlier fight with the Department of Justice over search data:
Rep. Ed Markey, a Massachusetts Democrat, said Wednesday that AOL's disclosure of the search habits of more than 650,000 of its users demonstrates that new laws are necessary. AOL has apologized for the disclosure.
"We must stop companies from unnecessarily storing the building blocks of American citizens' private lives," Markey said.
Markey's proposal, called the Eliminate Warehousing of Consumer Internet Data Act (EWOCID), was introduced in February after Google's courtroom tussle over search records with the U.S. Department of Justice.
Republicans have kept it bottled up in a House of Representatives subcommittee ever since, but a Markey representative said Wednesday that he hoped "this most recent breach will light a fire under the GOP leadership."
Tuesday, August 01, 2006
Among the ten commandments of protecting consumer privacy is the admonition "don't keep it." It appears that search engine ixquick is following that commandment:
Ixquick.com eliminates 'Big Brother'
First search engine to stop recording privacy details
HAARLEM, The Netherlands, June 27, 2006
As personal privacy concerns create growing alarm about the freedom of the Internet, the Ixquick metasearch engine (www.ixquick.com) has taken a pioneering step: starting today, Ixquick will permanently delete all personal search details gleaned from its users from the log files.
"This new feature of our search engine ensures both optimal privacy protection and maximum search performance for our customers, since they will be able to search using the 11 best search engines without their personal data being recorded," says Ixquick spokesman Alex van Eesteren.
As digital technology increasingly pervades our world, more and more personal details are being stored electronically, many of them by search engines. While you are searching the internet, these engines register the time of your searches, the terms you used, the sites you visited and your IP address. In many cases this IP address makes it possible to trace the computer, and in turn the household, that carried out the search.
These personal details are often retained for long periods by search engines and are of interest to commercial parties, governments and even criminals. "Many search engines openly use this data for commercial purposes. It seems only to be a question of time before the data gets misused," alleges Van Eesteren. "Therefore we have decided to permanently delete all personal search records. If the data is not stored, users privacy can't be breached".
Ixquick's Meta Search feature enables the user to simultaneously search 11 of the best search engines. However, Ixquick does not share the user's personal data with these individual search engines in any circumstances. In addition, as of this week, Ixquick will delete the users' IP addresses and 'unique user IDs' from its own 'Log Files'.
"Therefore, any user can use Ixquick.com to search in a combination of the best search engines secure in the knowledge that they can enjoy complete protection of their privacy," continues Mr. van Eesteren.
For more information, please visit www.ixquick.com.
This makes sense in so many ways: First, they save cash since they don't have to store the information. Second, they don't have to worry about a privacy breach. Third, they won't get dragged into a fight over customer information. Finally, it'll excite privacy-concerned web surfers without alientating the others.
Friday, July 14, 2006
From GameSpot (via the always interesting Video Game Law Blog):
Japanese MMOG suffers privacy leak - News at GameSpot
Game Garden warns that e-mail addresses and game logs of hundreds of thousands of Xenepic Online players may have been compromised.
By Walt Wyman, GameSpot
Posted Jun 28, 2006 11:41 am PT
Game Garden, an online game developer and provider, announced today that personal user information from Xenepic Online, a free massively multiplayer online role-playing game for PCs, was inadvertently compromised. Game Garden manages the server on behalf of NHN Japan Corporation, the game's provider.
The information was mistakenly stored on an open download server, potentially allowing anyone to access it using certain exploits. Data for 297,805 users was put at risk, including their game-server usernames and passwords, e-mail addresses, and game log files, which contain information on items purchased and chat history.
However, it seems that no payment information, such as credit card information, was among the compromised data. In a press release, Game Garden apologized to Xenepic users for the security failure and pledged to "further consolidate internal management to prevent similar incidents in the future."
Wednesday, June 07, 2006
If you don't need (really, really need) a particular type of personal information and it is at all sensitive, do not collect it. Do not keep it. If you have it, securely destroy it.
Privacy best practices world wide are pretty clear that you should only collect and retain personal information that is necessary for a clearly articulated purpose. In the CSA Model Code for the Protection of Personal Information, it is articulated thusly:
4.4 Principle 4 - Limiting Collection
The collection of personal information shall be limited to that which is necessary for the purposes identified by the organization. Information shall be collected by fair and lawful means.
This goes hand-in-hand with the principle that you should only keep information for as long as is reasonably necessary to fulfil those clearly articulated purposes. Take it away, CSA Code:
4.5 Principle 5 - Limiting Use, Disclosure, and Retention
Personal information shall not be used or disclosed for purposes other than those for which it was collected, except with the consent of the individual or as required by law. Personal information shall be retained only as long as necessary for the fulfilment of those purposes.
Generally Accepted Privacy Principles produced by the Canadian Institute of Chartered Accountants in Canada and the American Institute of Certified Public Accountants include variations on these general rules:
4. Collection. The entity collects personal information only for the purposes identified in the notice.
5. Use and Retention. The entity limits the use of personal information to the purposes identified in the notice and for which the individual has provided implicit or explicit consent. The entity retains personal information for only as long as necessary to fulfill the stated purposes.
So I guess you can draw from these examples that you should not collect or keep someone's social security number or social insurance number unless you really need it.
Interestingly and ironically, this lesson has just been learned the hard way by the American Insitute of Certified Public Accountants. The AICPA has just reached the conclusion that it should apply at least a portion of its own Generally Accepted Privacy Principles with respect to the personal information about its members that it collects and retains. It appears that a hard-drive containing personal information on 330,000 members, including social security numbers, has gone missing while in the custody of an overnight courier. While it is very easy to blame the courier, it is clear that the AICPA has no compelling reason to collect SSNs. In fact, there's no reason that even roughly corresponds to the risk associated with keeping such data around, let alone couriering it to a service provider.
To read more, check out: CPA group says hard drive with data on 330,000 members missing.
Tuesday, May 23, 2006
With 70% of critical business information contained in email, small and medium sized companies face numerous challenges. Legal concerns including privacy, retention, and accountability are forefront, but improper use, hardware requirements, and the ability to recover old emails are also highly important to today’s business owner.
Join Toby Keeping (IronSentry Inc.) and David Fraser (McInnes Cooper) in an information session as they discuss these and other issues that small and medium sized companies have to address with electronic information.
For more information, or to register, click here.
Contact: Toby Keeping, 902.463.4485 x1401 or email@example.com
Sunday, May 14, 2006
In a finding by the Office of the Privacy Commissioner released on Friday, two individuals complained that a credit bureau was keeping positive credit information on file for too long. Retention of negative information is limited by provincial law, but there was no self-imposed retention period for favourable information. During the course of the investigation, the bureau decided on twenty years and also decided to give individuals the right to have it removed before then. The Commissioner therefore considered the complaint to be resolved.
Thursday, March 23, 2006
The Privacy Commissioner of Canada's contributions program has been renewed for another year. Check out the press-release:
Privacy Commissioner's Office renews its cutting-edge privacy research program:
Ottawa, March 22, 2006 – The Privacy Commissioner of Canada, Jennifer Stoddart, today announced the renewal of funding through her Office's Contributions Program which, for the last three years, has allowed some of Canada's brightest privacy experts to develop a wealth of information on various privacy challenges of the 21st century.
"Knowledge is the ultimate currency, and with the research developed through our Contributions Program we will be in a position to further strengthen our mission of safeguarding and preserving privacy rights that are cherished in our democracy," said Ms. Stoddart. "It will also shed light on new approaches to dealing with critical privacy issues."
This is the third year of the Program, which was launched in June 2004 to further the development of a national research capacity in Canada on the broad spectrum of issues that have an impact on privacy. The Office is mandated to undertake and publish research related to the protection of personal information, and the Program was set up as part of the Office's budget pursuant to its program/legislative authority under federal private sector privacy legislation.
- The protection of personal health information
- Strategies for making individuals more aware of their privacy rights. Do we need more consumer friendly privacy policies? Do organizations need to do a better job of disseminating their policies?
- The professionalization of privacy specialists—what requirements or standards exist and what processes are in place to accredit and certify these individuals?
- The storage and retention of personal information—the Personal Information Protection and Electronic Documents Act requires that information only be retained as long as necessary to fulfill the stated purposes. What does this mean in practical terms and how should this requirement be assessed?
- Aspects of surveillance:
- New technologies: What does the public comprehend about the collection, use, and transmission of personal data generated from new technology?
- What use is made of transactional data generated by retail transactions, telecommunications devices, or video surveillance?
- Workplace surveillance
- The tracking of individuals’ interactions with the Internet
The Office will also consider requests to fund research on issues that fall outside the priority areas.
According to Michael Geist, a leading privacy expert and member of the Office's External Advisory Committee, the continuation of the Contributions Program will advance and foster the promotion and understanding of privacy rights of Canadians.
“There is an increased burden on us to be aware of threats to our privacy before they become realized. Research projects funded through this Program will go a long way in promoting greater knowledge,” said Mr. Geist.
Professor Geist is a law professor at the University of Ottawa where he holds the Canada Research Chair in Internet and E-commerce law. He is also a nationally syndicated columnist on technology law issues and the author of the Canadian Privacy Law Review.
Organizations that are eligible for funding under the Program include not-for profit organizations, such as educational institutions and industry and trade associations, as well as consumer, voluntary and advocacy organizations.
The maximum amount that can be awarded for any single research project is $50,000. Organizations are eligible to receive funding for only one project.
Projects must be completed within the fiscal year in which the funding was provided. The deadline to submit applications is May 5, 2006.
Links to the projects completed under the previous Contributions Programs are available on the OPC Web site at http://www.privcom.gc.ca/information/cp/index_e.asp.
The Office of the Privacy Commissioner of Canada is mandated by Parliament to act as an ombudsman, advocate and guardian of the privacy and protection of personal information rights of Canadians.
Monday, March 13, 2006
The Canadian Imperial Bank of Commerce is involved in a new incident of misdirected faxes. But, I hasten to add, the misdirected faxes do not appear to be the bank's fault. According to the Globe and Mail, faxes from CIBC to a sporting equipment supplier from Toronto have been sent to Christine Soda. The CIBC sent the faxes to the number it had on record for its customer, but the customer had moved and had not advised the bank of the new fax number. Once the number was released by the phone company, it was assigned to Ms. Soda.
Now, to make it more interesting, Ms. Soda has apparently refused to return the faxes to CIBC and both the bank and its customer are taking Ms. Soda to court for their return, according to the Toronto Star. Ms. Soda says her husband needs the documents for his own lawsuit. (He took them to his workplace and says he was fired because the faxes made the employer think he had another job. He is suing the former employer and needs the faxes as evidence.) The Privacy Commissioner is apparently on the case of this retention of personal information.
Here's a free piece of common sense that I routinely share with my clients: Never surrender your fax number. You can usually pay the phone company a reasonable fee so that it is not reassigned to another person for an interval of time.
Another freebie: Make sure your contacts know your updated information.
Here's my two cents' worth: This situation does not seem to engage PIPEDA. The information on the fax was about money transfers between two businesses. PIPEDA only deals with personal information, which means information about one or more individuals, not companies. It may be a breach of policy and a breach of bank secrecy, but it doesn't look like there was any personal information involved.
Saturday, February 18, 2006
Sabrina Pacifici at beSpacific (beSpacific: Google Responds to DOJ 's Motion to Comply With Data Demand) has posted a link to Google's response to the US Department of Justice's demand for its search data.
The Google Opposion to Motion is actually very interesting reading. Though part of Google's opposition is based on the argument that complying with the government's request would compromise its trade secrets, the document itself provides some interesting insights under the hood at Google and also argues that bare URLs can be very misleading.
Saturday, February 11, 2006
From the Federal Communication Commission (via beSpacific: FCC Proposes Rulemaking to Prevent Sale of Cell Phone Records):
FCC EXAMINES NEED FOR TOUGHER PRIVACY RULES
Comment Sought On Measures Proposed by EPIC, Commission
Washington, D.C. – The Federal Communications Commission today launched a proceeding to examine whether additional security measures could prevent the unauthorized disclosure of sensitive customer information held by telecommunications companies.
In a Notice of Proposed Rulemaking (NPRM) adopted today, the Commission seeks comment on a variety of issues related to customer privacy, including what security measures carriers currently have in place, what inadequacies exist in those measures, and what kind of security measures may be warranted to better protect consumers’ privacy. The Notice grants a petition for rulemaking filed by the Electronic Privacy Information Center (EPIC) expressing concerns about whether carriers are adequately protecting customer call records and other customer proprietary network information, or CPNI. EPIC claims that some data brokers have taken advantage of inadequate security standards to gain access to the information under false pretenses, such as by posing as the customer, and then offering the records for sale on the Internet. The practice is known as “pretexting.”
In its petition, EPIC proposed five additional security measures that it says will more adequately protect CPNI. The NPRM specifically seeks comment on these five measures, which are:
- Passwords set by consumers.
- Audit trails that record all instances when a customer’s records have been accessed, whether information was disclosed, and to whom.
- Encryption by carriers of stored CPNI data.
- Limits on data retention that require deletion of call records when they are no longer needed.
- Notice provided by companies to customers when the security of their CPNI may have been breached.
Section 222 of the Communications Act requires carriers to take specific steps to ensure that CPNI is adequately protected from unauthorized disclosure. Current rules require carriers to certify compliance with the Commission’s CPNI rules and make that certification available to the public, but the Commission observes that a lack of uniformity in these certifications could be an obstacle to effective enforcement. The Commission seeks comment on a tentative conclusion that it should amend its rules to require carriers to file annual compliance certificates with the Commission, along with a summary of all consumer complaints received in the past year concerning the unauthorized release of CPNI and a summary of any actions taken against data brokers during the preceding year.
The Commission also seeks comment on other ways to protect customer privacy, including whether carriers should be required to take the additional step of calling a subscriber’s registered telephone number before releasing CPNI in order to verify that the caller requesting the information is actually the subscriber.
Action by the Commission, February 10, 2006 by Notice of Proposed Rulemaking (FCC 06-10). Chairman Martin, Commissioners Copps, Adelstein and Tate.
The Canadian Privacy Law Blog is licensed under a Creative Commons Attribution-Noncommercial-No Derivative Works 2.5 Canada License.