Caslon Analytics elephant logo title for Privacy guide
home | about | site use | resources | publications | timeline   spacer graphic   Ketupa




Aust law

EU law

New Zealand

Asia law

N America





other writing






















related pages icon


section heading icon     technologies

This page highlights privacy-enhancing and privacy-eroding technologies such as the proposed P3P standard, PGP, biometrics and cookies.

Specific features of the European Commission's 1997 Working Documents on Privacy Enhancing Technologies (WDPET) have been superseded but the set remains a useful introduction to concepts and terminologies.

subsection heading icon     P3P

June 2000 saw the release of a draft of the Platform For Privacy Preferences (P3P) standard.

P3P, developed under the auspices of the World Wide Web Consortium and described in the P3P Toolbox site, attempts to provide a global standard that would allow users to restrict their browsers to those sites that abide by specific limits on data collection. Essentially, P3P acts as a translator, converting a site's privacy policy statement (often long, legalistic or difficult to find) into XML. It might be tied to trustmarks - discussed later in this guide and in more detail in a supplementary profile.

Proponents argue that visitors to a P3P-enabled site would get a "virtual red light" on their browser if that site's policy did not satisfy their own standards, expressed in XML.

It has, however, been widely criticised as complex, confusing and in practice likely to undermine privacy protection of individual internet users.  

Some note the scope for a site to change its policies after obtaining consumer data or relinquish data provided in good faith when they melt down or are acquired. The Berkmann Center's Jonathan Zittrain for example says "I like P3P but I think it is a red herring," as individual preferences frequently change and data supplied now may be more sensitive later.

For a sample of writings by legal and technical advocates and critics of P3P see Ruchika Agrawal's P3P Viewpoints site.

A detailed paper on P3P by the Center for Democracy & Technology (CDT) and the Ontario Information & Privacy Commissioner is available on the CDT site. The 1998 European Commission report Platform for Privacy Preferences (P3P) & the Open Profiling Standard (OPS) also expressed concern about implementation issues. 

Marc Rotenberg of the Electronic Privacy Information Centre (EPIC) offered a sharp critique of P3P, self-regulation and Lessig's Code & Other Laws of Cyberspace (New York: Basic Books 1999) earlier in 2000. There is similar criticism in Karen Coyle's 1999 P3P: Pretty Poor Privacy? statement, questioning the enthusiasm evident in works such as Lorrie Faith Cranor's Web Privacy with P3P (Sebastopol: O'Reilly 2002) - online here.

An Intellectual Capital article around the same time characterised it as DOA, despite frantic efforts at resuscitation.

As things stand, it is difficult to disagree with Yair Galil's 2001 assessment that

P3P is an interesting tool with considerable promise, especially if non-repudiability mechanisms are developed for it, but it is no substitute for privacy legislation. It is a protocol for describing privacy practices; in itself, it does not constrain the use of personal information, and therefore it should not be taken into account by legislatures in assessing the degree of privacy enforced "by the market".

If and when the use of P3P spreads, users and lawyers would do well to scrutinize its specifications with the same care as they now devote to the privacy policies posted on websites.

For the people most concerned about their privacy, other tools available today will provide a better array of protections than P3P-based schemes

subsection heading icon     Encryption and Anonymity

There's information about anonymity tools in our separate Security guide.

The Pretty Good Privacy (PGP) standard, developed by Phil Zimmermann in the early 1990s, involves a Public Key Infrastructure, with one key 'locking' a message and a different key unlocking it.

In principle, if you want to receive encrypted email you simply distribute the public to lock the messages - preventing them from being enjoyed by unwanted readers. The sender uses your key to encrypt the message; you unlock it with your key.

In practice there have been difficulties with escrow agents (entities that safeguard keys in case they get lost) and use-friendliness, highlighted for example in Why Johnny Can't Encrypt: A Usability Evaluation of PGP 5.0 (PDF) by Alma Whitten & JD Tygar.

As a result it is believed that as of 2002 well under 4 million of the several hundred million email users rely on PGP or competing systems such as S/MIME (Secure Multipurpose Internet Mail Extensions) or SSL (Secure Sockets Layer).

subsection heading icon     Cookies

Wondering about the mechanics of tracking?  Cookies (New York: McGraw-Hill 1998), by Simon St Laurent, won't satisfy system administrators and those who eat, drink and breathe code but in 500 pages offers an introduction to scripting, architecture and management of the ubiquitous tools for tracking who's visiting sites.

A West Virginia Journal of Online Law & Technology article by Viktor Mayer-Schönberger examines cookies and privacy legislation, arguing that companies who set them without consent may violate the European Union Directive on the Protection of Personal Data.

Cookie Monsters? Privacy in the Information Society, a 2001 report of the Senate inquiry into internet privacy legislation, argues that new Australian legislation will "not protect consumers' personal details from information-hungry web bugs" and fails to measure up to global standards. The report calls for a national site certification scheme and for limiting exemptions for small business and the media.

subsection heading icon     Biometric and other authentication schemes

This site features a more detailed note dealing with biometrics technologies and issues.

Particular issues are discussed in the Security & Authentication guide.

     next page  (safe harbours)

this site
the web



version of October 2003
© Caslon Analytics