When you build websites that rely on cookies and they are expected to work with privacy settings other than default, you’ll have to deal with P3P. Read on to find out about the cornerstones of the Platform for Privacy Preferences, and get your hands dirty with an example guiding you from empty hands to a complete basic implementation.
Governments and large organizations, with legal and administrative concerns like taxation and security typically address the practical aspects of identity we experience on a daily basis—issuing IDs and credentials and deciding the mechanisms for their verification. This division of responsibilities for defining and executing the construct of personal identity is nearly as old as the mind/body schism at the heart of Western culture.
Not all Facebook users appreciated the September 2006 launch of the `News Feeds' feature. Concerned about privacy implications, thousands of users vocalized their discontent through the site itself, forcing the company to implement privacy tools. This essay examines the privacy concerns voiced following these events. Because the data made easily visible were already accessible with effort, what disturbed people was primarily the sense of exposure and invasion. In essence, the `privacy trainwreck' that people experienced was the cost of social convergence.
As we set out to enhance personalization on Marriott.com, we realized we needed guidelines to inform our thinking and shape our decisions, particularly decisions related to customer privacy. Our earlier user research revealed the need for greater personalization and helped us understand customer attitudes towards privacy. From there, we sought to build customer trust and loyalty by addressing concerns about privacy and security in every aspect of the user experience. In creating the Guiding Principles outlined here, we conducted a thorough analysis of eight major websites and then merged the findings with what we already knew. These principles apply specifically to 'remember me' personalization.
Several surveys attest to growing public concerns regarding privacy, aggravated by the diffusion of information technologies. A policy of self-regulation that allows individual companies to implement self-designed privacy statements is prevalent in the United States. These statements rarely provide specific privacy guarantees that personal information will be kept confidential. This study provides a discourse analysis of such privacy statements to determine their overall efficacy as a policy measure. The in-depth analysis of privacy statements revealed that they offer little protection to the consumer, instead serving to authorize business practices which allow companies to profit from consumer data. Using public good theory as a foundation, policy implications are discussed.
For those of us who work with computers, the value of identifying ourselves to Web sites is increasingly obvious: no more retyping our name and address information, less need to memorize dozens of log-in passwords and paths to specific Web pages, less spam, and fewer irrelevant banner ads. But even those of us who appreciate the value of sharing some personal information with Web sites and those who run them are growing increasingly uncomfortable with the potential for abuse inherent in having information on our identities and preferences broadly available.
The following is an attempt to outline a charter of rights for the user of web applications. They are, of course, unenforceable but compliance with them would represent best practice in the design of user-centred interfaces. More significantly, any violation of the charter would indicate the presence of significant usability problems detrimental to the user experience. And failing to address the requirements of the user leads to frustration, irritation and consequently lost business.