The world’s Largest Sharp Brain Virtual Experts Marketplace Just a click Away
Levels Tought:
Elementary,High School,College,University,PHD
| Teaching Since: | May 2017 |
| Last Sign in: | 363 Weeks Ago, 4 Days Ago |
| Questions Answered: | 20103 |
| Tutorials Posted: | 20155 |
MBA, PHD
Phoniex
Jul-2007 - Jun-2012
Corportae Manager
ChevronTexaco Corporation
Feb-2009 - Nov-2016
CHAPTER 12 Privacy “If everybody minded their own business,” the Dutchman said in a hoarse growl, “the world would go round a deal faster than it does.” —Lewis Carroll, Alice’s Adventures in Wonderland LEARNING OBJECTIVES After you have read this chapter, you should be able to: • Explain the American approach to the regulation of privacy • Understand constitutional sources of the right to privacy • Discuss common law torts for the invasion of privacy • Explore the privacy concerns arising out of online marketing, including online behavioral advertising, unsolicited commercial email and the use of web beacons • Explain the key federal laws that regulate privacy including the GLB Act, COPPA, HIPAA, and the Electronic Communications Privacy Act • Analyze emerging data security requirements • Identify key cases in privacy law Introduction One well-regarded definition of privacy classifies it as the right “to be let alone.” 1 In a Harvard Law Review article from 1890, Samuel D. Warren and Louis D. Brandeis contend: Recent Inventions and business methods call attention to the next step which must be taken for the protection of the person, and for securing to the individual what Judge Cooley calls the right “to be let alone.” Looking back, it seems that Warren and Brandeis were prophetic. Considering their references to “recent inventions and business methods,” one can’t help but wonder if they could have foreseen a time when the right “to be let alone” would be increasingly threatened by complex online social networks; global positioning systems (GPS) that allow rental car companies, employees and others to track one’s location and speed; surveillance cameras in public places; massive data aggregation services; and other modern privacy threats. The scope of modern data collection practices is evident in startling clarity in Exhibit 12.1, which depicts a personal data ecosystem flowchart, which was distributed at series of recent workshops on privacy conducted by the Federal Trade Commission (FTC). 1 Samuel D. Warren and Louis D. Brandeis, “The Right to Privacy,” 193 Harv. L. Rev. 4 (1890). 363 Copyright 2010 Cengage Learning. All Rights Reserved. May not be copied, scanned, or duplicated, in whole or in part. Due to electronic rights, some third party content may be suppressed from the eBook and/or eChapter(s). Editorial review has deemed that any suppressed content does not materially affect the overall learning experience. Cengage Learning reserves the right to remove additional content at any time if subsequent rights restrictions require it. 364 Public Examples: Pharmacies Media Marketers Employers Banks Product & Service Delivery Government Lawyers/ Private Investigators Individuals Data Brokers DATA USERS Credit Bureaus Utility Companies Media Government Agencies Medical Examples: Hospitals Doctors & Nurses Examples: Retail Retail Stores Airlines Credit Card Companies Examples: Social Networking Services Retail & Content Websites Search Engines Internet Examples: Financial & Insurace Stock Companies Insurance Banks Data Collectors (sources) Individual Telecommunications & Mobile Carriers Mobile Providers Cable Companies Examples: Media Archives Websites Information Brokers Affiliates List Brokers Catalog Co-ops Ad Networks & Analytics Companies Healthcare Analytics Law Enforcement EXHIBIT 12.1 Personal Data Ecosystem Copyright 2010 Cengage Learning. All Rights Reserved. May not be copied, scanned, or duplicated, in whole or in part. Due to electronic rights, some third party content may be suppressed from the eBook and/or eChapter(s). Editorial review has deemed that any suppressed content does not materially affect the overall learning experience. Cengage Learning reserves the right to remove additional content at any time if subsequent rights restrictions require it. This chapter provides an overview of existing privacy law and examines privacy concerns arising out of a variety of online activities. In doing so, students are encouraged to consider whether the existing legal framework adequately protects the privacy of those active in on the Internet. Sources of the Right to Privacy U.S. Constitution Although the U.S. Constitution does not specifically recognize a right to privacy, there are important privacy protections enshrined in the U.S. Constitution. Consider, first, the Ninth Amendment, which reads: The enumeration in the constitution of certain rights shall not be construed to deny or disparage others retained by the people. This amendment was likely the genesis of the privacy right that has evolved in the courts and through the teachings of scholars such as Warren and Brandeis. In addition, the Fourth and Fifth Amendments are also sources the “right to privacy.” The Fourth Amendment provides: The right of the people to be secure in the persons, houses, papers, and effects, against unreasonable searches and seizures, shall not be violated; and no Warrants shall issue, but on probably cause, supported by the oath or affirmation, and particularly describing the place to be searched, and the person or thing to be seized. In Griswold v. Connecticut,2 the U.S. Supreme Court declared unconstitutional a state law prohibiting the use of birth control devices and the provision of advice concerning their use. The Court also recognized that the Bill of Rights provided us with what it deemed to be “zones of privacy,” or areas or locations where privacy is expected. Later cases held that an important element of this right was to establish the existence of a “reasonable expectation of privacy” in the particular zone of privacy. The following are the minimum requirements for establishing a “reasonable expectation of privacy”: 1. The person exhibits an actual expectation of privacy. To understand this concept of an expectation of privacy, consider what you expect when entering an area or location, such as your bedroom, which you desire to be off-limits to others. Alternatively, consider the level of privacy that an employee should anticipate with respect to his or her computer, email, or voicemail. 2. Society recognizes the expectation as reasonable. In addition to your own expectations regarding privacy, what do others believe to be your expectation of privacy when you close the door to your bedroom or office, or when you send an email or surf a website? For the purpose of our discussion on privacy rights online, these requirements have to be satisfied concerning the mass of information (much of which is personal in nature) being disseminated over the Internet. Next, consider the Fifth Amendment, which protects us from government action that could result in self-incrimination. That provision reads in part: No person … shall be compelled, in any criminal case, to be a witness against himself. This does not apply when a person voluntarily turns over documents, records, files, and papers to a law enforcement agency or official. Similarly, the public records of a corporation are not subject to this provision, even if they contain incriminating evidence. 2 381 U.S. 479 (1965). Chapter 12: Privacy 365 Copyright 2010 Cengage Learning. All Rights Reserved. May not be copied, scanned, or duplicated, in whole or in part. Due to electronic rights, some third party content may be suppressed from the eBook and/or eChapter(s). Editorial review has deemed that any suppressed content does not materially affect the overall learning experience. Cengage Learning reserves the right to remove additional content at any time if subsequent rights restrictions require it. Although the Fifth Amendment may be commonly associated with the right to refrain from testifying against oneself in a criminal trial, the Fifth Amendment also has application to the online world. An interesting application of the Fifth Amendment to cyberlaw involves the act of encrypting a file that contains possibly incriminating information. Encryption involves encoding methods to block access to certain documents. In Doe v. United States, 3 the Supreme Court held that an individual could “be forced to surrender a key to a strongbox containing incriminating documents, but not to reveal the combination to his wall safe … by word or deed.” This case seems to imply that a law enforcement agency, pursuant to a valid search warrant, could obtain an encrypted file. However, the decision in Doe would likely prevent the agency from forcing a defendant to supply the private key, password, or code that could enable decryption or decoding. Doe raises questions regarding employees who store potentially criminal information on their employer’s computers. If the information belongs to the employer and not the employee, it is possible that a court would allow the employer to access it and use it not only to fire the employee but also in connection with any law enforcement action. Of course, this presupposes that the company has the ability to require the employee to allow access, either through company policy or applicable law. State Constitutions In addition to the U.S. Constitution, state constitutions are a source of important privacy rights. In general, rights established by state constitutions mirror the amendments mentioned earlier in content and, similarly, apply only to public employees. However, some states afford greater protection against government violations of privacy. Beyond constitutional protections, states are playing a significant role in privacy regulation, with many enacting laws to address a range of privacy and data security considerations. Several of these measures will be discussed further below under State Privacy laws. Common Law Torts for the Invasion of Privacy There are four main privacy-related torts recognized by common law and by the Restatement (Second) of Torts. These torts provide monetary and injunctive relief for an unreasonable or unwarranted invasion of the right to privacy. They could also provide remedies for a cause of action in cases involving privacy rights online.4 The four torts are: (1) intrusion upon seclusion; (2) public disclosure of private facts causing injury to reputation; (3) publicly placing another in a false light; and (4) appropriation of a person’s name or likeness causing injury to reputation, each as discussed further below: Intrusion Upon Seclusion The right of each of us to be able to go to a place of seclusion and to be left alone is not absolute. But when another individual, without permission or legal justification, violates a place of seclusion, that individual may be found to have committed this tort. The Restatement (Second) of Torts defines intrusion upon Seclusion5 as: Intentionally intruding, physically or otherwise, upon the solitude of another or his private affairs or concerns. 3 487 U.S. 201 (1988). 4 For example, in the Boring v. Google case, discussed later in this chapter, the plaintiffs claimed that Google had committed certain privacy torts in connection with its Street View service. 5 Restatement (Second) of Torts § 652B (1977). 366 Part 4: Regulatory, Compliance and Liability Issues Copyright 2010 Cengage Learning. All Rights Reserved. May not be copied, scanned, or duplicated, in whole or in part. Due to electronic rights, some third party content may be suppressed from the eBook and/or eChapter(s). Editorial review has deemed that any suppressed content does not materially affect the overall learning experience. Cengage Learning reserves the right to remove additional content at any time if subsequent rights restrictions require it. In order to proceed, a plaintiff would need to prove the following elements: • There was an intent to intrude, or knowledge that the intrusion would be wrong • There was a reasonable expectation of privacy • Intrusion was substantial and highly offensive to a reasonable person. The tort of intrusion upon seclusion can apply in the online world. Consider the following case. STEINBACH v. VILLAGE OF FOREST PARK No. 06-4215, 2009 WL 2605283 (N.D. Ill. Aug. 25, 2009) FACTS [Steinbach, a local elected official, had an email account issued by the municipality. A third-party company, Hostway, provided the technology for the account. Steinbach logged in to her Hostway webmail account and noticed eleven messages from constituents that had been forwarded by someone else to a political rival.] [Steinbach sued the municipality, her political rival and an IT professional employed by the municipality. She brought numerous claims, including violation of the Federal Wiretap Act, the Stored Communications Act, and the Computer Fraud and Abuse Act. She also brought a claim under Illinois common law for intrusion upon seclusion.] JUDICIAL OPINION (JUDGE ZAGEL) [The plaintiff in this case brought a number of claims. Of most relevance here is the Court’s discussion of the intrusion upon seclusion tort:] While it is true that the Illinois Supreme Court has not explicitly recognized the tort of intrusion upon seclusion, this Court has found that the tort does exist. Ludemo v. Klein, 771 F. Supp. 260 (N.D. Ill. 1991) (Zagel, J.). When state tort law is unclear, a federal district court can follow the decisions of the state appellate court governing the geographical area where the alleged tort took place, unless there is reason to believe the Supreme Court of Illinois would decide the question differently. Id. at 261-62. In diversity cases it is the determination of what the highest court of the state would do that is the key question. Forest Park is located in Cook County, which is in the First District, case law from First District Appellate Court governs. The First District formally recognized the existence of the tort in Illinois. Busse v. Motorola, Inc., 813 N.E.2d 1013 (Ill. App. Ct. 2004). Therefore, this Court recognizes the existence of the tort under the First District ruling, since there is no hint of contrary reasoning from the highest court in Illinois. (Footnotes omitted). Under Busse, all four elements of the tort must be satisfied. Id. at 1017. These elements are: (1) defendant committed an unauthorized prying into the plaintiff’s seclusion; (2) the intrusion would be highly offensive to the reasonable person; (3) the matter intruded upon was private; and (4) the intrusion caused plaintiff to suffer. Id. Calderone first argues that his access to Steinbach’s email was authorized by Illinois statute. Section 3.1-35-20 states that “the mayor or president at all times may examine and inspect the books, records and papers of any agent, employee or officer of the municipality.” 65 ILL. COMP. STAT. 5/3.1-35-20. It is true that “records” includes all “digitized electronic material,” thus including email within the definition. 5 ILL. COMP. STAT. 160/2. While there is minimal case law interpreting the statute, it is clear that it does not give a mayor carte blanche access to use the records for his own use. In one case, the court found a city mayor was authorized under the statute to access private juvenile records which discussed questionable police officer behavior. People v. Urbana, 338 N.E.2d 220, 221, 222 (Ill. App. Ct. 1975). That access is different from what Calderone is accused of here. In Urbana, the mayor was not using the access for his own personal use, but for an official city investigation related to his position as mayor. Id. at 221. Here, Calderone did not access Steinbach’s email for any official purpose, but rather for his own personal gain. Therefore, Defendant’s argument that his actions were statutorily authorized fails. Calderone also argues that Steinbach did not sufficiently plead the privacy of the emails. In the Verified Third Amended Complaint, Steinbach alleges that the emails were meant and intended to be private. Complaint, ¶¶ 28, 40, 43. (Continued) Chapter 12: Privacy 367 Copyright 2010 Cengage Learning. All Rights Reserved. May not be copied, scanned, or duplicated, in whole or in part. Due to electronic rights, some third party content may be suppressed from the eBook and/or eChapter(s). Editorial review has deemed that any suppressed content does not materially affect the overall learning experience. Cengage Learning reserves the right to remove additional content at any time if subsequent rights restrictions require it. Steinbach sufficiently pled the third element of the tort to survive a Motion to Dismiss. Whether the emails actually were private is a question of fact which cannot be determined at this stage. Therefore, Count 8 must stand. CASE QUESTIONS 1. What are the elements of the tort of intrusion upon seclusion? 2. Ethical Consideration: Do you think that there is a role for the tort of intrusion upon seclusion in the online world or is it better limited to the “real” world? 3. While plaintiffs have had success in bringing claims for the tort of intrusion upon seclusion for online activities, other plaintiffs have not enjoyed similar success. What are the particular challenges of bringing claims for the tort of intrusion upon seclusion for privacy violations occurring online? Public Disclosure of Private Facts Causing Injury to Reputation Public disclosure of private facts causing injury to reputation concerns the public disclosure or transmission of highly personal facts or information about an individual that results in injury to reputation. In some instances, the tort is associated with the tort of defamation, and both may be used as separate causes of action arising out of the same case. In addition to the elements of “intent or knowledge” and “highly offensive to a reasonable person,” the public disclosure of private factors causing injury to reputation requires: (1) the facts must be private; and (2) communication or publicity must be disclosed to a significant segment of the community. Further, this tort is committed when the facts publicized would be (1) highly offensive to a reasonable person and (2) are not of legitimate concern to the public. Publicly Placing Another in False Light According to the Restatement (Second) of Torts: One who gives publicity to a matter concerning another that places the other before the public in a false light is subject to liability to the other for invasion of his privacy, if (a) the false light in which the other was placed would be highly offensive to a reasonable person, and (b) the actor had knowledge of or acted in reckless disregard as to the falsity of the publicized matter and the false light in which the other would be placed.6 Publicly placing another in false light is also associated with the tort of defamation and involves falsely connecting a person to an immoral, illegal, or embarrassing situation resulting in injury to one’s reputation. To date, this tort has not been the subject of much, litigation concerning online activities. Misappropriation of a Person’s Name or Likeness Causing Injury to Reputation The appropriation of a person’s name or likeness causing injury to reputation7 typically involves using a living person’s name or likeness for a commercial and non-newsworthy purpose without the individual’s permission. There are three elements that must be shown to prove that this tort has been committed: (1) the person’s name, portrait or picture was used; (2) for purposes of trade or advertising; (3) without the person’s written consent. 6 Restatement (Second) of Torts § 652E (1977). 7 Restatement (Second) of Torts § 652C (1977). 368 Part 4: Regulatory, Compliance and Liability Issues Copyright 2010 Cengage Learning. All Rights Reserved. May not be copied, scanned, or duplicated, in whole or in part. Due to electronic rights, some third party content may be suppressed from the eBook and/or eChapter(s). Editorial review has deemed that any suppressed content does not materially affect the overall learning experience. Cengage Learning reserves the right to remove additional content at any time if subsequent rights restrictions require it. An example from the Restatement (Second) of Torts is as follows: “A is an actress, noted for her beautiful figure. B, seeking to advertise his bread, publishes in a newspaper a photograph of A, under the caption, ‘Keep That Sylph-Like Figure by Eating More of B’s Rye and Whole Wheat Bread.’ B has invaded A’s privacy.” 8 Federal Privacy Laws As compared with other countries including the member states of the European Union, the United States has adopted a rather piecemeal approach to data privacy. Instead of having a single comprehensive data privacy law that applies across all industries, American lawmakers provide special protections to certain types of data, such as consumer credit information, medical records, and even video rental data. While U.S. legislators have not yet elected to enact privacy legislation that applies to all types of personal data, lawmakers have passed significant federal laws that apply to particular types of personal data. Although these laws have a limited scope, they generally establish broad requirements and stringent restrictions with respect to the collection and use of the particular types of personal data to which they apply. Privacy Protection Act Congress enacted the Privacy Protection Act (PPA)9 to reduce the chilling effect of law enforcement searches and seizures on publishers. The PPA prohibits government officials from searching or seizing any work product or documentary materials held by a “person reasonably believed to have a purpose to disseminate to the public a newspaper, book, broadcast, or other similar form of public communication,” 10 unless there is probable cause to believe the publisher has committed or is committing a criminal offense to which the materials relate. The PPA effectively forces law enforcement to use subpoenas or voluntary cooperation to obtain evidence from those engaging in First Amendment activities. Privacy Act of 1974 The Privacy Act of 197411 covers nearly all personal records maintained by federal agencies and some federal contractors. It applies to military health records, veterans’ records, Indian Health Service records, Medicare records, and medical records of other federal agencies. The Privacy Act of 1974 does not apply to most hospitals, clinics, or physicians, even if they receive federal funds or are tax-exempt. Generally speaking, the Privacy Act of 1974 grants people four rights: (1) to find out what information the government has collected about them; (2) to see, and have, a copy of that information; (3) to correct or amend that information; and (4) to exercise limited control of the disclosure of that information to other parties. Cable Communications Policy Act The Cable Communications Policy Act12 (“Cable Act”) was enacted to amend the Communications Act of 1934.13 The Cable Act establishes a comprehensive framework 8 Restatement (Second) of Torts § 652C (1977). 9 Privacy Protection Act, 42 U.S.C.A. §§ 2000aa, 2000aa-12 (1980). 10Id. 115 U.S.C. § 552a (1974). 12Cable Communications Policy Act of 1984, Pub. L. No. 98-549, 1984 Stat. 66 (codified at 47 U.S.C. § 521 (1984)). 1347 U.S.C. § 151 (1934). Chapter 12: Privacy 369 Copyright 2010 Cengage Learning. All Rights Reserved. May not be copied, scanned, or duplicated, in whole or in part. Due to electronic rights, some third party content may be suppressed from the eBook and/or eChapter(s). Editorial review has deemed that any suppressed content does not materially affect the overall learning experience. Cengage Learning reserves the right to remove additional content at any time if subsequent rights restrictions require it. for cable regulation and establishes strong protections for subscriber privacy by restricting the collection, maintenance, and dissemination of subscriber data. It prohibits cable operators from using the cable system to collect “personally identifiable information” concerning any subscriber without prior consent, unless the information is necessary to render service or detect unauthorized reception. It also prohibits operators from disclosing personally identifiable data to third parties without consent, unless the disclosure is either necessary to render a service provided by the cable operator or if it is made to a government entity pursuant to a court order. Video Privacy Protection Act The Video Privacy Protection Act14 aims to protect the privacy of consumers’ video rental histories. Subject to certain limited exceptions, the Act prohibits videotape service providers from disclosing personally identifiable information about individuals who rent or buy videos to third parties. The Act provides that customers may bring an action against any video store that discloses personally identifiable information. Actual damages are recoverable under the Act but must not be less than liquidated damages in the amount of $2,500. In addition, punitive damages, as well as reasonable attorney’s fees and litigation costs, and “preliminary and equitable” relief may be recoverable as deemed appropriate. There is a two-year statute of limitations on proceedings brought pursuant to the Act. Telephone Consumer Protection Act The Telephone Consumer Protection Act of 1991 (TCPA)15 was enacted in response to consumer complaints about intrusive telemarketing practices and concerns about the impact of such practices on consumer privacy. The Act amends Title II of the Communications Act of 1934 and requires the Federal Communications Commission (FCC) to promulgate rules “to protect residential telephone subscribers’ privacy rights.” In response to the TCPA, the FCC issued a Report and Order requiring any person or entity engaged in telemarketing to maintain a list of consumers who request not to be called. The TCPA further restricts the use of automated dialing systems, artificial or prerecorded voice messages, SMS text messages received by cell phones, and the use of fax machines to send unsolicited advertisements. It also restricts solicitors from calling before 8 a.m. and after 9 p.m. Individuals are entitled to collect damages directly from the solicitor for $500 to $1,500 for each violation, or recover actual monetary loss, whichever is higher. Electronic Communications Privacy Act The Electronic Communications Privacy Act (ECPA)16 legislation places restrictions on the interception of electronic communications and creates privacy protections for stored electronic communications. The ECPA was built on past law and was enacted to guard against potential abuses and constitutional violations in the area of electronic surveillance. 14Video Privacy Protection Act of 1988, Pub. L. No. 100-618, 102 Stat. 3195 (codified at 18 U.S.C. § 2710 (2002)). 15Telephone Consumer Protection Act of 1991, Pub. L. No.102-243 (codified at 47 U.S.C. 227 (2005)). 16Electronic Communications Privacy Act, Pub. L. No. 99-508, 100 Stat. 1848 (1986) (codified at 18 U.S.C. §§ 2510–2521, 2701–2710, 3117, 3121–3126 (1986)). 370 Part 4: Regulatory, Compliance and Liability Issues Copyright 2010 Cengage Learning. All Rights Reserved. May not be copied, scanned, or duplicated, in whole or in part. Due to electronic rights, some third party content may be suppressed from the eBook and/or eChapter(s). Editorial review has deemed that any suppressed content does not materially affect the overall learning experience. Cengage Learning reserves the right to remove additional content at any time if subsequent rights restrictions require it. Prior to the ECPA, the key legislation applicable to communications surveillance was Title III of the Omnibus Crime Control and Safe Streets Act of 1968. This measure codified rules regulating the use of wiretaps by law enforcement and provided that wiretaps may only be used (1) in the investigation of certain identified categories of offenses, including counterfeiting, fraud, and offenses punishable by death; (2) only when the government can show probable cause that the suspect has committed, is committing, or is about to commit an allowable offense; and (3) there is probable cause that a wiretap will reveal communications concerning the offense. This measure also required that the government show that other means of obtaining the information have been exhausted. Under the terms of the law, court orders could authorize surveillance for a maximum of 30 days (with the possibility for a 30-day extension). Furthermore, authorities conducting surveillance were required to provide reports to the court every 7–10 days. Generally, ECPA modifies Title III by imposing higher standards that must be met before the government is allowed to conduct electronic surveillance. In order to obtain a court order authorizing electronic surveillance under Title III, the law enforcement agency must state the alleged offense being committed, the interception point for the communications, a description of the types of conversations to be intercepted, and the identity of the persons anticipated to be intercepted. Probable cause must be demonstrated with particularity, and it must be shown that normal investigative techniques are not effective in the investigation. The requirements for obtaining an order to allow the use of a pen register are less stringent: law enforcement need only show that the information likely to be obtained is relevant to an ongoing criminal investigation. Although the rules governing electronic surveillance were applied by the courts to Internet communications, the Patriot Act (which will be discussed in further detail in the next chapter) amended the statutes governing electronic surveillance to explicitly extend to Internet communications. ECPA is broken down into individual titles that cover different areas related to electronic communications. Title I concerns the acquisition and disclosure of communication streams, with hopes of protecting both voice and data communications while in transit. The legislation’s coverage of wire communications is limited to aural transfers made through cable, wire, and similar transmission media maintained by persons engaged in the business of providing or operating facilities for interstate or foreign communications. ECPA prohibits the interception of oral, wire, and electronic communications by private and public parties unless specifically authorized by statute or by a court order. Title II governs both the acquisition and disclosure of stored information. It prohibits the unauthorized access to, or use of, stored communications and it prohibits service providers from disclosing the content of such stored communications except in certain limited circumstances. Disclosures are permissible when authorized by the sender or receiver of the message; when necessary for the effective rendition of the service or system; or when pertaining to the commission of a crime or law enforcement. Title III concerns the acquisition and disclosure of transactional information. The provisions of this title contain restrictions on the use of mobile tracking devices, pen registers, and trap-and-trace devices. Such restrictions were modified by the Patriot Act. The statute authorizes individuals or entities that are aggrieved by any intentional violation of ECPA to commence a civil action. Appropriate relief for individuals and/ or entities damaged as a result of a violation of the statute may include preliminary, Chapter 12: Privacy 371 Copyright 2010 Cengage Learning. All Rights Reserved. May not be copied, scanned, or duplicated, in whole or in part. Due to electronic rights, some third party content may be suppressed from the eBook and/or eChapter(s). Editorial review has deemed that any suppressed content does not materially affect the overall learning experience. Cengage Learning reserves the right to remove additional content at any time if subsequent rights restrictions require it. equitable or declaratory relief, as appropriate, actual damages, attorney’s fees, and court costs. Fair Credit Reporting Act and FACTA On December 4, 2003, the Fair and Accurate Credit Transactions Act of 2003 (FACTA)17 was signed into law, amending the then soon-to-expire Fair Credit Reporting Act (FCRA), which had created national credit reporting standards. The FCRA had duties on consumer reporting agencies. Specifically, the law requires every consumer credit reporting agency to take appropriate measures to prevent any inappropriate disclosure of information. For their part, prospective users of information must identify themselves, certify the purposes for obtaining the information and certify that the information will not be used for any unauthorized purposes. Under the FCRA, consumers have the right to opt out of receiving pre-approved credit-card offers in the mail. FACTA includes a number of consumer protections, including new tools to improve the accuracy of credit information and to help fight identity theft. The Act includes several notable provisions regarding general consumer rights, as summarized below: • adding several responsibilities to companies that furnish information to the credit bureaus; • providing more specific standards for accuracy of data maintained by reporting agencies; • giving consumers the right to correct inaccuracies in data profiles; and • providing consumers the right to opt-out of unsolicited offers. Given the emphasis placed on data security during this time period, it’s not surprising that FACTA includes several notable provisions regarding identity theft, including the following: • Procedures at credit bureaus to handle fraud alerts will be required by December 1, 2004. • Consumers are given the right to place fraud alerts on credit report and block credit bureaus from reporting information in their credit files as a result of identity theft. • Credit bureaus must implement a 90-day alert on a consumer’s credit file when customer’s wallet or purse is lost or stolen. • A customer may request a 7-year alert where, in addition to the loss or theft of a wallet or purse, evidence exists that a thief opened credit accounts in the consumer’s name. • A military duty alert can be requested because military personnel serving abroad would not be opening many new accounts during the time of service. • Consumers are entitled to obtain information about accounts or transactions in their name that result from identity theft. Although FACTA establishes certain important consumer protections, it has been criticized on a number of significant grounds, including that it preempts states from implementing more stringent—and perhaps more effective—laws concerning identity theft. Gramm-Leach-Bliley Act The United States also has strong privacy legislation in the financial sector. Although the Gramm-Leach-Bliley Act (GLB Act)18 is a very broad measure, subtitle A of Title V 17Fair and Accurate Credit Transactions Act of 2003, Pub. L. No. 108-159, 117 Stat 195. 18Gramm-Leach-Bliley Act, 15 U.S.C. §§ 6801–6809. 372 Part 4: Regulatory, Compliance and Liability Issues Copyright 2010 Cengage Learning. All Rights Reserved. May not be copied, scanned, or duplicated, in whole or in part. Due to electronic rights, some third party content may be suppressed from the eBook and/or eChapter(s). Editorial review has deemed that any suppressed content does not materially affect the overall learning experience. Cengage Learning reserves the right to remove additional content at any time if subsequent rights restrictions require it. focuses specifically on data privacy of financial institution consumers. Specifically, the GLB Act focuses on the privacy of nonpublic information of individuals who are customers of financial institutions, restricting those institutions’ ability to disclose a consumer’s personal financial information to nonaffiliated third parties. Furthermore, the GLB Act also obliges financial institutions to provide notices about their informationcollection and information-sharing practices to their customers and, subject to certain limited exceptions, to provide such individuals the opportunity to “opt out” if they do not wish for their information to be shared with nonaffiliated third parties. The GLB Act, however, does provide specific exceptions. For example, a financial institution must provide notice—but not the right to opt out—when it provides nonpublic personal information to a third-party service provider that provides services for that financial institution, or another financial institution(s) with which the financial institution has entered into a joint marketing agreement. A third-party service provider may market the financial institution’s own products and services or the financial products or services offered under a “joint marketing agreement” between the financial institution and one or more other financial institutions. A joint marketing agreement with other financial institution(s) means a written contract pursuant to which those institutions jointly offer, endorse, or sponsor a financial product or service. However, to take advantage of this exception the financial institution must: (1) provide the initial notice as required to consumers and customers; and (2) enter into a contract with the third-party service provider or financial institution under a joint marketing agreement that prohibits the disclosure or use of the information other than for the purpose for which it was disclosed. There are also certain exceptions to the notice and opt-out requirements. These exceptions are: • disclosures necessary to effect, administer, or enforce a transaction that a consumer requests or authorizes; • disclosures made in connection with servicing or processing a financial product or service that a consumer requests or authorizes; maintaining or servicing a consumer’s account; or a proposed or actual securitization, secondary market sale (including the sale of servicing rights) or similar transactions; • with consumer consent; • to protect the confidentiality or security of records; • to protect against or prevent actual or potential fraud; • for required institutional risk control or for resolving consumer disputes or inquires; • to persons holding a legal or beneficial interest relating to the consumer; • to persons acting in a fiduciary or representative capacity on behalf of the consumer (i.e., the consumer’s attorney); • to provide information to insurance rate advisory organizations, persons assessing compliance with industry standards, the financial institution’s attorneys, accountants, or auditors; • to law enforcement entities or self-regulatory groups (to the extent permitted or required by law); • to comply with federal, state, or local laws; • to comply with a subpoena or other judicial process; • to respond to summons or other requests from authorized government authorities; • pursuant to the Fair Credit Reporting Act, to a consumer reporting agency or from a consumer report generated by consumer reporting agency; or • in connection with a proposed or actual sale, merger, transfer, or exchange of all or a portion of a business or operating unit Chapter 12: Privacy 373 Copyright 2010 Cengage Learning. All Rights Reserved. May not be copied, scanned, or duplicated, in whole or in part. Due to electronic rights, some third party content may be suppressed from the eBook and/or eChapter(s). Editorial review has deemed that any suppressed content does not materially affect the overall learning experience. Cengage Learning reserves the right to remove additional content at any time if subsequent rights restrictions require it. The GLB Act also contains significant security provisions requiring the FTC and certain other federal agencies to establish standards with which financial institutions must comply in order to protect the security of their customers’ non-public information. Furthermore, the FTC has issued a separate rule on Standards for Safeguarding Customer Information, which requires financial institutions to develop, implement, and maintain a comprehensive information-security program that contains administrative, technical, and physical safeguards. As part of such security program, each financial institution must: (1) designate an employee or employees to coordinate the information security program; (2) identify reasonably foreseeable internal and external risks to the security, confidentiality, and integrity of customer information that could result in the unauthorized disclosure, misuse, alteration, destruction, or other compromise of information and assess the sufficiency of any safeguards in place to control the risks; (3) assure that contractors or service providers are capable of maintaining appropriate safeguards for the customer information and requiring them, by contract, to implement and maintain such safeguards; and (4) adjust the information security program in light of developments that may materially affect the entity’s safeguards. Health Insurance Portability and Accountability Act The United States has comprehensive federal health privacy legislation, with the key legislation being the Health Insurance Portability and Accountability Act of 1996 (HIPAA). 19 This legislation, which saw recent substantial revisions via the Health Information Technology for Economic and Clinical Health Act (HITECH),20 establishes a number of rules and requirements relative to the privacy and security of individually identifiable health information, which is defined as: [I]nformation that is a subset of health information, including demographic information collected from an individual, and: 1. Is created or received by a health care provider, health plan, employer, or health care clearinghouse; and 2. Relates to the past, present, or future physical or mental health or condition of an individual; the provision of health care to an individual; or the past, present, or future payment for the provision of health care to an individual; and (i) that identifies the individual; or (ii) with respect to which there is a reasonable basis to believe the information can be used to identify the individual. The data privacy and security requirements of HIPAA apply to health plans, healthcare providers and healthcare clearinghouses (collectively known as Covered Entities). Among its many requirements, HIPAA mandates the creation and distribution of privacy policies that explain how all individually identifiable health information is collected, used, and shared, and establishes strict controls on how that information is used and disclosed. Entities subject to HIPAA’s requirements are not permitted to use or disclose individually identifiable health information except as expressly permitted by the provisions of the HIPAA privacy rule. Although the requirements of HIPAA apply specifically to Covered Entities, other types of entities may be impacted by HIPAA because HIPAA requires Covered Entities 19Health Insurance Portability and Accounting Act of 1996, Pub. L. No. 104-191, 110 Stat. 1936 (codified as amended in scattered sections of 42 U.S.C.). 20Health Information Technology for Economic and Clinical Health Act, Pub. L. No. 111-5, 123 Stat. 226 (2009). 374 Part 4: Regulatory, Compliance and Liability Issues Copyright 2010 Cengage Learning. All Rights Reserved. May not be copied, scanned, or duplicated, in whole or in part. Due to electronic rights, some third party content may be suppressed from the eBook and/or eChapter(s). Editorial review has deemed that any suppressed content does not materially affect the overall learning experience. Cengage Learning reserves the right to remove additional content at any time if subsequent rights restrictions require it. to execute Business Associate Agreements with all third-party service providers that will have access to individually identifiable health information. Accordingly, in addition to impacting Covered Entities directly, the law has indirect implications for a range of vendors providing services to Covered Entities. In 2009, fairly significant changes to HIPAA were enacted. In response to concerns over the recession, on February 17, 2009, President Obama signed into law the American Recovery and Reinvestment Act of 2009 (ARRA).21 The ARRA received a lot of attention for its tax and spending provisions, but it also made very significant changes to certain aspects of healthcare regulation, in particular the privacy and security of health information. Title XIII of ARRA, the Health Information Technology for Economic and Clinical Health Act (HITECH Act), dedicated $22 billion in federal funding to advance the use of health information technology. Recognizing that effective data privacy and security is a necessary prerequisite to the digitization of our healthcare system, Subtitle D of the HITECH Act also modified many of HIPAA’s privacy and security provisions in fairly significant ways, as discussed further below. The HITECH Act implements many noteworthy changes that impact Business Associates of HIPAA Covered Entities. Most notably, the legislation makes Business Associates (and not just the Covered Entities to which they provide services, as had been the case) directly subject to HIPAA’s privacy and security requirements—as well as the penalties for violating those requirements. This expansion of the government’s jurisdiction on HIPAA enforcement is a dramatic shift from former policy. Prior to the HITECH Act, Business Associates were not directly subject to HIPAA. HIPAA requires the Covered Entity to contract with Business Associates to ensure that they would protect all PHI obtained from the Covered Entity in accordance with HIPAA’s requirements. This requirement was part of HIPAA prior to the enactment of the HITECH Act and will continue to be under the HITECH Act. However, before the HITECH Act, a Business Associate who failed to comply with HIPAA’s security and/or privacy requirements would face only the threat of contractual liability with the Covered Entity, not direct enforcement action by regulators. Under the changes ushered in by the HITECH Act, Business Associates will now be subject to the same government civil and criminal penalties as Covered Entities. This is clearly a significant shift that will result in an entire new group of entities having the potential to be subject to HIPAA’s civil and monetary penalties. In addition to this pretty major shift, the HITECH Act subjects Business Associates to a number of substantive provisions of the regulations, including the requirements to implement administrative, physical, and technical safeguards to protect PHI. Although the HITECH Act directly regulates the conduct of Business Associates, they are still required to enter into Business Associate Agreements with Covered Entities to which they provide services. These contracts, whether new or already in existence, must reflect the policy shift described above. Accordingly, Covered Entities and Business Associates will need to reevaluate and revise existing Business Associate Agreements. Additionally, any organization that transmits PHI to a Covered Entity or its Business Associate and requires routine access to such PHI, or any vendor that contracts with a Covered Entity to offer personal health records to patients as part of the Covered Entity’s electronic health record, will be required to enter into a contractual agreement with the Covered Entity and will be treated as a Business Associate. Quite significantly, the HITECH Act also imposes new data breach notification requirements. Although certain states (e.g., California) with breach notification laws have recently begun to extend the reach of those laws to include healthcare information, there has not been any federal requirement to date. Moreover, although the majority of 21American Recovery and Reinvestment Act of 2009, Pub. L. No. 111-5, 123 Stat. 115. Chapter 12: Privacy 375 Copyright 2010 Cengage Learning. All Rights Reserved. May not be copied, scanned, or duplicated, in whole or in part. Due to electronic rights, some third party content may be suppressed from the eBook and/or eChapter(s). Editorial review has deemed that any suppressed content does not materially affect the overall learning experience. Cengage Learning reserves the right to remove additional content at any time if subsequent rights restrictions require it. states have breach notification laws for data that can be used to commit financial identity theft, only a small minority of states extend these notification requirements to breaches involving health information. Issues concerning data searches, including the duties of Covered Entities, Business Associates and vendors of Personal Health Records to notify impacted individuals of data breaches, will be discussed further in the section of this publication concerning data security. However, it is important to note that under the HITECH Act, Covered Entities will be required to notify individuals upon any compromise of their unsecured PHI and Business Associates will be required to notify Covered Entities of such a breach. The breach notification must be made without unreasonable delay and within no more than 60 days following the detection of the breach. Furthermore, if the breach involves the data of more than 500 individuals, the Covered Entity must notify the Department of Health and Human Services (HHS) at the time of the discovery as well as “prominent media outlets” in the applicable area. Significantly, details of such large breaches will be posted on the HHS website for public viewing. With regard to breaches involving fewer than 500 individuals, in addition to the notification obligations, Covered Entities suffering a breach will be required to maintain a log of such breaches to be submitted annually to HHS. Children’s Online Privacy Protection Act In the late 1990s, there was a lot of concern regarding privacy on the Internet. There was even talk that the United States might enact a very broad law that would apply generally to privacy online. While U.S. legislators did not pass such a measure, they did enact a law targeted towards a particular area of concern—the privacy of children online. The Children’s Online Privacy Protection Act (COPPA)22 was signed into law on October 21, 1998. Its goals are: (1) to enhance parental involvement in order to protect the privacy of children in the online environment; (2) to help protect the safety of children in online forums such as chat rooms, home pages, and pen-pal services in which children may make public postings of identifying information collected online; and (3) to limit the collection of personal information from children without parental consent. Under COPPA, operators of websites directed to children under 13, or who knowingly collect personal information from children under 13 on the Internet, must provide parents with notice of their information practices. Subject to certain very limited exceptions, such operators must also obtain prior, verifiable parental consent for the collection, use and/or disclosure of personal information from children. Furthermore, upon request, operators must provide a parent with the ability to review the personal information collected from his/her child. The legislation also compels operators to provide parents with the opportunity to prevent the further use of personal information that has already been collected, or the future collection of personal information from that child. In addition, website operators must also limit collection of personal information for a child’s online participation in a game, prize offer, or other activity to information that is reasonably necessary for the activity. Finally, the legislation also mandates the establishment and maintenance of reasonable procedures to protect the confidentiality, security, and integrity of the personal information collected. The passage of COPPA is further evidence of the commitment on the part of U.S. legislators and regulators to protect the privacy of children. The FTC has long demonstrated an interest in protecting children from unfair information collection practices online, investigating and even commencing enforcement actions against companies alleged 22Children’s Online Protection Act of 1998, Pub. L. No. 105-277, 112 Stat. 2581 (codified at 15 U.S.C. §§ 6501–6506). 376 Part 4: Regulatory, Compliance and Liability Issues Copyright 2010 Cengage Learning. All Rights Reserved. May not be copied, scanned, or duplicated, in whole or in part. Due to electronic rights, some third party content may be suppressed from the eBook and/or eChapter(s). Editorial review has deemed that any suppressed content does not materially affect the overall learning experience. Cengage Learning reserves the right to remove additional content at any time if subsequent rights restrictions require it. to have engaged in unfair or deceptive information collection practices involving children. For example, in 1998, the FTC commenced a proceeding against GeoCities for deceptive practices in connection with GeoCities’ collection and use of personal identifying information. In addition, around the same time, the FTC also commenced an action against Liberty Financial Co. Inc. based upon allegations that the company was collecting data from children and using such data in a manner that was inconsistent with the company’s stated policies. Since the passage of COPPA, the FTC has brought additional enforcement actions against entities that have violated the privacy rights of children. The FTC is considering possible revisions to COPPA. In March of 2010, the FTC released a request for public comments.23 In seeking comments from the public, the FTC focused on the issue of whether technology warrants changes to the COPAA Rule. Among other questions, the FTC asked: • What implications for COPPA enforcement are raised by mobile communications, interactive television, interactive gaming, or other similar interactive media. • For input on the use of automated systems—those that filter out any personally identifiable information prior to posting—to review children’s Web submissions. • Whether operators have the ability to contact specific individuals using information collected from children online, such as persistent IP addresses, mobile geolocation data, or information collected in connection with behavioral advertising, and whether the Rule’s definition of “personal information” should be expanded accordingly. • Whether there are additional technological methods to obtain verifiable parental consent that should be added to the COPPA Rule and whether any of the methods currently included should be removed. • Whether parents are exercising their right under the Rule to review or delete personal information collected from their children, and what challenges operators face in authenticating parents. • Whether the Rule’s process for FTC approval of self-regulatory guidelines—known as safe harbor programs—has enhanced compliance, and whether the criteria for FTC approval and oversight of the guidelines should be modified in any way. Controlling the Assault of Non-Solicited Pornography and Marketing Act of 2003 The Controlling the Assault of Non-Solicited Pornography and Marketing Act (CANSPAM Act)24 requires that senders of unsolicited commercial email messages, or spam, label them as such and include the sender’s physical address as well as instructions about how recipients of the message can opt-out from future mailings. The use of false headers and deceptive subject lines are clearly prohibited. State Privacy Laws While the most significant data privacy laws in the United States have been enacted at the federal level, there is also a growing body of state privacy laws concerning specific matters such as the use of spyware and the protection of consumer credit information, school records, and financial data. Other states have adopted more general privacy 23Federal Trade Commission, FTC Seeks Comment on Children’s Online Privacy Protections; Questions Whether Changes to Technology Warrant Changes to Agency Rule, accessed at: http://www.ftc.gov/opa/2010/ 03/coppa.shtm 2415 U.S.C. § 7709. Chapter 12: Privacy 377 Copyright 2010 Cengage Learning. All Rights Reserved. May not be copied, scanned, or duplicated, in whole or in part. Due to electronic rights, some third party content may be suppressed from the eBook and/or eChapter(s). Editorial review has deemed that any suppressed content does not materially affect the overall learning experience. Cengage Learning reserves the right to remove additional content at any time if subsequent rights restrictions require it. protection legislation. The following sections will present an introduction to several key laws at the state level that are relevant to privacy protection. Among all states, California has certainly been the most active, enacting numerous laws concerning the privacy of personal information. One of the most notable measures is California’s Online Privacy Protection Act of 200325 (the Online Privacy Protection Act), which became effective on July 1, 2004. The law requires all operators of commercial websites or online services (Operators) that collect Personally Identifiable Information from California residents through websites or other similar online service (Web Sites) to post a privacy policy and to comply with the same. The Online Privacy Protection Act adopts a broad view of “Personally Identifiable Information,” defining it as any identifiable information about an individual collected online including any of the following: (1) first and last name; (2) a home or other physical address; (3) an email address; (4) a telephone number; (5) a Social Security number; (6) any other identifier that permits the physical or online contacting if a specific individual; and (7) information concerning a user that the Operator collects online from the user and combines with any of the identifiers described above. The breadth of the definition of Personally Identifiable Information ensures that the vast majority of Operators collecting information online will fall under the requirements of the new legislation. In addition to requiring all Operators collecting personally identifiable information from California residents to post a privacy policy, the Online Privacy Protection Act sets forth specific requirements about the content of such a privacy policy. Specifically, it must: • Identify the categories of information that the Operator collects through the Internet and the categories of persons or entities with which the Operator may share the information; • Disclose whether or not the Operator maintains a process for an individual user of and/or visitor to the Operator’s websites to review and request changes to his or her personally identifiable information and, if so, provide a description of such process; • Disclose whether the Operator reserves the right to change its privacy policy without notice to the individual user of, or visitor to, the websites; and • Identify the effective date of the privacy policy. Quite significantly, the legislation also requires that privacy policies be posted conspicuously on the Operator’s website, with very detailed requirements on what that entails. The Online Privacy Protection Act provides that an Operator will be considered in violation of the act if (1) the Operator fails to post a privacy policy within thirty days of being notified that it is not in compliance with the requirements of the legislation, and/or (2) the Operator either knowingly and willfully or negligently and materially fails to comply with the provisions of its own privacy policy. The legislation is enforced through California’s unfair competition law, which provides for civil fines and injunctive relief. Another interesting law in California is the Personal Information and Privacy Protection Act.26 This legislation mandated the creation of an Office of Privacy Protection within the department of Consumer Affairs to protect “the privacy of individuals’ personal information in a manner consistent with the California Constitution by identifying consumer problems in the privacy area and facilitating development of fair information practices….” The Personal Information and Privacy Act contains a number of other 25Online Privacy Protection Act of 2003, Cal. Bus. &. Prof. Code §§ 22575–22579 (2004). 26Personal Information and Privacy Protections Act, Cal. Bus. &. Prof. Code §§ 350–352. 378 Part 4: Regulatory, Compliance and Liability Issues Copyright 2010 Cengage Learning. All Rights Reserved. May not be copied, scanned, or duplicated, in whole or in part. Due to electronic rights, some third party content may be suppressed from the eBook and/or eChapter(s). Editorial review has deemed that any suppressed content does not materially affect the overall learning experience. Cengage Learning reserves the right to remove additional content at any time if subsequent rights restrictions require it. interesting provisions concerning privacy, including the requirement that all state governmental departments and agencies enact and maintain permanent privacy policies. Data Security Overview Data security is an increasing concern for individuals and businesses alike. After all, without sufficient data security, there can be no privacy. When individuals’ private information is breached, individuals may suffer a number of negative consequences including identity theft and embarrassment. When companies fail to properly secure the data in their possession, they may suffer legal liability, potential breach of contract claims, negative publicity, and other harms. Data Breach Notification Laws One of the most notable developments in this area has been the emergency of a duty to notify consumers of breaches involving their data. At the state level, much of the activity has focused upon addressing issues related to data security breaches. California was again a leader here, enacting its own comprehensive data security breach notification in 2002. California’s law, often referred to as S.B. 1386, applies to any online business having customers in California—even if the business itself is not based in California. Pursuant to S.B. 1386, which entered into force on July 1, 2003, all agencies, persons, or businesses that conduct business in California and that own or license computerized data containing personal information are required to report breaches in the security of such data to any resident of California whose unencrypted personal information has been compromised as a result of the breach. In order to trigger the law’s notification requirements, a security breach must involve personal information, which is defined as an individual’s first name or first initial and last name combined with one or more of the following pieces of data: (1) Social Security number; (2) driver’s license number or California Identification Card number; or (3) account number, credit or debit card number, in combination with any required security code, access code, or password that would permit access to an individual’s financial account. Furthermore, the notification requirements will only be triggered in situations in which either the name or the additional data elements are not encrypted. When an agency, person, or business is processing such personal information and suffers a breach, it must notify the affected customers in “the most expedient time possible and without unreasonable delay.” Significantly, the law defines a breach of security broadly as an “unauthorized acquisition of computerized data that compromises the security, confidentiality, or integrity of personal information maintained by the agency, person or business.” Individuals or entities required to provide such notice may do so in writing or electronically. However, all electronic notices must be in compliance with the federal Electronic Signatures in Global and National Commerce Act of 2000.27 Notwithstanding the foregoing, in instances where (1) the cost of providing the requisite notice would exceed $250,000, (2) the number of people to be notified exceeds 500,000, or (3) there is no sufficient contact information available, the affected individual or entity may provide substitute notice, which would consist of providing all of the following: (1) email 27Electronic Signatures in Global and National Commerce Act, Pub. L. No. 106-229, 14 Stat. 464 (codified at 15 U.S.C. § 96 (2000)). Chapter 12: Privacy 379 Copyright 2010 Cengage Learning. All Rights Reserved. May not be copied, scanned, or duplicated, in whole or in part. Due to electronic rights, some third party content may be suppressed from the eBook and/or eChapter(s). Editorial review has deemed that any suppressed content does not materially affect the overall learning experience. Cengage Learning reserves the right to remove additional content at any time if subsequent rights restrictions require it. notice if email addresses are available; (2) website notice provided there is a website that can be used to post such notice; and (3) notification to major statewide media. After a series of high-profile data security breaches in early 2005, a number of other states rushed to follow suit, enacting breach notification measures mostly modeled closely after California’s measure. Today, the vast majority of U.S. states have enacted their own data breach notification laws. Although similar, there are some important procedural difference in the laws, leading to compliance difficulties for those companies operating in more than one state that suffer a breach. Laws Mandating Specific Security Requirements Another emerging trend in state privacy laws are statutes that require specific information security requirements and encryption of personal data, including Social Security numbers (SSNs). Nevada is one notable state in this regard, with a 2008 law requiring that “[a] business in this State shall not transfer any personal information of a customer through an electronic transmission other than a facsimile to a person outside of the secure system of the business unless the business uses encryption to ensure the security of electronic transmission.” 28 In addition, as of January 1, 2010, a new measure requires “data collectors” (a broad term that can include businesses and governmental agencies) who do business in the state and accept payment cards to comply with the Payment Card Industry’s security standard, known as PCI DSS. In addition, the new measure also requires data collectors who do not accept payment cards to use encryption when transferring sensitive personal information “outside of the secure system.” Even more notable is Massachusetts, where the Office of Consumer Affairs and Business Regulation recently enacted regulations pertaining to identity theft and data security. The regulations have broad coverage because they apply to all entities that “own[], license[], store[] or maintain[] personal information about a resident of the Commonwealth,” and not only those entities located or operate in the state.29 Additionally, the definition of “person” in the regulations is broad, including “a natural person, corporation, association, partnership or other legal entity, other than an agency, executive office, department, board, commission, bureau, division or authority of the Commonwealth, or any of its branches, or any political subdivision thereof.” The regulations require that “[e]very person that owns, licenses, stores or maintains personal information about a resident of the Commonwealth shall develop, implement, maintain and monitor a comprehensive, written information security program applicable to any records containing such personal information.” The Massachusetts regulations also have provisions governing the encryption of data, including SSNs. The regulations require that all transmitted records and files that contain personal information be encrypted when transmitted wirelessly or over a public network. The regulations also require encryption of all personal information that is “stored on laptops or other portable devices.” The Massachusetts statute is one of the broadest encryption laws to have been passed thus far. The new Massachusetts privacy regulations do not contain an exemption for other compliance, such as compliance with the federal Gramm-Leach-Bliley Act30 or other federal statutes, instead requiring that all persons comply with the stringent requirements. Because the Massachusetts regulations have no exemption for persons who comply with Gramm-Leach-Bliley, even companies already in compliance may have to rework their privacy policies to comply with the Massachusetts regulations. 28Nev. Rev. Stat. Ann. § 597.970 (West 2007). 29201 Mass. Code Regs. 17.01–17.05 (2009). 30Gramm-Leach-Bliley Act, 15 U.S.C. §§ 6801–6809. 380 Part 4: Regulatory, Compliance and Liability Issues Copyright 2010 Cengage Learning. All Rights Reserved. May not be copied, scanned, or duplicated, in whole or in part. Due to electronic rights, some third party content may be suppressed from the eBook and/or eChapter(s). Editorial review has deemed that any suppressed content does not materially affect the overall learning experience. Cengage Learning reserves the right to remove additional content at any time if subsequent rights restrictions require it. Enforcing Privacy and Data Security Rights Although enforcement actions involving data privacy violations are still relatively new, there have been a number of cases that serve to emphasize that such violations may come with considerable penalties. To illustrate this point, this section shall examine several representative cases and enforcement actions. In addition to complying with requirements set forth in applicable data privacy laws and regulations, entities also have to ensure that their data collection, use and disclosure practices are conducted in accordance with their stated policies. There have been a number of cases that clearly demonstrate that companies are bound by the terms of their stated policies. Such cases are of particular importance when one considers the kinds of demands for information that governmental authorities have been making on private entities, including by requiring such companies to disclose customer data to the government, even where doing so would cause the companies to be in violation of their own policies. The FTC’s power as an agency that enforces privacy promises emerged in 2000 and has been growing ever since. In July 2000 the FTC commenced an enforcement action against bankrupt online toy store Toysmart.com, LLC and Toysmart.com, Inc. (collectively, “Toysmart”). The FTC was alerted when, in conjunction with its dissolution, Toysmart attempted to sell personal data collected via the Internet, even though the privacy policy posted at the time that data was collected assured customers that the information would never be shared with third parties. Specifically, the privacy policy contained this provision: “[P]ersonal Information, voluntarily submitted by visitors to our site, such as name, address, billing information and shopping preferences, is never shared with a third party.” The policy continued: “[W]hen you register with toysmart.com, you can rest assured that your information will never be shared with a third party.” On May 22, 2000, Toysmart announced that it was closing its operations and selling its assets. Despite the assurances in Toysmart’s privacy policy, Toysmart offered personal data collected via its website as part of the assets it was selling. As a result of Toysmart’s actions, the FTC initiated an enforcement action against the company, charging that it had violated Section 5 of the FTC Act by misrepresenting to customers that personal data would never be shared with third parties and then disclosing, selling, and offering for sale that personal data in violation of the company’s stated privacy policy. This action eventually ended in a settlement, and Toysmart was prohibited from selling its customer list as a stand-alone asset. The settlement permitted Toysmart to sell such customer lists containing personal data only (1) as part of a package which included the entire website; (2) to an entity that was in a related market; and (3) to an entity that expressly agreed to be Toysmart’s successor-in-interest as to the personal data. Under the terms of the settlement, the buyer of Toysmart’s assets would have to agree to abide by Toysmart’s privacy policy and to obtain the affirmative consent (opt-in) of the data subjects prior to using their personal data in any manner that was inconsistent with Toysmart’s original privacy policy. Toysmart’s difficulties with the FTC illustrate clearly the hazards of posting a privacy policy that is not completely accurate. For Toysmart, as well as many other companies, personal data is a major asset. By drafting a privacy policy in a very restrictive manner, Toysmart effectively limited its business plan and was not able to use one of its primary assets as it had intended. When the company attempted to transfer the personal data it had collected in contravention of its privacy policy, the FTC had prevented it from doing so. Thereafter, a series of FTC enforcement actions demonstrated that the FTC had begun to place particular emphasis on enforcing data security assurances made in online privacy policies. In early 2001, pharmaceutical giant Eli Lilly became the subject of an FTC enforcement action as a result of the security guarantees made in its online privacy Chapter 12: Privacy 381 Copyright 2010 Cengage Learning. All Rights Reserved. May not be copied, scanned, or duplicated, in whole or in part. Due to electronic rights, some third party content may be suppressed from the eBook and/or eChapter(s). Editorial review has deemed that any suppressed content does not materially affect the overall learning experience. Cengage Learning reserves the right to remove additional content at any time if subsequent rights restrictions require it. policy. Eli Lilly manufactures a number of pharmaceutical products, including the antidepressant Prozac. In marketing Prozac, Lilly operates a Prozac website, through which it collects various personal data from visitors to the site. From March 2000 to June 2001, Eli Lilly offered a service called “Medi-Messenger” through its Prozac website, which enabled registered users to receive individualized email reminders from Lilly concerning their Prozac medication or other matters. On June 27, 2001, Lilly sent a form email message to subscribers to the service. The message included, in the “To:” entry line, the email addresses of every individual subscriber. The FTC commenced an action against Eli Lilly, alleging that it made false or misleading representations in the privacy policy for the Medi-Messenger service. The privacy policy posted on the website at the time the information was collected stated that Eli Lilly employed measures and took steps appropriate under the circumstances to maintain and protect the privacy and confidentiality of personal data obtained from or about consumers through the Prozac site. The FTC alleged that Lilly had not employed such measures or taken such steps. Further, it contended that Lilly failed to provide appropriate training for its employees regarding consumer privacy and information security; failed to provide appropriate oversight and assistance for the employee who sent out the email, an individual who had no prior experience in creating, testing, or implementing the computer program used; and failed to implement appropriate checks and controls on the process, such as reviewing the computer program with experienced personnel and testing the program internally before broadcasting the email. Eli Lilly eventually settled the matter with the FTC and signed a consent order containing provisions intended to prevent the company from engaging in similar acts and practices in the future. The consent order applies broadly to the collection of personal data from or about consumers in connection with the advertising, marketing, offering for sale, or sale of any pharmaceutical, medical, or other health-related product or service by Eli Lilly. It consists of six parts, but the most significant to the current discussion are Parts I and II. Part I prohibits misrepresentations regarding the extent to which Lilly maintains and protects the privacy or confidentiality of any personal data collected from or about consumers. Part II of the consent order requires Eli Lilly to implement a four-stage information security program to protect the confidentiality and security of consumers’ personal data and to protect it against unauthorized access, use, or disclosure. The four stages require Lilly to: (1) designate appropriate personnel to coordinate and oversee the program; (2) identify foreseeable risks to the security, confidentiality, and integrity of personal data, and to address these risks in each relevant area of its operations; (3) conduct an annual written review by qualified persons that monitors and documents compliance with the program, evaluates its effectiveness, and recommends changes to it; and (4) adjust the program in light of any findings and recommendations resulting from reviews or ongoing monitoring. The FTC has also brought similar enforcement actions against software giant Microsoft and clothing manufacturer Guess?. The Guess? action marked the third time that the FTC settled an enforcement action against a company that allegedly made false assurances regarding the level of security it provided to individuals’ personal data. In this most recent case, the FTC alleged that the company failed to use reasonable or appropriate measures to protect consumer’s personal data and thereby exposed such consumers’ information to commonly known attacks by hackers, all in contravention with Guess?’s assurances that the data collected through its website would be protected. Guess? has sold clothing and accessories through its website (www.guess.com) since 1998. The FTC alleged that even though the site had been vulnerable to a number of commonly known web-based application attacks, Guess?’s online statements assured consumers that their 382 Part 4: Regulatory, Compliance and Liability Issues Copyright 2010 Cengage Learning. All Rights Reserved. May not be copied, scanned, or duplicated, in whole or in part. Due to electronic rights, some third party content may be suppressed from the eBook and/or eChapter(s). Editorial review has deemed that any suppressed content does not materially affect the overall learning experience. Cengage Learning reserves the right to remove additional content at any time if subsequent rights restrictions require it. information would be protected. Specifically, according to the FTC, at the time the Guess? website was attacked, it contained the following statements: “This site has security measures in place to protect the loss, misuse and alteration of information under our control” and “All of your personal information, including your credit card information and sign-in password, are stored in an unreadable, encrypted format at all times.” The FTC alleged that, despite these assurances, Guess did not store consumer’s information in an unreadable, encrypted format at all times and, in fact, the security measures implemented by Guess failed to protect against Structured Query Language and other commonly known attacks. The settlement agreement prohibits Guess from misrepresenting the degree to which it protects the security of personal information collected from consumers. It also requires Guess to establish and maintain a comprehensive data security program. Furthermore, Guess is also required to have its security program certified annually by an independent security professional. The FTC has also entered into a consent agreement with Petco Animal Supplies, Inc. (Petco) after alleging that Petco had engaged in deceptive trade practices by including various statements in its online privacy policy including: At PETCO.com, protecting your information is our number one priority, and your personal information is strictly shielded from unauthorized access. Entering your credit card number via our secure server is completely safe. The server encrypts all of your information; no one except you can access it. According to the FTC, these statements were a deceptive trade practice because Petco was unable to completely protect the data it received from its computer servers. The FTC settlement prohibits Petco from misrepresenting the extent to which it maintains and protects sensitive consumer information, and requires the company to establish and maintain a comprehensive information security program and arrange biennial audits of that program by an independent third party. The settlement also contains recordkeeping provisions to allow the FTC to monitor compliance. The FTC has also targeted Gateway Learning Corporation (Gateway), alleging that the company had engaged in unfair and deceptive trade practices by sharing customer information collected—after explicitly promising on it’s website not to do so. Initially the Gateway privacy policy stated “We do not sell, rent or loan any personally identifiable information regarding our consumers with any third party unless we receive customer’s explicit consent.” Gateway then decided to sell customer information collected on its website, and altered its privacy policy to state that “from time to time” Gateway would provide consumers’ personal information to “reputable companies” whose products or services consumers might find of interest. The FTC charged that: (1) Gateway’s claims that it would not sell, rent, or loan to third parties consumers’ personal information unless it received the consumers’ consent, and that it would never share information about children, were false; (2) Gateway’s retroactive application of a materially changed privacy policy to information it had previously collected from consumers was an unfair practice; and (3) Gateway’s failure to notify consumers of the changes to its privacy policy and practices, as promised in the original policy, was a deceptive practice. The two parties settled, and the agreement bars misrepresentations about how Gateway will use data it collects from consumers and prohibits the company from sharing any personal information collected from consumers on its website under the earlier privacy policy, unless it first obtains “opt-in” consent from consumers. It also prohibits Gateway from applying future material changes to its Chapter 12: Privacy 383 Copyright 2010 Cengage Learning. All Rights Reserved. May not be copied, scanned, or duplicated, in whole or in part. Due to electronic rights, some third party content may be suppressed from the eBook and/or eChapter(s). Editorial review has deemed that any suppressed content does not materially affect the overall learning experience. Cengage Learning reserves the right to remove additional content at any time if subsequent rights restrictions require it. privacy policy retroactively without consumers’ consent and requires Gateway to give up the $4,608 it earned from renting consumers’ information. In April 2005, the FTC settled charges against BJ’s Wholesale Club (BJ’s), after alleging that the company’s computer network, used to obtain bank authorization for credit and debit card purchases and to track inventory, was not secure enough. For credit and debit card purchases at its stores, BJ’s collects information—such as name, card number, and expiration date—from the magnetic strip on the back of the cards. The information is sent from the computer network in the store to BJ’s central datacenter computer network and from there through outside computer networks to the bank that issued the card. The FTC charged that BJ’s engaged in a number of practices which, taken together, did not provide reasonable security for sensitive customer information. These practices included: (a) BJ’s failure to encrypt consumer information when it was transmitted or stored on computers in its stores; (b) BJ’s creation of unnecessary risks to the information by storing it for up to 30 days, and (c) storing the information in files that could be accessed using commonly known default user IDs and passwords. The settlement requires BJ’s to establish and maintain a comprehensive information security program that includes administrative, technical, and physical safeguards. The settlement also requires BJ’s to obtain an audit from a qualified, independent, thirdparty professional that its security program meets the standards of the order and to comply with standard bookkeeping and recordkeeping provisions. Select Privacy Issues Behavioral Advertising Behavioral advertising, also known as behavioral targeting, uses technology to anonymously track and tabulate clicks in order to understand an individual consumer’s online activities. Cookies are used to monitor and track web surfing habits, including the websites visited, the length of time spent on a given web page, and the content viewed. Although the consumer information collected may not seem to be personally identifiable, as it does not identify individuals by name or address, the practice has the ability to collect and aggregate extensive amounts of personal information. The inventory of data collected by behavioral advertising is analyzed in order to predict a consumer’s future behavior and target future advertising to that consumer based on their web surfing history. Behavioral advertising is a widespread practice used by web publishers, Internet marketers and service providers to increase the effectiveness of advertising campaigns by tracking consumer activities online, and thereby, serve up more targeted content to consumers. Behavioral advertising is an important component of many companies’ marketing strategies but has come under fire due to privacy concerns. Consumers have expressed concerns about being tracked online and how their personal information, especially sensitive information, is collected, used and distributed on the Internet. The FTC and Congress have been closely monitoring this issue and the FTC recently issued revised self-regulatory guidelines for online behavioral advertising. Privacy advocates, such as the Center for Digital Democracy, are not satisfied with the new rules and are pushing for online advertising legislation. In response to consumer privacy concerns raised by behavioral advertising, the FTC has spent the last decade investigating, studying and enforcing privacy developments. As part of its efforts to protect online consumer privacy, the FTC has hosted several events that brought consumers together with consumer and privacy advocates, government representatives and Internet companies including a three-day public hearing titled 384 Part 4: Regulatory, Compliance and Liability Issues Copyright 2010 Cengage Learning. All Rights Reserved. May not be copied, scanned, or duplicated, in whole or in part. Due to electronic rights, some third party content may be suppressed from the eBook and/or eChapter(s). Editorial review has deemed that any suppressed content does not materially affect the overall learning experience. Cengage Learning reserves the right to remove additional content at any time if subsequent rights restrictions require it. “Protecting Consumers in the Next Tech-ade” in the fall of 2006,31 and a Town Hall Meeting titled “Ehavioral Advertising: Tracking, Targeting, and Technology” in the fall of 2007.32 These discussions culminated in a set of draft guidelines, “Behavioral Advertising: Moving the Discussion Forward to Possible Self-Regulatory Principles,” which were issued by the FTC in December 2007.33 The proposed guidelines express the FTC’s optimism that privacy concerns raised by behavioral advertising can be addressed and monitored by self-regulation.34 On February 12, 2009, the FTC issued revised guidelines: “Self-Regulatory Principles For Online Behavioral Advertising” and responded to the main issues raised by more than 60 comments it received.35 The primary focus of the revised guidelines was narrowed to cover behavioral advertising in which a website uses information collected from other sources (as opposed to behavioral advertising by and at a single website or contextual advertising). The four self-regulatory principles included in the report are as follows: • Any website collecting data for the purpose of behavioral advertising must provide clear consumer-friendly and prominent notice of the practice and easily accessible way for consumers to choose whether to have their information collected for such purpose; • Any business that collects or stores data for behavioral advertising purposes should provide reasonable security for that data. Such protections should be based on the sensitivity of the data, the nature of a company’s business operations, the types of risks a company faces, and the reasonable protections available to a company; • A Company must keep any promises that it makes with respect to how it will handle or protect consumer data and obtain express consent from consumers prior to using previously collected data in a materially different way than previously promised; • Companies should collect sensitive data for behavioral advertising only after they obtain affirmative express consent from the consumer to receive such advertising. The FTC is committed to protecting consumers’ privacy and put the industry on notice that it will be watching and evaluating the development and enforcement of self regulatory programs. The FTC report and revised guidelines represents a warning to the online advertising industry that regulation or legislation will come if it fails to develop, implement and enforce meaningful self-regulatory programs. Given the FTC’s recent challenge/warning to the advertising, marketing, and Internet industry, the interesting issue is whether the industry will do a better job of meaningful, rigorous self-regulation. The online advertising industry has recently taken steps to demonstrate to the FTC and Congress that it is serious about privacy. Having said that, the industry has voiced general concerns that restricting online advertising could negatively impact electronic commerce and the Internet. Here is a summary of recent efforts: • Joint Industry Task Force (January 2009). Four major marketing and advertising industry associations announced their commitment to collaborate to develop enhanced cross sector self-regulatory principles for online behavioral advertising to address privacy concerns. This working group represents the first time the media and marketing industry have joined forces to develop an expansive self-regulatory 31See http://www.ftc.gov/opa/2007/08/ehavioral.shtm. 32Id. 33See http://www.ftc.gov/opa/2007/12/principles.shtm 34Id. 35See http://www.ftc.gov/os/2009/02/P085400behavadreport.pdf Chapter 12: Privacy 385 Copyright 2010 Cengage Learning. All Rights Reserved. May not be copied, scanned, or duplicated, in whole or in part. Due to electronic rights, some third party content may be suppressed from the eBook and/or eChapter(s). Editorial review has deemed that any suppressed content does not materially affect the overall learning experience. Cengage Learning reserves the right to remove additional content at any time if subsequent rights restrictions require it. effort for interactive advertising. The associations include: The American Association of Advertising Agencies, the Association of National Advertisers, the Direct Marketing Association and the Interactive Advertising Bureau. • NAI Revised Privacy Principles (December 2008). The Network Advertising Initiative, a self-regulatory group for companies in the online marketplace, released an updated version of its NAI Principles self-regulatory standards, which govern its members.36 Members of the NAI include Google, Yahoo!, Tacoda, and Fox Audience Network. Under the new privacy principles: • Members that target Internet users anonymously are required to notify users of these practices and allow them to opt out; • Members that serve ads based on “sensitive” personal information (e.g., Social Security numbers, financial account information, medical data and geographic location) must obtain user’s express consent, even if targeting is anonymous; • Members may not target children under 13 without verifiable parental consent; and • Members may not retain data longer than necessary. In addition to recent developments in FTC regulation of behavioral advertising activity, three states have been investigating and addressing behavioral advertising on the Internet. New York, Connecticut, and Massachusetts have all recently drafted legislative proposals regarding behavioral advertising. Although none of these proposals have made it into law, it can be expected that the states will continue to focus on this issue. Given the nature of the Internet, the enactment of differing requirements at the state level would have a profound implication on the development of the industry and would complicate compliance efforts. With so many different constituencies involved and the stakes so high, it is unlikely that the controversy surrounding behavioral advertising will be resolved any time soon. For the immediate future, the emphasis will be on self-regulation. However, one cannot be certain that this will continue to be the case and evolutions in this area should be monitored closely, particularly within the next 12 to 18 months, as further important developments should be expected. Companies’ online advertising strategies are also giving rise to privacy concerns and, in some cases, are resulting in legal actions. Consider Facebook’s experiences with its Beacon offering, a part of Facebook’s advertising system that was recently the subject of a large class action lawsuit.37 Beacon was a system through which Facebook’s affiliated websites communicated information to each other about Facebook users. An illustration of the system would be where a Facebook user purchased movie tickets on Fandango and Facebook received information on the purchase. When Facebook users received word on this system, there was a tremendous outcry and soon after, there was a Facebook group of approximately 50,000 Facebook users arguing that Beacon’s services invaded their privacy. The class action lawsuit was later filed. In 2009, soon after the lawsuit was filed, Facebook shut down its Beacon operations. In addition to shutting down the Beacon service, Facebook proposed to spend $9.5 million to set up a privacy foundation focused on online privacy issues, as a settlement offer to the class action members. Most members were glad to accept the offer, mainly because of the termination of the Beacon service. However, several members, as well as Public Citizen, a Washington D.C.-based consumer advocacy group, moved that the court reject the settlement offer. Their chief complaint was that Facebook would have control over drafting the mission and by-laws of the foundation as well as appointing board members. 36See http://www.networkadvertising.org/networks/principles_comments.asp 37PC World, Facebook Beacon Lawsuit Settled, http://www.pcworld.com/article/191936/facebook_beacon_ lawsuit_settled.html (last visited August 25, 2010). 386 Part 4: Regulatory, Compliance and Liability Issues Copyright 2010 Cengage Learning. All Rights Reserved. May not be copied, scanned, or duplicated, in whole or in part. Due to electronic rights, some third party content may be suppressed from the eBook and/or eChapter(s). Editorial review has deemed that any suppressed content does not materially affect the overall learning experience. Cengage Learning reserves the right to remove additional content at any time if subsequent rights restrictions require it. Facebook maintained that this claim stemmed from a misunderstanding and that the foundation would be a completely separate entity run by well-known and respected privacy advocates. Judge Seeborg of the United States District Court for the Northern District of California presided over the matter, and he ruled in favor of the settlement despite the negative feedback from several of the class members. He was of the opinion that the claims of the foundation’s potential ties with Facebook were not very likely as it would not be in Facebook’s business interests to create a privacy foundation that would harm it. He also maintained that class members were not entitled to direct compensation and, therefore, fully accepted Facebook’s proposed settlement. Privacy at Work Workplace privacy is a critical issue for the business manager. Employee’s inappropriate use of email and computer systems owned and maintained by employers continue to provide a fertile environment for privacy issues and litigation. From the employer’s perspective, employee workplace privacy issues are often overcome by the employer’s vested economic interest in ensuring employee productivity and the protection of the employer from potential from liability for harassment, discrimination, obscenity, and defamation, as well as the protection of the employer’s intellectual property and proprietary information. There have been a number of cases that have explored the scope of the employees’ privacy rights in the workplace. Generally, these cases have established that employees should have little expectation of privacy in the workplace. Recently, however, there have been some indications that this well-established principle may not be bullet-proof and, in some cases, employees may have certain privacy rights. Consider the following recent case. STENGART v. LOVING CARE AGENCY 201 N.J. 300, 990 A.2d 650 (2010) FACTS [Marina Stengart worked for Loving Care Agency, Inc. (Loving Care), a home healthcare agency, as an Executive Director of Nursing. The company provided her with a laptop computer from which she could access her own personal email as well as the company’s server to access company emails.] [At the end of 2007, Stengart accessed her own personal Yahoo email account on the company’s laptop to speak with her attorney concerning her situation at work. She did not save her email user ID or password on the company laptop and did not save any private information on the computer. The company’s browser software, however, automatically saved a copy of each web page visited in a temporary cache folder. She was not aware of this.] [Soon after, Stengart left Loving Care and returned the laptop computer. A few months later, she filed a lawsuit with claims of discrimination, harassment, and retaliation. After the lawsuit was filed, Loving Care hired a company to retrieve all files from the laptop’s hard drive. The files that were retrieved included email communications that Stengart had with her attorney through her personal email account.] [Loving Care’s attorneys used the email communications in the lawsuit. Stengart’s attorneys argued that the email communications were private. Loving Care’s attorneys countered that there was no reasonable expectation of privacy based on the company’s policy, which stated:] Loving Care may review, access, and disclose all matters on the company’s media systems and services at any time e-mails, Internet communications and computer files are the company’s business records and are not to be considered private or (Continued) Chapter 12: Privacy 387 Copyright 2010 Cengage Learning. All Rights Reserved. May not be copied, scanned, or duplicated, in whole or in part. Due to electronic rights, some third party content may be suppressed from the eBook and/or eChapter(s). Editorial review has deemed that any suppressed content does not materially affect the overall learning experience. Cengage Learning reserves the right to remove additional content at any time if subsequent rights restrictions require it. personal to any individual employee occasional personal use of the computer is permitted. [Stengart’s attorneys requested that the judge disqualify Loving Care’s attorneys and exclude the emails from evidence. The judge denied the request, holding that Stengart waived the attorney-client privilege. Stengart appealed the denial of the request. The appellate court reversed and remanded the decision to the trial court. Loving Care appealed.] JUDICIAL OPINION (JUDGE RABNER) Loving Care argues that its employees have no expectation of privacy in their use of company computers based on the company’s Policy. In its briefs before this Court, the company also asserts that by accessing e-mails on a personal account through Loving Care’s computer and server, Stengart either prevented any attorney-client privilege from attaching or waived the privilege by voluntarily subjecting her e-mails to company scrutiny. Finally, Loving Care maintains that its counsel did not violate RPC 4.4(b) because the e-mails were left behind on Stengart’s company computer—not “inadvertently sent,” as per the Rule—and the Firm acted in the good faith belief that any privilege had been waived. Stengart argues that she intended the e-mails with her lawyer to be confidential and that the Policy, even if it applied to her, failed to provide adequate warning that Loving Care would save on a hard drive, or monitor the contents of, e-mails sent from a personal account. Stengart also maintains that the communications with her lawyer were privileged. When the Firm encountered the arguably protected e-mails, Stengart contends it should have immediately returned them or sought judicial review as to whether the attorneyclient privilege applied. A We start by examining the meaning and scope of the Policy itself. The Policy specifically reserves to Loving Care the right to review and access “all matters on the company’s media systems and services at any time.” In addition, e-mail messages are plainly “considered part of the company’s business … records.” It is not clear from that language whether the use of personal, password-protected, web-based e-mail accounts via company equipment is covered. The Policy uses general language to refer to its “media systems and services” but does not define those terms. Elsewhere, the Policy prohibits certain uses of “the e-mail system,” which appears to be a reference to company e-mail accounts. The Policy does not address personal accounts at all. In other words, employees do not have express notice that messages sent or received on a personal, web-based e-mail account are subject to monitoring if company equipment is used to access the account. The Policy also does not warn employees that the contents of such e-mails are stored on a hard drive and can be forensically retrieved and read by Loving Care. The Policy goes on to declare that e-mails “are not to be considered private or personal to any individual employee.” In the very next point, the Policy acknowledges that “[o]ccasional personal use [of e-mail] is permitted.” As written, the Policy creates ambiguity about whether personal e-mail use is company or private property. The scope of the written Policy, therefore, is not entirely clear. B The policies underlying the attorney-client privilege further animate this discussion. The venerable privilege is enshrined in history and practice…. Its primary rationale is to encourage “free and full disclosure of information from the client to the attorney.” That, in turn, benefits the public, which “is well served by sound legal counsel” based on full, candid, and confidential exchanges. “For a communication to be privileged it must initially be expressed by an individual in his capacity as a client in conjunction with seeking or receiving legal advice from the attorney in his capacity as such, with the expectation that its content remain confidential.” E-mail exchanges are covered by the privilege like any other form of communication. The e-mail communications between Stengart and her lawyers contain a standard warning that their contents are personal and confidential and may constitute attorney-client communications. The subject matter of those messages appears to relate to Stengart’s working conditions and anticipated lawsuit against Loving Care. IV Under the particular circumstances presented, how should a court evaluate whether Stengart had a reasonable expectation of privacy in the e-mails she exchanged with her attorney? A Preliminarily, we note that the reasonableexpectation-of-privacy standard used by the parties (Continued) 388 Part 4: Regulatory, Compliance and Liability Issues Copyright 2010 Cengage Learning. All Rights Reserved. May not be copied, scanned, or duplicated, in whole or in part. Due to electronic rights, some third party content may be suppressed from the eBook and/or eChapter(s). Editorial review has deemed that any suppressed content does not materially affect the overall learning experience. Cengage Learning reserves the right to remove additional content at any time if subsequent rights restrictions require it. derives from the common law and the Search and Seizure Clauses of both the Fourth Amendment and Article I, paragraph 7 of the New Jersey Constitution. The latter sources do not apply in this case, which involves conduct by private parties only. The common law source is the tort of “intrusion on seclusion,” which … provides that “[o]ne who intentionally intrudes, physically or otherwise, upon the solitude or seclusion of another or his private affairs or concerns, is subject to liability to the other for invasion of his privacy, if the intrusion would be highly offensive to a reasonable person.” A high threshold must be cleared to assert a cause of action based on that tort. A plaintiff must establish that the intrusion “would be highly offensive to the ordinary reasonable man, as the result of conduct to which the reasonable man would strongly object.” As is true in Fourth Amendment cases, the reasonableness of a claim for intrusion on seclusion has both a subjective and objective component. B A number of courts have tested an employee’s claim of privacy in files stored on company computers by evaluating the reasonableness of the employee’s expectation. No reported decisions in New Jersey offer direct guidance for the facts of this case. In one matter, the Appellate Division found that the defendant had no reasonable expectation of privacy in personal information he stored on a workplace computer under a separate password. The defendant had been advised that all computers were company property. His former employer consented to a search by the State Police, who, in turn, retrieved information tied to the theft of company funds. The court reviewed the search in the context of the Fourth Amendment and found no basis for the defendant’s privacy claim in the contents of a company computer that he used to commit a crime. Certain decisions from outside New Jersey, which the parties also rely on, are more instructive. Among them, National Economic Research Associates v. Evans, 21 Mass. L. Rptr. No. 15, at 337, 2006 WL 2440008 (Mass.Super.Ct. Aug. 3, 2006), is most analogous to the facts here. In Evans, an employee used a company laptop to send and receive attorney-client communications by e-mail. In doing so, he used his personal, password-protected Yahoo account and not the company’s e-mail address. The e-mails were automatically stored in a temporary Internet file on the computer’s hard drive and were later retrieved by a computer forensic expert. The expert recovered various attorney-client e-mails; at the instruction of the company’s lawyer, those e-mails were not reviewed pending guidance from the court. A company manual governed the laptop’s use. The manual permitted personal use of e-mail, to “be kept to a minimum,” but warned that computer resources were the “property of the Company” and that e-mails were “not confidential” and could be read “during routine checks.” The court denied the company’s application to allow disclosure of the e-mails that its expert possessed. The court reasoned, Based on the warnings furnished in the Manual, Evans [could not reasonably expect to communicate in confidence with his private attorney if Evans e-mailed his attorney using his [company] e-mail address through the NERA Intranet, because the Manual plainly warned Evans that e-mails on the network could be read by NERA network administrators. The Manual, however, did not expressly declare that it would monitor the content of Internet communications…. Most importantly, the Manual did not expressly declare, or even implicitly suggest, that NERA would monitor the content of e-mail communications made from an employee’s personal e-mail account via the Internet whenever those communications were viewed on a NERA-issued computer. Nor did NERA warn its employees that the content of such Internet e-mail communications is stored on the hard disk of a NERA-issued computer and therefore capable of being read by NERA. As a result, the court found the employee’s expectation of privacy in e-mails with his attorney to be reasonable. According to some courts, employees appear to have a lesser expectation of privacy when they communicate with an attorney using a company e-mail system as compared to a personal, web-based account like the one used here…. As a result, courts might treat e-mails transmitted via an employer’s e-mail account differently than they would web-based e-mails sent on the same company computer. Courts have also found that the existence of a clear company policy banning personal e-mails can also diminish the reasonableness of an employee’s claim to privacy in e-mail messages with his or her attorney…. We recognize that a zero-tolerance policy can be unworkable and unwelcome in today’s dynamic and mobile workforce and do not seek to encourage that approach in any way. (Continued) Chapter 12: Privacy 389 Copyright 2010 Cengage Learning. All Rights Reserved. May not be copied, scanned, or duplicated, in whole or in part. Due to electronic rights, some third party content may be suppressed from the eBook and/or eChapter(s). Editorial review has deemed that any suppressed content does not materially affect the overall learning experience. Cengage Learning reserves the right to remove additional content at any time if subsequent rights restrictions require it. The location of the company’s computer may also be a relevant consideration. We realize that different concerns are implicated in cases that address the reasonableness of a privacy claim under the Fourth Amendment. This case, however, involves no governmental action. Stengart’s relationship with her private employer does not raise the specter of any government official unreasonably invading her rights. V A Applying the above considerations to the facts before us, we find that Stengart had a reasonable expectation of privacy in the e-mails she exchanged with her attorney on Loving Care’s laptop. Stengart plainly took steps to protect the privacy of those e-mails and shield them from her employer. She used a personal, password-protected e-mail account instead of her company e-mail address and did not save the account’s password on her computer. In other words, she had a subjective expectation of privacy in messages to and from her lawyer discussing the subject of a future lawsuit. In light of the language of the Policy and the attorney-client nature of the communications, her expectation of privacy was also objectively reasonable. As noted earlier, the Policy does not address the use of personal, web-based e-mail accounts accessed through company equipment. It does not address personal accounts at all. Nor does it warn employees that the contents of e-mails sent via personal accounts can be forensically retrieved and read by the company. Indeed, in acknowledging that occasional personal use of e-mail is permitted, the Policy created doubt about whether those e-mails are company or private property. Moreover, the e-mails are not illegal or inappropriate material stored on Loving Care’s equipment, which might harm the company in some way. They are conversations between a lawyer and client about confidential legal matters, which are historically cloaked in privacy. Our system strives to keep private the very type of conversations that took place here in order to foster probing and honest exchanges. In addition, the e-mails bear a standard hallmark of attorney-client messages. They warn the reader directly that the e-mails are personal, confidential, and may be attorney-client communications. While a pro forma warning at the end of an e-mail might not, on its own, protect a communication, other facts present here raise additional privacy concerns. Under all of the circumstances, we find that Stengart could reasonably expect that e-mails she exchanged with her attorney on her personal, passwordprotected, web-based e-mail account, accessed on a company laptop, would remain private. It follows that the attorney-client privilege protects those e-mails. In reaching that conclusion, we necessarily reject Loving Care’s claim that the attorneyclient privilege either did not attach or was waived. In its reply brief and at oral argument, Loving Care argued that the manner in which the e-mails were sent prevented the privilege from attaching. Specifically, Loving Care contends that Stengart effectively brought a third person into the conversation from the start—watching over her shoulder—and thereby forfeited any claim to confidentiality in her communications. We disagree. Stengart has the right to prevent disclosures by third persons who learn of her communications “in a manner not reasonably to be anticipated.” That is what occurred here. The Policy did not give Stengart, or a reasonable person in her position, cause to anticipate that Loving Care would be peering over her shoulder as she opened e-mails from her lawyer on her personal, password-protected Yahoo account. The language of the Policy, the method of transmittal that Stengart selected, and the warning on the e-mails themselves all support that conclusion. Loving Care also argued in earlier submissions that Stengart waived the attorney-client privilege. For similar reasons, we again disagree. A person waives the privilege if she, “without coercion and with knowledge of [her] right or privilege, made disclosure of any part of the privileged matter or consented to such a disclosure made by anyone.” Because consent is not applicable here, we look to whether Stengart either knowingly disclosed the information contained in the e-mails or failed to “take reasonable steps to insure and maintain their confidentiality.” As discussed previously, Stengart took reasonable steps to keep discussions with her attorney confidential: she elected not to use the company e-mail system and relied on a personal, password-protected, webbased account instead. She also did not save the password on her laptop or share it in some other way with Loving Care. As to whether Stengart knowingly disclosed the e-mails, she certified that she is unsophisticated in (Continued) 390 Part 4: Regulatory, Compliance and Liability Issues Copyright 2010 Cengage Learning. All Rights Reserved. May not be copied, scanned, or duplicated, in whole or in part. Due to electronic rights, some third party content may be suppressed from the eBook and/or eChapter(s). Editorial review has deemed that any suppressed content does not materially affect the overall learning experience. Cengage Learning reserves the right to remove additional content at any time if subsequent rights restrictions require it. the use of computers and did not know that Loving Care could read communications sent on her Yahoo account. Use of a company laptop alone does not establish that knowledge. Nor does the Policy fill in that gap. Under the circumstances, we do not find either a knowing or reckless waiver. B Our conclusion that Stengart had an expectation of privacy in e-mails with her lawyer does not mean that employers cannot monitor or regulate the use of workplace computers. Companies can adopt lawful policies relating to computer use to protect the assets, reputation, and productivity of a business and to ensure compliance with legitimate corporate policies. And employers can enforce such policies. They may discipline employees and, when appropriate, terminate them, for violating proper workplace rules that are not inconsistent with a clear mandate of public policy. For example, an employee who spends long stretches of the workday getting personal, confidential legal advice from a private lawyer may be disciplined for violating a policy permitting only occasional personal use of the Internet. But employers have no need or basis to read the specific contents of personal, privileged, attorney-client communications in order to enforce corporate policy. Because of the important public policy concerns underlying the attorney-client privilege, even a more clearly written company manual— that is, a policy that banned all personal computer use and provided unambiguous notice that an employer could retrieve and read an employee’s attorney-client communications, if accessed on a personal, password-protected e-mail account using the company’s computer system— would not be enforceable. VI We next examine whether the Firm’s review and use of the privileged e-mails violated RPC 4.4(b). The Rule provides that “[a] lawyer who receives a document and has reasonable cause to believe that the document was inadvertently sent shall not read the document or, if he or she has begun to do so, shall stop reading the document, promptly notify the sender, and return the document to the sender.” According to the ABA Model Rules on which RPC 4.4(b) is patterned, the term “‘document’ includes e-mail or other electronic modes of transmission subject to being read or put into readable form.” Loving Care contends that the Rule does not apply because Stengart left the e-mails behind on her laptop and did not send them inadvertently. In actuality, the Firm retained a computer forensic expert to retrieve e-mails that were automatically saved on the laptop’s hard drive in a “cache” folder of temporary Internet files. Without Stengart’s knowledge, browser software made copies of each webpage she viewed. Under those circumstances, it is difficult to think of the e-mails as items that were simply left behind. We find that the Firm’s review of privileged e-mails between Stengart and her lawyer, and use of the contents of at least one e-mail in responding to interrogatories, fell within the ambit of RPC 4.4(b) and violated that rule. To be clear, the Firm did not hack into plaintiff’s personal account or maliciously seek out attorneyclient documents in a clandestine way. Nor did it rummage through an employee’s personal files out of idle curiosity. Instead, it legitimately attempted to preserve evidence to defend a civil lawsuit. Its error was in not setting aside the arguably privileged messages once it realized they were attorney-client communications, and failing either to notify its adversary or seek court permission before reading further. There is nothing in the record before us to suggest any bad faith on the Firm’s part in reading the Policy as it did. Nonetheless, the Firm should have promptly notified opposing counsel when it discovered the nature of the e-mails. The Appellate Division remanded to the trial court to determine the appropriate remedy. It explained that a hearing was needed in that regard to consider the content of the emails, whether the information contained in the emails would have inevitably been divulged in discovery that would have occurred absent [the Firm’s] knowledge of the emails’ content, and the nature of the issues that have been or may in the future be pled in either this or the related Chancery action. We agree. The forensically retrieved version of the e-mails submitted to the Court is not easy to read or fully understand in isolation, and no record has yet been developed about the e-mails’ full use. For the same reason, we cannot determine how confidential or critical the messages are. In deciding what sanctions to impose, the trial court should evaluate the seriousness of the breach in light of the specific nature of the e-mails, the manner in which they were identified, reviewed, disseminated, and used, and other considerations noted by the Appellate Division. As to plaintiff’s request for disqualification, the court should also “balance competing interests, weighing the ‘need to maintain the highest standards of the profession’ against ‘a client’s right freely to choose his counsel.’” We leave to the trial court to decide whether disqualification of the Firm, screening of attorneys, the imposition of costs, or some other remedy is appropriate. (Continued) Chapter 12: Privacy 391 Copyright 2010 Cengage Learning. All Rights Reserved. May not be copied, scanned, or duplicated, in whole or in part. Due to electronic rights, some third party content may be suppressed from the eBook and/or eChapter(s). Editorial review has deemed that any suppressed content does not materially affect the overall learning experience. Cengage Learning reserves the right to remove additional content at any time if subsequent rights restrictions require it. Under the circumstances, we do not believe a remand to the Chancery judge is required; the matter may proceed before the Law Division judge assigned to the case. VII For the reasons set forth above, we modify and affirm the judgment of the Appellate Division and remand to the trial court for further proceedings. CASE QUESTIONS 1. The trial court found that Stengart did not have a legitimate expectation of privacy when sending personal emails through the company’s laptop. The appellate court and the Supreme Court disagreed. Discuss the reasoning that the higher courts had in reversing the trial court. 2. What effect would this decision have on a company when thinking about forming company policy regarding personal communications? 3. Ethical Consideration: Did the Supreme Court come to the right conclusion from an ethical and moral point of view? What if the communications really did show that they were not guilty of discrimination and retaliation? The decision in this case has important implications for privacy in the workplace. Companies should take this case into account when drafting their email and Internet use policies. Privacy in Satellite and Aerial Photograph Images Advances in technology are also leading to new privacy concerns in the area of aerial images. Using satellites and other technology, parties are able to take aerial images without the knowledge and consent of those on the ground. “Online services like Google and Bing give users very detailed images of practically any location on the planet.” 38 By and large, these services are making their images available online where they can be accessed by a wide range of public and private entities. This raises a number of privacy concerns. For example, some municipalities are using the images to enforce laws. Such uses, if left unchecked, may raise serious Fourth Amendment concerns. On the other side, as demonstrated in the following case, the first case to challenge Google’s Street View offering on privacy grounds, the use and display of aerial images by private entities is also giving rise to concerns. BORING v. GOOGLE, INC. 598 F. Supp. 2d 695 (2009) FACTS [Plaintiffs in this case, Mr. and Mrs. Boring (the Borings), filed suit against Google, Inc. (Google) for invasion of privacy when Google used images of their house on the maps portion of their website. Google Street View is used by Internet viewers who wish to see the actual street on the Internet as opposed to a regular map view. Google provides this service by having people drive through the streets with 360 degree cameras that continuously film the streets and this is posted on the Internet. The Borings allege that their street is clearly marked off as private property and it was an invasion of privacy for Google to enter their property and film it. The Borings did not, however, take action to remove the pictures from the Internet, a service that is easily accessed through Google’s website.] JUDICIAL OPINION (JUDGE HAY) The action for invasion of privacy embraces four analytically distinct torts: (1) intrusion upon seclusion; (2) publicity given to private life; (3) appropriation of (Continued) 38Frank Eltman, “Smile! Aerial images being used to enforce laws,” Associated Press, August 14, 2010, accessed at: http://www.google.com/hostednews/ap/article/ALeqM5iCbUn8MLjuZctBtGu8eCrgYXDPAgD9HJC3B00 392 Part 4: Regulatory, Compliance and Liability Issues Copyright 2010 Cengage Learning. All Rights Reserved. May not be copied, scanned, or duplicated, in whole or in part. Due to electronic rights, some third party content may be suppressed from the eBook and/or eChapter(s). Editorial review has deemed that any suppressed content does not materially affect the overall learning experience. Cengage Learning reserves the right to remove additional content at any time if subsequent rights restrictions require it. name or likeness; and (4) publicity placing a person in a false light. Borse v. Piece Goods Shop, Inc., 963 F.2d 611, 621 n. 9 (3d Cir.1992). The Borings do not identity the tort or torts underlying their invasion of privacy claim. Appropriation of name or likeness and false light publicity clearly do not apply. Since the remaining torts have an arguable relationship to the facts alleged, the Court will discuss each. [The court first examined the tort of intrusion upon seclusion.] This tort is established where a plaintiff is able to show: (1) physical intrusion into a place where he has secluded himself; (2) use of the defendant’s senses to oversee or overhear the plaintiff’s private affairs; or (3) some other form of investigation into or examination of the plaintiff’s private concerns. Id. at 621. “Liability attaches only when the intrusion is substantial and would be highly offensive to `the ordinary reasonable person.’” Id. (quoting Harris by Harris v. Easton Publ’g Co., 335 Pa.Super. 141, 483 A.2d 1377, 1383-84 (1984)). See also Restatement (Second) of Torts § 652B (same). In order to show that an intrusion was highly offensive, the plaintiff must allege facts sufficient to establish that the intrusion could be expected to cause “mental suffering, shame, or humiliation to a person of ordinary sensibilities.” Pro Golf Mfg., Inc. v. Tribune Review Newspaper Co., 570 Pa. 242, 809 A.2d 243, 247 (2002) (quoting McGuire v. Shubert, 722 A.2d 1087 (Pa.Super.1998)). This is a stringent standard. Wolfson v. Lewis, 924 F.Supp. 1413, 1420 (E.D.Pa.1996). While it is easy to imagine that many whose property appears on Google’s virtual maps resent the privacy implications, it is hard to believe that any—other than the most exquisitely sensitive—would suffer shame or humiliation. The Plaintiffs have not alleged facts to convince the Court otherwise. Although the Plaintiffs have alleged intrusion that was substantial and highly offensive to them and have asserted that others would have a similar reaction, they have failed to set out facts to substantiate this claim. This is especially true given the attention that the Borings have drawn to themselves and the Street View images of their property. The Borings do not dispute that they have allowed the relevant images to remain on Google Street View, despite the availability of a procedure for having them removed from view. Furthermore, they have failed to bar others’ access to the images by eliminating their address from the pleadings, or by filing this action under seal. “Googling” the name of the Borings’ attorney demonstrates that publicity regarding this suit has perpetuated dissemination of the Borings’ names and location, and resulted in frequent re-publication of the Street View images. The Plaintiffs’ failure to take readily available steps to protect their own privacy and mitigate their alleged pain suggests to the Court that the intrusion and the their suffering were less severe than they contend. [The court then examined the Boring’s claim that Google had committed the tort of publicity given to private life.] The Amended Complaint, insofar as is purports to state a claim for publicity given to the Borings’ private life, is similarly flawed. Under Pennsylvania law, this claim comprises four elements: (1) publicity; given to (2) private facts; (3) which would be highly offensive to a reasonable person; and (4) are not of legitimate public concern. See Harris by Harris, 483 A.2d at 1384. Because the Plaintiffs have not alleged facts sufficient to establish the third element of this tort, the Court need not address the other requirements. As the Court has already discussed, the Amended Complaint is devoid of facts sufficient to indicate that the photographs of the Borings’ property revealed private facts such that a reasonable person would be highly offended. The Plaintiffs do not allege that their situation is unique or even unusual. Yet, it does not appear that the viability of Street Search has been compromised by requests that images be removed, nor does a search of relevant legal terms show that courts are inundated with—or even frequently consider—privacy claims based on virtual mapping. Furthermore, as was true with the intrusion upon seclusion claim, the Plaintiffs have done little to limit—and seem to have heightened intentionally—public interest in and access to the allegedly private information. (Footnotes omitted). [Next, the court examined the plaintiffs’ negligence claim.] In order to state a claim based on negligence, a plaintiff must allege facts sufficient to show: (1) a duty of care; (2) breach of the duty; (3) actual loss or damage; and (4) a causal connection between the breach of duty and the resulting injury. Farabaugh v. Pa. Turnpike Com’n, 590 Pa. 46, 911 A.2d 1264, 1272- 73 (2006) (citing R.W. v. Manzek, 585 Pa. 335, 888 A.2d 740, 746 (2005)). The Borings’ negligence claims are set out in the Amended Complaint as follows: Defendant has a duty of care to the public to utilize proper internal controls to avoid trespassing on private property. Additionally, Defendant has a duty to utilize proper methods and controls to avoid publishing data over Street View, irrespective of how the date is (Continued) Chapter 12: Privacy 393 Copyright 2010 Cengage Learning. All Rights Reserved. May not be copied, scanned, or duplicated, in whole or in part. Due to electronic rights, some third party content may be suppressed from the eBook and/or eChapter(s). Editorial review has deemed that any suppressed content does not materially affect the overall learning experience. Cengage Learning reserves the right to remove additional content at any time if subsequent rights restrictions require it. [sic] captured, for the whole world to see without some advance method of filtering. Defendant breached its duty by its aforesaid actions. Plaintiffs have been injured, and such breach was the proximate cause of Plaintiffs’ injuries. These allegations are insufficient to state a viable claim. Simply stating that there is or ought to be a duty is not enough; the duty alleged must be one recognized by the law…. (Internal citations omitted; footnotes omitted). [Thereafter, the court analyzed the claim of trespass.] … The Borings have not alleged facts sufficient to establish that they suffered any damages caused by the alleged trespass. They do not describe damage to or interference with their possessory rights. Instead, they claim, without factual support, that mental suffering and a diminution in property value were caused by Google’s publication of a map containing images of their home. While, arguendo, trespass was the “but for” cause of their alleged harm, it was not the proximate cause required to establish indirect and consequential damages. (footnotes omitted) The Court need not consider whether the Borings have alleged facts sufficient to support a claim for nominal damages, because the Amended Complaint does not contain a nominal damages claim. [Next, the court considered the Borings’ claim for unjust enrichment.] In order to establish a claim for unjust enrichment, a plaintiff must allege facts showing that: (1) he conferred a benefit upon the defendant; (2) the defendant appreciated the benefit; and (3) the defendant accepted and retained the benefit under circumstances making it inequitable for defendant to retain the benefit without compensating the plaintiff for its value. Lackner v. Glosser, 892 A.2d 21, 34 (Pa.Super.2006). The doctrine of unjust enrichment is “typically invoked … when plaintiff seeks to recover from defendant for a benefit conferred under an unconsummated or void contract.” Steamfitters Local Union No. 420 Welfare Fund v. Philip Morris, Inc., 171 F.3d 912, 936 (3d Cir.1999) (citing Zvonik v. Zvonik, 291 Pa.Super. 309, 435 A.2d 1236, 1239-40 (1981)). In this event, the law implies a quasi-contract, requiring that the defendant compensate the plaintiff for the value of the benefit conferred. In other words, the defendant makes restitution to the plaintiff in quantum meruit. See Hershey Foods Corp. v. Ralph Chapek, Inc., 828 F.2d 989, 998 (3d Cir.1987); AmeriPro Search, Inc. v. Fleming Steel Co., 787 A.2d 988, 991 (Pa.Super.2001) (citations omitted). In this case, there was no relationship between the parties that could be construed as contractual. It cannot fairly be said that the Borings conferred anything of value upon Google. The entire thrust of the Borings’ allegations is that Google took something from the Borings without their consent, and should be held liable for having done so. There is, therefore, no basis for applying a quasi-contractual remedy. The Borings argue that unjust enrichment is not an exclusively quasi-contractual remedy, but may stand alone as an independent tort. The Court of Appeals for the Third Circuit addressed this issue in Steamfitters, writing: “In the tort setting, an unjust enrichment claim is essentially another way of stating a traditional tort claim (i.e., if defendant is permitted to keep the benefit of his tortious conduct, he will be unjustly enriched.)” Id. 171 F.3d at 936…. … The Plaintiffs have failed to plead—much less set out facts supporting—a plausible claim of entitlement to injunctive relief … CASE QUESTIONS 1. Ethical Consideration: With advances in technology, individuals are able to capture images that just a few short years ago would have been very difficult to do. What do you think about the propriety of this? Should companies like Google, and even private photographers, be able to capture images from the air without the knowledge and consent of the individuals they are photographing? 2. The court held in this case that the Borings were not able to overcome the hurdles of showing the offensive nature of the alleged invasion of privacy committed by Google. The fact that the Borings did not even try to remove their house from Google Maps was indicative of the fact that they did not really find it that offensive. In addition, the court found that a reasonable person would not find such a thing particularly offensive and they therefore did not satisfy the burden of proof for a claim of invasion of privacy. Do you agree with the court’s conclusions on these points? Why or why not? 3. What are the elements that must be demonstrated in order to establish a claim for “intrusion upon seclusion?” 394 Part 4: Regulatory, Compliance and Liability Issues Copyright 2010 Cengage Learning. All Rights Reserved. May not be copied, scanned, or duplicated, in whole or in part. Due to electronic rights, some third party content may be suppressed from the eBook and/or eChapter(s). Editorial review has deemed that any suppressed content does not materially affect the overall learning experience. Cengage Learning reserves the right to remove additional content at any time if subsequent rights restrictions require it. GoogleGoogle Street View (GSV) also allows users of Google’s location-based services, such as Google Maps and Google Earth, to view panoramic street-level photographs of different locations from across the world. GSV works because cameras atop GSV cars capture the images as the cars roam through different cities. WiFi antennas and software in the cars connect to local WiFi networks in order to pinpoint the cars’ locations and provide accurate information for Google’s location-based services. Google’s actions of connecting to local WiFi networks without permission, however, raises significant privacy concerns. WiFi networks allow different devices to communicate with one another. To establish communications, the WiFi network broadcasts a radio signal with its “service set identifier” (SSID) and its own unique hardware identifier, known as a “media access control” address (MAC address). WiFi-enabled devices pick up on the broadcasted information, establishing network connections. These connections can either be open (i.e., not password protected and thus accessible to any device), or closed (i.e., password protected and accessible only to authorized devices). As GSV cars roam the streets, their WiFi antennas detect WiFi network data, including SSID and MAC addresses, which are then relayed to Google-developed software that processes the data for storage. The privacy controversy arises from a computer code in the Googledeveloped software. The code, which Google contends was written by a lone, rogue engineer, collects samples of WiFi network data known as “payload data.” Payload data can include confidential information, such as passwords, credit card numbers, Social Security numbers, and email addresses. Initially, Google stated that its software neither collected nor stored payload data. However, following an audit by an international data protection agency, Google admitted that it “mistakenly” collected samples of payload data from open WiFi networks.39 Google explained that the code in question was first written for an experimental WiFi project in 2006, but when a new project was launched, engineers incorporated the old code into the new project without realizing that it collected payload data. Consequently, Google accumulated approximately 600 gigabytes of payload data transmitted over open WiFi networks across the world. Google states that maintaining the public’s trust is crucial to all that it does, but, in the case of payload data collected by its GSV cars, it fell short.40 Google subsequently grounded its entire GSV fleet, ceased collecting WiFi network data, and launched an internal investigation. Google’s practice of unauthorized wireless data collection is stirring up privacy concerns in both the domestic and international contexts. The domestic response to Google’s actions includes a direct inquisition by Congress on Google’s wireless data collection, a multistate investigation of Google’s actions, as well as several privacy-related lawsuits being filed against Google. The international community is similarly following suit. The federal response to Google’s data collection was swift. Initially, two lawmakers who cochair the House of Representative’s Privacy Caucus, (Continued) 39Posting of Alan Eustace, Official Google Blog: WiFi data collection, http://googleblog.blogspot.com/2010/05/ wifi-data-collection-update.html (May 14, 2010, 01:44 EST) (last visited July 26, 2010). 40See Posting of Alan Eustace, Official Google Blog: WiFi data collection, http://googleblog.blogspot.com/2010/05/ wifi-data-collection-update.html (May 14, 2010, 01:44 EST) (last visited July 26, 2010). Chapter 12: Privacy 395 Copyright 2010 Cengage Learning. All Rights Reserved. May not be copied, scanned, or duplicated, in whole or in part. Due to electronic rights, some third party content may be suppressed from the eBook and/or eChapter(s). Editorial review has deemed that any suppressed content does not materially affect the overall learning experience. Cengage Learning reserves the right to remove additional content at any time if subsequent rights restrictions require it. Representatives Joe Barton and Edward Markey, wrote a letter to Jon Liebowitz, Chairman of the FTC to investigate the matter. In response, Mr. Liebowitz pledged that the FTC will “take a very close look at exactly what’s going on.” 41 Other federal agencies have similarly voiced their opinion. For instance, in a posting on the FCC’s official blog, Joel Gurin, Chief of the Consumer and Governmental Affairs Bureau of the FCC, noted that “[w]hether intentional or not, collecting information sent over WiFi networks clearly infringes on consumer privacy.” 42 The federal response to Google seeks to address two concerns. The first is whether Google’s actions are legal under different statutory regimes, including: Section 5 of the Federal Trade Commission Act, the Communications Act of 1934, and the Electronic Communications Privacy Act. The second is whether Google analyzed and profiled any of the payload data that it collected in the time period between when it first captured the data and when it eventually grounded its GSV fleet. In furtherance of both of these concerns, a 38-state coalition—led by Connecticut Attorney General Richard Blumenthal—sent a letter to Google inquiring into the company’s actions. Specifically, the coalition wants Google to answer whether its software was designed to collect random pieces of information broadcast over open wireless networks or download specific types of data, and whether it has sold or otherwise used any of the payload data that it collected. Ultimately, the coalition seeks to “determine whether laws were broken and whether legislation is necessary to prevent future privacy breaches.” 43 Notwithstanding the federal and state inquiries into its actions, Google also faces a growing number of privacy-related lawsuits. Currently, Google faces eight suits in different states and the District of Columbia. Most of these suits claim that Google violated federal and state wiretapping laws while its GSV cars collected payload data from open WiFi networks. Notably, some of these lawsuits seek class-action certification, which may potentially force Google to pay up to $10,000 for each time that it collected data from a class member.44 In addition to the domestic backlash that Google faces, several different countries are inquiring into Google’s actions as its GSV cars were present in at least thirty-three nations. There are two main international responses to Google’s actions. First, nations such as Ireland, Denmark, and Austria, have asked Google to delete the data it collected from within the nations’ borders. Google (Continued) 41Chloe Albanesius, FTC Vows to Take ‘Very Close Look’ at Google Wi-Fi Sniffing, PC Magazine, May 24, 2010, http://www.pcmag.com/article2/0,2817,2364126,00.asp. 44Gregg Keizer, Google’s Wi-Fi Snooping Earns it a Class-Action Lawsuit, InfoWorld, May 20, 2010, http:// www.infoworld.com/d/security-central/googles-wi-fi-snooping-earns-it-class-action-lawsuit-629?page=0,1. 42Consumer View: Staying Safe from Cyber Snoops, Posting of Joel Gurin to the Official Blog of the Federal Communications Commission, http://reboot.fcc.gov/blog?entryId=493624 (June 11, 2010) (last visited July 26, 2010). 43Press Release, Connecticut Attorney General’s Office, CT Attorney General Seeks Additional Information On Google Street View Snooping, Announces 38 States Have Joined Multistate Investigation, http://www.ct. gov/ag/cwp/view.asp?A=2341&Q=463406 (July 21, 2010) (last visited July 26, 2010). 396 Part 4: Regulatory, Compliance and Liability Issues Copyright 2010 Cengage Learning. All Rights Reserved. May not be copied, scanned, or duplicated, in whole or in part. Due to electronic rights, some third party content may be suppressed from the eBook and/or eChapter(s). Editorial review has deemed that any suppressed content does not materially affect the overall learning experience. Cengage Learning reserves the right to remove additional content at any time if subsequent rights restrictions require it. Summary While today most jurisdictions around the world recognize privacy in some respect, the right to privacy is a relatively new concept. Moreover, there continue to be tremendous differences in how privacy is viewed, not to mention how individuals may enforce their privacy rights. In certain jurisdictions, including Europe, privacy is viewed as a fundamental human right. In other jurisdictions, especially the United States, the right to privacy is more often conceptualized as the right to be left alone from interference. Generally, privacy rights developed slowly but somewhat steadily, at least until the latter part of the twentieth century. During the mid- to late-1990s, privacy rights became the subject of increased focus and concern. One significant reason for this new focus was the advent of new technologies. Throughout this time period, new technological developments were having a profound impact on the way in which we worked, played, and lived. They were enhancing our ability to exchange information, content and data rapidly and across vast distances and national boundaries. Efforts to collect, process, and mine data, which were once extraordinarily time-consuming processes, could now be accomplished almost instantaneously. As a result, there was increasing concern about how such technologies might affect adversely individual privacy. In essence, we wondered how to preserve this vague right to be left alone from interference, as the new technologies made it less and less likely that we would indeed be left alone. Around the same time period, a comprehensive data protection regime with elements of extraterritoriality was just coming into force in Europe. Europe’s main data protection directive (the “Data Protection Directive”), which was passed in 1995 and took effect in 1998, severely limited the ability of enterprises and organizations to transfer any personal data outside of the European Economic Area (EEA) unless the country to which the data would be transferred provided “adequate” protection to such personal data. As a result, many jurisdictions were compelled to endeavor to implement new privacy and data protection laws so as to ensure that entities within their national boundaries would be able to continue to receive personal data from within the EEA. The United States, of course, opted for a different approach and undertook effects to negotiate a self-regulatory scheme as an alternative to comprehensive data privacy legislation. All of the foregoing factors contributed to an increased demand for comprehensive privacy laws. And many jurisdictions responded. During this time period, around the globe, a great number of privacy laws were introduced and/or enacted. At the same time, governmental and regulatory authorities in various jurisdictions directed attention towards launching investigations and commencing enforcement actions against companies that had been viewed as violating individual privacy rights. Today, more than a decade after the enactment of the Data Protection Directive, concerns over privacy has obliged and provided proof that hard drives containing data from those nations have been destroyed. Similarly, the United Kingdom’s Information Commissioner Office noted that Google breached data protection requirements, but acknowledged that it will not pursue further investigations if Google agrees to delete its data. The second international response is more troublesome for Google. Regulators in nations such as Germany, France, and Spain have requested that Google turn over hard drives of wireless data that it collected. These nations seek to investigate whether Google violated any of domestic data protection laws. Notably, regulators in Germany have launched a criminal probe into Google’s actions and are considering whether to file formal charges against Google.45 45Jeremy Kirk, Germany Launches Criminal Investigation of Google, Computerworld, May 20, 2010, http:// www.computerworld.com/s/article/9177019/Germany_launches_criminal_investigation_of_Google. Chapter 12: Privacy 397 Copyright 2010 Cengage Learning. All Rights Reserved. May not be copied, scanned, or duplicated, in whole or in part. Due to electronic rights, some third party content may be suppressed from the eBook and/or eChapter(s). Editorial review has deemed that any suppressed content does not materially affect the overall learning experience. Cengage Learning reserves the right to remove additional content at any time if subsequent rights restrictions require it. continue to grow. Technological developments, including the advancement of social media applications, the more widespread use of telemedicine and electronic health records and the expanded use of surveillance technology are all increasing the threats to individual privacy. In response to these threats, jurisdictions around the world are proposing new legislative means for protecting information privacy. Businesses must keep pace with the ever-evolving legal and regulatory environment applicable to online privacy. Key Terms personal data or personal information, p. 363 Federal Trade Commission (FTC), p. 363 Intrusion upon Seclusion, p. 366 public disclosure of private facts causing injury to reputation, p. 368 publicly placing another in a false light, p. 368 misappropriation of a person’s name or likeness causing injury to reputation, p. 368 Privacy Protection Act (PPA), p. 369 Cable Communications Policy Act (CCPA), p. 369 Video Privacy Protection Act, p. 370 Telephone Consumer Protection Act of 1991 (TCPA), p. 370 Federal Communications Commission (FCC), p. 370 Electronic Communications Privacy Act (ECPA), p. 370 Fair Credit Reporting Act (FCRA), p. 372 identity theft, p. 372 Gramm-Leach-Bliley Act (GLB Act), p. 372 Health Insurance Portability and Accountability Act (HIPAA), p. 374 data breach, p. 375 Children’s Online Privacy Protection Act (COPPA), p. 376 geolocation data, p. 377 spam, p. 377 Manager’s Checklist • Privacy and data security laws continue to evolve and the business manager must remain vigilant about applicable changes that can impact their businesses. • Business managers should establish and implement effective policies regarding email usage, data retention and destruction, information security, and Internet usage. • If your company collects and stores information about its customers, suppliers, employers or others, make sure you provide them with a privacy policy that is clear, easy to understand, and that complies with all applicable laws and regulations. • Businesses must be aware that national approaches to data privacy vary widely from country to country and many countries outside of the United States have laws that are more stringent than those in place in the United States. • When launching new products or services, companies should consider having the products and services evaluated to ensure that they comply with privacy and security requirements before they are made available to the public. Questions and Case Problems 1. In this case, plaintiffs, Chinese residents, brought suit against Yahoo! and Yahoo! Hong Kong (“Yahoo”) for a violation of the Electronic Communications Privacy Act (“ECPA”). Plaintiffs allege that defendants disclosed specific personal information about plaintiffs such as Internet user information. The ECPA prohibits unauthorized access to electronic communication as well as the interception of electronic communication. The issue that the plaintiffs must overcome is whether Congress intended the ECPA to apply overseas (as it is an American law and the alleged violation took place in China). The case is presented before the U.S. District Court for the Northern District of California. The ECPA is an amendment to an earlier act called the Wiretap Act. The Wiretap Act only prohibited the interception of wire and oral communications to which the ECPA added electronic communications. The District Court pointed out that, although there is no decision on whether the ECPA applies to foreign states, the Wiretap Act was specifically held to not apply to overseas communications. Moreover, there is no 398 Part 4: Regulatory, Compliance and Liability Issues Copyright 2010 Cengage Learning. All Rights Reserved. May not be copied, scanned, or duplicated, in whole or in part. Due to electronic rights, some third party content may be suppressed from the eBook and/or eChapter(s). Editorial review has deemed that any suppressed content does not materially affect the overall learning experience. Cengage Learning reserves the right to remove additional content at any time if subsequent rights restrictions require it. specific provision of the ECPA that specifically amends the Wiretap Act to apply overseas. The plaintiffs argue however, that even if it is true that it only applies to violations in the United States, the communications in all probability travelled through Yahoo’s servers in the United States and therefore the statute should apply. Decide the outcome that the District Court should adopt and discuss reasons.46 2. In this case, plaintiff Stacy Guin brought suit in the District Court of Minnesota against defendant Brazos Higher Education Service Corporation (“Brazos”) for negligence in keeping his personal information safe and secure. The action arose from a situation in which John Wright, a financial analyst for Brazos, had his home burglarized and his laptop stolen with applicants’ information on it, possibly including Guin’s. Guin learned about this breach of privacy through a letter that Brazos sent out to all of its customers within the state, as required by law, since they did not specifically know who’s information had been on the laptop computer. Guin sued Brazos for negligence, alleging that Brazos had a duty of care to secure his personal information and insure it was not released. Brazos argues that Guin did not sufficiently establish a claim for negligence. The four elements of a negligence claim are: (1) duty of care, (2) breach of the duty of care, (3) an injury, and (4) the breach was the proximate cause of the injury. Brazos argues that one of the key components, injury, was not satisfied as it is not even clear if his information was on the computer and therefore, it is possible that his privacy was not breached whatsoever. Guin bases the first and second prongs—duty and breach— on the Gramm-Leach-Bliley Act (“GLB Act”) which imposes a duty “to protect the security and confidentiality of customers’ nonpublic personal information.” He also argues that Brazos had its own self imposed duty that had the same standards as the GLB Act. He claims that they breached both of those duties by allowing his information to be released into the public from Wright’s computer. Guin also claims that he was injured in the form of identity theft since his information is now public (although there is no proof that his information was in fact used or even released for that matter). To be the proximate cause of the injury, the last of the components of a negligence claim, one must show that the breaching party breached a duty that was reasonably foreseeable to lead to an injury to the other party. Discuss and state reasons for what the District Court should conclude on the issue of negligence in this case.47 3. Customer lists are key assets for many businesses. At the same time, however, when compiling customer lists, many companies promise customers that their information will be protected and held in confidence. Consider, for instance, the recent case involving the now defunct XY Magazine, a magazine targeted towards a youthful gay male market. When subscribers subscribed for the magazine, the applicable subscription form said it “never sells its list to anybody.” XY.com told prospective subscribers that their magazines would be mailed in shrink-wrapped black plastic so that subscribers’ parents couldn’t tell what they were getting. When, in connection with the bankruptcy proceeding, certain parties attempted to assert ownership over the personal data that have been collected by the magazine, the FTC warned the parties that selling the data would violate the terms of the privacy policy. Discuss whether you agree with the FTC’s determination in this matter. Also, identify the previous case in which the FTC considered a similar issue and identify the outcome of that case.48 4. In connection with the settlement of claims of alleged intellectual property infringement, a company agreed to transfer to another company certain customer lists containing customer information. However, the privacy policy of the company who agreed to the data transfer promised customers that it would not share its customers’ information with third parties. Should the company be able to comply with the settlement agreement, in contravention of its stated privacy policy, or should it be 46Zheng v. Yahoo, No. C-08-1068, 2009 WL 4430297 (N.D. Ca. Dec. 2, 2009). 47Guin v. Brazos Higher Education Services Corp. Inc., No. Civ. 05-668, 2006 WL 288483 (D. Minn. Feb. 7, 2006). 48Correspondence from the FTC to the representatives of the potential buyer of the customer data http://www. ftc.gov/os/closings/100712xy.pdf. Chapter 12: Privacy 399 Copyright 2010 Cengage Learning. All Rights Reserved. May not be copied, scanned, or duplicated, in whole or in part. Due to electronic rights, some third party content may be suppressed from the eBook and/or eChapter(s). Editorial review has deemed that any suppressed content does not materially affect the overall learning experience. Cengage Learning reserves the right to remove additional content at any time if subsequent rights restrictions require it. required to comply with its privacy policy, and thereby violate the settlement agreement?49 5. HHS and the FTC recently launched an investigation into a major pharmacy chain for its information disposal practices. The regulators claimed that the pharmacy chain failed to protect customers’ sensitive financial and medical information by disposing prescriptions and labeled pill bottles in dumpsters that were accessible by the public. What consequences should a company face for failing to properly dispose of customer information?50 Additional Resources • United States Federal Trade Commission, www.ftc.gov • Department of Health and Human Services, www.hhs.gov • Privacy Rights Clearinghouse, www.privacyrights.org • International Association of Privacy Professionals, www.privacyassociation.org • Electronic Privacy Information Center, www.epic.org • Privacy.org, www.privacy.org 49FenF, LLC v. Healio Health, Inc., No. 5:08-CV-404 (N.D. Ohio, July 8, 2010). 50In the Matter of Rite Aid Corp., FTC, No. 072-3121, proposed settlement announced July 27, 2010. 400 Part 4: Regulatory, Compliance and Liability Issues Copyright 2010 Cengage Learning. All Rights Reserved. May not be copied, scanned, or duplicated, in whole or in part. Due to electronic rights, some third party content may be suppressed from the eBook and/or eChapter(s). Editorial review has deemed that any suppressed content does not materially affect the overall learning experience. Cengage Learning reserves the right to remove additional content at any time if subsequent rights restrictions require it.
Hel-----------lo -----------Sir-----------/Ma-----------dam----------- T-----------han-----------k Y-----------ou -----------for----------- us-----------ing----------- ou-----------r w-----------ebs-----------ite----------- an-----------d a-----------cqu-----------isi-----------tio-----------n o-----------f m-----------y p-----------ost-----------ed -----------sol-----------uti-----------on.----------- Pl-----------eas-----------e p-----------ing----------- me----------- on----------- ch-----------at -----------I a-----------m o-----------nli-----------ne -----------or -----------inb-----------ox -----------me -----------a m-----------ess-----------age----------- I -----------wil-----------l