1 / 27

Best Practices for OSPs: Defamation and the Communications Decency Act

Best Practices for OSPs: Defamation and the Communications Decency Act . Jennifer Lloyd Kelly, Fenwick & West, LLP Marcia Hofmann, EFF. Presenters. Jennifer Lloyd Kelly, Litigation Associate, Fenwick & West LLP

africa
Download Presentation

Best Practices for OSPs: Defamation and the Communications Decency Act

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Best Practices for OSPs:Defamation and the Communications Decency Act Jennifer Lloyd Kelly, Fenwick & West, LLP Marcia Hofmann, EFF

  2. Presenters • Jennifer Lloyd Kelly, Litigation Associate, Fenwick & West LLP • Practice focuses on intellectual property and commercial disputes, primarily for local technology companies. • Marcia Hofmann, Staff Attorney, EFF • Practice focuses on information-related policy and litigation.

  3. Overview • What kinds of problems arise when your website has user-generated content? • Basics of defamation law – which kinds of statements could subject the user (and potentially your company) to liability? • Communications Decency Act Section 230 – what steps you can take to protect yourself from liability.

  4. User-Generated Content – Common Problem #1 • User posts nasty or inflammatory statements about another person, company, or product on your website. • Examples: eBay seller reviews, Amazon product reviews, statements on blogs, message boards, and discussion forums. • Risk: subject of statements is offended and threatens to or actually sues for defamation.

  5. User-Generated Content – Common Problem # 1 Example

  6. User-Generated Content – Common Problem #2 • User posts another person’s private information – personally identifying information (name, address, SSN), saucy photographs, pornographic or otherwise private videos.

  7. User-Generated Content – Common Problem #2 Example • Barnes v. Yahoo: • Woman’s ex-boyfriend created and posted unauthorized “profiles” about her on Yahoo’s website that contained nude photographs and contact information, and, while impersonating her, solicited men for sex in chat rooms. • Woman contacted Yahoo and demanded the offending materials be removed. Yahoo agreed to do so, but did not actually remove them until after she filed lawsuit. • Lawsuit premised on this failed promise.

  8. User-Generated Content – Common Problem #3 • User solicits in a discriminatory manner. • Examples: • Person posts housing ad on Craig’s List that allegedly violates the federal Fair Housing Act. • Company runs roommate matching service that allows users to advertise openings for roommates, which provides structured data fields with pull down menus.

  9. User-Generated Content – Common Problem #3 Example

  10. User-Generated Content – Common Problem #4 • Employee uses company email/computer/network to send emails and post information that threatens or harasses others. • Example: Delfino v. Agilent (using his work computer, employee of Agilent sent emails and posted information that harassed plaintiff).

  11. Defamation • Elements of claim: • A statement about a person to someone other than that person • That is a false statement of fact • Which tends to harm the reputation of the person

  12. Defamation • Per se defamation: harm to reputation is presumed– • Attacks on a person’s professional character or standing • Allegations that an unmarried person is unchaste • Allegations that a person has an STD • Allegations that a person has committed a crime of moral turpitude (murder, adultery)

  13. Defamation • Important limitations/defenses: • Statement must actually be false. • Statement must be of a verifiable fact, not an opinion (“George stole the painting” vs. “That painting is hideous” or “George is a fool”). • Public figures must prove “actual malice.” • Privilege for certain types of statements made in official proceedings or public meetings

  14. Defamation • Republication: • Absent immunity, anyone who repeats or republishes a defamatory statement made by another can be held liable for defamation if they knew or had reason to know the statement was defamatory.

  15. Content From Third Parties • Can your company be held liable for unlawful material that users post on its website? • Do you have any obligation to police user-generated content? • Can you alter content that users post on your site? If so, how much?

  16. Communications Decency Act Section 230 “No provider or user of an interactive computer service shall be treated as the publisher or speaker of any information provided by another information content provider.”

  17. Communications Decency Act Section 230 “No cause of action may be brought and no liability may be imposed under any State or local law that is inconsistent with this section.”

  18. History of Section 230 • Much of the Communications Decency Act has been found unconstitutional, but Section 230 still survives. • Congress passed Section 230 to encourage the growth of the Internet and keep government interference to a minimum.

  19. Elements Section 230 immunity requires that: • the defendant be a provider or user of an interactive computer service, • the cause of action treat the defendant as a publisher or speaker of information, and • the information be provided by another information content provider.

  20. What’s a “provider or user of an interactive computer service”? A broad variety of online publishers. Examples: • Traditional ISPs • Website operators (including bloggers) • Listserv operators • Users of online services

  21. What kinds of suits does Section 230 protect against? A wide range of legal claims. Examples: • Defamation • Unfair competition • Invasion of privacy • Breach of contract • State intellectual property claims • Federal civil rights claims

  22. Limitations • Section 230 doesn't apply to federal criminal claims, intellectual property claims, or electronic communications privacy laws. • You can select, withdraw, or edit user content, but immunity may not apply if you significantly change the meaning of the content.

  23. Other Things to Know • Efforts to police content are irrelevant for purposes of § 230 immunity. • If a situation is a close call, you should check with a lawyer.

  24. Best Practices • If someone threatens to sue your company for publishing content protected by Section 230, you can send a letter asserting that you have immunity. • You might work with an attorney to develop a form letter for such situations.

  25. To Sum Up • Section 230 protects OSPs from a wide range of claims based on content provided by others (though does not extend to federal criminal, IP, or electronic communications claims). • You won’t lose immunity for selecting and editing user content, but you can’t significantly change the meaning of it.

  26. To Sum Up • Immunity may apply regardless of whether you choose to police content on your site. • Work with a lawyer to develop a form letter for Section 230 situations. If you have questions about the law or are unsure whether it protects you in a certain situation, consult an attorney.

  27. Questions? Jennifer Lloyd Kelly, Associate Fenwick & West LLP jkelly@fenwick.com Marcia Hofmann, Staff Attorney Electronic Frontier Foundation marcia@eff.org http://www.eff.org

More Related