Best practices for osps defamation and the communications decency act
Download
1 / 27

Best Practices for OSPs: Defamation and the Communications Decency Act - PowerPoint PPT Presentation


  • 210 Views
  • Uploaded on

Best Practices for OSPs: Defamation and the Communications Decency Act . Jennifer Lloyd Kelly, Fenwick & West, LLP Marcia Hofmann, EFF. Presenters. Jennifer Lloyd Kelly, Litigation Associate, Fenwick & West LLP

loader
I am the owner, or an agent authorized to act on behalf of the owner, of the copyrighted work described.
capcha
Download Presentation

PowerPoint Slideshow about 'Best Practices for OSPs: Defamation and the Communications Decency Act' - africa


An Image/Link below is provided (as is) to download presentation

Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author.While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server.


- - - - - - - - - - - - - - - - - - - - - - - - - - E N D - - - - - - - - - - - - - - - - - - - - - - - - - -
Presentation Transcript
Best practices for osps defamation and the communications decency act

Best Practices for OSPs:Defamation and the Communications Decency Act

Jennifer Lloyd Kelly, Fenwick & West, LLP

Marcia Hofmann, EFF


Presenters
Presenters

  • Jennifer Lloyd Kelly, Litigation Associate, Fenwick & West LLP

    • Practice focuses on intellectual property and commercial disputes, primarily for local technology companies.

  • Marcia Hofmann, Staff Attorney, EFF

    • Practice focuses on information-related policy and litigation.


Overview
Overview

  • What kinds of problems arise when your website has user-generated content?

  • Basics of defamation law – which kinds of statements could subject the user (and potentially your company) to liability?

  • Communications Decency Act Section 230 – what steps you can take to protect yourself from liability.


User generated content common problem 1
User-Generated Content – Common Problem #1

  • User posts nasty or inflammatory statements about another person, company, or product on your website.

  • Examples: eBay seller reviews, Amazon product reviews, statements on blogs, message boards, and discussion forums.

  • Risk: subject of statements is offended and threatens to or actually sues for defamation.


User generated content common problem 1 example
User-Generated Content – Common Problem # 1 Example


User generated content common problem 2
User-Generated Content – Common Problem #2

  • User posts another person’s private information – personally identifying information (name, address, SSN), saucy photographs, pornographic or otherwise private videos.


User generated content common problem 2 example
User-Generated Content – Common Problem #2 Example

  • Barnes v. Yahoo:

    • Woman’s ex-boyfriend created and posted unauthorized “profiles” about her on Yahoo’s website that contained nude photographs and contact information, and, while impersonating her, solicited men for sex in chat rooms.

    • Woman contacted Yahoo and demanded the offending materials be removed. Yahoo agreed to do so, but did not actually remove them until after she filed lawsuit.

    • Lawsuit premised on this failed promise.


User generated content common problem 3
User-Generated Content – Common Problem #3

  • User solicits in a discriminatory manner.

  • Examples:

    • Person posts housing ad on Craig’s List that allegedly violates the federal Fair Housing Act.

    • Company runs roommate matching service that allows users to advertise openings for roommates, which provides structured data fields with pull down menus.


User generated content common problem 3 example
User-Generated Content – Common Problem #3 Example


User generated content common problem 4
User-Generated Content – Common Problem #4

  • Employee uses company email/computer/network to send emails and post information that threatens or harasses others.

  • Example: Delfino v. Agilent (using his work computer, employee of Agilent sent emails and posted information that harassed plaintiff).


Defamation
Defamation

  • Elements of claim:

    • A statement about a person to someone other than that person

    • That is a false statement of fact

    • Which tends to harm the reputation of the person


Defamation1
Defamation

  • Per se defamation: harm to reputation is presumed–

    • Attacks on a person’s professional character or standing

    • Allegations that an unmarried person is unchaste

    • Allegations that a person has an STD

    • Allegations that a person has committed a crime of moral turpitude (murder, adultery)


Defamation2
Defamation

  • Important limitations/defenses:

    • Statement must actually be false.

    • Statement must be of a verifiable fact, not an opinion (“George stole the painting” vs. “That painting is hideous” or “George is a fool”).

    • Public figures must prove “actual malice.”

    • Privilege for certain types of statements made in official proceedings or public meetings


Defamation3
Defamation

  • Republication:

    • Absent immunity, anyone who repeats or republishes a defamatory statement made by another can be held liable for defamation if they knew or had reason to know the statement was defamatory.


Content from third parties
Content From Third Parties

  • Can your company be held liable for unlawful material that users post on its website?

  • Do you have any obligation to police user-generated content?

  • Can you alter content that users post on your site? If so, how much?


Communications decency act section 230
Communications Decency Act Section 230

“No provider or user of an interactive computer service shall be treated as the publisher or speaker of any information provided by another information content provider.”


Communications decency act section 2301
Communications Decency Act Section 230

“No cause of action may be brought and no liability may be imposed under any State or local law that is inconsistent with this section.”


History of section 230
History of Section 230

  • Much of the Communications Decency Act has been found unconstitutional, but Section 230 still survives.

  • Congress passed Section 230 to encourage the growth of the Internet and keep government interference to a minimum.


Elements
Elements

Section 230 immunity requires that:

  • the defendant be a provider or user of an interactive computer service,

  • the cause of action treat the defendant as a publisher or speaker of information, and

  • the information be provided by another information content provider.


What s a provider or user of an interactive computer service
What’s a “provider or user of an interactive computer service”?

A broad variety of online publishers. Examples:

  • Traditional ISPs

  • Website operators (including bloggers)

  • Listserv operators

  • Users of online services


What kinds of suits does section 230 protect against
What kinds of suits does Section 230 protect against? service”?

A wide range of legal claims. Examples:

  • Defamation

  • Unfair competition

  • Invasion of privacy

  • Breach of contract

  • State intellectual property claims

  • Federal civil rights claims


Limitations
Limitations service”?

  • Section 230 doesn't apply to federal criminal claims, intellectual property claims, or electronic communications privacy laws.

  • You can select, withdraw, or edit user content, but immunity may not apply if you significantly change the meaning of the content.


Other things to know
Other Things to Know service”?

  • Efforts to police content are irrelevant for purposes of § 230 immunity.

  • If a situation is a close call, you should check with a lawyer.


Best practices
Best Practices service”?

  • If someone threatens to sue your company for publishing content protected by Section 230, you can send a letter asserting that you have immunity.

  • You might work with an attorney to develop a form letter for such situations.


To sum up
To Sum Up service”?

  • Section 230 protects OSPs from a wide range of claims based on content provided by others (though does not extend to federal criminal, IP, or electronic communications claims).

  • You won’t lose immunity for selecting and editing user content, but you can’t significantly change the meaning of it.


To sum up1
To Sum Up service”?

  • Immunity may apply regardless of whether you choose to police content on your site.

  • Work with a lawyer to develop a form letter for Section 230 situations. If you have questions about the law or are unsure whether it protects you in a certain situation, consult an attorney.


Questions
Questions? service”?

Jennifer Lloyd Kelly, Associate

Fenwick & West LLP

[email protected]

Marcia Hofmann, Staff Attorney

Electronic Frontier Foundation

[email protected]

http://www.eff.org


ad