Editor’s note: Sid Venkatesan is an IP partner specializing in high stakes IP disputes and IP counseling for technology companies in the Silicon Valley office of Orrick, Herrington & Sutcliffe LLP. James Freedman is an associate in Orrick’s IP group and a recent Stanford Law School graduate.
Online content providers and aggregators are well aware of the potential penalties that can result from a copyright infringement lawsuit. In addition to being expensive to litigate, a copyright lawsuit can result in statutory damages (which can range between $750 to $30,000 for each infringing work found on a website), some or all of an infringer’s profits and even steeper penalties for willful infringement. A peer-to-peer platform relying on user-uploaded content, for example, can face nearly unlimited liability under this regime. Clearly, a copyright suit can have a crippling effect on an early-stage tech company.
One of Congress’ goals when it passed the Digital Millennium Copyright Act in 1998 was to insulate certain digital content providers (called “service providers” in the statute) as long as they promptly took down infringing works on notice from the copyright holder of those works.Multi-part safe statutory tests are often a litigator’s delight, but they do not always provide clarity for businesses trying to comply with the law.
This protection is called a “safe harbor” and can be used as a defense to a copyright infringement claim. There are several safe harbors in the DMCA (they are contained in Title 17 of the United States Code, Section 512). One important one, set forth in Section 512, subpart (c), protects service providers that offer “storage at the direction of a user” on the provider’s network, i.e. a platform for user-uploaded content. These service providers can rely on the safe harbor as long as they:
- Promptly take down allegedly infringing material upon receipt of a statutory compliant notice;
- Do not have “actual knowledge” of copyright infringement;
- Are unaware of “facts or circumstances from which infringing activity is apparent” (this is called “red flag” knowledge in case law);
- Remove or disable access to infringing material upon learning about it (whether or not a takedown notice has been received);
- And do not receive a “financial benefit directly attributable to the infringing activity” where the “service provider has the right and ability to control the activity.
Multi-part safe statutory tests are often a litigator’s delight, but they do not always provide clarity for businesses trying to comply with the law. Indeed, Section 512(c) raises a number of questions. For example, what are “facts and circumstances from which infringing activity is apparent”? Does an online ad-supported content-hosting platform “receive financial benefit directly attributable to the infringing activity”? Does a service provider that processes or tags user-uploaded content perform “storage at the direct of a user” or something else?Unpacking Safe Harbor
The boundaries of the safe harbor have been tested in litigation and as a result, some recent federal Court of Appeals decisions have cleared up some of the questions around Section 512(c). Most recently, the Ninth Circuit Court of Appeals – which covers nine states, including the west coast – ruled that the video streaming site Veoh was protected by the safe harbor in a case brought by Universal Music Group (though Veoh’s successor won, the original company went bankrupt following a fire sale in 2010).
The reasoning of the Ninth Circuit was aligned with a 2012 decision in the Second Circuit (which covers Connecticut, New York and Vermont) in Viacom v. YouTube, meaning there is agreement as to some of the rules of the road for courts covering the largest media and technology hubs in the country.
Specifically, the Veoh and YouTube cases make clear that:
- “Storage at the direction of a user” includes more than simply storing user-uploaded content. Veoh and YouTube both automatically processed uploaded videos for hosting and converted them to Flash. These activities were found to be protected as they were logically related to “storage.” On the other hand, YouTube’s alleged practice of syndicating clips to third parties might not fall within the safe harbor, and this issue was sent back to the trial court for further litigation.
- Neither “actual knowledge” nor “red flag” knowledge can be based on a general awareness that infringing works may be on the service provider’s system. For example, there were news articles in 2007 indicating that Veoh had been lax in policing infringing code, but the Ninth Circuit found that such general awareness was not tied to the allegedly infringing UMG works at issue in the case. In reaching this conclusion, the Ninth Circuit appeared to rely on Veoh’s policy of promptly taking down infringing content on notice and the absence of internal emails or documents showing that Veoh knew of specific infringing works on its system.
- There is no affirmative duty by the service provider to continuously monitor for potentially infringing copyrighted material. This conclusion is consistent with the statutory notice and takedown scheme, which places the burden of identifying infringing material on the copyright holders.
- That being said, a service provider is not allowed to simply sit back and wait for a takedown notice if it is aware of specifically infringing material. Some emails presented in the YouTube case suggested that the YouTube founders may have been aware that infringing material on the site yet elected to wait for a takedown notice before removing the material. This was one of the reasons the YouTube case was sent back to the trial court for further proceedings, whereas Veoh had obtained a summary judgment victory [the lesson here as always: bad evidence can rarely be covered up with legal doctrine]. Also, on Thursday, the Ninth Circuit concluded that the BitTorrent site isoHunt was not entitled to safe harbor protection because its operator “actively encourag[ed] infringement,” including by “urging his users to both upload and download” works, actions that showed he had at least red flag knowledge of infringing works.
- Finally, the safe harbor does not apply where a service provider receives a financial benefit where the provider “exerts substantial influence on the activities of the users.” Veoh’s model of letting users decide what should be uploaded and retaining the right to take down content in its discretion was found to not “exert substantial influence on its users” and therefore fell within the safe harbor.
Generally, these cases show that a platform that leaves content uploads to the discretion of its users, performs processing specifically related to the display of and access to that content, promptly abides by DMCA takedown requests, and does not close its eyes to specifically infringing works can probably take comfort from Section 512(c).
Less clear is how business models that use user uploaded content for purposes other than general ad-supported storage and viewing or models that limit or direct what users upload will fare. Moreover, though affirmative monitoring is not required under these cases, we can reasonably expect that what constitutes “red flag” knowledge may change over time as technology improves and service providers are more easily able to identify and analyze content on their platforms.
[This column reflects Sid’s and James’ general views and does not constitute legal advice or the views of Orrick or its clients.]