Study on Expanded “User Rights” Fails Econometric Scrutiny

Earlier this month, scholars at the American University Washington College of Law’s Program on Information Justice and Intellectual Property (PIJIP) published a paper suggesting that governments around the world should consider weakening copyright protection in favor of expanded “user rights.” The Google-funded report presents an index purporting to show a positive correlation between broad fair use and safe harbor laws and certain economic and scholastic benefits. But, as economist George Ford explains in an essay published last week, the report is an exercise in flawed design and misapplied empirical analysis which cannot be relied upon for informed policymaking.

The PIJIP report, The User Rights Database: Measuring the Impact of Copyright Balance, written by Sean Flynn and Michael Palmedo, highlights a database of exceptions and limitations based on responses to copyright-related survey questions posed to both wealthy and developing countries over a sixteen year period. Flynn and Palmedo then average the responses to construct an index of “user rights” and employ regression analyses to show a positive relationship between greater limitations and exceptions to copyright law and positive economic and scholastic outcomes. The authors claim that this index is a first-of-its-kind tool which can be used to measure the social and economic impact of expanding user rights in the digital era, and that the “findings are relevant to several major arguments for or against expansions of copyright user rights.”

In an essay published in critical response to the report, titled The Vanishing Benefits of Fair Use: A Review of the Flynn-Palmedo Study on “User Rights” in Copyright Law, George Ford begins by questioning the very purpose of the regression models, arguing that because a no conceptual framework is offered to explain why a correlation between the databases variables is desirable, the analysis is purely ad hoc—or arranged to reach a specific outcome. Making matters worse, Ford points out that the report reveals evidence of selection bias which precludes any causal interpretation of the results.

Ford then thoroughly takes to task the regression equation, and though his analysis gets into econometric theory and practices beyond this author’s familiarity, he explains that the methods used by Flynn and Palmedo deviate from those recognized by statistical experts as most effective and reliable. In fact, he describes the empirical work as having a “willy-nilly, unprofessional feel,” based on unsound statistical methods which result in misleading correlations.

Perhaps Ford’s biggest concern is with the study’s poor specification of the variables it measures. Describing the authors’ choice to include sales and income totals from affiliates of US owned multinational corporations, Ford explains “no explanation is given as to why this sector is chosen or why it would be affected much by fair use, or why increasing the sectors profits suggests expanding fair use is a good policy.” The result of this misspecification is that positive effects reported in the study are largely spurious and actually vanish when the data is re-filtered using both country and time-fixed effects.

The Flynn-Palmedo study also measures the number of citable documents published in the countries it surveys in an attempt to show that expanded limitations and exception to copyright laws result in increased scholarly output. Again, Ford points out that the regression analysis employed is inconsistent–with no explanation given as to why–and fails to hold up when subjected to more scrupulous methodologies.

Though Ford argues the study “should be dismissed on statistical errors alone,” he goes on to identify additional problems that render the work unreliable. Taking issue with the study’s survey questions, Ford explains that the authors arbitrarily assign numeric values to questions that may be interpreted in a variety of ways and confuse respondents. Calculating a simple mean of these numbers is a careless approach to the databases construction which renders the results “dubious at best.”

Ford also calls out the authors for claiming their results speak to “innovation and creativity.” Insisting that the results do not, and that their relationship to general social wellbeing is left unexplained, Ford states:

“[T]he outcomes studied include mostly the revenues and profits of firms presumably exploiting copyright’s exceptions. I have no doubt that weaker copyright enforcement will increase the size and profits of some firms, and that these firms will encourage governments to expand fair use and safe harbor.”

Ford acknowledges that, due to its complexity, condensing copyright law intricacies into a single index is impossible. Unfortunately, this doesn’t stop some from claiming to have created a comprehensive and superior database every few years.

In the current case—despite suggesting positive correlations amongst the inexplicably selected variables—the data compiled by the Flynn-Palmedo study do not survive when subjected to Ford’s better-specified models.

Ford concludes his critique by reaffirming that the Flynn-Palmedo study offers no theoretical basis for its regression models and does not meet minimum professional econometric standards. At a time when many nations are reviewing their copyright laws, it’s imperative to understand the Flynn-Palmedo results are in no way causal and that they should not form the basis of policymaking efforts. In fact, according to Ford, the study should be retracted and substantially revised before any weight is afforded to it whatsoever.


Now read this

Debunking Criticism of the Copyright Small Claims Act

It’s been six weeks since the Copyright Alternative in Small Claims Enforcement (CASE) Act (H.R.3945) was introduced to Congress by a bipartisan coalition of Representatives, and while there’s an abundance of support among politicians,... Continue →