Think Tanks by the Numbers

Guest blogger Donald Abelson discusses the impact of think tanks, and how impact can be quantified. Transparify does not edit the content of guest blogs; the views expressed in this blog are those of the author alone, and may not reflect the views of Transparify.

When it comes to assessing the impact or influence of their organizations, directors of think tanks generally have two prepared responses.  The first, which is directed to scholars and to investigative journalists familiar with the complex world of think tanks, tends to be more circumspect. As the head of a Canadian think tank said to me in a recent interview , “measuring our impact on public policy is virtually impossible.”  But what directors of think tanks are willing to concede behind closed doors is a far cry from the message they convey in public. Indeed, the narrative that is carefully crafted for stakeholders prepared to support and endorse the work of think tanks is very different.  When potential funding dollars are on the line, think tanks can ill afford to be modest. “We have enormous influence when it comes to shaping public opinion and public policy,” directors of several US-based think tanks often claim. “Just look at the numbers.”

What kind of numbers are directors of think tanks referring to, and do they help us to better understand how much or little impact think tanks wield?  Recognizing the importance of convincing donors that public visibility or exposure should be equated with policy influence, think tanks go to great lengths, and often to great expense (by hiring media consulting firms) to monitor how often they are cited in newspapers, on television, and on the internet.  Many organizations also keep a close watch on how many publications are downloaded from their website, as well as the frequency with which their experts are called upon to testify before legislative committees.

While these and other indicators of public exposure might be useful in highlighting how active certain think tanks are in attempting to shape the parameters around important policy debates, they tell us little about the actual impact of think tanks in influencing policy decisions.  After all, policy outputs (such as publications and testimony) are very different from policy outcomes (the decisions made by policy makers). Yet, rather than asking directors of think tanks to explain, in concrete terms, how and to what extent their organizations  contributed to particular policy outcomes, those of us who monitor the activities of think tanks have in some ways helped them to foster the illusion of policy influence. This needs to change.

Several scholars and journalists familiar with the complex world of think tanks participate in the annual Global Go To Think Tank s Index Report, an initiative of the Think Tanks and Civil Societies Program at the University of Pennsylvania. Released since 2006, the report both tracks the number of think tanks worldwide, and ranks the top think tanks (in various categories) according to over a dozen criteria.  Although an ambitious undertaking, the reports’ rankings are widely seen as arbitrary and impressionistic. Not only are the numbers of think tanks reported worldwide inflated (indeed many of the organizations included in the study are not think tanks), but the manner in which the rankings are conducted needs to be far more transparent.  Although scholars can debate the strengths and limitations of the survey, what is more important is the fact that the top-ranked think tanks (usually those that are the largest and best funded) use the rankings to mobilize more support for their work. If think tanks do indeed matter, than the issue of how numbers are used to validate their activities needs to be explored further. 

Donald E. Abelson teaches at the University of Western Ontario in Canada. His work focuses primarily on the role of think tanks and their efforts to influence public opinion and public policy.

How We Rate Think Tanks’ Financial Transparency

Transparify rates the extent to which think tanks disclose who funds them, with how much, to do what work. While looking into think tank finances is not a new idea – a variety of other players have done so in the past and some continue to do so today – our initiative differs in two regards.

First, we exclusively look at what information think tanks disclose publicly through their websites. We do not contact institutions asking them to provide us with lists of their funders or similar funding data because we believe that transparency involves being transparent towards all interested stakeholders by default, rather than constituting a favour that an institution may choose to bestow upon request. In our view, if information is not accessible, it is not public. For example, a journalist may not have time to wait for think tank’s clarification of who funded its research on a given issue before her deadline expires and her piece goes to press.

Second, our scope is global. We rate think tanks across dozens of countries, applying the same rating criteria to all. A policy research institution based in a small and poor nation may not enjoy a high profile on the international stage, but within its home country, its findings and recommendations are more likely to remain unchallenged by other researchers and thus may have significant impact on policy formulation. Plus, our findings to date indicate that many well-funded think tanks in rich countries can learn a lot about transparency from some of their smaller peers in Africa, Asia and Latin America. We want to recognize excellence wherever it occurs.

So how does it work? Each think tank is assessed by two or more raters following a standard protocol. Working independently from each other, they award between zero and five stars according to the type and extent of financial information available on the think tank’s website.

Think tanks that score the maximum possible five stars enable stakeholders to see clearly and in detail who funds them, how much each donor contributed, and what projects or activities that money went towards (some great examples are listed here). Think tanks that do not provide any up-to-date information on where they get their money from receive zero stars. Most institutions we have looked at so far fall in between these two extremes.

We pre-tested this methodology in late 2013 and found that the results returned by different raters tend to be highly consistent. In the few cases in which raters do assign different scores, an experienced external adjudicator reviews their findings, revisits the institution’s website, and determines the final score.

Best of all, anyone can visit the website of any think tank we rate and compare the information provided there using our rating tool and criteria – so our findings will be easy to check up on.

By the way, in case you were wondering – we plan to release our final rating results before the end of this month.