THE GLOBAL “GO-TO THINK TANKS” and Why I do not believe in it!

THE GLOBAL “GO-TO THINK TANKS 2009” and Why I do not believe in it!

And…  could we trust this ranking?

Think Tanks and Foreign Policy Program at the University of Pennsylvania and its director,James G. McGann, have done it again! The new Ranking for 2009 or THE GLOBAL “GO-TO THINK TANKS” is out. And I suppose anyone in the world with access to Internet knows about it by now! I received this information eight times: five times directly from the author and his team alone, then from a friend of mine who deals with NGOs (not think tanks), another one whose primary interest is independent media and internet and the third forward was from a not-for-profit lawyer. And these messages were only for the occasion of the report launch only. The same process gets repeated few times a year. I am sure most of those who deal with think tanks in this region have been subject of a similar ‘bombardment’ of information.

Setting aside my dissatisfaction with the communication strategy / distribution of the product (almost reaching spamming dimensions), my main concerns are with the contents of this ranking. From the soundness of the methodology including the rigor of the selection to imputed conclusions to the lack of accuracy in the presented information, this ranking is riveted by mistakes and inconsistencies.

I. The Methodology and the set up of the Ranking

(Full disclosure – McGann invited me to be on the panel of experts – invitation that I have declined in summer last year).

Let’s be fair: Gathering an experts’ panel is a common method for rankings of all sorts. From corruption to think tanks to selection of best sportsmen and sportswomen, experts are those whose word is final.

In this ranking ‘the members of the Expert Panel were asked to nominate regional or global centers of excellence that they felt should be recognized for producing rigorous and relevant research, publications and programs in one or more substantive areas of research’.

Theoretically, this is all fine. But, once we scratched under the surface of the methodology of this ranking, three immediate problems appear: broad definition of think tanks that allows for curious entries; the name of the experts are not disclosed (overall it is not clear if this group is unbiased as a whole): and there is no vetting process of the inputs by individual experts. While I am not fully aware of all the details of the design of this study, many things become as obvious as problematic from reading just the report (page 8 of the report depicts the process).

1. Definition of think tanks – McGann operates with a very broad definition of think tanks that encompasses not-for-profit think tanks, some general non-governmental organizations that engage in policy-relevant research, for-profit consultancies, university-based research centers, governmental and inter-governmental research departments. Such loose boundaries work against the very notion of a think tank and include some organizations that are not think tanks. Open Society Institute, while engaged in advocacy and supporting research and research organizations on many subjects, is definitely not a think tank. Likewise, I have not seen anywhere else Transparency International and Amnesty International to be referred to as think tanks. Then, the World Bank Research Department and the Overseas Development Institute (ODI), that are listed as independent think tanks. While these organizations have extremely important role in developing their own and supporting policy research of others, it is disputable whether they could be listed as (independent) think tanks. Should such departments / governmental institutes be listed, I am afraid that one will have to apply the same criteria with a number of countries and list their research departments too (many of Western European countries have research centers within their ministries).

2. According to the report, 298 individuals served on the panel of Experts. While I am sure that each of these individuals is a reputable expert in her/his respective discipline, the design is flawed in the way that the experts from one region could nominate / select think tanks for other regions. It is highly debatable to what extent 300 people could be acquainted with all the world’s think tanks. For example, I was invited to serve the panel on the European think tanks, but could have selected / voted think tanks from all regions. Frankly, I am not familiar with most of the rest, specifically those think tanks operating in Africa, Asia and South America. And by no means, I would be able to detect the quality of those organizations, a notion hard to pinpoint when discussing think tanks.

Next, why the list of experts is not public? Such list would add to the credibility of the study. Finally given the low response rate (750 out of 8500 potential respondents) it would be necessary to have the list of respondents available in order to check the overlap between the ranked think tanks and respondents. While explicitly the think tanks could not ‘vote’ for themselves, schemes for mutual voting of friends / colleagues or networks could be easily designed.

Let me draw a funny, if not awkward, parallel between this ranking and the world of sports. The Awards for best football player of the year, be that by World Football Association (FIFA) or L’Equipe – eminent French Sport Magazine, issue the names of coaches/journalist that vote for the winner and the runner-ups. In NBA, the nomination process for the All-Star game mitigates the problem challenge of distinguishing between popularity and quality of players by a 2-tier system. The public votes the most popular 5 of each conference: the East and the West Conference. Coaches of all NBA teams select the remaining 10 players on each side. And we are not talking about research here, just credible rankings in the real world. Are these biased? Probably Yes. Are they fair? Yes.

3. The vetting process: There is very little of this and maybe even none in the described process. For example, the experts and peers are encouraged to be fair and not nominate their own centers. But in this interlinked world, who checks the multiple roles and affiliations these experts and peers have? An employee in one institution (the one in the questionnaire) could be a board member or external collaborator of another. Or how one checks the network of think tanks that could vote in block for their peer member? The potentials for skewing data are countless (such as with many rankings of this sort), calling for more scrutiny and control on the ‘exit side’. Lamentably, this survey does not stipulate a single control procedure as a final stage – only compilation of results and their dissemination (at least this is not publicly acknowledged).

II. Imputed conclusions

The study brings about a number of partial conclusions / suggestions that could not withstand a simple scrutiny let alone a more rigorous standard.

1. One of the most distorted conclusions is about the size of the sector. It is overblown out of proportion. Given that I know the think tanks in Central and Eastern Europe well, let’s observe some data there: 54 think tanks in Romania, or 45 in Poland and particularly 27 in the Czech Republic strikes me as overly exaggerated numbers (even when taking into consideration the broad definition McGann uses). By adding organizations that defy the definition of think tanks, the report’s overall number seems exaggerated. Unfortunately, the ranking does not provide the full list of think tanks. I would gladly go through these country lists and examine the extent to which all the listed organizations are think tanks. If similar criteria were used for the rest of the world, most likely the total of 6000+ think tanks is a grossly exaggerated number.

2. Specific assertions. For example, the study argues:

‘Analogous to a “canary in the coal mine”, the indigenous think tank sector can also function as a key indicator for the state of the civil society in that country. If analysts and critics associated with think tanks are allowed to operate freely, so too can the rest of civil society.’

Theoretically, this causality sounds right. Moreover, such notion – most probably underpinned by the concept of pluralist society – refers to an open space in which civil society actors have the right and appropriate space to suggest policy alternatives. Theoretical concepts on other ways of influencing policy debate in less open societies (such as elitist or corporatist for example) are not part of the analysis, not even considered. And the latter, might be more into play in most of the world. However, this is a side point I make. The main point is are the listed think tanks NGOs?

Because many of those listed are not non-governmental organizations, the suggested causality and the stipulated link are very hard to be made for the set of organizations ranked in the report. I am sure that World Bank Research Department, allegedly the 13th best think tank in the world, does not perceive itself as NGOs. Likewise, I have never seen Overseas Development Institute (ODI), UK anywhere else listed as NGO. Then, in Europe, university based research centers, party affiliated think tanks and governmentally funded research institutes are not considered as part of civil society. In many countries of central and eastern Europe, think tanks have distanced themselves from civil society (negative development in my opinion, but reality in the field), something that again goes against this report’s stipulation. This is not to claim that the organizations that do not represent civil society should be written off, but considering their work as a contribution by the civil society and illustrating the functioning of broader civil society is an overstatement or misleading step to say the least.

3. Brookings Institution is listed as Top 10 under ALL thematic categories. While there is no doubt the Brookings is one of the world’s finest think tanks, it is questionable to find it listed for thematic areas which are not defined as priorities by the think tank itself (check Brookings web-site). Their high ranking on ‘Environment Think Tank Rankings’ is the best case in point. It seems that the rankings under thematic priorities are biased in favor of multi-thematic think tanks over specialized and less visible think tanks.

4. The categorization becomes even more blurred later in the rankings. For example, what one should understand under: Outstanding Policy Oriented – Public Policy Research Program. Likewise how was category Nr. 26: Most Impact on Public Policy or Policy Debates measured? The tables and the accompanying description do not hint at the criteria underpinning the selection in each category.

…. And there are several other imputed conclusions. But let’s not pay attention to every detail.

III. Accuracy of the presented information

Some of my comments above could be listed as inaccuracies. However, whenever an issue could be a subject to a definition or conceptualization debate, I still leave some space for doubt. Regrettably there are some examples of inaccuracy which are not subject of any interpretation but rather result of ignorance or omission:

1. Institute for Public Policy – Kyrgyzstan appears in the top 30 CE Europe think-tanks.

2. In Table 1 on page 15, Armenia and Azerbaijan appear under Eastern Europe while on the very next table on page 16 they are listed under Asia (weird classification, to say the least).

3.  I apologize for being too detailed, but after all a good study should be accurate: Adam Smith is one of the most famous economists in human history, while the second best think tank in Western Europe should be Adam Smith Institute. Carnegie Endowment for International Peace, Belgium does not operate as a separate organization, but it is rather one of the global hubs / advocacy outpost of the CEIP. Ranking it as 26th Western European think tank is very questionable decision and it reveals even bigger methodological concerns.

Instead of a conclusion

The report stipulates:

‘I would like to point out that the inclusion of an institution in the universe of leading think tanks does not indicate a seal of approval or endorsement for the institution, its publications or programs. Likewise a failure to be nominated does not necessarily indicate a lack of a quality and effectiveness or poor performance.’

Furthermore it has the following disclaimer:

‘Despite our best efforts to consult widely and create a rigorous and inclusive process we can not eliminate all bias from the selection of the top think tanks in the world. We fully recognize that personal, regional, ideological, and discipline biases may have been introduced into the nomination and selection process by some of those consulted for this study. We are confident, however, that our efforts to create a detailed set of selection criteria, an open and transparent process, and an increase in the participation from underrepresented regions of the world has served to insulate the nomination and selection process from serious problems of bias and under representation.’

This is a fair stipulation. But then, WHY this RANKING? What does it stand for? And whom does it help? As it now stands, it does not help me as a donor in Central and Eastern Europe at all. I know an increasing number of think tanks in the region that snub at it and pay no attention. However, I am worried in that many ‘non think tankers’ find out about think tanks from this study and perceive the sector as depicted here, which in many ways is misleading. What is your opinion?


Receive new post in your Inbox.


  1. says

    Goran, in addition to asking experts to assess think-tanks from regions they do not know, Jim McGann also asks them to rank them all on thematic areas they do not know – which is why, as you point out, the top think-tanks such as Brookings get high marks for all subject areas. This thoroughly discredits the whole ranking.

    A waiter in a burger restaurant in is no more qualified than an interior designer to be an expert on the optimal cooking-time and temperature for preparing coquilles St Jacques.

    As you know, I also have repeatedly turned down the offer to be an expert on McGann’s rankings, as Director of a network of think-tanks (disclosure: funded by OSI), because I cannot possibly vote objectively between members of a network of 40 think-tanks, let alone between them and other think-tanks in the region (Central and Eastern Europe and Central Asia – which is Asia in the case of Kyrgyzstan, as you also noted).
    McGann failed to see why I would not want to join other think-tank network representatives in exercising my unbiased judgement …

    And then, how could I vote on the best environmental policy institutes in Latin America? or the best policy outputs in Arabic?

    Back to what I know …. there are think-tanks in the top 30 in McGann’s ranking for Central and Eastern Europe who have produced only one or two publications (hardly top-notch ones at that) in the past three years, so what were the experts assessing? Their own contacts books? (yes, in some cases).

    Time to bury the rankings – and seek out individual quality initiatives and policy publications. Let Jim embrace the world (even if he and his team are not quite sure where the countries are), and we can savour the cherries on a smaller cake.

  2. says

    Thanks Jeff. These are additional valuable comments. I agree the minimum we need to do regarding this ranking is alert the general public about its shortcomings.

  3. Rico says


    Well done; it’s about time this whole survey is debunked. I might title your piece something like: “The University of Pennsylvania’s Global Think Tank Survey Ruse”. Doing so would de-personalize it slightly and perhaps create some internal discussion in Pennsylvania about the merits of continuing to disseminate this hogwash. I would suggest having it copyedited and then sent to McGann for a response. If you do, can you pls remind him that Spam is generally not appreciated other than on the Montey Python set…


  4. says

    Good things first: I didn’t know about this report until an hour ago. Given that I read all sorts of media widely and intensely and am involved in a small think tank myself, it may be an encouraging sign that—no matter how much noise the authors try to create—this dubious ‘study’ hasn’t reach all too far yet.

    Beyond that initial hope, and beyond all your very valid methodological concerns: what exactly could any ranking of think tanks—even if terminologically correct and methodologically consistent—ever achieve?

    I have no extensive consolidated global knowledge on think tanks, but I know a few – every single one of them has a different set of aims and objectives and operates in different domains of policy intervention…

    I just don’t see how any kind of ranking could be useful. In other words: to establish a ranking of think tanks, your methodology must be flawed. If it weren’t, you wouldn’t even be able to start, would you?

    As a sidenote: I just realised that Wikipedia is as fuzzy on think tanks as McGann et al.

  5. says

    Andreas, thanks for the comment on my blog. You comment is spot on the core of the matter. Why start a ranking of type of organizations that is hard to define at first place? Then these organizations impact and work product are hard to measure, let alone compare.

    So my objection is from the very outset of the ranking. I know one would say, so is democracy, corruption, human development, still the world is not short of rankings (btw, as being part on some of the democracy panels – Freedom House organizes, I could assure you they have big challenges in comparing apples and oranges. Still the term is more precisely defined as well the tenets of democracy). However, with the think tank we eventually come to recognizing brand name and reputation as main determinants for the good ranking of a given think tank. Not to bore you too much with more details – I have been studying the subject in the last four years – let me illustrate with an important detail. There could be a reputable think tanks that specializes as a convener of high ranking officials and experts (no in-house analysis). On the other side of the spectrum there are think tanks that focus on producing original research, policy analysis and put forth new alternatives (no or little convening function). Third, there are centers that employ their analytics towards a particular advocacy goal (legitimate ones most of the time, e.g. accession of Western Balkan countries into the EU). Then how can you lump them together in one ranking? Simply by measuring their reputation. However, the Pennsylvania University Rating claims they measure more and enters a wide array of flaws and inconsistencies.

  6. Bobi says

    Pa dobro; koga ja kritikuvas Definition of think tanks, koja e tvojata go preturiv celiot blog i nemozev da ja najdam


Leave a Reply

Your email address will not be published. Required fields are marked *