THE GLOBAL “GO-TO THINK TANKS 2009” and Why I do not believe in it!
And… could we trust this ranking?
Setting aside my dissatisfaction with the communication strategy / distribution of the product (almost reaching spamming dimensions), my main concerns are with the contents of this ranking. From the soundness of the methodology including the rigor of the selection to imputed conclusions to the lack of accuracy in the presented information, this ranking is riveted by mistakes and inconsistencies.
I. The Methodology and the set up of the Ranking
(Full disclosure – McGann invited me to be on the panel of experts – invitation that I have declined in summer last year).
Let’s be fair: Gathering an experts’ panel is a common method for rankings of all sorts. From corruption to think tanks to selection of best sportsmen and sportswomen, experts are those whose word is final.
In this ranking ‘the members of the Expert Panel were asked to nominate regional or global centers of excellence that they felt should be recognized for producing rigorous and relevant research, publications and programs in one or more substantive areas of research’.
Theoretically, this is all fine. But, once we scratched under the surface of the methodology of this ranking, three immediate problems appear: broad definition of think tanks that allows for curious entries; the name of the experts are not disclosed (overall it is not clear if this group is unbiased as a whole): and there is no vetting process of the inputs by individual experts. While I am not fully aware of all the details of the design of this study, many things become as obvious as problematic from reading just the report (page 8 of the report depicts the process).
1. Definition of think tanks – McGann operates with a very broad definition of think tanks that encompasses not-for-profit think tanks, some general non-governmental organizations that engage in policy-relevant research, for-profit consultancies, university-based research centers, governmental and inter-governmental research departments. Such loose boundaries work against the very notion of a think tank and include some organizations that are not think tanks. Open Society Institute, while engaged in advocacy and supporting research and research organizations on many subjects, is definitely not a think tank. Likewise, I have not seen anywhere else Transparency International and Amnesty International to be referred to as think tanks. Then, the World Bank Research Department and the Overseas Development Institute (ODI), that are listed as independent think tanks. While these organizations have extremely important role in developing their own and supporting policy research of others, it is disputable whether they could be listed as (independent) think tanks. Should such departments / governmental institutes be listed, I am afraid that one will have to apply the same criteria with a number of countries and list their research departments too (many of Western European countries have research centers within their ministries).
2. According to the report, 298 individuals served on the panel of Experts. While I am sure that each of these individuals is a reputable expert in her/his respective discipline, the design is flawed in the way that the experts from one region could nominate / select think tanks for other regions. It is highly debatable to what extent 300 people could be acquainted with all the world’s think tanks. For example, I was invited to serve the panel on the European think tanks, but could have selected / voted think tanks from all regions. Frankly, I am not familiar with most of the rest, specifically those think tanks operating in Africa, Asia and South America. And by no means, I would be able to detect the quality of those organizations, a notion hard to pinpoint when discussing think tanks.
Next, why the list of experts is not public? Such list would add to the credibility of the study. Finally given the low response rate (750 out of 8500 potential respondents) it would be necessary to have the list of respondents available in order to check the overlap between the ranked think tanks and respondents. While explicitly the think tanks could not ‘vote’ for themselves, schemes for mutual voting of friends / colleagues or networks could be easily designed.
Let me draw a funny, if not awkward, parallel between this ranking and the world of sports. The Awards for best football player of the year, be that by World Football Association (FIFA) or L’Equipe – eminent French Sport Magazine, issue the names of coaches/journalist that vote for the winner and the runner-ups. In NBA, the nomination process for the All-Star game mitigates the problem challenge of distinguishing between popularity and quality of players by a 2-tier system. The public votes the most popular 5 of each conference: the East and the West Conference. Coaches of all NBA teams select the remaining 10 players on each side. And we are not talking about research here, just credible rankings in the real world. Are these biased? Probably Yes. Are they fair? Yes.
3. The vetting process: There is very little of this and maybe even none in the described process. For example, the experts and peers are encouraged to be fair and not nominate their own centers. But in this interlinked world, who checks the multiple roles and affiliations these experts and peers have? An employee in one institution (the one in the questionnaire) could be a board member or external collaborator of another. Or how one checks the network of think tanks that could vote in block for their peer member? The potentials for skewing data are countless (such as with many rankings of this sort), calling for more scrutiny and control on the ‘exit side’. Lamentably, this survey does not stipulate a single control procedure as a final stage – only compilation of results and their dissemination (at least this is not publicly acknowledged).
II. Imputed conclusions
The study brings about a number of partial conclusions / suggestions that could not withstand a simple scrutiny let alone a more rigorous standard.
1. One of the most distorted conclusions is about the size of the sector. It is overblown out of proportion. Given that I know the think tanks in Central and Eastern Europe well, let’s observe some data there: 54 think tanks in Romania, or 45 in Poland and particularly 27 in the Czech Republic strikes me as overly exaggerated numbers (even when taking into consideration the broad definition McGann uses). By adding organizations that defy the definition of think tanks, the report’s overall number seems exaggerated. Unfortunately, the ranking does not provide the full list of think tanks. I would gladly go through these country lists and examine the extent to which all the listed organizations are think tanks. If similar criteria were used for the rest of the world, most likely the total of 6000+ think tanks is a grossly exaggerated number.
2. Specific assertions. For example, the study argues:
‘Analogous to a “canary in the coal mine”, the indigenous think tank sector can also function as a key indicator for the state of the civil society in that country. If analysts and critics associated with think tanks are allowed to operate freely, so too can the rest of civil society.’
Theoretically, this causality sounds right. Moreover, such notion – most probably underpinned by the concept of pluralist society – refers to an open space in which civil society actors have the right and appropriate space to suggest policy alternatives. Theoretical concepts on other ways of influencing policy debate in less open societies (such as elitist or corporatist for example) are not part of the analysis, not even considered. And the latter, might be more into play in most of the world. However, this is a side point I make. The main point is are the listed think tanks NGOs?
Because many of those listed are not non-governmental organizations, the suggested causality and the stipulated link are very hard to be made for the set of organizations ranked in the report. I am sure that World Bank Research Department, allegedly the 13th best think tank in the world, does not perceive itself as NGOs. Likewise, I have never seen Overseas Development Institute (ODI), UK anywhere else listed as NGO. Then, in Europe, university based research centers, party affiliated think tanks and governmentally funded research institutes are not considered as part of civil society. In many countries of central and eastern Europe, think tanks have distanced themselves from civil society (negative development in my opinion, but reality in the field), something that again goes against this report’s stipulation. This is not to claim that the organizations that do not represent civil society should be written off, but considering their work as a contribution by the civil society and illustrating the functioning of broader civil society is an overstatement or misleading step to say the least.
3. Brookings Institution is listed as Top 10 under ALL thematic categories. While there is no doubt the Brookings is one of the world’s finest think tanks, it is questionable to find it listed for thematic areas which are not defined as priorities by the think tank itself (check Brookings web-site). Their high ranking on ‘Environment Think Tank Rankings’ is the best case in point. It seems that the rankings under thematic priorities are biased in favor of multi-thematic think tanks over specialized and less visible think tanks.
4. The categorization becomes even more blurred later in the rankings. For example, what one should understand under: Outstanding Policy Oriented – Public Policy Research Program. Likewise how was category Nr. 26: Most Impact on Public Policy or Policy Debates measured? The tables and the accompanying description do not hint at the criteria underpinning the selection in each category.
…. And there are several other imputed conclusions. But let’s not pay attention to every detail.
III. Accuracy of the presented information
Some of my comments above could be listed as inaccuracies. However, whenever an issue could be a subject to a definition or conceptualization debate, I still leave some space for doubt. Regrettably there are some examples of inaccuracy which are not subject of any interpretation but rather result of ignorance or omission:
1. Institute for Public Policy – Kyrgyzstan appears in the top 30 CE Europe think-tanks.
2. In Table 1 on page 15, Armenia and Azerbaijan appear under Eastern Europe while on the very next table on page 16 they are listed under Asia (weird classification, to say the least).
3. I apologize for being too detailed, but after all a good study should be accurate: Adam Smith is one of the most famous economists in human history, while the second best think tank in Western Europe should be Adam Smith Institute. Carnegie Endowment for International Peace, Belgium does not operate as a separate organization, but it is rather one of the global hubs / advocacy outpost of the CEIP. Ranking it as 26th Western European think tank is very questionable decision and it reveals even bigger methodological concerns.
Instead of a conclusion
The report stipulates:
‘I would like to point out that the inclusion of an institution in the universe of leading think tanks does not indicate a seal of approval or endorsement for the institution, its publications or programs. Likewise a failure to be nominated does not necessarily indicate a lack of a quality and effectiveness or poor performance.’
Furthermore it has the following disclaimer:
‘Despite our best efforts to consult widely and create a rigorous and inclusive process we can not eliminate all bias from the selection of the top think tanks in the world. We fully recognize that personal, regional, ideological, and discipline biases may have been introduced into the nomination and selection process by some of those consulted for this study. We are confident, however, that our efforts to create a detailed set of selection criteria, an open and transparent process, and an increase in the participation from underrepresented regions of the world has served to insulate the nomination and selection process from serious problems of bias and under representation.’
This is a fair stipulation. But then, WHY this RANKING? What does it stand for? And whom does it help? As it now stands, it does not help me as a donor in Central and Eastern Europe at all. I know an increasing number of think tanks in the region that snub at it and pay no attention. However, I am worried in that many ‘non think tankers’ find out about think tanks from this study and perceive the sector as depicted here, which in many ways is misleading. What is your opinion?