Sunday, August 30, 2009

The 2009 Lodewijk Meijer Top-1000 of Universities in Developing Countries; Part1 Our Methodology

Education is essential for development, and development is important for good education. For many years education specialists and governments in developed nations have concentrated on primary and secondary education when providing aid and support to developing nations.

However, recently growing numbers of studies indicate that it might be more than worthwhile to pay more attention to educational support at the tertiary level as well.
We just finalized our calculations concerning the Top-1000 list with Universities from Developing Countries. A total of more than 1,100 universities from about 135 countries qualified.
We looked at 30 variables from various providers. Our basic philosophy was that all these providers had created indicators that contained value. However, all of them were either deriving a global or regional or country ranking and/or were focusing on just a specific field of research (there were for instance quite a few ratings available for Business Schools) and/or had a prime focus on developed nations. Our goal was to create general/overall and partial rankings for universities from developing countries in particular. We felt that the focus on 'global' and 'developed' has led to a situation in which a) the average level of the Developing Nations top schools was underestimated; and b) it was quite hard for potential students and other interested parties to get a good overview. We believe that our list provides a solution.
We incorporate analysis by the following providers:
  • The French rating list for Global Business Schools EDUNIVERSAL
    • Two variables with a combined weight of 6.25 percent
 The four different lists are equally weighted (each 2.5 percent), for a total of 10 percent. The combined 16.25 percent for these business school related lists might seem high, but it was clear that business education was further than that in other areas with respect to internationalization and comparison.
The other four providers that we incorporated generated broad and/or field-specific rating lists. We incorporate both details and broad rankings from these providers. They are:
  • Top Universities (TU) from the UK. Large advisor from the UK, with separate rankings for the top 20-40 schools per country overall and for separate fields of research, like Arts & Humanities, Life Sciences & Bio Medicine, Natural Sciences, Social Sciences and Technology (Engineering & Computer Science). We incorporate both the Broad Rating and the Field-related Ratings in our overall rating scheme. TU builds its rating using 6 different variables: 1) Academic Peer Review (40%); 2) Employer Review (10%); 3) Faculty-Student Ratio (20%); 4) Citations per Faculty (20%); 5) International Faculty (5%) and 6) International Students (5%). The TU-Broad Rating gets a 5% weight. The individual field ratings get a 2.5% weight, with the exception of Arts & Humanities and Social Sciences who both get a 1.25% weight. We decided to give Arts & Humanities and Social Sciences half the weight of the other fields, because the other providers with field specific information (see below) decided either to combine these two fields or - in the case of Shanghai Jiao Tong - to refrain from incorporation of Arts & Humanities. The overall combined weight for provider TU is therefore 15%.

  • The so-called ARWU list, prepared by Shanghai Jiao Tong University (itself a school in our top-1000!). The ARWU list is one of the better known global rating lists of universities. It is an impressive piece of work, in which the Chinese rank schools according to 7 variables: 1) Noble Prize Awards or Fields Medals won by Alumni (10%); 2) The same, but by Current Staff (15%); 3) Citations per Faculty (25%); 4) Publications (25%); 5) Top Articles published in Journals that are considered top-quintile in their field (25%) and 6) Only for Engineering - Funding / Research Budget (25%). As you see the total weight adds up to 125% for Engineering. In that case the total is re-weighted to 100%. In our overall rating we incorporated the ARWU - Broad and ARWU - Field-Specific rankings. There are field-specific ratings for Natural Science/Physics, Engineering & IT, Life Sciences, Clinical Medicine and Social Sciences. They are all part of our overall rating system. The ARWU variables get a combined weight of 17.5%. The ARWU-Broad index gets a 5% weight, while the field-specific ARWU rankings are awarded with a 2.5% weight each.
  • Spanish provider Webometrics went a totally different way. The Spanish institution generated a rating that is totally automated. Using search engines like Google, Yahoo, Live Search and Exalead they extracted information about more than 10,000 universities all over the world. This rating was quite important for us in that it was probably the only one that did not really discriminate against universities from developing countries. The Webometrics rating does not provide a sub-categoriation in specific fields, but because it was the only one that was readily available for almost all schools in our sample we decided to incorporate both its overall scoring variable and the components of it. They are: 1) Size (weight 20%), which is defined by Webometrics as the number of pages online about the University (either directly written by itself or by others); 2) Visibility (50%), measured as the total number of unique external links received by the University's home page; 3) Rich files (15%): count of the number of online Powerpoint-, Word-, Adobe PDF- and Post-script (PS)-files in which a university was mentioned; and 4) Scholar (15%) which was used to derive the number of papers and citations per domain. When analyzing the Webometrics rating we found that the rating had a certain flaw in that it seemed to discriminate against specific languages. We decided not to remove the rating, but re-weighted the variables and adjusted the overall weight. We assigned an overall weight of 11.25% to the rating. The WEBO-OVERALL and WEBO-SCHOLAR ratings got a weight of 3.75% each. The WEBO-SIZE, WEBO-VISIBILITY and WEBO-RICH FORMAT variables got a weight of 1.25% each allocated to them.
  • Last but not least, we assigned a 40% weight in total to the rating developed by the Taiwanese accreditation commission. We felt that it was the rating with the most rigorous approach and a focus on both short-term developments and longer-term (last 12 years) trends. It rated the following variables: i) Research Productivity; ii) Research Impact; and iii) Research Excellence. Research productivity is analyzed by looking at the number of papers written during the last 12 and one year respectively. Both get a 10 percent weight. Research impact is measured by looking at the number of citations during the last 12 and one year respectively and the average number of citations per article written during the last 12 years. All three sub-variables get a 10 percent weight each. Last but not least, research excellence is measured by the value of the so-called H-Index over the last two years (20%); number of highly cited papers over the last 12 years (15%) and number of articles written in so-called high-impact journals in the current year (15%). The H-Index is calculated as follows: a school gets and H-Index equal to a value of h if at least h of its total of N written papers receive h citations in high-level, prestigious journals. It is clear that the Taiwanese rating ensures that top universities are really excellent research institutions by all means. We assign a weight of 10% to the broad Taiwanese index and 5% to each of its 6 field-specific indices.
Using the aforementioned weights, we derived the 2009 Lodewijk Meijer Institute rating list. During the coming weeks we will first present you the top of the overall rating list and then move on to the field-specific lists. After that we will analyze results on a country by country and regional basis. We will also present remarkable universities in separate articles.
Tomorrow in Part 2 of this series attention for the OVERALL RATING LIST

No comments:

Post a Comment