Capability Deficit in Leadership of HEIs

Being a Vice-Chancellor or Head of an Institution of Higher Education is not a bed of roses as lot of people and aspirants for such jobs may be thinking. These positions are extremely difficult, and not a lot of bright people want that kind of job. It is an unfortunate situation that the system we have set up in higher education seems to recruit for such positions from a pool of candidates that have neither been trained nor have they been given any incentives to develop the skills necessary for academic leadership. With the rise of alternative education options, crises in financial outlays and devaluation of formal college degrees, HEIs face challenging times in the decades to come and there is more need than ever before to hire the right leaders with the right experiences and the right skill sets.

Repeatedly, media has been flagging the issue about leadership-crisis in HEIs, for public attention, which has always been known to people in academics and the government. News18 had done a story (https://www.news18.com/news/india/unfit-dozens-in-the-vice-chancellor-pro-vice-chancellor-race-515870.html ) in 2012, “Unfit dozens in the Vice Chancellor, Pro Vice Chancellor race.” The Hindu had done a story titled – ‘Public inquiry’ by JNUTA finds V-C unfit for position – on JNU V-C Jagadesh Kumar in October 2017 (https://www.thehindu.com/news/cities/Delhi/public-inquiry-by-jnuta-finds-v-c-unfit-for-position/article19935324.ece ). Times of India had also reported it prominently. It cannot be a mere accident that Prof. Jagadesh Kumar now heads the UGC. Times News Network, in 2019 had published a research finding that 75% of Vice Chancellors in the country were unfit for the job they held.

In academic institutions, faculty begin their careers in the role of entry-level assistant professors usually after their Ph.D. They are appointed based on their prior peer-reviewed publications and teaching skills but rarely because of their leadership and administrative skills. Few years later, the assistant professor applies for promotion presenting a docket of more than 100 pages of documentation consisting almost entirely of research publications, teaching evaluations, letters of recommendation, and grants and awards received. Particularly in top institutions, most of the weight is placed on research publications, then teaching, then service and once again, leadership and administrative experience are rarely given strong weight in promotion decisions. Without strong research publications, faculty cannot be promoted regardless of their teaching and leadership excellence. Sure, some faculty stay where they are as purely a research and teaching faculty member, but the upward career mobility is usually possible only after one has achieved a full Professor’s rank.

Faculty positions such as Professor of Psychology require people who love analysing data, investigating phenomena, and communicating results through writing or in the classroom. On the other hand, educational administrator positions like a Dean, Provost, or a Vice Chancellor require people who love problem solving, making difficult decisions, managing teams and projects, and evaluating and taking risks. Yet, it is very rare for a college or university to hire a principal or a Vice chancellor who has not been a lifelong academic.

Academics sometimes have a bit of an unfortunate reputation of being big picture thinkers, with their heads in the clouds (or ivory tower) and disconnected from the realities of everyday life. They start a research project, and then get excited by another new idea several days later, only to end up after several months with a dozen great ideas yet none close to being completed.

Faculty do not learn how to make decisions as an Assistant Professor, where their main concern is to complete the research project and get it published in some top journal that only a handful of other academics in their field will read. Research publications take months if not years to go through the peer review and editing process. Decisions in higher education leadership, especially in the face of crises such as a pandemic, need to be made within days if not hours. The work context is completely different as well, even though both the jobs are in academia.

One reason why leadership in HEIs has been losing its credibility is that so many academic leaders are not good at making long-run decisions for the health of their institutions. The most obvious example is where they fail protect the integrity of the curriculum in the face of faculty desires to teach whatever the faculty finds interesting. Higher education is quickly losing its value proposition, becoming out-of-date, inefficient, and losing credibility in the workplace, due to mindless tactical tinkering with the curriculum and the processes. We may have been so focused on hiring high-quality researchers and teachers, that we forgot they need to also be high-quality leaders and administrators.

So what is the solution?

First and foremost, early career faculty, regardless of their core field of study, must receive training on leadership, team development, risk management and related skills required for higher education administration.

Second, there is a need to change the tenure and promotion criteria for faculty to pursue such trainings. Unless one wants to remain a research or teaching professor for rest of one’s career, tenure and promotion should be granted only that faculty, who can also lead and administer.

Third and finally, academia should consider outside leaders and businessmen who have the necessary skill sets to lead large complex organizations. There are a whole community of people who got their PhD but decided against traditional research and teaching careers. They may be qualified and exceptional in academic leadership positions.

*****

See Behind the Curtain of QS World University Rankings 2022

I will begin on a lighter note because what follows is serious and may be tough, harsh and unsavoury for quite a few learned people.

There is a joke about a man asking his son about his result in the school, which is narrated nearly in all parts of the country. Rendered in local dilect with local nuances and cultural flavour, the outcome is always hilarious.  This joke goes something like this –

Man (to his son Ramu) – tell me, whether you passed this time or have failed the exams once again

Ramu (replying to his father) – I have stood fourth in the class

Man – very good Ramu, but did you pass

Ramu – Gopal (Head master’s son) has stood sixth in the class, I have done better than Gopal

Man – Poor Gopal, he remained behind you, but did he pass or not

Ramu – only Dheeru and Golu passed, they stood first and second. Don’t get angry with me, I am better than 36 in my class. Only 3 are better than me.

Man (in angry and abusive tone) – Idiot, you failed again

Clearly, the result was only 5% (2 out of 40) pass rate.

Let us now look at the QS World University Rankings 2022. India has celebrated that three of our institutions – IIT Bombay (shared rank 177), IIT Delhi (rank 185) and IISc Bangalore (shared rank 186) continue to remain in the top 200 ranked Universities of the World even in 2022. The Prime Minister (https://twitter.com/narendramodi/status/1402628065474203650) and the Education Minister (https://twitter.com/DrRPNishank/status/1402559433259962371) also congratulated these institutions, and rightly so, rankings do give us a sense of achievement. We need to be careful however, if our euphoria (https://timesofindia.indiatimes.com/india/india-emerging-a-vishvaguru-says-ramesh-pokhriyal-after-3-indian-institutes-figure-in-top-200-qs-world-university-rankings/articleshow/83373333.cms ) is like that of a Ramu or a Golu?

QS World University Rankings 2022 feature 1,300 universities from around the world. There are 35 Indian Universities in this list of 1300. (https://www.topuniversities.com/university-rankings/world-university-rankings/2022 )

Universities were evaluated according to a weighted average of the six metrics – Academic Reputation (40%), Employer Reputation (10%), Faculty/Student Ratio (20%), Citations per faculty (20%), International Faculty Ratio (5%), and International Student Ratio (5%).

The matrices are reported as measurements on an analogue interval scale (0-100) which are then aggregated into an overall score (weighted average). The overall score is therefore on an analogue interval scale (0-100).

The overall score was then ordered from high to low and discreet ranks awarded as 1, 2, 3, and 4 and so on. Universities tied at same overall score share the same rank and the next rank is then skipped to account for double cases at same rank. In such ranking, among the top 200 ranks, three institutions from India figured.

Let us try to see beneath the veil of these ranks.

  • MIT, which ranks first has an overall score of 100 (rounded up) composed of Academic Reputation (40% of 100), Employer Reputation (10% of 100), Faculty/Student Ratio (20% of 100), Citations per faculty (20% of 100), International Faculty Ratio (5% of 100), and International Student Ratio (5% of 91.4).
  • The overall scores are thus some kind of ratings for the Universities. Interestingly, as we go down the ranking list, the overall score drops very fast – Carnegie Mellon University, Pittsburgh United States scores less than 75% but ranks at 53; Hanyang University, Seoul South Korea scores less than 50% but ranks at 156; Maastricht University, Maastricht Netherlands scores less than 50% but ranks at 156; and University of Missouri, Columbia United States scores less than 25% but ranks at 476.
  • Overall Scores for Universities ranked at 501 or lower are nor reported (they scored 24 or less out of 100)

Let us revert to performance by the institutions from India. There are 35 institutions from India in the list of 1300 ranked institutions, of which 3 are in top-200, 5 more are in the 201-500 group, another 14 are in the next 500 ranks while the remaining 13 are in the last 300 ranks. The top-3 institutions from India are rated and ranked as under:

  • IIT Bombay (Academic Reputation -51.3, Employer Reputation -79.6, Faculty/Student Ratio- 32.5, Citations per faculty -55.5, International Faculty Ratio – 1.5, International Student Ratio – 1.6; Overall score – 46.4; rank-177),
  • IIT Delhi (Academic Reputation -45.8, Employer Reputation -70.8, Faculty/Student Ratio- 30.9, Citations per faculty -70.0, International Faculty Ratio – 1.2, International Student Ratio – 1.7; Overall score – 45.9; rank 185)

and

  • IISc Bangalore (Academic Reputation -34.2, Employer Reputation -19.2, Faculty/Student Ratio- 48.8, Citations per faculty -100.0, International Faculty Ratio – 1.2, International Student Ratio – 1.8; Overall score – 45.7; rank 186)

The next 5 ranked institutions are:

  • IIT Madras (Overall score – 38.1, rank 255),
  • IIT Kanpur (Overall score – 36.4, rank 277),
  • IIT Kharagpur (Overall score – 36.3, rank 280),
  • IIT Guwahati (Overall score – 28.3, rank 395) and
  • IIT Roorkee (Overall score – 28.0, rank 400).

Here is what the rating data displays:

  • Only the public institutions of technology and science are able to find a place in the top-500 club. These are deemed to be universities but not a university in the real sense of the term. A university is multi-disciplinary, spanning across humanities, science, commerce and social sciences rather than being confined to a very narrow focus on technology.
  • There is no real Indian University in the top-500 ranks. South Africa has 4 real universities in the top-500 club.
  • As against 8 institutions from India in the top-500 club, Europe has 212 institutions, United States has 87 institutions while Rest of Asia has 117 institutions (includes 26 from mainland China, 16 from Japan).
  • These 8 institutions do not account for even 1% of the total university enrolment in India.
  • The best of best in India scores only 46% marks as compared to the best in the world score of 100%.
  • There are large variances in the scores for Academic Reputation, Employer Reputation, Faculty/Student Ratio and Citations per faculty within the top 3 whose ranks are spread over only 9 ranks.
  • Employer reputation seems to exceed Academic Reputation for the high ranked institutions in India. IISc turns out to be an exception in reputation as well as in its Citation score.

Makeup is used as a beauty aid to help build up the self-esteem and confidence of an individual. Like NIRF Rankings (https://www.researchgate.net/publication/350354434_NIRF%27s_India_Rankings_Are_Ludicrous) QS World University Rankings 2022 are a makeup for educational institutions. This makeup conceals the ugly pockmarks on the face of Universities in India. It is unfortunate that the Education Minister has utilized this makeup to beat the harsh lights and the glare of camera flashes which would expose the rot in education system.

By calling these rankings as a testimony for India’s “leap in the field of Education & Research and is emerging as a VISHVAGURU” Education Minister is only proving his lack of understanding and literateness. Surely, he remembers well – “Parde Mein Rehne Do Parda Na Uthao, Parda Jo Uth Gaya To Bhedh Khul Jayega, Allah Meri Tauba – Allah Meri Tauba” (परदे में रहने दो पर्दा न उठाओ, पर्दा जो उठ गया तो भेद खुल जायेगा, अल्लाह मेरी तौबा – अल्लाह मेरी तौबा) keep the curtain on, don’t lift the curtain, If the curtain is lifted, then the secret will be revealed, Allah is my repentance – Allah is my repentance.

**

First published 12 June 2021

***

“Likes” “Follows” “Shares” and “Comments” are welcome.

We hope to see energetic, constructive and thought provoking conversations. To ensure the quality of the discussion, we may edit the comments for clarity, length, and relevance. Kindly do not force us to delete your comments by making them overly promotional, mean-spirited, or off-topic.

***

NIRF Rankings Are Ludicrous

On November 30, 2020, the National Institutional Ranking Framework (NIRF) invited applications for India Rankings 2021, the Sixth edition of this annual exercise. NIRF was launched in 2015 to rank higher educational institutions in the country. NIRF makes a loud claim of its purpose as “promoting competitive excellence in the higher educational institutions” and its process as “being based on objective criteria” is approved, endorsed and supported by the Ministry of Education of the Government of India.

Rankings for Educational Institutions are accorded great significance by institutional staff and leadership teams and when awarded by the Government itself, the outcomes of the ranking have significant material consequences. Year after year, NIRF has been publishing its Annual Rankings inciting excitement across academic social media. Nothing wrong in celebratory and congratulatory banter that follows; but what is unsettling is the fact that the academic scholars take things like a ranking as a confirmation, or evidence, of how good, or bad for that matter, they are having it as compared to everyone else.

Let me make it clear from the start that my intention here is not to criticise rankings. This is not a story about flawed methodologies or their adverse effects, about how some rankings, other than the NIRF, are produced for making profit, or about how opaque or poorly governed they are.  The intent here is to draw attention to a highly problematic assumption that there is, or that there could be, a meaningful relationship between a ranking, on the one hand, and, what an Educational Institution is and does in comparison to others, on the other.  

To avoid any embarrassment to the Indian Ranking Systems, let us take an example from three of the most popular rankings – Academic Ranking of World Universities 2020 (Shanghai Ranking), The Times Higher Education World University Rankings 2021 and QS World University Rankings 2021. Furthermore, to avoid any embarrassment to the Indian educational institutions, cases of educational institutions from our neighbouring country is taken.

Quaid-e-Azam University, Islamabad does not have a mechanical engineering department; in fact, it does not offer engineering of any kind. Yet the department of mechanical engineering at Quaid-e-Azam University was rated 76-100 in 2017. (http://www.shanghairanking.com/Shanghairanking-Subject-Rankings-2017/mechanical-engineering.html).

This placed it just below Tokyo University and just above Manchester University. Wow! Thereafter every year QAU improved its score and in 2020 it jumped into the 51-75 range putting it under McGill University but higher than Oxford University. (http://www.shanghairanking.com/Shanghairanking-Subject-Rankings/mechanical-engineering.html).

The Times Higher Education World University Rankings 2021 declared the Abdul Wali Khan University in Mardan as Pakistan’s top university (https://www.timeshighereducation.com/world-university-rankings/2021/world-ranking#!/page/0/length/25/locations/PK/sort_by/rank/sort_order/asc/cols/stats).

The Times Higher Education World University Rankings 2020 did not even list The Abdul Wali Khan University Mardan. (https://www.timeshighereducation.com/world-university-rankings/2020/world-ranking#!/page/0/length/25/locations/PK/sort_by/rank/sort_order/asc/cols/stats).

The QS World University Rankings 2021, which were released soon after the release of The Times Higher Education World University Rankings 2021, put National University of Sciences And Technology (NUST) Islamabad at Pakistan’s number one and The Abdul Wali Khan University Mardan was not even on the list. (https://www.topuniversities.com/university-rankings/world-university-rankings/2021).    

There is nothing exceptional about these examples beyond them being striking examples of how arbitrary rankings are.

Most ranking organisations, including the NIRF, never send assessors to the thousands of educational institutions they rank. Instead, they simply design forms for the officials of the institutions to fill and submit. The ranking criteria are periodically adjusted (for whose benefit?). Everyone (except the student) has something in the rankings for them.

Across the world, ranking organisations have been exposed as inconsistent, changing metrics from year to year, and omitting critical pieces of information. Smart academics and administrators have also learned to game the system. This speeds up their promotions and brings in recognitions and rewards.

Rankings are artificial zero-sum games. Artificial because they force a strict hierarchy upon educational institutions; artificial also because it is not realistic that an educational institution can only improve its reputation for performance exclusively at the expense of the reputations of other institutions. The most ludicrous aspect of it all is the belief, which may seem like a rational explanation that when an institution goes “up,” this must be because it has actually improved. If it goes “down,” it is being punished for underperforming. Such linear-causal kind of reasoning is absurd.

One of the hallmarks of any rankings are the numbers of research publications and citations.

Hundreds (the precise number is 1494) of Indian scientists and academics have been chosen from nearly 160 thousand (1,59,683 to be precise) scientists in universities across the world, ranked by their number of research publications and how often they were cited. (https://www.nationalheraldindia.com/science-and-tech/1494-indians-among-top-2-scientists-in-world-stanford). Stanford University reportedly declared these Hundreds of Indian luminaries in the world’s top two per cent of scientists.

THAT IS A TOTAL LIE! Stanford University has not sanctioned any such report. This doctored news wrongly draws upon the enormous prestige of Stanford. Only one of the four authors, John P.A. Loannidis, has a Stanford affiliation. He is a professor of medical statistics while the other three authors are from the private sector. Their published work inputs numbers from an existing database into a computer that crunches them into a list. That list is meaningless for India. It does not represent scientific acumen or achievement.

Generating scientific research papers without knowing any science or doing actual research has been honed into a fine art by academic smarties at home and abroad. The stuff produced has to be published for which smart professors have developed many tricks including a membership to the cartel of international referees. The next and most difficult stage is to generate citations after the paper is published.

At this point, the smart professor relies upon smart friends to cite him and boost his ratings. Those friends have their friends in India, China, or elsewhere. This international web of connections is known as a citation cartel. Cartel members generate reams of scientific gibberish that the world of mainstream science refuses to even notice. Some of the individuals who made it to the exalted ‘Stanford scientist list’ would surprise people if they could pass a tough high-school-level exam for entering undergraduate studies in a decent university like Stanford. Others could certainly be genuine. No one would be able to tell.

Yet in India, the rewards are handsome, and the smart professor soon becomes chairperson, dean, vice-chancellor, or an influence peddler. One can expect nothing from the present gatekeepers of academia because fraud is a way of life for most. These gatekeepers shunt out all genuine academics lest they be challenged from below. This is creating a spiralling down vortex of mediocrity and upward spiral of favouritism. So many ‘category A’ NAAC accreditations of educational institutions are merely Self-congratulations and reflect the official policies that encourage academic dishonesty, all of whom have inflicted massive damage upon Indian higher education system.

Rankings are release and presented with much fanfare. Numbers, calculations, tables and other visual devices, “carefully calibrated” methodologies, and all that, are there to convince us that rankings are rooted in logic and  quasi-scientific reasoning. Rankings are made to appear as if they were works of science, they most definitely are not. However, maintaining the appearance of being factual is crucial for rankings.

The policy regime in India places a lot of importance on the rankings. That creates a problem, as more than a few educational institutions have started hiring consultants to help them raise their rankings. When a measure becomes a target, it ceases to be a good measure. This is the generalized Goodhart’s law which comes from Strathern’s paper, not from any of Goodhart’s writings [Strathern, Marilyn (1997). “‘Improving ratings’: audit in the British University system”. European Review. John Wiley & Sons. 5 (3): 305–321].

To assume that a rank, in any ranking, could possibly say anything meaningful about the quality of an educational institution relative to other institutions, is downright irrational. It is, however, precisely this assumption that makes rankings highly consequential, especially when it goes not only unchallenged, but also openly and publicly embraced, by the scholars themselves.

*

First published 24 Mar 2021

*

“Likes” “Follows” “Shares” and “Comments” are welcome.

We hope to see energetic, constructive and thought provoking conversations. To ensure the quality of the discussion, we may edit the comments for clarity, length, and relevance. Kindly do not force us to delete your comments by making them overly promotional, mean-spirited, or off-topic.

*