Thursday, November 10, 2011

Stop wasting researchers' time: Drastically simplify grant selection

Today, someone sent me this excellent blog post by UBC mathematics professor N. Ghoussoub discussing the immense waste of valuable time professors spend on applying for grants with a low acceptance rate


I have made the conscious decision to not even bother applying to grants with low acceptance rates. Even though I think my research is quite respectable, and I have over the years received a decent number of grants, simple economics tells me that it would be better to invest my time in doing actual research. 


What is the solution? Clearly professors need grants to hire graduate student, buy equipment and travel to conferences. Junior professors also need prestigious grants to boost their career. However, application processes for all grants should be streamlined.


There are excellent moves in this direction underway. The Canadian Common CV system will soon be in use by most granting agencies. Hopefully that will eliminate the need for researchers to spend any time in grant applications talking about their past research. Grant applications, at least for established researchers, should be primarily based on the CV (papers published, graduate students trained,. etc.). 


Let's take this three steps further:

  • There should be a 'common research proposal system'. A researcher would write a set of description in a standard format for pieces of research they want to accomplish -- a maximum of 2-3 pages each, with a maximum of one page devoted to literature review, and few or no budget details. The proposals would be open to public scrutiny and comment. A proposal could be as focused as presenting a specific experiment the researcher wishes to conduct, or could outline a general set of research objectives, with their rationale. Proposals could be enhanced or removed as time goes by. The professor might actually finish some units of work, or might improve their ideas, for example.
  • The common-CV system should be enhanced to automatically tag each publication with data regarding citations. It should also compute indexes like the H-Index (excluding self-citations), G-Index, and variants of these that weight more recent work more highly.
  • Have a common simplified grant-application system. A researcher would select the grants they want to apply for, indicate which of their research proposals they would like to work on with each applied-for grant, and the amount of time they would want to spend on each task. Except for expensive equipment that requires quotes, the budget would simply indicate the number of masters, PhD and postdoctoral students, plus technical and administrative assistants that would be needed. The system would compute the budget based on standard salary rates plus an allowance for conference travel and basic equipment and supplies per researcher (the amount for these would be standardized for each field). This point about budget is important: In current application processes, professors have to write detailed budgets, but then almost never get the amount of funds matching the budget; the current process just forces professors to write essentially-fictitious budgets.

The above would drastically shorten the time a professor wastes applying for grants they have a low probability of receiving. Professors would essentially make 'standing offers' or 'standing requests' to do certain research.


Much of the computation of criteria for grant selection should be automated: Selection committees would give high weight in their decisions to data such as the citation indexes mentioned above, and versions of the citation indexes computed for for publications relevant to the research field of the grant. They would factor in trends regarding graduate student training, and the professor's available time (a professor with many other grants would have less available time).


For established researchers, the peer review process would be limited to two things: 1) Brief comments on the extent to which the researcher is continuing on an established line of research, and 2) suggestions regarding how the research could be improved. If a professor is branching out in a new direction, or is a new researcher, then more detailed comments would be needed. But for researchers that are continuing in the same broad line of research, the track record should largely speak for itself. The suggestions for improvements should primarily be to help guide the researcher.


With the above systems in place, the overhead of applying for and peer-reviewing grants could be drastically reduced.


Some would argue that citation indexes can be faulty. So can peer reviews. There would clearly be different biases in the above system, but I think there would be no major sources of unfairness introduced. The system would simply render the research process a whole lot more productive.


Ghoussoub's comments  about waste of research resources also can be applied to conferences and journals with low acceptance rates. For those, however, the solution is completely different: Simply increase acceptance rates so they are generally at least in the 40% range. Far too many decent papers are rejected in my field because of arbitrarily-imposed low acceptance rates.

1 comment:

  1. Another relevant standard is CASRAI http://casrai.org/
    Still not clear which one will win CCV or CASRAI, or maybe both will coexist for some time.

    ReplyDelete

Note: Only a member of this blog may post a comment.