- CCV interface indeed sucks, and must be redesigned
- Terrible interface does not obviate the need for tracking data
- Free-format CVs are not a viable solution
- Tri-council must facilitate and lead a fully open discussion to completely redesign its interface, architecture and services around it
- Tri-council must harmonize the CV format across all agencies
- I am happy to help with the CCV redesign or its public consultation.
Some Canadian researchers have had enough with CCV, and are petitioning to completely abandon it as it exists. Keeping the buggy, unfriendly and unreliable nature of CCV in mind, they may be right. However, before we discuss the limitations, it is important to recall that the goal of CCV is laudable: one common CV for all agencies and academic activities. You prepare it once, and reuse it in as many applications as you need, each potentially requiring a different format, or a different portion of the CV. There is a need to update it, as more papers are published or grants are obtained etc. But these iterative additions should ideally be a trivial task compared to the initial task of entering one’s CV. Considering the large number of funding agencies, peer-review activities and the need for automated reporting, this consolidation is essential to reduce the overall amount of time spent across ALL agencies, and over a period of 3 to 5 years. Let me repeat this key point so it won’t be lost: we must evaluate the savings of time and cost, in view of the full breadth of a researcher’s domain, activities and lifecycle. Again, given the LARGE number of funding agencies, VARIOUS types of peer-review activities involved, and the need for SCALABLE and AUTOMATED reporting, this CCV-type standardization and consolidation is essential to reduce the OVERALL amount of time and EFFORT spent across ALL agencies, and over a period of 3 to 5 YEARS. With that in mind, the goals of CVV are laudable, and the need for such a centralized system essential.
These laudable goals might have been achieved if the CCV interface had not been so poor and researchers were not constantly running into issues, esp. during already-quite-frustrated periods of grant submission deadlines. Although it was launched back in 2002, it did not seem to have improved significantly or become more user-friendly over time. That’s indeed quite a shame. That Canada’s highest scientific bodies are unable to find a decent software solution for a simple text-entry system with usable interface is frankly terrible, and reflects poorly on them.
CCV’s terrible interface is the root cause of great frustration for all stakeholders involved. To my understanding, this is the main reason for the petition to abandon to appear. I have read that petition with interest, and empathize and agree with the frustrations expressed in the letter. However, I do not support their proposition to abandon the CCV altogether for the following reasons:
- Terrible interface does not obviate the need for standardizing and tracking data:
- Tracking researcher’s data (various CV metrics and details) is crucial for several informed decisions and debates. They include quantitative analyses of the effectiveness of the funding decisions over the years, and establishing or debunking bias in the peer-review system. For example, two of the petitioners Drs. Holly Witteman and Michael Hendricks researched the gender gaps in CIHR grant success to provide an important insight that “Gender gaps in [CIHR] grant funding are attributable to less favourable assessments of women as PIs, not of the quality of their proposed research.”[link at the end] They end this paper with “To further advance knowledge, future research should investigate why female principal investigators might be evaluated less favourably, forms of bias beyond gender, and the effectiveness of strategies for facilitating peer review that consistently supports the best research.” The emphasized phrases are mine and are two key reasons for CCV (not against), or similar standardized data collection and management system. Reason 1: to study all possible confounders and potential sources of bias, one must collect them first. The collection must happen in a standardized way, and their consolidation in a harmonized way, which to the best of my knowledge, can only be achieved with a system like CCV. There is a mention of potential alternative sources such as the funding and bibliometric datasets, but I don’t believe they are yet a sufficient replacement for CCV. That is after assuming those former databases are actually of better quality and more user-friendly than CCV, which is yet to be tested in the real world! Next, Reason 2: to quantitatively evaluate the effectiveness of strategies for peer review, or estimate return on investment from different programs/streams of funding allocation, we would need fully digitized and high-quality databases
- Another aspect I am interested in the future, as an ECR and a visible minority, is the role of race in peer review and academic success, and finding ways to ensure that peer review processes are free from any bias. Did you know applicants with ethnic names get rated as less hirable and less competent compared to their White and Asian counterparts, even though all of them had identical CVs? [link at the end]. Wouldn’t you want to work towards building a database to enable a high-quality quantitative study, to establish such subtle forms of bias, or debunk them? Imagine how many more studies similar to the gender gap study cited above can be achieved with CCV data?
- Free-format CVs are not a viable solution
- They are harder to parse, introduce variability in presentation, and are difficult to automatically validate against certain specifications e.g. eligibility criteria
- They introduce undesirable noise such as the design or style or font of CV used etc, that can bias the reviewer’s evaluation
- They are also biased against
- non-native speakers of English, many of them minorities, as the lack of standardization will put them at a disadvantage
- persons with disabilities, as these formats require more work to “polish” them to try up their chances, again for lack of standardized format.
- CCV can accommodate atypical CVs
- Although I support the need for some free-form expression to describe atypical trajectories or unconventional contributions from researchers taking atypical career paths, the unstructured alternatives proposed may create fragmentation, and might end up being more time-consuming to manage (for reviewers and others). While we can debate what would be the right architecture for CCV databases, and the need for different types of data, standardized format is essential. We can certainly accommodate free-text entry in this format!
Keeping those in mind, I suggest the Canadian Tri-council pursue the following, which may hopefully help us achieve an Ideal CV (ICV) system :
- Facilitate and lead a fully open discussion:
- By presenting their own case for the need and their stance on the CCV. As it stands, there is no public info on this. At all
- More importantly, they must share all the related info publicly e.g. cost they are incurring, resources they plan to invest going forward, and how the decisions were made etc. This does not seem to be the case currently
- Share detailed statistics on coverage of CCV across Canada, number of international collaborators, # PIs with disabilities, patient partners, various formats or services working with CCV, and their changes over time. This would help estimate the effort and impact on different stakeholders involved. This will also help us decide which categories can benefit most from providing easy-to-get-started templates and tutorials
- As it stands, discussions are in the abstract without any concrete examples or any technical/design details, which is presenting a barrier for a high-quality debate. For example, when the petitioners say “The data is very low quality and extremely incomplete”, or “it’s not possible for a system to do everything that the CCV is supposed to do”, it is difficult to follow what they are referring to, or how the data are incomplete or low-quality, without specific and concrete examples, and a clear schematic of all the CCV-related services. I think most of the participants in this debate are right and are participating with good intentions, and trying to find an acceptable solution.
It is just that different folks are focusing on different parts of this multifaceted issue! Public sharing of all the related info as noted above will help improve the quality of CCV debate by enabling various stakeholders understand each other
- Open discussion will also help improve researcher’s trust in agencies that they are doing what all the stakeholders want and need
- It is possible such public review may indeed result in a clear conclusion that CCV cannot be easily redeemed, or the cost of it is much higher relative to starting from scratch. However, IMHO, that would be unlikely given most researcher’s CVs are already in the database, and effort would be on the centralized software (design, quality control, etc).
- Completely redesign the user interface:
- in consultation with all the stakeholders, and researchers in all stages. If this requires reforming or replacing CCV related committees or contractors (IT, support etc), or throwing 10 additional developers and servers, regardless of the sunk cost, so be it
- Establish focus groups, and rely on continuous testing to ensure design decisions made are indeed user-friendly and time-saving
- It’s 2019 and it would be a giant shame if we can’t provide user-friendly data-entry forms. I absolutely love the TurboTax interface and their design decisions to make it user-friendly for a comparable complex and recurring task – why can’t we simply adopt something similar?
- Given the number of outages we’ve noticed in the past, esp. during deadlines, stress testing must be done well ahead of time.
- Harmonize the formats across different funding agencies:
- Even with a better/usable UI, Dr. Paul Minda says “as long as CCV is used 30 different unique ways for each agency and each program within each agency, it’s going to be doomed”
- Lack of harmonization across agencies not only creates issues, but also wastes the users time. If the Tri-council want to keep calling it a COMMON CV, it better be actually common across all the funding agencies within Canada. Otherwise, why does it even exist?
- Accessibility and inclusion must be a priority: Special efforts must be placed to reduce the amount of time needed for everyone, esp. the differently abled and non-academic researchers, as well as in recognizing unconventional contributions properly. There can be no excuse in these aspects!
- The design of web interface should be such all the clicks and navigation are tracked, to enable unambiguous identification of where the pain-points exist, and what the common issues are. This would help quantitatively measure the effort required for different parts, which can help in identifying ways to simplify the interface, or expediting/removing some subparts. Any such tracking must be completely ANONYMOUS
- When most of the scientific world is marching towards standardizing and harmonizing data in their domains, every which way, to enable as well as improve reproducibility, proposing to end what is already a reasonably standardized system such as the CCV – what was arguably ahead of its time back in 2002, is not the right step forward
- You might be wondering whether the investment into the aforementioned suggestions is worth it or not, and that’s a fair question. I think they are worth the investment, to achieve a fair and equitable science funding system. Given the ideals of Equity, Diversity and Inclusion (EDI) Canada espouses, and how we are already a world leader, this investment is necessary and well worth it in the long term.
Some of these ideas may indeed have been discussed before (although there is no public history of these discussions except for an article here and there), and lack of any improvement so far does raise valid skepticism. For example, Dr. Michael Hendricks says “a significant part of my motivation [in leading and signing the petition] here is not related to whether or not a ‘perfect CCV’ would be useful (I am not convinced), but whether the tri-councils can or will ever deliver one. For many reasons, I do not believe they can.” This is a perfectly fair skepticism, given the history of CCV and how the Tri-council never managed to improve it in its 15+ years of existence. That comment should give the Tri-council a clear indication of how terrible the situation is, and they need to take this reform very seriously, as their credibility is resting on delivering an easy and acceptable CCV.
That said, given we don’t know why the Tri-council failed so far, how much we have already invested into it, and what the Tri-council reform plans are going forward, and keeping the overall long-term value of “Easy CCV”, I say let’s not throw the baby out with the bathwater, and work with the Tri-council towards making CCV great.
Thank you for checking out my blogpost. I’d love to hear thoughts and constructive feedback.
Hello Tri-council folks, I will be happy to volunteer in the redesign or steering committees to help achieve a better CCV.
- I am an able-bodied scientist, and hence may not be fully able to empathize with barriers faced by persons with disabilities. This is why I strongly recommend the Tri-council to make it a priority to ensure CCV does not present any barriers to the physically challenged. Few ways to achieve that is to consult them in identifying the barriers as well as acceptable solutions, followed by sufficient testing to ensure they indeed remove the barriers. If that’s not possible, they must provide additional support to them
- I am also a technically savvy scientist, who breathes, eats and spits out data. Although this helps me appreciate the essential need for standardization for a scalable, automated and fully quantitative solution, this also makes me biased towards these solutions in the broader space of all possible solutions. Hence brainstorming sessions to reduce the data collection when possible would help e.g. by avoiding the collection of redundant info, or providing alternative solution (e.g. Google Scholar to identify all published papers so no one has to enter their papers by hand).
References and Links:
- University Affairs article covering the discussion on petition to abandon the CCV
- Witteman, Holly O., et al., “Are Gender Gaps Due to Evaluations of the Applicant or the Science? A Natural Experiment at a National Funding Agency.” The Lancet, vol. 393, no. 10171, 2019, pp. 531–40. doi:10.1016/S0140-6736(18)32611-4
- Eaton, Asia A., et al. “How Gender and Race Stereotypes Impact the Advancement of Scholars in STEM: Professors’ Biased Evaluations of Physics and Biology Post-Doctoral Candidates.” Sex Roles, June 2019. doi:10.1007/s11199-019-01052-w.
I thank Prof. Paul Minda for his helpful comments on the initial draft.