Hcéres, a partner of choice

An evaluation tool designed to serve stakeholders

The activities of Hcéres reflect the public authorities’ dual commitments to:

  • make a single body responsible for evaluating clusters of higher education and research institutions, individual institutions, research units and study programmes, or, where applicable, for verifying the quality of the evaluations carried out by other bodies;
  • provide a dedicated evaluation tool for higher education and research institutions or their groupings and stakeholders in general at the national level in France.

ESG

Hcéres is also committed to building the European Higher Education Area which, via the Bologna Process, has enabled the implementation of numerous schemes, including the development of national quality assurance systems (evaluation, accreditation, auditing, etc.) in each Member State, based on the European standards (ESG: European Standards and Guidelines for Quality Assurance in the European Higher Education Area).

These standards define two aspects of quality assurance:

  • internal quality assurance concerning all continuous improvement schemes for activities carried out by the evaluated entities.
  • external quality assurance, which organises the external evaluation of these activities by peers, through quality assurance agencies such as Hcéres. In particular, external quality assurance entails self-evaluation processes carried out by the evaluated entities.

The European standards combine the practice of self-evaluation by institutions with external evaluation by peers. 

What are the objectives of evaluations carried out by Hcéres?

Hcéres’ role is not to criticise, but rather to act as a partner of choice, because it is chosen by the evaluated entities and also because it is committed to supporting them. That is why adopting a constructive approach is another cornerstone of its methodology. For the High Council, it is not a question of judging, but rather of advising in a completely impartial manner.

The evaluations carried out by Hcéres are designed to serve the evaluated entity and:

  • provide information for the management teams of higher education and/or research institutions, or their groupings, on which they can base their future training and research strategies;
  • provide teaching and research staff with comparative data they can use to improve the quality of the service they provide;
  • offer supervising ministries the information they require for decision-making (allocation of human and financial resources, accreditation of study programmes, official certification of research units, etc.);
  • give students relevant information to help with their educational choices;
  • meet companies’ needs for information about the quality of study programmes and degrees, and about the competencies of graduates;
  • inform civil society about the activities of higher education and research institutions or their groupings, in a reliable and transparent manner.

Evaluation principles and methods adopted by Hcéres

“Demanding” is the word that best describes the Hcéres methodology. In line with the European standards, evaluation by Hcéres combines two complementary processes:

  • Self-evaluation, which enables each entity to characterise its development trajectory.
  • External evaluation, which is based on the self-evaluation report submitted by the evaluated entity and information gathered during the visit by a panel of experts.

On the basis of this information, Hcéres develops an “integrated evaluation” using a method that draws heavily on the combined expertise of its 4 evaluation departments: Institutions, Research bodies, Research entities, and Study Programmes.

This approach requires a continuous dialogue between the High Council, the evaluated entity and its supervising ministry:

  • A preliminary consultation with stakeholders at the national level;
  • Transmission of information to the evaluated entities during the preliminary phase of the evaluation;
  • Consideration of the institutions’ comments about their evaluation, which are combined with the final external evaluation report;
  • Feedback from the evaluated entities to validate the consistency and relevance of the procedures.

Evaluation campaigns cover a period of around 12 months from the submission of the evaluation file.

Evaluation campaigns for clusters of higher education and research institutions generally last 18 months.