Principles and methodology

In the context of the autonomy and empowerment of each institution, evaluations organised by Hcéres perform the following key functions:

  • evaluating all activities carried out by the institution (partnerships, research, commercial development, training, student pathways, internationalisation, etc.);
  • analysing the procedures for the development of the institution’s strategy and its operational implementation;
  • analysing the institution’s governance strategies and its ability to control the management of its activities, in line with its strategy and in support of a clearly identified Quality policy;
  • analysing the institution’s ability to monitor and describe the progress it has made during the reference period (the period under evaluation, see below) in its different activities, and to ascertain its strength and weaknesses from this;
  • analysing the overall coherence of the institution’s policies in the different activities, in accordance with its strategic choices.

The evaluation must enable a judgement to be made about the institution’s awareness of its qualities, its ability to set and monitor its targets and to implement improvement measures.

The evaluations are carried out by a panel of experts on a collegial basis, and in compliance with the Hcéres Evaluation Charter. These experts are peers with experience of governance and institutional management. All evaluation reports are available to the public: they enable the supervising ministries and partners of an institution (local authorities, partners from the socio-economic sector, etc.) or users to obtain information about institutions in an identical format for each institution.

The evaluation procedures for institutions can be adapted (smaller panel of experts, streamlined visits, etc.) according to the specific characteristics of institutions, including their missions and fields of activity. Although the methodology can be adapted, the framework defined by the Hcéres institutional evaluation standard remains the same for all institutional evaluations.

The Department of Evaluation of Higher Education and Research Institutions (DEE) is experimenting with “coordinated” or “joint” methodological adaptations in partnerships within or outside Hcéres. In this manner, the evaluations of national schools of architecture (ENSA) are carried out jointly with the Hcéres Department of Programmes Evaluation, and certain bodies may be evaluated in collaboration with the Department of Research Evaluation. Lastly, the evaluations of certain schools of engineering are carried out in coordination with the Commission for Engineering Qualifications (CTI).