The evaluation of Curing the Limbo was carried out by the five implementing partners: the Municipality of Athens, Athens Development and Destination Management Agency S.A. (ADDMA), the National and Kapodistrian University of Athens (UoA), the Catholic Relief Services (CRS) and the International Rescue Committee (IRC Hellas).
Each entity was responsible for collecting and analysing data pertaining to the activities they implemented. Consequently, for some time, implementing partners were carrying monitoring and evaluation of their respective components and services, each in accordance with their own beneficiary data base. Such approach did not allow partners to capture the impact of the project on the integration of the refugees. The latter required integration of the evaluation processes across the different project components and different project partners.
Integration takes very long time; it takes twenty, thirty years. It was a very big challenge to see how we can set up monitoring and evaluation for integration as a whole […] Three years is not integration; it is early support for having better access, but it is not truly integration.
Source: Curing the Limbo project hearing
While initially there was no clear leadership of the evaluation among the partners, this role was eventually taken up by the quality assurance team. Originally tasked with ensuring the quality of project deliverables, activities and risk management, the team took it upon themselves to integrate and standardise the evaluation approaches employed by the five cooperating partners. This change was motivated by several factors, as follows.
Firstly, it was established that the project lacked a clearly developed, overarching evaluation approach. Rather, each implementing partner was focused on collecting and analysing data specific only to their own line of activities, in response to indicators relevant to their segment of the project. As such, the project lacked an integrative evaluation framework where all data collected by various partners could be assimilated and analysed, giving insight into the project’s impact in its totality. The quality assurance team took the lead in developing such a framework through guiding a process of collaboration and consultation.
Secondly, all five of the partners in the project brought significant expertise in evaluation, each having their own experiences, methodological approaches and understanding of the process. While such a rich combined expertise was definitely an advantage, it also created divergences in the interpretation of certain methods or theoretical approaches among evaluation actors. Here, the quality assurance team played an important role in arriving at shared definitions and harmonised approaches to data collection and analysis.
Everybody had their own evaluation frameworks for each pillar, output and outcome; some had their theory of change. We were all measuring what we were doing, but separately. Then we realised that we were not measuring the common goal of the programme.
Source: Curing the Limbo project hearing
The process of developing a harmonised, integrated approach to evaluation was conducted in a consultative manner, building on extensive discussions among the partners. This strong collaborative process was both an asset and a great challenge. While the quality assurance team supported integration, arriving at a consensus was time and energy consuming. On the one hand, such discussions were highly stimulating intellectually, enhancing the quality of the final results and leading to the development of a strong learning culture among the implementing partners. On the other hand, at times, partners felt tired by the process; when delays hindered the collection of adequate baseline data, for instance. When reflecting on the process, the evaluation team wished they had clearly defined an evaluation leader from the project’s outset.
The first thing to do was to go beyond the measuring indicators and to go deep into a more essential way of evaluating. The words we usually used during the meetings such as ‘process’, ‘theory’, ‘culture’; they don’t have the same, common meaning to all of us. Step one was to establish common definitions.
Source: Curing the Limbo project hearing
The fact that the evaluation was carried out directly by the project implementing partners and in a close, frequent dialogue allowed for efficient and effective integration of learning loops into project activities. The baseline survey, for instance, revealed that the initially planned six months of housing support for refugees should be extended to one year, and this was consequently applied. Similarly, close monitoring of community activities showed that, in order for this element to be successful, more empowering work needed to be done in the first instance. As a result, the project component was prolonged to ensure meaningful implementation. Such adjustments were possible both due to the governance model and the choice of action research as the evaluation’s governing principle. At the late stage of the project (November 2020), an external collaborator National Centre of Social Research (EKKE) was contracted to provide expertise and contribute to drafting the final evaluation report. EKKE will extensively utilise the collected data and will further conduct semi-structured interviews and focus group discussions.