Open-access The field of evaluation and the "sur mesure" strategy

DEBATE DEBATE

Ligia Maria Vieira da Silva

Debate sobre o artigo de Zulmira M. A. Hartz

Debate on the paper by Zulmira M. A. Hartz

The field of evaluation and the "sur mesure" strategy

Zulmira Hartz has launched a timely debate on the institutionalization of evaluation for health policies and programs. The author provides an extensive review of international experiences and particularly focuses on the French case, raising prime issues for the debate over the current Brazilian health agenda: the use of evaluation to back decision-making and its incorporation into health reform experiences, the relationship between policies and programs, and especially the field's current trend towards methodological pluralism.

I would start by reflecting on the field's specificity and the opposition between the structured or prêt-à-porter and non-structured or sur mesure approaches. Despite the various limitations posed by experimental designs, mainly with regard to ethical and operational problems, they have been used to support health systems and services management (an aspect of the institutionalization of evaluation) particularly in relation to the efficacy of technologies. In addition, building information systems to monitor health situations requires defining problems, criteria, and patterns on a national and international scale, an approach that has made it possible to control some diseases in the past. If we define, measure, and evaluate problems only on the basis of local criteria and patterns, not only comparisons became impossible, but the possibility of articulating control measures such as those leading to the eradication of smallpox worldwide and polio in the Americas. This does not mean to deny the social and cultural nature of the health/disease phenomenon, several aspects of which require a local and decentralized focus for diagnosis and intervention, in addition to negotiated evaluation. Evaluation of program coverage can only be performed in a quantified, structured way. Yet the meaning of this coverage with regard to the degree of implementation and the technical and scientific quality is revealed more accurately through loosely structured approaches, taking recourse to qualitative techniques to obtain information. Likewise, evaluation of effectiveness, which until recently required an exclusively experimental design, can now be conducted with loosely structured strategies.

I should add that the choice of approach does not always obey a theoretical and methodological logic. One can now recognize the existence of a field of evaluation as the sense ascribed to it by Bourdieu, i.e., a network of relations among agents, evaluators, and institutions (Bourdieu & Wacquant, 1992). The field's make-up derives precisely from the institutionalization of evaluation as a result of government's demand for a judgment of social programs' performance and effectiveness in various industrialized countries. The material expression of the field can be visualized in the analysis of the make-up of the International Conference on Evaluation held in Vancouver in 1995, with 1,600 evaluators, five associations, and 66 countries participating (Chelimsky, 1997). This field has several intersections, including those with the fields of science, health, and other professional fields linked to social programs, in addition to its relations with the field of power. What is at issue in this field is the dispute over scientific competence (knowledge) and technical competence (know-how). Thus, the dispute over which methodologies are most valid gains special relevance, since the controversy over what is scientific in the field is linked to the struggle over the evaluation project market. In addition, the object of evaluation involves interests linked to the power sphere, especially when it is a matter of public policy evaluation. A policy's success or failure means the accumulation or loss of political and symbolic capital either by those who govern or by those in opposition. Evaluation can also be used by managers inside institutions as a tool to control subordinates. Thus, resistance to the outside evaluator, quantification, and objectification can be greater, while strategies to expand individual power (empowerment) may be more readily accepted and more organic when there is a trend towards decentralization and democratization of decision-making processes.

Analyzed from a different angle, the sur mesure approach may be seen as adjusting the evaluation's methodological strategy to its object. In this sense, it should also be preferred, while the main problems involve more the construction of the object (Bourdieu, 1989) and treatment of theory as a guide for evaluation (Chen, 1990) than the opposition between qualitative and quantitative techniques, which (as Hartz points out quite appropriately) can be articulated in actual studies.

Another important point raised by the author is the distinction between evaluation programs and policies. Although the bibliography she quotes does not make this distinction, I believe it is necessary from both a theoretical and methodological point of view. Public policies relate to the state, i.e., the power field; evaluating policies involves not only judging the adequacy, pertinence, effectiveness, efficiency, and legitimacy of governmental intentions and actions, but especially analyzing the nature of the state and the political power involved in drafting them. Meanwhile, programs relate more to a policy's technical and operational dimension, i.e., its material manifestation as objectives, goals, resources, and activities, and their evaluation requires a set of methods and techniques that are different from those needed to analyze policies. By combining policies and programs in the same object, one runs the risk of reducing politics to technique or even to planned policy, or (inversely) focusing only on the political side of technique. I do not mean to say (especially at the local governmental or even institutional level) that program evaluation is a merely technical issue. For example, depending on the object or issue to be evaluated, contextual analysis can be an investigation strategy for explaining the processes involved in implementing a program.

Last comes the relationship between evaluation and the decision-making process. More than a theoretical, technical, or methodological question, this is a political and ethical issue, involving choices. That is, faced with various rationales that interfere with the management process, institutionalization of evaluation for a public health system means seeking to ensure the hegemony of the technical/health rationale in the decision-making process, i.e., prioritizing health needs over institutional corporatist or even external presses. It means developing management based on identification of problems, organizing supply through programmed actions, and emphasizing control of risks and causes through a territorial focus, with social participation. In other words, it means changing the current health care model. This demands not only evaluating, but especially intervening to change the country's health reality sur mesure.

BOURDIEU, P., 1989. O Poder Simbólico. Rio de Janeiro: Bertrand Brasil/Lisboa: Difel.

BOURDIEU, P. & WACQUANT, L. J. D., 1992. Réponses. Pour Une Anthropologie Réflexive. Paris: Seuil.

CHELIMSKY, E., 1997. The Coming Transformations In Evaluation. In: Evaluation for the 21st Century: A Handbook (E. Chelimsky & W. R. Shadish, eds.), pp. 1-26, Thousand Oaks: Sage Publications.

CHEN, H., 1990. Theory-Driven Evaluations. Newbury Park: Sage Publications.

Publication Dates

  • Publication in this collection
    18 Mar 2003
  • Date of issue
    Apr 1999
location_on
Escola Nacional de Saúde Pública Sergio Arouca, Fundação Oswaldo Cruz Rua Leopoldo Bulhões, 1480 , 21041-210 Rio de Janeiro RJ Brazil, Tel.:+55 21 2598-2511, Fax: +55 21 2598-2737 / +55 21 2598-2514 - Rio de Janeiro - RJ - Brazil
E-mail: cadernos@ensp.fiocruz.br
rss_feed Acompanhe os números deste periódico no seu leitor de RSS
Acessibilidade / Reportar erro