Project Report on Hospital Management Information: Benchmark

Description
Benchmarking is the process of comparing one's business processes and performance metrics to industry bests or best practices from other industries. Dimensions typically measured are quality, time and cost.

Advice in International Cooperation, Health & Public Administration

Imprint

HOSPITAL MANAGEMENT INFORMATION BENCHMARK PROJECT
Tanzania
END OF PROJECT EVALUATION

Final Report 15 September 2011
Cordaid, Tragpi, CSSC

Evaluation Management Information Benchmark Project

1

Advice in International Cooperation, Health & Public Administration

Imprint

Cordaid Project Reference Country Organization Author

1000971 Tanzania CSSC Pieter-Paul Gunneweg

Disclaimer: The nature of Imprint’s work can imply that customers request Imprint to come forward with innovative ideas or suggestions to bring about change. Imprint takes full responsibility for the observations and advice expressed in this document however, Imprint disclaims any responsibility for the way the advice is interpreted or implemented.

Evaluation Management Information Benchmark Project

2

Advice in International Cooperation, Health & Public Administration

Imprint

Table of Content .................................................................................................................... 3 Executive Summary .................................................................................................................. 5 Map of Tanzania ...................................................................................................................... 6 Acronyms and Abbreviations ............................................................................................ 7 Acknowledgement ...................................................................................................................... 8 1. Introduction .................................................................................................................... 9 1.1 The Management Information Benchmark Project ................................... 9 1.2 Objectives, Rationale & Assumptions ......................................................... 9 1.3 Terms of Reference of the Evaluation ...................................................... 9 1.4 Methodology of the Evaluation ..................................................................... 10 2. Project Context ........................................................................................................... 11 2.1 Tanzania Country Profile ................................................................................. 11 2.2 The Tanzanian Health System .......................................................................... 11 2.3 Relevant Health Sector Policies and Programmes ............................ 12 2.3.1 National Health Policy ............................................................................. 12 2.3.2 Health Sector Reform .................................................................................. 12 2.3.3 Health Sector Strategic Plan III...................................................... 13 2.3.4 Primary Health Care Service Development Programme ............. 13 2.3.5 Hospital Reform Programme ...................................................................... 13 2.3.6 M&E Strengthening Initiative ............................................................... 14 2.4 Situation Analysis Church Hospitals ....................................................... 14 3. Project Description ................................................................................................. 16 3.1 Partnership ................................................................................................................ 16 3.2 Project Management & Set-up .......................................................................... 16 3.3 Project Area & Beneficiaries ....................................................................... 16 3.4 Main Project Activities ................................................................................... 17 3.4.1 Performance Benchmarking ......................................................................... 17 3.4.2 Hospital Support Visits ........................................................................... 18 4. Assessment Data Management ................................................................................... 20 4.1 Data Recording & Registration ..................................................................... 20 4.2 Organization of Data Management ................................................................ 21 4.3 Data Analysis ........................................................................................................... 22 5. Assessment Hospital Performance ....................................................................... 23
Evaluation Management Information Benchmark Project

3

Advice in International Cooperation, Health & Public Administration

Imprint

5.1 Commitment to Benchmark Methodology ....................................................... 23 5.2 Use of Benchmark Methodology ....................................................................... 24 5.3 Hospital Performance........................................................................................... 24 6. Main Conclusions & Recommendations ................................................................ 26 6.1 Relevance ..................................................................................................................... 26 6.2 Impact ............................................................................................................................ 26 6.3 Effectiveness ........................................................................................................... 27 6.4 Efficiency................................................................................................................... 27 6.5 Sustainability ......................................................................................................... 28 6.6 Lessons Learnt & Recommendations .............................................................. 28 Annex 1: Terms of Reference ................................................................................... 30 Annex 2: Logical Framework ...................................................................................... 34 Annex 3: Evaluation Indicators & Relative Score per Hospital..... 35 Annex 4: Resource Persons ........................................................................................ 36 Annex 5: Key Performance Indicators ................................................................ 38 Annex 6: KPI Time Series 2008-2010 Lake Zone Hospital ................... 39 Annex 7: Production Figures 2008 -2010 Selected Hospitals ............ 42 Annex 8: Itinerary Consultant ............................................................................... 43 Annex 9: Resource Documents ................................................................................... 44

Evaluation Management Information Benchmark Project

4

Advice in International Cooperation, Health & Public Administration

Imprint

This chapter presents a summary of the main findings and recommendations of the evaluation of the Management Information Benchmark Project (MIBP). The objective of the MIBP is to improve performance of 17 selected Church-owned hospitals in the Lake and Northern Zone through capacity building in performance measurement and data management. The MIBP started in 2007 and comes to a closure at the end of 2011. The purpose of the evaluation is to establish the extent to which the MIBP has realized its objectives. The evaluation was conducted from 15th till 29th August 2011. The MIBP is in line with the National health sector strategies in which improved hospitals performance is a key priority. The MIBP also suits priorities of the recipient hospitals which have a stake in improving their overall performance. The MIBP has been successful in introducing an innovative and objective framework to monitor, assess and advance hospital performance including the establishment and improvement of a database of hospital performance indicators in the course of the project duration. There is mutual appreciation among hospital management about relevance and usefulness of the newly introduced performance benchmarking methodology coupled with regular zonal exchange and feedback meetings. Apart from this achievement, actual outcome, effect and impact of the MIBP is however limited and varies between hospitals. The MIBP contributed positively to increase consciousness of hospital management on the importance of data management for rational decision making. Some aspects of data management such as data registration, recording and supervision have clearly improved. However, restricted progress is seen in other aspects of data management such as data analysis and the routine use of data for informed and more result oriented management decisions. By and large, enhanced hospital performance in terms of increased productivity and improved trends in performance indicators couldn’t be attributed to the MIBP. Limiting factors to the success of the MIBP are both internal as well as external. The MIBP extended its project area from 9 hospitals in the Lake Zone to an additional 8 hospitals in the Northern Zone clearly limiting the degree to provide sufficient and required intensive follow-up support to the individual hospitals. Another main limiting factor to the success of the MIBP is the worsening financial situation of all hospitals constraining resources and staff. This is coupled with considerable turnover of senior management positions and considerable numbers of inexperienced managers. It is recommended to further develop and mainstreaming the benchmark methodology. However rather in isolation, this should be done in the context of strengthening the national health system most notably the ongoing hospital reform agenda and new initiative to harmonize the national health sector monitoring and evaluation system. It is suggested that the benchmark methodology is harmonized, possibly integrated, with other software applications in support of improving routine data management of health institutions.

Evaluation Management Information Benchmark Project

5

Advice in International Cooperation, Health & Public Administration

Imprint

Evaluation Management Information Benchmark Project

6

Advice in International Cooperation, Health & Public Administration

Imprint

AIDS CCT CHMT Cordaid CSSC DDH DMO FBO HMIS HMT HSRS HSR HSSP HSPS-IV ICT IT KPI LGA LGRP ELCT MAM MIBP MOHSW M&E MSD NACP NCD P4P PMO-RALG RHMT RS TEC OC POW SWAp TCMC Tragpi TQIF VA

Acquired Immune Deficiency Syndrome Christian Council of Tanzania Council Health Management Team Catholic Organization for Relief and Development Christian Sociaal Services Commission District Designated Hospital District Medical Officer Faith Based Organisation Health Management Information System Hospital Management Team Health Sector Resource Secretariat Health Sector Reforms Health Sector Strategic Plan Health Sector Programme Support programme, phase IV Information, Communication Technology Information Technology Key Performance Indicator Local Government Authority Local Government Reform Programme Lutheran Evangelican Church Tanzania Mpango wa Maendeleo ya Afya ya Msingi Management Information Benchmark Project Ministry of Health and Social Welfare Monitoring and Evaluation Medical Stores Department National Aids Control Programme Non Communicable Disease Pay for Performance Prime Minister Office Regional Administration and Local Government Regional Health Management Team Regional Secretariat Tanzania Episcopal Conference Operational Costs Plan of Work Sector Wide Approach Tanzania Christian Medical Association Trag Performance Intelligency Group Tanzania Quality Inprovement Framework Voluntary Agency Hospital

Evaluation Management Information Benchmark Project

7

Advice in International Cooperation, Health & Public Administration

Imprint

The evaluator would like to express his sincere appreciation for all the support and input received from the various stakeholders and resource persons of the Project. Without this support it would not have been possible to conduct the evaluation in such a participatory manner. In particular I like to thank Ms. Meclina Isasi, the project coordinator based at CSSC in Dar es Salaam for her very valuable information and time provided to me. I also like to thank Mrs. Natasha Lako (Cordaid) and Mr. Steven Lugard (Tragpi) for their input and direction. I am also very thankful for all the logistical support from CSSC Lake Zone office. Lastly, I like to express my gratitude to all hospital staff interviewed for their time, insights and suggestions. I hope this evaluation report is useful and may provide sensible ideas and suggestions to further advance hospital care in Tanzania.

15th September 2011 Pieter-Paul Gunneweg [email protected]

Evaluation Management Information Benchmark Project

8

Advice in International Cooperation, Health & Public Administration

Imprint

The Management Information Benchmark Project (MIBP) was initiated in June 2007 with a tri-partite agreement between the Christian Social Services Commission (CSSC), the Catholic Organization for Relief and Development Aid (Cordaid) and the Trag Performance Intelligence Group (Tragpi). The project evolved from another project supported by Cordaid to strengthen the output and performance of selected hospitals in critical health services, the Pay-for-Performance project (P4P). During the implementation of the P4P project the need for a hospital performance measurement methodology was identified to assist hospitals in comparing performance within and between hospitals. The MIBP was to deliver on this demand. The actual start of the MIBP was September/October in 2007.

The Initial objectives of the MIBP were to strengthen the financial management capabilities of the health facilities, specifically in the areas of cost accounting and productivity measurements. The MIBP was to assist and capacitate the CSSC and the health facilities in the usage of the newly developed financial management manual for CSSC health facilities. Specific objectives of the project were: (i) to build financial management capacity within CSSC and 20 selected hospitals, 10 of which were also participating in the P4P project; (ii) to assess impact of emerging new financing modalities in the health sector on financial accounting requirements of health facilities and; (iii) to develop and run a modified training curriculum on financial management and cost accounting for health facilities. However, in the first year of implementation, it became clear that the initial MIBP objectives were not feasible. It was agreed to re-focus the objectives of the MIBP to improve management performance of selected hospitals. The originally envisaged capacity support for improved financial management and cost accounting was left out. The newly agreed objective of the MIBP was to improve the performance of selected Hospitals and CSSC through capacity building in performance measurement and data management (November 2009 - October 2011). A revised logical framework was agreed upon (Annex 2). The hypothesis of the MIBP is that improved measurement will hold the key to improved management, which will lead to enhanced hospital performance. Benchmarking is seen as a important tool to increase transparency within the hospital between the various departments and between hospitals of a kind. It is assumed that it will enhance financial management and hospital governance through the promotion of transparency in and between hospitals. Hospital management is sensitized and mobilized to: (i) learn from peer managers to handle and overcome common challenges; (ii) instill a sense of competition among managers and hospital departments benefitting performance; (iii) improve cost consciousness and cost efficiency.

The main objective of the evaluation of the MIBM project is to determine the extent to which the MIBP enhanced data management and the routine application of performance benchmarking by the recipient hospitals and to what extent this contributed to improved health care performance of the same institutions. If possible, the evaluation had to distinguish between hospitals involved in the MIBP alone, those hospitals which only
Evaluation Management Information Benchmark Project

9

Advice in International Cooperation, Health & Public Administration

Imprint

participated in P4P, and those hospitals which were simultaneously benefitting from the MIBP and P4P. A comprehensive list of additional and specific evaluation questions was part of the TOR (Annex 1). The involvement of Cordaid with the MIBP comes to a closure at the end of 2011. The purpose of this end-of-project evaluation is to catalog successes and challenges of the MIBP and to assess its relevance and compatibility for other Countries in which Cordaid is involved in strengthening the health sector. Furthermore, this evaluation may be instrumental for the other implementing agencies (CSSC and Tragpi) to provide direction for a possible continuation of the MIBP in a next phase.

A variety of evaluation methods is applied. Primary data were collected from document review as well as from interviews with important stakeholders in the MIBM. Interview tools were developed including an assessment framework of critical outcome indicators based on the project logical framework and the specific objectives of the TOR of this evaluation. This framework was used to assess the relative score of each hospital (Annex 3). 8 hospitals were visited all situated in the Lake Zone. Of those 8, 7 hospitals participated in the benchmark project whereas 1 hospital participated in the P4P project. 4 out of the 7 hospitals participating in the MIBM took simultaneously part in the P4P project. Members from the hospital management teams as well as technical staff in charge of data recording and analysis were interviewed. Typically, the management team was interviewed on the purpose of the MIBM project and the impact of the project in the routine data management of the hospital. If possible a meeting was conducted with the hospital management team. Findings and observations were validated by interviews with hospital staff, data clerks, heads of departments and IT staff. Next to interviews, hospital documents were scrutinized such as minutes of hospital management meetings, annual reports and strategic plans. The recording of data was verified in selected wards and OPDs. Verification was done on actual follow up of recommendations and suggestions of the project team during their visits in 2009, 2010 and 2011. Last but not least, trends is the score of the KPIs were assessed over the period 2007 to 2010. In addition to hospital staff, representatives from Council Health Management Teams (CHMTs) and Regional Health Management Teams (RHMTs) were interviewed as well as representatives of CSSC head office as from CSSC Lake Zone office. Representatives of the Ministry of Health and Social Welfare (MOHSW) were interviewed such as Health Sector Resource Secretariat (HSRS), the Hospital Reform Secretariat and the Health Management Information department (HMIS). In addition, various more general documents were reviewed such as project reports, the logical framework, hospital visit reports and quarterly, annual and financial reports. An overview of all persons interviewed and all documents consulted are attached in annex 4 and annex 9 respectively.

Evaluation Management Information Benchmark Project

10

Advice in International Cooperation, Health & Public Administration

Imprint

The United Republic of Tanzania is the largest country in East Africa and is classified as one of the least developed countries in the world. It covers an area of 945.087 sq Km. The average national income per person was USD 500 per capita (2010). The rural poverty rates in districts vary from below 20% to above 50%. Almost 90% of the poor live in rural areas. Life expectancy at birth is estimated at 57 years for men and 59 years of age for women. 1 Mainland Tanzania is divided into 26 administrative regions, 113 districts and about 10,342 villages.

As a result of a decentralization policy both, the MOHSW and Prime Minister Office of Regional Administration and Local Government (PMO-RALG) are jointly accountable for Health service delivery. Whereas the MOHSW is responsible for sector policy development, the regulatory framework, M&E and quality of care, the PMO-RALG is responsible for actual service delivery at the local level through the Local Government Agencies (LGAs). The LGA has full administrative, executive as well as legislative powers. Annually it receives Central Government grants to plan and finance health services through the Comprehensive Council Health Plan (CCHP). The CHMT of the LGA is responsible for planning, management and delivery of health services up to and including district hospital services. The CHMT consist of a core team of 7 health professionals, headed by the District Medical Officer (DMO). At the Regional level, the MOHSW and the PMO-RALG have maintained the intermediate administrative level of the RHMT under the Regional Secretariat (RS). The Regional Medical Officer (RMO) is leading the RHMT. The RHMT is mandated to apply National policies, to provide technical advise to LGAs. The RHMT performs a regulatory, monitoring and evaluation function on behalf of the MOHSW. Primary Health Care services form the basis of the pyramidal structure of health care services with dispensaries, health centers and at least one hospital at the district level. Currently there are 4,679 dispensaries and 481 health centers throughout the country. About 90% of population lives within five KM of a primary health facility. There are 55 district hospitals owned by Government and 13 Designated District Hospitals (DDH), owned by various Faith Based Organizations (FBO). Furthermore, there are 86 other hospitals at first referral level (owned by Governments and private sector). There are 18 Regional Hospitals, functioning as referral hospital for district hospitals and 8 consultancy and specialized hospitals in the country. Tanzania’s health context has a pluralistic outlook with both, public and non-public actors, involved in health service delivery. It is acknowledged that the complementary strengths of different health service providers are important to realize equitable access to health services of an adequate quality. Similarly, it is acknowledged that the need for regulation, harmonization and co-ordination of all these various actors involved in health care provision is of paramount importance. FBOs provide for approximately half of the health care services in Tanzania.

1

United Nations World Development Report, 2010

Evaluation Management Information Benchmark Project

11

Advice in International Cooperation, Health & Public Administration

Imprint

In Tanzania a coherent system of Government policies, strategies and programmes is giving direction to Health Sector Development. Consistency between general and sector policies is increasing. The health sector reform strategy aims at improving accessibility and quality of health services and improving health outcomes through decentralisation and deconcentration of government to achieve greater responsiveness and enhanced accountability.

The MOHSW has formulated a new National Health Policy in 2007. The vision of the Government is to have a healthy society, with improved social well-being that will contribute effectively to personal development and the nation at large. The mission is to provide basic health services in accordance with geographical conditions, which are of acceptable standards, affordable and sustainable. The health services will focus on those most at risk and will satisfy the needs of the citizens in order to increase the lifespan of all Tanzanians. Specifically the Government wants: (i) to reduce morbidity and mortality in order to increase the lifespan of all Tanzanians by providing quality health care; (ii) to ensure that basic health services are available and accessible; (iii) to prevent and control communicable and noncommunicable diseases; (iv) to sensitize the citizens about the preventable diseases; (v) to create awareness to individual citizen on his/her responsibility on his/her health and health of the family; (vi) to improve partnership between public sector, private sector, religious institutions, civil society and community in provision of health services; (vii) to plan, train, and increase the number of competent health staff; (viii) to identify and maintain the infrastructures and medical equipment; and (ix) to review and evaluate health policy, guidelines, laws and standards for provision of health services.

Health Sector Reforms (HSR) started in 1994 and aim at the improvement of access, quality and efficiency of primary health (district level) services, as well as strengthening and reorientation of secondary and tertiary service delivery in support of primary health care. The programme also aims at strengthening support services at the central level, in the MOHSW, it agencies and training institutions. The HSR covers a wide range of dimensions: managerial reforms in decentralised health services; financial reforms relating to for example user-charges, health insurance and community health funds; public/private mix reforms, e.g. encouragement of private sector to complement public health services; organisational reforms e.g. integration of vertical health programmes into the general health services; health research reforms such as establishment of a health research users fund and promotion of demand oriented health research. In a later stage hospital reforms were added as element of the reforms, because the quality of hospital services was not improving in line with the sector reforms. Coordination and involvement of major donors in the health sector is envisaged through the Sector Wide Approach (SWAp). The SWAp provides all the mechanisms to manage relations between the Government and the Donor Partners (DPs) and all associated support modalities also in relation to the HSR.

Evaluation Management Information Benchmark Project

12

Advice in International Cooperation, Health & Public Administration

Imprint

The Health Sector Strategic Plan 2209-2015 (HSSP-III) maintains the emphasis on improved accessibility to district health services of good quality, but also with a view to the need for adequate referral services in secondary and tertiary hospitals. HSSP III contains eleven strategies on specific topics in the health service delivery related to diseases and management: District health services; Referral hospital services; Central level support; Human resources for health; Health care financing; Public private partnership; Maternal, newborn and child health; Prevention and control of communicable and non-communicable diseases; Emergency preparedness and response; Social welfare and social protection; and Monitoring, evaluation and research. The cross-cutting issues elaborate on the approach towards quality, equity, gender and governance.

In 2007 the MOHSW developed the Primary Health Care Service Development Programme (Mpango wa Maendeleo ya Afya ya Msingi, MMAM) 2007 -2017 to address the new Health Policy and the health related MDGs. The objective of the MMAM programme is to accelerate the provision of primary health care services for all by 2012, while during the remaining five years of the programme consolidation will be achieved. The main areas of focus will be on strengthening the health systems through rehabilitation of infrastructure, human resource development, improving the referral system, increased health sector financing and improved provision of medicines, equipment and supplies. This programme will be implemented by the MOHSW in collaboration with other sectors including PMO-RALG, RSs, LGAs and Village Committees. The workforce will be increased by doubling the throughput in the existing training institutions upgrading 4 schools for enrolled nurses, production of more health tutors and upgrading the skills of existing staff. Infrastructure development through rehabilitation of existing health facilities, construction of new ones as well as improving the outreach services aims at having 8,107 primary health facilities, 62 district hospitals, 128 training institutions by year 2012. The Referral System will be strengthened by improving information communication system and transport.

Although much of the success of the health system in Tanzania arises from its commitment to the primary health care approach, an essential role is dedicated to hospitals both as the location of expertise to deal with complex conditions, and as the source of technical expertise for lower levels of service provision. With the aim of ensuring quality, equity, efficiency, affordability and financial viability of referral and regional hospitals, a comprehensive and ambitious plan for hospital reform was laid out first in the Plan of Work (1999) and later in HSSP II (2003). The strategies included devolving management authority, broadening financing options and strengthening of financial management, strengthening planning and management, increasing efficiency, improving infrastructure and concentrating on provision of referral services. A five year implementation plan including a number of concrete steps were developed. Guidelines for reforming regional and district hospitals were however, issued in 2005 only. The hospital reform agenda to improve performance remains a high priority to both MOHSW and PMO-RALG and a key strategy in the HSSP-III. The continued emphasis on the hospital reform agenda is reflected in the strategic objectives formulated in the HSSP III to address the challenges for hospital and referral care are: (i) To increase access for patients
Evaluation Management Information Benchmark Project

13

Advice in International Cooperation, Health & Public Administration

Imprint

in need of advanced medical care; (ii) to improve quality performance of clinical services in hospitals; (iii) to improve management, institutional development, governance and accountability of the hospitals and: (iv) to strengthen hospital governance.

In 2010/11, the MOHSW embarked on a new initiative to comprehensively streamline and update the MOHSW Monitoring and Evaluation system including all elements of the National health information system at all levels of the health care system. The initiative is led by a consortium of funding and implementing partners under the SWAp. The consortium combines MOHSW leadership, ownership and sustainability with technical and financial support from funding and implementing partners. The aim of the consortium approach is to identify a regional technical support partner to provide additional technical and financial support to regional and council health management teams to implement all HMIS strengthening activities. M&E is essential to measure progress of health sector objectives against targets. Moreover, it can identify opportunities to refine strategies and accelerate progress by using data for decision making. In previous HHS plans, M&E have been hampered by conflicting set of indicators, inadequate data availability, poor data quality and by a generalised absence of data use for planning and management at all levels. The HSSP III includes a new set of 43 national-level indicators and targets. At the local level a sub-set of indicators will be needed for measuring performance across time, and between districts, regions and health institutions. This initiative evolves from the HSSP III which puts a renewed emphasis on monitoring and evaluation (M&E). It articulates a new vision to develop a M&E culture that ensures that quality data is used at all levels for planning and management of health sector activities. Specific objectives in HSSP III with respect to M&E are: (i) to elevate the HMI unit at the MOHSW to become a directorate; (ii) develop new M&E policies and guidelines for increased harmonisation; (iii) to strengthen routine HMIS to function effectively at all levels to provide timely quality data used for planning, management and M&E; (iv) improve data management including ICT and harmonise different subsystems and databases; (v) improve rationalisation, harmonisation and coordination of surveys and research and ensure they complement the routine HMIS and fit in the M&E plan of the health sector. Next to improving and harmonising the HMIS in general, the M&E initiative recognises the fact that the existing HMIS tools do not necessarily support the requirement of (referral) hospitals. The need to support ICT capacity and information systems for hospital data management and evidence based decisions is clearly articulated.

The current financial position of many Church hospitals is troublesome and has particularly weakened over the last couple of years. It has brought some hospitals at the state of near bankruptcy. The most prominent reasons have to do with decreasing and delayed levels of financial support from the Government which has to be compensated by the hospitals own resources. In particular the following causes are prominent: (i) reduced and delayed release of Government funds for hospital running costs (OC); (ii) delayed absorption of newly recruited medical professional staff in the Government payroll; (iii) high percentages of unavailability of drugs and medical equipments at the Government Medical Stores Department (MSD) and lastly; (iv) the newly enacted Government exemption policy which
Evaluation Management Information Benchmark Project

14

Advice in International Cooperation, Health & Public Administration

Imprint

prevents DDH hospitals to charge client fees for services rendered to approximately 80% of their clients. Other reasons contributing to the dire financial situation are both internal as well as external of nature: (i) increased levels of professional medical staff terminating their employment in favor of alternative Government service with better terminal benefits and better promotion prospects; (ii) dwindling external donor support for church owned hospitals; (iii) inadequate financial support or collateral from Dioceses; (iv) inadequate lobby and advocacy support and outcome from important stakeholders such as Diocesan health offices, professional medical associations such as the Tanzania Christian Medical Association (TCMA), the Bishops and the CSSC.

Evaluation Management Information Benchmark Project

15

Advice in International Cooperation, Health & Public Administration

Imprint

The MIBP is implemented through a partnership of three autonomous agencies; CSSC, Cordaid and Tragpi. CSSC is an ecumenical body that was established in 1992 by the Christian Council of Tanzania (CCT) and the Tanzania Episcopal Conference (TEC). CSSC provides support on behalf of its members in de areas of primary education and health. CSSC has a DSM-based head office and 5 Zonal offices providing support to its member organizations and their individual health institutions. CSSC operates within the overall policy framework of the health and education sector of Tanzania with a mission to support the delivery of social services through partnerships, lobbying and advocacy. CSSC is representing 83 hospitals, approximately 68 health centers and more than 450 dispensaries. Among the 83 hospitals there are two consultant and teaching hospitals and 19 Designated District Hospitals. Tragpi is a Netherlands based private company specialized in cost management and measuring quality in healthcare, most notably in general and specialized hospitals in Europe. The tri-partite agreement between the respective partners is inspired by their mutual desire to improve health sector performance in Tanzania, most notably improvement in hospital performance. The partnership is governed by a demarcation of roles and inputs. CSSC is hosting the MIBP on behalf of the beneficiary hospitals. Furthermore, CSSC provides support staff, facilities and logistical support at Central and Zonal levels. Cordaid provides advisory and financial support and guidance to the implementation of the MIBP also in context of other support interventions such as P4P, the introduction of ICT in health facilities through projects such as as AfyaPro, Care2X and Afya Matandeo. Cordaid contributes for all operational costs of the MIBP. Tragpi contributes with the transfer of its specialized knowledge on hospital performance benchmarking. On an almost equal share, it co-finances the project with all cost related to involvement, travel and lodging of its own consultants.

The MIBP employs a part-time coordinator (70 fte) at CSSC-HQ in Dar es Salaam. The project coordinator resorts under CSSC’s competence centre and is assisted by support staff such as a secretary and accountant. The project coordinator is accountable to the coordinator of the competence centre and communicates and reports on all administrative and financial issues with Cordaid and Tragpi, respectively. The implementation of the MIBP is supported by the CSSC Zonal offices in Mwanza and Arusha respectively, mainly for coordination, facilitation and participation in hospital support visits. A team of Tragpi consultants operate from the Netherlands on a part-time basis. A focal person within Tragpi is responsible for communication with the project coordinator at CSSC. One Tragpi consultant, based in Moshi, was assigned to the MIBP on a semi-permanent basis from June 2010.

To date, the MIBP is implemented in 17 selected Church hospitals. After preparatory work (2007), the first batch of 9 hospitals were enrolled in the Lake Zone in 2008. A second batch of 8 hospitals was added in 2010 in the Northern Zone. All hospitals serves as District

Evaluation Management Information Benchmark Project

16

Advice in International Cooperation, Health & Public Administration

Imprint

Designated Hospitals (DDH) and are governed by a service agreement with the Government of Tanzania. All hospitals currently employed in the MIBP are privately owned by various Dioceses of the Catholic Church, the Lutheran Evangelican Church (ELCT) and the Anglican Church respectively. They provide general curative, preventive and outreach health care services to an average of approximate 350 thousand people. Most hospitals provide regular specialist medical treatment such as specialized surgery, gynecology, orthopedic, eye and pediatric services. All hospitals operate in-patient as well as out-patient departments. Registered inpatient bed capacity ranges from 175 to 224 but actual number of bed capacity is however significantly higher and can sometimes even reach 40% higher compared to the official registered bed capacity. All hospitals operate as referral centers for first line health facilities in their respective Districts and even beyond. Average total number of hospital staff employed is about 200. Average bed occupancy based on registered beds varies per hospital and per season but ranges from 80 to 120 percent.

The MIBP employed two main activities: (i) the introduction and application of a benchmark tool and methodology and; (ii) the follow up participating hospitals through supportive visits. A comprehensive description of both activities in presented in paragraph 3.4.1 and 3.4.2 successively.

One of the main activities carried out by the MIBP during the second year (2008) was the design of a benchmark instrument and methodology. The design phase was most notably done by consultants from Tragpi with support from CSSC and involved: (i) evaluation of international standards and best practices and relate these to the Tanzania health sector cq hospital context and requirements; (ii) defining Key Performance Indicators (KPIs) and designing the benchmark methodology; (iii) setting up of data templates and; (iv) sensitization and advocacy of the project in the project area. After evaluation of international standards and best practices, a list of about 200 performance indicators was proposed which was, over time, condensed to 10 KPIs in a participatory manner involving the management of the hospitals. Next to the 10 KPIs applicable for all hospitals, a set of 5 additional performance indicators was agreed upon variable for each individual hospital matching their specific circumstances or demands. KPIs indicate levels of hospital performance with respect to efficiency, quality of care and staffing. KPIs for efficiency are: (i) number of OPD clients per clinician; (ii) occupancy rate of actual beds; (iii) average length of stay (specified for maternity ward and pediatric ward); iv) major surgical procedure per clinician and; (v) minor surgical procedure per clinician. KPIs for quality of care and performance are: (i) ratio staff per actual bed; (ii) radiology, ultrasounds and laboratory test per day and; (iii) ratio positive malaria tests on total malaria cases. KPIs for staffing are: (i) ration total new staff on total staff and; (ii) ration leaving staff on total staff (Annex 5). The KPIs formed an integral element of the newly designed computerized template (Excel) which was consecutively refined and improved over the years and is now in its 4th edition. The template consist of the following data formats and sections: (i) to do list; (ii) general data

Evaluation Management Information Benchmark Project

17

Advice in International Cooperation, Health & Public Administration

Imprint

of the health facility; (iii) production figures; (iv) personnel; (v) finances; (vi) data KPI graphs; (vii) KPIs; (viii) financial results; (ix) definitions; (x) dashboard; (xi) annual report. Till 2010, this template was updated by the hospitals on a annual basis and verified and corrected by the MIBP project team in consecutive support supervision visits. Consolidated reports formed the basis for defining average performance benchmarks and comparison of performance of respective departments within the hospital as well as comparing performance of hospitals across the Zones. Comparison of performance of individual hospital departments was done in hospital support visits, whereas comparison between hospitals across the two Zones was done in annual zonal feedback meetings attended by all hospitals from the respective zones. Since 2011, the use of the benchmark template has shifted from an annual reporting format to an internal monitoring tool to inform the hospital management on performance and progress being made against the respective KPI’s benchmarks. The template is to assist the hospital management to formulate an appropriate management response in order to address signaled challenges for better performance. The yearly zonal feedback meeting had the purpose to create a technical platform to discuss common challenges with respect to hospital management. Specific objectives of the zonal feedback meetings were: (i) to discuss progress and management challenges; (ii) to share knowledge and experiences in data management and its impact on result oriented management and hospital performance; (iii) to facilitate peer reviews and in general; (iv) to generate a culture of mutual learning between participating hospitals. Central to the zonal meeting was the presentation of the hospital benchmark results of the previous year based on KPIs and their respective benchmarks. The benchmark process was elaborated upon, each KPI was explained, results presented and related challenges extensively discussed. Next to plenary sessions, one-to-one sessions were conducted with individual hospitals to discuss hospital specific issues and management measures. The zonal benchmark meetings were attended by the hospital management teams of the participating hospitals and, if available, representatives from the MOHSW. Adult learning methods were applied as much as possible. Next to discussing benchmark results, the first meetings in each zone were also used to discuss and to mutually agree on the KPIs and to introduce the monthly benchmark template. So far, three zonal feedback meetings were conducted in the Lake Zone (2009, 2010 and 2011) and 1 meeting was conducted in the Northern Zone (2011).

The second main interventions employed by the MIBP to enhance data management was through support visits to participating hospitals. Whereas it was planned to conduct two visits per hospital a year, due to limited resources an average of 1 visit per year per hospital was realized. The main objectives of the support visits were: (i) to verify data provided by the hospital; (ii) to instill understanding, ownership and commitment to improved data management and performance benchmarking and; (iii) to agree on specific and concrete measures to improve data management and hospital performance and to mentor the HMT achieving these. In the end, the purpose of the visits was to arrive at a complete and consistent data set for internal and external performance benchmarking. Hospital visits were

Evaluation Management Information Benchmark Project

18

Advice in International Cooperation, Health & Public Administration

Imprint

conducted by a team consisting of the MIBP project coordinator, 1 or 2 consultants from Tragpi and the CSSC zonal coordinator. Although initial support visits (2007/8) took 4 days, more recent visits usually covers only 2 days. Typically, the first day of a visit is spent with responsible hospital officers (e.g data clerks, medical record departments) on validating data submitted to the project coordinator prior to their visit on accuracy, completeness, reliability and verifiability. Usually incomplete reports are updated with data from the Health Management Information books (HIMS), annual reports, ward admission books and medical recording forms (patient files). In principle, especially during 1st round of hospital visits, templates of routine registration books and forms were modified. The second day of the visit was usually spent to sensitize the hospital staff on the use of data and the purpose of benchmarking by presenting the hospital benchmark results. These meetings were usually attended by the hospital management, heads of departments, medical doctors, clinical officers and data recorders and clerks. Representative issues discussed at these meetings were: (i) the need for data management and the current practices; (ii) responsibility and ownership of data management; (iii) measures for improved data management; (iv) the relation between data management and strategic management decision. The support visit is usually completed with a meeting with the hospital management to agree on the way forward and in particular to agree on measures to improve data management for improved hospital performance. A detailed visit report documenting the most important findings and suggestions for immediate actions is prepared by the project team and sent to the hospitals for recording and follow-up.

Evaluation Management Information Benchmark Project

19

Advice in International Cooperation, Health & Public Administration

Imprint

The first objective of the MIBP for the purpose to enhance hospital performance was to improve the data management of the participating hospitals. This chapter provides the evaluation results based on a framework of outcome indicators derived from the MIBP logical framework. The MIBP differentiates between 3 levels in data management of which the evaluation findings are consecutively described in the paragraphs: data recording and registration (4.1); the organization of data management (4.2) and lastly; data analysis (4.3). A comprehensive overview of evaluation results and relative score on outcome indicators per hospitals is provided in Annex 3. The extent to which the management is using data for informed management decisions will be discussed in chapter 5.

Improvement in staff consciousness towards the importance of data management is seen across all hospitals visited. This is most notably the case with the management and the heads of departments, less so with regular staff at wards and departments. Both, management staff as well as lower staff cadres in hospitals participating in the P4P project have a higher understanding of the concept and need for data management compared to their peers in hospitals which benefitted only from the MIBP. Obvious variation is seen in levels of understanding on the need and importance of proper data management among management staff members within and between hospitals (Table 1).
Table 1: Relative Score Hospitals on Data Consciousness, Recording & Registration
Hospital 2 S've S'ma P4P Project No No BenchMark Project Yes Yes Evaluation Criteria: Outcome Indicators (= Results of the MIBP expected within the project period) Consciousness towards importance a Hospital Management Team 1 2 data management b Heads of Departments 1 1 c Nurses, Clinicians 0 0 Sub-Total 2 3 Data Recording & Registration a Data quality & consistency Admission Books 0 0 b Data quality & consistency OPD Books 1 1 c Quality & Consistency of HMIS Reports 1 1 Sub-Total 2 2 Key 0 = No Change 1 = Moderate Improvement Total Score 4 5 2 = Marked Improvement Relative Position 4 3 M'za No Yes 1 1 0 2 0 0 0 0 2 5 N'ga No Yes 1 1 0 2 1 1 1 3 5 3 B'lo Yes Yes 2 2 2 6 2 2 2 6 12 1 R'ia Yes Yes 2 1 1 4 2 2 2 6 10 2 M'na Yes Yes 2 2 1 5 2 2 1 5 10 2 K'do Yes No 2 1 1 4 2 2 2 6 10 2

Overall, data recording and registration has improved resulting in better quality and less inconsistencies. However improvement in hospitals participating in the MIBM is still moderate with significant divergences between wards and departments within and between hospitals. Whereas a moderate improvement can be seen in recording of OPD books and the general HMIS reports, recording of ward admissions books remains a challenge and there is still considerable room for improvements. In some hospitals this is done by student nurses with unsatisfactory supervision and follow-up by the respective head of departments. A marked and more sustained progress in data recording and registration is seen in hospitals also having benefitted from the P4P project. Improvements in these hospitals are also more consistent across the various wards, OPDs and registrations departments. Recording in ward admission books, OPD registers etc in most hospitals participating in P4P was very good and in some instances even spotless. Improved data recording in these hospitals is also confirmed by the respective CHMTs who were generally appreciative of improved HMIS records.
Hospital Abbreviations (Table 1-6): Sumve (S’ve), Sengerema (S’ma), Murgwanza (M’za), Nyakahanga (N’ga), Biharamulo (B’lo), Rubia (R’ia), Mugana (M’na), Kagondo (K’do) .
2

Evaluation Management Information Benchmark Project

20

Advice in International Cooperation, Health & Public Administration

Imprint

Across all hospitals, management has intensified supervision of data recording and registration. Clinicians are more regularly reminded and checked on keeping their OPD registers. Wards and departments are reminded in daily clinical meetings about the importance of data recording and updating admission books and to include and verify nightly admissions. Medical record departments are more regularly checked.

The manner in which data registration and collection is organized and structured shows a marked difference between hospitals having participated in P4P compared to hospitals participating only in the MIBP. Whereas progress in the first category of hospitals is robust, progress in the last category of hospitals is just nominal (Table 2). All hospitals but one have a data clerk responsible for overseeing, collecting, cleaning and consolidating data from all departments and preparing consolidated monthly and quarterly HMIS reports. However, hospitals having participated in P4P employ data clerks with a more specific Terms of Reference (TOR), more exclusive and with more experience. In these hospitals a data clerk visits each department for data verification and cleaning on average once every week. In hospitals benefitting from MIBP only, data clerks have been usually allocated these tasks next to other routine responsibilities they already fulfill and in some instances this clearly affects the intensity and outcome of the data collection and validation. In all hospitals minor modifications in data collection formats have been made making them more complete, easier to summarize and less prone to mistakes. A marked difference between hospitals participation in P4P and hospitals benefitting from MIBP is the establishment of a special data-task-team in all P4P hospitals. Membership of this team differs per hospital but usually comprises MO i/c, hospital secretary, matron, data clerks and heads of departments. Typically, the team meets every week to assess and correct completeness and quality of ward admission books, OPD registers, summary sheets, and other hospital records. It has a clear effect on attitude of key hospital staff to the need and importance of data recording and it has proved to strengthen quality and consistency of the data recording.
Table 2: Relative Score Hospitals on Data Management & Organization
Hospital S've S'ma P4P Project No No Yes BenchMark Project Yes Evaluation Criteria: Outcome Indicators (= Results of the MIBP expected within the project period) Data Management & Organization A Use of Data Recorder with TOR 0 1 B Professional Capacity Data Recorder 0 0 C Improved Data Recording Formats 1 1 D Improved Data collecting recording, collection 1 1 supervision Data Verification task Force E & Establishment 0 0 F Improved Medical Record Department 0 0 G Improved IT application in Data recording 0 1 Key 0 = No Change 1 = Moderate Improvement Total Score 2 4 2 = Marked Improvement Relative Position 7 5 M'za No Yes 1 0 1 0 0 0 1 3 6 N'ga No Yes 1 1 1 1 0 0 0 4 5 B'lo Yes Yes 2 2 2 2 2 2 2 14 1 R'ia Yes Yes 2 2 2 2 2 2 1 13 2 M'na Yes Yes 2 1 2 1 2 1 0 9 3 K'do Yes No 2 0 2 1 2 0 1 8 4

Overall, professional qualification and competence of staff at medical record departments remains a challenge. Some hospitals stand out in reorganizing their registry department with specially designated rooms for new admissions and re-admissions, introduction new patients files (as opposed to cards) and professionally upgrading staff. Again, this is more profound in hospitals having participated in P4P.
Evaluation Management Information Benchmark Project

21

Advice in International Cooperation, Health & Public Administration

Imprint

The application of IT in routine data recording and registration remains a challenge for all hospitals. Apart from operating specific data recording software applications for vertical diseases programmes such as the National Aids Control Programme (NACP), routine data recording and registration is done manually. Only one hospital (Murgwanza) operates a specifically designed electronic database for in-patients and out-patients which is based on HMIS requirements. In principle this database is useful however, it appears to be obsolete, regularly defunct and inadequately used by the management. Some hospitals employ an IT specialist for more general computer backup support. A majority of hospitals have expressed an interest in the introduction of AfyaPro but to date implementation hasn’t started.

Despite moderate to robust gains in data recording and collection, the compilation of data by means of updating the MIBP template at a regular (monthly) interval and presenting the summary overviews (e.g KPIs) to the hospital management is only done in 3 hospitals which also participated in P4P. Updating is either done by the IT staff (Biharamulo), the data collector (Rubia) or the accountant (Mugana). All other hospitals fail to update the monthly benchmark template. This is due to the fact that they miss a competent or designated person responsible for this task. In all instances, difficulties in working with the benchmark template was cited as the most important reason for failing to produce monthly update. This seemed to correlate with the degree of computer literacy of the concerned staff although similar technical problems with the template were mentioned across most hospitals. In some hospitals (Murgwanza, Nyakahanga) it was not clear whether the MIBP template was in use at all (Table 3).
Table 3: Relative Score Hospitals on Data Analysis
Hospital S've S'ma P4P Project No No BenchMark Project Yes Yes Evaluation Criteria: Outcome Indicators (= Results of the MIBP expected within the project period) Data Analysis a Professional Competence Data Analist 0 0 b Monthly update of BM Template 0 0 c Competence to operate BM template 0 0 d Preparing Monthly Consolidated Reports 0 0 Key 0 = No Change 1 = Moderate Improvement Total Score 0 0 2 = Marked Improvement Relative Position 3 3 M'za No Yes 0 0 0 0 0 3 N'ga No Yes 0 0 0 0 0 3 B'lo Yes Yes 2 2 2 2 8 1 R'ia Yes Yes 1 2 2 2 7 2 M'na Yes Yes 1 2 2 2 7 2 K'do Yes No 0 N/A N/A 0 0 3

The 3 hospitals which do keep up with the monthly data recording of the MIBP template produce and submit monthly or quarterly overview reports for the hospital management using the various features of the template. However, an actual scrutiny of hospital performance based on trend analysis of KPIs and underlying factors is not done and this seems clearly beyond the capacity of the (support) staff producing these reports.

Evaluation Management Information Benchmark Project

22

Advice in International Cooperation, Health & Public Administration

Imprint

The second main objectives of the MIBP was to introduce a performance benchmarking methodology and to strengthen the hospital management in its regular use and application for the purpose of improving hospital performance. This chapter provides the evaluation results based on a framework of effect and impact indicators derived from the MIBP logical framework. Evaluation findings are consecutively described in the paragraph on benchmark methodology (5.1), its adoption and usage by the hospital management (5.2) and finally in an assessment of the actual performance improvement of the hospitals (5.3). A comprehensive overview of evaluation results and relative score on effect and impact indicators per hospitals is provided in Annex 3.

Overall, there is clear commitment of the management of all hospitals to the objectives and the purpose of the MIBP. There is mutual understanding and appreciating for the MIBPs relevance and usefulness to support and improve hospital performance. This is usually more robust among management team members who have been part of the MIBP from the start and who attended the zonal feedback meetings. Some hospitals have changed senior management positions during the course of the MIBP and this has sometimes affected levels of understanding and commitment of its implementation. Extent of commitment to and understanding of the objectives of the MIBP is however different at the level of heads of departments between hospitals having participated in P4P and those participating only in the MIBP. In this last category, virtually none of the heads of departments are aware or able to explain the objectives and purpose of the MIBP whereas in the P4P hospitals commitment and understanding of this staff cadre matched the levels of the hospital management staff (Table 4).
Table 4: Relative Score Hospitals on Understanding & Commitment to Hospital Benchmarking
Hospital S've S'ma M'za N'ga P4P Project No No No No Yes Yes Yes Yes BenchMark Project Evaluation Criteria: Effect Indicators (=Effects expected as the result of realizing the project Outcomes) Understanding & Commitment to a Commitment Hospital Management 1 2 1 1 Hospital Benchmarking c Commitment Hospital Staff: HoDs, 0 0 0 1 Mos d Hospital peer Reviews, exchange 0 1 0 2 visists Key 0 = No Change 1 = Moderate Improvement Total Score 1 3 1 4 2 = Marked Improvement Relative Position 4 3 4 2 B'lo Yes Yes 2 2 0 4 2 R'ia Yes Yes 2 2 1 5 1 M'na Yes Yes 2 1 0 3 3 K'do Yes No N/A N/A 1 1 4

Appreciation of the zonal feedback meetings by the hospital management staff meetings is very high. The zonal meetings have been able to encourage managers to critically reflect on relevant management issues to improve performance in critical areas. The fact that hospitals were compared on KPIs was appreciated as useful and innovative and a creative way to understand own hospital performance and its contributing factors. The zonal meetings were also found instrumental for the realization that hospital data in general were unreliable, not complete, inconsistent and often cooked-up. The selection of KPIs in a participatory manner was appreciated and found useful. Apart from the exchange of ideas between hospitals during the zonal feedback meetings which was appreciated, few exchange visits have been taken place between the hospitals as a result of the MIBP. Whereas most hospitals planned to visit peer hospitals, this generally didn’t materialize. A good exception is Nyakahanga DDH where senior staff visited
Evaluation Management Information Benchmark Project

23

Advice in International Cooperation, Health & Public Administration

Imprint

Nyakaiga hospital to learn from the P4P project. In result, Nyakahanga DDH encouraged by the MIBP, started its own P4P scheme (July 2011).

There is no evidence that the regular use of the benchmark methodology is established and internalized by the hospitals. On the contrary, the BM template is not updated in the majority of hospitals nor is it used for regular review of KPIs. The only exception to this general picture is Biharamulo DDH and to a somewhat lesser extent Rubia DDH. Here, the management is presented with regular monthly or quarterly updates of KPIs providing a basis for a more result oriented management decision towards improving hospital performance (e.g. more frequent and accurate staff allocations based on actual bed occupancy rates). Biharamulo DDH is using the information from the BM template as well for hospital board meetings (Table 5).
Table 5: Relative Score Hospitals on use of BM template for Management Planning, Reporting & Forecasting
Hospital S've S'ma M'za P4P Project No No No Yes BenchMark Project Yes Yes Evaluation Criteria: Effect Indicators (= Effects expected as the result of realizing the project Outputs) Management Planning, Reporting a Routine Monthly Interpretation data/KPIs 0 0 0 and forecasting b Result oriented management decision based 0 0 0 on analysis of KPIs c Result oriented Hospital Board decisions 0 0 0 based on analysis KPIs d Improved Hospital operational plans, budgets 0 0 0 & work planning e Improved internal Resources allocation 0 0 0 (Staff, Finances) f Improved Annual Report (2007/8 vs 2010) 0 0 0 g Application & Analysis KPIs reflected in 0 0 0 Strategic forecasting h Overall internalization of BM methodology 0 0 0 Key 0 = No Change 1 = Moderate Improvement Total Score 0 0 0 2 = Marked Improvement Relative Position 3 3 3 N'ga No Yes 0 0 0 0 0 0 0 0 0 3 B'lo Yes Yes 2 2 2 0 1 0 0 1 7 1 R'ia Yes Yes 1 1 0 0 0 0 0 0 2 2 M'na Yes Yes 0 0 0 0 0 0 0 0 0 3 K'do Yes No N/A N/A N/A 0 0 0 N/A N/A 0 3

The effect of the MIBP has been very limited on the extent to which the management of the hospitals routinely uses information for a more result oriented decision making process to improve performance. At a more conceptual level, this can also be observed by the content and quality of annual reports and strategic business plans which haven’t significantly changed over the last few years.

The purpose of the MIBP is to improve hospital performance and this is measured against the overall improvement of productivity and against the improvement of KPIs against benchmarks over the project period. An overall improvement in productive levels cannot be observed. Trends of productivity levels of selected hospitals in the total number of out-patients (OPD), major operations, inpatient admissions and in-patient days over the years 2007/8 to 2010 are not very consistent with major fluctuation both upward as well as downward (Annex 7). Changes in production levels seems more to do with contextual variables. For example, the increased number of OPDs and major operations at Mugana DDH is best explained as the result of the increasing number of disenchanted clients from Bukoba Regional Hospital. Moreover, decreasing trends in productivity can be best explained as a result of the overall increasing stressful position of most hospitals. Therefore, any attribution of the MIBP to changed productivity levels at this point should be assessed as very limited. This is substantiated by the fact that
Evaluation Management Information Benchmark Project

24

Advice in International Cooperation, Health & Public Administration

Imprint

apart from one hospital (Biharamulo DDH), none of the other hospitals have internalized the benchmark methodology to date. In addition, the evaluation didn’t find convincing evidence that specific management interventions resulted in improved performance (Table 6).
Table 6: Relative Score Hospitals on improved Service Delivery
Hospital S've S'ma M'za N'ga B'lo R'ia M'na K'do P4P Project No No No No Yes Yes Yes Yes Yes Yes Yes Yes Yes Yes No BenchMark Project Yes Evaluation Criteria: Impact Indicators (= Fundamental & sustainable improvement) Improved Hospital Service Delivery A Improved Performance (2008 -2011) Mixed, Not Consistent N/A B Improved Trends KPIs (2008 - 2011) No objective basis to assess realistically N/A 0 Key 0 = No Change 0 1 = Moderate Improvement Total Score 0 2 = Marked Improvement Relative Position 0 0 0 improved hospital As with the assessment of hospital production figures, establishing

performance based on trend analysis of KPIs is ambiguous. To date, the MIBP establish a 3year baseline for KPIs for the Lake Zone (2008-2010) and a 2-year baseline for the Northern Zone (2009-2010). An example of baseline series for Average Length of Stay and Bed Occupancy Rate is depicted in diagram 1 and diagram 2. An overview of all KPIs for the lake Zone hospitals in a time series 2008-2010 is provided in Annex 6.
Diagram 1: Average Length of Stay – 3yr/hospital
1.60
1.40 1.20

Diagram2: Bed Occupancy Rate 3 yr/hospital
1.60
1.40 1.20

1.00 0.80
2008

1.00 0.80
2008

0.60
0.40

2009
2010

0.60
0.40

2009
2010

0.20 0.00

0.20 0.00

Source: Feedback Meeting Lake Zone (2011)

Source: Feedback Meeting Lake Zone (2011)

Overall, trends in KPIs over time are mixed. As little evidence was found that hospitals have internalized the benchmark methodology to an extent that management is using data for rational management decisions, any attribution of the MIBP to changed values of KPIs is difficult to establish. Moreover, the difficulty of providing a clear and unambiguous assessment is the fact that the quality of data used for compiling the original benchmarks and values for KPIs are doubtful, some incomplete and inconsistent, some even missing. In effect, the value of the baseline benchmarking (2008) and consecutive benchmarking (2009, 2010) should be taken with some reservation. Rather than indicating improved hospital performance, different KPIs scores over time most probably prompt at a more accurate reflection of the existing performance. This is most likely the result of the gradually enhanced data recording and collection practices of the hospitals under the MIBP.

Evaluation Management Information Benchmark Project

25

Advice in International Cooperation, Health & Public Administration

Imprint

This chapter summarizes the main conclusions and provides recommendations in view of the achievements, constraints and important contextual variables. Conclusions and observations are grouped under main criteria, each criteria presented as a separate paragraph as follows; relevance (6.1); impact (6.2); effectiveness (6.3); efficiency (6.4) sustainability (6.5) and recommendation (6.6).

The MIBP is in line with the National agenda to improve hospitals performance which is a high priority with the MOHSW and a key strategy in the HSSP-III. The MIBP clearly matches the aspiration of the MOHSW with respect to its overall strategic directions as well as specific policies and programmes to support hospital services. Likewise, relevance of the MIBP for CSSC is considerable as it offers additional and complementary support to strengthen and capacitate CSSC and its members realizing improved hospital management and performance. The relevance of the MIBP to the extent in which objectives, activities and outputs are suited to the priorities and policies of the recipient hospitals is considerable. Obviously, hospitals have a stake in improving their overall performance and the prime aim of the MIBP is to assist them attaining this through improving their general data management capacity and the introduction of an innovative benchmarking methodology. However, the MIBP doesn’t capture the full extent of all underlying factors contributing to weak hospital performance and management nor does it capture the full capacity support agenda needed to strengthen the management to bring about fundamental and lasting change. A more multi-dimensional approach also in context of other CSSC support projects would have considerably improved relevance of the MIBP. The relevance of the MIBP to introduce and pilot an innovative hospital performance benchmark methodology and to use its findings to improve, contextualize and upscale the methodology with respect National health system development is obvious. However, so far this aspect of the MIBP has received little attention. The MIBP is implemented in relative isolation from the MOHSW administrative structures and systems at the District (CHMT) and Regional (RHMT) levels whereas it could potentially support these levels in the execution of their supervisory and advisory tasks towards the hospitals.

The extent to which the MIBP has realized its objectives to: (i) improve data management and (ii) to improve hospital performance is limited. The MIBP together with the P4P had a clear positive impact on appreciation and consciousness of hospital management on the importance for data management for rational decision making. Some elements of data management such as data recording and the organization of data management have improved although in varying degrees across hospitals. Improvement in other aspects of data management such as data analysis and the use of data for informed management decisions have generally not been achieved. Overall impact of the MIBP on improved hospital performance however is not realized at this point in time and probably requires a sustained and more intensive approach.

Evaluation Management Information Benchmark Project

26

Advice in International Cooperation, Health & Public Administration

Imprint

The MIBP generated increased consciousness about the importance of data management among CSSC project staff and CSSC zonal coordinators and there is mutual appreciation for the potential of the benchmarking methodology. Appointed CSSC project staff clearly benefitted from the knowledge transfer by Tragpi and should be considered sufficiently competent to manage the project. There have been no interventions under the MIBP to support national health system development and the MIBP is implemented in isolation from mainstream developments to improve the M&E system of the MOHSW. To date there have been no systematic efforts to include the MOHSW at the various levels in the implementation, follow-up or evaluation of the MIBP. It is only recently that the MOHSW at the National level has become aware of the project and its potential usefulness. The MOHSW has expressed interest to extent the pilot with inclusion of 5 Government hospitals in Tanga Region in order to assess relevance and appropriateness.

One of the major factors influencing the incomplete achievement of the objectives of the MIBP seems the lack of sufficient follow up of individual hospitals. Supportive visits were carried out just once every year and this appears simply insufficient to bring about change in hospital systems and staff attitudes given the complex and constrained context of the hospitals. The above explanation is underpinned by the finding that hospitals who took part in the P4P performed relatively much better compared to hospitals who didn’t participate P4P. The main reason is that P4P hospitals received prolonged (since 2006) and more regular technical support visits (4 times yearly). In addition not unimportantly, staff received cash incentives for improved data recording and service delivery levels. The implementation of the MIBP relied most notably on a core project team consisting of a project coordinator operating from DSM supported by Tragpi consultants. Whereas the initial focus of the MIBP was on the Lake Zone, the project extended to Northern Zone in 2009. This extension seems to have overstretched the available budget and MIBPs capacity to provide sufficient number of follow-up visits to the increased number of hospitals hereby reducing more meaningful outcomes. De-centralizing responsibility for support visits to CSSC zonal offices hasn’t been done and would have been most likely also beyond their technical competence. Obviously, when considering the effectiveness of the MIBP, the environment in which the hospitals operate should be taken into consideration. As indicated, this has considerably worsened during the project period. Many hospitals struggle for survival, their financial situation worsened constraining resources and staff, coupled with considerable turnover of senior management positions and significant numbers of inexperienced managers. Obviously this has an impact on sound implementation of the MIBP and effectiveness of its achievements.

The overall project budget seems fair with limited operational costs for a small project team operating from DSM, a reasonable overhead fee for CSSC and Tragpi bearing all its own project related costs. Budgets for support visits and zonal feedback meetings although substantial, remain fair in relation to the perceived outputs. No extraordinary investments

Evaluation Management Information Benchmark Project

27

Advice in International Cooperation, Health & Public Administration

Imprint

costs were necessary as the project could make use of CSSC facilities both at National as well as Zonal level. Although the MIBP is implemented under responsibility of the competence centre of the CSSC, it is implemented parallel and in isolation from other CSSC support programs of the competence centre directed at the same hospitals (e.g. P4P, AfyaPro). Improved levels of cooperation, synergy and alignment between these various projects would have been justified and would have most certainly contributed to enhanced (cost-)efficiency of MIBP.

The tripartite agreement will come to an end with the completion of Cordaids involvement with CSSC. The probability that the benefits and achievements of the MIBP are likely to be sustained after the closure of the project is questionable. A majority of hospitals require continued support to bring them to a level where they actually start using improved data for rational management decisions. Closure of the project will most certainly hinder this capacity building trajectory and will bring most of these hospitals, albeit some levels of improved data collection and recording, back to their original level of data management and -usage. The probability that sufficient capacity at CSSC will remain to continue promoting and advancing the use of the benchmark methodology and tools after the closure of the MIBP is uncertain. It is obvious that the technical competence of current project coordinator of the MIBP at CSSC is probably sufficient however, her position depend on availability of funds currently provided by Cordaid. Next to the project manager, no other CSSC staff nor at CSSC head quarters nor in the Zones, seem sufficiently conversant with the specifics of MIBP methodology and tools to carry it forward without continued support from Tragpi after closure of the MIBP. Whether CSSC will be able to continue with capacity support in line with MIBP objectives after completion of Cordaid’s support remains to seen. It will be crucial to secure sufficient additional finances for operational costs for a continued project phase which may be a challenge. Alternatively, there may be opportunities to link-up and integrate with other CSSC support projects for which finances are secured. The recently approved project to comprehensive support hospital capacity support financed by the German Development Agency (GIZ) may offer clear opportunities in this regard. In addition, funding may be possible through the MOHSW particularly through the Danida supported hospital reform budget as well as the Public-Private-Partnership budget of which CSSC is already a beneficiary.

The assessment of the MIBP results in a set of main lessons learnt and specific recommendations. Although these recommendations relate to the specific Tanzania context they may also be applicable to other countries in which Cordaid is active in supporting the health care system. 1. The tri-partite agreement between Cordaid, CSSC and Tragpi has resulted in the development of an innovative, useful and objective framework to monitor, assess and advance performance of hospitals. Although a pilot, the benchmarking methodology, if

Evaluation Management Information Benchmark Project

28

Advice in International Cooperation, Health & Public Administration

Imprint

2.

3.

4.

5.

6.

7.

8.

9.

moved forward, may prove instrumental to progress important elements of the National hospital reform agenda under the MOHSW. The benchmark methodology consisting of a set of KPIs measuring efficiency, quality and staffing is relevant, concise and manageable and supports transparency within and between hospitals. The development of the framework has potential for further development including additional indicators covering also financial performance. The MIBP has shown the relevance and importance of providing a forum for hospitals to reflect and discuss issues of mutual interest including hospital management and performance. A regional or zonal meeting for hospitals clearly provides a sensible platform to channel mutual concerns and aspirations and offers potential to incorporate other important stakeholders as well such as the CHMTs and RHMTs. It is recommended that an annual or bi-annual regional or zonal hospital meeting is institutionalized with a clearly agreed mandate and TOR. Further development and mainstreaming of the benchmark methodology is recommended. It should however be best considered in the context of supporting national health system development in particular the current new initiative to strengthen the overall M&E system of the MOHSW, the on-going hospital reform agenda and the emerging attention on Performance Based Financing (PBF) policies. Within this context it is suggested that the benchmark methodology is harmonized with other software applications in support of improving routine data management of health institutions, possibly integrated. Most notably an integration with AfyaPro could be considered. Further development of AfyaPro could be considered on a purely commercial basis and shouldn’t necessarily involve a continued partnership with CSSC. Further development and mainstreaming of the benchmark methodology may benefit from the continuation of a pilot involving a selected number of hospitals providing micro lessons for macro policy considerations. The pilot area should be concise and manageable and involve both, government as well as non-government hospitals. If finances allow, it may be considered to employ health centers in the pilot to test relevance and applicability of the benchmark methodology also at this level. In addition, it is recommended that the pilot involves the relevant MOHSW administrative and supportive levels i.c. the CHMTs and RHMTs. Each of these levels have specific responsibilities in respect to supporting hospital management and monitoring hospital performance. The pilot should capitalize on these existing structures and where possible support and strengthen them. A possible continued pilot may be feasible with financial support of the on-going hospital reform agenda and the Public-Private-Partnership agenda. Financial support for a continued pilot may be negotiated with the MOHSW under the Danida supported Health Sector Program Support Programme (HSPS-IV) of which CSSC is already a beneficiary. Hospitals are complex organisations that do not lend themselves to quick solutions; reforms and improvements need to be taken forward over a number of years and require a balance of various interventions. MIBP may have been too narrow implementation strategy with over-emphasis on just one aspect of hospital management. It is recommended that in any continued pilot or support programme, the complexity of hospital management is taken into consideration in a more comprehensive and more intense management support model which is also an expressed request from the beneficiary hospitals.

Evaluation Management Information Benchmark Project

29

Advice in International Cooperation, Health & Public Administration

Imprint

Background In 2008 CSSC, Cordaid and Tragpi signed a tripartite agreement with the aim to cooperate on improving the accessibility of hospital care in Tanzania. Cordaid and CSSC did already have an established partnership. The objective for this agreement was that Tragpi would offer general experience with financial management, specifically cost accounting and productivity measurements in health care. At the start of the cooperation activities took place under project 154/10166: P4P program support. CSSC hired a full time officer for the project activities; Tragpi assigned experienced consultants to assist part time during field visits to Tanzania. The expectations were high; after a while it turned out that improving both cost accounting and introducing benchmarking was not feasible within the given resources for the project. It was then decided to focus on benchmarking only, since it was assumed that benchmarking would lead to better financial management in the end. Under the first phase of the project 9 District Designated Hospitals (DDHs) – all faith based organizations (FBOs) - in Lake Zone started. Out of these 9 facilities, 3 also participate in Cordaids’ P4P-project. When project 154/10166 came to an end Cordaid entered into a separate contract with CSSC for the purpose of benchmarking only (project 100971). The input of Dutch consultants was still covered by Tragpi. The new project continued with the 9 before mentioned DDHs and another 8 faith based DDHs were added (1 also participating in P4P); all from Northern Zone. It was planned to involve 4 MoHSW owned DDHs from Northern Zone as well; although government is interested in the project, no government facilities have been included so far. The project does not include lower level facilities. The objective of the project is: to improve the performance of health facilities through data management and by comparing data of the facilities with those of peer facilities, through the use of a benchmarking tool. Data management is: (i) data collection and recording; (ii) data analysis and; (iii) use of data and feedback. By comparing their data with the data of similar facilities management will become: - eager to learn from others on how to improve the management of the facility in order to offer more and better services to patients; - competitive and improve the management of their facility, leading to improved access to the health facility; and - aware of costs of services and skilled to make choices that lead to an overall increase of services (both quantitative and qualitative) within the limited resources they have. Although the project has not been linked up with the P4P-project in Tanzania, it is still believed that benchmarking can be of added value to PBF-projects, like: - improved data collection; - improved quality of care;
Evaluation Management Information Benchmark Project

30

Advice in International Cooperation, Health & Public Administration

Imprint

- stimulated competition between facilities, that might boost their performance (and thus level of funding); - increased capacity of facility management; The above leads to an increased ability of facilities to show their results. Herewith the facility earns higher bonuses resulting in higher staff motivation and better access for the community to quality health care. - an additional tool for verification of quality indicators for (local) Government; - increased insight for the community in the quality of care and cost effectiveness of the facility, compared to other facilities. Objectives of the evaluation With the project coming to an end in November 2011 the effectiveness of the project and the possibilities to continue (elsewhere) will be evaluated. The main objective for the evaluation is: To what extend has the introduction of benchmarking contributed to improved health care performance of the facilities involved? More specific questions: 1. Did the score on the Key Performance Indicators improve in the health facilities involved in benchmarking? 1. Did the productivity of the health facilities – amount of health services delivered improve? 2. Did the resources – both staff and financial - required per service delivered decrease? 3. Did data management at facilities involved improve between 2008 and the time of evaluation? a. data collection and registration; b. data analysis 4. If available, did plans and budgets from 2011 improve compared to those of 2008? 5. How does management of facilities involved judge the relevance of the project? 6. To what extend do facility managers compare their performance with the performance of other facilities? Is this due to the benchmarking project? 7. Is there proof that managerial decisions taken lately by facility managers are based upon management information, available through the benchmarking project? 8. How do local Government Authorities judge the use of benchmarking? Do they see differences between facilities that are involved in benchmarking and other health facilities in their districts? If yes, what are these differences? 9. Is (local) Government interested to start using benchmarking on a wider scale if resources would be available? 10. How do management and staff of CSSC judge the use of benchmarking? 11. Can CSSC – both at central and zonal level - support facilities: a. to improve data collection, recording and analysis; b. to use benchmarking? 12. Does CSSC staff have the technical knowledge to use and adjust the benchmarking tool? 13. What is the likely sustainability of the projects’ results achieved? Both at facility level as at CSSC. 14. Are there signs of a positive leverage effect as a result of the tripartite collaboration between CSSC, Tragpi and Cordaid? 15. Did each of the 3 parties fulfill its role as laid down in the Memorandum of Understanding?

Evaluation Management Information Benchmark Project

31

Advice in International Cooperation, Health & Public Administration

Imprint

16. How do the Dutch consultants from Tragpi assess their involvement in the project? What did they learn from the project (skills/capacities used in NL? 17. Do the costs of the project outweigh the benefits? 18. What are the lessons learned of the project? Which recommendations can be given for continuation of the project, in both Tanzania and other countries? 19. Does the management of the facilities express further / different needs concerning capacity building of their facility, as a result of the project? If possible (available) underpin the answers to the questions above with data of 2007 (before benchmarking) and of 2010 (or 2011 so far) and compare the outcome for: a. health facilities involved in benchmarking only; b. health facilities involved in P4P only; c. health facilities involved in both benchmarking and P4P; d. health facilities not involved in benchmarking or P4P. In total a number of 8 facilities in Lake Zone will be researched; all faith based hospitals. Because the facilities in Northern Zone only started recently, it has been decided not to include them in the evaluation. Methodology and timetable The methodology will include: - a study of data available; - a study of documents and reports available; - interviews; - general HMIS data; - writing a findings report. Preferably the evaluation will take place in July / August 2011. The final document will be available latest 15th of September 2011. The following maximum amount of days is applicable: - a maximum of 2 days in the Netherlands for preparation and feedback (incl meetings with staff of Cordaid and Tragpi); - a maximum of 16 days field visit in Tanzania (including days of travel within the country); - a maximum of 3 days report writing; - 2 days of international travel. Sources of information Documents: collected data reports and plans from facilities; collected data and reports from CSSC; collected data and reports from Tragpi; MoU’s and contracts, reports and other available documents from Cordaid.

-

Persons to interview: Hospital managers and other members of staff of facilities; Bishops of Dioceses that own the facilities and Diocesan Health Coordinators; District Medical Officers and District Health Managers; Director and staff of CSSC involved in the project (from the Competence Centre and from Lake Zone’s Health Secretary);
Evaluation Management Information Benchmark Project

32

Advice in International Cooperation, Health & Public Administration

Imprint

Tragpi staff involved in the project; Cordaid staff involved in the project.

Deliverables A synthesis report will be written (in English). This report will contain a description of the field and desk work, will answer the objective and specific questions and will propose recommendations for the continuation of the project. The report will also contain a financial paragraph with the costs of the evaluation. Required qualifications evaluator - a degree in Public Health, Social Sciences, Business Administration or other relevant field; - at least 5 years experience in the area of health systems strengthening, health care economics and/or management of health facilities in developing countries;
- proven experience in monitoring and evaluation of programs in developing countries, preferably in the health sector;

- excellent communication and writing skills in English. Based upon the above the consultant is invited to present a detailed quotation.

Evaluation Management Information Benchmark Project

33

Advice in International Cooperation, Health & Public Administration

Imprint

HIERARCHY OF PROGRAM OBJECTIVES
GOAL (= Fundamental & sustainable change)

Indicators
1.

MONITORING, EVALUATION & REPORTING Assumptions:
Factors, conditions or decision outside control of the project important for sustaining the Goal. 1. Context related to the National Health Sector Reform Agenda 2. Hospital Reform Agenda 3. 4. 5. Context related to the National Health Sector Reform Agenda Hospital Reform Agenda Service Agreements & general relation between FBO Hospitals & Government (LGAs, MOHSW, PMO-RALG).

Indicator of Impact (Criteria to verify to what extend the goal is fulfilled) Improved trends Benchmarking of 10 essential KPIs

Improved Hospital Service Provision
PURPOSE (= The effects expected as the result of realizing the outputs)

1.

Improving Hospital Management Performace

OUTPUTS (Products & results of the project within the specified project period)

1. 2.

Improved Hospital Data Management Improved Hospital Performance Benchmarking

Indicator of Effect (Criteria to verify to what extend purposes are fulfilled) 1. Improved DDH strategic business plans; 2. Improved DDH operation plans, budgets & work planning; 3. Improved Resource allocation (Staff & Resources) 4. Improved annual reports 5. Increased ICT in hospital data management Indicator of Output (Criteria to verify to what extend the outputs are produced) 2. Improved data recording management (organization, resources, etc.) 3. Improved performance data quality, consistency, collection, recording & analysis (HIMS books, consolidated quarterly, annual reports) 4. Accurate & regular application & recording of Benchmarking Tool 5. Improved HMIS Reports & HMIS management 6. Improved internal hospital management consultation & result oriented decision making 7. Improved staff consciousness towards data management Indicators: Criteria to verify to what extend activities have been implemented: Targets Results / Achievements

1. 2. 3.

Participation Hospitals in other support projects (e.g. P4P) Human Resources for Health context & constraints Funding levels Hospitals (Internal & External)

Main Activities (2008-2011) Interventions undertaken to produce the specified outputs 1. 2. Introduction benchmark Tool (Annual & monthly tool) Training, backup support CSSC & DDH management in use of benchmark tool & analysing its results Facilitate peer reviews, exchange learning & benchmarking between DDHs Share lessons learnt with other stakeholders (CSSC vs District/Regional/National)

Strategies: The most viable implementation option
using criteria such as: cost efficiency, benefits for target group, probability to get results, risks, feasibility, institutional issues, etc

Assumptions/Conditions
Factors, within and outside direct control of project necessary to achieve or produce the outputs

1.

Pilot in Lake & Northern Zones

Refer to progress reports

3. 4.

Note: Causal sequence is from bottom to top: implementing (sub-)activities lead to outputs which in turn contribute to realizing purposes and finally contributes to achieving goals.

Evaluation Management Information Benchmark Project

34

Advice in International Cooperation, Health & Public Administration

Imprint

Nyakahanga

Status DDH DDH DDH DDH P4P Project No No No No BenchMark Project Yes Yes Yes Yes Evaluation Criteria: Output Indicators (= Results of the project expected within the project period) Consciousness a Hospital Management Team 1 2 1 1 towards importance b Heads of departments 1 1 1 1 data management c Nurses, Clinicians 0 0 0 0 Sub-Total 2 3 2 2 Data Recording & a Data quality & consistency Ward Admission Books 0 0 0 1 Registration b Data quality & consistency OPD Admission Books 1 1 0 1 c Quality, Consistency & timeliness of HMIS Reports 1 1 0 1 Sub-Total 2 3 1 3 Data Management a Use of Data Recorder with specific TOR 0 1 1 1 Organisation b Professional capacity of Data recorder 0 0 0 1 c Improved data recording formats 1 1 1 1 d Improved data recording, collection & supervision 1 1 0 1 e Establishment Data verification task force 0 0 0 0 f Improved medical records department 0 0 0 0 g Improved IT application in Data recording 0 1 1 0 Sub-Total 2 4 3 4 Data Analysis a Professional competence data analist 0 0 0 0 b Monthly update of BM Template 0 0 0 0 c Competence to operate BM Template 0 0 0 0 d Preparing Monthly Consolidad reports 0 0 0 0 Sub-Total 0 0 0 0 Evaluation Criteria: Effect Indicators (=Effects expected as the result of realizing the project Outputs) Understanding & a Commitment Hospital management 1 2 1 1 Committment to b Commitment H'tal staff: Head of Departments 0 0 0 1 Hospital c Hospital peer reviews, exchange visits 0 1 0 2 Benchmarking Management planning, reporting and forecasting

DDH Yes Yes 2 2 2 6 2 2 2 8 2 2 2 2 2 2 2 14 2 2 2 2 8 2 2 0

DDH Yes Yes 2 1 1 4 2 2 2 7 2 2 2 2 2 2 1 13 1 2 2 2 7 2 2 1

DDH Yes Yes 2 2 1 5 2 2 1 5 2 1 2 1 2 1 0 9 1 2 2 2 7 2 1 0

VA Yes No 2 1 1 4 2 2 2 7 2 0 2 1 2 0 1 8 0 1 N/A N/A 1 N/A N/A 1 1 N/A N/A N/A 0 0 0 N/A N/A 0

Sub-Total 1 3 1 4 4 5 3 a Routine monthly interpretation data/KPIs 0 0 0 0 2 1 0 b Result oriented Management decision based on 0 0 0 0 2 1 0 analysis KPIs c Result oriented Hospital Board decisions based 0 0 0 0 2 0 0 on analysis KPIs d Improved hospital operational plans, budgets & 0 0 0 0 0 0 0 work planning e Improved internal Resource allocation (Staff, 0 0 0 0 1 0 0 finances) f Improved annual report (2007/8 versus 2010) 0 0 0 0 0 0 0 g Application & analysis KPI's reflected in Strategic 0 0 0 0 0 0 0 forecasting h Overall internalization of BM methodology by 0 0 0 0 1 0 0 Hospital management Sub-Total 0 0 0 0 8 2 0 Evaluation Criteria: Impact Indicators (=Fundamental & sustainable improvements) Improved hospital a Improved Performance (2008 -2011) Mixed, Not Consistent Performance b Improved Trend KPIs and flexible PI's period 2008 No Objective Basis to assess realistically -2011 Grand Total Relative Position 7 7 15 5 7 7 13 6 52 1 43 2 34 3

21 4

Evaluation Management Information Benchmark Project

35

Kagondo

Key: 1 = No Change 2 = Moderate Improvement 3 = Marked Improvement

Biharamulo

Murgwanza

Sengerema

Mugana

Sumve

Rubia

Advice in International Cooperation, Health & Public Administration

Imprint

Name
Mrs Natasha Lacko Mr. Steven Ingard Mrs Bertine Lokhorst Dr.A.Kimambo Ms. M. Isasi Dr.B.Jensen Mrs Kirstine Njogard Dr R. Mwambo Sr.R Mhenga Dr.E. Munda Mr.E. Mayombya Sr.F.Teobaldi Mr.P.Sindaguru Dr.M.Massi Mrs.T.Gasembe Dr.MJ Voeten Mr. Mihayo Mr.M.Joseph Dr. G. Ssebuyoya

Organisation
Cordaid Trag Performance Intelligence Group BV Trag Performance Intelligence Group BV Christian Social Services Commission Christian Social Services Commission Ministry of Health & Social Welfare Danish Embassy Christian Social Services Commission Sumve District Designated Hospital Sumve District Designated Hospital Sumve District Designated Hospital Sumve District Designated Hospital Sumve District Designated Hospital Regional Medical Office Mwanza Region Regional Medical Office Mwanza Region Sengerema District Designated Hospital Sengerema District Designated Hospital Sengerema District Designated Hospital Biharamulo District designated Hospital

Position
Policy Officer Sector Health & Wellbeing CEO Consultant Director Project Coordinator Benchmark Project Policy Advisor Secretary Health, Co-chair Health SWAP Zonal CSSC Secretary Hospital Health Secretary Medical Officer in Charge I/c of Medical Record Department Sr i/c Female Ward Hospital Accountant Regional Medical Officer Regional Health Secretary Medical Officer in Charge Assistant Nursing Officer i/c Nursing Officer i/c of Medical Ward Medical Officer i/C Hospital Secretary Hospital Patron Hospital Accountant Computer System Administrator Data collector, Statistician Head of department Pediatric Ward District Medical Officer (DMO) District Aids Coordinator / Ag DMO Computer Analists Health Record Technician Data Recorder Data Recorder Medical Officer in Charge Nurse I/C school of Nursing Hospital Secretary Assistant Matron Hospital Accountant Assistant Hospital Accountant Ag Medical officer i/c Hospital Secretary Hospital Accountant

Sr.Leocardia Bernardo Biharamulo District designated Hospital Mr.E.Mulenda Sr.Josephine Method Mr. J.Alayala Mr. H.Paulo Sr.P.Advera Dr.M.kihulya Mr.D.Kamara Mr.M.Gwaho Ms L.Tobias Mrs A. Sahine Mrs. N. Barungula Dr.M.Barongo Mr. M. Kipingili Mr. A. Nziko Mrs. L. Rubagor Mr. P.Rubandwa Mr. P.Mazige Dr.E.Rwamugeta Mr.J.Kataraiye Mrs.U. Bwatota Biharamulo District designated Hospital Biharamulo District designated Hospital Biharamulo District designated Hospital Biharamulo District designated Hospital Biharamulo District designated Hospital Biharamulo District Council Biharamulo District Council Murgwanza District Designated Hospital Murgwanza District Designated Hospital Murgwanza District Designated Hospital Murgwanza District Designated Hospital Murgwanza District Designated Hospital Murgwanza District Designated Hospital Murgwanza District Designated Hospital Murgwanza District Designated Hospital Murgwanza District Designated Hospital Murgwanza District Designated Hospital Nyakahanga District Designated Hospital Nyakahanga District Designated Hospital Nyakahanga District Designated Hospital

Evaluation Management Information Benchmark Project

36

Advice in International Cooperation, Health & Public Administration

Imprint

Mr.I.Agras Mr. J.Buberwa Dr.J.Kamozora Mr. H Mussa Dr. D. Ngaita Mr.N. Rweyemamau Mr.G.Kinyima Dr.D.Katunzi Mrs.B.Rwihula Mrs. D.Katebereza Sr Angela Mr.J.Sekiku Mr.P.Krijenen Mrs B.Wambura Mr.A.Annestas Sr.M.Kawa Sr.E.Karia Fth N.Timanywn Dr.B.Biig Dr. Modest Mr.J.bahawzilen Sr.Concesta Mrs. J.Ishengoma Fth Peter Mr. Rugarabamu Dr. R.Peperkorn Mrs.M.Lieser Dr.M.Ongara Dr.C.Kumalija

Nyakahanga District Designated Hospital Nyakahanga District Designated Hospital Nyakahanga District Designated Hospital Muleba District Medical Office Rubya District Designated Hospital Rubya District Designated Hospital Rubya District Designated Hospital Rubya District Designated Hospital Rubya District Designated Hospital Rubya District Designated Hospital Rubya District Designated Hospital Fadeco Community Development Radio Partage Orphan support programme Mugana District Designated Hospital Mugana District Designated Hospital Mugana District Designated Hospital Mugana District Designated Hospital Bukoba Diocese Int Centre for Aids Care & Treatment (ICAP) Kagondo Hospital Kagondo Hospital Kagondo Hospital Kagondo Hospital Bukoba Diocese Bukoba Diocese Royal Netherlands Embassy Cristian Social Services Commission Minstry of Health and Social Welfare Minstry of Health and Social Welfare

Data Analists Data Analists Dentist Data Analists Medical Officer i/c Hospital secretary HMIS data collector Previous Medical Officer i/c Nursing Officer Maternity Ward Head of department OPD Hospital Matron Managing Director Managing Director Hospital Accountant Data recorder, Record Technician Hospital Matron Principal Nurse training School Bischop Programme Officer Kagera region Medical Officer i/c Hospital Secretary Data Clerk Hospital Matron Treasurar general Diocesan Health Coordinator Health Secretary Capacity Support Advisor Head Hospital Reform Unit Head HMIS Unit

Evaluation Management Information Benchmark Project

37

Advice in International Cooperation, Health & Public Administration

Imprint

No KPI Efficiency
1 2 3 3a 3b 4 5 Number of Outpatients per clinician Bed Occupancy Rate Average length of Stay (ALOS) ALOS Pediatric ward ALOS Maternity ward Major operations per room per day Minor operations per room per day

Definition
Total number of patients that visited the OPD / number of medical staff (All) Total Inpatient days / actual beds * 365 days * 100% Total inpatient days / total admissions Total inpatient days Pediatric ward / total admissions Pediatric ward Total inpatient days maternity ward / total admissions maternity ward Total major operations / No of Major ORooms / 260 days per year Total minor operations / No of minor ORooms / 260 days per year

Indicator

Management Issues

Efficiency in Out-patient Care Management of OPD, HRH, Quality of Care Ability to provide safe and efficient patient care Quality of Care, Efficiency, HRH, Relative importance DDH,

Efficient use of Resources

Quality of care & Clinical Mgt, Discharge policy, Resource Management

Productivity of high cost hospital services

Efficiency, HRH, Productivity, Quality of Care

Quality
6 7 8 Staff/Nurses per actual bed Radiology, Ultrasounds & Laboratory per day Number of postive malaria tests / malaria cases Total number of nurses (registered & enrolled) / number of actual beds Total number of specific tests / 365 days per year Total nr of + tests / total malaria cases Available staff per hospital bed Availability & Utilization of diagnostic capacity Quality of malaria lab tests HRH, Quality of care HRH, Quality of Care, Management of Equipment & Maintenance Quality of care & Clinical Mgt

Staffing
9 10 Total new staff over total staff Total leaving staff over total staff Newly hired staff compared to total number of staff (not staff specific) General production and expertise level HRH, Personnel Management, payment & incentive schemes

Evaluation Management Information Benchmark Project

38

Advice in International Cooperation, Health & Public Administration

Imprint

Bed Occupancy Rate
1.60
1.40 1.20
8.0

Average Lenght of Stay (ALOS)
12.0

10.0

1.00 0.80
2008

6.0
2008 4.0

0.60
0.40

2009
2010

2009
2010

2.0

0.20 0.00
0.0

ALOS Maternity Ward
12.0

ALOS Pediatric ward
9.0
8.0

10.0

7.0
8.0

6.0
5.0

6.0
2008 4.0

4.0 3.0
2.0

2008

2009
2010

2009
2010

2.0

1.0
0.0

0.0

Major Operations per O-room per day
10.0 9.0 8.0 7.0 6.0 5.0 4.0 3.0 2.0 1.0 0.0
2008
10.0 9.0 8.0 7.0 6.0 5.0 4.0 3.0 2.0 1.0 0.0

Minor operation per O-room per day

2008

2009
2010

2009
2010

Evaluation Management Information Benchmark Project

39

Advice in International Cooperation, Health & Public Administration

Imprint

No of Outpatients per clinician
5,000 4,500 4,000 3,500 3,000 2,500
2,000 1,500 2008
0.8 0.6 1.6 1.4
1.2

No of staff per actual bed

1.0

1,000 500 0

2009

0.4

2008

2010 0.2
0.0

2009 2010

No of radiology and Ultrasounds per day
30.0

No of laboratory tests per day
450.0

400.0

25.0
350.0

20.0

300.0 250.0

15.0
2008 10.0
200.0
150.0 100.0 2008

2009
2010

2009
2010

5.0
50.0

0.0

0.0

No of positive Malaria Tests / Malaria Cases
0.50 0.45 0.40 0.35 0.30 0.25 0.20 0.15 0.10 0.05 0.00
2008

No of new staff versus total staff
0.30

0.25

0.20

0.15
2008 0.10

2009
2010

2009
2010

0.05

0.00

Evaluation Management Information Benchmark Project

40

Advice in International Cooperation, Health & Public Administration

Imprint

No of leaving staff over total staff
0.25

0.20

0.15

0.10

2008

2009
2010

0.05

0.00

Evaluation Management Information Benchmark Project

41

Advice in International Cooperation, Health & Public Administration

Imprint

Hospital Murgwanza

Production Figures OPD Major OK Inpatient Admissions In-Patient Days OPD Major OK Inpatient Admissions In-Patient Days OPD Major OK Inpatient Admissions In-Patient Days OPD Major OK Inpatient Admissions In-Patient Days OPD Major OK Inpatient Admissions In-Patient Days OPD Major OK Inpatient Admissions In-Patient Days OPD Major OK Inpatient Admissions In-Patient Days

2008 27.853 867 10.747 108.209 15.475 295 4.059 24.954 41.412 651 14.019 76.774 38.862 295 12.996 71.685 35.921 474 9.742 64.218

2009 28.673 673 9.446 84.219 16.938 435 4.255 22.885 37.995 684 14.618 59.130 39.938 435 11.266 64.409 ? 466 9.331 59.829

2010 28.574 684 8.675 76.730 17.036 721 4.040 32.045 36.895 846 14.622 51.132 17.036 721 12.121 64.498 36.044 227 8.309 32.061 No Data Available

Mugana

Rubia

Nyakahanga

Biharamulo

Sumve

Sengerema

No Data Available

Evaluation Management Information Benchmark Project

42

Advice in International Cooperation, Health & Public Administration

Imprint

Date

Consultant Activity

11.08.2011 General preparation, development questionnaires etc. 12.08.2011 Interview tragpi Nl staff 14.08.2011 Travel The Hague-A'Dam-Dar es Salaam 15.08.2011 Interviews CSSC, MoHSW 16.08.2011 Travel DSM-Mwanza, Interviews CSSC Zonal Manager and Sumve DDH 17.08.2011 Interview RHMT Mwanza, Travel M'za-S'ma, Interview Séngerema DDH 18.08.2011 Interview Segerema DDH, Travel & Interviews S'ma-B'mulo 19.08.2011 Travel B/mulo-Karagwe, Interviews Nyakahanga DDH 20.08.2011 Interviews Nyakahanga DDH 21.08.2011 Travel Karagwe-Bukoba-Rubia 22.08.2011 Interview Rubia DDH 23.08.2011 Interview Rubia DDH, Muleba CHMT 24.08.2011 Interview Mugana DDH 25.08.2011 Interview Kagondo Hospital 26.08.2011 Interview Bukoba Diocesan Health coordinator, Bishop, RMO 27.08.2011 Interview RHMT Kagera Region 28.08.2011 Travel Bukoba-Mwanza-DSM 29.08.2011 Interview MOHSW, Hospital Reform, de-briefing CSSC 30.08.2011 Travel Dar es Salaam-A'dam-The Hague 01.09.2011 Draft Report 02.09.2011 Draft Report 05.09.2011 Draft Report 15.09.2011 Final Report

Evaluation Management Information Benchmark Project

43

Advice in International Cooperation, Health & Public Administration

Imprint

Christian Social Services Commission (2009), Management Information Benchmark Project, November 2009 - October 2011, Dar es Salaam. Christian Social Services Commission (2011), Management Information Benchmark Project, Feedback Meeting Northern Zone, Arusha. Christian Social Services Commission (2011), Management Information Benchmark Project, Feedback Meeting Lake Zone, Mwanza. Christian Social Services Commission (2010), Management Information Benchmark Project, TRIP Mwanza and Kagera Regions, Murgwanza DDH, Dar es Salaam. Christian Social Services Commission (2010), Management Information Benchmark Project, TRIP Mwanza and Kagera Regions, Sengerema DDH, Dar es Salaam. Christian Social Services Commission (2010), Management Information Benchmark Project, TRIP Mwanza and Kagera Regions, Nyakahanga DDH, Dar es Salaam. Christian Social Services Commission (2010), Management Information Benchmark Project, TRIP Mwanza and Kagera Regions, Rubya DDH, Dar es Salaam. Christian Social Services Commission (2010), Management Information Benchmark Project, TRIP Mwanza and Kagera Regions, Biharamulo DDH, Dar es Salaam. Christian Social Services Commission (2010), Progress Report Management Information Benchmark Project, November 2009 to October 2010, Dar es Salaam. Christian Social Services Commission (2011), Building Capacity for Service Agreements at P4P Dioceses, Dar es Salaam. Cordaid (2009), Projectenschets Management Information Benchmark Project, The Hague. Cordaid (2007), Project Plan, Building financial management capacity for Tanzania health facilities, CSSC & TRAG, The Hague. Sengerema Designated District Hospital (2011), Annual report 2010, Sengerema. Sengerema Designated District Hospital (2008), Annual report 2007, Sengerema. Sengerema Designated District Hospital (2011), Hospital Strategic Plan 2011-2015, Sengerema. Nyakahanga Designated District Hospital (2009), Annual Report 2008, Karagwe. Nyakahanga Designated District Hospital (2011), Annual Report 2010, Karagwe. Biharamulo Designated District Hospital (2011), Annual Report 2010, Biharamulo. 44

Evaluation Management Information Benchmark Project

Advice in International Cooperation, Health & Public Administration

Imprint

Biharamulo Designated District Hospital (2009), Annual Report 2008, Biharamulo. Rubia Designated District Hospital (2011), Hospital Annual Report 2010, Rubia. Rubia Designated District Hospital (2008), Hospital Annual Report 2007, Rubia. Rubia Designated District Hospital (2011), Strategic Plan 2011-2015, Rubia. Muleba District Council (2011), Revised Comprehensive Health Plan July 2010 – June 2011, Muleba. Mugana Designated District Hospital (2009), Annual Report 2008, Mugana. Mugana Designated District Hospital (2011), Annual Report 2010, Mugana. Murgwanza Designated District Hospital (2009), Annual Report 2008, Ngara. Murgwanza Designated District Hospital (2011), Annual Report 2010, Ngara. Sumve Designated District Hospital (2011), Hospital Annual Report 2010, Sumve. Sumve Designated District Hospital (2008), Hospital Annual Report 2007, Sumve. Government of Tanzania Ministry of Health and Social Welfare (2009), Health Sector Strategic Plan III, Partnerships for Delivering the MDGs July 2009 – June 2015, Dar es Salaam. Government of Tanzania Ministry of Health and Social Welfare (2010), M&E Strengthening Initiative combination Plan, 5 year operational plan & Year 1 operational plan, Dar es Salaam. Government of Tanzania Ministry of Health and Social Welfare (2009), Health Sector Performance Profile Report 2009 Update, Dar es Salaam.

Evaluation Management Information Benchmark Project

45



doc_720750900.pdf
 

Attachments

Back
Top