首页    期刊浏览 2024年12月04日 星期三
登录注册

文章基本信息

  • 标题:Application of DEA method in efficiency evaluation of public higher education institutions.
  • 作者:Nazarko, Joanicjusz ; Saparauskas, Jonas
  • 期刊名称:Technological and Economic Development of Economy
  • 印刷版ISSN:1392-8619
  • 出版年度:2014
  • 期号:March
  • 语种:English
  • 出版社:Vilnius Gediminas Technical University
  • 摘要:Public higher education sector is under a growing pressure to increase efficiency and improve the quality of its activities. The quality of education service has become a major issue in higher education worldwide (Zafiropoulos, Vrana 2008). Expectations of the state, society, media and other stakeholders stimulate universities to manage their resources more effectively and also cause increased transparency in state funding of the higher education sector. Another factor contributing to that phenomenon is the necessity to conform to the European Union standards.
  • 关键词:Data envelopment analysis;Education;Federal aid to higher education;Government aid to higher education;Universities and colleges

Application of DEA method in efficiency evaluation of public higher education institutions.


Nazarko, Joanicjusz ; Saparauskas, Jonas


Introduction

Public higher education sector is under a growing pressure to increase efficiency and improve the quality of its activities. The quality of education service has become a major issue in higher education worldwide (Zafiropoulos, Vrana 2008). Expectations of the state, society, media and other stakeholders stimulate universities to manage their resources more effectively and also cause increased transparency in state funding of the higher education sector. Another factor contributing to that phenomenon is the necessity to conform to the European Union standards.

Corporate standards and models of management are more and more frequently applied in the public sector. There is an increasing number of alternative financing schemes that rely on larger contributions from students. Those based on income-contingent loans provide insurance against uncertain educational outcomes (Del Rey, Racionero 2010). However, the specificity of the public sector often makes it impossible to copy such patterns directly. The public sector is characterised, among others, by the complexity of the sector's environment and its instability (frequent political and legal changes), by the multitude and ambiguity of goals and by the variety of stakeholders with contradicting expectations (Bonaccorsi, Daraio 2009; Nazarko et al. 2009). Another factor is a limited amount of public funds, which are distributed and supervised according to detailed regulations. Furthermore, activities of public sector institutions are not subject to high competitive pressure and as profit-oriented as private counterparts. Additionally, there is a lack of objective criteria for the assessment of the sector. This leads to the problem of state money distribution that has nothing to do with efficiency of its management by public institutions. Providing the financial resources for higher education has been a particularly sensitive issue, which influenced the achievement of many goals, but also economic and social mission entrusted by the universities (Munteanu, Andrei-Coman 2011).

It is, therefore, crucial to create stimulants for rational management of public funds and improvement in the quality of services offered by the public sector academic institutions. To provide quality enhancement of the educational environment, higher education institutions could create and implement a strategy for their higher school improvement--a long-term action plan, which includes management of the organisational units as interconnected and interdependent entities, and engagement of students in quality assurance activities as enthusiastic and responsible academic community members (Stukalina 2012). Also, the systematic comparative study of the efficiency of public sector units (Nazarko et al. 2008, 2009) could be made. Such an assessment defines reference points (benchmarks) for studied activities. It may, therefore, be treated as a substitute for competition and contribute to a more efficient allocation of public funds, greater care for the efficiency of conducted processes, higher quality of offered services and improvements in management of public institutions.

The Data Envelopment Analysis (DEA) method occupies an important place in the comparative efficiency studies in the public sector worldwide (Chalos, Cherian 1995; Odeck 2005). It is also applied in the higher education sector because outcomes of DEA may provide valuable information supporting HEI management. DEA does not just enable the identification of areas requiring improvement but also describes the development possibilities in those areas. Moreover, it allows answering questions concerning HEI strengths and weaknesses, the mode of fund allocation among HEI organisational units, or the optimal size of these units.

There is one important virtue of DEA application in higher education settings, namely, it can assess the efficiency of universities from multiple viewpoints. However, when the number of evaluation criteria increases, more universities are evaluated as being efficient (Aoki 2010).

Examples of DEA application in the area of higher education from around the globeare described in works by Leitner et al. (2007), Taylor and Harris (2004), McMillan and Datta (1998), Bradley et al. (2006), Nazarko et al. (2008).

In the UK, issues of higher education management receive a lot of attention, thus providing many instances of DEA application for the assessment of higher education effectiveness or productivity. Therefore, UK can be named a leader in evaluation of university effectiveness.

One of the instances of British experience in that field is the comparative efficiency analysis undertaken as a response to the increased awareness of the issues of accountability, value for money and cost control. Authors of the study--Athanassopoulos and Shale--proposed concepts of cost and outcome efficiency in order to gain further insights into the university operations (Athanassopoulos, Shale 1997). The object of the research comprised 45 universities. The study revealed that a subset of six institutions showed satisfactory performance across alternative efficiency tests.

Another example is the investigation of the level of efficiency and change in productivity of nearly 200 further education providers in England over the period 1999-2003 (Bradley et al. 2010). In the course of DEA, it was found that the mean provider efficiency varied between 83 and 90 percent over the period. In the meantime, productivity change amounted to approx. 12 percent, which comprised of 8 percent of technology change and 4 percent of technical efficiency change. Therefore, a multivariate analysis was performed. It showed that student-related variables--such as gender, ethnicity and age--were generally more important in determination of efficiency levels than staff-related variables. It was also established that the local unemployment rate has an effect on provider efficiency. Considering policy implications of the results, the authors of the research recommend that further education providers should implement strategies to improve completion and achievement rates of white males as well as offer increased administrative support to teachers.

Another British example of DEA application is the examination of the technical efficiency of 45 universities in the period 1980/81-1992/93. The analysis indicated that there was a substantial rise in the weighted geometric mean of the technical efficiency score during the study period, although this rise was most noticeable between 1987/88 and 1990/91. The rise of technical efficiency scores was attributed largely to the gains in pure technical efficiency and congestion efficiency, with scale efficiency playing a minor role (Flegg et al. 2004).

An interesting research was conducted on a sample of 54,564 graduates from UK universities in 1993 to assess whether the choice of technique affects the measurement of universities' performance (Johnes 2006). A methodology developed by Thanassoulis et al. (2002) allows each individual's DEA efficiency score to be decomposed into two components: one attributable to the university, at which a student studied, and the other attributable to the individual student. The results showed that rankings of universities derived from the DEA efficiencies, which measure university performance (having excluded efforts of individuals), were not strongly correlated with the university rankings derived from the university effects of the multilevel models. The data were also used to perform a university-level DEA. The university efficiency scores derived were largely unrelated to the scores from the individual-level, confirming a result from a smaller data set (Johnes 2006a). However, the university-level DEAs provide efficiency scores, which are generally strongly related to the university effects of the multilevel models.

Another instance of DEA application in British higher education sector is the efficiency and productivity studies of more than 500 English in-service training institutions during the period of 5 years (Bradley et al. 2006). Five main types of studied units were specified: general/tertiary colleges, Sixth Form Colleges, Specialist Colleges, Specialist Designated Institutions and External Institutions. Variables describing the number and the quality of students and teachers were used as input variables for a DEA model. Student achievements, measured as the number of students continuing their education and the number of attained qualifications, were treated as output variables.

Casu, Thanassoulis (2006) evaluated cost efficiency in UK university central administration. For this purpose, researchers set up a data envelopment analysis (DEA) framework. The problems in defining the unit of assessment and the relationship between the inputs and the outputs are clearly demonstrated. Glass et al. (2006) computed DEA-based efficiency scores for policy evaluations and possible funding guidance in UK higher education.

In the period 1998-2003, the efficiency of 72 public universities of Germany was examined with the use of DEA and stochastic frontier analysis (Kempkes, Pohl 2010). The work referred to the faculty composition of universities as an essential element in the efficiency of higher education. The main finding was that East German universities have performed better in the total factor productivity change compared to those of West German universities. However, when looking at mean efficiency scores over the sample period, West German universities still appeared at the top end of relative efficiency outcomes.

In Austria (Leitner et al. 2007) studies with the use of DEA allowed to assess the efficiency of natural sciences and engineering departments in HEI. Models developed there consisted of two input variables (number of academic teachers and floor area of the department) and 12 output variables (extramural grants, ratio of completed projects to the total academic staff, number of projects completed by the department, number of exams, diploma students, monographs, reports, presentations and other publications, number of patents obtained, and PhD graduates). According to the researchers, it was demonstrated that DEA method surpassed traditional approaches based on simple calculation of indicators. Consequently, application of the DEA method does not only allow determining a department's efficiency but also helps specifying improvement possibilities of department.

In the Netherlands, the DEA approach was utilised to estimate per-student education costs (PSCs) in higher education institutions in an effort to redress a number of methodological problems endemic to such estimations, particularly the allocation of shared expenditures between education and other institutional activities (Salerno 2006). The results were compared with PSC estimates generated by a more traditional approach. DEA was argued to increase the likelihood of producing more realistic cost estimates for individual institutions.

In Greece, 20 public universities were assessed through quantitative analysis including performance indicators, DEA and econometric procedures (Katharaki, Katharakis 2010). The findings showed inefficiency in terms of human resource management while also identifying a clear opportunity to increase research activity and hence--the research income. The author of the study hoped to contribute to the broader debate on reforming the management and administration system of Greek universities.

In the US, a multi-output production function to analyse economies of scope between patents and R&D was applied in research universities (Chavas et al. 2012). The tradeoffs and/ or synergies that arise between traditional university research outputs (articles and doctorates) and academic patents were analysed. In the study, sources of economies of scope and relative roles of complementarity, scale and convexity were also investigated. DEA estimates of scope economies using R&D input and output data from 92 research universities showed significant economies of scope between articles and patents but only modest complementarities except for a few cases. The findings showed how scale effects (for small universities) and convexity effects can contribute to economies of scope.

The other instance of American research on effectiveness of education providing institutions is the work of Hirao (2012). In the study, efficiency of the top 50 public and private business schools in the United States in the year 2006 was measured with the help of DEA. It was found that although technical efficiencies of private and public schools were both high, scale and overall efficiencies of public schools were lower than those of private schools.

Breu, Raab (1994) used DEA to measure the relative efficiency of the "best" 25 U.S. News and World Report-ranked universities. Their results indicate how DEA may be used to measure relative efficiency of these higher education institutions from commonly available performance indicators.

In Canada, efficiency of 45 HEI was studied (McMillan, Datta 1998). Three types of Canadian HEI were specified: comprehensive with a medical school, comprehensive without a medical school and primarily undergraduate. Nine different models were used in the analysis. Output variables included among others: number of students sorted by the field of studies, number of sponsored research grants, etc. Input variables consisted of the number of academic staff with the division between the exact science and the humanities, the number of employees obtaining research grants, etc. The authors stress the utility of the DEA method as a benchmarking tool applied by HEI. They recommend that DEA is used to study more homogenous administrative units such as departments.

Another illustrative example of efficiency assessment in Canadian universities is that with using DEA and stochastic frontier methods (McMillan, Chan 2006). An analysis of the rankings revealed that the relative positions of individual universities across sets of several efficiency rankings demonstrated an underlying consistency. High-efficiency and low-efficiency groups were evidenced but the rank for most universities was not significantly different from the others. The results emphasised the need for caution when employing efficiency scores for management and policy purposes, and they recommended looking for confirmation across viable alternatives.

In Australia (Madden et al. 1997), as a consequence of the 1987 Green Paper on Australian higher education, which included the recommendation to abandon the binary system and introduce the Dawkins plan for transfer of resources from established universities to the former colleges of advanced education, a comparison of the initial and subsequent performance of economics departments was completed. The findings revealed that while overall performance has improved substantially, further productivity improvements were required for new universities to achieve best practice. Avkiran (2001) used DEA to examine the relative efficiency of Australian universities. Three performance models were developed: overall performance, performance on delivery of educational services, and performance on fee-paying enrolments. The findings showed that the universities were performing well on technical and scale efficiency but there was room for improving performance on fee-paying enrolments. In the research by Abbott and Doucouliagos (2003), non-parametric techniques were used to estimate technical and scale efficiency of individual Australian universities. Various measures of output and inputs were used. The results showed that regardless of the output-input mix, Australian universities recorded high levels of efficiency relative to each other.

In South Africa, 10 out of 21 public HEI were studied from the perspective of their efficiency during a period of 4 years (Taylor, Harris 2004). Taking into account the limitations of the method, seven models were tested. In each model, the output variables consisted of the number of graduates and the indicators characterising HEI engagement in research. Input variables varied in each model and included: total costs, financial resources, number of students and employees. Demonstrated efficiency differences between HEI allowed specifying four main factors that determine HEI efficiency: increase in the number of students, quality of recruited students, quality of academic staff and the level of fixed costs.

In China, relative efficiency in the production of research of 109 regular universities in 2003 and 2004 was analysed (Johnes, Yu 2008). Output variables measured the impact and productivity of research. Input variables reflected staff, students, capital and resources. The mean efficiency was just over 90% when all input and output variables were included in the model, and this felled to just over 80% when student-related input variables were excluded from the model. The rankings of the universities across models and time periods were highly significantly correlated. Further investigation suggested that the mean research efficiency was higher in comprehensive universities compared to specialist universities, and in universities located in the coastal region compared to those in the western region of China. The aforementioned result offered support for the merger activity, which took place in Chinese higher education.

In Taiwan, 18 classes of freshmen English students in the academic year 2004-2006 were examined with DEA (Montoneri et al. 2012). A diagram of teaching performance improvement mechanism was designed to identify key performance indicators for evaluation in order to help teachers concentrate their efforts on the formulated teaching suggestions. The sensitivity study highlighted the priority of the richness of course contents over the other evaluated indicators. The performance improvement mechanism was designed to help decision-makers to develop educational policies. J.-K. Chen and I.-S. Chen (2011) adopted Inno-Qual performance system (IQPS) by using DEA to evaluate the Inno-Qual efficiency of 99 Taiwanese universities divided into five types (research-intensive, teaching-intensive, profession-intensive, research & teaching-intensive, and education-in-practice-intensive). On the basis of the empirical results, researchers found that more than half (73%) of the universities were highly inefficient in improving the Inno-Qual performance. Thus, it was concluded that improving the Inno-Qual efficiency based on results would be helpful for reducing the majority of cost expenditures.

To assess the efficiency of Thai public universities at the faculty level using the DEA method, two efficiency models--the teaching efficiency model and the research efficiency model--were used (Kantabutra, Tang 2010). Further statistical analyses were performed to examine the difference in performance between two types of public universities: the government universities and the autonomous universities. Then, the differences in efficiency between university locations and types of faculties were checked. The results indicated that the autonomous universities outperformed the government universities in terms of research efficiency. It was additionally determined that universities in provincial areas and faculties attributed to the health science group were efficient in terms of teaching.

Kuah and Wong (2011) presented the DEA model for joint evaluation of the relative teaching and research efficiencies of universities in Malaysia. The inputs and outputs for university performance measurement were identified. They comprised of 16 measures in total. Joint DEA maximisation was used to model and evaluate these measures. The application of DEA enabled academics to identify deficient activities in their universities and take appropriate actions for improvement.

It is worth mentioning a cross-national initiative, which focused on the comparison of the efficiency of Italian and German public universities and its evolution in the period 2001-2007 (Agasisti, Pohl 2011). The authors of the research underlined the importance of the task enumerating two main reasons: first, to assess whether the public spending for funding the universities was used efficiently; and second, the stimulation of a benchmarking exercise that could be useful for managerial and policymaking purposes in European countries. The study with the use of DEA revealed that German universities were more efficient than their Italian counterparts. However, the Italian institutions improved their efficiency rapidly in the period 2001-2007. Abramo et al. (2011) proposed an application of the DEA methodology for measurement of technical and allocative efficiency of university research activity. The analysis is based on bibliometric data from the Italian university system for the five-year period of 2004-2008. Technical and allocative efficiency was measured using university research staff classified according to academic rank as input, and the field-standardised impact of the research product realised by the staff as output. The analysis was applied to all scientific disciplines of the so-called hard sciences, and conducted at a subfield level, thus at a greater level of detail than ever before achieved in national-scale research assessments.

An interesting example of an international study using DEA in relative efficiency assessment is the project scrutinising how public education and R&D expenditures were utilised in new EU member states in comparison to the selected EU and OECD countries plus Croatia (Aristovnik 2012). In that study, the relative efficiency was defined as the deviation from the efficiency frontier, which represented the maximum output/outcome attainable from each input level. An analysis of output-oriented efficiency measures revealed that such new EU member states as Hungary, Estonia and Slovenia could be treated as good benchmark countries in the field of primary, secondary and tertiary education, respectively. Cyprus and Hungary were indicated as dominating countries in the field of the R&D sector. The empirical results also suggested that new EU member states showed relatively high efficiency in tertiary education, but lagged behind in the R&D efficiency measures.

This paper describes the application of the DEA method in a comparative efficiency study of 19 Polish universities of technology.

1. Characteristics of higher education sector in Poland

Higher education in Poland is divided into two sectors: public and private. All in all, 470 HEI function in both sectors with 132 as public institutions. There are two main categories of higher education institutions: university-type and non-university institutions. In a university-type HEI at least one unit is authorised to confer the academic degree of PhD. Almost all PhD granting HEI (approximately 100), including all of the 19 universities of technology, are public (Higher Education ... 2012).

There are approx. 1,764,000 students (year 2011) in different types of HEI in Poland with 1,245,000--in public HEI and 518,000--in private HEI. Approx. 965,000 of students are full time (public: 876,000, private: 88,000) and approx. 799,000 of students are part-time (public: 369,000, private: 430,000). Universities of technology provide education for 338,000 students (full-time programmes: approx. 246,000, part-time programmes: approx. 92,000). All HEI are the primary workplace for more than 99,000 academic teachers, including 11,500 tenured professors. Universities of technology employ 15,500 academic teachers, including 2,900 tenured professors (Higher Education ... 2012).

Higher education institutions in Poland offer the following education possibilities (Higher education in Poland 2013):

--first cycle studies (equivalent to the bachelor's degree) of two kinds:

--studies leading to the professional title of "licencjat", 3 to 4 years in duration;

--studies leading to the professional title of "inzynier", 3.5 to 4 years in duration;

--second cycle studies of 1.5 to 2 years in duration (similar to the master's degree), leading to the professional title of "magister" or an equivalent degree, accessible for graduates of the first cycle studies;

--long-cycle studies of 4.5 to 6 years in duration (similar to the master's degree) leading to the professional title of "magister" or an equivalent degree;

--third cycle studies--doctoral programmes, provided by the university-type higher education institutions as well as some research institutions (firstly, the Polish Academy of Sciences).

Along with 29 other countries, Poland signed the Bologna Declaration, which aims to create the European Higher Education Area. The current reforms of the Polish higher education system follow the recent action lines of the Bologna Process.

Government budget subsidies are the primary funding source for the public HEI. Subsidies are assigned for education of full time undergraduate and master's degree students, education of full-time PhD students, salaries of academic staff and facility maintenance. The size of subsidy depends on: (i) the number of students (including different weights given to various fields of study); (ii) the number of PhD students (with different weights assigned to various academic specialties); (iii) the number of teaching and research staff (with different weights assigned to their seniority and formal qualifications); (iv) the number of research grants obtained in a given year; (v) the number of licenses to award PhD and higher doctorate degrees; (vi) student exchanges with foreign universities (Rozporzqdzenie ... 2012).

In 2011, government budget subsidy for public HEI amounted to approx. USD 4 billion, out of which USD 1 billion went to the universities of technology. There is a general consensus among scientists and politicians that the current level of financing is far from sufficient. However, the costs for maintaining the public higher education sector are increasingly difficult to bear even for rich countries' budgets (Johnes 2006; Onsel et al. 2008). Similarly to other public institutions, HEI are under the growing pressure to increase the efficiency in spending of public resources, to actively search for alternative funding sources and to compete for a good position in the educational market (Higher Education ... 2012).

2. Conceptual framework of Data Envelopment Analysis (DEA)

In this paper, M. J. Farrell's effectiveness concept was used to analyse effectiveness of higher education institutions. This concept assesses the effectiveness as a relative measure, describing the relation of inputs to outputs with respect to the maximum value possible to obtain in given technological conditions (Farrell 1957). Farrell distinguished two components of organisational effectiveness: technical and allocative effectiveness. Technical effectiveness was defined as an ability to produce a certain amount of product with minimal inputs. The allocative effectiveness was described as a reflection of the ability to absorb inputs in an optimal proportion considering prices of the inputs (inputs costs). The combination of technical and allocative effectiveness constitutes the overall economic effectiveness (Coelli et al. 2002).

Farrell's productivity concept was based on relative effectiveness measurement method Data Envelopment Analysis (DEA) developed by A. Charnes, W. W. Cooper and E. Rhodes (Charnes et al. 1978). In the method, the effectiveness (E) of the analysed object (j), called Decision Making Unit (DMU) can be defined as a quotient of a weighted sum of the outputs to the weighted sum of the inputs: s

[E.sub.j] = [s.summation over (r=1)] [u.sub.rj][y.sub.rj]/[m.summation over (i=1)][v.sub.ij][x.sub.ij], (1)

[y.sub.rj]--the amount of the product r generated by [DMU.sub.j], output; [x.sub.ij]--the amount of the resource i used by [DMU.sub.j], input; [u.sub.rj]--weight of the output [y.sub.rj]; [v.sub.ij]--weight of the input [x.sub.ij]; r = 1, 2, ..., s--number of the generated products; i = 1, 2, ..., m--number of resources used; j = 1, 2, ..., n--number of DMUs.

[FIGURE 1 OMITTED]

The concept of the DEA method is presented in Figure 1.

Application of the DEA method does not require prior determination of weights. Optimisation of weights is done for each object separately through solving linear programming task in order to maximise the relation output/input described in the Equation (1) with taking into consideration the constraints given. This way, strengths of each unit are exposed:

max [h.sub.jo] = [s.summation over (r=1)][u.sub.rjo][y.sub.rjo]/[m.summation over (i=1)][v.sub.ijo][x.sub.ijo], (2)

subject to

[s.summation over (r=1)][u.sub.rjo][y.sub.rj]/[m.summation over (i=1)][v.sub.ijo][x.sub.ij] [less than or equal to] 1, j = 1, ..., [j.sub.o], ..., n;

[u.sub.rjo] [greater than or equal to] 0, r = 1, ..., s;

[v.sub.ijo] [greater than or equal to] 0, i = 1, ..., m.

DEA models that require constant returns to the scale approach are called CCR models (the acronym of the first letters of the names of the method's authors--Charnes, Cooper and Rhodes (Charnes et al. 1978) or CRS (Constant Returns to Scale). The models used in variable returns to scale are called BCC models, the acronym of the names of the model's authors--Banker, Charnes, Cooper (Banker et al. 1984) or VRS (Variable Returns to Scale). A DEA model can be input oriented, then inputs are minimised with the limitation on the lower amount of the outputs. It can also be output oriented, which means maximisation of outputs with the limitation on the upper amount of inputs (Guzik 2009).

3. Data analysis and selection of a model

Comparison of teaching and scholarly achievements of universities is complex and evokes a considerable amount of controversy. It is often argued that such a comparison is subjective and lacks a clear framework. DEA has its limitations and cannot pretend to be a universal and fully objective method. However, its conscious use may prove to be a source of valuable information on the HEI performance. The possibility to measure and compare values expressed in different units is an important advantage of the DEA method. Selection of variables is the primary and often the most difficult aspect of DEA application in the comparative analysis of DMUs. This paper presents two essential stages in the variables selection process: the merit-related and the statistics-related stage.

According to the DEA methodology, in order to analye the efficiency of Polish universities of technology, it was assumed that each university (DMU--Decision Making Unit) may be characterised by its initial assets (system input), effects (results, system output) and transformation processes, which convert assets into effects (taking into account the impact of the environment, which remains out of university's control).

15 variables concerning the financial, staff, organisational and qualitative aspects of university performance were analysed. The merit-related analysis resulted in the selection of 5 input variables, 8 output variables and 2 environmental variables. Table 1 presents the set of analysed variables with their description.

In order to detect relations between the variables, a correlation analysis was carried out in each group of variables.

All input variables are strongly and significantly correlated with each other (Table 2). The strongest correlation of all input variables may be observed with the variable [I.sub.1] (government budget subsidy obtained by a university). Thus, this variable is a very good representative of all input variables analysed initially. It is, therefore, accepted in the model as a variable representing input.

Results of a university performance should be related to the input variable. In order to determine the strength of that relation, correlation between the input variable and the output variables was calculated (Table 3).

Only four out of eight output variables are strongly and significantly (significance level p < 0.05) correlated with the input variable: [O.sub.1]--weighted number of full-time students based on their field of study; [O.sub.2]--weighted number of full-time PhD students calculated on the basis of their scholarly disciplines; [O.sub.7]--employers preferences determined through survey research and [O.sub.8]--parametric assessment of scholarly achievements of universities carried out by the Ministry of Science and Higher Education. Correlation of the remaining output variables with the input variable is insignificant. Thus, these variables were excluded from further analysis.

In order to examine the impact of the environmental variables on the achieved results, the correlation between the environmental variables [E.sub.1] (population size of the city, in which the university is located), [E.sub.2] (percentage of students with need-based financial aid) and the output variables was calculated. It was established that the two environmental variables are characterised by a strong and significant correlation with output variables (Table 4). Variable [E.sub.2] shows negative correlation with the output variables. The obtained results indicate the need to include the environmental variables in the model.

Variables selected for the model should be characterised by a high level of variation, which enables clear diversification of HEI in respect to their input and achieved effects. All variables present in the model are characterised by a sufficiently high level of variation (coefficient of variation CV > 50%) (Table 5).

Ultimately variables [I.sub.1], [O.sub.1], [O.sub.2], [O.sub.7], [O.sub.8], [E.sub.1] and [E.sub.2] were selected for the comparative efficiency calculations with the use of DEA method (Table 6).

4. Comparative analysis of the institutional efficiency

Due to the character of the task, a CCR-CRS output-oriented model was chosen for calculations (Eq 2). This model was considered suitable as universities have no direct influence on the size of the government budget subsidy. As a result of the very strong linear correlation of output variables with the input variable and the impossibility to rapidly increase the effects, a CSR (constant returns to scale) model was selected. Calculations were carried out with the use of the Frontier Analyst v. 4.1.0, Statistica 9 and Excel 2007 software.

During the first stage of calculations, the efficiency of the universities was determined excluding environmental variables. On the basis of the results it was found that the [O.sub.7] variable (employer hiring preferences) has a low share in the DMU's efficiency assessment. As a consequence, the calculations were repeated excluding this variable. The obtained results turned out to be practically identical with the previous ones (Table 7).

Therefore the [O.sub.7] variable was excluded from further calculations.

Since in several cases the DEA algorithm omitted some output variables (e.g. number of students), the author decided to impose constraints on the weighs ascribed to the output variables. It is also justified by the fact that the government budget subsidy to the Polish HEI is mainly spent on educating students and that the universities of technology are required to carry out research and PhD-level education. On these premises, it was assumed that the share of [O.sub.1], [O.sub.2] and [O.sub.8] variables may not be lower than 30%, 10% and 20%, respectively. Calculations conducted with these assumptions slightly changed the results of particular universities; however, five out of six universities, which were considered efficient previously, kept their status. In turn, relative efficiency of some universities fell drastically (U12, U16, U13, U2), which indicates that their research strength and PhD-level education are relatively weak in comparison to other universities (Fig. 2).

[FIGURE 2 OMITTED]

Next the [E.sub.1] and [E.sub.2] environmental variables were introduced to the model by including them in the Frontier Analyst software as uncontrolled inputs. Due to the software requirements, the [E.sub.2] variable was replaced by the 1/[E.sub.2] variable in order to obtain the positive correlation between that variable and the outputs. During the process of calculation, it was observed that the introduction of [E.sub.1] and [E.sub.2] variables resulted in assigning a zero weigh to the [I.sub.1] variable by the DEA algorithm. Since the utilization of the government budget subsidy is the basis for the relative efficiency analysis of the universities, the authors decided to impose additional constraint on variable weighs. It was assumed that the share of [I.sub.1] variable may not be lower than 70% and the share of [E.sub.1] and [E.sub.2] variables may not be higher than 30%. Calculations carried out with such assumptions hardly changed the results of the analysis (except for single cases--U15) (Fig. 3). It is an indicator that the environment, in which a university functions, has no significant influence on its efficiency.

[FIGURE 3 OMITTED]

In order to study the sensitivity of calculations to data error, simulations were carried out where the output variable was distorted by [+ or -] 3%, [+ or -] 5% and [+ or -] 10%. Input variables remained unchanged since they were determined with high accuracy. The simulation demonstrated that the calculation results remained stable with the distortion level of [+ or -] 3%. Distortion of [+ or -] 5% caused significant shifts but the general picture of the ranking was sustained. Distortion of [+ or -] 10% caused the instability of the results. Simulation results led to the conclusion that since the weighted number of students (including PhD students) and the number of points in the parametric assessment of research achievements carried out by the Ministry of Science and Higher Education are based on the factors and indicators, which are set arbitrarily, one should exhibit far reaching caution in interpreting the results of the university efficiency calculations. These results to a large extent may be determined by arbitrary assumptions. This problem may be a premise for further detailed studies in this area.

The last analysis aimed at studying the influence of a university size on its relative efficiency. University size (measured by the size of the government budget subsidy) shows moderate correlation (r = 0.53) with relative efficiency. It may lead to the conclusion that on average, larger universities achieve higher efficiency. This conclusion is supported by the visual analysis of the efficiency graph in the university size function (Fig. 4).

[FIGURE 4 OMITTED]

The additional element of the DEA method analysis is the possibility to develop benchmarking graphs to compare objects (Guzik 2009). The apexes of the graph represent objects and the lines visualise the relations between the objects. The arrows indicate direction of interaction and are led from the example (model) objects to the objects following the example. The benchmarking graph indicates, which objects should serve as model objects for those that are not fully effective to make them work according to the optimal technology (Fig. 5).

[FIGURE 5 OMITTED]

On the basis of the benchmarking graph of the university (Fig. 5), the following conclusions can be drawn:

--universities U5 and U4 could serve as examples for HEI that are not as efficient;

--special attention should be paid to the university U5. It is present in benchmarking formulas of eight universities;

--the maximum of three (in four) efficient universities are the benchmarks for the inefficient ones.

Analysis of benchmarking graph enables determining best practices in order to set criteria of functioning improvement and measuring progress (Karlof, Ostblom 1993).

Conclusions

The paper presented an example of the DEA method implementation in the efficiency assessment of Polish universities of technology. This example shows the usefulness and rationality of DEA application in the sector of higher education. Systematic and multi-criteria assessment of public sector institutions may bring many benefits not only to the authorities that operate with limited public funds but firstly to the assessed units. DEA results carry significant information on the efficiency of HEI functioning in relation to other institutions with a similar scope of activity. They point at the attainable results and at the factors, which mostly influence the efficiency of a unit. The authors are convinced that the comparative efficiency analysis may be one of the important stimuli to increase the quality of education and research, to improve the spending efficiency of public funds and their allocation as well as to perfect the HEI management. There are many good practices in the sector, but they need better dissemination.

The study presented in the paper--though limited in scope--shows that Polish universities of technology are diversified in regard to the efficiency of their performance. It is demonstrated that there are considerable reserves for efficiency improvement in particular schools. At the same time, one should warn against too hasty and straightforward reading of the calculation results obtained using the DEA method. Proper interpretation of these results requires deep knowledge of the studied area and a high degree of caution when formulating radical conclusions.

Caption: Fig. 1. Concept of the DEA method

Caption: Fig. 2. University efficiency scores: Score 1--without restrictions on the output weights, Score 2--with restrictions on the output weights

Caption: Fig. 3. University efficiency scores taking into account the environmental variables: Score 1--without restrictions on the environmental variable weights, Score 2--with restrictions on the environmental variable weights

Caption: Fig. 4. Efficiency score versus university size

Caption: Fig. 5. Benchmarking graph

doi:10.3846/20294913.2013.837116

References

Abbott, M.; Doucouliagos, C. 2003. The efficiency of Australian universities: a data envelopment analysis, Economics of Education Review 22(1): 89-97. http://dx.doi.org/10.1016/S0272-7757(01)00068-1

Abramo, G.; Cicero, T.; D'Angelo, C. A. 2011. A field-standardized application of DEA to national-scale research assessment of universities, Journal of Informetrics 5(4): 618-628. http://dx.doi.org/10.1016/j.joi.2011.06.001

Agasisti, T.; Pohl, C. 2011. Comparing German and Italian public universities: convergence or divergence in the higher education landscape?, Managerial and Decision Economics 33: 71-85. http://dx.doi.org/10.1002/mde.1561

Aoki, A. 2010. Data envelopment analysis for evaluating Japanese universities, Artificial Life and Robotics 15: 165-170. http://dx.doi.org/10.1007/s10015-010-0786-7

Aristovnik, A. 2012. The relative efficiency of education and R&D expenditures in the new EU members states, Journal of Business Economics and Management 13(5): 832-848. http://dx.doi.org/10.3846/16111699.2011.620167

Athanassopoulos, A.; Shale, E. 1997. Assessing the comparative efficiency of higher education institutions in the UK by means of data envelopment analysis, Education Economics 5: 117-133. http://dx.doi.org/10.1080/09645299700000011

Avkiran, N. K. 2001. Investigating technical and scale efficiencies of Australian universities through data envelopment analysis, Socio-Economic Planning Sciences 35(1): 57-80. http://dx.doi.org/10.1016/S0038-0121(00)00010-0

Banker, R. D.; Charnes, A.; Cooper, W. W. 1984. Some models for estimating technical and scale inefficiencies in data envelopment analysis, Management Science 30(9): 1078-1092. http://dx.doi.org/10.1287/mnsc.30.9.1078

Bonaccorsi, A.; Daraio, C. 2009. Characterising the European university system: a preliminary classification using census microdata, Science and Public Policy 36(10): 763-775. http://dx.doi.org/10.3152/030234209X475245

Bradley, S.; Johnes, J.; Little, A. 2006. The measurement and determinants of efficiency and productivity in the FE sector in England, Lancaster University Management School Working Paper 2006/036.

Bradley, S.; Johnes, J.; Little, A. 2010. Measurement and determinants of efficiency and productivity in the further education sector, Bulletin of Economic Research 62(1): 1-30. http://dx.doi.org/10.1111/j.1467-8586.2009.00309.x

Breu, T. M.; Raab, R. L. 1994. Efficiency and perceived quality of the nation's "top 25" national universities and national Liberal Arts colleges: an application of data envelopment analysis to higher education, Socio-Economic Planning Sciences 28(1): 33-45. http://dx.doi.org/10.1016/0038-0121(94)90023-X

Casu, B.; Thanassoulis, E. 2006. Evaluating cost efficiency in central administrative services in UK universities, Omega 34(5): 417-426. http://dx.doi.org/10.1016/j.omega.2004.07.020

Chalos, P.; Cherian, J. 1995. An application of Data Envelopment Analysis to public sector performance measurement and accountability, Journal of Accounting and Public Policy 160: 143-160. http://dx.doi.org/10.1016/0278-4254(94)00015-S

Charnes, A.; Cooper, W. W.; Rhodes, E. 1978. Measuring the efficiency of decision making units, European Journal of Operational Research 2(6): 429-444. http://dx.doi.org/10.1016/0377-2217(78)90138-8

Chavas, J.-P.; Barham, B.; Foltz, J.; Kim, K. 2012. Analysis and decomposition of scope economies: R&D at US research universities, Applied Economics 44: 1387-1404. http://dx.doi.org/10.1080/00036846.2010.541151

Chen, J.-K.; Chen, I.-S. 2011. Inno-Qual efficiency of higher education: empirical testing using data envelopment analysis, Expert Systems with Applications 38(3): 1823-1834. http://dx.doi.org/10.1016/j.eswa.2010.07.111

Coelli, T.; Rao, D. S. P.; Battese, G. E. 2002. An introduction to efficiency and productivity analysis. Boston: Kluwer Academic Publishers.

Del Rey, E.; Racionero, M. 2010. Financing schemes for higher education, European Journal of Political Economy 26(1): 104-113. http://dx.doi.org/10.1016/j.ejpoleco.2009.09.002

Farrell, M. J. 1957, The measurement of productive efficiency, Journal of Royal Statistical Society 120(3): 253-281. http://dx.doi.org/10.2307/2343100

Flegg, A. T.; Allen, D. O.; Field, K.; Thurlow, T. W. 2004. Measuring the efficiency of British universities: a multi-period data envelopment analysis, Education Economics 12: 231-249. http://dx.doi.org/10.1080/0904529042000258590

Glass, J. C.; McCallion, G.; McKillop, D. G.; Rasaratnam, S.; Stringer, K. S. 2006. Implications of variant efficiency measures for policy evaluations in UK higher education, Socio-Economic Planning Sciences 40(2): 119-142. http://dx.doi.org/10.1016/j.seps.2004.10.004

Guzik, B. 2009. Podstawowe modele DEA w badaniu efektywnosci gospodarczej i spotecznej, Wydawnictwo Uniwersytetu Ekonomicznego w Poznaniu, Poznan.

Higher education in Poland [online]. 2013. Erasmus, Available from Internet: www.erasmus.org.pl/ higher-education-poland

Higher Education Institutions and their Finances in 2011. 2012. Central Statistical Office, Warsaw, Poland.

Hirao, Y. 2012. Efficiency of the top 50 business schools in the United States, Applied Economics Letters 19: 73-78. http://dx.doi.org/10.1080/13504851.2011.568380

Johnes, J. 2006. Data envelopment analysis and its application to the measurement of efficiency in higher education, Economics of Education Review 3: 273-288. http://dx.doi.org/10.1016/j.econedurev.2005.02.005

Johnes, J. 2006a. Measuring teaching efficiency in higher education: an application of data envelopment analysis to economics graduates from UK universities 1993, European Journal of Operational Research 174(1): 443-456. http://dx.doi.org/10.1016/j.ejor.2005.02.044

Johnes, J.; Yu, L. 2008. Measuring the research performance of Chinese higher education institutions using data envelopment analysis, China Economic Review 19: 679-696. http://dx.doi.org/10.1016/jxmeco.2008.08.004

Kantabutra, S.; Tang, J. C. S. 2010. Efficiency analysis of public universities in Thailand, Tertiary Education and Management 16(1): 15-33. http://dx.doi.org/10.1080/13583881003629798

Karlof, B.; Ostblom, S. 1993. Benchmarking: a signpost to excellence in quality and production. New York: John Wiley & Sons. 197 p.

Katharaki, M.; Katharakis, G. 2010. A comparative assessment of Greek universities' efficiency using quantitative analysis, International Journal of Educational Research 49: 115-128. http://dx.doi.org/10.1016/j.ijer.2010.11.001

Kempkes, G.; Pohl, C. 2010. The efficiency of German universities--some evidence from nonparametric and parametric methods, Applied Economics 42: 2063-2079. http://dx.doi.org/10.1080/00036840701765361

Kuah, Ch. T.; Wong, K. Y. 2011. Efficiency assessment of universities through data envelopment analysis, Procedia Computer Science 3: 499-506. http://dx.doi.org/10.1016/j.procs.2010.12.084

Leitner, K.; Prikoszovits, J.; Schaffhauser-Linzatti, M.; Stowasser, R.; Wagner, K. 2007. The impact of size and specialisation on universities' department performance: a DEA analysis applied to Austrian universities, Higher Education 53: 517-538. http://dx.doi.org/10.1007/s10734-006-0002-9

Madden, G.; Savage, S.; Kemp, S. 1997. Measuring public sector efficiency: a study of economics departments at Australian universities, Education Economics 5: 153-168. http://dx.doi.org/10.1080/09645299700000013

McMillan, M.; Chan, W. H. 2006. University efficiency, a comparison and consolidation of results from stochastic and non-stochastic methods, Education Economics 14(1): 1-30. http://dx.doi.org/10.1080/09645290500481857

McMillan, M.; Datta, D. 1998. The relative efficiencies of Canadian universities: a DEA perspective, Canadian Public Policy--Analyse de Politiques 24(4): 485-511.

Montoneri, B.; Lin, T. T.; Lee, C.-C.; Huang, S.-L. 2012. Application of data envelopment analysis on the indicators contributing to learning and teaching performance, Teaching and Teacher Education 28: 382-395. http://dx.doi.org/10.1016/j.tate.2011.11.006

Munteanu, V.; Andrei-Coman, N. 2011. Modern methods for underlying the revenue and expenses budget in higher education institutions from Romania based on the criteria of cost-efficiency, Procedia Social and Behavioral Sciences 15: 2445-2448.

Nazarko, J.; Komuda, M.; Kuimicz A. K.; Szubzda, E.; Urban, J. 2008. The DEA method in public sector institutions efficiency analysis on the basis of higher education institutions, Badania Operacyjne i Decyzje 4: 89-105 (in Polish).

Nazarko, J.; Kuimicz, A. K.; Szubzda, E.; Urban, J. 2009. The general concept of benchmarking and its application its higher education in Europe, Higher Education in Europe 3-4: 497-510. http://dx.doi.org/10.1080/03797720903356677

Odeck, J. 2005. Evaluating target achievements in the public sector: an application of a rare non-parametric DEA and Malmquist indices, Journal of Applied Economics 8: 171-190.

Onsel, Ulengin, F.; Ulusoy, G.; Aktac, E.; Kabak, O.; Topcu, Y. 1. 2008. A new perspective on the competitiveness of nations, Socio-Economic Planning Sciences 42: 221-246. http://dx.doi.org/10.1016/j.seps.2007.11.001

Rozporzadzenie Ministra Nauki i Szkolnictwa Wyzszego z dnia 8 lutego 2012 r. w sprawie sposobu podzialu dotacji z budzetu panstwa dla uczelni publicznych i niepublicznych. 2012. Dz.U. 2012 nr 0 poz. 202. (Feb. 8, 2012 Act of [the Republic of Poland's] Minister of Research and Higher Education regarding the method of distribution of government subsidies for public and private institutions of higher education, Republic of Poland Government Register 2012, no. 0, item 202).

Salerno, C. 2006. Using data envelopment analysis to improve estimates of higher education institution's per-student education costs, Education Economics 14(3): 281-295. http://dx.doi.org/10.1080/09645290600777485

Stukalina, Y. 2012. Addressing service quality issues in higher education: the educational environment evaluation from the students' perspective, Technological and Economic Development of Economy 18(1): 84-98. http://dx.doi.org/10.3846/20294913.2012.658099

Taylor, B.; Harris, G. 2004. Relative efficiency among South African universities: a data envelopment analysis, Higher Education 47: 73-89. http://dx.doi.org/10.1023/B:HIGH.0000009805.98400.4d

Thanassoulis, E. 2003. Introduction to the theory and application of data envelopment analysis: a foundation text with integrated software. Boston: Kluwer Academic Publishers. 281 p.

Thanassoulis, E.; Conceicao, M. da; Portela, S. 2002. School outcomes: sharing the responsibility between pupil and school, Education Economics 10(2): 183-207. http://dx.doi.org/10.1080/09645290210126913

Zafiropoulos, C.; Vrana, V. 2008. Service quality assessment in a Greek higher education institute, Journal of Business Economics and Management 9(1): 33-45. http://dx.doi.org/10.3846/1611-1699.2008.9.33-45

Received 18 April 2013; accepted 30 July 2013

Joanicjusz NAZARKO (a), Jonas SAPARAUSKAS (b)

(a) Department of Business Informatics and Logistics, Faculty of Management, Bialystok Technical University, Wiejska 45 A Street, 15-351 Bialystok, Poland

(b) Vilnius Gediminas Technical University, Sauletekio al. 11, 10223 Vilnius, Lithuania

Corresponding author Jonas Saparauskas

E-mail: [email protected]

Joanicjusz NAZARKO is a Professor at the Bialystok University of Technology in Poland. He serves as Dean of the Faculty of Management and Head of the Department of Business Informatics and Logistics. He is the author of more than 200 publications and a number of expert assessments, projects and technical and economic elaborations and a recognised expert in the field of forecasting, foresight, benchmarking and productivity analysis in corporate and public sectors. Nazarko is a member ofthe Production Engineering Committee of the Polish Academy of Sciences. He served as an expert of the EU 7th Framework Program and is a senior member of IEEE.

Jonas SAPARAUSKAS. Doctor, Associated Professor at the Department of Construction Technology and Management, and Vice-Dean of Undergraduate Studies at the Faculty of Civil Engineering of Vilnius Gediminas Technical University. He is a member of EURO Working Group OR in Sustainable Development and Civil Engineering (EWG-ORSDCE). He is is author and co-author of more than 20 scientific articles and 1 book. Research interests: construction technology and organisation, construction investments management, multiple criteria decision making, sustainable development.
Table 1. Model variables

Input           [I.sub.1]   Government budget subsidy [PLN]
variables
                [I.sub.2]   Number of academic teachers

                [I.sub.3]   Number of other employees

                [I.sub.4]   Number of licenses to award PhD degrees

                [I.sub.5]   Number of licenses to award higher
                            doctorate degrees

Output          [O.sub.1]   Weighted number of full-time students
variables
                [O.sub.2]   Weighted number of full-time PhD students

                [O.sub.3]   Percentage of students studying abroad

                [O.sub.4]   Percentage of international students

                [O.sub.5]   Percentage of students with university
                            scholarships

                [O.sub.6]   Percentage of students with government
                            ministry scholarships

                [O.sub.7]   Employer preference for hiring alumni

                [O.sub.8]   Parametric assessment of scholarly
                            achievements of faculty

Environmental   [E.sub.1]   Population size of the city where the
variables                   university is located

                [E.sub.2]   Percentage of students with need-based
                            financial aid

Source: Elaborated by the authors.

Table 2. Pearson correlation coefficient of input variables

            [I.sub.1]   [I.sub.2]   [I.sub.3]   [I.sub.4]   [I.sub.5]

[I.sub.1]     1.000       0.984       0.982       0.953       0.988
[I.sub.2]     0.984       1.000       0.968       0.958       0.968
[I.sub.3]     0.982       0.968       1.000       0.942       0.953
[I.sub.4]     0.953       0.958       0.942       1.000       0.944
[I.sub.5]     0.988       0.968       0.953       0.944       1.000

Source: Calculated by the authors.

Table 3. Pearson correlation coefficient of input and output variables

            [O.sub.1]   [O.sub.2]   [O.sub.3]   [O.sub.4]

[I.sub.1]     0.97        0.96        0.22        0.15
P             0.00        0.00        0.36        0.53

            [O.sub.5]   [O.sub.6]   [O.sub.7]   [O.sub.8]

[I.sub.1]     0.18        0.43        0.93        0.96
P             0.46        0.06        0.00        0.00

Source: Calculated by the authors.

Table 4. Pearson correlation coefficient of output and environmental
variables

            [O.sub.1]   [O.sub.2]   [O.sub.7]   [O.sub.8]

[E.sub.1]    0.7186      0.8391      0.8314      0.8563
[E.sub.2]    -0.5496     -0.5803     -0.5079     -0.6368

Source: Calculated by the authors.

Table 5. Coefficient of variation of model variables

     [I.sub.1]   [O.sub.1]   [O.sub.2]   [O.sub.7]

CV     0.59        0.65        1.10        1.19

     [O.sub.8]   [E.sub.1]   [E.sub.2]

CV     0.82        0.78        0.86

Source: Calculated by the authors.

Table 6. Variables selected for DEA model

Input variable     [I.sub.1]   Government budget subsidy

Output variables   [O.sub.1]   Weighted number of full-time students

                   [O.sub.2]   Weighted number of full-time PhD
                               students

                   [O.sub.7]   Employer hiring preferences with
                               respect to alumni

                   [O.sub.8]   Parametric assessment of scholarly
                               achievements

Environmental      [E.sub.1]   Population size of the city, in which
variables                      the university is located

                   [E.sub.2]   Percentage of students with need-based
                               financial aid

Source: Elaborated by the authors.

Table 7. Efficiency scores for 19 universities

No     Univ.       Score

1    [U.sub.1]    100.00%
2    [U.sub.4]    100.00%
3    [U.sub.5]    100.00%
4    [U.sub.10]   100.00%
5    [U.sub.17]   100.00%
6    [U.sub.18]   100.00%
7    [U.sub.6]    97.30%
8    [U.sub.11]   96.60%
9    [U.sub.19]   95.70%
10   [U.sub.2]    93.90%
11   [U.sub.9]    91.10%
12   [U.sub.15]   86.50%
13   [U.sub.16]   84.10%
14   [U.sub.14]   83.30%
15   [U.sub.13]   83.10%
16   [U.sub.8]    82.80%
17   [U.sub.7]    81.20%
18   [U.sub.3]    79.80%
19   [U.sub.12]   75.00%

Source: Calculated by the authors.
联系我们|关于我们|网站声明
国家哲学社会科学文献中心版权所有