首页    期刊浏览 2024年12月11日 星期三
登录注册

文章基本信息

  • 标题:Assessing the impact of the Title III program on doctoral and professional programs at minority serving institutions using a multilevel Rasch Rating Scale Model.
  • 作者:Kaliba, Aloyce R. ; Powell, Kimberly K.
  • 期刊名称:International Journal of Education Research (IJER)
  • 印刷版ISSN:1932-8443
  • 出版年度:2012
  • 期号:March
  • 语种:English
  • 出版社:International Academy of Business and Public Administration Disciplines
  • 摘要:Critics of Historically Black Colleges and Universities (HBCUs) contend that while these institutions have performed their mission from an historical perspective, this mission and their current performance is at best marginal if not negative in terms of the educational outcomes for their students. Consequently, such critics posit that these institutions should not receive race-based special funding. Proponents of HBCUs argue that the institutions have a history of successful outreach and play a major role in the development of human capital in the African American community. For example, while HBCUs constitute three percent (3%) of the higher educational institutions in the United States, they graduate approximately twenty-eight percent (28%) of African American undergraduates (Jackson & Nunn, 2003). African American students at HBCUs are more likely than their counterparts at other institutions to pursue postgraduate education and become professionals (Drewry & Doermann, 2001; Wenglinsky, 1996). HBCUs award nearly thirty-five percent (35%) of all bachelor's degrees in astronomy, biology, chemistry, mathematics and physics. Over fifty percent (50%) of African American teachers graduated from HBCUs (Jackson & Nunn, 2003). In addition, HBCUs continue to account for a disproportionately high number of diverse minority entrants into the labor force (National Association for Equal Opportunity in Higher Education, 2009). They have also contributed significantly to low-income, minority students in obtaining postsecondary education (United States Department of Education, 1999).
  • 关键词:African American universities and;Doctoral degrees;Educational programs;Historically black colleges and universities;Minority serving institutions;Minority serving institutions (Universities and colleges);Minority serving universities and colleges;Multilevel analysis;Professional development

Assessing the impact of the Title III program on doctoral and professional programs at minority serving institutions using a multilevel Rasch Rating Scale Model.


Kaliba, Aloyce R. ; Powell, Kimberly K.


INTRODUCTION

Critics of Historically Black Colleges and Universities (HBCUs) contend that while these institutions have performed their mission from an historical perspective, this mission and their current performance is at best marginal if not negative in terms of the educational outcomes for their students. Consequently, such critics posit that these institutions should not receive race-based special funding. Proponents of HBCUs argue that the institutions have a history of successful outreach and play a major role in the development of human capital in the African American community. For example, while HBCUs constitute three percent (3%) of the higher educational institutions in the United States, they graduate approximately twenty-eight percent (28%) of African American undergraduates (Jackson & Nunn, 2003). African American students at HBCUs are more likely than their counterparts at other institutions to pursue postgraduate education and become professionals (Drewry & Doermann, 2001; Wenglinsky, 1996). HBCUs award nearly thirty-five percent (35%) of all bachelor's degrees in astronomy, biology, chemistry, mathematics and physics. Over fifty percent (50%) of African American teachers graduated from HBCUs (Jackson & Nunn, 2003). In addition, HBCUs continue to account for a disproportionately high number of diverse minority entrants into the labor force (National Association for Equal Opportunity in Higher Education, 2009). They have also contributed significantly to low-income, minority students in obtaining postsecondary education (United States Department of Education, 1999).

HBCUs are therefore relevant and are still a vital part of the American higher education landscape as embedded in their mission and outreach. Given the fact that HBCUs continue to serve students who will be the majority in the future workforce, and as the number of minority students continues to grow; the practices, experiences, and contribution of HBCUs will be essential in diversifying the workforce and entrepreneurial efforts (Kelderman, 2010). However, these institutions need to highlight the positive work they are doing. Research is needed to identify success, weaknesses, and areas for improvement. This is necessitated by the current state of the economy, declining budgets, and the ineffectiveness of higher education to compete with other social priorities such as health care, K-12 education, crime prevention and incarceration for public funding (Duderstadt & Womack, 2003).

Title III, Part B of the Higher Education Act of 1965 was created due to the fact that a significant number of African Americans students who attend HBCUs disproportionately come from low-income families, had poor high school achievement scores, and had lower standardized test scores. Due to their historical background, the HBCUs are better equipped and are able to prepare these students academically, psychologically, and socially than predominantly white institutions (PWIs) (Lucas, 1994). Congress found that HBCUs had in fact contributed significantly to African Americans and low-income students in obtaining post-secondary degrees. Subsequently, the U.S. Congress approved financial assistance in the form of grant awards under Title III, Part B of the Higher Education Act of 1965 for graduate and professional programs at HBCUs that were making significant contributions to certain disciplines at the doctoral and professional levels (H.R. 9567, 1965; USDE, 1999). The program aims at strengthening and building capacity of eligible doctoral programs in science, mathematics, technology, and engineering offered by HBCU institutions. Other eligible professional programs include law, dental, medicine, pharmacy, and veterinary science (USDE, 1999).

The 1986 amendments to the Higher Education Act of 1965 (P.L. 99-498), warranted major changes in funding procedures and the use of race-specific language in the designation of HBCUs in the Title III, Part B section of the Higher Education Act of 1965. HBCUs would receive exclusive funding under Part B on a formula based method. The formula method for allocation is based on half of the institution's enrolled Pell grant recipients; one-fourth of the institution's total number of graduates; and the remaining fourth from the institution's number of graduates who are attending graduate or professional degree programs in which African Americans are underrepresented (Boren et al., 1987).

A number of reports give accolades to the Title III program; appreciating the impact it has had on strengthening and building capacity of eligible doctoral programs in science, mathematics, technology, and engineering and other eligible professional programs such as law, dental, medicine, pharmacy, and veterinary science. Even though the program has provided notable resources for capacity building, what is not known is the program's extent of success or lack of success as determined by the intended stakeholders. The lack of systematic program evaluation contributes to some questions and skepticism regarding the efficacy of the program. Kendrick (1981), Norman (1985), and Patrick (1992) conducted studies that assessed the impact of Title III on academic quality, institutional viability and survival at eligible HBCUs. Results from their respective studies found that Title III funding contributed in strengthening the aforementioned areas. However, these studies did not fill the gaps regarding the program's impact on specific areas within eligible doctoral and professional programs. ExpectMore.gov (2005) reviewed the program and its potential impact on eligible doctoral and professional programs and found that adequate and sufficient documentation that could be used to assess the impact of the program was unavailable.

Studies on federal programs, including the Title III program at minority serving institutions, were conducted by the U.S. Government Accountability Office (GAO) in 2007 and 2009. The 2007 study noted limited feedback mechanisms that encouraged open communication from recipient institutions (Scott, 2007). The follow-up study in 2009 recommended a management oversight by the Department of Education and creating mechanisms for collecting feedback from recipient institutions (GAO, 2009). Subsequently, the Department of Education had taken some steps to increase communication with recipient institutions. Such measures include exploring the option of using webinars when conducting conferences for recipient institutions and an e-mail address for grantees to express their concerns and inquiries anonymously (Scott, 2010). Although the Department of Education had taken some steps toward increased communication with stakeholders, these efforts are still limited and are in the early stages of development. Therefore, there is still much to learn about the program in the context of its impact on eligible doctoral and professional programs.

This study was designed to assess the impact of the programs by interviewing various stakeholders. The objective was to determine whether the program is fulfilling its intended purpose. The hypothesis is that due to limited and continued shrinking budgets, administrators across institutions are focusing on fewer activities. The specific objectives were to examine the impact of the program on: (a) enhancing research and instruction activities, (b) technology development and use in classes, (c) facilities improvement, (d) student financial assistance, (e) student services, (f) faculty development, and (g) institutional financial stability.

METHODOLOGY

Data for this study was collected from three HBCUs. The three institutions were selected using convenience sampling. Due to confidentiality reasons, the authors could not provide any information that can be used to identify the institutions. Eligible programs and institutions include: 1) Morehouse School of Medicine; 2) Meharry Medical School; 3) Charles R. Drew Postgraduate Medical School; 4) Clark-Atlanta University; 5) Tuskegee University School of Veterinary Medicine and other qualified graduate programs; 6) Xavier University School of Pharmacy and other qualified graduate programs; 7) Southern University School of Law and other qualified graduate programs; 8) Texas Southern University School of Law and School of Pharmacy and other qualified graduate programs; 9) Florida A&M University School of Pharmaceutical Sciences and other qualified graduate programs; 10) North Carolina Central University School of Law and other qualified graduate programs; 11) Morgan State University qualified graduate program; 12) Hampton University qualified graduate program; 13) Alabama A&M qualified graduate program; 14) North Carolina A&T State University qualified graduate program; 15) University of Maryland Eastern Shore qualified graduate program; 16) Jackson State University qualified graduate program; 17) Norfolk State University qualified graduate programs; and 18) Tennessee State University qualified graduate programs (USDE, 1999).

Five criteria were used to select the eligible institutions: the amount of funding was in the upper quintile; student enrollment was over 5,000 students; the institution had at least two eligible doctoral and professional programs; more than twenty-five (25%) of funding is allocated toward scholarships and assistantships for students; and the institution was located in the southern region. These criteria were put in place to capture a reasonable number of students per eligible programs institution and budget in case of follow-up. For each institution, the level of funding depends on the size of eligible program. The website for the U.S. Department of Education indicates that the fiscal year 2011 appropriation for this program is $61,302,150, a reduction of $122,850 from the fiscal year 2010 level. During this study, the funding ranged between $2 and $4 million per year for 50% of the recipients. The sample institutions receive on average of $9 million per year (USDE, 2011).

E-mail addresses of administrators who manage eligible doctoral and professional programs were obtained from the website of the three institutions. An initial letter was electronically sent to these administrators soliciting support for this study. The letter briefly detailed the purpose and significance of the study. All contacted administrators agreed to participate. They directed the faculty members who were knowledgeable of Title III funding to complete the electronic survey instrument. All potential respondents were given a three-week window to complete and electronically submit the survey. Measures were put in place to ensure that respondents were not able to submit more than one completed survey instrument. Two follow-up reminders were transmitted to administrators to remind faculty members on the importance of the survey to enhance the response rate. After the three- week cutoff date, all of the submitted surveys were reviewed for completeness. A total of 47 completed responses were received.

The questions/statements in Appendix 1 were developed by taking into account allowable expenditures and expected, measurable outcomes. The survey instrument was divided into the following subsections: (a) demographics questions regarding the respondents' gender, the institution in which they were employed, the position(s) that they hold at the institution, how long they had been employed at the institution, and the school/college in which they worked in at the institution; (b) twenty five questions was designed to assess the impact of the Title III program. The questions were divided into seven domains: research and instruction, technology development, facilities improvement, student financial assistance, student services, faculty development, and financial stability; (c) other questions were designed to obtain information on strategies regarding improvement and communication of the Title III program success; and (d) a comment section. Taking into account allowable expenditures and expected measurable outcomes, questions regarding the assessment of the impact of the program were developed. Questions that addressed suggested strategies for improvement and solicited views related to modifications in the current legislation, performance reporting, research-based decision-making process, and means of increased feedback from stakeholders through various platforms were also included. All questions in sections (b) and (c) were in a five ordered Likert Scale coded from 1 to 5 with a greater score indicating a favorable impact. The response categories were coded as 1=strongly disagree, 2=disagree, 3=neither agree nor disagree, 4=agree, and 5=strongly agree. Refer to Appendix 1 for details on statements used to rank the impact by faculty members of the three institutions.

EMPIRICAL MODEL DEVELOPMENT

In this study, respondents were asked to evaluate different statements and indicate their degree of agreement with the statements using Likert based survey items as shown in Appendix 1. Based on the responses, each item or statement may be analyzed separately or summed to create a score for a group of items or a summative scale (Bradley, Sampson & Royal, 2006). However, analyzing single-item responses pertaining to a latent variable (impact of the program) is not reliable. Generally, it is not advisable to make inferences based upon the analysis of single item responses that are used in measuring a scaled latent variable (Johnson & Albert, 1999). For this study, the format of the statements was in a five ordered Likert Items; therefore, these responses can be either treated as ordinal or interval-level measurements. Responses to a single Likert item are normally treated as ordinal data, because, one cannot assume that respondents' responses are equidistant (Fox, 2005). That is, if the data are ordinal, one score could be higher than another, however, it cannot be determined how much higher. When responses to several items are summed to measure a latent variable such as the impact of the program and all statement in a survey instrument use the same Likert scale, the responses are treated as interval data. The interval data elucidates the distance between two points and the differences between each response are equal in distance (Zheng & Rabe-Hesketh, 2009).

Interval estimates on a continuum bases and from the Likert scale responses can be obtained by applying the Polytomous Rasch Models (PRM). The polytomous Rasch model is a generalization of the dichotomous Rasch model (Rasch, 1960, 1980). For Likert scale responses, the polytomous Rasch model is used to measure the latent variable in a continuum scale. The model permits testing of the hypothesis that the statements in the Likert scale reflect increasing levels of the latent variable as intended (Rabe Hesketh and Skrondal, 2008). The common polytomous Rasch model is the Rating Scale model (RSM) that assumes identical threshold distances across items (Andrich, 1978). This study applied the RSM for data analyses as categorical responses were ordinal and the same across all items/statements used to measure the impact of the program.

DATA ANALYSES

The Econometric Model

Regarding data analysis, X will be a data matrix made up by the responses of v=1,.., N subjects who responds to k=1,..,K polytomous items with the same number h=1,..,M response categories per item. The subjects are in the low and items (statements) in the column. The polytomous rating scale model (RSM) as suggested by Andrich (1978) is:

[MATHEMATICAL EXPRESSION NOT REPRODUCIBLE IN ASCII]

Equation 1 express the probability of giving any one of the possible response categories by subject [[theta].sub.v] on items i. Also, [[beta].sub.k] is the item parameters, [[theta].sub.v] is the subject parameters and [[gamma].sub.h]h are category parameters describing the scoring which is considered to be the same for all items. The parameters, [[theta].sub.v] equals the subject v's interval value on the latent scale measurement (impact of the program). As the value of 6v increase, the probability of selecting response h in a monotonic manner increases. The [[beta].sub.k] parameters measure the marginal or location effect for item k, which depend on both the response option and the particular item. The sum in the denominator ensures that for individual v the sum of probabilities (P([X.sub.vk]=h\[[theta].sub.v]) over all response options to item k equals 1.

In this study, individual respondents are nested in schools/faculties that are also nested in institutions/universities. Individual responses may also vary by school and institutions. Multilevel modeling is therefore needed to capture variability in individual responses. Fox (2007) defines a general multilevel model with covariates such that respondents (v=1,.., N) are nested within schools (i = and the three surveyed institution/universities (j=1,2,3). The

model that quantifies the impact, which depends on individual respondent characteristics and that vary by schools and institutions is specified as a multilevel structure as follows:

[MATHEMATICAL EXPRESSION NOT REPRODUCIBLE IN ASCII] (2)

In Equation 2, z is a matrix of level 1 covariates with a total numbers of Q variables that include a vector of one for an intercept when q=0. For strictly ordered and increasing model thresholds, the following constraints are usually imposed in Equation (2) for parameter identification purposes.

[m.summation over h=1] [[gamma].sub.h] = O, [[gamma].sub.1] < [[gamma].sub.2]... [[gamma].sub.h-1] (3)

When constraints in Equation (3) are imposed, all consecutive differences defining the item category response functions will be negative and the sum of the item category response function will add to one (Chajewski & Lewis, 2009).

The covariates included in the level one model are position of the respondent (i.e., dummy variables for administrators, faculty, Title III project director, and Dual Position), dummy variable for gender, and ordinal variable for experience (i.e., 1=0-5 years, 2=6-10 years, 3=11-15 years, 4=16-20 years, and 5=over 20 years). Due to relatively small sample size, schools were grouped into two groups: agriculture, education and graduate studies, law and other (group 1); and dental, medicine, other medical studies, nursing, sciences, and veterinary (group 2). For the same reason, strongly disagree and disagree responses were combined to form one response-disagree. Therefore, there were four response categories per item (i.e., 1=disagree, 2=neither agree nor disagree, 3=agree, and 4=strongly agree) and twenty-five items.

Notice that Equation (2) investigates whether there are any differential school and institutional effects on individual responses. For example, do administrator responses differ by school or institution? The model was estimated using gllamm in Stata. The program gllamm runs in statistical package Stata and estimates Generalized Linear Latent and Mixed Models by maximum likelihood (Rabe-Hesketh, Skrondal, & Pickles, 2004a; 2004b). Level-1 covariates were later dropped from the model due to lack of statistical significance.

Validity and Reliability Test of the Survey Instrument

Validity and reliability of the survey instrument were tested using face and content validity and Cronbach's alpha, respectively. Face and content validity was substantiated by a focus group knowledgeable of Part B, section 326 of the Title III Program. The focus group carefully reviewed the survey instrument and assisted in providing guidance in constructing the instrument statements. In addition, they were required to determine to whether the instrument statement looks valid in terms of measuring the impact of the program. They also rated each statement to determine how essential a particular statement is; in measuring impact of the program. Statements of the survey instrument where added, dropped and modified accordingly.

A pilot survey was also conducted at one of the eligible institutions to determine the reliability of the instrument. An instrument is reliable to the extent that whatever it measures, it measures it consistently. This study focused more on internal consistency of the survey instrument. In general, when the items on an instrument are not scored right versus wrong, Cronbach's alpha is often used to measure the internal consistency especially when the survey instrument uses the Likert scale. The responses from the pilot study were analyzed to determine the reliability of the instrument. The estimated Cronbach's alpha from the pilot study was 0.78, which was an acceptable range for the survey instrument (Creswell, 2009; Fong, Ho, & Lam, 2010).

RESULTS OF THIS STUDY

Results from the Rasch Rating Scale Model are presented in Table 1. The log of likelihood ratio that tests the null hypothesis that all variables were statistically insignificant was -1154 and statistically significant at 1% level of significance. The mean item difficulty score ranged from 0.362 which represent lowest difficulty (Q7; Title III funds have aided in the area of research in doctoral and/or professional programs: research and instruction domain) to 3.826 which represent the highest difficulty (Q28; Title III funds have established or improved a development office to assist doctoral and/or professional programs in increasing contributions from alumni and the private sector which is in the financial stability domain). The mean difficulty scores were between 1.942 and 2.287 and include the following three statements which represent intermediate difficulty: Title III funds have improved library holdings particularly at the doctoral and/or professional level which is in research and instruction domain (Q10); Title III funds have assisted in establishing or strengthening student services which is in the student services domain (Q23); and Title III funds have assisted in improving classrooms for doctoral and/or professional programs which is in the facility construction, maintenance, and renovation domain (Q18).

Results in Table 1 shows that the three items with least difficulty were on the following statements: Title III funds have aided in the area of instruction in doctoral and/or professional programs (Q7); Title III funds have enabled African American students to complete their respective degree with fewer financial hindrances (Q20), and Title III funds have enabled doctoral and/or professional faculty to attend more professional conferences and workshops (Q27). The three items with highest difficulty scores were on the following statements: Title III funds have established or enhanced a teacher education program in doctoral/or professional programs designed to qualify students to teach in public schools (Q25), Title III funds have increased the availability of online courses in doctoral and/or professional programs (Q14) and Title III funds have established or improved a development office to assist doctoral and/or professional programs in increasing contributions from alumni and the private sector (Q28).

In order to compare items across domains we use an item characteristics curves as shown in Figure 1. Item Characteristic Curve (ICC) also known as trace-lines, category response curves, or probability curves. The theoretical range of impact of the program would range from negative infinity (no impact) to positive infinity (high impact). For practical considerations we limit the range of values from -3 to +3, with a midpoint of zero. The assumption behind the ICC is that each respondent has an underlying level on the impact of the program. At each level of impact, there is a certain probability that a respondent will give a certain score to a given statement/item. The probability is low for respondents who think that the program has low impact and the probability is high for respondents who think that the program has high impact. This means that respondent with high score will have a higher probability of endorsing response categories that are consistent with high impact (i.e., strongly agree).

In Figure 1, the probability of giving a high score to any of the statement in the survey instrument is near zero, for a respondent with the lowest levels of score on the impact of the program. The probability increases and at the highest levels of impact score, that is, the probability of giving "Strongly Agree" approaches one. The curves in Figure 1 therefore show the relationship between the probability of scoring "strongly agree" to an item and the impact rating by the respondents.

In addition, Figure 1 shows two technical properties of an item characteristic curve. The first is the difficulty of the item/statement, which describes, in this case, the position of the item along the impact scale on the horizontal axis. A low difficulty score will appear among respondents with high impact score (easy item) and low score on item will appear among respondents with low-impact score (difficulty item). The second technical property is discrimination, which describes how well an item can differentiate between respondents having impact score below the item location and those having impact score above the item location. This property essentially reflects the steepness of the item characteristic curve in its middle section. The steeper the curve, the better the item can discriminate and the flatter the curve, the less the item is able to discriminate since the probability of "strongly agree" response at low impact levels is nearly the same as it is at high impact levels.

Based on the results in Figure 1, all items have the same level of discrimination (the curves do not cross) but differ with respect to discrimination power. In each domain, the left hand curves represent an easy item (because the probability of "strongly agree" is high for respondents with low impact score and approaches one for respondents with high impact score). This means that majority of respondent scored the item/statement as "strongly agree". The center curves represent items of medium difficulty because the probability of "strongly agree" is low at the lowest impact levels, around 0.5 in the middle of the impact scale and near one at the highest impact levels. These items were scored as "agree" on average. The right-hand curve represents a hard item-few respondents scored the item as "strongly agree". The probability of "strongly agree" is low along the impact scale and increases only when the higher impact levels are reached. Even at the highest impact level shown (+3), for most of them, the probability of "strongly agree" is less than 0.8.

In general, for research and instruction, Title III funds has the highest impact on aiding the area of instruction in doctoral and/or professional programs and least impact on improving library holdings particularly at the doctoral and/or professional level. Due to the closeness of the ICC, the ranking on impact of the program on these items might not be very large. For technology improvement, the program has the highest impact on expanding technology in doctoral and/or professional programs and the lowest impact on increasing the availability of online courses in doctoral and/or professional programs. The difference on impact level is very large as the two ICCs are relatively far apart. Similarly, for facility construction, maintenance and renovation, the program has the highest impact in improving laboratory facilities for doctoral and/or professional programs and the least impact on assisting in improving offices for doctoral and/or professional programs. Relatively, all items in student financial assistance and student services have similar impacts. For faculty development and financial stability, the program has the highest impact on enabling doctoral and/or professional faculty to attend more professional conferences and workshops and lowest impact on strengthening the institution's endowment in an effort to facilitate financial independence.

Results in Table 1 and Figure 1 were combined to formulate an Item-Person map as shown in Figure 2. For this study an Item-Person Map locates the item along the impact scale in relation to the respondent estimated levels of impact using the same metric. Item-person maps constrain all items in the scale to have equal levels of discrimination, thus items can be compared with one or another in terms of difficulty or location without considering discrimination ability. Figure 2 presents the item-person map for the 25 items on impact scale. The upper part of the graph represents the estimated level (score) of a person, along the impact continuum. The respondents with high ranking in terms of impact of the programs are at the right of Figure 2. Scoring of items in the "strongly agree categories" are also on the right side of the item - person map. Figure 2 suggest that few respondents with high score on impact of the program (right of the scale) and low score on the impact of the program (left of the scale) are not measured reliably because of lack of item coverage. However, it is clear that the program impact was high in terms of aiding in the area of instruction (I7) and enabling doctoral and/or professional faculty to attend more professional conferences and workshops (I27). The least impact was on establishing or improving an office to assist doctoral and/or professional programs to increasing contributions from alumni and the private sector (I28) and increasing the availability of online courses in doctoral and/or professional programs (I14).

SUMMARY AND CONCLUSION

The federal funds under Part B, section 326 of the Title III program are granted to institutions that are making a substantial contribution to the legal, medical, dental, veterinary, or other graduate education opportunities in mathematics, engineering, or the physical or natural sciences for minority students. The grant may be used for: (1) purchase, rental or lease of scientific or laboratory equipment for educational purposes, including instructional and research purposes; (2) construction, maintenance, renovation, and improvement in classroom, library, laboratory, and other instructional facilities, including purchase or rental of telecommunications technology equipment or services; (3) purchase of library books, periodicals, technical and other scientific journals, microfilm, microfiche, and other educational materials, including telecommunications program materials; (4) scholarships, fellowships, and other financial assistance for needy graduate and professional students to permit the enrollment of the students in and completion of the doctoral degree in medicine, dentistry, pharmacy, veterinary medicine, law, and the doctorate degree in the physical or natural sciences, engineering, mathematics, or other scientific disciplines in which minority students are underrepresented; (5) establish or improve a development office to strengthen and increase contributions from alumni and the private sector; (6) assist in the establishment or maintenance of an institutional endowment to facilitate financial independence; and (7) funds and administrative management, and the acquisition of equipment, including software, for use in strengthening funds management and management information systems (USDE, 1999).

A survey instrument was electronically administered to administrators and faculty to estimate the impact of the program. Forty-seven responses were received from three eligible institutions with representation from different schools and colleges. A multilevel Rasch Rating scale model was utilized for data analysis. The results from this model were used to construct the Item Characteristics Curve and Item-Person maps.

In general, the Title III program has intermediate to high impact on doctoral and professional programs for all intended purposes. The results show that the Title III program significantly impacts three key areas: aiding in the area of instruction in doctoral and/or professional programs; enabling African American students to complete their respective degree with fewer financial hindrances; enabling doctoral and/or professional faculty to attend more professional conferences and workshops. The program has a low impact on establishing or enhancing teacher education program in doctoral/or professional programs designed to qualify students to teach in public schools, increasing the availability of online courses in doctoral and/or professional programs, and establishing or improving a development office to assist doctoral and/or professional programs in increasing contributions from alumni and the private sector. Due to declining budgets and the scarcity of public funding in general, there is a need to review the objective of the Title III program to effectively link funding to the activities with immediate and high impact.

LIMITATIONS OF THIS STUDY

Certain limitations exist within this study. The study assesses the impact of the Title III program on a limited number of doctoral and professional programs. The study only examines the impact of Title III through the lens of doctoral and professional programs. Consequently, the questions are geared toward these programs. The study is also limited to administrators and faculty who are directly involved in eligible doctoral and professional programs.

RECOMMENDATIONS FOR FUTURE RESEARCH

To address the limitations of this study, future research is recommended. For example, the survey instrument should be expanded to include more questions and a larger sample that will cover all eligible institutions should be tested. Future research studies should encompass the impact of Title III funding on other graduate programs such as Masters level programs. Research that compares the impact of multiple graduate level programs (e.g. doctoral, professional, and masters) should be examined as well.

[FIGURE 1 OMITTED]

[FIGURE 2 OMITTED]
Appendix 1

Statements and Percent Frequency of the Ordered Responses on
Perceived Impact

Variable           Item   Statement

Perceived                 Title III funds have
  Impact                  aided/assisted/improved:

Research and       Q6     in the area of research in doctoral
  Instruction             and/or professional programs.

                   Q7     in the area of instruction in
                          doctoral and/or professional
                          programs.

                   Q8     in the expansion of the curriculum
                          in doctoral and/or professional
                          programs.

                   Q9     in maintaining regional
                          accreditation at the institution.

                   Q10    library holdings particularly at the
                          doctoral and/or professional level.

Technology         Q11    in strengthening library
  Improvement             technology.

                   Q12    expanded technology in doctoral
                          and/or professional programs.

                   Q13    have expanded
                          telecommunications within
                          doctoral and/or professional
                          programs.

                   Q14    have increased the availability of
                          online courses in doctoral and/or
                          professional programs.

Facility           Q15    in improving laboratory facilities
  Construction,           for doctoral and/or professional
  Maintenance,            programs.
  and Renovation
                   Q16    in improving buildings for
                          doctoral and/or professional

                          programs.
                   Q17    Title III funds have assisted in
                          improving offices for doctoral
                          and/or professional programs.

                   Q18    in improving classrooms for
                          doctoral and/or professional
                          programs.

Scholarships,      Q19    have increased enrollment of
  Fellowship,             African American students in the
  and other               respective doctoral and
  Financial               professional programs.

Assistance         Q20    have enabled African American
                          students to complete their
                          respective degree with fewer
                          financial hindrances.

                   Q21    doctoral and/or professional
                          programs in retaining African
                          American students.

                   Q22    in improving the graduation rate
                          of African American students

Student            Q23    in establishing/strengthening
  Services                student services.

                   Q24    in establishing /strengthening
                          community outreach initiatives

                   Q25    have established/enhanced a
                          teacher education program
                          designed to qualify students to
                          teach in public schools.

Faculty            Q26    faculty in obtaining more
  Development             educational hours in their
                          corresponding disciplines.

                   Q27    have enabled faculty to attend
                          more professional conferences
                          and workshops.

Financial          Q28    have established/ improved a
  Stability               development office to assist
                          doctoral and/or professional
                          programs in increasing
                          contributions from alumni and the
                          private sector.

                   Q29    in strengthening financial
                          management capabilities and
                          management information systems
                          of the institution.

                   Q30    have strengthened the institution's
                          endowment in an effort to
                          facilitate financial independence.

Variable           Item       Response Categories (%)

Perceived                 1       2       3       4       5
  Impact

Research and       Q6     6.38    4.26     4.26   27.66   57.45
  Instruction      Q7     2.13    6.38     6.38   34.04   51.06
                   Q8     4.26    4.26    19.15   29.79   42.55
                   Q9     2.13    0.00    27.66   46.81   23.40
                   Q10    4.26    4.26    31.91   36.17   23.40

Technology         Q11    4.26    4.26    38.30   34.04   19.15
  Improvement      Q12    4.26    4.26    21.28   40.43   29.79
                   Q13    2.13    8.51    38.30   36.17   14.89
                   Q14    6.38   12.77    57.45   12.77   10.64

Facility           Q15    4.26    6.38    14.89   31.91   42.55
  Construction,    Q16    2.13   12.77    34.04   25.53   25.53
  Maintenance,     Q17    8.51    6.38    44.68   25.53   14.89
  and Renovation   Q18    4.26    8.51    27.66   40.43   19.15

Scholarships,      Q19    2.13    2.13    19.15   36.17   40.43
  Fellowship,
  and other
  Financial

Assistance         Q20    2.13    2.13    17.02   29.79   48.94
                   Q21    2.13    6.38    17.02   29.79   44.68
                   Q22    2.13    4.26    19.15   31.91   42.55

Student            Q23    0.00    6.38    40.43   25.53   27.66
  Services         Q24    2.13   12.77    40.43   27.66   17.02
                   Q25    4.26    8.51    53.19   21.28   12.77

Faculty            Q26    8.51   12.77    38.30   17.02   23.40
  Development      Q27    2.13    6.38     8.51   36.17   46.81

Financial          Q28    8.51    8.51    63.83    6.38   12.77
  Stability        Q29    2.13    8.51    38.30   27.66   23.40
                   Q30    2.13    8.51    48.94   21.28   19.15


REFERENCES

Andrich, D. (1978). A rating formulation for ordered response categories. Psychometrika,43, 561-73.

Bradley, K. Sampson, S., & K. Royal. (2006). Applying the rasch rating scale model to gain insight into student's conceptualization of quality mathematics instruction. Mathematics Education Research Journal 18(2),11-26.

Boren, S., Irwin, P., Lyke, B., Riddle, W., Stedman, J., Fraas, C., Jordan, K., & Gregory, W. (1987). The higher education amendments of 1986 (P.L. 99-498): A summary of revisions. 87-187 EPW. (ED294485) Retrieved from http://www.eric.ed.gov

Chajewski, M., & Lewis, C. (2009). Optimizing item exposure control algorithms for olytomous computerized adaptive tests with restricted item pools. In D. J. Weiss (Ed.). Proceedings of the 2009 GMAC Conference on Computerized Adaptive Testing.

Creswell, J. W. (2009). Research design: Qualitative, quantitative, and mixed methods approaches. Thousand Oaks, CA: Sage Publications.

Drewry, H. N, & Doermann, H. (2001). Stand and prosper: Private Black colleges and their students. Princeton, NJ: Princeton University Press.

Duderstadt, J. J., & Womack, F. W. (2003). The future of the public university in America: Beyond the crossroads. Baltimore: Johns Hopkins University Press.

Eighty-Ninth Congress H.R. 9567 (1965). The Higher Education Act of 1965 (P.L. 89-329). Retrieved from http://ftp.resource.org/gao.gov/89-329/00004C57.pdf

ExpectMore.gov. (2005). Program assessment - Strengthening historically Black graduate institutions. Retrieved from http://www.whitehouse.gov/omb/expectmore/summary/10003317.2005.html

Fox, J.P. (2005). Multilevel IRT using dichotomous and polytomous response data. British Journal of Mathematical and Statistical Psychology, 58, 145-172.

Fong, D.Y., Ho, S.Y., & T. H. Lam (2010). Evaluation of internal reliability in the presence of inconsistent responses. Health and Quality of Life Outcomes, 8.

Jackson, C. L., & Nunn, E. F. (2003). Historically Black colleges and universities: A reference handbook. Santa Barbara, Calif: ABC-CLIO.

Johnson, V. E. & J.H. Albert (1999). Ordinal Data Modeling. Statistics for social science and public policy. New York: Springer.

Kelderman, E. (2010). White house adviser urges historically black colleges to change how they are seen. The Chronicle of Higher Education, 56. Retrieved from http://chronicle.com/article/White-House-Adviser-Urges/65218/

Kendrick, C.M.C. (1981). The impact of the advanced institutional development program on the curriculum of traditionally black colleges and universities. Dissertation Abstract International, 42-04, AA118120365.

Lucas, C. (Ed.).(199. American higher education: A history. New York, NY: St. Martin's Press.

National Association for Equal Opportunity in Higher Education (2009). HBCU and PBI policy recommendations for Obama economic stimulus package and first 100 days. Retrieved from http://www.nafeo.org/community/index.php.

Norman, M.K. (1985). The impact of the advanced institutional development program on the institutional viability of seven black 1890 land grant institutions (title III). Doctoral Dissertation. University of Missouri--Columbia.

Patrick, E. M. (1992). A descriptive study of Title III fund use by historically Black colleges and universities since enactment of Part B of the 1986 Higher Education Act Amendments. (Doctoral dissertation) University of Connecticut. Storrs, Manfield, CT.

Rasch, G. (1960/1980). Probabilistic models for some intelligence and attainment tests. (Copenhagen, Danish Institute for Educational Research), expanded edition (1980) with foreword and afterword by B.D. Wright. Chicago: The University of Chicago.

Rabe-Hesketh, S., Skrondal, A., & Pickles, A. (2004a). Generalized multilevel structural equation modeling. Psychometrika 69, 167-190.

Rabe-Hesketh, S., Skrondal, A., & A. Pickles (2004b). GLLAMM Manual. U.C. Berkeley Division of Biostatistics Working Paper Series. Working Paper 160. Available at http://www.gllamm.org/docum.html

Rabe-Hesketh, S., & A. Skrondal (2008). Multilevel and longitudinal modeling using stata. College Station, TX: Stata Press Publication.

Scott, G. A. (2007). Low-income and minority serving institutions: Education has taken steps to improve monitoring and assistance, but further progress is needed: Testimony before the Subcommittee on Higher Education, Lifelong Learning, and Competitiveness, Committee on Education and Labor, House of Representatives. Testimony, GAO-07-926T. [Washington, D.C.]: U.S. Govt. Accountability Office. Retrieved from http://www.gao.gov/new.items/d07926t.pdf

Scott, G.A. (2010). Low-income and minority serving institutions: Sustained attention needed to improve education's oversight of grant programs. Testimony, GAO-10-659T. [Washington, DC]: U.S. Govt. Accountability Office. Retrieved from http://www.gao.gov/new.items/d10659t.pdf

United States Department of Education. (1999). Higher education act of 1965: part b strengthening historically black graduate institutions. Retrieved from http://www2.ed.gov/programs/idueshbgi/hbgi-laws326.pdf

United States Government Accountability Office. (2009). Low-Income and minority serving institutions: Management attention to long-standing concerns needed to improve education's oversight of grant programs. Report to the Chairman, Subcommittee on Higher Education, Lifelong Learning, and Competitiveness, Committee on Education and Labor, House of Representatives. GAO-09-309. Retrieved from http://www.gao.gov/new.items/d09309.pdf

United States Department of Education. (2011). Title III part b, strengthening historically black graduate institutions program--awards. Retrieved from http://www2.ed.gov/programs/idueshbgi/awards.html

Wenglinsky, H. H. (1996). The educational justification of historically Black colleges and universities: A policy response to the U.S. Supreme Court. Educational Evaluation and Policy Analysis, 18(1), 91-103.

Zheng, X, & Rabe-Hesketh, S. (2009). Estimating parameters of dichotomous and ordinal item response models with Gllamm. Stata Journal, 7(3), 313-333.

About the Authors:

Aloyce R. Kaliba holds a Ph.D. in Economics from Kansas State University, Manhattan, Kansas. Dr. Kaliba is an Associate Professor of Economics, Department of Economics and Finance, College of Business, Southern University and A&M College, Baton Rouge, Louisiana. His current research interests include using item theory models in project evaluation and impact assessment.

Kimberly K Powell holds a Ph.D. in Urban Higher Education from Jackson State University in Jackson, Mississippi. Dr. Powell is the Grant Coordinator for the Office of Graduate Studies at Southern University and A&M College in Baton Rouge, Louisiana. Dr. Powell is also an Adjunct Professor for the College of Business at Southern University and A&M College. Her research interests include public policy, marketing and management in higher education, and diversity in higher education.

Aloyce R. Kaliba

Kimberly K. Powell

Southern University and A&M College
Table 1

Estimated Parameter Values Using the Rasch Rating Scale Model

                                          Estimated
Perceived Impact                 Item     Coefficient   Std. Err.

1. Research and Instruction      Q7          0.362        0.446
                                 Q8          0.994        0.448
                                 Q9          1.440        0.434
                                 Q10         1.942        0.442

2. Technology                    Q11         2.352        0.446
                                 Q12         1.497        0.437
                                 Q13         2.510        0.437
                                 Q14         3.808        0.455

3. Facility Construction,        Q15         1.052        0.442
  Maintenance, and               Q16         2.378        0.448
  Renovation                     Q17         2.998        0.444
                                 Q18         2.272        0.440

Scholarships, 4. Fellowship,     Q19         0.788        0.442
  and other Financial            Q20         0.404        0.449
  Assistance                     Q21         0.865        0.441
                                 Q22         0.878        0.440

5. Student Services              Q23         1.968        0.449
                                 Q24         2.810        0.444
                                 Q25         3.188        0.451

6. Faculty Development           Q26         2.976        0.454
  and Financial Stability        Q27         0.576        0.450
                                 Q28         3.826        0.453
                                 Q29         2.287        0.442
                                 Q30         2.652        0.446

Category Parameters              Step 2      1.123        0.173
                                 Step 3      1.367        0.155

Intercept: category response 1               5.541        0.487

Intercept: category response 2               2.512        0.462

Intercept: category response 3               0.355        0.453

Variance: school                             0.580        1.666

Variance: institutions                       4.580        1.066

                                                   P>[absolute
Perceived Impact                 Item     z        value of z]

1. Research and Instruction      Q7       0.810       0.417
                                 Q8       2.220       0.026
                                 Q9       3.320       0.001
                                 Q10      4.390       0.000

2. Technology                    Q11      5.280       0.000
                                 Q12      3.430       0.001
                                 Q13      5.740       0.000
                                 Q14      8.370       0.000

3. Facility Construction,        Q15      2.380       0.017
  Maintenance, and               Q16      5.310       0.000
  Renovation                     Q17      6.760       0.000
                                 Q18      5.160       0.000

Scholarships, 4. Fellowship,     Q19      1.780       0.074
  and other Financial            Q20      0.900       0.368
  Assistance                     Q21      1.960       0.050
                                 Q22      1.990       0.046

5. Student Services              Q23      4.380       0.000
                                 Q24      6.320       0.000
                                 Q25      7.070       0.000

6. Faculty Development           Q26      6.550       0.000
  and Financial Stability        Q27      1.280       0.201
                                 Q28      8.450       0.000
                                 Q29      5.180       0.000
                                 Q30      5.950       0.000

Category Parameters              Step 2   6.500       0.000
                                 Step 3   8.800       0.000

Intercept: category response 1            11.380      0.000

Intercept: category response 2             5.440      0.000

Intercept: category response 3             0.780      0.433

Variance: school

Variance: institutions

Perceived Impact                 Item     95% CI

                                          Lower    Upper

1. Research and Instruction      Q7       0.512    1.236
                                 Q8       0.117    1.871
                                 Q9       0.589    2.291
                                 Q10      1.075    2.808

2. Technology                    Q11      1.479    3.225
                                 Q12      0.641    2.353
                                 Q13      1.653    3.366
                                 Q14      2.917    4.699

3. Facility Construction,        Q15      0.186    1.919
  Maintenance, and               Q16      1.500    3.256
  Renovation                     Q17      2.129    3.868
                                 Q18      1.410    3.135

Scholarships, 4. Fellowship,     Q19      0.077    1.653
  and other Financial            Q20      0.476    1.284
  Assistance                     Q21      0.002    1.729
                                 Q22      0.015    1.740

5. Student Services              Q23      1.088    2.847
                                 Q24      1.939    3.681
                                 Q25      2.304    4.072

6. Faculty Development           Q26      2.086    3.866
  and Financial Stability        Q27      0.307    1.458
                                 Q28      2.939    4.714
                                 Q29      1.421    3.153
                                 Q30      1.778    3.526

Category Parameters              Step 2   0.785    1.462
                                 Step 3   1.063    1.672

Intercept: category response 1            4.587    6.496

Intercept: category response 2            1.607    3.417

Intercept: category response 3            0.533    1.243

Variance: school

Variance: institutions
联系我们|关于我们|网站声明
国家哲学社会科学文献中心版权所有