EVALUATING A POSTGRADUATE PROGRAM SUPPORT: A DUAL APPROACH USING REALIST EVALUATION AND CIPP MODEL FOR ACCREDITION ENHANCEMENT

Authors

DOI:

https://doi.org/10.46933/DGS.vol9i2264-277

Keywords:

postgraduate program support, realist evaluation, CIPP model, accreditation enhancement, educational program evaluation

Abstract

This research explores the effectiveness of a Graduate Program Support Program in enhancing the accreditation status by employing the Realist Evaluation and CIPP (Context, Input, Process, Product) Model Evaluation approaches. Addressing a research gap regarding the nuanced impact of support programs on postgraduate accreditation, this study combines qualitative document analysis, stakeholder interviews, and field observations. The research investigates how contextual factors and program elements influence the success of the support program in the complex landscape of postgraduate education. The primary research question focuses on understanding the interplay between the Realist Evaluation and the CIPP Model in evaluating and improving accreditation outcomes. The study aims to comprehensively understand the key elements contributing to the program's success. Findings reveal that the integrated approach successfully enhances accreditation by addressing program components and contextual nuances. Recommendations include tighter integration of program components, enhanced collaboration among stakeholders, and refinement of evaluation strategies. In conclusion, integrating the Realist Evaluation and the CIPP Model proves valuable in designing effective support programs for advancing postgraduate accreditation, offering insights that contribute significantly to the existing knowledge in educational program evaluation

Downloads

Download data is not yet available.

References

Darma, I. K. (2019). The effectiveness of teaching program of CIPP evaluation model. International Research Journal of Engineering, IT & Scientific Research, 5(3), 1–13. https://doi.org/10.21744/irjeis.v5n3.619.

De Souza, D. E. (2022). A critical realist approach to systems thinking in evaluation. Evaluation, 28(1), 72–90. https://doi.org/10.1177/13563890211064639

Greenhalgh, J., & Manzano, A. (2022). Understanding ‘context’ in realist evaluation and synthesis. International Journal of Social Research Methodology, 25(5), 583–595. https://doi.org/10.1080/13645579.2021.1918484.

Gullickson, A. M., King, J. A., Lavelle, J. M., & Clinton, J. M. (2019). The current state of evaluator education: A situation analysis and call to action. Evaluation and Program Planning, 75, 20–30. https://doi.org/https://doi.org/10.1016/j.evalprogplan.2019.02.012

Manap, R., Othman, N., Roslan, S. N., Ismail, K., & Kamarubahrin, A. F. (2019). Measuring The Effectiveness of University Programmes Based on Evaluation Models: A Meta-Analysis. ‘Abqari Journal, 20(2), 78–95. https://doi.org/10.33102/abqari.vol20no2.206

Mertens, D. M. (2019). Research and Evaluation in Education and Psychology: Integrating Diversity with Quantitative, Qualitative and Mixed Methods (Fifth). SAGE Publication.

Moran, A. M., Coyle, J., Pope, R., Boxall, D., Nancarrow, S. A., & Young, J. (2014). Supervision, support and mentoring interventions for health practitioners in rural and remote contexts: An integrative review and thematic synthesis of the literature to identify mechanisms for successful outcomes. Human Resources for Health, 12(1). https://doi.org/10.1186/1478-4491-12-10

Mukumbang, F. C., Marchal, B., Van Belle, S., & van Wyk, B. (2020). Using the Realist Interview Approach to Maintain Theoretical Awareness in Realist Studies. Qualitative Research, 20(4), 485–515. https://doi.org/10.1177/1468794119881985

Palinkas, L. A., Mendon, S. J., & Hamilton, A. B. (2019). Innovations in Mixed Methods Evaluations. Annual Review of Public Health, 40, 423–442. https://doi.org/10.1146/annurev-publhealth-040218-044215.

Ronfeldt, M. (2021). Links Among Teacher Preparation, Retention, and Teaching Effectiveness. https://doi.org/10.31094/2021/3/1.

Stufflebeam, D. L., & Zhang, G. (2017). The CIPP Evaluation Model. The Guilford Press.

Tokmak, H. S., Baturay, H. M., & Fadde, P. (2013). Applying the context, input, process, product evaluation model for evaluation, research, and redesign of an online master’s program. International Review of Research in Open and Distance Learning, 14(3), 273–293. https://doi.org/10.19173/irrodl.v14i3.1485.

Ton, G., & Vellema, S. (2022). Introduction: Contribution, Causality, Context, and Contingency when Evaluating Inclusive Business Programmes. IDS Bulletin, 53(1), 1–20. https://doi.org/10.19088/1968-2022.102.

Wang, J., Veugelers, R., & Stephan, P. (2017). Bias against novelty in science: A cautionary tale for users of bibliometric indicators. Research Policy, 46(8), 1416–1436. https://doi.org/https://doi.org/10.1016/j.respol.2017.06.006.

Yew, A. C. Y., Chiu, D. K. W., Nakamura, Y., & Li, K. K. (2022). A quantitative review of LIS programs accredited by ALA and CILIP under contemporary technology advancement. Library Hi Tech, 40(6), 1721–1745. https://doi.org/10.1108/LHT-12-2021-0442.

Zhao, J., Jull, J., Finderup, J., Smith, M., Kienlin, S. M., Rahn, A. C., Dunn, S., Aoki, Y., Brown, L., Harvey, G., & Stacey, D. (2022). Understanding How and Under What Circumstances Decision Coaching Works for People Making Healthcare Decisions: a Realist Review. BMC Medical Informatics and Decision Making, 22(1), 1–20. https://doi.org/10.1186/s12911-022-02007-0

Downloads

Published

2024-10-14

How to Cite

EVALUATING A POSTGRADUATE PROGRAM SUPPORT: A DUAL APPROACH USING REALIST EVALUATION AND CIPP MODEL FOR ACCREDITION ENHANCEMENT. (2024). Diegesis : Jurnal Teologi, 9(2), 264-277. https://doi.org/10.46933/DGS.vol9i2264-277

Most read articles by the same author(s)