The Value-Added Assessment of Higher Education learning: The case of Nagoya University of Commerce and Business in Japan

Authors

  • Hiroshi Ito Surname
  • Nobuo Kawazoe

Keywords:

higher education; learning assessment; PROG, Value-added assessment, assessment tools, generic skills

Abstract

Assessment of higher education learning has been considered
increasingly important. One of the current trends in this field is the
value-added assessmenthow much students learn during a certain
period of time at university. In the United States, for example, Arum
and Roksa (2011) conducted a large-scale assessment on second-year
university students’ learning with the Collegiate Learning Assessment
(CLA) to examine how much university students improved generic
skills during the first two years of higher education. Findings suggested
that they did not improve much. The researchers concluded that the
poor result was attributed to the fact that American university students
on average study only 12 hours a week. In Japan, the situation may be
even worse as Japanese university students on average study 3.5 hours,
much less than their cohorts in the United States. However, studies on
the value-added assessment of Japanese university students’ learning
are scarce. With the Progress Report on Generic Skills (PROG), an
assessment tool similar to CLA, as well as interviews with students who
took PROG, this study quantitatively examines how much students
improved generic skills at a Japanese university during the first two
years of higher education. The findings show that as was the case of the
US peers, Japanese university students in this study did not improve
their generic skills very much in the first two years of higher education.
This study also qualitatively explores possible reasons for such results.
The findings also show that the researched students on average studied
only 40 minutes a week. This study suggests offering more courses with
active learning approaches to intrinsically motivate students in order for
them to spare more time for learning.

References

Arum, R. and Roksa, J. (2011). Academically adrift: Limited learning on college campuses.

Chicago and London.

Barkley, E. F. (2010). Student engagement technique: A handbook for college faculty. New

York: Jossey-Bass

Clouder, L., Broughan, C., Jewell, S. & Steventon, G. (2012). Improving Student

Engagement and Development through Assessment. London and New York:

Routledge.

Douglas, J., Thomson, G. & Zhao, C. (2012). The Learning Outcomes Race: the Value of

Self-Reported Gains in Large Research Universities. Higher Education 64(3): 317-

Hardison, C. M. and Vilamovska, A. (2009). The Collegiate Learning Assessment: Setting

standards for performance at a college or university. Santa Monica: Rand Education.

Ito, H. (2014a). Assessing an assessment tool of higher education: The case of PROG in

Japan. International Journal of Evaluation and Research in Education 3(1): 1-10.

Ito, H. (2014b). What’s wrong with learning for the Exam? An assessment-based

approach for student engagement. Journal of Education and Learning 3(2): 145-152.

Ito, H. (2014c). Shaping the First-Year Experience: Assessment of the Vision Planning

Seminar at the Nagoya University of Commerce and Business. International

Journal of Higher Education 3(3): 1-9.

Kawaijuku and Riasec. (2013). PROG: Progress Report on Generic Skills. Tokyo: Kawaijuku

and Riasec.

Kuh, G. D. (2012). What we are learning about student engagement. Change 35: 28.

Kushimoto, T. (2012). Outcome Assessment and its role in self-reviews of undergraduate

education: in the context of Japanese Higher Education Reforms Since 1990s.

Higher Education 59: 589-598.

McIntyre, J., Todd, N., Huijser, H., and Tehan, G. (2012). Building pathways to academic

success. A Practical Report. The International Journal of the First Year in Higher

Education 3(1): 109-118.

McVeigh, B. J. (2002). Japanese higher education as myth. New York and London: An East

Gate Book.

National Survey of Student Engagement. (2011). How much time college students spend

studying varies by major and corresponds to faculty expectations, survey finds.

Available at:

http://nsse.iub.edu/NSSE_2011_Results/pdf/NSSE_2011_Press_Release.pdf.

Peters, R. A. (2011). Enhancing academic achievement by identifying and minimizing the

impediments to active learning. Public Administration Quarterly 35(4): 466-493.

Pike, G. R. (2011). Assessing the Generic Outcomes of College: Selections from

assessment measures. San Francisco: Jossey-Bass.

Riasec. (2012). Measuring Generic Skills. Tokyo: Riasec.

Sambell, K., McDowell, L. and Montgomery, C. (2012). Assessment for Learning in Higher

Education. London and New York: Routledge.

Shavelson, R. (2009). Measuring College Learning Responsibility: Accountability in a New Era.

Palo Alto: Stanford University Press.

Tsuji, T. (2013). Nagze nihon no daigakusei wa sekai de ichiban benkyo shinai no ka [Why

Japanese university students study the least in the world]. Tokyo: Toyo keizai

shinbunsya.

Downloads

Published

2015-04-30

Issue

Section

Articles