Akihito Kamata
Akihito Kamata, BRT Affiliate
Dr. Kamata is the Executive Director at the Center on Research and Evaluation (CORE). He has also been a Professor at Southern Methodist University since August 2013 (Department of Education Policy & Leadership, Center on Research and Evaluation, Simmons School of Education & Human Development; Department of Psychology, Dedman College of Humanities and Sciences). Prior to joining SMU, Dr. Kamata was a faculty member at the University of Oregon and Florida State University.
Dr. Kamata's primary research interest is psychometrics and educational and psychological measurement, focusing on implementation of item-level test data analysis methodology through various modeling framework, including item response theory, multilevel modeling, and structural equation modeling. He did pioneering work on multilevel item response theory modeling, where item response data from individuals are nested within group units, such as schools. This line of work is represented by his 2001 publication in Journal of Educational Measurement, a special issue on multilevel measurement modeling in Journal of Applied Measurement in 2005, and several book chapters on the topic, including a recent chapter in the Handbook of Advanced Multilevel Analysis (2011). Other recent interests include developing effect size measures for testlet modeling, developing reliability measures of growth trajectory for longitudinal data modeling, and Bayesian inference for complex psychometric models.
Dr. Kamata received his doctoral degree in Measurement and Quantitative Methods from Michigan State University in 1998.
Professional Website: Akihito Kamata at SMU
CORE - UO Research Supports Literacy Evaluation Program
April 2020
University research to improve reading fluency assessments will soon be helping teachers across the nation, thanks to a licensing agreement with the education technology company Analytical Measures Inc.
College of Education research associate professor Joe Nese developed the Computerized Oral Reading Evaluation to reduce the workload for teachers who must frequently test their students’ reading levels. Evaluation combines an innovative psychometric model and a custom set of reading passages with speech recognition software to better evaluate student reading ability.
The automated evaluation allows teachers to simultaneously administer brief reading assessments to multiple students with fewer errors, providing a more accurate understanding of students’ reading development.
Analytical Measures will incorporate the tool into its Moby.Read application. The new tool and Moby.Read both received funding from the Institute of Education Sciences.
“It’s a good example of public grant money coming together to make research accessible and available to educators,” Nese said.
Nese and co-principal investigator Akihito Kamata plan to further improve the system by adding the capability to measure “reading prosody,” or how expressive students are when they read, like if they pause at commas or change their tone to emphasize dialogue and questions.
“Eventually what we really want is to better assess if students understand what they’re reading, rather than just measuring speed and accuracy,” Nese said.