Assessing undergraduate information literacy change over time
Ault A.B.; Ferguson J.
2019
Performance Measurement and Metrics
1
10.1108/PMM-02-2019-0005
Purpose: The research project assessed information literacy skill changes in college students at two points in time, as entering first-year students in 2012 and as seniors in their senior seminar capstone courses in the 2015–2016 academic year. The paper aims to discuss this issue. Design/methodology/approach: The Standardized Assessment of Information Literacy Skills (SAILS) individual test was the selected instrument. Version 1 of the test was used for first-year students and Version 2 was used for seniors. All testing was done in person in computer labs with a librarian or library staff member present to proctor the test. This resulted in obtaining 330 student results as first years and 307 as seniors, with 161 exact matches for both administrations of the test. Exact matching of student scores to demographic details pulled from the college’s student information systems were used in the analysis. Findings: The analysis shows that overall first-year students tested below the 70 percent proficiency benchmark in all eight skill sets, but by the time they were seniors they scored above 70 percent in three skill sets. Male students and students of color performed lower than their counterparts, but these groups did demonstrate significant improvement in four skill sets by the time they were seniors. Students in the Honors program, those who took longer to complete the test as seniors, those with higher GPAs, those in Humanities majors, and those who had upper level course exposures to librarian information literacy instruction had higher performance on the test. There were no statistically significant results for students who were first generation, Pell Grant eligible, or were in-state or out-of-state residents. Originality/value: There are few published studies that utilized the SAILS test for longitudinal institution-wide assessment. The majority of institutions that utilized the individual version of SAILS did so to determine change within a selected course, or set of courses, in the same semester and very few are published. © 2019, Emerald Publishing Limited.
Information literacy; Liberal arts; Library instruction; Longitudinal; SAILS; Statistical analysis
Baker C.N., Under-represented college students and extracurricular involvement: the effects of various student organizations on academic performance, Social Psychology of Education, 11, 3, pp. 273-298, (2008); Catalano A., Phillips S., Information literacy and retention: a case study of the value of the library, Evidence Based Library and Information Practice, 11, 4, pp. 2-13, (2016); Chan Y., Investigating the relationship among extracurricular activities, learning approach and academic outcomes: a case study, Active Learning In Higher Education, 17, 3, pp. 223-233, (2016); Cowan S.A., Graham R.Y., Eva N., How information literate are they? A SAILS study of (mostly) first-year students at the U of L, OPUS: Open ULeth Scholarship, 2016-2017, pp. 17-20, (2016); Graham R.Y., Eva N., Cowan S., SAILS, take 2: an exploration of the “build your own test” standardized IL testing option for Canadian institutions, Communications in Information Literacy, 12, 1, pp. 19-35, (2018); Hill J.B., Macheak C., Siegel J., Assessing undergraduate information literacy skills using project SAILS, Codex: The Journal of the Louisiana Chapter of the ACRL, 2, 3, pp. 23-27, (2013); Hsieh M.L., Dawson P.H., Carlin M.T., What five minutes in the classroom can do to uncover the basic information literacy skills of your college students: a multiyear assessment study, Evidence Based Library and Information Practice, 8, 3, pp. 34-57, (2013); Mery Y., Newby J., Peng K., Assessing the reliability and validity of locally developed information literacy test items, Reference Services Review, 39, 1, pp. 98-122, (2011); Background, (2018); The individual scores test, (2018); Radcliff C., Oakleaf M., Van Hoeck M., So what? The results and impact of a decade of IMLS-funded information literacy assessments, Proceedings of the 2014 Library Assessment Conference: Building Effective, Sustainable, Practical Assessment, pp. 801-809, (2015); Radcliff S., Wong E.Y., Evaluation of sources: a new sustainable approach, Reference Services Review, 43, 2, pp. 231-250, (2015); Seow P., Pan G., A literature review of the impact of extracurricular activities participation on students’ academic performance, Journal of Education for Business, 89, 7, pp. 361-366, (2014); Tong M., Moran C., Are transfer students lagging behind in information literacy?, Reference Services Review, 45, 2, pp. 286-297, (2017); Wells V.A., Report: SAILS test executive summary 2014, pp. 1-12, (2014); Williams P., Developing student competencies in information literacy sessions through web-based instruction for distance learners, (2015); Zacherman A., Foubert J., The relationship between engagement in cocurricular activities and academic performance: exploring gender differences, Journal of Student Affairs Research and Practice, 51, 2, pp. 157-169, (2014); Information Literacy Competency Standards for Higher Education, (2000)
Emerald Group Holdings Ltd.
Article
Scopus