Login or register for free to remove ads.

A Meta-analysis of Learning Evaluation Online: LEO's Useability, Adoption, and Patterns of Use PROCEEDINGS

, The University of Melbourne, Australia ; , Monash University, Australia

EdMedia: World Conference on Educational Media and Technology, in Seattle, WA USA ISBN 978-1-880094-35-8 Publisher: Association for the Advancement of Computing in Education (AACE)

Abstract

The design, development, implementation, and evaluation of Web-based courseware for teaching and learning are important issues in higher education. However, available evidence suggests evaluation of much courseware is minimal. This paper reports a meta-analysis of the adoption, usage, and implementation of a generic, customisable, online survey / evaluation software tool—Learning Evaluation Online (LEO). A customisable evaluation tool derived from the need to: · customise each survey to suit each particular project, · collect data from widely geographically separated evaluators, and · decrease the time and expense of data collection and evaluation. LEO was also designed to be customisable by individuals with minimal computing experience. Templates for basic evaluation question types are provided. The use of LEO was logged by the LEO software engine and our LEO-on-LEO survey. An analysis of the quantitative and qualitative data generated is presented together with recommendations for improving implementation of similar tools.

Citation

Ip, A. & Kennedy, D.M. (1999). A Meta-analysis of Learning Evaluation Online: LEO's Useability, Adoption, and Patterns of Use. In B. Collis & R. Oliver (Eds.), Proceedings of EdMedia: World Conference on Educational Media and Technology 1999 (pp. 126-131). Association for the Advancement of Computing in Education (AACE).

Keywords


Feedback and Suggestions please email info@editlib.org or use our online feedback form.