As part of a seven university CCLI Type 3 collaborative effort focused on models and modeling, we have extended the model eliciting activity (MEA) construct to upper division engineering programs. Originally developed and validated by mathematics education researchers, MEAs were found to have significant value as an educational tool. Our overall goal has been to use this construct as a means for enhancing engineering students' problem solving and modeling skills as well as their conceptual understanding of certain engineering topics. In doing this we have pursued two main research thrusts: MEAs as teaching tools and MEA as learning assessment tools. This paper summarizes our results for these two foci as we near the conclusion of our final project year. In using MEAs as a teaching tool - we have examined three activities: (1) Development of effective MEAs: We have created a series of over 20 MEAs for upper level students that target students' problem solving skills and conceptual learning. In doing this we have found that MEAs also enhance such important professional skills as communication, teamwork, and ethical understanding. (2) Implementation of MEAs: We have introduced and rigorously assessed our MEAs in classroom settings as a means to further understand students' problem solving, modeling and teamwork processes. (3) Enhancing the learning benefits of MEAs: Our consortium has added new conceptual dimensions to MEAs to further enhance student learning. In particular, we have introduced an ethical dimension as a means for improving students' ability to recognize and resolve those types of ethical dilemmas that arise in the engineering workplace. In using MEAs as a learning tool - we have focused on two additional activities: (1) Assessing the effectiveness of MEAs in various dimensions including improving conceptual learning and problem solving: We have developed a series of assessment instruments to better understand and measure the educational benefits of using MEAs. Specifically, we are triangulating across three assessment instruments, which we created for this project: (1) pre- and post-concept inventories (or knowledge tests) to assess gain in conceptual understanding, (2) an online reflection tool to assess process, and (3) a grading rubric to assess the resultant artifact (general model and specific solution). We have also developed an instrument to measure students' self-efficacy scale related to their modeling skills. (2) Assessing the MEA motivated problem solving process: Through the use of various data collection tools, including PDAs and wikis, in combination with the mentioned assessment instruments, we are identifying the various problem solving processes used by the student teams, as well as the range of problems that can be addressed, to determine how effective the various processes are relative to improved conceptual understanding. This paper summarizes our achievements in each of these five activities. Particular emphasis is placed on our mixed measurements for student learning and achievement, and a discussion of the relative conceptual gain for a series of MEA experiments, including those where a comparison group was available.
展开▼