Assessment Processes

Jump to: Processes for Educational Objectives | Processes for Student Outcomes


Processes for Educational Objectives

Jump to Section: Table 1: Tools used for assessment of program educational objectives | Alumni surveys | Industrial Advisory Council feedback

The program educational objectives were developed through direct engagement of faculty and the Mechanical Engineering Industrial Advisory Council (MEIAC), which has representation from major employers of the program’s graduates as well as alumni. Thus the program objectives are designed to meet the needs of the constituents that hire our graduates (industry) and the constituents who design and deliver content (faculty) in a manner consistent with the mission of Iowa State University and in a manner that will benefit our students.

The process for reviewing and revising, as necessary, the program educational objectives as well as engagement of the various program constituents is shown in Fig. 1 and described as follows.

Figure 1: Process for revision of Program Educational Objectives. The revision cycle is established at 6 years.
The objectives were revised last in 2010-11 with the next scheduled revision cycle being in 2016-17.

fig1

The Associate Chair for Undergraduate Studies leads the Undergraduate Education Committee, a faculty committee that is directly engaged in (re) defining the program educational objectives. The process for revising the PEOs is initiated by this committee, which takes into account any changes/revisions made to the University, College and Department missions.  The committee also uses assessment data generated from alumni surveys and our MEIAC. The revised objectives are then presented to the faculty and the MEIAC for feedback and finalization via faculty vote.  The student body is made aware of the objectives through an open forum, where they are also provided with an opportunity to provide feedback on various aspects of their program experience.

Since the program educational objectives describe accomplishment up to 5 years after graduation, our review cycle is established as 6 years.  Our last assessment and evaluation occurred in 2010-2011 and the next evaluation is planned for 2016-17. Of course, as situations change, a non-cycle review may be conducted as needed.  This process is documented in our department’s intranet.

Assessment and evaluation of program educational objectives are based on measures and processes outlined briefly in Table 1 and discussed in more detail below.

Table 1: Tools used for assessment of program educational objectives.

Assessment tool

Description

Assessment Cycle

Evaluation Cycle

Notes on sample size

Documentation and Maintenance

Alumni Surveys Alumni (3-5 years out of program) self-assess the extent to which they have attained the objectives, as well as importance and preparatory level of student outcomes in their career.  In addition they provide information on their current career status. Every 6 years
(most recent in 2011)
Every 6 years
(most recent in 2012)
About 150 surveys were sent out.  Historical response rate has been about 20-25% Electronically administered by College of Engineering.  All data stored electronically.
Industrial Advisory Council Feedback Advisory Council members provide observations and commentary on the contributions of graduates towards their company’s enterprise Every 3 years
(most recent in 2012)
Every 3 years
(most recent in 2012)
15 industrial advisory council members represent major industrial recruiters of our students. Meeting minutes, responses stored as hard copies and electronically.

Alumni surveys: Alumni surveys were administered by Engineering Career Services (ECS) starting in 2011.  The College of Engineering ABET Committee developed a standard survey in 2010.  The first portion of the survey contained questions asked of graduates of all programs in the College.  The second portion of the survey contained questions developed by the individual programs for their graduates.  ECS maintains and verifies a database of email contacts for all graduates.  The survey was delivered electronically using the ISU Career Management System (CMS).  CMS tracks responses, sends reminders and, at the end of the survey period, generates reports for use by the program.  The 2011 survey targeted graduates from the program in the 2006-2007 academic year.  The program portion of the survey includes specific questions that leverage the mapping of the objectives to specific actions and outcomes to the program and are designed to yield data to enable better interpretation and evaluation.

Industrial Advisory Council Feedback:  The Mechanical Engineering Industrial Advisory Council (MEIAC), which meets twice a year with departmental leadership, faculty and students, completes a survey and also provides anecdotal feedback on the success of graduates in attaining the objectives.

Process for Student Outcomes

Jump to Section:
Course Surveys | Course Outcomes Assessment | Figure 2: The relationship between course outcomes, student outcomes and program educational objectives
Table 3: Mapping of mechanical engineering course outcomes to student outcomes | Fundamentals of Engineering (FE) morning exam data

Assessment of student outcomes is based on the measures and processes indicated in Table 2.

Table 2: Tools used for outcomes assessment

Assessment tool

Description

Assessment Cycle

Evaluation Cycle

Notes on sample size

Documentation and Maintenance

Course surveys Students assess their opportunities to attain student outcomes in each core course Every three years
(Most recent in 2011)
Every three years – follows assessment cycle(Most recent in 2011-12) Survey is online- response rate is ~73% of students enrolled in the program Departmentally administered online surveys using Class-Climate software.  All data stored electronically.
Course outcome assessment (mapped to student outcomes)* Faculty directly assess the students’ attainment of specific course outcomes based on performance in specific evaluative components of a course Every three years(Most recent in 2011) Every three years – follows assessment cycle(Most recent in 2011-12) Data is typically provided for all students enrolled in the courses.  Enrollments range from 30% – 65% of graduating class size. Faculty provide assessment data via Excel sheets (stored electronically).  Faculty also provide copies of assessment instruments and graded student work. Copies are stored electronically.
Fundament-als of Eng. (FE)  morning exam data Comparison of ME graduate performance in morning exam and national performance Every year(Most recent in 2011) Every 5 years(Most recent in 2011-12) Over the last 10 years, about 46% of ME graduates take the FE exam annually College of Engineering obtains data from NCEES and provides summaries to programs.  Data stored electronically.

*A mapping of student outcomes to various course outcomes as well as an assessment map is provided in Table 3 below.


Course Surveys:  This survey is administered online to students in all Mechanical Engineering courses using ClassClimate®.  The survey consists of two parts – the first part addresses evaluation of the instructor and the second which is pertinent to outcomes assessment, asks students to self-assess the opportunities to demonstrate the various student outcomes in a given class.

We deployed an online survey starting Spring 2011 to facilitate data collection and reporting.  In an initial pilot in fall 2010, the online survey resulted in 78% response rate, which was deemed very good.  Our current response rates range from 64% – 80% for a given class and an average of 73% of students enrolled in the program.  We note that the instructor evaluation portion of the survey is sent out every semester for personnel feedback and review, while the outcomes assessment portion is on a three-year assessment cycle followed by evaluation.


Course Outcomes Assessment:  Faculty teaching core classes (see Table 3, below) are charged with directly assessing attainment of specific course outcomes.  Course outcomes assessment is pertinent because in most programs, course outcomes (established by faculty) can be mapped to student outcomes (ABET a-k and ME program (ASME) outcomes) as shown in Fig. 2.  Therefore attainment of student outcomes can be demonstrated by demonstrating attainment of course outcomes.

With the intent to spread the outcomes assessment across the curriculum and avoid unnecessary redundancy in data collection, a curricular assessment map was created (Table 3, below) that involves twelve core courses ranging from the sophomore to the senior level, including the capstone design experience courses.  The choice of courses was primarily determined by the need to ensure 1) almost all graduating students went through the course and 2) sufficient coverage of thermal systems and mechanical systems courses were attained.

The faculty use instruments and activities that are for the most part, already in place for evaluation of student performance, such as exams, homework, quizzes, lab activities/reports, project presentations, design reports etc.  For lab reports and design projects, rubrics were the most common tool used to assign quantitative measures.  The faculty member(s) teaching the course establish(es) the criteria for attainment for the specific instrument they choose.

Figure 2:  The relationship between course outcomes, student outcomes and program educational objectives.

fig2

Table 3:  Mapping of mechanical engineering course outcomes to student outcomes.

table3


Fundamentals of Engineering (FE) morning exam data:  We also use the FE morning exam data provided by NCEES (the National Council of Examiners for Engineering and Surveying) as another direct assessment measure of attainment of student outcomes.  This also provides some information regarding the abilities of our students on a national level and allows a broader assessment of the effectiveness of our curriculum.  The specific components we look at include

  • Mathematics and Probability/Statistics scores (outcome a)
  • Thermodynamics and Chemistry (outcome b)
  • Ethics and Business Practice (outcome f)

We typically use the metric of meeting or exceeding the national score each component. Based on data from the last ten years, about 46% of our graduates take the exam annually.