Computer engineering



Download 2.78 Mb.
Page19/34
Date20.10.2016
Size2.78 Mb.
#5567
1   ...   15   16   17   18   19   20   21   22   ...   34

ABET Committee

The COE department has a standing ABET Committee which is responsible for the review and possible revision of the PEOs and POs whenever needed, in addition to supervising the direct and indirect outcome assessment processes, data collection and presentation, assessment data for continuous program improvement. In the following we highlight the main three tasks:




  1. Coordinate the Departmental Assessment Process and Carry out Data Analysis. This consists of supervising, coordinating and collecting results of various assessment tools as described above. The committee has the primary responsibility for monitoring the success of the COE program in meeting its stated objectives. Compiles all assessment data at the end of each academic year, the committee analyzes the assessment data for the year and reports its findings to the department council. This is determined by comparing the outcome performance criteria to its current score if needed. Based on assessment results, the committee may identify weaknesses and suggest remedial actions for approved items by the department council.




  1. Coordinate the review process of the PEOs and POs. The review cycle of the PEO is currently three years. The COE department chair initiates the program objectives review process according to the COE Assessment Plan. The Alumni and Employer assessment are used to the review the PEOs. The analysis and findings are presented to the departmental council to involve department faculty in the necessary changes or revision of the PEOs. Furthermore, the decision for revising the PEOs is presented to the Industry Advisory Committee for further discussion and approval. In case the PEOs are revised, the committee coordinates the process of revising the POs and consequently the course outcomes in accordance with the POs and PEOs.



  1. Coordinate the Continuous Improvement Process. In order to close the loop for the self-assessment of the program, the committee analyze all of the collected assessment data involving direct and indirect assessment tools and identifies program outcomes that do not meet their performance criteria. Such program outcomes become the subject to possible improvements. Based on assessment results, the committee may suggest remedial actions for approved by the department council. AC set up a plan and a schedule to coordinate the continuous improvement process. A one-page report is written for each weak program outcome to be improved. The report presents some background information on the specific program outcome to be improved, such as the performance criteria, assessment tools, data collection cycle, evaluation of results, and the list of proposed actions to improve the performance of the learning outcome. The committee recommends to the department assigning a Faculty to carry out the program remedial actions based on the above continuous improvement report for a given program outcome. Furthermore, the committee monitors the assessment data of program outcomes for which some continuous improvement actions were conducted and may possibly repeat the above process. The COE Department will maintain records of all course files, assessment reports, and refinements adopted in the undergraduate program.

Program Assessment Plan

The review process of the PEOs and POs is once every three years as indicated shown in Table ‎3.6 -18. The planning of the Indirect Assessment and Meeting/Consulting with the Industry Advisory Committee is shown in Table ‎3.6 -19. Planning of the Continuous Improvement and Program Assessment process is shown in Table ‎3.6 -20.

The COE Assessment cycle is 3 years (see Table ‎3.6 -18):


    • First year: a sub-set of program outcomes is examined for possible improvement. If needed some action is taken to improve POs that need improvement. The committee documents its analysis and actions.

    • Second year: the remaining subset of program outcomes is examined for possible improvement. If needed some action is taken to improve POs. The committee documents its analysis and the remedial actions.

    • Third year: Data collection is carried out for all outcomes. The results of continuous improvement are analyzed. The committee documents assessment data analysis and conclusions.



Table ‎3.6 18. Planning the review process of the PEOs and PO

Planning of the PEOs and POs Review Process and Frequency

2006-2007

(T061-062)

2007-2008

(T071-072)

2008-2009

(T081-082)

2009-2010

(T091-092)

Review of the Program Educational Objectives PEOs (every 3 years)

X







X

(Based on Surveys Data Collected in T081 and T082)



Review of the Program Outcomes (every 3 years)

X







X



Table ‎3.6 19. Planning the Indirect Assessment & Consulting the Industry Advisory Committee

Planning the POs Indirect Assessment

T081

T082

T091

T092

T101

T102

T111

T112

Survey of Alumni and Employers

(every 3 years)

X

X













X

X

Survey of COE Graduates and COOP Supervisors

(every semester)

X

X

X

X

X

X

X

X

Meeting and consulting with the Industry Advisory Committee

(every year)




X




X




X

X

X



Table ‎3.6 20. Planning the Continuous Improvement and Program Assessment processes.

Planning Continuous Improvement and Program Assessment

T081

T082

T091

T092

T101

T102

Continuous Improvement

Performance Analysis of some outcomes based on Direct and Indirect Assessment Data.

C(design), E(formulation), D(teamwork), G(communication), J(contemporary), L(statistics), and N(integration).

X




X










Continuous Improvement

Performance Analysis of some outcomes based on Direct and Indirect Assessment Data.

A(math/science), B(experiments), F(ethics), H(eng. sol.), I(learning), K(tools), and M(dis. math).




X




X







Program Assessment

(Direct Assessment: Rubrics and Exit Exam)













X

X

Course Outcomes Assessment (secondary)

For each course in the Computer Engineering major, faculty involved in teaching the course have prepared a Course Learning Outcomes Table that includes the following for each outcome:



  • Outcome indicators and details: this describes the main course topics that contribute to achieving the outcome.

  • Suggested assessment methods and metrics.

  • Outcome minimum weight: this indicates the importance of the outcome in the course. It is the minimum weight from the total course score (out of 100) that must be used for assessing the outcome or covering the outcome in the course.

  • A mapping between the course learning outcome and ABET program outcomes.

  • Each outcome is given a level of emphasis as Low, High, Medium that correlates with the weight used for assessing the outcome. This weight will be used in the final mapping table between courses and ABET program outcomes.

    • When the course outcome weight is < 10%, it will be given a Low rank (L).

    • When the course outcome weight is between 10% and 20% it will be given a Medium rank (M).

    • When the course outcome weight is  20% it will be given a High rank (H).

The tables for the course learning outcomes for all core Computer engineering courses are given in Appendix .A.

Course outcomes are assessed every semester by course instructors both directly and indirectly. To report the direct assessment of course learning outcomes, instructor use the Course Learning Outcomes Evaluation Table that includes the following for each outcome:



  • Outcome minimum weight: this indicates the importance of the outcome in the course. It is the minimum weight from the total course score (out of 100) that must be used for assessing the outcome or covering the outcome in the course.

  • Outcome weight: this is to be filled by the instructor indicating how much weight was actually used by the instructor for assessing the outcome.

  • Assessment Method: this describes the methods were used to asses the outcome, the weight of each method, and the evidence of assessment.

  • Class Average: indicates the student average performance in the outcome.

It should be noted that the evaluation criteria for each outcome is flexible and can vary from instructor to instructor. However, it should be constrained with the minimum weight specified. An example of course learning outcomes evaluation table is given in Table ‎3.6 -21.



Table ‎3.6 21. Example of course learning outcomes evaluation (by Faculty).

COE 205 Computer Organization and Assembly Language Programming

Outcome


Outcome Min. Weight

Assessment Method

Assignments

Quizzes

Exam I

Exam II

Exam III

Final Exam

Lab Work

Project

Total

O1


55%

15%

8%

15%

20%







5%

8%

71%

60%

Average

12.1%

5.3%

9.5%

12.1%







4.1%

7%

50.1% (70.6%)


Evidence

#1-4

#1-4, 6

Q1-5

Q1-5







#1-13

Report




O2


4%



















5%




5%

80%

Average



















4.1%




4.1% (82%)

Evidence



















#1-13







O3


15%
















20%







20%

75%

Average
















11.8%







11.8% (59%)

Evidence
















Q1-5










O4


2%




2%



















2%

60%

Average




1.3%



















1.3% (65%)

Evidence




#5






















O5


2%






















2%

2%

55%

Average






















1%

1% (50%)

Evidence






















Report




Weight

15%

10%

15%

20%




20%

10%

10%

100%

Average

12.1%

6.6%

9.5%

12.1%




11.8%

8.2%

8%

68.3%


Download 2.78 Mb.

Share with your friends:
1   ...   15   16   17   18   19   20   21   22   ...   34




The database is protected by copyright ©ininet.org 2024
send message

    Main page