Review of the computer science program


Program Outcome 4: Critical evaluation and testing



Download 2.43 Mb.
Page16/25
Date18.10.2016
Size2.43 Mb.
#1433
TypeReview
1   ...   12   13   14   15   16   17   18   19   ...   25

Program Outcome 4: Critical evaluation and testing

4

3

2

1

  • Methods of evaluating some aspects (design, data structures, algorithm, tools or programming language) of system / tool / project are clearly mentioned and explained

  • Methods of testing the system/tool/project are clearly mentioned and explained.

  • Methods of evaluating some aspects partially mentioned

  • Methods of testing the system/tool/project are clearly mentioned and explained.

  • Methods of evaluating some aspects partially mentioned

  • Methods of testing the system/tool/project are partially/vaguely mentioned

  • No description of evaluating or testing the system is mentioned.


Program Outcome 5: Methods and tools

4

3

2

1

  • Implemented a full prototype system

  • Employed use cases for requirements analysis

  • Employed OO approach and diagrams in the analysis and design phases

  • Reported the use of tools in analysis, design, implementation, and testing of the system.

  • Implemented a full prototype system

  • Employed use cases requirements analysis

  • Employed OO approach and diagrams in the analysis and design phases

  • Did not report the use of tools in analysis, design, implementation, and testing of the system

  • Implemented most of the prototype system

  • Employed use cases for requirements analysis

  • Reported the use of at most 2 design diagrams

  • Did not report the use of tools in analysis, design, implementation, and testing of the system

  • Did not have a working prototype system.

  • Presented very weak employment of use cases, analysis and design diagrams.

  • Did not use tools in analysis, design, implementation, and testing of the system


Program Outcome 7: Risk analysis

4

3

2

1

  • Risk analysis has been identified and planned for.

  • Students have indicated some occurring risks and how they have addressed risks

  • Alternative plans used to address the risks are documented.

  • Students showed the effect of the risks on the project and lessons learned.




  • Risk analysis has been identified and planned for.

  • Students have indicated some occurring risks and how they have addressed risks

  • Alternative plans used to address the risks are not documented.

  • Students showed the effect of the risks on the project and did not show lessons learned.




  • Risk analysis has been identified but not planned for.

  • Students didn’t report occurring risks

  • Alternative plans used to address the risks are not documented.

  • Risk analysis was superficial or not present.


Program Outcome 8 (i): Communication (Written Communication)

4

3

2

1

  • Report is well organized

  • Report is mostly grammatically sound

  • Some words are misspelled

  • Report is well organized

  • Report contains major grammatical mistakes and/or some words are misspelled

  • Report is not well organized


Program Outcome 8 (ii): Communication (Oral Communication)

4

3

2

1

  • Clear and well reasoned

  • Well organized (good introduction and conclusion, adequate content to support the conclusions)

  • Well-formed sentences

  • Interesting (confident, maintain eye contact)

  • Clear

  • Reasonably well organized (good introduction and conclusion, inadequate content to support the conclusions)

  • Most sentences are well-formed

  • Interesting (confident, maintain eye contact)

  • Not clear, many sentences are ill-formed, conclusions not clear.


Program Outcome 11: Professional development

4

3

2

1

  • The project involved learning a concept and/or algorithm not part of the CS curriculum

  • The project involved learning a development platform not part of the CS curriculum

  • The project involved learning a concept and/or algorithm not part of the CS core curriculum.

  • The Development platform learning experience is minimal.

  • The project involved learning a development platform not part of the CS curriculum

  • The concept and/or algorithm learning experience is minimal

  • The learning experiences of the project from a concept and/or algorithm point of view and from a development platform point of view are minimal


Program Outcome 12: Computing and society

4

3

2

1

  • The benefit of the system/tool/project is made very clear

  • The safely implication (risks) to the society is made very clear

  • Components re-used in the system (from other sources) are mentioned

  • One of the three elements is missing

  • Two of the elements are missing

  • All three elements are missing


Rubrics Results

Based on the way each rubric value is computed, as explained in the previous section, we include the values computed for each program learning outcome. Starting with Program Outcome #1, Knowledge in Major, the following table details the results for carrying out the assessment in 061.



 

Final Average

Rubric Value

Principles of Programming

45.08

1

Discrete Structures

58.24

2

Operating Systems

63.63

2

Artificial Intelligence

66.83

2

Computer Networks

65.35

2

Programming Languages

63.2

2

 

 

 

Total Average

60.39

2

The average value for the principles of programming part is very low. The reason for that is the dramatic change in offering the two introductory programming level courses starting 061. It was conveyed to the department that the programming skills of ICS students are relatively weak. Therefore, the department launched an adhoc committee to look into this matter more closely. Some conjectured that the use of Java, a very rich object-oriented language, was the main reason. Therefore, the two courses have been restructured and exams were made to concentrate heavily on problem solving and programming questions. We hope that subsequent offers of these courses will show an increase in performance. One other factor that should not be underestimated is that students from the Management Information Systems, MIS, department also take these courses. MIS students have generally complained of the "difficulty" of these courses to them. As for the rest of the program outcomes, the following tables include the details of each outcome. These were based on evaluating 20 final reports, the first 10 represent senior projects' final reports (Summer Training Option) and the last 10 represent cooperative work final reports (Cooperative Work Option)

PO #2: Modeling

PO #3: Problem Solving

Evaluation 1

Evaluation 2

Evaluation Decider

Value

Evaluation 1

Evaluation 2

Evaluation Decider

Value

 

 

 

 

 

 

 

 

1

2

 

1.5

1

2

 

1.5

1

1

 

1

1

2

 

1.5

1

1

 

1

2

1

 

1.5

2

3

 

2.5

2

2

 

2

3

3

 

3

2

2

 

2

3

2

 

2.5

2

2

 

2

1

1

 

1

1

1

 

1

1

2.5

1

1

1

2

 

1.5

1

1

 

1

1

2

 

1.5

1

3

1

1

2

3

 

2.5

4

3

 

3.5

4

1

3

3.5

4

3

 

3.5

4

3

 

3.5

4

3

 

3.5

4

3

 

3.5

4

3

 

3.5

3

3

 

3

4

2

4

4

3

2

 

2.5

4

2

3

3

4

2

3

3

4

3

 

3.5

4

3

 

3.5

4

3

 

3.5

4

3

 

3.5

4

3

 

3.5

3

2

 

2.5

1

2

 

1.5

1

3

3

3

 

 

 

 

 

 

 

 

Average (Cooperative Work Option)

1.55

Average (Cooperative Work Option)

1.70

Average (Summer Training Option)

3.30

Average (Summer Training Option)

3.15

Overall Average

2.43

Overall Average

2.43




PO #5: Methods & Tools

PO #7: Risk Analysis

Evaluation 1

Evaluation 2

Evaluation Decider

Value

Evaluation 1

Evaluation 2

Evaluation Decider

Value

 

 

 

 

 

 

 

 

1

2

 

1.5

1

1

 

1

1

1

 

1

1

1

 

1

2

1

 

1.5

1

1

 

1

2

3

 

2.5

2

1

 

1.5

2

2

 

2

1

1

 

1

2

2

 

2

2

1

 

1.5

1

1

 

1

1

1

 

1

1

2

 

1.5

1

1

 

1

1

1.5

 

1.25

1

1

 

1

2

4

3

3

1

1

 

1

3

3

 

3

4

3

 

3.5

4

2

3

3

4

3

 

3.5

3

2

 

2.5

4

2

2

2

4

3

 

3.5

4

2

1

1.5

3

2

 

2.5

4

3

 

3.5

3

2

 

2.5

4

2

3

3

4

3

 

3.5

4

3

 

3.5

2

3

 

2.5

4

2

2

2

3

2

 

2.5

3

2

 

2.5

2

2

 

2

2

3

 

2.5

 

 

 

 

 

 

 

 

Average (Cooperative Work Option)

1.73

Average (Cooperative Work Option)

1.10

Average (Summer Training Option)

2.75

Average (Summer Training Option)

2.75

Overall Average

2.24

Overall Average

1.93




PO #4: Critical Evaluation & Testing

PO #11: Professional Development

Evaluation 1

Evaluation 2

Evaluation Decider

Value

Evaluation 1

Evaluation 2

Evaluation Decider

Value

 

 

 

 

 

 

 

 

2

1

 

1.5

2

1

 

1.5

1

1

 

1

2

4

4

4

2

2

 

2

2

1

 

1.5

2

1

 

1.5

2

2

 

2

2

1

 

1.5

3

3

 

3

2

1

 

1.5

2

2

 

2

1

1

 

1

2

2

 

2

1

2

 

1.5

1

2.5

1

1

1

1

 

1

1

1

 

1

2

1

 

1.5

1

1

 

1

3

3

 

3

3

1

2

2

4

3

 

3.5

4

1

2

1.5

2

2

 

2

3

1

2

2

2

2

 

2

3

1

1

1

2

1

 

1.5

3

1

2

2

2

2

 

2

3

1

3

3

3

3

 

3

2

1

 

1.5

3

2

 

2.5

2

3

 

2.5

1

1

 

1

2

1

 

1.5

2

2

 

2

2

1

 

1.5

 

 

 

 

 

 

 

 

Average (Cooperative Work Option)

1.40

Average (Cooperative Work Option)

1.90

Average (Summer Training Option)

2.25

Average (Summer Training Option)

1.85

Overall Average

1.83

Overall Average

1.88




PO #12: Computing & Society

PO #8(i): Written Communication Skills

Evaluation 1

Evaluation 2

Evaluation Decider

Value

Evaluation 1

Evaluation 2

Evaluation Decider

Value

 

 

 

 

 

 

 

 

2

2

 

2

2

3

 

2.5

2

3

 

2.5

2

2

 

2

3

2

 

2.5

2

4

2

2

3

2

 

2.5

3

3

 

3

2

2

 

2

3

3

 

3

3

3

 

3

3

3

 

3

2

3

 

2.5

3

2

 

2.5

1

2

 

1.5

2

2

 

2

1

2

 

1.5

2

1

 

1.5

2

3

 

2.5

1

4

3

3.5

3

2

 

2.5

3

2

 

2.5

4

2

2

2

3

3

 

3

3

2

 

2.5

3

3

 

3

3

2

 

2.5

3

3

 

3

3

1

3

3

3

3

 

3

2

2

 

2

3

2

 

2.5

2

2

 

2

2

3

 

2.5

3

2

 

2.5

2

3

 

2.5

2

2

 

2

3

3

 

3

2

2

 

2

2

2

 

2

 

 

 

 

 

 

 

 

Average (Cooperative Work Option)

2.25

Average (Cooperative Work Option)

2.50

Average (Summer Training Option)

2.30

Average (Summer Training Option)

2.70

Overall Average

2.28

Overall Average

2.60

The above results have shown that students in the summer training option have done considerably better than cooperative work students in program outcomes 2, 3, 4, 5, and 7. The main reason for that is that the summer training students are evaluated in the senior project course. The senior project course is run in such a way that the coordinator ensures that all such outcomes are addressed during their work. They go through the whole software engineering life cycle in one single project, resulting in a more focused result. Cooperative work students, however, are bound by the projects given to them by their employers. The coordinator for ICS 351 puts great efforts in trying to ensure that students go through the whole life cycle of at least one project. It is not unusual, and it happened earlier, that students were relocated to other employers and even sometimes dropped from the cooperative work option and asked to transfer to summer training option due to the fact that the employer was not cooperative in meeting the requirements of the department. However, we believe that we can still put more structure to the cooperative program in a way similar to what is done in the senior project. We really do not believe that our coop students are much weaker than summer training students. They just need to be more explicit in expressing what they do in their cooperative work final report. As for program outcomes 8(i), 11, and 12 the difference is not significant. Overall, the department needs to pay more attention to Program Outcomes 4, 11, and 12, viz. critical evaluation and testing, professional development and computing and society for both options.



The results for oral evaluation for senior projects in 062 are as follows:


PO #8(ii): Oral Communication Skills

Evaluator 1

Evaluator 2

Evaluator 3

Evaluator 4

Average

4

3.5

 

 

3.75

3

3.5

 

 

3.25

3

3

3

3

3.00

3.5

1

2

 

2.17

3

4

4

 

3.67

2.5

 

 

 

2.50

2.5

 

 

 

2.50

2

3

2.5

 

2.50

4

4

3

4

3.75

4

4

3

4

3.75

3

2

 

 

2.50

3

3

 

 

3.00

3.5

4

 

 

3.75

3.5

4

 

 

3.75

2

2.5

 

 

2.25

3.5

3

 

 

3.25

3.5

 

 

 

3.50

4

2

3

3

3.00

4

4

3

4

3.75

4

3

3

3

3.25

 

 

 

 

 

 

 

Overall Average

3.14

As for the oral and written communication skills, it is obvious that our students are doing well in this regard.

With regard to the soft skills represented in Program outcomes 6, 9 and 10, since the number of students is large, we are only including the summarized values. Details are going to be available in the course display of the program.



Figure: Relative professional satisfaction
The relative professional satisfaction index shows that there may be a problem in punctuality and attendance among coop students more than that in summer training students. One reason for such result is that the summer training period is relatively short, 8 weeks, compared to that of the cooperative work's 28 weeks. Also, the number of cooperative work students is much less than its counterpart in summer training.



Figure: Average Values for Program Outcomes 6, 9 and 10

The figure above clearly shows that cooperative work students have scored better than summer training students in the soft skills. This is expected due to the long exposure of cooperative work students to these skills in their work environments. Overall, our students are doing well in this regard.


Comparison of Rubrics-based Evaluation, Grade-based Evaluation and Indirect-Assessment II



Figure: Comparison of all direct and indirect measures

In the above comparison figure, the most "conservative" evaluation among all methods is that of using the rubrics in all program outcomes except "significantly" for outcomes 9 and 10, which are based on the employer surveys. The enormous efforts and times spent by faculty in carrying out the direct assessments in 061 and 062, along with all its disadvantages, led us to believe that it might be unnecessary. The rubrics approach seems more informative and can clearly point out areas where the department needs to improve. Coupled with the lesser time and effort needed to carry out the rubrics assessment method, the department decided to adopt the rubrics method and drop the grade-based assessment altogether. Although the indirect assessment using students opinions on the course outcome coverage is "inflated", we still believe that the survey at the end of the semester provides valuable input to current and future instructors to better offer the course.



Conclusion

In conclusion, the department needs to pay more attention to program outcomes 4, 7, and 11, viz., critical evaluation and testing, risk analysis, and professional development respectively as they had a rubrics score of less than 2.



Program assessment and improvement history

The BS Computer Science program was evaluated by the ABET/CSAB team in 2001. In summer semester after the visit, the Chairman of the department assigned two faculty members to study the ABET assessment comments made by the visiting team and develop recommendations to improve the program for the department. Copy of the report from the faculty members is attached with this document and some highlights of the report are reproduced here.

A chronological list of evaluation, assessment, and improvement of CS program is:


14-18 April 2001

Last ABET/CSAB visit:

28 February – 3 March 2004

CS program Self Assessment visit:

February 2007

CS program revised, approved, and implemented


Standard I-6. The results of the program’s assessments and the actions taken based on the results must be documented.



Download 2.43 Mb.

Share with your friends:
1   ...   12   13   14   15   16   17   18   19   ...   25




The database is protected by copyright ©ininet.org 2024
send message

    Main page