Review of the computer science program



Download 2.43 Mb.
Page3/25
Date18.10.2016
Size2.43 Mb.
#1433
TypeReview
1   2   3   4   5   6   7   8   9   ...   25

C. Assessments

For each instrument used to assess the extent to which each of the objectives is being met by your program, provide the following information:




  1. Frequency and timing of assessments

  2. What data are collected (should include information on initial student placement and subsequent professional development)

  3. How data are collected

  4. From whom data are collected (should include students and computing professionals)

  5. How assessment results are used and by whom

Attach copies of the actual documentation that was generated by your data collection and assessment process since the last accreditation visit, or for the past three years if this is the first visit. Include survey instruments, data summaries, analysis results, etc.


Assessment Tools and Procedures

Program assessment and improvement has been the approach of the ICS department since its inception in 1985. The process of assessment was informal rather than formal and rather than having explicit documentation and it was implicit. The ICS Curriculum committee every academic year is assigned the task to review the program taking into consideration the following factors:



  1. Instructors’ comments in the course file,

  2. Technological developments in the computing field,

  3. Trends in computer science education as discussed in various articles published in the literature,

  4. Faculty opinion during the ICS council meetings,

  5. Input from the students through department Chair and the college Dean meetings with the students at least once every semester

The department views the program improvement and revision at two levels: minor immediately needed changes and improvement every year and major program revision every five years.

As the ABET/CSAB criteria changed in 2001, the department adopted the new criteria and developed formal program assessment tools. These include



  1. Graduating students exit surveys: every graduating student is required to complete survey before his clearance form can be signed by the department. Such surveys are collected every semester by the department office and passed to the curriculum committee for analysis.

  2. Employer surveys

  3. Alumni surveys
    Both the employer and alumni surveys are conducted online through the Web page of the Deanship of Academic Development on the request of the department. These surveys were conducted during the academic year 2002-2003. These are now conducting again in 2006-2007.

Considering that all the above tools provide an indirect assessment of program objectives and outcomes, the department has explored two direct assessment tools for the program, and as mentioned later, is adopting one of them for its future assessments.
Indirect Assessment I

The ICS department considers the views of graduating students, alumni, and employers on its offered programs as a valuable source to assess the strengths and weakness of the programs. The department uses the graduating students’ surveys, alumni surveys, and employers surveys as tools to assess its programs. The department conducted the surveys twice: once during 2002 – 2003 academic year and a second time during 2006-2007. In addition, the program, after revision, was given to internal and external reviewers for their comments before it was approved and implemented in February 2007.



  1. 2002 – 2003 Surveys

    1. Graduating student survey

    2. Employer Survey

    3. Alumni Survey

  2. 2005 – 2006 Surveys

    1. Program review comments from Academia

    2. Program review comments from Industry

  3. 2006 – 2007 Surveys

    1. Graduating student survey

    2. Employer Survey

    3. Alumni Survey




  1. 2002 – 2003 Survey Results



    1. Graduating Students Survey

During the period from September 2002 to April 2003, twenty-six graduating students participated in the survey. The results of the survey are summarized below.



 




Question

Strongly Agree

Agree

Neutral

Disagree

Strongly Disagree



The work in the program is too heavy and induces a lot of pressure.

3

9

10

3

1



The program is effective in enhancing team- working abilities.

3

13

6

4

0



The program administration is effective in supporting learning.

4

6

13

3

0



The program is effective in developing analytic and problem solving skills.

5

11

9

1

0



The program is effective in developing independent thinking.

6

13

5

2

0



The program is effective in developing written communication skills.

3

13

7

2

1



The program is effective in developing planning abilities.

2

14

8

2

0



The mathematical content of the program is adequate for pursuing the advanced courses in the program.

3

9

8

2

0



Question 9: The co-op training experience is effective in enhancing:





Question

Strongly Agree

Agree

Neutral

Disagree

Strongly Disagree



Ability to work in teams

8

7

2

0

2



Independent thinking

6

10

2

0

1



Appreciation of ethical values

7

5

7

0

0



Professional Development

9

7

3

0

0



Time management skills

9

5

5

0

0



Judgment

5

7

7

0

0



Discipline

4

5

10

0

0



The link between theory and practice

5

9

4

0

1


Download 2.43 Mb.

Share with your friends:
1   2   3   4   5   6   7   8   9   ...   25




The database is protected by copyright ©ininet.org 2024
send message

    Main page