Articles:

Ceilidh: A Courseware System for the Assessment and Administration of Computer Programming Courses in Higher Education

S D Benford, E K Burke, E Foxley, C A Higgins

Learning Technology Research Computer Science Department University of Nottingham NOTTINGHAM NG7 2RD, UK

Email address

Table of contents

Abstract
Introduction
The Ceilidh System
Student facilities
Course management tutor facilities
Course management teacher facilities
Course management - course developer facilities
Setting up exercises
Tutor and teacher facilities - statistical analysis
Experiences with Ceilidh
Impact on students
Impact on teachers and tutors
Use in formal assessment
Summary
References

Abstract

This paper describes a courseware assessment and management system called Ceilidh. As well as being available for the management of courses in general, it has specific assessment facilities which make it especially useful and appropriate for computer programming courses.

In administering courses, Ceilidh provides the ability to monitor student progress and overall course progress; to inform tutors of relevent information and to detect defaulting students. It allows all coursework to be submitted on-line, with the system collating the work and producing relevent reports and statistics. This coursework can be in the form of computer programs, multiple choice questionairres, question/answer exercises or essays and reports. Computer programs and multiple choice questionairres can be automatically assessed. Ceilidh also enables the presentation of information to students in the form of lecture notes and feedback; in the case of programming exercises there can be a detailed report of areas where marks have been lost and gained.

We have found the system to be extremely valuable. From the students viewpoint, it has acted as a confidence builder for noivice programmers and produces immediate feedback provided by the automarking facility. From the teachers and tutors viewpoint it allows us to identify weak students at an early stage, considerably reduces time spent on the mundane task of marking, and significantly improves general course management.

Introduction

This paper will briefly describe the Ceilidh software quality control environment for the teaching of computer programming. Ceilidh is being developed in a project involving over thirty UK institutes of higher education and is funded under the UK Universities Funding Council's Teaching and Learning Technology Programme (TLTP). Ceilidh has also been installed at sites in America, Australia, Belgium, China, India, Malaysia, New Zealand, Portugal, Russia and Spain.

The paper will give an overview of the Ceilidh system followed by a discussion of some of the insights gained from several years of use at Nottingham. For further details of the Ceilidh system, see the paper[1] or manuals such as the guide[3] for users of the system.

Several previous systems have been developed within the area of automatic administration and assessment. Some of them, for example HANDIN[4] which was developed at the University of Sheffield, concentrates solely on course management, while others, for example AUTOMARK[8] which was developed at McAlister University in Ontario, Canada, concentrates on the area of automatic marking. Other systems that have dealt with these problems are CENVIRON[7] and KELVIN[5] both of which were developed at Helsinki University of Technology. However, although there is much CBL software which involves some form of self-assessment, there is little work involving serious assessment of solutions to complex problems, where the assessments are signicicant enough to count towards the student's grading. Any such system is bound to involve considerable security problems to control access to information, to fit in with institutional legal requirements, and to authenticate users. At the recent International Conference of Computer Based Learning in Science in Vienna in December 1993, there was only one paper[6] other than Ceilidh[2] at the conference describing systems with complex assessment procedures, out of 93 papers presented. At the Association for Educational and Training Technology Conference at Edinburgh in April 1994, Ceilidh was the only paper out of 75 delivered which involved assessment. Ceilidh combines both administration and assessment by setting automated assessment tools within a generalised administration framework.

The core of our Ceilidh system is an automatic-assessment facility which tests computer programs from a range of perspectives including dynamic correctness, dynamic efficiency, typographic analysis, complexity analysis and structural weakness based on well known software metrics. The system may be configured by the teacher running the course so that a student is able to repeatedly re-assess the quality of their program as it is developed and so gradually work towards a quality target. Not only does this promote awareness of quality control issues, it also trains the student in time and work management (i.e. determining when the program is of satisfactory quality and not wasting time making unnecessary "improvements"). In addition to the auto-assessment of programs, Ceilidh provides integrated facilities for managing course resources and is therefore a suitable administration tool for courses other than programming.

Although the system is under continuous development and improvement, versions of Ceilidh have been in operation at Nottingham University for over five years. The first public release of Ceilidh was made available in July 1992 and subsequently it has been installed at numerous sites worldwide. The current version supports both the C and C++ languages under UNIX and is freely available to academic institutions. The UK Universities Funding Council's UK national pilot of Ceilidh is supporting development effort to extend the system to other programming languages including Pascal, Ada and SML, and other operating platforms for instance PCs and X11.
The three main areas involved in what we refer to as courseware are:

1. The administration of courses

Under this heading we include:

2. The assessment and collation of student achievement

Marking student work in various forms such as:

3. The presentation of information to students

The traditional role of CAL has been in the presentation of information to a student, with the speed of progress determined by the student, and with different routes being followed depending on the student's choice and on the system's assessment of the student's progress.
Eventually the Ceilidh project aims to encapsulate all these areas, but at present covers only the first two. The third area is being actively pursued as part of the current project, for instance with the development of a hypertext interface, but at present the system gives administration and marking support for a course of lectures.

The Ceilidh System

This section will provide an overview of the Ceilidh system by presenting a high level summary of the functionality available. The structure of the Ceilidh System is presented in Figure 1.

Figure 1: The Structure of Ceilidh

Ceilidh can simultaneously support a number of concurrent courses. Each course is divided into a number of units representing different chapters or topics. Each unit includes a number of exercises which may be programming exercises, short answer questions or essay questions. Of these, programming and short answer exercises are supported by automated assessment facilities. Essay style exercises include facilities for on-line collection of work for later hand marking.

The overall functionality of Ceilidh is categorised into student facilities and course management facilities. Students use the system to obtain, complete and assess work and to access course resources such as notes, work and lecture schedules and even the teachers. Course management involves three processes: developing the course; setting up the course and exercises; and monitoring the running of the course. In the Ceilidh system, these three facilities reflect three categories of staff involved in managing a course: course tutors represent teaching assistants and are provided with additional facilities to inspect work and to summarise the progress of individuals or groups of students; course teachers are provided with facilities to administer entire courses; and course developers can amend the course information including notes, exercises and test data.

We now consider each class of user in turn. Note that these facilities allow increasing access to the system and that lower level functions are always available to higher level categories of user.

Student facilities

The student menu is presented in Figure 2.

Figure 2: The Student Menu

The system provides the following key facilities to students. a) It allows students to access general temporal course information such as hand-in times for coursework and more permanent information such as lecture notes. This information can be viewed on-line or it can be printed.
b) It provides access to the questions set in exercises (see section 2.4.1).
c) It offers outline program source (skeleton programs to be completed by students), and associated modules and header files for exercises where appropriate, to assist in the solution of programming exercises. For an essay exercise, an outline of the main sections/heading expected in the essay may be given.
d) It allows students to edit, compile and test run their computer programs. The extent to which compilation details are hidden from the student is determined by the teacher.
e) The students can submit their work to the system. If the exercise is programming, the system will mark their program. A summary of the marks is made available to the students to help them to assess their program quality (and a copy of the program and of the marks awarded is retained centrally for later referenc e). Marking can take place many times providing an iterative process of development and assessment. If the exercise is an essay, the system simply stores a copy of the solution.
f) It allows them to view a model solution, to run this solution, to view test data and to run both their own solution and the model solution against the test data. The model solution can only be seen after the deadline for submission has passed.
g) It allows them to comment on a specific exercise or on the system as a whole. Comments are stored for later browsing by teachers.
h) It offers help facilities including an overview of the marking metrics employed by the system and a good programming style guide.
i) An email help facility is also available to provide addi tional support via teachers and tutors.

Course management tutor facilities

Ceilidh provides tutors with the following additional facilities.

a) List details of the work submitted by all of the tutor's students, or by any named student. For each item of coursework the listing gives a summary of the marks awarded and the time at which it was submitted, including whether it was early or late. Details of the work such as the program source code and a more detailed breakdown of the marks can be inspected if requested.
b) List the names of students who have not submitted work, or who have submitted late.
c) List the marks awarded to students for a particular exercise.
d) Summarise the average marks across all of the exercises on a given course.

Course management teacher facilities

Ceilidh provides teachers with the following additional facilities.

a) Declare certain exercises to be open. These are the exercises on which the student will be expected to work.
b) Declare certain previously opened exercises to be late.
Students submitting after this date will be warned that their work is late. For each exercise made late, the teacher will be invited to request plagiarism tests, and overall class software metrics. The latter is particularly important to enable the teacher to keep in close touch with the class progress, and its strengths and weaknesses.
c) Declare certain exercises to be closed. Students will no longer be able to submit work for these exercises.
d) Teachers can browse and respond to the students comments.
e) Set weighting and scaling factors so as to calculate final assessment grades from marked exercises in any desired way.

Course management - course developer facilities

Ceilidh provides course developers with the following additional facilities (see Figure 3).

Figure 3: Course Developer Facilities

a) Create new coursework, or amend existing coursework. The system prompts the user to ensure that all the necessary data items have been input (see below).
b) Set up new courses and manage registers of tutors and students.

Setting up exercises

In order to set up an exercise the course developer needs to provide a number of files:

+ A question/specification.
+ A working model solution or model answer.
+ A file of test data for each of the dynamic tests to be carried out in programming exercises. Test data may either take the form of raw data to be input to the program or may take the form of a UNIX shell script which can be used to drive the program in a more flexible way.
+ A file of "keywords" for each test. These are matched against the program output for programming exercises and against the students responses for short answer exercises.
The keywords are actually UNIX regular expressions to be matched against the program's output. The use of regular expressions offers a high degree of flexibility in making this comparison.
Finally, the teacher needs to provide a mark weighting file which assigns a name and relative mark weighting to each of the tests involved.
The teacher may also configure the relative weighting of style and complexity tests if required and may even alter the relative weightings of the various software metrics used.

Tutor and teacher facilities - statistical analysis

Another facility that allows course teachers and tutors to track the progress of individuals or groups of students is by using statistical analysis. This is supported by a statistical summary package within Ceilidh. Three kinds of summary can be produced.

+ A student summary shows the progress of an individual student over a given course. This includes displaying their mark for each exercise along with the class average. It also includes the number of submission attempts per exercise, thus providing some indication of how hard the student found the exercise.
+ An exercise summary displays the distribution of student marks for a given exercise including the number of submission attempts per student.
+ A course summary displays the average marks across the whole course, as well as the average number of submissions across the whole course.
These facilities are available to tutors and teachers. Students are encouraged to obtain a personal summary of their own progress across a course. Summaries may be presented as tables or as graphs which may appear on-screen or be printed. Facilities are also provided to batch print summaries for each student on the course. For example, at Nottingham we have established the practice of emailing summaries to all tutors on a fortnightly basis.

Experiences with Ceilidh

This section summarises our experiences of building and using Ceilidh. This summary does not represent the results of a formal statistical/experimental evaluation of the effects of the system on the educational process. Instead, it summarises several years of feedback, observation and opinion from a wide range of sources. These include:
+ Our own subjective view as teachers and implementors.
+ Analysis of user questionnaires. Ceilidh contains questionnaires about the system and the courses supported and these are made available, on-line at the end of each course.
+ Archives of the many student comments and questions submitted through the on-line comment facility.
+ The marks obtained across three years of use.
+ Experiences of discussing Ceilidh in examiners' meetings and feedback from other members of staff passing on student comments from tutorials.
+ Feedback from other organisations involved in piloting the system worldwide.

Impact on students

We begin with observations on the ways in which Ceilidh has affected different kinds of student.

+ The system has acted as a confidence builder for novice programmers who benefit greatly from the kind of positive early feedback that arises from the early simple exercises. One particular aspect of this has been building the confidence of female students who may often initially feel intimidated by the "macho" image associated with programming.
+ The system has enabled us to spot the really weak students early on enabling us to focus effort on helping them.
+ Nearly all of the critical feedback came from experienced students, particularly over the marks assigned. The notion of good programming style was a particularly contentious issue and we had several discussions on the issues of style and standardisation of layout. This turned out to be beneficial as it is the self-taught "experts" who often need the most help in this area (even if they think that they don't!).
This last observation is of key importance. One of the most surprising and pleasing aspects of Ceilidh was its role in consciousness raising. The provision of immediate feedback by a machine produced much more discussion of programming correctness and style than did previous hand marking. Ceilidh's on-line comment facility played a key role in this discussion and one of the early extensions to the system was to offer the students the opportunity to comment on each mark when given. Consequently, our experience of automatic assessment is that, far from reducing contact with students, the quantity and quality of discussion is increased. Looking back, we suspect that this effect stems from a number of factors. First, immediate feedback means that marks are received while the problem is uppermost in a student's mind. Second, students may be happier to argue with a machine than with a teacher. Third, the marking process is generally more open to inspection than with hand marking (e.g. style rules are published and are applied consistently).

As a further comment, hand marking of any form of coursework can lead to a student being treated less fairly than others. For instance, coursework marked by more than one person will lead to inconsistences in marks awarded due to differing ideas of what the correct answer should be. This coupled with other problems such as racism, sexism and favouritism can lead to certain students achieving poorer marks than they deserve. We believe that such explicit discrimination is reduced, if not eliminated, by the use of the Ceilidh system since it marks each solution consistently. However, we recognise that implicit discrimination through inappropriately chosen questions must still be guarded against.

Of course, the introduction of Ceilidh was not without its problems and we did observe a number of more negative reactions to the system. One problem group came to be known as the perfectionists. These students seemed unable to stop working on an exercise even when a satisfactory mark had been obtained. A second problem group was the gamblers, students who iterated around the marking cycle many times, tweaking their programs in an attempt to pick up extra marks without necessarily thinking through the problem at hand. Perfectionists and gamblers resulted in us making two further extensions to the system. The first was to introduce a minimum time delay between markings during which time a student could not remark their program. This delay is tailorable by the teacher and may range from a few seconds to several days. A more general advantage of this extension is that it allows Ceilidh marking to be used in a "once-only" fashion if the teacher wishes. The second extension was to extend the progress monitoring facilities to spot the problem students. This involved two new facilities:

1. Automatically producing graphs of number of marking attempts per student across an exercise or course.
2. Automatically producing graphs of the "development profile" for each student completing an exercise. The development profile charts the mark obtained against each marking attempt and so visually shows progress across an exercise.

Graphs such as these provide a rough guide to teachers of potential problem students (e.g. perfectionists or gamblers) who might need additional teaching support. Teachers might then encourage these students to change their working method, either by learning to better manage their work according to sensible quality targets or to think more about the problem at hand before changing their program. In more general terms, the cases of perfectionists and gamblers both highlight Ceilidh's role in encouraging students to manage their own work in an effective way. We believe that this is possible because Ceilidh devolves greater responsibility to students for determining final marks and therefore for managing their work load.

Before passing on, we should briefly mention one other hidden problem. Ceilidh requires that students hand in their work in electronic format. Indeed, at Nottingham, we currently insist that nearly 75% of all core first year work is handed in online (not just programs). This makes it difficult for students to work with pen and paper in their own residences. Of course, many students own their own computers and are able to transfer their work across onto to system. However, we fear that those who don't may be disadvantaged. We believe that the solution to this problem quite simply (in technical if not financial terms) lies in the provision of better computing infrastructure. In particular, we would like to see all of our students armed with personal or notebook computers within the next few years.

Impact on teachers and tutors

So far, we have considered Ceilidh's effect on the students. We also need to consider its effect on the staff. The clear and simple benefit to teachers has been a massive reduction in time spent marking students programs. Ceilidh has also helped with more efficient administration of courses. Collection and collation of work has been trivial, deadlines are published in advance and are stuck to and marks are returned to examiners on time. Ceilidh also helps with the trivial, but often annoying, aspects of coursework such as making sure that work is clearly labelled and is legible. This was particularly noticeable where the collection and collation of essays was concerned. Clearly some of these issues extend to courses other than programming.

The general progress monitoring facilities within Ceilidh have been of benefit to both tutors and teachers and have allowed us to keep track of our students. As an example, some tutors actually handout progress charts to students during tutorials to confirm progress on the course. Interestingly, far from viewing this as a draconian regime as some people feared, many students seem to appreciate regular confirmation of satisfactory progress.

Use in formal assessment

Ceilidh is used both to give feedback to students and for formal course assessment. The current first semester programming course at Nottingham is entirely coursework assessed and the second is 50% coursework assessed with the other 50% being a formal written examination. This dual use of Ceilidh has given rise to several interesting issues.

The iterative use of auto-assessment tends to result in very high raw marks, with marks around 80% and 90% being common. Also, the marks tend to be far more tightly grouped (often the case in continuous assessment). Even with the scaling facilities mentioned previously, it may be difficult to categorise the students according to an "expected" normal distribution. This raises the issue of the relation between feedback and assessment. Indeed, it is obvious that Ceilidh, by providing interactive feedback, in increasing the likelihood of obtaining high marks. One solution is to remark the students' work at the end of the course using different criteria (e.g. using different and harder tests and more rigorous style checking). However, we then encounter the problem that students feel cheated because the system has been reporting excellent progress which may not match their final mark. In general, this issue relates to whether we are seeking normative or criterion assessment of students . So far we have used Ceilidh on first year foundation courses where criterion assessment is a sensible approach (i.e. if you can demonstrate the necessary skills then you can pass the course). The role of Ceilidh, and indeed of any iterative auto-assessment technique, may be limited where normative assessment is required (i.e. on courses in later years).

Summary

Ceilidh is a system which combines tools for the automatic assessment of students' programming work with a general online administrative framework. Ceilidh has been used at Nottingham for over five years and has been distributed to many sites worldwide. Our early experiences with the system are promising in that it improves the completion rate of exercises and also, perhaps surprisingly, seems to promote dialogue with the students. However, taking a radical step such as combining instant feedback with the ability to repeatedly re-assess work also introduces some new tensions and we have had to deal with undesirable student reactions to the system (the "gamblers" and "perfectionists") and also re-examine the often difficult relationship between feedback and assessment.

References

1. Steve Benford, Edmund Burke, and Eric Foxley,
"A System to Teach Programming in a Quality Controlled Environment", The Software Quality Journal 2, pp.177-197 (1993).

2. Steve Benford, Edmund Burke, Eric Foxley, Neil Gutteridge, and Abdullah Mohd Zin,
Ceilidh: A course administration and marking system, Proceedings of the International Conference on Computer Based Learning in Science, Vienna, December 1993.

3. Steve Benford, Edmund Burke, and Eric Foxley,
Student's Guide to the Ceilidh System (2.4), LTR Report, Computer Science Dept, Nottingham University, 1995.

4. A. J. Cowling and J. J. McGregor,
"HANDIN - A System for Helping with the Teaching of Programming", Software - Practice and Experience 15(6), pp.611-622 June 1985).

5. A Eerola and L Malmi,
KELVIN - A System for Analysing and Teaching C Programming Style, Computer Learning in Complex Environments CLCE94, University of Joensuu, Finland, May 16th-19th 1994, pages 112-117.

6. Walter Friedl,
Learn - The computer assisted learning approach, Proceedings of the International Conference on Computer Based Learning in Science, Vienna, December 1993.

7. L Malmi,
CENVIRON - An Environment for Teaching and Learning C Language, Computer Learning in Complex Environments CLCE94, University of Joensuu, Finland, May 16th-19th 1994, pages 87-90.

8. K. A. Redish and W. F. Smyth,
"Program style analysis : A Natural By-product of program compilation", Communication of the ACM 29(2), pp.126-133 (Feb 1986).