Module Evaluation

<< Home << Context << Content <<Instructional Design << Delivery <<Assessment

 

 


The Database Disciplinary Commons

2010


 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 


Evaluation: The Formal Process

Formal evaluation is part of the life cycle of every module. The evaluation and monitoring process at Staffordshire University is very similar to that of other universities; it starts with student feedback and the teaching team's evaluation and moves on through faculty and university quality procedures. The final artefact for this portfolio is the Module Report - a formal document completed after each delivery of a module which looks at student feedback and module results.

This page reviews the MDS module informally, looking at it from 3 different perspectives:

 

    What the students said

    What the course team said

    Where do we go from here?

 

Evaluation: The Student Experience

The faculty has just introduced an online module evaluation form but for a number of reasons, students tend not to complete this form. The teaching team asked the students to complete an anonymous paper questionnaire to help us identify any changes which should be made to the module. Approximately 2/3 of the students completed the paper questionnaire.

It is notoriously difficult to get an accurate picture of the student experience. For one thing, there are many different experiences: one student wrote:

‘very interesting module, taught brilliantly that engaged all the way providing an insight into DBs in the real world’

Other students described the module as 'very challenging' - which the teaching team takes to mean, 'very difficult'. Generally, comments were positive with a number of suggestions for the milestone (increase the word limit). The survey highlighted two main issues; one we had already identified and the other was a bit of a surprise.

The issue we expected was a request for more practical work/practical time. During the course of the module, it had become apparent that some students were struggling to finish practical work in the time allowed. The practicals form the basis for the Proof Of Concept artefact so are an important part of the assessment as well as supporting learning. We had provided extension exercises for students who wanted to go beyond the taught material but our review of the module had identified a need for more supporting material and this view was clearly shared by the students.

A number of students said they felt the connection between practicals and lecture material needed to be clearer. This was unexpected as the lectures and practicals had been developed together and the practicals were written to support the lectures. The link was clear to the team, but needed to be better communicated to the students.

 

Evaluation: The Teaching Team Experience

The MDS module was a good experience for the teaching team. The students, even those who found the module difficult, were committed and hard working and some of the coursework was of a very high standard. The team felt that relationships between staff and students, and between the students themselves, were good and interactive. We covered some material which we would not normally teach and covered other material at a higher level than is possible in other modules, meaning that the module stretched staff as well as students.

One of our concerns was that the module might not be pitched at the correct level. Student feedback was that about 1/3 of students found the module challenging or very challenging - we had been concerned it might be too easy! The quality of the work produced suggests that the module stretched students but that they were able to meet the challenge. More support for the examination (see below) would probably be welcomed.

The module pass rate was high - of the students who submitted for the module, only 2 students did not pass. There were some very good grades but there was also a spread of results, across both cohorts, suggesting that the module overall was pitched about right. For the coursework, it was satisfying to see that all the students who passed had carried out additional research to supplement the taught material. In some ways the real test is the examination since this covers a wider field than the coursework. There was less evidence of additional research in the exam but concepts were applied well to the scenarios.

Exam results were not as good as the coursework results - are they ever? - and exam preparation is one of the areas we want to develop. There were some students who performed significantly less well in the examination than in the coursework. We provided a series of online MCQ tests to help students prepare for the examination and Blackboard statistics showed that almost all the students who submitted, had attempted the tests. For next academic year we intend to supplement the online MCQ tests with post lecture quizzes - an idea that came out of the Commons - to reinforce concepts. One of the practicals in the 2009/10 delivery was designated as an 'exam practical' where the practical work was used to help students understand concepts that would feature in the exam. It was noticeable that the topics covered in this practical were better answered than topics which had not been supported by a practical and we will try to provide more 'exam practicals' in the next delivery.

The team had identified as an issue the need to give students the opportunity to do more practical work and this was reinforced by the students' comments. In keeping with the evolutionary philosophy of the module, we are reviewing the module content. The next delivery of the module will be broadly similar but with some fine tuning of concepts and more discussion of database architectures.

 

Evaluation: Where Do We Go From Here?

We nominated the MDS module for the Disciplinary Commons because we wanted the opportunity to stand back and review the way the module was developing. There is some fine tuning to be done - we need to try to provide more time for practical work perhaps by swapping one of the lecture slots, we need to make the practical/lecture link more explicit and we will provide more learning material to support the exam - but overall the module seems to work well and this is supported by the students' comments. The module content will continue to evolve as this is part of the module approach.

One of the most important elements in the module delivery was the number and calibre of the students. There were only 33 students on the module, meaning that the teaching team got to know all the students individually and was able to provide individual support to those students who wanted this. MDS is a computing option module so the UK cohort is self selecting and the UK students have all previously worked with members of the database team: they know what they are letting themselves in for. The Stuttgart students are very motivated students who have worked hard to gain the grades required to study overseas. With a larger cohort or less motivated students, we would have a different module.


 
<< Previous : Home >>