A Portfolio for Human Computer Interaction Design

 
Home
Aims & Philosophy
Context & History
Content
Instructional Design
Assessment
Evaluation
Delivery
Contact

Sometimes I try to improve the language, the lines, or the delivery, but I don't ad-lib because I think that makes it really hard for everybody else.  Harrison Ford

Delivery
 
 
Delivery of Human Computer Interaction Design (HCID)

Home > Delivery

This portfolio has focused on a set of reasonably concrete factors such as content, instructional design, assessment, etc. in order to illustrate how we have arrived at the present module status and how the module aims and more specific objectives  are achieved.

I have done this by examining and reflecting upon the context, content, instructional design, assessment and evaluation artefacts contained within this module.  In contrast to these areas, the delivery section is different because this is more about me, how I deliver information to students in lab sessions, lectures and tutorials, how I converse with students on a more individual basis.   It is difficult for me to state how effectively I deliver ideas, skills and concepts.  The only people who can easily make this judgement are my students and possibly my peers during peer observation.  

I therefore propose to talk briefly about the mechanics of my delivery, what I actually do when I deliver.  I will then make brief reference to student feedback which touches very slightly on my delivery and finally talk about my recent  observation within the Disciplinary Commons peer observations

Mechanics of My Delivery

I deliver lectures to large groups of students in the HCID module.  The lecture sessions have been examined  within the Instructional Design section of this Portfolio but to elaborate further, I do not stay static when delivering these sessions but walk across the stage and around the lecture theatre.  I use Powerpoint slides which mainly relate to HCID theory but move away from the Projector whenever possible in order to make points.  I use objects as sample artefacts in lectures. So for example I will use a door handle and open and shut the door to demonstrate affordance or the fact that this object affords pushing (or pulling).  I prompt students in lectures for answers to questions relating to content and encourage them to discuss an issue in pairs for brief bursts within a lecture.  While this is happening I move more quickly around the lecture theatre in order to monitor what they are saying but move to the front in order to conclude this phase and open what they have said to general discussion. 

During tutorials or practical lab session, I do similar things to the above.  I move quickly between students in order to see and converse with every student during a practical session.  I know from my own observation, some lecturers prefer not to move around too quickly and spend more time in more depth with each person.  However the problem with this is that in a demanding practical session there may be students who feel they have not had any attention and indeed a lecturer may have run out of time without seeing some students.  This is always a possibility even if the lecturer attempts to move quickly around the room.  During a practical session tutorial, I introduce a topic with examples, either using the data projector and on-line web support (VLE or other) or the smart board; I always prompt students for answers while trying to hold eye-contact with the whole room.  I then encourage students to quickly consolidate what they have just heard by making their own interpretation of it either on screen (the usual approach) or on paper. Again I quickly move around to maintain interest, see what they are doing and to give formative feedback.

Student feedback on Delivery

The artefact illustrated in Figure 16 in the Evaluation section does reveal some comments which do relate to my delivery.  As already mentioned, the statistics indicate that 68% of students thought that the teaching in the module helped them to learn effectively and 78% were clear about what they were meant to be learning.  Other comments provided during this same questionnaire which were not included in the statistics but simply reported as a selection of student comments in a 'Strengths of the Unit' section, indicate that students generally viewed the topic very positively.  Example comments indicated that the module (unit) which 'could have been fairly dry and musty'  was 'turned into something good by the lecturer' [sic]; that generally 'help was always at hand' with 'excellent online facilities' and  'interesting lectures'.  Also, because 'teaching was done in small chunks it made it easy to take in'.  On the other hand,  in a 'How this Unit could be Improved' column, one student worried about his system not working and whether the focus was on presentation and related issues.

It is easy to take a few comments and some statistics, interpret them either way and, for example, be falsely inflated by faint praise or unnecessarily deflated by misunderstanding.  However, if I reflect on this artefact's statistics and these comments, I should, I feel, take heart  that I am delivering the module in a correct and enjoyable manner and steering it in a positive direction that enables and empowers a majority of students in the area of usability.  My delivery, like anybody's would always have room for improvement.  However, very broadly, the evaluative feedback suggests that the delivery appears to substantially address the original aims of the module.

Peer Observation within the HCI Disciplinary Commons

I was observed as part of the peer observation system within the Commons by fellow commoner Fiona Fairlie.  This observation process was very helpful in terms of examining my delivery and I would like to thank Fiona for her time and comments.  The peer observation artefact is illustrated in Figure 20.

Fig. 20 Artefact  Peer Observation Feedback Form

Peer Feedback Form
Class BSc BIT
Level 2
HCID
Practical Session
Solent University
Date 29/11/07
Aims Design – around menus

Particular aspect on which feedback is requested general feedback requested
Things which you did well

-knew names of all students in the class
-gave clear explanation of required tasks and of underlying concepts
-used data projector successfully to demonstrate use of software (it may be worth noting that I -- -was sitting at the back of the classroom and could see what was happening on screen adequately, even when items were being selected from quite small menus)

Things which you did less well, and suggestions for improvement
- because the whole lab group were completing the task together, there were a couple of occasions on which one or two of the faster students had to sit and wait for slower classmates to catch up before progressing with the task. This resulted in the quicker students sitting doing nothing for a couple of minutes. It is difficult to see how this can be avoided when structuring a class in this way.
Other comments

- the software being used, Visual Basic, makes it easy to create widgets but how easy it is to change from a standard look/layout? Left with the feeling that the programming tool is driving the design of the interface.

 

As can be seen, I was observed in late November at a point when students were consolidating and finalising their assignment.  I was visited in a session where I was partly reinforcing ideas of design, including metaphors and screen objects and partly providing an open question session on user task requirements with me as the customer/user and the students as the consultants. 

 

I felt that the reinforcement of design and accompanying exercise went well but, unusually, I tried to deliver it to all students at the same time with all students working on the practical exercise at the same pace.  In retrospect, this meant that more able students were waiting and less able students could not keep pace with the delivery.  On reflection, it would have been better to avoid this type of delivery, provide an initial set of verbal or VLE /web site information with a brief explanation and then allow them to proceed at their own pace with me acting in a consultancy role.  In fact I wrote at the time in my self evaluation form:

My key take-homes
- not necessarily do this as group based exercise which is worked through by all of us together
- more time necessary for assignment based questions in similar 2 hour session
What to do/not to do next time
- students do exercise at their own pace from web based support with me prompting rather than leading
- more time given to both the exercise and open student questions at end
- investigate more complete change from standard/look layout within design environment

The latter point is something I am considering in relation to the design and development environment as I do wish to avoid the development tool driving the design rather than the other way round.

Some Conclusions

Reflecting on this whole process, it did focus clearly on my delivery as do all peer observations.  It is difficult for any of us to look at ourselves and make clear decisions on how we interact with and deliver information to students.  We always need some outside constructive, objective advice in order to make rational decisions about our delivery and without the input described above, this section would have been even more difficult to construct.  

However, like all other areas of this HCID portfolio, it has allowed me to reflect upon and examine aspects of my teaching in much greater depth and provide me with more insight into what I do, how I actually do it, what I need to change, and whether what I do meets my own aims.  For these reasons, the HCI Disciplinary Commons workshops and the portfolio have been a positive influence on my approach to teaching over the past year and I have no doubt that this influence will continue in future years.

David Cox, 2008

References

Ford, H., (no date given) available at

 http://www.brainyquote.com/quotes/quotes/h/harrisonfo364555.html.  Accessed 3rd June 2008

 

       
David Cox