This project is funded by the McGill Faculty of Engineering, the Royal Bank Teaching and Learning Improvement Fund, a Petro-Canada Young Innovator Award, the Natural Sciences and Engineering Research Council of Canada, and Formation de Chercheurs et l'Aide à la Recherche.
Information technology promised to empower us and simplify our lives. In reality, most of us can attest to the fact that the opposite is true. Modern presentation technology, for example, has made teaching in today's classrooms increasingly complex and daunting. Electronic classrooms offer instructors a variety of multimedia presentation tools such as the VCR, document camera, and computer inputs, allowing for the display of video clips, transparencies, and computer generated simulations and animations. Unfortunately, even the most elegant user interfaces still frustrate many would-be users.
Whereas fifty years ago, the only concern a teacher had was running out of chalk, faculty now struggle to perform relatively simple tasks, such as connecting their computer to the projector, displaying a video tape, and even turning on the lights. As a result of the cognitive effort and time its use requires, the technology tends to be underutilized. Worse still, it often distracts the instructor from the primary pedagogical task. While technology's capacity to improve the teaching and learning experience is evident to many, its potential has remained largely untapped.
A related concern is the effort required to exploit the technology for novel applications, for example, distance or on-line education. The desire to provide lecture content to students who are unable to attend the class in person, as well as to those who wish to review the material at a later time, has been a driving force behind the development of videoconferencing and web-based course delivery mechanisms. Although a number of universities now offer courses on-line, the cost involved in creating high-quality content is enormous. Both videoconferencing and simple videotaping of the lectures require the assistance of a camera operator, sound engineer, and editor. For asynchronous delivery, lecture material, including slides, video clips, and overheads, must be digitized, formatted and collated in the correct sequence before being transferred. Adding any material at a later date, for example, the results of follow-up discussion relating to the lecture, is equally complicated. The low-tech solution, which offers the lecture material by videotape alone, still involves considerable effort to produce and suffers further from a lack of modifiability, a single dimension of access (tape position), and a single camera angle. This prevents random accessibility (e.g. skip to the next slide) and view control (e.g. view the instructor and overhead transparency simultaneously at reasonable resolution), thus limiting the value to the students.
Our principle concern in this project was to overcome the complexity and effort associated with the use of electronic classroom technology and the costs of on-line lecture delivery. As such, the initial objectives of this project were to develop, in a classroom setting, computer-augmented presentation technology in order to:
Through early meetings between the design team, undergraduate students, and pedagogical experts from McGills Centre for University Teaching and Learning, we soon realized that the third of these goals was heading in the direction of "technology overkill" rather than offering a substantive improvement to the educational experience, and hence, was struck.
The project was also motivated by a desire to deploy new technology to support effective teaching, and to help professors enhance their teaching with innovative and powerful tools and approaches. This was particularly relevant in the Engineering faculty, where survey results indicated relatively low student satisfaction with the quality of teaching (In-Touch, 1998). While technology itself certainly cannot make someone a better teacher, we thought that it could be used to offer the tools necessary for improved self-evaluation and for ongoing, qualitative, student feedback regarding both the good and bad points of ones teaching style. The role of situated feedback was seen as offering the potential for sustained faculty improvement.
Initially, we considered the possibility of instructors reviewing their web-based lectures after class, or asking students to provide on-line critique of the instructors teaching style, but these approaches seemed of questionable benefit. Furthermore, combining the functionality of viewing a lecture on-line with instructor critique was considered problematic. However, the discussion led to a new proposal that offered similar functionality but a different purpose, namely to permit students to ask questions on-line during the viewing of a recorded lecture.
The motivation for this proposal came from our observation that electronic lectures tend to be closed systems. Not only is it difficult for instructors to add to or modify existing content, but there is no mechanism for students to offer feedback concerning the lecture or ask questions relating to the material, apart from email or course newsgroups. While such tools provide communication between students and instructor, they lack both permanence and visibility. Questions posed in class become part of the entire class's experience, whereas, if asked and answered electronically, the exchange is only available to a few participants (email) or to those students who are actively following the discussion thread (newsgroup). The thread eventually disappears or is buried by ensuing discussion on unrelated topics. A further limitation is the lack of context, i.e. exactly what was the student viewing before asking the question? This led to the formulation of a new project objective, namely:
Similar to the approach taken in the design of a reactive videoconference room (Cooperstock et al, 1997), presentation technology was installed in the McConnell Engineering 13 classroom and augmented with sensors and computers for processing and control (see Figure 1). The room now activates and configures the appropriate equipment in response to instructor activity without the need for manual control. For example, when an instructor logs on to the classroom computer, the system infers that a computer-based lecture will be given, automatically turns off the lights, lowers the screen, turns on the projector, and switches the projector to computer input. The simple act of placing an overhead transparency on the document viewer causes the slide to be displayed and the room lights adjusted to an appropriate level. Similarly, audiovisual sources such as the VCR or laptop computer output are displayed automatically in response to activation cues (e.g., the play button pressed on the VCR; the laptop connected to a video port). Together, these mechanisms assume the role of skilled operator, taking responsibility for the low-level control of the technology, thereby freeing the instructor to concentrate on the lecture itself, rather than the user interface.
Figure 1. Architecture of the Intelligent Classroom connections. Green lines represent serial connections, blue lines represent IR control (for the one instance where a conduit could not accommodate an additional HD9 connector) thick black lines represent video, thin black lines represent composite or S-Video, and red lines represent audio. The large black module in the center of the image is the AMX Accent3 controller, which drives various devices under computer control and the HC 12 module is our button-panel unit with microcontroller, pictured separately in Figure 2. The SW/2 units are video switchers, one of them running in an auto-sense mode, such that an active signal on the laptop connection is automatically selected, while the second is driven by computer control. The P/2 DA2 units are video splitters, such that either video signal can be routed to both projectors.
Along with such automation, the need for a seamless manual override mechanism becomes paramount. For example, if the instructor raises the lights, the technology must respect that preference. Furthermore, the ability to turn the lights on or off must not be dependent upon the automatic controller, as it was before this project began. As a default backup, manual controls for each device (lights, projector, VCR, etc.) must be accessible and functional at all times. Such manual controls serve as basic on/off switches as well as output enable/disable buttons. For example, a single toggle button on the VCR would allow the presenter to select whether or not the video clip being played is projected to the class. By observing the use of these manual override mechanisms, the reactive classroom system can adapt to the preferences of individual users and remember these settings for future use by the same individual. At the end of each lecture, the system resets itself to a default configuration.
Early interviews with instructors revealed that for most users, manual override functions were only required for the room lights and speaker volume, so these were made a top priority. The confusing multi-layered touch-screen menu (Figure 2, left) was replaced by a simple physical button panel (Figure 2, right), consisting of six switches for the various banks of lights, another two switches for the projection screen and window blinds, and a volume control knob that adjusts VCR and microphone volume, depending on which is in use. However, advanced users wanted greater control over the selection of projector outputs, for example, display the laptop output on the main screen and the primary computer display (current lecture slide) on the side screen. Unfortunately, our existing hardware configuration did not permit this level of flexibility, but we are currently addressing this need, in part simply by repositioning the video switch so that it selects between laptop and secondary computer display (previous lecture slide).
Figure 2. Touch-screen interface involving a hierarchical menu structure (left) and the replacement button-panel interface (right) for manual override, providing manual light switches, projector, screen, and drape controls, and a context-sensitive volume dial.
In addition to automating device control, the classroom was instrumented to record a digital version of any presentation, including both the audio and video, as well as the instructor's slides and notes written during the lecture. This capability was provided by eClass, formerly known as Classroom 2000 (Abowd et al, 1998), a system developed at the Georgia Institute of Technology, which performs capture, collation, and synchronization of digital notes, written on an electronic whiteboard or digital tablet, with an audiovisual recording of the lecturer (see Figure 3).
Figure 3. A sample eClass lecture capture being viewed through a web browser and RealPlayer.
A critical component completed in the last semester involved modifying the software driver for the electronic whiteboard so that it conformed to the protocol expected by the lecture capture system. This was most important to enable the use of the eraser tool, which was previously available only on the digital tablet. A second important improvement was the move from PZM ceiling-mounted microphones to lapel-clip wireless microphones for the recording of the instructors voice. This change came in response to numerous complaints about the poor quality of audio, resulting from the earlier microphones picking up background noise from the class. As one student recently commented, "The recorded audio is now so good that I can often hear the lecture better at home than I can in class."
The lecture capture system makes use of our presenter tracking algorithm, which follows the instructor's movements, even in front of the projected video screen, thereby obviating the need for a professional cameraman. The recorded version of the lecture is then converted into a set of web pages, in which every ink stroke written by the professor is linked to the position in the video when that stroke was generated. The system then allows students to review lectures, either in randomly accessed portions, or in its entirety, through a conventional web browser, either from networked university computers or home computers connected by modem. Again, all of this is handled automatically by our control system. The only requirement of instructors is that they confirm they want the lecture recorded.
It should be stressed that lecture capture, in itself, is unlikely to improve the quality of student learning. At best, the captured material offers an alternative review mechanism, similar to handwritten course notes or a textbook, but with the benefit of an accompanying soundtrack. Perhaps more substantial factors are the effect of the technology on the quality of the instructor's presentation and the students use of the recorded material for study purposes.
We first prototyped a Java-based interactive Previously Asked Questions interface, which was manually embedded in each lecture. While reviewing a lecture on-line, students could peruse questions pertaining to any slide, or submit new questions of their own, anonymously if desired. At the click of a mouse button, a list of Previously Asked Questions (with answers) pertaining to that slide was displayed. Students could then submit their own questions at the bottom of the list, which were then stored along with a pointer to the slide being viewed. Instructors or TAs could later edit and respond to these questions, using a separate X-Windows based application. The question and answer were then incorporated into the appropriate slides, thereby forming an integrated component of the lecture.
The prototype allowed us to demonstrate the system to a few candidate users and experiment with its ease of use. As the eClass system was subsequently re-coded using the mySQL database and PHP web scripting language, we were able to exploit this to implement an elegant student and instructor user interface directly through the web browser, without requiring any special (e.g. X-windows) capabilities. Through preliminary experimentation, we soon appreciated the benefit of forwarding submitted questions to the instructor by email and copying the student on replies, provided that an email address was entered at the time the question was asked. By alerting instructors and/or TAs to postings, no special effort had to be expended to check the PAQ periodically for new questions. Similarly, students posing questions can receive responses directly via email, while all students who subsequently visit the web-based lecture can see the question and answer on-line.
In a preliminary evaluation of the lecture-capture tools, a serious shortcoming identified was the need for instructors to manage their own conversion and upload of slides from PowerPoint, a requirement that proved to be unreasonably complex for many users. A converter application was later developed by Georgia Tech, but could only run on a Windows platform and was incompatible with newer versions (post-97) of MS-Office.
Our solution was to develop a converter application that runs in the background on a central machine, waiting for conversion requests. When a user creates a lecture, an option is provided to select and upload a PowerPoint file, which is then transferred to the converter. The resulting slide images are automatically copied back to the appropriate lecture directory on the server, in preparation for the instructor's class. A significant improvement recently made was the incorporation of feedback on the client window, informing the instructor of the status of the conversion. Previously, there was no easy way to verify that the converter was functioning correctly. The converter uses a lock file to prevent multiple instructors from attempting to convert their lectures at the same time.
In a similar vein, we introduced feedback from the encoding machine, so that instructors could verify that the lecture is being recorded properly. This was especially important as the RealNetworks "RealProducer" system often failed to release the video frame grabber at the end of recording, thereby leaving the system inoperative at the start of the following class. More serious problems with RealProducer under the Windows NT Operating System led us to port the system to Linux during the month of December 2000, with a significant expected improvement in stability.
An extension of the eClass capture system being deployed in the undergraduate computing laboratories is the Mini-Presentation system. This tool supports the preparation and delivery of short talks, along with accompanying slides, allowing students to practice their oral presentation skills without this needing to take place during class hours. The development required a modified interface, including a timebar, indicating the amount of time remaining, and a restructuring of the control flow so as to permit multiple recordings and reviews of the presentation until the student is satisfied with the result. The resulting tool was the MiniPresentation system.
Insofar as we have only completed the evaluation of the Intelligent Classroom for its first semester of use (January-April 2000), it is somewhat premature to describe the full impact of the technology. We are presently beginning the analysis for a second semester (September-December 2000), and will report on the combined data in April (Winer, 2001). In the interim, the following material is extracted from a report prepared by Laura Winer (Winer, 2000), pertaining to our analysis of web logs and student questionnaires, and is included with her permission.
We analysed the web logs for the four classes that made extensive use of the lecture capture capabilities throughout the term: Human-Computer Interaction, Artificial Intelligence, Introduction to Computer Engineering II and Extractive Metallurgical Engineering. The logs tracked information pertaining to the number of visits by each host, whether slides or video were accessed, and the time spent during each visit. The data was then normalised by the number of students in each course, as shown in Table 1, below.
Course |
Total # of Web sessions |
Avg. sessions /student |
Total # of video sessions |
Avg. video sessions /student |
Avg. time/ host (mins.) |
# of unique hosts |
% of McGill hosts |
HCI |
845 |
32.5 |
175 |
6.73 |
295 |
97 |
41 |
AI |
460 |
23 |
150 |
7.5 |
141 |
106 |
38 |
ICE-II |
895 |
7.52 |
240 |
2.02 |
112 |
295 |
24 |
EME |
364 |
11.03 |
246 |
7.45 |
181 |
123 |
26 |
Table 1: Aggregate usage statistics by course for the term
These data provide a global portrait of the use made by students of the review capabilities offered by the MC13 system. If one looks at the average number of Web sessions per student, one can see that the two courses taught by the professor who was the most comfortable with the system, HCI and AI, were consulted most often by the students. It is important to note, however, that the content of those courses, and especially HCI, meant that the system itself was of inherent interest to the students.
The video sessions were lower in average number per student, except for the EME course. This is not surprising given that the EME course had distance students who could be expected to rely more on video recordings to recreate a "live" experience. The campus-based students were generally using the system for review of a lecture that they had attended, and the video was most likely judged not to be worth the time required to access from off-campus over a modem.
The relatively large number of unique hosts indicates that students have Internet access from many sites, and are certainly not limited to the Faculty labs or even their dorms, if they do live in residence.
Another interesting piece of data is the percentage of McGill hosts. Not surprisingly, the EME course, which had a number of off-campus students, had roughly 75% of the host computers non-McGill. However, ICE-II, a totally campus-based course, had the same percentage of non-McGill hosts. The other two courses each had more than half of the sessions originating from non-McGill sites. This certainly supports the idea that the review system allows students to access the materials at their convenience, in terms of both time and place.
The above data give an overall picture, but do not give any indication of the flow of consultations over time. If one looks at Figure 4, the Average number of sessions per student, one sees very clearly that the courses all followed a similar pattern: early interest, tailing off in the middle, and peaking near the end of term, just before final exams. It would seem then, that students initially explored the system, then consulted it seldom or not at all, but accessed it heavily in the last weeks of term. This is encouraging in that it seems that having access to complete review materials is useful for students. While a simple comparison of grades between this term and previous years is unlikely to show any significant difference, it would be interesting to know from the professors and/or TAs if they noticed a change in the quantity and/or quality of consultations and questions they received just prior to finals.
Figure 4. Average number of sessions/student by week for the winter (January-April) and fall (September-December) 2000 semesters.
Figure 5. Average duration of session by week for the winter (January-April) and fall (September-December) 2000 semesters.
The following tables summarise the main points identified from the data analysis of students responses to the questionnaires. It is important to note that in all cases, the students were asked open-ended questions, and the category labels were created when analysing the data. The questionnaires were identical for all courses: the beginning and end of term questionnaires were signed by the evaluation researcher while the midterm questionnaire was signed by the professor teaching those two courses.
The first topic addressed was to determine whether students had accurate expectations of what MC13 was and could offer them as learners. Students were asked for a description of what the Intelligent Classroom was, and asked to comment on how they thought the environment would affect their learning experience. The descriptive results were classified as semi/accurate vs. mis/uninformed, and their expectations were grouped as to positive/euphoric, neutral, or negative/cynical. Table 2 shows the results: the majority of students had a fairly accurate perception of what to expect from the facilities in MC13 and were positive about its potential impact on their student experience. The potential of accessing the recordings for reviewing content was explicitly mentioned by 28 of the 84 students responding, and they were distributed fairly evenly across three of the four classes (HCI, AI, and ICE-II).
Have you heard of the "Intelligent Classroom"? What is it? |
Semi/Accurate: 73 |
Mis/Uninformed: 11 |
|
How do you think it will change your experience as a student? |
Positive/Euphoric: 105 |
Neutral: 11 |
Negative/Cynical: 27 |
Table 2: Beginning of term Student questionnaire. Responses received from 84 of 198 students registered in the four courses.
The midterm questionnaire, administered only to students in HCI and AI, focused on very specific questions that the professor was interested in receiving feedback on. Many of these were related to course issues, but a few focussed explicitly on MC13 issues (see Table 3). The quality of the professors handwriting itself was an issue; this is probably unrelated to the specific technology as it would have been a problem with any kind of presentation medium that used handwritten annotations. Only about half of the students reported accessing the recordings; this is consistent with the data from the usage logs (data source 1). It is interesting to note that the same number of students reported taking fewer notes as those who reported accessing the recordings, suggesting that those who took fewer notes may have used the recordings to supplement their own notes for review. Note-taking is an interesting issue: students who reported taking fewer notes saw this as liberating, allowing them to concentrate more on the content itself. The use of pen colours in presenting material and annotating the slides was reported to be helpful, a strategy that could also be used in non-technologically enhanced classroom environments.
1) How do you rate the legibility of the instructors handwriting on: - the digital tablet |
OK: 20
|
Poor: 14
|
- the electronic whiteboard |
OK: 15 |
Poor: 15 |
2) Have you used recordings for review? |
Yes: 20 |
No: 16 |
Was audio quality sufficient? |
Yes: 8 |
No: 4 |
3) Is use of pen colours helpful or distracting? |
Helpful: 32 |
Distracting: 1 |
4) Do you take fewer notes? |
Yes: 20 |
No: 13 |
Table 3: Midterm AI/HCI student questionnaire. Responses received from 37/46 students registered in the two courses.
Table 4 provides a summary of the end of course questionnaire results. Certainly, the overwhelming number of responses indicated a positive impact of the MC13 physical and technological environment on the learning experience. When one eliminates the frustrations caused by technical glitches and crashes, two-thirds of the complaints are eliminated; not surprising, and a fact that leads one to believe that many of the problems that MC13 is experiencing now can be attributed to "growing pains." It is also interesting to note that the complaints about technical problems were fairly evenly distributed across the four classes, and the experience of the professor did not seem to be a factor in student reactions.
Did the fact that your course was in MC13 change the learning experience for you? |
Changed in a positive way: 75 |
Changed in a negative way: 8 |
No change: 7 |
Which features of MC13 did you find most useful? Most irritating? |
Useful: 78 26: Previous slide 17: Review via web 15: Quality of visual presentations 9: "live" editing/annotations 7: Variety of visual presentation modes 4: Physical environment |
Irritating: 64 39: Technical problems/Crashes 9: Poor recording quality 6: Physical environment 5: Poor handwriting 2: Lighting changes 2: Professor preparation 1: Being recorded |
|
Would you like to take another course in MC13, or would you prefer not to? |
Yes: 54 |
No: 8 |
Neutral: 4 |
Table 4: End of course Student Questionnaire Data. Responses received from 67 of 198 students registered.
One point, although mentioned by only one student, does warrant further consideration: the student objected to being recorded. Recording and making publicly accessible a students interactions in class may well change the nature of some individuals classroom participation. This is a question that will require serious consideration as the classroom learning environment is re-engineered. However, with our recent introduction of wireless microphones for lecture recording, students voices are now effectively eliminated, so this concern is not presently an issue.
The single feature most often mentioned as useful was the capability of viewing the previous slide. While the current PowerPoint slide is projected onto the large screen in the centre of the room, the previous slide is automatically projected onto the Idea board, situated at the front right of the classroom. Students, and the professor, could therefore easily refer to the content of the previous slide when explaining or discussing the current slide.
As noted earlier, one of the main features of MC13 is its ability to record the lecture presentation in audio, video, and slide annotations. This feature was mentioned as the most useful by about 25% of the students responding to the questionnaire; however another 15% mentioned the poor quality of the recordings as the most irritating feature, leading one to believe that if the quality were improved, more than a third of the students would find this to be a useful feature. It will be interesting to verify whether this is borne out by the responses to the last semesters questionnaires, following the use of wireless microphones.
[Abowd et al, 1998] Abowd, G., Atkeson, C., Brotherton, J., Enqvist, T., Gulley, P., and Lemon, J., Investigating the capture, integration and access problem of ubiquitous computing in an educational setting. In Proceedings of Human Factors in Computing Systems CHI '98. ACM Press, New York, pp. 440-447.
[Cooperstock et al, 1997] Cooperstock, J.R., Fels, S.S., Buxton, W. and Smith, K.C. (1997), Reactive Environments: Throwing Away Your Keyboard and Mouse. Communications of the ACM, 40(9), 65-73.
[In-Touch, 1998] Survey of Graduate and Undergraduate Students. In-Touch Survey Systems for the Recruitment and Liaison Office, McGill University, 1997-1998.
[Winer, 2000] Winer, L. Student use of and reaction to MC13 (The "Intelligent Classroom"): January March 2000, CUTL Internal Report, McGill University, 2000.
[Winer, 2001] Winer, L.R. and Cooperstock, J.R. The "Intelligent Classroom": Changing teaching and learning with an evolving technological environment. International Conference on Computers and Learning, Coventry, UK, April 2001.