Page 1

Using a Supplemental Computer Screen to Scaffold Learning at a Radiology Viewing Station Abu Aish M and Pusic M

Many learning interchanges occur around the X-ray viewing boxes; These learning interchanges could be scaffolded by installing a third computer screen that presents a databank of educationally relevant images for comparison with the actual case being assessed on the regular x-ray viewing screens. Objective: To design and perform a formative evaluation of a digital image database presented by way of a third screen adjacent to the present radiology viewing screens, in an active clinical setting Methods: We installed two Tablet computers to the walls adjacent to the PACS screens in the Pediatric Emergency Department. We tracked the utilization of the software using computer log files. We surveyed trainees as to their attitudes towards the intervention at the end of their rotations. Six months after implementation, we also assessed the faculty members’ attitude and suggestions. Results: Most trainees (86%) and faculty (82%) reported using the program. The majority of subjects reported positive attitudes towards it and considered it a positive learning experience. In our usage survey, the Tablet was used 110 times over 70 days (1.6 x/day). XX trainees and xx faculty completed the surveys. Qualitatively, we found that our faculty variably incorporated the intervention into their teaching. They noted the potential of the device for decision support. Conclusion: The use of a third screen to support the interpretation of radiographs was feasible and positively viewed by both trainees and faculty in this non-radiology setting.

Draft for Publication2 - APA ersion.doc Page 1 of 25

Introduction The ability to interpret radiographs is an important skill for emergency department physicians (EDP)

(Higginson et al, 2004). A significant proportion of ED patient encounters involve decision-making based on the correct interpretation of plain x-rays (Higginson et al, 2004). Radiology specialist interpretation is often not available on a continuous basis for ED radiographs. This means that many disposition decisions are based on ED physician interpretations of the films (Gatt et al, 2003) The medical literature describes a significant discrepancy between the abilities of ED physicians and radiologists. In a recent study from a pediatric center in New Zealand (Higginson et al, 2004), the overall disagreement rate between radiologists and ER doctors was 26.1%. The clinically significant disagreement rate was 4.8%. There was no relationship between factors such as years of training or the confidence of the ED doctor and likelihood of disagreement. Types of films more likely to result in disagreement were chest films (30%), then abdominal films (21%), and then skeletal films.(13%) The highest risk of disagreement was for chest films initially thought to be normal, and the lowest in skeletal films initially thought to be abnormal. Halsted et al,(2004) studied radiology trainees and found that they had a 3% failure rate of interpreting pediatric x-rays. Most errors (69%) involved the diagnosis of fractures and/or dislocations. Sixty-one percent of all recurrent errors involved buckle, Salter II, avulsion, and transverse fractures. Other studies have shown variable degrees of disagreement between ED physicians and radiologists ranging between 0.4%-17%

(Brunswick, 1996, Espinosa, 2000, Gatt, 2003, Mayhue, 1989, Preston 1998). Thus there is room for improvement of the education of ED physicians in interpreting radiographs. Despite the importance of radiology in medicine fewer than 30% of medical schools who responded to the survey by Shaffer (2000) indicated that a radiology rotation is mandatory in their schools; instead the main radiology experience comes during medical and surgical rounds or elective rotations. The emergency department is an excellent place to teach radiology to medical students and residents. A large number of x-rays are done in the ED and the patient is readily available for contextualization of the interpretation. The amount of radiology education that the ED trainee

Draft for Publication2 - APA ersion.doc Page 2 of 25

receives depends on the staff persons’ interest in education, the time pressures in the ED and the time of the year (e.g. more fractures are seen in summer months and more chest x-rays are seen in winter months). There is a paucity of literature describing radiology education in the context of emergency departments compared to the radiology departments. This literature focuses on the discrepancy between the emergency residents and radiology trainees and ways to decrease these discrepancies. Most of the time, the solutions proposed focus on the organizational factors without addressing the need for better radiology education in the emergency department. One possible approach to improving the situation is to use computer aided instruction (CAI) in emergency radiology education. In a recent review Letterie (2003) analyzed the reports published between 1988 and 2000 on computer assisted instruction (CAI) in medical education. While there were problems with many of the studies amongst the 210 reports they examined, “most” of the articles enthusiastically supported the use of the programs. The use of CAI in radiology education is well described (Collins. 2002 Lieberman 2002, Shaffer 2005, ,Shaffer 2004, Su 2004, Wagner, 2005), Different modalities have been described including teaching websites and computer programs. These reports describe favorable outcomes but modern technologies in radiology education are still underused (Collins. 2002 Lieberman 2002, Shaffer 2005, ,Shaffer 2004, Su 2004, Wagner, 2005) even in institutions where the digital technology and picture archiving and communication systems (PACS) are available (Durfee et al, 2003). On a ward or in the Emergency Department, many learning interchanges occur at the x-ray viewing box or PACS station. We postulate that these learning interchanges could be made richer through the provision of a database of digital images that illustrate key aspects of normal or pathological radiological visual features. Such a station could enable deeper processing by trainees by facilitating novel instructional strategies including: •

Visual explanation of interpretation procedures

Immediate comparison side-by-side of two different examples

Repetitive practice on a number of examples

Visualization with segmentation of the features of interest.

We have developed a computer-aided instructional tool designed to enhance medical trainees’ skills of x-ray interpretation in clinical settings. The tool consists of a tablet computer bolted to the wall next to the ED X-ray viewing stations – in essence a third PACS screen. The third screen scaffold the learning of trainees using the PACS display. We describe a software application that can be presented via this third screen. It can present a fully labeled example of any plain radiograph along with schematics showing a basic approach to interpretation of a given film. Our objective in this Draft for Publication2 - APA ersion.doc Page 3 of 25

article is to describe the use of this learning strategy as well as the attitudes of learners and preceptors to it.

Theoretical Background The cognitive apprenticeship model introduced by Collins (1989) described the different stages of the relationship between experts and novices that ultimately leads to better learning; these stages are: Modeling, Scaffolding and Reflection. Modeling is when the expert performs the skill in front of the novice. Scaffolding is when the expert supports the novice to learn the skill. The support can take many forms and different degrees of depth according to the current novice’s skill level. It is at this stage that participation of the novice is encouraged and the dialogue between the expert and the novice to construct and negotiate meaning becomes so important and crucial to the learning process. Reflection is when the novice becomes able to criticize his performance and problem solving skills and compare it the expert’s. Brown’s work(1989) on the concept of Situated Cognition showed that students’ learning is a social process that is enhanced by active participation within culturally organized environments and real-world activities .Learning happens best when the activity is perceived authentic and involves interaction within the socio-cultural context. Brown’s ideas are closely related to the work of Lave and Wegner (1991) who viewed learning as a dynamic process that exists in activity and the participation of the individuals within the social context. Lave and Wegner stressed on the importance of the dialogues between the community members to negotiate meaning in the apprenticeship process: Jarvela’s work (1995), clearly showed that the cognitive apprenticeship model failed to create reciprocal understanding unless both students and the teachers were actively engaged in goaldirected activity. So it is not only the context that matters but the interaction and participation within the context is the key to learning. Using technology in medical education can support the cognitive apprenticeship model if it takes place in the right context when the student actually needs knowledge (e.g. when dealing with an actual patient), but the reality is that most of the time, the students use different technologies at home or in the library or when they are not actively seeing patients. To some extent, this decontextualizes the learning process and makes these instructional methods another e-textbook in a different format. Existing technologies also lack the dynamic relational aspect of the learning process that exists in the apprenticeship model Our objective was to design a technology that enhances the role of the expert and enriches the context to support our existing apprenticeship model rather than to take over the role of the expert. We planned to have our technology “embedded” in the ED culture so it can be used in the right situation when it is needed and not after the need has passed. We aim to use the instructional tool to enhance the interactions in the context of the emergency department and encourage participation and meaning negotiations between students and physicians

Draft for Publication2 - APA ersion.doc Page 4 of 25

Research Design and Methods Setting British Columbia’s Children’s Hospital Pediatric Emergency Dept (ED) is a tertiary care facility that has an annual census of 35,000 patient visits. The department is divided to an acute side and a nonurgent side. Patients seen in the acute side are usually the sick and the young (less than 2 years) while the non-urgent side deals with lower acuity cases that require fewer resources (eg musculoskeletal complaints, simple lacerations and stable upper respiratory illnesses. The ED is staffed by Pediatric Emergency Medicine subspecialists as well as general pediatricians on the non-urgent side of the ED. A large number of trainees rotate through the department every month, on average 3 medical students and 4-5 residents from different specialties. To ensure the students and residents get consistent pediatric emergency knowledge, an educational half day supervised by teaching staff and fellows is added to their clinical shifts, to discuss topics related to pediatric emergencies. During clinical shifts, medical students attempt to read patients’ x-rays by themselves before asking the senior physician to take a look and help them make clinical decisions about patients in the department. The x-ray interpretation, to some extent, determines the patient disposition (e.g. starting the patient on antibiotics, cast application, discharge vs. admission to the hospital…etc). To minimize medical errors, all decisions must be approved by the senior physician before discharging any patient from the department.

Radiology Interpretation Protocols The PED is supported by a full-service radiology department that includes 11 radiologists. At the time of the study after-hours coverage was done by an in-house radiology resident with an attending radiologist available on a call-in basis 24 hours per day. For non-urgent cases, PED physicians interpret the radiographs and decide on disposition. A radiologist then interprets the films, usually within 24 hours. In cases of discrepancy, the ED is notified. In all cases, a written report is provided by the radiologist to the requesting ED physician. The hospital is equipped with a state-of-the-art Philips Medical Systems I-Site PACS.

Computer Program The computer intervention consists of an image database and a user-interface loaded into Tablet computers fixed to the wall beside the two X-ray viewing stations at the acute and fast track areas in the Pediatric Emergency Department at BC Children’s Hospital. (Figure 1) Hardware: We used 3 Tablet computers in our study, 2 were affixed to the walls of the emergency department at all times and one was kept as a backup in cases of technical problems. Tablet

Draft for Publication2 - APA ersion.doc Page 5 of 25

computers differ from regular computers in that there is not necessarily a keyboard or mouse available – instead the user is able to interact directly with a computer program using a pen touching the screen in the same manner as that for a Personal Digital Assistant device. The model of Tablet computer we used was the Compaq TC-1000T Tablet PCs with 256 MB RAM, 30 GB Hard Drive and a 10.4-inch 800x600 pixel display.(HP-Compaq: Mountain View California) The operating system of the computer was the Microsoft Windows XP for Tablet Computers OS released in 2002.( Microsoft TM Bellevue, WA){} Radiographs: We prospectively collected a normal radiographic series (ie all routine views) of each of 22 body regions by maintaining a log book in the Emergency Dept and by having a study medical student attend the radiologist’s film review session each weekday morning. These reviews were downloaded from the PACS system in a digital JPEG format and then reviewed by one of the investigators to ensure its clarity and representativeness of the body region in question. These radiographs were labeled according to standard radiology textbooks. Software: We created the software using an authoring program (Toolbook Instructor 8.5, Click2Learn Corp, Bellevue WA).{} The program allows rapid application development of programs destined for either the Windows platform or for the Internet. The database is made up of digitized images of plain x-rays commonly encountered during pediatric emergency practice. For each body part, one normal film is available to the user that is annotated such that a pen-touch on the film outlines and labels the anatomical feature.(Figures 2a2c) The user interacts with the Tablet PC using an attached digital pen. The user is not required to write characters but rather simply points and clicks on hyperlinks in a manner analogous to surfing the Internet. To begin, the user clicks on any part of the screen with the pen .This activates the main screen which is an HTML document showing a view of the human body. The user clicks on a body part of the drawing (e.g. the elbow) which brings up a prototypical x-ray of the elbow as well as a menu frame where the user can click on different links (buttons) to : •

see different views of the same x-ray(i.e AP lateral and oblique views) (Figure 2A)

demonstrate an approach to interpret the x-ray(Figure 2B)

Draft for Publication2 - APA ersion.doc Page 6 of 25

show text labels of the bone and soft tissue parts (Figue 2C)

show a correseponding anatomy drawing (Figure 2C)

The user can return to the start point at any time from any screen by a simply clicking on a special “home” button . These features are shown in Figure xx using an elbow x-ray for an example. This first version of the application did not have any other categories of x-rays beyond the single set of normal views; ie no pathological images and no age or normal variants. We tested the usability of the application interface using four pediatric residents, four medical students, and two nursing students as subjects. We incorporated the suggestions of the students and then reflected the new application back to them until there were no further suggestions made. We then showed the interface to two experienced PEM physicians. Their suggestions lead to minor changes. This version was frozen for the duration of the study. Participants All medical students and residents rotating through the ED for 2 weeks or more were eligible for the study. All faculty members who attend to patients in the ED for more than 3 shifts per month were surveyed to determine their attitudes to the intervention. The evaluation of the program took place between June 2006-February 2007. The participation was entirely voluntary and not used in trainees’ final rotation evaluation. Study Design The intervention was evaluated using a prospective survey of the trainees and faculty. Using log files, we also tracked the utilization of the software including time of day, duration of each interaction and which body regions were most commonly accessed. The Institutional Review Board of both the university and hospital approved the design and implementation of the study. Survey Development: The student survey solicited their attitudes towards several aspects of the software program and its implications. It was partly based on the Student Evaluation of Educational Quality (SEEQ), a well-

Draft for Publication2 - APA ersion.doc Page 7 of 25

known higher education student satisfaction survey (Marsh, 1982). We selected specific questions from the SEEQ and minimally changed the wording to reflect our context. We had used the survey in a previous implementation of computer tutorials in the PED of another institution (Pusic 2007) We added several questions asking the students to describe their use of the Tablet. We pilot-tested the composite survey on one rotation block of our students asking them to comment on clarity and whether the intent of the questions was clear. We incorporated the suggested changes. The faculty survey was developed ad hoc by the investigators and pilot tested on three physicians outside the PED. Suggestions to improve clarity and intended meaning were incorporated into a new draft of the survey and reflected back to the pilot testers. They did not suggest any further changes.

Data Collection At the beginning of each month, one of the investigators (MAA) attended the Friday teaching half day to orientate the trainees to the Radiology Tablet PC. He repeated the process the following Friday if some trainees were absent in the first week. During each trainee’s last teaching session of their rotation, we administered the survey of their attitudes towards the intervention. This was at least 12 days after the start of their rotation. We asked the trainees not to write their names on the surveys. The trainees’ usage of the Tablet PC was tracked using a feature of the Toolbook authoring program that generates time-stamped entries to a log file each time a subject accesses a new page or screen. We did this in an anonymous fashion that did not allow identification of any individual.

Faculty survey We contacted all the staff PEM physicians working in the BCCH ED by e-mail to orientate them to the program. We also described the aims and use of the program at multiple division rounds and business meetings. At the end of the evaluation period (February 2007) the faculty members were

Draft for Publication2 - APA ersion.doc Page 8 of 25

asked to complete the survey at one of the division business meetings. Faculty who were not in attendance at that meeting were contacted separately. Data Analysis Survey statistics are reported using appropriate descriptive statistics and graphical representations. Log file measures of interest (number of screens accessed, path followed through the programs, and the subjects’ attitudes) are described with the relevant descriptive statistics such as frequency histograms. Qualitative Data Collection and Analysis All physicians and students working in the department throughout the month of November 2007 were approached in one of the academic teaching half days at the beginning of November to explain that the project investigator Dr Mohammed Abu Aish (MA) who works in the Pediatric Emergency Department at BC Children’s Hospital will be observing “some” of the medical students and physicians during his clinical work in the department to collect some data about Radiology education in the department. Consents were obtained at the same time. MAA collected field data while working rather than as an added observer. We felt that this would avoid the student and physician changing their behaviors (e.g. showing more interest in technology, spending more time teaching...etc) while being observed. He avoided recording my observations on paper in front of the team to ensure spontaneous and natural behaviors. Instead, he recorded his observations privately and as soon as the interaction has finished. The study investigator (MA) observed 6 encounters that occurred at the X-ray viewing box and recorded the conversations between emergency physicians and students. He collected data on three such encounters before the implementation of our intervention, and another three encounters after implementation. The post-implementation encounters were with the same faculty members but with different students (unable to have same students due to scheduling conflicts). Since we only collected 6 brief encounters, hand analysis was used to link the content of the discourse to the 3 different themes described earlier in Collins Cognitive Apprenticeship model whenever a link can be made: Modeling(M), Scaffolding(S) and Reflection(R). No computer analysis, transcription or lengthy coding was used to analyze the data as the aim was to provide brief examples of how different physicians used this new intervention.

Results Log Files describing use of the Tablet Computer:

Draft for Publication2 - APA ersion.doc Page 9 of 25

We collected log files of the computer use over a 70 day period. During that time there were 110 separate accesses of the Tablet PC (1.6 per day). In each session, the median number of images accessed was three (IQR: 1, 5). Users seemed to look at one body region per session (65/110). In 28 sessions, the user looked at two body regions. The maximum number in a session was eight (only once). The most commonly accessed body regions are shown in Figure 4, with Chest X-rays being the most popular followed by Knee, C-spine and Elbow in order. We could not precisely time the sessions as there was no logout procedure. The user simply walked away once they were done. For sessions where more than one screen was accessed, the median length of time spent on each screen before the last one was 10 seconds (IQR: 5, 19). If we extrapolate that number to the last screen, then sessions generally lasted a median of 54 seconds (IQR 30,137). The most popular types of pages viewed were the labeled x-rays (84% of total pages viewed) with the remainder being views of the Anatomy drawings. Survey of Trainees: During the study period, 49 residents from different specialties and 17 medical students and spent more than two weeks in the rotation and were oriented to the application. A further 11 residents and 16 students were not surveyed because they spent <2 weeks in the department but would have had the opportunity to use the Tablet Computers and would have contributed to the usage statistics. Survey participation was reasonable with 38 residents (77%) and 12 students (70%) completing the end of rotation surveys. The residents came from Family Practice (17), Pediatrics (11), Emergency Medicine (6) and other subspecialty (4). The main reason for trainees to not complete the surveys was non-attendance at the final teaching session of their rotation. Amongst the residents, there was no significant difference by subspecialty in likelihood to participate. The majority of the trainees had used the program at some point in their rotation: 33/38 residents (87%) and 10/12 students (83%). The relative frequency of their usage varied considerably.(Figure 3). While 45% of trainees used the program at least once a shift, a minority did not use it even once 5 residents(13%) and 2 students(17%) .The general attitude of the trainees and students towards the application was positive, at least 2/3 found the application useful, practical and easy to use (Figure 5) Survey of Faculty: Of the 17 faculty members approached, 14 (82%) completed the survey. They included seven of our nine full-time faculty and all six of the regular part-time physicians and both PEM fellows. None of the investigators is represented. The faculty did not use the application to the same extent as the trainees (see Figure 3) being roughly equally distributed across the three categories of use: frequent, occasional and non-user. The attitudes of the faculty towards the application were positive.(Figure 5) At least 2/3rds found the application “useful”, “practical” and “easy to use”. The majority (13/14) felt that the locations of the Tablets, adjoining the PACS system, was practical. Interestingly, 9 of the 14 reported that the program “helped me interpret some difficult X-rays” suggesting that they may have used it for decision support. Suggestions to Improve the Application:

Draft for Publication2 - APA ersion.doc Page 10 of 25

The participants were asked, in an open-ended question at the end of their surveys, to identify ways of improving the application. Over a third of the respondents wanted representation of normal age variants within the program. Similarly, along with the labeled normal images, they would have liked representations of common pathologies such as growth plate fractures. Four of the faculty requested better Information Technology support, given that there were several instances where the computer was not available. Single respondents also suggested the following: normal variants; more clinical presentations; management information; increased font size; online availability outside the ED. Qualitative Data:

Observations: 6 sample encounters were recorded, 3 physicians and 7 different trainees (medical students or residents) 1pre and 1 post implementation encounter for each physician is recorded Figure 6 discuss the encounters observed. we will refer to each physician by number to be able to compare his pre and post implementation behaviors later in the discussion we used the cognitive apprenticeship model (Collins, 1989) to analyze the dialogue and behaviors between physicians and students (will use M for modeling, S for scaffolding and R for reflection

Discussion: We have described a novel intervention for the learning of radiology by non-radiologists. Our trainees and faculty used the intervention regularly and had positive attitudes towards it on our surveys. The form of the computers may be related to their acceptance. Reports of tablet computers usage in Veterinary medicine education showed positive attitudes for students (Eurell, 2005). A number of studies have looked at the suitability of Tablet Computers for various health care applications.(Lottridge, 2007) and have found broad acceptance. We believe that important factors in the success of the Tablet were its location, constant availability and simple interface. We carefully chose the locations of the tablets in our department so they can be used to help with decisions involving actual case radiographs. Information resources can fail if they take too long to access (Brian, 2007). We placed the computers on the wall right beside the PACS machines. Having the software application on a computer even ten feet away would have rendered impossible the simultaneous comparison of the labeled teaching image with the caseâ&#x20AC;&#x2122;s images. We purposefully wanted the application to be different from the other computers in the department. There was no keyboard and no mouse to cause confusion with the PACS or the Clinical Information System applications. Also, no other applications were available on the Tablet so that it was always available for its intended use. Its default screen was the home screen of the application. In this way it came across less as a computer and more as a visual textbook or a third â&#x20AC;&#x153;helpâ&#x20AC;? screen for the PACS. In addition, we imposed a three-click rule where any required image could be accessed with only 3 mouse-clicks from any other screen. Our intervention is a natural next step in the development of radiology CAI. While radiology educators have developed a large number of excellent applications they are usually either presented online

Draft for Publication2 - APA ersion.doc Page 11 of 25

for independent study or are part of the pre-clinical curriculum (Collins 2002, Shaffer 2005, Su 2004) Some students and residents are fortunate to complete electives in a Radiology Department; however, considerable learning of how to interpret radiographs occurs in Emergency Departments or on the wards, sites where expert radiology educators are rarely available. One important implication of this form of distributed education is that radiology education experts will able to influence the teaching done by nonradiologists well away from the Radiology department. Users reported using the application for decision support. Several of the faculty members used the labeled views to help them identify structures on radiographs that they found difficult to interpret. As we develop more sophisticated versions with normal variants, age variants and pathological images, the application is likely to be used in this role even more. Eleven out the 63 subjects (17%) never used the program. Some residents and faculty indicated that they did not know about it and others mentioned that they did not have any reason to use it during their rotations (eg no trauma cases). We believe that some of these responses reflect the busy nature of our emergency department but it might also indicate individual’s attitude towards technology. Implications of qualitative observations:

Our few short qualitative encounters flesh out the picture painted by our log file data and the surveys. We saw three distinct patterns: • Physician 1 did not adopt the technology at all; • Physician 2 adopted the technology immediately; she used the tablet in her instruction. She used the technology while talking about current patients • Physician 3 did not use the technology when he taught a real patient case; however, he did direct the students to use the technology when the department was quiet. Only physician 2 used the program in the way we viewed it used, within the context of discussing actual patient x-rays. This physician encouraged participation and negotiation of meaning. Physician No 1 did not use the intervention while physician No 3 used it as yet another inert didactic educational resource: present in the context but without much active interchange between him, the student and the resource. These scenarios could be predicted from Roger’s model of the diffusion of technology (2003)which states that some individuals, termed “early adopters” will be quick to integrate a new technology into their practice. In this case Physician 2 adjusted her education practice to incorporate the new intervention. Physician’s 1 and 3 correspond, respectively, to the “technology skeptics” and “majority” in Rogers’ model. We believe there is considerable educational advantage to adopting this “Third Screen” intervention. Scaffolding is an important feature of the cognitive apprenticeship (Collins, 1989) The intervention enhanced the cognitive apprenticeship model by presenting a visual scaffold to the trainee’s reasoning about their current patient’s x-ray. They could work on the x-ray while still remaining in the role of clinician, instead of passively waiting for the preceptor to pronounce on the x-ray. More attention to the quality and depth of these types of interactions between students and physicians is needed; we have to explore other solutions to improve these conversations to enhance our context (e.g. time management, instructional skills workshops for faculty members).

Draft for Publication2 - APA ersion.doc Page 12 of 25

Limitations: We used 3 tablets PCs in our study, two fixed to the wall and one as a backup in case of emergencies We had to replace one of the wall tablets that went dead due to irreversible damage to the hard disk The most common technical problem we faced during the study period is accidental unplugging of the power cord (most likely because it gets confused with the pen). The tablet pen uses special batteries (AAAA size). When the batteries start to drain, the pen behaves erratically. We only included trainees who spent 2 or more weeks in the ED in our study as we felt this will give the trainee enough time to use the program in different situations and for different scenarios. We ended up losing fair number of medical students who rotate for less than 2 weeks from our sample. We still believe that the number of students enrolled represent the overall medical students attitude towards our intervention. Our qualitative data gathering was based on relatively few interactions. We did not have the resources to carry out a comprehensive data gathering and subsequent grounded theory or content analysis. Having incidentally observed a large number of interactions during the course of our clinical duties, we feel that the three vignettes we describe are representative of the spectrum of use of the application.

Future directions: Different reports in radiology and medical education literature of using digital technology and picture archiving and communication systems (PACS) in radiology education(9) showed positive attitudes by trainees. Although we believe that our intervention is more interactive than the saved image libraries, we have no proof that our intervention is superior to PACS systems in knowledge gain and transfer, future randomized studies are needed to determine the impact of each intervention Even that our program was introduced in a tertiary care center ,we believe , based on the attitude of the family practice residents towards it, that it will be acceptable in the community teaching centers and rural hospitals were the availability of radiological support can be even more challenging Different suggestions were written to make this program a better learning tool. The most commonly suggested additions are (adding more xrays for different age groups and adding pathologies). We are currently developing the 2nd version of the program that includes different age groups and are aiming for the pathology xrays in our 3th version of the program .The second version is expected to be up and running in July 2008, We believe that when we use the second version of the program, traineesâ&#x20AC;&#x2122; attitude towards the program will even be better Acknowledgements: We would like to thank Dr Amal Yusef and Dr Simi Khangura for their great help in the early stages of this project

References: 1.

Brown, J.S., Collins, A.& Duguid, P (1989). Situated cognition and the culture of learning. Educational Researcher 18 (1), 32â&#x20AC;&#x201C;42.

Draft for Publication2 - APA ersion.doc Page 13 of 25


Brunswick JE, Ilkhanipour K & Seaberg D (1996) Radiographic interpretation in the emergency room. Am J Emerg Med;14:346–8.


Collins, A., Brown, J.S.& Newman, S. (1989). Cognitive apprenticeship: teaching the crafts of reading, writing and mathematics. In: Resnick, L.B. (Ed.), knowing. Learning and Instruction: Essays in Honor of Robert Glaser. Lawrence Erlbaum Associates, New Jersey, pp. 453–494.


Collins J, Dotti SL & Albanese MA.(2002) Teaching radiology to medical students: an integrated approach. Acad Radiol. Sep;9(9):1046-53.


De Bruijn, H.F.M.,( 1995). Cognitive apprenticeship in a CAL environment for functionally illiterate adults. Instructional Science 23, 221–241.


Durfee SM, Jain S & Shaffer K. (2003) Incorporating electronic media into medical student education: a survey of AMSER members on computer and web use in radiology courses. Alliance of Medical Student Educators in Radiology. Acad Radiol. Feb;10(2):205-10


Espinosa JA & Nolan TW. (2000)Reducing errors made by emergency physicians in interpreting radiographs: longitudinal study. BMJ;320:737–40.


Eurell JA, Diamond NA, Buie B, Grant D & Pijanowski GJ (2005). Tablet computers in the veterinary curriculum. J Vet Med Educ. Spring;32(1):113-6


Gatt M E, Spectre G, Paltiel O, Hiller N , Stalnikowicz R( 2003) Is the radiologist really necessary? Chest radiographs in the emergency department. Postgrad. Med. J.;79;214-217

10. Gunderman RB, Williamson KB, Fraley RE & Steele J (2004)The role of technology in radiology education. Acad Radiol. Apr;11(4):476-9 11. Halsted M,Kumar H,Paquin J,Poe S,Bean J,Racadio J,Strife J & Donnelly L. (2004) Diagnostic errors by radiology residents in interpreting pediatric radiographs in an emergency setting. Pediatr Radiol 34: 331–336 DOI 10.1007/s00247-004-1150-7

12. Higginson I, Vogel S, Thompson J & Aickin R (2004) Do radiographs requested from a paediatric emergency department in New Zealand need reporting? Emergency Medicine Australasia 16, 288–294

13. Holroyd BR, Bullard MJ, Graham TAD & Rowe BH. (2007). Decision Support Technology in Knowledge Translation. Academic Emergency Medicine 14 (11) , 942–948.

14. Jarvela, S.(1995). The cognitive apprenticeship model in a technologically rich learning environment: interpreting the learning interaction. Learning and Instruction 5, 237–259 15. Lave, J. & Wenger, E.(1991). Situated Learning: Legitimate Peripheral Participation. Cambridge University Press, New York.

Draft for Publication2 - APA ersion.doc Page 14 of 25

16. Letterie GS. (2003) Medical education as a science: the quality of evidence for computer-assisted instruction. Am J Obstet Gynecol. Mar;188(3):849-53. 17. Lieberman G, Abramson R, Volkan K & McArdle PJ (2002).Tutor versus computer: a prospective comparison of interactive tutorial and computer-assisted instruction in radiology education. Acad Radiol. Jan;9(1):40-9. 18. Lottridge DM, Chignell M, Danicic-Mizdrak R, Pavlovic NJ, Kushniruk A & Straus SE ( 2007) Group differences in physician responses to handheld presentation of clinical evidence: a verbal protocol analysis.BMC Med Inform Decis Mak. Jul 26;7(1):22 19. Marsh, H. W. (1982). SEEQ: A Reliable, Valid, and Useful Instrument for Collecting Students' Evaluations of University Teaching. British Journal of Educational Psychology, v52 pt1 p77-95 20. Mayhue FE, Rust DD & Aldag JC (1989). Accuracy of interpretation of emergency department radiographs: effect of confidence levels. Ann Emerg Med;18:826–3 21. Preston CA, Marr JJ & Amaraneni KK (1998) Reduction of “callbacks” to the ED due to discrepancies in plain radiograph interpretation. Am J Emerg Med;16:160–2. 22. Pusic MV, Leblanc VR & Miller SZ. (2007). Linear versus web-style layout of computer tutorials for medical student learning of radiograph interpretation. Acad Radiol. Jul;14(7):877-89 23. Rogers, Everett M. (2003). Diffusion of Innovations. (5th ed.). New York: Free Press. 24. Samuel S & Shaffer K (2000) Profile of medical student teaching in radiology: teaching methods, staff participation, and rewards.Acad Radiol. Oct;7(10):868-74. 25. Shaffer K. (2005) Radiology education in the digital era.Radiology. May;235(2):359-60 26. Shaffer K & Small JE.( 2004) Blended learning in medical education: use of an integrated approach with webbased small group modules and didactic instruction for teaching radiologic anatomy. Acad RadiolSep;11(9):1059-70. 27. Smith
Patient-centred learning--back to the future.Med Teach. Feb;29(1):33-7. 28. Su TJ, Shaffer K.( 2004) Reinventing the apprenticeship: the hot seat in the digital era. Acad RadiolNov;11(11):130029. Wagner M, Heckemann RA, Nomayr A, Greess H, Bautz WA & Grunewald M.. (2005) COMPARE/Radiology, an interactive Web-based radiology teaching program evaluation of user response. Acad Radiol. Jun;12(6):752-60. 30. Woolley
 31. Williamson

Learning theory in radiology education. Radiology. Oct;233(1):15-8

Draft for Publication2 - APA ersion.doc Page 15 of 25

FIGURE 1. Visual Presentation of the Tablet The Tablet Computer, on the left, is loaded with an image database presenting labeled education images. The user interacts with the software using a digital pen. The right hand screen is one of the two PACS screens.

Draft for Publication2 - APA ersion.doc Page 16 of 25

FIGURE 2.A -- Software Screen Capture -- Labeled Radiograph

Draft for Publication2 - APA ersion.doc Page 17 of 25

FIGURE 2B -- Software Screen Capture â&#x20AC;&#x201C; Approach to Interpretation

Draft for Publication2 - APA ersion.doc Page 18 of 25

FIGURE 2C -- Software Screen Capture â&#x20AC;&#x201C; Anatomy Image

Draft for Publication2 - APA ersion.doc Page 19 of 25

FIGURE 3. Self-reported use of Tablet computer application. On the self-report survey, the subjects reported how often they used the Tablet Computer according to the following categories: “Frequent Use” is generally every shift; “Occasional” is every several shifts; “Never Used”

Draft for Publication2 - APA ersion.doc Page 20 of 25

FIGURE 4. Frequency of Tablet Computer Accesses by Body Region The Da Vinci drawing is the home page for the application. Clicking on a body region takes the user to a representative x-ray. Diameter of circle corresponds to the number of times accessed over the 70 day logging period.

41 40




Draft for Publication2 - APA ersion.doc Page 21 of 25

Figure 5. Trainee and Faculty Attitudes Towards The Tablet Educational Intervention. The combined results of a satisfaction survey of residents (N=38), medical students (12) at the end of their rotation and faculty (14) are shown. The darkest middle line is the median response, and the gray box represents the interquartile range. Whiskers are the 5% and 95% percentiles.

I learned something valuable from the Tablet application The Tablet helped me teach or learn xray interpretation The Tablet helped me interpret difficult x-rays Use of the Tablet is practical and feasible in the ER There are a sufficient number of Tablet Computers The Tablets are easy to use The location of the Tablet is effective The images and labels are clear The suggested approaches to reading the xrays are clear Overall, the Tablet was a positive learning experience

1 Strongly Disagree




5 Strongly Agree

Draft for Publication2 - APA ersion.doc Page 22 of 25

Figure 6: Recorded encounters before and after introducing the program into the ED Pre Implementation encounters: The first encounter (Physician No.1 with Resident No.1) (Student is looking at the X-ray viewing box which shows xx) Physician: Is this room 6’s x-ray? Student: Yes (Physician approaches the screen) Physician: Holy cow, what are you going to do? S Student: I think I will call surgery to see him Physician: No kidding sounds good R Student: did you see him? Physician: not yet, let me finish this chart and will go together (Both walk away from the corner) The second encounter (Physician No. 2 and student No. 2) (Student is looking at the x-ray viewing box while the physician quietly approaches him from behind his back and looks at the screen from behind) Physician: What do you see? (Student points to a spot in the x-ray) Student: I think the right heart border is a bit dirty to me Physician: sure, but would you treat? S Student: I am not sure Physician: Do you know what we called this area on chest xrays? S Student: no Physician: we called it the zone of radiographic bulls**t (Both laughing) Physician: if we treated all patients coming with dirty borders, we would’ve treated thousands of viral pneumonias with antibiotics, do you see my point? M Student: I see, so you think it’s viral? Physician: yeh, I think it is, do you wanna go see him? Student: Sure (Both walk away) The third encounter (Physician No.3 and Student No.3) (Student is looking at the x-ray box, after few minutes, he grabs one of the textbooks from the adjacent shelf and opens few pages while looking again at the screen) (Physician approaches the student) Physician: Is this room 9’s X-ray? Student: yes (Student returns the book on the shelf, he joins the physician) Physician: what do you think? S Student: I am not sure; I think the alignment is off R (Student points to a spot on the x-ray) Physician: do you know what is this line called? S Draft for Publication2 - APA ersion.doc Page 23 of 25

Student (smiling): Actually, I was just looking it up before you came Physician: that’s OK, it’s called the anterior humoral line, do you see how it intersects the capitellum. S (Physician points to the x-ray) Student: yes, yes Physician: This is where we look for elbow dislocation, looks ok here M Student: so it is not dislocated! Physician: No, I think it is ok, don’t you agree? M, S Student: sure, sure. Do you want to splint him? Physician: that’s fine with me; get him to see his family doctor in a week. (Student leaves the area) Post Implementation encounters – Tablet PC Installed : The fourth encounter (Physician No. 1 and student No 4) (Student playing with the tablet screen) (Physicians approaches the student) Physician: Did you see room 1 x-ray? (Student leaves the tablet and goes to the x-ray box) Student: yeh, its up, looks fine (Physician clicks on the digital x-ray viewing box to bring up the x-ray) (Student points to a spot on the x-ray and say): would you call it normal? Physician: Yup, it’s fine, not uncommon for this age M Student: great, she is on her way Physician: awesome (Both leave) The Fifth encounter (Physician No.2 and student No. 5) (Student looking at the x-ray box, calls the physician for help) Student: Dr----, do you have a minute? Physician: sure, (approaches the screen) Student: this is the kid in 10; he is ok now but needs his c-spine cleared Physician: What did you find on exam? Student: well, he is a bit tender on palpation, but everything else is negative Physician: anything on x-ray? S Student: well, lines seem OK, can’t see a fracture (Physician turns to the tablet, clicks on few screens) Physician: Did you compare the lines with the tablets? M,S Student: Actually, I haven’t used it yet (Physician hands the pen to the student; student looks at the tablet screen and looks back to the x-ray viewing box) S Student: looks fine to me R Physician: I think so, I just like to keep them in mind, you know, we don’t see them often M Student: oh, sure (smiles) Physician: what do you wanna do? S Student: I wanted to do flex/ex views just to confirm R Physician: sure, go ahead (Both leave)

Draft for Publication2 - APA ersion.doc Page 24 of 25

The sixth encounter (Physician No. 3 and students No.6 and 7) (Physician talks to both students) Physician: Did you guys use the tablet? Student 6: I played with it earlier today, it is a great tool Student 7: I didnâ&#x20AC;&#x2122;t have the chance to use it yet Student 6: it is a great program, especially the elbow page (Physician stands and walks to the tablet, grabs the pen and click on the screen) Physician: C-spine is excellent as well; you can see all the lines we talked about by clicking on this box (Physician hands the pen to student 7 and says): feel free to play with it when itâ&#x20AC;&#x2122;s quiet Student 6: I used it yesterday to compare an elbow, helped like a comparison view Physician (laughing): Saved the patient extra radiation (Nurse interrupts to ask the physician to assess a patient, physician leaves with student 6 while student 7 continues to click on few screen on the tablet) M: Modeling S: Scaffolding R: Reflection Adopted from the Cognitive Apprenticeship Model, Collins 1989

Draft for Publication2 - APA ersion.doc Page 25 of 25

Using Tablets to learn/teach Radiology  
Using Tablets to learn/teach Radiology  

My Fellowship project at BC Children's Hospital