Difficulties with “Digital Natives”: Bridging the Skills Gap Via One-Shot Library Instruction

by Emily Thompson
Studio Librarian, University of Tennessee at Chattanooga Library
and
Theo Rhodes
Assistant Professor of Psychology, State University of New York at Oswego

[peer-reviewed article]

Abstract 

Current students are very familiar with their handheld devices, but they are often thrown into productivity applications with very little instruction due to the assumption that digital natives are already proficient. This study focused on students’ abilities using PowerPoint to create and execute a presentation. We conducted an A-B comparison with a “one-shot” instruction session by a librarian in between. After analysis by a group of objective observers, we saw a statistically significant improvement in the post-intervention slides. This implies that it is helpful to give students lessons in common productivity applications, with a possible new direction for library instruction.

Many librarians have anecdotes of current college students often needing help with tasks that seem fairly basic, such as attaching a .pdf to an email or how to use PowerPoint to construct anything more than a mediocre presentation. At our small, Master’s granting, comprehensive university, the librarian in the newly created role of “Learning Technologies Librarian” and a professor in the Psychology department wondered if focusing a one-shot librarian instruction session on presentations would improve things. We put together a 50-minute instruction session on presentation skills with space to ask questions about PowerPoint, and then collected student-produced slides before and after the session in order to ascertain whether or not the students incorporated the advice and had begun to understand how to give a better presentation.

Literature Review

Conventional wisdom characterizes the current generation of students as extremely adept at technology.  In 2001, Marc Prensky coined the terms “digital natives” and “digital immigrants”[1], and these terms have since seeped into conventional speech. A digital native is often defined by the image of a teenager focused on a phone or engrossed in a videogame[2], whereas the digital immigrant is usually older, confused, and embarrassed.[3] Prensky’s article details how students who grew up surrounded by media prefer to learn via short videos and are constantly networked, which can confuse older professors and teachers who still teach “step-by-step.”[4] However, current college-aged students are often unable to pinpoint when they learned to use typical productivity applications such as Word, PowerPoint, or Excel. Characterizing people in this age group as digital natives has had the adverse effect of reducing explicit instruction in the usage of common applications. In 2013, The Organisation for Economic Co-Operation and Development (OECD) compiled data from their first Survey of Adult Skills. They looked at both literacy and numeracy skills, but also gathered information on the skills needed to solve problems in “technology-rich environments.” [5] The United States in particular performed poorly: 49.4% of American adults age 16-24 studied scored at Level 1 (the ability to navigate a single environment to accomplish a specifically stated goal) or lower.[6] To achieve Level 2, the participant needed to perform a specific task across multiple applications and deal with unexpected difficulties.[7] This result indicates that despite their presumed understanding of technology, current students of high school and college age have trouble transferring knowledge from one application to another. While today’s students are indeed constantly surrounded by technology,[8] the question remains whether immersion in smartphones and related technology results in automatic fluency in productivity applications.

An alternate theory has been put forward by a group of researchers from the Online Computer Library Center (OCLC) and the University of Oxford. Instead of digital “natives” and “immigrants,” they describe “Visitors” and “Residents.” Visitors are those who view the web as a series of tools that will eventually yield a result. Residents see the web as a place, albeit a digital one, and like a physical place they move around it, speaking to various groups of people and gathering information to suit their needs.[9] While this theory is particularly relevant to web-based information gathering, students regularly interact with applications beyond the browser without necessarily being “Residents” of this type of application. Every assigned paper or presentation engages them with a productivity application (Word, PowerPoint, etc), but the focus is typically on the assignment rather than the tools. While a browser interface is a familiar place that leads to both research and society, productivity applications are typically first encountered in a school work context and do not lend themselves to social interaction and “Resident” engagement the way many web-based applications do. Productivity applications can be considered the airports of the digital world: a place used to get from idea to communicating it, but no one actually lives there.

Students are often assigned projects with very little indication as to what tools they should use to complete the task. When the Pew Internet [10]and American Life Project surveyed high school teachers to determine the types of projects assigned (which are still overwhelmingly research papers)[11], they did not ask what applications would be used to accomplish them. When a specific application is mentioned in [12]pedagogical analysis, such as in J.H. Bickford’s “Uncomplicated Technologies and Erstwhile Aids: How PowerPoint, the Internet, and Political Cartoons can Elicit Engagement and Challenge Thinking in New Ways,”[13] it is primarily in the context of trying to engage students in their assignments rather than teaching them how the program works. The students in this study certainly appreciated the use of technology, since they were better able to communicate their ideas without having to worry about drawing skills,[14] but this may be in part because they were able to simply gather photos and then fashion them into what were essentially memes.[15] In other words, they were allowed to bring their local (“Resident”) knowledge into an application they were just visiting. The assignment did not cover what PowerPoint is for or how to use it well. This is also reflected in other classroom PowerPoint activities, such as creating a game[16] or interpreting literature using the program’s various features (animation, sound effects, etc.).[17] In fact, in the case of literature interpretation, some of the students expressed frustration that they were being asked to use technology in this way[18] whereas others seemed to be overly engaged with the chance to use computers in class.[19] These assignments tend to pander to the idea that today’s students must use technology in assignments or they will lose interest, without bothering to teach the technology in context.

While it remains to be seen who will take up the mantle of teaching technology to today’s college students, teaching Information Literacy and database skills is commonplace in academic libraries, and that instruction has been shown to increase students’ self-efficacy and help resolve negative emotions towards the research process.[20] However, these efforts are typically focused on print formats, and teaching broader media literacy is still somewhat novel. The University of Michigan’s Visual Resource Center (VRC) has already made some inroads into teaching visual literacy as a complement to information literacy. By focusing on the Visual Literacy Standards of the Association of College and Research Libraries,[21] the VRC has created a curriculum to help students analyze images using a methodology similar to analyzing research articles.[22] However, the students are not asked to create their own images. Scott Spicer and Charles Miller described a class that began to create their own media in their 2014 article on digital storytelling. The authors noted that students gained a stronger sense of their own technical abilities by working on the project over the course of a semester.[23] They tracked specific skills including using an external microphone to get better audio, adding music, and uploading to the Internet.[24] With some instruction and a specific assignment, students gained practical skills and confidence with digital media. Students demonstrated a significant level of confidence with audio-visual equipment and editing software (much of which was new to the students) and a lesser, but still significant improvement in project management and storyboarding (skills they may have used in other projects). [25] It should be noted that although Mr. Spicer is a librarian, this was a full-semester long course rather than the “one-shot” workshop more typical of library instruction.

Why should academic libraries begin to take on this sort of instruction? Visual media are an increasingly important form of communication and many careers require the ability to interpret and create media. To help college students acquire these skills, some professors have begun to expand the research paper into multimedia assignments. Thomas P. Mackey and Trudi E. Jacobsen point out that as students navigate these assignments, they need the skills to decide the best type of media to articulate their ideas and which tools to use to create it.[26] The ACRL Plan for Excellence includes as one of its goals that “Librarians transform student learning, pedagogy, and instructional practices through creative and innovative collaborations.”[27] This is further reflected in the new Framework for Information Literacy. The second frame “Information Creation as a Process” speaks to the variety of ways information can exist and that learners need to be able to recognize how a message can change depending on how it is packaged.[28] Unfortunately, this only addresses what students should be able to understand when they encounter media created by someone else. Much as students first learn to read and then read to learn, when we teach students how to make their own media they are then better able to understand the pictures, videos, slide sets, or infographics they encounter, placing this instruction in the purview of librarians.

For this study, we focused on presentations and their associated software (in particular PowerPoint and Prezi). These applications are a common entry point to multimedia, in that students are frequently assigned presentations as a complement to research papers. Despite their ubiquity, professors often complain (anecdotally) that students do not seem to improve as presenters no matter the instructions given. Our research question was simple: would students’ presentations improve if they had a class taught by a librarian who specializes in multimedia skills?

Methods

We recruited participants from psychology research methods courses at a mid-sized comprehensive northeastern public university. The classes took place in the Spring and Fall 2014 semesters, and crossed six sections taught by four different professors. Class sizes ranged from 10-20 students, with the majority of those students being female. The class required the creation of a research project, usually “replication with extension,” from start to finish. It culminated in a final paper with an accompanying 10 to15 minute presentation. This course was chosen because it is a required course for all psychology majors and had been tailored to be consistent across professors. This enabled us to broaden our participants knowing they were in very similar situations. The librarian also happened to be the collections and instruction liaison to the Psychology department, which made it easy to recruit faculty, and establish rapport with many of the students who may have met her in a previous course.

At the beginning of the semester, the students completed a pre-survey indicating their general feelings about giving presentations and their overall skill level in creating them. They filled out these surveys during a librarian guest lecture on database searching and other research methods. (Appendix A)

The students were then created a set of presentation slides. We asked the professors to give them a nominal grade to ensure some level of quality, and the students were not told that they did not have to actually present them. This assignment was not part of the original course; it was added specifically for this study. However, all students in the participating sections had to complete the slides, even if they declined to participate in the study itself. We considered these slide sets the baseline for the students’ skill level. This process was not without setbacks. Since slides made before the librarian’s intervention were not retained by students in two sections, their final slides were not included in the analysis, though their surveys were. The two students who used Prezi also did not retain a first effort separate from their final presentation. To round out the number of participants, students in two additional classes were recruited in the following semester.

Towards the end of each semester, the students had an intervention with the Learning Technologies Librarian (LTL), consisting of a second guest lecture on presentation skills. The LTL covered elements including how much text to put on slides, where to get pictures, basic photo editing using PicMonkey, stance, dress, and the mechanics of Prezi and PowerPoint.[29] After this class, the students created their final presentations, which were given the last week of class. Also during the last week of class, the students filled out a post-survey. This survey covered how helpful they thought the intervention was and where their current level of confidence lay. (Appendix B)

Upon completion of the course, we collected both sets of slides and anonymized them by removing the students’ names and identifying details. We then converted the slides to .pdfs using the naming convention of a letter (A, B, or C depending on the professor) and a number (1-30 for the pre-intervention and 100-130 for the post-intervention). We distributed the slides in groups of six to fifteen to our 20 objective observers (all with a Masters or higher). Each set was evaluated by at least three different observers. They filled out a rubric for each set, and we collected the final scores for comparison and analysis (Appendix C). Though the rubric generally worked well for this study, some observers had difficulty evaluating projects that did not incorporate visual elements. This suggested ways the instructions for using this multimedia project rubric could be improved in future research.

Over the course of the study we did have a number of setbacks. The most prominent occurred when one of the professors collaborating neglected to collect the first set of slides from their two sections and the students subsequently used them as a starting point for their final presentations without saving the earlier draft. We have kept the survey responses from these students, however their final slides were not evaluated by our observers.

Unfortunately, these were the two largest classes (~20 students in each) and this reduced our population to 12 students. In order to get a sufficient number of slides, we expanded the study to include two additional classes in the following semester. This have us an additional 28 students, for a total of 40. Conditions were kept the same as the previous semester, and in the rare cases where students had already participated they were not included.

Two students used Prezi instead of a traditional slide-based presentation software. The links given for both their pre-intervention and post-intervention slide sets were the same, so their slides were not given to the observers.

The Research Methods class used has a rather high attrition rate, hence we have more pre-surveys than post-surveys and slide sets. All of the classes lost at least one student during the course of the semester. This is normal for this particular class.

While the rubric proved to work fairly well, we designed it to be used for many types of multimedia projects instead of just presentations. This resulted in some frustration from our observers when some of the presentations did not have pictures that needed citation. The initial email to observers included instructions, but there was still some confusion. It would need to be revamped for any further replications.

Results

A total of 59 students completed both the pre- and post-intervention surveys. According to the pre-intervention surveys, 35 of the students experienced feelings of anxiousness and 37 felt nervousness when they are assigned a presentation. Only eight said they feel excited, and only seven feel confident (see Figure 1).[30] However, the post-survey responses show that the majority of students felt more confident after the librarian’s presentation by giving themselves a rating of 4 or 5, with 1 being less confident and 5 being more confident (see Figures 2 and 3).

Figure 1

Figure 2

Figure 3

As for the slides themselves, 40 students completed and turned in both the pre- and post-intervention slide sets, which we passed on to multiple independent observers for rating. We assessed interrater reliability among the observers using a one-way random average measures intraclass correlation[31]. This measure provides an estimate of how consistently our independent, randomly selected observers rated the student presentations. The resulting ICC was 0.755, which indicates good agreement between raters (F(82,166) = 4.084, p < 0.001). This measure indicates that the variability in our independent observers’ reports introduced a minimal amount of measurement error, and that their assessments are valid measures of student performance.

Not only did the students assess themselves as more confident, but our observers judged their post-intervention slides to be of a higher quality. The average scores for student presentations post-intervention (28.3) were significantly higher (t(39) = 5.45, p < 0.001) than the scores for the same students pre-intervention presentations (23.1). (Figure 4).

figure 4

Overall, the results indicate that a class on presentations given by someone other than the usual instructor is a positive experience for the students and results in a higher quality of presentations. Comparing the pre- and post-surveys, we see a higher self-rating on their presentation ability. Additionally, most post-survey respondents indicated it was helpful to have a class on presentations. Comments included:

“It opened my eyes to different ideas and ask more questions regarding my presentation.”

“I liked how we were shown what websites to use and how to properly present, also where to get pictures.”

“The workshop helped me with the best way to design my presentation and general dos and don’ts of presenting.”

It is clear from the surveys that students may not always know where to turn when they are having trouble with a common application. They often feel that they are “supposed” to know something. The instruction session gave them a safe space to ask questions.

Discussion

While it is apparent that the presentation skills lecture from the librarian had a positive effect on the students’ presentations, the mechanism is not completely clear. Interestingly, an informal survey of psychology faculty at the university involved in the study suggested the librarian did not give advice that was much different than what these professors provide to their students in a presentation assignment or in criteria provided on their syllabus. Furthermore, all of these students held sophomore or junior standing and had likely already completed a presentation assignment in another class. However, the scores on their first set of slides suggests that they had either developed bad habits or lacked basic presentation skills. We lean towards the thought that bringing in external expertise emphasizes that presentation skills have broader applications. In other words, the intervention makes the assignment feel like more than just a task to get a grade in the specific class. It also helped that the librarian is a confident and experienced presenter, thereby giving the students someone to emulate. It is not clear whether this experiment would be as successful with a different librarian.

It would be helpful to revisit these students later in their academic career to see if they have retained the skills over time, therefore internalizing the skills as something worth having rather than just something they had to do for a grade. Further research could help hone in on whether this is specific to presentation skills. The rubric was designed to be expanded to other multimedia assignments, and has already been adapted to several classes doing videos. It also serves as a convenient artifact when professors express interest in a multimedia assignment, but are not sure where to start with grading.

Conclusion

At its heart, this study suggests that while the majority of today’s college students are adept at social media and cell phone use, they are less adept with common productivity software. An intervention by an outsider can boost confidence and quality by providing an opportunity to address common mistakes and to answer questions. The intervention also has the effect of teaching students that presentation skills are interdisciplinary life skills, rather than just a task on the way to a grade for one class. While this study focused only on the use of presentation software, further research could extend to other productivity and creative applications.

Acknowledgements:

The authors wish to thank Dr. Leigh Bacher, Dr. Ceylan Cizmeli, Audrey Hager, and Dr. Roger Taylor for letting us use their classes in this study and Tasha Bergson-Michelson, Jessica Olin, and Jessica Schomberg for providing valuable advice as we worked through this paper.

The authors also wish to thank our objective observers: Bo Baker, Sarah Barbrow, Natalie Haber, Michelle Bishop, Virginia Cairns, Leah Galka, Beverly Kutz, Linda Lee, Adrienne Matteson, Kimberly Miller, Jaime Myers, Kirsten Parsons, JJ Pionke, Dr. Jennifer Rapke, Emily Petty Puckett Rodgers, Brian Rogers, Karen Shockey, Katherine Stout, Chantelle Swaren, Brandon West, and Lane Wilkinson.

Appendix A

Pre-survey:

1) How confident are you in your presentation skills? (1=novice-> 5=expert)

(novice)  1——–2———3———4——–5 (expert)

2) Power Point is (1=difficult->5=easy)

(difficult)  1——–2———3———4——–5 (easy)

3) Where do you get your images from?

[open answer]

4) When assigned a presentation I feel

a) excited
b) nervous
c) anxious
d) confident
e) nothing much

5) Who helps you with your presentation?

[open answer]

6) What programs or applications do you use for your presentations?

[open answer]


Appendix B

Post-survey:

1) How confident are you in your presentation skills?

(novice)  1——–2———3———4——–5 (expert)

2) Was it helpful to have a class on presentations? (1=yes->5=no)

(yes)  1——–2———3———4——–5 (no)

3) What did you appreciate the most about the workshop?

[open answer]

4) Was anything unhelpful?

5) Where did you get your images from?

[open answer]

6) Did you feel less or more confident about the second presentation?

 

(less confident)  1——–2———3———4——–5 (more confident)

 

7) Did you meet with the Learning Technologies Librarian outside of class? (Yes/No)

Appendix C

Skill Excellent (4)

The presentation has all the indicators listed.

Good (3)

The presentation has most of the indicators, but may be missing one or be slightly imperfect

OK (2)

The presentation has made a good-faith effort, but is missing half of the indicators.

Poor (1)

The presentation is complete, but is missing most of the indicators.

Citation of photos
Permission (Does the student have a legal right to use the material? Is the material creative commons licensed?)
Clarity (Is the research topic presented in an understandable manner?)
Quality (Is the presentation easy to follow? Does the presentation follow a logical path?)
Does it have a pleasant design aesthetic? (Words are readable on the background.)
Is the design appropriate? (Do the images relate to the topic?)
Consistency: Does everything look like it belongs to the same project?
Error Prevention: Does the project work? (slides move forward in the correct order, soundtrack syncs with the visuals, etc)
Relevant: Project reflects the assignment.

Partially inspired by:

Blummer, B. A., & O. Kritskaya. (2009). “Best practices for creating an online tutorial: A literature review.” Journal of Web Librarianship, 3 (3), 199-216.

Nielsen, J. (1995). “10 usability heuristics for user interface design,” Neilsen Norman Group. Retrieved from  http://www.nngroup.com/articles/ten-usability-heuristics/

References

[1]                Prensky, M. (2001). “Digital natives, digital immigrants, part 1,” On the Horizon 9 (5), 2-3. DOI: 10.1108/10748120110424816

[2]                “Brain dead teen, only capable of rolling eyes and texting, to be euthanized.” (2015). The Onion. Retrieved from http://www.onionstudios.com/videos/brain-dead-teen-only-capable-of-rolling-eyes-and-texting-to-be-euthanized-1621

[3]                “60-year-old corporate executive grotesquely forms word ‘hashtag’” (2015). The Onion. Retrieved from http://www.theonion.com/article/60-year-old-corporate-executive-grotesquely-forms–38351

[4]                Prensky, 3-4.

[5]                OECD. (2013). OECD Skills Outlook 2013: First Results from the Survey of Adult Skills, 25. Retrieved from https://www.oecd.org/skills/piaac/Skills%20volume%201%20(eng)–full%20v12–eBook%20(04%2011%202013).pdf

[6]                OECD, 90- 93.

[7]                OECD, 90.

[8]                Prensky, M. (2001). “Digital natives, digital immigrants, part 2: do they really think differently?” On the Horizon 9 (6), 4. DOI: 10.1108/10748120110424843

[9]                White, D. S. & LeCornu, A. (2011). “Visitors and Residents: A new typology for online engagement.” First Monday 16 (9). Retrieved from http://firstmonday.org/article/view/3171/3049

[11]               Purcell, K, et al, (2012). How Teens Do Research in the Digital World. Pew Research Center’s Internet & American Life Project, 42. Retrieved from http://www.pewinternet.org/2012/11/01/how-teens-do-research-in-the-digital-world/

[13]               Bickford, J. H. (2010). “Uncomplicated technologies and erstwhile aids: How PowerPoint, the internet, and political cartoons can elicit engagement and challenge thinking in new ways.” History Teacher 44 (1), 51-66

[14]               Bickford, 60.

[15]               Bickford, 58-59.

[16]               Siko, J., Barbour, M. & Toker, S. (2011). “Beyond Jeopardy and lectures: Using ‘Microsoft PowerPoint’ as a game design tool to teach science.” Journal of Computers in Mathematics and Science Teaching 30 (3 ), 303-320.

[17]               Callahan, M. & King, J. M. (2011). “Classroom remix: Patterns of pedagogy in a techno-literacies poetry unit.” Journal of Adolescent & Adult Literacy 55 (2) ,134-144.

[18]               Callahan & King, 140.

[19]               Callahan & King, 141-142.

[20]               Ren, W. H. (2000). “Library instruction and college student self-efficacy in electronic information searching.” The Journal of Academic Librarianship 26 (5), 326.

[21]               Association of College and Research Libraries. (2011).  ACRL Visual Literacy Competency Standards for Higher Education. Retrieved from http://www.ala.org/acrl/standards/visualliteracy

[22]               Shoen, M. (2015). “Teaching visual literacy skills in a one-shot session.” VRA Bulletin 41 (1).

[23]               Spicer, S. & Miller, C. (2014). “An exploration of digital storytelling creation and media production skill sets in first year college students.” International Journal of Cyber Behavior, Psychology and Learning 4 (1), 46-58.

[24]               Spicer & Miller. 54.

[25]               Spicer & Miller. 55

[26]               Mackey, T. P & Jacobsen, T. E. (2011). Reframing information literacy as a metaliteracy. College and Research Libraries 76 (1), 74.

[27]               Association of College and Research Libraries (2015). ACRL Plan for Excellence. Retrieved from http://www.ala.org/acrl/aboutacrl/strategicplan

[28]               Association of College and Research Libraries. (2015). Framework for Information Literacy for Higher Education. Retrieved from http://www.ala.org/acrl/standards/ilframework

[29]               Thompson, E. (2014). “Presentation 101” [Slides] Retrived from  http://prezi.com/d45f5hnsx4xh/?utm_campaign=share&utm_medium=copy&rc=ex0share

[30]               Some students listed more than one answer.

[31]               McGraw, K. O., & Wong, S. P. (1996). “Forming inferences about some intraclass correlation coefficients.” Psychological Methods 1 (1), 30.

This entry was posted in Peer-Reviewed Article and tagged , , , , , , . Bookmark the permalink.

Leave a Reply

Your email address will not be published. Required fields are marked *