This content was published: February 19, 2018. Phone numbers, email addresses, and other information may have changed.
Tips from eLearning 2018
Posted by Andy Freed
I had the good fortune to attend the ITC eLearning conference last week in Tucson, AZ. Rather than a specific Best Practice, I wanted to share a few things that stood out to me from the conference that seem timely based on what has been happening here at PCC.
Online Assessments as tools for deep learning
This presentation was by Kirsten Butcher, PhD from the University of Utah. She herself is an instructional designer but also teaches to aspiring instructional designers. The presentation was great (and my full notes from this session are in Spaces), but I want to focus on her defense of multiple-choice question as a tool for deep learning. From working with faculty for years, we know that there can be problems with multiple choice questions. However, Kirsten argued that they can be an excellent tool for formative assessment if you write them well and provide constructive feedback. And using the LMS quiz tool means instant feedback for students. To summarize her session:
- Deep, low-stakes assessment can improve long-term knowledge retention and reward motivated students
- Questions that require that the student infer from lecture/reading to answer the question forces them to think harder about the content.
- Detailed (and instant) feedback gives the student the freedom to get things wrong and learn from their mistakes.
- In the quiz settings, make the quiz worth something (a few points), but don’t set a time limit and allow many attempts. Let students see the feedback to promote reflection. Show only one question at a time to reduce cognitive load.
There was a lot more but too much for a brief post, so I suggest checking out my notes or her slide deck.
Evidence supporting the value of the High Tech, High Touch Teaching Model to Improve Student Outcomes
Christopher Roddenberry and Tom Rankin, Wake Tech College
This session detailed the efforts behind a grant-funded program to increase student retention (especially among minority and first generation students) by using a combination of high-tough interaction facilitated by technology tools. There are many such projects like this, but I appreciated that this project has been happening now for several terms and has useful data. I also appreciated that the co-presenter identified himself as a non-technical instructor. Some of the biggest take away ideas were:
- The methods used in the pilot are effective. One instructor saw a 4% increase in student retention and noted that 92% of his students performed better than the control population.
- The first week of the term was more work than before, but it had very high returns on the time invested during that week.
- The methods used in the pilot that seemed to have the greatest impact were:
- Option for a 1st week orientation. This was often in-person and allowed the students to meet the instructor and learn the software and course in person. (note: they don’t have something like the Start Guide)
- Week 1 contacts to “at risk” students – either because of attendance or grades, contacting them directly (usually via Text or via the Remind app) paid dividends.
- Instructors in the pilot made a commitment to have a 6-hour response time during the pilot.
- Casual, low-fidelity instructor introduction videos are powerful. It doesn’t need to be polished. In fact, the lack of polish makes it personal.
- Virtual office hours were popular among students
Again, this was an info-packed session. I’ll try to add the slides to my notes but they weren’t that pretty. One of the things that I think might most resonate with instructors was something Tom mentioned. “Teaching is more fun now!” He acknowledged that there was more work in the setup and first week, but overall, less time was spent on administrivia. And the course evaluations from students were much higher than those in the control group.
It shouldn’t be any surprise that students really appreciate interaction with their instructor. That seems to be time and time again one of the most valuable things online students refer to. I also attended a panel discussion from online students from Pima Community College. Interaction and timely feedback were the most critical things for them to stay motivated and feel connected.

Thanks, Andy, for sharing your main takeaways from the conference. I’ve been pushing the quiz tool as a *learning* tool, as opposed to an assessment tool, for years. Well-crafted, automatically graded questions can be an outstanding means of helping students check their understanding. And the LMS never gets tired! Students can complete quiz after quiz, as many times as they need to, to achieve mastery in a subject-area. Nice to hear that leading thinkers support this approach. – Peter
Andy – I’m so glad you shared this! I found the information on how to effectively use multiple-choice questions intriguing. While I haven’t (yet) used this exact approach quizzes, I do use a similar approach for online multiple-part homework problems and it seems to work well to help students learn the information. Hmm….It sounds like I now need to make an intro video :-)
Greg,
THANKS for sharing the PPT from Kirsten Butcher. (I am still working my way through that one, and not yet ready to move on to Roddenberry and Rankin!)
The idea of deep, low-stakes formative quizzes sounds WONDERFUL to me. But I am not at all confident I have the distinction clear between deep and shallow quiz items (especially multiple choice.) Do you by any chance have any other resources to recommend? I am thinking about investing the time to change my strategies for using quizzes — which would be a LOT of time — but I want to get more confident before I go forth and put this good idea into practice!