ACTT Notes: Canvas Evaluation Review

Announcements / Updates:

Request to add “UDOIT” to Canvas: Accessibility LTI that generates a report on a Canvas site to let faculty know when accessibility issues are present in a course. Joe will meet with the faculty to determine what the implementation is for, and whether this should become a project.

No ACTT meeting next week as we’ll be meeting as IT-GOV-ATDL.

A couple of requests to add “Poll Everywhere” to Canvas. Allows for polls to be administered directly through Canvas in a synchronous / on-ground classroom.

 

Canvas Data Evaluation

How useful is the information? What can / should we use? What do we not need? And what is missing?

We have Canvas for three years. We need to be able to evaluate its use in order to determine whether Canvas (or any LMS) is a good technological choice for Middlebury.

Is there a reliable connection between statistics and “engagement” or “quality” of learning?

Numbers are not enough to say that “Canvas is improving teaching and learning at Middlebury.”

Biggest use at Monterey is for class resource web sites and for flipped instruction models. Monterey has very few purely online courses. There’s a certificate program with several online courses, but is blended with on-ground courses.

The College does not have online courses. The Hebrew School is “hybrid” but also fully online part of the year.

Can we define Canvas as a learning space–as more than a platform for distribution and submission? Should it / can it be used for more than file sharing? Or should we be looking at implementation of other technologies that accomplish the same thing?

What kinds of functions can we look at to determine if Canvas is being used well for teaching and learning? How do we measure those functions?

How do we apply an analytic to determine good pedagogy or successful teaching and learning? We can’t really understand what’s happening in classes without talking to students and teachers.

Should we figure out a way to do some qualitative research with teachers and students across Middlebury to determine how Canvas is being used? What works? What doesn’t work? What does teaching with technology look like at Middlebury?

We could try to align Canvas data with our findings from doing qualitative research. Start with “power users” to begin developing stories about how Canvas is being used. Expand the view out by looking at teachers who are using other digital tools in teaching and collect those stories — all to support an analysis of what tools are best for Middlebury.

Look for faculty from different programs to talk about use of Canvas.

Possible plan:

  1. Invite Canvas “power users” to discuss use of Canvas for later in this fall.
  2. Align findings with Canvas data.
  3. Expand that discussion to include other technologies.
  4. Begin larger discussion about use of digital technologies for learning and teaching in Canvas.

This could provide a model for qualitative research for future tech evaluations.