NITLE Camp 2010 Days 1 & 2

NITLE Camp 2010 was 4 days of in-depth discussion and learning about assessment activities and the pedagogy and support of mobile devices. For me, it was a fantastic intro. to these topics and I have so much more to discuss than what you’ll see here (so find me and we can talk if you want to hear more!) but here are some highlights of what I learned:

Day 1: Assessment: Ideas for inquiry & student success

The focus here was on learning-centered / student-centered assessment (as opposed to teaching-centered) Ashley Finley, Director of Assessment for Learning at the American Association of Colleges & Universities, lead this day’s workshop

Assessment as a conversation

Consider the idea that both formative (continual throughout the learning process) and summative assessment (at the end of learning) approaches have a role to play within an overall assessment program, but that assessment is inherently continual–a conversation, if you will.

Planning for assessment

Create a plan using a logic model (create it from right to left and then implement the plan working left to right). Start by defining the goal/outcome, define the evidence needed, and define the resources needed to effect the change, then work through them in the opposite order. Make sure the plan involves clear steps to analyze and share the data with as broad an audience as possible, and a clear timeline for doing so.

Making assessment a campus-wide endeavor

Approach assessment as a holistic and integrated, campus-wide activity. Many departments are already involved in assessment work. Take stock of current assessment activities in other college departments (involves conversations). Establishing a map of currently ongoing assessment helps everyone identify redundancies AND places where potential collaboration may occur. Ask your institutional research, college advancement, alumni, student life, civic engagement, admissions, and (in Middlebury’s case) Commons offices what they are doing to assess student learning outcomes.

E.g. Say you work in LIS and talk to the campus Alumni office. Imagine that you find out about an annual survey that goes out to alumni 5 years out that asks them to reflect on the value of their college experience. LIS is interested in obtaining feedback about the effectiveness of its information literacy program and adds one question to this survey asking what technology skills they learned, found most useful (or wished they’d learned about) while an undergraduate. This tactic doesn’t create yet another survey but piggybacks on a tool already being used. It also provides a method of measuring an outcome beyond the traditional 4-year time period (continuing the conversation).

Implement and adjust

Make adjustments to the assessment program as needed while it is running. Following the run-through of the assessment program, take some time to evaluate the program’s effectiveness. Revise and amend the assessment program on a regular (yearly) basis!

Day 2: Assessing instructional technology community meeting

Examples of assessment activities at other colleges and universities

DePauw Univeristy, Carol Smith, Director : assessment as a way to inform institutional priorities in IT
Colgate University: Collaboration for enhanced learning
St. Lawrence: ECAR, HEDS, CIRP, MISO, etc. and “run, don’t walk, to your institutional research officer”
Colgate University: Institutional research, planning, assessment effectiveness survey review
AAC&U and MISO: Inter-institutional assessment; VALUE rubrics and MISO survey
Stonehill College: Information literacy assessment program
Centre College: Assessing student literacy through new first year course
Trinity University: Information literacy quality enhancement plan “Expanding Horizons
Meeting participants resolved to check in on progress of assessment activities at home institutions sometime in September.

Poster Session

In the evening on day 2, I attended a poster session presented by other camp participants. Click to view a pdf of all the poster abstracts. I think I gravitated towards the posters on the topics for which I wasn’t attending workshops or meetings (moodle, digital storytelling). 2 highlights:

Woodle (Moodle at Wooster) findings

I particularly enjoyed hearing from Matt Gardzina, Director of Instructional Technology at the College of Wooster, about his school’s experiences with learning management system (LMS) Moodle (nicknamed Woodle :). As the poster abstract explains, and he related in person, the faculty at Wooster ended up not really using Woodle for much more than course readings and a parking spot for their syllabi. They used Woodle elements like quizzes and forums far less. As a result, the instructional technologists at Wooster have started to downplay Woodle and amped up support for their blogging and wiki platforms as alternatives to the LMS. I mentioned the Curricular Technology team at Middlebury’s recommendation to support a suite of tools as opposed to a single LMS, and he agreed that it was a good recommendation, especially given his findings at Wooster. (Kudos to the CT team on validation for their recommendation from a comparable institution! I bet Matt would be willing to discuss this further if you wanted to learn more about the specifics of the Wooster findings.)

Before and After: Augmenting Digital Story Projects

When we teach with technology how can we ensure a balance between student technology fluency and the other student learning outcomes for the course? Brett Boessen, Associate Professor of Media Studies at Austin College, shared some good examples when he explained how he has begun integrating formative accompanying materials (like storyboards) and self-reflective elements (students’ author statements) into a digital storytelling assignment in one of his classes. He played some delightful (and quite good) examples of videos ranging from video screencasts to mashups created by students in his course on Participatory Cultures. By embedding planning and reflective elements in the assignment requirements, Brett seems to have struck a good balance between successfully engaging students with their own process of creating and sharing a story, and achieving technology fluency.

6 thoughts on “NITLE Camp 2010 Days 1 & 2

  1. Richard Jenkins

    Assessment: I especially like the idea of surveying Midd alumni 5 years after they graduate — to see what literacy skills they still use, what they wish they had more of, etc.

    Reply
    1. Jess Isler Post author

      Definitely worth considering. The point is to ask: are there other natural places where we can hook in small aspects of assessment and in turn perform basic outreach and strengthen relationships across campus? Not just with the library and research, but also instructional technologies, and other technologies that we support.

      Reply
    1. Jess Isler Post author

      I didn’t take notes, unfortunately, but I think Jason or Janet would be willing to share more info. about their poster topic.

      Reply
  2. Alex Chapin

    Jess,
    Thanks for sharing the Curricular Technology team’s initial recommendations for the future of curricular technology at Middlebury and reporting back reactions. A good source of information on alternatives to a single LMS is 7 Things You Should Know About LMS Alternatives.

    We have configured our instance of Moodle (a.k.a Measure) almost exclusively for online assessment and in this capacity it has served us well, providing placement, entrance and exit exams for many of the Language Schools as well as some academic departments. For more on this, see:
    Segue from Segue >Student Assessment

    That said, the assessments that have been created in Measure are simple knowledge assessments, examining what students know by means of multiple choice and cloze questions. These assessments usually have only one correct answer. Assessing information literacy or technology fluency is more challenging.

    Reply
    1. Jess Isler Post author

      Yes, that ELI 7 things series paper is great. It’s almost uncannily aligned with your team’s recommendations—regarding both the suite of tools and the hub. Is the CT team moonlighting for Educause? ;)

      Reply

Leave a Reply

Your email address will not be published. Required fields are marked *