Tag Archives: lis web team

Sign up for LIS Website Testing!

Students, Faculty, and Staff: Would you like the opportunity to help LIS improve its website and make it work for you?

If you agree to help, we’ll observe and record you performing some tasks on the website.

Sessions will be scheduled from Monday, February 27 through Thursday, March 8 in the Davis Family Library. If you’re interested, please sign up by Thursday February 23rd (sign-ups are now closed), and we’ll respond with a confirmation. Details are below.

The session will take no longer than 45 minutes (15 minutes for explanation and summary, and 30 minutes for testing activities). We’ll ask you to perform specific tasks and we’ll use your responses in our work to improve the site. Your participation will be strictly used within LIS staff and not shared with any outside organization.

Thank you for taking this opportunity to help LIS improve its web presence!

Usability Surveys on LIS Web Pages

The LIS web team is at it again! In a effort to improve the user experience on the LIS website we are conducting usability testing. For a few weeks you’ll find obtrusive mint green boxes in the corners of the Library, Helpdesk, and LIS pages. These are very short usability surveys. Simply click on “Give Feedback” and then answer each question by clicking the location you’d go to find various pieces of information on each page. Each survey contains only a handful of questions so please participate when you have a moment.

LIS Website Team Update: UNA

This is an update on the LIS Website Team’s progress toward the User Needs Analysis (UNA) piece of our charge. Right now we’re sharing the results and suggested changes that emerged from the UNA with the LIS Content Managers for the 4 primary LIS Homepages (Curricular Technology, Helpdesk, Library, & LIS).

The UNA results were based on a handful of participants in focus groups and a relatively small number of responses to webpage pop-up surveys (particularly for certain web pages). We are now turning our focus to designing Usability testing (likely based on the format used by the original Website Team). We hope to achieve better participation for this phase of our assessment. We’ll wait to share the results of our UNA until Usability testing is complete and summarized, effectively sharing all the new LIS Website assessment data at once.

Does tagging content make it easier to find with search? No.

I’ve received this question from several people now. Below are two videos from Matt Cutts who works on Google’s Webspam team explaining how tagging content mostly does not affect their search results. This also means that tagging largely will not affect how results appear on Middlebury’s site, since we use Google to provide our search results.

Tags

Tag Clouds

This does not mean that you shouldn’t tag content at all. Tags can still be useful for humans who want to find other posts and pages on a topic. However, if you want your page to be easier to find, your time is better invested in making sure that the content is well written, structured and relevant to a particular topic.

LIS web presence – marketing project

The LIS Web team – Dan Frostman, Jess Isler, Richard Jenkins, Matt La France, & Barbara Merz – has been conducting a publicity blitz for chosen features of the LIS web presence. The selection was done in consultation with other LIS staff, with the aim of drawing attention to underutilized good stuff available to the Middlebury community. The features we advertised were:

  • Searching: the Midd Google search & special Helpdesk Google search
  • Training: Lynda
  • Drupal: documentation and new editing interface
  • Self-service PIN and password updating
  • Media services Event Recording & Film Screening forms
  • Middmedia Continue reading

Focus Groups on LIS Website (Free pizza & cookies!)

LIS will be conducting four focus groups of students, faculty, and staff on November 16 & 17, 2011. We would like feedback on the pros and cons of accessing & navigating certain pages on the LIS Website. Using the results, we hope to improve the website.

If you are interested in helping, please fill out this sign-up form. (Available dates and times are listed on the form.) Continue reading

Usability Testing & Web Analytics

Last month I attended a NERCOMP presentation on usability testing and web analytics at UMASS Amherst. The information from the presentation led by staff from Yale University will no doubt be valuable as the LIS web team explores further usability testing. Below is a summary of the information. My full notes are available on middfiles under “NERCOMP Events”.

Summary

Usability testing uncovers why your users behave the way they do. It can lower costs, increase participation, increase satisfaction, and provide data upon which to base decisions. It’s low cost but can be time consuming.

All that’s required is a laptop, someone to administer the test, and recording software or an observer. Do usability testing periodically when implementing new features or redesigning. Test early in a beta version of the site/application.

Once features have been implemented, collect analytics data to see what usage looks like. Form a question that can be answered by analytics data, decide how to measure it, and then see if the data makes sense. Check user behavior before and after the change. Do usability testing again to see if there are persisting or new issues.

The “Think Aloud Protocol”

Have the user speak their thoughts/thought process aloud as they try to complete the task. Think about what the user sees, says, and does. For each task write a question designed to see if the task can be accomplished. Recruit 5-10 users from population and invite them to take a test consisting of tasks/questions.

When writing your questions consider the goals of the implementation from both the organization’s and the user’s perspectives. Identify the tasks that need to be performed and form them into clear questions that address a single task. When you’ve identified your tasks do a pre-test to see how well the questions work.

It’s important that the facilitator is impartial. The goal is to collect data. Explanations can be made after the session. Collecting general feedback can also be helpful though it should be tempered with other sources of data.

“It’s not the customer’s job to know what he wants.” – Steve Jobs

LIS Blog Update

I met with the LIS Website team this week and proposed some minor changes to this blog the most obvious of which is an update in the theme design. The other change I’ve just introduced are some quick links in the navigation bar below the header.

I welcome feedback about the new theme. I have about half a dozen variations of this design, each with different images in the upper right corner which I’ll introduce over the next month or so as a way of refreshing the appearance of this blog. If you know of other images that you think would fit this space, send them along and I’ll add them to the rotation.