Author Archives: Ian McBride, Adam Franco, Matthew La France, Alex Chapin, Joseph Antonioli, Barbara Merz, Travis Stafford and Bryan Carson

Weekly Web Development Round-up April 18-22, 2011

To give our colleagues a better idea of what’s changed in our web applications each week, we’ll be preparing this quick list for publication each Friday. Not all of the details of each change are included below, but we’ll be happy to answer any questions you might have in the comments. Continue reading

Weekly Web Development Round-up April 11-15, 2011

To give our colleagues a better idea of what’s changed in our web applications each week, we’ll be preparing this quick list for publication each Friday. Not all of the details of each change are included below, but we’ll be happy to answer any questions you might have in the comments. Continue reading

LIS site improvements since usability testing

Soon after the launch of the new Middlebury web site, the LIS website team conducted usability testing on the LIS parts of the site (see report 1, report 2). Many improvements have been made based on the feedback received during testing. Since part of the Team’s charge for this year included following-up on these recommendations, we thought it best to share some of the highlights. These changes were made with the help of many content managers and website editors; we thank you for your contributions!: Continue reading

Tighter security on GO, administration enhancements

In order to address recent bot activity, input validation has been tightened on the form elements in GO available to non-authenticated users. This includes a captcha validation challenge on the “flag as inappropriate” area for non-authenticated users. In order to avoid having to fill this in you can always authenticate with your Middlebury account via the “log in” link at the top of go.

Also some usability enhancements have been made on the administration back-end to make GO administration smoother.

Usability Testing & Web Analytics

Last month I attended a NERCOMP presentation on usability testing and web analytics at UMASS Amherst. The information from the presentation led by staff from Yale University will no doubt be valuable as the LIS web team explores further usability testing. Below is a summary of the information. My full notes are available on middfiles under “NERCOMP Events”.

Summary

Usability testing uncovers why your users behave the way they do. It can lower costs, increase participation, increase satisfaction, and provide data upon which to base decisions. It’s low cost but can be time consuming.

All that’s required is a laptop, someone to administer the test, and recording software or an observer. Do usability testing periodically when implementing new features or redesigning. Test early in a beta version of the site/application.

Once features have been implemented, collect analytics data to see what usage looks like. Form a question that can be answered by analytics data, decide how to measure it, and then see if the data makes sense. Check user behavior before and after the change. Do usability testing again to see if there are persisting or new issues.

The “Think Aloud Protocol”

Have the user speak their thoughts/thought process aloud as they try to complete the task. Think about what the user sees, says, and does. For each task write a question designed to see if the task can be accomplished. Recruit 5-10 users from population and invite them to take a test consisting of tasks/questions.

When writing your questions consider the goals of the implementation from both the organization’s and the user’s perspectives. Identify the tasks that need to be performed and form them into clear questions that address a single task. When you’ve identified your tasks do a pre-test to see how well the questions work.

It’s important that the facilitator is impartial. The goal is to collect data. Explanations can be made after the session. Collecting general feedback can also be helpful though it should be tempered with other sources of data.

“It’s not the customer’s job to know what he wants.” – Steve Jobs

LIS Content Managers Squash Errors Using SiteCheck

Now that the LIS website content managers have been established, the LIS web team has provided them with a tool to help identify and deal with errors in the content of web pages. The SiteCheck tool from Siteimprove will generate a report every 5 days for designated pages that it has crawled and let you know what errors it has found.

Currently a member of the web team is compiling these reports and sending a monthly report to content managers who are in turn dealing with the errors. Spelling suggestions are reviewed on the SiteCheck page before they appear on the report. A single report is sufficient for all of LIS since the number of issues is not so many that CMs cannot find the ones that pertain to their area.

The effort has been a great success with the 28 pages with broken links reduced now to only 6 in the last report. Spelling errors were also reduced from over 20 to only 2. We are very excited to have this process in place going forward.

State of the Site

Overview

What follows is a report on the state of notable web applications and sites in use at Middlebury including the College website, the Middlebury instance of WordPress (i.e. sites.middlebury.edu) and a variety of key web applications that provide services widely used by faculty, students and staff. Continue reading