Journal revamped : ” … the members of the new editorial group … say their task is to produce copy that demonstrates that academics can do more than indulge in prolix, self-indulgent, and jargon-ridden prose that does little for readers’ edification, let alone pleasure.” (The journal referred to:Public culture.)
Students, Faculty, and Staff: Would you like the opportunity to help LIS improve its website and make it work for you?
If you agree to help, we’ll observe and record you performing some tasks on the website.
Sessions will be scheduled from Monday, February 27 through Thursday, March 8 in the Davis Family Library. If you’re interested, please sign up by Thursday February 23rd (sign-ups are now closed), and we’ll respond with a confirmation. Details are below.
The session will take no longer than 45 minutes (15 minutes for explanation and summary, and 30 minutes for testing activities). We’ll ask you to perform specific tasks and we’ll use your responses in our work to improve the site. Your participation will be strictly used within LIS staff and not shared with any outside organization.
Thank you for taking this opportunity to help LIS improve its web presence!
The LIS web team is at it again! In a effort to improve the user experience on the LIS website we are conducting usability testing. For a few weeks you’ll find obtrusive mint green boxes in the corners of the Library, Helpdesk, and LIS pages. These are very short usability surveys. Simply click on “Give Feedback” and then answer each question by clicking the location you’d go to find various pieces of information on each page. Each survey contains only a handful of questions so please participate when you have a moment.
This is an update on the LIS Website Team’s progress toward the User Needs Analysis (UNA) piece of our charge. Right now we’re sharing the results and suggested changes that emerged from the UNA with the LIS Content Managers for the 4 primary LIS Homepages (Curricular Technology, Helpdesk, Library, & LIS).
The UNA results were based on a handful of participants in focus groups and a relatively small number of responses to webpage pop-up surveys (particularly for certain web pages). We are now turning our focus to designing Usability testing (likely based on the formatused by the original Website Team). We hope to achieve better participation for this phase of our assessment. We’ll wait to share the results of our UNA until Usability testing is complete and summarized, effectively sharing all the new LIS Website assessment data at once.
Presenting part 2 of 2 blog posts describing usability testing methods of the LIS Website team (as promised in the Usabilla post).
The Team presented the results of our findings at a meeting with Area Directors and since the presentation itself does a good job of providing an overview of the other tools we used, here it is: Web Team Recommendations. We will be passing the torch to a new iteration of the LIS Website team soon. They will be charged with following up on the status of these recommendations (among other tasks). In addition, we’ll be sharing these recommendations directly with the people in charge of the specific areas of the site.
This post describes the usability testing that the LIS Website Team has done with one testing method. Stay tuned for a later post that summarizes our finding from direct feedback, surveys, observational testing, and this method.
The LIS Website Team used a service called Usabilla, which allows you to quickly design usability tests for web pages based on questions asking respondents to click on the page in order to answer the question. The application is described in this video:
The information gathered with this tool is highly subjective when there are few responses. The LIS Website Team has made recommendations for action based on each of the questions, but we leave it up to the people responsible for each area of the site to decide whether to implement these recommendations based on the number of people who took the survey. We also expect that the experts in each area of the site will draw their own conclusions from the data.
Here are the number of respondents for each test:
Curricular Technology: 15
LIS Homepage: 38
To see a heatmap of clicks for each question click the link for the question. Our recommendations based on the response are included below the question.
The LIS Website team has set up four quick tests to see if we’ve placed links to resources and information in the right place on the page and used the correct labels. For each test, you’ll be asked 5 questions like, “Where would you click to find out when the next Cookie Night will be?” You can click anywhere on the screenshot and can leave multiple clicks for each question. To add a comment to one of your clicks like, “I’d click here, but only because I know to find Cookie Night information on the blog…” you can click the plus (+) sign above and to the right of your placemark.
We’ve created one test for each of the four areas of the LIS Website. Each test has a different set of five questions. A test should only take 1-2 minutes to complete. Thanks for your help!
The LIS Website team invites LIS Staff to help us out with our usability testing activities. Many of you have already been involved–either through your work during the building phases of our site, or by sending us feedback about the site–and we thank you for your input. We will be incorporating these observations into our recommendations for changes and adjustments to the site.
As we turn our focus to usability testing in the form of observational testing sessions, we want to provide the opportunity for interested LIS staff to participate. The LIS Website team’s current usability testing plan involves using an audio/video/screencapture tool, and coordinating and conducting testing with student, faculty, and staff testers (along with other methods).
Please note: if you submitted questions to the usability testing form that was previously advertised, you do not need to resubmit them—we have them in hand and will incorporate them into our testing. If you have additional tasks or questions you feel are essential for testing, please send them to email@example.com. If you are interested in helping us with the testing, please let us know by emailing the team. Thank you!