Usability, Experience, and Progress Study

From Wikimedia Usability Initiative
Revision as of 13:03, 2 November 2010 by 77.64.170.2 (Talk)

(diff) ← Older revision | Latest revision (diff) | Newer revision → (diff)
Jump to: navigation, search
Internal document This is an internal document

While making small fixes to this page like fixing typos and dead-links is encouraged, any changes which significantly modify the information of this page should be suggested on the discussion page instead, as this is an internal document.

The Usability Initiative conducted its Usability, Experience, and Progress Study in early October 2009 in San Francisco, CA. Interviews were conducted on October 14th and 15th.

The Project Team partnered with Bolt Peters, a local user experience consulting firm, and Davis Research, a California based Recruiting firm (who we highly recommend), both of who we worked with in our previous study in March. We conducted 8 in-person interviews focusing on the editing experience, process, and its successes and failures, particularly focused on evaluating the changes the Initiative implemented in it's first two releases (Acai and Babaco). We also explored novice user experiences, use, and interaction with templates, syntax and other complex formatted content.

Call for Proposals

We reached out to both local and North America based UX + Usability firms with a call for Proposals.

Goals

The primary research goals were to:

  • Validate the changes implemented by the Usability Initiative; changes made to the editing process that were intended to improve editing experience and reduce interface barriers to participation. Those changes include features bundled into our Acai and Babaco Releases.
  • Of those changes, specifically look at site-wide navigation and skinning, the new toolbar, the content generation dialogues, and the new table of contents (edit page navigation).
  • Identify and evaluate patterns of reactions and obstacles that novice users encounter in editing Wikipedia with respect to 'templates' and 'Infoboxes'.
  • Continue to identify barriers that Wikipedia readers-but-not-contributors encounter in performing basic tasks associated with editing a Wikipedia article (adding personal content, fixing a typo, adding a reference, contributing to a discussion page, navigating and modifying the history).
  • Continue to discover user experience patterns and issues that have not previously been identified.


Methodology

Target Audience

The primary goal of this study was to evaluate changes made since the original study. We therefore took a similar approach to recruiting our study participants. In the future, we may even consider bringing back some previous subject.

As with the previous study, the target audience of this study was limited to regular Wikipedia readers who are willing to contribute their knowledge to the site, but have expressed reservations about doing so. The team aimed to select a majority (~80%) of users that had not edited but were willing to, and a minority (20%) of users that were novice but not new editors, with fewer than 5 contributions. We preferred potential subjects whose primary reasons for not contributing seemed to be due to the technical complexity of the interface of MediaWiki and markup, but understand that fear of article deletion, unwillingness to have work edited by others, lack of confidence in 'expertise', not being willing or used to collaborating, and philosophical differences are all also big inhibitors to Wikipedia readers' contribution.

Participant No. Has not edited, but is willing to Has made < 25 contributions Male Female Under 20 21–30 31–40 40+ Uses WP every day Uses WP > Once a week
01 X X X X
02 X X X X
03 X X X X
04 X X X X
05 X X X X
06 X X X X
07 X X X X
08 X X X X

Recruiting

Using Ethnio, we recruited over 1181 San Francisco residents for the in person testing. A banner/alert was placed and displayed at the top of every 1 in 1000 Wikipedia pages for the duration of 96 hours. This led Wikipedia readers to a short welcome message and survey (shown below). Based on our criteria (see above, Target Audience), the 1181 users who responded to our survey were filtered down to 486 viable subjects.

Our team partnered with Davis Recruiting (who we can't say enough good things about). Davis recruiting contacted and phone screened these participants based on further discussion, explanation, and criteria as well as based on their age, gender, occupation and other demographics. Davis Recruiting also screens for talkativeness and openness in discussing thoughts and actions. From Davis we received our 8 study participants with 3 wait-listed/back up candidates.


Screenshot of Ethnio recruit survey
Screenshot of Ethnio screener invitation
Screenshot of Survey Gratitude

Lab Testing

This time around, because of our prior experience with remote testing, smaller subject numbers, and desire to conduct future studies (read: funds), we conducted all of our interviews in person. We returned to the Fleischman Field Research facilities, located in San Francisco, California. Participants, limited to those within driving distance, were brought into a room with the Research Interviewer, and a Mac or PC laptop, based on their stated preference. Behind a two way mirror, Wikimedia Usability Researchers could view the interviewer and participant interacting, hear the interview audio, view the participants entire screen and computer interactions, and see their faces and expressions, in real time as the interview was being conducted. These videos (both the screen and the participant) were also captured under a Creative Commons License and can be viewed in the section labeled "Full Interview Videos".

Study Participants

Participant No. Age Gender Wikipedia Editing History Wikipedia Usage Occupation Other Info Location
01 "Judy" 46 Female Has made one edit (on public transportation) Every day Architect San Francisco, CA
02 "Mark" 33 Male Has not contributed, but is willing to. Every day Environmental Consulting San Francisco, CA
03 "Alexandra" 23 Female Has not contributed, but is willing to. Everyday College Student and Video Producer. Was researching corporate videos. Oakland, CA
04 "Angelique" 33 Female Has not contributed, but is willing to. More than once a week Executive Director of Non-Profit San Francisco, CA
05 "Cheryle" 32 Female Has not contributed, but is willing to. More than once a week Homemaker San Francisco, CA
06 "Stephen" 53 Male Has made an edit Everyday Freelance Writer Santa Rosa, CA
07 "Jack" 23 Male Has not contributed, but is willing. Every day Quality Control @ Biotech El Cerrito, CA
08 "Terrence" 21 Male Has not contributed but is willing to Every day Student West Oakland, CA

Scripting

The moderator script was drafted closely match the script from our original study, drafted collaboratively by the Wikipedia Usability Team, B|P, and the Wikimedia Foundation. The script originally aimed to see how users would attempt to and feel about a collection of tasks or objectives that first time and early Wikipedians frequently encounter in their editing process including, but not limited to: finding various modes of editing, adding personal content, looking at discussion pages, creating a new article, finding and using help, navigating an article with templates, and adding a reference, external link, formatting.

In this second round study, we focused on the same tasks, but particularly observed the use (or lack of use) of the tools and changes the team has implemented. The changes include, but are not limited to: a "cleaned up" and less cluttered skin, updated search and searchbox location, an entirely new toolbar including updated toolbar buttons, actions, and dialogues, tabbed site-wide navigation, edit box navigation, and integrated help.

Click here for the full script.

Summary of Results

No Searching for Search

Despite a fairly drastic relocation of the search box, none of the 8 users stumbled when trying to use it. In fact, one user actually used the box successfully, and then later was taken aback to realize it was in a different place than he was used to. The new position takes advantage of user expectations from across the web, aligning Wikipedia with many other major websites. The improved search algorithm was a marked improvement from our last study where we saw users give up and use Google to navigate to the Wikipedia page they were searching for.

Easier Navigation

“It was easy, and I wouldn’t have thought it would be that easy.”
All of the 8 users successfully found the Edit tab with a minimum of hunting when they were asked to begin editing. There were no open questions about page types or page actions, but 3 users "misused" the edit tab from within and edit or preview process.

Decreased Intimidation

“Before there were a lot of tools, and I liked that they were all spread out in front of you, but this actually makes a lot of sense. I had to muddle my way through the older system, but this one seemed fine.”

“Websites don't have common sense, but programmers do.”

One notable observation from the previous study in this series was the degree of trepidation users expressed after reaching the editing interface. This time, although they persisted in referring to wikitext as code, they were much more willing to dive in and start editing. We primarily credit the significantly cleaner overall look of the page and the more modern toolbar, which has been tremendously simplified and de-geeked.

Toolbar: Visibility

“Links are so easy to screw up. I’m not sure if we've correctly typed the link markup. Ah, there are these buttons...”

“I'm trying to see what the code looks like for these other links. I guess these double brackets will make it search for that. ... Ah, the link button. So I guess this is what I was looking for earlier.”

While the toolbar provided some much- needed help to users who were lost in “code”, we found that the #1 problem was simply getting users to notice it. 7 of the 8 users were inclined to try to find something in the existing text that seemed close to the end result they wanted, and copy/paste it. It’s likely that the coloring (very pale gray) as well as the positioning (middle of a very busy page) contribute to this stealth toolbar effect, but it’s also possible that users simply don’t understand the “show me” nature of the different tools. Make the tools/toolbar stand out.

See our prototypes to address this issue here.

Toolbar: Links

“Uh-oh, I think I may have made the wrong kind of link before. I'll go to the preview window to see if this is a link. It would have been nice to just edit it in the preview.”

Of the dialog boxes that have been implemented, the one that most consistently resulted in errors or confusion was the link insertion dialog; in fact, 6 of 8 users initially made some error in using it. The most common error was failure to notice the two top tabs for internal and external links. Not having seen these, users quite happily worked toward creating their link without any worries about the nature of the link. This behavior demonstrates that the distinction is a purely technical one. Unify wiki/external link tabs in the link dialog box. Since the single button on the toolbar indicates one tool, and most users did not differentiate the type of link they were making, and missed the tabs altogether, eliminate the two tabs in the link dialog.

See our current prototypes addressing this issue here.

Toolbar: Tables

“Table, I’m not sure what that is. I'm going to save it and then see, because this preview is too confusing.”

“This is basically the programming for the table. I don't see where to add the information. ... It came out wrong, not what I was expecting, but what I do see is a table with historical figures, so I'll go into the editor and look at that. ... If I hadn't looked at the other table, I wouldn't have known how to do this."

“Okay, I just got a bunch of code instead of a table.”

The insertion of tables proved to be one of the most problematic tasks, tripping up 4 of 8 users, even after they discovered and began to use the toolbar. The dialog box seemed straightforward enough, but once the wikitext was inserted, confusion set in. Users were not sure which part of the text to actually edit, with some interpreting the “sample text” (Row 1, cell 1) as labels, and attempting to insert table contents between the markup dividers. Confusion was exacerbated when users flipped to the article view (either in preview mode or by publishing), and couldn’t find anything resembling the table they were expecting. Bar none, they were looking for a table with grid lines and a background color.

Toolbar: Etc

The icons for “Bold” and “Italic” font break the web- and application-standard convention of using the letters B and I, resulting in widespread disorientation among users. Even those who correctly guessed their purpose were not confident they had guessed correctly, and had to hover or click to confirm their assumption. The term “Advanced” is a misnomer, implying that the functionality contained within will be primarily of interest to advanced users. In reality, the tools extracted to this area of the toolbar are simply less commonly used. The term “advanced” in this context can be a barrier to exploration, preventing some novices from ever discovering the tools tucked away there. The “Embedded File” tool proved to be one of the least easily understood tools of this study. Users generally avoided it, finding that even the sample text, when inserted, provided no additional insight.

Navigable Table of Contents

“This is different, it's got these hot-links [the table of contents]. That's nice."

“On the right, this would take me to the live article.”

Less than half of our users recognized the table of contents for what it is, with most users not seeming to notice it at all. Those who did notice it tended to think it would bring them to that section of the live article. When a heading is clicked in the table of contents, the article text is not correctly aligned within the editing window, sometimes being off by as much as several paragraphs.

See our current mockups addressing these issues here.

Templates

“I’m completely intimidated by that.”

“It took me a minute to figure it out, but now I know that it’s all laid out in computer editing lingo, so that it creates a box.”

“That’s a whole lot of html that makes my eyes dance all over the screen."

While the changes made as part of Babaco have significantly reduced the intimidation factor of editing ordinary pages, this finding did not hold when users encountered the more complicated template code. Users were confident using scrolling and the right-hand links to find the item they wanted to change in a single-column page, but introducing a second column flummoxed them. Their behavior returned to the uncertainty and unwillingness to make changes that we have seen in earlier releases.

Some participants were also uncertain about the language—terms like “transcluded” are new to most users and add to the intimidation factor. Make it clearer that users are in a template that requires “advanced” or “special” markup.

Previewing

“I'd want to go back to the read page and look at it without all the code, to figure out where I want to put my part.”

“I was expecting it to be more like a Word doc.”

“I might go to word, and make the paragraph there first. It's easier to spellcheck and look at it without all the links and stuff.”

Only 2 of our 8 users realized that in preview mode, the editing window was still on the page, just pushed to the bottom, and 1 of them didn’t realize it for quite a long time.

Saving

“Save puts it in some place on the wiki server where I can get in and edit it, but it’s not published.”

We’ve all had it drilled into us - “Save your work!” All 8 of our 8 users displayed a desire to somehow do this before proceeding to a different page of any sort, and most assumed that “Save Page” was the correct way to do this - often even performing this as a preliminary step before previewing their changes.

Study I and Study II

The study confirmed for us that our changes are progressing in the right direction - towards greater ease of use for novice users. The majority of such users we interviewed in this study showed less intimidation, completed tasks faster and with greater ease (and sometimes even with pleasure!), and employed the tools and features we have implemented without instruction and with success. The users did, however, illustrate many imperfections with those tools (i.e. the separation between internal and external links in the link dialog box) along with a bug or two ;). But it should please us and you to know that we are convinced these tools are worth modifying and updating - we will continue to tweak and develop the new toolbar, the dialogs that accompany them, in-situ help, the Vector skin, navigation across the site and from and edit page, Search, and on.

That being said, our study illustrated how much work is still left to be done. Our script focused on tasks that allowed us to focus on a user's ability to find and employ the tools we have deployed (in the interests of evaluating our work), but our interviews brought to the surface the myriad issues that continue to create barriers for new users. We again saw users struggle to get their questions answered (at least the ones that couldn't be solved with our toolbar cheat sheet), suffer from never ending link options (we call them "rabbit holes"), adopt makeshift preview and save processes, become overwhelmed with textual instructions, messages, and warning, and within the course of a one hour study - having their first edits reverting without dialogue or explanation.

Though we cannot address all of these issues within the time, resources, and scope of our project and grant, we hope the issues brought to the surface will be addresses with future work, other Foundation projects, new grants, and community members and volunteers. The solicited feedback and interviews, and your unsolicited feedback on this page and project wiki continue to help us do what we do better.

Full Interview Videos

User 1
User 2
User 3
User 4
User 5
User 6
User 7
User 8
User 2 Search Clip

Our eventual goal for all Wikipedia Users


Accounts

Please note that we created user accounts to isolate the work or changes made by a test subject. These accounts were: Usability X (for X = 1, 2,....8).

Notes

The team's rough notes from the interviews are here.