What happens when we don’t get to know our users?


One of the challenges I’ve experienced in my career as an information architect has been that clients don’t have confidence that IT can create technical solutions to their business pain. When solutions are developed without regular, frequent user input, we risk creating tools and systems that don’t meet users’ needs. Only when we understand how they conduct their work can we create valuable, usable solutions that address users’ pain. We may even be able to create more innovative solutions once we become more familiar with day-to-dayprocesses and workflows.

When users are left out of the solution design and development process, three things may happen:

We risk wasting time and money designing, building, and testing tools or services that will never be used. I have collaborated with clients to create online communities of practice with the goal of expanding information sharing across the institution. The key to these sites is they are accessible by all. Yet, in many cases they are rarely accessed. There could be many reasons users don’t visit the sites, but until we understand those reasons, we won’t be able to create a solution that becomes embedded in users’ daily practices.

This is where good data analysis comes in. With data that gives us a comprehensive picture of how our sites are being used, we can narrow the potential causes of disuse. Is the content updated regularly so users find new information each time they visit? Is it authoritative, correct, and of use to our audience? What is different between the sites that are heavily used and those that aren’t, in terms of content, maintenance, and outreach/promotions? Each question’s answer may suggest a different solution.

When we analyzed the usage data for these communities of practice, we noticed a trend that sites would have high visit numbers after their launch, but that this interest was not sustained. Over time, we discovered that sites with frequently updated content sustained more interest than those with static content. Further, the sites with more authoritative content were visited more than those did not. This suggested specific improvements, that, when implemented, increased site usage again.

We risk frustrating users when we create systems or tools that do not address their business pain. From the outside, it may look as though we don’t care about the actual problems our users are having, or about how we can support the ways they conduct their work. They may see that we are creating tools arbitrarily, based in a best practice or perceived need that does not match their actual needs.

Here, user feedback techniques can help. By interviewing users about their work, we can discover what tasks they are trying to achieve, and how they achieve them. What information do they need access to? How do they find it? Can they retrieve the information they need? When they get stalled in their work, why is that? What do they do to get around it? How do they use the tools they already have, and do they kludge anything to get the results they need? Each question here gives us an opportunity to develop more needs-based solutions.

Talking directly with users about their experiences with a search tool that aggregates documents, we learned that many power users had developed their search habits around an older system that had archaic rules for formulating searches. When we shared our more simplified version, they had trouble finding what they needed. We’re in the process now of mapping the old rules to the new system. By finding out how our users worked, we’re able to deliver a more satisfactory service.

We risk losing credibility and building a reputation for being out of touch with our users. Sometimes, this can go as far as creating an environment where multiple teams work toward the same goal from different starting points. We can create multiple products to solve the same problem, which adds to users’ confusion and frustration.

This is why we need to embed clients and users into the design process. Embedding users into the entire development and design process is crucial if we want our solutions to be embedded in users’ daily work. When combined, data analysis, user feedback, and strong relationships with clients and users make it much easier to develop solutions that users find valuable.

This may mean it takes more time to discover and understand changing requirements, but in the end the solutions will be much more valuable, because users have been able to share what works for them and what doesn’t. At the end of the day, we are trying to make their work easier. They will be the best judges of whether we have succeeded, or not. Over time, successful collaborations will build clients’ confidence that we will develop the right solutions when (or even before) users need them.

Image from Illustrated London News, 1870, by way of Wikimedia Commons

Where book history & digital preservation meet: the importance of users in meaning making

Where book history & digital preservation meet: the importance of users in meaning making

Book history and digital preservation

Book historians and digital preservationists have certain things in common. Both seek to preserve cultural heritage and information; both work to provide access to objects and materials for the future; and both are concerned with authenticity, interpretation, context, provenance, and physicality (or intangibility) of the materials they work with and study. In this post, I examine the connection between book history and digital preservation that lies in meaning making and interpretations of information and objects. Scholars of book history have long debated the importance of individuals-not-the-author (such as readers) that influence the interpretation/reception of a text, and digital preservationists are still discussing the value for users of providing contextual metadata with digital objects.

On meaning making

The meanings of objects change over time and across distance, with cultural differences, age differences, and individual world views. As Edwina Taborsky (1990) states, the meaning of an object remains stable and communicable within a certain spatial and temporal area. This reliance on social context and the user’s reception of an object (digital or print) determines that the user’s meaning will not be the same as the original intended meaning. Robert Darnton (1982), a well-known historian of the book, agreed that texts work on the sensibilities of readers in different ways, using as examples a 17th century London burgher and a 20th century American professor. Similarly, digital content is used and reused in different ways in different contexts (Beaudoin 2012). The greater the temporal and spatial difference between the original context of the content and that of the user, the greater the difference in the interpretation or meaning of the content. Therefore, the digital preservationist (or museum exhibit curator, or rare book librarian) has a professional obligation to surround digital and historical objects with contextual information to recreate the original cultural meaning of the object as closely as possible.

For this reason, I believe in the importance of meaning making through metadata. Providing historical and cultural contextual metadata (as well as technical and other metadata) is key to improving the ability of future users to understand a digital object. If there are two separate research camps in the study of contextual metadata, as Faniel and Yakel (2011) state, then I’ll raise my tent with the reuse folks, who examine meaning making through metadata. The other camp, of digital curation, focuses its attention on technical metadata (Beaudoin 2012). Continue reading