Product Information:-

  • Journals
  • Books
  • Case Studies
  • Regional information

User studies

Options:     PDF Version - User studies Print view

Article Sections

  1. Context
  2. Methods and methodology
  3. Portrait of the user
  4. References

Methods and methodology

User studies are subject to a very wide range of research approaches and methods, from small-scale qualitative studies to deep log analysis. I shall start with attempts by Wilson and others to develop a theoretical framework, before going on to consider the main research methods.

Theoretical framework

Beaulieu (2003) describes how Sheffield academics Wilson and Ellis helped establish a conceptual basis for user studies. Wilson’s (1981) work on user studies came out of a very long project on communication and information flow in social services departments. His model of information behaviour proved highly influential: according to Bawden (2007) it has been cited over 100 times. Whereas researchers had previously examined use, Wilson preferred to look at the actual process of information seeking, in other words the why rather than the how, and at the user in his or her environment, social, personal and professional. He provided graphics of his model as follows:

 

Figure 1: Model of information behaviour (Wilson, 1981).

Figure 1. Model of information behaviour (Wilson, 1981)

 

Figure 2: Model of information seeking (Wilson, 1981).

Figure 2. Model of information seeking (Wilson, 1981)

 

Ellis derived his model (1989, quoted in Beaulieu, 2003) from his own empirical work (later verifying it with further studies), and explored patterns of information seeking such as starting, chaining (following citations), browsing, differentiating (filtering out information one did not need), extracting, verifying, and ending. Wilson subsequently revised his own theory to incorporate Ellis’ research.

The last 2006 issue of the Journal of Documentation was devoted to a revisiting of significant research papers 25 years on, including Wilson’s paper. In the same issue, Wilson commented ironically that although 25 years ago he had pleaded for more qualitative studies, this was now the dominant paradigm in this area, leading to a plethora of small-scale studies. What was now needed, he argued, was for large-scale research programmes lasting several years.

"I do not believe that qualitative methods should be the sole means of investigating information phenomena, and I regret the extent to which researchers today carry out one qualitative study after another without attempting to put the ideas that emerge to the test of the Law of Large Numbers" (Wilson, 2006a).

He also pointed to the proliferation of theoretical perspectives in the area and called for a "common theoretical framework", offering his own work in activity theory. This sees human activity in terms of its cultural, social, and historical context, including the cultural rules, norms of behaviour, motivation, etc. which determine activity (Wilson, 2006b).

Prabha et al. (2007) provide a useful account of some of the models of information behaviour (although they do not cite Wilson’s 1981 paper), including that of Kuhlthau on information seeking as a sequence of intellectual stages – becoming aware of one’s lack of knowledge, identifying a problem area, defining the problem, collecting relevant information, and presenting what has been learnt. (Kuhlthau’s work is also discussed in the information management viewpoint, Niels Jørgen Blåbjerg and the Learning Objects Web development project).

Prabha et al. use role and rational choice theories to contextualize their own work. Role theory looks at the various roles people play arising from their social positions. Rational choice theory has strong applications to economics and proposes that individual actions are the basic unit of social life; individuals make a cost-benefit analysis before selecting a particular course of action.

Because one will never know the full range of choices, the concept of "satisficing" describes the process of making a judgement that one has enough information to satisfy a need. This process also involves a cost-benefit analysis of the effort involved in looking for more information.

Steinerová and Šušol (2005) put forward a model which looks at the how, rather than the why, of information seeking, based on their major study of the information behaviour of users of academic and research libraries. Their resulting data led them to propose the following types of user:

  1. User Type S: Strategic and pragmatic, views information horizontally, preferring a broad range of information sources. They prefer well established, easily accessible sources, which suit their minimum effort style. They also like to be guided by mediators (such as librarians) and information systems, and to use formal judgement criteria on relevance. Surfers and bouncers, they move quickly from one type of electronic resource to another, and tend to suffer from stress over the uncertainty of searching. More likely to be female, and students.
  2. User Type A: Analytical, has a deeper and more open approach to information seeking, with natural curiosity. They use multiple methods of access and search across different types of resources, using both formal and informal sources. They are less likely to find information searching stressful or begrudge the time it takes. They will also make links with their own knowledge. Like to work independently. More likely to be male, researchers and academics.

There is also a Type P, which is a mixture of the two.

Quantitative methods: surveys and experiments

The early empirical studies undertaken by Sheffield in the 1970s (Beaulieu, 2003) used a combination of methods including: questionnaires; interviews; observation; and analysis of secondary sources, such as reference enquiries. Questionnaires continue to be used alongside other methods in large-scale studies, such as the one quoted above (Steinerová and Šušol, 2005) in which the authors surveyed 16 academic and research libraries in Slovakia and 793 subjects, with a 79.3 per cent response rate. The five-part questionnaire used both Likert-scale measuring responses and open-ended questions. Interestingly they comment that they see quantitative methods only as a starting-point, and that qualitative methods should be used for more in-depth findings.

Surveys are routinely undertaken in universities in order to target resources effectively and justify expenditure, and libraries are no exception. Often a standard instrument is used, such as LibQUAL+™ or the SCONUL satisfaction survey, with a view to measuring satisfaction with services. LibQUAL+™ was initiated in the USA, but is also used in Canada, the UK and Europe. It contains a number of standard questions on service quality, grouped under headings such as "Affect of service", "Information control", and "Library as place", and it is possible for a particular library to add its own questions. The SCONUL survey contains a number of sections designed to capture demographics, use of services, and overall satisfaction with service levels. Creaser (2006) provides a useful comparison of these survey instruments, and Kyrillidou and Persson (2006) describe a survey which took place at Lund University as part of a wider application of LibQUAL+™ to 5 Swedish institutions: 2 universities and 3 hospital libraries, and which prompted 372 responses.

Details of these instruments can be found at , and LibQUAL+™ can also be found at .

Usability, the extent to which a product can meet users’ goals effectively, is a popular area of user studies, and one where surveys are often combined with other methods. Kani-Zabihi et al. (2005) list these methods as:

  • diary studies,
  • questionnaires,
  • observation,
  • usability testing,
  • focus groups,
  • interviews,
  • and transaction logs.

They carried out an experiment with 48 participants completing tasks and then filling in online questionnaires. In their study of the usability of the International Bibliography of the Social Sciences database, Lockyer et al. (2006) also got small groups of students to complete tasks, and then used an online survey (chosen for its ability to reach large numbers, it returned 957 responses of which 833 were analysed) for triangulation.

Qualitative methods

In their to the "Human information behaviour" special issue of Journal of Documentation, Spink and Foster (2007) comment:

"The methodologies used in the studies presented here demonstrate in particular that sample size and choice of data collection method need not be considered limiting factors to the study of complex information contexts. Indeed we can learn as much, or more, from the observation of individuals, of participation in small group, or from theoretical debates as from large-scale scientific approaches."

It is somewhat ironic that this issue, with its preponderance of qualitative methods, immediately follows that in which Wilson (2006a) calls for more large-scale studies. However, it does bear out the trend that recent user studies have tended to be small scale, in depth, of a particular group of users, using qualitative methods. Typically the latter, while they do not offer the breadth Wilson considers necessary, do offer depth and the ability to probe below the surface.

Stein et al. (2006) used focus groups for their investigation to support their hypothesis that graduate students are "in the most pain" and have difficulty with research methods. To gain an overview, and help them frame questions, they posted a survey on the Web. Gelfand (2005) used a variant of the focus group in the form of classroom discussion.

Semi-structured interviews are also a popular technique, used with various groups including partially sighted people (Beverley et al., 2007), a public library knitting group (Prigoda and McKenzie, 2007), and graduate students and librarians on how the former use the "affordances" the library offers (Sadler and Given, 2007).

You learn a lot from seeing how people do things, and it is therefore not surprising that observations are popular. Their value in usability studies is immediate as you can see users interacting with the product. A couple of examples were quoted above; another, smaller-scale usability study was conducted on the University of Minnesota library’s searching facilities (Sadeh, 2008). The small sample (two lots of eight) gave the opportunity for a much more free-flowing and spontaneous observation, when users were encouraged to verbalize their thought processes during a set of predetermined exercises. Observations of people at work have a self-evident value – for example that by McKnight (2007) of the information-seeking behaviour of a group of critical care nurses over a total period of 50 hours.

Deep log analysis

Totally different in character from qualitative studies, and presumably going some way towards Wilson’s plea for more large-scale studies, is deep log analysis (DLA). Log data can – and has – been analysed to reveal details of use, but deep log analysis is so called because it links analysis with the data of transactional server logs with another dataset containing demographic information, thus also providing information about the user. Nicholas et al. (2008) conducted a study of users of the ScienceDirect database, a comprehensive collection of scientific, technical and medical journals, books and bibliographic information. Data derived from the logs were considered alongside data from an online questionnaire which went out in May 2005 to 49,266 scholars, to which other (non-Elsevier) authors were added to create a representative population. IP addresses were then matched between respondents and log users. Log data showed, broadly speaking, who was viewing what, and when. There were a total of 757 matches between data logs and completed questionnaires, from a range of disciplines, the most popular being the life sciences followed by the social sciences, and a wide geographical spread (29 per cent of respondents each were from the USA and Western Europe).

The profiles yielded three different types of data as follows:

1. ScienceDirect log data recording searching, browsing and downloading actions:

  • Type of article item viewed (abstract, PDF, etc.)
  • Publication status
  • Publication year
  • Number of unique journals viewed in a session
  • Subject of journals used
  • Number of visits.

2. ScienceDirect log data recording search and navigation behaviour:

  • Number of searches in a session
  • Number of returned hits.

3. Questionnaire data

  • Subject background and other demographic data
  • Attitude towards various key statements.

The results revealed a rich portrait of electronic information behaviour by subject area (e.g. over a quarter of the searches by mathematicians were to older, pre-1998 articles). The authors term this an exploratory study, but already the results have shown similarities with other findings.



Узнайте про нужный веб портал про направление обшивка колонн arbud-prom.com.ua

Law gate her

Emerald Websites