header
vol. 19 no. 4, December, 2014


Social question and answer services versus library virtual reference: evaluation and comparison from the users' perspective


Yin Zhang
School of Library and Information Science, Kent State University, Kent, OH 44242, USA
Shengli Deng
School of Information Management, Wuhan University, Wuhan, China


Abstract
Introduction. In recent years, the introduction of social question and answer services and other Internet tools have expanded the ways in which people have their questions answered. There has been speculation and debate over whether such services and other Internet tools are replacing library virtual reference services.
Method. Most previous research on the comparison between social question and answer services and virtual reference has been conducted from the perspectives of libraries or library users, and research based on social question and answer services users' viewpoints has not yet been explored. This study surveyed Yahoo! Answers users on their awareness, use, and evaluation of the social question and answer services site and virtual reference.
Analysis. The survey data from multiple-choice questions, fill-in, and open-ended questions were analysed using SPSS for a quantitative summary.
Results. The findings of this study show that social question and answer services and virtual reference have their own strengths and weaknesses in serving various user needs and in meeting the expectations of an online or virtual service, based on important user-identified considerations.
Conclusions. The notion that social question and answer services will take over virtual reference is not supported. As they stand, virtual reference and social question and answer services have great potential for collaboration in meeting user information needs. The practical implications of the findings to virtual reference are also discussed.


Introduction

The Internet has enabled people to use various tools to ask for help. Visits by Americans to question and answer Websites have increased in recent years. Between 2006 and 2008, visits to social question and answer services increased by 889% (Hitwise, 2008). The number of people who visited Yahoo! Answers, as measured by unique browser sessions, totaled over 16.4 million in the month of September 2013 alone (Quantcast, 2013).

In the last decade, library virtual reference has become increasingly commonplace in providing reference services to library patrons. Despite the popularity of virtual reference, national trends show that reference librarians are answering fewer and fewer questions every year (Tyckoson, 2011). The Internet Public Library, which provides librarian-recommended Internet resources and reference help to the public, has also received fewer questions in recent years (IPL, 2011). This is backed by other reports of declining library reference transactions (Kuhl, 2012).

Library communities speculate and debate over whether search engines, social question and answer services, Wikipedia, Facebook, and other social networks are replacing library reference services. Social question and answer services have increasingly sought to supplant the expertise of the reference librarian with the wisdom of the crowd (Golbeck and Fleischmann, 2010). Some have even predicted the death of library reference services overall (Henry, 2011; Pomerantz, 2010). However, others argue that reference services are not dead but have experienced a paradigm shift from physical to virtual reference services, and virtual reference needs to prove its relevance and value as a core, high-demand reference service (Young, 2013).

Problem statement

Most previous research comparing virtual reference and social question and answer services has been conducted from the perspective of librarians, library users, or researchers. Studies from the viewpoint of social question and answer services users are sparse. A closer look at this user group and their comparison between using social question and answer services and virtual reference will offer additional insight regarding user behaviour in seeking information online, and important considerations for a virtual service. This line of research is also crucial to understanding social question and answer services users' awareness, use, and assessment of virtual reference, and to shed light on the relationship between the two services in meeting user information needs in the Internet environment.

The study reported in this paper will address the research gap and identify areas for improvement within current virtual reference by exploring the following research questions:

  1. What questions do social question and answer services users ask while using the online service?
  2. How satisfied are users with answers received from social question and answer services?
  3. Are social question and answer services users aware of virtual reference and have they used the service?
  4. How do social question and answer services users evaluate and compare social question and answer services and virtual reference?
  5. What are the important considerations of a virtual service from a user's perspective?

Literature review

Virtual reference refers to all reference services provided online, whether synchronous or asynchronous (e-mail, LibAnswers, and Instant Message), or through SMS (text). Other nomenclatures include digital reference services, online reference services, and real-time services (Nicholas, 2011). This literature review provides a synopsis of previous research involving virtual reference, its role in meeting user information needs, and its connection to social question and answer services.

Change in user reference needs for virtual reference service

Library virtual reference continues to evolve along with the changing technology and user expectations (Kilzer, 2011). Today's users expect to get help where they are, minimising the overall effort associated with seeking information, and to identify the most convenient source to meet their information needs (Chow and Croxton, 2012). Although they may be capable of using a search engine to answer simple fact-based queries, users may still have difficulty sorting through the results of a research-based query, and this is where virtual reference might step in (Numminen and Vakkari, 2009; (Solorzano, 2013).

In a longitudinal study of live chat and instant message reference services, Radford and Connaway (2013) observed a significant decrease in subject searching but an increase in procedural questions in the same time period. The authors suggested that the downturn in the subject reference types might be partially because of online tools such as search engines, Wikipedia, or other reference guides. The authors also suggested that changes in the makeup of virtual reference users could be another contributing factor.

In responding to the evolution of various communication technologies, libraries have been offering virtual reference by telephone, e-mail and online chat to make the service as convenient as possible for users (Luo, 2012). The increasing availability and use of electronic resources allow virtual reference service providers to extend their reach and expand user horizons (Agosto, Rozklis, MacDonald and Abels, 2011).

Users' preferences and satisfaction with virtual reference

Research shows that users' preferences for virtual reference are dependent upon the type of question, the type of information seeker, and the convenience offered across library user groups (Chow and Croxton, 2012). Studies have found that convenience was considered an important reason for users to access virtual reference services, and that it was critical to library patrons' willingness to use virtual reference (Connaway, Dickey and Radford, 2011; Nicol and Cook, 2013).

One challenge for current virtual reference is its low usage by patrons, despite the fact that many studies established very high user satisfaction and inclination to recommend the service (Pomerantz and Luo, 2006). The blame for the decrease in usage has been placed on the availability of virtual reference-like services on the Internet, users' overt confidence in their searching skills, and the library anxiety involved in talking to a librarian (Connaway, Radford and Williams, 2009). Additionally, visibility of the service and the individual user's willingness to utilise the service are also important factors contributing to virtual reference use (Mu, Dimitroff, Jordan and Burclaff, 2012).

Connaway and Radford (2011) suggest that user reassurance and positive user experience is key to address the virtual reference use challenge. They point out that without an explicit, live introduction to virtual reference, many users will not bring the trust and appreciation they feel for librarians into an online experience. It is possible for users and librarians to greatly value how they are treated in addition to the accomplishment of the encounter's goal to facilitate information discovery and use (Radford, Radford, Connaway and DeAngelis, 2011).

Previous research has shown user perception and user satisfaction are very important in the virtual reference process. Such studies paid attention to the gap between virtual reference and library users' satisfaction. However, little is known about social question and answer services users' perception of virtual reference in their searching process (especially those users who use both) and what role the types of questions may play in this process.

Socialisation of virtual reference

In essence, virtual reference is an interactive process between librarians and users. Each interaction and query brings something new to the librarian and the user (Tyckoson, 2011). The effectiveness of social interaction, for example, in asynchronous e-mails, may depend on the interpersonal and effective communication that takes place between librarians and users (Park, Li and Burger, 2010). The tools to provide virtual reference will continue to change, but those key elements will remain the same—forming a connection with users and helping them find what they are looking for (Solorzano, 2013). A lack of social cues is seen as an aspect of chat reference to which librarians must adapt (Gronemyer and Deitering, 2009).

Online reference interactions present some unique challenges. Librarians who are part of a collaborative service may have difficulty answering questions about local policies at libraries in other parts of the state or country (Bishop, Sachs-Silveira and Avet, 2011). For reference research, online social reference exemplifies a new stage that involves a transition from question negotiation to an online collaborative group effort. Unlike library virtual reference, social reference is conducted through cooperative effort and open participation to make obtaining reference help much more convenient (Radford, 2008). The socialisation of virtual reference asks for virtual reference providers to embrace the change and challenge.

Differences between social question and answer services and virtual reference

At present, there is an increasing interest in understanding social question and answer services and virtual reference's similarities and differences, and most importantly the lessons that could be learned from one to enhance the other (Gazan, 2007; Kitzie and Shah, 2011). Additionally, research involving both social question and answer services and virtual reference suggests that it is important to understand how users view the two services with regard to meeting their reference needs (Shah and Kitzie, 2012).

In a recent study, Radford et al. (2013) found that users saw virtual reference services as authoritative and objective, more synchronous, and as receptive to more complex questions. Social question and answer services, however, were viewed as asynchronous, less authoritative, having simpler questions, and providing more opinionated answers. In terms of performance, Shah and Kitzie (2012) found that virtual reference outperforms social question and answer services in the following aspects: customisation, quality, relevance, accuracy, authoritativeness, and completeness. On the other hand, social question and answer services outperformed virtual reference in the following areas: cost, volume, speed, social aspects, engagement, and collaboration. Some additional differences between virtual reference and social question and answer services recognised by previous research include the dyadic reference encounter vs. teamwork, relevant boundaries vs. blurred boundaries, limit of authority vs. empowerment of authority, and institutional human and information resources vs. volunteers and free online information (Shachaf, 2010).

It should be noted that most current studies that compare social question and answer services and virtual reference have focused on librarians and college students, who are more likely to be familiar with library virtual reference but not necessarily social question and answer services. This is shown in the study by Shah and Kitzie (2012) where the majority of the students did not use social question and answer services to ask or answer a question. Studying social question and answer services users regarding their experience with social question and answer services and virtual reference can yield new insights when comparing the two services in serving user needs. Also, research samples of previous studies tended to be small due to their qualitative nature, with no more than forty participating subjects from one institution. The samples may not be representative of virtual reference and social question and answer services users, and findings may not be generalisable.

Currently, an interesting avenue to explore is how virtual reference can complement social question and answer services to provide more comprehensive information when needed. Such collaboration may allow social question and answer services to offer premium (paid) content to their large user-base, and for virtual reference to create a new revenue stream to support sustainability (Radford, Connaway and Shah, 2012). Kitzie, Choi and Shah, (2012) conceptualised social question and answer services and virtual reference services under one umbrella, namely online question and answer services, and investigated both these services and virtual reference together. The findings indicated that social question and answer services and virtual reference services may in fact be directly comparable and, from learning concordant strengths and weaknesses, be improved as a whole.

Methods

Yahoo! Answers as the social question and answer services site

For this study, Yahoo! Answers was chosen as the social question and answer services site for several reasons:

Sample and data collection

An online survey was conducted to understand social question and answer services uses and to seek users' input on their experiences and comparisons between virtual reference and social question and answer services. The survey sample consisted of 1,431 Yahoo! Answers users, who posted or answered a question in the Yahoo! Answers education and reference category during the time period of February 27 to March 30, 2013. The subject category was chosen because its theme is similar to virtual reference.

The survey was hosted in Qualtrics, a secure online survey tool available to the researchers through institution subscription. The sample subjects were invited by the e-mail platform in Yahoo Answers! to participate in the survey between June 21 – July 15 with one follow-up in the middle. No incentive was offered for participation. The survey was designed to understand uses and comparisons between Yahoo! Answers and virtual reference, and the underlying motivations and concerns during the informational and social process. The survey questions, which contained multiple-choice, fill-in, and open-ended questions, covered the following areas with details about specific questions in the findings section where related results are reported:

Out of the 1,431 users in the sample, thirty could not be reached because of unavailable accounts at the time of the study. After one round of follow-up reminders, 230 responded to the online survey, with a response rate of 16.4%. The basic demographics of the respondents are summarised in Table 1. Among those who participated in the study, 58% were male and 42% were female; and 65% had a college or post-graduate degree. Their ages ranged from 18 to 82, with 18% being 18-19, 15% in their 20s, 10% in their 30s, 14% in their 40s, 20% in their 50s, 19% in their 60s, and 5% being 70 or older. The participants' Yahoo! Answers user levels tended to be high with 44% in Level 7, the highest user level that may be achieved for positive contributions, which is earned by answering questions, receiving best answer recognition as chosen by question-askers, and participating actively. Among remaining participants, there were 12% each in Level 6 and Level 5, with 14% and 11% for Level 4 and Level 3 respectively. New users start at Level 1, and most infrequent users remain in Level 1 and Level 2 if they do not consistently answer questions to earn positive points to move their user level higher.


Table 1: Participants' profile (N=230).
  Category Count Percentage
Sex (n=192)Male11158%
Female8142%
Age (n=186)18-193318%
20-292715%
30-391810%
40-492614%
50-593720%
60-693619%
70 and above
95%
Education (n=192)Less than high school
00%
High school
3619%
Some college3217%
College7238%
Post-graduate5227%
Yahoo! Answers user level (n=178)Level 132%
Level 2106%
Level 31911%
Level 42514%
Level 52212%
Level 62112%
Level 77844%

Limitations

The survey response rate at 16.4% is considered low. This is a common challenge faced by these types of studies that directly survey social question and answer services' users, particularly those on Yahoo! Answers. All of the very few studies that surveyed Yahoo! Answers users have reported low response rates. For example, in Oh's (2011) study of health answerers' behaviour using the site, 257 responded out of 2,139 contacted, giving a response rate of 12.0%. In Dearman and Truong's (2010) study of why users of the social question and answer services site do not answer questions, the researchers contacted 731 users and received 135 responses, yielding a rate of 18.5%.

Several factors contribute to the low survey response rate of this type of study. First, there is a restriction of ten e-mails per account per day at Yahoo! Answers, which limits the number of participants reached by the surveyor. Secondly, the site has a feature that allows users to hide their account and e-mail contact information, which makes direct e-mail contact impossible. Thirdly, the site systematically eliminates accounts that are used for sending a large number of e-mails to other users. During the course of this study, several accounts used to reach the sample users were deactivated by Yahoo! Answers. Fourthly, the social question and answer services site's users tend to be very cautious and on high alert for spam content. In this study, a handful of e-mail inquiries and a few phone calls were received to verify if the survey invitation was indeed a real and valid request. Finally, a lack of incentive also contributed to the low response rate. A few e-mail inquiries asked for an incentive to participate.

To fill the gap of previous research that focused on library users, the intended sample of this study was social question and answer services users, and participants were recruited within a social question and answer services environment. As social question and answer services users, they are likely to have more positive attitudes toward social question and answer services than the average person. This study tried to limit potential bias by only asking those who have used both social question and answer services and virtual reference to offer their comparison and evaluation of the two services. A stratified sampling of virtual reference and social question and answer services users and non-users would be most desirable for future research to shed additional light for comparison.

Given these limitations, more research with multiple user samples using this direct survey method and other methods such as interviews and focus groups is needed to generate a more complete picture of virtual reference and social question and answer services comparisons.

Findings

Type of questions asked at the social question and answer services site

Two separate questions in the survey asked participants whether they had used the social question and answer services site to ask questions and to answer questions. Among those who participated in the study, most had both asked questions (85%) and answered questions (99%) at Yahoo! Answers. For those who indicated that they had asked a question, a follow-up question asked what types of questions they had asked in the past and to select all question types that applied, along with an option to enter other types if the provided types did not apply. Table 2 summarises the types of questions asked. The most common types of questions posted on the social question and answer services site are those asking for advice or opinion (60%), questions about specific facts (51%), and questions for leisure or entertainment (50%), with at least half of the participants having asked these types of questions. Questions about personal and/or private issues and for research are the next two common types of questions with about 40% of participants involved. Questions for school homework (25%) and for work (16%) are the two least common categories.


Table 2: Types of questions asked in Yahoo! Answers (n=177).
Type of questionNo. of Responses%
1. Questions asking for advice or opinions10660%
2. Questions about specific facts9151%
3. For leisure or entertainment8950%
4. About personal and/or private issues7040%
5. For research6838%
6. For school homework4525%
7. For work2816%
8. Other2816%
Note. % does not total 100% as types of questions are not mutually exclusive and participants chose all that applied.

Those participants who had asked questions had a follow-up question to recall the specific question asked most recently. The rank of common types of questions recently asked by participants (see Table 3) is in line with the common practices of askers on the site, which suggests that Yahoo! Answers users mostly engage in advice-seeking and fact-finding for leisure or personal issues; seldom using the site for work.

Other types of questions reported include specific examples of questions people encounter in their daily lives, such as advice and help with technology related issues, questions about health, motorcycle repair, study options, sports, or questions to make people think and fine-tune ideas and to create social awareness.


Table 3: Types of questions asked most recently in Yahoo! Answers (n=174).
Type of questionNo. of Responses%
1. Questions asking for advice or opinions7644%
2. Questions about specific facts4023%
3. For leisure or entertainment3319%
4. About personal and/or private issues2716%
5. For research2514%
6. For school homework169%
7. For work127%
Other2816%
Note. % does not total 100% as types of questions are not mutually exclusive and participants chose all that applied.

User satisfaction with answers received in social question and answer services

Out of 179 participants who had asked questions using Yahoo! Answers, 172 answered a follow-up question in the survey questionnaire to rate their satisfaction with the specific question asked most recently by choosing one from: a) very satisfied, b) somewhat satisfied, c) neither satisfied nor dissatisfied, d) somewhat dissatisfied, and e) very dissatisfied. As reported in Table 4, 70% of the users were either very satisfied (31%) or somewhat satisfied (39%), 12% were neither satisfied nor dissatisfied, and 18% were somewhat dissatisfied (10%) or very dissatisfied (8%). It is interesting to note that 58% of the users did not seek any other sources after receiving the answer from the Yahoo! Answers site in their most recent encounter. A notable 26% of the users conducted Internet searching after receiving their answer. The major reasons cited for such post-receiving-answer activities were to verify the answer received (43%), to have more options (36%), and being unsatisfied with the answer received (30%).


Table 4: User satisfaction with the answers received (n=172).
Level of user satisfactionResponses%
Very satisfied5331%
Somewhat satisfied5339%
Neither satisfied nor dissatisfied 2012%
Somewhat dissastisfied1810%
Very dissatisfied148%
Total172100%

It is noted that user satisfaction with answers received in social question and answer services varies by the type of question when the responses reported for Table 3 and Table 4 are cross examined. Table 5 summarises the calculated mean score of user satisfaction by question type. The mean score is calculated based on the scale: 1 for very satisfied, 2 for somewhat satisfied, 3 for neither satisfied nor dissatisfied, 4 for somewhat dissatisfied, and 5 for very dissatisfied. As some respondents indicated their asked question belongs to multiple question types, only those responses with one unique question type were selected for comparison.

Overall, social question and answer services users are most satisfied with answers to their schoolwork questions, followed by questions about specific facts, and questions for advice or opinions. The least satisfied answers are those addressing research questions and about personal and/or private issues. A one-way ANOVA (analysis of variance) test result shows there is not a significant effect of question type on the level of user satisfaction with the answer received [F(4, 102) = 1.699, p = 0.156].


Table 5: User satisfaction with the answers received by question type.
RankQuestion typeMeanStandard deviation (SD)
1For school homework (n=8)1.631.06
2Questions about specific facts (n=15)2.071.39
3Questions asking for advice or opinions (n=46)2.151.01
4For work (n=4)2.251.89
5For leisure or entertainment (n=19)2.261.28
6About personal and/or private issues (n=6)2.501.05
7For research (n=9)3.441.33
Note. 1 = Very satisfied, 2 = Somewhat satisfied, 3 = Neither satisfied nor dissatisfied, 4 = Somewhat dissatisfied, 5 = Very dissatisfied.

Virtual reference awareness and use

Table 6 summarises virtual reference awareness and use among participants. The awareness and use of virtual reference is low. In this study, 68% of the users have never heard of virtual reference, 14% of them are aware of the service but have never used it, 13% use virtual reference only sometimes, and only 4% use virtual reference regularly.


Table 6: Virtual reference use and awareness by Yahoo! Answers users (n=202).
Virtual reference awareness and useResponses%
Never heard of it13868%
I am aware of the service but have never used it2814%
I use it only sometimes2713%
I use it on a regular basis94%
Total202100%

Those twenty-eight participants who were aware of the library virtual reference service but have never used it were asked about the main reason for not having used the service. The answers can be summarised in the following categories:

User evaluation and comparison of social question and answer services and virtual reference

A follow-up question in the survey asked respondents who have used both Yahoo! Answers and virtual reference (36 in total) to evaluate and compare the two services based on their experience. Twelve statements related to social question and answer services and virtual reference characteristics that are comparable, based on previous literature, were provided for the comparison and evaluation (see Table 7). For each statement, participants were asked to provide their assessment by choosing one of the following: Yahoo! Answers is much better, Yahoo! Answers is slightly better, About the same, Virtual reference is slightly better, Virtual reference is much better, or No opinion.

Table 7 summarises the evaluations and comparisons by combining the two choices in favour of Yahoo! Answers and the two choices in favour of virtual reference. The results show that Yahoo! Answers stands out in the following areas over virtual reference:

On the other hand, virtual reference is considered better than Yahoo Answers in the following areas:

Although about half of the users consider both services to be about the same in the areas of user privacy (53%), being comfortable asking any questions (53%), and overall service satisfaction (47%), virtual reference is slightly stronger in user privacy (25% vs. 11%) while Yahoo! Answers has an edge in overall user satisfaction (33% vs. 17%) and making users feel comfortable asking any questions (28% vs. 11%). Overall, the strengths and weaknesses identified by the users who have used both social question and answer services and virtual reference are consistent with previous studies as reported in the literature review.


Table 7: Evaluation and comparison of Yahoo! Answers and virtual reference.
Yahoo! Answers is betterAbout the same
Virtual reference is better
No opinion
Total
The service answers advice-seeking and opinion questions (n=36)69%
19%6%6%100%
It's fun (n=36)69%
25%
0%6%100%
Service is accessible
(n=36)
49%
35%
14%3%100%
Prompt response (n=36)46%
27%24%3%100%
Answers are provided by people
with related experience (n=36)
41%
35%
19%5%100%
My privacy is guarded
(n=36)
11%53%25%11%100%
Overall, I am satisfied
with the service (n=36)
33%47%17%3%100%
I feel comfortable asking
any questions (n=36)
28%53%11%8%100%
Answers can be trusted (n=36)6%17%75%3%100%
Answers are of good quality
(n=36)
16%22%59%3%100%
Answers are provided by
people with expertise (n=36)
16%27%54%3%100%
The service answers factual
questions (n=36)
11%37%49%3%100%
Note. Bold numbers indicate the highest response for each evaluation item.

Considerations of a virtual service

Social question and answer service users in this study were asked to rate the importance of various considerations when they used any online or virtual service to seek answers to their questions. Besides the comparable statements used for social question and answer services and virtual reference comparisons, additional statements about unique features of one or the other were also provided for rating; these included the number of responses for a given question, opportunity for friendship or connection with people, a variety of responses and opinions for a question, and answers by people directly and specifically instead of found by a search engine were added for virtual service considerations. In addition, there was an other category that allowed participants to enter and rate their own key consideration for a virtual service.

For each statement, participants were asked to provide their rating by choosing one from: a) very important, b) important, c) neither unimportant nor important, d) unimportant, e) very unimportant, and f) not applicable.

Table 8 summarises the importance ratings and their calculated mean scores and standard deviations using the scale with 1 for very important, 2 for important, 3 for neither important nor unimportant, 4 for unimportant, and 5 for very unimportant; not applicable ratings were not included in the calculations. As shown in Table 8, the following considerations, with a mean score of better than important (2), are the top considerations for using an online or virtual question and answer service:

  1. quality of answers (1.40)
  2. accessibility of service (1.41)
  3. trust (1.46)
  4. promptness (1.57)
  5. related experience (1.60)
  6. expertise (1.64)
  7. privacy (1.75)
  8. feeling comfortable asking questions (1.88)
  9. other (1.89)

The other important considerations suggested by users included cost, ability to evaluate results, continued learning/awareness, personal/sincere aspect, accuracy, and no jargon when using the service.

To determine the underlying latent clusters within the thirteen specific user considerations, we conducted principal components analysis with varimax rotation. The other consideration was not included as it represents a small but different set of considerations. Factors were retained based on Eigenvalues greater than 1. Items were assigned to factors based on their largest loading. We also calculated the Cronbach's Alpha coefficient for each factor to determine its scale reliability. As shown in Table 9, the analysis yielded the following four factors explaining a total of 66.25% of the variance for the entire set of variables:

The communalities of the variables included are all above 0.50, six of which are above 0.70. This result means that the extracted factors explain most of the variance in each of the variables. Cronbach's Alpha for the four factors overall and the first three factors are over 0.72, suggesting a relatively high internal consistency of the items/considerations being related as a group. The fourth factor's Cronbach's Alpha is below 0.7 and this factor needs to be interpreted with caution.


Table 8: Considerations and their importance ratings for online services.
ConsiderationVery
important
ImportantNeither
important
nor
unimportant
UnimportantVery
unimportant
MeanStandard
Deviation
(SD)
1. Quality of answers (n=182)117
(64%)
57
(31%)
8
(4%)
0
(0%)
0
(0%)

1.40

0.574
2. Service is accessible (n=182)116
(64%)
59
(32%)
6
(3%)
0
(0%)
1
(1%)

1.41

0.613
3. Answers can be trusted (n=179)114
(64%)
50
(28%)
13
(7%)
2
(1%)
0
(0%)

1.46

0.681
4. Prompt response (n=181)94
(52%)
74
(41%)
10
(6%)
2
(1%)
1
(1%)

1.57

0.700
5. Answers are provided by
people with related experience (n=179)
94
(53%)
65
(36%)
17
(9%)
3
(2%)
0
(0%)

1.60

0.730
6. Answers are provided by
people with expertise (n=179)
93
(52%)
62
(35%)
20
(11%)
3
(2%)
1
(1%)

1.64

0.790
7. My privacy is guarded (n=181)95
(52%)
50
(28%)
25
(14%)
9
(5%)
2
(1%)

1.75

0.950
8. I feel comfortable asking
any questions (n=180)
76
(42%)
61
(34%)
34
(19%)
7
(4%)
2
(1%)

1.88

0.925
9. Other (please specify) (n=28)14
(50%)
4
(14%)
9
(32%)
1
(4%)
0
(0%)

1.89

0.994
10. Answers are provided by
people directly and specifically
instead of found by a search engine (n=181)
56
(31%)
66
(36%)
45
(25%)
9
(5%)
5
(3%)

2.12

0.998
11. There are a variety of
responses and opinions (n=184)
42
(23%)
80
(43%)
40
(22%)
20
(11%)
2
(1%)

2.24

0.962
12. Number of responses for
a given question (n=181)
35
(19%)
63
(35%)
67
(37%)
13
(7%)
3
(2%)

2.37

0.932
13. It's fun (n=181)46
(25%)
52
(29%)
50
(28%)
21
(12%)
12
(7%)

2.45

1.181
14. Opportunity for friendship
or connection with people (n=172)
23
(13%)
26
(15%)
60
(35%)
33
(19%)
30
(17%)

3.12

1.253


Table 9: Communalities and factor loadings for considerations of virtual/online services.
ConsiderationLoadingsCommun-
ality
Total/
overall
Factor 1:
Human
answer
Factor 2:
Answer
service
Factor 3:
Active
response
experience
Factor 4:
Personal
social
concern
Answers are from people with related experience0.809   0.747 
Answers are from people with expertise0.769   0.725 
Answers are from people instead of a search0.759   0.602 
Prompt response 0.801  0.721 
Service is accessible 0.793  0.645 
Answers can be trusted 0.632  0.607 
Quality of answers0.4350.550  0.506 
A variety of responses and opinions  0.838 0.749 
Number of responses for a given question  0.805 0.741 
It's fun  0.6960.4270.701 
I feel comfortable asking
any questions
   0.6860.626 
My privacy is guarded   0.6680.633 
Opportunity for friendship/connection
with people
  0.4600.5740.608 
Eigenvalue2.5472.2902.2501.525 
% of total variance explained19.60%17.62%17.31%11.73%66.25%
Cronbach's Alpha0.784
(k=3)
0.721
(k=4)
0.732 (k=3)0.469
(k=3)
0.773
(k=13)
Note. Factor loadings <0.4 are suppressed.

Table 10 shows the mapping of the top eight most important considerations for virtual online services from Table 8, with the results of user evaluation and comparison of virtual reference and Yahoo! Answers as summarised in Table 7. This mapping and comparison of virtual reference and social question and answer services shows where the two services stand in user evaluations for the top eight most important considerations when using online services in general. It is interesting to note that these key considerations are evenly shared between virtual reference and Yahoo! Answers; neither service stands out in all of them. This result suggests that each platform performs well in some areas valued most by users, while having room to learn or improve to meet users' needs and requirements/expectations for online services.


Table 10: Comparison of virtual reference and social question and answer services on top eight important considerations.
ConsiderationBetter performer as evaluated by users
1. Quality of answers Virtual reference
2. Service is accessibleYahoo! Answers
3. Answers can be trustedVirtual reference
4. Prompt responseYahoo! Answers
5. Answers are provided by
people with related experience
Yahoo! Answers
6. Answers are provided by people with expertise Virtual reference
7. My privacy is guardedVirtual reference
8. Feeling comfortable asking questions Yahoo! Answers

Discussion

The objective of this study was to address the gap in user evaluation and comparison of social question and answer services and virtual reference in previous research. The results indicate that social question and answer services and virtual reference play different roles in meeting user information needs, and have their share of strengths and weaknesses individually as virtual services.

This study revealed that the types of questions asked on the social question and answer services site are mostly associated with users' daily lives, with most questions pertaining to advice or opinion and about half for leisure or entertainment. These results are consistent with findings by previous studies (e.g. Kim, Oh and Oh, 2007), which indicate that conversational questions seeking opinions or suggestions are asked more often than informational questions on Yahoo! Answers, with users seldom using the site for work. Unlike social question and answer services, virtual reference tends to elicit a high number of questions that are fact-based, informational, subject-related, and research-orientated, rather than those that focus on personal opinions or thoughts (Shah, Radford, Connaway, Choi, and Kitzie, 2012). Overall, while both social question and answer services and virtual reference have been utilised for varying purposes and queries, each has been used more specifically for certain types of questions. Users decide between social question and answer services and virtual reference based on the types of questions they have at hand.

Findings from this study concerning user satisfaction with social question and answer services and user comparisons between social question and answer services and virtual reference help explain user service choices and preferences. The study revealed that 70% of users are either very satisfied or somewhat satisfied with the answers they received from the social question and answer services site. This high overall user satisfaction with social question and answer services is consistent with findings from previous social question and answer services studies and comparable to virtual reference services (e.g., Chan, Ly and Meulemans, 2012; Connaway and Radford, 2011; Pomerantz and Luo, 2006; Shah, 2011; Shah and Kitzie, 2012). Additionally, this study found that user satisfaction may vary by types of question. Users are more likely to be satisfied with answers to their schoolwork questions, questions about specific facts, and questions for advice or opinions. The questions eliciting the least satisfactory responses are those addressing research questions and personal/private issues. These results show the strengths and weaknesses of social question and answer services service in answering different types of questions, and help explain why users turn to virtual reference for research or difficult questions, as found in previous studies (e.g. Numminen and Vakkari, 2009).

This study found that social question and answer services users have low awareness and usage of virtual reference, with 68% having never heard of virtual reference, 14% being aware of the service but never utilising it, 13% using virtual reference only sometimes, and only 4% using virtual reference regularly. These findings add to previous virtual reference research and discussion about low virtual reference awareness and use in general (Connaway and Radford, 2011). Attention must also be paid to those individuals who report awareness of virtual reference but have not yet used the service, citing the barriers of unfamiliarity and perceived difficulty. While reference librarians and their work are positively valued overall, social question and answer services users remain largely unaware of and are not familiar with reference services. These findings point to user awareness and service promotion as areas that libraries need to address in virtual reference.

Recent studies show that many users did not seem to recognise the difference between the functions of social question and answer services and virtual reference (Kitzie and Shah, 2011), and college students especially were less likely to be aware of virtual reference (Gazan, 2007). The lack of awareness of virtual reference found in this study may impede this group of users from taking advantage of the service. However, those who had used both social question and answer services and virtual reference showed a clear understanding of the differences between the two in their evaluations and comparisons, further confirming the different roles of the two services in meeting user information needs.

This study has identified the important considerations and the underlying latent clusters of the considerations of an online or virtual information service. The results show that service accessibility, quality of answers, and credibility of answers are most important considerations, followed by promptness, answerers' expertise and related experience, and privacy. These results confirm the findings by Connaway, Lanclos, and Hood (2013) that indicate access is the key to meeting users' expectations, depending upon the context and situation of their needs. These results also confirm previous studies that convenience, speed, and efficiency are critical attributes for users of virtual services in their searches for information (Nicholas, 2011; Nicol and Crook, 2013). The considerations can be explained by four underlying factors: human answer (questions are answered by humans and by those with expertise and related experience), answer service (promptness, accessibility, trust, and quality), active response experience (number and variety of answers and fun experience), and personal social concern (feeling comfortable asking questions, privacy, and friendship). This result suggests people still value human-provided answers despite the availability of various search tools and resources. At the same time, quality service, user experience, and personal/social concerns are also valued.

When mapping the important considerations for an online or virtual service with user evaluations of virtual reference and social question and answer services, it is clear that neither service is scored higher in every aspect. While virtual reference outperforms social question and answer services in quality of answers, trustworthiness of answers, expertise of the people providing the answers, and privacy, social question and answer services performs better for the accessibility of the service, promptness, the related experience of those providing answers, and feeling comfortable when asking questions. The weaknesses of virtual reference services pinpoint areas where enhancements could be made to meet user needs.

Although privacy has been a core concern among librarians and a reason why some social media aspects have not been adopted in virtual reference (Litwin, 2006), it is surprising that those who have used both virtual reference and social question and answer services do not see a major difference in privacy between the two services (as shown in Table 7), and privacy is ranked seventeenth among the important considerations for an online service (see Table 8) among all respondents of this study. This sharp difference in viewpoint in privacy issues needs to be further explored.

Conclusions and suggestions for further research

This study contributes to the ongoing discussion about the future of library reference services and the changes they must make to meet the increasing online information environment. Based on user evaluation and comparison of social question and answer services and virtual reference, it is clear that each currently has its own strengths and weaknesses in serving various user needs, and in meeting all the important criteria that users seek in an online or virtual information service. The notion that social question and answer services will take over virtual reference is not supported.

Virtual reference and social question and answer services have great potential to collaborate in meeting user needs in the Internet environment. We have already seen some cases of virtual reference and social question and answer services integration. For example, many reference librarians are participating in Yahoo! Answers and offering quality reference services within this platform. Their involvement and services have received recognition by the Yahoo! Answers user community (John, 2010). Another example is Radical Reference (http://radicalreference.info/), where a collective of volunteer library workers provide online reference services to activists, journalists, and researchers. The engagement and contribution of reference librarians provide good marketing and advertising opportunities for libraries and library services. On the other hand, while libraries have been widely adopting social media and have recognised the values of such platforms for improving virtual reference services (Arya and Mishra, 2011), there has been little integration of social media and reference services beyond interrelation of the two for service promotion (Benn and McLoughlin, 2013). There has been ongoing effort in exploring how virtual reference services may work alongside social question and answer services and how virtual reference librarians may collaborate in forming a community with experts outside libraries for possible new service designs (Radford et al., 2013).

Although the specific model and infrastructure for future virtual reference services remains to be developed and tested, the future direction of virtual reference is clear with regard to meeting user expectations of virtual services in the online environment: quality, accessibility, trust, prompt response, expertise, user community, privacy, and openness to various questions. Libraries need to reach out to potential users who may not have ever asked a reference question or even know that libraries offer interactive virtual reference services. Additionally, the integration of Yahoo! Answers with Google searches provides a good example for potential collaboration between virtual reference and similar services, as its questions and answers are visible within Google results. For virtual reference, this could provide the opportunity to promote user awareness of the service with quality answers provided by librarians.

This study confirms that virtual reference and social question and answer services can be chosen to meet different needs and serve different purposes. In addition, it identifies the important considerations for an online or virtual service. Future research should examine whether an online or virtual service is preferred by certain groups of users, and explore the reasons for this preference to gain a better understanding of users' information-seeking process. For example, are there users who prefer a service irrespective of their information needs and situations? What are the user profiles of those who only use social question and answer services, but not virtual reference? What are the user profiles of those who only use virtual reference, but not social question and answer services? What is their information-seeking path? Do they utilise other sources before and after using social question and answer services and/or virtual reference? Additionally, some questions in shaping future virtual reference operation also require exploration. For example, what are the effective ways for virtual reference to collaborate with social question and answer services? What are effective ways to direct questions to people with the appropriate expertise? How can some social elements of social question and answer services be integrated into virtual reference? What is the impact of social question and answer services-like technology on virtual reference? The answers to these questions will prepare virtual reference to better position itself to satisfy users' information needs more effectively as part of their information-seeking process.

Acknowledgements

This research is supported in part by Wuhan University Academic Development Plan for Scholars after 1970s for the project Research on Internet User Behavior and the Chinese National Funds of Social Science (No. 14BTQ044). The authors would like to thank the three anonymous reviewers and Rick Rubin for their helpful suggestions in revising the manuscript. The assistance from Kristin Yeager for statistical analysis is greatly appreciated.

About the authors

Yin Zhang is a professor at the School of Library and Information Science at Kent State University. She received her B.S. and M.S. in Information Science from Wuhan University and Ph.D. in Library and Information Science from University of Illinois at Urbana-Champaign. Her research and teaching areas include user information-seeking behavior, information uses and services, information systems, and information organization. She can be contacted at: yzhang4@kent.edu
Shengli Deng (corresponding author) is an associate professor and deputy director of the Department of Information Management, School of Information Management at Wuhan University. He received his B.S. and M.S. in Information Management from Central China Normal University and Ph.D. in Information Science from Wuhan University. His research interests include information behavior, information interaction, and information services. He can be contacted at: victorydc@sina.com

References
  • Agosto, D., Rozklis, L., MacDonald, C. & Abels, E. (2011). A model of the reference and information service process: an educator's perspective. Reference & User Services Quarterly, 50(3), 235–244.
  • Arya, H.B. & Mishra, J.K. (2011). Oh! Web 2.0, virtual reference service 2.0, tools & techniques (I): a basic approach. Journal of Library & Information Services in Distance Learning, 5(4), 149-171.
  • Benn, J. & McLoughlin, D. (2013). Facing our future: social media takeover, coexistence or resistance? The integration of social media and reference services. Paper presented at the IFLA World Library and Information Congress, 17-23 August, 2006, Singapore. Retrieved from http://library.ifla.org/id/eprint/129 (Archived by WebCite® at http://www.webcitation.org/6Rg9fUPKH)
  • Bishop, B.W., Sachs-Silveira, D. & Avet, T. (2011). Populating a knowledge base with local knowledge for Florida's Ask a Librarian reference consortium. The Reference Librarian, 52(3), 197-207.
  • Blogger Journey (2013, January 25). Top 10 Q&A websites 2013. [Web log post]. Retrieved from http://www.bloggerjourney.com/2013/01/top-10-q-websites-2013-get-answers-for.html (Archived by WebCite® at http://www.webcitation.org/6Rg9qC2va)
  • Chan, I., Ly, P. & Meulemans, Y. (2012). Extending IM beyond the reference desk: a case study on the integration of chat reference and library-wide instant messaging network. Information Technology and Libraries, 31(3), 4-22.
  • Chow, A. S. & Croxton, R. A. (2012). Information-seeking behavior and reference medium preferences. Reference & User Services Quarterly, 51(3), 246-262.
  • Connaway, L.S., Dickey, T.J. & Radford, M.L. (2011). 'If it is too inconvenient I'm not going after it': convenience as a critical factor in information-seeking behaviors. Library & Information Science Research, 33(3), 179-190.
  • Connaway, L.S., Lanclos, D. & Hood, E.M. (2013). 'I find Google a lot easier than going to the library website.' Imagine ways to innovate and inspire students to use the academic library. In D. M. Mueller (Ed.), Proceedings of the ACRL 2013 (pp. 289-300). Chicago, IL: ACRL. Retrieved from http://www.ala.org/acrl/sites/ala.org.acrl/files/content/conferences/confsandpreconfs/2013/papers/Connaway_Google.pdf (Archived by WebCite® at http://www.webcitation.org/6Rg9waIRw)
  • Connaway, L.S. & Radford, M.L. (2011). Seeking synchronicity: revelations and recommendations for virtual reference. Dublin, OH: OCLC Research. Retrieved from http://www.oclc.org/reports/synchronicity/default.htm (Archived by WebCite® at http://www.webcitation.org/6RgAIO3AI)
  • Connaway, L.S., Radford, M.L. & Williams, J.A. (2009). Engaging Net Gen students in virtual reference: reinventing services to meet their information behaviors and communication preferences. Paper presented at the ACRL Fourteenth National Conference, 12-15 March, 2009, Seattle, Washington. Retrieved from http://www.ala.org/acrl/sites/ala.org.acrl/files/content/conferences/confsandpreconfs/national/seattle/papers/10.pdf (Archived by WebCite® at http://www.webcitation.org/6RgAnBDNx)
  • Dearman, D. & Truong, K.N. (2010). Why users of Yahoo! Answers do not answer questions. Proceedings of the SIGCHI Conference on Human Factors in Computing Systems (pp. 329-332). New York, NY: ACM. Retrieved from http://dl.acm.org/citation.cfm?id=1753376&dl=ACM&coll=DL&CFID=399886409&CFTOKEN=55138757 (Archived by WebCite® at http://www.webcitation.org/6RgB4bNrc)
  • Gazan, R. (2007). Seekers, sloths and social reference: homework questions submitted to a question-answering community. New Review of Hypermedia and Multimedia, 13(2), 39-248.
  • Gazan, R. (2011). Social Q&A. Journal of the American Society for Information Science and Technology, 62(12), 2301-2312.
  • Golbeck, J. & Fleischmann, K.R. (2010). Trust in social question and answer services: the impact of text and photo cues of expertise. Proceedings of the American Society for Information Science and Technology, 47(1), 1–10. Retrieved from http://asis.org/asist2010/proceedings/proceedings/ASIST_AM10/submissions/48_Final_Submission.pdf (Archived by WebCite® at http://www.webcitation.org/6RgBBhjgW)
  • Gronemyer, K. & Deitering, A. (2009). 'I don't think it's harder, just that it's different': librarians' attitudes about instruction in the virtual reference environment. Reference Services Review, 37(4), 421-434.
  • Henry, J. (2011). Death of reference or birth of a new marketing age. Public Services Quarterly, 7(1-2), 87-93.
  • Hitwise (2008). U.S. visits to question and answer websites increased 118 percent year-over year. Retrieved from http://www.hitwise.com/news/us200803.html (Archived by WebCite® at http://www.webcitation.org/6RgBLnO8v)
  • Internet Public Library. (2011). Timeline of ipl2/IPL history. Retrieved from http://www.ipl.org/div/about/timeline/ (Archived by WebCite® at http://www.webcitation.org/6RgBO7BKC)
  • John, J. (2010). Best answering percentage 77%. Retrieved from http://enquire-uk.oclc.org/content/view/97/55/ (Archived by WebCite® at http://www.webcitation.org/6RgBtcsOb)
  • Kilzer, R. (2011). Reference as service, reference as place: a view of reference in the academic library. The Reference Librarian, 52(4), 291-299.
  • Kim, S., Oh, J. S. & Oh, S. (2007). Best-answer selection criteria in a social question and answer services site from the user oriented relevance perspective. Proceedings of the American Society for Information Science and Technology, 44(1), 1-15.
  • Kitzie, V., Choi, E. & Shah, C. (2012). To ask or not to ask, that is the question: investigating methods and motivations for online Q&A. Poster presented at the 2012 Proceedings of Workshop on Human Computer Interaction and Retrieval, 4-5 October, 2012, Cambridge, Massachusetts. Retrieved from http://ils.unc.edu/hcir2012/hcir2012_submission_7.pdf (Archived by WebCite® at http://www.webcitation.org/6RgByaA2F)
  • Kitzie, V. & Shah, C. (2011). Faster, better, or both? Looking at both sides of the online question and answering coin. Poster presented at the 2011 Annual Meeting of the American Society for Information Science and Technology, 9-12 October, 2011, New Orleans, Louisiana. Retrieved from http://www.asis.org/asist2011/posters/180_FINAL_SUBMISSION.pdf (Archived by WebCite® at http://www.webcitation.org/6RgC0wKo2)
  • Kuhl, J. (2012). On life support, but not dead yet! Revitalizing reference for the 21st century. Paper presented at the PLA 2012 Conference, 13-17 March, 2012, Philadelphia, Pennsylvania.
  • Litwin, R. (2006, May 22). The central problem of library 2.0: privacy. [Web log post]. Retrieved from http://libraryjuicepress.com/blog/?p=68 (Archived by WebCite® at http://www.webcitation.org/6RgC2q1LX)
  • Luo, L. (2012). Professional preparation for 'Text a Librarian': what are the requisite competencies? Reference & User Services Quarterly, 52(1), 44-52.
  • McGee, M. (2008, April 27). Yahoo! Answers: 11 million answers per month. Small Business [Web log post]. Retrieved from http://www.smallbusinesssem.com/ yahoo-answers-11- million-answers-per-month/1147/ (Archived by WebCite® at http://www.webcitation.org/6RgC4VUBQ)
  • Mu, X., Dimitroff, A., Jordan, J. & Burclaff, N. (2012). A survey and empirical study of virtual reference service in academic libraries. The Journal of Academic Librarianship, 37(2), 120-129.
  • Nicholas, P. (2011). Creating a digital reference agenda for academic libraries in Jamaica: an exploratory case study. Libri: International Journal of Libraries & Information Services, 61(4), 258-280.
  • Nicol, E.C. & Crook, L. (2013). Now it's necessary: virtual reference services at Washington State University, Pullman. The Journal of Academic Librarianship, 39(2), 161-168.
  • Numminen, P. & Vakkari, P. (2009). Question types in public libraries' digital reference service in Finland: comparing 1999 and 2006. Journal of the American Society for Information Science and Technology, 60(6), 1249-1257.
  • Oh, S. (2011). The relationships between motivations and answering strategies: an exploratory review of health answerers' behaviors in Yahoo! Answers. Proceedings of the American Society for Information Science and Technology, 48(1), 1-9. Retrieved from http://www.asis.org/asist2011/proceedings/submissions/ 136_FINAL_SUBMISSION.pdf (Archived by WebCite® at http://www.webcitation.org/6RgC7aHIl)
  • Park, J.R., Li, G. & Burger, A. (2010). Opening and closing rituals of the virtual reference service of the Internet Public Library. Journal of Documentation, 66(6), 807-823.
  • Pomerantz, J. (2010, June 1). Facebook social question and answer services service is the harbinger of the death of reference. [Web log post]. Retrieved from http://jeffrey.pomerantz.name/ 2010/06/facebook-social-qa-service-is-the-harbinger-of-the-death-of-reference/ (Archived by WebCite® at http://www.webcitation.org/6RgCA6HPQ)
  • Pomerantz, J. & Luo, L. (2006). Motivations and uses: evaluating virtual reference service from the users' perspective. Library & Information Science Research, 28(3), 5-29.
  • Quantcast (2013). Answers.yahoo.com traffic. Retrieved from https://www.quantcast.com/answers.yahoo.com#!traffic (Archived by WebCite® at http://www.webcitation.org/6RgCBTmin)
  • Radford, M.L. (2008). A personal choice: reference service excellence. Reference & User Services Quarterly, 48(2), 108-115.
  • Radford, M.L. & Connaway, L.S. (2012). Chattin' 'bout my generation: comparing virtual reference use of Millennials to older adults. In M. L. Radford (Ed.), Leading the reference renaissance: today's ideas for tomorrow's cutting-edge services (pp. 35-46). New York, NY: Neal-Shuman.
  • Radford, M.L. & Connaway, L.S. (2013). Not dead yet! A longitudinal study of query type and ready reference accuracy in live chat and IM reference. Library & Information Science Research, 35(1), 2-13.
  • Radford, M.L., Connaway, L.S., Mikitish, S., Alpert, M., Shah, C. & Cooke, N. (2013). Conceptualizing collaboration & community in virtual reference and social question and answer services. Paper presented at the Eighth International Conference on Conceptions of Library and Information Science, 19-22 August, 2013, Copenhagen, Denmark.
  • Radford, M.L., Connaway, L.S. & Shah, C. (2012). Convergence & synergy: social question and answer services meets virtual reference service. Proceedings of the American Society for Information Science and Technology, 49(1), 1-8. Retrieved from http://www.asis.org/asist2012/proceedings/Submissions/111.pdf (Archived by WebCite® at http://www.webcitation.org/6RgCG54qY)
  • Radford, M.L., Radford, G.P., Connaway, L.S. & DeAngelis, J.A. (2011). On virtual face-work: an ethnography of communication approach to a live chat reference interaction. Library Quarterly, 81(4), 431-453.
  • Shachaf, P. (2010). Social reference: toward a unifying theory. Library & Information Science Research, 32(1), 66-76.
  • Shah, C. (2011). Effectiveness and user satisfaction in Yahoo! Answers. First Monday, 16(2). Retrieved from http://www.uic.edu/htbin/cgiwrap/bin/ojs/index.php/ fm/article/ viewArticle/3092/2769 (Archived by WebCite® at http://www.webcitation.org/6RgCI97AA)
  • Shah, C. & Kitzie, V. (2012). Social Q&A and virtual reference—comparing apples and oranges with the help of experts and users. Journal of the American Society for Information Science and Technology, 63(10), 2020-2036.
  • Shah, C., Radford, M. L., Connaway, L. S., Choi, E. & Kitzie, V. (2012). 'How much change do you get from 40$?'—analyzing and addressing failed questions on social question and answer services. Proceedings of the American Society for Information Science and Technology, 48(1), 1-10. Retrieved from https://www.asis.org/asist2012/proceedings/Submissions/119.pdf (Archived by WebCite® at http://www.webcitation.org/6RgCJpN38)
  • Solorzano, R.M. (2013). Adding value at the desk: how technology and user expectations are changing reference work. The Reference Librarian, 54(2), 89-102.
  • Tyckoson, D.A. (2011). Issues and trends in the management of reference services: a historical perspective. Journal of Library Administration, 51(3), 259-278.
  • Young, C.L. (2013). To be discontinued: a virtual reference cautionary tale. The Reference Librarian, 54(2), 175-176.
How to cite this paper

Zhang, Y. & Deng, S. (2014). Social question and answer services versus library virtual reference: evaluation and comparison from the users' perspective. Information Research, 19(4), paper650. Retrieved from http://InformationR.net/ir/19-4/paper650.html (Archived by WebCite® at http://www.webcitation.org/6UFe4Q67M)

Check for citations, using Google Scholar