FDLP Users Speak: One Library’s Experience with the FDLP User Survey

When the U.S. Government Printing Office (GPO), working with Outsell, Inc. and the Depository Library Council, developed a national Web-based survey for depository library users in late 2010, the FDLP User Survey (Survey), our library gladly participated.

In August 2011, the national report of findings for the Survey was released online as, FDLP Users Speak: The Value and Performance of Libraries Participating in the Federal Depository Library Program (National Report). In October 2011, local reports of findings became available for individual depositories to download; we named ours FDLP User Survey: Evans Library (Local Report).

The Evans Library Government Documents Task Force began to study the Local Report with hopes that the results of the Survey would provide a clear message from our users that could be used to improve their satisfaction with our resources and services. However, we found it difficult to map the graphic results in our Local Report to the corresponding Survey questions and to compare our results with the results of all depositories and other academic depositories as found in the National Report.

This article describes how we processed the National and Local Reports to make them usable and then analyzed the results in order to utilize them in depository planning. We hope that our experience may be of help to others as they work with their reports.

Background and Response Rate

The Government Documents Task Force publicized the Survey during January and February 2011, both on and off campus, through a number of print and electronic avenues. Nationally, the response rate averaged six users per library. We were pleased that the lowest number of respondents to any one question for Evans Library was 40, while the highest number of respondents to a question was 61.

Processing the Survey Reports

To analyze the results of the Local Report, we started by creating a table, Findings of FDLP User Survey: Evans Library. This table was created to better compare the Survey’s questions with each corresponding graph of results and to allow comparisons between our library’s and other libraries’ results. It accomplishes this by bringing together, in one document, the original Survey questions, the Local Report’s results, and the National Report’s results for all depositories and for academic depositories. It also compensates for a number of issues that initially confused us. These include:

  • Inconsistent numbering between the Survey questions and the National Report’s results (i.e., the charts showing responses)
  • No numbering for the Local Report’s results
  • Inconsistent naming of the results for the same question in the National and Local Reports
  • Results combining two questions or separating one question into two answers
  • Results without corresponding questions
  • Uncharted negative responses for many questions*
  • Percentages omitted for local-level results

More specifically, the table highlights 14 of the 16 questions on the Survey by providing the original question and directions for answering plus the title in parentheses of the corresponding chart in Evans Library’s Local Report. In addition, there is a note if the negative choices were eliminated in the presentation of results, and numerical percentages are estimated for all local responses from the graphs. Finally, the responses for each question are sorted according to Evans Library patron responses from greatest to least, rather than being in the order given in the report. The sorting alone helps make useful sense of the local data. As in the Survey, the questions are grouped into ones about use of depository libraries and about patrons’ satisfaction with that use.

Summary of Local Survey Results

The table was then used to systematically review local responses for each question on the Survey. We found that our local findings were consistent with the National Report’s findings, only more positive. Our users apparently use more types of documents more often and for more purposes than other users and are more satisfied with their outcomes. They see fewer major barriers and, except for wanting more online materials, see fewer needed improvements. We surmise that our respondents were largely a self-selected group of individuals familiar with the Federal Depository Library Program (FDLP) at Evans Library and supportive of it.

Using the Data

Collection Development and Weeding

We will use knowledge of patrons’ purposes for using Government information and their preferred types and formats of Government information to guide consideration of both acquisitions and weeding in the future. In relation to patron preferences, we discovered two useful charts in the National Report that were not generated for the Local Report: Figure A2 – Types of information used (FDLP Users Speak, p.27) and Figure 7 – Respondents’ use of print/tangible and electronic resources for each type of Government information (FDLP Users Speak, p.13).

In addition, another chart was created in-house from the Local Report to combine the results of the Survey’s questions asking about use of tangible information resources (question A6) and online resources (A7) in order to compare the use specifically by Evans Library patrons of 15 different types of Government resources by format. They are listed in order of greatest to least in terms of online format usage. All in all, the results of the Survey confirmed for us that our patrons want us to become a more electronic depository.

Action Items

We reviewed patrons’ responses to major and minor problems/barriers/obstacles and proposed the following action items based on those responses:

  • Provide more materials online.
  • Improve finding tools, such as the catalog.
  • Provide more training in finding and using Government information through research guides, online tutorials, and on-campus sessions.
  • Publicize document delivery/interlibrary to promote access to older materials and as a substitute for filling gaps in the tangible collection.
  • Focus on maintaining and improving the services reported as most used by Evans Library patrons.
  • Consider Evans Library patrons’ preferred methods of delivery of information when disseminating depository news, resources, and training.
  • Promote the depository to other local libraries as a place to visit virtually as well as in person for Government information.

Final Thoughts

Our library administration was extremely pleased to have the Government Documents Task Force’s final report as evidence of assessment, which is an activity valued by accreditation committees and the university alike.

The shortcomings of the Survey instrument and subsequent reports obscure the details of many user responses. Therefore, further user needs research is warranted. Fortunately, as the National Report points out, a modified Survey instrument can be used as a template for continuing assessment. Though all user comments, including negative comments, are reported, a clearer picture of what the patrons actually said could be acquired if depositories were given data about the negative responses that were not conveyed in the reports. It also would be beneficial if actual response percentages were on the charts for all answers, to all questions, for all types of libraries. Currently, it does this for all questions at the national level but for only a few questions at other levels.

If other depository libraries think the table discussed in this article would be a useful basis for their own survey analyses, the authors invite them to use it. Also, if any one would like a more detailed account of our procedure or analysis, please feel free to contact us. We will be happy to answer any questions; you can reach us at This email address is being protected from spambots. You need JavaScript enabled to view it. or This email address is being protected from spambots. You need JavaScript enabled to view it.. We are also interested in experiences other libraries have had in processing and using their reports in the effort to hear our users speak.

* Note from the editors: When asked about this before the individual library reports were released, GPO was informed that it is an Outsell, Inc. practice to combine strongly agree and somewhat agree together when displaying data in charts. This is done for two reasons: 1) to focus on general, top-line trends and 2) there are shades of gray in various respondents’ minds when answering surveys, with much left to interpretation. The data provided allows for easy calculation of negative responses.