Features – Evaluating an Online Information Service for Management – A Fastcase Review

T. R. Halvorson is Deputy County Attorney in Sidney, Montana, President of Synoptic Text Information Services, Inc., webmaster of LexNotes , and author of Law of the Super Searchers: the Online Secrets of Top Legal Researchers .

Abstract: This article presents a method for preparing a memorandum to management evaluating an online legal information service. The ability to evaluate the quality of sources and convey the evaluation to management is a competency of law librarians. The article presents reasons for basing evaluations on standardized criteria, the advantages of the SCOUG Rating Scale, a checklist of questions for a SCOUG-inspired evaluation of an online legal information service, adapting to the management perspective when drafting a memorandum, a template of a memorandum to management, guidelines for working from the checklist to the template, and a completed example of an Inter-Departmental Memorandum evaluating Fastcase. The conjunction of the thought process behind the analysis of an online service and the though process behind communicating with management is largely capitalized in the checklist and the template.

Oh No. Not This Again.

Suppose you are the law librarian in a firm with a number of attorneys. One of them has received the latest salvo in the email advertising barrage from Acme Legal Cybrary, Inc. The attorney clicks on the Forward command and dashes off a note to you saying, “Why aren’t we subscribing to this instead of Westlaw or Lexis? This is so much cheaper.” You quickly search for a published review and find that none exists yet. You post a message to the lists and receive stone silence.

Okay, it is your job. You must evaluate Acme and draft the memo. The onus is on us.

From Crisis to Recognized Competence

Crisis is not the only, and certainly not the best, reason you should be able to do this yourself. Competency 3.2 of the Competencies of Law Librarianship approved by the Executive Board of the American Association of Law Libraries in March 2001 says the law librarian:

Evaluates the quality, authenticity, accuracy, and cost of traditional and electronic sources, and conveys the importance of these to the client.

Who is the judge? Who in the firm, the school, or the bureau is competent to evaluate online services? The law librarian is the person with this competence. The law librarian is the one who should be recognized as the professional in this realm. The law librarian should be the esteemed leader and the valued teammate.

Core Competencies 1.8, 1.6, 1.9, and 1.13 say:

1.8 Exhibits leadership skills including critical thinking, risk taking, and creativity, regardless of position within the management structure.

1.6 Demonstrates knowledge of library and information science theory, information creation, organization, and delivery.

1.9 Demonstrates commitment to working with others to achieve common goals.

1.13 Displays excellent communication skills and is able to promote the library and advocate for its needs.

Notice the verbs. A law librarian not only possesses certain traits, but exhibits, demonstrates, and displays them. This is how the connection to employers, institutions, and the workplace is made. This is how the law librarian wins a connected position. The law librarian can exhibit, demonstrate, display, and connect by well communicated evaluations of online services.

But how to do it? We do it best when we base our process on standardized criteria.

Why Use Standardized Criteria

Though the audience, purpose, and author of the memorandum to management are different from those of the published review article, still there are good reasons to use standardized criteria for evaluation of online services. The first is circumspection. Using standardized criteria reminds us to consider all aspects before judging, deciding, or acting.

A second reason is comparability. We seldom evaluate a service in a vacuum or in isolation. The purpose of evaluation usually is to compare services. Standardized criteria allow our evaluations to be made on a comparable basis.

A third reason is to retain the classic with the contemporary. The new online legal information services tend to be delivered on the web. Relative to classic systems like Dialog, Lexis, and Westlaw in the forms delivered via direct dial, proprietary software, these systems are new and different. The web context engenders its own, contemporary evaluation perspective. If the contemporary perspective were all we had, however, we would lose track of classic quality factors that have abiding value and importance. A well chosen set of standardized criteria is one that has classic roots and also has proven itself durable and adaptable in the contemporary context.

A fourth reason is community. We don’t use standardized criteria to put on airs of professionalism or a conceit of expert knowledge. We don’t do it to impress with formalism. We use standardized criteria as instruments of social interaction. They help us connect with the community by considering what we are evaluating from a community perspective. Using standardized criteria helps us avoid excessive focus on pet likes and pet dislikes. It keeps us from merely riding our hobby horse or projecting our personal idiosyncrasies onto our teammates. It enables us to consider factors that are important to the members of our community that might not be very important to us individually.

Standardized criteria that are classic, that came from a community context, that have proven durable and adaptable to the contemporary situation make us competent not merely by the standards of our enclave, but by connecting our enclave to the wider community.

Advantages of The SCOUG Rating Scale

Among the dozens of published rating scales, the SCOUG Rating Scale has important advantages. It is classic, it is durable, it came from community, and it is capable of complementation in the current web situation.

The SCOUG Rating Scale is the product of the Fourth Annual Southern California Online Users Group (SCOUG) Retreat. Though called a retreat, it was a fifteen-hour-a-day think tank. In the felicitously strident words of Reva Basch, “Retreat, hell — this was a full-scale advance.” 1

Participants came from academic libraries, corporate libraries, public libraries, and independent research firms. The retreat attracted searchers, industry decision makers, and media persons from all over the country — from New York to Hawaii. Collectively, they searched all major online services and some esoteric ones too. Mead Data Central, DIALOG, Maxwell Online, Access Company, Predicasts, Data Courier, AIAA (The Aerospace Database), and ABC-Clio were represented. A report of each day’s session was uploaded to The WELL. Comments from WELL-beings were relayed back to the group. The aim of the retreat was to create a framework for judging the quality and reliability of databases in terms of their design, content, and accessibility — in effect, a consumer rating scale for online information. The perspectives of both professional search intermediaries and end-users were considered expressly. The nature of the retreat and its participants allowed the outcome to circumspect, classic, and community-oriented.

Next, to be a good choice the rating scale needs to be capable of complementation and be adaptable to the contemporary situation in online legal research. Others already have demonstrated that the SCOUG Rating Scale is capable of being complemented, added to, or adapted to the web environment. In a paper titled “Evaluation criteria for different versions of the same database – a comparison of Medline services available via the World Wide Web,” presented at the 21st International Online Information Meeting in London, Betsy Anagnostelis and Alison Cooke report on efforts to establish criteria for evaluating different offerings of the same Medline data. To develop and test criteria, they chose eight web-based Medline services: Community of Science, HealthGate, Healthworks, Helix, Internet Grateful Med, Medscape, OBGYN.net, Simon-Williamson Clinic, and PubMed. They devised a set of evaluation criteria based on a literature review. They adapted the SCOUG Rating Scale, criteria proposed by Carol Tenopir and Katie Hover for comparing databases available via different hosts, the OMNI Medline evaluation criteria, and criteria proposed by William Detmer. The OMNI criteria were themselves based primarily on the SCOUG Rating Scale. Their result has 13 categories. (In medical searching, controlled vocabulary is highly important using MeSH terms. Hence, factors relating to thesaurus searching rate a whole category.)

This illustrates nicely how the core of the SCOUG Rating Scale has enduring utility and how it can be complemented in the context of a specific subject domain and web delivery of information. Where case law, statutes, and regulations are concerned, our situation has significant parallels. Never mind whether we search VersusLaw, Loislaw, Quicklaw America, Westlaw, Lexis, or Fastcase, the core data of the court opinions, statutes, and regulations is the same. That suggested to me that the SCOUG Rating Scale could be used in a similar fashion in the context of primary legal authority delivered on the web. That is why I have based numerous reviews of online services on the SCOUG Rating Scale. 2

The Categories and Checklist

The SCOUG Rating Scale organizes quality factors into ten broad categories:

Coverage and scope Integration
Timeliness Output
Accuracy / Error Rate Documentation
Accessibility / Ease of Use Customer Support and Training
Consistency Value-to-Cost Ratio

Applied to legal information services, I have developed a Checklist of Questions for a SCOUG-Inspired Review of an Online Legal Information Service. We can use this Checklist in the background process of seeking information about a service and evaluating its quality. Then we transform the results of the background process into the memorandum to management.

Adapting to the Management Perspective

Besides adapting to the web environment, we also need to adapt to the management perspective.

The memorandum to management must be concise. There is a delicate balance between being concise and including everything that should be. On the one had, we could become so concise that our memorandum becomes incompetent by omissions. On the other, we could become so complete that our memorandum becomes incompetent by failing to connect with our employers. They don’t receive the information either because they won’t finish reading or because while experiencing the intake of information they have another experience of resentment at the length and the second experience shouts down the first. A number of law librarians tell me management will not read more than one page. Consensus sets in at two pages, while a few librarians give lengths up to five pages.

Besides being concise, the memorandum to management should be conclusory. In what I am about to say, please recognize that I am generalizing and simplifying. These statements won’t be true of all management and may not be entirely true of any management, yet I believe these traits are widespread among management. Management does not want facts; they want conclusions. They do not want information; they want answers. They want the meaning distilled from the raw content. They want to use their computers like toasters, they want to use search engines like toasters, and they want to use you like a toaster. They pop an email into you as bread. They want you pop a memo out to them as toast — toast that they can grab with their briefcase on the way out the door. They don’t care what happens in a toaster. They don’t care that it might not even be a toaster, nor that what they are eating is not toast and was not made with bread. They just want to believe that they are eating toast. Go ahead and give them toast, but make it the toast they should eat, and make them slow down just enough to have their toast with a bit of marmalade or preserves, not jelly or jam. A little bit of the raw material should be there. You give them what they want, but you get to be the you who does it, and you are a law librarian.

Condensed Categories

For purposes of drafting the memorandum to management, there are two reasons we need to reorganize and condense the categories of the SCOUG Rating Scale.

First, ten is more diffuse than management will assimilate. Any list with more than seven items begins to befuddle. Five or six would be better. For management, on of them must be the executive summary. That leaves room for only four or five other categories.

Second, management will not read a memorandum longer than two pages. In many settings, management prefers one page. One page will not accommodate ten category headings with as little as one sentence under each.

We need to condense and reorganize, but is it feasible? Susan B. Hersch presented an outline for evaluation of online legal information services that uses seven categories: Provider, Coverage, Source, Accuracy, Currency, Search Engines, and Cost. 3 Alyssa Althshuler has formulated what she calls Five Steps to evaluating online legal information services, and those steps hit on many of the quality factors included in the SCOUG Rating Scale under the headings: Content, Accuracy, Clarity, Reliability, and Cost.4 This suggests to me the feasibility of condensing and reorganizing the ten categories of the SCOUG Rating Scale so that management will assimilate our evaluations of online services.

An IDM Cheat Sheet

Now we can create a template of an interdepartmental memorandum to management evaluating an online legal information service, a cheat sheet, if you will.

Executive Summary
disqualifiers
value-to-cost ratio
Content
coverage and scope
timeliness
accuracy / error rate [authenticity or validity]
Output
output (database records or document format)
Use
accessibility / ease of use
consistency
integration
output (hit list, ranking, sorting, KWIC, etc.)
Help
documentation
customer support and training

Nine of the SCOUG categories condense into Content, Output, Use, and Help. The tenth SCOUG category, Value-to-Cost Ratio, goes into the Executive Summary.

The IDM cheat sheet also adds “disqualifiers” to the Executive Summary. A prime example of a disqualifier is a service that lacks coverage or scope that a firm or organization must have and without which it could not consider subscribing.

The SCOUG category of Output is split and covered partially under Output and partially under Use. From a management perspective, output issues like hit list, ranking, sorting, KWIC, etc. are part of Use. Management perceives “Output” as having to do with database records and document formats — final output, if you will, rather than intermediate output.

This cheat sheet gives us an outline or template where we can plug in answers. The conjunction of the thought process behind the analysis of an online service and the though process behind communicating with management is largely capitalized in this tool. It is like farming with implements rather than with finger nails.

Using the Checklist and Cheat Sheet

Three key concepts guide the use of the checklist and cheat sheet.

The first key concept is:

In your own mind, you would be working from the perspective of the ten SCOUG categories, but on paper you would target management’s perspective under the condensed categories.

The cheat sheet is used in conjunction with the checklist. You perform your analysis with the whole checklist under the ten categories, and then report to management with the cheat sheet under the condensed five categories. If you try to work only from the management perspective of the condensed categories, you will stumble into the pitfall of lacking circumspection. Remember that we use an established rating scale partly for the sake of circumspection, and we’ll lose circumspection if we start with the cheat sheet instead of starting with the checklist. While the report to management is simplified, the analysis that precedes the report must not be.

The second key concept is:

After considering the questions on the checklist, one would report to management only the most significant facts relating to a particular service.

For example, Quicklaw America has some quirks with a few of its operators. For that system, the traits of those operators are significant facts that need to be reported concisely, but for other services one might report nothing about operators than that the query language is standard Boolean.

The third key concept is:

Certain points always have such significance that they must be reported, such as coverage and scope.

Coverage and scope must be reported for all systems under Content. Coverage and scope also will show up often under Disqualifiers in the Executive Summary.

Completed Example

To demonstrate these tools in use, we need a subject online service. The rise of alternative online sources of American legal information continues with the introduction of Fastcase.com. I’ve been watching the development of Fastcase for well over a year. As the pieces jelled, I found myself repeating George Peppard’s line, “I love it when a plan comes together.” The creators of Fastcase have studied reviews of other legal information services. They have taken evaluation criteria to heart and have avoided many of the common failings of the alternative online legal information services.

The link points to a completed Inter-departmental Memorandum evaluating Fastcase. The exercise assumes that the firm is in New York.

With these capital tools and an understanding of how to use them, we can transform “Oh no, not this again” into “Oh yes, this again.”

Footnotes

1 Reva Basch, “Measuring the Quality of the Data: Report on the Fourth Annual SCOUG Retreat,” Database Searcher, vol. 6, no. 8, October 1990, pp. 18-24. <back to text>

2 I had previously proposed that searchers “use the Web to conduct and publicize SCOUG-inspired quality evaluations of selected Web resources.” T. R. Halvorson, “Searcher Responsibility for Quality in the Web World,” Searcher, vol. 6 no. 9, October 1998, pp. 12-20.

Reviews taking this approach include:

Loislaw: T. R. Halvorson, “The LOIS Law Library: A View through the Southern California Online Users Group Rating Scale Lenses,” LLRX.com™, March 1, 1999.
VersusLaw: T. R. Halvorson “VersusLaw’s V.: A View through the Southern California Online Users Group Rating Scale Lenses,” LLRX.com™, March 15, 1999.
Jurisline: T. R. Halvorson, “Jurisline.com: What You See … What You Don’t See,” LLRX.com™, January 17, 2000.
National Law Library: T. R. Halvorson, National Law Library: A View through the Southern California Online Users Group Rating Scale Lenses, LLRX.com™, May 1, 2000.
Quicklaw America: T. R. Halvorson, Quicklaw America: A View through the Southern California Online Users Group Rating Scale Lenses, LLRX.com™, October 2, 2000.
RegScanLaw (formerly EastLaw): T. R. Halvorson, EastLaw: A View through the Southern California Online Users Group Rating Scale Lenses, LLRX.com™, January 2, 2001.
LawProbe: T. R. Halvorson, LawProbe: SCOUG Rating Scale Review in Brief, LLRX.com™, January 2, 2001.
CaseClerk: T. R. Halvorson, CaseClerk.com: SCOUG Rating Scale Review, LLRX.com™, March 1, 2002.
Casecrawler: T. R. Halvorson, Casecrawler: SCOUG Rating Scale Review, LLRX.com™, May 1, 2002.

<back to text>

3 Susan B. Hersch, “Inundated with Offers for Legal Research Services on the Internet? Sorting out the Good, the Bad, and the Ugly,” LLRX.com™, May 1, 2001. <back to text>

4 Alyssa Altshuler, “An Overview of Five Internet Legal Research Alternatives to Westlaw and LexisNexis,” Virginia Lawyer, October 2001. <back to text>

Posted in: Evaluation of Internet Resources, Features, Law Library Management, Online Legal Research Services