From the Reference Desk

Reviewing the Internet

Column Editor: Tom Gilson (College of Charleston) (gilsont@cofc.edu)


The Infofilter Project

The Internet is like a giant "Information Bazaar" in which many of the stalls are heaped with wondrous riches. Unfortunately, many of the other stalls in this "Information Bazaar" are either empty, or full of useless junk.

Just as in the print world, the quality of reference materials on the Internet varies dramatically. But unlike the print world, where publications like RQ, Choice, Booklist, and ARBA monitor quality via their reviews, evaluation of specific Internet resources is meager.

At this year's Charleston Conference held in November, James Rettig presented a paper entitled, "Putting the Squeeze on the Information Firehose: the Need for Net editors and Net reviewers." In that paper, Rettig did an excellent job of outlining this problem and evaluating some recent attempts at reviewing Internet resources. By far, the most promising attempt Rettig uncovered was the Infofilter Project.

The Infofilter Project is a collective effort whose major players are librarians and other information professionals interested in evaluating the quality of Internet resources. Infofilter is actually a three-tiered process starting with the Inteval Listserv which serves as discussion list for reviews and policy. First drafts of reviews are posted to Inteval where Infofilter participants can make comments and suggestions which in turn are incorporated in later revised drafts.

A quick look at Interval shows that most reviews are submitted in HTML. According to Eileen Flick, Infofilter's HTML editor, while, this is not mandatory, it has become customary, mainly because of the review template provided for reviewers. This template is made available to insure reviewers include specific technical information, as well as, the review itself. The template is made available via the second tier of the Infofilter process, the Working Home Page.

The Working Home Page contains this review template which requires technical information like URL, type of resource, host institution, author/ manager, size, formats, keywords, etc. The template also provides space for the review, with the recommendation that the review be "evaluative and thorough without getting over 300-500 words." Any special notes, regarding graphics, loading speed, interface or connectivity are included next, and finally a space is provided for references and links to related Internet sources. In addition, the Working Home Page lists involved guidelines for assigning keywords when filling out the template and archives new and revised reviews where Infofilter participants and others can gain easy access to reviews in process.

This brings us to the Infofilter Home Page itself. This is where the final reviews reside. After being submitted to Inteval and being archived on the Working Home Page, if a review has not been commented on for a few weeks, the reviewer can make a request-for-close. If the review is deemed finished by the editors, it is then added to the official Infofilter Home Page where it can viewed. The reviews can be browsed in three ways; by most recent review, by subject keyword and by resource name. (The NetScape find command allows word searching on all these lists.) Users can also search the full text of the reviews using a WAIS search engine.

The Infofilter Project is a serious and sophisticated attempt at providing reviews of Internet resources. Currently, it is limited in that it consists of approximately fifty reviews, including those on both the working and official homepages. Hopefully, it will continue to grow as more people get involved with the Project. The quality of reviews is good overall, but there are no stated criteria other that being thorough and not exceeding 500 words. (Eileen Flick indicated that the "most commonly bandied-about comparison is with Choice.")

In his Charleston Conference paper, James Rettig raised a concern about criteria in Net reviewing. With its template, the Infofilter Project provides some criteria, particularly for needed technical information. But Rettig's concern for criteria to aid in type-of-source definition and for content evaluation remain a problem for Infofilter. Rettig calls for the Infofilter team to "help settle the vexing questions about genre," and, "clarify criteria, appropriate to the Internet environment, for judging Internet resources." Rettig is right. The folks at Infofilter are making a valuable contribution to librarians and other Net users, but they can play an even more important role in setting standards for Internet reviewing. They are in at the ground level with an idea whose time has come, but the job has just started.

Like IFB Abstracts (reviewed in this column, ATG , Sept 1995), the Infofilter Project is an example of creative self publishing which uses the Web as its outlet. Let's hope such efforts mark a continuing trend because they represent a new and exciting way in which librarians can help the information community at large, as well as each other.

(The official Infofilter Home Page is at http://www.kcpl.lib.mo.us/inteval. htm, while the Working Home Page is at http://www.usc.edu/users/help/ flick/Reviews/index.html.

A revised copy of James Rettig's Charleston Conference paper is available on the Working Home Page or directly from http://www.swem.wm.edu/firehose.html. Those interested in participating in the Infofilter Project can send e-mail to inteval@kcpl.lib.mo.us. Enjoy!)


Against the Grain's home page: http://www.spidergraphics.com/atg.

Reprinted with permission of the author.

Against the Grain, February 1996, vol.8 no.1, p.58.