[ Introduction | To the 1960s: Known by Inference Only | From the 1960s to 1972: Branching Out | 1973-1980: Split Card Catalogues | 1980-1988: GEAC and ReCon | 1988-2000: NOTIS | 2000- : Voyager | Table of Contents (General Guidelines) | Table of Contents (Top)]
This document was written in order to record, before a looming major turnover of staff in the Cataloguing Unit, what can be remembered about the development of the Queen's University Library catalogues that may be useful or illuminating in the future, at least in accounting for various persisting peculiarities. It includes some account of technological and organizational changes, but it is not a history of the Cataloguing Unit or the people in it. No personal names are mentioned.
Some developments which overlapped more than one period are discussed only once, in what seemed the most convenient spot.
In part this document is a nostalgic [?] description of the bad old days, but it also shows how far we have come and how much change the Cataloguing Unit has dealt with over the years.
Most of this account was produced from memory, with contributions by several current and former staff members. No systematic consultation of documents was attempted, but some material has been added from documents which turned up serendipitously.
Readers who are able to suggest additions or corrections are invited to do so.
[Top | Table of Contents (General Guidelines) | Table of Contents (Top)]
In Special Collections there are copies of book catalogues of the Queen's College library (see under title: Catalogue of books in Queen's College Library) for the dates 1853, 1860, 1863, and 1875. A note on the record has: "The 1875 edition is a cumulative catalogue, interleaved with blank pages and annotated with further additions to the collection." Eventually this type of catalogue was found to be too cumbersome to keep up to date.
There was a card catalogue at Queen's long before anyone can now remember. Early card sets produced in-house often had minimal information on all but the "main entry" card, which then had "tracings" added at the bottom, or often on the back, recording where the added entries were filed and (sometimes) the cataloguer's initials and date. In order to change a call number or a heading, it was necessary to remove, correct, and refile the entire card set, which explains why little recataloguing and reclassification was done during the card catalogue era.
The main collection in Douglas Library and its branches was classified using the Library of Congress system from an early date. However, author cutters and literary numbers often diverged considerably from LC, because a high proportion of the cataloguing was original, and because it was considered more important to fit numbers into the local shelflist than to follow any other source.
Most fiction was classed in PZ until the mid 1960s, when it was decided to class it as literature. Thousands of PZs remain today, though at least from the 1980s it has been standard practice to reclass PZs when an author's literary number was reviewed (usually when a new title was added).
LC numbers were not used for folios during a period that ended in the mid 1960s. Instead, folios were classed in sequential numbers starting with arY and arZ. Some remain today.
[Top | Table of Contents (General Guidelines) | Table of Contents (Top)]
The 1960s were an era of expansion and change. The library system and the Douglas Library building both more than doubled in size within a few years. Budgets and staff grew accordingly. The Douglas Library Cataloguing Department eventually grew large enough to overflow its quarters in the south end of Douglas's main floor into underground seminar rooms, which continued to be needed even after the Department moved to the north end of the ground floor in 1971.
By this time the public card catalogue had been divided into Subject and Name-Title sections (with personal name subjects filed in Name-Title). The Name-Title Catalogue included one temporary slip each for titles on order or in process, and was also the union catalogue for the library system. The Cataloguing Department also maintained an internal card shelflist (from early times; some people remember cards handwritten in "library hand") and name and subject authority files.
For some years (from about 1967 through the early 1970s) there was a subscription for LC deposit cards, which were filed by year, pulled if the title was acquired, otherwise discarded after two or three years (by which time the records were available in the printed NUC catalogs). When the deposit card was pulled it was sent with the book for cataloguing. The cards were then sorted by the number of copies required for a complete set (depending on the number of added entries), and eight cards were stuck (using wax) to a sheet labelled with the number of copies (called "8-up"). A similar procedure was followed for cards typed from original cataloguing work, and for copy photographed from the printed LC/NUC catalogs. These master sheets were reproduced by Queen's Printing and sent to Cataloguing to be resorted. Added entries were then typed across the tops of the various copies of the card (or, in one ill-advised attempt to save labour, by highlighting or checking the tracing at the bottom, which was quite obscure to users). (At another time cards were reproduced on a noisy machine kept in the Cataloguing Unit.)
Meanwhile, separate cataloguing operations were established in the three faculty libraries. Copies of main entry cards for fully catalogued book titles from these libraries were supposed to be sent to Douglas and filed into the union catalogue, though there were frequent delays in the process, and no guarantee that all of them made it. Health Sciences began by using the LC system, but in 1968 switched to the National Library of Medicine classification system and MESH subject headings for monographs, while arranging periodicals by title. Law (the faculty was established in 1957) used an in-house adaptation of a system developed in Los Angeles for monographs (except international law, which was classed using LC's using LC's now obsolete JX class and the Cutter-Sanborn tables), while arranging periodicals, not catalogued at all, by title, and law reports, etc., also not catalogued at all, by jurisdiction and subject. The Education library, which was established in Douglas Library in the late 1960s and moved to its new building in 1971, used LC for books (though they frequently catalogued the same titles in different numbers from Douglas), and from about 1971 developed an in-house classification system for audio-visual materials. Education used LC call numbers for periodicals until about 1998, when they switched to arrangement by title.
Douglas Cataloguing continued to catalogue for the numerous smaller branch libraries (nine or ten science and engineering branches, plus psychology, art, music, theology; most of which had been established for a long time), though full card sets were not created for the union catalogue as well as the branch catalogues. Only the main entry card in the union catalogue had reliable information about branch books. (Branch staff had to phone the Reference desk to find out if a title was available on campus. Circulation status, of course, was available only from the library location.) The Theology Library was reduced to Reading Room status and the main collection incorporated in Douglas in the early 1970s, but a "Theology" tag was added to the records and to all subsequent records for titles bought on the Theology fund (in case the decision might be reversed?).
(During this and the following periods, other smaller libraries also existed in other centres and institutes around the campus, but as long as they were independent from the main library system they will not be dealt with here.)
Within the Douglas and branches collections, print periodicals were given full call numbers when bound (and in Douglas and some branches interfiled with monographs, though some branches filed them in separate sections, sometimes by title). However, they were not usually given subject headings, or if they were the subdivisions were often "Societies, etc." or "Yearbooks". Holdings information for current serials was not in the catalogue at all, but was recorded in the Serials Checking File (also called the Kardex), open to the public but often a challenge to interpret. Branch libraries had Kardexes that duplicated information in the one in Douglas; faculty libraries had their own versions independent of Douglas.
Microforms were not classified at all, except that microfilm monographs were given sequential numbers within broad LC classes. The Early English Books microfilm set was purchased on a subscription handled by the library, but held in the English Department without being recorded in the catalogue at all; access was by printed lists.
During the boom period Acquisitions was very well funded and purchased large quantities of books that Cataloguing, cramped for space, did not grow large enough to process. As a result a large backlog was built up, which was accessible only through the one copy of the multi-part order slip which was filed in the card catalogue. In about 1971 it was decided that the backlog should be on open shelves: over 40,000 books were assigned sequential "PRE" numbers, with green stickers on the books and on another copy of the order slip which replaced the first in the catalogue. Since the order of the numbers was more or less random, this did not greatly enhance access, though many of the branch libraries did select and take away titles relevant to their collections before they were hopelessly out of date.
The Documents collection had been arranged by jurisdiction and department even in an uncatalogued state. In the late 1960s another separate operation began there, "coding" titles using a modified version of the CODOC classification system (which had been created at the University of Guelph as a "quick and dirty" way to organize government publications using a computer, without benefit of authority files or printed schedules). A coding sheet was filled out by hand for each item (in upper case characters only), the coding sheets were sent to another building for key-punching, and at intervals cumulative printouts were run. Subject access was by KWIC (keyword-in-context) printouts (no subject headings as such were assigned). This was the first computerized cataloguing project at Queen's. As such, it was completely independent from the card catalogue; users had to find out about the Documents collection by other means. In 1975 the "Queen's version" of CODOC was given up and replaced by the version then in use at Guelph and elsewhere (the original CODOC modified by experience), in a project that made use for some time of shared coding among co-operating libraries, using a microfiche union list. By the 1980s this was found to be less convenient than original coding, the several CODOC libraries went separate ways, and the union list lapsed.
The second computerized cataloguing project was undertaken about 1970 when it was decided that Douglas Library circulation should be automated. Shelflist cards were put through a machine which created brief records by OCR (optical character recognition); white punched cards were produced and inserted in the book pockets -- a big project and then another step in cataloguing each new book. These records typically included one author, a portion of the title, publication date, call number, and accession number. (Note: the accession numbers from the beginning were also stamped inside the books, and were the unique identifier when confusion arose among different copies or editions. They ceased to be assigned or recorded when the collection was barcoded in 1980.) The shelflist conversion project stalled at QE471, and was only continued some years later (completed in 1978) but records from new cataloguing continued to be added to the database. Printouts of items charged out, long- or short-term, were produced daily and kept at the Circulation Desk; there were also printouts of materials on Reserve, and printouts of the whole database in shelflist order. Pink punched cards were produced for volumes of selected serials, for circulation purposes. A master record for each serial title was created with SCF (Serials Checking File) or CLDE (Closed Entry) at the end of the call number. Some of these oddities may still exist in the data base.
A project was undertaken a little later to convert the Health Sciences shelflist. However, the company involved went bankrupt and the work was lost.
[Top | Table of Contents (General Guidelines) | Table of Contents (Top)]
The era of rapid expansion ended in the early 1970s, to be followed by budget cuts and staff cuts. The most dramatic and controversial of these took effect early in 1973, when there was a large cut in Douglas Cataloguing staff, accompanied by the splitting of the Name-Title card catalogue . The theory (officially) was that fewer staff would be needed to file into a small, "clean" catalogue than into a large and complex one. The Old NTC was frozen and the New NTC started. At the same time a separate On Order and Received file was created for temporary slips produced after that date (still one per title; filed by main entry for some years, then refiled by title). These changes were, to say the least, a considerable inconvenience for users and public service staff, and increased the staff time spent in searching to offset any savings in filing.
In retrospect, there were benefits other than reduced filing time. During the preceding period (from about 1967) the introduction of the Anglo-American Cataloguing Rules (AACR1) had resulted in massive changes in cataloguing practices at LC and elsewhere, especially in the form of corporate headings (e.g. "Toronto. University" was replaced by "University of Toronto"). Douglas Library Cataloguing had not yet adopted all of these changes (though the Law Library, at least, had - the Law Library made very few changes in whatever copy was found) because it was not feasible to change headings on so many records and it was thought undesirable to create large numbers of split files. The New NTC started clean, following the new rules and LC practice. (This was not the end of changes in rules, of course. In 1975, ISBD (International Standard Bibliographic Description) punctuation changed the look of catalogue cards. By the end of the decade the first version of AACR2 had been released, with a further set of headings changes. It goes on. However, none of the later changes has been as disruptive as the one associated with AACR1).
From 1973, also, LC call numbers if available were followed exactly, regardless of the effect on the shelves. This was serious, especially in literature, since the literary number of an author often moved considerably from the one that had been assigned in-house, and another author or authors came in between the two sequences. (At first very little reclassing took place to rectify this situation, but in later years much more was done, and not too many inconsistencies remain today.) Another aspect of this change to strictly LC call numbers was that bibliographies, formerly classed with the subject, were now classed in Z.
A considerable number of books were still given original cataloguing in-house. To prevent locally created call numbers from conflicting with LC numbers for future acquisitions, Douglas adopted the practice of adding a "t" (for tentative? or temporary?) to the end of local numbers. This was also done when copy did not reflect a final LC decision, e.g. for LC's CIP (Cataloguing in Publication) records, since the title or main entry sometimes changed after CIP. In 2003 this is still in effect for local numbers and other numbers not clearly of LC origin, except in non-LC number ranges adopted for Canadian material.
Certain exceptions were made to the "follow LC" rule. After the 1978 publication of the PS schedule developed by the National Library of Canada, Canadian literature was reclassified from LC's PR and PQ numbers into PS8000 and PS9000 (though, since NLC's catalogue was not available to us, we assigned our own author numbers). Canadian history had originally been classified using LC's F1000-F1140 number range, but Queen's agreed with other Canadian libraries in finding this inadequate. When the F5000 classification was published in 1967, we used it, and when in 1976 the even more spacious FC class was developed we adopted it. Since it has never been possible to undertake a major reclassification of either the F1000 or F5000 books, Canadian history remains split into three sequences.
There were various arguments and policy changes over the years about the use of Canadian Subject Headings. At one time only the extreme cases were allowed (as, e.g., Federal-provincial instead of Federal-state), but more recently we have accepted Canadian period subdivisions for history etc. and a range of others where LC subject headings were felt to be inadequate or inappropriate.
Over the years the Special Collections Unit has acquired a number of specialized collections for which internal listings or partial cataloguing were done, as well as some which were fully catalogued. The dated collection (pre-1700 imprints) and the English and French political pamphlets collections (to about 1840), for example, were given full description and subject heading treatment, but their call numbers were based on publication dates, not subject. The most important example of partial cataloguing was that thousands of Canadian pamphlets, within the Lorne Pierce collection, were given call numbers composed of a single basic number (originally F1028, then F5012, the two being interfiled, and never switching to FC) and the date, followed by a cutter. These records were in many cases created by staff without cataloguing training (though Special Collections did have at least one trained cataloguer in the unit for several years), and often had inappropriately general subject headings. Some records were filed into the union catalogue, but many were recorded only in the unit's internal files (and are only now being added to the main catalogue, again by staff without full cataloguing training)
During the 1970s searching for copy continued to be done in the printed NUC catalogues, but the last 5-year cumulation of NUC was that for 1973-1977, with annual cumulations to 1981. Instead, microfiche cumulations came into increasing use, from NUC, Canadiana, and other sources, and microfiche reader-printers became essential equipment (in place of the Polaroid camera used to take pictures of records in the printed catalogues). Sometime in the later 1970s we also purchased cards (which might come with the books or be ordered by a number determined from a microfiche index) from Abel (or Blackwell / BNA)
About 1977 the LC shelflist on microfiche was purchased and proved helpful to cataloguers assigning call numbers. However, after one or two supplements it was not updated. Today LC's online shelflist serves us well, but the microfiche remains useful for the treatment of voluminous authors, where the explanation handwritten on a card may be more illuminating than the online shelflist alone.
Starting in the late 1970s and continuing, as the Douglas and other stacks became full, less-used materials were selected and moved to a succession of storage locations, which led to a great many location changes in catalogue records [recorded only on the Main Entry cards, however].
By the end of 1979, the saving resulting from reduced filing after the split of the card catalogues in 1973 had largely disappeared with the growth of the New NTC. For the record, the size in 1980 of the main card catalogues in Douglas was: 15 cabinets of Old NTC, 9 cabinets of New NTC, 12 or 14 cabinets of Subject Catalogue - each cabinet having 72 drawers containing close to 1000 cards each - and one smaller cabinet of On Order and Received, which works out to something like two and a half million cards.
In the late 1970s a variety of proposals were prepared for a move toward more automation. The shelf list conversion was to be completed (this did happen), and the records upgraded to full LC-MARC by matching to another data base such as Blackwell/North America, and a catalogue produced, perhaps in COM (Computer Output on Microform) format or perhaps online. Faculty shelf lists were to be converted in the same way, and serial records to be added with acquisitions as well as holdings information, while an online acquisitions system would also be instituted. A draft budget proposal dated April 19, 1977, estimated that all this could be done, with a COM catalogue, in three years for $900,000, plus another $300,000 for an online circulation system, $100,000 to complete coding of the Documents collection, and $300,000 for cataloguing of the inactive backlog of 40,000 volumes (the PRE books). However, the money was not available at that time.
[Top | Table of Contents (General Guidelines) | Table of Contents (Top)]
By the end of the 1970s the technology for a usable online catalogue was finally available. After a review of the options, it was decided in 1979 to purchase the GEAC system.
GEAC ran on its own minicomputer and was accessed on special-purpose terminals with small monitors on swivel heads; by the end of the eight GEAC years these machines were plagued by a high rate of sticky keys and other malfunctions, and many had broken down altogether. There were never enough in Cataloguing to go around, so that librarians wrote out information on a work slip to be typed in by others at the bank of terminals in the middle of the room. There was a thorough change in procedures including much new training, parts of which were spread throughout the GEAC period.
The database of brief records created for Circulation purposes was loaded into the GEAC computer in May 1980, except that the accession number record was dropped, and accession numbers ceased to be assigned. The information in the database was incomplete and full of problems. No subject headings were included. There were no cross-references. Some records had very brief author information, some had full headings, others were somewhere in between. (Bertrand Russell's name appeared under thirteen separately filed forms.) Since each record represented an item, multi-volume sets were particularly clumsy. Most records were in upper case only. The first search interfaces were also not very user-friendly - and the users of those days were often computer illiterate; many faculty members and students had never used a computer of any kind before.
In order to provide the necessary interface with Circulation, barcodes (large size and stuck down with Mylar) were produced with call numbers from the database printed on them. In a large project during the summer of 1980 these barcodes were stuck into the books (inside back cover, top right). In addition, item records were created for thousands of volumes of serials for which individual records did not exist previously. Unfortunately it turned out a little later that these barcodes did not "wand" very well, and they were replaced with smaller equivalents as they circulated, or in later projects. Some Special Collections items, and possibly others, still have the old barcodes.
In June, 1980, new cataloguing was done on GEAC, card sets were no longer produced, and by 1982 even temporary paper records for new titles were no longer filed. This had the short-term effect of splitting the catalogue again!
Funding had been obtained not only to buy the system, but also to undertake a Retrospective Conversion project (ReCon). This began in June 1980. In order to input the most complete records, the "main entry" cards in the Name-Title card catalogues were used, starting with the New NTC (the titles in highest demand and the best records) and proceeding after a couple of years to the Old NTC. Unfortunately in the early version of GEAC full MARC coding could not be used in the variable fields. It was not found practical to have the ReCon typists enter full MARC codes in the fixed fields either, and a simplified list of place and language codes was adopted. (Fixed fields in these records still contain codes like "xx" and "und" much more often than they should. The simplified list also included non-MARC codes like "eur" and "lat", as well as "ca", "us" and "uk" - since converted to "xxc", "xxu" and "xxk" respectively. It did include a code for Ontario, and the local code "kin" for Kingston imprints)
The following record, transcribed from instructions written about 1980, shows approximately the format in which cataloguing (both new and ReCon) was entered in the first version of GEAC. The $ sign was the delimiter, followed by a capital letter to indicate another field of the same type, by a lower-case number to indicate a subfield. Note the collation included in the Imprint, and the run-on Notes.
CALL NO.: RA566.C29 AUTHOR: Calabrese, Edward J., 1946- TITLE: Pollutants and high risk groups : the biological basis of increased human susceptibility to environmental and occupational pollutants $B Environmental science and technology IMPRINT: c1978 $a New York :$b Wiley $c xviii, 266 p. : ill. ; 24 cm. - (Environmental science and technology) SUBJECTS: Pollution - Toxicology $B Environmentally induced diseases - Etiology LC CARD NO.: 77-13957 ISBN/ISSN: 0-471-02940-8 LANGUAGE: ENG COUNTRY OF PUBLICATION: US FORM: M STATUS: C NOTES: "A Wiley-Interscience publication." Bibliography: p. 206-244
This record, apparently entered as a sample for ReCon, is still (2004) in the database, with no sign that it has been changed in any way other than in the three machine conversions that have occurred since (except perhaps the removal of the series from the collation), and looking respectable apart from some missing subfield coding. Many other records have presented more problems.
ReCon was finished in February 1985. A Progress report on library automation prepared in 1985 gave the average ReCon cost per record as $1.18, which was considered very efficient, and the total number of records converted as 843,153, (for Douglas, Branches, Education, Bracken, 702,927; for Documents, 109,067, for Archives, 30,218, and for Film Studies 941).
Eventually the GEAC system was upgraded to a version called MRMS (MARC Record Management System), which made full MARC coding possible. A machine conversion was done in the summer of 1985 to fit the ReCon records into MARC fields. The results were useful but could not be completely satisfactory. Many of these records have scarcely been touched since (or if they have, it was most likely to correct a single heading or other problem, so that most of the ReCon marks remain). They usually have all the information that was on the card, but added entries are apt to appear as principal authors, all the notes are run on in a single paragraph, a series statement may be tacked onto the 300 field, and coding is quite basic, especially in subject fields. 600, 610, etc. fields are coded as 650, with |x subdivisions only (unless a |t was involved, in which case subsequent 650s may be coded as 600!). All records were supplied with an 040 field giving CaOKQ as the cataloguing library, though this was often untrue; no real effort has ever been made to correct this, though LC card numbers are often included in the same records.
The same MRMS upgrade made it possible for multiple volumes to be included in a single record. A project was undertaken to link the appropriate item records together. (It did not catch everything; a few unlinked item records still lurk in the database.)
MRMS was not very satisfactory for actual cataloguing, since each field had to be added or edited as a separate command, and response time was frequently very bad. Staff entering data spent far too much time waiting for the machine to respond
This new GEAC system had a much improved public interface which grouped like headings together. Previously a user who typed in (a modest example) "Kingston Ont" as a subject search term would have to go forward through dozens of screens before coming to "Kingston Ont -- History", now the same searcher could choose the desired heading after only a few screens.
GEAC index rebuilds were not very efficient. During at least part of the period, each index could be rebuilt only every three weeks, which meant a long delay before records from new cataloguing could be found.
Response time was often maddeningly slow, but experienced users could speed up the interface by "command chaining"; for instance, typing in IND/2/MAR would bring up the MARC version of the next record in the index in one step instead of three.
Patron records, Reserve records, and the "Hold" function were not available in the second version. For this reason, a few public terminals continued to be able to bring up records using the first interface. Subject and note fields were stripped out of the records for this purpose.
Towards the end of the GEAC period, an automated acquisitions system came into use. Order numbers from this system, as well as from the pre-automated system, sometimes turn up as notes in Holdings records. Most older bibliographic records still contain an 035 with a "GRSN" number, which was the record number in GEAC. (None of these mean anything now.)
In both early and later GEAC there were no diacritics. ReCon and current cataloguing simply omitted most diacritics, but umlauts etc. were spelled out with an inserted E (as they had been filed in the card catalogue). This practice continued until 1996 in Douglas and Health Sciences, though it was never followed in Law.
Incidentally, the effects of computer filing showed up a large number of inconsistencies in headings and other errors in the database which had been hidden in the card catalogue era. These were gradually dealt with, but in GEAC it was still not a very efficient process, since records could only be modified if accessed by number (the GRSN number which remained in the database post-GEAC).
GEAC never included machine authority files of any kind. Each cataloguing library at Queen's continued to maintain its own authority files (or not) on cards. (By this time LC and Canadiana authority files were available on microfiche.) Card shelflists were also maintained, as the GEAC database was not considered sufficiently reliable where copies and volumes were concerned.
Douglas still had its large backlog of PRE books as well as a current float. In a summer project about 1987 the remaining PREs were given brief entries in the database, renumbered as INVs, and, later, put back into closed stacks with significantly improved but still incomplete access through the catalogue. Over the next few years many new titles were given INV treatment when full copy could not be found, often with information input by staff not trained in MARC coding; records created in this way are marked by (among other things) invalid coding in the 245 and empty Date subfields.
Documents was still using CODOC. During the GEAC period the Documents database was dumped into the GEAC catalogue with minimal modifications. GEAC (second version) had keyword search capability (with Boolean search enabled at times on one or two machines, though it placed a very heavy load on the system), and Documents public service staff liked the fact that their all-upper-case records were instantly identifiable in key word search results (GEAC's indexes displayed in mixed case).
During the 1980s Documents also purchased records for the Microlog collection of Canadian government publications on microfiche. These records did not meet normal cataloguing standards, and modifications made as they were loaded into the GEAC catalogue made them worse. For example, in records with title main entries (something never allowed in CODOC), the publisher was automatically removed from the imprint (260) field and made into a corporate author, though the results frequently began with words like "Department" or "Government". (The basic Microlog records did not improve much over the years, but the loading process eventually corrected at least that peculiarity. In 2004 all the original records, now complete with links to electronic versions, were reloaded with some machine corrections, and the messy old records stripped out.)
At some point (perhaps post-GEAC?), records for the Canadian Institute for Historical Microreproductions (CIHM), Early English Books, and other microform collections were purchased and added to the database. Each of these packages too had its own peculiarities.
A project was done in the Film Studies Department to add a collection of National Film Board films to the database. These records were created by students with minimal training. This material is no longer in the catalogue.
The University Archives, which moved from Douglas Library to Kathleen Ryan Hall in the late 1970s, was increasingly independent of the library. However, Queen's theses were still processed and catalogued in Douglas and listed in the union catalogue (including print copies in Archives and elsewhere, and microform copies held in the library's microforms area), as were materials for the Archives reference collection. (Gradually the theses have received less cataloguing, with the assignment of unique numbers being dropped around 1990, and subject analysis for science theses with informative titles ending about 1997.)
In the 1980s Archives began putting records for manuscripts into computers. During the GEAC period Archives records co-existed with library records in the same online catalogue, but this was found unsatisfactory and confusing for users (records for books by a Canadian author were sometimes swamped in a sea of records for manuscripts). After GEAC, Archives records (except theses and reference materials) were moved into a separate database.
The separate cataloguing operation in Education was closed down in 1981; staff moved to Douglas and the shelflist was filed into the Douglas shelflist. All Education ReCon was done at Douglas. Education has continued to do its own Audio-Visual and some other special materials. A special project about 1987 created records for an Education software collection (in the form of floppy disks).
ReCon projects were undertaken in the early 1980s in Health Sciences and beginning in 1985 or soon after in Law. Law ReCon began with machine searches and overlays from the National Library of Canada database, possibly including some wrong overlays which have not been found yet, but resulting in records that do not have the ReCon and other local peculiarities described above.
By the time ReCon was completed, about 1986, the Name-Title and Subject card catalogues were compacted into fewer cabinets and were relatively little used. The New NTC and the Subject card catalogues were dumped in the late 1980s; the Old NTC was moved to the third floor and retained for several more years (until Douglas Library closed for renovations in 1995), as it was thought it might still contain the only record of some materials.
[Top | Table of Contents (General Guidelines) | Table of Contents (Top)]
By the late 1980s GEAC no longer served Queen's needs adequately. Investigation of alternative systems led to a contract with NOTIS, then based at Northwestern University, which offered what was then regarded as a state of the art integrated system. The GEAC database was converted to NOTIS in April 1988 and work began in the new system. (Each record then received a NOTIS record number comprising three letters and four numerals. Roughly, NOTIS numbers starting with AA or AB are records from the original dump into GEAC, in call number order; AC and most of AD were added during the GEAC period (including Documents, Health Sciences, and Law records); from about ADW the records were added during the NOTIS period, which ended at about AHE.)
NOTIS ran on the University's mainframe computer, and was accessed, for the first several years, through dedicated dumb terminals, which this time had full-sized monitors. (They could also access other programs running on the mainframe, including versions of word processing and e-mail.) Acquisitions used the same basic record and an attached Order/Pay/Receive (OPR) record which worked quickly and efficiently for experienced staff who had memorized the various codes used.
NOTIS was not a Windows-based system. Only one record could be displayed at any one time. On the dumb terminals, there were no Control or Alt keys, and no such functions as copy and paste. It was, however, possible to program Function Keys 13-24 to insert selected character strings.
In the bibliographic record, variable fields could be added in a continuous string in any order, with fields separated by a special character, and would then rearrange in order by tag number. Indicators were typed between a pair of colons, or defaults would be inserted automatically if only the colons were typed. If a record ran over a single screen it was necessary to page forward or backward to see it all. One of the "bugs" of the program was that it did not handle line wrap well; if a field running over one line was changed in any way, unwanted spaces would appear at the ends of the lines and then had to be closed up again; some such spaces are still to be seen.
NOTIS did allow diacritics to be entered (the exact method varied over time), though they did not display in the OPAC. Douglas (from 1995 called Central) Cataloguing, and Health Sciences, continued using spelled-out umlauts until 1996, then began to follow LC: there are still many inconsistencies in the database, particularly in personal names, titles, and notes. There is still a lack of consistency between diacritics in different systems, too, which can result in mangled overlays when the problems are not caught (e.g. Québec could become Qubec or Qubebec). Another peculiarity: square brackets did not work properly; records created on NOTIS have angle brackets where they should have square ones.
NOTIS also allowed the creation of online authority files. Online Name Authority work started in 1988, with staff looking up LC or Canadiana authorities on microfiche, copying the vital information, and typing it into a template as it was called for in a record. At one point there was a project to input the authorities from the card files (which did letters S-Z). After the first year or so of NOTIS, we were able to acquire machine files of both LC and Canadiana name authorities, which were then mounted in separate "processing units", from which records could be "derived" into the local processing unit and modified as needed. The in-house Name Authority file was gradually built up to about a quarter of a million records - with a good many changes along the way in procedures, and various policies as to which source should be used in any given case.
Sometime in this period the full LC Subject Authority file was also purchased and loaded into the main database, so that subject cross-references finally appeared in the OPAC. This was updated periodically. In the early 1990s the "global change" function became available and allowed large-scale corrections of both Subject and Name headings, though the procedure was both finicky and clumsy.
The original function of the "processing unit" was to separate the records of the various libraries. There were processing units for Health Sciences, Law, Documents, and what was confusingly called "DL", which included the central library, the branches, and Education. In the early years of NOTIS each processing unit had exclusive control over its records, so that notices of typos had to be sent from one cataloguing operation to another. Needless to say, if all four processing units happened to have copies of the same title, there were (at least) four separate records. In later years, the processing units shared permission to change one another's bibliographic records (mainly to correct typos), and in some cases holdings as well, but it remained necessary to have separate records.
In about 1993 the Law Library abandoned its in-house classification scheme (in which all the numbers began with K) and began using LC, with old and new classifications shelved separately. (Note: LC's K classification for Law had been coming out piecemeal since about 1969, though it is only approaching completion in 2003. Douglas, meanwhile, rather than adopting a non-LC scheme, had formerly classed its few law titles as just K followed by a cutter number, then followed a tentative outline from LC which assigned classes to countries. The Law Library also used this expedient in the early days of its switch to LC. Unfortunately, LC changed its plans after 1991 and the classes assigned to many countries changed when the full schedule finally came out. As a result of these and other short-term solutions, the Ks in Law and in the Central Library are rife with inconsistencies.)
In the early 1990s, increasing pressure to increase efficiency and reduce the backlog resulted in significant staff resources being reassigned from file maintenance and searching to derived cataloguing. At about the same time there was an attempt to increase output by reducing standards for some types of material, which did not have much effect and was gradually abandoned.
As part of preparing for the move of the main collection to the new Stauffer Library, various projects were undertaken, including one to convert the previous non-MARC holdings information for multi-volume records (which listed each volume on a separate line) to MARC format, using 866 tags. This produced much tidier results in the OPAC.
The largest project was to catalogue as many as possible of the remaining INV backlog books, though to a lower standard than usual. The old records created by the summer project undertaken in the 1980s were matched by numeric fields and titles against the OCLC database and records were loaded (with some wrong matches). Call numbers from this project had "u" (for unverified) added at the end of the date. (Prior to the project, the INVs were checked for items that should be in Special Collections, and these were removed to another backlog, most of which still exists as such.) The backlog project did result in about 50,000 items being brought to open shelves, though about 10,000 volumes remained when other pressures caused the project to be abandoned.
The card shelflist was closed in the early 1990s, and was finally abandoned when Cataloguing moved out of Douglas Library in 1995. (Two cabinets of unconverted NAF cards were taken, but nothing was done with them and they were eventually dumped.)
From the mid-1980s to the mid 1990s the preferred source for searching was the Bibliofile CD-ROMs rather than microfiche or manual sources. [Douglas had two microcomputers for this purpose]. A search usually required swapping CDs in the machine. Printouts could be made from the records found, and after the first short period, records could be overlaid. The Law Library also used Bibliofile as a source, while Health Sciences used National Library of Medicine.
Following the move of the main collection to Stauffer in 1994, Cataloguing and Acquisitions from Douglas - now called Central Technical Services - moved in May 1995 to "swing space" in the former Documents Library in Mackintosh-Corry Hall, and suffered the inconveniences of being separated from the collection. It was originally planned that CTS would move back to Douglas after renovation, but this did not work out. A move to Stauffer is in prospect for 2005-06, if all goes well.
Meanwhile, in 2000 the formerly separate Documents technical services operation (now reduced in size) was amalgamated with CTS. Documents continue to be classed using CODOC, though the records are no longer all upper case and practices are coming closer to those used for other materials.
Later in 1995 the old dumb terminals were replaced by personal computers. NOTIS now ran through an interface program called Host Explorer, and behaved very much as it had before (though the line wrap problem was more or less fixed), but staff now had access to other software and to the Internet.
One important piece of other software was Cataloger's Toolkit, developed by Gary Strawn at Northwestern. Using the Toolkit, cataloguing on NOTIS became much more efficient. Among other things, it allowed downloading of individual records by all Cataloguing staff.
Bibliofile was replaced for some years by a program called Laserquest, which allowed the CD-ROMs to be networked so that they could be accessed by all cataloguing staff from their own machines (with no switching of discs). Later, when Bibliofile became available in a networked form, it replaced Laserquest again. Another program called BookWhere was introduced; this allowed searching a wide range of Z39.50-compatible catalogues at once and proved very useful for titles not in Bibliofile / Laserquest. Staff also developed skills in searching other catalogues through the Internet, which by 2000 had almost entirely superseded manual sources even for esoteric materials. Selected staff learned to search ISM (later AG Canada) and OCLC as well, but these were not considered suitable for general use because of cost and training problems.
During this period the Cataloguing Manual, which had evolved in looseleaf format over many years, was partly converted to online format, with much additional material as new issues arose. though some earlier material has not yet been converted.
The amalgamation of the science and engineering branch libraries, and related materials from Stauffer and various storage locations, in the renovated Douglas Library in the late 1990s, led to large file maintenance projects in Cataloguing to combine serial records and delete multiple copies.
In the late 1990s electronic journals and other Internet sources began to be a significant part of the collection. How to catalogue them became an important issue. Initially, if a journal was held partly in print and partly in electronic form, the information was combined on a single record, in a series of projects to enter the titles of various vendors.
[Top | Table of Contents (General Guidelines) | Table of Contents (Top)]
By the late 1990s NOTIS had been acquired by a commercial firm that did not actively develop new features, and its limitations were becoming increasingly obvious. In particular, the OPAC interface was by now so "antiquated" that students accustomed to Windows and the Web found it difficult to use. This led in 1999 to another replacement process, which selected Endeavor's Voyager system. After considerable preparation and training, including various projects to clean up the database, the migration to Voyager took place at the end of 2000.
Voyager is a client/server system: the database is kept in the server in ORACLE format; actual work is done on PCs, then stored in the database. The OPAC runs on the Web, while the cataloguing and acquisitions modules are Windows-based. In 2004 the potential as well as the limitations of the system are becoming clearer. (One limitation is that Global Change has not yet worked very well.)
The Web-based OPAC allows direct linking between earlier and later titles of serials, if recorded in the proper 780 and 785 fields. In anticipation of such linking, those fields had been created in all recent serial records, using the subfield |w (CaOKQ) followed by the NOTIS record number. It turned out that in the release of Voyager we started with, the |w subfield did nothing, but the links worked (more or less well) by going to the title in the index. Direct linking of records using ISSN numbers is now imminent -- but not all our records contain the required fields or the ISSNs, so results will continue to be variable. Meanwhile, Voyager's serials check-in function also required some work on titles, especially titles like "Annual report", to make them adequately distinctive.
The decision was made not to continue the "processing unit" concept in Voyager; instead, the various libraries' holdings can now be combined on a single bibliographic record. On the other hand, there was a trend toward using separate records for electronic journals, as records for these became available in packages which did not remain static. A wide variety of practices and policies on combining or separating records may be found in the current database (including old serial records that put multiple titles on a single record).
One of the decisions made in the migration was to load the complete LC and Canadiana Name Authority records into the same database as the bibliographic records (though they only show in the OPAC if there are matching bibliographic records). This has resulted in large numbers of duplicate records which are only slowly being tidied up, and also in many references pointing in wrong directions. A decision following from the first was to follow LC practice wherever it differs from Canadiana, except in cases where LC is clearly wrong, or in the small number of Canadian subject headings we have chosen to use (and for which we have created authority records).
Since at least the 1960s, the Industrial Relations Library had existed as a completely separate operation, hiring its own staff and cataloguing its own collections. It became a branch of the main library system about 1988 but continued cataloguing into the 1990s. In 2001 it was closed and the book and serial collections were transferred to Stauffer and elsewhere, with significant file maintenance required. The 2003 move of the Art Library into Stauffer (as a distinct collection) may be the last such branch change for a considerable time.
The proportion of Queen's cataloguing done by librarians as "original" has been dropping for at least a decade, as departing librarians were not replaced, and as technicians became able to search more widely for useful copy. By 2003 this trend had gone about as far as it can go.
Meanwhile, the number of technicians was reduced by budget pressures. With emphasis being placed on quick turnaround of materials, the accumulated "current backlog" (and a small part of the INVs) were sent to a vendor for cataloguing in 2002/03; the same vendor is also cataloguing the blanket orders before they ever arrive. Plans were in the works for a reorganization of Central Technical Services which will see more cross-training and work-sharing between Cataloguing and Acquisitions staff. The future development of the catalogue will be shaped by future changes in the organization of the staff as well as by changes in standards and technology.
[Top | Table of Contents (General Guidelines) | Table of Contents (Top)]
Confirmed in new index, Nov. 2, 2004
Page maintained by Elizabeth A. Read, email@example.com. Created: June 16, 2003 by Doreen Rutherford. Updated: 21-Jun-2004 05:24 PM