Customization of Library Service in a Cross-Platform Programming Environment

Alvan Bregman and Winnie S. Chan

This paper describes how library operations can be integrated in a cross-platform programming environment. It discusses the organizational and technical issues in planning and designing of macro applications from the technical services workstation. It then describes a variety of technical and public service macro applications. It shows the efficiency and ergonomic benefits of these customized programs. It concludes with an example of how PC-based macro-programs can populate Web-accessible server-side databases with ActiveX technologies.


Programming Initiatives Background

This article describes how library operations can be integrated, regardless of the systems in use, through what we call the "desktop-programmed environment." Two keys to a desktop-programmed environment are the technical services workstation and the "macro," a computer program created either through recording series of keystrokes or-far better-through purpose-designed algorithms. 1 While such macros are commonly in use in many libraries within a single system, our article shows how these can be used to link and integrate systems to great functional advantage.

At the University of Illinois at Urbana-Champaign (UIUC) we began developing cross-system macros in the context of our migration away from a first-generation system known as LCS/FBR, which had been largely developed in-house and used by a consortium of Illinois libraries. 2 While long-range plans call for moving to a state-of-the-art client/server-centered system, interim arrangements required the implementation of a turnkey telnet-based system, DRA Classic.

It quickly became evident that the introduction of new technology would have a profound effect on the library as an organization. Indeed, existing structures and entrenched technologies affected implementation of the system and were in turn highly affected themselves by what was implemented. For that reason, this article looks at the "programmed environment" not only as a desktop phenomenon, but also as an organizational phenomenon. To have a deep understanding of the real impact desktop computer programming can have in a library, we begin with some organizational and technical issues and how they affected systems and services.

Local Organizational Structure

The UIUC Library is highly decentralized, made up of some forty-five semi-autonomous departmental libraries. Original cataloging had been decentralized in the mid-1980s; much acquisitions and serials check-in work was also done in the departmental libraries. Serials cataloging and copy cataloging were the respective responsibilities of an Office of the Principal Cataloger unit and of an Automated Services unit. Catalog maintenance was entirely centralized for systems reasons: changes to the consortial catalog could only be performed using hardwired terminals running SuperWylbur or WLN text-editing software.

Besides using LCS/FBR, UIUC also maintained a variety of other systems, including IBM for acquisitions accounting and Innopac for serials check-in. All original cataloging was done through OCLC. Hence, multiple input was an overall feature of the configuration. Many parts of the system were not automated; many card files were still in use, especially since retrospective conversion was incomplete.

Catalog Maintenance in a Pre-Windows (DOS) Environment

UIUC had long grappled with the problem of carrying out catalog maintenance using an inflexible line-editing system. This situation has been documented by Clark and Chan and by Henigman.3 A variety of computer programs written in BASIC and operating under DOS had been developed to facilitate accuracy and productivity in cataloging maintenance. Essentially, these programs guided inputters in the formatting of data and made possible the interface between the separate LCS and FBR databases. In implementing the new system, however, these programs became obsolete, since SuperWylbur, LCS, and FBR were all being retired.

On the cataloging side, terminals hardwired to OCLC were in use. Catalogers worked on printouts provided by searchers, and inputters put the catalog information into the OCLC databases, typing from catalogers' workforms.

Programming had been created to produce call-number labels as a by-product of OCLC work. A DOS-based BASIC program was run by each inputter to write call-number information to a disk. Every morning the label files on the separate disks were aggregated onto a single disk and sent to the marking unit, where labels were created.

New Desktop Computers Operating under Windows Installed

In the two years previous to the implementation of DRA Classic, UIUC had vigorously replaced its hardwired terminals with networked PCs that used a standard suite of Microsoft application software running under Windows. Staff were given courses on the use of PCs with Windows, but since they were not required to work with the applications directly, there was no appreciable use of the new technology.

The bank of OCLC terminals was also retired and access to this utility was transferred to the PCs without changing the way that work was carried out. Gradually, catalogers also received PCs and began limited use of such applications as Cataloger's Desktop.

New Telnet-Based Integrated Library System Implemented

In August 1998 the DRA Classic system was implemented for access to the public catalog, cataloging, and circulation functions. The statewide implementation was necessarily rushed and a variety of difficulties faced the library.

In the first place, batch processing work requests were unavailable for all practical purposes. The DRA Report Writer utility was not implemented by the consortium for individual libraries. Secondly, the programs written to facilitate maintenance under LCS/FBR were not functional in the new system. Thirdly, Classic is a keystroke-intensive system with an early generation editing protocol. Moreover, the protocols and keystroke patterns required to work with Classic were very different from those staff had long used with the earlier system. Even after intensive training, staff reported difficulties and frustrations using the system. Some of these complaints diminished as staff became more familiar with Classic; nevertheless, the system remained so keystroke-intensive that some functions, such as the creation of serials analytic records, became impractical.

The DRA record structure at once simplified and added further complexity to the situation by making use of the MARC Format for Holdings Data (MFHD). Previously, UIUC holding records were in nonstandard format and a separate program was required to link holdings to bibliographic information. With DRA, bibliographic and holdings information were in compatible formats, but item records were non-MARC. While DRA editing allowed for dynamic updating of the database, new records could not be added according to specifications without a consortial loading program that had not yet been written.

Managing the Transition between Systems

Loading of new records into the old system was suspended in May 1998, three months prior to the new system going live, as part of the catalog data conversion process. To deal with this, we devised a Web-based access tool, which became known as the "Gap Shelflist," as the main source for users to find new titles added to the library collection during this period. This shelflist was also designed to be the primary source for holdings data that would be entered into the new system once it became available.

To accomplish this project, we took advantage of the new hardware environment of PC workstations and developed customized OCLC Passport software macros to derive MARC-based data from each OCLC update that was done for new titles. A separate QBASIC program then processed and imported the data into a Microsoft SQL Server database. We chose the Microsoft SQL Server over a PC-based Access database for its stability and fast response over the internet. Users of the Gap Shelflist were able to perform queries by author, title, call number, or location from an Active Server Pages (ASP) Web page containing some VBScript codes. The server-side scripting in the ASP file then processed the query and passed the output to the client machine of the internet users.

Soon acquisitions information was added to the Gap Shelflist. We had, in fact, two OPACs: a DRA main catalog that represented our holdings up to the termination of LCS/FBR, and a Web-based secondary catalog showing holdings added since that time. The duration of the Gap grew and grew as one external delay after another occurred to prevent the consortial loader from becoming operational. Staff and patrons grew used to the Web-based view of new titles. In its five-month existence, the Gap Shelflist came to include more than eighteen thousand new titles, effectively bridging the bibliographic record vacuum during this crucial systems transition period.

General Issues Regarding Macro Programming From Recorded Keystrokes to a Programmed Environment

As we mentioned at the outset, desktop-based computer programs can be created by recording keystroke sequences using the macro utility provided as part of various application software packages. The most serious drawback of recorded macros is that they lack recursiveness and are limited in what they can do with data input and output. It is only once one begins adding custom features to the macro that a programmed environment becomes possible. Almost all desktop application macro utilities use some version of the Visual Basic programming language. This single standard means that programmers can work relatively easily with a variety of computer applications. Bannerjee has described how input and output of data can be managed by the effective use of text files written to and read from a workstation drive. 4 While programmed macros can be written from scratch, the easiest approach is to record required keystroke sequences and then edit and improve the recorded macro by adding recursive, protective, and interactive features, and this is the approach we most often used.

DRA Classic did not have a macro-making utility, but SmarTerm terminal emulation software, the communication packages we chose to provide access to the system, did have such functionality. 5 This functionality was initially demonstrated by Chu of Western Illinois University in the 1998 ILCSO Technical Services Fall Forum following implementation of the new system. We immediately saw the potential of this feature given our situation at UIUC and proceeded to experiment along lines opened up for us by our work with the Gap Shelflist.

Solving the Timing Problem

A Macro to Create Analytics Templates

Our first major foray into the world of SmarTerm macros was to facilitate the creation of serials analytics records, a process for which templates did not exist. Our goal, then, was to build such a template, an apparently mechanical task. After reengineering the workflow to establish the most efficient order of steps in the process, we used the SmarTerm macro utility to record the required sequence of 102 keyboard strokes. 6 Unfortunately, the new macro halted at a different stage of the program every time it was run. We spent a considerable amount of time looking at the code for the recorded macros before we could diagnose the problem.

We noted that the SmarTerm utility automatically inserted blocks of Visual Basic code that were designed to account for the variable time it might take the host to respond. Accordingly, the program should have waited to receive specific information from the host before proceeding. Since this seemed reasonable, when the macro regularly failed, we continued to look for logic and other problems that might be holding up its success. The debugging utility did not located errors in syntax (and we had not written code anyway), so we finally decided to build the program up step by step to see where exactly the failure was.

First, we removed all the interpolated SmarTerm code, leaving only the recorded commands in place. To our surprise, the macro was able to proceed all the way through the sequence of commands. We now had a macro that made this important part of our work more than one hundred times faster. We also had learned that the system-generated timing code seemed to be a hindrance rather than a help.

Macros for Maintenance

The next SmarTerm macros we built were needed for catalog maintenance. For example, holding lines needed to be added whenever complete volumes of serials were received after binding. Without a macro to help, each holding line needed to be constructed through keystrokes. We now found that timing became a more delicate problem since we were working outside the edit mode in DRA. The telnet interface required time to send information back and forth from the host, and this time could vary considerably. When the system was slowest, typically toward the end of the day, the interval between responses could be many seconds. We had already found out that the timing code provided automatically by SmarTerm (see section above) was not effective. We were repeatedly frustrated in our initial attempts to deal with this issue. The "Sleep" command, which paused execution of the macro for a set number of seconds, was unreliable except when its setting was very high, and this slowed down execution so much that there was no advantage over manual work. The "MatchString" functions seemed to replicate what the original SmarTerm LockStep subroutine was meant to do, and except in very specific situations, it also was not a viable solution. Finally, we experimented with an "EventWait" function, and this provided us with a solution to our problem. The key to a successful macro execution in our environment was timing, and we managed this by including EventWait functions everywhere a response from the host was expected.

With the timing issue resolved, we consistently were able to use macros to optimize the efficiency and range of work done on our new system. The next challenge we undertook was to automate more of our repetitive processes. We will outline some examples from different kinds of library operations of macros we wrote that made a big difference to our workflow. 7

Technical Services Applications

Cataloging Applications Automation of Process

After five months in the Gap, we had a huge number of library materials (mostly books) with holding links to cataloging records in OCLC but no holding representation in our own online catalog. The books themselves were piled up everywhere in our cataloging workroom and we had great difficulty finding titles users requested after consulting the Gap Shelflist. Accordingly, we developed a simple SmarTerm macro that would allow us to add holding information in DRA where there was already a bibliographic record in the database. This exercise was meant to establish that we could use our cataloging module for direct input of holdings without going through OCLC, and it was an effective, though partial, solution to our problem.

OCLC Macros Constructed for Data Entry

We also seriously began experimenting with the OCLC Macro Language. The goal was to eliminate the need for catalogers to "hand off" work to our OCLC inputters, staff members who were skilled in using OCLC editing conventions. Since copy catalogers were working with OCLC records anyway, why couldn't they simply finish the work themselves? Our plan was to train inputters in copy cataloging and double our productivity. The writing of macros was facilitated by the similarity between the OCLC Macro Language and the SmarTerm macro language, which were both based on Visual Basic. After mapping workflow and carefully recording each keystroke, a working "input" macro was created. Timing problems luckily did not present themselves with this macro. (However, we now know that such problems can and will appear in all macros that send and receive information from a remote file or host.)

Work was begun with our children's books cataloging operation because this was a high-volume, self-contained process where errors would both be visible and correctable within manageable confines. Once such a process was working smoothly, we could go on to implement general workflow changes assisted by more general macros.

Our input process essentially involves entering a Dewey call number and location code into a bibliographic record located in OCLC. Once the bibliographic record is updated in OCLC with this information, the information is loaded into DRA through a consortial loader program, which separates out the holdings information and places it in a holdings record, linked to the downloaded bibliographic record.

Searching for the appropriate record and entering the information requires knowledge of specialized OCLC coding protocols. By using "AskBox" or "InputBox" VB functions, input information could be captured and written to a text file in whatever form was required to generate labels or for other purposes. Initially, we decided to write the data directly to files that could be used by the old BASIC program that had been in use long prior to the advent of our new system. Later versions of the macro integrated the label-producing function, significantly simplifying our procedures. A side-benefit was that catalogers and student assistants could enter call number and location data using OCLC without specialized training in its protocols.

Further steps were taken along these lines, as noted below.

Acquisitions Applications

Integrating the Labeling Function with Other Processes

The most important offshoot of this integrated approach to programming was in the transference of a significant portion of our copy-cataloging operation out of cataloging units. The hardware we used in creating labels was antiquated and becoming unreliable. We hoped to migrate to laser-printer labels, but there was no stand-alone program that seemed suitable for our needs. Since we were able to generate text files with label information and Microsoft Office was installed on all our networked PCs, we experimented with the labels utility in Microsoft Word as the basis to generate our labels. This involved creating yet a third kind of macro, using Visual Basic for Applications in Microsoft Word. We were surprised that working with this macro language was no simpler and in many cases more difficult than working with SmarTerm or OCLC macro languages. The Mail Merge utility in Word is somewhat intricate to set up, involving many steps, not all of which are intuitive. Templates must be designed that will accept data in the desired format. Data files must be created that correspond in design to the templates. Once that is done, information in the data files can be merged into templates. We wanted to create reliable macros that untrained users could use to make spine labels.

Again timing was a problem with the Word macro. The speed at which instructions can be carried out by the operating system far exceeds the ability of the screens to resolve themselves with new data to be processed. In order to control the macros, we have installed message boxes that require human response before the program can proceed. The virtue of this for a labeling utility is that operators who know nothing about programs can feel some control over what is occurring automatically. It also will allow us to add variations to the basic program as developments proceed.

Acquisitions Order Record Notes

Another function present in the old LCS/FBR system and in our Gap Shelflist was that order records could be modified to indicate whether materials on order had been received or claimed. This function was not present in our implementation of DRA. Accordingly, we wrote a macro that automatically searched our order records on DRA and added appropriate notes to the record. This macro could be operated by students who had no formal training in DRA, MFHD coding, or other technical matters. The acquisitions notes were coded so that they did not appear to casual users of the public catalog, but could be found by staff or by specially trained users.

Other Maintenance Applications

Location Transfers

Another important use of macros was in the transfer of major portions of the collection from one location to another. Here again, the manual work involved in processing volumes one at a time would have made this task impractical to undertake. However, by integrating SmarTerm macros that automated DRA instructions against text files of barcodes, huge amounts of work could be done with a small complement of relatively untrained staff. In this procedure, we used hand-held scanners to record the barcodes of relevant parts of the collection and used the resulting text file as input data. The macro called up the relevant item record and made the relevant changes. In one summer project, approximately twenty-five thousand volumes were processed in the Social Sciences and Education Library using this functionality.

Material Status Changes

Macros were also developed that could batch-process large quantities of library item records in order to change the codes that regulated borrowing terms. This was necessary on a large scale in some libraries because of anomalies in the original data conversion from LCS/FBR to DRA. Our first exercise was to change a class of serials and monographs in the UIUC Library and Information Science Library that had been given incorrect borrowing types. This program operated against files of data base control numbers (DBCNs), the unique numbers assigned by the systems to every bibliographic and holding record. (Using a text file of DBCNs was far inferior to using barcodes, but that was the data we had. A barcode represents a single item, but a DBCN will call up a record that might represent sometimes hundreds of items, as in a long run of serials.) The complexities of the program made development relatively slow, but finally a functioning algorithm was created. We used versions of this algorithm to process more than ten thousand items in the Grainger Engineering Library collection and more than twenty thousand items in the Veterinary Medicine Library. We divided up the files of DBCNs into ranges and ran simultaneous sessions to process the records involved. This speeded up the processing considerably, but required than an operator monitor the programs as they ran. This work would never have taken place at all without automation through macros.

Circulation Applications

Cataloging is not the only area of library operations for which macros have become essential at UIUC. Circulation activities have been substantially improved through the design and implementation of SmarTerm macros in particular. In many cases, the work of circulation units has been speeded up exponentially through these small programs. Working closely with Betsy Kruger, the Head of Circulation and Bookstacks, a number of programs in circulation were created, including the following:

Check-In Streamers

Our first circulation macros allowed streamers to be printed as part of the check-in process. These streamers were to be placed in books that had been returned by borrowers prior to reshelving. The DRA implementation did print out certain information each time an item was checked in, but it did not allow for the spacing between items or customization of information required to create a streamer. Staff had to hit the line feed button on the printer several times for each item received, slowing down the process greatly. Moreover, the actions were grossly inefficient in terms of ergonomics and repetitive stress concerns. Our simple macro added date and time to the streamer and operated the line feeds; staff needed only to scan in a barcode and hit one function key. Staff were highly appreciative of the help they got from this simple but effective customization.

Item Route In/Out Functions

A macro was created to allow for items to be routed in and out of temporary locations, such as binding, conservation, or a new books shelf. In such instances, the macro can change the loan default for the item, insert an item note that give the date until which the item will be on the new book shelf, change the DRA location to the new books location, and print out information such as the title, call number, and "on new book shelf until" date. Taken as separate tasks, a large number of keystrokes would be required. However, with the aid of macro programming, the operator need only scan in a barcode number and enter the location code. Further developments of this kind of macro are described in the next section.

Public Service Applications

Web-Based Databases

Based on our "Gap Shelflist" experience and the success with SmarTerm macro programming, we decided to further customize our library service by integrating the two programming experiences. As more departmental libraries provide users with Web-based access to local information files, our public service applications initiative eventually led to interacting with their locally developed databases. To enable direct access to the Web-accessible databases additional codes are needed. We added ActiveX Data Object (ADO) controls, such as the Connection Object and Recordset Object, into our macro modules for this purpose. When integrating with ActiveX technologies, macro applications can be extended in a networked, cross-platform environment. We will describe the PC-based macro we first developed and then show how this was extended to become a vehicle to populate server-based databases (see figure 1).

Figure 1. SmarTerm Macro Populating a Server-Side Database

From DRA to Web-Page Display

The SmarTerm/DRA "new books" circulation macro was customized for the UIUC History and Philosophy Library so it could prepare a New Books list directly from the online catalog workstation. The macro was designed to perform the following tasks:

  • receive the barcode number of newly received item;
  • call up the item record;
  • change the location to the New Books Shelf; and
  • write specified bibliographic information from the display to a text file as output.

During this last step, the macro also converted the data text file into a table which could be printed and/or subsequently saved into an HTML file that was posted regularly on the History Library's home page. Each month, then, a new list was accessible over the internet. While this was a very welcome new public service feature, it still had the drawback of presenting basically static lists for review.

From DRA to Web-Accessible Database

At the same time, the Grainger Engineering Library had developed two Access databases in which newly cataloged books and reference materials were recorded. 8 These databases can be searched via an Active Server Pages (ASP) application from the library home page. 9 However, the information in the databases had to be entered manually. The New Books macro described above was modified to allow information from the online catalog to be added automatically into the Grainger Library's Access databases. This involved revamping the macro to include the following features:

  • the use of Microsoft Open Database Connectivity (OBDC) on Windows NT for direct access to the database from the client machine;
  • the creation of ADO Objects, namely, Connection, Command, and Recordset; and
  • the employment of ActiveX Controls, such as dialog boxes and buttons, to pause the program and allow for verifying data, editing a field, or canceling the transaction, for example, when a duplicate record is discovered.

With the macro running, staff only needed to scan in a barcode to accomplish a great deal of work. Once the macro had run, relevant catalog information, such as call number, ISBN, author(s), title, publisher, series, subject terms (for the reference materials), barcode, location, etc. were added to the database and instantly became accessible from the Grainger Library's home pages. Without a doubt this is by far the most desirable model that any macro application can hope to achieve. Figures 2 and 3 illustrate the start and pause of the New Book Macro and figures 4 and 5 illustrate how the data are searched and displayed on the Web page. 10

Figure 2. Start of the New Book Macro--Scanning a Barcode

Figure 3. Pause in the New Book Macro--Verifying/Editing

Figure 4. Performing a Query on the New Book Web Page
Following the End of Macro Run in the Online Catalog

Figure 5. Search Result Returned from the SQL Server Database

Conclusion

The purpose of this article has been to demonstrate how PC-based macro-programs can be extended to provide excellent functionality within and between basically stand-alone library systems. Within the programmed environment provided by properly configured technical services workstations, libraries can customize and greatly extend the range and efficiency of all kinds of library services and operations. Not only does the programmed environment provide power and flexibility to its users, many ergonomic benefits can be gained by reducing errors and injuries caused by excessive keyboarding. Most of all, macros can facilitate new links between the library catalog, technical service modules, and independent Web-based databases. At the very least, macros in a programmed environment can improve the efficiency of work with virtually any single library system.



References and Notes

   1. For a good introduction to the concept of the technical services workstation, see Michael Kaplan, ed., Planning and Implementing Technical Services Workstations (Chicago: American Library Association, 1997).

   2. LCS stands for "Library Computer System" and FBR for "Full Bibliographic Record."

   3. Sharon E. Clark and Winnie Chan, "Maintenance of an Online Catalogue," Information Technology and Libraries 4, no. 4 (1985): 324­38; Barbara Henigman, "Using Microcomputers to Provide Efficient and Accurate Maintenance forOnline Databases," Illinois Libraries 71, no 3/4 (1989): 197­99.

   4. Kyle Bannerjee, "Making Desktop Applications communicate with Your OPAC" (paper presented at the American Library Association Annual Meeting, New Orleans, June 27, 1999). Accessed Jan. 24, 2001, http://ucs.orst.edu/~banerjek.

   5. SmarTerm is a registered trademark of Persoft, Inc., Madison, Wisc.

   6. The UIUC Library administrations commissioned a review of the Technical Services Division to coincide with the introduction of the new desktop technology and integrated library system. The purpose of the review was to recommend ways to reengineer the work of the division. It quickly became clear that the division needed to be reorganized and work processes needed to be reconfigured substantially, if only to tap the capabilities of the new technology. This was no simple task, however, since there was little, if anything, that could be brought over from the old systems to function effectively in the new environment.

   7. Many more macros are in use at UIUC than space permits us to describe here.

   8. William H. Mischo and Mary C. Schlembach, "Web-Based Access to Locally Developed Databases," Library Computing 18, no. 1 (1999): 51­58.

   9. For a similar application elsewhere, see Karen R. Harker, "Order out of Chaos: Using a Web Database to Manage Access to Electronic Journals," Library Computing 18, no. 1 (1999): 59­67.

   10. For the code for this macro see the Web version of this article at www.lita.org/ital/2001_bregman.html. For information on the programming protocols used, consult SmarTerm Macro Language A­Z Reference (Madison, Wisc.: Persoft, Inc., 1997).


Appendix A. New Book Macro

Editor's & Author's Note: This macro is provided for reference only. Results may vary if it is utilized in a system other than that for which it was designed.

******** TIMING ********************************************
SUB EventWait
Dim EventStep as Object
Set EventStep = Session.EventWait
EventStep.EventType = smlPAGERECEIVED
EventStep.MaxEventCount = 1
EventStep.TimeoutMS = 1000
EventStep.Start
END SUB
********* END OF TIMING ************************************
'Declare CancelButton Flag Public
PUBLIC cancelflg
********* START OF THE MAIN PROGRAM ************************
SUB NewBooksNC
'! RUN THIS MACRO FROM THE DRA NETCAT MENU. THIS MACRO SCANS BARCODE
'! AND ADDS NEW TITLE DATA FROM MARC CATALOG RECORD TO THE GRAINGER
'! NEW BOOKS SQL DATABASE RESIDING ON WINDOWS NT SERVER.

'Scan or type in the barcode number
StartScan:
Barcode$ = AskBox$ ("Scan Barcode. Press 'Q' to Quit; S to start")

'This provides a way to stop work with this macro.

If Barcode$ = "Q" OR Barcode$ = "q" Then
GoTo Closing

' This checks for a 14 digit barcode; if fewer digits are found, user is
' prompted to scan barcode again.

ElseIf LEN(Barcode$) <> 14 Then
MsgBox ("Barcode Invalid. Please scan again.")
GoTo Scan
End If

'Declare arrays and reset variable values

Dim BIBINFO$(50), TEMP$(4)
Dim RowNumber, I, J, K as Integer
DBCN$ = "" : ISBN$ = "" : CallNumber$ = "" : PubDate$ = ""
Author1$ = "" : Title$ = "" : Edition$ = "" : Publisher$ = ""
Series$ ="" : Author2$ = "" : MainEntry$ = ""

'Send barcode and retrieve item record

Session.Send Barcode$ + Chr(13)
EventWait

'Obtain Database Control Number (DBCN) and Call Number from Terminal Display
'Capture ScreenText (Row, Column, Page, and Characters to capture)

DBCN$ = Session.ScreenText (2,11,1,8)
CallNumber$ = trim$(Session.ScreenText (20,15,1,20))

'Send DBCN and retrieve record

Session.Send DBCN$ + Chr(13)
EventWait

'Obtain Screen Number from Terminal Display

scrno$ = Session.ScreenText (1,46,1,2)

'Check whether screen number is > 9

If right$(scrno$,2) = "-" then
scrno$ = mid$(scrno$,1,len(scrno$)-1)
End If

'Obtain Publication Date from Terminal Display

PubDate$ = Session.ScreenText (7,33,1,4)

'Read Rows 9 through 21 of Terminal Display in as many screens as needed; 'Store 'data in arrays (I)

ScreenNumber = 1 : I = 0
Do while ScreenNumber <= val(scrno$)
RowNumber = 9
Session.Send "s" & str$(ScreenNumber) + Chr(13)
EventWait

Do while RowNumber <= 21
I = I + 1
BIBINFO$(I) = Session.ScreenText (RowNumber, 1, 1, 80)

'Concatenating multiline fields

If left$(BIBINFO$(I),10) = " " then
I = I - 1
BIBINFO$(I) = trim$(BIBINFO$(I)) + " " + trim$(BIBINFO$(I+1))
End If
BIBINFO$(I) = Trim$(BIBINFO$(I))

'Remove End of File Marker

If right$(BIBINFO$(I), 2) = " $" then
BIBINFO$(I) = left$(BIBINFO$(I), (len(BIBINFO$(I)) - 2))
End If
RowNumber = RowNumber + 1
Loop
ScreenNumber = ScreenNumber + 1
Loop

' Select required fields from MARCFIELD (I) array
' Store required fields in MARCFIELD (J) array

For J = 1 to I
Select Case Left$(BIBINFO$(J),3)
Case "100"
Author1$ = mid$(BIBINFO$(J),10)
Case "110"
Author1$ = mid$(BIBINFO$(J),10)
Case "111"
Author1$ = mid$(BIBINFO$(J),10)
Case "245"
Title$ = mid$(BIBINFO$(J),10)

'Remove Statement of Responsibility Field

SubfieldCodeC = instr(Title$," / $c ")
If SubfieldCodeC <> 0 then
Title$ = mid$(Title$,1, SubfieldCodeC-1)
End If
Case "250"
Edition$ = mid$(BIBINFO$(J),10)
Case "260"
Publisher$ = mid$(BIBINFO$(J),10)
SubfieldCodeC = instr(Publisher$,"$c")
If SubfieldCodeC <> 0 then
Publisher$ = left$(Publisher$, SubfieldCodeC-1)
End If
If right$(Publisher$,1) = "," then
Publisher$ = Mid$(Publisher$,1,len(Publisher$)-1)
End If

'Concatenate repeatable fields 020, 4XX, 7XX and 8XX

Case "020"
ISBN$ = ISBN$ & "; " + mid$(BIBINFO$(J), 10)
SubfieldCodeC = instr(ISBN$, "$c")
If SubfieldCodeC <> 0 then
ISBN$ = left$(ISBN$, SubfieldCodeC - 1)
End If
Case "440"
Series$ = Series$ + "; " + mid$(BIBINFO$(J),10)
Case "490"
Series$ = Series$ + "; " + mid$(BIBINFO$(J),10)
Case "700"
Author2$ = Author2$ + "; " + mid$(BIBINFO$(J),10)
Case "710"
Author2$ = Author2$ + "; " + mid$(BIBINFO$(J),10)
Case "711"
Author2$ = Author2$ + "; " + mid$(BIBINFO$(J),10)
Case "830"
Series$ = Series$ + "; " + mid$(BIBINFO$(J),10)
End Select
next J

'Remove leading semicolon and space from ISBN, Series and Author2 fields

ISBN$ = mid$(ISBN$, 3)
Series$ = mid$(Series$, 3)
If Author1$ = "" then
Author2$ = mid$(Author2$, 3)
End If

'Concatenate Author fields

MainEntry$ = Author1$ + Author2$

'Concatenate Title and Edition fields

If Edition$ <> "" then
Title$ = Title$ + " " + Edition$
End If

'Remove subfield codes A-Z from Title, MainEntry, Publisher & Series

TEMP$(1) = Title$
TEMP$(2) = MainEntry$
TEMP$(3) = Publisher$
TEMP$(4) = Series$

For K = 1 to 4
Stripcodes:
cpchk = 1
cp = instr(cpchk, TEMP$(k), "$")
Do while cp <> 0
If cp <> 0 and (mid$(TEMP$(k),cp+1,1) >="a" or_
mid$(TEMP$(k),cp+1,1) <= "z") then
TEMP$(k) = mid$(TEMP$(k),1, cp-2) + " " + mid$(TEMP$(k),cp+2)
Else
cpchk = cpchk + 1
End If
cp = instr(cpchk, TEMP$(k), "$")
Loop
Next K
Title$ = TEMP$(1)
MainEntry$ = TEMP$(2)
Publisher$ = TEMP$(3)
Series$ = TEMP$(4)

' START WORK WITH ACTIVEX DATA OBJECTS (ADO) TO INSERT DATA INTO AN MS-
' ACCESS DATABASE FROM WHICH INFORMATION WILL BE RETRIEVED AND
' DISPLAYED ON A WEB PAGE, ALSO VIA ADO

'Open a Connection to the SQLNewBooks database

set myCONNECT = CreateObject("ADODB.Connection")
myCONNECT.open "SQLNewBooks"

'Create the Recordset Object to retrieve records from the database

set myRS = CreateObject("ADODB.Recordset")
set myRS.ActiveConnection = myCONNECT

'Formulate SQL Statement to check for duplicate call number in FullRec table

myRS.open "Select count(*) from FullRec where CallNumber = '" + Callnumber$ + "'"
Duplicates = myRS.Fields(0)
myRS.close

'If duplicates are found, set item aside and begin to process next item

If Duplicates <> 0 then
msgbox "Duplicate Call Number. Please Process Manually."
goto DoAnother
End If

'Initialize cancelflg (CancelButton NOT clicked)

cancelflg = 0

Call DISPLAYSCREEN(Title$, MainEntry$, CallNumber$, PubDate$, ISBN$,_
Publisher$, Series$, Barcode$)

'Returning from DISPLAYSCREEN; Check CancelButton clicked or not

If cancelflg <> 1 then

'Create a Command Object to update the database

set mycmd = CreateObject("ADODB.Command")
set mycmd.ActiveConnection = myconn

'Formulate SQL statement to add a new record to the database

sql$ = "Insert into FullRec (DateRec, CallNumber, Year, Author, Title,_
ISBN, Publisher, Series, Barcode)"
sql$ = sql$ + " VALUES ('" & Now & "', '" & CallNumber$ & "',_
'" & PubDate$ & "', '" & MainEntry$ & "', '" & Title$ & "',_
'" & ISBN$ & "', '" & Publisher$ & "', '" & Series$ & "', '" & Barcode$ & "')"
mycmd.CommandText = sql$
mycmd.Execute
End If

DoAnother:
Erase BIBINFO$, TEMP$
GoTo StartScan

Closing:
set mycmd = Nothing
set myrs = Nothing
myconn.close
set myconn = Nothing

END SUB

************** END OF THE MAIN PROGRAM *********************


************** FUNCTION TO EDIT RECORD *********************
FUNCTION MyDlgProc(ControlName$, Action%, SuppValue%)
'based on value of action and ControlName you want to be interactive with dialog
END FUNCTION
************** END OF FUNCTION ***************************


************** START OF DISPLAYSCREEN PROGRAM ************
SUB DisplayScreen(Title$, MainEntry$, CallNumber$, PubDate$, ISBN$,_
Publisher$, Series$, Barcode$)
Begin Dialog UserDialog ,,250,270,"Please Verify/Edit Record",.MyDlgProc
OKButton 12,252,40,14
CancelButton 144,252,40,14
Text 4,4,156,8,"TITLE:",.Text1,,,ebBold
TextBox 4,16,200,12,.TextBox1
Text 4,40,156,8,"AUTHORS:",.Text2,,,ebBold
TextBox 4,52,200,12,.TextBox2
Text 4,76,156,8,"CALLNUMBER:",.Text3,,,ebBold
TextBox 4,88,200,12,.TextBox3
Text 8,112,148,8,"PUBLICATION DATE:",.Text4,,,ebBold
TextBox 4,124,200,12,.TextBox4
Text 8,148,148,8,"ISBN:",.Text5,,,ebBold
TextBox 4,164,200,12,.TextBox5
Text 8,188,148,8,"PUBLISHER:",.Text6,,,ebBold
TextBox 4,200,200,12,.TextBox6
Text 8,220,148,8,"SERIES:",.Text7,,,ebBold
TextBox 4,232,200,12,.TextBox7
End Dialog

Dim dummy as UserDialog
'Initialize properties of all controls of the 'Dummy' Dialog
dummy.TextBox1 = Title$
dummy.TextBox2 = MainEntry$
dummy.TextBox3 = CallNumber$
dummy.TextBox4 = PubDate$
dummy.TextBox5 = ISBN$
dummy.TextBox6 = Publisher$
dummy.TextBox7 = Series$
'Value -1 when OKButton was clicked; Value 0 when CancelButton was clicked
rc% = Dialog(Dummy, -1, 0)
Select Case rc%
Case -1 'OKButton clicked, proofread x$ in case any changes needed
Cancelflg = 0
Title$ = dummy.TextBox1
MainEntry$ = dummy.TextBox2
CallNumber$ = dummy.TextBox3
PubDate$ = dummy.TextBox4
ISBN$ = dummy.TextBox5
Publisher$ = dummy.TextBox6
Series$ = dummy.TextBox7
Case 0 'CancelButton clicked
MsgBox "Record has been canceled. Nothing is added to the database."
cancelflg = 1
End Select
END SUB


   Alvan Bregman ( abregman@uiuc.edu) is Rare Book Collections Librarian and Assistant Professor of Library Administration and Winnie S. Chan ( w-chan2@uiuc.edu) is Assistant Engineering Librarian for Computer Services and Assistant Professor of Library Administration at the University of Illinois at Urbana-Champaign.