Volume 8, Issue 4, September 2001
Technology Electronic Reviews (TER) is a publication of the Library and Information Technology Association.
Technology Electronic Reviews (ISSN: 1533-9165) is a periodical copyright © 2001 by the American Library Association. Documents in this issue, subject to copyright by the American Library Association or by the authors of the documents, may be reproduced for noncommercial, educational, or scientific purposes granted by Sections 107 and 108 of the Copyright Revision Act of 1976, provided that the copyright statement and source for that material are clearly acknowledged and that the material is reproduced without alteration. None of these documents may be reproduced or adapted for commercial distribution without the prior written permission of the designated copyright holder for the specific documents.
REVIEW OF: Beverly Abbey. (Ed.). (2000). Instructional and Cognitive Impacts of Web-Based Education. Hershey, PA: Idea Group Publishing.
By James L. Van Roekel
As a university attempts to transform itself into a technology-based institution, Web sites are being utilized more and more. This certainly includes the presence of course content and supporting media as supplemental material for classroom-based courses and distance learning.
Many works on Web-based education give "how-to" lessons on publishing Web materials and media content for class use. The authors of this volume, rather, offer insights into issues of design guidelines, integration, theory, and evaluations of Web-based instruction and learning. Three questions are dealt with: "How should instructional delivery be modified for Web access? What independent cognitive responsibilities have been placed uniquely on the learner? How may we ensure that Web instruction is more than an electronic correspondence course" (p. i)? The authors deal with the questions in a myriad of ways, however, the main themes include: 1) the responsibility of the student to interact with the material; 2) the importance of creating exciting, relevant, and dynamic material to aid the student in interactions, including cognitive mapping, concept attainment activities, and motivational media toward effective and efficient instruction; and 3) the importance of technology skills and competence of both instructor and learner. There is an admission of a lack of "best" theories and practices for content design and Web instruction. Many of the authors do not pretend to know the future of the Web and its use in instruction. There is an overwhelming sense, however, that we are at the beginnings of a new educational delivery mode; a mode that, no doubt, will continue to be utilized in education and instruction.
This 270+ page volume includes a comprehensive index and references at the end of each chapter. It is organized alphabetically by author, because of the diversity of individual chapter topics. A few chapters offer illustrations that are presented very clearly in design and copy.
Instructional and Cognitive Impacts of Web-Based Education is an important, interesting and easily readable book. Anyone beginning to develop Web-based courses or interested in Web-based education, theory, distance learning, and content design will find this an excellent resource.
James L. Van Roekel (jamesvr@hotmail.com) is Director of Academic Instructional Technology and Distance Learning at Sam Houston State University, Huntsville, TX.
Copyright © 2001 by James L. Van Roekel. This document may be reproduced in whole or in part for noncommercial, educational, or scientific purposes, provided that the preceding copyright statement and source are clearly acknowledged. All other rights are reserved. For permission to reproduce or adapt this document or any part of it for commercial distribution, address requests to the author at jamesvr@hotmail.com.
REVIEW OF: Allen C. Benson and Linda M. Fodemski. (1999). Connecting Kids and the Internet: A handbook for Librarians, Teachers, and Parents (2nd ed.). New York: Neal-Schuman.
By Roger Strode Harrison
Although no printed handbook can hope to map completely the ever-changing Internet landscape, a guide such as Connecting Kids and the Internet provides what the Internet itself so desperately needs: a starting point for those who need to guide youngsters in Internet exploration but who themselves may be unfamiliar with this terrain. This book, although published in 1999, provides a reasonably accurate and up-to-date picture of what's "out there," outlining the essential aspects of the Internet, providing pointers to further adventures, and warning of pitfalls along the way.
Due to the recent federal legislation and media coverage dealing with kids and Internet access, the first issue which springs to mind for many readers may be the issue of how kids can use the Internet appropriately. After introducing the history and structure of the Internet in chapter 1, the authors dive right into "Safety@the.keyboard" in the second chapter, where they state: "...[i]n cyberspace, just as in libraries and bookstores, parents have the right and responsibility to control the flow of information to their children and only their children." While users often overlook password security, the authors introduce their discussion of Internet safety with guidelines for selecting appropriate passwords. The treatment of virus protection is scant, and unfortunately warns that the "best defense against viruses is running anti-virus software." In fact, avoiding risky behavior, which the authors do discuss but don't emphasize, should be considered as the primary defense, and using anti-virus software should be considered a necessary but secondary means of protection. Filtering is discussed in enough depth to provide readers with a good idea of what they can and can't do, with the important caveat that "parents are still the best filtering mechanism." The introductory chapters conclude with recommended sites presenting what's new on the Internet.
No guide to the Internet would be complete without a list of recommended Web sites, and this one is well-chosen, with an engaging chapter devoted strictly to online "virtual field trips." But to their credit, they pay similar attention to e-mail, including a good discussion of something with which many Internet veterans have trouble: subscribing and unsubscribing to mailing lists. The coverage of mail clients Eudora and Netscape (with screen shots) is useful for those who may not be acquainted with any but Microsoft clients. The authors will want to add coverage of Outlook Express, missing in this edition, to future editions. The chapter "Finding Friends in Faraway Places" presented an aspect of the Internet with which I was unfamiliar: expanding the "penpal" concept into "keypals," and suggesting numerous sites where kids can find correspondents in other countries.
This book particularly shines in its chapters devoted to learning activities. "Finding the Good Stuff" includes an accessible yet thorough introduction to search strategies, which is within the reach of high-schoolers but is something that many students even in college fail to master. The chapter on searching library catalogs via the Internet, although a good overview, lacked any real discussion of what distinctly separates library databases from most Web search engines: the search precision available by using the controlled vocabulary in subject headings. Intellectual property rights and citing electronic sources are important concepts for students to master, and are intelligently placed in the chapter on "Browsing Virtual Bookstores." The chapter on "Serious Research Sites" will not be ground-breaking for librarians, but for parents and teachers unfamiliar with librarian-organized guides to Internet content, this chapter is indispensable. The six "Internet Self Study Guides" in this chapter, however, don't help the novice explore these resources, but instead function more as simple how-to guides for unrelated tasks.
The chapters on "The Internet for Teens" are not what one might expect. Rather than covering subjects of teen interest (for instance, researching colleges on the Web), they instead focus on more esoteric (and in my opinion, less generally interesting) aspects of the Internet: UNIX shell accounts, telnet, and FTP. Newsgroups are nicely covered in this section, but it might more appropriately belong with the chapters on e-mail and other text-based adventures.
The authors provide worthwhile supplementary material, including 14 lesson plans exploring key concepts in the book, with recommendations for the age-appropriateness of each, and a CD-ROM including "The Link Farm," consisting of hundreds of Web links arranged by subject. The lesson plans are included on the CD-ROM. The next edition of the book ought to omit the CD-ROM in favor of a Web site to perform the same function.
Although parts of this book are already outdated or superseded, its worth lies in its systematic presentation of what the Internet is, how it is navigated, and what use can be made of it. Therefore, this book will continue to be useful for years to come.
Roger Strode Harrison (rharrison@fullerton.edu) is a systems analyst at the Pollak Library, California State University, Fullerton.
Copyright © 2001 by Roger Strode Harrison. This document may be reproduced in whole or in part for noncommercial, educational, or scientific purposes, provided that the preceding copyright statement and source are clearly acknowledged. All other rights are reserved. For permission to reproduce or adapt this document or any part of it for commercial distribution, address requests to the author at rharrison@fullerton.edu.
REVIEW OF: Allison J. Head. (2000). Design Wise: A Guide for Evaluating the Interface Design of Information Resources. Medford, NJ: Information Today.
By Wilfred (Bill) Drew
According to the author, Allison Head, "the purpose of Design Wise is to go beyond the typical considerations that often sway us into making a choice -- like price, availability, or content -- so that design of an interface itself is also a basis for evaluation (p. xv)." This work is intended for anyone that must evaluate information resources products. Design Wise is divided into two parts: Part 1: Interface Design Basics, and Part 2: Interface Design Analysis.
Part 1: Interface Design Basics provides an introduction to HCI (human-computer interaction) design and how to evaluate information resources. Chapter 1 is titled "Why Design Matters." Chapter 2, "Secret Shame", describes how many knowingly design badly designed resources. Mitchell Kapor of Lotus fame is interviewed. This chapter also describes the trade-offs made by developers and how important usability testing is in the process. "Deconstructing Evaluation", chapter 3, examines how complex the evaluation of any computer-based resource has become. One of the most valuable parts of the book, an evaluation template, is presented in this chapter. All three chapters contain charts, tables, examples, and interviews.
Part 2: Interface Design Analyses uses the design template by looking at three formats of resources: CD-ROMs, Web sites, and online commercial vendors. Chapter 4 examines the design of interfaces to CD-ROMs. Peter Jacso is inteviewed about the future of the CD-ROM and how it will fare in the new Web world. Chapter 5 looks at the Web design process and how it needs to include a more user-friendly design. Lou Rosenfeld, co-author of Information Architecture for the World Wide Web, discusses the problems of design information systems in the current environment of the Web.
Chapter 6,"Online Commercial Databases: Power Tools Unplugged?," looks at the pluses and minuses of moving commercial services to the Web.
Chapter 7, "Four Predictions," makes four predictions based on the research conducted to write the book.
Design Wise belongs on the desk of every Web designer next to Lou Rosenfeld's book, Information Architecture for the World Wide Web. It should be required reading for every student in all master's programs in library and information studies. This book is highly recommended as a purchase for all libraries and Web design departments.
Wilfred (Bill) Drew (drewwe@morrisville.edu or http://www.morrisville.edu/~drewwe) is Associate Librarian, Systems and Reference at SUNY Morrisville College Library.
Copyright © 2001 by Wilfred (Bill) Drew. This document may be reproduced in whole or in part for noncommercial, educational, or scientific purposes, provided that the preceding copyright statement and source are clearly acknowledged. All other rights are reserved. For permission to reproduce or adapt this document or any part of it for commercial distribution, address requests to the author at drewwe@morrisville.edu.
REVIEW OF: Tim Lindholm and Frank Yellin. (1999). The Java Virtual Machine Specification (2nd ed.). Reading, MA: Addison-Wesley.
By Michael B. Spring
The Java Virtual Machine is the software that allows Java programs to run on multiple platforms. It is the software layer responsible for the interface between the byte code of the Java class files and the execution of those instructions on a particular hardware platform. As such, it is what promises Java's cross-platform capability. Its role as interface to the host system also makes it the locus of the security capabilities promised in Java. Finally, given the intent to provide small footprints on limited capability platforms, it is the piece of the Java set that must be developed efficiently and cleanly.
The intent of this book is to provide the technical specification for the Java Virtual Machine (JVM). At the same time, it provides an insider's perspective on building a Java Virtual Machine that makes it an enjoyable as well as an informative read. It is a standard that is enriched by the primary author's perspective and comment. The information is clearly presented and the discussion is informative without being overbearing. The caveats and assumptions are of import and well stated.
From this reader's point of view, chapter two, Java Programming Language Concepts, is one of the most succinct and well written overviews of Java available. It puts all of Java on the table in a clear concise way. It is a great read after having muddled through all the dos and don'ts in the myriad of Java programming books on the market. While it is an expensive structural overview, the serious Java programmer who has spent hundreds of dollars on less definitive programming texts might well consider this addition to his or her library for the structural clarity it provides. Sun might consider using this chapter as an appendix in some of its supposedly definitive books like Core Java.
Chapter three addresses the structure of the JVM and chapter four goes into the binary class file format. The structure of the file as created by the compiler is defined, followed by an explanation of the role of the JVM in examining the class file. This includes a detailed discussion of the various phases of class verification as well as the bytecode verifier, instance initialization, and exception checking. Chapter five addresses the loading, linking, and initializing processes, issues of validity, trustworthiness, loading, and linking.
The bulk of the book is a simple and clear reference to the JVM instruction set. Each of the op codes is described in terms of the format, value, operand stack inputs and outputs, and a description of the processing. The description contains a delineation of the exceptions that can and cannot be thrown. This is the substance of the specification and constitutes about half of the book.
Chapter seven provides some useful insight into the coding of a JVM. As with any complex programming activity, it helps to have a picture of how those who envisioned the specifications implemented them. This chapter provides enough of any insight without providing too much detail. The authors talk about the things that they have done, as well as the things that need to be done or that will be done in future releases. As we look to a time when JVMs will be written for hundreds of different types of devices -- along the lines originally envisioned in the "Green" project -- these insights provide valuable support for those who will engage this task.
Michael B. Spring is an Associate Professor of Information Science and Telecommunications at the University of Pittsburgh. He can be reached at spring@imap.pitt.edu.
Copyright © 2001 by Michael B. Spring. This document may be reproduced in whole or in part for noncommercial, educational, or scientific purposes, provided that the preceding copyright statement and source are clearly acknowledged. All other rights are reserved. For permission to reproduce or adapt this document or any part of it for commercial distribution, address requests to the author at spring@imap.pitt.edu.
REVIEW OF: Erik T. Ray. (2001). Learning XML. Sebastapol, CA: O'Reilly & Associates.
by Brian K. Yost
The acronym SGML (Standard Generalized Markup Language) is often jokingly changed to Sounds Good Maybe Later. The promise of SGML is great, but its complexity and lack of practical application outside of publishing have prevented it from gaining wide general use. XML (Extensible Markup Language), which is based on SGML, is predicted to change all this. XML is touted as having limitless application and being less complex than SGML. With all this promised potential, the question I have when reading about XML is: "For what practical applications can I use XML now in terms of Web publishing and information management?"
So how does Learning XML do in providing answers to my question? Erik T. Ray does an excellent job of explaining what XML is and why it holds great potential, but the book is a bit lacking in discussing current practical uses. However, this is an intentional approach. Learning XML takes a theoretical approach to XML rather than attempting to be a cookbook of XML programs. In the preface, Ray states: "This book is intended to give you a birds-eye view of the XML landscape that is now taking shape" and "We'll concentrate on the theory and practice of document authoring without going into too much detail about writing applications or acquiring software tools" (p. x). The title meets these stated goals very well. In addition, Ray has included examples and applications to illustrate the concepts covered in the chapters.
Learning XML is organized in a logical manner. Chapter 1, "Introduction," serves as an overview of XML and includes a very good section describing just what XML is. Also included are sections on the history, goals, and the current state of XML. Chapter 2, "Markup and Core Concepts," explains the structure of XML documents. Chapter 3, "Connecting Resources with Links," provides in-depth and detailed information on linking resources in XML. Chapter 4, "Presentation," explains the CSS (Cascading Style Sheets) standard and how it can be used to transform XML documents. Chapter 5, "Document Models: A Higher Level of Control," covers DTDs (document type definitions) and XML Schema. Chapter 6, "Transformation: Repurposing Documents," covers XSLT (Extensible Style Language for Transformation). Chapter 7, "Internationalization," addresses issues of XML character sets and encoding. Chapter 8, "Programming for XML," introduces using Perl, SAX, and DOM with XML. There are two appendixes: "Resources" and "A Taxonomy of Standards." A glossary and index are also included.
Many of the chapters go into great detail about rules and syntax, such as the section on XPointer in chapter 3. At times, I thought there was too much detail for a book intended to be an overview, and this made it difficult to read straight through. If I am going to learn these kinds of details, I would actually like to be working with XML. However, because of the level of detail, I will find this book valuable as a reference in the future when I need a refresher on a topic -- much like I use O'Reilly's HTML & XHTML: The Definitive Guide to review a topic when needed.
Learning XML would be used best in conjunction with other titles that provide details on using XML for practical applications. Some titles that would complement it include:
- Benoit, M. (2001). XML by Example (2nd ed.). Indianapolis, IN: Que.
- Cagle, K., Gibbons, D., Hunter, D., Ozu, N., & Pinnock, J. (2000). Beginning XML. Birmingham, UK: Wrox Press.
- Elliotte, R.H. (2001). XML Bible (2nd ed.). New York, NY: Hungry Minds.
- Ozu, N., et. al. (2001). Professional XML (2nd ed.). Birmingham, UK: Wrox Press.
Ray meets his goal of providing a theoretical and practical overview of XML with Learning XML very well. He provides an excellent introduction to XML, as well as a reference to be used by more experienced users. Although you probably will not be able to start using XML for practical applications immediately after reading Learning XML, you will have a good understanding of the purpose, structure, and potential of XML.
Brian K. Yost (yostb@hope.edu) is Systems Librarian at Hope College in Holland, Michigan.
Copyright © 2001 by Brian K. Yost. This document may be reproduced in whole or in part for noncommercial, educational, or scientific purposes, provided that the preceding copyright statement and source are clearly acknowledged. All other rights are reserved. For permission to reproduce or adapt this document or any part of it for commercial distribution, address requests to the author at yostb@hope.edu.
REVIEW OF: Julie M. Still. (Ed.). (2001). Creating Web-Accessible Databases: Case Studies for Libraries, Museums, and Other Nonprofits. Medford, NJ: Information Today.
By Lydia Ievins
In the early planning stages of a project to render an existing database Web-accessible, Still looked around for published results of others' related experiences and discovered a paucity of material. This volume has been assembled to fill that void. It presents a range of casual and accessible narratives from a dozen different writers in academic, public, and corporate environments.
The first perspective in this collection comes from Ronald C. Jantz, whose role at the Scholarly Communications Center of the Rutgers University Libraries involves fostering collaboration among faculty and librarians to enable the Web publication of existing topical databases. He describes a procedural and architectural approach emphasizing reusability, explaining that this economy of effort allows for a single straightforward technical solution across multiple projects (some half-dozen completed databases are mentioned) and even replicable at any other research library. A second Rutgers voice, that of Vibiana Bowman, provides a more detailed account of a recent project facilitated by Jantz's organization. In her case, the starting point was an old (1980s) bibliographic database of local history, a searchable index of the local newspaper created with the ProCite citation management tool. She describes a laborious conversion process, the results of which were just readying for public release as of the publication date, and offers administrative and logistical advice to others facing similar projects.
Mary and John Mark Ockerbloom administer a pair of collaboratively developed sites, the On-Line Books Page and the Celebration of Women Writers, housed at the University of Pennsylvania. Both sites are generated from back-end databases, offering field searching and multiple browsable views of the data. This joint case study describes their scope, development, and maintenance processes. Because these sites have been up since the early '90s, the Ockerblooms offer a relatively seasoned perspective. By contrast, Melissa Doak's project at SUNY Binghamton, a historical site about Women and Social Movements in the United States, is in its relative infancy. Her experience began with a focus on document content, and more recently incorporated a complete reassessment of navigational and site design needs; the project is still seeking funding for development of an underlying database.
The Virginia Digital Library Program (VDLP) provides funding and implementation services for digitization projects across the state. Elizabeth Roderick writes that among the VDLP's accomplishments number some 20 new bibliographic databases. These started out in a wide variety of formats, from fragile hand-written card files to word-processing files to legacy databases. All these records have now been painstakingly converted to MARC; the resulting databases are published centrally and cross-searchable via a single gateway. Roderick describes the personnel and technical resources required to maintain existing projects, offering an administrative overview of such issues as usage statistics, security concerns, and the cost-benefit of providing digital library services. From a similarly administrative perspective but from the commercial sphere, Vicky H. Speck at ABC-CLIO describes her organization's experience in publishing an existing print and CD-ROM database product to the Web, addressing questions like re-keying of print indexes, using in-house development versus hiring outside help, and employing principles of iterative design.
The following several chapters stray from the specific problems of creating Web-accessible databases to take on an assortment of related scenarios, these primarily depicting the use of such databases. It seems clear that in her editorial role, Still wished to round out her collection with a broader range of experiences -- but this tactic leads to something of a fragmented reading experience. Her next two contributors hail from a bookselling background. Brian-John Riggs of Papermoon Books relates his recent personal journey to becoming an online bookseller, as part of the Advanced Book Exchange (ABE) network of retailers throughout the U.S. and Canada. Jeff Strandberg writes about the same community from a different angle: His company, 21 North Main, works in alliance with ABE to market these retailers' used, rare, and out-of-print books to libraries. Less about creating Web-accessible databases than about showcasing a single application where such tools have proven indispensable, these two case studies serve as a counterpoint from the small start-up business perspective. The next editorial leap takes us away from database creation altogether to focus instead on the user experience. Anne T. Keenan and Laura B. Spencer, from the Blair (Nebraska) Public Library and from Rutgers University, respectively, describe the issues they encounter in helping library patrons make use of databases and other resources in the electronic environment. Keenan writes about her patrons' Internet use patterns and about helping them learn to evaluate Web resources; Spencer's thoughtful analysis looks more closely at the philosophical implications of ways in which the Web has both simplified and complicated the library research environment. After this, in a gesture that Still must have intended as a helpful bit of conceptual framework for non-librarian readers, Aurora Ioanid of Monmouth University and Vibiana Bowman of Rutgers offer a whirlwind overview of the concept and history of metadata.
The final chapter returns to the promised program of case studies in creating Web-accessible databases, this time from Oxford University's Richard Gartner. He draws an interesting parallel between the current proliferation of different database solutions and the chaotic period in cataloging history immediately before the standardizing influence of AACR2 and MARC. The same forces that drove standardization then, he argues, can be seen at work in the arena of digital content now. He proposes XML as the current best tool for devising a standard framework to describe large, multifaceted digital collections. As examples, he cites several precedent-setting projects that have taken this approach: American Memory Project at the Library of Congress, the Online Archive of California, Harvard's Visual Information Access catalog, and the Internet Library of Early Journals at Oxford.
By design, given the ephemeral nature of individual technologies, this volume treads very lightly on technical details, choosing instead to focus on process and people. In some cases, the technically oriented reader will find not even the sparest of "techy" descriptions. Thus, the collection's primary utility will be for non-technical readers who are just beginning to gather advice and impressions about an upcoming project, or who merely want some generalist background reading about what might be involved in building a Web-accessible database.
Lydia Ievins (ievins@alumni.si.umich.edu) is a freelance information professional in western Massachusetts.
Copyright © 2001 by Lydia Ievins. This document may be reproduced in whole or in part for noncommercial, educational, or scientific purposes, provided that the preceding copyright statement and source are clearly acknowledged. All other rights are reserved. For permission to reproduce or adapt this document or any part of it for commercial distribution, address requests to the author at ievins@alumni.si.umich.edu.