Telecommunications Electronic Reviews (TER)

telecommunications electronic reviews

Volume 3, Issue 4, July, 1996

Telecommunications Electronic Reviews (TER) is a publication of the Library and Information Technology Association.

Telecommunications Electronic Reviews (ISSN: 1075-9972) is a periodical copyright © 1996 by the American Library Association. Documents in this issue, subject to copyright by the American Library Association or by the authors of the documents, may be reproduced for noncommercial, educational, or scientific purposes granted by Sections 107 and 108 of the Copyright Revision Act of 1976, provided that the copyright statement and source for that material are clearly acknowledged and that the material is reproduced without alteration. None of these documents may be reproduced or adapted for commercial distribution without the prior written permission of the designated copyright holder for the specific documents.


   Contents

ter issues


   REVIEW OF: Yuval Fisher. Spinning the Web: A Guide to Serving Information on the World Wide Web. New York: Springer, 1996.

by Tim Bucknall

This book is crammed full of information from cover to cover, discussing everything from the history of the Web and the construction of URL's to CGI scripts and server security. According to the author, this copious and extremely diverse information is intended primarily for an audience of content providers, Web page designers, and Web masters.

Spinning the Web is generally well written and Fisher interjects enough notes of wry humor to offset the occasionally turgid lapse into lengthy technical description. But the book suffers from a lack of focus stemming from an attempt to cover adequately both basic and advanced concepts. It seems highly unlikely that many individuals interested in a very basic definition of the Web and an introduction to the fundamentals of HTML, would also be interested in a comprehensive listing of strftime format strings and the intricacies of JavaScript and VRML. Indeed, the book adopts a somewhat schizophrenic approach to its target audience, with much of the first half of the work devoted to topics of interest primarily to Web neophytes and the second half accessible only to those with some degree of technical savvy.

Yet even the initial chapters, which describe such basic ideas as the history and function of the Internet and of Web browsers, may prove somewhat challenging to the vast majority of the Web's users, who utilize popular GUI browsers such as Netscape or Microsoft Explorer on Macs or PC's. For these users, the book's extremely strong, but largely announced, UNIX bias may prove more than slightly disconcerting.

For example, without warning users that this is a platform-specific instruction, Fisher instructs users that the best way to change external viewers for Mosaic is to alter the ".mailcap" file. Even more inexplicable, Fisher asserts that, "for the novice, FTP can be complicated, since the FTP interface is not graphical." He then proceeds to show an example of how to download a file from the Internet using command line FTP. Given the prevalence of PC-based GUI Web browsers, it would seem much more useful and practical to de-emphasize UNIX commands, and instead use descriptions and examples more relevant to the types of Web browsers commonly found on most desktops. For example, a brief tour of Webbed software archives and instructions on how to download software from within a browser would seem more useful than a description of Archie and command line FTP.

The next section starts well with "quick and dirty" instructions for setting up NCSA httpd and CERN httpd Web servers. These directions are short and accurate--less than 10 steps each. The author makes it easy for even a newbie to set up their own Web server--assuming they have the rights and space to do it. But then Fisher confuses things by providing an overly extensive listing of configuration directives, mixing arcane and little used configuration commands with those which are absolutely essential. The reader is overwhelmed with the number of options and the author provides relatively little means to evaluate which directives are really necessary.

After a brief, but generally strong, chapter on server security, Fisher spends three chapters covering HTML. He assumes the reader has no prior knowledge of HTML, and he provides excellent and extensive examples. Perhaps sensing that he is competing with hundreds of online HTML tutorials, Fisher personalizes his HTML guide by including some fun HTML tricks which he terms "truly nasty and naughty," complete with caveats and warnings.

The remaining chapters of the book are quite technical in nature, covering CGI scripts, WWW utilities, Java, and VRML. Here Fisher seems most in his element, charging from topic to topic and providing accurate and well-thought-out examples. One caution, however: he does tend to move a bit quickly, and those without prior UNIX experience may be left behind.

Throughout the book, Fisher refers to hundreds of Web sites containing further information which may be of interest to the reader. To simplify access and keep up with dynamic URL's, Fisher maintains this site list as a Web site ( http://inls.ucsd.edu/y/WWWBook/). The site also includes examples of VRML, HTML, Perl, and C code. This addition could prove an excellent mechanism for keeping the book up to date, although when I visited the site, several of the links were already dead.

This book is not recommended for novices. However, the often lively writing style and the generous examples do much to overcome the book's meandering focus, and make the $27.95 Spinning the Web a decent resource for intermediate and advanced Web users who do much of their work in UNIX.

Tim Bucknall (BUCKNALL@sesat.uncg.edu; URL http://www.uncg.edu/~bucknall/tim.html.) is Electronic Information Resources Librarian at the University of North Carolina at Greensboro. He deals with the Internet and other electronic information networking and access management functions.

Copyright © 1996 by Tim Bucknall. This document may be reproduced in whole or in part for noncommercial, educational, or scientific purposes, provided that the preceding copyright statement and source are clearly acknowledged. All other rights are reserved. For permission to reproduce or adapt this document or any part of it for commercial distribution, address requests to the author at BUCKNALL@sesat.uncg.edu.

image| ter issues


   REVIEW OF: Richard J. Linn and M. Umit Uyar, eds. Conformance Testing Methodologies and Architectures for OSI Protocols. Los Alamitos, CA: IEEE Computer Society Press, 1994.

by Mark Hinnebusch

This large (525 pages) collection of 35 previously published papers is a review of the state of the art in the theoretical underpinnings, and the practical application, of methods to test software and hardware products in order to ensure or verify conformance to communications protocols.

All of the material is extremely technical in nature and primarily of interest to those working in this specialty of telecommunications and theoretical computer science. The reader from a marginally related field may benefit from some of the articles of a tutorial nature. For instance, there are overviews of the OSI Reference Model, the ASN.1 specification language, and finite state automata theory. More specific, but still of somewhat general interest, are tutorials on state transaction specification tools such as TTCN, LOTOS and TESDL. Most of the real-world cases deal with X.25 conformance testing, but both X.400 and ISDN case histories are represented.

The material certainly has a limited audience but for that audience provides a good overview of the literature. The articles vary in dates, ranging from 1964 through 1992, with the majority first published in the mid 1980's. This collection is a good example of the difficulty of collecting and republishing relevant material in monographic form in a timely manner. In a field as dynamic as telecommunications, such collections can only serve a retrospective function.

In the mid 1980's, and even in the early 1990's, OSI was the wave of the future, and formal conformance testing mechanisms were seen as critical. OSI has lost a bit of its luster, and formal conformance testing is viewed as much less important in the TCP/IP community. While much of the material in this book is independent of specific protocols, the emphasis on OSI certainly gives it a dated appearance.

Mark Hinnebusch (FCLMTH@NERVM.NERDC.UFL.EDU) is with the Florida Center for Library Automation. He is the long-time chair of the Z39.50 Implementors' Group and the coauthor of From A to Z39.50: A Networking Primer (Westport, CT: Mecklermedia, 1995).

Copyright © 1996 by Mark Hinnebusch. This document may be reproduced in whole or in part for noncommercial, educational, or scientific purposes, provided that the preceding copyright statement and source are clearly acknowledged. All other rights are reserved. For permission to reproduce or adapt this document or any part of it for commercial distribution, address requests to the author at FCLMTH@NERVM.NERDC.UFL.EDU.

image| ter issues


   REVIEW OF: Daniel C. Lynch and Marshall T. Rose, eds. Internet System Handbook. Reading, MA: Addison Wesley Publishing Co., 1993.

by Mark Leggott

The Internet System Handbook is a collection of essays written by 23 Internet experts, including the two editors, plus Vinton G. Cerf, Stephen Kent, and John S. Quarterman. While published in 1993, all of the essays were written in 1992. Although this publishing date may seem like a big disadvantage (four years on the Internet is like 28 years anywhere else, even if no one knows you're not a dog), this volume is still a useful contribution to the literature on the ubiquitous Net.

The book is written for intermediate to advanced users (one chapter provides details for writing a sample client-server program in C), although it could also serve as an introduction to anyone wanting to learn more about the development and design of the Internet. As one would expect with a collection of essays, there is quite a range of writing styles and depth of coverage. Some essays are very general, and accessible to a wide readership. Others are quite technical, and require some knowledge of general network technologies and/or C programming. If you're one of those people who likes their reading to have flow and consistency, you may have a problem with this one. Then again, this ain't exactly Stephen King.

One reason the book remains useful today is that little time is spent on discussing applications, although e-mail, FTP and Telnet are discussed at some length. The focus, instead, is on the foundations and architecture of the Internet. Although a number of the chapters are in desperate need of an update--there is no mention of Internet Protocol Next Generation or the World Wide Web, and the chapters on backbone management and network tools have little practical value--their historical/reference value remains. As someone who needs to understand how everything I use works, I find myself referring to this volume with some regularity.

This 800-page tome is divided into four main parts: Introduction-- history, "administrative" architecture and activities, growth rate, globalization; Technologies--core protocols (ICMP, IGMP, UDP, DNS, TCP, IP), routers and routing protocols (distance vector, link state, multiprotocol and interdomain protocols), foundation applications (mail, FTP, Telnet), programming TCP/IP-based applications (TCP and UDP examples); Infrastructure--directory services (finger, WHOIS, DNS, NETFIND, X.500), management (SNMP), tools (Internet backbone and network management), performance (IP, ICMP, ARP, TCP and UDP layers), security; Directions--Internet growth and evolution, annotated bibliography. Parts two and three, on technologies and infrastructure, account for the bulk of the book.

Bottom line assessment--this is a useful reference work for computer science teachers and students, techies, system administrators and others concerned with the technical underpinnings of the Internet, but if you want information on the latest protocols/applications/technologies, go elsewhere.

Mark Leggott (mleggott@stfx.ca) is a cybrarian at Angus L. Macdonald Library, St. Francis Xavier University. His outpost in cyberspace is at http://libwww.stfx.ca/mleggott.html.

Copyright © 1996 by Mark Leggott. This document may be reproduced in whole or in part for noncommercial, educational, or scientific purposes, provided that the preceding copyright statement and source are clearly acknowledged. All other rights are reserved. For permission to reproduce or adapt this document or any part of it for commercial distribution, address requests to the author at mleggott@stfx.ca.

image | ter issues


   About TER