TER Volume 4, Issue 7, July, 1997

ter - telecommunications electronic reviews

Volume 4, Issue 7, July, 1997

Telecommunications Electronic Reviews (TER) is a publication of the Library and Information Technology Association.

Telecommunications Electronic Reviews (ISSN: 1075-9972) is a periodical copyright © 1997 by the American Library Association. Documents in this issue, subject to copyright by the American Library Association or by the authors of the documents, may be reproduced for noncommercial, educational, or scientific purposes granted by Sections 107 and 108 of the Copyright Revision Act of 1976, provided that the copyright statement and source for that material are clearly acknowledged and that the material is reproduced without alteration. None of these documents may be reproduced or adapted for commercial distribution without the prior written permission of the designated copyright holder for the specific documents.


REVIEW OF: Mark Grand. Java Language Reference. Sebastapol, CA: O'Reilly & Associates, 1997.

by Craig S. Booher

Java Language Reference is designed as a reference manual for serious Java programmers. Consisting of ten chapters and an appendix, the book focuses on the syntax and lexical structure of the Java language (version 1.0.2). Early chapters discuss lexical analysis, data types, expressions, and declarations. The middle of the book covers statements and control structures, program structure, and threads. A detailed description of exception handling and an extensive discussion, representing almost half of the book, of the classes in the java.lang package complete the body of the manual. The appendix lists the Unicode character set.

Organization is one of the essential features of a good reference manual. Grand has done a superb job of organizing his information. He presents the language in a bottom-up order, starting with elementary components and building to more complicated structures. This allows the reader to quickly recognize the subunits of the language and their hierarchical relationships.

The contents of each chapter are also arranged in a hierarchical fashion. Section headings and subheadings are numerically classed with decimal notations as deep as four levels. This arrangement enables the reader to easily understand the position of any particular Java element within the overall context of the language.

Railroad diagrams are used extensively throughout the book to provide succinct representations of the syntactical and lexical constructs allowed by the language. Those versed in this convention can quickly grasp the sequence of words, symbols, and punctuation which form an acceptable construct within the Java language. Grand considerately includes a short, but extremely useful, explanation of the structure and conventions of railroad diagrams for those who are unfamiliar with this notation.

Another strength of the book is its syndetic structure. The table of contents, an index, and the systematic, thorough use of cross references are key elements in this structure. Because the book is so well organized, the table of contents, which provides entries to each chapter and main section, can often be used to locate a specific topic. A ten-page index provides very precise access to the book, including entries for non-alphanumeric elements (e.g., *, -, <, and >). Finally, most sections conclude with references to one or more related sections. In fact, these references are so common that the reader cannot help but picture this book in an online, hypertext environment.

Recognizing that Java is a relatively new programming language, the author has included several aids--familiar markers--to help readers orient themselves. Those familiar with C and its offspring will appreciate Grand's frequent comparisons of Java elements and constructs with their counterparts in C and C++. He also often provides examples to illustrate the language element or syntax being discussed.

The physical layout of the book contributes to its utility. Systematic use of font conventions allows the reader to readily interpret the significance of words and phrases. Section headings are printed in bold face and larger type, making them easy to locate. Plenty of white space, created by adequate margins, short paragraphs, bullet points, and examples relieve the reader of potential eyestrain in such a technically dense work. One truly delightful feature is a paperback bound in such a way that it readily lays open without having to break the spine. This is an extremely useful feature when one is trying to read and keep two hands on the keyboard at the same time.

This book is not intended to be read sequentially. Nor is it an introductory book or a programming guide. As a reference manual, it is designed to provide quick, succinct answers to specific questions about the syntax and structure of the Java programming language. It will be very useful to those with intermediate or advanced experience with Java. Although beginners will probably want to include this book in their reference library, they will definitely need a tutorial or other basic work to provide them with a foundational knowledge of the language.

Craig S. Booher (cbooher@kcc.com) is the Technical Information Coordinator for the Kimberly-Clark Corporation and has been a reviewer for the Science and Technology Annual Reference Review.

Copyright © 1997 by Craig S. Booher. This document may be reproduced in whole or in part for noncommercial, educational, or scientific purposes, provided that the preceding copyright statement and source are clearly acknowledged. All other rights are reserved. For permission to reproduce or adapt this document or any part of it for commercial distribution, address requests to the author at cbooher@kcc.com.

REVIEW OF: Ziad M. Awdeh and James N. Budwey. INTER COMM '97, Global Communications Congress and Exhibition: Congress Proceedings, February 24-27, 1997 Vancouver, BC, Canada. Norwood, MA: Artech House, 1997.

by Pamela Czapla

This tome disappoints.

Before going further, an overview of content is in order. This book reports the proceedings of three congresses held jointly at Inter Comm '97, in Vancouver, British Columbia. As Congress Chair James N. Budwey indicates in the introduction, two vertical markets were added: teleport and intelligent transportation systems. Thus the papers of three congresses (Inter Comm, the World Teleport Association, or WTA, and the Intelligent Transportation Society of America, or ITS America) are represented. A "blockbuster plenary" constituted the closing session.

The Inter Comm congress proceedings discuss carriers, wireless communications, enterprise networking, Global Information Infrastructure (GII), and regional infrastructure. Overviews cover new developments, convergence, and regions such as Latin America, Asia Pacific, Eastern Europe, and China.

The second block of papers features the World Teleport Association. According to Newton, teleports are high bandwidth distribution systems that traditionally consist of a fiber optic/coaxial cable network and a collection of nearby satellite antennas. [ 1] The World Teleport Association papers discuss global opportunities, intelligent cities, and real estate usages. Papers represent activity in North America and the Pacific Rim. The last two papers in this section suggest growth areas for teleport technology such as multipoint conferencing.

An Intelligent Transportation System is "the application of technology and information, where appropriate, to improve the movement of people and goods." (p. 554) The papers from ITS American constitute the third section of these proceedings. Starsman ("Intelligent Transportation Systems--A Status Report") and Jaffe (ITS Architecture Overview--Architecture Applications") provide decent status reports and architectural overviews respectively. In "Intelligent Transportation Systems Market Opportunities," Parsons delineates six areas where he sees market opportunities. Unfortunately, none of the panelists on the future directions panel wrote up their presentations.

The closing blockbuster session featured panelists from all three congresses. However, only the President of the Intelligent Transportation Society of America submitted a paper. After providing an overview of the development of ITS, he suggests partnerships as essential to the future of ITS. He also suggests another six years of strong federal involvement to realize goals articulated in the Intermodal Surface Transportation Efficiency Act (ISTEA) of 1991.

That sums up the material in these sections. However, although the forum may be groundbreaking, the proceedings are not. They fall short in many ways. For many presenters there is only a photo, biography, and a blank area for notes, no paper. Some papers are simply the proposals describing what the panelist will discuss, or abstracts. More complete papers often provide little more than platitudes. Promising titles turn out to be thinly veiled sales pitches. Statistics are cited without verification of source.

For those with patience and time, there are occasional gems. Thanks to this "alignment of planets," the reader has a chance to look across developments in these three areas in one sitting. Thus, these proceedings provide a quasi-state-of-the-art look at telecommunications today and in the near future. They address the advances or lack thereof. Current telecommunication confusion and uncertainties are all there.

These proceedings, like the congresses, are not for the neophyte. Acronyms abound. Few take the time to define terms. Panelists emanate a high level, sometimes global, understanding of the issues involved. The perspective is theoretical and philosophical rather than detailed.

The perspective is mainly North American with a sprinkling of global views. On a global basis Europe, Asia, and Latin America are best represented. There are sporadic mentions of Russia and Australia--no representation from Africa. This probably is a function of travel budgets as much as actual activity.

Like the six blind men and the elephant, statements reflect the backgrounds and professional origins of the authors. Telephone panelists extol telephone technologies such as ADSL (Asymmetric Digital Subscriber Line) and ISDN (Integrated Services Digital Network) for their contributions to telecommunications while minimizing the technologies of other industries like cable telecommunications. ITS America panelists discuss telecommunications from the standpoint of intelligent transportation systems.

Although the presentations are understandably myopic, they do demonstrate commonalities of issues and problems faced by the disparate congresses. Regulation dogs all of them. International problems know no boundaries. Difficulties of marketing to the consumer and customer satisfaction are experienced by all. Other common themes are (1) calls for a level playing field, (2) noting that national or regional standards conflict with global aspirations, (3) rising demand for telecommunications, and (4) the idea that progress should be driven by service, not technology.

In summary, the published proceedings, where papers are provided, portray the landscape of telecommunications with broad brush strokes. For the well informed professional who would be the natural reader for these proceedings, this tome probably adds little.


[1] Newton, H. (1993). Newton's Telecom Dictionary. New York: Telecom Library Inc., p. 979.

Dr. Pamela Czapla (pjc2@psu.edu) is the Director of the National Cable Television Center Library at Pennsylvania State University.

Copyright © 1997 by Pamela Czapla. This document may be reproduced in whole or in part for noncommercial, educational, or scientific purposes, provided that the preceding copyright statement and source are clearly acknowledged. All other rights are reserved. For permission to reproduce or adapt this document or any part of it for commercial distribution, address requests to the author at pjc2@psu.edu.

REVIEW OF: John Maxymuk, ed. Finding Government Information on the Internet: A How-to-do-it Manual. New York: Neal-Schuman Publishers, 1995.

by Robert H. Wittorf

This manual reviews access to government publications in electronic formats. The librarian who is used to thinking that the Government Printing Office (GPO) is the sole source of government publications (outside of the originating agency) will be surprised to find what has happened in the past several years. The trend has been away from producing paper copies.

First there were CD-ROM discs and magnetic media diskettes supplementing paper publications. Then came electronic delivery over telecommunication lines using diverse server types: Telnet, File Transfer Protocol (FTP), Gopher, and World Wide Web. With the proliferation of servers, the bibliographic control and distribution of U.S. federal publications no longer needed to pass through a single agency. Separate agencies distributed information directly to the public. Further, agencies no longer produced paper documents in many cases, substituting machine-readable copies that could be downloaded. Clearly, the Internet altered the landscape of government documents.

The early chapters of this book cover the efforts of the library community to have the U.S. government rationalize the distribution of government publications. Currently, there is no easy solution to bibliographic control, but then, there is no easy solution to bibliographic control over Internet publications generally.

A second development, occurring as part of the Paperwork Reduction Act (1980), was the privatization of U.S. federal information. "Privatization" frequently involved higher prices for formerly free or inexpensive information. Quoting the American Library Association, privatization provided "less access to less information." (p. 24) Providing a classic case study, in chapter two James Love spends several pages on the controversy surrounding HR (House of Representatives) 830 (1995), a bill which included draft provisions allowing private publishers of government information broad control over its dissemination and the prices they could charge for it. The chapter details the successful fight that librarians and others mounted to prevent government data from being priced beyond the means of ordinary citizens.

To be sure, the GPO still distributes paper and other media as part of the Federal Depository Library Program (FDLP), documenting them in the Monthly Catalog of U.S. Government Publications. In reality, however, a considerable amount of information is distributed through electronic means over telephone wires or using the Internet. In addition, the public may be charged fees to acquire some of this information.

This new means of distribution has added a dimension to servicing government document collections: the government document librarian needs to proactively prepare the library to retrieve information. Such preparations can include creating local Web sites with hypertext links to the myriad sources and variety of available government information. Chapter four covers the mechanics and decision making needed for maintaining library Web and Gopher pages on the Internet.

Chapters five and six describe the development of the GPO's efforts to provide access to electronic documents. They include information on registering with the Federal Bulletin Board (FBB); using it and the Government Locator Service; and public access. On the first level of access is general access, which is free; a second level is available to registered, validated users who can download information and pay with a credit card number or a GPO deposit account. Finally, the depository librarian may access the FBB through a special password and retrieve most (but not all) files without cost as part of the FDLP.

From chapter seven on, the book surveys other electronic information sources of the varieties of government information: U.S. federal, state, and local governments, and foreign countries. The manual gives scores of uniform resource locators (URLs) to governmental and non-governmental World Wide Web sites, including law schools and commercial information providers. The sites are either collections of links to other sites or primary information sites. The authors describe each site, giving its authorship, purpose, and an overview of its contents.

The reader needs to have access to the Internet in order to use this manual with maximum effect as it contains scores of URLs for sites in the U.S. and abroad. Sites should be visited for their learning value since Web sites are dynamic and yesterday's contents may not be today's. There are the inevitable typographical errors or omissions from the URLs (p. 208--"Financenet" as http://ww.financenet.gov or p. 210--"Rice University" as http://www/rice.edu). In some cases the Web page has changed since it was cited(e.g., p.208--International Trade Administration Home Page should be http://www.ita.doc.gov and not http://www.doc.gov/resources/ITA_info.html).

The topics that the manual covers go beyond those of standard administrative, legislative, or judicial interest. Any documents librarian knows that U.S. federal information includes scientific, social, and historical information (to mention only three broad areas of interest). It is not surprising that the areas covered include technology and areas in which government is generally expected to be strong: patents and trademarks; agriculture; biology and chemistry; computer science; earth sciences; mathematics and physics; and space science. There are also URL references to files of graphic images at the Library of Congress, NASA (National Aeronautics and Space Administration) and the Smithsonian Institute.

Each chapter includes a short, relevant bibliography. At the end of the book is a list of all the URLs mentioned for Web access. In addition, there is a list of Telnet and FTP servers cited. The authors also provided an index.

In an otherwise outstanding manual, there is one major defect: it would have been useful if the publishers had included a bookmark file on a diskette to facilitate browser access or at least had provided the file on a Web site for downloading.

This manual brings to mind the preservation issue, which strikes me whenever my browser tells me that a particular URL is no longer valid. It is just as easy to revise an online document or online statistics as it is to change a URL. Few independent individuals or agencies have the time to check on continuity of data. In the days when paper copies existed apart from the agency that produced them, it was relatively easy to prove that something had changed. It is not so simple anymore. A second preservation issue is also relevant. The information could just disappear from the server with few people the wiser. In this environment, the Internet may be characterized as delivering the greatest collection of ephemera ever assembled.

This manual is recommended for course work in government documents or for updating library skills in the area of government documents.

Robert H. Wittorf (wittorf.1@nd.edu) is the Assistant Director for Administrative Services, Planning, and Budget at the University Libraries of the University of Notre Dame.

Copyright © 1997 by Robert H. Wittorf. This document may be reproduced in whole or in part for noncommercial, educational, or scientific purposes, provided that the preceding copyright statement and source are clearly acknowledged. All other rights are reserved. For permission to reproduce or adapt this document or any part of it for commercial distribution, address requests to the author at wittorf.1@nd.edu.

Life as Bandwidth | Networking is Life

by Thomas C. Wilson

In a recent conversation with a friend, I was reminded of how most of life can be seen as a bandwidth issue. We all have limits of time, money, space, energy, etc., and from time to time we may wish that we could just get some more bandwidth in these areas. In this conversation I had, we were discussing ATM (Asynchronous Transfer Mode) as a solution to network limitations, and I noted that even ATM--or any technology for that matter--has a limit.

While it is true that some applications will be better served by a network architecture or design that guarantees levels of service, it is patently false that any technology can guarantee delivery of that service under any circumstance. And that point is precisely what we must keep in mind as we ponder, plan for, or evaluate new networking technologies. Certain portions of the market would have us believe that ATM--or fill in the blank for your favorite high- speed topology--is the answer for bandwidth constraints that we currently experience, or that, if we are indeed on top of things, we would prepare for migration to this technology in the near future.

The logic is flawed on two counts: (1) it appeals to the single solution ideology that almost never occurs in the real world, and (2) it simplifies the analysis to a point of virtually eliminating the inherent limitations of any technology. On the first point, suffice it to say that any argument that is based on the notion that there will be one and only one solution to a particular problem deserves serious critique since it is by its nature more of a religious statement than a technological reality.

As for the second point, we seem to be doomed to be constantly barraged with "the next best thing" syndrome perpetuated by those who stand to benefit the most from wide success of a particular approach. Network bandwidth is indeed a serious challenge and new technologies should be pursued to attempt to address the portion of the problem that can be answered by throwing money at it wisely. A fundamental understanding of the problem and the solution--not necessarily technical--is critical to evaluating what might be addressed in a particular situation and to establishing realistic expectations for the performance of said solution.

In the case of ATM, the most noted characteristic is quality of service, a feature that permits an application to receive a guarantee of a certain level of bandwidth for a given time slice over a virtual circuit or channel. This attribute is particularly important for audio and video delivery over a network. The intellectual challenge is to avoid jumping to the conclusion that because ATM can offer this feature that ATM will never run out of bandwidth; no technology will ever be created that has unlimited bandwidth.

This leads to a focus on what issues can be resolved technically and which ones require the intervention of other factors. It would be possible to segment network traffic by type so that bandwidth-sensitive applications such as video would receive priority on virtual circuits that are reserved for such applications over bandwidth-insensitive applications such as email. To the extent that such segmenting can be done, it should be. Even this focus, however, is insufficient to deal with the real-world aggregation of bandwidth consumption. It is one thing to test a network architecture with twenty or so high-end users; it is quite another thing to link millions of users each demanding a slice of the pie.

At some point the technology alone will not address this issue. Quite frankly if technology alone could, we would not have the Internet performance challenges we have at this time. Internet service providers, particularly the big ones, have every incentive to improve the performance--if they could!

What we need is a better understanding of bandwidth consumption, user incentives, and technological limitations. Very few network managers probably know what crosses their wires. Yes, there are tools to digest the application protocols in use (e.g., telnet, pop mail, http, etc.), but many managers have time only to deal with emergencies, not network analysis. Be that as it may, understanding bandwidth consumption also means knowing more than percent utilization; it includes understanding what that percent implies in a given network topology and what the system-wide performance is--after all that's what our users experience!

Although we don't like to talk about it, we also need to develop economic models that influence user behavior in a way that links their consumption with some type of cost--not just money, there are other options available. If the "network" absorbs that consumption without some reflection back to the user of its impact, there is no option available for determining user priority of service. In the absence of such incentives/disincentives, every consumption is equalized for every user.

We also need to start approaching any technological solution, not as a panacea, but as an alternative that has pros and cons. This entails recognizing that there is no perfect solution and that those who talk as though there is one are either naive or intentionally deceptive. In terms of network bandwidth, we must think in aggregations and real-world scenarios. The demand will always continue to grow beyond our ability to provide the service. The technology cannot save us, we must save ourselves.

Which point brings me back to my conversation with my friend: all of life fits this model. We have a limited number of hours in the day, a limited amount of personal energy, and a limited level of expendable income. Why should we be surprised to find out that the technologies we create suffer from these same limitations?

Tom Wilson, Editor-in-Chief TER, TWilson@uh.edu

Copyright © 1997 by Thomas C. Wilson. This document may be reproduced in whole or in part for noncommercial, educational, or scientific purposes, provided that the preceding copyright statement and source are clearly acknowledged. All other rights are reserved. For permission to reproduce or adapt this document or any part of it for commercial distribution, address requests to the author at TWilson@uh.edu.

About TER