
Volume 3, Issue 1, May 1, 1996
Telecommunications Electronic Reviews (TER) is a publication of the Library and Information Technology Association.
Telecommunications Electronic Reviews (ISSN: 1075-9972) is a periodical copyright © 1996 by the American Library Association. Documents in this issue, subject to copyright by the American Library Association or by the authors of the documents, may be reproduced for noncommercial, educational, or scientific purposes granted by Sections 107 and 108 of the Copyright Revision Act of 1976, provided that the copyright statement and source for that material are clearly acknowledged and that the material is reproduced without alteration. None of these documents may be reproduced or adapted for commercial distribution without the prior written permission of the designated copyright holder for the specific documents.
Contents
- REVIEW OF: John Colonna-Romano and Patricia Srite. The Middleware Source Book. by Bob Craigmile
- REVIEW OF: Brian Kahin and James Keller, eds. Public Access to the Internet. by Rebecca A. Ladew
- REVIEW OF: 1994 International Conference on Network Protocols, October 25-28, 1994, Boston, Massachusetts: Proceedings. by Tony Toyofuku
- Will the Web Save Advertising? Or Is It the Other Way Around? by Thomas C. Wilson
- About TER
REVIEW OF: John Colonna-Romano and Patricia Srite. The Middleware Source Book. Boston, MA: Digital Press, 1995.
by Bob Craigmile
This book envisions a day when information utilities will be like other utilities, offering a wide variety of services which can be accessed by a similarly wide variety of equipment. Until that time, middleware will, according to the authors, have an important role to play.
The book seeks to address a distinct type of network program that exists across various platforms on networks. As it states, "Middleware is software that resides in the middle between an application program and the base-level operating system and networking capabilities of a computing system." (p. 3) Middleware seeks to make the platform transparent and the application portable. Its special emphasis is on client/server distributed processing, but it will also support peer-to-peer and distributed object programming. (p. 19)
That the book has a bias toward Digital Equipment Corporation's systems is not a surprise, since it's published by Digital Press. This bias also explains its scant treatment of Unix. Digital middleware supports all DEC platforms, MS Windows systems, and Macintosh. (p. 22) After laying out what middleware is, the authors cover standards and services in great detail. Specifically, presentation, communication, control, information, computation and management services are all given chapter-length treatments.
Heavy use of diagrams helps the reader deal with the concepts of putting middleware into action. The book also features a glossary, appendices, and a listing of sources for middleware packages.
The target audience is techies and "anyone who is planning, designing, building, or supporting client/server or distributed systems." (p. xviii) This book will serve as a reference for those doing this type of work, especially when DEC's equipment is involved. That said, it will have limited utility beyond experts because of its narrow audience, jargon, and tech-ese. It is clear that services like "middleware" will have a role, but books like this will not help them be easily implemented.
Bob Craigmile (librlc@emory.edu) is reference librarian at Pitts Theology Library, and writes for Computers in Libraries magazine.
Copyright © 1996 by Bob Craigmile. This document may be reproduced in whole or in part for noncommercial, educational, or scientific purposes, provided that the preceding copyright statement and source are clearly acknowledged. All other rights are reserved. For permission to reproduce or adapt this document or any part of it for commercial distribution, address requests to the author at librlc@emory.edu.
REVIEW OF: Brian Kahin and James Keller, eds. Public Access to the Internet. Cambridge, MA: The MIT Press, 1995.
by Rebecca A. Ladew
This book is not about "accessing" the Internet per se; there are plenty of books on that topic. Instead, this book is a discussion of important issues involved in ensuring wide access to the Internet at a time when it is undergoing rapid commercialization and growth. With this happening, for the Internet to function smoothly, appropriate questions need to be asked, and economic and social issues need to be analyzed. Pornography, for example, is a recent and pending issue affecting the Internet, with a call being made for censorship. This may lead to more changes in the way policymakers make decisions. Contents presented in this book will be beneficial to network managers, politicians, network activists, and other professionals. This is a good reference tool for anyone who is involved in policymaking decisions for the Internet infrastructure.
As a model, the Internet inspired the development of the National Information Infrastructure (NII) concept, which was established by the Clinton-Gore Administration. The NII is an evolving model that encompasses not only the networks, information, software, standards, and people on the Internet, but also the seamless interconnection of all information and media such as television, libraries, and voice communication. Editors Kahin and Keller are among other contributing authors who examine in this volume the Internet's potential for transforming communities and institutions, as well as issues in the transformation of the Internet itself.
Chapters in this book are grouped in five parts: "The Public Access Agenda," "The Sociology and Culture of the Internet," "Establishing Network Communities," "Accommodating New Classes of Users," and "Pricing and Service Models." Each section, for example "The Public Access Agenda," has several overviews by different contributors, and each overview is followed by notes and bibliographical references. At the end of this book there is a listing of the contributors, their positions and e-mail addresses, and a subject index. Dispersed throughout are several illustrations, charts, and tables.
Understanding how the information infrastructure operates, policy making, and the challenges that lie ahead are the issues discussed in "The Public Access Agenda." The culture of the Internet is being challenged by the influences of an expanding array of communities and institutions, and the people who operate in them. This trend and policy making, especially in the area of people rights and responsibilities, are examined in "The Sociology and Culture of the Internet."
President Clinton in his 1996 State of the Union Address stated that there will be a computer in every classroom across the country by the turn of the next century. Educational reform and the impact of networking on the learning and teaching processes are discussed in "Establishing Network Communities." Other topics discussed in this section include community cooperative networks, government funding for grassroots innovations and the interaction of their citizens, the influence of computerized network information on social change (e.g., the impact on Native American culture), and the role of public libraries in providing public access to the Internet.
In "Accommodating New Classes of Users," the authors discuss how the NII policy can overcome the barriers of developing a two-tier society of information haves and have-nots. The NII must demonstrate the positive effects of communication technology on low and moderate income families and the ability to lift Americans out of poverty. Meeting the challenges of business and end-user communities (e.g., what they want, what they need, and what they are doing) are studied in this chapter. Extending current Internet services to smaller users and several models for reducing current costs are also covered.
In "Pricing of the Internet," the role of the private and public sectors are reviewed. This discussion centers around efficient pricing schemes that encourage growth in network use and capacity and guide resources to the highest-value applications. Pricing policies for an integrated services Internet and competition are fully covered. An analysis of the effect of certain pricing policies is included.
This book is a publication of the Harvard Information Infrastructure Project which began as a workshop to discuss how the federally subsidized Internet should evolve toward a commercial network. Other publications are forthcoming that will focus on the judgements and research of practitioners and scholars from a wide range of perspectives. It is hoped that these practitioners and scholars will play a vital part in the development of both policy and practice for the "information infrastructure."
Rebecca Ladew (Ladew@clark.net) is a member of the Governor's Advisory Board for Telecommunications Relay in the State of Maryland. Ms. Ladew is a representative for the speech/hard of hearing disabled community. When Rebecca was a VISTA Volunteer she was the curriculum developer in an adult literacy program. Ms. Ladew has a Masters in Instructional Technology.
Copyright © 1996 by Rebecca Ladew. This document may be reproduced in whole or in part for noncommercial, educational, or scientific purposes, provided that the preceding copyright statement and source are clearly acknowledged. All other rights are reserved. For permission to reproduce or adapt this document or any part of it for commercial distribution, address requests to the author at Ladew@clark.net.
REVIEW OF: 1994 International Conference on Network Protocols, October 25-28, 1994, Boston, Massachusetts: Proceedings. Los Alamitos, CA: IEEE Computer Society Press, 1994.
by Tony Toyofuku
Today, the popular media is awash in advertisements for new hi-tech devices such as cellular telephones and direct satellite broadcasts. The ads tell us of the clarity of the newest phones, or the ability to receive 150 television channels at the push of a button. These products are relatively new to consumers, but twenty years ago, long before RCA introduced the Digital Satellite System TV receiver or some pundit came up with the term "flip phone," scientists were designing the underpinning communication protocols and were discussing their research findings at engineering conferences. [ 1, 2]
Conferences, and the resultant proceedings, continue to play an important role in the dissemination of scientific information. One such conference, held in Boston, was the 1994 IEEE International Conference on Network Protocols (ICNP). As defined by Simon Lam, the chair of the conference, "network protocols" are:
distributed programs that offer services to users, which are either other network protocols or clients and servers of networked applications. Furthermore, ICNP is concerned with the entire development cycle of network protocols--from design and specification, to verification, testing, performance analysis, implementation, performance tuning, and management. (p. viii)
The proceedings generated from this conference are divided into nine sections: "Multicasting," "Protocol Design and Performance," "Protocol Testing," "Network Security," "Verification and Validation," "Protocol Synthesis," "Protocol Development Methods," "Protocol Design for Applications," and "Fault-tolerant Topology and Routing." The papers are presented by scientists from both academia and the telecommunications industry, and they are geared toward other scientists actively conducting research in the field of communications protocols.
Like most IEEE conferences, this one is highly technical and the topics are narrow. Such papers as "Multi-Rate Traffic Shaping and End-to-End Performance Guarantees in ATM" [ 3] and "Automated Synthesis of Protocol Specifications with Message Collisions and Verification of Timeliness" [ 4] are examples of the specificity of the topics presented. Each paper stands alone; the only relationship among papers is that they fall into the same general topic of "network protocols." The papers are all short (there are 28 papers in 247 pages), and all are brief overviews of the authors' ongoing research. Because of the brevity of the papers, the authors cannot go into tremendous detail, but there is certainly enough information to whet one's curiosity, and all of the papers have ample footnotes for further exploration.
It is difficult to weigh the merits of each individual paper, as there is a broad spectrum of topics, but overall the research is impressive. As a programmer, my research interests are firmly planted in the upper levels of network protocols; I deal only with the applications programming interface (API) and rarely do I delve into the cloistered sanctum of the lower levels of network protocols. Even so, I found all of the papers to be thought provoking and many of them opened my eyes to new ways of tackling programming problems. One article I found particularly provocative was "Single-Link and Time Communicating Finite State Machines". [ 5] I enjoyed this paper, not because it is higher in quality than the others, but because it gave me ideas for how I can better design communicating finite state machines. [ 6]
The one criticism I have is aimed more at the IEEE as a publisher than it is at the proceedings of the ICNP. With the proliferation of IEEE conferences on various aspects of telecommunication, there seems to be some duplication of papers that are presented at these conferences, or that appear in other IEEE publications. In some cases, where there is substantial additional information, or the research has been updated, this duplication can be helpful; for example "A Simulation Study of the Impact of Mobility on TCP/IP" appears in the ICNP proceedings, and later it appears in The IEEE Journal on Selected Areas In Communications. In the notes of the journal article, the authors clearly state "this paper was presented in part at the International Conference on Network Protocols." Rather than being mere copies of each other, the journal article is more detailed and in-depth than is the paper presented at the conference. [ 7, 8]
In other cases, however, a nearly identical paper can be presented at more than one conference, and hence appear in more than one set of proceedings, with neither referring to the other. This is not an indictment of any particular author. An example is the paper "Throughput Efficiency of a Link Management Procedure for LEO Satellite Systems" which, except for editorial changes, the authors also presented at the 1994 IEEE MILCOM conference, titled "Re-Examining the Reliable Link Initialization Procedure." [ 9, 10]
Sidestepping any real or perceived ethical questions, it is an unnecessary expense for libraries to purchase conference proceedings that contain previously published material. This is especially true when there is no clear indication that similar papers have already been printed elsewhere. As the blue bound IEEE volumes are omnipresent in most technical libraries, one can only wonder how much redundancy there is among the many IEEE proceedings on communications that appear every year.
The question of originality aside, I enjoyed the papers; they are provocative, well-written and well-edited. While all of the ideas presented at this conference are currently found only in research laboratories, perhaps in the not-too-distant future one of these technologies will form the core of the telecommunications products of tomorrow.
Notes
[1] Fluhr, Z. & Nussbaum, E. (1973). Switching Plan for a Cellular Mobile Telephone System. In IEEE International Conference on Communications, 1, 11-13.
[2] Kohn, D. (1975). Aspects of Direct Satellite Television Broadcasting. In World Telecommunications Forum Technical Symposium (pp. 3.4.2/1-5). Geneva, Switzerland: International Telecommunications Union.
[3] Saha, D., Mukherjee, S., & Tripathi, S.K. (1994). Multi-Rate Traffic Shaping and End-to-End Performance Guarantees in ATM. In 1994 International Conference on Network Protocols, 188-195.
[4] Kakuda, Y, Igarashi, H., & Kikuno, T. (1994). Automated Synthesis of Protocol Specifications with Message Collisions and Verification of Timeliness. In 1994 International Conference on Network Protocols, 143-150.
[5] Peng, W. (1994). Single-Link and Time Communicating Finite State Machines. In 1994 International Conference on Network Protocols, 126-133.
[6] For a short discussion on FSMs, see: Shemitz, J. (1992, November). Multitasking State Machines. C Users Journal, 10(11), 23-34.
[7] Manzoni, P., Ghosal, D., & Ghafoor, A. (1994). A Simulation Study of the Impact of Mobility on TCP/IP. In 1994 International Conference on Network Protocols, 196-203.
[8] Manzoni, P., Ghosal, D., & Ghafoor, A. (1995, June). Impact of Mobility on TCP/IP: An Integrated Performance Study. IEEE Journal on Selected Areas in Communications, 13(5), 858-867.
[9] Ward, C., Mitra, S., & Phillips, T.M. (1994). Throughput Efficiency of a Link Management Procedure for LEO Satellite Systems. In 1994 International Conference on Network Protocols, 32-39.
[10] Ward, C., Mitra, S., & Phillips, T.M. (1994). Re-examining the Reliable Link Initialization Procedure. In Proceedings of MILCOM '94 (pp. 517-521). New York, NY: IEEE.
Tony Toyofuku (toyofuku@uci.edu) is Electronic Services Librarian and Spanish and Portuguese Bibliographer at the University of California, Irvine.
Copyright (c) 1996 by Tony Toyofuku. This document may be reproduced in whole or in part for noncommercial, educational, or scientific purposes, provided that the preceding copyright statement and source are clearly acknowledged. All other rights are reserved. For permission to reproduce or adapt this document or any part of it for commercial distribution, address requests to the author at toyofuku@uci.edu.
Will the Web Save Advertising? Or Is It the Other Way Around?
by Thomas C. Wilson
Over a year ago, I heard a radio interview of several people involved in advertising talking about the wonderful future that the Internet held for their industry. Much of the hoopla centered around the notion that advertisers now spend time and money ineffectively trying to reach potential customers through traditional broadcast methods (e.g., radio, TV, magazines, etc.). These methods provide several serious challenges to the company wishing to sell products and to advertising agencies and publishers.
First, companies that advertise do not wish to waste money on exposing unlikely customers to their products--a fundamental flaw in broadcast media. Put another way, they wish to maximize coverage in potential markets.
Second, traditional media do not have very sophisticated or accurate means of determining who they are really reaching and what that audience wants. Yes, they do have survey data from samples of their readers, listeners, or viewers, and they have self-reported data from a small percentage of subscribers who respond to questionnaires. But statistically the conclusions that can be appropriately drawn from such instruments are relatively lame.
Third, when a company selects an advertising medium and outlet, how do they determine the "real" audience or market share of that particular outlet? To be sure there are companies that attempt to measure such things, but they, too, fall prey to the limitations of the previous challenge.
With the wider availability of the Internet in recent years, some advertising agencies have acted as if this new medium would answer all of these challenges. To wit, the medium itself could provide a more extensive indication of "real" market share and more direct contact with the customer. Certainly, in theory a networked environment that provides some level of interactivity with the actual consumer would yield more accurate details about what s/he is seeking in a product. And it would provide opportunities for sharing information about available products at the level of detail that a particular consumer desires. But that is theory.
The interview I heard emphasized the desire that advertisers have to focus on potential customers, as opposed to boring, irritating, or entertaining an audience that has no interest in their products. The representatives from ad agencies were portrayed as victims of circumstance: they only want to get all of the information that can help a customer into his/her hands. The presentation ignored the fact that we are still talking about marketing literature which is frequently not what the customer wants--they want factual specifications, diagrams, reviews, answers about their specific applications, etc. The network itself cannot change the content, it simply improves the delivery.
Companies have experimented with plying their trade via the net with some success. Some vendors have used creativity and a sense of humor to convey information about products to consumers on the net. This process does hold much promise for companies that decide to take the plunge, build the necessary infrastructure, and transform their thinking about marketing literature.
With the advent of video, animation, sound clips, and interactivity, the toolkit for companies wishing to convey product information via this channel has grown substantially. Having the tools, however, is just one step in the process--in fact it is the most straightforward part. Companies who are unwilling to recreate their content will struggle in this arena. For the sake of illustration, consider a company who wishes to use highly graphical animation to convince the viewer of his/her need for a product, but who refuses to buy a pipe larger than 56 Kbps to the Internet. One position influences the other negatively, and net customers will not endure the poor performance. Or how about the company that remains committed to selling products on the basis of abstract relationships between happiness and perfume or car models or vacation cruises or fruit drinks or clothing? Such an approach misses an important point in the changes underway.
It may be possible in a universe of one-way broadcast television to sell products solely by appealing to human interests unrelated to the products themselves because the advertiser has a somewhat captive audience in an atmosphere that encourages short, feel-good segments. Advertisers usually cannot afford to spend long periods of time explaining products, particularly when most of the audience may not be interested in the first place. Thus, advertisers seem to be thinking that, with the net, finally they will be able to reach an identifiable audience whom they can grab by exposing them to more of the same well-produced and slick, but trivial marketing content.
This misguided thought is precisely what will trip up most advertisers. Mark Stahlman, president of New Media Associates, argues that the current corporate use of the Web will die because Web technology can't effectively manipulate people in the same way that TV ads can. [ 1] He notes that Web advertising must be informational or support related, and that type of advertising does not generate enough revenue from consumers to warrant its continuance.
Much of the subsidized info-material we enjoy in our lives (e.g., magazines, television, and radio) is available because advertising dollars pay for most or all of the production costs. This model has moved into the Internet as a means of supporting free or low- cost access to many sites. Indeed, much of the push toward several recent technical developments has come from the desire to improve advertising effectiveness.
Once again we have an industry leaping toward a technology in an attempt to avoid a change in another area. Can advertisers migrate to a new model for content? Will the Web die as advertisers move on to more lucrative environments? How much are we willing to pay for real content? Sounds like more questions than answers in the short-run.
Note
[1] Stahlman, M. (1996, April 8). Why the web will die. InformationWeek (574), 100.
Tom Wilson, Editor-in-Chief TER, TWilson@uh.edu
Copyright © 1996 by Thomas C. Wilson. This document may be reproduced in whole or in part for noncommercial, educational, or scientific purposes, provided that the preceding copyright statement and source are clearly acknowledged. All other rights are reserved. For permission to reproduce or adapt this document or any part of it for commercial distribution, address requests to the author at TWilson@uh.edu.