Telecommunications Electronic Reviews (TER) Vol 7, Num 2, Feb 2000
Volume 7, Issue 2, February 2000
Telecommunications Electronic Reviews (TER) is a publication of the Library and Information Technology Association.
Telecommunications Electronic Reviews (ISSN: 1075-9972) is a periodical copyright © 2000 by the American Library Association. Documents in this issue, subject to copyright by the American Library Association or by the authors of the documents, may be reproduced for noncommercial, educational, or scientific purposes granted by Sections 107 and 108 of the Copyright Revision Act of 1976, provided that the copyright statement and source for that material are clearly acknowledged and that the material is reproduced without alteration. None of these documents may be reproduced or adapted for commercial distribution without the prior written permission of the designated copyright holder for the specific documents.
REVIEW OF: Robert D. Rodman. Computer Speech Technology. Boston, MA: Artech House, 1999.
by Betty Landesman
We have all seen science-fictional applications that either accept speech as input to a computer system (Star Trek IV, where Scotty refers to the 20th century keyboard as "quaint") or turn the output of the computer system into speech (HAL in 2001: A Space Odyssey). Real applications also abound from Kurzweil reading machines to the software used by Dr. Stephen Hawking that translates words selected from a menu into speech. Speech-output devices such as these, using for the most part printed text, have been more successful to-date than speech-recognition devices. Remember the early PDA's and how often dictated memos were garbled? What is it about human speech that makes it so difficult for computers to understand?
In this book, Robert D. Rodman of the Department of Computer Science at North Carolina State University sets out to describe the nature of language and language usage and to explain the concepts of speech processing. The book is intended not for the computer programmer but for the non-technical lay person with an interest in speech technology.
The book is divided into three sections. The first section covers "Fundamentals". Chapter 1 is "About Speech" and discusses the human vocal tract, phonetic alphabets, consonants and vowels, manners of articulation, representation of sounds by letters and combinations, dialects and languages. Chapter 2 covers the means by which vibrations of sound are converted to electrical signals to be represented in the computer.
The second section covers "Uses". Chapters 3-5 define and explain the concepts and components of speech recognition (the ability to take speech as input and produce a transcript of that speech as output), speech synthesis (making computers talk), speaker recognition, language identification, and lip synchronization (the computer takes speech as input and outputs parameters to animate a face that appears to be speaking in sync with that speech).
The third section explores applications of these uses. Applications of speech recognition include assistive technology, telephone call processing, instruction of computers through speech by people with their hands full (e.g., military pilots and operating room physicians), and education (language learning, speech therapy, reading instruction). Applications of speech synthesis include screen readers for computer screens, aid for the visually- and speech-impaired, foreign language learning, information retrieval and reporting, transportation, and requests for information from government services (e.g., jury duty, road closings, lottery results). Applications of speaker recognition include access, authentication, monitoring of prisoners, fraud prevention, and personal services (e.g., adjusting car seats). Applications of language identification include international telephone call routing, monitoring of communications, and public information systems. The only application of automatic lip synching seems to be animation.
The book concludes with a glossary.
I thought the overall organization of the book was somewhat uneven. The first two sections are the best, explaining many very specialized concepts in clear, non-technical, and, for the most part, non-mathematical terms. The reader will learn a great deal about the components of human speech and how they relate to speech recognition and speech synthesis systems. The section on applications, however, seems overly general, lacking in detail, and, in some cases, stretching in order to find an application. This area would have benefited from fewer but more detailed examples of specific applications, how they (will) use the features of human speech and/or the computer and exactly why a particular application may be so difficult. There was some of this, but I felt not enough. For example, Dr. Rodman mentions the development at NCSU of a program to call to check on elderly people living alone. I would have liked to know more about how this program was designed and how well it worked.
It is extremely difficult to convey technical detail in words that someone who knows nothing about the subject can understand. Dr. Rodman succeeds admirably in this endeavor. I recommend this book to non-technical readers who wish to understand human speech production and how computer speech technology addresses these issues.
Betty Landesman (firstname.lastname@example.org) is Manager, Document Delivery Team at the Research Information Center of the AARP in Washington, D.C.
Copyright © 2000 by Betty Landesman. This document may be reproduced in whole or in part for noncommercial, educational, or scientific purposes, provided that the preceding copyright statement and source are clearly acknowledged. All other rights are reserved. For permission to reproduce or adapt this document or any part of it for commercial distribution, address requests to the author at email@example.com.
REVIEW OF: Peter Jacso and F. W. Lancaster. Build Your Own Database. American Library Association, 1999.
by Margaret Sylvia
Written for librarians and others who may be new to database production, this is a basic text on database design and implementation. The book is divided into two parts: the first on content and organization of the database, the second on software issues. The section on content and organization of databases includes information on quality of information; accessibility of content; and the domain, scope, and currency of coverage. There is a nice explanation of the need for consistency in terminology, controlled vocabulary, and subject indexing. The creators of the major Internet search engines would do well to read this section for an understanding of why simple keyword searching is not usually adequate, particularly for large and complex databases. The discussion of quality and usability factors for database retrieval includes number of access points, consistency of terminology, specificity, retrievability factors, and automatic methods of extracting index words and phrases.
The section on software issues comprises the bulk of the book beginning with a description of different types of database software with advantages and disadvantages of software categories and tips on selecting the best alternatives for particular needs. A chapter on structuring the database and individual records leads the reader through the choices on length of fields and records, multiple types of records, linking records and fields in different databases, and the structure of data fields. Information on indexing the records follows, with a discussion of index creation and browsing. Information on searching with Boolean operators, truncation, case sensitivity, field qualifiers, proximity operators, and natural language searching is included.
Next, sorting output via various choices is covered including a discussion of how different software sorts special characters such as punctuation. Computer programs that do not ignore punctuation or capitalization when sorting have been a constant source of irritation to librarians, particular those of us who have lived by the American Library Association filing rules for so many years. Why can't databases be informed that "Dr." and "doctor" may be sorted and filed as if they were the same? Ignoring initial articles can also be a sorting problem for some databases.
Finally, there is a chapter on the displaying, printing, and downloading functions of database software. In the text, different database software is examined by using examples, including ProCite, Reference Manager, EndNote, DB/TextWorks, Micro-CDS/ISIS, and Filemaker Pro. The focus is not on the details of how each program works, but rather, how basic principles of database creation or management could be accomplished in a particular program. At the time of publication of this book, most of the programs discussed in the book did not have a direct method of creating HTML (hypertext markup language) output, though this is certainly a topic of interest to many librarians and others. Filemaker Pro (http://www.filemaker.com) is an exception and can create HTML databases that can be published directly on the Internet.
This book will be useful for those who are interested in creating and publishing a new database, such as an index to a local newspaper. For those who have an existing database and simply want to make it searchable via the Internet, this book may be useful as a beginning, particularly for choosing and understanding database software. The scope of the book, however, is broader and more basic than this. Additional information will be needed by those whose interests lie more in converting a database to a web publication. For those who need to know how to create an excellent database, the explanation of which is the objective of the text, this book should be your first choice.
Margaret Sylvia is the Assistant Director for Technical Services, St. Mary's University, San Antonio, Texas.
Copyright © 2000 by the American Library Association. This document may be reproduced in whole or in part for noncommercial, educational, or scientific purposes, provided that the preceding copyright statement and source are clearly acknowledged. All other rights are reserved. For permission to reproduce or adapt this document or any part of it for commercial distribution, address requests to Office of Rights and Permissions, 50 East Huron Street, Chicago, IL 60611.
REVIEW OF: Ellen Siever, Stephen Spainhour & Nathan Patwardhan. Perl in a Nutshell: A Desktop Quick Reference. Sebastopol, CA; O'Reilly & Associates, 1999.
by John Wynstra
"Computer languages differ not so much in what they make possible, but in what they make easy. Perl is designed to make the easy jobs easy, without making the hard jobs impossible." (p. 3) With these introductory sentences the authors begin their task of compiling just the right amount of information to make understanding and using Perl easier. Perl has been around for a number of years and is most often thought of as a scripting language; however, as a result of continued development of the open source code, it has developed into a sophisticated and well supported programming language.
Perl in a Nutshell is a nicely designed and well organized book. The subtitle, "A Desktop Quick Reference," aptly describes the purpose and focus of this book. The contents are organized into the following sections: Getting Started, Language Basics, Modules, CGI, Network Programming, Perl/Tk, and Win32. These sections are easily located using the black index tabs on the right hand side of the pages which are also visible when the book is closed.
The Getting Started section provides a brief introduction to Perl. It covers where to get it, how to install it on both UNIX and WIN32 platforms, and how to use the documentation for it. This section is mostly narrative, but can be easily scanned for specific topics due to the generous use of section headings and subheadings. Several pages are dedicated to listing newsgroups and web sites that are good sources of information for Perl programmers.
The Language Basics section describes the building blocks of Perl. It discusses the data types, the operators, and the built-in functions that are available. A reference list of built-in functions is the main feature of this section. The functions are listed first by category (i.e. - Numeric functions, Input and output, Array Processing, etc.) and then in alphabetical order. The descriptions of the functions can be found in the alphabetical list. A short descriptive paragraph and an occasional example result in about five functions per page. More examples would improve the utility of this section inasmuch as they often do the best job of clarifying how things work and are most helpful when trying to figure out the nuances of a function that has not been used before. A number of pages are also set aside to describe those confusing undeclared special variables that appear in many Perl programs such as $. and $_ . Also included in this section is a chapter on how to use the Perl debugger.
One of the strengths of any programming language is the availability of pre tested modules, libraries or packages that allow the programmers to focus their time on larger development tasks and not on developing every single function and object needed in their programs. Perl has a number of well-developed and publicly-available modules and packages--some of which are included in the standard distribution of Perl. The Modules section describes the most common modules. These modules and their functions are listed in alphabetical order.
One of the more popular applications of Perl has been in processing the back-end scripting on web servers. Much of this happens in the context of handling those ubiquitous forms that confront us at almost every turn on the World Wide Web. The Common Gateway Interface, CGI for short, is one of the mechanisms that make this possible. The CGI section describes how to use the CGI for such things as forms processing, setting cookies, and utilizing environment variables. A portion of this section is dedicated to the mod_perl module which is useful for fine tuning and managing the Apache Web Server. The CGI section is short and concise, but very useful.
Another section related to the Web is the Databases section. Database accessibility is a driving force on the Internet. A lot of the content that is generated on the web is the result of forms that retrieve information, process orders, and personalize content. Databases that store and process information in the background make this possible. Perl can be used to access a number of database systems including DB2, Informix, MySQL, Oracle, and Sybase to name a few.
Network Programming, the next section, delves into the mechanisms used by computers to communicate with each other. Perl offers a wide array of tools to create and customize the use of ftp, http, e-mail, and news clients and servers. Besides a number of built- in functions, there are modules available that greatly enhance network programmability.
The Perl/Tk section is mostly filled with reference material about the various methods, widgets, and options available for writing programs with a graphical user interface. Perl/Tk can be utilized on Unix as well as Windows 95/NT platforms.
The last section of the book, Win32 Modules and Extensions, is particularly helpful for those using Perl in a Windows NT environment. There are some differences between how Perl works on the Windows platform and how it works under Unix. It is important to be able to utilize the special features available only for Windows and to be aware of functions that may not cross platforms. This section also includes a chapter on PerlScript, which is an ActiveX scripting engine that makes it possible to incorporate Perl with Active Server Pages (typically the domain Microsoft tools like Vbscript).
I have been a fan of the O'Reilly "Nutshell" series since I obtained my first book from this series entitled UNIX in a Nutshell by Daniel Gilly. All the books I have obtained from this series sit on my reference shelf close at hand and I refer to them often. They provide a quick reminder of the syntax for a command I am trying to use or a function I would like to understand. I would highly recommend this book along side several others in this distinguished series.
John Wynstra (firstname.lastname@example.org) is the Library Information Systems Specialist at the University of Northern Iowa.
Copyright © 2000 by John Wynstra. This document may be reproduced in whole or in part for noncommercial, educational, or scientific purposes, provided that the preceding copyright statement and source are clearly acknowledged. All other rights are reserved. For permission to reproduce or adapt this document or any part of it for commercial distribution, address requests to the author at email@example.com.
REVIEW OF: Dorothy Cady and Nancy Cadjan. Network+ Certification Success Guide. New York: McGraw-Hill, 1999.
by Brian K. Yost
The CompTIA (Computing Technology Industry Association) A+ Certification program has become a de facto standard for measuring basic levels of competency for computer service technicians. CompTIA is attempting to position the Network+ Certification as a vendor-neutral standard for networking professionals. The Network+ Certification program is quite new (officially launched in April 1999), and the test has some overlap with components of vendor-specific certifications such as Novell's CNE and Microsoft's MCSE programs. The Network+ Certification Success Guide is a helpful resource for answering questions about the coverage of the Network+ exam and explaining where it fits in with the vendor-specific certification programs.
First of all, the Network+ Certification Success Guide is not a "Cram Session" or "Exam Cram" type study guide. Unlike exam study guides, it does not provide the needed content knowledge to pass the exam. It does, however, explain what that required knowledge is and provides suggested resources for locating the information. Rather than a tutorial-based study guide, the Network+ Certification Success Guide explains the Network+ certification program and process. To this end, the book is mostly successful.
Each chapter in the book covers a particular aspect of the Network+ program. Chapter 1 explains what the Network+ Certification examination is and who sponsors it and provides the details of signing up to take the test.
After introducing some of the specifics of the Network+ exam, the authors take a broader approach in chapter 2: "Evaluating Trends in the IT Industry." In this section, study after study is cited showing that networking professionals are in high demand and that job satisfaction is high in the industry. This is useful information for someone considering a career in information technology, but those already in the profession have undoubtedly heard much of this before. For example, is anyone really surprised that continued job growth is expected in the IT industry?
Chapter 3, "Valuing Your Network+ Certification," argues the value of earning the Network+ Certification. The value of certification from the perspectives of employer, IT professional, and customer is presented. The authors also argue the value of a vendor- neutral certification program as opposed to vendor-specific programs such as Novell, Cisco, and Microsoft programs. One of the most valuable sections of chapter 3 is the comparison between the Network+ exam and the Microsoft Networking Essentials and Novell Networking Technologies exams. At the time of the writing of this book, the Network+ exam was not accepted as equivalency for the comparable Microsoft or Novell exams. Fortunately, this has changed according to the Network+ FAQ on the CompTIA web site (http://www.comptia.org). Novell will now accept the Network+ exam in place of the Networking Technologies exam for CNE candidates. This certainly makes sense considering the close overlap in coverage of the two exams.
Chapter 4, the longest chapter in the book, details the 15 job categories and 175 associated tasks the Network+ exam covers. These job functions are based on the IT Skills Job Task Analysis performed by CompTIA when creating the Network+ exam. I have to admit I found this chapter a bit tedious. Even though it can be a chore to wade through, it is a good introduction to the networking related job activities that IT professionals often perform. It is also interesting for more experienced administrators to review how things really should be done if time were not always at a premium (e.g., some of the associated tasks with the setting up Standard Operating procedures category). This chapter is a far cry from the pragmatic focus of most "Exam Cram" type certification study guides. While the information in this chapter will not directly help one pass the Network+ exam, it will help in understanding the job activities that the Network+ exam covers, and this may help to focus studying. In addition, if actually put into practice, it could also help one to be a better network administrator.
Chapter 5, "Using the Examination Blueprint to Prepare for the Network+ Certification Exam," describes what topic areas are covered by the Network+ exam and provides the percentage of test questions in each area. This is a very useful chapter for determining what to study and where to concentrate in preparation for the exam. This chapter illustrates a major difference between this book and a study guide. The Network+ Certification Success Guide does a very good job of explaining what one needs to know to pass the exam, but doesn't provide the needed knowledge. For example, when discussing the TCP/IP fundamentals covered on the exam, the authors state, "Make sure you understand what is meant by terms such as default gateway, POP3, SMTP, SNMP, FTP, HTTP, IP and others." (p.150) This is obviously good advice, but those needing to study these concepts will not find explanations in this book. For that type of test preparation, other resources will need to be consulted. The Guide does, however, provide many recommendations for sources with this information.
Chapter 6 covers studying for the Network+ exam. The first section provides suggested resources for preparing for the exam. Both books and web based resources covering particular subject areas of the exam are included. The authors do a good job of recommending resources for the various topics of the exam. Now that several Network+ study guides have been published, however, it is far easier and more efficient to use one or two of these to learn Network+ content. Some examples of recently published Network+ study guides include:
- Meyers, M & Schwarz, B. (1999). Network+ Certification Exam Guide. New York, NY: McGraw-Hill.
- Craft, M., Poplar, M.A., Watts, D.V., & Willis, W. (1999). Network+ Exam Prep. Scottsdale, AZ: Coriolis Group Books.
- Groth, D., Bergersen, B., & Catura-Houser, T. (1999). Network+ Study Guide. San Francisco, CA: Sybex.
The second section of chapter 6 provides help with studying for the Network+ exam. Some basic studying tips are included, as well as a tool to determine how much studying will be required and how to allocate study time among the topic areas covered by the exam.
Chapter 7, the final chapter, includes 150 sample Network+ exam questions and brief explanations of the correct answers. The authors recommend using this as a pre-study assessment tool to determine where to concentrate studying efforts. They also suggest taking it again after studying to ensure readiness for the real exam. The questions are arranged by Network+ exam topic coverage. The percentage of questions in each topic area is equivalent to the coverage on the actual exam (e.g., questions on the physical layer of the OSI model are grouped together and account for 6% of the sample test questions).
The authors are clearly well qualified. Dorothy Cady is a former technical writer for Novell and is a CNA, CNE, and CNI. Nancy Cadjan is a technical writer for Novell. The authors frequently throw in real life examples of situations they have faced in their work as IT professionals. The examples are well selected and add interest to some rather monotonous sections, such as the job tasks listed in Chapter 4.
There are quite a few typos and proofing mistakes, but no worse than many other titles. This seems quite common for computer/ technology books and likely is caused by the rush to get time- sensitive books out, over-reliance on software-based spelling and grammar checking, and editors and proofreaders who don't thoroughly understand the subject matter. A complaint about the layout of this title is the low quality of many of the figures and charts. While they are usually illustrative and well chosen, many of them are jagged and look like they were produced on an old dot- matrix printer.
The Network+ Certification Success Guide fulfills its intended purpose very well. With this book the reader will learn what the Network+ exam is, why it is beneficial to earn the certification, what the exam covers, and what areas they need to study to prepare for the exam. Even though much of this information is available from the CompTIA web site and other sources, it is very useful to have the information gathered in a single book. Because it is not a content-based study guide, however, other resources will be necessary to prepare for the Network+ exam. Combining this title with either the resources suggested by the authors or with one or two Network+ study guides would be a good strategy for approaching the Network+ examination.
Brian K. Yost (firstname.lastname@example.org) is Library Technology Coordinator at Hope College in Holland, Michigan.
Copyright © 2000 by Brian K. Yost. This document may be reproduced in whole or in part for noncommercial, educational, or scientific purposes, provided that the preceding copyright statement and source are clearly acknowledged. All other rights are reserved. For permission to reproduce or adapt this document or any part of it for commercial distribution, address requests to the author at email@example.com.