Editorial Reviews from Amazon.com

A practical book, Neural Smithing is aimed at the reader who intends to design and build neural networks for applications from forecasting to pattern recognition. The authors concentrate on multilayer perceptrons (MLPs) as the most commonly used neural network model, which adds to the book's overall clarity and focus. This textbook-style reference begins with simple, single-layer networks and the elements of supervised learning. It then builds on these basics with such topics as error surfaces, genetic algorithms, and generalization. Examples and illustrations guide the reader through the discussion, but the authors don't suggest problems for further study--a small omission in an otherwise well-constructed book. Readers must know calculus and statistics to make sense of the text, but they don't need much knowledge of neural computing. Whether used as an introductory textbook or as a professional reference, Neural Smithing is highly useful. Tightly focused and easy to use, it should have a place next to every neural toolbox. --Rob Lightner


Neural Smithing, April 26, 2002

Reviewer: George Matty from Huntsville, AL United States

Book is excellent. Covers the theory very well, such that you can make the computer code yourself. They also provide puedocode. You will be able to learn it better than other books that just give you the code. I find that once you understand the theory, writing the code is easy.


Saves you months of information gathering

February 28, 2002 Reviewer: DK Kam from Amsterdam

Everybody who tries to use NNets for real goes through these steps. First, there is the Delta rule. Then, there is overfitting, local minima, generalization problems and frustration. The complexity of NN is not in it's math; the difficulty is in the construction of a NN. This book is excellent in providing rules-of-thumb for NN construction, while at the same time providing the theoretical backing. Hey I am not making money reviewing this book, it's just really good.


Run out of ideas to improve your Neural Network?

May 18, 2001 Reviewer: Kah Tong, Seow from Singapore

Many textbooks can help me to understand the different concepts of neural network, but not the practical tips needed to optimize neural network anlysis and implementation.

The topics covered are reminicent to those discussed in part 2 and 3 of the Neural Network FAQ. In chapter 6, the relationships between learning rate, momontum, trainig time and learning modes are presented graphically. With this, it helps me to rule out and avoid learning parameters that are unlikely to improve the NN performance. This is especially important if the dataset is large and the NN program is implemented in Java.

If the aim is to develop a NN solution that will give you the best results, I find both chapter 7 (heuristics for weights initialization) and 16 (heuristics for improving generation) are esential and saves me a lot of time from reading many journals.

In summary, this book has helped me to develop the art of NN optimization. It shows me how to visualize decision surface and the various graphical relationships between learning paramters and various components of NN topology. I think you will find this book very useful after your NN program is up and running and you are looking for ideas and explaination on how to improve the NN performance further.


The book is a five star effort.

May 19, 1999 Reviewer: A reader from Cleveland, Ohio

The book is a five star effort. Here is a review circulated in popular neural network newsgroup: Newsgroups: comp.ai.neural-nets From: saswss@hotellng.unx.sas.com (Warren Sarle) Subject: Neural Smithing Message-ID: Organization: SAS Institute Inc. I have added a new book to the list of "The best elementary textbooks on practical use of NNs" in the NN FAQ (it may not show up on the server for a few days): Reed, R.D., and Marks, R.J, II (1999), Neural Smithing: Supervised Learning in Feedforward Artificial Neural Networks, Cambridge, MA: The MIT Press, ISBN 0-262-18190-8. After you have read Smith (1993) or Weiss and Kulikowski (1991), Reed and Marks provide an excellent source of practical details for training MLPs. They cover both backprop and conventional optimization algorithms. Their coverage of initialization methods, constructive networks, pruning, and regularization methods is unusually thorough. Unlike the vast majority of books on NNs, this one has lots of really informative graphs. The chapter on generalization assessment is a little weak, which is why you should read Smith (1993) or Weiss and Kulikowski (1991) first. There is a little elementary calculus, but not enough that it should scare off anybody. One minor complaint: "smith" is not a verb! Warren S. Sarle SAS Institute Inc. The opinions expressed here saswss@unx.sas.com SAS Campus Drive are mine and not necessarily (919) 677-8000 Cary, NC 27513, USA those of SAS Institute.


A handbook, reference book, and a documentary on Neural Nets,

April 5, 1999 Reviewer: A reader from Seattle, WA.

It was a pleasure to have the opportunity to read the Russell Reed, Robert Marks book "Neural Smithing". I have been an engineer at Boeing for 20 years involved in computing, CAD 3D design, and related applications. My current assignment includes supporting a sophisticated Neural Network based design retrieval system. I am also completing my Ph.D. dissertation based on Neural Network research. To begin with, it seems reasonable to characterize "Neural Smithing" in broad terms. The book is not just a stuffy, hypothetical, academic treatment of Neural Networks loaded with formulas and references. It does contain these elements but they are encapsulated in a larger presentation which leads the reader on an adventure of exploration and an ultimately satisfying journey of discovery. The book is certainly well grounded in theory and motivated by classical approach but the overall message is: you too can make neural networks from scratch by following the principles, guidelines, suggestions, and hints presented in this handbook. What's more, your network will probably perform correctly, or at least you'll understand the reason why not. "Neural Smithing" guides the reader through various channels and pathways, around pitfalls, and ultimately to an understanding of neural networks on a personal level. The reader comes away from the first reading with a feeling of intimate knowledge and intuitive understanding of neural networks. After this, the book transforms from a required travel guide into a trusty reference book. With over 380 references, it is a veritable who's who in neural network technology and a "must have" for any serious experimenters workbench. The underlying assumption the authors seem to be working from is that the reader will in fact "Roll up their sleeves, sit down at a computer, and get involved with neural networks at a working level". Ultimately, they will arrive at the intersections of discovery marked by the various chapters of the book. Here, the information contained in the book stands ready to guide the "user" through these intersection in an interactive way where the reader participates in the choices. The reader is made to consider their particular application and the implication of a variety of choices on the results they are interested in achieving. The authors bring a unique understanding and hands-on experience to their discussions. While the book is liberally referenced, the originality of the book is recognized at a higher level as the authors use their experience to facilitate overall integration into a coherent, comprehensive presentation. This book gets high marks for presentation, particularly in the areas of line-of-reasoning and chain-of-development. The style will be instantly recognizable to readers versed in formal, axiomatic development and presentation. "Neural Smithing" is a sophisticated work replete with Hypercubes, Conjugate Gradient Descents, Correlation and Variance, Voronoi tessellation, and Genetic Algorithms. Yet least the reader begin to feel that they are in peril of being lost it conceptual hyperspace, have no fear. The authors have a down to earth (3-dimensional) way of pulling you back. "Just think of error optimization as a marble rolling around on hills and valleys," encourage the authors. Well, when you put it that way, how can you miss. Other euphemisms include "pin the hyperplanes to the data", and "leap-frog weights". The bottom line is that the authors have a way of bringing you with them even when the going gets thick. Finally, I believe this work to be comprehensive. It is well paced from beginning to end and covers all the bases in a very logical format. First, the basics are presented in excellent depth. (I believe other books miss this opportunity in many cases). Next, the heart of neural network theory and operation is presented and classically motivated. Finally, more exotic topics are covered (in equivalent depth I might add) such as Genetic Algorithms as they apply to neural networks, and generalization heuristics. The discussions are extremely well documented with references, formulas, and very relevant figures. Also, the appendixes provided are excellent and pertinent. "Neural Smithing" is a handbook, a reference book, a cookbook complete with recipes for getting the job done, and finally a documentary on the exploration of an exciting field which is clearly setting the foundations for eventual understanding of the human mind. I would strongly recommend this book to someone who feels a compelling need to have access to the tools of this realm. This book is a vehicle for exploration.

Frank S. Holman III, Boeing Commercial Airplane Company, Seattle, WA. 98124-0346