C++ Neural Networks and Fuzzy Logic C++ Neural Networks and Fuzzy Logic
by Valluru B. Rao
M&T Books, IDG Books Worldwide, Inc.
ISBN: 1558515526   Pub Date: 06/01/95
  

Table of Contents


Preface

The number of models available in neural network literature is quite large. Very often the treatment is mathematical and complex. This book provides illustrative examples in C++ that the reader can use as a basis for further experimentation. A key to learning about neural networks to appreciate their inner workings is to experiment. Neural networks, in the end, are fun to learn about and discover. Although the language for description used is C++, you will not find extensive class libraries in this book. With the exception of the backpropagation simulator, you will find fairly simple example programs for many different neural network architectures and paradigms. Since backpropagation is widely used and also easy to tame, a simulator is provided with the capacity to handle large input data sets. You use the simulator in one of the chapters in this book to solve a financial forecasting problem. You will find ample room to expand and experiment with the code presented in this book.

There are many different angles to neural networks and fuzzy logic. The fields are expanding rapidly with ever-new results and applications. This book presents many of the different neural network topologies, including the BAM, the Perceptron, Hopfield memory, ART1, Kohonen’s Self-Organizing map, Kosko’s Fuzzy Associative memory, and, of course, the Feedforward Backpropagation network (aka Multilayer Perceptron). You should get a fairly broad picture of neural networks and fuzzy logic with this book. At the same time, you will have real code that shows you example usage of the models, to solidify your understanding. This is especially useful for the more complicated neural network architectures like the Adaptive Resonance Theory of Stephen Grossberg (ART).

The subjects are covered as follows:

  Chapter 1 gives you an overview of neural network terminology and nomenclature. You discover that neural nets are capable of solving complex problems with parallel computational architectures. The Hopfield network and feedforward network are introduced in this chapter.
  Chapter 2 introduces C++ and object orientation. You learn the benefits of object-oriented programming and its basic concepts.
  Chapter 3 introduces fuzzy logic, a technology that is fairly synergistic with neural network problem solving. You learn about math with fuzzy sets as well as how you can build a simple fuzzifier in C++.
  Chapter 4 introduces you to two of the simplest, yet very representative, models of: the Hopfield network, the Perceptron network, and their C++ implementations.
  Chapter 5 is a survey of neural network models. This chapter describes the features of several models, describes threshold functions, and develops concepts in neural networks.
  Chapter 6 focuses on learning and training paradigms. It introduces the concepts of supervised and unsupervised learning, self-organization and topics including backpropagation of errors, radial basis function networks, and conjugate gradient methods.
  Chapter 7 goes through the construction of a backpropagation simulator. You will find this simulator useful in later chapters also. C++ classes and features are detailed in this chapter.
  Chapter 8 covers the Bidirectional Associative memories for associating pairs of patterns.
  Chapter 9 introduces Fuzzy Associative memories for associating pairs of fuzzy sets.
  Chapter 10 covers the Adaptive Resonance Theory of Grossberg. You will have a chance to experiment with a program that illustrates the working of this theory.
  Chapters 11 and 12 discuss the Self-Organizing map of Teuvo Kohonen and its application to pattern recognition.
  Chapter 13 continues the discussion of the backpropagation simulator, with enhancements made to the simulator to include momentum and noise during training.
  Chapter 14 applies backpropagation to the problem of financial forecasting, discusses setting up a backpropagation network with 15 input variables and 200 test cases to run a simulation. The problem is approached via a systematic 12-step approach for preprocessing data and setting up the problem. You will find a number of examples of financial forecasting highlighted from the literature. A resource guide for neural networks in finance is included for people who would like more information about this area.
  Chapter 15 deals with nonlinear optimization with a thorough discussion of the Traveling Salesperson problem. You learn the formulation by Hopfield and the approach of Kohonen.
  Chapter 16 treats two application areas of fuzzy logic: fuzzy control systems and fuzzy databases. This chapter also expands on fuzzy relations and fuzzy set theory with several examples.
  Chapter 17 discusses some of the latest applications using neural networks and fuzzy logic.

In this second edition, we have followed readers’ suggestions and included more explanations and material, as well as updated the material with the latest information and research. We have also corrected errors and omissions from the first edition.

Neural networks are now a subject of interest to professionals in many fields, and also a tool for many areas of problem solving. The applications are widespread in recent years, and the fruits of these applications are being reaped by many from diverse fields. This methodology has become an alternative to modeling of some physical and nonphysical systems with scientific or mathematical basis, and also to expert systems methodology. One of the reasons for it is that absence of full information is not as big a problem in neural networks as it is in the other methodologies mentioned earlier. The results are sometimes astounding, even phenomenal, with neural networks, and the effort is at times relatively modest to achieve such results. Image processing, vision, financial market analysis, and optimization are among the many areas of application of neural networks. To think that the modeling of neural networks is one of modeling a system that attempts to mimic human learning is somewhat exciting. Neural networks can learn in an unsupervised learning mode. Just as human brains can be trained to master some situations, neural networks can be trained to recognize patterns and to do optimization and other tasks.

In the early days of interest in neural networks, the researchers were mainly biologists and psychologists. Serious research now is done by not only biologists and psychologists, but by professionals from computer science, electrical engineering, computer engineering, mathematics, and physics as well. The latter have either joined forces, or are doing independent research parallel with the former, who opened up a new and promising field for everyone.

In this book, we aim to introduce the subject of neural networks as directly and simply as possible for an easy understanding of the methodology. Most of the important neural network architectures are covered, and we earnestly hope that our efforts have succeeded in presenting this subject matter in a clear and useful fashion.

We welcome your comments and suggestions for this book, from errors and oversights, to suggestions for improvements to future printings at the following E-mail addresses:

V. Rao rao@cse.bridgeport.edu

H. Rao ViaSW@aol.com


Table of Contents

Copyright © IDG Books Worldwide, Inc.