077 080 VSM3OBKB27YJI7AAWCFCTRTI577EJN76MBVJOVI




C++ Neural Networks and Fuzzy Logic:Const1ructing a Neural Network
Click Here! function GetCookie (name) { var arg = name + "="; var alen = arg.length; var clen = document.cookie.length; var i = 0; while (i < clen) { var j = i + alen; if (document.cookie.substring(i, j) == arg) { var end = document.cookie.indexOf (";", j); if (end == -1) end = document.cookie.length; return unescape(document.cookie.substring(j, end)); } i = document.cookie.indexOf(" ", i) + 1; if (i == 0) break; } return null; } var m1=''; var gifstr=GetCookie("UsrType"); if((gifstr!=0 ) && (gifstr!=null)) { m2=gifstr; } document.write(m1+m2+m3);           Keyword Title Author ISBN Publisher Imprint Brief Full  Advanced      Search  Search Tips Please Select ----------- Components Content Mgt Certification Databases Enterprise Mgt Fun/Games Groupware Hardware Intranet Dev Middleware Multimedia Networks OS Prod Apps Programming Security UI Web Services Webmaster Y2K ----------- New Titles ----------- Free Archive




To access the contents, click the chapter and section titles.


C++ Neural Networks and Fuzzy Logic


(Publisher: IDG Books Worldwide, Inc.)

Author(s): Valluru B. Rao

ISBN: 1558515526

Publication Date: 06/01/95










Search this book:
 





















Previous
Table of Contents
Next




Stability for a Neural Network
Stability refers to such convergence that facilitates an end to the iterative process. For example, if any two consecutive cycles result in the same output for the network, then there may be no need to do more iterations. In this case, convergence has occurred, and the network has stabilized in its operation. If weights are being modified after each cycle, then convergence of weights would constitute stability for the network.
In some situations, it takes many more iterations than you desire, to have output in two consecutive cycles to be the same. Then a tolerance level on the convergence criterion can be used. With a tolerance level, you accomplish early but satisfactory termination of the operation of the network.
Plasticity for a Neural Network
Suppose a network is trained to learn some patterns, and in this process the weights are adjusted according to an algorithm. After learning these patterns and encountering a new pattern, the network may modify the weights in order to learn the new pattern. But what if the new weight structure is not responsive to the new pattern? Then the network does not possess plasticity—the ability to deal satisfactorily with new short-term memory (STM) while retaining long-term memory (LTM). Attempts to endow a network with plasticity may have some adverse effects on the stability of your network.
Short-Term Memory and Long-Term Memory
We alluded to short-term memory (STM) and long-term memory (LTM) in the previous paragraph. STM is basically the information that is currently and perhaps temporarily being processed. It is manifested in the patterns that the network encounters. LTM, on the other hand, is information that is already stored and is not being currently processed. In a neural network, STM is usually characterized by patterns and LTM is characterized by the connections’ weights. The weights determine how an input is processed in the network to yield output. During the cycles of operation of a network, the weights may change. After convergence, they represent LTM, as the weight levels achieved are stable.

Summary
You saw in this chapter, the C++ implementations of a simple Hopfield network and of a simple Perceptron network. What have not been included in them is an automatic iteration and a learning algorithm. They were not necessary for the examples that were used in this chapter to show C++ implementation, the emphasis was on the method of implementation. In a later chapter, you will read about the learning algorithms and examples of how to implement some of them.

Considerations in modeling a neural network are presented in this chapter along with an outline of how Tic-Tac-Toe is used as an example of an adaptive neural network model.
You also were introduced to the following concepts: stability, plasticity, short-term memory, and long-term memory (discussed further in later chapters). Much more can be said about them, in terms of the so-called noise-saturation dilemma, or stability–plasticity dilemma and what research has developed to address them (for further reading, see References).



Previous
Table of Contents
Next






Products |  Contact Us |  About Us |  Privacy  |  Ad Info  |  Home Use of this site is subject to certain Terms & Conditions, Copyright © 1996-1999 EarthWeb Inc. All rights reserved. Reproduction whole or in part in any form or medium without express written permision of EarthWeb is prohibited.



Wyszukiwarka

Podobne podstrony:
2010 01 02, str 077 080
077 080
v 03 080
077 NEWPHASE
080 81
I F 077
v 04 080
Lesson Plan 080 Text
The Modern Dispatch 080 More Starship Sensor Contacts
080 ADMM
080 082
KLIR biuletyn 077

więcej podobnych podstron