View Table of ContentsCreate a BookmarkAdd Book to My BookshelfPurchase This Book Online
Skip to Book Content
Book cover image

Index

Neural Smithing: Supervised Learning in Feedforward Artificial Neural Networks
Russell D. Reed and Robert J. Marks II
Copyright © 1999 Massachusetts Institute of Technology

Previous Section Next Section

Index

B

Back-propagation
benefits of, 4
as derivative calculation, 49, 53-57, 66
dual meaning of, 49, 66
fuzzy control of, 148-149
pseudocode examples for, 63-66
as training algorithm (see Back-propagation algorithm)
Back-propagation algorithm
batch mode for, 57-58, 65, 68
benefits of, 49, 180-182
classical alternatives to, 158-183
as gradient descent, 57, 163
modifications for, 62-63
as one of many methods, 67
on-line mode for, 59-62, 65-66, 68
popularity of, 49, 155
purpose of, 49
training time for, 67-70
variations of, 135-153
Back-propagation network, 66
Batch learning, 57-58, 155
algorithm variations and, 147, 153
generalization and, 251
pseudocode example for, 65
random initialization and, 105
training time and, 68, 77-79
Bayesian methods, 258-260
Best-step steepest descent, 165-166. See also Cauchy's method
BFGS (Broyden-Fletcher-Goldfarb-Shanno) method, 174-175
Bias
generalization and, 240, 249, 256, 258, 267
in hyperplane geometry, 17-18
initialization and, 110-111
Bias weights, 125-126, 231
Bold driver method, 136-137
Boolean functions, 19, 39-41, 109
Bootstrapping, 258, 273
Bottlenecks, and pruning, 232-234
Brent's method, 159
Broyden-Fletcher-Goldfarb-Shanno (BFGS) method, 174-175

Top of current section
Previous Section Next Section
Books24x7.com, Inc. © 1999-2001  –  Feedback