Artificial Neural Networks: (Record no. 295110)
[ view plain ]
000 -LEADER | |
---|---|
fixed length control field | 03805nam a2200181Ia 4500 |
020 ## - INTERNATIONAL STANDARD BOOK NUMBER | |
ISBN | 9788120332294 |
082 ## - DEWEY DECIMAL CLASSIFICATION NUMBER | |
Classification number | 006.32 |
Item number | PRI-A |
100 ## - MAIN ENTRY--AUTHOR NAME | |
Personal name | Priddy, Kevin L. |
245 ## - TITLE STATEMENT | |
Title | Artificial Neural Networks: |
Sub Title | An Introduction/ |
Statement of responsibility, etc | Kevin L Priddy AND PAUL E. KELLER |
250 ## - EDITION STATEMENT | |
Edition statement | Eastern Economy Edition |
260 ## - PUBLICATION, DISTRIBUTION, ETC. (IMPRINT) | |
Place of publication | New Delhi: |
Name of publisher | PHI Learning, |
Year of publication | 2009. |
300 ## - PHYSICAL DESCRIPTION | |
Number of Pages | ix, 165 p. : . |
Other physical details | illustrations (some color) ; 26 cm |
505 ## - FORMATTED CONTENTS NOTE | |
Formatted contents note | Chapter 1. Introduction. 1.1. The neuron -- 1.2. Modeling neurons -- 1.3. The feedforward neural network -- 1.4. Historical perspective on computing with artificial neurons. Chapter 2. Learning methods. 2.1. Supervised training methods -- 2.2. Unsupervised training methods. Chapter 3. Data normalization. 3.1. Statistical or Z-score normalization -- 3.2. Min-max normalization -- 3.3. Sigmoidal or SoftMax normalization -- 3.4. Energy normalization -- 3.5. Principal components normalization. Chapter 4. Data collection, preparation, labeling, and input coding. 4.1. Data collection -- 4.2. Feature selection and extraction. Chapter 5. Output coding. 5.1. Classifier coding -- 5.2. Estimator coding. Chapter 6. Post-processing. Chapter 7. Supervised training methods. 7.1. The effects of training data on neural network performance -- 7.2. Rules of thumb for training neural networks -- 7.3. Training and testing. Chapter 8. Unsupervised training methods. 8.1. Self-organizing maps (SOMs) -- 8.2. Adaptive resonance theory network. Chapter 9. Recurrent neural networks. 9.1. Hopfield neural networks -- 9.2. The bidirectional associative memory (BAM) -- 9.3. The generalized linear neural network -- 9.4. Real-time recurrent network -- 9.5. Elman recurrent network. Chapter 10. A plethora of applications. 10.1. Function approximation -- 10.2. Function approximation-Boston housing example -- 10.3. Function approximation-cardiopulmonary modeling -- 10.4. Pattern recognition-tree classifier example -- 10.5. Pattern recognition-handwritten number recognition example -- 10.6. Pattern recognition-electronic nose example -- 10.7. Pattern recognition-airport scanner texture recognition example -- 10.8. Self organization-serial killer data-mining example -- 10.9. Pulse-coupled neural networks-image segmentation example. Chapter 11. Dealing with limited amounts of data. 11.1. K-fold cross-validation -- 11.2. Leave-one-out cross-validation -- 11.3. Jackknife resampling -- 11.4. Bootstrap resampling. Appendix A. The feedforward neural network. A.1. Mathematics of the feedforward process -- A.2. The backpropagation algorithm -- A.3. Alternatives to backpropagation. Appendix B. Feature saliency. Appendix C. Matlab code for various neural networks. C.1. Matlab code for principal components normalization -- C.2. Hopfield network -- C.3. Generalized neural network -- C.4. Generalized neural network example -- C.5. ART-like network -- C.6. Simple perceptron algorithm -- C.7. Kohonen self-organizing feature map. Appendix D. Glossary of terms -- References -- Index. |
520 ## - SUMMARY, ETC. | |
Summary, etc | This tutorial text provides the reader with an understanding of artificial neural networks (ANNs) and their application, beginning with the biological systems which inspired them, through the learning methods that have been developed and the data collection processes, to the many ways ANNs are being used today. The material is presented with a minimum of math (although the mathematical details are included in the appendices for interested readers), and with a maximum of hands-on experience. All specialized terms are included in a glossary. The result is a highly readable text that will teach the engineer the guiding principles necessary to use and apply artificial neural networks. |
650 ## - SUBJECT ADDED ENTRY--TOPICAL TERM | |
Topical Term | Neural networks (Computer science) |
700 ## - ADDED ENTRY--PERSONAL NAME | |
Personal name | KELLER, PAUL E. |
942 ## - ADDED ENTRY ELEMENTS (KOHA) | |
Koha item type | Book |
Withdrawn status | Lost status | Damaged status | Not for loan | Home Library | Current Location | Shelving location | Date acquired | Full call number | Accession Number | Price effective from | Koha item type | Source of classification or shelving scheme | Collection code | Cost, normal purchase price |
---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
Dept. of Computational Biology and Bioinformatics | Dept. of Computational Biology and Bioinformatics | Processing Center | 01/09/2015 | 006.32 PRI-A | DCB1828 | 01/09/2015 | Book | |||||||
Dept. of Futures Studies | Dept. of Futures Studies | Processing Center | 23/05/2023 | 006.32 PRI | DFSKM71 | 23/05/2023 | Book | Dewey Decimal Classification | Knowledge Management | 154.05 |