BackProp Task
- Date:
- 2002-11-12
- Authors:
- David Öggesjö <it1ogda@ituniv.se>
- Henric Thisell <it2thhe@ituniv.se>
- Sebastian Edman <it2edse@ituniv.se>
- Source Code:
- source.zip
- source.tar.gz
- source.tar.bz2
- Printable Version:
- backprop.pdf
- Table of Contents
- 1 Introduction
- 1.1 Problem description
- 1.2 Organization
- 2 Kohonen Data Analyzing
- 2.1 Introduction to Kohonen's SOFM
- 2.2
Introduction to the Principal Component Analysis
- 2.3 The Kohonen Choices
- 2.4 Program Structure
- Readfile.c
- Kohonen.c
-
Changeable Parameters
- 3 Kohonen Data Analyzing Results
- 3.1 Initial Problems
- How to Present the Result?
- Huge Data Sets
- 3.2 Graphs
-
Animation of Kohonen's
SOFM in Progress
-
Learning Rate and Sigma Function
- Change in Weights
-
What does the
Graphs Show?
- 3.3 Analyzed Data Sets
- Echo
- Heart-disease
- IR-spectra
- Isolet
- Mushrooms
- Nettalk
- Proteins
- Sonar
- Wine
- Vowels
- 4 Back Propagation Network
- 4.1
Introduction to the Back Propagation Algorithm
- The Neuron
- Feed Forward Networks
- Back Propagation Training
- 4.2 Back Propagation Choices
- 4.3 Program Structure
- Readfile.c
- Backprop.c
-
Changeable Parameters
- 4.4
Chosen Data Sets
- Wine
- Proteins
- Vowels
- 5 Back Propagation Network Results
- 5.1 Initial Problems
- Extra '-1' Neuron
-
What is Considered to be a Correct Prediction?
- How to Present the Result?
- No Coding Structure
- 5.2 Wine, Graphs and Results
-
Percentages
- 5.3 Proteins, Graphs and Results
-
Varying the Number of Neurons
-
Difference Between Runs
-
Neuron Output Specific Graphs
- Many Iterations
-
Percentages
- 5.4 Vowels, Graphs and Results
- Number of Neurons
- Learning Rate
- Momentum
- Parameters A and B
- Optimized Network
- RMS Error
- Percentages
- 6 Conclusion
- 6.1 Kohonen's SOFM
-
Checking the Data Sets
- Size of the Network
- Calculations
- 6.2 Back Propagation
- Number of Neurons
-
Learning Rate Impact on Huge Data Sets
- Momentum
- Optimization
- Over Training
- Calculation Time
- 7 Further Work
- 8 Glossary