Skip to content

Eleott-hi/MLP

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

10 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

Multilayer Perceptron

Implimentation of a simple artificial neural network in the form of a perceptron with sigmoid activation function, which must be trained on an open dataset and perform recognition of 26 handwritten letters of the Latin alphabet.

Contents

  1. Chapter I
    1.1. Building
  2. Chapter II
    2.1. Start
    2.1. Recognittion
    2.1. Learning
    2.1. Testing

Chapter I

Building

To builld project make and cmake must be installed.

Go to project_directory/src in terminal and run make. It will start building the project in the build directory using cmake. After that, the program will automatically start.

Start

If you see the output above - everything is correct and the program works just fine.

Chapter II

Start

By default, the underlying MLP has random weights, so it won't make any useful prediction. You can switch MLP between graph and matrix implementations (matrix is the default).

MLP type

You can also upload your own weights, but they must match the template:

  1. Configuration:
  • Input layer - 784 neurons
  • Any number of hidden layers - 100 neurons
  • Output layer - 26 neurons
  1. Weights (double)
  2. Biases (double)

To load weights, click the Import button in the weights area. There are trained weights in the project weights catalog.

After loading, the weight information will be updated:

Weights Info

You can also download weights from the program to the selected file.

Recognittion

In the Recognition tab, you can draw a letter, the predicted result will be shown in the Results area.

You can also upload an image in bmp format.

Right-click to clear the drawing area.

Recognition

Learning

Before you start training, you need to download the training and test datasets (you will find the MNIST open dataset in the datasets directory) by clicking Import buttin in certain areas. Wait for them to load. Then you can choose:

  • How many hidden layers your new MLP will have
  • How many epochs your new MLP will go through
  • Specify the learning rate.
  • Specify learning type
  • Specify fold amount (only for cross validation learning type)

After that press the Start button and the learning process will run. You can monitor the progress in the progress bar above. You can also observe the MLP error at the end of each epoch on the graph.

If you want to stop process prematurely click Stop button. In this case the old MLP will not be overwritten by the new one.

In the case of cross validation, the MLP with better metrics will be written instead of the old MLP.

At the end, the weight information area will be updated.

Learning

Cross validation

Testing

Before you start testing, you need to download the test datasets (you will find the MNIST open dataset in the datasets directory) by clicking Import buttin in certain areas. Wait for it to load. Then you can specify:

  • Sample rate (which part of test data will be tested, 1.0 by default)

After that press the Start button and the testing process will run. You can monitor the progress in the progress bar above.

If you want to stop process prematurely click Stop button.

At the end, the MLP metricx will be shown.

Testing

About

Implementation of a simple neural network (multilayer perceptron) that is trained on an open data set (EMNIST) and recognizes letters of the Latin alphabet (including handwritten ones)

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

 
 
 

Contributors

Languages