My realization of a simple dl framework from scratch
There are simple parts to build neural network:
- Linear Layer
- Activation Functions:
- ReLU
- LeakyReLU
- Sigmoid
- Softmax
- Regularization Layers:
- Dropout
- Batchnorm (1 dimentional version)
- Criterions:
- MSE
- CrossEntropy
The framework was tested on MNIST classification problem. And got accuracy score > 98% on test dataset