Skip to content

sameershinde14/Deep-Residual-Networks-using-Exponential-Linear-Unit

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

4 Commits
 
 
 
 

Repository files navigation

Deep-Residual-Networks-using-Exponential-Linear-Unit

Very deep convolutional neural networks introduced new problems like vanishing gradient and degradation. The recent successful contributions towards solving these problems are Residual and Highway Networks. These networks introduce skip connections that allow the information (from the input or those learned in earlier layers) to flow more into the deeper layers. These very deep models have lead to a considerable decrease in test errors, on benchmarks like ImageNet and COCO. In this paper, we propose the use of exponential linear unit instead of the combination of ReLU and Batch Normalization in Residual Networks. We show that this not only speeds up learning in Residual Networks but also improves the accuracy as the depth increases. It improves the test error on almost all data sets, like CIFAR-10 and CIFAR-100

The details about this paper are here.

About

• Designed deep residual learning model with exponential linear unit for image classification with higher accuracy. • Decreased the error rate to 5.62% and 26.55% on CIFAR-10 and CIFAR-100 datasets respectively which outpaced the most competitive approaches previously published. • Published research paper for the same on 21th Sept 2016 at ACM co…

Topics

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

 
 
 

Contributors

Languages