-
In this post, we will go through all the elements required to create and train AlexNet following the original paper. We will cover data processing, architecture definition, coding of training and validation loops, optimizations to speed up and training. Achieving comparable results with a top-1 error rate of 39.9% and top-5 error rate of 17.7%.
Let's code and train VGGNet from scratch! In this post, I will explain the process of implementing this iconic CNN from designing a general architecture and using dense evaluation to optimizing training speed and actually training the network to obtain a validation top-1 and top-5 error rates of 28.33% and 9.66% respectively. I will also compare the error rates and training performance against the original paper and AlexNet.
Building upon the recently implemented and trained VggNet, we will go through and implement 'A Neural Algorithm of Artistic Style' by Gatys, Ecker and Bethge which allowed to transfer the style of one image to the content of a different one. I will dive into some of the complexities of the implementation and give some beautiful examples of the results this technique can offer.