Print Email Facebook Twitter Efficient Recurrent Residual Networks Improved by Feature Transfer Title Efficient Recurrent Residual Networks Improved by Feature Transfer Author Liu, Yue (TU Delft Electrical Engineering, Mathematics and Computer Science) Contributor Pintea, Silvia (mentor) van Gemert, Jan (mentor) Suveg, Ildiko (mentor) Degree granting institution Delft University of Technology Date 2017-08-31 Abstract Over the past several years, deep and wide neural networks have achieved great success in many tasks. However, in real life applications, because the gains usually come at a cost in terms of the system resources (e.g., memory, computation and power consumption), it is impractical to run top-performing but heavy networks such as VGGNet and GoogleNet directly on mobile and embedded devices, like smartphones and cameras. To tackle this problem, we propose the use of recurrent layers in residual networks to reduce the redundant information and save the parameters. Furthermore, with the help of feature map knowledge transfer, the performance of Recurrent Residual Networks (ReResNet) can be improved so as to reach similar accuracy to some complex state-of-the-art architectures on CIFAR-10, even with much fewer parameters. In this paper, we demonstrate the efficiency of ReResNet possibly improved by Feature Transfer on three datasets, CIFAR-10, Scenes and MiniPlaces. Subject Residual networksRecurrent networksKnowledge transfer To reference this document use: http://resolver.tudelft.nl/uuid:04a446a8-546c-455d-8344-948c7e3cdff5 Part of collection Student theses Document type master thesis Rights © 2017 Yue Liu Files PDF Thesis_YueLiu.pdf 3.14 MB Close viewer /islandora/object/uuid:04a446a8-546c-455d-8344-948c7e3cdff5/datastream/OBJ/view