Skip to main content
  • CIS
    Members: Free
    IEEE Members: Free
    Non-members: Free
    Pages/Slides: 145
30 Jun 2024

Deep Neural Networks are data hungry, they require millions of labelled data in order to work! — Really? — The last decade has shown useful approaches to work with less labelled data, either by having a lot of data from a similar domain or by letting the network learn meaningful representations without explicit supervision. This tutorial first brings self-supervised learning to a general perspective of learning with few data, covering typical transfer learning and auto-encoder approaches or perceptual loss. Furthermore, the tutorial will investigate some typical (mis-) conceptions of these methods and suggest some practical tips on how to learn with few data. By participating in this tutorial, you will get deep insights in representation learning and learning with few data, as well as practical tools to start working on data in your own domain.