4.1 Pre-trained Models

The way in which we do transfer learning in the context of deep learning is with pre-trained models. But what is a pre-trained model? In general, we are referring to a model which has been trained on a specific data set. We will be exploring transfer learning in the context of computer vision so this means the model saw many images, each with a label which says what it is, and over time the model learned to correctly say the label given the image. There are a lot of different ideas going on here: computer vision, datasets, transfer learning, and more. If any of this is not making sense, that is totally expected, there is both a lot of jargon as well as many new ideas being introduced. Try to stay focused on the high level and we will come back to many of these topics in more detail.

Now that we know the high level idea of transfer learning, let us dive into a real example using Flux. To give some context, in other machine learning frameworks like PyTorch, the pre-trained models are built right into PyTorch itself. In the Flux ecosystem however, the pre-trained models live in a package called Metalhead.jl. Metalhead is built to work with the Flux ecosystem so you do not need to work about writing compatibility issues.

CC BY-NC-SA 4.0 Logan Kilpatrick