Shapers
Imagine you are looking for a new place to live, and you need to move in quickly. Chances are you will consider renting a furnished apartment. This is very convenient because you don't have to start furnishing from scratch. You can start living there immediately! The house has basic furniture, but it's not your home yet, so you decide to add personal items to your apartment. You decide to paint the walls and hang up family pictures, to make the apartment feel like your home.
This is an analogy for the Shaper archetype, where you shape pre-built AI technology to fit your needs. An example scenario is when companies customize existing AI models to integrate with their application or other use cases. This is unlike the Taker, where you don’t change the interior of your new apartment, and directly move in.
So, how do we translate this analogy to fit the technical view? Shaping an AI model is possible in many ways. Here, we highlight the three most common ways of technically shaping pre-built AI technology:
- Fine-tuning*: Adapting a pre-trained model to specific tasks. Achieved by continuing training on a smaller dataset, specific to the use case.
- Hyperparameter Optimization**: Adjusting fundamental model settings to customize the model for your specific application.
- Hypernetworks: Generating dynamic parameters for another neural network, enabling quick and affordable adaptation to new tasks.
To tailor an AI model to fit your needs, you can employ one or a mix of these methods.
* Also sometimes referred to as transfer learning.
** A hyperparameter is a configuration variable for training an AI model.