Introducing our new Dutch language model: Mistral-Small-24B-Instruct-2501-Dutch-GGUF
Today, we're excited to share our latest achievement at Aisk: a specialized Dutch language model that makes AI more accessible for Dutch-speaking users and businesses.
What we've created
We've taken the powerful Mistral-Small-24B-Instruct model (released in January 2025 by Mistral AI) and fine-tuned it specifically for Dutch language understanding and generation. Our model builds upon this strong foundation while improving its capabilities for Dutch content. The result is a model that can understand and respond in Dutch while maintaining the strong instruction-following abilities of the original Mistral Small 24B.
Why we made this
While many AI models excel in English, Dutch language support often lags behind. As a Netherlands-based company we wanted to create a tool that works well for our local community.
How we built it
Starting with the Mistral-Small-24B-Instruct base model, we trained it using 2,587 Dutch books, processing approximately 350 million tokens. This extensive Dutch-language dataset helps the model understand nuances and expressions specific to Dutch.
To make the model more practical for everyday use, we've quantized the model into Q4KM and Q8 GGUF's allong with the FP16 model.
What you can do with it
This model is particularly good at:
- Having conversations in Dutch
- Creating and summarizing Dutch content
- Helping with translations to and from Dutch
- Answering questions in Dutch
- Understanding Dutch text
Getting started
For developers interested in using this model, we've made it available on Hugging Face at GetAisk/Mistral-Small-24B-Instruct-2501-Dutch-GGUF.
Basic code to start using the model is provided in the model documentation, making it straightforward to integrate into your projects.
Looking forward
We're not stopping here of course, we are already working on the next update to release a model that has been finetuned on over 1 billion tokens of the Dutch language. Stay tuned!