Description
In Domain-Specific Small Language Models you’ll discover:
- Model sizing best practices
- Open source libraries, frameworks, utilities and runtimes
- Fine-tuning techniques for custom datasets
- Hugging Face’s libraries for SLMs
- Running SLMs on commodity hardware
- Model optimization or quantization
Perfect for cost- or hardware-constrained environments, Small Language Models (SLMs) train on domain specific data for high-quality results in specific tasks. In Domain-Specific Small Language Models you’ll develop SLMs that can generate everything from Python code to protein structures and antibody sequences—all on commodity hardware.
about the book
Domain-Specific Small Language Models teaches you how to create language models that deliver the power of LLMs for specific areas of knowledge. You’ll learn to minimize the computational horsepower your models require, while keeping high–quality performance times and output. You’ll appreciate the clear explanations of complex technical concepts alongside working code samples you can run and replicate on your laptop. Plus, you’ll learn to develop and deliver RAG systems and AI agents that rely solely on SLMs, and without the costs of foundation model access.





Reviews
There are no reviews yet