Build a Large Language Model (from Scratch)
| AUTHOR | Raschka, Sebastian |
| PUBLISHER | Manning Publications (10/29/2024) |
| PRODUCT TYPE | Paperback (Paperback) |
Description
How to implement LLM attention mechanisms and GPT-style transformers. In Build a Large Language Model (from Scratch) bestselling author Sebastian Raschka guides you step by step through creating your own LLM. Each stage is explained with clear text, diagrams, and examples. You'll go from the initial design and creation, to pretraining on a general corpus, and on to fine-tuning for specific tasks. Build a Large Language Model (from Scratch) teaches you how to: - Plan and code all the parts of an LLM
- Prepare a dataset suitable for LLM training
- Fine-tune LLMs for text classification and with your own data
- Use human feedback to ensure your LLM follows instructions
- Load pretrained weights into an LLM Build a Large Language Model (from Scratch) takes you inside the AI black box to tinker with the internal systems that power generative AI. As you work through each key stage of LLM creation, you'll develop an in-depth understanding of how LLMs work, their limitations, and their customization methods. Your LLM can be developed on an ordinary laptop, and used as your own personal assistant. Purchase of the print book includes a free eBook in PDF and ePub formats from Manning Publications. About the technology Physicist Richard P. Feynman reportedly said, "I don't understand anything I can't build." Based on this same powerful principle, bestselling author Sebastian Raschka guides you step by step as you build a GPT-style LLM that you can run on your laptop. This is an engaging book that covers each stage of the process, from planning and coding to training and fine-tuning. About the book Build a Large Language Model (From Scratch) is a practical and eminently-satisfying hands-on journey into the foundations of generative AI. Without relying on any existing LLM libraries, you'll code a base model, evolve it into a text classifier, and ultimately create a chatbot that can follow your conversational instructions. And you'll really understand it because you built it yourself! What's inside - Plan and code an LLM comparable to GPT-2
- Load pretrained weights
- Construct a complete training pipeline
- Fine-tune your LLM for text classification
- Develop LLMs that follow human instructions About the reader Readers need intermediate Python skills and some knowledge of machine learning. The LLM you create will run on any modern laptop and can optionally utilize GPUs. About the author Sebastian Raschka is a Staff Research Engineer at Lightning AI, where he works on LLM research and develops open-source software. The technical editor on this book was David Caswell. Table of Contents 1 Understanding large language models
2 Working with text data
3 Coding attention mechanisms
4 Implementing a GPT model from scratch to generate text
5 Pretraining on unlabeled data
6 Fine-tuning for classification
7 Fine-tuning to follow instructions
A Introduction to PyTorch
B References and further reading
C Exercise solutions
D Adding bells and whistles to the training loop
E Parameter-efficient fine-tuning with LoRA
- Prepare a dataset suitable for LLM training
- Fine-tune LLMs for text classification and with your own data
- Use human feedback to ensure your LLM follows instructions
- Load pretrained weights into an LLM Build a Large Language Model (from Scratch) takes you inside the AI black box to tinker with the internal systems that power generative AI. As you work through each key stage of LLM creation, you'll develop an in-depth understanding of how LLMs work, their limitations, and their customization methods. Your LLM can be developed on an ordinary laptop, and used as your own personal assistant. Purchase of the print book includes a free eBook in PDF and ePub formats from Manning Publications. About the technology Physicist Richard P. Feynman reportedly said, "I don't understand anything I can't build." Based on this same powerful principle, bestselling author Sebastian Raschka guides you step by step as you build a GPT-style LLM that you can run on your laptop. This is an engaging book that covers each stage of the process, from planning and coding to training and fine-tuning. About the book Build a Large Language Model (From Scratch) is a practical and eminently-satisfying hands-on journey into the foundations of generative AI. Without relying on any existing LLM libraries, you'll code a base model, evolve it into a text classifier, and ultimately create a chatbot that can follow your conversational instructions. And you'll really understand it because you built it yourself! What's inside - Plan and code an LLM comparable to GPT-2
- Load pretrained weights
- Construct a complete training pipeline
- Fine-tune your LLM for text classification
- Develop LLMs that follow human instructions About the reader Readers need intermediate Python skills and some knowledge of machine learning. The LLM you create will run on any modern laptop and can optionally utilize GPUs. About the author Sebastian Raschka is a Staff Research Engineer at Lightning AI, where he works on LLM research and develops open-source software. The technical editor on this book was David Caswell. Table of Contents 1 Understanding large language models
2 Working with text data
3 Coding attention mechanisms
4 Implementing a GPT model from scratch to generate text
5 Pretraining on unlabeled data
6 Fine-tuning for classification
7 Fine-tuning to follow instructions
A Introduction to PyTorch
B References and further reading
C Exercise solutions
D Adding bells and whistles to the training loop
E Parameter-efficient fine-tuning with LoRA
Show More
Product Format
Product Details
ISBN-13:
9781633437166
ISBN-10:
1633437167
Binding:
Paperback or Softback (Trade Paperback (Us))
Content Language:
English
More Product Details
Page Count:
368
Carton Quantity:
20
Product Dimensions:
7.40 x 0.90 x 9.20 inches
Weight:
1.35 pound(s)
Feature Codes:
Price on Product
Country of Origin:
US
Subject Information
BISAC Categories
Computers | Artificial Intelligence - Expert Systems
Computers | Data Science - Machine Learning
Computers | Languages - Python
Descriptions, Reviews, Etc.
jacket back
From the back cover:
Build a Large Language Model (From Scratch) is a practical and eminently-satisfying hands-on journey into the foundations of generative AI. Without relying on any existing LLM libraries, you'll code a base model, evolve it into a text classifier, and ultimately create a chatbot that can follow your conversational instructions. And you'll really understand it because you built it yourself!
About the reader:
Readers need intermediate Python skills and some knowledge of machine learning. The LLM you create will run on any modern laptop and can optionally utilize GPUs.
Show More
publisher marketing
How to implement LLM attention mechanisms and GPT-style transformers. In Build a Large Language Model (from Scratch) bestselling author Sebastian Raschka guides you step by step through creating your own LLM. Each stage is explained with clear text, diagrams, and examples. You'll go from the initial design and creation, to pretraining on a general corpus, and on to fine-tuning for specific tasks. Build a Large Language Model (from Scratch) teaches you how to: - Plan and code all the parts of an LLM
- Prepare a dataset suitable for LLM training
- Fine-tune LLMs for text classification and with your own data
- Use human feedback to ensure your LLM follows instructions
- Load pretrained weights into an LLM Build a Large Language Model (from Scratch) takes you inside the AI black box to tinker with the internal systems that power generative AI. As you work through each key stage of LLM creation, you'll develop an in-depth understanding of how LLMs work, their limitations, and their customization methods. Your LLM can be developed on an ordinary laptop, and used as your own personal assistant. Purchase of the print book includes a free eBook in PDF and ePub formats from Manning Publications. About the technology Physicist Richard P. Feynman reportedly said, "I don't understand anything I can't build." Based on this same powerful principle, bestselling author Sebastian Raschka guides you step by step as you build a GPT-style LLM that you can run on your laptop. This is an engaging book that covers each stage of the process, from planning and coding to training and fine-tuning. About the book Build a Large Language Model (From Scratch) is a practical and eminently-satisfying hands-on journey into the foundations of generative AI. Without relying on any existing LLM libraries, you'll code a base model, evolve it into a text classifier, and ultimately create a chatbot that can follow your conversational instructions. And you'll really understand it because you built it yourself! What's inside - Plan and code an LLM comparable to GPT-2
- Load pretrained weights
- Construct a complete training pipeline
- Fine-tune your LLM for text classification
- Develop LLMs that follow human instructions About the reader Readers need intermediate Python skills and some knowledge of machine learning. The LLM you create will run on any modern laptop and can optionally utilize GPUs. About the author Sebastian Raschka is a Staff Research Engineer at Lightning AI, where he works on LLM research and develops open-source software. The technical editor on this book was David Caswell. Table of Contents 1 Understanding large language models
2 Working with text data
3 Coding attention mechanisms
4 Implementing a GPT model from scratch to generate text
5 Pretraining on unlabeled data
6 Fine-tuning for classification
7 Fine-tuning to follow instructions
A Introduction to PyTorch
B References and further reading
C Exercise solutions
D Adding bells and whistles to the training loop
E Parameter-efficient fine-tuning with LoRA
- Prepare a dataset suitable for LLM training
- Fine-tune LLMs for text classification and with your own data
- Use human feedback to ensure your LLM follows instructions
- Load pretrained weights into an LLM Build a Large Language Model (from Scratch) takes you inside the AI black box to tinker with the internal systems that power generative AI. As you work through each key stage of LLM creation, you'll develop an in-depth understanding of how LLMs work, their limitations, and their customization methods. Your LLM can be developed on an ordinary laptop, and used as your own personal assistant. Purchase of the print book includes a free eBook in PDF and ePub formats from Manning Publications. About the technology Physicist Richard P. Feynman reportedly said, "I don't understand anything I can't build." Based on this same powerful principle, bestselling author Sebastian Raschka guides you step by step as you build a GPT-style LLM that you can run on your laptop. This is an engaging book that covers each stage of the process, from planning and coding to training and fine-tuning. About the book Build a Large Language Model (From Scratch) is a practical and eminently-satisfying hands-on journey into the foundations of generative AI. Without relying on any existing LLM libraries, you'll code a base model, evolve it into a text classifier, and ultimately create a chatbot that can follow your conversational instructions. And you'll really understand it because you built it yourself! What's inside - Plan and code an LLM comparable to GPT-2
- Load pretrained weights
- Construct a complete training pipeline
- Fine-tune your LLM for text classification
- Develop LLMs that follow human instructions About the reader Readers need intermediate Python skills and some knowledge of machine learning. The LLM you create will run on any modern laptop and can optionally utilize GPUs. About the author Sebastian Raschka is a Staff Research Engineer at Lightning AI, where he works on LLM research and develops open-source software. The technical editor on this book was David Caswell. Table of Contents 1 Understanding large language models
2 Working with text data
3 Coding attention mechanisms
4 Implementing a GPT model from scratch to generate text
5 Pretraining on unlabeled data
6 Fine-tuning for classification
7 Fine-tuning to follow instructions
A Introduction to PyTorch
B References and further reading
C Exercise solutions
D Adding bells and whistles to the training loop
E Parameter-efficient fine-tuning with LoRA
Show More
List Price $59.99
Your Price
$59.39
