|
Demystifying Large Language Models: Unraveling the Mysteries of Language Transformer Models, Build from Ground up, Pre-train, Fine-tune and Deployment
|
(Buch) |
Dieser Artikel gilt, aufgrund seiner Grösse, beim Versand als 3 Artikel!
Lieferstatus: |
i.d.R. innert 5-10 Tagen versandfertig |
Veröffentlichung: |
April 2024
|
Genre: |
EDV / Informatik |
ISBN: |
9781738908486 |
EAN-Code:
|
9781738908486 |
Verlag: |
James Chen |
Einband: |
Kartoniert |
Sprache: |
English
|
Dimensionen: |
H 229 mm / B 152 mm / D 19 mm |
Gewicht: |
502 gr |
Seiten: |
346 |
Zus. Info: |
Paperback |
Bewertung: |
Titel bewerten / Meinung schreiben
|
Inhalt: |
This book is a comprehensive guide aiming to demystify the world of transformers -- the architecture that powers Large Language Models (LLMs) like GPT and BERT. From PyTorch basics and mathematical foundations to implementing a Transformer from scratch, you'll gain a deep understanding of the inner workings of these models.
That's just the beginning. Get ready to dive into the realm of pre-training your own Transformer from scratch, unlocking the power of transfer learning to fine-tune LLMs for your specific use cases, exploring advanced techniques like PEFT (Prompting for Efficient Fine-Tuning) and LoRA (Low-Rank Adaptation) for fine-tuning, as well as RLHF (Reinforcement Learning with Human Feedback) for detoxifying LLMs to make them aligned with human values and ethical norms.
Step into the deployment of LLMs, delivering these state-of-the-art language models into the real-world, whether integrating them into cloud platforms or optimizing them for edge devices, this section ensures you're equipped with the know-how to bring your AI solutions to life.
Whether you're a seasoned AI practitioner, a data scientist, or a curious developer eager to advance your knowledge on the powerful LLMs, this book is your ultimate guide to mastering these cutting-edge models. By translating convoluted concepts into understandable explanations and offering a practical hands-on approach, this treasure trove of knowledge is invaluable to both aspiring beginners and seasoned professionals. |
|