Skip to content

Building Large Language Model Applications, Published by Packt

License

Notifications You must be signed in to change notification settings

PacktPublishing/Building-LLM-Powered-Applications

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

54 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

Building LLM Powered Applications, First Edition

This is the code repository for Building LLM Powered Applications, First Edition, published by Packt.

Create intelligent apps and agents with large language models

Valentina Alto

      Free PDF       Graphic Bundle       Amazon      

About the book

Unity Cookbook, Fifth Edition

Building LLM Powered Applications delves into the fundamental concepts, cutting-edge technologies, and practical applications that LLMs offer, ultimately paving the way for the emergence of large foundation models (LFMs) that extend the boundaries of AI capabilities.

The book begins with an in-depth introduction to LLMs. We then explore various mainstream architectural frameworks, including both proprietary models (GPT 3.5/4) and open-source models (Falcon LLM), and analyze their unique strengths and differences. Moving ahead, with a focus on the Python-based, lightweight framework called LangChain, we guide you through the process of creating intelligent agents capable of retrieving information from unstructured data and engaging with structured data using LLMs and powerful toolkits. Furthermore, the book ventures into the realm of LFMs, which transcend language modeling to encompass various AI tasks and modalities, such as vision and audio.

Whether you are a seasoned AI expert or a newcomer to the field, this book is your roadmap to unlock the full potential of LLMs and forge a new era of intelligent machines.

Key Learnings

  • Explore the core components of LLM architecture, including encoder-decoder blocks and embeddings
  • Understand the unique features of LLMs like GPT-3.5/4, Llama 2, and Falcon LLM
  • Use AI orchestrators like LangChain, with Streamlit for the frontend
  • Get familiar with LLM components such as memory, prompts, and tools
  • Learn how to use non-parametric knowledge and vector databases
  • Understand the implications of LFMs for AI research and industry applications
  • Customize your LLMs with fine tuning
  • Learn about the ethical implications of LLM-powered applications

Chapters

Chapters Colab Kaggle Gradient Studio Lab
Chapter 1: Introduction to Large Language Models
Chapter 2: LLMs for AI-Powered Applications
Chapter 3: Choosing an LLM for Your Application
Chapter 4: Prompt Engineering
  • Chapter 4 - Prompt Engineering.ipynb
Open In Colab
Open In Kaggle
Open In Gradient
Open In Studio Lab
Chapter 5: Embedding LLMs within Your Applications
  • Chapter 5 - Embedding LLMs within your Applications.ipynb
Open In Colab
Open In Kaggle
Open In Gradient
Open In Studio Lab
Chapter 6: Building Conversational Applications
  • Chapter 6 - Building conversational apps.ipynb
Open In Colab
Open In Kaggle
Open In Gradient
Open In Studio Lab
Chapter 7: Search and Recommendation Engines with LLMs
  • Chapter 7 - Building recommendation systems with LLMs.ipynb
Open In Colab
Open In Kaggle
Open In Gradient
Open In Studio Lab
Chapter 8: Using LLMs with Structured Data
  • Chapter 8 - LLMs with structured data.ipynb
Open In Colab
Open In Kaggle
Open In Gradient
Open In Studio Lab
Chapter 9: Working with Code
  • Chapter 9-Working with code.ipynb
Open In Colab
Open In Kaggle
Open In Gradient
Open In Studio Lab
Chapter 10: Building Multimodal Applications with LLMs
  • Chapter 10 - Building multi-modal agents.ipynb
Open In Colab
Open In Kaggle
Open In Gradient
Open In Studio Lab
Chapter 11: Fine-Tuning Large Language Models
  • Chapter 11 - Fine tuning LLMs.ipynb
Open In Colab
Open In Kaggle
Open In Gradient
Open In Studio Lab
Chapter 12: Responsible AI
Chapter 13: Emerging Trends and Innovations

Requirements for this book

Chapter Software required Link to the software Hardware specifications OS required
4-11 Python Download Suitable Windows/Linux/MacOS

Errata

* Page 8, Chapter 1 : **P(“table”), P(“chain”), and P(“roof”) are the prior probabilities for each candidate word, based on the language model’s knowledge of the frequency of these words in the training data.** _Correction:_ **P(“table”), P(“chair”), and P(“roof”) are the prior probabilities for each candidate word, based on the language model’s knowledge of the frequency of these words in the training data.**

Get to know Author

Valentina Alto is a Data Science Graduate who joined Microsoft Italy in 2020 as an Azure solution specialist. Since 2022, she has been focusing on data and AI workloads within the manufacturing and pharmaceutical industries. She has been working closely with system integrators on customer projects to deploy cloud architecture with a focus on Modern Data Platforms and AI-powered applications.
In June 2024, she moved to Microsoft Dubai as an AI App Tech Architect to focus more on AI-driven projects in the Middle East.
Since commencing her academic journey, she has been writing tech articles on statistics, machine learning, deep learning, and AI in various publications. She has authored several books on machine learning and large language models.

Community Contributor

Vandervoort Patrick, is an analyst programmer, and primarily programs in Delphi, Python, C, C++, and C#. He also works on install, configure on Linux servers (Ubuntu and Suse), and he is interested in everything related to Artificial Intelligence.

Other Related Books

About

Building Large Language Model Applications, Published by Packt

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Contributors 4

  •  
  •  
  •  
  •