Prompt Engineering, ChatGPT, and Generative LLM
(C) Copyright Elephant Scale
May 14, 2023
Course Description
- OpenAI with ChatGPT is eating the world with over a hundred million of users, and HuggingFace is the most popular library for Deep Learning, based on Transformer Neural Network, and containing inside Keras and PyTorch,
- This course introduces the students to Neural Networks, OpenAI, and AI in general. Then it goes into ChatGPT4 and teaches how humans should use it. This is called prompts engineering.
- Then it shows how to build applications on top of ChatGPT with the use of additional private company documents.
- Finally, the students learn about the use of the HuggingFace library to evaluate and integrate ChatGPT competitors.
After the course, you will be able to do the following tasks
- Correctly use ChatGPT out of the box.
- Improve ChatGPT performance with “prompts.”
- Further improve ChatGPT performance with machine-generated prompts and additional search services, such as Azure Retrieval-Augmented Generation (RAG)
- Fine-tune open models with HuggingFace
Audience
- Developers, data scientists, team leads, project managers
Skill Level
- Beginner to intermediate
Duration
- Three days
- Can be broken into Beginner (day 1) to intermediate (days 2 and 3)
Prerequisites
- General familiarity with machine learning
- Knowledge of a programming language
Format
- Lectures and hands on labs. (50% – 50%)
Lab environment
- Zero Install: There is no need to install software on students’ machines!
- A lab environment in the cloud will be provided for students.
Students will need the following
- A reasonably modern laptop with unrestricted connection to the Internet. Laptops with overly restrictive VPNs or firewalls may not work properly.
- A checklist to verify connectivity will be provided
- Chrome browser
Detailed outline
Introduction to Prompt Engineering
- Guidelines
- Iterative
- Summarizing
- Inferring
- Transforming
- Expanding
- Chatbot
Main NLP tasks
- Token classification
- Fine-tuning a masked language model
- Translation
- Summarization
- Training a causal language model from scratch
- Question answering
- Mastering NLP
HuggingFace and Open Models
- Transformers
- Encoders
- Decoders
- Sequence to sequence
- Bias and limitations
- Pipeline
- Models
- Tokenizers
- Putting it all together
Fine tuning a pretrained model
- Processing the data
- Fine-tuning a model with the Trainer API or Keras
- A full training
ChatGPT competitors and derivatives
- Bing Chat
- Chatsonic
- Jasper Chat
- Google Bard AI
- Character AI
- YouChat
- OpenAI Playground