Research Reading: Prototyping Conversational LLM's With Alpaca

Jonathan Bechtel

Details
LLM's have taken over the world, and their usability has had an unprecedented rate of increase in the last 6 months.
The Alpaca model developed by Stanford's computer science department represents an affordable way to turn pre-trained research LLM's into conversational chatbots that are on par with commercial models.
The project outlines a lot of the key aspects of building a chatbot:
- How to generate and format the data necessary to train everything
- Scripts for training the model with built-in parameters
- Boilerplate code for serving models over Huggingface with Gradio
It's quickly become the new standard for prototyping LLM models.
This talk is designed to give us a hands-on walk through the project, and take a deep dive through its github repo to dissect its most important parts.
This is meant to be a fast-acting introduction to the world of research LLM's, and prompt an open-ended discussion about the current state of language models.
The homepage for the project can be found here: https://crfm.stanford.edu/2023/03/13/alpaca.html
The github repo can be found here: https://github.com/tatsu-lab/stanford_alpaca
While this event is FREE, tickets are required & space is limited!
Attend this event