2
3
Fork 6
12 MiB materialized 12 MiB stored

MyGPT Workshop: Build a ChatGPT For Your Own Data in One Hour

Mount

README.md

MyGPT

We use langchain - index and openai to build your own Generative AI application using your own documents.

Requirements

pip install -r requirements.txt
export OPENAI_API_KEY=YOUR_OPENAI_API_KEY

Usage

Train

# Retrain from scratch
python src/train.py

Run the app

gradio src/app.py

Testing

Question: What are these documents about?

Getting Data

Generally

  1. Download text files, any directory structure
  2. Put them into the data directory of this repository
  3. Train app!

Notion

  1. Follow the steps here: https://www.notion.so/help/export-your-content#export-as-markdown-&-csv
  2. Unzip the downloaded archive
  3. Move the unzipped folder/directory into the data directory of this repo and then train!

Slack

  1. Follow steps here: https://slack.com/help/articles/201658943-Export-your-workspace-data

Sample Data

A sample dataset has been provided in the sample-data directory, just copy the gen-ai folder into the data directory and use that for a very simple corpus of documents.

File List Total items: 9
Name Last Commit Size Last Modified
assets updated header image 1 month ago
data Basic app skeleton 1 month ago
model Basic app skeleton 1 month ago
sample-data converted pdf to markdown for sample data 2 weeks ago
src Updated packages, minimal requirements, tested on Windows 1 week ago
.gitattributes Initial commit 79 B 2 months ago
.gitignore Better error display & validation 287 B 1 month ago
README.md Updated README 1.1 KiB 2 weeks ago
requirements.txt Updated packages, minimal requirements, tested on Windows 118 B 1 week ago

About

MyGPT Workshop: Build a ChatGPT For Your Own Data in One Hour

Repository Size

Materialized: 12 MiB
Stored: 12 MiB

Activity 16 commits

File Types