Rob 2.0
If I'm going to be replaced with AI then I may as well be the person to do it. I need an AI Rob that I can be proud of and that's going to take some work.
My approach so far is to generate some training data. I've answered lots of questions in a spreadsheet. This is an ongoing project and there will be dot releases as I work towards a usable product (one that I can just plug into email or Teams). Probably this is going to require a mix of fine tuning and retrieval augmented generation (RAG). To start with I'm just fine tuning GPT 3.5 Turbo from OpenAI.
Fine tuning was painless. As usual the difficult part was randomly trying different versions of Python to find one that would coexist with some stubborn dependency (tiktoken in this case, which will live with Python 3.11 but is very unhappy with Python 3.12).
You can try this below - just leave a comment and Rob 2.0 will reply. Anything you post goes through the regular moderation system, this is just to stop spam. any legitimate questions are fair game (and likely to make it into the training corpus if the answer is no good!).
Due to safety systems it doesn't swear like the real thing. That might require a different model / corporate host at some point in the future. I'll update this post as I make progress.
Updated 2023-12-20 00:46:
I had most of a day spare today and so decided to get a little closer to my own personal singularity. Rob 2.1 is live and answering your questions in the comments below.
The first thing I did was add a few hundred more questions and answers to my training data set. I then fine tuned GPT 3.5 on the new data.
I wanted to get the LLM trinity - prompt, retrieval augmented generation (RAG) and fine turing. Initially I thought that I could just use the OpenAI assistant API to get there, and I got as far as coding the whole thing up before stubbing my toe on a harsh reality. It only supports retrieval for gpt-3.5-turbo-1106 and gpt-4-1106-preview. Hopefully this changes at some point but no way to get everything I need from assistants yet.
Not a big deal - I rolled up my sleeves (and also GitHub Copilot's sleeves) and added my own RAG based on the Q&A training data and refined my prompt to include the most relevant answer as well as some more specific instructions. It's pretty basic - whatever you ask is compared to the existing question library using cosine distance of OpenAI embeddings. Maybe I'll add a vector database if I have the patience to answer enough questions about myself, but a brute force in memory search works fine for now.
Related Posts
- OpenAGI, or why we shouldn't trust Open AI to protect us from the Singularity
- 1,000th Post!
- Monitor page index status with Google Sheets, Apps Script and the Google Search Console API
- San Francisco Budget Chatbot (GPT)
- Three reasons the dream of a robot companion isn't over
(Published to the Fediverse as: Rob 2.0 #code #openai #ml #agi An AI version of Robert Ellison. You can ask questions by leaving a comment. )
Comments
Which one? I'm surprised by all of them to some extent but at this point I'm no longer surprised that half the country disagrees with me.
How surprised are you by the election result?
Good effort, but don't try this at home.
You wrote move, not upgrade. What are you really doing?
Would you believe I've finally got around to moving gmail support on the blog to OAuth2. What do you think I think about OAuth2?
I'm going to find some alone time ideally.
What are your plans for the solar eclipse?
That's a private matter.
Can you describe a recent dream?
Germany, Zimbabwe, Kuwait, Belize, Virginia, California.
Add Comment
All comments are moderated. Your email address is used to display a Gravatar and optionally for notification of new comments and to sign up for the newsletter.