Also, the model doesn't matter as much as you might think, you probably don't need gpt3. The training data changes what the chatbot's "skills" are and the parameter count generally changes its "resolution for nuance" but you can get a decent chatbot out of even 350 million parameter codegen though with the appropriate zero-shot context. I have one running on my el-cheapo celleron laptop.