![]() Please refer to the dolly GitHub repo for tips on dolly-v2-3b, a 2.8 billion parameter based on pythia-2.8b.dolly-v2-7b, a 6.9 billion parameter based on pythia-6.9b.High quality instruction following behavior not characteristic of the foundation model on which it is based.ĭolly v2 is also available in these smaller models sizes: ![]() ![]() dolly-v2-12b is not a state-of-the-art model, but does exhibit surprisingly Information extraction, open QA and summarization. Based on pythia-12b, Dolly is trained on ~15k instruction/response fine tuning recordsīy Databricks employees in capability domains from the InstructGPT paper, including brainstorming, classification, closed QA, generation, Databricks’ dolly-v2-12b, an instruction-following large language model trained on the Databricks machine learning platform
0 Comments
Leave a Reply. |