Could open-source chatbots be a cheaper ChatGPT alternative?
As big tech companies continue to perfect generative AI, the open-source community may now have a chance to work with the technology as well. Big data analytics company Databricks has unveiled its version of a generative AI that is available for anyone to use for any purpose.
Called Dolly, the Databricks team proved that anyone can take a dated off-the-shelf open-source large language model (LLM) and provide it with ChatGPT-like instruction by training it in just 30 minutes on one machine using high-quality training data. The team also pointed out that the instruction-following does not seem to require the latest or the largest models.
“Our model is only 6 billion parameters, compared to 175 billion for GPT-3. We open-source the code for our model (Dolly) and show how it can be re-created on Databricks. We believe models like Dolly will help democratize LLMs, transforming them from something very few companies can afford into a commodity every company can own and customize to improve their products,” the team stated in a blog post.
Diving deeper, the team at Databricks explained that Dolly is pretty much a cheaper-to-build LLM that has similar capabilities as exhibited by ChatGPT. The tech behind it is based on the work from the Alpaca model built by Standford which is based on Meta’s LLaMA. Simply put, Dolly is an open-source clone of an Alpaca and inspired by a LLaMA.
While the Alpaca model works on a small dataset of 50,000 human-like questions and answers, Databricks discovered that even years-old open-source models with much earlier architectures exhibit striking behaviors when fine-tuned on a small corpus of instruction training data.
“Dolly works by taking an existing open source six billion parameter model from Eleuther AI and modifying it ever so slightly to elicit instruction following capabilities such as brainstorming and text generation not present in the original model, using data from Alpaca,” Databricks explained.
For organizations, building their own model instead of sending data to a centralized LLM provider could eventually have some risks. This includes the datasets being most likely to benefit from AI representing their most sensitive and proprietary data. Also, organizations would not want their data being held by a third-party company.
As such, Databricks believes that organizations would eventually want to have models that are owned and operated by them. However, just like all the other big tech companies as well, Databricks acknowledges that generative AI is still an emerging technology. There are still concerns about factual accuracy, bias, offensive responses, general toxicity and hallucinations in LLMs for Dolly, just like in other language models as well.
“We’re in the earliest days of democratization of AI for the enterprise, and much work remains to be done, but we believe the technology underlying Dolly represents an exciting new opportunity for companies that want to cheaply build their own instruction-following models,” stated Databricks.
READ MORE
- Data Strategies That Dictate Legacy Overhaul Methods for Established Banks
- Securing Data: A Guide to Navigating Australian Privacy Regulations
- Ethical Threads: Transforming Fashion with Trust and Transparency
- Top 5 Drivers Shaping IT Budgets This Financial Year
- Beyond Connectivity: How Wireless Site Surveys Enhance Tomorrow’s Business Network