Compare Models
-
BigScience
BLOOM
FREEBigScience Large Open-science Open-access Multilingual Language Model (BLOOM) is a transformer-based LLM. Over 1,000 AI researchers created it to provide a free large language model for everyone who wants to try and it is a multilingual LLM. BLOOM is an autoregressive Large Language Model (LLM), trained to continue text from a prompt on vast amounts of text data using industrial-scale computational resources. It can output coherent text in 46 languages and 13 programming languages. It is free, and everybody who wants to can try it out. To interact with the API, you’ll need to request a token. This is done with a post request to the server. Tokens are only valid for two weeks. After which, a new one must be generated. Trained on around 176B parameters, it is considered an alternative to OpenAI models. There is a downloadable model, and a hosted API is available. -
ChatGLM
ChatGLM-6B
FREEResearchers at the Tsinghua University in China have worked on developing the ChatGLM series of models that have comparable performance to other models such as GPT-3 and BLOOM. ChatGLM-6B is an open bilingual language model (trained on Chinese and English). It is based on General Language Model (GLM) framework, with 6.2B parameters. With the quantization technique, users can deploy locally on consumer-grade graphics cards (only 6GB of GPU memory is required at the INT4 quantization level). The following models are available: ChatGLM-130B (an open source LLM), ChatGLM-100B (not open source but available through invite-only access), and ChatGLM-6 (a lightweight open source alternative). ChatGLM LLMs are available with a Apache-2.0 license that allows commercial use. We have included the link to the Hugging Face page where you can try the ChatGLM-6B Chatbot for free. -
OpenAI
ChatGPT (Web Browser Version)
FREEThe ChatGPT Web Browser Version is an accessible online powerful language model. The chatbot is designed to provide users with a user-friendly interface that facilitates interaction without needing any specialized programming or machine learning knowledge. Users can leverage ChatGPT for a wide range of applications, including but not limited to tutoring in academic subjects, generating creative content, drafting and editing text, providing personalized recommendations, translating languages, and even programming help. Businesses can use it for automating customer service, generating marketing content, and providing personalized user experiences.ChatGPT is powered by GPT-3.5-turbo by default and is free to try. If you are a paying customer and subscribe to ChatGPT Plus, you can change the model to GPT-4 before you start a chat. Currently, the ChatGPT models support several languages, including but not limited to English, Spanish, French, German, Portuguese, Italian and Dutch. New features for ChatGPT-Plus users have just been announced. These include a web-browsing feature that provides up-to-date information (prior to the update, ChatGPT was limited in what it could answer, as it was only trained on data until 2021). ChatGPT-Plus users can also access third-party plug-ins for web services like Expedia, Kayak, and Instacart. With these plug-ins, users can prompt ChatGPT to perform tasks on specific websites. -
Deepmind
Chinchilla AI
OTHERGoogle’s DeepMind Chinchilla AI is still in the testing phase. Once released, Chinchilla AI will be useful for developing various artificial intelligence tools, such as chatbots, virtual assistants, and predictive models. It functions in a manner analogous to that of other large language models such as GPT-3 (175B parameters), Jurassic-1 (178B parameters), Gopher (280B parameters), and Megatron-Turing NLG (300B parameters) but because Chinchilla is smaller (70B parameters), inference and fine-tuning costs less, easing the use of these models for smaller companies or universities that may not have the budget or hardware to run larger models.
-
Yandex
YaLM
FREEYaLM 100B is a GPT-like neural network for generating and processing text. It can be used freely by developers and researchers from all over the world. It took 65 days to train the model on a cluster of 800 A100 graphics cards and 1.7 TB of online texts, books, and countless other sources in both English and Russian. Researchers and developers can use the corporate-size solution to solve the most complex problems associated with natural language processing.Training details and best practices on acceleration and stabilizations can be found on Medium (English) and Habr (Russian) articles. The model is published under the Apache 2.0 license that permits both research and commercial use.