Compare Models
-
Amazon
SageMaker
FREEAmazon SageMaker enables developers to create, train, and deploy machine-learning (ML) models in the cloud. SageMaker also enables developers to deploy ML models on embedded systems and edge-devices. Amazon SageMaker JumpStart helps you quickly and easily get started with machine learning. The solutions are fully customizable and supports one-click deployment and fine-tuning of more than 150 popular open source models such as natural language processing, object detection, and image classification models that can help with extracting and analyzing data, fraud detection, churn prediction and personalized recommendations.The Hugging Face LLM Inference DLCs on Amazon SageMaker, allows support the following models: BLOOM / BLOOMZ, MT0-XXL, Galactica, SantaCoder, GPT-Neox 20B (joi, pythia, lotus, rosey, chip, RedPajama, open assistant, FLAN-T5-XXL (T5-11B), Llama (vicuna, alpaca, koala), Starcoder / SantaCoder, and Falcon 7B / Falcon 40B. Hugging Face’s LLM DLC is a new purpose-built Inference Container to easily deploy LLMs in a secure and managed environment. -
Cohere
Summarize
$0.015Cohere is a Canadian startup that provides high-performance and secure LLMs for the enterprise. Their models work on public, private, or hybrid clouds and is available as an API that can be integrated into various libraries using Python, Node, or Go software development kits (SDKs).Cohere Summarize generates a succinct version of a provided text. This summary relays the most important messages of the text, and a user can configure the results with a variety of parameters to support unique use cases. It can instantly encapsulate the key points of a document and provides text summarization capabilities at scale. -
Yandex
YaLM
FREEYaLM 100B is a GPT-like neural network for generating and processing text. It can be used freely by developers and researchers from all over the world. It took 65 days to train the model on a cluster of 800 A100 graphics cards and 1.7 TB of online texts, books, and countless other sources in both English and Russian. Researchers and developers can use the corporate-size solution to solve the most complex problems associated with natural language processing.Training details and best practices on acceleration and stabilizations can be found on Medium (English) and Habr (Russian) articles. The model is published under the Apache 2.0 license that permits both research and commercial use.
1
2