Featured AI Models
Filter Open Source Models
Bidirectional Encoder Representations from Transformers, a powerful NLP model for various language understanding tasks.
Provider: Google
Parameters: ~340 million
Required RAM: ~4GB
Deep residual learning framework that revolutionized image recognition and classification tasks.
Provider: Microsoft Research
Parameters: ~25 million (ResNet-50)
Required RAM: ~500MB
Text-to-Text Transfer Transformer, a versatile text-to-text framework that can be fine-tuned for various NLP tasks with impressive performance.
Provider: Google
Parameters: ~11 billion
Required RAM: ~16GB
A powerful open-source language model with strong performance across various NLP tasks.
Provider: DeepSeek AI
Parameters: 7 billion
Required RAM: 16GB
A compact yet powerful language model with strong performance, suitable for various NLP applications.
Provider: OpenBMB
Parameters: 2 billion
Required RAM: ~8GB