text-embedding-3-large

Model Description

text-embedding-3-large is our most capable embedding model for both english and non-english tasks. Embeddings are a numerical representation of text that can be used to measure the relatedness between two pieces of text. Embeddings are useful for search, clustering, recommendations, anomaly detection, and classification tasks.

Description Ends

Recommend Models

DeepClaude-3-7-sonnet

DeepSeek-R1 + claude-3-7-sonnet-20250219,The Deep series is composed of the DeepSeek-R1 (671b) model combined with the chain-of-thought reasoning of other models, fully utilizing the powerful capabilities of the DeepSeek chain-of-thought. It employs a strategy of leveraging other more powerful models for supplementation, thereby enhancing the overall model's capabilities.

QwQ-32B

QwQ-32B is a 32.5B-parameter reasoning model in the Qwen series, featuring advanced architecture and 131K-token context length, designed to outperform state-of-the-art models like DeepSeek-R1 in complex tasks.

gemini-2.5-flash-preview-04-17

Gemini-2.5-Flash-Preview-04-17 is a large language model supporting text, image, video, and audio inputs, with advanced output and code execution capabilities and high token limits.