Documentation
Platform Core (Models)
The foundation of the MX4 AI platform.
Last updated on February 2, 2026
Platform Core represents our family of sovereign large language models, trained specifically for high-performance Arabic and English reasoning.
Available Models
mx4-atlas-core
Our flagship model. Best for complex reasoning, content generation, and Arabic dialect understanding.
Standard ContextGeneral Purpose
mx4-atlas-lite
Optimized for speed and cost. Ideal for classification, extraction, and real-time chat.
Extended ContextLow Latency
Model Specifications
Model sizes, context windows, and throughput depend on hardware and deployment configuration. We share detailed sizing and benchmark guidance during pilots or under NDA.
Performance Benchmarks (Arabic Tasks)
Arabic benchmark results are available upon request. We review evaluation outcomes during pilots and align model selection to your target tasks and dialects.
Use Case Recommendations
Choose Platform Core for:
- Complex reasoning and analysis tasks
- Long-form content generation
- Multi-dialect Arabic conversation
- Nuanced translation tasks
- Fine-tuning on specialized domains
Choose MX4 Platform Lite for:
- Real-time chat applications
- Text classification & extraction
- High-volume API usage (cost sensitive)
- Edge deployment / low-latency requirements
- Structured output tasks (JSON, schemas)
Pricing Guide
Platform Core
70B parameter model
$0.50 / 1M input tokens
$1.50 / 1M output tokens
MX4 Platform Lite
13B parameter model
$0.10 / 1M input tokens
$0.30 / 1M output tokens
Quick Start
quick_start.pypython
1import openai2import os34client = openai.OpenAI(5 api_key=os.getenv("MX4_API_KEY"),6 base_url="https://api.mx4.ai/v1"7)89# Use Platform Core for complex tasks10response = client.chat.completions.create(11 model="mx4-atlas-core", # Flagship model12 messages=[13 {"role": "user", "content": "اشرح لي مفهوم الذكاء الاصطناعي بطريقة مبسطة"}14 ],15 temperature=0.716)17print("Core:", response.choices[0].message.content)1819# Use MX4 Platform Lite for fast responses20response = client.chat.completions.create(21 model="mx4-atlas-lite", # Lightweight model22 messages=[23 {"role": "user", "content": "صنف المشاعر: هذا المنتج ممتاز!"}24 ],25 temperature=0.326)27print("Lite:", response.choices[0].message.content)