17-April-25: Nebius AI Studio (www.nebius.com) provides an Inference Service with hosted open-source models, requiring no MLOps expertise. Features ultra-low latency (ideal for Europe), model comparisons, and options for fast or cost-efficient processing. Includes models like MetaLlama-3.1 and Mistral.