Comprehensive side-by-side LLM comparison
UI-TARS-72B-DPO supports multimodal inputs. Both models have their strengths depending on your specific coding needs.
Alibaba / Qwen
Qwen2.5-14B-Instruct is a 14-billion-parameter language model from Alibaba released in September 2024 within the Qwen2.5 family, occupying the mid-tier of the series between efficiency-focused small models and the high-capability 72B flagship. Trained on 18 trillion tokens with emphasis on instruction alignment, code understanding, and multilingual reasoning, it offers a strong performance-to-compute ratio for developers who need more capability than 7B but cannot serve 32B or larger models. The model supports 128K context windows and structured output generation out of the box.
ByteDance
UI-TARS-72B-DPO, released by ByteDance in early 2025, is a 72 billion parameter multimodal large language model from the UI-TARS family, built on Qwen-2-VL and fine-tuned for automated GUI interaction and computer control. It features native understanding of screenshots, UI elements, and web interfaces, achieving strong results across GUI benchmarks for perception, grounding, and agentic control. UI-TARS-72B-DPO targets computer-use agents, web automation, and applications requiring robust visual UI reasoning.
3 months newer
Qwen2.5 14B Instruct
Alibaba / Qwen
2024-09-19
UI-TARS-72B-DPO
ByteDance
2025-01
Available providers and their performance metrics
Qwen2.5 14B Instruct
UI-TARS-72B-DPO
Qwen2.5 14B Instruct
UI-TARS-72B-DPO