Research#llm🏛️ OfficialAnalyzed: Jan 3, 2026 15:44

Image GPT

Published:Jun 17, 2020 07:00
1 min read
OpenAI News

Analysis

The article describes OpenAI's Image GPT, a transformer model trained on pixel sequences for image generation. It highlights the model's ability to generate coherent image completions and samples, and its competitive performance in unsupervised image classification compared to convolutional neural networks. The core finding is the application of transformer architecture, typically used for language, to image generation.

Reference

We find that, just as a large transformer model trained on language can generate coherent text, the same exact model trained on pixel sequences can generate coherent image completions and samples. By establishing a correlation between sample quality and image classification accuracy, we show that our best generative model also contains features competitive with top convolutional nets in the unsupervised setting.