🖊️ Arabic Text Generator
NanoGPT trained from scratch on ArabicText-Large (50K articles) — no pretrained weights.
Model
50 500
0.1 2
0 200
0.5 1
Generated Text
Next-Token Probability Distribution (NanoGPT only)
Example Seeds
Examples
About: NanoGPT is a character-level GPT trained from scratch on 50K Arabic Wikipedia/news articles from the ArabicText-Large corpus. Architecture: 6 layers · 6 heads · 384-dim embeddings · ~10M parameters.