Click any tag below to further narrow down your results
Links
This article breaks down Andrej Karpathy’s zero-dependency, 243-line GPT implementation in plain Python. It explains how each part—tokenizer, autograd engine, embeddings, attention mechanism, residual connections, and MLP—mirrors a full-scale transformer on a tiny dataset of baby names.
Deep Think with Confidence (DeepConf) is introduced as a method to improve reasoning efficiency and performance in large language models by using internal confidence signals to filter out low-quality reasoning traces. It requires no additional training or tuning and can be easily integrated into existing systems. Evaluations show significant accuracy improvements and a reduction in generated tokens on various reasoning tasks.