Filter by Tags

2026

2025

Paper Summary for Recursive Looped Transformers: Latent Reasoning

19 minute read

A paper-reading note on latent reasoning in Looped / Recursive Transformers: scaling test-time compute via recurrent depth, recursive latent thoughts, and large-scale looped language models.

Recursive Transformers Paper Interpretation

A One-Stop Guide to Scaling Laws in LLM Quantization

27 minute read

A comprehensive overview of Quantization Scaling Laws. Dive deep into 5 papers to understand how performance loss from quantization varies with model parameters and token count.

Quantization Paper Interpretation