Scaffolded LLMs as natural language computers

Recently, LLM-based agents have been all the rage – with projects like AutoGPT showing how easy it is to wrap an LLM in a simple agentic loop and prompt it to achieve real-world tasks. More generally, we can think about the class of ‘scaffolded’ 1 LLM systems – which wrap... [Read More]

The singularity as cognitive decoupling

An interesting way I like to think about the singularity is as the cognitive decoupling. Specifically, the singularity is the final industrial revolution that lets capital be converted directly into intellectual labour 1. The first industrial revolution occurred when capital became convertible into manual energy – i.e. humanity learned to... [Read More]

Why GOFAI failed

Now that deep learning has been fully victorious and the reasons for its victory have started to be understood, it is interesting to revisit some old debates and return to the question of why GOFAI failed with a new perspective. [Read More]

My PhD experience

Originally written in early 2021 while writing up my thesis. Never got around to publishing it on my blog then. I think it might be interesting to people wanting to see what a PhD looks like from the ‘inside’. Note that everyone’s PhD experience is highly personal to them and... [Read More]

GPUs vs Brains. Hardware and Architecture

Epistemic status: I owe a lot of my thoughts to Jacob Cannell’s work on both brains and deep learning. My thinking here comes from my experience in large-scale ML as well as neuroscience background and specifically experience in analog hardware for deep learning [Read More]