tokenization

3 videos across 2 channels

Tokenization is the process of breaking text and code into tokens that AI systems can interpret, influencing cost, speed, and safety. The concept matters for building reliable AI agents that can interact with code and tools without relying on brittle shells, as well as for understanding how tokens shape memory, coding ecosystems, and regulatory or ethical considerations in digital workflows. Recent discussions connect tokenization to practical questions about tool calls, model steering, and the evolving balance between human and machine roles in software development.

Bash is bad for agents thumbnail

Bash is bad for agents

The video argues that Bash alone is not enough for AI agents to safely and efficiently interact with code and systems, a

00:32:22
I asked Sam Altman about the future of code thumbnail

I asked Sam Altman about the future of code

The video explores the tension between rapidly evolving AI tooling and the fundamentals of software development, with a

00:30:42