Hey there!
Summary of this email:
- The importance of running AI models locally (and a new course about it!)
- What is Real-Time Vibe Coding
⏱️ Estimated reading time: 1 quick minute.
🏠 The importance of running AI models locally
Just a few hours ago, the folks at Cursor made some somewhat controversial statements: We're entering the third era of programming with AI.
And that third wave is remote agents, running on their servers inside virtual machines.
They mention that 35% of their pull requests are created this way. And honestly, it makes quite a lot of sense.
But this is AI for programming. AI for delivering features is different.
If we can increasingly build ourselves mini applications that make our day-to-day easier, that's where local models make more sense than ever.
No internet needed and you gain a lot of privacy and security.
That's why we've launched a new course to learn how to develop applications that can scale with local models. We hope you like it. 😊
✨ What is Real-Time Vibe Coding
We mentioned earlier the 3 waves of programming with AI. According to Cursor:
- Autocomplete with tab tab
- Local agents
- Remote agents
From our point of view, there's an even earlier stage missing, which was the ChatGPT moment: When you'd copy code there and ask it questions.
That was just over a year ago... time flies!
That said, it absolutely makes sense that the next wave is remote agents, and we'll probably gradually head in that direction.
In parallel, something very interesting is brewing: Models embedded in chips.
This will greatly improve speed and efficiency, making almost all responses instantaneous (under 1s).
Imagine asking AI to do a massive refactor or add a complex feature and in one second it gives you 5 proposals already implemented.
We did this exercise of imagining what the future of programming could look like and wrote a blog post about it. Let us know if you agree or disagree with our conclusions. 🙌
And since you've made it to this part of the newsletter (which you reached very quickly today), here's the joke of the week, which I know you were waiting for:
> Why are programming models so buff? Because they train with weights! 😂 😂 😂
Cheers!