9 Comments
User's avatar
Yogesh Tiwari's avatar

Thank you. Loved every bit of this post.

Hasan Uzun's avatar

Yes — the “short leash” idea is the key. The best AI coding workflows I’ve seen look less like magic and more like tight ops: small scoped tasks, explicit constraints, quick verification, then handoff to the next step. That same pattern is spreading beyond engineering into founder ops, research, and customer workflows too.

Renowned World's avatar

Great read! AI coding certainly works best with a clear workflow and structure, and not to mention when driven by strong product thinking.

One thing I’ve noticed though is that these tools are making it possible to ship features so quickly that teams can end up moving faster than users can meaningfully react or give feedback.

I actually wrote a short piece on that recently!

https://renownedworld.substack.com/p/the-hidden-problem-with-ai-speed?r=6esy69

Adam Howarth's avatar

You had me at iterate loop.

A S's avatar
Feb 10Edited

Thanks Neo for this wonderful blog - spending my 3rd extended work day on this! I have a question or may be clarification on the workflow you proposed. I feel till 3rd step, i.e. code- we are same for any kind of task - build/fix/refactor/optimize; post code we can either have the AI to generate tests or do review (both me & AI) or first iterate on top of the generated code for explanations, optimizations, alternatives and then go for tests and review or vice versa. Are we this flexible enough depending on the complexity of the task?

Shrijith Venkatramana's avatar

I use LiveReview, a Source-Available AI Code Reviewer, which I built for myself.

The specialty with LiveReview is that it triggers during git pre-commit automatically.

And it also marks a commit as reviewed or not for posterity.

To be honest - despite being the builder of the tool, I was sceptical of how useful it could be.

But with experience, I’ve found that LiveReview precommit has caught me 100s of issues in a matter of months.

This makes the team level PR review easier later because sanity checks have been done at commit time itself.

You can check it out here:

https://hexmos.com/livereview/

Bola Ogunlana's avatar

I love the workflow. In fact, it kind of mirrors the one I settled on myself in my Cloud DevOps work. I just published my book (vibe coding: build cloud infrastructure at the speed of thought) on this same topic. It's good to know it works for multiple realms in Software engineering.

Israel's avatar

I learned a lot from this. Thank you! Question about the tests you prompt for -- you ask for the happy path and two "edge" cases. Does "edge" really mean edge as in "boundary" or "uncommon", or does it actually mean error path or exceptions?

AK DevCraft's avatar

Really well put workflow, I appreciate it! The only thing I do differently is dictating the prompt instead of typing it. Guess I’m a bit lazy 😄