Weave Launches: AI To Quantify Engineering Work
"X-ray vision for engineering teams"
tl;dr: Weave uses AI to understand software engineering work, and give leaders X-ray vision into their teams.
Founded by Adam Cohen & Andrew Churchill
Adam’s background is in operations and sales. He led organizations of 100+ people and created an internal tool to measure performance and help individuals identify their weak spots. This is common practice in revenue teams and he wants to bring it to engineering.
Andrew was employee #1 at Causal. He saw firsthand how subjective engineering management is and how hard it is to scale a high-performing engineering team.
They met at Causal, where Adam was hired to run the sales and customer success function. They got to talking about the big difference between how their two departments worked, and the rest is history.
The Problem
Engineering leaders are flying blind. They can’t dive in everywhere, so they need to rely on gut feel or shoddy metrics to try to get a handle on what’s going on and what needs fixing.
Engineering is unique in that there are no good metrics to solve this problem. And that's why they built Weave.
The Solution
Weave uses AI to measure engineering work. They run LLMs + their own models on every PR and review, analyzing both output and quality. Then they summarize this data and insights in dashboards
They have built a custom machine learning model that is trained on an expert-labeled data set of PRs. The data set lets them answer the question: “how long would this PR take for an expert engineer?” This enables them to calculate the metric most companies care the most about: how much actual work is getting done:

This isn't a line of code calculator, this is an actual estimate of the key metric: "How long would it take an experienced engineer to make this change?"
They can also tell you, for example, how much time each engineer is spending on code review and how useful their reviews are:

And they classify PRs into new features, bug fixes, or "keeping the lights on", so they can tell you how much of your engineering bandwidth is going to each bucket:

Early Use Cases
- One company used Weave to see that Cursor had a 20% lift on their teams output
- A manager noticed that code review quality has the highest correlation to output. He reset code review standards and team output went up by 15%
- Smaller engineering teams (~5 engineers) are using the platform daily in stand-ups
Learn More
🌐 Visit to workweave.dev learn more.
✨ Are you running an engineering team, of any size? You can connect your repo to Weave and get started in ~30 seconds here.
🤝 Do you know any engineering leaders at mid-market or enterprise companies who might find this valuable? They would love an intro, reach out here.
⭐ Give Weave a Star on Github.
👣 Follow Weave on LinkedIn & X.
Simplify Startup Finances Today
Take the stress out of bookkeeping, taxes, and tax credits with Fondo’s all-in-one accounting platform built for startups. Start saving time and money with our expert-backed solutions.
