Skip to main content

Your Company Buys AI. That's Not Playing Offense.

Alfred Lin says the risk is moving too slowly. But buying AI tools isn't velocity — it's checking the box. What you need before you can actually play offense.

Your Company Buys AI. That's Not Playing Offense.

Ricardo Argüello

Ricardo Argüello
Ricardo Argüello

CEO & Founder

Business Strategy 9 min read

A lot of companies think they are playing offense with AI right now just because they are spending money. They buy a few licenses, spin up an “innovation committee,” launch a couple of pilots. It looks like momentum — but the day-to-day reality of the business usually has not changed at all.

A couple of days ago, Alfred Lin of Sequoia Capital — the same person who sits on the boards of Airbnb and DoorDash — shared a message that a CEO from his portfolio sent to their team. The core thesis: “The biggest risk is no longer making the wrong decision. It is moving too slowly while the world moves around you.” He laid out two paths. You can play defense — protecting your turf and waiting to see how the market shakes out. Or you can play offense, which means using these new tools to aggressively rethink your old strategies before the market does it for you.

Lin is right. But here is what I keep seeing companies get wrong: they hear “play offense” and immediately start buying AI tools. That is not offense. That is the corporate version of checking a box. The actual difference between offense and defense has nothing to do with how many tools you own — it is whether you built anything underneath them.

Further down the thread, Samir Qamar made a great distinction between speed and velocity. “The former includes direction,” he pointed out. That is exactly the issue most enterprises are running into right now.

Speed without direction is waste

There is a massive difference between just moving fast and actually heading somewhere. In enterprise AI, we see a lot of companies doing the former — hoarding tools without building any real capability to use them.

I say this with 25 years of building enterprise software behind me: most companies I encounter have AI speed. They buy licenses, set up demos, present pilots to the board. But speed without direction produces a pattern that is already predictable.

Gartner estimated that 30% of generative AI projects would be abandoned after proof of concept. Not for lack of ambition or budget. They get abandoned because the company moved fast on the purchase but never built the road for delivery: fragmented data, disconnected systems, teams that do not know how to iterate with the tool.

And yet, McKinsey documented productivity gains of 20-60% in companies that integrate AI into their workflows. The gap between that failure and that success has nothing to do with which model you picked or how much you spent. It comes down to whether the AI actually connects to how the company operates day to day.

If your company recognizes any of these symptoms, you have speed without velocity:

  • You have bought the tools, but nothing works differently. Your team might have access to half a dozen AI platforms, but if you look closely, people are still making decisions and delivering client work exactly the way they did last year.
  • You are stuck in “eternal demo” mode. It is easy to make a pilot look great in a quarterly board deck. It is much harder to make it work once it touches your actual legacy systems. If you are launching a new pilot every quarter but nothing is actually operational, you are spinning your wheels.
  • You know your OpenAI bill, but not your ROI. The monthly invoice is crystal clear. What nobody in the company can tell you is which business metric actually improved because of it.

The four foundations of real velocity

So what does “playing offense” actually look like? It starts with building the foundations that make every tool you adopt productive from day one. There are four, and the order matters.

Data infrastructure

Upgrading your AI model will not save you if the underlying data is a mess. If ~40% of your CRM records are outdated, plugging in the newest LLM just means you get bad analysis delivered with higher confidence. And if your support data lives in three systems that do not talk to each other, no AI agent is going to resolve tickets that need context from all three.

Clean data appreciates with every model generation. As I detailed in the post about AI investments with expiration dates, what ages well is not the technique or the model — it is the infrastructure underneath. Companies that invest in cleaning and connecting their data today will benefit from every model improvement for the next five years without starting over each time.

Integration layers

If your AI tool is not wired into the actual operation, what you have is a demo with a monthly invoice. You need functional APIs, configured webhooks, real connections to CRM, ERP, and support systems. When an AI agent can read a ticket, query the customer history, and update the status in the same system where your team works, that is integrated AI. When the team copies and pastes between ChatGPT and a spreadsheet, that is a demo with a monthly license.

Not the most exciting topic at a conference, but the integration layer is what makes everything else work. Without it, every new tool turns into an island — your team ends up copying and pasting between systems, manually bridging gaps that should not exist. Once the plumbing is in place, plugging in a new capability takes days instead of months because the connections are already there.

Team AI fluency

Giving the team access is not the same as giving them capability. The AI fluency gap we documented tells a clear story: teams that refine prompts, question outputs, and try different approaches get roughly twice the value out of the same tool as teams that accept whatever comes back first. Same license, wildly different results — and the gap comes down entirely to how well people know how to work with the tool.

You cannot fix this with a quick workshop. Getting real value from AI requires daily practice — your team has to integrate it into their actual workflows, treating the tool like a sounding board rather than a magic 8-ball that gives final answers.

Strategic clarity

The question nobody wants to answer: which problems are you going to say no to? The conviction cycle we described applies directly here — you write down your assumptions, test them against reality, then commit or walk away. Without that filter, “offense” turns into dispersion: ten pilots running at once, none with enough depth to produce anything meaningful.

Companies that play offense say no more often than yes. They choose three problems, go deep, measure results, and then decide whether to scale or discard. That requires a discipline that speed does not demand and velocity does.

What separates companies that actually play offense

We have worked with B2B companies at different stages of AI maturity, and the contrast between the ones that move forward and the ones that stall is fairly obvious.

Speed companiesVelocity companies
Tools5+ platforms, fragmented use2-3, connected to real workflows
PoCsNew ones every quarter, few reach productionFew, but each with a defined integration path
TeamTool access, no trainingDaily practice, iteration as habit
DataFragmented across systemsClean pipeline, accessible, documented
Metric”We are using AI""This process costs 30% less since we integrated AI”
When a new model dropsStart from zeroMount it on existing infrastructure

Eytan Starkman, my co-founder at IQ Source, summarizes it in a line I repeat in every client conversation: “The companies that arrive ready to move fast are never the ones that bought the most tools. They are the ones that did the foundation work first.”

This is why foundation work matters. If your data is clean and your teams know how to iterate, you do not have to start from scratch every time a new LLM drops. You just plug the new capability into your existing plumbing and keep moving.

Companies without those foundations restart every time the model or the tool changes. And restarting looks a lot like standing still — which is exactly the risk that Lin’s portfolio CEO described.

Three actions for this quarter

You do not need all four foundations to be perfect before you move. But you do need direction before velocity. These three actions can be executed in one quarter and tell you exactly where you stand.

Audit your AI tools against outcomes. List every AI tool your company pays for. Next to each one, write the measurable business outcome it produces. If you cannot write a concrete metric, that tool is spending, not investment. Consolidate or eliminate the ones that do not produce. The clarity this exercise generates is more valuable than any new tool.

Go deep on a single process. We often see companies spinning up ten different surface-level pilots at once. It is a waste of time. Find one complete workflow and build out the data pipeline, the integrations, and the training for it. Our AI-by-department guide details how to get from pilot to production in 90 days. Nailing one actual process moves the needle; juggling ten demos does not.

Define your “no” list. Write down which AI applications your company will not pursue this quarter. This is harder than it sounds, because saying no to a shiny pilot feels like losing ground. But the companies with real velocity got there by being ruthless about focus — not by running ten experiments at once and hoping one sticks.


That CEO’s advice to “stay on the front foot” is exactly right. Just remember that being on the front foot means having a deliberate strategy for where you are going, not rushing to buy whatever tool dropped this week.

If your company has AI motion but you are not sure it has direction, send us a paragraph describing your current initiatives. We classify each one: velocity (directional, compounding) or speed (no foundations, non-compounding). We send back a one-page diagnostic.

Request a velocity diagnostic

Frequently Asked Questions

AI strategy AI adoption velocity vs speed data infrastructure AI fluency AI investment Sequoia Capital

Related Articles

Finance AI: why LLMs still hallucinate in production
Business Strategy
· 7 min read

Finance AI: why LLMs still hallucinate in production

OpenAI formally proved in 2025 that LLM hallucinations are mathematically inevitable. Here's what that means for building finance AI that CFOs will sign.

AI governance AI architecture finance AI
Your AI Wants to Touch Payroll. Kubernetes Knows How.
Business Strategy
· 7 min read

Your AI Wants to Touch Payroll. Kubernetes Knows How.

The engineer who built Azure Kubernetes Service is now Workday's CTO. It's not a hire — it's an architecture signal: container governance is the playbook for AI agents.

AI agents Kubernetes governance