This week marks an important shift.
The central question for this sleeve is no longer:
“Will AI and autonomy integrate into defence procurement?”
It is now:
“Who becomes embedded infrastructure — and who doesn’t?”
That distinction changes how risk is read.
1. AI Is No Longer a Pilot Programme
Research confirms that AI and autonomy funding is now institutionalised in FY2026 defence budgets.
This matters.
Dedicated line items mean:
procurement continuity
architectural embedding
upgrade cycles
repeatable sustainment revenue
We are no longer observing experimental adoption.
We are observing system integration.
That lowers adoption risk.
2. Lock-In Conditions Are Forming
Several structural dynamics are now visible:
enterprise AI frameworks embedded into command systems
federated data architectures hardening
certification and retraining costs rising
iterative upgrade clauses appearing in procurement contracts
These create switching costs.
Switching costs create duration.
Duration creates concentration.
The cycle is no longer purely about growth. It is about entrenchment.
3. What Risk Looks Like Now
As adoption risk declines, risk shifts.
The primary risk is no longer “AI fails to integrate.”
It is:
vendor concentration
execution missteps
political scrutiny around dominant platforms
This sleeve was designed to tolerate volatility. It was not designed to avoid concentration.
Concentration is now a feature of the environment.
4. Capital Posture
No changes this week.
Structure remains:
Command-layer concentration intact
Autonomy exposure maintained
Cyber exposure steady
Compute exposure unchanged
Prime ballast retained
Treasury buffer intact
This is not inertia.
It reflects that no structural downgrade triggers are active.
Until budgets reverse or procurement language reverts to pilot-only, resizing would be premature.
5. What Would Change This View
We would reconsider posture if we saw:
removal of AI as a discrete defence budget category
cancellation of integration programmes without replacement
procurement language reverting to experimental framing
regulatory intervention halting deployment
None of those conditions are present.
Closing
The Cabal is moving from early integration phase to early structural lock-in phase.
That reduces adoption uncertainty.
It increases concentration risk.
For now, the correct stance is unchanged:
Hold concentration.
Avoid dilution.
Ignore hype.
Watch procurement.
The experiment is no longer about whether AI enters defence.
It is about who controls it once it does.


