Collaborative Capability Development
Scaling Golden Dome's Model For Multi-Vendor Developed Solutions
Acquisition professionals traditionally only see two models for delivering a complex weapons system.
Prime Integrator: Award to a prime who selects the key suppliers and with government oversight delivers an integrated solution.
Independent Integrator: Award is made to multiple vendors with each providing a piece of the capability and the Government acts as integrator or hires a third-party as integrator to deliver a full solution.
There is a third path that can provide the government greater flexibility to maintain best of breed providers delivering the latest and greatest tech also while mitigating the inherent risk with the government acting as integrator or outsourcing that risk to a usually less capable company.
The third path can be called Collaborative Integrator. In this approach, the government awards contracts to multiple vendors, each responsible for a specific portion of a broader capability. Rather than relying on a single prime, the government manages the effort by coordinating contributions across vendors, maintaining regular updates, enforcing a common data architecture and incentivizing teamwork.
Golden Dome Example
This collaborative approach is now being actively used to build the foundational C2 layer for Golden Dome in what is being termed a “software development consortium.” The companies forming the consortium include Palantir, Anduril Industries, Aalyria Technologies, Scale AI and Swoop Technologies. It also includes traditional defense contractors Lockheed Martin, Northrop Grumman and RTX, although it is understood that they are operating in more of a support role and were added later.
The new consortium has already conducted a live demonstration of their capability, and according to the Golden Dome Direct Reporting Program Manager, Gen. Michael Guetlein, has proved it’s on target to deliver an operational capability by 2028.
The way Gen Guetlein has organized the different contractors and constructed the program governance is unique. As he described it:
“They operate as a unit. They decide what they’re going to build, when they’re going to build it, how they’re going to build it, and and who the best athlete among them is to build it. Then they hold themselves accountable on a weekly, biweekly basis. If at any point during that week, one of them did not carry their load, they can vote that individual off the island (adding that his chief engineer makes the final call). The government owns the technical baseline, but the group of nine partners are operating independently of each other and holding each other accountable through peer pressure, if you will, to perform. So far, it’s working phenomenal.”
Industry Consortium Conducts Live C2 Demonstration for Golden Dome
The team briefs Gen Guetlein and his team at least once a week to gauge progress and ensure that IOC remains on track. He also works collaboratively with the industry consortium to resolve logjams above their level and ensure that the bureaucracy does not drive unnecessary schedule delays into the aggressive schedule.
Paradigm Shift
This is a paradigm shift that when executing effectively really provides the best of all worlds…and addresses the inherent weaknesses of the other models.
The Prime Integrator
This model can work effectively and is usually the approach desired by most companies. It gives them the most control and provides them the ability to make technical, cost and schedule trades internally with minimal government involvement (as long as it doesn’t impact contractual requirements). However, the weaknesses often include:
Stifling of competition as subcontractors are usually locked-in (sometimes with long-term agreements that provide the primes with stable pricing) with little room for new entrants.
Performance (using the best tech) and accountability can vary widely depending on the level of knowledgeable government oversight, the contract incentives and the company’s tech scouting capabilities.
The Independent Integrator
This model has been the most widely used approach in recent years as government acquirers attempted to identify products and vendors whose collective capabilities could be synthesized. In some cases, they employed government-led teams (the Kessel Run approach) while in others, they hired third party integrators to execute the integration functions (the ABMS model). However, the weaknesses often include:
Independent integrators have low accountability, as they really are doing “best effort” which is often reinforced by use of cost type contracts.
Given that they usually don’t own any of the systems, third-party integrators often don’t understand the technology space as well (how could they?), so they lack the ability to screen low-performers until well into execution or they sub-optimize the integration if the tech turns out to be mediocre.
The government does have the ability to offload non-performers with this model but that rarely happens or when it does, its often too late in the game.
The Collaborative Integrator
This model more directly addresses the incentives that have plagued some of these other approaches by leveraging a few key practices that are built into the framework.
Continuous Competition. Competition doesn’t end at contract award or only happen at key points. Rather vendors must consistently perform to remain part of the effort, which helps maintain high performance standards, encourage innovation and prevent complacency.
Specialized Capabilities. Instead of relying on a single provider to deliver everything or picking vendors based only on government expertise, the government benefits from being able to identify their preferred vendors while also leveraging the experience and insights of the broader consortium. This is especially valuable in highly dynamic tech areas where expertise is specialized.
Frequent Feedback. Vendors are subject to frequent testing and evaluation cycles that provides the government direct feedback on which companies are performing or falling short. With flexibility to offboard underperformers, the rest of the consortium can also keep the configuration stable and help to find the right vendor to keep progress on track.
Modern Development. This model reflects how high-performing companies build complex systems today using incremental deliveries, rapid feedback loops, and continuous improvement rather than long development cycles.
There’s History
This is not an entirely new model, just one not often employed in DoD/DoW.
Project Maven used a collaborative, iterative model with vendors where the government orchestrated the ecosystem and worked with vendors to define problems, refine data, and improve models together with Palantir (who provided data integration and analytics platforms), AWS and Microsoft (who enabled scalable compute and cloud storage), L3Harris and Maxar (who contributed imagery and ISR experience) and others. Instead of a single monolithic defense contractor, Maven operated as a kind of modular consortium—lean, flexible, and fast. Maven has been an incredible success judging by its extensive use across all combatant commands and its designation as a formal program of record.
C2, Battle Management, and Communications (C2BMC) Program. This program is run by the Missile Defense Agency and was initially managed using a “National Team” construct with Lockheed Martin and Northrop Grumman as co-partners. Eventually Lockheed assumed the lead and now serves as the integrating prime. The program has had success against a technically challenging set of requirements (for its time) albeit it at significantly great cost and time.
Space Development Agency. SDA implemented a collaborative contracting model through its tranche-based approach to building a proliferated LEO satellite network. Rather than relying on a single prime contractor, SDA awarded work to multiple vendors for different capabilities across its transport and tracking layers. While the program did not use the exact approach as Golden Dome, they did try a new model that could have achieved the same outcomes. Sadly, they were not organized as well as they could have been which resulted in delayed fielding.
The Eurofighter. The Eurofighter programme embarked on building the world’s most advanced swing-role combat aircraft. Its consortium had 4 partner nations (UK, Germany, Spain and Italy), 3 primary companies (Airbus, BAE Systems and Leonardo) and over 400 contributing companies. It achieved success in terms of performance, maintaining European componentry and exporting to other nations (with ~770 orders). While the multi-nation consortium introduced complexities that likely drove cost delays (due to workshare agreements), schedule delays (multi-nation decision-making) and had complex governance (shifting requirements), it was overall deemed a success in meeting its key goals.
Alliance Contracting or Integrated Project Delivery (as termed by McKinsey & Co) is a model where owners, contractors, and engineers are integrated into a single contract has been heralded as a cure for what ails contracting approaches both in the government and industry. It is used by many large firms in other industries, such as retail, healthcare, and financial services, to great success.
When to Use
The only proscription against use of this model is for programs with stable, well-defined requirements where it’s likely to be more a hassle than help. Given the increasingly dynamic state of threats and technology, this model will become more applicable (in different degrees) across all domains and challenges.
Today, it is best employed when dealing with rapidly evolving technologies such as AI, autonomy, software-defined systems, space, and cyber. These areas have requirements that are not fully knowable upfront and will evolve with increased use of new solutions as their potential is better understood.
This model also works best when:
specialized expertise is fragmented across multiple vendors and where it is unlikely for one vendor to be “best of breed” in all areas.
performance is hard to specify but easy to measure using frequent tests, evaluations or demos. This also helps support continuous competition.
modular architectures are feasible where the system can be broken into interchangeable components (software, subsystems, payloads). This enables multi-vendor participation without assuming overly high integration risk.
speed matters more than perfect upfront design. This model is ideal for “field early capability and iterate” style strategies.
the government wants to shape, not just select, the industrial base that it needs to solve the particular challenge. This is helpful when the government wants to enlist help from less traditional sources.
the government program office has a very deep understanding of the problem set, the current solutions (where they worked and fell short) and a reasonable understanding of the technology potential. Under this model, the government does not need to possess as deep of expertise since that is mitigated through the demonstration process.
the contracting office has the knowledge and courage to write adaptive contracts that can drive the right incentives without being overly inflexible.
Potential Downsides and Risks
Collaborative integrator model has a different set of risks which must me managed.
Increased Government Integration Responsibility. Without a single prime contractor, the government must take a more active role in ensuring that all components work together. With a team of highly performing vendors that are incentivized to work together, there is less technical work for the government to manage, however this model still requires strong government leadership, insight into how coordination is occurring across vendors and integration planning is coalescing especially leading up to key demonstration events.
Diffused Accountability. With multiple contributors, it can be harder to assign responsibility when issues arise, particularly when problems span interdependent components. The government needs to have strong understanding of where the boundary lines exist across vendors and in demonstration planning, ensure that each vendor is being challenged appropriately.
Vendor Turnover Challenges. Replacing vendors can improve performance but may also introduce transition delays, knowledge gaps and a storming phase as onboarding occurs within a high trust vendor team. The government should have a plan for how it manages transition of each vendor off the team to ensure that this process is as seamless and least disruptive as possible.
Strategy & Execution Constraints. Not all acquisition structures and personnel are designed or prepared for frequent vendor changes, iterative delivery cycles and flexible scope adjustments. Programs may need to carefully design their acquisition strategy and contracting approach to enable this model…and most importantly select the government managers very carefully.
Key Steps to Implement the Model
The bottom line is that there are certain elements that should be in place when using this model.
Break the Capability into Manageable Pieces
Define the system in terms of discrete components or functions that can be developed independently. Clear boundaries make it easier to assign work across multiple vendors.
Establish a Strong Coordination Function
Designate a team responsible for managing vendor contributions, ensuring alignment across efforts and overseeing integration and delivery. This role is critical to program success. Importantly, this role is one of coordination not micromanaged oversight. The goal should be to step in only when necessary.
Choose Flexible Contracting Approaches
Use contracting mechanisms that allow for multiple awards, support vendor on-ramps and off-ramps; and that enables adjustments based on performance. A small managed multi-award IDIQ may be the right vehicle for this approach but to really have the flexibility required, an Other Transaction contract is likely best.
Implement Regular Evaluation Cycles
Set up structured opportunities to regularly test vendor outputs, compare performance and gather user feedback. Use these cycles to inform decisions about continuing, expanding, or replacing vendors.
Define Clear Performance Metrics
Establish objective criteria for evaluating vendors, such as technical performance, delivery timelines and the ability to easily integrate with other components. The government should set challenging performance benchmarks with ambitious deadlines to keep the team motivated. Transparent metrics support fair and defensible decisions for awarding continued work.
Plan for Change from the Start
Expect that vendors may enter and exit over time. Build processes to onboard new vendors quickly, transition work smoothly and maintain continuity across development and test cycles.
Align Funding with Iterative Delivery
Structure funding to support incremental progress, scaling of successful components and requirement adjustments in response to real-world events.
Final Word
A collaborative contracting model provides a valuable addition to the program manager’s toolkit of options for managing highly complex technology efforts. The collaborative contracting model represents a shift from static program structures to dynamic, performance-driven ecosystems. It acknowledges that no single vendor is likely to be best at everything and that the best outcomes often come from combining strengths across multiple contributors. This model incurs different risks that must be managed, but with the right conditions offers huge opportunities.
This approach is not without challenges. It demands stronger government leadership, more deliberate understanding and coordination, and thoughtful acquisition design. It also requires a highly motivated government team that is willing to break obstacles to execution with ruthless precision. When executed well, it can deliver faster capability development, higher-quality outcomes and greater adaptability over time.
We look forward to seeing the Golden Dome team achieve great success under Gen Guetlein’s leadership using this approach. Please let us know when you have seen or executed a program using this approach and what you learned in the process.
Join over 14,000 defense and industry subscribers to get our weekly recaps and thought pieces. Paid subscribers also get budget and legislative analysis and are valued supporters of our work.






This is a really important shift.
But it also creates a new kind of risk:
when responsibility is distributed across vendors, it becomes harder to see where decisions actually live — and who carries their consequences over time.
Continuous competition and frequent evaluation improve performance.
But they don’t solve what happens after decisions are made:
how they are tracked
how they are challenged
and how learning actually changes the system without destabilizing it
That layer still seems structurally missing.
And without it, we risk building highly adaptive systems that are difficult to hold accountable in practice.
Do we believe the government has enough people with the requisite passion, intellect, and energy to make this happen at scale? All the success stories here and elsewhere seem to assume a ‘workforce in waiting’ with the skills and attributes to make this happen — but never ask “if that’s so, why hasn’t it happened yet?”