What is a color review and why do most teams get it wrong?
A color review is a structured internal evaluation of a proposal draft at a specific stage of development. The idea comes from the federal proposal management discipline, and it has become standard practice across government contracting. The problem is that most teams use the name without the discipline. They schedule a "Red Team" because the capture plan says to, then hand reviewers a 200-page incomplete draft with no instructions and expect something useful to come out of it.
What comes out is usually a pile of comments about typos, formatting preferences, and complaints that the draft is "too long" or "not compelling enough." None of that helps a writer do anything on Monday morning.
The failure is almost never the reviewers' fault. It is a planning failure. Color reviews require a defined charter, the right reviewers for that stage, clear evaluation criteria, and a feedback format that maps comments to specific requirements. Without those four things, you are running a reading group, not a quality review.
What is a Pink Team and what should reviewers look for?
The Pink Team is the earliest structured review, typically held when you have an outline, a draft solution concept, or a first pass at the executive summary and win themes. The draft is incomplete by design - you are evaluating strategy, not prose.
Pink Team reviewers should be asking:
- Does the proposal respond to what the customer actually asked for? Not what we think they want, but what is in the RFP?
- Are the win themes grounded in real discriminators, or are they generic claims any competitor could make?
- Is the solution technically feasible given the performance work statement and period of performance?
- Are we compliant with the proposal instructions? Are we organized per Section L?
Pink Team is not the time to edit sentences. Reviewers who start marking up prose at this stage waste their own time because half of what they edited will be rewritten anyway. The charter for Pink Team should explicitly prohibit line editing.
For IDIQ vehicles and task order proposals, Pink Team is especially important because teams often default to copy-pasting from prior submissions. The Pink Team's job is to verify that the approach actually fits this specific task order, this specific scope, and this customer's current priorities. Generic solutions lose task orders.
What is a Red Team and what does a useful Red Team look like?
Red Team is the major review. It happens when you have a complete or near-complete draft - all sections written, all graphics drafted, pricing in progress. The Red Team should evaluate the proposal as an evaluator would.
The most common Red Team mistake is using internal people who already know the solution. When someone has been writing sections for three weeks, they cannot read the draft as a stranger. The best Red Teams bring in people who have not been close to the capture - ideally people with proposal evaluation experience, subject matter depth in the relevant domain, or both.
Red Team reviewers should score the proposal using the actual evaluation criteria from the RFP. If the RFP says Technical Approach is evaluated on understanding of requirements, soundness of approach, and risk mitigation, reviewers should score each section on exactly those dimensions. Not on whether they liked the writing.
A Red Team report that says "Section 2.1 is weak" is useless. A Red Team report that says "Section 2.1 does not address the requirement in SOW paragraph 3.4.2 for a data quality plan, and it does not demonstrate understanding of the agency's existing data architecture" is something a writer can act on in 48 hours.
Red Team feedback should be documented at the section level, tied to specific RFP references, and prioritized by severity. Reviewers marking up PDFs with free-form comments creates more work than it solves. A structured scoring sheet forces discipline.
What is a Gold Team and when does it happen?
Gold Team is the final senior review before submission. By Gold Team, the draft should be essentially complete - compliant, narrative polished, graphics final, and price integrated with technical. Gold Team is a management and pricing review, not a writing review.
Gold Team participants are typically senior leadership: capture manager, BD director, program executive, pricing lead, and any partners or teaming leads whose sections need approval. The purpose is to confirm:
- The overall strategy is consistent from cover to close
- Price is competitive and defensible given the technical approach
- Risk items have been resolved or accepted at the right level
- No compliance gaps remain
- The proposal is ready to submit as-is
If Gold Team identifies major technical problems, something went wrong earlier in the process. Gold Team is not a safety net for a failed Red Team. When Gold Team becomes a rewrite session, the proposal is usually submitted late or submitted in poor condition.
On competitive procurements above $50M, Gold Team should include at least one person who has evaluated proposals at the agency or a similar agency. That perspective - what an evaluator actually sees and scores - is different from what a proposal manager sees.
How do you give feedback that writers can actually use?
The format of review feedback matters as much as the content. The goal is comments that a writer can resolve without needing to call the reviewer to find out what they meant.
Each comment should answer three questions: What is missing or wrong? Where in the RFP does this requirement appear? What would a better response look like?
"This section needs more detail" answers none of those. "This section does not address the staffing methodology required by RFP Section L.4.2. Add a paragraph explaining how the contractor will identify, vet, and onboard staff within the contract's required 30-day timeline" answers all three.
Feedback should be categorized. Compliance issues - things that will get you marked non-compliant or unacceptable - go to the top of the list. Discriminator gaps - places where you are not making the case for why you win - come next. Clarity and prose issues come last. If a writer has to choose between fixing a compliance gap and rewording an introduction, the compliance gap wins every time.
What are the most common color review mistakes?
Scheduling without planning is the most common. Teams put "Pink Team" on the calendar but do not assign a reviewer charter, evaluation criteria, or feedback template. Reviewers show up and improvise.
Mixing review stages is the second most common. Running a Red Team on a 30% complete draft means reviewers are compensating for gaps in the narrative rather than evaluating the solution. You get the worst of both reviews and the value of neither.
Ignoring the proposal instructions is a quiet killer. Some teams spend hours reviewing technical narrative while missing that their proposal exceeds the page limit by 40 pages, or that their volume structure does not match what Section L requires. The compliance matrix review should happen before any color review.
Not closing the loop is how the process breaks down. Writers receive a 90-comment Red Team report and nobody tracks which comments were addressed, which were rejected, and why. The next review has no baseline to compare against. On large proposals, comment resolution should be tracked in a spreadsheet or tool, not managed from memory.
Finally, bringing the wrong people to each review. The Pink Team needs strategy thinkers. The Red Team needs evaluator-level subject matter experts. Gold Team needs decision-makers with pricing authority. Sending the same five people to all three reviews means you get the same blind spots at every stage.
FAQ
Q: How many reviewers should be on each color team? A: Pink and Red Teams typically run best with 3 to 5 reviewers. Below three, you miss perspective diversity. Above five, coordinating feedback becomes its own management problem. Gold Team is usually smaller - 2 to 4 senior stakeholders with decision authority.
Q: Can we skip a color review on a small task order? A: On a task order under $5M with a short turnaround, a formal Pink and Red Team may not be practical. But you should still have at least one structured peer review using a checklist tied to the evaluation criteria. Even a 30-minute review with the right questions beats none.
Q: What should the Pink Team charter include? A: The charter should identify reviewers, the specific sections being reviewed, the evaluation questions they should answer, the feedback format, and the deadline for comments. One page is enough. Without a charter, reviewers default to whatever seems important to them.
Q: How long should reviewers have to complete their review? A: Red Team reviewers typically need 3 to 5 business days for a complete proposal. Giving reviewers 24 hours on a large volume produces shallow feedback. Build reviewer time into the schedule from the start - it is not a detail to figure out later.
Q: What happens when Gold Team wants major changes two days before submission? A: This is a real scenario. The capture manager and volume leads have to make a fast judgment call about what is feasible, what improves the proposal, and what introduces new errors under deadline pressure. Partial rewrites done in 12 hours often create more problems than they solve. Gold Team should not be the venue for new ideas.
Q: How do we handle reviewers who dominate or derail the debrief? A: Written feedback submitted before the debrief session prevents this. When comments are already documented, the debrief becomes a prioritization and clarification conversation rather than an open brainstorm. The proposal manager should chair the debrief and keep the agenda to comment resolution, not new strategy.