RFP Screener & Go/No-Go AI

Stop Wasting BD Resources on Losing Bids

AI-powered RFP screening and Go/No-Go scoring that evaluates every opportunity against your win probability in under 10 minutes.

Ready to stop wasting BD resources on losing bids?

Book a Free Consultation

The single most destructive behavior in federal business development is responding to RFPs you cannot win. The cost is not just the time spent writing — it is the opportunity cost of proposals you didn't write, the team morale erosion from repeated losses, and the distorted win-rate data that leads to progressively worse bid decisions.

The Hawary AI Federal RFP Screener & Go/No-Go Engine exists to prevent this at the source. It is a senior-level federal procurement intelligence system that performs deep Fit-Gap analysis between a client's Capability Statement and an incoming RFP or Solicitation document. The output is a structured, evidence-based recommendation that exceeds 7,000 characters of expert-grade analysis — not a checklist, not a keyword match, not a summary.

The engine emulates the judgment of experienced federal contracting advisors: it analyzes business model first (self-performer vs. prime/manager vs. product provider), evaluates technical alignment from the correct role lens, distinguishes fatal flaws from mitigatable risks, and produces a definitive Go/No-Go decision with full supporting rationale.

The following conditions generate an automatic No-Go recommendation. The engine does not proceed to scoring analysis if any of these conditions are present. These are not risks — they are fatal flaws.

Auto-Disqualifier

Condition

Rationale

Missing Required Certification

RFP requires specific set-aside status (8(a), SDVOSB, WOSB, HUBZone) that the client does not hold

Non-compliance is an absolute eligibility barrier; proposal will be rejected as technically unacceptable

OCONUS Location

Place of Performance is Outside Continental United States (overseas, APO/FPO, territories not served by client)

Geographic disqualifier — client cannot legally or operationally perform

Expired Submission Deadline

Solicitation deadline has already passed at time of analysis

No proposal can be submitted; any further analysis is wasted effort

Expired Client SAM.gov Registration

Client's SAM.gov registration is expired or inactive

FAR 4.1102 requires active registration to be eligible for award; expired registration = ineligible contractor

Geographic Mismatch

Place of Performance requires physical presence in a location where the client has no office, employees, or established local operations and cannot establish presence before performance start

Fatal operational barrier where no reasonable mitigation exists within timeline

Insufficient Past Performance

RFP requires a minimum number of past performance references at a minimum contract value that the client cannot document

Evaluators will score Past Performance as "Unknown Confidence" or below threshold, eliminating competitive viability

Personnel License Requirements

RFP requires state-licensed professionals (Professional Engineers, Licensed Contractors, etc.) in a state where the client holds no current licenses

Cannot legally perform the work; licensing cannot be obtained within proposal timeline

Security Clearance Level

RFP requires facility clearance or personnel clearances that the client does not hold and cannot obtain within the performance timeline

Performance impossible without clearance; cannot be mitigated

Bonding Capacity Exceeded

RFP requires performance or payment bonds that exceed the client's established bonding capacity with their surety

Financial disqualifier for construction contracts; cannot perform without bond

NAICS Code Mismatch

Client's size standard under the required NAICS code exceeds the small business threshold, rendering them other-than-small for a small business set-aside

Size status misclassification; proposal would be rejected upon verification

Full Scoring Framework

Overall Strategic Assessment

The Overall Strategic Assessment is the top-level Go/No-Go determination derived from aggregate scoring across all dimensions. It uses a three-tier classification:

Assessment Level

Alignment Score

Meaning

Recommendation

HIGH

>80% alignment

Strong strategic fit across most or all dimensions; client has high win probability if proposal is executed well

Go — Full proposal development recommended

MEDIUM

50–80% alignment

Moderate fit with identified gaps that are mitigatable with specific actions before submission

Conditional Go — Address identified gaps, then proceed

LOW

<50% alignment

Fundamental misalignment in one or more critical dimensions; probability of competitive win is low

No-Go — Resources better deployed elsewhere

The Overall Strategic Assessment narrative explains the alignment of the RFP to the client's strategic capabilities, industry focus, and business model. It references core competencies from the Capability Statement, past performance scale, contract type familiarity, and mission alignment with the issuing agency.

Dimension 1: Technical & Scope of Work Alignment

What Is Evaluated:

A line-by-line analysis of the Statement of Work or Performance Work Statement against the client's documented capabilities. This is not a keyword match — it is a substantive evaluation of whether the client can actually perform each major task area.

Scoring Criteria:

  • Can the client perform each SOW task with their existing workforce and processes?

  • Do the client's service lines match the technical requirements (e.g., IT systems integration, construction management, environmental services, professional staffing)?

  • Are there task areas where the client has no experience or track record?

  • If subcontracting is required for certain tasks, does the client have established subcontractor relationships or the capacity to establish them?

  • Do the client's proposed management systems (QA/QC, project controls, reporting) match RFP expectations?

Evidence Sources:

The analysis cites specific SOW sections and matches them against specific CS service lines. No assumption is made about capability not documented in the CS.

Common Findings:

  • Full alignment: Client services directly mirror SOW task areas with documented past performance

  • Partial alignment: Client covers 70–80% of SOW with gaps in 1–2 task areas manageable through subcontracting

  • Low alignment: Core SOW task areas are outside the client's documented service portfolio

Dimension 2: Personnel & Expertise Match

What Is Evaluated:

Comparison of RFP-required personnel roles and certifications against the client's documented key personnel.

Key Certifications Evaluated:

Certification

Domain

Relevance

PMP (Project Management Professional)

Project Management

Widely required for PM roles on federal contracts

PE (Professional Engineer)

Engineering/Construction

Required for design-build, infrastructure, and engineering contracts

CCM (Certified Construction Manager)

Construction Management

Required for CM-at-risk and construction management contracts

CISSP

Cybersecurity

Required for IT security and information assurance contracts

PMP-ACP

Agile Project Management

Required for software development and IT modernization contracts

CAPM

Project Coordination

Entry-level PM roles on federal contracts

CPA

Financial/Audit

Required for financial management and audit support contracts

LEED

Sustainable Construction

Required for green building and energy efficiency contracts

Scoring Criteria:

  • Does the client have a qualified candidate for the Key Personnel roles named in the RFP?

  • Are required certifications current and verifiable from the CS?

  • Is there sufficient depth in the staffing plan to cover the required labor categories?

  • Are there personnel gaps that could be filled through teaming, subcontracting, or hiring within the proposal timeline?

Note: Missing personnel for non-key personnel roles is a mitigatable risk (recruitment plan can address). Missing key personnel with named certification requirements is a potential fatal flaw if no alternative is available.

Dimension 3: Compliance & Certification Status

What Is Evaluated:

Verification of mandatory certifications, licenses, socioeconomic status, and regulatory compliance requirements.

Socioeconomic Set-Aside Certifications:

Certification

Program

Eligibility Requirements

8(a)

SBA 8(a) Business Development Program

Socially and economically disadvantaged individuals; SBA-certified; annual revenue limits apply

SDVOSB

Service-Disabled Veteran-Owned Small Business

51%+ owned/controlled by service-disabled veteran; verified through VA

WOSB

Women-Owned Small Business

51%+ owned/controlled by women; SBA-certified or self-certified for certain NAICS

HUBZone

Historically Underutilized Business Zone

35%+ employees in HUBZone; principal office in HUBZone; SBA-certified

SDB

Small Disadvantaged Business

Self-certified; used in evaluation preference calculations

Scoring Logic:

  • If RFP is set aside exclusively for a specific program (e.g., 8(a) sole source), the client must hold that certification — this is an auto-disqualifier if missing

  • If RFP is a full and open competition with small business evaluation preference, the absence of a set-aside certification is a competitive disadvantage but not disqualifying

  • Holding a relevant set-aside certification when the RFP is set aside for that category is a significant competitive advantage that increases win probability substantially

Additional Compliance Items Reviewed:

  • SAM.gov active registration status

  • Applicable state contractor licenses

  • Insurance requirements (general liability minimums, professional liability, workers' comp)

  • Any agency-specific pre-qualification requirements (e.g., GSA Schedule, CIO-SP3 vehicle)

Dimension 4: Geographic & Logistical Feasibility

What Is Evaluated:

The client's ability to perform the work at the required Place of Performance.

Analysis Framework:

  • Local Operations (client has existing office/employees in the Place of Performance area): No geographic barrier; score as fully favorable

  • Nearby Operations (within reasonable commuting distance or short mobilization): Minor barrier; mitigatable through daily/weekly travel or satellite office establishment

  • Remote Performance with Travel Requirements: Moderate barrier; evaluate travel cost impact and personnel willingness to travel; may require subcontractor with local presence

  • No Geographic Connection: Significant barrier; client must establish local presence or teaming partner before performance start; evaluate feasibility within timeline

Important Distinction:

Lack of a local office is a mitigatable risk, not a disqualifier — unless the RFP explicitly requires a local office or the nature of the work (e.g., daily on-site presence at a specific facility) makes remote performance impossible. The engine does not conflate geographic inconvenience with geographic impossibility.

OCONUS Automatic Rejection:

Any Place of Performance outside the continental United States, including APO/FPO military addresses, overseas territories, and foreign locations, results in automatic disqualification unless the client has documented operational capacity in that specific location.

Dimension 5: Financial & Operational Scale

What Is Evaluated:

Whether the client has the financial capacity and operational infrastructure to perform the contract at the required scale.

Key Financial Metrics Evaluated:

Factor

What Is Analyzed

Contract Value vs. Annual Revenue

A single contract should not exceed 30–40% of client's annual revenue without exceptional financial capacity; larger ratios create performance risk

Bonding Capacity

For construction contracts: does the client's surety bonding line cover the required performance and payment bond amounts?

Staffing Scale

Can the client staff the contract at the required personnel levels (FTEs, headcount) within the mobilization period?

Cash Flow / Float

Government contracts typically pay Net-30 to Net-60; does the client have working capital to bridge payroll during this period?

Subcontractor Management

If subcontracting is required, does the client have experience and systems for managing subcontractor performance and payments?

Insurance Capacity

Are the client's current insurance limits sufficient to meet minimum requirements, or will new coverage need to be obtained?

Scoring Logic:

  • A contract that is appropriately scaled for the client (consistent with existing contract portfolio size) scores high on this dimension

  • A contract significantly larger than the client's current portfolio is a moderate-to-high risk requiring explicit mitigation planning (teaming, credit facilities, performance bonds)

  • A contract that exceeds the client's financial capacity to perform is a fatal flaw

Dimension 6: Competitive Positioning

What Is Evaluated:

The client's strength relative to likely competitors within the set-aside category, based on available intelligence.

Analysis Components:

  • Incumbent Status: Is there a known incumbent contractor? Incumbents win 65–70% of recompetes. The analysis flags incumbent presence and assesses whether the client has differentiators strong enough to overcome this advantage.

  • Set-Aside Competition Pool: Within the specific set-aside category (8(a), SDVOSB, etc.), how many qualified competitors likely exist in this geography and NAICS?

  • Client Differentiators: What specific, documentable advantages does the client hold? (e.g., superior past performance ratings, unique technical certifications, local presence, lower overhead rates, specialized equipment)

  • Price Competitiveness: Based on the contract type (FFP, T&M, CPFF) and estimated value, is the client likely to be price-competitive?

  • Agency Relationship: Does the client have an existing relationship or past performance with this specific agency or office?

Common Competitive Scenarios:

Scenario

Competitive Assessment

Client is incumbent on recompete

Very favorable; default recommendation to pursue

Client has strong past performance with same agency

Favorable; agency familiarity reduces evaluator uncertainty

Open competition, multiple qualified competitors

Neutral; win probability depends on proposal quality and price

Strong incumbent identified, no client relationship with agency

Unfavorable; client must have clear discriminators to justify pursuing

Sole-source or limited competition

Highly favorable if client meets all requirements

Dimension 7: Identified Gaps & Critical Risks

This dimension produces the structured risk register that separates the analysis from a simple yes/no scoring exercise.

Risk Classification:

Category

Definition

Effect on Recommendation

FATAL FLAW

A gap that cannot be remediated within the proposal timeline and would render the proposal non-compliant or non-competitive

Generates No-Go recommendation regardless of other dimension scores

CRITICAL RISK

A gap that is serious but potentially mitigatable with immediate, documented action

Generates Conditional Go with required mitigation plan

MITIGATABLE RISK

A gap that can be addressed through standard proposal strategies (teaming, staffing plan, subcontracting, narrative explanation)

Does not affect Go recommendation; requires documentation in proposal

MINOR GAP

A difference between the client's current capabilities and the RFP requirements that is unlikely to affect win probability

Noted for awareness; no recommendation impact

The Most Common Mitigatable Risks (never classified as fatal flaws):

  • Absence of local office (when remote performance is operationally feasible)

  • Minor personnel certification gaps (when equivalent credentials can be documented)

  • Slightly below-threshold past performance value (when scope and complexity alignment is strong)

  • Single missing technical capability (when subcontractor can credibly fill the gap)

Adjustment Factors

After the seven-dimension scoring is complete, the following factors adjust the final win probability estimate:

Factor

Direction

Magnitude

Client is incumbent

Positive

+15–25%

Prior performance with issuing agency

Positive

+10–15%

Sole-source or limited competition environment

Positive

+20–35%

Client holds required set-aside certification

Positive

+10–20% (in set-aside competitions)

Strong agency relationship / prior teaming with agency-preferred contractor

Positive

+5–10%

Known incumbent competitor

Negative

-20–30%

RFP released with very short response timeline (under 20 days)

Negative

-5–10% (suggests pre-wired procurement)

Very large contract value vs. client annual revenue

Negative

-10–15%

Multiple evaluation criteria where client scores below average

Negative

-5–10% per criterion

Past performance CPARS ratings below "Very Good"

Negative

-10–20%

Complete Output Template

Every Fit-Gap Analysis follows this exact output structure, producing a minimum of 7,000 characters of substantive analysis:

```

Is this RFP a fit for the client? [Yes / No]

(One-word bold answer on its own line above all analysis)

📝 Comprehensive Fit-Gap Analysis & Reasoning

Overall Strategic Assessment

[Explain alignment of RFP to client's strategic capabilities, industry focus, and business model.

Reference core competencies, past performance, contract scale, and mission from CS.

State alignment tier: HIGH (>80%) / MEDIUM (50-80%) / LOW (<50%)]

Technical & Scope of Work Alignment

[Line-by-line SOW analysis against CS services, interpreted through client business model.

Highlight management, QA, technical, and reporting overlaps.

Cite specific SOW sections and CS service lines.]

Personnel & Expertise Match

[Compare RFP-required roles/certifications against CS team descriptions.

Focus on PMP, PE, CCM, or domain-specific credentials.

Distinguish key personnel requirements from general staffing requirements.]

Compliance & Certification Status

[Verify mandatory certs, licenses, socioeconomic statuses (8a, SDVOSB, WOSB, HUBZone).

Note competitive advantage if applicable.

Cross-reference SAM.gov registration status.]

Geographic & Logistical Feasibility

[Assess client location vs. RFP Place of Performance.

Frame lack of local office as mitigatable risk, not disqualifier.

Identify specific logistical actions required if geographic gap exists.]

Financial & Operational Scale Assessment

[Evaluate bonding capacity, staffing scale, and operational infrastructure

against RFP scope and estimated contract value.

Flag any financial capacity concerns explicitly.]

Identified Gaps & Critical Risks

[Separate into two sections:

FATAL FLAWS (no-go conditions): [List with specific evidence]

MITIGATABLE RISKS (manageable with action): [List with recommended mitigation for each]

NEVER conflate fatal flaws with mitigatable risks.]

Competitive Positioning

[Analyze strength vs. likely competition within set-aside category.

Highlight CS differentiators.

Note incumbent status if known.

Provide honest win probability assessment.]

Final Summary Recommendation

[Definitive Go / Conditional Go / No-Go statement.

For Go: list immediate next steps for proposal development.

For Conditional Go: list specific conditions that must be met before proceeding.

For No-Go: explain why resources are better deployed elsewhere.]

```

Inputs Required

The Fit-Gap Analyzer requires exactly two documents:

  1. Client Capability Statement (CS) — The client's formal federal capability statement documenting: core services and competencies, past performance references (project names, agencies, contract values, periods of performance), key personnel and their certifications (PMP, PE, CCM, CISSP, etc.), socioeconomic certifications (8(a), SDVOSB, WOSB, HUBZone), NAICS codes, SAM.gov registration status, and geographic operating footprint.

  1. RFP / Solicitation Document — The complete solicitation package including: Statement of Work (SOW) or Performance Work Statement (PWS), Section L (Instructions to Offerors), Section M (Evaluation Criteria), NAICS code, set-aside designation, place of performance, required certifications, key personnel requirements, past performance requirements, contract type, estimated contract value, and submission deadline.

Step-by-Step Workflow

Step 1 — Document Ingestion & Business Model Classification

Before any scoring begins, the engine classifies the client's primary business model against the contract requirements:

  • Self-Performer: Client directly delivers all SOW tasks with own workforce

  • Prime/Manager: Client manages subcontractors and holds the prime contract relationship

  • Product Provider: Client provides goods or systems rather than services

This classification determines the correct evaluation lens. A prime/manager who subcontracts 80% of performance is not evaluated the same way as a self-performer — their management capability becomes the primary scoring factor.

Step 2 — Auto-Disqualifier Screening

The engine applies all auto-disqualifiers (see below) before proceeding with scoring. Any single auto-disqualifier generates a hard No-Go recommendation. This step takes priority over all other analysis.

Step 3 — Seven-Dimension Scoring

The engine evaluates alignment across seven scoring dimensions (see Full Scoring Framework below), generating an evidence-based score for each dimension and an overall strategic assessment.

Step 4 — Win Probability Adjustment

The raw alignment score is adjusted for win probability factors (see Win Probability section below) to produce a final bid recommendation.

Step 5 — Output Generation

A structured report exceeding 7,000 characters is generated using the Output Template below.

Feature

Detail

Seven-Dimension Scoring Framework

Comprehensive evaluation beyond simple keyword matching

Ten Defined Auto-Disqualifiers

Prevents wasted effort on fundamentally ineligible RFPs

Fatal Flaw vs. Mitigatable Risk Distinction

Actionable risk classification system

Win Probability Adjustment Factors

Quantified adjustments based on competitive intelligence

Business Model-First Analysis

Evaluation from the correct operational role lens

FAR/DFARS Compliance Awareness

Regulatory grounding throughout the analysis

7,000+ Character Output Standard

Ensures depth and detail for informed Go/No-Go decisions

SAM.gov Cross-Reference

Verifies registration and certification currency

Create a free website with Framer, the website builder loved by startups, designers and agencies.