Skip to main content

← All posts

Walking through a customer security questionnaire, section by section

What enterprise customers are actually measuring when they send a vendor security questionnaire, and how to answer each section without overpromising or underselling.

· Jake Schaaf, Founder of Atticus Rowan

Customer security questionnaires used to be a finance department exercise. A buyer’s procurement team would forward a 20-question form, the vendor would fill in the obvious answers and the contract would close. That model is gone in most mid-market and enterprise buying.

A 2026 vendor questionnaire from a credible enterprise customer runs 80 to 200 questions. Some are standard, drawn from CAIQ, SIG Lite or a custom hybrid. Some are bespoke, written by the buyer’s own information-security team to test specific concerns about your category. The form arrives with a deadline that is shorter than feels reasonable, and the buyer’s signal is clear. They are evaluating whether your control environment is good enough to put your software, data flow or service inside their environment.

What follows is how the modern questionnaire is actually structured, what each section is measuring and how to answer without overpromising or underselling. The structure mirrors the section order of the most common buyer-side templates, with the headings buyers tend to use.

Why questionnaires got longer

Two pressures expanded the form.

  • Third-party risk regulation. Bank examiners, healthcare regulators and SEC examiners all tightened third-party oversight expectations between 2023 and 2025. The buyer’s own examiners now ask whether the buyer can demonstrate vendor due diligence, and the questionnaire is the most legible artifact of that diligence.
  • Cyber insurance renewal pressure on the buyer. When a buyer’s own insurer asks them to attest that critical vendors have specific controls, the buyer pushes those controls down to vendors via the questionnaire. Sub-100% MFA, untested backups, missing EDR coverage and undisclosed incidents are the four findings most likely to push a buyer toward a different vendor.

Treat the questionnaire as a compressed audit, not a marketing form. Firms that approach it as evidence-backed disclosure tend to keep buyers. Firms that approach it as a sales artifact tend to lose them on the second pass.

Section 1, governance and program

Document who owns cybersecurity by name and role. If you use a fractional CISO or vCISO, name the relationship and the cadence. Document the written information security program, the framework you align to (NIST CSF 2.0 is the most common defensible answer, SOC 2 if you are running toward an audit, NIST 800-171 if you have federal supply-chain exposure) and the executive oversight cadence.

Buyers read this section to answer one question. Is cybersecurity a function with a name and a calendar, or is it a part-time chore the IT manager picks up between tickets. Either answer is honest, but only one keeps the engagement alive.

Section 2, identity and access

This is the section reviewers spend the most time inside. Document the following with percentages and dates, not adjectives.

  • MFA coverage on user accounts, with type. Phishing-resistant MFA (FIDO2, hardware keys) on privileged and executive accounts is increasingly the bar.
  • Privileged-access management. Named PAM tool, vault inventory, session recording posture.
  • Joiner-mover-leaver process with timelines. Document the SLA from termination to access removal in writing.
  • Service-account inventory. The accounts machines use to talk to each other, with rotation cadence.
  • Access reviews. Quarterly is the modern bar.

Sub-100% MFA on privileged accounts is the most common reason a vendor is flagged for follow-up. If you cannot honestly answer 100%, document the exact percentage, the gap remediation plan and the date the gap closes. Buyers will accept an honest in-progress answer with a date. They will not accept a vague “we are working toward it.”

Section 3, endpoint, network and email

Document the controls and the team behind them.

  • EDR. Named tool, percentage of managed endpoints covered, who watches the alerts, response cadence. EDR with no monitoring team behind it reads as a partial control.
  • Patch management. Cadence, coverage rate metric, exception process. “We patch when needed” is not an answer.
  • Network segmentation. User subnets, server subnets, OT or production subnets if relevant. East-west firewalling between tiers is the modern norm.
  • Perimeter. Named firewall vendor, IDS/IPS posture, remote access architecture (VPN replaced by ZTNA in many environments).
  • Email security. Phishing protection layer beyond native, outbound DLP if you handle sensitive data, BEC-specific controls.

Buyers parse coverage gaps here as the operationally most exposed parts of your environment. Be specific.

Section 4, data protection and backup

Document data classification (you do not need an exhaustive scheme, just a defensible one with at least 3 tiers), encryption posture in transit and at rest, and backup architecture in detail.

The backup question that gets the most follow-up is also the simplest. When was the last successful tested restore. The right answer is a recent date and a brief description of the test scope. The wrong answer is silence or a generic “we test regularly.” Reviewers treat untested backups as unavailable backups, and that flips a routine question into a real concern.

Recovery-time objective and recovery-point objective should be numbers, not aspirations. If your RTO is 24 hours, say 24 hours and document the architecture that supports it.

Section 5, third-party and supply chain

The supply-chain section was a footnote until the third-party incidents of 2024 and 2025. Buyers now expect a vendor inventory with tier classification (critical, important, routine), due-diligence records on the vendors in the top tiers, contractual security obligations and ongoing oversight evidence.

For a smaller mid-market vendor, a defensible answer here is a documented vendor list with criticality tiering, security questionnaires on file from your top-tier vendors and a review cadence. You do not need a third-party-risk-management tool to answer this section credibly. You do need the discipline.

Section 6, incident response and history

Document the incident-response plan, the date of the most recent tabletop, the response-provider relationships you have in place (forensics retainer, breach counsel, carrier coordination) and a truthful disclosure of material incidents in the reporting window.

Disclosure is the part that destroys engagements when handled poorly. If you had a material incident, disclose it with scope, remediation actions and current status. Buyers can work with a clean disclosure. They cannot work around discovering an incident later through a different channel.

In incidents we coordinate with the cyber insurance carrier and the third-party forensics team. We do not perform forensics directly, that work belongs with a specialist firm engaged through the carrier or breach counsel.

Section 7, training and awareness

Document the cadence (most defensible answer is annual training plus monthly phishing simulations), simulation results trend over time and tenure of the program. Reviewers weight consistency. A 3-year program with documented completion reads differently than a 6-month-old program created in response to the questionnaire itself.

What the questionnaire does not measure

Worth naming for context. The questionnaire is a snapshot of your control environment. It does not test whether you can recover from an incident, whether your team would actually follow the IR plan under pressure, whether your SOC 2 readiness work has produced an audit-ready evidence binder or whether your cyber insurance program would respond cleanly to a real claim. Those questions show up in the buyer’s follow-up call, in the on-site audit visit (if there is one) and in the contract negotiation.

Atticus Rowan supports companies preparing to respond to enterprise customer questionnaires. We build the control environment, document the evidence behind each answer and coordinate the response so a 150-question form does not consume the CFO’s calendar for 3 weeks. We are not the SOC 2 audit firm and we are not the buyer’s procurement team, we are the MSP that turns the questionnaire from a recurring fire drill into a documented, repeatable artifact.

Our Manufacturing Cybersecurity Guide maps the IT-OT boundary, the customer-audit response cadence and the OEM remote-access governance pattern.

If your team is staring at a deadline this quarter, start a conversation before you start typing answers into the form.