Cybersecurity for community banks and credit unions, the examiner's list
What FDIC, OCC, NCUA and state examiners actually look at when they review a community bank or credit union's cybersecurity posture, and what a credible program looks like at the mid-market asset level.
· Atticus Rowan
Community banks and credit unions sit in a regulatory environment that has tightened meaningfully over the last decade. FDIC, OCC, NCUA and state examiners all treat cybersecurity as a top-tier examination focus area now, and the institutions that show up at examination with a weak answer tend to leave with a Matter Requiring Attention, a consent order or worse.
The examination experience is predictable if the institution knows what the examiners actually ask. Most of what looks intimidating is structured, tiered and documented in examination handbooks the examiners follow. Preparing well is mostly about matching the institution’s program to the categories the examiner will walk through.
Here is a working view of the examiner’s list at a community-bank or credit-union institution under $1 billion in assets, and what a credible program looks like in response.
The frameworks examiners reference
Community banks and credit unions sit under several overlapping frameworks.
- FFIEC Cybersecurity Assessment Tool (CAT). The longstanding self-assessment methodology. Deprecated for new use as of August 2025, but many institutions still operate against it because the categories remain the industry vocabulary.
- FFIEC Architecture, Infrastructure and Operations (AIO) Booklet. The updated guidance that has partly replaced the CAT’s scope.
- NIST Cybersecurity Framework 2.0. Increasingly referenced by examiners, either directly or via CAT/AIO mapping.
- NCUA cybersecurity reviews. For federally-insured credit unions, the NCUA’s Automated Cybersecurity Examination Tool (ACET) follows CAT structure closely.
- State-level requirements. Varies by state. New York DFS Part 500 is the most prescriptive; many states have adopted NAIC or similar model rules.
An institution preparing for examination does not need to pick one framework. The programs the examiners credit typically align to CAT/AIO as the core, with CSF 2.0 as the external vocabulary and state-specific requirements layered where applicable.
The five domains, compressed
Examinations walk through roughly five domains.
1, cybersecurity governance
The first question. Who is accountable for cybersecurity, what is their authority and how do they report to the board.
What examiners look for:
- A named cybersecurity officer or equivalent (sometimes called a chief information security officer, sometimes called an information security officer, sometimes fractional at smaller institutions)
- A written information security program, dated and reviewed within the past 12 months
- Board-level cybersecurity reporting on a defined cadence (at least annually, often quarterly)
- A documented risk appetite statement
- Evidence that the board or a designated committee actively engages rather than passively receives reports
Weak governance answers are where examinations most often produce corrective action, because governance is how the examiner reads program maturity.
2, threat identification and detection
How the institution identifies and monitors cybersecurity threats.
What examiners look for:
- Threat intelligence consumption (FS-ISAC for banks, NCUA’s threat bulletins for credit unions, sector-specific feeds)
- Continuous monitoring of network, endpoint and identity signals
- Security information and event management (SIEM) or equivalent log aggregation
- 24x7 monitoring coverage, either in-house or through a managed provider
- Defined incident alerting thresholds and triage process
For institutions under $1 billion in assets, the examiner expectation is almost always that 24x7 monitoring is delivered through a managed service provider (MDR or SOC-as-a-service) rather than through in-house staffing. Building an in-house SOC at that asset scale is not the expectation.
3, cybersecurity controls
The operational controls protecting systems, data and customers.
What examiners look for:
- Multi-factor authentication universal, including internal access to core systems
- Endpoint detection and response on all workstations and servers
- Privileged access management, with periodic review of privileged accounts
- Network segmentation between user, server, core processor and ATM/member service networks
- Data loss prevention controls on customer data
- Encryption of data at rest and in transit
- Vulnerability management with defined remediation SLAs
- Patching cadence with exception documentation
- Secure configuration baselines
Examiners sometimes sample specific controls with direct evidence requests (logs, configuration exports, patch records). A program that cannot produce the evidence for sampled controls signals broader gaps.
4, external dependency management
The institution’s third-party relationships. This domain has grown in examiner priority continuously since 2015.
What examiners look for:
- A documented inventory of all third parties with access to nonpublic information or critical systems
- Tier classification based on criticality
- Due diligence before onboarding, with documented evidence
- Ongoing monitoring of material vendors (SOC 2 review, security questionnaire, financial condition review)
- Contract language meeting regulatory expectations on data protection, breach notification, right to audit
- Core processor oversight specifically (FIS, Jack Henry, Fiserv and similar)
- Cloud service provider oversight where applicable
Community banks and credit unions are usually heavily dependent on a small number of critical vendors. The examiner is checking whether that dependency is actively managed or merely assumed.
5, incident response and resilience
The institution’s ability to respond to and recover from cybersecurity incidents.
What examiners look for:
- A written incident response plan covering cybersecurity incidents specifically
- Annual tabletop exercises with documented output
- Business continuity plan covering cybersecurity scenarios (ransomware, prolonged outage, third-party failure)
- Defined and tested backup and recovery capability with immutability
- Cyber insurance in force at appropriate limits
- Coordination with the institution’s regulator, FS-ISAC and law enforcement pre-defined
Examiners increasingly ask to see the after-action review from the most recent tabletop. A plan without evidence of exercise is a weak answer.
The documentation an examiner expects
A defensible program produces a consistent set of documentation artifacts.
- Written information security program, current version
- Cybersecurity risk assessment, refreshed at least annually
- Board-level cybersecurity reporting for the past 12 months
- Incident response plan with annual tabletop output
- Business continuity plan with annual test documentation
- Third-party risk management policy with vendor inventory
- Employee training records, including role-based cybersecurity training
- Privileged access review records
- Vulnerability management reports and remediation tracking
- Backup and recovery test records
- Penetration testing or independent security assessment within the past 12-24 months
A community-bank or credit-union examiner almost always asks for a subset of these by document name. Producing them inside the examination window signals program maturity. Producing them after a 2-week delay signals the opposite.
Where community banks and credit unions at this scale typically fall short
A representative gap list across institutions with $100 million to $1 billion in assets that have not had a recent independent review:
- Third-party risk management weak for non-core vendors. The core processor is well managed; the 40 SaaS applications with material data access are not.
- Vulnerability management cadence inconsistent. Monthly scans run, but the SLA for remediating critical vulnerabilities is vague.
- Board reporting mechanical rather than substantive. Quarterly reports happen but do not surface material decisions for board engagement.
- Tabletop exercises nominal. A tabletop happens annually, but the after-action review produces no revisions to the plan.
- Penetration testing out of date. Last independent assessment was more than 24 months ago.
- Cybersecurity training uniform rather than role-based. Everyone takes the same training regardless of role.
None of these are catastrophic individually. Several of them in combination produce the profile the examiner cites in a Matter Requiring Attention.
What a working program looks like at a $200M to $1B institution
The representative program:
- Named cybersecurity officer (often fractional vCISO for institutions under $500 million in assets)
- Written information security program documented against CAT/AIO categories with NIST CSF 2.0 mapping
- 24x7 monitoring through an MDR or SOC-as-a-service provider
- EDR on all workstations and servers
- MFA universal, phishing-resistant factors on privileged accounts
- Immutable backup with documented recovery testing
- Annual independent penetration test
- Quarterly vulnerability scans with defined remediation SLAs
- Quarterly third-party risk reviews on material vendors
- Quarterly board cybersecurity reporting
- Annual tabletop exercise with revised plan output
- Role-based cybersecurity training with documented completion
- Cyber insurance at limits appropriate to the institution’s asset size
Typical annual investment for an institution in this asset band: $200,000 to $800,000 combined across managed services, tooling, independent assessments and insurance. A finding-free examination is usually worth more than the spend.
Where we fit
We support community banks and credit unions as the IT and cybersecurity partner that builds and operates the program the examiners expect. The engagement is framework-native from day one. CAT/AIO vocabulary in the documentation, CSF 2.0 mapping where useful, vendor-specific coverage for core processor and critical third parties, evidence production calibrated to examination cadence.
We do not issue independent security assessments. Independence rules require the assessment to come from a separate party. We coordinate with the assessor and produce the evidence the assessor and the examiner both need.
If your institution is preparing for examination, responding to an MRA or simply reviewing whether the current cybersecurity program matches current examiner expectations, schedule a discovery call. We can walk through the current posture and scope the program against the specific regulator environment.