CMMC Assessment Explained
How Assessors Evaluate Evidence Using Examine, Interview, and Test
A CMMC assessment is not a conversation about your security program. Every control practice is evaluated against specific assessment objectives using three formally defined methods — Examine, Interview, and Test — applied to concrete assessment objects. Here is exactly what that means for your organization on assessment day.
Under NIST SP 800-171A — the governing methodology for all CMMC Level 2 assessments — a C3PAO assessor does not evaluate a control as a single pass/fail question. They evaluate a set of granular assessment objectives labeled [a], [b], [c] that together define what "met" actually means for that practice. Each objective must be individually satisfied using at least one of three required methods: Examine, Interview, or Test.
A thorough lead assessor will typically combine methods — examining your policy, then testing the live system, then questioning the control owner. Understanding how each method works and what it targets is the most direct path to a clean assessment.
The Three Assessment Methods: What Each One Actually Does
Each method has a specific purpose, targets specific objects, and demands specific evidence. Treating them as interchangeable — or assuming a strong policy substitutes for a system test — is a preparation mistake that surfaces on assessment day.
Examine
The assessor reviews physical or digital artifacts to verify a control is documented and that the documentation addresses the objective using matching language.
Assessors scan for thematic resonance — wording that mirrors the assessment objective almost exactly. "We maintain strong access controls" does not satisfy an objective stating "authorized users are identified."
Interview
The assessor questions the specific person responsible for a control — identified from your organizational chart. Whoever currently holds that role will be interviewed, not a designated spokesperson.
Interviews verify that the control owner understands and executes the control. A well-written policy owned by someone who cannot explain it is a gap, not a defense.
Test
The assessor exercises a live system or mechanism to verify it performs as documented. Policies and interview answers are not substitutes — the control must be demonstrably functional in the live environment.
Assessors evaluate effectiveness, not elegance. A manual weekly log report satisfies the same audit logging objective as a $50,000 SIEM — if the outcome is consistent and documented.
The Examine Method: Policies, Procedures, and Thematic Resonance
When an assessor examines your documentation, they are matching your language against the exact wording of each assessment objective. This concept — thematic resonance — is one of the most consequential and least-understood aspects of CMMC preparation. The assessor is legally required to make a direct match. They cannot reframe, interpret, or credit intent.
The most common failure here is importing ISO 27001, SOC 2, or ITAR documentation into an SSP without rewriting it for CMMC vocabulary. The underlying controls may be sound — but if the language doesn't mirror the objective, the assessor cannot mark it met.
"Authorized users of the system, the connections to other systems, and the system environment of operation are identified."
"Access rosters are maintained. Recovery Time Objectives (RTO) are defined per system tier. Periodic access reviews are scheduled quarterly."
"Authorized users of the system, the connections to other systems, and the system environment of operation are identified."
"Authorized users are identified via Active Directory security groups in Asset Inventory Appendix A. System connections are enumerated in the network diagram (Appendix B). The environment of operation is described in SSP §2.1."
Who Gets Interviewed: Roles, Targets, and the Golden Rule
Assessors do not conduct general security awareness sessions. They identify the specific role responsible for each control from your organizational chart and interview whoever currently holds that seat. The person interviewed needs to be prepared for the specific controls they own — not security in general.
Answer the question and nothing but the question. Volunteering that you just completed a major network re-architecture, recently changed firewall vendors, or are mid-deployment on a new SIEM will immediately prompt the assessor to pull those threads. Every unrequested detail is a potential new line of scrutiny. Brief each control owner individually on both their specific controls and this rule.
What a "Test" Actually Looks Like
The Test method is how assessors verify that documented controls work in the live environment. Testing takes three primary forms: observed system behavior, active configuration review, and live demonstrations where the assessor attempts to bypass a control. The critical principle governing all three: effectiveness matters, elegance does not.
How to Prepare: Evidence Mapping and Avoiding Document Dumping
The most damaging preparation mistake is document dumping — submitting a large, disorganized evidence package and expecting the assessor to locate relevant artifacts. With 320 objectives to verify across a bounded assessment window, when evidence cannot be found efficiently, controls get marked Not Met regardless of whether the underlying evidence exists somewhere in the pile.
The correct approach is an Evidence Mapping File — a spreadsheet connecting every assessment objective to its supporting evidence at the exact document, page, and paragraph level.
| Practice | Obj. | Evidence Location |
|---|---|---|
| AC.L2-3.1.1 | [a] | Access Policy §3.2 ¶1 |
| AC.L2-3.1.1 | [b] | SSP §4.1 → AD screenshot |
| IA.L2-3.5.3 | [a] | MFA Config doc, p. 7 |
| AU.L2-3.3.1 | [e] | Log Review SOP §2, signed report |
| SC.L2-3.13.11 | [a] | FIPS CMVP Cert #4127 |
| CA.L2-3.12.4 | [a] | SSP §2.1 + Network Diagram B |
- 01Map every objective before assessment day. Work through all 320 objectives in NIST 800-171A and identify the exact artifact satisfying each — specific section and paragraph, not document category.
- 02Verify thematic resonance in every policy. Read each implementation statement against the objective language side-by-side. If the wording doesn't mirror the objective, rewrite the statement — not the control.
- 03Brief every control practice owner individually. Each person who will be interviewed should know which controls they own, how those controls are implemented, and the Golden Rule: answer the question, nothing more.
- 04Run a pre-assessment walkthrough of high-risk Test objectives. Screen lock timers, MFA bypass attempts, banner configurations, log review procedures — fix anything that doesn't work before the assessor sees it.
- 05Prepare a sampling response package for CRMAs. For every out-of-scope or CRMA device, have VLAN evidence, ACL rules, and AUP acknowledgments ready to produce on demand for any asset the assessor selects.
The Bottom Line
A CMMC assessment is a structured, objective-by-objective verification exercise governed by a published methodology. Assessors are matching your documentation language to objective wording, questioning the specific people responsible for each control, and testing whether your systems do what your policies say they do. The framework they must follow is the same one you can read and prepare against.
Organizations that map every objective to its exact evidence, brief their control owners on their specific roles, and validate live system behavior before assessment day eliminate the avoidable failures that have nothing to do with their actual security posture.
The assessors who arrive are legally bound to follow the same methodology you can study in advance. That predictability is your most underused preparation asset — use it.