Documentation Assessment Methodology // 11 MIN READ

Policy vs Procedure vs Evidence in CMMC

Three Layers. Three Independent Scores. One Keeps Failing.

A well-written policy with no corresponding procedure is an aspiration. A procedure with no supporting evidence is a claim. Assessors evaluate all three layers independently — and the layer most contractors neglect is the one that determines whether the control is scored Met or Not Met.

Why Assessors Care About the Distinction

Under NIST SP 800-171A, every control practice is decomposed into assessment objectives. Each objective is evaluated using three methods: Examine (review documents and configurations), Interview (ask personnel about their knowledge and actions), and Test (observe or execute the control in practice). These three methods map directly to three layers of documentation — and if any layer is missing, the objective cannot be scored Met.

Layer 01

Policy

What the organization has decided to do. The intent, scope, and rules. Evaluated via Examine.

Layer 02

Procedure

How the organization carries it out. The steps, the roles, the tools. Evaluated via Interview.

Layer 03

Evidence

Proof that it actually happened. Configurations, logs, records. Evaluated via Test.

These layers are not redundant — they are complementary. The policy says what. The procedure says how. The evidence proves that it happened. An assessor who finds a policy but no procedure will ask: "How do you actually do this?" An assessor who finds a procedure but no evidence will ask: "Can you show me that this was done?" A missing layer is a missing dimension of proof.

The most common documentation failure in CMMC assessments is not the absence of documentation. It is the presence of one layer with the absence of another. Contractors invest heavily in policy documents — sometimes hundreds of pages — and then cannot demonstrate that a single policy is executed through a repeatable procedure with dated evidence. The policies exist. The compliance does not.

What Belongs in Policy

A policy is an organizational decision. It declares what the organization will do, who it applies to, and under what authority. A policy does not describe how to configure a firewall, how to review logs, or how to provision a user account. Those are procedures. A policy sets the rules that procedures implement.

For CMMC, every control family requires at least one organizational policy. NIST SP 800-171A explicitly lists policy documents as examination objects for the first assessment objective in each family. If the policy does not exist, the entire family starts with a deficit.

A well-constructed CMMC policy includes:

Element Purpose & Scope
What the policy covers — which systems, which users, which data classifications. The scope must align with the SSP boundary. A policy that says "all company systems" when the assessment boundary is an enclave creates a contradiction the assessor will flag.
Element Roles & Responsibilities
Who is responsible for implementing, maintaining, and enforcing the policy. Named roles — not individuals — so the policy survives personnel changes. "The IT Security Manager" is a role. "John Smith" is a person who may leave.
Element Requirements
The organizational rules that map to NIST SP 800-171 controls. "All users shall authenticate using multi-factor authentication when accessing CUI systems remotely." This is a requirement — not a configuration instruction. It declares what must happen, not how to make it happen.
Element Exceptions & Enforcement
How exceptions are requested, approved, and documented. What happens when the policy is violated. A policy without an exception process implies there are no exceptions — and the first exception the assessor finds will become a finding.
Element Review Cycle
How often the policy is reviewed and updated — annually at minimum. The last review date and the approver's name must be on the document. A policy dated three years ago with no evidence of review is stale — and staleness is a finding.
The policy test: A policy should be readable by a non-technical executive and make sense as a set of organizational decisions. If the policy includes IP addresses, registry keys, or command-line syntax, it has crossed into procedure territory. Keep policies at the decision layer. Move the implementation details to procedures.

What Belongs in Procedure

A procedure is the operational implementation of a policy. It describes the specific steps, tools, configurations, and workflows that carry out the policy's requirements. Where a policy says "what must happen," a procedure says "how it happens, who does it, and what tools they use."

Procedures are evaluated during the Interview phase of the assessment. The assessor will ask a staff member to walk through the procedure — not read it from the document, but describe it from their operational experience. If the person cannot describe the procedure without consulting the document, the assessor may conclude the procedure is documented but not operationalized.

Policy Statement

"Audit logs shall be reviewed weekly for anomalous activity."

This is a decision. It declares the frequency, the scope, and the purpose. It belongs in the Audit and Accountability policy. It does not tell anyone how to perform the review.

Corresponding Procedure

"Every Friday, the IT Security Manager logs into the M365 Unified Audit Log, filters for failed sign-ins, admin changes, and DLP incidents, completes the Log Review Checklist, and saves the record to the Evidence Vault."

This is an implementation. It names the tool, the person, the steps, and the artifact produced. It belongs in the Log Review Procedure.

A well-constructed procedure includes:

  • Trigger — What initiates the procedure: a schedule (weekly, monthly), an event (new vulnerability published, user reports incident), or a condition (account lockout threshold exceeded).
  • Steps — The specific actions, in order. Detailed enough that a trained replacement could execute them. Not so detailed that they become brittle when a tool updates its interface.
  • Responsible role — Who performs each step. Named by role, not by individual.
  • Tools — Which systems or platforms are used. "Microsoft Purview Compliance Portal" is a tool. "The compliance tool" is not specific enough.
  • Output / artifact — What the procedure produces: a completed checklist, a ticket, a report, a configuration change. This artifact becomes the evidence that the procedure was executed.
  • Escalation path — What happens when the procedure encounters an anomaly or exception. Who is notified. What the next step is.
The most common procedure failure: The procedure exists but produces no artifact. The IT team performs the log review every Friday but does not complete a checklist, save a report, or create a ticket. The procedure is operationalized — the work happens — but there is no evidence it happened. From the assessor's perspective, undocumented work is unverified work. And unverified work is Not Met.

What Counts as Implementation Evidence

Evidence is the proof that a control is implemented and operating. It is the layer that assessors spend the most time evaluating — because it is the layer that cannot be fabricated retroactively. A policy can be written the week before an assessment. A procedure can be drafted from memory. But evidence of consistent operation over the past 90 days either exists or it does not.

CMMC evidence falls into three categories:

01

Configuration Evidence

Exports, screenshots, or reports showing that a technical control is configured as described in the SSP. Examples: conditional access policy JSON export, DLP rule configuration screenshot, Intune compliance baseline export, firewall rule table, sensitivity label policy settings. Configuration evidence proves the control exists in the system — but not that it is operating correctly or being maintained.

02

Operational Evidence

Records showing that a process was executed on a specific date by a specific person. Examples: completed log review checklists with dates and reviewer names, vulnerability scan reports from the past three months, patch deployment reports, training completion records, incident response case files. Operational evidence proves the control is not just configured — it is actively maintained by a human being on a defined schedule.

03

Outcome Evidence

Artifacts showing that a control produced a result. Examples: a DLP alert that fired and was investigated, a vulnerability that was remediated and confirmed cleared on rescan, an access review that resulted in permissions being removed, a phishing simulation report showing user click rates. Outcome evidence is the strongest category — it proves not just that the control runs, but that it works.

The assessor's hierarchy: Configuration evidence proves the control exists. Operational evidence proves someone maintains it. Outcome evidence proves it works. The strongest evidence packages include all three for critical controls. The weakest packages include only configuration evidence — a screenshot of a setting with no record of anyone ever reviewing whether that setting is effective.

Why Screenshots Cannot Replace Process

Screenshots are the most common evidence artifact in CMMC assessments — and the most overused. A screenshot of a conditional access policy proves the policy exists at the moment the screenshot was taken. It does not prove the policy was in effect last month. It does not prove anyone reviewed its effectiveness. It does not prove it was not temporarily disabled during a troubleshooting session and never re-enabled.

Assessors know this. That is why they do not accept screenshots alone for controls that require ongoing operation.

Insufficient

Screenshot of MFA policy

A screenshot of the Azure AD conditional access policy requiring MFA. Taken the day before the assessment. Proves the policy exists right now. Does not prove it was in effect three months ago. Does not prove it was not bypassed. Does not prove users are actually being challenged — only that the policy is configured.

Sufficient

Screenshot + sign-in logs + exclusion review

The same screenshot, plus Azure AD sign-in logs from the past 90 days showing MFA challenges being satisfied across multiple users. Plus a dated record of a quarterly review of the conditional access exclusion list showing who is excluded and why. Together, these prove the policy exists, operates, and is maintained.

The rule of thumb: a screenshot proves a point-in-time state. Logs, records, and dated reports prove continuous operation. For any control that requires ongoing execution — log review, vulnerability scanning, patch management, access reviews, training — a screenshot is the starting point, not the finish line. The assessor will ask: "This shows the configuration. Now show me that it ran last month."

How to Avoid Contradictions Across Documents

The most damaging documentation failure is not a missing document. It is a contradiction between documents that already exist. When the policy says one thing, the procedure says another, and the evidence shows a third, the assessor cannot score the control as Met — because the organization has not demonstrated a coherent implementation. It has demonstrated confusion.

Common contradictions that assessors find:

01

Policy Says "All Users" — Procedure Covers a Subset

The Access Control policy states: "All users shall use multi-factor authentication." The MFA procedure describes enforcement only for users in the "CUI Users" security group. The conditional access policy targets that same group — not "all users." The policy overpromises. The implementation underdelivers. The assessor reads the policy, checks the implementation, and finds the gap.

02

SSP Describes a Configuration That No Longer Exists

The SSP states: "DLP policies block external sharing of CUI-labeled content." The DLP policy was modified six months ago to change the action from "block" to "warn" after users complained. The SSP was never updated. The assessor reads the SSP, examines the DLP policy, and finds a mismatch. The control is either Not Met (the SSP is wrong) or the implementation is weaker than documented (the policy was degraded without authorization).

03

Retention Policy Says 12 Months — Configuration Shows 180 Days

The Audit and Accountability policy states a 12-month log retention period. The M365 Unified Audit Log is configured for 180-day retention. The SIEM retains data for 90 days before archiving. Nobody verified that the technical settings match the policy. The assessor exports the configuration, reads the number, and scores the control Not Met.

04

Incident Response Plan References a Tool That Was Replaced

The IR plan describes detection using "the Splunk SIEM." The organization migrated to Microsoft Sentinel eight months ago. The IR plan still references Splunk — including specific Splunk queries and dashboard names. The assessor reads the plan, asks the IR team about their process, and discovers the plan describes a system that no longer exists. The plan is not current. The control may be Not Met.

The prevention is straightforward but requires discipline: every time a technical configuration changes, review the policy and procedure that reference it. Every time a policy is updated, verify that the corresponding procedures and SSP descriptions still align. Build a cross-reference table — each policy section links to the procedure that implements it and the evidence artifact that proves it. When any element changes, the cross-reference flags what else must be updated.

The most dangerous contradictions are the ones nobody notices. They happen when the IT team changes a configuration to solve an operational problem — disabling a DLP rule, adding an MFA exclusion, shortening a log retention period — without updating the documentation. The change was reasonable. The documentation gap creates the finding.

A Simple Mapping Model from Requirement to Proof

The most effective documentation strategy for CMMC is a four-column mapping table that connects every NIST SP 800-171 practice to its policy source, its implementing procedure, and its evidence artifact. This table becomes the master index for the assessment — the assessor can trace any control from requirement to proof in a single view.

CMMC Practice Policy Procedure Evidence
AC.L2-3.1.1
Limit system access
Access Control Policy §3.1 — "Access to CUI systems is restricted to authorized users based on role and least privilege." User Provisioning Procedure — Manager submits request → IT creates account in Azure AD → assigns to CUI Users group → enables MFA. Azure AD user list export showing group membership. Provisioning tickets for recent new hires. Access review record from last quarter.
AU.L2-3.3.1
Create audit logs
Audit & Accountability Policy §2.1 — "All in-scope systems shall generate and retain audit logs for a minimum of 12 months." Log Management Procedure — Unified Audit Log enabled in GCC High. Retention set to 1 year. Firewall syslog forwarded to collector VM. Purview audit retention settings screenshot. Syslog collector configuration. Sample log entries from 90 days ago confirming retention.
RA.L2-3.11.2
Vulnerability scanning
Risk Assessment Policy §4.2 — "Vulnerability scans shall be conducted monthly on all in-scope systems using authenticated scanning." Vulnerability Scanning Procedure — Nessus scan scheduled monthly → credentialed scan of all targets → results triaged by severity → remediated per SLA table. Scan configuration showing targets and credentials. Three consecutive monthly scan reports. Remediation tracker with dispositions. Rescan confirmation for remediated CVEs.
IR.L2-3.6.1
Incident response
Incident Response Policy §1.1 — "The organization shall maintain and test an incident response capability." Incident Response Plan — Detection via Sentinel alerts → triage by IT Security Manager → containment → eradication → recovery → lessons learned. Tabletop exercise conducted annually. Sentinel alert rule configurations. Last tabletop exercise record with date, attendees, and findings. If a real incident occurred: the case file showing detection through resolution.
Build this table before the assessment — not during it. For each of the 110 practices, fill in all four columns. Any cell that is empty identifies a gap: a practice with no policy, a policy with no procedure, or a procedure with no evidence. Fix the empty cells before the assessor finds them.

This mapping table serves three purposes. First, it is a gap-finding tool during preparation — any empty cell is a gap that must be closed. Second, it is a navigation tool during the assessment — when the assessor asks about a control, you can point them to the exact policy section, the exact procedure document, and the exact evidence artifact. Third, it is a maintenance tool after certification — when anything changes, the table shows which other documents must be updated to prevent contradictions.

The mapping table is not a formal CMMC requirement. There is no assessment objective that says "produce a four-column mapping table." But the organizations that build one consistently produce cleaner assessments, faster evidence retrieval, and fewer documentation contradictions. It is the scaffolding that holds the entire documentation structure together — and it costs nothing to create.

The Bottom Line

A CMMC assessment evaluates three layers of documentation — policy, procedure, and evidence — and all three must be present, consistent, and current for every control practice. A policy without a procedure is an intent without an implementation. A procedure without evidence is a claim without proof. Evidence without a policy and procedure is an ad hoc action that cannot be sustained or repeated.

The contractors who fail assessments rarely fail because they have no documentation. They fail because the documentation they have describes a compliance posture that does not match what the assessor finds in the live environment. The policy says one thing. The system does another. Nobody updated the procedure when the configuration changed. The evidence was never collected because the procedure never defined an output artifact.

Build the mapping table. For each of the 110 practices, connect the policy to the procedure to the evidence. Every empty cell is a gap. Every contradiction is a finding. Every artifact you cannot produce on demand is a control you cannot prove. Fill the table. Align the layers. And when the assessor asks "show me how you do this" — show them all three.