CMMC Scoring Met vs Not Met Assessment Objectives NIST 800-171A // 8 MIN READ

CMMC Scoring Explained

Met vs Not Met, Assessment Objectives, and Why One Gap Fails the Requirement

CMMC Level 2 findings are captured at the assessment objective level — one NOT MET objective causes the entire security requirement to be NOT MET. Per 32 CFR 170.4, an assessment objective is a set of determination statements expressing the desired outcome for a specific security requirement. Those determinations are reached through three validation methods: examine, interview, and test.

CMMC scoring is the binary evaluation of 110 control practices — decomposed into 320 discrete assessment objectives defined in NIST SP 800-171A — where each objective receives an independent finding of Met or Not Met. There is no curve. There is no "mostly compliant" outcome. A practice is Met only when every one of its objectives is individually Met. One unaddressed objective produces a Not Met finding for the entire practice, regardless of how many other objectives were satisfied. The resulting SPRS score, calculated under the DoD NIST SP 800-171 Assessment Methodology, determines certification eligibility under DFARS 252.204-7021.

Why this matters operationally: an organization that satisfies five of six objectives within a single practice has not satisfied the practice at all. Their preparation effort on those five objectives is real, but the finding is identical to an organization that addressed none of them. The gap does not reduce the impact of the finding — it only clarifies where remediation must focus.

What Are the Possible CMMC Scoring Outcomes?

The DoD CIO CMMC Level 2 Assessment Guide defines exactly three possible findings for each assessment objective. These are the only permitted outcomes — there is no partial credit, no conditional met, and no "in progress" status that carries scoring value.

Met

All Determination Statements Satisfied

The assessor has verified — through at least one of the three assessment methods — that the objective's determination statements are all satisfied by the current state of the environment. The evidence is adequate (right type) and sufficient (enough of it to demonstrate consistent implementation).

Not Met

One or More Statements Unsatisfied

The assessor has determined that at least one determination statement within the objective is not satisfied — whether because evidence is missing, the live system contradicts the documentation, or the control owner cannot demonstrate understanding of the requirement. The entire practice is Not Met.

Not Applicable

Objective Does Not Apply — With Documentation

The control objective does not apply to the assessed environment because the activity or technology it addresses is absent. N/A is rarely accepted and requires affirmative documentation: a policy prohibiting the activity, a technical control enforcing the prohibition, and an SSP statement explaining both. N/A is a claim that must be evidenced, not a blank field.

The No-Partial-Credit Rule
A practice with six assessment objectives where five are Met and one is Not Met is scored as Not Met for the practice. There is no weighting within the practice, no scoring of the proportion of objectives satisfied, and no mechanism to carry partial credit into the SPRS calculation. The binary nature of the finding is absolute at the practice level, even though the point system creates a range of impact at the score level.

How Does One Missed Objective Fail an Entire CMMC Requirement?

The structure that makes one missed objective consequential is the practice-to-objective decomposition defined in NIST SP 800-171A. Each practice title is an umbrella for a set of specific determination statements that together constitute full implementation. Satisfying the practice title in spirit is not the same as satisfying each determination statement in evidence.

Practice-to-Objective Decomposition — Two Representative Level 2 Practices
AC.L2-3.1.1 Authorized Access Control 5 points — cannot defer to POA&M
[a]Authorized users of the system are identified.Met
[b]Processes acting on behalf of authorized users are identified.Met
[c]Devices and other systems authorized to connect to the system are identified.Met
[d]System access is limited to authorized users.Met
[e]System access is limited to processes acting on behalf of authorized users.Met
[f]System access is limited to devices and other systems authorized to connect.Not Met
✗ Practice NOT MET — Objective [f] unaddressed. Five Met objectives do not offset one Not Met. Full 5-point deduction. Cannot be deferred to POA&M.
AU.L2-3.3.2 User Activity Audit 3 points — POA&M eligible
[a]The content of audit records needed to support the monitoring, analysis, investigation, and reporting of unlawful or unauthorized system activity is defined.Met
[b]Audit records contain the defined content.Met
[c]Audit records support the monitoring, analysis, investigation, and reporting of unlawful or unauthorized system activity.Met
✓ Practice MET — All three objectives independently satisfied.
Practice-to-objective decomposition showing how one Not Met objective (AC.L2-3.1.1[f]) fails a 5-point practice, while all Met objectives (AU.L2-3.3.2) pass. Not Met 5-point controls cannot be deferred to a POA&M.

How Do Assessors Determine Met vs Not Met?

NIST SP 800-171A defines three assessment methods. Legally, satisfying one of the three methods for a given objective is sufficient to mark it Met. In practice, C3PAO assessors cross-validate across at least two methods — typically Examine plus either Interview or Test — because a policy that exists on paper but is not understood by the control owner, or not enforced by the live system, is a policy that does not satisfy the objective.

Examine

Document Review

The assessor reviews documentation — policies, procedures, SSP implementation statements, configuration exports, activity records — and evaluates whether the evidence demonstrates the objective is satisfied.

Objects: Specifications (policies, procedures) · Mechanisms (configs, system outputs) · Activities (dated records of execution)
Interview

Personnel Verification

The assessor interviews the specific individual named as control owner in the SSP. The interview verifies that the control owner understands the requirement, can describe how it is implemented, and is consistent with the documented procedures.

Objects: Individuals (control owners by name, not by role or proxy)
Test

Live System Verification

The assessor exercises a live system and observes whether it behaves as documented — watching a session lock, triggering a SIEM alert, reviewing active ACL enforcement, or confirming MFA is active on CUI accounts in production.

Objects: Mechanisms (systems, devices, tools) · Activities (observable processes)
Object TypeWhat It IsExamplesMethod
Specifications Documented statements of policy, procedure, standard, or requirement Access control policy, incident response procedure, acceptable use policy, SSP implementation statement Examine
Mechanisms Hardware, software, and firmware implementing or enforcing a requirement Firewall ACL rules, MFA configuration, SIEM alert policy, AD group policy, EDR settings Examine / Test
Activities Operational processes — records demonstrating a control is consistently executed Dated log review records, patch remediation tickets, training completion logs, signed scan reports Examine / Test
Individuals The specific people responsible for implementing and maintaining each control Network administrator, HR director, ISSO, facilities manager, sampled end users Interview
Four assessment object types defined in NIST SP 800-171A with corresponding examples and assessment methods used by C3PAO assessors during CMMC Level 2 evaluations.
The object type determines the evidence type required. An objective that targets an activity — "audit logs are reviewed on a defined schedule" — requires a dated record of that activity happening, not a policy stating it should happen. Policies are specifications. Records are activities. An assessor cannot substitute one for the other even when both exist and are consistent.

How Does the 5/3/1 Point System Affect Your SPRS Score?

The DoD NIST SP 800-171 Assessment Methodology assigns a point weight to each of the 110 control practices. These weights determine two things: how much each Not Met practice reduces the SPRS score, and whether a failing practice is eligible to be placed on a POA&M or must be fully remediated before conditional status can be issued.

5
Points — Cannot Be Deferred

Highest-consequence security capabilities. A Not Met 5-point practice cannot be placed on a POA&M. It must be fully implemented before Phase 2 begins. Examples: FIPS-validated cryptography (SC.L2-3.13.8), multifactor authentication (IA.L2-3.5.3), incident response (IR.L2-3.6.1). An open 5-point practice at Phase 2 close blocks conditional status entirely.

3
Points — POA&M Eligible

Significant security capabilities whose absence is serious but whose remediation can be tracked on a 180-day closeout timeline. A Not Met 3-point practice reduces the SPRS score by 3 and is eligible for the POA&M — provided the total score still meets the 88/110 threshold for conditional status.

1
Points — Limited Practice Deficiency

Minor deficiencies — often documentation issues such as a missing signature, an undated policy, or an incomplete procedure statement. One-point failures qualify for the 5-day Limited Practice Deficiency correction window. Uncorrected within 5 days, they move to the POA&M. A large number of 1-point failures can still add up to meaningful SPRS score reduction.

The Scoring Misunderstanding
Point values determine POA&M eligibility and SPRS score impact — they do not determine whether a practice is Met or Not Met. A 1-point practice with one unaddressed objective is just as Not Met as a 5-point practice with the same gap. The difference is what happens to the finding: the 5-point finding blocks conditional status, while the 1-point finding triggers the 5-day correction window. Both are equally binary in their Met/Not Met determination.

Why Do Well-Prepared Organizations Still Fail CMMC Objectives?

The majority of assessment failures occur not in organizations with no security program, but in organizations whose preparation was real but whose evidence — for specific objectives — was inadequate, insufficient, or internally inconsistent. These four patterns account for most Not Met findings in otherwise well-prepared environments.

Failure Pattern 01The Interview Disconnect — Policy Exists, Control Owner Does Not Know It

A correctly written, signed access control policy satisfies the Examine method for several objectives. The assessor then interviews the network administrator who owns those controls. The administrator describes a process that contradicts the policy, cannot identify where the policy is stored, or is unaware that the policy assigns them specific implementation responsibilities. The interview produces a Not Met finding for objectives that the Examine method had satisfied.

The critical insight: each assessment method is evaluated independently. A policy that satisfies Examine does not carry over to satisfy Interview. The control owner must be able to describe the same implementation the policy documents.

⛔ Finding: Not Met — examination evidence overridden by contradictory interview evidence.
Failure Pattern 02The Configuration Mismatch — SSP States One Thing, System Does Another

The SSP states that user accounts lock after three consecutive failed authentication attempts. The assessor tests the live system and observes that the group policy actually enforces a five-attempt threshold. The policy exists and the SSP documents the requirement. The test produces a different result than the documentation claims. The objective is Not Met because the test demonstrates the control is not implemented as documented — regardless of whether three attempts or five attempts is the "right" security posture.

⛔ Finding: Not Met — live system behavior contradicts SSP documentation. The assessment evaluates what is true, not what is intended.
Failure Pattern 03The N/A Claim Without Supporting Evidence

An organization that bans remote access marks the remote access control objectives as Not Applicable. The assessor asks for the policy explicitly prohibiting remote access, the technical control that enforces the prohibition, and the procedure for verifying that new systems are deployed without remote access capability. None of these exist — the organization simply does not allow remote access and has never formalized the prohibition. N/A without supporting evidence is the same as Not Met in practice, because the assessor has no basis to accept the N/A claim.

⛔ Finding: Not Met — N/A is a claim that requires documentation. An undocumented prohibition provides the assessor nothing to evaluate.
Failure Pattern 04Missing Authorizing Signature on an Otherwise Complete Policy

A policy document that is complete in content, correctly scoped, and thematically resonant with the assessment objectives it supports — but lacks an authorizing signature and date from a senior official — is not a valid policy for assessment purposes. Without an authorizing signature, the assessor cannot confirm the policy is operative rather than a draft. The finding lands on the Limited Practice Deficiency list and triggers the 5-day correction window. If not corrected, it moves to the POA&M.

⛔ Finding: Limited Practice Deficiency — 5-day correction window. All policies must bear current, dated signatures from an authorizing official before assessment day.

Which CMMC Objectives Apply to Each Asset Category?

The 320 assessment objectives do not apply equally to every asset in the environment. The DoD's five asset categories determine which practices must be demonstrated against which systems — and understanding this mapping is what makes enclave scoping a meaningful cost-reduction strategy rather than merely an organizational convenience.

Asset Categories and CMMC Practice Assessment Scope
Category 01
CUI Assets
Systems that store, process, or transmit CUI. The full 110 control practices with all applicable assessment objectives apply. Every objective for every applicable practice must be Met for these assets.
Full — All 110
Category 02
Security Protection Assets
Systems providing security functions that protect CUI — SIEM, EDR, vulnerability scanner, firewall, IAM platform, backup. Assessed against all applicable CMMC practices relevant to their security function. A SIEM that is not itself securing CUI but is securing the CUI enclave is fully in scope.
Full — Relevant
Category 03
Contractor Risk Managed Assets
Assets connected to the network but restricted from touching CUI through documented policy and technical controls. Assessed only against the SSP — the organization must demonstrate through documentation and network evidence that these assets cannot access CUI. No full-control assessment applies to CRMAs if the boundary is adequately documented.
SSP Only
Category 04
Specialized Assets
Government-owned equipment, operational technology, IoT devices, test equipment, and other systems unable to fully satisfy all CMMC requirements due to their technical constraints. Assessed against the SSP with compensating controls documented. The organization must demonstrate that compensating controls sufficiently address the security risk.
SSP + Compensating
Category 05
Out-of-Scope Assets
Assets logically and physically isolated from CUI with no connectivity to the CUI environment. Not assessed against CMMC practices. The isolation must be documented and demonstrable — an asset claimed to be out of scope but reachable from the CUI enclave is in scope by definition.
Not Assessed

How Do You Prepare for Every CMMC Assessment Objective?

The binary nature of objective-level scoring makes pre-assessment preparation a completeness exercise rather than a best-effort exercise. Every objective must be individually satisfied. These are the five preparation checks that prevent single-objective failures from producing practice-level Not Met findings in a well-prepared environment.

📋
Build an Evidence Mapping File That Points to Every Objective Every assessment objective requires a pointer to its supporting evidence at the exact document, page, and paragraph level. Not a list of documents — a direct pointer for each of the 320 objectives. An objective without a mapped artifact is an objective the assessor will need to search for, which expands sampling and extends assessment duration.
🔤
Verify Thematic Resonance Between SSP Statements and Objective Language Each SSP implementation statement must use the vocabulary of the assessment objective it addresses. If the objective states "authorized users are identified" and the SSP states "access rosters are maintained," the assessor cannot confirm a match without interpretation — which they are not permitted to perform. Rewrite statements that do not directly mirror the objective language before the assessment, not after.
🔁
Validate Evidence Across All Three Methods — Not Just Examine For every objective, confirm that the live system behaves consistently with the documentation (Test) and that the control owner can describe the implementation consistent with the SSP (Interview). A policy that passes Examine but fails Interview or Test is a Not Met finding. The Rule of Three: ensure at least Examine plus one additional method is supportable for every objective.
✍️
Audit Every Policy Document for Authorizing Signature and Date Run through every policy submitted as evidence and confirm it carries a current, dated signature from an authorizing official. Missing signatures are the most common Limited Practice Deficiency trigger — they are entirely preventable with a pre-assessment signature audit at least 30 days before the engagement begins, leaving time to obtain signatures without scheduling pressure.
Verify All 5-Point Controls Are Fully Implemented Identify every 5-point control practice in your environment and confirm that all applicable assessment objectives are Met. No 5-point control can appear on a POA&M. A 5-point Not Met finding at Phase 2 close blocks conditional status entirely and requires full remediation before any certification can be issued — meaning the cost of the assessment is sunk until the remediation is complete and a new assessment is scheduled.

The Bottom Line

CMMC scoring is a completeness test applied at the most granular level the framework defines under 32 CFR Part 170. A practice is the unit of scoring in terms of point impact on the SPRS total. An assessment objective is the unit of evaluation in terms of what the assessor must verify.

Execute this checklist before your C3PAO assessment: (1) Map evidence to every one of the 320 assessment objectives at the document-section-paragraph level. (2) Verify that every SSP implementation statement mirrors the exact vocabulary of its corresponding NIST SP 800-171A objective. (3) Test every live system configuration against its documented control — session lock timers, MFA enforcement, ACL rules — and correct any mismatch before assessment day. (4) Run mock interviews with every named control owner and confirm they can describe their specific controls without contradicting the SSP. (5) Identify all 5-point controls and verify full implementation — these cannot be deferred to a POA&M under the 180-day closeout rule. (6) Audit every policy document for a dated authorizing signature from a senior official at least 30 days before the assessment. The assessment evaluates 320 binary determination statements. Prepare for every one of them — not for most of them.

"Mostly compliant" is not an assessment outcome. It is a description of a Not Met finding that was close. The gap between Met and Not Met is the gap between a certification and a POA&M — and in some cases, the gap between a conditional status and a full reassessment from scratch. Prepare for every objective. Not for most of them.