How to Conduct a SOC 2 Gap Analysis: A Complete Guide
Written by
Jon Ozdoruk
Published on

How to Conduct a SOC 2 Gap Analysis: A Complete Guide to Audit Readiness in 2026
The decision to pursue SOC 2 certification rarely comes from inside the organization. It comes from outside — an enterprise prospect's security team that will not sign a contract without it, a customer success manager who has lost three deals in a row to the same question, a legal team reviewing a vendor agreement that requires it as a condition of doing business. The decision is usually made quickly. The implementation that follows is rarely as quick.
What most organizations discover when they begin SOC 2 preparation is that they lack a clear picture of where they actually stand. They have security practices — some formal, some informal, some documented, some living entirely in the engineering team's institutional memory. What they do not have is a precise understanding of how those practices map to what SOC 2 actually requires, which controls they already satisfy, which they partially address, and which are entirely absent.
That understanding is what a SOC 2 gap analysis provides. A structured gap analysis is the essential first step between the decision to pursue SOC 2 and the implementation work that closes the distance to audit readiness. It tells you where you are, what you need to build, what you need to formalize, and how long it will realistically take — before an auditor tells you the same things in a findings report that delays your certification by six months.
This guide explains exactly how to conduct a SOC 2 gap analysis — what the Trust Services Criteria require at the control level, how to assess your current practices against those requirements, how to score your findings, and how to turn the output into a prioritized remediation roadmap that gets you to a clean Type II opinion.
What a SOC 2 Gap Analysis Measures
A SOC 2 gap analysis measures the distance between your organization's current controls and processes and the requirements of the Trust Services Criteria against which you will be audited. It is structured around the criteria you have selected for your SOC 2 scope — the Security criterion, which is mandatory for every SOC 2 audit, and any additional criteria you have included based on your customers' expectations and the nature of your service.
The Security criterion — also called the Common Criteria — contains 33 criteria organized across nine categories: CC1 through CC9. It is the foundation of every SOC 2 audit and the source of most audit findings. The additional Trust Services Criteria — Availability (A1), Confidentiality (C1), Processing Integrity (PI1), and Privacy (P1 through P8) — each include a set of criteria relevant to their respective focus areas.
A gap analysis is not the same as a readiness assessment. A readiness assessment is typically conducted by the auditor or a third-party consultant close to the audit date — it tells you whether you are ready to be audited. A gap analysis is conducted by the organization itself at the beginning of the implementation process — it tells you what you need to build in order to be ready. The gap analysis drives the implementation. The readiness assessment verifies it. Conflating the two is one of the most common reasons organizations go into SOC 2 audits underprepared.
The gap analysis also establishes your baseline. As remediation activities are completed over the following months, rescore individual criteria against the gap analysis baseline to track progress, update your preparation timeline, and demonstrate to leadership and customers that the program is advancing on schedule.
Selecting Your Scope Before Starting
Before beginning the gap analysis, the scope of your SOC 2 audit must be defined. Scope definition determines which systems, services, and criteria are covered by the audit — and therefore which criteria your gap analysis must assess.
Criteria selection is the first scope decision. Every SOC 2 audit includes the Security criterion. Beyond Security, criteria selection should be driven by your customers' requirements and the nature of your service. Availability is relevant for any SaaS company whose customers depend on the service being available, which is most of them. Confidentiality is relevant for companies that process sensitive business information under confidentiality obligations. Processing Integrity is relevant for companies that provide transaction processing, financial calculations, or other services where the accuracy and completeness of processing is critical. Privacy is relevant for companies that collect and process personal information directly from individuals.
Including criteria that are not relevant to your service creates an unnecessary audit surface without corresponding value to customers. Including criteria that are relevant to your service but excluding them from scope creates a gap between what your SOC 2 report covers and what your customers need it to cover, which defeats the commercial purpose of certification.
System boundary definition is the second scope decision. The system in scope for the SOC 2 audit must be defined before the gap analysis begins. The system boundary includes the infrastructure, software, people, procedures, and data that are used to provide the service being audited. Defining the boundary too narrowly excludes controls that auditors will expect to see. Defining it too broadly creates an audit surface that is difficult to manage and includes systems and processes not relevant to the service being evaluated.
For most SaaS companies, the system boundary includes the production application and its supporting infrastructure; the data processing and storage systems; the network and security infrastructure; the people and processes involved in operating and supporting the service; and the monitoring and incident response processes. Development environments, internal tools that do not access production data, and corporate IT systems not involved in service delivery are typically excluded.
The Gap Analysis Scoring Framework
A consistent scoring framework is what separates a gap analysis that produces actionable results from one that produces only impressions. Using a four-point scale applied consistently across every criterion provides the granularity needed to prioritize remediation and estimate timelines accurately.
Score 0 — Not Addressed. No relevant control, process, or policy exists. This gap requires building a capability from scratch and represents the highest remediation effort.
Score 1 — Partially Addressed. Some relevant activity exists — an informal practice, a policy that addresses part of the requirement, or a technical control that partially satisfies the criterion — but significant gaps remain. The existing activity provides a foundation, but cannot be presented as compliance without substantial additional work.
Score 2 — Largely Addressed. The criterion is substantially met by existing practices. A control exists, it is implemented, and evidence is available. Minor gaps remain — documentation may be incomplete, the control may not be applied consistently, or evidence collection may not be systematic — but remediation is incremental.
Score 3 — Fully Addressed. The criterion is fully met. The control is documented, consistently implemented, and produces evidence that would withstand audit scrutiny without modification.
Every criterion in your selected scope receives a score. The distribution of scores across criteria tells you your overall gap profile — the proportion of criteria at each level — and the specific scores tell you where to focus remediation effort.
Assessing the Common Criteria: CC1 Through CC9
The Common Criteria assessment is the core of any SOC 2 gap analysis. The 33 criteria across nine categories cover the full scope of the Security Trust Services Criterion and are the source of most audit findings.
CC1 — Control Environment
The control environment criteria address the governance and organizational foundation of the security program. They assess whether senior management has demonstrated commitment to security, whether the organizational structure supports security objectives, and whether personnel understand their security responsibilities.
CC1.1 requires that the entity demonstrate a commitment to integrity and ethical values. This is assessed through documented code-of-conduct policies, background-check procedures, and evidence that the organization's stated values are reflected in its actual practices.
CC1.2 requires that the board of directors or equivalent demonstrate independence from management and exercise oversight of the development and performance of internal controls. For SaaS companies without formal boards, oversight responsibility typically falls to the executive team and must be evidenced through management review activities and documented oversight of the security program.
CC1.3 requires that management establish structures, reporting lines, and appropriate authorities to pursue objectives. Documented organizational charts, defined security roles and responsibilities, and evidence that those roles are actively exercised satisfy this criterion.
CC1.4 requires the entity to demonstrate a commitment to attracting, developing, and retaining competent individuals aligned with its objectives. Security awareness training records, role-specific security training documentation, and evidence of security competency requirements in hiring are the primary pieces of evidence.
CC1.5 requires that the entity hold individuals accountable for their internal control responsibilities. Documented performance expectations related to security, evidence of disciplinary processes for security policy violations, and management monitoring of security responsibilities satisfy this criterion.
The most common CC1 gap is the absence of documented evidence of management oversight. Leadership commitment to security may be genuine but undocumented. SOC 2 auditors require evidence — not assertions — of management commitment. Management review meetings that address security, documented security metrics reported to leadership, and signed security policies are the most frequently missing pieces of evidence.
CC2 — Communication and Information
CC2 criteria address the flow of security-relevant information within and around the organization. They require that security policies be communicated to personnel, that information needed to support security objectives be available, and that the organization communicate appropriately with external parties.
CC2.1 requires that the entity obtain or generate relevant, high-quality information to support the functioning of internal controls. This is satisfied by documented security policies, risk assessments, and the information outputs of security monitoring activities.
CC2.2 requires that the entity internally communicate information necessary to support the functioning of internal controls. Security awareness training, policy distribution records, and security communications to relevant personnel satisfy this criterion.
CC2.3 requires the entity to communicate with external parties regarding matters that affect the functioning of internal controls. Customer-facing security documentation, incident notification procedures, and the mechanism for customers to report security concerns are the primary external communication requirements.
CC3 — Risk Assessment
The risk assessment criteria are among the most scrutinized in SOC 2 audits because they form the analytical foundation of the entire security program. A security program without a functioning risk assessment process is a collection of controls without a rationale.
CC3.1 requires that the entity specify objectives with sufficient clarity to enable the identification and assessment of risks relating to objectives. For SOC 2, the relevant objectives are the service commitments and system requirements defined in the system description — the availability, confidentiality, processing integrity, and security properties the organization has committed to delivering.
CC3.2 requires that the entity identify risks to the achievement of its objectives across the entity and analyze them as a basis for determining how to manage them. A formal risk assessment methodology, documented risk register, and evidence of regular risk assessment activities satisfy this criterion. The risk assessment must be specifically linked to the SOC 2 service commitments — it is not sufficient to have a generic IT risk assessment that does not address the risks to the services in scope for the audit.
CC3.3 requires the entity to consider the potential for fraud when assessing risks to the achievement of objectives. Fraud risk assessment must be a documented component of the risk assessment process.
CC3.4 requires the entity to identify and assess changes that could significantly affect the system of internal controls. Change management processes that include security impact assessment — for new system deployments, new supplier relationships, significant product changes, and organizational changes — satisfy this criterion.
CC4 — Monitoring Activities
CC4 criteria address whether the organization monitors the effectiveness of its controls and takes action when monitoring reveals gaps or failures.
CC4.1 requires that the entity select, develop, and perform ongoing and separate evaluations to ascertain whether the components of internal control are present and functioning. Continuous security monitoring, vulnerability scanning, and periodic internal assessments of control effectiveness satisfy this criterion.
CC4.2 requires the entity to evaluate and communicate internal control deficiencies to the parties responsible for taking corrective action in a timely manner. A documented process for tracking identified control gaps, assigning remediation owners, and verifying remediation completion is required. The absence of a formal deficiency tracking process is a consistent CC4 finding. Many organizations identify control gaps informally but lack a systematic process for tracking them through to resolution.
CC5 — Control Activities
CC5 criteria address whether the organization has selected and implemented controls that address identified risks and whether those controls are reflected in policies and procedures.
CC5.1 requires that the entity select and develop control activities that contribute to mitigating risks to the achievement of objectives to acceptable levels. The controls implemented must be traceable to the risks identified in the CC3 risk assessment — controls that exist without a documented risk rationale are harder to defend in an audit.
CC5.2 requires that the entity also select and develop general control activities over technology to support the achievement of objectives. Technology controls — access management, change management, system monitoring, backup and recovery — must be implemented and documented.
CC5.3 requires that the entity deploy control activities through policies that establish what is expected and in procedures that put policies into action. Policies alone are insufficient — there must be documented procedures that operationalize each policy and evidence that those procedures are followed.
CC6 — Logical and Physical Access Controls
CC6 is the criterion category that generates the most findings in SOC 2 audits. It addresses the controls that govern who has access to what — the technical and operational measures that prevent unauthorized access to systems, data, and infrastructure.
CC6.1 requires that the entity implement logical access security software, infrastructure, and architectures over protected information assets to protect them from security events. Multi-factor authentication, role-based access control, network segmentation, and encryption at rest and in transit are the primary controls assessed under this criterion.
CC6.2 requires that, prior to issuing system credentials and granting system access, the entity register and authorize new internal and external users whose access is administered by the entity. The access provisioning process — how access is requested, approved, and granted — must be documented and consistently followed.
CC6.3 requires that the entity remove access to protected information assets when appropriate. Offboarding procedures that include timely revocation of access — within a defined timeframe following termination or a role change — are required. Timely access revocation is one of the most frequently cited CC6 findings. Many organizations eventually revoke access, but not within a defined and documented timeframe.
CC6.4 requires that the entity restrict physical access to facilities and protected information assets to authorized personnel. Physical access controls to offices and any on-premises infrastructure must be documented and evidenced.
CC6.5 requires that the entity discontinue logical and physical protections over physical assets, including media, when those assets are removed from service. Secure disposal procedures for storage media and decommissioned devices are required.
CC6.6 requires the entity to implement controls to prevent, detect, and respond to the introduction of unauthorized or malicious software. Endpoint protection, email security controls, and application whitelisting or equivalent controls satisfy this criterion.
CC6.7 requires that the entity restrict the transmission, movement, and removal of information to authorized internal and external users and processes. Data loss prevention controls, encrypted transmission requirements, and controls on removable media are the primary implementation mechanisms.
CC6.8 requires the entity to implement controls to prevent, detect, and respond to unauthorized or malicious actions by software tools that may affect the subject matter of the engagement. This criterion addresses the controls applied to privileged access tools, administrative utilities, and other software with elevated system access.
CC7 — System Operations
CC7 criteria address the operational security controls that protect systems in production — vulnerability management, monitoring, incident detection and response, and recovery from security events.
CC7.1 requires that, to meet its objectives, the entity use detection and monitoring procedures to identify changes to configurations or new vulnerabilities, and use a process to address identified vulnerabilities. Vulnerability scanning, configuration monitoring, and a documented vulnerability remediation process with severity-defined SLAs are the core requirements.
CC7.2 requires that the entity monitor system components and their operation for anomalies indicative of malicious acts, natural disasters, and errors that affect the entity's ability to meet its objectives. Security information and event management, log review processes, and alert response procedures are required.
CC7.3 requires that the entity evaluate security events to determine whether they could or have resulted in a failure to meet its objectives and, if so, take action to prevent or address such failures. The incident triage and escalation process — how the organization determines whether a security event constitutes an incident and how it responds — is assessed here.
CC7.4 requires that the entity respond to identified security incidents by executing a defined incident response program to understand, contain, remediate, and communicate about security incidents, as appropriate. The incident response plan, evidence of its testing, and records of its invocation for actual or suspected incidents satisfy this criterion.
CC7.5 requires that the entity identify, develop, and implement activities to recover from identified security incidents and, as appropriate, communicate information about the event to management and other parties. Recovery procedures, post-incident reviews, and communication records from past incidents are the evidence items for this criterion.
CC8 — Change Management
CC8 addresses the controls that govern how changes to systems, infrastructure, and configurations are managed — preventing unauthorized changes and ensuring that approved changes do not introduce new risks.
CC8.1 requires that the entity authorize, design, develop, or acquire, configure, document, test, approve, and implement changes to infrastructure, data, software, and procedures to meet its change management objectives. The change management process must address all types of production changes — code deployments, infrastructure changes, configuration changes, and database changes — and must require security review for changes that affect controls in scope for the SOC 2 audit.
Change management is a criterion category where the gap between policy and practice is most commonly found. Many SaaS companies have a change management policy, but deploy to production through processes that do not consistently follow it — emergency changes made without approval, configuration changes made directly in production without a change record, or deployments that skip required testing steps. The audit examines actual change records against the documented procedure.
CC9 — Risk Mitigation
CC9 addresses the controls applied to risks arising from business disruption and third-party relationships.
CC9.1 requires that the entity identify, select, and develop risk mitigation activities for risks arising from potential business disruptions. Business continuity planning, disaster recovery testing, and documented recovery time objectives satisfy this criterion.
CC9.2 requires that the entity assess and manage risks associated with vendors and business partners. The vendor risk management process — how suppliers are assessed before engagement, how supplier security is monitored on an ongoing basis, and how risks identified through supplier assessments are managed — is assessed here. Vendor risk management is consistently among the top five sources of CC9 findings. Many organizations have a list of vendors but lack a documented assessment process, evidence that assessments have been conducted, and contractual information security requirements in supplier agreements.
Assessing Additional Trust Services Criteria
If your SOC 2 scope includes criteria beyond Security, each additional criterion adds its own set of assessment requirements.
Availability (A1) adds three criteria addressing the availability commitments and system requirements documented in the system description. A1.1 requires that the entity maintain, monitor, and evaluate current processing capacity and use of system components to manage capacity demand and to enable the implementation of additional capacity to help meet its availability objectives. A1.2 requires that the entity authorizes, designs, develops, or acquires, implements, operates, approves, maintains, and monitors environmental protections, software, data backup processes, and recovery infrastructure. A1.3 requires that recovery plan procedures be in place to achieve timely system recovery or, if necessary, to take other corrective actions to meet its availability objectives.
Confidentiality (C1) adds two criteria specifically addressing the protection of confidential information. C1.1 requires the entity to identify and maintain confidential information to meet its confidentiality objectives. C1.2 requires that the entity dispose of confidential information to meet its confidentiality objectives.
Processing Integrity (PI1) adds five criteria addressing the accuracy, completeness, validity, timeliness, and authorization of system processing. These criteria are most relevant to transaction processing services, financial platforms, and any service in which the correctness of data processing is a material commitment to customers.
Privacy (P1 through P8) adds criteria that map directly to privacy regulation requirements — notice, consent, collection, use, retention, disclosure, access, and monitoring — and is most relevant for organizations that collect personal information directly from individuals rather than processing it on behalf of customers.
Common SOC 2 Gap Analysis Findings
Certain gaps appear consistently across SaaS companies conducting their first SOC 2 gap analysis, regardless of their technical sophistication or security maturity.
Access reviews have never been conducted. Access control policies exist, and access is provisioned through a documented process, but periodic access reviews — the control that verifies that previously granted access remains appropriate — have never occurred. CC6.3 requires that access be removed when appropriate. Demonstrating this requires evidence of reviews, not just evidence of initial provisioning.
Vendor risk management is entirely informal. Suppliers are assessed informally during procurement, but there are no documented third-party risk management records, no standardized security questionnaire or review process, and no contractual information security requirements in most supplier agreements. CC9.2 generates findings from this gap in the majority of first-time SOC 2 audits.
Change management policy exists but is not consistently followed. Code is deployed to production outside the documented change management process for urgent fixes, minor changes, or configuration updates. The gap is not the policy — it is the evidence that the policy is being followed consistently. CC8.1 is assessed through actual change records rather than the policy document.
The incident response plan has never been tested. An incident response plan exists, but it has never been used in a real incident and has never been exercised in a simulation or tabletop exercise. CC7.4 requires evidence of testing — a plan that exists only as a document without any evidence of operational testing does not satisfy the criterion at audit.
Monitoring alerts exist, but are not reviewed. Security monitoring tools are configured and generate alerts, but there is no documented process for reviewing and responding to them, and no evidence that they are being acted upon. CC7.2 requires evidence of active monitoring — not merely the existence of monitoring tools.
The system description does not accurately reflect the system. The system description — the document that defines the scope of the SOC 2 audit — was written at the beginning of the implementation, but has not been updated as the product and infrastructure have evolved. Auditors assess the system description for accuracy against the actual system. Inaccuracies generate findings.
Building Your Remediation Roadmap
The output of a scored gap analysis is a complete map of your readiness — every criterion scored, every gap identified, every remediation need specified. The next step is to convert that map into a prioritized roadmap with a realistic timeline for audit readiness.
Prioritize by audit impact first. Some gaps will prevent the audit from proceeding. Gaps in CC3 (Risk Assessment) and CC5 (Control Activities) — specifically, the absence of a documented risk assessment and documented procedures for key controls — are structural requirements that auditors assess directly in Stage 1. These must be remediated before the audit is scheduled.
Prioritize by remediation effort second. Gaps that require only documentation and formalization of existing informal practices — access review procedures, vendor assessment documentation, incident response plan testing records — can often be closed within weeks. Gaps that require building new technical controls or operational processes require longer lead times and must be planned for in the implementation timeline. Address the quick wins first to improve the overall readiness score rapidly while longer-lead items are in progress.
Define your observation period start date. SOC 2 Type II reports cover a defined observation period — typically six to twelve months — during which the auditor assesses whether controls were operating effectively. The observation period cannot begin until all required controls are in place. Every day that passes before controls are implemented is an observation period that cannot be recovered. The sooner gaps are closed, the sooner the observation period can begin, and a Type II report can be issued.
Build evidence collection from day one. Every control implemented must produce evidence that it is operating. Build evidence collection — logs, screenshots, records, reports — into the operational processes that implement each control from the beginning of the observation period. Attempting to reconstruct evidence retroactively results in incomplete records that do not meet audit requirements.
Set a realistic target date for the Type II audit. Given your gap analysis scores, your available remediation resources, and a realistic estimate of how long it will take to close each gap, define a target date for the start of your observation period and work backward to the completion date for remediation. A twelve-month observation period is standard for a first-time Type II audit. A six-month observation period is achievable for organizations with strong existing security practices and focused gap remediation.
From Gap Analysis to Clean Opinion
The SOC 2 journey from gap analysis to a clean Type II opinion follows a predictable sequence for organizations that approach it systematically. Gap analysis establishes the baseline. Remediation closes the identified gaps. Evidence collection documents the operation of controls throughout the observation period. Readiness assessment verifies that controls are audit-ready before the auditor arrives. The Stage 1 audit reviews documentation. The Stage 2 audit tests controls through evidence review and walkthroughs. A clean opinion confirms that controls were suitably designed and operating effectively throughout the observation period.
The organizations that reach a clean Type II opinion on their first attempt are not the ones with the most sophisticated security programs. They are the ones who understood precisely what was required, closed the gaps before the observation period started, collected evidence systematically throughout, and walked into the audit with complete documentation and no surprises.
A gap analysis is how you build that precision. It converts the vague anxiety of "we need to get SOC 2 compliant" into a specific, manageable roadmap — these are the gaps, this is the priority order, this is what it will take, and this is when you will be ready.
dsalta helps SaaS companies conduct SOC 2 gap analyses, automate evidence collection across all Trust Services Criteria, and achieve clean Type II opinions, with audit-preparation infrastructure built into the platform from day one.
Explore more SOC 2 articles
Getting Started with SOC 2
Audit Preparation & Evidence
Controls & Technical Implementation
Multi-Framework Strategy
Business & Trust
Stop losing deals to compliance.
Get compliant. Keep building.
Join 100s of startups who got audit-ready in days, not months.


