The Commercial P&C Carrier Technology Stack in 2026: Where Underwriting Intelligence Fits
Mid-size commercial P&C carriers operate technology stacks that were assembled over years from different vendors, through acquisitions, and through the accretion of point solutions that solved specific problems without being designed to work together. The result, at most carriers of this size, is a set of systems that handle their specific domains adequately but share data poorly and create manual reconciliation work at the seams between them.
Understanding how the technology stack interacts — and where its gaps are — is the prerequisite for any meaningful automation initiative. This piece maps the typical technology stack at a mid-size commercial carrier and identifies the integration challenges that affect underwriting automation most directly.
The Core Systems Layer
The technology stack at a mid-size commercial carrier rests on three core systems that manage the fundamental policy lifecycle.
Policy management system (PMS) is the system of record for policies. It manages the submission intake, rating, issuance, endorsement, and renewal workflow. Guidewire PolicyCenter and Duck Creek Policy are the most common PMS platforms among mid-size commercial carriers that have made a modernization investment in the past decade. Carriers that have not modernized may be running legacy PMS platforms — older Majesco versions, custom-built systems, or industry-specific platforms — that have limited API capability and require different integration approaches.
Claims management system (CMS) manages the full claims lifecycle from first notice of loss through payment and close. Guidewire ClaimCenter is the dominant platform in the Guidewire-deployed carrier segment. The claims data in the CMS is the most valuable source for loss history analysis, but it is frequently isolated from the underwriting workflow — underwriters do not have direct access to the CMS and receive prior loss data only through the loss run that the insured provides.
Billing system manages premium collection, installment plans, and payment processing. Billing systems at mid-size carriers are often the oldest system in the stack and the least connected to other systems. The billing system is generally not relevant to the underwriting workflow, but billing data that surfaces payment history patterns can occasionally be a useful risk signal for renewal underwriting.
The Data and Analytics Layer
Above the core transactional systems, mid-size carriers typically have some combination of data warehouse infrastructure and reporting tools that aggregate data across the operational systems for management reporting, actuarial analysis, and rate filing support.
The data warehouse is where the cross-system data integration that the operational systems do not provide natively is supposed to happen. In practice, data warehouses at mid-size carriers are frequently behind on their ETL pipelines, have data quality issues that limit the reliability of their reporting, and were built primarily for backward-looking reporting rather than for real-time underwriting workflow support.
The analytics layer's relevance to underwriting automation is primarily as a source of carrier-specific training data for ML scoring models and as the destination for the submission analytics and productivity reporting that automation platforms produce.
The External Data Integration Layer
Commercial underwriting requires external data that no carrier's internal systems contain: peril exposure data, business intelligence, property records, credit information, and loss indicator signals. Mid-size carriers access this data through a fragmented set of direct vendor relationships — a Verisk integration for ISO data, a CoreLogic account for property data, a business credit service for commercial credit reports — each with its own access method, data format, and update frequency.
The external data layer is where the largest efficiency gap in the commercial underwriting stack exists. The data is available, but accessing it systematically for every submission requires either a set of API integrations that most carriers have not built or manual lookups that underwriters perform inconsistently and only when workflow timing permits.
Underwriting intelligence platforms address this layer directly by aggregating the external data sources and delivering a structured dossier rather than requiring the carrier to build and maintain individual data source integrations. The platform becomes the single integration point for external data rather than the carrier maintaining its own integration portfolio.
Agency Management System Integration
The broker side of the submission workflow operates in agency management systems — Applied Epic, Vertafore AMS360 — that generate and transmit commercial submissions. The data quality and structure of what the carrier receives depends on how the broker's AMS has been configured and how consistently its team populates submission fields.
Carriers with high-volume broker relationships have a business reason to invest in the data quality of what they receive from those brokers. Structuring the carrier-broker integration to pass structured data rather than free-form documents — and providing feedback when submission data gaps affect processing speed — creates a data quality flywheel that benefits both parties.
Where the Seams Create Problems
The integration challenges that matter most for underwriting automation are at the seams between systems: between the PMS and the external data layer (where the dossier assembly happens), between the PMS and the CMS (where prior loss data needs to flow without manual intervention), and between the submission intake process and the underwriting workflow (where routing decisions need to be made before the underwriter opens the file).
Each seam represents a manual process in current operation that automation can address — but only if the underlying data quality on both sides of the seam is sufficient to support reliable automated data exchange. The data quality work is not glamorous but it is the actual prerequisite for the automation outcomes carriers are seeking.
Carriers that approach their technology modernization as a technology purchasing exercise — acquiring platforms without addressing the data quality gaps at the seams — consistently underperform on automation ROI relative to carriers that treat data quality as the first investment in the modernization sequence.