AI tools like Claude Code accelerate COBOL modernization. But code translation is the easy part. Learn how to secure mainframe data during migration.

Anthropic's Claude Code announcement triggered IBM's worst stock drop since 2000. AI tools can now accelerate COBOL analysis and migration in quarters instead of years. But code translation is the easy part of mainframe modernization.
The hard problem – i.e., the one generating far less attention – is securing the sensitive data that has been sitting behind z/OS perimeter controls for decades once it starts flowing to cloud platforms, APIs, and SaaS applications.
The mainframe modernization market is projected to grow from $8.39 billion in 2025 to $13.34 billion by 2030. Every dollar of that spending creates a new attack surface if data protection is not built into the migration from day one.
Mainframe modernization is the process of updating legacy mainframe systems (primarily IBM z/OS environments) to integrate with modern cloud platforms, APIs, and development workflows. It covers a spectrum of approaches:
COBOL (Common Business-Oriented Language) sits at the center of this conversation because it remains the dominant language on these systems.
Developed in 1959, COBOL still powers an estimated 95% of ATM transactions in the United States. Over 220 billion lines of COBOL code are in production today, and COBOL systems handle an estimated $3 trillion in daily commerce.
These are the transaction engines behind banks, insurers, airlines, and government agencies.
The scale of this dependency is the problem. Universities no longer teach COBOL. The developers who built these systems have retired, taking institutional knowledge with them.
Finding engineers who can read and maintain COBOL gets harder every quarter. This talent gap is why mainframe modernization solutions have become a strategic priority rather than a discretionary IT project.
One critical distinction: COBOL and mainframes are related but not synonymous. IBM's Rob Thomas noted that approximately 40% of COBOL does not run on mainframes at all.
COBOL on IBM Z is code optimized over decades of tight coupling between software and hardware. That coupling is what makes modernization complex, and what makes the data security challenge different from a standard cloud migration.
On February 23, 2026, Anthropic published a blog post positioning its Claude Code tool as a solution for COBOL modernization.
The company argued that AI can automate the exploration and analysis phases that consume the majority of effort in COBOL modernization. Anthropic claimed that with AI, teams can modernize their COBOL codebases in quarters instead of years.
The market reaction was severe. IBM shares dropped 13% in a single day – i.e., the worst performance since October 2000. IBM stock fell 27% in February 2026, on track for its largest one-month decline since at least 1968.
The core of Anthropic's argument: modernizing a COBOL system once required armies of consultants spending years mapping workflows.
Claude Code can map dependencies across thousands of lines of code, document workflows, and identify risks faster than human analysts. This addresses the cost barrier that has stalled modernization for years.
Anthropic also published a Code Modernization Playbook with guidance on migrating COBOL to modern languages like Java or Python.
IBM responded directly. Rob Thomas, Senior Vice President of IBM Software, drew a line between translating code and modernizing a platform.
He compared IBM Z's tight hardware-software integration to the iOS and iPhone ecosystem (i.e., something that a code converter cannot replicate). Evercore ISI analyst Amit Daryanani reinforced this point: clients already had the option to migrate but are sticking with the platform.
The announcement is significant. But the conversation that started is incomplete.
Everyone is debating whether AI can translate COBOL. Almost nobody is asking what happens to the data once it moves.
IBM's Rob Thomas summarized the gap in a single sentence: translation captures almost none of the actual complexity.
The real work in mainframe modernization is data architecture redesign, runtime replacement, transaction processing integrity, and hardware-accelerated performance that has been built over decades of tight software-hardware coupling.
Consider what actually runs on a mainframe. These systems process millions of transactions per second with near-zero downtime.
The mainframe security controls – i.e., RACF, ACF2, Top Secret – are deeply integrated into the operating system. Cryptographic acceleration is built into the processor itself. Batch processing, CICS transactions, and DB2 databases are all optimized for the IBM Z hardware stack.
Converting COBOL syntax to Java does not replicate any of this.
A 2025 report from Advanced (ModernSystems) found that 53% of organizations planned, instead, to push a hybrid modernization strategy, i.e., reducing mainframe dependency without full decommissioning. The industry has learned from failed migrations.
Several banks have undertaken multi-year COBOL modernization efforts that resulted in widespread service disruptions and regulatory fines.
AI tools like Claude Code accelerate the analysis phase. They can map dependencies, document workflows, and surface risks that would take human analysts months to find.
That is a genuine breakthrough. But analysis and documentation are the first step, not the last.
The migration itself – and specifically, securing the data throughout – is where modernization projects succeed or fail.
Here is the problem that the Anthropic-versus-IBM debate overlooks entirely. The mainframe is, historically, one of the most secure computing environments in enterprise IT.
IBM z/OS systems protected by RACF, ACF2, or Top Secret enforce granular access controls at the dataset level. Data sits behind a perimeter that has been hardened for decades. Cryptographic hardware acceleration handles encryption at near-wire speed.
The moment data leaves that perimeter, the security model changes. And every mainframe modernization initiative, by definition, moves data out of the mainframe.
When COBOL applications are migrated to cloud platforms, the data they process (e.g., credit card numbers, Social Security numbers, health records, financial transactions) starts flowing through APIs, message buses, cloud storage, and analytics pipelines.
Each of these creates new data breach risks.
The 2025 NetSPI Mainframe Security Report found that network segregation between mainframe infrastructure and corporate environments was virtually nonexistent.
Among dozens of client engagements, only one organization had effectively segregated their mainframe environment.
The data itself is often the vulnerability. COBOL applications frequently store sensitive information in cleartext on mainframe databases.
Modifying legacy application code to protect this data is risky and expensive – i.e., the same talent shortage that drives modernization also prevents on-mainframe remediation. So the data moves to the cloud unprotected.
The global average cost of a data breach reached $4.4 million in 2025, according to IBM.
The compliance scope under PCI DSS, HIPAA, and GDPR extends to sensitive data as soon as it leaves its original environment. You are not just migrating code. You are migrating risk.
You cannot protect what you cannot find.
Before any COBOL code is analyzed, mapped, or translated, you need a complete inventory of the sensitive data that flows through those applications.
This means scanning mainframe datasets, DB2 tables, VSAM files, and IMS databases for patterns tied to PCI, PII, and PHI. Data discovery is the foundation of every secure modernization initiative.
The safest place to secure data is before it leaves the mainframe.
Tokenization replaces sensitive values with non-reversible tokens before data enters the migration pipeline. This means that even if the pipeline is compromised – or the cloud destination is breached – the exposed data has no exploitable value.
This is fundamentally different from encryption, which is reversible. Both are essential, but they solve different problems. Encryption protects data in transit and at rest. Tokenization removes the sensitive data from the environment entirely.
Mainframe-to-cloud data pipelines are persistent bridges, not one-time transfers.
They create three distinct attack surfaces: data in transit (network links, APIs, message buses), data at rest (cloud storage, staging areas), and data in use (ETL jobs, analytics pipelines).
Each requires its own controls. TLS 1.3 for transport. Encryption for storage. Masking or tokenization for processing environments.
When sensitive data moves from a mainframe to a cloud environment, your PCI DSS, HIPAA, or GDPR compliance scope expands to include every system that touches that data.
PCI tokenization can dramatically reduce this scope by ensuring that the cloud-side systems never see actual cardholder data. This is a direct cost savings: fewer systems in scope means fewer audit requirements and lower compliance overhead.
Installing security agents on a mainframe is risky and expensive. It can destabilize the same legacy applications you are trying to modernize.
Agentless mainframe security operates in the network traffic flow, intercepting and protecting data without any code changes to COBOL applications or mainframe infrastructure.
This removes the "security versus stability" conflict that stalls most mainframe security initiatives.
AI is accelerating COBOL modernization. The analysis work that once took years now takes weeks. But faster code translation without data protection built into the pipeline creates a new category of risk.
DataStealth secures sensitive data during every phase of mainframe modernization:
Lindsay Kleuskens is a data security specialist helping enterprises reduce risk and simplify compliance. At DataStealth, she supports large organizations in protecting sensitive data by default, without interrupting user workflows. Her work focuses on PCI DSS scope reduction, preventing client-side attacks, and enabling secure third-party integrations without the security risk. Lindsay regularly shares practical insights on modern data protection challenges and helps organizations navigate evolving compliance standards with confidence.