← September 14, 2026 edition

hypercubic

AI for COBOL maintenance and modernization

Hypercubic Is Coming for COBOL, and Honestly It Is About Time

AIEnterpriseDeveloper ToolsLegacy Systems

The Macro: The Mainframe Is Not Dead, It Is Just Forgotten

There is a popular narrative in tech that mainframes are relics. Legacy systems waiting to be replaced by cloud-native microservices. The reality is almost the opposite. Mainframes process 95 percent of ATM transactions globally. They handle 70 percent of Fortune 500 workloads. The IRS runs on COBOL. Major airlines, insurance companies, and banks depend on mainframe code that was written before most of their current engineers were born. These systems are not going away. They are too important to replace and too dangerous to touch.

The problem is not the technology. COBOL works. Mainframes are remarkably reliable. The problem is the people. The average COBOL developer is over 55 years old. They are retiring at an accelerating rate, and they are taking decades of institutional knowledge with them. There are roughly 800 billion lines of COBOL in production worldwide, and the pool of humans who understand any given system is shrinking every year. This is not a hypothetical future crisis. It is happening right now.

The modernization consulting industry has tried to address this for decades and has largely failed. Accenture, IBM, and Deloitte all have mainframe modernization practices. They charge enormous fees, take years to deliver, and frequently produce results that do not work as well as the original system. The “lift and shift” approach, where you rewrite COBOL in Java or Python, has a failure rate that nobody likes to talk about publicly. The Commonwealth Bank of Australia spent $1.5 billion modernizing its core banking platform. Most companies cannot afford that kind of bet.

Raincode and Heirloom Computing have tried to build compilers and translators that convert COBOL to modern languages automatically. The results are mixed. The output code works but is often unreadable, which means you have traded one maintenance problem for a different maintenance problem. The fundamental issue is that the knowledge embedded in these systems is not just in the code. It is in the heads of the people who have been maintaining the code for 30 years.

The Micro: Two Ex-Apple Engineers With a Hackathon Habit

Hypercubic has two products. HyperDocs ingests COBOL, JCL, and PL/I codebases and generates documentation, architecture diagrams, and dependency graphs. HyperTwin captures institutional expertise from retiring engineers by observing their workflows and interactions, creating what is essentially a digital clone of their knowledge. The combination is smart. You need both the code-level understanding and the human-level context to maintain these systems.

Sai Gurrapu and Aayush Naik are both ex-Apple engineers. Sai was a Machine Learning Engineer there. He has won 18 hackathons, published NLP research, and bootstrapped a software engineering interview platform to six-figure ARR before starting Hypercubic. Aayush was an Apple Engineer specializing in robotics and AI. They are based in San Francisco and came through Y Combinator’s Fall 2025 batch.

The testimonial on their site comes from Alexander Kolterer, a mainframe modernization leader who previously worked at AWS and IBM. He called Hypercubic “the most comprehensive AI-powered modernization solution I’ve seen for mainframes in my 30+ years in the industry.” That kind of endorsement from someone with deep domain credibility is worth more than any customer logo wall. United Airlines also appears on their site, which suggests at least exploratory conversations with exactly the kind of enterprise that needs this product.

Forbes named them among the top Y Combinator Fall 2025 startups. That is nice for credibility but irrelevant to whether the product works. What matters is the hybrid approach. Most AI coding tools use purely generative methods, which means they hallucinate. Hypercubic uses a hybrid deterministic and generative AI approach specifically to prevent hallucinations in documentation and code understanding. When you are dealing with systems that process billions of dollars in transactions, hallucinated documentation is not just annoying. It is dangerous.

The competitive landscape is surprisingly thin for a market this large. IBM has Watson-based tools for mainframe analysis but they are expensive and require significant professional services. Micro Focus (now part of OpenText) does mainframe development tools but has not made a serious AI play. Phase Change Software attempted AI-driven code understanding but never gained significant traction. The opportunity is genuinely open for a startup that can move fast and build trust with enterprise buyers simultaneously.

The Verdict

I think Hypercubic is solving one of the most important and least glamorous problems in enterprise software. Nobody starts a company to work on COBOL because it is exciting. They start a company to work on COBOL because the alternative is watching critical infrastructure slowly become unmaintainable, and that scares them more than the unglamorous pitch.

At 30 days, I want to see how quickly HyperDocs can process a real enterprise COBOL codebase. Not a demo system. A messy, undocumented, million-line production system. At 60 days, the question is whether HyperTwin is actually capturing useful knowledge or just recording conversations that get filed and forgotten. Knowledge capture tools have failed before because the output was not actionable. At 90 days, I want to know whether enterprises are buying this as a product or engaging it as a consulting project. If it is the former, this scales. If it is the latter, they have built a boutique consultancy with good marketing. The difference matters enormously for what happens at month twelve.