IBM is giving the classic mainframe a high-tech reboot for the age of artificial intelligence. On Monday, the company unveiled the IBM z17, the latest iteration of its mainframe hardware, designed with a clear mission: make AI enterprise-ready at scale.
At the heart of the z17 is IBM’s Telum II processor, backed by enhanced AI acceleration capabilities and full encryption. According to IBM, the new system supports over 250 AI use cases — from AI agents to generative AI applications — making it not just a data processing machine, but a flexible AI-ready computing platform for the world’s largest businesses.
Mainframes may feel like relics of the past, but they’re far from obsolete. In fact, 71% of Fortune 500 companies still rely on them today, and the market for mainframes reached $5.3 billion in 2024, according to Market Research Future.
The z17 is built for this moment. It can process 450 billion AI inference operations per day, a 50% boost over its predecessor, the z16, which debuted in 2022. With integrated support for open-source tools, modern software, and diverse hardware, the z17 is intended to be an agile, future-ready infrastructure core — not a siloed legacy system.
IBM didn’t jump on the AI bandwagon post-ChatGPT. According to Tina Tarquinio, VP of Product Management and Design for IBM Z, this upgrade has been in the works for over five years — long before the AI gold rush began in late 2022.
“It's been wild knowing that we were introducing an AI accelerator, and then seeing the explosion of AI unfold,” Tarquinio told TechCrunch. What’s more surprising, she added, is that the early customer feedback IBM collected during the development of z17 closely aligned with where the AI industry actually ended up.
With support for 48 IBM Spyre AI accelerator chips at launch — and a roadmap to scale up to 96 within a year — IBM is leaving plenty of headroom for the next wave of massive AI models. Tarquinio emphasized the need for “AI agility”, noting that z17 is architected to accommodate larger models, greater memory needs, and new approaches yet to emerge.
Also notable: energy efficiency. While z17 boosts AI acceleration performance by 7.5x, it uses 5.5x less energy compared to other platforms for similar multi-model workloads, IBM claims. This aligns with growing enterprise concerns around the carbon footprint of AI infrastructure.
The z17 isn’t just another iteration of old hardware — it’s IBM staking a claim in the AI race with a hybrid of reliability, security, and adaptability. For large-scale enterprises still reliant on mainframes, z17 could be the bridge between traditional IT systems and modern AI demands.
IBM says the z17 will be generally available on June 8, potentially marking a key moment for businesses looking to infuse AI into mission-critical systems — without reinventing their tech stacks from scratch.