
Earlier this week, I had the opportunity to meet with NetApp’s executive team to explore their new AFx product line and the AI enhancements embedded within it. What stood out immediately was how distinct their approach is compared to other storage vendors currently touting AI capabilities. While many competitors aim to own every facet of the AI pipeline, NetApp is charting a different course. Rather than building a monolithic solution that handles all AI processing internally, NetApp is focusing on what they do best—providing a robust, scalable data catalog and storage infrastructure—while leaving the development of AI models and workflows to their customers and strategic partners. It’s a refreshing example of a company staying in its lane and excelling at it.
One particularly notable aspect is NetApp’s commitment to openness in their AI platforms. During the sessions, it was clear that the roadmap is designed with extensibility in mind. When asked about specific capabilities, the response was often “not at launch, but it’s coming,” which signals a thoughtful, phased rollout. A prime example is the AI Data Engine (AIDE) (which starts at 1:32:30 into the video). We discussed how valuable it would be to process data stored on non-NetApp platforms—an important consideration for enterprises with diverse storage environments. NetApp acknowledged this and indicated that broader compatibility is on the horizon.
Another standout feature of the AFx platform is its modular scalability. Historically, scaling compute or storage often meant scaling both together, sometimes with frustrating limitations. NetApp’s new architecture breaks that mold. Need more compute power at the controller level? Simply add more controllers to the cluster. The system recognizes the new hardware and prompts you to integrate it seamlessly.
The same principle applies to storage expansion. If additional capacity is required without increasing compute, new shelves can be added directly to the array. Once racked and connected to the network switches, they boot up and are automatically incorporated into the existing storage pool.
The AI Data Engine components follow this modular philosophy as well. These are currently NetApp-exclusive devices, but the company plans to support third-party hardware post-launch. As independent nodes in the AI cluster, they function like any other client with mounted storage—except they run NetApp’s software to catalog unstructured data across the environment. This tagging system enables AI solutions to locate relevant data quickly and efficiently, streamlining the training and inference processes.
I’m genuinely excited to see how organizations leverage this platform. NetApp’s approach—focused, flexible, and partner-friendly—could be a game-changer in how enterprises integrate AI into their infrastructure.
Denny