31.6 C
Canberra
Monday, February 24, 2025

DDN Gooses AI Storage Pipelines with Infinia 2.0


DDN Gooses AI Storage Pipelines with Infinia 2.0

(spainter_vfx/Shutterstock)

AI’s insatiable demand for knowledge has uncovered a rising downside: storage infrastructure isn’t maintaining. From coaching basis fashions to operating real-time inference, AI workloads require high-throughput, low-latency entry to huge quantities of information unfold throughout cloud, edge, and on-prem environments. Conventional storage techniques have usually struggled beneath the load of those calls for, creating bottlenecks that may drastically delay innovation within the AI house. 

Right this moment, DDN unveiled Infinia 2.0, a major replace to its AI-focused, software-defined knowledge storage platform designed to remove the inefficiencies in AI storage and knowledge administration. The corporate says Infinia 2.0 acts as a unified, clever knowledge layer that dynamically optimizes AI workflows. 

“Infinia 2.0 is not only an improve—it’s a paradigm shift in AI knowledge administration,” DDN CEO Alex Bouzari says, emphasizing how Infinia builds on the corporate’s deep-rooted experience in HPC storage to energy the following technology of AI-driven knowledge providers. 

A rendering of a large-scale Infinia 2.0 configuration from DDN’s Past Synthetic digital occasion.

As AI adoption grows, the challenges of scale, velocity, and effectivity turn into extra obvious. LLMs, generative AI functions, and inference techniques require not solely large datasets however the skill to entry and course of them sooner than ever. Conventional storage options wrestle with efficiency bottlenecks, making it tough for GPUs to obtain the information they want shortly sufficient, limiting general coaching effectivity. On the similar time, organizations should navigate the fragmentation of information throughout a number of areas, from structured databases to unstructured video and sensory knowledge. Transferring knowledge between these environments creates inefficiencies, driving up operational prices and creating latency points that gradual AI functions. 

DDN claims Infinia 2.0 solves these challenges by integrating real-time AI knowledge pipelines, dynamic metadata-driven automation, and multi-cloud unification, all optimized particularly for AI workloads. Relatively than forcing enterprises to work with disconnected knowledge lakes, Infinia 2.0 introduces a Information Ocean, a unified world view that eliminates redundant copies and permits organizations to course of and analyze their knowledge wherever it resides. This is supposed to scale back storage sprawl and to permit AI fashions to go looking and retrieve related knowledge extra effectively utilizing a complicated metadata tagging system. With just about limitless metadata capabilities, AI functions can affiliate huge quantities of metadata with every object, making search and retrieval operations dramatically sooner. 

Infinia 2.0 integrates with frameworks like TensorFlow and PyTorch, which the corporate says eliminates the necessity for complicated format conversions, permitting AI execution engines to work together with knowledge on to considerably velocity up processing instances. The platform can also be designed for excessive scalability, supporting deployments that vary from just a few terabytes to exabytes of storage, making it versatile sufficient to satisfy the wants of each startups and enterprise-scale AI operations.  

Efficiency is one other space the place Infinia 2.0 could possibly be a breakthrough. The platform boasts 100x sooner metadata processing, decreasing lookup instances from over ten milliseconds to lower than one. AI pipelines execute 25x sooner, whereas the system can deal with as much as 600,000 object lists per second, surpassing the restrictions of even AWS S3. By leveraging these capabilities, DDN asserts that AI-driven organizations can guarantee their fashions are skilled, refined, and deployed with minimal lag and most effectivity. 

(Supply: DDN)

Throughout a digital launch occasion immediately known as Past Synthetic, DDN’s claims have been strengthened by sturdy endorsements from business leaders like Nvidia CEO Jensen Huang, who highlighted Infinia’s potential to redefine AI knowledge administration, emphasizing how metadata-driven architectures like Infinia remodel uncooked knowledge into actionable intelligence. Enterprise computing chief Lenovo additionally praised the platform, underscoring its skill to merge on-prem and cloud knowledge for extra environment friendly AI deployment. 

Supermicro, one other DDN accomplice, additionally endorses Infinia: “At Supermicro, we’re proud to accomplice with DDN to remodel how organizations leverage knowledge to drive enterprise success,” mentioned Charles Liang, founder, president, and CEO at Supermicro. “By combining Supermicro’s high-performance, energy-efficient {hardware} with DDN’s revolutionary Infinia platform, we empower prospects to speed up AI workloads, maximize operational effectivity, and scale back prices. Infinia’s seamless knowledge unification throughout cloud, edge, and on-prem environments permits companies to make sooner, data-driven choices and obtain measurable outcomes, aligning completely with our dedication to delivering optimized, sustainable infrastructure options.” 

On the Past Synthetic occasion, Bouzari and Huang sat down for a fireplace chat to replicate on how a earlier concept, born from a 2017 assembly with Nvidia, developed into the Infinia platform. 

DDN had been requested to assist construct a reference structure for AI computing, however Bouzari noticed a a lot larger alternative. If Huang’s imaginative and prescient for AI was going to materialize, the world would wish a basically new knowledge structure, one that might scale AI workloads, remove latency, and remodel uncooked data into actionable intelligence. 

On the Past Synthetic occasion, Huang and Bouzari sit down for a fireplace chat concerning the larger image of storage and AI.

Infinia is extra than simply storage, Bouzari says, and fuels AI techniques the way in which power fuels a mind. And in line with Huang, that distinction is important. 

“Some of the essential issues folks neglect is the significance of information that’s crucial throughout software, not simply throughout coaching,” Huang notes. “You wish to prepare on an enormous quantity of information for pretraining, however throughout use, the AI has to entry data, and AI wish to entry data, not in uncooked knowledge type, however in informational move.” 

This shift from conventional storage to AI-native knowledge intelligence has profound implications, the CEOs say. As an alternative of treating storage as a passive repository, DDN and Nvidia are turning it into an lively layer of intelligence, enabling AI to retrieve insights immediately. 

“That is the rationale why the reframing of storage of objects and uncooked knowledge into knowledge intelligence is that this new alternative for DDN, offering knowledge intelligence for all the world’s enterprises as AIs run on prime of this material of data,” Huang says, calling it “a unprecedented reframing of computing and storage.” 

Reframing definitely appears crucial as AI continues to evolve as a result of the infrastructure supporting it should evolve as effectively. DDN’s Infinia 2.0 might symbolize a serious shift in how enterprises strategy AI storage, not as a passive archive, however as an lively intelligence layer that fuels AI techniques in actual time. By eliminating conventional bottlenecks, unifying distributed knowledge, and integrating seamlessly with AI frameworks, Infinia 2.0 goals to reshape how AI functions entry, course of, and act on data. 

With endorsements from business leaders like Nvidia, Supermicro, and Lenovo, and with its newest funding spherical of $300 million at a $5 billion valuation, DDN is positioning itself as a key participant within the AI panorama. Whether or not Infinia 2.0 delivers on its formidable guarantees stays to be seen, however one factor is obvious: AI’s subsequent frontier isn’t nearly fashions and compute however is about rethinking knowledge itself. And with this launch, DDN is making the case that the way forward for AI hinges on new paradigms for knowledge administration.

Study extra concerning the technical points of Infinia 2.0 at this hyperlink, or watch a replay of Past Synthetic right here.

Associated Objects:

Feeding the Virtuous Cycle of Discovery: HPC, Massive Information, and AI Acceleration

The AI Information Cycle: Understanding the Optimum Storage Combine for AI Workloads at Scale

DDN Cranks the Information Throughput with AI400X2 Turbo

Editor’s word: This text first appeared on AIWire.

 

Related Articles

LEAVE A REPLY

Please enter your comment!
Please enter your name here

[td_block_social_counter facebook="tagdiv" twitter="tagdivofficial" youtube="tagdiv" style="style8 td-social-boxed td-social-font-icons" tdc_css="eyJhbGwiOnsibWFyZ2luLWJvdHRvbSI6IjM4IiwiZGlzcGxheSI6IiJ9LCJwb3J0cmFpdCI6eyJtYXJnaW4tYm90dG9tIjoiMzAiLCJkaXNwbGF5IjoiIn0sInBvcnRyYWl0X21heF93aWR0aCI6MTAxOCwicG9ydHJhaXRfbWluX3dpZHRoIjo3Njh9" custom_title="Stay Connected" block_template_id="td_block_template_8" f_header_font_family="712" f_header_font_transform="uppercase" f_header_font_weight="500" f_header_font_size="17" border_color="#dd3333"]
- Advertisement -spot_img

Latest Articles