8.7 C
Canberra
Saturday, July 26, 2025

Azure NetApp Information: Revolutionizing silicon design for high-performance computing


Learn the way the Azure {Hardware} Techniques and Interconnect crew leverages Azure NetApp Information for chip improvement.

Excessive-performance computing (HPC) workloads place vital calls for on cloud infrastructure, requiring sturdy and scalable assets to deal with complicated and intensive computational duties. These workloads typically necessitate excessive ranges of parallel processing energy, usually supplied by clusters of central processing unit (CPU) or graphics processing unit (GPU)-based digital machines. Moreover, HPC functions demand substantial knowledge storage and quick entry speeds, which exceed the capabilities of conventional cloud file techniques. Specialised storage options are required to fulfill the low latency and excessive throughput enter/output (I/O) wants.

Microsoft Azure NetApp Information is designed to ship low latency, excessive efficiency, and enterprise-grade knowledge administration at scale. Distinctive capabilities of Azure NetApp Information make it appropriate for a number of high-performance computing workloads similar to Digital Design Automation (EDA), Seismic Processing, Reservoir Simulations, and Threat Modeling. This weblog highlights Azure NetApp Information’ differentiated capabilities for EDA workloads and Microsoft’s silicon design journey. 

Infrastructure necessities of EDA workloads

EDA workloads have intensive computational and knowledge processing necessities to handle complicated duties in simulation, bodily design, and verification. Every design stage entails a number of simulations to boost accuracy, enhance reliability, and detect design defects early, lowering debugging and redesigning prices. Silicon improvement engineers can use further simulations to check totally different design situations and optimize the chip’s Energy, Efficiency, and Space (PPA).

EDA workloads are categorised into two major sorts—Frontend and Backend, every with distinct necessities for the underlying storage and compute infrastructure. Frontend workloads deal with logic design and practical points of chip design and encompass 1000’s of short-duration parallel jobs with an I/O sample characterised by frequent random reads and writes throughout thousands and thousands of small recordsdata. Backend workloads deal with translating logic design to bodily design for manufacturing and consists of a whole lot of jobs involving sequential learn/write of fewer bigger recordsdata.

The selection of a storage answer to fulfill this distinctive mixture of frontend and backend workload patterns is non-trivial. The SPEC consortium has established the SPEC SFS benchmark to assist with benchmarking the varied storage options within the business. For EDA workloads, the EDA_BLENDED benchmark offers the attribute patterns of the frontend and backend workloads. The I/O operations composition is described within the following desk.

EDA workload stage  I/O operation sorts 
Frontend  Stat (39%), Entry (15%), Learn File (7%), Random Learn (8%), Write File (10%), Random Write (15%), Different Ops (6%) 
Backend  Learn (50%), Write (50%) 

Azure NetApp Information helps common volumes which are perfect for workloads like databases and general-purpose file techniques. EDA workloads work on giant volumes of information and require very excessive throughput; this requires a number of common volumes. The introduction of huge volumes to help greater portions of information is advantageous for EDA workloads, because it simplifies knowledge administration and delivers superior efficiency in comparison with a number of common volumes. 

Beneath is the output from the efficiency testing of the SPEC SFS EDA_BLENDED benchmark which demonstrates that Azure NetApp Information can ship ~10 GiB/s throughput with lower than 2 ms latency utilizing giant volumes.

A graph with a line graph and numbers

Digital Design Automation at Microsoft

Microsoft is dedicated to enabling AI on each workload and expertise for gadgets of right now and tomorrow. It begins with the design and manufacturing of silicon. Microsoft is surpassing scientific boundaries at an unprecedented tempo for working EDA workflows, pushing the bounds of Moore’s Legislation by adopting Azure for our personal chip design wants.

A diagram of a company's product

Utilizing the perfect practices mannequin to optimize Azure for chip design between prospects, companions, and suppliers has been essential to the event of a few of Microsoft’s first absolutely customized cloud silicon chips: 

  • The Azure Maia 100 AI Accelerator, optimized for AI duties and generative AI. 
  • The Azure Cobalt 100 CPU, an Arm-based processor tailor-made to run common function compute workloads on Microsoft Azure.
  • The Azure Built-in {Hardware} Safety Module; Microsoft’s latest in-house safety chip designed to harden key administration.
  • The Azure Enhance DPU, the corporate’s first in-house knowledge processing unit designed for data-centric workloads with excessive effectivity and low energy. 

The chips developed by the Azure cloud {hardware} crew are deployed in Azure servers delivering best-in-class compute capabilities for HPC workloads and additional speed up the tempo of innovation, reliability, and operational effectivity used to develop Azure’s manufacturing techniques. By adopting Azure for EDA, the Azure cloud {hardware} crew enjoys these advantages: 

  • Fast entry to scalable on-demand innovative processors.
  • Dynamic pairing of every EDA instrument to a selected CPU structure. 
  • Leveraging Microsoft’s improvements in AI-driven applied sciences for semiconductor workflows. 

How Azure NetApp Information accelerates semiconductor improvement innovation 

  • Superior efficiency: Azure NetApp Information can ship as much as 652,260 IOPS with lower than 2 milliseconds of latency, whereas reaching 826,000 IOPS on the efficiency edge (~7 milliseconds of latency).
  • Excessive scalability: As EDA tasks advance, knowledge generated can develop exponentially. Azure NetApp Information offers large-capacity, excessive efficiency single namespaces with giant volumes as much as 2PiB, seamlessly scaling to help compute clusters even as much as 50,000 cores. 
  • Operational simplicity: Azure NetApp Information is designed for simplicity, with handy person expertise through the Azure Portal or through automation API. 
  • Value effectivity: Azure NetApp Information presents cool entry to transparently transfer cool knowledge blocks to managed Azure storage tier for lowered price, after which robotically again to the new tier on entry. Moreover, Azure NetApp Information reserved capability offers vital price financial savings in comparison with pay-as-you-go pricing, additional lowering the excessive prices related to enterprise-grade storage options. 
  • Safety and reliability: Azure NetApp Information offers enterprise-grade knowledge administration, control-plane, and data-plane safety options, making certain that vital EDA knowledge is protected and obtainable with key administration and encryption for knowledge at relaxation and for knowledge in transit. 

The graphic under reveals a manufacturing EDA cluster deployed in Azure by the Azure cloud {hardware} crew the place Azure NetApp Information serves shoppers with over 50,000 cores per cluster.

A screenshot of a computer cluster

Azure NetApp Information offers the scalable efficiency and reliability that we have to facilitate seamless integration with Azure for a various set of Digital Design Automation instruments utilized in silicon improvement.

—Mike Lemus, Director, Silicon Improvement Compute Options at Microsoft.

In right now’s fast-paced world of semiconductor design, Azure NetApp Information presents agility, efficiency, safety, and stability—the keys to delivering silicon innovation for our Azure cloud.

—Silvian Goldenberg, Accomplice and Common Supervisor for Design Methodology and Silicon Infrastructure at Microsoft.

Study extra about Azure NetApp Information

Azure NetApp Information has confirmed to be the storage answer of alternative for essentially the most demanding EDA workloads. By offering low latency, excessive throughput, and scalable efficiency, Azure NetApp Information helps the dynamic and complicated nature of EDA duties, making certain fast entry to cutting-edge processors and seamless integration with Azure’s HPC answer stack.

Try Azure Nicely-Architected Framework perspective on Azure NetApp Information for detailed data and steerage.

For additional data associated to Azure NetApp Information, try the Azure NetApp Information documentation right here.



Related Articles

LEAVE A REPLY

Please enter your comment!
Please enter your name here

[td_block_social_counter facebook="tagdiv" twitter="tagdivofficial" youtube="tagdiv" style="style8 td-social-boxed td-social-font-icons" tdc_css="eyJhbGwiOnsibWFyZ2luLWJvdHRvbSI6IjM4IiwiZGlzcGxheSI6IiJ9LCJwb3J0cmFpdCI6eyJtYXJnaW4tYm90dG9tIjoiMzAiLCJkaXNwbGF5IjoiIn0sInBvcnRyYWl0X21heF93aWR0aCI6MTAxOCwicG9ydHJhaXRfbWluX3dpZHRoIjo3Njh9" custom_title="Stay Connected" block_template_id="td_block_template_8" f_header_font_family="712" f_header_font_transform="uppercase" f_header_font_weight="500" f_header_font_size="17" border_color="#dd3333"]
- Advertisement -spot_img

Latest Articles