Introduction
May 2025, Microsoft has released the Azure FXv2-series VMs – compute-optimized instances featuring the 5th Gen Intel® Xeon® Platinum 8573C (Emerald Rapids) CPU – now generally available across a growing list of regions.
Performance at a Glance
The FXv2-series delivers:
- Up to 50% CPU performance boost over FXv1, thanks to all-core turbo up to 4.0 GHz.
- Support for 96 vCPUs and 1,832 GiB memory (21 GiB/vCPU ratio).
- NVMe-backed local & remote storage, doubling IOPS and offering up to 5× throughput improvements. Remote disk support enables up to 400K IOPS and 11.25 GB/s throughput.
- 70 Gbps maximum network bandwidth, leveraging Azure Boost and Microsoft Azure Network Adapter (MANA).
- Enhanced AI/ML and HPC readiness through Intel AMX and Total Memory Encryption (TME).
Who It’s For & Why It Matters
- Database & Analytics Workloads (e.g., SQL Server, Oracle RAC)
Ideal for large databases requiring high IOPS, low latency, and sustained CPU throughput. Benchmarks have demonstrated up to 1.2M IOPS and 33.75 GB/s throughput in high-performance scenarios. - Electronic Design Automation (EDA)
EDA customers benefit from improved IPC, large L3 caches, higher memory capacity, and faster storage, all contributing to faster chip design iterations and better license efficiency. - High-Performance Storage & Networking
With Azure Boost, remote storage capabilities scale up to 800K IOPS / 16 GB/s, and local storage can deliver up to 6.6M IOPS / 36 GB/s in certain configurations.
Technical Deep Dive
Feature | Specifications |
---|---|
vCPU / Memory | Up to 96 vCPUs, 1,832 GiB RAM |
CPU Performance | All-core turbo: 4.0 GHz |
Storage (remote) | 400K IOPS, 11.25 GB/s with Premium SSD v2 / Ultra Disk |
Storage (local) | NVMe-based with Azure Boost enhancements |
Networking | Up to 70 Gbps via Azure Boost & MANA |
Security & AI | Intel TME DDR encryption, Intel AMX support |
Config Options | FXmdsv2 (with local disk), FXmsv2 (without), plus constrained-core SKUs |
Availability
Currently available in:
- Australia East, Canada Central, Central US, East US, East US 2, Germany West Central, Japan East, Korea Central, South Africa North, South Central US, Sweden Central, Switzerland North, West Europe, West US 3
More regions will be added throughout 2025.
What Sets FXv2 Apart
- CPU: 50% uplift vs FXv1 with 96 cores.
- Storage: NVMe-fast remote storage, 2–5× performance gains.
- Networking: 70 Gbps with Azure Boost.
- Memory: 21 GiB per vCPU for data-heavy workloads.
- AI & Security: Intel AMX for AI, Total Memory Encryption for protection.
- Flexibility: Wide SKU options including local disk and constrained-core variants.
Best Practices & When to Choose FXv2
- Databases & Analytics: Ideal where high compute, memory, and storage I/O converge.
- EDA Workloads: High performance chip design scenarios requiring ultra-low latency.
- ML & AI Prep: Useful for CPU-bound inference pipelines with AMX acceleration.
How to Get Started
- Choose between FXmsv2 (no local NVMe) or FXmdsv2 (with local NVMe).
- Use constrained-core SKUs to optimize licensing costs.
- Deploy via Azure Portal, CLI, or ARM templates.
- Combine with Premium SSD v2 or Ultra Disk for top-tier I/O.
- Monitor performance using Azure Monitor or your preferred observability tools.
Final Thoughts
The Azure FXv2-series marks a significant leap in compute performance, memory scaling, and storage efficiency. Whether you’re running high-transaction databases, advanced analytics, or chip design simulations, FXv2 offers the horsepower to run demanding workloads faster and more efficiently – delivering up to 500% performance gains over previous generations.
Now is the time to explore FXv2 and supercharge your enterprise-grade workloads on Azure. Thanks.,
P.S. Modern AI tool has been used for creating some of the content. Technical validation and proofing are done by the author.