F5 on Intel Technology Powered Platforms

Secure and optimize your apps and AI models by using F5 solutions on Intel processors and platforms. 

Accelerate AI Inference and App Performance at the Edge

F5 and Intel have partnered to provide high-performance, secure AI inference with protection, performance, and availability. This validated solution lets you offer AI-powered apps anywhere.

Protect AI Models and Apps

F5 NGINX One on Intel infrastructure processing units (IPUs) lets you create a security air gap and use mTLS certificates to protect access.

Accelerate Performance

Using NGINX One on an Intel IPU improves security performance and offloads decryption tasks in order to conserve CPU resources for AI models.

Ensure Availability

Active health checks, high availability, and intelligent load balancing from NGINX One mean inference is accessible whenever needed.

How F5 Helps

Accelerate and protect AI inference for rapid and reliable access

Provide high performance protection and simplified AI model optimization and deployment for secure inference at the edge with F5 NGINX One, Intel IPUs, and Intel OpenVINO.

Read the solution overview ›

Get faster performance with BIG-IP VE and Intel QuickAssist Technology (QAT)

By offloading and accelerating cryptographic processing, Intel QAT improves BIG-IP VE security and application performance while conserving CPU utilization.

Read the white paper ›

Mitigate attacks in cloud environments with F5 BIG-IP VE and Intel

BIG-IP VE for SmartNICs provides up to a 300x increase in DDoS mitigation capacity by integrating with Intel FPGA PAC N3000 to offload malicious traffic detection, free up CPU resources, and lower TCO.

Read the solution brief ›

Resources

Next Steps

Find out how F5 products and solutions can enable you to achieve your goals.

Contact F5