SEMI VISION

SEMI VISION

Share this post

SEMI VISION
SEMI VISION
Accelink: Key Technologies for Energy-Efficient Pluggable Optics in AI/ML Applications
Copy link
Facebook
Email
Notes
More

Accelink: Key Technologies for Energy-Efficient Pluggable Optics in AI/ML Applications

Original Article by SemiVision Research (Accelink)

SEMI VISION's avatar
SEMI VISION
May 24, 2025
∙ Paid
11

Share this post

SEMI VISION
SEMI VISION
Accelink: Key Technologies for Energy-Efficient Pluggable Optics in AI/ML Applications
Copy link
Facebook
Email
Notes
More
6
Share

This presentation is proudly delivered by Accelink, on the topic of:

Key Technologies for Energy Efficiency Pluggable Optics in AI/ML Applications

As the demand for AI and machine learning computation continues to grow exponentially, the data center industry is facing unprecedented challenges in network bandwidth scalability and energy efficiency.

Whether for AI model training or inference, the ability to simultaneously enhance network performance and power efficiency has become a critical priority for future infrastructure planning.

Accelink, with deep expertise in optical communications and module technologies, believes that two key technology directions will be essential to overcoming these limitations:

  • Linear Pluggable Optics (LPO) – delivering lower power consumption and higher reliability to enable high-speed connectivity for AI clusters.

  • Immersion Cooling – offering superior thermal management to significantly improve overall system energy efficiency.

In the following session, SemiVision will present the latest advancements, validation results, and the role these technologies play in enabling higher performance and lower power AI data center architectures.

The accelerating demand for AI and data center infrastructure can be clearly observed from two critical perspectives: market growth and energy challenges.

Ethernet Optical Transceiver Market Growth (LightCounting Data)

The market data shows two key application segments:

  • Yellow: General Ethernet applications.

  • Blue: AI Clusters (dedicated AI compute clusters).

Starting from 2024, the share of AI Cluster deployments rises sharply. By 2028, the overall Ethernet optical transceiver market is expected to exceed $10 billion, with AI-related demand accounting for nearly half of the total.

Data Center Power Demand (Goldman Sachs Analysis)

Both the U.S. and global data center power demand show steep growth, driven by AI and non-AI workloads.

  • By 2030, global data center electricity consumption is projected to surpass 1000 TWh.

  • The share of AI-related power consumption is expected to expand significantly.

  • Notably, the U.S. AI power demand shows the most aggressive growth, emerging as one of the major energy challenges for the industry.

More Fibers and More Power Required for AI/ML

AI/ML workloads demand not only higher bandwidth enabled by more optical fibers, but also substantially higher power to support compute-intensive tasks.

This dual challenge of scaling bandwidth and energy efficiency has become a critical consideration for the future of infrastructure expansion and sustainability.

For Paid Members, SemiVision will discuss topics on

  • LPO’s Importance and Its Integration with NIC

  • Observations on 800G LPO (Linear Pluggable Optics) Performance and System-Level Benefits

  • Innovations in Silicon Photonics and the Adoption of New Materials

  • LPO and Immersion Cooling — Key Technologies for Data Center Energy Efficiency and AI Compute Scaling

This post is for paid subscribers

Already a paid subscriber? Sign in
© 2025 TSPA Semiconductor
Privacy ∙ Terms ∙ Collection notice
Start writingGet the app
Substack is the home for great culture

Share

Copy link
Facebook
Email
Notes
More