January 26, 2026) - AscentOptics, a leading provider of optical transceiver solutions, today announced that its 400G and 800G AI optical modules have entered volume production and are being delivered ...
Finchetto CEO exclusively talks to us about using light to cut network latency, reduce power use, and remove network ...
Microsoft’s new Maia 200 inference accelerator chip enters this overheated market with a new chip that aims to cut the price ...
Memory chips, prices and AI infrastructure are now tightly linked as the global semiconductor market enters what industry ...
Micron Technology (NASDAQ:MU) shares declined 2% on Monday after reports indicated that Samsung Electronics (KS:005930) is ...
Microsoft on Monday unveiled the second generation of its in-house artificial intelligence chip, along with software tools ...
AI factories meet the computational capacity and power requirements of today’s machine-learning and generative AI workloads.
After changing its requirements for high-bandwidth memory chips, Nvidia is close to certifying a new source for supplies.
Google researchers have revealed that memory and interconnect are the primary bottlenecks for LLM inference, not compute power, as memory bandwidth lags 4.7x behind.
The U.S. government kicked off the year with significant trade updates affecting the advanced semiconductor industry. These new policy changes ...
Samsung Electronics Co. is nearing certification from Nvidia Corp. for its HBM4, the company’s sixth-generation ...
Riemann Computing is Seeking Funding to Build out a Data Center We are looking to democratize how data is processed in ...