Microsoft has introduced Maia 200, its latest in-house AI accelerator designed for large-scale inference deployments inside ...
Microsoft has announced that Azure’s US central datacentre region is the first to receive a new artificial intelligence (AI) inference accelerator, Maia 200.
With over 100 billion transistors, Maia 200 offers "powerhouse" AI inferencing possibilites, Microsoft says.
For enterprises deploying AI applications with similar read-heavy workloads and unpredictable traffic spikes, OpenAI's ...
The Chinese firm is eyeing new data centers and availability zones as regional cloud and AI investment accelerates.
Microsoft expands Windows 365 cloud PC rentals to Brazil South, improving latency and access for eligible users and ...
IBM (NYSE: IBM) today announced IBM Enterprise Advantage, a first-of-its-kind asset-based consulting service that combines ...
Franklin Templeton, a global investment leader, today announced the launch of Intelligence Hub, a modular, AI-driven ...
What is the Maia 200 AI accelerator? The Maia 200 is Microsoft's custom-designed chip, specifically an AI inference ...
CEO Satya Nadella reported rising interest in cloud and AI sovereignty, which analysts warned could lead to more selective ...
Hyperscaler leverages a two-tier Ethernet-based topology, custom AI Transport Layer & software tools to deliver a tightly integrated, low-latency platform ...