The Only Company That Can Challenge Nvidia Just Landed OpenAI
For years, AMD played second fiddle to Nvidia in AI chips. That changed in October 2025 when OpenAI signed a 6-gigawatt GPU agreement that could generate tens of billions in revenue for AMD over multiple years. The deal includes warrants for up to 160 million AMD shares—giving OpenAI the option to acquire roughly 10% of the company. Sam Altman didn’t just validate AMD’s roadmap. He bet on it.
Current price: ~$208 per share. Market cap: $343 billion. Forward P/E: 42x. AMD trades at a premium to the semiconductor sector average, but at a discount to its historical highs and to its Nvidia-beating growth ambitions. The stock reached $267 in October 2025 before pulling back as investors debated whether execution can match the promise.
Watch AMD Live Chart – Real-Time Stock Price
Real-time TradingView chart with technical indicators:
Why AMD Matters to BusinessTech Readers
The OpenAI Partnership: AMD’s Biggest Win Ever
OpenAI’s 6-gigawatt GPU agreement with AMD represents the largest AI chip deployment deal in the company’s history. The first 1-gigawatt deployment of AMD Instinct MI450 GPUs begins in the second half of 2026, with subsequent tranches scaling to 6 gigawatts across multiple generations of hardware. AMD expects the partnership to deliver “tens of billions of dollars in revenue.”
The deal structure reveals how badly OpenAI wants supply chain diversification. AMD issued OpenAI warrants for up to 160 million shares—about 10% of the company—with vesting tied to deployment milestones and AMD stock price targets. The final tranche requires AMD to reach roughly $1 trillion market cap (about $600/share). If OpenAI holds through all milestones, the warrant position could be worth ~$100 billion.
UBS analyst Timothy Arcuri called it “AMD’s own stock paying for OpenAI’s chip purchases.” The financial engineering is creative, but the strategic validation is unmistakable: OpenAI believes AMD’s MI450 and future generations can handle frontier AI workloads at scale. That’s a statement Nvidia’s competitors have never been able to credibly make.
MI350 and Helios: Catching Up to Blackwell
AMD’s Instinct MI350 series launched in June 2025 with aggressive claims: 4x AI compute performance and 35x inference improvement over MI300. Built on CDNA 4 architecture with TSMC’s 3nm process, MI350 packs 288GB of HBM3e memory with 8TB/s bandwidth—significantly more memory than Nvidia’s B200 at 180GB.
The MI355X variant delivers 10 PFLOPs in Matrix FP6/FP4 computations, competitive with Nvidia’s Blackwell B200. AMD claims 40% more tokens per dollar than B200 in inference workloads, though independent benchmarks show more nuanced results. SemiAnalysis found MI350 competitive for small-to-medium LLM inference but acknowledged Nvidia’s GB200 NVL72 maintains advantages at frontier model scale.
The bigger story is Helios—AMD’s rack-scale AI system announced alongside MI350. HPE will offer Helios as a “single turnkey rack” capable of trillion-parameter training, with 260TB/s aggregate scale-up bandwidth and 2.9 AI exaflops FP4 performance. AMD is no longer just selling chips. It’s selling integrated systems that compete directly with Nvidia’s GB200 NVL72 racks.
Data Center Dominance: 122% Revenue Growth
AMD’s Data Center segment has become the company’s growth engine. Q3 2024 results showed 122% year-over-year revenue growth to $3.5 billion—more than half of AMD’s total revenue for the first time. EPYC server CPUs gained share against Intel while MI300X GPUs expanded adoption with Microsoft, Meta, Oracle, and major cloud providers.
Meta uses MI300X exclusively to serve all live traffic for its Llama 405B frontier model. Microsoft deploys MI300X across multiple Copilot services powered by GPT-4 models. These aren’t pilot programs—they’re production deployments at hyperscale. AMD raised its 2024 Data Center GPU revenue forecast to exceed $5 billion, up from $4.5 billion in July.
For 2025, AMD targets data center revenue CAGR above 60% and company-wide revenue CAGR above 35%. The company’s Financial Analyst Day laid out an ambitious goal: $100 billion in annual data center revenue by 2030. Whether execution matches ambition remains the central question for investors.
Q3 2024 Financial Performance
- Revenue: $6.8 billion (+18% YoY), record high
- Data Center Revenue: $3.5 billion (+122% YoY)
- Client Revenue: Record high, driven by Ryzen AI PCs
- Gross Margin: 54% (+250 bps YoY)
- EPS: $0.92 (+31% YoY)
- Q4 2024 Guidance: $7.5 billion (+22% YoY)
- Full-Year 2024 Data Center GPU Revenue: >$5 billion
CEO Lisa Su summarized the trajectory: “The growth of our business is accelerating, and our financial performance is exceeding expectations as we meet an unwavering demand for the most advanced artificial intelligence technologies.” Gaming and Embedded segments showed weakness, but Data Center strength more than compensated.
The Nvidia Gap: Real Competition or Marketing?
AMD bulls point to technical specs showing MI350 competitive with Blackwell. AMD bears note that Nvidia’s revenue ($206 billion projected FY2025) dwarfs AMD’s ($33 billion projected 2025). The reality lies somewhere between.
Where AMD competes effectively: MI300X and MI350 show compelling performance-per-dollar for inference workloads on small-to-medium models. The 288GB HBM3e memory advantage matters for large model deployments where memory capacity limits batch sizes. AMD’s EPYC CPUs continue taking share from Intel in x86 servers, now targeting 36% market share.
Where Nvidia maintains lead: CUDA ecosystem (5 million developers), GB200 NVL72 dominance at frontier training scale, production execution, and customer lock-in. ROCm has improved dramatically—AMD claims 2.4x inference performance improvement since MI300 launch—but software ecosystem remains the gap that matters most.
Barclays analysts framed the OpenAI deal correctly: “We realize there will be delays with these deals, and that the infrastructure required largely doesn’t exist today, but we would again highlight this as a proof point that the ecosystem is desperate for more compute.” AMD wins because demand exceeds Nvidia supply, not because AMD beats Nvidia technically.
AMD’s Product Roadmap: MI400, MI500, and Beyond
AMD has committed to an aggressive annual release cadence for AI accelerators:
- MI350 (2025): Shipping now. CDNA 4, 288GB HBM3e, drop-in upgrade for MI300/MI325 systems
- MI450/Helios (2026): Rack-scale systems for OpenAI deployment, 432GB HBM4, 300GB/s UltraEthernet
- MI500 (2027): Next architecture, targeting frontier model training at scale
- EPYC Turin (Now): 5th-gen server CPUs with up to 192 cores
- EPYC Venice (2027): Next-gen server CPUs for exascale supercomputers
The roadmap is ambitious. AMD claims MI450 will outperform all Nvidia products including Rubin Ultra. That’s marketing—but it signals confidence in the trajectory. The bigger question: Can AMD execute on schedule while Nvidia continues accelerating its own cadence?
Key Catalysts to Watch
Q4 2024 / FY 2025 Earnings (February 3, 2026): Watch for data center revenue trajectory, MI350 ramp commentary, and 2026 guidance incorporating OpenAI deal visibility.
MI450 Production (H2 2026): The OpenAI deal depends on MI450 delivering at scale. Any slippage affects billions in expected revenue and damages the credibility AMD has built.
ROCm Ecosystem: Software remains AMD’s weakest link. Watch for developer adoption metrics, enterprise deployment ease, and major model framework support.
Hyperscaler Diversification: Oracle, Microsoft Azure, and Meta have committed to AMD deployments. Additional hyperscaler announcements would validate the multi-vendor AI infrastructure thesis.
Wall Street’s AMD Outlook
Analyst sentiment remains bullish with significant price target dispersion:
- UBS (Timothy Arcuri): $210 price target, Buy rating
- Benchmark: $170 price target, Buy rating
- Consensus: ~$185 average target, majority Buy ratings
- Bull Case: MI350/MI450 execution, ROCm adoption, $300+ targets by 2030
- Bear Case: Nvidia’s ecosystem moat, execution risk, custom silicon threats
AMD trades at ~42x forward P/E versus Nvidia’s ~29x, reflecting higher growth expectations but also higher execution risk. The stock’s 52-week range ($76.48-$267.08) demonstrates extreme volatility around AI narrative shifts.
Technical Analysis: Support and Resistance
AMD reached an all-time high of $267.08 on October 29, 2025, before correcting to current levels around $208. Key technical levels:
- Support: $200 (psychological), $190 (prior resistance turned support)
- Resistance: $220 (recent range high), $267 (all-time high)
- 52-Week Range: $76.48 – $267.08
- Beta: 1.70 (high correlation to market moves)
The stock pulled back ~22% from October highs as investors digested the OpenAI deal structure and broader tech sector rotation. Volume patterns suggest accumulation near $200 support.
Related Stocks to Watch
- NVDA (Nvidia) – AMD’s primary competitor, AI chip market leader
- MSFT (Microsoft) – Major MI300X customer via Azure
- META (Meta) – Uses MI300X exclusively for Llama 405B inference
- GOOGL (Alphabet) – Custom TPU competitor, potential AMD hyperscaler customer
Track AMD with BusinessTech Context
Bookmark this page to monitor AMD alongside BusinessTech.News coverage of AI infrastructure, semiconductor competition, and data center spending trends. When MI350 benchmarks drop, when OpenAI deployment milestones hit, when quarterly earnings reveal GPU revenue trajectory—watch the chart above to see how markets price the company challenging Nvidia’s AI dominance.
AMD represents the market’s best hope for AI chip competition. Whether that hope translates to sustainable market share gains or remains perpetually “closing the gap” will determine whether today’s valuation looks cheap or expensive in hindsight. The OpenAI deal proves AMD belongs in the conversation. Execution over the next 24 months determines whether it wins the argument.