Intel Announces Arc Pro B70 and Arc Pro B65 GPUs
throwaway270925
144 points
97 comments
March 26, 2026
Related Discussions
Found 5 related stories in 48.0ms across 3,471 title embeddings via pgvector HNSW
- AMD Announces the Ryzen 9 9950X3D2 coobird · 49 pts · March 26, 2026 · 56% similar
- Intel's make-or-break 18A process node debuts for data center with 288-core Xeon vanburen · 270 pts · March 03, 2026 · 54% similar
- Testing Apple's 2026 16-inch MacBook Pro, M5 Max, and its new performance cores rbanffy · 12 pts · March 09, 2026 · 53% similar
- Arm's Cortex X925: Reaching Desktop Performance ingve · 264 pts · March 03, 2026 · 51% similar
- $500 GPU outperforms Claude Sonnet on coding benchmarks yogthos · 142 pts · March 26, 2026 · 51% similar
Discussion Highlights (16 comments)
genpfault
600 GB/s of memory bandwidth isn't anything to sneeze at. ~$1000 for the Pro B70, if Microcenter is to be believed: https://www.microcenter.com/product/709007/intel-arc-pro-b70... https://www.microcenter.com/product/708790/asrock-intel-arc-...
WarmWash
Wake me when they wake up and release a middling card with 128GB memory.
nickthegreek
Both have 32gb vram. Could be a pretty compelling choice.
vessenes
Not sure why you'd want this over an apple setup. M4 max is 545GB/s of memory bandwidth - $2k for an entire Mac Studio with 48GB of RAM vs 32 for the B70.
whalesalad
Anyone running an ARC card for desktop Linux who can comment on the experience? I've had smooth sailing with AMD GPU's but have never tried Intel.
DiabloD3
Since they fired the entire Arc team and a lot of the senior engineers already updated their Linkedins to reflect their new positions at AMD, Nvidia, and others, as well as laying off most of their Linux driver team (GPU and non-GPU), uh... WTF?
pjmlp
New cards in 2026, and targeting Vulkan 1.3?!
tbyehl
Where's the A310 / A40 successor? Gimme some SR-IOV in a slot-powered, single-width, low-profile card.
SkyeCA
32GB of vram for a decent price? I wonder if these will work well for VR, because vram is my current main issue.
jmward01
I think this shows a shift in model architecture. MOE and similar need more memory for the compute available than just one big model with a lot of layers and weights. I think this is likely a trend that will accelerate. You build the trade-off in which encourages even more experts which means more of a tradeoff, so more experts.....
mikelitoris
Too little too late, classic Intel
SmellTheGlove
Any idea if it'll be possible to mix these with nvidia cards? Adding 32GB to a single 3090 setup would be pretty nice.
kadoban
The last go around they looked good on paper and then Intel just didn't make any of them to sell. Announce all you want, if you don't ever ship anything I could buy, who gives a shit.
cmxch
Good to see that Intel learned to release product to more than just resellers. Now can we have a 64gb B70 that’s worldwide available and not marked to unicorns like the Maxsun B60 Dual model has been?
lostmsu
Nothing like Crossfire/SLI? Not possible to efficiently connect multiple cards for one large model?
thefounder
Why don’t they make an GPU optimised for inference/batch jobs with 1 TB of ram ? Everyone wants to run the biggest models locally.