Communities:
x/LocalLLaMA
| # | Tweet | Community | Topic | Views ▼ | Ratio | Engagement | Posted |
|---|---|---|---|---|---|---|---|
| 1 | [image] Just some numbers so you don’t get misled
RTX 3090 (7 years old)
> 24GB VRAM
> Bandwidth: 936.2 GB/s
> Bi-directional NVLink 112GB/s
RTX PRO 4000
> 24GB VRAM
> Bandwidth: 672 GB/s
> No Bi-directional NVLink,
> need 32 Gen. 5 PCIe Lanes to pool 2 at 64GB/s | x/LocalLLaMA | Artificial Intelligence | 42.1K | 0.8x | 286 | Apr 6 |
| 2 | [image] How am I connecting the DGX Spark cluster
> Mikrotik CRS804-4DDQ 1.6Tbps switch
> (4) 400G QSFP-DD to 2x 200G QSFP56
Each DGX Spark has a ConnectX-7 supporting 200Gbps
Each cable out of the switch goes into 2 DGX Sparks
This allows 8x DGX Sparks cluster at the full 1.6Tbps | x/LocalLLaMA | Artificial Intelligence | 33.0K | 0.8x | 214 | Feb 19 |
| 3 | [text] Most people think VRAM = model size
and that’s why their runs crash
GPU memory math is complex
and so are the implications
Here’s how it actually works in a nutshell ↓ | x/LocalLLaMA | Artificial Intelligence | 23.7K | 0.5x | 239 | Apr 3 |
| 4 | [image] which one of you is this? | x/LocalLLaMA | Artificial Intelligence | 12.1K | 0.2x | 235 | Apr 2 |
| 5 | [image] RTX PRO 6000 (96GB VRAM, ~$15K) GIVEAWAY FAQ
Q: Cost to enter?
A: $0. Free.
Q: Do I have to register for GTC?
A: Yes, virtual attendance is COMPLETELY FREE
Q: Where do I enter?
A: Tap the link in my bio, there’s a clear button on the page
Q: How do I increase my chances?
A: | x/LocalLLaMA | Artificial Intelligence | 9.5K | 0.2x | 89 | Mar 7 |