Update log↩
- 2022-09-28: personal workstation
- 2024-07-13: start of server cluster
Personal workstation↩
Hardware
- GPU: NVIDIA RTX 3090
- We will design our build to allow for a second graphics card, which will allow us to run two experiments in parallel as well as test multi-GPU code locally.
- CPU: AMD Ryzen 7 5800X
- Motherboard: ASUS ROG Strix X570-E Gaming WiFi II
- CPU Cooler: NZXT Kraken X53 240mm
- Power Supply: EVGA Supernova 1600 G+ (80+ Gold)
- RAM: Corsair Vengeance LPX DDR4 3200 (4x32GB)
- Storage (SSD): Silicon Power P34A80 M.2 NVMe SSD (2TB)
- Case: Corsair Crystal Series 680X
Justifications
- Power supply
- Using the Power Supply Calculator provided by Newegg, we need an estimated 1200 to 1500W power supply. This makes sense since each RTX 3090 could draw up to 400W underload, and the rest of the system will draw around 350W as well. Going with a 1600W PSU is a safe bet.
- Motherboard
- The PCIe slots on the ASUS X570-E supports two 3-slot wide graphics cards, which means the motherboard can support two RTX 3090 graphics cards. However, if you are going to be using dual GPUs, Founders Edition graphics cards with blower style fans are going to be necessary, since the space between the cards is tight.
Server cluster↩
- GPU: ASUS Phoenix NVIDIA GeForce RTX 3060 V2
- CPU: Intel i7-12700K LGA1700
- Motherboard: MSI MAG Z690 Tomahawk WiFi DDR4
- CPU Cooler: Noctua NH-D9L
- Power Supply: EVGA Supernova 1000 G7 (80+ Gold)
- RAM: Corsair Vengeance LPX DDR4 3200 (2x32GB)
- Storage (SSD)
- Crucial MX500 2TB 3D NAND SATA 2.5 Inch Internal SSD
- Kingston NV2 1TB M.2 2280 NVMe Internal SSD
- Case: Rosewill RSV-L4500U Server Case
Specs to know↩
GPUs
Setup process↩
1PCIe x16 vs. x8 with 4x Titan V GPUs for Machine Learning