it:ai_computers
Table of Contents
computers for running artificial intelligence
see also:
Introduction
- practically any computer can run cloud-based AI applications as the resources used to run these are computer servers on the internet and your computer is only sending it your requests, the cloud server does the computing and sends you back a response
- if however, you wish to generate AI outputs from your own computer then you will need some high end gear to achieve this - depending on the demands of what you are wishing to output - generally this requires a high end desktop computer although some high end laptops may suffice for the lesser tasks
- AI calculations currently are either done on the CPU or on the GPU - BUT NOT both at the same time (as yet)
- currently most AI is performed on the GPU as this is usually about 10x faster than even an Intel i9 CPU for this task which uses vector manipulations, and this means video RAM on the GPU (VRAM) becomes a critical bottleneck for your work - a bare minimum of 8Gb is needed for lesser tasks, while at least 24-128Gb is needed for training models and this becomes expensive and requires much larger power supplies of 600-1600W
- gaming laptop
- eg. Asus ProArt Studiobook 16 has 64Gb RAM, Intel i9 CPU, nVidia Geforce RTX™ 4070 128bit GDDR6 8Gb 130W GPU (4608 CUDA 8.9 cores), 2Tb SSD drive and costs ~$AU5500 in 2023
- option of adding external GPUs to laptops
- these are not cheap and your benefits will be limited by the bandwidth of the Thunderbolt USB-C port and assumes your computer will be compatible with eGFX (external graphics card technologies)
- the external GPU boxes even without GPU cards are not cheap as they need to have a decent power supply (eg. 600W) and PCIe slots for GPU cards (preferably 3 full sized slots compatible with high end nVidia cards - not all allow this)
- high end GPU cards are not cheap (2023 prices):
- an nVidia Quadro RTX A4500 20GB 200W GPU will cost around $AU2500
- an nVidia Quadro RTX A5000 24GB 230W GPU will cost around $AU3800
- an nVidia Quadro RTX A6000 48Gb 700W GPU will cost around $AU8000
- for data centres, an nVidia A100 80Gb HBM2e 2Tb/s 300W GPU will cost around $AU26,000
- AI training desktop computer
- computer with 2 x nVidia 48Gb RAM GPUs (total 96Gb) is likely to cost at least $AU22,000
- companies may wish to purchase GPU servers
- eg. a 10Gbps LAN networked GPU server may have up to 8 x nVidia 80Gb A100 or even H100 GPUs (the US have restricted overseas sales of H100 GPUs) and up to 8Tb system RAM, dual CPUs, and a 2000-8000W power supply
- future technologies will develop:
- specialised AI training optimised computer chips - these are likely to be analog based
- allow combining RAM and VRAM resources
- quantum computers for the large models
Image generation
- a high end “gaming” laptop may suffice if it has 32-64Gb RAM, a fast CPU and an CUDA-compatible nVidia GPU with at least 6-8Gb RAM (12GB is better)
Training LoRAs for fine tuning image generation
- this will require a GPU with at least 18-24Gb VRAM, hence a laptop is unlikely to cut it
Running a small 5Gb 7b LLM
- a high end “gaming” laptop may suffice if it has 32-64Gb RAM, a fast CPU and an CUDA-compatible nVidia GPU with at least 6-8Gb RAM (12GB is better)
Running a small 5Gb 7b LLM fine tuned to 15Gb
- this will require a GPU with at least 18-24Gb VRAM, hence a laptop is unlikely to cut it
Fine tuning a small 5Gb 7b LLM
- this will require a GPU with at least 48Gb VRAM, even most desktops will not suffice
- an AI training desktop computer with 2 x nVidia 48Gb RAM GPUs (total 96Gb) is likely to cost at least $AU22,000
- NB. it may take 8hrs on 8x A100 high end nVidia GPUs rented on the cloud
Training a LLM
- this typically uses some 10,000 GPUs at the same time
it/ai_computers.txt · Last modified: 2023/12/19 00:32 by gary1