GPU

AMD Instinct MI100 GPU

Edit@18 days ago

Intergrated Memory(VRAM)
Capacity

32 GB

(HBM2 4096-bit)

Bandwidth

1200 GB/s

171 Token/s

Vector Compute
FP64
11.54 T
FP32
23.07 T
FP16
X
BF16
X
INT32
INT8
X

AMD Instinct MI100 GPU General-Purpose Float-Point performance (Vector Performance / Scalar Performance)

FP64: 11.54 TFLOPS

FP32: 23.07 TFLOPS

Matirx Compute
FP64
X
FP32
46.10 T
FP16
184.60 T
FP8
X
TF32
X
BF16
92.30 T
INT16
X
INT8
184.60 T
INT4
X

AMD Instinct MI100 GPU AI performance (Tensor Performance / Matrix Performance)

FP32: 46.10 TFLOPS

FP16: 184.60 TFLOPS

BF16: 92.30 TFLOPS

INT8: 184.60 TOPS

Hardware Specs
AMD Instinct MI100 GPU is a 7nm chip, launched by AMD at 2020. It has 32 GB built-in(On-Board/On-Chip) memory with bandwidth up to 1200 GB/s. It has 7680 general-purpose ALUs(CUDA cores/Shader cores) and 480 matrix cores(Tensor cores) .
Process Node
7 nm
Launch Year
2020

Vector(CUDA) Cores
7680
Matrix(Tensor) Cores
480
Core Frequency
1000 ~ 1502 MHz
Cache
8MB

Comment without registration

Share your experience with AMD Instinct MI100 GPU / Found an Error? Help Us Improve!