Can I run SD 2.1 on NVIDIA H100 PCIe?

check_circle
Perfect
Yes, you can run this model!
GPU VRAM
80.0GB
Required
5.0GB
Headroom
+75.0GB

VRAM Usage

0GB 6% used 80.0GB

Performance Estimate

Tokens/sec ~117.0
Batch size 32

info Technical Analysis

NVIDIA H100 PCIe provides excellent compatibility with SD 2.1. With 80.0GB of VRAM and only 5.0GB required, you have 75.0GB of headroom for comfortable inference. This allows for extended context lengths, batch processing, and smooth operation.

lightbulb Recommendation

You can run SD 2.1 on NVIDIA H100 PCIe without any compromises. Consider using full context length and larger batch sizes for optimal throughput.

tune Recommended Settings

Batch_Size
32
Context_Length
77
Inference_Framework
llama.cpp or vLLM

help Frequently Asked Questions

Can I run SD 2.1 on NVIDIA H100 PCIe? expand_more
NVIDIA H100 PCIe has 80.0GB VRAM, which provides 75.0GB of headroom beyond the 5.0GB required by SD 2.1 (0.87B). This is plenty of room for comfortable inference with room for KV cache, batching, and extended context lengths.
How much VRAM does SD 2.1 need? expand_more
SD 2.1 requires approximately 5.0GB of VRAM.
What performance can I expect? expand_more
Estimated 117 tokens per second.