Skip to content

cli tool to know if a model will run on your GPU without downloading it.

License

Notifications You must be signed in to change notification settings

rictrlab/fitgpu

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

7 Commits
 
 
 
 
 
 
 
 
 
 

Repository files navigation

Check if a HuggingFace model will run on your GPU.

fitgpu takes a HuggingFace model ID and tells you whether the model's weights will fit in your GPU's available VRAM.

How?

  1. Gets model's file metadata from HuggingFace (weights are not downloaded)
  2. Sums up the sizes of all weight files (.safetensors / .bin)
  3. Queries your GPU's free VRAM using the NVIDIA driver
  4. Compares and shows the result

Installation

pip install fitgpu

Use

fitgpu <model_id> [--token TOKEN]
  • model_id — HuggingFace model ID (e.g. google/gemma-2-2b)
  • --token TOKEN — optional, HuggingFace API token for gated/private models

Public models

fitgpu google/gemma-2-2b

Gated models

fitgpu meta-llama/Llama-2-7b-hf --token hf_YOUR_TOKEN

Example

$ fitgpu google/gemma-2-2b
model : google/gemma-2-2b
size  : 4.89 GB (weights on disk)

GPU 0: NVIDIA RTX 4090
  VRAM : 24.00 GB total, 22.31 GB free
  result: fits

About

cli tool to know if a model will run on your GPU without downloading it.

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages