NVIDIA RTX 3090 cannot run Mixtral 8x7B (46.70B) in this configuration. The model requires 46.7GB but only 24.0GB is available, leaving you 22.7GB short.
Consider using a more aggressive quantization (Q4_K_M, Q3_K_M) to reduce VRAM requirements, or upgrade to a GPU with more VRAM. Cloud GPU services like RunPod or Vast.ai offer affordable options.