could you add a built for torch29-cxx11-cu130-x86_64-linux
thanks a lot
Should be added, give it a try.
File "/root/.cache/huggingface/hub/models--varunneal--flash-attention-3/snapshots/add01af002563fdeff03a8e5fb77ce497d202055/build/torch29-cxx11-cu130-x86_64-linux/flash_attention_3/init.py", line 1, in
from .flash_attn_interface import *
File "/root/.cache/huggingface/hub/models--varunneal--flash-attention-3/snapshots/add01af002563fdeff03a8e5fb77ce497d202055/build/torch29-cxx11-cu130-x86_64-linux/flash_attention_3/flash_attn_interface.py", line 10, in
from . import _C # Registers operators with PyTorch
^^^^^^^^^^^^^^^^
ImportError: libcudart.so.12: cannot open shared object file: No such file or directory
Fails on CUDA 13 container :(
I'll try to take a look this week. Just respond in this thread if I take over a week / forget about it.