This is the repository card of kernels-community/gpt-oss-metal-kernels that has been pushed on the Hub. It was built to be used with the kernels library. This card was automatically generated.
How to use
# make sure `kernels` is installed: `pip install -U kernels`
from kernels import get_kernel
kernel_module = get_kernel("kernels-community/gpt-oss-metal-kernels")
f32_bf16w_matmul = kernel_module.f32_bf16w_matmul
f32_bf16w_matmul(...)
Available functions
f32_bf16w_matmulbf16_f32_embeddingsf32_bf16w_rmsnormf32_bf16w_dense_matmul_qkvf32_bf16w_dense_matmul_attn_outputf32_bf16w_dense_matmul_mlp_gatef32_ropef32_bf16w_matmul_qkvf32_sdpaf32_topkexpert_routing_metadataf32_scatterf32_bf16w_matmul_add
Benchmarks
No benchmark available yet.
- Downloads last month
- 48
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support