id stringlengths 23 33 | difficulty stringclasses 1
value | nl stringlengths 85 95 | mlir stringlengths 157 187 | notes stringlengths 8 18 | dialect stringclasses 1
value | source stringclasses 1
value |
|---|---|---|---|---|---|---|
001_ew_bin_001_add_f16_8 | programmatic-wide | Write a function that applies stablehlo.add elementwise to two f16 tensors of shape 8. | module {
func.func @f(%a: tensor<8xf16>, %b: tensor<8xf16>) -> tensor<8xf16> {
%0 = stablehlo.add %a, %b : tensor<8xf16>
return %0 : tensor<8xf16>
}
} | add f16 8 | stablehlo+func | day_f8_wide_sweep |
002_ew_bin_002_add_f16_16 | programmatic-wide | Write a function that applies stablehlo.add elementwise to two f16 tensors of shape 16. | module {
func.func @f(%a: tensor<16xf16>, %b: tensor<16xf16>) -> tensor<16xf16> {
%0 = stablehlo.add %a, %b : tensor<16xf16>
return %0 : tensor<16xf16>
}
} | add f16 16 | stablehlo+func | day_f8_wide_sweep |
003_ew_bin_003_add_f16_32 | programmatic-wide | Write a function that applies stablehlo.add elementwise to two f16 tensors of shape 32. | module {
func.func @f(%a: tensor<32xf16>, %b: tensor<32xf16>) -> tensor<32xf16> {
%0 = stablehlo.add %a, %b : tensor<32xf16>
return %0 : tensor<32xf16>
}
} | add f16 32 | stablehlo+func | day_f8_wide_sweep |
004_ew_bin_004_add_f16_64 | programmatic-wide | Write a function that applies stablehlo.add elementwise to two f16 tensors of shape 64. | module {
func.func @f(%a: tensor<64xf16>, %b: tensor<64xf16>) -> tensor<64xf16> {
%0 = stablehlo.add %a, %b : tensor<64xf16>
return %0 : tensor<64xf16>
}
} | add f16 64 | stablehlo+func | day_f8_wide_sweep |
005_ew_bin_005_add_f16_4x4 | programmatic-wide | Write a function that applies stablehlo.add elementwise to two f16 tensors of shape 4x4. | module {
func.func @f(%a: tensor<4x4xf16>, %b: tensor<4x4xf16>) -> tensor<4x4xf16> {
%0 = stablehlo.add %a, %b : tensor<4x4xf16>
return %0 : tensor<4x4xf16>
}
} | add f16 4x4 | stablehlo+func | day_f8_wide_sweep |
006_ew_bin_006_add_f16_8x8 | programmatic-wide | Write a function that applies stablehlo.add elementwise to two f16 tensors of shape 8x8. | module {
func.func @f(%a: tensor<8x8xf16>, %b: tensor<8x8xf16>) -> tensor<8x8xf16> {
%0 = stablehlo.add %a, %b : tensor<8x8xf16>
return %0 : tensor<8x8xf16>
}
} | add f16 8x8 | stablehlo+func | day_f8_wide_sweep |
007_ew_bin_007_add_f16_8x16 | programmatic-wide | Write a function that applies stablehlo.add elementwise to two f16 tensors of shape 8x16. | module {
func.func @f(%a: tensor<8x16xf16>, %b: tensor<8x16xf16>) -> tensor<8x16xf16> {
%0 = stablehlo.add %a, %b : tensor<8x16xf16>
return %0 : tensor<8x16xf16>
}
} | add f16 8x16 | stablehlo+func | day_f8_wide_sweep |
008_ew_bin_008_add_f16_4x4x4 | programmatic-wide | Write a function that applies stablehlo.add elementwise to two f16 tensors of shape 4x4x4. | module {
func.func @f(%a: tensor<4x4x4xf16>, %b: tensor<4x4x4xf16>) -> tensor<4x4x4xf16> {
%0 = stablehlo.add %a, %b : tensor<4x4x4xf16>
return %0 : tensor<4x4x4xf16>
}
} | add f16 4x4x4 | stablehlo+func | day_f8_wide_sweep |
009_ew_bin_009_add_f16_2x8x8 | programmatic-wide | Write a function that applies stablehlo.add elementwise to two f16 tensors of shape 2x8x8. | module {
func.func @f(%a: tensor<2x8x8xf16>, %b: tensor<2x8x8xf16>) -> tensor<2x8x8xf16> {
%0 = stablehlo.add %a, %b : tensor<2x8x8xf16>
return %0 : tensor<2x8x8xf16>
}
} | add f16 2x8x8 | stablehlo+func | day_f8_wide_sweep |
010_ew_bin_010_add_f32_8 | programmatic-wide | Write a function that applies stablehlo.add elementwise to two f32 tensors of shape 8. | module {
func.func @f(%a: tensor<8xf32>, %b: tensor<8xf32>) -> tensor<8xf32> {
%0 = stablehlo.add %a, %b : tensor<8xf32>
return %0 : tensor<8xf32>
}
} | add f32 8 | stablehlo+func | day_f8_wide_sweep |
011_ew_bin_011_add_f32_16 | programmatic-wide | Write a function that applies stablehlo.add elementwise to two f32 tensors of shape 16. | module {
func.func @f(%a: tensor<16xf32>, %b: tensor<16xf32>) -> tensor<16xf32> {
%0 = stablehlo.add %a, %b : tensor<16xf32>
return %0 : tensor<16xf32>
}
} | add f32 16 | stablehlo+func | day_f8_wide_sweep |
012_ew_bin_012_add_f32_32 | programmatic-wide | Write a function that applies stablehlo.add elementwise to two f32 tensors of shape 32. | module {
func.func @f(%a: tensor<32xf32>, %b: tensor<32xf32>) -> tensor<32xf32> {
%0 = stablehlo.add %a, %b : tensor<32xf32>
return %0 : tensor<32xf32>
}
} | add f32 32 | stablehlo+func | day_f8_wide_sweep |
013_ew_bin_013_add_f32_64 | programmatic-wide | Write a function that applies stablehlo.add elementwise to two f32 tensors of shape 64. | module {
func.func @f(%a: tensor<64xf32>, %b: tensor<64xf32>) -> tensor<64xf32> {
%0 = stablehlo.add %a, %b : tensor<64xf32>
return %0 : tensor<64xf32>
}
} | add f32 64 | stablehlo+func | day_f8_wide_sweep |
014_ew_bin_014_add_f32_4x4 | programmatic-wide | Write a function that applies stablehlo.add elementwise to two f32 tensors of shape 4x4. | module {
func.func @f(%a: tensor<4x4xf32>, %b: tensor<4x4xf32>) -> tensor<4x4xf32> {
%0 = stablehlo.add %a, %b : tensor<4x4xf32>
return %0 : tensor<4x4xf32>
}
} | add f32 4x4 | stablehlo+func | day_f8_wide_sweep |
015_ew_bin_015_add_f32_8x8 | programmatic-wide | Write a function that applies stablehlo.add elementwise to two f32 tensors of shape 8x8. | module {
func.func @f(%a: tensor<8x8xf32>, %b: tensor<8x8xf32>) -> tensor<8x8xf32> {
%0 = stablehlo.add %a, %b : tensor<8x8xf32>
return %0 : tensor<8x8xf32>
}
} | add f32 8x8 | stablehlo+func | day_f8_wide_sweep |
016_ew_bin_016_add_f32_8x16 | programmatic-wide | Write a function that applies stablehlo.add elementwise to two f32 tensors of shape 8x16. | module {
func.func @f(%a: tensor<8x16xf32>, %b: tensor<8x16xf32>) -> tensor<8x16xf32> {
%0 = stablehlo.add %a, %b : tensor<8x16xf32>
return %0 : tensor<8x16xf32>
}
} | add f32 8x16 | stablehlo+func | day_f8_wide_sweep |
017_ew_bin_017_add_f32_4x4x4 | programmatic-wide | Write a function that applies stablehlo.add elementwise to two f32 tensors of shape 4x4x4. | module {
func.func @f(%a: tensor<4x4x4xf32>, %b: tensor<4x4x4xf32>) -> tensor<4x4x4xf32> {
%0 = stablehlo.add %a, %b : tensor<4x4x4xf32>
return %0 : tensor<4x4x4xf32>
}
} | add f32 4x4x4 | stablehlo+func | day_f8_wide_sweep |
018_ew_bin_018_add_f32_2x8x8 | programmatic-wide | Write a function that applies stablehlo.add elementwise to two f32 tensors of shape 2x8x8. | module {
func.func @f(%a: tensor<2x8x8xf32>, %b: tensor<2x8x8xf32>) -> tensor<2x8x8xf32> {
%0 = stablehlo.add %a, %b : tensor<2x8x8xf32>
return %0 : tensor<2x8x8xf32>
}
} | add f32 2x8x8 | stablehlo+func | day_f8_wide_sweep |
019_ew_bin_019_add_f64_8 | programmatic-wide | Write a function that applies stablehlo.add elementwise to two f64 tensors of shape 8. | module {
func.func @f(%a: tensor<8xf64>, %b: tensor<8xf64>) -> tensor<8xf64> {
%0 = stablehlo.add %a, %b : tensor<8xf64>
return %0 : tensor<8xf64>
}
} | add f64 8 | stablehlo+func | day_f8_wide_sweep |
020_ew_bin_020_add_f64_16 | programmatic-wide | Write a function that applies stablehlo.add elementwise to two f64 tensors of shape 16. | module {
func.func @f(%a: tensor<16xf64>, %b: tensor<16xf64>) -> tensor<16xf64> {
%0 = stablehlo.add %a, %b : tensor<16xf64>
return %0 : tensor<16xf64>
}
} | add f64 16 | stablehlo+func | day_f8_wide_sweep |
021_ew_bin_021_add_f64_32 | programmatic-wide | Write a function that applies stablehlo.add elementwise to two f64 tensors of shape 32. | module {
func.func @f(%a: tensor<32xf64>, %b: tensor<32xf64>) -> tensor<32xf64> {
%0 = stablehlo.add %a, %b : tensor<32xf64>
return %0 : tensor<32xf64>
}
} | add f64 32 | stablehlo+func | day_f8_wide_sweep |
022_ew_bin_022_add_f64_64 | programmatic-wide | Write a function that applies stablehlo.add elementwise to two f64 tensors of shape 64. | module {
func.func @f(%a: tensor<64xf64>, %b: tensor<64xf64>) -> tensor<64xf64> {
%0 = stablehlo.add %a, %b : tensor<64xf64>
return %0 : tensor<64xf64>
}
} | add f64 64 | stablehlo+func | day_f8_wide_sweep |
023_ew_bin_023_add_f64_4x4 | programmatic-wide | Write a function that applies stablehlo.add elementwise to two f64 tensors of shape 4x4. | module {
func.func @f(%a: tensor<4x4xf64>, %b: tensor<4x4xf64>) -> tensor<4x4xf64> {
%0 = stablehlo.add %a, %b : tensor<4x4xf64>
return %0 : tensor<4x4xf64>
}
} | add f64 4x4 | stablehlo+func | day_f8_wide_sweep |
024_ew_bin_024_add_f64_8x8 | programmatic-wide | Write a function that applies stablehlo.add elementwise to two f64 tensors of shape 8x8. | module {
func.func @f(%a: tensor<8x8xf64>, %b: tensor<8x8xf64>) -> tensor<8x8xf64> {
%0 = stablehlo.add %a, %b : tensor<8x8xf64>
return %0 : tensor<8x8xf64>
}
} | add f64 8x8 | stablehlo+func | day_f8_wide_sweep |
025_ew_bin_025_add_f64_8x16 | programmatic-wide | Write a function that applies stablehlo.add elementwise to two f64 tensors of shape 8x16. | module {
func.func @f(%a: tensor<8x16xf64>, %b: tensor<8x16xf64>) -> tensor<8x16xf64> {
%0 = stablehlo.add %a, %b : tensor<8x16xf64>
return %0 : tensor<8x16xf64>
}
} | add f64 8x16 | stablehlo+func | day_f8_wide_sweep |
026_ew_bin_026_add_f64_4x4x4 | programmatic-wide | Write a function that applies stablehlo.add elementwise to two f64 tensors of shape 4x4x4. | module {
func.func @f(%a: tensor<4x4x4xf64>, %b: tensor<4x4x4xf64>) -> tensor<4x4x4xf64> {
%0 = stablehlo.add %a, %b : tensor<4x4x4xf64>
return %0 : tensor<4x4x4xf64>
}
} | add f64 4x4x4 | stablehlo+func | day_f8_wide_sweep |
027_ew_bin_027_add_f64_2x8x8 | programmatic-wide | Write a function that applies stablehlo.add elementwise to two f64 tensors of shape 2x8x8. | module {
func.func @f(%a: tensor<2x8x8xf64>, %b: tensor<2x8x8xf64>) -> tensor<2x8x8xf64> {
%0 = stablehlo.add %a, %b : tensor<2x8x8xf64>
return %0 : tensor<2x8x8xf64>
}
} | add f64 2x8x8 | stablehlo+func | day_f8_wide_sweep |
028_ew_bin_028_add_i8_8 | programmatic-wide | Write a function that applies stablehlo.add elementwise to two i8 tensors of shape 8. | module {
func.func @f(%a: tensor<8xi8>, %b: tensor<8xi8>) -> tensor<8xi8> {
%0 = stablehlo.add %a, %b : tensor<8xi8>
return %0 : tensor<8xi8>
}
} | add i8 8 | stablehlo+func | day_f8_wide_sweep |
029_ew_bin_029_add_i8_16 | programmatic-wide | Write a function that applies stablehlo.add elementwise to two i8 tensors of shape 16. | module {
func.func @f(%a: tensor<16xi8>, %b: tensor<16xi8>) -> tensor<16xi8> {
%0 = stablehlo.add %a, %b : tensor<16xi8>
return %0 : tensor<16xi8>
}
} | add i8 16 | stablehlo+func | day_f8_wide_sweep |
030_ew_bin_030_add_i8_32 | programmatic-wide | Write a function that applies stablehlo.add elementwise to two i8 tensors of shape 32. | module {
func.func @f(%a: tensor<32xi8>, %b: tensor<32xi8>) -> tensor<32xi8> {
%0 = stablehlo.add %a, %b : tensor<32xi8>
return %0 : tensor<32xi8>
}
} | add i8 32 | stablehlo+func | day_f8_wide_sweep |
031_ew_bin_031_add_i8_64 | programmatic-wide | Write a function that applies stablehlo.add elementwise to two i8 tensors of shape 64. | module {
func.func @f(%a: tensor<64xi8>, %b: tensor<64xi8>) -> tensor<64xi8> {
%0 = stablehlo.add %a, %b : tensor<64xi8>
return %0 : tensor<64xi8>
}
} | add i8 64 | stablehlo+func | day_f8_wide_sweep |
032_ew_bin_032_add_i8_4x4 | programmatic-wide | Write a function that applies stablehlo.add elementwise to two i8 tensors of shape 4x4. | module {
func.func @f(%a: tensor<4x4xi8>, %b: tensor<4x4xi8>) -> tensor<4x4xi8> {
%0 = stablehlo.add %a, %b : tensor<4x4xi8>
return %0 : tensor<4x4xi8>
}
} | add i8 4x4 | stablehlo+func | day_f8_wide_sweep |
033_ew_bin_033_add_i8_8x8 | programmatic-wide | Write a function that applies stablehlo.add elementwise to two i8 tensors of shape 8x8. | module {
func.func @f(%a: tensor<8x8xi8>, %b: tensor<8x8xi8>) -> tensor<8x8xi8> {
%0 = stablehlo.add %a, %b : tensor<8x8xi8>
return %0 : tensor<8x8xi8>
}
} | add i8 8x8 | stablehlo+func | day_f8_wide_sweep |
034_ew_bin_034_add_i8_8x16 | programmatic-wide | Write a function that applies stablehlo.add elementwise to two i8 tensors of shape 8x16. | module {
func.func @f(%a: tensor<8x16xi8>, %b: tensor<8x16xi8>) -> tensor<8x16xi8> {
%0 = stablehlo.add %a, %b : tensor<8x16xi8>
return %0 : tensor<8x16xi8>
}
} | add i8 8x16 | stablehlo+func | day_f8_wide_sweep |
035_ew_bin_035_add_i8_4x4x4 | programmatic-wide | Write a function that applies stablehlo.add elementwise to two i8 tensors of shape 4x4x4. | module {
func.func @f(%a: tensor<4x4x4xi8>, %b: tensor<4x4x4xi8>) -> tensor<4x4x4xi8> {
%0 = stablehlo.add %a, %b : tensor<4x4x4xi8>
return %0 : tensor<4x4x4xi8>
}
} | add i8 4x4x4 | stablehlo+func | day_f8_wide_sweep |
036_ew_bin_036_add_i8_2x8x8 | programmatic-wide | Write a function that applies stablehlo.add elementwise to two i8 tensors of shape 2x8x8. | module {
func.func @f(%a: tensor<2x8x8xi8>, %b: tensor<2x8x8xi8>) -> tensor<2x8x8xi8> {
%0 = stablehlo.add %a, %b : tensor<2x8x8xi8>
return %0 : tensor<2x8x8xi8>
}
} | add i8 2x8x8 | stablehlo+func | day_f8_wide_sweep |
037_ew_bin_037_add_i32_8 | programmatic-wide | Write a function that applies stablehlo.add elementwise to two i32 tensors of shape 8. | module {
func.func @f(%a: tensor<8xi32>, %b: tensor<8xi32>) -> tensor<8xi32> {
%0 = stablehlo.add %a, %b : tensor<8xi32>
return %0 : tensor<8xi32>
}
} | add i32 8 | stablehlo+func | day_f8_wide_sweep |
038_ew_bin_038_add_i32_16 | programmatic-wide | Write a function that applies stablehlo.add elementwise to two i32 tensors of shape 16. | module {
func.func @f(%a: tensor<16xi32>, %b: tensor<16xi32>) -> tensor<16xi32> {
%0 = stablehlo.add %a, %b : tensor<16xi32>
return %0 : tensor<16xi32>
}
} | add i32 16 | stablehlo+func | day_f8_wide_sweep |
039_ew_bin_039_add_i32_32 | programmatic-wide | Write a function that applies stablehlo.add elementwise to two i32 tensors of shape 32. | module {
func.func @f(%a: tensor<32xi32>, %b: tensor<32xi32>) -> tensor<32xi32> {
%0 = stablehlo.add %a, %b : tensor<32xi32>
return %0 : tensor<32xi32>
}
} | add i32 32 | stablehlo+func | day_f8_wide_sweep |
040_ew_bin_040_add_i32_64 | programmatic-wide | Write a function that applies stablehlo.add elementwise to two i32 tensors of shape 64. | module {
func.func @f(%a: tensor<64xi32>, %b: tensor<64xi32>) -> tensor<64xi32> {
%0 = stablehlo.add %a, %b : tensor<64xi32>
return %0 : tensor<64xi32>
}
} | add i32 64 | stablehlo+func | day_f8_wide_sweep |
041_ew_bin_041_add_i32_4x4 | programmatic-wide | Write a function that applies stablehlo.add elementwise to two i32 tensors of shape 4x4. | module {
func.func @f(%a: tensor<4x4xi32>, %b: tensor<4x4xi32>) -> tensor<4x4xi32> {
%0 = stablehlo.add %a, %b : tensor<4x4xi32>
return %0 : tensor<4x4xi32>
}
} | add i32 4x4 | stablehlo+func | day_f8_wide_sweep |
042_ew_bin_042_add_i32_8x8 | programmatic-wide | Write a function that applies stablehlo.add elementwise to two i32 tensors of shape 8x8. | module {
func.func @f(%a: tensor<8x8xi32>, %b: tensor<8x8xi32>) -> tensor<8x8xi32> {
%0 = stablehlo.add %a, %b : tensor<8x8xi32>
return %0 : tensor<8x8xi32>
}
} | add i32 8x8 | stablehlo+func | day_f8_wide_sweep |
043_ew_bin_043_add_i32_8x16 | programmatic-wide | Write a function that applies stablehlo.add elementwise to two i32 tensors of shape 8x16. | module {
func.func @f(%a: tensor<8x16xi32>, %b: tensor<8x16xi32>) -> tensor<8x16xi32> {
%0 = stablehlo.add %a, %b : tensor<8x16xi32>
return %0 : tensor<8x16xi32>
}
} | add i32 8x16 | stablehlo+func | day_f8_wide_sweep |
044_ew_bin_044_add_i32_4x4x4 | programmatic-wide | Write a function that applies stablehlo.add elementwise to two i32 tensors of shape 4x4x4. | module {
func.func @f(%a: tensor<4x4x4xi32>, %b: tensor<4x4x4xi32>) -> tensor<4x4x4xi32> {
%0 = stablehlo.add %a, %b : tensor<4x4x4xi32>
return %0 : tensor<4x4x4xi32>
}
} | add i32 4x4x4 | stablehlo+func | day_f8_wide_sweep |
045_ew_bin_045_add_i32_2x8x8 | programmatic-wide | Write a function that applies stablehlo.add elementwise to two i32 tensors of shape 2x8x8. | module {
func.func @f(%a: tensor<2x8x8xi32>, %b: tensor<2x8x8xi32>) -> tensor<2x8x8xi32> {
%0 = stablehlo.add %a, %b : tensor<2x8x8xi32>
return %0 : tensor<2x8x8xi32>
}
} | add i32 2x8x8 | stablehlo+func | day_f8_wide_sweep |
046_ew_bin_046_add_i64_8 | programmatic-wide | Write a function that applies stablehlo.add elementwise to two i64 tensors of shape 8. | module {
func.func @f(%a: tensor<8xi64>, %b: tensor<8xi64>) -> tensor<8xi64> {
%0 = stablehlo.add %a, %b : tensor<8xi64>
return %0 : tensor<8xi64>
}
} | add i64 8 | stablehlo+func | day_f8_wide_sweep |
047_ew_bin_047_add_i64_16 | programmatic-wide | Write a function that applies stablehlo.add elementwise to two i64 tensors of shape 16. | module {
func.func @f(%a: tensor<16xi64>, %b: tensor<16xi64>) -> tensor<16xi64> {
%0 = stablehlo.add %a, %b : tensor<16xi64>
return %0 : tensor<16xi64>
}
} | add i64 16 | stablehlo+func | day_f8_wide_sweep |
048_ew_bin_048_add_i64_32 | programmatic-wide | Write a function that applies stablehlo.add elementwise to two i64 tensors of shape 32. | module {
func.func @f(%a: tensor<32xi64>, %b: tensor<32xi64>) -> tensor<32xi64> {
%0 = stablehlo.add %a, %b : tensor<32xi64>
return %0 : tensor<32xi64>
}
} | add i64 32 | stablehlo+func | day_f8_wide_sweep |
049_ew_bin_049_add_i64_64 | programmatic-wide | Write a function that applies stablehlo.add elementwise to two i64 tensors of shape 64. | module {
func.func @f(%a: tensor<64xi64>, %b: tensor<64xi64>) -> tensor<64xi64> {
%0 = stablehlo.add %a, %b : tensor<64xi64>
return %0 : tensor<64xi64>
}
} | add i64 64 | stablehlo+func | day_f8_wide_sweep |
050_ew_bin_050_add_i64_4x4 | programmatic-wide | Write a function that applies stablehlo.add elementwise to two i64 tensors of shape 4x4. | module {
func.func @f(%a: tensor<4x4xi64>, %b: tensor<4x4xi64>) -> tensor<4x4xi64> {
%0 = stablehlo.add %a, %b : tensor<4x4xi64>
return %0 : tensor<4x4xi64>
}
} | add i64 4x4 | stablehlo+func | day_f8_wide_sweep |
051_ew_bin_051_add_i64_8x8 | programmatic-wide | Write a function that applies stablehlo.add elementwise to two i64 tensors of shape 8x8. | module {
func.func @f(%a: tensor<8x8xi64>, %b: tensor<8x8xi64>) -> tensor<8x8xi64> {
%0 = stablehlo.add %a, %b : tensor<8x8xi64>
return %0 : tensor<8x8xi64>
}
} | add i64 8x8 | stablehlo+func | day_f8_wide_sweep |
052_ew_bin_052_add_i64_8x16 | programmatic-wide | Write a function that applies stablehlo.add elementwise to two i64 tensors of shape 8x16. | module {
func.func @f(%a: tensor<8x16xi64>, %b: tensor<8x16xi64>) -> tensor<8x16xi64> {
%0 = stablehlo.add %a, %b : tensor<8x16xi64>
return %0 : tensor<8x16xi64>
}
} | add i64 8x16 | stablehlo+func | day_f8_wide_sweep |
053_ew_bin_053_add_i64_4x4x4 | programmatic-wide | Write a function that applies stablehlo.add elementwise to two i64 tensors of shape 4x4x4. | module {
func.func @f(%a: tensor<4x4x4xi64>, %b: tensor<4x4x4xi64>) -> tensor<4x4x4xi64> {
%0 = stablehlo.add %a, %b : tensor<4x4x4xi64>
return %0 : tensor<4x4x4xi64>
}
} | add i64 4x4x4 | stablehlo+func | day_f8_wide_sweep |
054_ew_bin_054_add_i64_2x8x8 | programmatic-wide | Write a function that applies stablehlo.add elementwise to two i64 tensors of shape 2x8x8. | module {
func.func @f(%a: tensor<2x8x8xi64>, %b: tensor<2x8x8xi64>) -> tensor<2x8x8xi64> {
%0 = stablehlo.add %a, %b : tensor<2x8x8xi64>
return %0 : tensor<2x8x8xi64>
}
} | add i64 2x8x8 | stablehlo+func | day_f8_wide_sweep |
055_ew_bin_055_subtract_f16_8 | programmatic-wide | Write a function that applies stablehlo.subtract elementwise to two f16 tensors of shape 8. | module {
func.func @f(%a: tensor<8xf16>, %b: tensor<8xf16>) -> tensor<8xf16> {
%0 = stablehlo.subtract %a, %b : tensor<8xf16>
return %0 : tensor<8xf16>
}
} | subtract f16 8 | stablehlo+func | day_f8_wide_sweep |
056_ew_bin_056_subtract_f16_16 | programmatic-wide | Write a function that applies stablehlo.subtract elementwise to two f16 tensors of shape 16. | module {
func.func @f(%a: tensor<16xf16>, %b: tensor<16xf16>) -> tensor<16xf16> {
%0 = stablehlo.subtract %a, %b : tensor<16xf16>
return %0 : tensor<16xf16>
}
} | subtract f16 16 | stablehlo+func | day_f8_wide_sweep |
057_ew_bin_057_subtract_f16_32 | programmatic-wide | Write a function that applies stablehlo.subtract elementwise to two f16 tensors of shape 32. | module {
func.func @f(%a: tensor<32xf16>, %b: tensor<32xf16>) -> tensor<32xf16> {
%0 = stablehlo.subtract %a, %b : tensor<32xf16>
return %0 : tensor<32xf16>
}
} | subtract f16 32 | stablehlo+func | day_f8_wide_sweep |
058_ew_bin_058_subtract_f16_64 | programmatic-wide | Write a function that applies stablehlo.subtract elementwise to two f16 tensors of shape 64. | module {
func.func @f(%a: tensor<64xf16>, %b: tensor<64xf16>) -> tensor<64xf16> {
%0 = stablehlo.subtract %a, %b : tensor<64xf16>
return %0 : tensor<64xf16>
}
} | subtract f16 64 | stablehlo+func | day_f8_wide_sweep |
059_ew_bin_059_subtract_f16_4x4 | programmatic-wide | Write a function that applies stablehlo.subtract elementwise to two f16 tensors of shape 4x4. | module {
func.func @f(%a: tensor<4x4xf16>, %b: tensor<4x4xf16>) -> tensor<4x4xf16> {
%0 = stablehlo.subtract %a, %b : tensor<4x4xf16>
return %0 : tensor<4x4xf16>
}
} | subtract f16 4x4 | stablehlo+func | day_f8_wide_sweep |
060_ew_bin_060_subtract_f16_8x8 | programmatic-wide | Write a function that applies stablehlo.subtract elementwise to two f16 tensors of shape 8x8. | module {
func.func @f(%a: tensor<8x8xf16>, %b: tensor<8x8xf16>) -> tensor<8x8xf16> {
%0 = stablehlo.subtract %a, %b : tensor<8x8xf16>
return %0 : tensor<8x8xf16>
}
} | subtract f16 8x8 | stablehlo+func | day_f8_wide_sweep |
061_ew_bin_061_subtract_f16_8x16 | programmatic-wide | Write a function that applies stablehlo.subtract elementwise to two f16 tensors of shape 8x16. | module {
func.func @f(%a: tensor<8x16xf16>, %b: tensor<8x16xf16>) -> tensor<8x16xf16> {
%0 = stablehlo.subtract %a, %b : tensor<8x16xf16>
return %0 : tensor<8x16xf16>
}
} | subtract f16 8x16 | stablehlo+func | day_f8_wide_sweep |
062_ew_bin_062_subtract_f16_4x4x4 | programmatic-wide | Write a function that applies stablehlo.subtract elementwise to two f16 tensors of shape 4x4x4. | module {
func.func @f(%a: tensor<4x4x4xf16>, %b: tensor<4x4x4xf16>) -> tensor<4x4x4xf16> {
%0 = stablehlo.subtract %a, %b : tensor<4x4x4xf16>
return %0 : tensor<4x4x4xf16>
}
} | subtract f16 4x4x4 | stablehlo+func | day_f8_wide_sweep |
063_ew_bin_063_subtract_f16_2x8x8 | programmatic-wide | Write a function that applies stablehlo.subtract elementwise to two f16 tensors of shape 2x8x8. | module {
func.func @f(%a: tensor<2x8x8xf16>, %b: tensor<2x8x8xf16>) -> tensor<2x8x8xf16> {
%0 = stablehlo.subtract %a, %b : tensor<2x8x8xf16>
return %0 : tensor<2x8x8xf16>
}
} | subtract f16 2x8x8 | stablehlo+func | day_f8_wide_sweep |
064_ew_bin_064_subtract_f32_8 | programmatic-wide | Write a function that applies stablehlo.subtract elementwise to two f32 tensors of shape 8. | module {
func.func @f(%a: tensor<8xf32>, %b: tensor<8xf32>) -> tensor<8xf32> {
%0 = stablehlo.subtract %a, %b : tensor<8xf32>
return %0 : tensor<8xf32>
}
} | subtract f32 8 | stablehlo+func | day_f8_wide_sweep |
065_ew_bin_065_subtract_f32_16 | programmatic-wide | Write a function that applies stablehlo.subtract elementwise to two f32 tensors of shape 16. | module {
func.func @f(%a: tensor<16xf32>, %b: tensor<16xf32>) -> tensor<16xf32> {
%0 = stablehlo.subtract %a, %b : tensor<16xf32>
return %0 : tensor<16xf32>
}
} | subtract f32 16 | stablehlo+func | day_f8_wide_sweep |
066_ew_bin_066_subtract_f32_32 | programmatic-wide | Write a function that applies stablehlo.subtract elementwise to two f32 tensors of shape 32. | module {
func.func @f(%a: tensor<32xf32>, %b: tensor<32xf32>) -> tensor<32xf32> {
%0 = stablehlo.subtract %a, %b : tensor<32xf32>
return %0 : tensor<32xf32>
}
} | subtract f32 32 | stablehlo+func | day_f8_wide_sweep |
067_ew_bin_067_subtract_f32_64 | programmatic-wide | Write a function that applies stablehlo.subtract elementwise to two f32 tensors of shape 64. | module {
func.func @f(%a: tensor<64xf32>, %b: tensor<64xf32>) -> tensor<64xf32> {
%0 = stablehlo.subtract %a, %b : tensor<64xf32>
return %0 : tensor<64xf32>
}
} | subtract f32 64 | stablehlo+func | day_f8_wide_sweep |
068_ew_bin_068_subtract_f32_4x4 | programmatic-wide | Write a function that applies stablehlo.subtract elementwise to two f32 tensors of shape 4x4. | module {
func.func @f(%a: tensor<4x4xf32>, %b: tensor<4x4xf32>) -> tensor<4x4xf32> {
%0 = stablehlo.subtract %a, %b : tensor<4x4xf32>
return %0 : tensor<4x4xf32>
}
} | subtract f32 4x4 | stablehlo+func | day_f8_wide_sweep |
069_ew_bin_069_subtract_f32_8x8 | programmatic-wide | Write a function that applies stablehlo.subtract elementwise to two f32 tensors of shape 8x8. | module {
func.func @f(%a: tensor<8x8xf32>, %b: tensor<8x8xf32>) -> tensor<8x8xf32> {
%0 = stablehlo.subtract %a, %b : tensor<8x8xf32>
return %0 : tensor<8x8xf32>
}
} | subtract f32 8x8 | stablehlo+func | day_f8_wide_sweep |
070_ew_bin_070_subtract_f32_8x16 | programmatic-wide | Write a function that applies stablehlo.subtract elementwise to two f32 tensors of shape 8x16. | module {
func.func @f(%a: tensor<8x16xf32>, %b: tensor<8x16xf32>) -> tensor<8x16xf32> {
%0 = stablehlo.subtract %a, %b : tensor<8x16xf32>
return %0 : tensor<8x16xf32>
}
} | subtract f32 8x16 | stablehlo+func | day_f8_wide_sweep |
071_ew_bin_071_subtract_f32_4x4x4 | programmatic-wide | Write a function that applies stablehlo.subtract elementwise to two f32 tensors of shape 4x4x4. | module {
func.func @f(%a: tensor<4x4x4xf32>, %b: tensor<4x4x4xf32>) -> tensor<4x4x4xf32> {
%0 = stablehlo.subtract %a, %b : tensor<4x4x4xf32>
return %0 : tensor<4x4x4xf32>
}
} | subtract f32 4x4x4 | stablehlo+func | day_f8_wide_sweep |
072_ew_bin_072_subtract_f32_2x8x8 | programmatic-wide | Write a function that applies stablehlo.subtract elementwise to two f32 tensors of shape 2x8x8. | module {
func.func @f(%a: tensor<2x8x8xf32>, %b: tensor<2x8x8xf32>) -> tensor<2x8x8xf32> {
%0 = stablehlo.subtract %a, %b : tensor<2x8x8xf32>
return %0 : tensor<2x8x8xf32>
}
} | subtract f32 2x8x8 | stablehlo+func | day_f8_wide_sweep |
073_ew_bin_073_subtract_f64_8 | programmatic-wide | Write a function that applies stablehlo.subtract elementwise to two f64 tensors of shape 8. | module {
func.func @f(%a: tensor<8xf64>, %b: tensor<8xf64>) -> tensor<8xf64> {
%0 = stablehlo.subtract %a, %b : tensor<8xf64>
return %0 : tensor<8xf64>
}
} | subtract f64 8 | stablehlo+func | day_f8_wide_sweep |
074_ew_bin_074_subtract_f64_16 | programmatic-wide | Write a function that applies stablehlo.subtract elementwise to two f64 tensors of shape 16. | module {
func.func @f(%a: tensor<16xf64>, %b: tensor<16xf64>) -> tensor<16xf64> {
%0 = stablehlo.subtract %a, %b : tensor<16xf64>
return %0 : tensor<16xf64>
}
} | subtract f64 16 | stablehlo+func | day_f8_wide_sweep |
075_ew_bin_075_subtract_f64_32 | programmatic-wide | Write a function that applies stablehlo.subtract elementwise to two f64 tensors of shape 32. | module {
func.func @f(%a: tensor<32xf64>, %b: tensor<32xf64>) -> tensor<32xf64> {
%0 = stablehlo.subtract %a, %b : tensor<32xf64>
return %0 : tensor<32xf64>
}
} | subtract f64 32 | stablehlo+func | day_f8_wide_sweep |
076_ew_bin_076_subtract_f64_64 | programmatic-wide | Write a function that applies stablehlo.subtract elementwise to two f64 tensors of shape 64. | module {
func.func @f(%a: tensor<64xf64>, %b: tensor<64xf64>) -> tensor<64xf64> {
%0 = stablehlo.subtract %a, %b : tensor<64xf64>
return %0 : tensor<64xf64>
}
} | subtract f64 64 | stablehlo+func | day_f8_wide_sweep |
077_ew_bin_077_subtract_f64_4x4 | programmatic-wide | Write a function that applies stablehlo.subtract elementwise to two f64 tensors of shape 4x4. | module {
func.func @f(%a: tensor<4x4xf64>, %b: tensor<4x4xf64>) -> tensor<4x4xf64> {
%0 = stablehlo.subtract %a, %b : tensor<4x4xf64>
return %0 : tensor<4x4xf64>
}
} | subtract f64 4x4 | stablehlo+func | day_f8_wide_sweep |
078_ew_bin_078_subtract_f64_8x8 | programmatic-wide | Write a function that applies stablehlo.subtract elementwise to two f64 tensors of shape 8x8. | module {
func.func @f(%a: tensor<8x8xf64>, %b: tensor<8x8xf64>) -> tensor<8x8xf64> {
%0 = stablehlo.subtract %a, %b : tensor<8x8xf64>
return %0 : tensor<8x8xf64>
}
} | subtract f64 8x8 | stablehlo+func | day_f8_wide_sweep |
079_ew_bin_079_subtract_f64_8x16 | programmatic-wide | Write a function that applies stablehlo.subtract elementwise to two f64 tensors of shape 8x16. | module {
func.func @f(%a: tensor<8x16xf64>, %b: tensor<8x16xf64>) -> tensor<8x16xf64> {
%0 = stablehlo.subtract %a, %b : tensor<8x16xf64>
return %0 : tensor<8x16xf64>
}
} | subtract f64 8x16 | stablehlo+func | day_f8_wide_sweep |
080_ew_bin_080_subtract_f64_4x4x4 | programmatic-wide | Write a function that applies stablehlo.subtract elementwise to two f64 tensors of shape 4x4x4. | module {
func.func @f(%a: tensor<4x4x4xf64>, %b: tensor<4x4x4xf64>) -> tensor<4x4x4xf64> {
%0 = stablehlo.subtract %a, %b : tensor<4x4x4xf64>
return %0 : tensor<4x4x4xf64>
}
} | subtract f64 4x4x4 | stablehlo+func | day_f8_wide_sweep |
081_ew_bin_081_subtract_f64_2x8x8 | programmatic-wide | Write a function that applies stablehlo.subtract elementwise to two f64 tensors of shape 2x8x8. | module {
func.func @f(%a: tensor<2x8x8xf64>, %b: tensor<2x8x8xf64>) -> tensor<2x8x8xf64> {
%0 = stablehlo.subtract %a, %b : tensor<2x8x8xf64>
return %0 : tensor<2x8x8xf64>
}
} | subtract f64 2x8x8 | stablehlo+func | day_f8_wide_sweep |
082_ew_bin_082_subtract_i8_8 | programmatic-wide | Write a function that applies stablehlo.subtract elementwise to two i8 tensors of shape 8. | module {
func.func @f(%a: tensor<8xi8>, %b: tensor<8xi8>) -> tensor<8xi8> {
%0 = stablehlo.subtract %a, %b : tensor<8xi8>
return %0 : tensor<8xi8>
}
} | subtract i8 8 | stablehlo+func | day_f8_wide_sweep |
083_ew_bin_083_subtract_i8_16 | programmatic-wide | Write a function that applies stablehlo.subtract elementwise to two i8 tensors of shape 16. | module {
func.func @f(%a: tensor<16xi8>, %b: tensor<16xi8>) -> tensor<16xi8> {
%0 = stablehlo.subtract %a, %b : tensor<16xi8>
return %0 : tensor<16xi8>
}
} | subtract i8 16 | stablehlo+func | day_f8_wide_sweep |
084_ew_bin_084_subtract_i8_32 | programmatic-wide | Write a function that applies stablehlo.subtract elementwise to two i8 tensors of shape 32. | module {
func.func @f(%a: tensor<32xi8>, %b: tensor<32xi8>) -> tensor<32xi8> {
%0 = stablehlo.subtract %a, %b : tensor<32xi8>
return %0 : tensor<32xi8>
}
} | subtract i8 32 | stablehlo+func | day_f8_wide_sweep |
085_ew_bin_085_subtract_i8_64 | programmatic-wide | Write a function that applies stablehlo.subtract elementwise to two i8 tensors of shape 64. | module {
func.func @f(%a: tensor<64xi8>, %b: tensor<64xi8>) -> tensor<64xi8> {
%0 = stablehlo.subtract %a, %b : tensor<64xi8>
return %0 : tensor<64xi8>
}
} | subtract i8 64 | stablehlo+func | day_f8_wide_sweep |
086_ew_bin_086_subtract_i8_4x4 | programmatic-wide | Write a function that applies stablehlo.subtract elementwise to two i8 tensors of shape 4x4. | module {
func.func @f(%a: tensor<4x4xi8>, %b: tensor<4x4xi8>) -> tensor<4x4xi8> {
%0 = stablehlo.subtract %a, %b : tensor<4x4xi8>
return %0 : tensor<4x4xi8>
}
} | subtract i8 4x4 | stablehlo+func | day_f8_wide_sweep |
087_ew_bin_087_subtract_i8_8x8 | programmatic-wide | Write a function that applies stablehlo.subtract elementwise to two i8 tensors of shape 8x8. | module {
func.func @f(%a: tensor<8x8xi8>, %b: tensor<8x8xi8>) -> tensor<8x8xi8> {
%0 = stablehlo.subtract %a, %b : tensor<8x8xi8>
return %0 : tensor<8x8xi8>
}
} | subtract i8 8x8 | stablehlo+func | day_f8_wide_sweep |
088_ew_bin_088_subtract_i8_8x16 | programmatic-wide | Write a function that applies stablehlo.subtract elementwise to two i8 tensors of shape 8x16. | module {
func.func @f(%a: tensor<8x16xi8>, %b: tensor<8x16xi8>) -> tensor<8x16xi8> {
%0 = stablehlo.subtract %a, %b : tensor<8x16xi8>
return %0 : tensor<8x16xi8>
}
} | subtract i8 8x16 | stablehlo+func | day_f8_wide_sweep |
089_ew_bin_089_subtract_i8_4x4x4 | programmatic-wide | Write a function that applies stablehlo.subtract elementwise to two i8 tensors of shape 4x4x4. | module {
func.func @f(%a: tensor<4x4x4xi8>, %b: tensor<4x4x4xi8>) -> tensor<4x4x4xi8> {
%0 = stablehlo.subtract %a, %b : tensor<4x4x4xi8>
return %0 : tensor<4x4x4xi8>
}
} | subtract i8 4x4x4 | stablehlo+func | day_f8_wide_sweep |
090_ew_bin_090_subtract_i8_2x8x8 | programmatic-wide | Write a function that applies stablehlo.subtract elementwise to two i8 tensors of shape 2x8x8. | module {
func.func @f(%a: tensor<2x8x8xi8>, %b: tensor<2x8x8xi8>) -> tensor<2x8x8xi8> {
%0 = stablehlo.subtract %a, %b : tensor<2x8x8xi8>
return %0 : tensor<2x8x8xi8>
}
} | subtract i8 2x8x8 | stablehlo+func | day_f8_wide_sweep |
091_ew_bin_091_subtract_i32_8 | programmatic-wide | Write a function that applies stablehlo.subtract elementwise to two i32 tensors of shape 8. | module {
func.func @f(%a: tensor<8xi32>, %b: tensor<8xi32>) -> tensor<8xi32> {
%0 = stablehlo.subtract %a, %b : tensor<8xi32>
return %0 : tensor<8xi32>
}
} | subtract i32 8 | stablehlo+func | day_f8_wide_sweep |
092_ew_bin_092_subtract_i32_16 | programmatic-wide | Write a function that applies stablehlo.subtract elementwise to two i32 tensors of shape 16. | module {
func.func @f(%a: tensor<16xi32>, %b: tensor<16xi32>) -> tensor<16xi32> {
%0 = stablehlo.subtract %a, %b : tensor<16xi32>
return %0 : tensor<16xi32>
}
} | subtract i32 16 | stablehlo+func | day_f8_wide_sweep |
093_ew_bin_093_subtract_i32_32 | programmatic-wide | Write a function that applies stablehlo.subtract elementwise to two i32 tensors of shape 32. | module {
func.func @f(%a: tensor<32xi32>, %b: tensor<32xi32>) -> tensor<32xi32> {
%0 = stablehlo.subtract %a, %b : tensor<32xi32>
return %0 : tensor<32xi32>
}
} | subtract i32 32 | stablehlo+func | day_f8_wide_sweep |
094_ew_bin_094_subtract_i32_64 | programmatic-wide | Write a function that applies stablehlo.subtract elementwise to two i32 tensors of shape 64. | module {
func.func @f(%a: tensor<64xi32>, %b: tensor<64xi32>) -> tensor<64xi32> {
%0 = stablehlo.subtract %a, %b : tensor<64xi32>
return %0 : tensor<64xi32>
}
} | subtract i32 64 | stablehlo+func | day_f8_wide_sweep |
095_ew_bin_095_subtract_i32_4x4 | programmatic-wide | Write a function that applies stablehlo.subtract elementwise to two i32 tensors of shape 4x4. | module {
func.func @f(%a: tensor<4x4xi32>, %b: tensor<4x4xi32>) -> tensor<4x4xi32> {
%0 = stablehlo.subtract %a, %b : tensor<4x4xi32>
return %0 : tensor<4x4xi32>
}
} | subtract i32 4x4 | stablehlo+func | day_f8_wide_sweep |
096_ew_bin_096_subtract_i32_8x8 | programmatic-wide | Write a function that applies stablehlo.subtract elementwise to two i32 tensors of shape 8x8. | module {
func.func @f(%a: tensor<8x8xi32>, %b: tensor<8x8xi32>) -> tensor<8x8xi32> {
%0 = stablehlo.subtract %a, %b : tensor<8x8xi32>
return %0 : tensor<8x8xi32>
}
} | subtract i32 8x8 | stablehlo+func | day_f8_wide_sweep |
097_ew_bin_097_subtract_i32_8x16 | programmatic-wide | Write a function that applies stablehlo.subtract elementwise to two i32 tensors of shape 8x16. | module {
func.func @f(%a: tensor<8x16xi32>, %b: tensor<8x16xi32>) -> tensor<8x16xi32> {
%0 = stablehlo.subtract %a, %b : tensor<8x16xi32>
return %0 : tensor<8x16xi32>
}
} | subtract i32 8x16 | stablehlo+func | day_f8_wide_sweep |
098_ew_bin_098_subtract_i32_4x4x4 | programmatic-wide | Write a function that applies stablehlo.subtract elementwise to two i32 tensors of shape 4x4x4. | module {
func.func @f(%a: tensor<4x4x4xi32>, %b: tensor<4x4x4xi32>) -> tensor<4x4x4xi32> {
%0 = stablehlo.subtract %a, %b : tensor<4x4x4xi32>
return %0 : tensor<4x4x4xi32>
}
} | subtract i32 4x4x4 | stablehlo+func | day_f8_wide_sweep |
099_ew_bin_099_subtract_i32_2x8x8 | programmatic-wide | Write a function that applies stablehlo.subtract elementwise to two i32 tensors of shape 2x8x8. | module {
func.func @f(%a: tensor<2x8x8xi32>, %b: tensor<2x8x8xi32>) -> tensor<2x8x8xi32> {
%0 = stablehlo.subtract %a, %b : tensor<2x8x8xi32>
return %0 : tensor<2x8x8xi32>
}
} | subtract i32 2x8x8 | stablehlo+func | day_f8_wide_sweep |
100_ew_bin_100_subtract_i64_8 | programmatic-wide | Write a function that applies stablehlo.subtract elementwise to two i64 tensors of shape 8. | module {
func.func @f(%a: tensor<8xi64>, %b: tensor<8xi64>) -> tensor<8xi64> {
%0 = stablehlo.subtract %a, %b : tensor<8xi64>
return %0 : tensor<8xi64>
}
} | subtract i64 8 | stablehlo+func | day_f8_wide_sweep |
StableHLO-Held-Out-200
Programmatic parametric sweep over 7 StableHLO op families × 6 dtypes × 3 shape ranks (n=200, verifier-clean).
This dataset is one of six NL→MLIR benchmarks released alongside the NeurIPS
2026 Evaluations & Datasets track paper Cross-Dialect Generalization Without
Retraining: Benchmarks and Evaluation of Schema-Derived Constrained Decoding
for MLIR (anonymous submission). The full suite — MLIR-Spec-150,
Linalg-Spec-30, StableHLO-Spec-30, StableHLO-Held-Out-200,
StableHLO-OutOfGrammar-25, and MLIR-Functional-Reference-30 — totals 465
instances across three MLIR dialects.
Composition
- Instances: 200
- Format: one JSON record per line in
data/test.jsonl - Schema: fields =
dialect,difficulty,id,mlir,nl,notes,source - Verifier:
stablehlo-opt v1.4.0(upstream truth) andiree-compile --compile-to=input(substitute, 50/50 concordant on a stratified n=50 sample) - License: Apache-2.0 (SPDX: Apache-2.0). No third-party IP restrictions.
Loading
from datasets import load_dataset
ds = load_dataset("plawanrath/StableHLO-Held-Out-200", split="test")
print(ds[0])
Each record is a self-contained natural-language→MLIR pair; verify-valid pass-rate under the dialect's verifier is the primary evaluation metric.
Source format
For paper reproducibility, individual per-record JSON files (the
examples/*.json layout used by the companion code repository) and the
MLCommons Croissant 1.0 metadata (croissant.json) ship together with the
release. The JSONL file at data/test.jsonl is the canonical HuggingFace
interface; it is generated 1-to-1 from the source records.
Datasheet
A full Gebru-style datasheet covering motivation, collection, preprocessing,
uses, distribution, and maintenance is included in the companion
reproducibility archive (docs/datasheets/datasheet.md). Key points:
- All reference MLIR programs are verifier-clean at the time of release.
- Hand-authored single-author (no crowdsourcing, no LLM-authored references).
- Test-only — fine-tuning on these benchmarks contaminates future evaluation and is explicitly out of scope.
Companion artifacts
- Reproducibility archive (code + scripts):
submission_artifact.tar.gzin the OpenReview attachment / Zenodo mirror. - Companion code repository: .
Citation
@inproceedings{anonymous2026crossdialect,
title = {Cross-Dialect Generalization Without Retraining: Benchmarks and Evaluation of Schema-Derived Constrained Decoding for MLIR},
author = {Anonymous},
booktitle = {Advances in Neural Information Processing Systems (NeurIPS), Datasets and Benchmarks Track},
year = {2026},
note = {Anonymous submission under review.}
}
License
Apache-2.0. See LICENSE.
- Downloads last month
- 22