Question about the Pattern table gen

Hi
define the OP at first td file as below :
def Npu_ReshapeOp:Npu_Op<“Reshape”, [xxx]> {
let summary = “Reshape operation”;
let description = [{
xx.
}];
let arguments = (ins
AnyRankedTensor:$input,
Optional:$buffer
);
let results = (outs AnyRankedTensor:$output);
}

and define one Pat at another td:

def ReshapeReshapeOpt2Pattern : Pat<(Npu_ReshapeOp(Npu_ReshapeOp $arg1, $arg2)), (Npu_ReshapeOp $arg1, $arg2)>;

when tablegen, it report below error:
error: op ‘npu.Reshape’ argument number mismatch: 1 in pattern vs. 2 in definition
def ReshapeReshapeOpt2Pattern : Pat<(Npu_ReshapeOp(Npu_ReshapeOp $arg1, $arg2)),

thanks.

Your op has two arguments, the TableGen DRR spec expects both of them in your pattern. Where you pattern is source → target pair (llvm-project/mlir/include/mlir/IR/PatternBase.td at f5ab0bb14855154b8ecaf24fc2a2797dd8e95d17 · llvm/llvm-project · GitHub). Here your source’s outermost op only has one argument (where argument = operand or attribute).

thanks for your quick reply ,sorry, the first time to use DRR, now I change the pat as below:

def ReshapeReshapeOpt2Pattern : Pat<(Npu_ReshapeOp (Npu_ReshapeOp $arg1, $arg2), $arg3),
(Npu_ReshapeOp $arg1, $arg2)>;

it will report below error:
error: no viable conversion from ‘::mlir::Operation::operand_range’ (aka ‘mlir::OperandRange’) to ‘llvm::SmallVectorTemplateBase<mlir::Value, true>::ValueParamT’ (aka ‘mlir::Value’)
tblgen_values.push_back(arg2);

the inc file as below:
tblgen_values.push_back((*arg1.begin()));
tblgen_values.push_back(arg2);

Mmm, that’s weird it’s complaining about a variadic input there to arg where value is expected. I don’t know what type Optional is here (that’s the 2nd argument here). Do you have an IR snippet too of the before?

sorry, the Op defined as below:
def AnyTensorOrNone: AnyTypeOf<[AnyTensor, NoneType]>;

def Npu_ReshapeOp:Npu_Op<“Reshape”, [xxx]> {
let summary = “Reshape operation”;
let description = [{
xx.
}];
let arguments = (ins
AnyRankedTensor:$input,
Optional: $buffer
);
let results = (outs AnyRankedTensor:$output);
}

the IR as below:
%114 = “npu.Reshape”(%113) : (tensor<1x29x200xf32, 4296728576 : i64>) → tensor<1x1x29x200xf32, 4296728576 : i64> loc(#loc208)
%115 = “npu.Permute”(%114, %0) {order = [0, 1, 3, 2]} : (tensor<1x1x29x200xf32, 4296728576 : i64>, none) → tensor<1x1x200x29xf32, 4296572928 : i64> loc(#loc209)
%116 = “npu.Slice”(%115, %0, %0, %0, %0) {axes = , ends = [1, 1, 200, 10], offset = [0, 0, 0, 0], steps = [1, 1, 1, 1]} : (tensor<1x1x200x29xf32, 4296572928 : i64>, none, none, none, none) → tensor<1x1x200x10xf32, 4296597504 : i64> loc(#loc210)

Optional < AnyTensorOrNone > : $buffer

the above Pat is the toy just for to practice DRR.

Oh I know what’s going on here, and it’s missing support in DRR. Shouldn’t be hard to add though. The optional should be checked before being pushed and the operand segment sizes adjusted based on if there or not. (I vaguely remembered at one point considering switching to invoking builders, but out of cache).

For short term and which also helps learning DRR, you could try the custom builder construct.

We’ve discussed making variadic a more first class concept (optional is implemented by way of variadic) too, and in that world pushing back an optional or ArrayRef could adjust the segment size automatically as now a fixed concept.

clear now, above test case is just for practice, I can try another test case, thanks again