How can I model an instruction that gets broken down into two µ-ops?

?