This is misguided. For decennia now, there is no reason to assume that hand-unrolled code is faster than a for-loop. Compilers optimize this stuff, and they do this even better than mindlessly multiplying x by itself. For example, raising x to the power 6 only needs 3 multiplications, see for example: https://godbolt.org/z/Edz4jjqvv
While there are definitely use cases for meta-programming, optimization is not one of them.
I want to stress that this is not true. Sure, sometimes it might work, but compilers can also uninline, as well as reorder the way things are evaluated. Compilers don't do a 1:1 mapping of lines of code to assembly instructions anymore; instead they are designed to take your program as input, and generate the best executable that has the same observable effect as your code. So whatever optimization you perform in the source code, it is going to be very brittle as well wrt to seemingly harmless compiler changes (like changing compiler flags, updating the compiler to a new version, and so on).
While indeed nothing is guaranteed, at this point in time the compiler is vastly better at optimizing code than humans are. If you want to make a point that multi-stage programming helps optimize code, you have to do much better than an example of raising x to some power.
Let's put it another way: do you think there is utility in macros at all? And do you think that type safe code is better than untyped code? If you say yes to both, you must also think that staging is useful, since it basically gives you type safe macros. Now lots more things can be macros instead of runtime functions, and you don't need to deal with the ergonomic issues that macros have in other languages. For a more real world example, see Jeremy Yallop's work on fused lexing and parsing.