In: Physics
Quantum computing / Chapter 9 - Deutsch-josza algorithm : "Suppose that f (00) = f (01) = 0, f (10)= f (11) = 1. Apply the Deutsch-Josza algorithm and show that at least one of the first two qubits ends up as a 1." I found this at Quantum Computing Explained by David McMahon Example 9.3 and i wasn't able to understand how he used Uf to get his third solution. I understand the reason behind applying the Hadamard gate. Help will be appreciated. Thanks in advance!
For Applying Uf You need to multiply (-1)f(x) to the above equation termwise. Here value of f(x) is given in the question now,