In: Physics
The power dissipated in a resistor is given by P = V 2 / R , which means power decreases if resistance increases. Yet this power is also given by P = I 2 R , which means power increases if resistance increases. Explain.
In the first case, if we assume current to be constant through the resistors(as in a series circuit), then P is directly proportional to R, i.e. Power dissipation increases as the value of resistance increases for a series circuit.
In the second case, we are assuming that the voltage across the resistors(V) is constant(as in case of parallel circuit). So then, P is inversely proportional to R. P decreases as R increases.
What you have here are two different scenarios: first one is for series arrangement of resistors(requires at least two resistors) and second one is for the parallel arrangement. If only one resistor is used in the circuit, it is parallel configuration,assuming an ideal voltage source(no internal resistance of source).
So if we are talking about the same scenario(both for series or both for parallel) this contradiction won't arise:
Assuming there is only one resistor R(since you didn't mention any other), P will always reduce as R increases if an ideal voltage source is used.
P.S. : If you want to try out this thing practically, you would not get the same result as in parallel. This is because, the source has its internal resistance. So even if there is only one resistor, you are actually connecting it in series with the source resistance(which is usually about 20–30 ohms). So practically, P would increase as R increases.