The short answer is “don’t do it”.
The voltage dropped by the resistor is given by Ohm’s law: V = I R.
So if you know exactly how much current your device will draw, you can choose a resistor to drop exactly 7.5V, and leave 4.5V for your device, when running that current through it. But if the current through your device is changing, or if you want to build more than one system and not every device is exactly the same in current draw, you can’t consistently get 4.5V into the device using just a resistor.
Your other options include
linear regulator. This is basically a variable resistor that will adjust its value to keep the output where you want it. This is probably only a good solution if your device consumes very little power (maybe up to 100mA).
shunt regulator. This means using a resistor to drop the voltage as you suggest, but then add an additional device in parallel to the load to control the voltage. The shunt regulator will adjust its current (within limits) to properly maintain current through the resistor to maintain the desired output voltage.
conversion regulator. This uses some tricks to generate the required output voltage with much better power efficiency than a linear regulator. This is probably the best option if your device needs more than 10 or 20 mA of current.
If these conditions are met, you can reduce the DC voltage by resistors (high power aluminum) [>50W]
Your battery is enough to supply at least 20 times (or more) current to your load.
Power loss is not a problem.
(Over) Heating is not a problem or there is a good cooling mechanism for the resistors.
Even your lowest load resistance is much higher (20x or more) than that of aluminum.
Note: 20x is an artificial number only, the actual number depends on how much voltage variance your load can handle.
Please login or Register to submit your answer