Hello. I’m new to solar energy and would like to build my own solar powered phone charger. I have tried to learn as much as I can from these forums as well as the wider internet, but please forgive any naivety. My main concern is cost, although I would set myself a target of charging a 3.7V, 1420mAh battery in under 10 hours. I will use this example battery throughout this post. Firstly, I’m looking for clarification that my calculations are correct: if I have a panel outputting 100mA at 3.7V, would this mean it would take me 1420/100 = 14.2 hours to charge said battery, assuming the phone was not consuming any electricity during the charging period?
I’ve purchased a Powerfilm mini panel which outputs 100mA at 3.6V, just to run some tests.
This is a 0.36W panel, but the voltage of the panel (3.6V) and the battery (3.7V) don’t match. Since P=IV, does this mean that said panel would output a current of 0.36/3.7=97.3mA rather than 100mA? I basically don’t understand why panels usually have a voltage and current rating given rather than just a power rating – if a panel is rated at 142mA and 37V (P=IV=5.25W), does this mean I could use such a panel to charge the example battery in just 1 hour? I have a sneaky suspicion that the answer is no, but I couldn’t explain why.
I’ve seen a number of solar products which take something like 8-12 hours to charge a phone, and to me this seems rather slow (compared to wall chargers). I understand that there is a limit to how much current we can put in, as this would generate excessive heat and cause damage to the battery, but could I, assuming I had the world’s best heatsink, charge a battery quickly by using 10A? How do I calculate the practical ampere limits for charging?
Can I charge a phone battery directly from the solar panel, using an LM317T voltage regulator combined with a resistor to regulate current, and a Zener diode with resistor to act as an automatic cutoff switch when the battery is charged? For a low power application such as a phone, is MPPT necessary and are its efficiency gains significantly greater than the power it consumes?
Again, I am a solar newbie, so thanks for reading and I look forward to hearing your answers.
I’ve purchased a Powerfilm mini panel which outputs 100mA at 3.6V, just to run some tests.
This is a 0.36W panel, but the voltage of the panel (3.6V) and the battery (3.7V) don’t match. Since P=IV, does this mean that said panel would output a current of 0.36/3.7=97.3mA rather than 100mA? I basically don’t understand why panels usually have a voltage and current rating given rather than just a power rating – if a panel is rated at 142mA and 37V (P=IV=5.25W), does this mean I could use such a panel to charge the example battery in just 1 hour? I have a sneaky suspicion that the answer is no, but I couldn’t explain why.
I’ve seen a number of solar products which take something like 8-12 hours to charge a phone, and to me this seems rather slow (compared to wall chargers). I understand that there is a limit to how much current we can put in, as this would generate excessive heat and cause damage to the battery, but could I, assuming I had the world’s best heatsink, charge a battery quickly by using 10A? How do I calculate the practical ampere limits for charging?
Can I charge a phone battery directly from the solar panel, using an LM317T voltage regulator combined with a resistor to regulate current, and a Zener diode with resistor to act as an automatic cutoff switch when the battery is charged? For a low power application such as a phone, is MPPT necessary and are its efficiency gains significantly greater than the power it consumes?
Again, I am a solar newbie, so thanks for reading and I look forward to hearing your answers.
Comment