This is the perspective from an electrician (me). The charger will take the 100-120 volts and step it down as ratio, lets say 12:1. that means if the charger gets 120 volts AC from your wall it will pass through a transformer and be stepped down by a multiplier of 12 (just an arbitrary number, but sufficient for this explanation) to give you 10 volts AC.
This 10 volts AC is rectified to give you 10 volts DC, which is now suitable to charge your battery (neglecting for now other electronic parts like voltage regulators, FETS, etc.)
If you input 110 volts to your charger, the ratio of step down remains the same (12:1) to give you a charge voltage (no load) of 9.167 volts. A little lower, but no biggie.
If you input 100 volts, the new charge voltage will be 8.33 volts.
The ratios I used are arbitrary and are not necessarily what Marui has designed for their charger. But you can see how varying input voltage will affect your output charge voltage.
The reverse however, is also true. If this charger was designed to give 9.6 volts at 100 volts input, increasing the input voltage will also increase the transformer output voltage. A smart charger will not usually be affected by this increase as they usually have some voltage regulation to counter these changes.
If you increase the voltage to the battery during charging, it MAY cause a problem such as increased heat generated during charging. Under really extreme circumstances, like high ambient temperature or excessively long charging time, damage MAY occur to the battery.
The case that Will brought up may be the result of a dud battery, or other conditions not mentioned.
|