Advertisement

The Inside Scoop on Battery Chargers

I can't take credit for writing the following article explaining how battery chargers work and how to select them for your needs. I can however take credit for recognizing a nice job explaining the topic. Hats off to Don Wilson, the Xantrex "Tech Doctor" for this presentation.

graph_3stagecharget_3469.jpg

__

I can’t take credit for writing the following article explaining how battery chargers work and how to select them for your needs. I can however take credit for recognizing a nice job explaining the topic. Hats off to Don Wilson, the Xantrex “Tech Doctor” for this presentation.

Tech Doctor Investigation into the Mysterious World of Battery Chargers

**By Don Wilson **

Advertisement

“Why is my battery not charging?” is a common question that plagues many. Unfortunately, there is no clear cut, one-stop shop on answers as the reason varies widely. Many folks simply don’t understand how batteries work, not to mention the broader scope of battery technology, chargers and electricity.

So … how exactly does a charger work?

There are many different types of chargers with different technologies, algorithms, sizes and options, but the bottom line is that a charger works because its voltage is higher than the battery voltage which causes current to flow to the battery. In most simplistic terms, the voltage differential causes current to flow from the source (charger) to the load (battery). However, I’m the first to admit that the devil is in the details. For instance, while a lead-acid 12-volt battery needs exposure to at least 14 volts in order to fully charge, if the voltage is higher it will cause it to gas out, drying the cells which will, eventually, cause damage.

Advertisement

Typical Battery Charger Phases

Keep in mind that every battery will have a slightly different profile.

_What is multi-stage charging? _

Advertisement

In the above example, the 14 volt threshold is not only critical, but also potentially dangerous. So, the term multi-stage charging means that the voltage differential changes throughout the charging cycle. We’ll use the typical 12-volt liquid lead acid battery as an example.

The first charge stage would be the BULK stage which gets as much current into the battery as fast as possible without damage. The charger will attempt to discharge 14.4 volts at its maximum current in order to achieve the charge. Anything higher can cause heat build-up; lower will slow the charge rate. With this in mind, once the voltage differential equalizes (battery voltage meets the charger voltage, approximately 85% charged), we enter the absorption stage. 

In the absorption stage, the charger maintains the 14.4 volts, but the current will slowly drop as the battery increases in resistance (caused by an increase in charge level). Absorption stage will top off the battery state of charge. Once the battery is “full,” the charger will drop its voltage to 13.4 and transition to the float stage. The float voltage**__ **level is high enough to keep the battery “full,” even if DC loads are turned on, but low enough to prevent persistent gassing of the battery which can cause long term damage.

Advertisement

Why do some chargers have a battery temperature sensor?

The examples I have used are for the absolute ideal scenarios using a liquid battery, a proper sized charger, and a moderate temperature. However, the battery’s reactions to voltage differential changes with different temperature levels. When a battery is warmer, it has an easier time accepting current, but when it’s colder, it has a higher resistance to current. So, more complex chargers utilize a battery temperature sensor to determine the ability of the battery to accept a charge and will adjust the voltage (higher voltage when cold, lower voltage when warm) to give an optimum charge, and to regulate the temperature of the charging battery. The voltage difference is minimal (typically .03 volts for every degree variance from moderate temperature), but makes a difference in the battery’s longevity.

How large of a charger should I have?

With limited knowledge of battery charging, one might believe that a 400ah battery bank, charged by a 400-amp charger, should fully charge from a completely discharged status in about an hour. However, a charger that large would cause so much heat build-up in the battery that it would be completely destroyed before too long. On the other side of the spectrum, a 5-amp charger would not damage the battery, but would take over three days to charge! So … what’s the optimum charger? The general rule of thumb is C/5, or Capacity (in amp-hours) divided by 5. So an 80A charger is the right size for a 400Ah battery bank (400/5=80). When rounding is necessary, always round down because your battery bank will degrade over time and your C/5 rule will eventually meet.

How do I match my battery to my charger?

Actually, in a new installation, the battery should be specified first, before even considering the charger. Why? If the charger is determined first, it may limit your battery choices. On the other hand, there are so many charger types that you can always find one (or stackable charger units) to match your battery bank.

First consideration is the size, next is battery chemistry. If you decide on a gel battery or an AGM, ensure your charger has the algorithm to match the battery type, and the temperature compensation to effectively charge the bank.

Another consideration is input voltage. If you plan on using the charger in a worldwide environment, ensure you select a charger that can operate on a worldwide voltage range. A US-only charger (120V 60Hz input) would certainly be damaged by plugging into European (230V 50Hz) power. However, there are some that can take a wide window of input voltages and still function as designed.

How do I know if my battery needs charging or not?

Most people use battery voltage as an indicator as to the battery state of charge. This gives a broad indication, but is far from accurate. For instance, a battery under little loads that measures 11.5V would be considered heavily discharged. However, a battery under heavy load measuring 11.5V would rebound to a much higher voltage when the load turns off. The only truly accurate way to confirm the state of charge is to measure total amperage being drawn from — and charged back into — the battery. The best device to use is a shunt-based battery monitor which measures the amperage and uses voltage readings and complicated equations to accurately display a state of charge of the battery. Once the monitor shows the battery around 50% charged, it’s time to charge your battery bank (batteries should not be discharged below 50% State of Charge).

What should I consider when planning a charger installation?

The first and most important thing to consider is the location of the charger. Higher voltage AC travels better over long distances, where the DC does not. Due to this situation, the charger should be mounted as close to the battery as possible. If the AC source is 30 feet from the battery, your voltage drop of 30 feet of AC wiring will be much less significant than the voltage drop of 30 feet of DC wiring.

Next on the priority list for consideration is the charger size. Your maximum charger amperage should be 20% of your battery bank size (in Amp Hours). If you have a large bank, you can use one of the chargers on the market that are ‘stackable.’ This means that you can install two 40 amp chargers to get 80 amps of charge. The most effective way is to have the chargers synchronize (or stack) with each other so the charge algorithm works efficiently. This prevents one charger from assuming a fully charged battery because it mistakenly ‘reads’ the voltage of the other charger.

Next you must consider the future of the system. If you plan to eventually add an inverter, you might consider the benefits of an inverter/charger combination unit. Since an inverter and a charger share many of the same components, installing a combination unit in your system allows a cost savings realized from duplicating redundant components. When using separate units, one will always be idle while the other is in use. So for hardware weight and size efficiencies, a combination unit is recommended.

What is Power Factor Correction, and how is it important?

While challenging to explain, Power Factor Correction (PFC) and PFC chargers require less incoming energy to provide the same output of their non-PFC counterparts. PFC is measured by how efficiently the AC sine wave is used.  In a non-PFC charger, the circuitry has a delayed reaction to the alternating current in the incoming sine wave. When PFC is utilized, the circuitry ‘anticipates’ the rise in voltage, eliminates the delay and allows the circuitry to use the incoming AC more effectively.

Here’s an example to try and clarify this concept. When two 80A chargers were compared at full output, the non-PFC charger was drawing over 14A where the PFC charger was just over 9A. The end result of the PFC advantage is more amperage available for the other AC devices installed in the system.

 **** 

 

_TECH DOCTOR DON WILSON __has worked in technical capacities in the automotive, RV and marine fields and for the military since 1989 and has extensive experience in designing and troubleshooting onboard electrical systems. A former customer service manager dealing with electronic issues, Wilson currently serves as a technical instructor for the RV industry’s RVIA Trouble Shooter Clinics and is a full-time sales application engineer for Xantrex Technology USA Inc._

 

Advertisement
Advertisement