Maybe this is old news but today I learned that the YotaPhone 2 charger shipped with the phone is actually Qualcomm Quickcharge 2.0 compatible. This means you can also charge other Qualcomm Quickcharge 2.0 compatible phones with it, like my other phone the Moto G4+. Works perfectly.
Yes, I've been using my QuickCharge 3.0 charger and it's charging with 9V ~1,3A.
kbal said:
Yes, I've been using my QuickCharge 3.0 charger and it's charging with 9V ~1,3A.
Click to expand...
Click to collapse
Dont, fastcharging will greatly reduce you battery life.
Enviado desde mi SM-N930F mediante Tapatalk
kingtiamath said:
Dont, fastcharging will greatly reduce you battery life.
Enviado desde mi SM-N930F mediante Tapatalk
Click to expand...
Click to collapse
Although it is true it reduces your battery life it is only by a small margin, nog greatly.
VirtuaLeech said:
Although it is true it reduces your battery life it is only by a small margin, nog greatly.
Click to expand...
Click to collapse
Im afraid it does. I have done many experiments myself and batteries often charged with fastcharge in as little as 6 months give you no more than 70% of its original charge.
Enviado desde mi SM-N930F mediante Tapatalk
..the same goes for wireless charging btw.
Amplificator said:
..the same goes for wireless charging btw.
Click to expand...
Click to collapse
Are you joking? What is your answer based on?
Wireless charging runs on a much lower amperage so it should be the best solution to charge your phone.
nonyhaha said:
Are you joking? What is your answer based on?
Wireless charging runs on a much lower amperage so it should be the best solution to charge your phone.
Click to expand...
Click to collapse
My answer is based on simple physics.
Just because the amps are lower doesn't mean it's not bad for the battery.
Wireless charging is way less efficient than any form of wired charging.
What happens to the loss? Well, it gets dissipated as heat - and what is the "big killer" of lithium batteries? ..heat.
For this single fact alone, denying that wireless charging causes more harm than a cabled charging is simply.. well, silly.
The only ones denying this are either unaware of simple science or are lying to you, probably to sell you a charger
Yes, every form of charging, even at a theoretical 100% efficiency will heat up the battery due to chemical reactions inside the battery, but the lower efficiency you have the more energy is converted into heat - thus you do more damage and getting even less actual battery-energy out of it.
Simply put: the best charging method is the one that produces the least amount of heat while maintaining a high efficiency - wireless charging is simply not that.
Charging using a cable at 90% means 10% is being converted into heat (not all 10%, but for arguments sake, play along), where as using wireless charging might be at.. 50% depending on different circumstances (probably a lot closer to 70% than 50%, but again, for arguments sake).
This means that the other 50% is just turned into wasted, unnecessary and unwanted heat.
The percentages obviously aren't correct in this example, but it's more to get the point across.
With wireless charging you do more damage (it is subjective as to whether this matters to you) to the battery than you would by using a cable, simply because you create more excessive heat which only purpose is to heat up the battery and surrounding area than actually going into the battery itself.
If we consider the 50% efficiency of the before mentioned example, this means that you would need to charge your device for almost twice as long time as when you use a cable. Not only does it create more heat by virtue of being inductive charging, but it will be doing so for, again, almost twice the time length.
Efficiency also depends on things like distance - the less "perfect" your phone is placed on the charger the less efficient, and thus more wasteful it is.
Google something like "qi wireless charging overheating" and you will see plenty of people reporting on overheating problems when using wireless charging. This is because of all this wasted energy that is dissipated as heat - instead of "filling" the battery it simply heats it, and the surroundings, up.
Despite being made to the same specs, this seem to differ from charger to charger, such as this thread here on XDA would indicate: http://forum.xda-developers.com/google-nexus-5/help/post-qi-charging-battery-temp-t2544768
If you look at the version specifications you see that version 1.2 of the "low power" Qi charging branch which phones are a part of increased the power to up to 15W.
Unless they also worked on the efficiency this would actually mean that version 1.2 does more damage to the battery than 1.0 and 1.1, but for that you would have to dive a bit deeper than the information given in that link.
But as always it's sort of subjective as to what point people will see wireless charging as being too wasteful and/or damaging.
Personally, I don't care because the convenience of wireless charging by far outways the little damage it does to a battery, in my opinion, and the same goes for QuickCharge as well. By the time I would see a noticeable effect on battery life I have probably already bought a new phone anyway
If we take Qualcomms QuickCharge for example,I think QC 3.0 is at the point of where people shouldn't really care about the negative impact. If you read the spec sheet for QC 3.0 it's basically a tweaked version of QC 2.0 (well duh) where the power delivery is controlled much better than QC 2.0 was, bringing both the efficiency and therefor speed to a much higher level even though both are rated for 18W.
Some reading for those who still doubt basic physics :
http://batteryuniversity.com/learn/article/charging_without_wires
http://batteryuniversity.com/learn/article/charging_at_high_and_low_temperatures
http://batteryuniversity.com/learn/article/all_about_chargers
http://batteryuniversity.com/learn/article/ultra_fast_chargers
..and the best of all: https://google.com/
But let me ask you the same question you asked me; and I quote:
nonyhaha said:
Are you joking? What is your answer based on?
Click to expand...
Click to collapse
..that probably sounded very condescending (which is not how it was intended, of course), but I'm curious as to where you've acquired this absurd idea that Qi wireless charging is the best method of all? It's very likely the worst of all, actually.
There is almost no heat dissipated for QC3.0
For me quick charging is a big help, saves hours, if you have a large QC battery or powerbank especially. Yotaphone battery charges especially quickly with QC charger.
"..Wireless charging is way less efficient than any form of wired charging..."
Yes, because you convert first AC 110v or 240v to a lover voltage, f.e DC 5v with an efficiency of maybe 85%.
Then this 5v DC are chopped to a long wave ac voltage (about 19v / 110 to205 kHz) and sends to a cooper coil in the QI transmitter.
There the energy goes as a by a resonant inductive couppling (magnetic field) through a air gap to the QI receiver - again wit an efficiency of perhaps 70%.
The magnetic induction in the receiver coil delivers us again long wave ac voltage which is converted into adequate DC voltage (again efficiency about 70%).
So frankly speaking you may tell a bit of truth regarding losses converted to heat - but this heat ocures everywhere, but not in the Li-Po batteries. It does only in the last step: conversion of electrical energy into a chemical process inside of Li-Po.
Take a look to a label on your QI Charger and you will notice something like following: Input 5V/2a, Output 5V/1A (loss of 50%)
Almost all lithium batteries have their own charging controllers on board which take care of the correct charging parameters. Those controllers are adjusted to charge and also quick charge li-po batteries in the right manner.
Enough theory.
Just follow the electrical way:
in case of direct charger: USB-connector ->copper wire -> Smartphone -> copper wire->LiPo
in case of QI charger: USB Connector->copper coil->air gap->copper coil->copper wire-> LiPo
so there's no basic difference how the LiPo is connected to the power - in both cases by a copper wire
in both cases you can charge with lets say 5v/1A (of course LiPo will be charged with their own characteristical voltage and currents)
modern LiPos are built for a life of 700 bis 1000 charging cycles (about 2 years), and nobody knows if a LiPo would live longer by charging him slowly.
You can charge your smartphone in a fridge to prevent high temperatures.
USB devices are smart, they negotiate themselves by a protocol regarding the charge load. There is no danger to take a Smartphone with capability to be charged with 1.4 amps and connect it to a charger with a 2.1 amp.
You should take more care of the USB cable - it should be able to pass those required Amps to the devices.
Yes less efficient and worse for battery, maybe takes a few minutes more to charge, costs a little more to charge. But its much more pleasing not to use cables and very impressive too. I love wireless charging.
Amplificator said:
My answer is based on simple physics.
Just because the amps are lower doesn't mean it's not bad for the battery.
Wireless charging is way less efficient than any form of wired charging.
What happens to the loss? Well, it gets dissipated as heat - and what is the "big killer" of lithium batteries? ..heat.
For this single fact alone, denying that wireless charging causes more harm than a cabled charging is simply.. well, silly.
The only ones denying this are either unaware of simple science or are lying to you, probably to sell you a charger
Yes, every form of charging, even at a theoretical 100% efficiency will heat up the battery due to chemical reactions inside the battery, but the lower efficiency you have the more energy is converted into heat - thus you do more damage and getting even less actual battery-energy out of it.
Simply put: the best charging method is the one that produces the least amount of heat while maintaining a high efficiency - wireless charging is simply not that.
Charging using a cable at 90% means 10% is being converted into heat (not all 10%, but for arguments sake, play along), where as using wireless charging might be at.. 50% depending on different circumstances (probably a lot closer to 70% than 50%, but again, for arguments sake).
This means that the other 50% is just turned into wasted, unnecessary and unwanted heat.
The percentages obviously aren't correct in this example, but it's more to get the point across.
With wireless charging you do more damage (it is subjective as to whether this matters to you) to the battery than you would by using a cable, simply because you create more excessive heat which only purpose is to heat up the battery and surrounding area than actually going into the battery itself.
If we consider the 50% efficiency of the before mentioned example, this means that you would need to charge your device for almost twice as long time as when you use a cable. Not only does it create more heat by virtue of being inductive charging, but it will be doing so for, again, almost twice the time length.
Efficiency also depends on things like distance - the less "perfect" your phone is placed on the charger the less efficient, and thus more wasteful it is.
Google something like "qi wireless charging overheating" and you will see plenty of people reporting on overheating problems when using wireless charging. This is because of all this wasted energy that is dissipated as heat - instead of "filling" the battery it simply heats it, and the surroundings, up.
Despite being made to the same specs, this seem to differ from charger to charger, such as this thread here on XDA would indicate: http://forum.xda-developers.com/google-nexus-5/help/post-qi-charging-battery-temp-t2544768
If you look at the version specifications you see that version 1.2 of the "low power" Qi charging branch which phones are a part of increased the power to up to 15W.
Unless they also worked on the efficiency this would actually mean that version 1.2 does more damage to the battery than 1.0 and 1.1, but for that you would have to dive a bit deeper than the information given in that link.
But as always it's sort of subjective as to what point people will see wireless charging as being too wasteful and/or damaging.
Personally, I don't care because the convenience of wireless charging by far outways the little damage it does to a battery, in my opinion, and the same goes for QuickCharge as well. By the time I would see a noticeable effect on battery life I have probably already bought a new phone anyway
If we take Qualcomms QuickCharge for example,I think QC 3.0 is at the point of where people shouldn't really care about the negative impact. If you read the spec sheet for QC 3.0 it's basically a tweaked version of QC 2.0 (well duh) where the power delivery is controlled much better than QC 2.0 was, bringing both the efficiency and therefor speed to a much higher level even though both are rated for 18W.
Some reading for those who still doubt basic physics :
http://batteryuniversity.com/learn/article/charging_without_wires
http://batteryuniversity.com/learn/article/charging_at_high_and_low_temperatures
http://batteryuniversity.com/learn/article/all_about_chargers
http://batteryuniversity.com/learn/article/ultra_fast_chargers
..and the best of all: https://google.com/
But let me ask you the same question you asked me; and I quote:
..that probably sounded very condescending (which is not how it was intended, of course), but I'm curious as to where you've acquired this absurd idea that Qi wireless charging is the best method of all? It's very likely the worst of all, actually.
Click to expand...
Click to collapse
So you really think you know what you are saying there...
Heat dissipation will happen ONLY on the emitter part. So no heating on the recetor coil as well as no heating in the phone. I thinl ypu have to get your facts straight.
Because wireless charging coils run on such low amperage this will nevver become a problem of overheating.
As you said before me, you should get your phisics knowlege up to date. I am already a graduate with a phisics degree.
nonyhaha said:
So you really think you know what you are saying there...
Heat dissipation will happen ONLY on the emitter part. So no heating on the recetor coil as well as no heating in the phone. I thinl ypu have to get your facts straight.
Because wireless charging coils run on such low amperage this will nevver become a problem of overheating.
As you said before me, you should get your phisics knowlege up to date. I am already a graduate with a phisics degree.
Click to expand...
Click to collapse
Yes, I do think I know what I'm talking about - but luckily you came to the rescue and used your alleged physics degree to write a reply that proved me wrong with all of your facts, right?
Oh, no.. you didn't - you just doubled down instead, well done.
It doesn't matter (and is not important in this case) where the heat dissipation happens (and never did I claim it happened at the receiver - only that it happens) - the battery is still being heated up regardless, due to the energy loss.
If someone with an alleged physics degree keeps denying that the battery is heated up accordingly to my previous post then I doubt that you finished at the top of your class, if at all, sorry. I would really like to see all your evidence you have against what I wrote in my previous post (and that tons of people are posting about on the interwebz).
Just give it a go on Google, such as this thread from XDA: http://forum.xda-developers.com/google-nexus-5/help/post-qi-charging-battery-temp-t2544768
You can even do a simple charging test of your own, just compare battery temperatures while using a Qi wireless charger, QC2.0 and another at 1A.
Are everyone posting about high temperatures while using Qi chargers lying? Why would they do that? ..maybe the wired-charging-mafia are paying people to discredit WPC and other groups.. hm, maybe.
Yeah, it's getting a bit ridiculous, but silly claims require silly responses, sorry
If you can actually prove what I was saying in my previous post is wrong then I'll gladly accept it, but I do not take "na-ah, not true" with any degree of seriousness and neither do I give credit to claims of physics degrees. In that case I'm an ESA astronaut currently in space - see where this is going?
I go by what you actually write, not what you claim. The only reason for boasting about alleged degrees is to divert attention from the lack of any credible proof - disprove what my previous post said and I'll gladly accept it.
Ok, so my Yotaphone 2 charger has quick charge ability, as does my Samsung Galaxy Note 4 charger and car charger.
Despite all of these chargers having fast charging ability and my Samsung Note 4 fast charging perfectly with all of them, none of them appear to fast charge my Yotaphone 2......
It's at 63% charged right now and whether I plug it into a non QC charger or any of my quick chargers, it's saying 55 minutes until fully charged.
I've looked through the settings pages and can't find a way to enable quick charge on my Yotaphone like I could on my Note 4 battery page.
I'm running a Gearbest supplied YD206 which I flashed to the RU 134 ROM (so it's now showing as a YD201)
Am I missing something?
Any ideas/replies would be greatly appreciated!
Yotaphone 2 charger should indicate active quick charging by ligthing up "Yotaphone" with white LEDs on the charger. If its charging with 5V only your charger doesn't light up.
I don't think that theres something wrong with your phone. Just that charging estimation is inaccurate (at the moment).
Well my chargers Yotaphone logo is lighting up, so I guess it's working then. Thanks for the reply ?
I've had to put the two pin Yotaphone charger block into a three pin UK adaptor to try it. Annoyingly & worryingly it buzzes a lot & quite loudly - is that the same for everyone?
zippyioa said:
Ok, so my Yotaphone 2 charger has quick charge ability, as does my Samsung Galaxy Note 4 charger and car charger.
Despite all of these chargers having fast charging ability and my Samsung Note 4 fast charging perfectly with all of them, none of them appear to fast charge my Yotaphone 2......
It's at 63% charged right now and whether I plug it into a non QC charger or any of my quick chargers, it's saying 55 minutes until fully charged.
I've looked through the settings pages and can't find a way to enable quick charge on my Yotaphone like I could on my Note 4 battery page.
I'm running a Gearbest supplied YD206 which I flashed to the RU 134 ROM (so it's now showing as a YD201)
Am I missing something?
Any ideas/replies would be greatly appreciated!
Click to expand...
Click to collapse
Perhaps something wrong is with the cable, not the charger. Something like that happened to me sometime ago - when I used some different cable QC works again.
I had already tried three different chargers and two different cables
If the earlier post about the Yotaphone charger lighting up is correct, I think the phone is quick charging ok.
I guess I was expecting something similar to my Note 4 where it actually stated "fast charging" in the battery menu if I was charging it with a QC.
That message would then change to "charging" if I used a standard charger instead.
I bought a new powerbank, it seems to charge other phones ok but NOT the yotaphone. The powerbank displays the percentage charge for about 10 seconds then display goes off, but so does the yotaphone charging. Other phones and gadgets don't go off. Anyone else have this?
Powerbank is QC3.0. I have tried using different cables, always same.
Sometimes it charges OK. I thought my powerbank was fake until I found it charged other gadgets well.
I also noticed that YotaPhone2 sometimes doesn't want to charge. I just plug it in (cable&charger original), the YotaPhone logo lights up but the phone just doesn't charge! I will try with my power bank and see what happens.
I haven't understood the cause yet, maybe it's because mine has unlocked bootloader, TWRP, root, xposed. (YD206)
Related
Hello!
Just curious if there is an issue with using my new Nexus 10 2A charger with other phones, such as my HTC Sensation or Blackberry Torch?
The Sensation uses a 1A charger, but I assume the phones are smart enough to only draw the current necessary, so they won't be damaged by drawing too much?
I'd like to just use the Nexus 10 charger and not have to carry other ones.
yes it is fine
Cool thanks
EniGmA1987 said:
yes it is fine
Click to expand...
Click to collapse
I heard though that
*first it creates unnecessary heat because the current drawn by circuitry on lower amperage device has to be dissipated as heat
*second, this is less science/engineering but someone said that the specific pins are created by companies and the pins themselves can vary in terms of impedance, thus change the overall circuitry of the device in the long run
*third Li-Ion can pull more current than the default charger and it tends to do so to charge faster, albeit at the cost of the battery overall life deteriorates because higher charging rates also leads to faster breakdown of cells?
I wish I had sources, but this is what I pulled off the Internet when I was younger... can you please assist and advise? Would greatly appreciate (even if we start new thread from this
nutnub said:
I heard though that
*first it creates unnecessary heat because the current drawn by circuitry on lower amperage device has to be dissipated as heat
*second, this is less science/engineering but someone said that the specific pins are created by companies and the pins themselves can vary in terms of impedance, thus change the overall circuitry of the device in the long run
*third Li-Ion can pull more current than the default charger and it tends to do so to charge faster, albeit at the cost of the battery overall life deteriorates because higher charging rates also leads to faster breakdown of cells?
I wish I had sources, but this is what I pulled off the Internet when I was younger... can you please assist and advise? Would greatly appreciate (even if we start new thread from this
Click to expand...
Click to collapse
Wish I knew for sure too. REally I don't care a lot about my HTC Sensation as I plan on getting a Nexus 4 LTE when it eventually comes out. Hopefully those come with 2A chargers!
Sure I could get a Nexus 4 and use LTE right now on Bell, but I'd rather wait for an official one.
nutnub said:
I heard though that
*first it creates unnecessary heat because the current drawn by circuitry on lower amperage device has to be dissipated as heat
*second, this is less science/engineering but someone said that the specific pins are created by companies and the pins themselves can vary in terms of impedance, thus change the overall circuitry of the device in the long run
*third Li-Ion can pull more current than the default charger and it tends to do so to charge faster, albeit at the cost of the battery overall life deteriorates because higher charging rates also leads to faster breakdown of cells?
I wish I had sources, but this is what I pulled off the Internet when I was younger... can you please assist and advise? Would greatly appreciate (even if we start new thread from this
Click to expand...
Click to collapse
Everybody seems to misunderstand LiPo charging, as it is different than previous battery technologies
For general LiPo Information, you should look here. Charging information is about halfway down the page
http://www.rchelicopterfun.com/rc-lipo-batteries.html
Ill quote the important part:
Selecting the correct charge current is also critical when charging RC LiPo battery packs. The golden rule here use to be "never charge a LiPo or LiIon pack greater than 1 times its capacity (1C)."
For example a 2000 mAh pack, would be charged at a maximum charge current of 2000 mA or 2.0 amps. Never higher or the life of the pack would be greatly reduced. If you choose a charge rate significantly higher than the 1C value, the battery will heat up and could swell, vent, or catch fire.
Times are a changing...
Most LiPo experts now feel however you can safely charge at a 2C or even 3C rate on quality packs that have a discharge rating of at least 20C or more safely and low internal resistances, with little effect on the overall life expectancy of the pack as long as you have a good charger with a good balancing system. There are more and more LiPo packs showing up stating 2C and 3C charge rates, with even a couple manufactures indicating 5C rates. The day of the 10 minute charge is here (assuming you have a high power charger and power source capable of delivering that many watts and amps).
Click to expand...
Click to collapse
Pretty much all phones are right around 2000mAh capacity now days so even going by the "old" golden charging rule a 2A charger would be safe to use. My Galaxy Nexus came with (I think) a 1A charger, but ever since I got my tablet shortly thereafter I have just used the tablets 2A charger for both devices and never once had an issue. It has been 8 months now of using the 2A charger on my phone. Idle life can still reach a little over 3 days on a single charge and I still get one of the best screen on time's of most people I know around the forums. So yes from personal experience a 2A tablet charger is completely fine to use on a phone.
Charging circuitry is built into the device, not the "charger"
Nothing to worry about
EniGmA1987 said:
Ill quote the important part:
Pretty much all phones are right around 2000mAh capacity now days so even going by the "old" golden charging rule a 2A charger would be safe to use. My Galaxy Nexus came with 9I think) a 1A charger, but ever since I got my tablet shortly thereafter I have just used the tablets 2A charger for both devices and never once had an issue. It has been 8 months now of using the 2A charger on my phone. Idle life can still reach a little over 3 days on a single charge and I still get one of the best screen on time's of most people I know around the forums. So yes from personal experience a 2A tablet charger is completely fine to use on a phone.
Click to expand...
Click to collapse
Is it safe to assume that all chargers come default at 1C charging for their device? Because if that's the case, I figure most electronics we own can just be replaced with 10w chargers (which would make life much more convenient).
This is slightly related/unrelated, but how do you know whether a charger is "high quality" or will only provide "constant current / constant voltage"? It seems strange to me that these days, you can't find the circuitry of many devices we own publicly available so you can't check if the design is good (let alone how they chose the components in their design?). Do you (and other veterans) have any thoughts on this?
Thanks for teaching me lots!
-newb, happily reading away
I bought one of those 2amp double chargers from a seller on Amazon. It wasn't really cheap either (in cost anyway- I spent a bit more hoping it would be higher quality). After plugging in my MotoRAZR and the wife's lumia the charger popped and some plastic from the housing of the charger flew across the room! Thankfully both phones were fine.
I wondered whether both phones tried to pull more than the charger could handle and the charger had poor quality circuitry.
Since then, I've only ever bought branded official replacement chargers (Motorola, Samsung etc). I'd happily mix and match them to the phones but I'd be wary of buying a no name Chinese jobby from Ebay or Amazon marketplace.
Sent from my XT910 using xda premium
nutnub said:
Is it safe to assume that all chargers come default at 1C charging for their device? Because if that's the case, I figure most electronics we own can just be replaced with 10w chargers (which would make life much more convenient).
Click to expand...
Click to collapse
Most batteries can discharge a lot faster than they can recharge, but with LiPo, the difference is getting smaller.
Batteries used to need trickle charging as if you charge fast they would get hot, which causes the chemicals inside to expand(think like a fizzy drink, pour it fast and it will overflow) causing the battery to burst, exposing nasty chemicals.
New technology means the charger can accurately monitor how fast we fill the battery, without letting it get too hot, and also the way it is filled(as with the fizzy drink, pour down the side of a glass rather than straight to the bottom and you will fill the glass faster, with less chance of it over-spilling)
This is slightly related/unrelated, but how do you know whether a charger is "high quality" or will only provide "constant current / constant voltage"? It seems strange to me that these days, you can't find the circuitry of many devices we own publicly available so you can't check if the design is good (let alone how they chose the components in their design?). Do you (and other veterans) have any thoughts on this?
Click to expand...
Click to collapse
Unfortunately, industry is full of products made to a budget, usually by using cheaper components/designs(the charger for the ASUS TF101 was renowned for failing), so there is no foolproof way of determining 'quality' apart from word of mouth, looking at quantities sold, feedback in reviews/forums.
Basically, it boils down to 'consumer testing'
---------- Post added at 09:54 AM ---------- Previous post was at 09:38 AM ----------
Here's a bit more related information found buried deep in documents here: http://www.usb.org/developers/devclass_docs
The USB2.0 specifications for current output say the maximum current is limited to 1.8A, while USB3.0 has a maximum current limit of 5A
Hopefully, USB3.0 will quickly become a new standard for portable devices.
more questions!
First of all, let me please thank you for responding and being so thorough with your answers! There is so much information out there, and in my 22 years of existence, I cannot for the life of me sort through the sheer amount of data. I do greatly enjoy reading every little thing that is posted, especially in this thread because I think it's super important to understand the electronics that we interact with.
sonicfishcake said:
I bought one of those 2amp double chargers from a seller on Amazon. It wasn't really cheap either (in cost anyway- I spent a bit more hoping it would be higher quality). After plugging in my MotoRAZR and the wife's lumia the charger popped and some plastic from the housing of the charger flew across the room! Thankfully both phones were fine.
I wondered whether both phones tried to pull more than the charger could handle and the charger had poor quality circuitry.
Since then, I've only ever bought branded official replacement chargers (Motorola, Samsung etc). I'd happily mix and match them to the phones but I'd be wary of buying a no name Chinese jobby from Ebay or Amazon marketplace.
Sent from my XT910 using xda premium
Click to expand...
Click to collapse
My concern with this is that if Motorola or Samsung does put out a product less than optimal, would we all know? Another way of asking this is how do we know that Apple/Motorola/Samsung/Lenovo does produce superior products and it's not merely a matter of advertisement or brand image? Do you think there is a way to know, as a consumer, that even third party products are becoming more competitive, given that smaller companies have much harder time advertising and building a name/brand for themselves? (if you can't tell, I am rooting for the little guys because I may one day work for the little guys)
skally said:
Most batteries can discharge a lot faster than they can recharge, but with LiPo, the difference is getting smaller.
Batteries used to need trickle charging as if you charge fast they would get hot, which causes the chemicals inside to expand(think like a fizzy drink, pour it fast and it will overflow) causing the battery to burst, exposing nasty chemicals.
New technology means the charger can accurately monitor how fast we fill the battery, without letting it get too hot, and also the way it is filled(as with the fizzy drink, pour down the side of a glass rather than straight to the bottom and you will fill the glass faster, with less chance of it over-spilling)
Click to expand...
Click to collapse
Thank you for clarifying for us. Would you happen to know if there are specifics to recharge specs, short of finding me published papers on the technology? What you said is definitely what I've been reading from the Internet and I do trust you, just would help me have greater peace of mind with my nice and shiny devices,,,
skally said:
...
Unfortunately, industry is full of products made to a budget, usually by using cheaper components/designs(the charger for the ASUS TF101 was renowned for failing), so there is no foolproof way of determining 'quality' apart from word of mouth, looking at quantities sold, feedback in reviews/forums.
Basically, it boils down to 'consumer testing'
---------- Post added at 09:54 AM ---------- Previous post was at 09:38 AM ----------
Here's a bit more related information found buried deep in documents here: http://www.usb.org/developers/devclass_docs
The USB2.0 specifications for current output say the maximum current is limited to 1.8A, while USB3.0 has a maximum current limit of 5A
Hopefully, USB3.0 will quickly become a new standard for portable devices.
Click to expand...
Click to collapse
A quick question, just because USB3.0 should allow up to 25W, that doesn't mean that it's the standard for devices, does it? As in Nexus 10 probably can only draw 10W, even if my computer (which although stated is USB3.0) may not have the circuitry behind it to allow for such a draw? I'm a little iffy on the whole implementation of USB standards. Because if USB2.0 has draw of up to 9W, I haven't seen this from my laptop or any devices claiming to have USB2.0 ports,,,
but then again, I may be paranoid. Just trying to line up my experience with theory!
Thank you all for so much support and enthusiasm. Any chance we'll see this on a top thread somewhere?
nutnub said:
A quick question, just because USB3.0 should allow up to 25W, that doesn't mean that it's the standard for devices, does it? As in Nexus 10 probably can only draw 10W, even if my computer (which although stated is USB3.0) may not have the circuitry behind it to allow for such a draw? I'm a little iffy on the whole implementation of USB standards. Because if USB2.0 has draw of up to 9W, I haven't seen this from my laptop or any devices claiming to have USB2.0 ports,,,
but then again, I may be paranoid. Just trying to line up my experience with theory!
Thank you all for so much support and enthusiasm. Any chance we'll see this on a top thread somewhere?
Click to expand...
Click to collapse
If the Nexus kernel says the limit is 2A then that's it. It cant use more power.
Have you seen the internal USB 3.0 cable?
It's at least twice as thick as a USB 2.0 cable, I got a new chassi for my computer last week, with a couple 2.0 and a 3.0 USB front port.
And if your motherboard's built for USB 3.0, I'm pretty sure it can take the current. Otherwise there would be no meaning of adding 3.0 support.
Sent from my Nexus 10 using xda app-developers app
If something is listed as a USB3 port, it must be up to USB3 certifications. Otherwise the manufacturer of the device is liable for a huge lawsuit if issues arise. If something says USB3 that doesnt mean it IS drawing 25w though, just that the port is capable of having 25w pulled through it over the USB connector. Same with USB2 and its 9w limit on the spec. Also, plugging a tablet such as this into a computer's USB3 port does not mean it will charge faster or get faster data transfers, since the cable being used and the device are still of the older specification.
nutnub said:
Thank you for clarifying for us. Would you happen to know if there are specifics to recharge specs, short of finding me published papers on the technology? What you said is definitely what I've been reading from the Internet and I do trust you, just would help me have greater peace of mind with my nice and shiny devices,,,
Click to expand...
Click to collapse
Have a look here for info on the recharging process for Lithium based cells.
https://sites.google.com/site/tjinguytech/charging-how-tos/the-charging-process
It is worth noting the level of precautions taken while charging the cells aggressively. You really don't need a bucket of sand on standby when you plug your phone in to it's charger
nutnub said:
A quick question, just because USB3.0 should allow up to 25W, that doesn't mean that it's the standard for devices, does it? As in Nexus 10 probably can only draw 10W, even if my computer (which although stated is USB3.0) may not have the circuitry behind it to allow for such a draw? I'm a little iffy on the whole implementation of USB standards. Because if USB2.0 has draw of up to 9W, I haven't seen this from my laptop or any devices claiming to have USB2.0 ports,,,
Click to expand...
Click to collapse
There are actually 2 different current limits for each USB specification: USB2.0 has 0.5A and 1.8A, while USB3.0 has 1.5A and 5.0A
The lower of the current limits is what I would expect to get from a USB port on a computer, while the higher one I would expect to get from a dedicated charger.
I believe the higher current specification was added purely for charging mobile devices, as it is only achieved by adding a resistance across D+ and D-, removing the data transmission capabilities of the port. I don't know if that's practical, or possible with a computer USB port.
I do remember seeing motherboards with ports specifically designed for fast charging, but I haven't got any info on them as yet.
There are also kernels which enable "fast charging" on a PC. Basically it removes the data connection in software and treats any USB connection as if it were plugged into AC. You can charge just as fast on a computer as you can on a wall charger when this feature is enabled in the kernel.
I am using the N10 charger for my Note 2 and it charges bloody fast using this charger. Charging is noticeably faster on Note 2 than the stock 1A charger that came with the Note.
Battery is not getting warm and battery temps are similar to those on 1A charger. Basically its cutting the charging time in almost half.
Agreed. Note 2 charger is awesome. Bought a powergen 3.1 amp car charger for the note 2 also after watching videos and reading up on proper car chargers for the phone. Guess I can use it for my nexus 10 too.
Sent from my Nexus 10 using xda premium
I own RC cars with lipo batteries and rule of thumb is total mah divide by 1000 = the Max amp charger you can use. So a 2100mah battery can be charged with a 2.1A charger.
On that note I charge my Samsung s3 that has a 2100mah battery with a 2.1A car charger without any issue.
Sent from my SGH-T999 using Tapatalk 2
I used the N10's charger to charge my iPod Nano 3rd gen, no problem
Hello friends,
So I just got my Note 4 and i'm wondering how long should I keep it in charge for the first time? And should I drain it on first use or charge it when it's at let's say 20%??
Thanks in advance.
14 hrs, dont drain, battery should be between 20-80% before charging in normal use, fast charge off.
@zurkx
Thanks for the reply.
Are you sure about the 14 hours??? I thought Li-ion batteries don't need that long of a charging time !!!
XeroHertZ said:
@zurkxAre you sure about the 14 hours??? I thought Li-ion batteries don't need that long of a charging time !!!
Click to expand...
Click to collapse
Please happily ignore that "advices".
Use Fast charge, charging takes exactly till the battery is full, that's about 1,5 hours for a full charge.
I don't see ANY sense in charging a LiIo battery "fuller than full", just impossible nonsense.
LiIo batteries suffer of aging, slightly increased by the number of charges, highly (!) increased by overheating, not of any memory effects.
There is NO "breaking in" of the Note 4s battery, amperage of fast charge doesn't come even near the safety limits, won't cause quick degradation or overheating.
So just don't listen go the immortal myths and "ancient wisdom" propagated by people not aware of the fact that battery technology indeed changed over the decades.
Chefproll said:
Please happily ignore that "advices".
Use Fast charge, charging takes exactly till the battery is full, that's about 1,5 hours for a full charge.
I don't see ANY sense in charging a LiIo battery "fuller than full", just impossible nonsense.
LiIo batteries suffer of aging, slightly increased by the number of charges, highly (!) increased by overheating, not of any memory effects.
There is NO "breaking in" of the Note 4s battery, amperage of fast charge doesn't come even near the safety limits, won't cause quick degradation or overheating.
So just don't listen go the immortal myths and "ancient wisdom" propagated by people not aware of the fact that battery technology indeed changed over the decades.
Click to expand...
Click to collapse
Thanks Chefprol.I have done some research on charging the battery and have come to a conclusion that once it's charged I can use it straight away but and then drain it to 18 to 20% then charge it fully.
Chefproll said:
Please happily ignore that "advices".
Use Fast charge, charging takes exactly till the battery is full, that's about 1,5 hours for a full charge.
I don't see ANY sense in charging a LiIo battery "fuller than full", just impossible nonsense.
LiIo batteries suffer of aging, slightly increased by the number of charges, highly (!) increased by overheating, not of any memory effects.
There is NO "breaking in" of the Note 4s battery, amperage of fast charge doesn't come even near the safety limits, won't cause quick degradation or overheating.
So just don't listen go the immortal myths and "ancient wisdom" propagated by people not aware of the fact that battery technology indeed changed over the decades.
Click to expand...
Click to collapse
Thanks ! i tought it would be a old myth to first drain the batery and then fully load it but as far as i know its only with old phones and mp3 players and such.
hope i will get my note 4 today ! waiting for it since monday
Fast Charge is not really a useful feature for me, it just hurts the battery more in the long run
what about the thoughts on conditioning the battery?
Sent from my SM-N910C using XDA Free mobile app
There's no need to condition the battery, its a lithium battery.
If you're having battery drain issues I would suggest you clear your data cache.
ddaharu said:
what about the thoughts on conditioning the battery?
Sent from my SM-N910C using XDA Free mobile app
Click to expand...
Click to collapse
this is the same guy making up stuff about the note 4 GPS being bad.
dont listen to fools.
First charge needs to be 14 hours to trickle charge the battery to full and make sure the meter is calibrated to a full battery.
fast charge does reduce battery life since it charges at higher voltage and amperage. any battery gets damaged a little by that. best is a slow charge (preferably Qi) at a normal charging voltage. Slower the better for longer battery life. if you want convenience over battery life then by all means fast charge and mess it up and replace after 2-3 years.
Who's post are you referring to?
zurkx said:
this is the same guy making up stuff about the note 4 GPS being bad.
dont listen to fools.
First charge needs to be 14 hours to trickle charge the battery to full and make sure the meter is calibrated to a full battery.
fast charge does reduce battery life since it charges at higher voltage and amperage. any battery gets damaged a little by that. best is a slow charge (preferably Qi) at a normal charging voltage. Slower the better for longer battery life. if you want convenience over battery life then by all means fast charge and mess it up and replace after 2-3 years.
Click to expand...
Click to collapse
arjun90 said:
Who's post are you referring to?
Click to expand...
Click to collapse
It's mine. That guy already bumped into me a while ago, now it's time for his revenge.
I'll care for that, now...
---------- Post added at 02:09 PM ---------- Previous post was at 01:32 PM ----------
zurkx said:
this is the same guy making up stuff about the note 4 GPS being bad.
Click to expand...
Click to collapse
So here we go; you asked for it...
My critism about the Note 4 refers to it's GPS receiver, which is "deaf" compared to the competition and shows frequent signal drops.
More here: http://forum.xda-developers.com/note-4/general/gps-close-to-unusable-t2948602
dont listen to fools.
Click to expand...
Click to collapse
Indeed - have a look:
First charge needs to be 14 hours to trickle charge the battery to full and make sure the meter is calibrated to a full battery.
Click to expand...
Click to collapse
I already advised to realize this is 2014 battery technology, not the ancient batteries of the past.
Short: There is no "trickle charge" with Lithium-Ion-batteries.
See this: http://batteryuniversity.com/learn/article/charging_lithium_ion_batteries - quote: "The difference lies in a higher voltage per cell, tighter voltage tolerance and the absence of trickle or float charge at full charge."
fast charge does reduce battery life since it charges at higher voltage and amperage. any battery gets damaged a little by that.
Click to expand...
Click to collapse
Quote: "The charge rate of a typical consumer Li-ion battery is between 0.5 and 1C in Stage 1, and the charge time is about three hours. Manufacturers recommend charging the 18650 cell at 0.8C or less."
"C" is the capacity, 3220 mAh with our Note 4's battery. So we're save to charge with a current (milliamperes, "mA") of up to 3220 mA - if we follow the manufacturer's advice for the older type of batteries of that kind (18650 is an old warrior in the field), there's still 2576 A left.
So what does our fast charge supply deliver ? Look at it's ratings: 5 V, 2 A (2000 mA).
So even fast charge is far below the limits - our real limit is 3220 mA, but fast charging just uses 2000 mA.
Sound and safe.
Wonder about me highlighting "higher voltage" in zurkx's highly elaborate statement in red ? - Answer is above: The voltage does NOT change, it is NOT higher. Of course not !
The worst enemies of LiIon batteries are heat and age.
Heat is generated by a) placing the device at a hot spot (like behind the car's windscreen or in bright sunlight), b) by using demanding features like 4K video recording or highend games, c) by charging .
a) Your call. Just don't let your Note get hot. Overheating destroys your battery in no time. We're lucky we've got an exchangeable battery - so nothing to really worry about.
b) Your call. See a).
c) Charging produces some heat, especially on the "last mile", when the battery is "almost full", because the battery is a bit reluctant of getting charged up to the brim. So more heat is generated in that last phase. It's not much, won't reach the safety limits. It just can't, because the build-in charging circuits limits the current if heat gets up.
By the way: That integrated charging circuits are propped with safety measures, checking charge, condition, temperature and the like.
So even if you hook up a charger capable of providing 20 whopping amperes, the circuits just won't let that happen.
There is no way of providing the battery too much current; it's automatically limited.
best is a slow charge (preferably Qi) at a normal charging voltage. Slower the better for longer battery life.
Click to expand...
Click to collapse
Again; welcome to the 21st century. We don't need any slow charge. It's the opposite.
Charging right slow has the danger that apps on the phone draw more power than the charge provides. That may drain your battery instead of filling it.
Plus: If you hook up the charger for long, it will be recharged (charge gets "topped off") frequenly. And every new charging attempt has a slightly negative impact on the battery's life; it's like wearing it a bit down. - Charge often, reduce your battery's life. That damage is tiny, by the way. But it is there, so hooking up your charger for many hours slowly kills your battery.
Now for the aging:
if you want convenience over battery life then by all means fast charge and mess it up and replace after 2-3 years.
Click to expand...
Click to collapse
LiIon battery ARE AGING, up from the time of manufacture.
You all know that: You charge a device like you're told by the instructions - but after 1 to 3 years you notice a severe drop of usage time, a drop of capacity.
That's aging.
NOTHING you can do against that but buying a new battery.
So your battery will lose it's capacity over time; if you use it or not. You all know that, you all experienced that.
With the Note 4, we can happily buy a new battery if the old one runs out; it's that simple. But as a normal Li Ion battery reaches it's shelf live after 2 or 3 years anyway, there's NO (!) need of burdening it and you with slow charge. The results are exactly the same, with the difference that you save precious time with fast charging.
And now allow me quoting again:
dont listen to fools.
Click to expand...
Click to collapse
Have a nice day, all of you except one.
youre completely wrong.
The QuickCharge tech charges at higher VOLTAGE and AMPERAGE.
http://www.androidauthority.com/quick-charge-explained-563838/
Quick Charge 2.0
Voltages 5v 5v / 9v / 12v
Max Current 2A 3A
Snapdragon 200, 400, 410, 615, 800, 801, 805
The rest is just BS as usual. You have no idea what youre talking about. Dumping 9V (Samsung Note 4 AFC) into a 5V battery makes it charge hotter and faster and degrades it significantly. After two weeks of fast charge i lost a small chunk off the top of my brand new battery.
just bad advice as usual.
zurkx said:
youre completely wrong.
Click to expand...
Click to collapse
Yes, indeed. I was completely wrong by believing you'd understand some simple things.
In fact, I am not sure if I should take your statements for serious or just for a joke.
The QuickCharge tech charges at higher VOLTAGE and AMPERAGE.
Voltages 5v 5v / 9v / 12v
Max Current 2A 3A
Click to expand...
Click to collapse
So you REALLY believe that changes of the output voltage of the POWER SUPPLY lead to the BATTERY charged with more volts ?
You can't be serious. That's technically impossible.
Let's put it easy:
If you insert your power supply into a 110 V receptacle in the USA, you get 5 V output.
So according to your "logic", using the same power supply in Europe (230 V) increases the voltage to 10 V ?
No. Just NO.
That higher POWER SUPPLY voltage is used for fulfilling the rule W = V * A (Watt = Volt * Ampere); just to be able to squeeze more power through the power supply's cable.
In the Note 4 and in the charging circuit, that voltage OF COURSE will be regulated down to the regular charging voltage - just with the benefit to carry more amperes.
So the CHARGING VOLTAGE stays the same; it does NOT follow the voltage supplied by the POWER SUPPLY. It never does.
So fast charging does NOT (read that: NOT !) increase the charging voltage. It cannot.
Got that now ? - Or do I need to put it ever more simple ?
It does not help using swearing words like "fool" or "bull****".
But it could help just saying: "Oh, sorry, I was wrong. - My apologies."
Make yourself at home with the basics of lithium ion and charging technology. THEN speak up.
Ah, overlooked something:
After two weeks of fast charge i lost a small chunk off the top of my brand new battery.
Click to expand...
Click to collapse
1) Hope that chunk fell somewhere you were able to pick it up again.
2) How to you KNOW that ? I expect a detailled description about how you did the magic of finding out that your battery doesn't charge to 100 %.
3) If you KNEW that fast charging would kill your battery, wise man - why did you allegedly use the feature ? - Sorry, man... Your statements are not very trustworthy. I guess you never used that feature, just say so to strengthen your shaky point of view. Please don't mess with a perception psychologist.
4) If your battery really suffered, that might be due to your highly acclaimed and absolutely pointless 14-hours-charging-marathons, causing a permanent charge on/charge off cycle, weakening your battery.
So please just stop bashing a real useful feature of the Note 4. If you just love waiting ages for batteries to charge - your preference. But please stop spreading false facts about things you very obviously are not at home with.
And a last thing which might stop that aimless harassing fire of yours: I am HAM, a licenced amateur radio operator, holding the highest German licence class. These are the people who know a bit about volts and amperes.
how hard is it for you to understand that quickcharge 2.0 outputs higher VOLTAGE and AMPERAGE to charge the battery ? The charger charges the BATTERY AT 9V 1.67A up to 50% and then switches over to the regular 5V 2A charge rate. INPUT VOLTAGE (110V or 230V) has nothing to do with OUTPUT VOLTAGE. It charges the battery at 9V REGARDLESS of INPUT VOLTAGE.
edit:
also it has nothing to do with the cable. you must be crazy if you think a cable issue exists whether you transfer 15W or 10W across it. the cable is rated for well beyond that. the reason for the higher voltage is that modern lithium ions can accept high voltage charge rates with limited damage at low amperage. the reason they cut it off at 50% is the battery would be severely damaged if you tried to charge it to 100% and overshot. so yes quickcharge 2.0 really does charge your battery at a higher voltage than it was designed to be charged at. and no they dont have a magical transformer on your phone to go from 9V to 5V. otherwise they would be using it all the time and fast charge 9V to 100%. the wall plug is the only thing which has a transformer and the phone uses what it gets from there. they arent going to build half of another wall plug (9V DC-DC) and stuff it into the phone. it would generate heat and add bulk. Instead the PMIC "spikes" the battery with higher voltage and keeps it roughly constant (load modulation) by communicating with the quickcharge 2.0 AFC on the other end.
Hopeless.
I just love these battery threads, there's always some muppet who says the battery needs conditioning and must first be charged for a suitably ridiculous length of time. When it's charged it's charged, lithium batteries have no memory effect so the idea of conditioning them is moronic
Sent from my SM-N910F using XDA Free mobile app
yes they have no memory effect. why ? because you say so.
other people believe otherwise because they actually test things out for themselves :
http://www.psi.ch/media/memory-effect-now-also-found-in-lithium-ion-batteries
http://pocketnow.com/2013/05/03/li-ion-batteries-memory-effect
http://www.nature.com/nmat/journal/v12/n6/full/nmat3623.html
no need to keep it for 14 hours, as they said in the catalog you only need to charge it till it's full, then unplug the charger.
Hello again !
After all cooled down a bit, here's some more information about that dreaded HIGH VOLTAGE fast charging uses which seemingly makes some of you wet your pants.
First, there's an experiment you can do yourself. You don't need to do - but it's quite impressive and gives you some proof of the things I say.
Get two 9 V batteries; the small rectangle ones we all know. Connect the positive contact of the first battery with the negative contact of the second. Thus you get an 18 volts DC power source.
Get a thin, isolated wire, short-circuit the open contacts with the wire. Wait.
Nothing special will happen, maybe the wire will get a little warm - and your batteries will eventually die.
(If you use a VERY thing wires, it might heat up.)
Now take a length of the same wire, do the same using your car's battery (12 – 13.8 V DC).
WARNING !
1) Take the battery out of the car, set it on solid ground with nothing combustible near !!! Do NOT try this with the battery still in the car !!!
2) Use pliers to connect the wire with the battery contacts !!!
3) Do that OUTDOORS !!!
Short-circuit the battery contacts using the pliers with the wire.
You don't need to wait. The cable will turn into a smoking, burning, white-hot thing in an instant.
Huh ? - We've got 18 V with just nothing happening, we've got just 12 V wreaking instant havoc and destruction !?
Amperage is the key !
Voltage alone does not cause the destruction, it's the amperage.
9 V batteries cannot provide sufficient amperes for killing the wire; 12 V car batteries do.
Short: High amperage kills wires, high voltage doesn't.
So back to our topic...
To fast charge our Note 4's battery, we need power, watts. But the tiny wires in the Note 4 can't withstand a high wattage; they would heat up like the wire connected to the 12 V car battery.
So Samsung uses a little trick, according to Ohm's law: W = V * A, W is watts, V is volts, A is amperes.
So we can achieve a high wattage by EITHER using a higher voltage OR a higher amperage.
Higher amperage does not work because it will kill the tiny wires in the Note.
So Samsung raised the voltage for carrying more watts from the power supply via the internal Note 4's cabling to the charging circuit.
That higher voltage gets transformed down to the normal charging voltage at the charging circuit.
Your battery is charged with the usual voltage, but with the benefits of a higher amperage.
That's all the magic: That higher voltage is used to carry more wattage to the charging circuit, but not beyond. Nothing else.
And that's why it does not harm your battery; charging voltage will not change - your battery just gets charged faster, always monitored by the charging circuit which will lower the charge accordingly if needed, so your battery will always be safe. That's why the "last mile" (charge from about 92 % to 100 %) takes more time to charge - because the charging circuit automatically lowers the charge to protect your battery.
So don't be afraid of that higher voltage; it never reaches your battery, it is just a means for transferring higher wattage via tiny wires.
Note: You ever wondered why Europeans use 230 V instead of 110 V ? - That's the reason. Being able to carry more watts over regular power lines without risking the wires heating up too much. It's not a means of destruction, it's the opposite.
I just installed a QI charger for my new LG G2. I got a few QI charging pads as well. I want to see how fast it is charging, or the charge rate (500mah, 750? ect) as the receiver is supposed to do the following charge: DC 5V/500mA-1000mA .
Ive tried a few apps, but I cant find one that specifically says what the charging rate is. Anyone know the best way to figure that out?
Two questions:
1) When you say you "installed a QI charger for my new LG G2" exactly what do you mean there, and I do mean exactly: are you referring to getting a Qi charging pad (which you mention) or do you mean you got some kind of part that you physically installed in or on your G2 - the Verizon G2 is the only one that supports wireless charging out-of-the-box so, that's why I'm asking.
2) With respect to actual charging, the output of the Qi wireless charging pad is directly related to the amperage/current supplied by the actual AC adapter or USB charger you're using with it. If it's about 1A (the AC or USB charger) then you're going to lose quite a bit of power in the actual charging process because wireless charging is pretty severely inefficient most of the time, give or take you'd get 400 to 500mAh going into the actual device from the charging pad.
What I'm saying is if you have a Qi wireless charging pad, you'd be best served using as high an amperage/current charger for the pad itself so that the pad can then transfer as much as possible to the device itself. Anything less than a solid 2A charger attached to the Qi wireless charging pad and you're basically wasting a lot of it in the process and might be better off actually just using the USB port on a computer or something (about 500-550mAh max anyway).
Basic rule of thumb: the Qi wireless charging pad can use all the amperage/current it can get, with at least the factory LG 1.8A charger being what I'd call the bare minimum (and with that you'd probably be able to push about 900 to 1000 mAh (aka 1A) to the device. Qi hardware is roughly 40% efficient so, you're going to lose a lot in the process as stated; the more you start with the more that gets to the device even accounting for the inefficiency.
As far as measuring the current, you can try CurrentWidget on the Play Market, it may provide you with some info in terms of the charging rate.
br0adband said:
Two questions:
1) When you say you "installed a QI charger for my new LG G2" exactly what do you mean there, and I do mean exactly: are you referring to getting a Qi charging pad (which you mention) or do you mean you got some kind of part that you physically installed in or on your G2 - the Verizon G2 is the only one that supports wireless charging out-of-the-box so, that's why I'm asking.
2) With respect to actual charging, the output of the Qi wireless charging pad is directly related to the amperage/current supplied by the actual AC adapter or USB charger you're using with it. If it's about 1A (the AC or USB charger) then you're going to lose quite a bit of power in the actual charging process because wireless charging is pretty severely inefficient most of the time, give or take you'd get 400 to 500mAh going into the actual device from the charging pad.
What I'm saying is if you have a Qi wireless charging pad, you'd be best served using as high an amperage/current charger for the pad itself so that the pad can then transfer as much as possible to the device itself. Anything less than a solid 2A charger attached to the Qi wireless charging pad and you're basically wasting a lot of it in the process and might be better off actually just using the USB port on a computer or something (about 500-550mAh max anyway).
Basic rule of thumb: the Qi wireless charging pad can use all the amperage/current it can get, with at least the factory LG 1.8A charger being what I'd call the bare minimum (and with that you'd probably be able to push about 900 to 1000 mAh (aka 1A) to the device. Qi hardware is roughly 40% efficient so, you're going to lose a lot in the process as stated; the more you start with the more that gets to the device even accounting for the inefficiency.
As far as measuring the current, you can try CurrentWidget on the Play Market, it may provide you with some info in terms of the charging rate.
Click to expand...
Click to collapse
Firstly, thank you for taking time to write such a great response. I really appreciate it!
1) Yes I installed a universal sticker. I used this one http://www.amazon.com/gp/product/B00MN3RR7Q/ which is supposedly supposed to do 1000mA. People in the reviews seem to say they are getting good results. I wish the ATT version had wireless out of the box, but then if it did I would be stuck with PMA charging. I installed an actual NFC/PMA sticker in my G3. PMA kind of sucks...anywho.
2) This is the pad I am using: http://www.amazon.com/gp/product/B00H9B7ALK/. 1.5a input and 1a output. On this one, I am averaging about 3% per 10 minutes or 30% an hour. So roughly 3 hours and 20 minutes to full charge. I can try the the stock LG. Oh, my Dell Venue 8 Pro is a 2a one. I can try that as well. Can the paid take the 2a in even though it was built for 1.5a in?
I will try current widget. Ive been using battery monitor to log as well.
1) Neat, I didn't even know such a thing existed, I may have to give that a shot with my G2 at some point (if I decide to keep it, that is).
2) As stated before, using a higher amperage/current charger or power supply is preferred, sure. It should help get the charging done faster and again the device (meaning the charger) will pull what it requires and nothing more.
Basic electronics 101 here: two things that matter with respect to smartphone chargers (or most any device, to be honest) - amperage aka current and voltage.
Voltage is pushed from a power supply meaning it will always be the same amount, give or take micro-variations. If it's a 5VDC power supply (of any kind) it's designed to provide 5VDC constantly. If it's some other value, say 9VDC, 12VDC, and so on, that's how much it pushes - if you were to connect a 9VDC charger to a smartphone or other device that's designed for a 5VDC input, you'd fry the electrical circuits in the device because it would be flooded with more power than it's designed for.
Amperage aka current is pulled from a power supply and only what is required is what's actually taken. With respect to smartphones, most of the higher end devices these days can make use of roughly 1.2 to 1.8 A (read as Amps) when it comes to charging. This means if you had a charger that output 5VDC (from what I just said above that's the standard worldwide for such devices as smartphones) but could theoretically provide 5A of current, the smartphone technically would not be damaged because it would only pull roughly 1 to 1.8 Amps at most - if you do use CurrentWidget and you plug in the G2 and look at the reading while it's charging, you'll note that the level of amperage/current being pulled from the charger fluctuates like crazy - voltage stays constant (give or take a microvolt here and there) but the current will jump all over the place, especially if you enable the "Smart Charging" feature of the G2.
The reason this happens is because when a LiIon battery is pretty low on a charge, say down to 10-15%, it's "gone deep" as the saying goes and the charging circuit will pull the max amperage/current that the charger is capable of producing and that can be measured/seen using CurrentWidget. As the battery gets into the 90% full range, the amperage/current draw will reduce (again, especially with the Smart Charging enabled) as the battery gets towards being totally full. This is a good thing in most every respect and it keeps the LiIon battery in good shape too - if it pulled the max current till it was 100% it wouldn't necessarily be so good and would heat the battery up more than necessary and LiIon batteries are very sensitive to temperature variations.
Hence, phones get fried by "cheap Chinese chargers" a lot of times because of voltage issues and faulty voltage regulators, not from amperage/current problems. It's actually kind of difficult to kill a device with amperage/current, but screwing around with the voltage will destroy a device almost 100% of the time and quite fast too.
Also, this is the reason why you'll see a phone charge relatively quickly to the 99% point then it seems to take even longer to get that last 1% to finish it off at 100% - it's the way LiIon charging technology works and helps the battery lifespan (meaning how long the battery is useful for measured in years and not "battery life" in terms of how long it can run before you have to charge it again measured in hours). The charging process "slows down" as it gets close to being full which works great for this kind of technology.
Hope this info helps...
br0adband said:
Basic rule of thumb: the Qi wireless charging pad can use all the amperage/current it can get, with at least the factory LG 1.8A charger being what I'd call the bare minimum (and with that you'd probably be able to push about 900 to 1000 mAh (aka 1A) to the device. Qi hardware is roughly 40% efficient so, you're going to lose a lot in the process as stated; the more you start with the more that gets to the device even accounting for the inefficiency.
Click to expand...
Click to collapse
Okay let me see if I understand this correctly. The OUTPUT of the qi charging pad could be 1000mAh, but since the wireless QI hardware is only 40% efficient, the actual charge rate will be more around mAh to 500mAh? Im recording an actual charge rate of 500mAh and my phone states it is on AC power and not USB.
If the receiver on the phone states it can do up to 1000mAh, what I need to find is a charger that outputs a lot more like 2000mAh and at 40% efficiency I might be able to get around the 1000mAh?
That pretty much sums it up, yep - as long as you account for the inefficiency of the Qi charging technology, you can get faster charge times and still use it without having to plug in/unplug, etc the old fashioned way.
It works, it's just not nearly as fast or efficient as the old fashioned way so, give the Qi pad plenty of current and you'll be fine - since it will pull what it needs, using a 1.8A or 2A or even more won't hurt it, but it will make it pretty damned warm to the touch when it's charging so keep that in mind. As the G2 would be sitting on top of the Qi pad, if the pad gets warm or even hot then obviously the G2 will as well by heat transfer and heat/high temps are bad for LiIon batteries as I mentioned earlier.
It's a trade-off more than anything else but again, it does work as long as you're understanding the hows and whys to make the best of it.
br0adband said:
That pretty much sums it up, yep - as long as you account for the inefficiency of the Qi charging technology, you can get faster charge times and still use it without having to plug in/unplug, etc the old fashioned way.
It works, it's just not nearly as fast or efficient as the old fashioned way so, give the Qi pad plenty of current and you'll be fine - since it will pull what it needs, using a 1.8A or 2A or even more won't hurt it, but it will make it pretty damned warm to the touch when it's charging so keep that in mind. As the G2 would be sitting on top of the Qi pad, if the pad gets warm or even hot then obviously the G2 will as well by heat transfer and heat/high temps are bad for LiIon batteries as I mentioned earlier.
It's a trade-off more than anything else but again, it does work as long as you're understanding the hows and whys to make the best of it.
Click to expand...
Click to collapse
I ran the battery down to 70% and I have it on the charger with a 2a wall adapter. I will see how fast it charges. But it seems like I will get roughly 1/3rd the charging speed of a wall adapter. Which means in the car using gps with the screen on and QI chrarging will probably mean a negative overall power situation.
Im also going to try a high speed, charging only cable like this http://www.amazon.com/gp/product/B009W34X5O/ between the wall adapter and the charging pad to see if there is any difference.
Don't waste your money, that thing is no better than a "Gold Plated 56K Modem Cord," seriously. Gold plating, "high speed," all that stuff is marketing BS and means absolutely nothing in the long run - it's a microUSB cable, nothing more.
In 20+ years of using USB cords of all kinds I've yet to see one that's corroded so, that gold plating is not gonna matter anyway.
Any microUSB cable you can find today is more than capable of handling ~2A without a single issue and it's well known that the G2 can max out at 1.6A draw for charging anyway so any cable is more than adequate for doing it.
br0adband said:
Don't waste your money, that thing is no better than a "Gold Plated 56K Modem Cord," seriously. Gold plating, "high speed," all that stuff is marketing BS and means absolutely nothing in the long run - it's a microUSB cable, nothing more.
In 20+ years of using USB cords of all kinds I've yet to see one that's corroded so, that gold plating is not gonna matter anyway.
Any microUSB cable you can find today is more than capable of handling ~2A without a single issue and it's well known that the G2 can max out at 1.6A draw for charging anyway so any cable is more than adequate for doing it.
Click to expand...
Click to collapse
Hah! I already have one I use in the car
shaxs said:
I ran the battery down to 70% and I have it on the charger with a 2a wall adapter. I will see how fast it charges. But it seems like I will get roughly 1/3rd the charging speed of a wall adapter. Which means in the car using gps with the screen on and QI chrarging will probably mean a negative overall power situation.
Im also going to try a high speed, charging only cable like this http://www.amazon.com/gp/product/B009W34X5O/ between the wall adapter and the charging pad to see if there is any difference.
Click to expand...
Click to collapse
Okay I was able to maintain neutral power with the screen on and running gps. Let it go for almost 2 hours and it was at the same percentage as when I started. Im good with that for car use.
I'm wondering if the heat I hear about being generated before the charging rate slows down after 50% would have any negative impact on battery life. Would there be any benefit in using my multi-port charger for overnight charges when I am not in a hurry to charge the battery?
Also, I assume that the battery doesn't have any memory, and that there's no reason to break it in, fully discharge periodically, etc. and that it's okay to charge a little or a lot regardless of the current charge state. Is that correct?
This is a question i would like to know the answer to as well
I did a slow charge last night and the battery seemed to discharge s little slower this morning fwiw, but that's not terribly scientific.
Sent from my XT1575 using XDA Free mobile app
There's already a thread for this. No, it doe not harm battery life.
Darnell_Chat_TN said:
There's already a thread for this. No, it doe not harm battery life.
Click to expand...
Click to collapse
Could you please point me towards that thread? I didn't locate it with a few search combinations. Thanks.
Mississip said:
I'm wondering if the heat I hear about being generated before the charging rate slows down after 50% would have any negative impact on battery life. Would there be any benefit in using my multi-port charger for overnight charges when I am not in a hurry to charge the battery?
Also, I assume that the battery doesn't have any memory, and that there's no reason to break it in, fully discharge periodically, etc. and that it's okay to charge a little or a lot regardless of the current charge state. Is that correct?
Click to expand...
Click to collapse
Fast Charging Lithium = Battery damage. It's basic chemistry. The cells take mechanical damage from expanding too quickly. So, for best longevity, charge her with like a .7 to 1 amp charger.
Locklear308 said:
Fast Charging Lithium = Battery damage. It's basic chemistry. The cells take mechanical damage from expanding too quickly. So, for best longevity, charge her with like a .7 to 1 amp charger.
Click to expand...
Click to collapse
wrong. the only thing that damages cells is charging beyond the voltage specifications. How fast you dump electrons in has no negative effects, its only when you put too many in that batteries get damaged.
Locklear308 said:
Fast Charging Lithium = Battery damage. It's basic chemistry. The cells take mechanical damage from expanding too quickly. So, for best longevity, charge her with like a .7 to 1 amp charger.
Click to expand...
Click to collapse
Thank you. I had thought the same thing. No one had the time to give me any detailed information, so I researched. I can't post links, but the following articles are helpful and will show up first in a search for the title
'Will speed chargers kill your battery?'
'BU-401a: Fast and Ultra-fast Chargers'
A conventional phone charger can only supply the current and voltage that is safe for a battery at all charge levels. In other words, it is must use the least common denominator. Quick Charge makes this process much more active by monitoring max current, max voltage, and temperature so that it can supply more power when it is safe and less power when it is not. Quick Charge will always keep the current, voltage, and temperature within the battery's designed specifications.
In terms of battery memory effect, no, modern lithium based batteries do not have any sort of memory-like effect. This is mostly associated with older and cheaper NiCad type batteries. This is one of those things that people seem to have a really hard time moving past.
People worry far too much about babying their battery.
Assuming you are going to use the phone for ~2 years then a properly designed fast charger should have a negligible effect on battery life. After 2 years of continuous usage all bets are off whether you used a fast charger or not.
If you really want to worry about how to treat your battery then there are two things you should try not to do. Don't let the battery go all the way to 0% and let it sit like that for a year. Don't leave your phone on your dash in direct sunlight everyday. Outside of those two things there's not much you can do to change the lifetime of your battery so just use the damn thing. =P
dalingrin said:
A conventional phone charger can only supply the current and voltage that is safe for a battery at all charge levels. In other words, it is must use the least common denominator. Quick Charge makes this process much more active by monitoring max current, max voltage, and temperature so that it can supply more power when it is safe and less power when it is not. Quick Charge will always keep the current, voltage, and temperature within the battery's designed specifications.
In terms of battery memory effect, no, modern lithium based batteries do not have any sort of memory-like effect. This is mostly associated with older and cheaper NiCad type batteries. This is one of those things that people seem to have a really hard time moving past.
People worry far too much about babying their battery.
Assuming you are going to use the phone for ~2 years then a properly designed fast charger should have a negligible effect on battery life. After 2 years of continuous usage all bets are off whether you used a fast charger or not.
If you really want to worry about how to treat your battery then there are two things you should try not to do. Don't let the battery go all the way to 0% and let it sit like that for a year. Don't leave your phone on your dash in direct sunlight everyday. Outside of those two things there's not much you can do to change the lifetime of your battery so just use the damn thing. =P
Click to expand...
Click to collapse
Exactly.
dalingrin said:
A conventional phone charger can only supply the current and voltage that is safe for a battery at all charge levels. In other words, it is must use the least common denominator. Quick Charge makes this process much more active by monitoring max current, max voltage, and temperature so that it can supply more power when it is safe and less power when it is not. Quick Charge will always keep the current, voltage, and temperature within the battery's designed specifications.
In terms of battery memory effect, no, modern lithium based batteries do not have any sort of memory-like effect. This is mostly associated with older and cheaper NiCad type batteries. This is one of those things that people seem to have a really hard time moving past.
People worry far too much about babying their battery.
Assuming you are going to use the phone for ~2 years then a properly designed fast charger should have a negligible effect on battery life. After 2 years of continuous usage all bets are off whether you used a fast charger or not.
If you really want to worry about how to treat your battery then there are two things you should try not to do. Don't let the battery go all the way to 0% and let it sit like that for a year. Don't leave your phone on your dash in direct sunlight everyday. Outside of those two things there's not much you can do to change the lifetime of your battery so just use the damn thing. =P
Click to expand...
Click to collapse
I had researched the topic and learned what you have stated, but I really appreciate you taking the time to write this fuller explanation. I wished to take every reasonable precaution to maximize battery life, given the battery is not easily replaced.
There have been references published claiming that charging faster (higher current) shortens overall Li-Ion battery life.
Mechanism may be related to heat.
One thing the Qualcomm Quick Charge 2.0 (used in the Snapdragon 808) does is increase voltage at the charger from standard USB 5V, to 9V and 12V, for higher charge rates (power) at still-moderate current (to keep heat dissipation down).
I measured 1.1 to 2.3 amps at ~9V with QC 2.0 charger on the MXPE, with the higher current measurements at lower State of Charge (SoC). Have not seen 12V yet, but I only tried it down to 45% SoC (2.3 amps at 9V), I imagine it bumps up to 12V when the battery is discharged further, nearer to complete discharge.
This charger is rated for
5V, 4A
9V, 2.22A
12V, 2.5A
20V, 1.0A
So the max power fed to the battery would be 28W (12V*2.5A).
(This is the Power Partners PEAW30-12-USB, supposedly a 30W charger. So much for integrity in advertising.)
So the current is kept to a manageable level to control heat dissipation (therefore max temperatures), from the charger to (somewhere in the phone). But I believe that at the battery itself, more rapid charging (higher power) would still require higher current, because voltages have to be limited in the battery itself, so one would think heat dissipation (> max temperatures) would still be a problem in the battery itself. Does that shorten battery life?
The answer is probably: Who cares. Because: Li-Ion batteries have a 2-3 year life in any case, regardless of their service life or even if they are not used at all. They age and exhibit substantial capacity decline over time. Discharge/charge cycles hasten the capacity decline, but the battery is only good for 2-3 years, give or take, no matter what. And since aftermarket replacement batteries are inferior, unsafe, and stale, there is no reason to try to hang on to your phone for more than 2-3 years in any case. (Especially since the "non-user-replaceable" batteries can be a pain in the a** to R&R. The Moto X Pure 2015 battery is one of those. Some phones actually incur permanent damage to seals if the battery is removed/replaced - the Kyocera Hydro Wave is this way.)
You say "but you could replace the battery with an OEM battery". There are two types of OEM Li-Ion phone batteries on the market that an individual consumer can buy retail, when their phone is 2 years old or more: Used stale batteries (look up "reverse logistics"), and "new" (i.e. not put into service yet) stale batteries. Good luck finding a fresh, new OEM Li-Ion battery for your 2 year old or older phone (out of production for at least a year).
Been down this road before. Wasted lots of time and money replacing phone batteries after 2-3 years. From now on I'm going to stop coddling phone batteries, stop replacing them after 2-3 years, and just figure on a new phone every 2-3 years. It's the only way to get a fresh, new Li-Ion phone battery. (And get the phone right when it is released, like the MXPE this month. That way you are more sure the battery is fresh.)
I think everything in the wireless phone paradigm is increasingly heading that way anyway. Everything, and I mean everything, pushes the market to a 2 year product life cycle. Batteries last 2 years. Increasingly, batteries are not made to be replaceable. Carriers are changing networks so fast you need a new phone every 2 for that alone. New OS/SW overloads hardware older than 2 years. Displays may fade over a couple years. USB connectors wear out. Just relax and go with it. Marvel at the Qualcomm Quick Charge 2.0 (I am). You'll be happier with a new phone every 2 years.
Sorry for the long rant.
Sorry for the kind of off topic, but it's kind of related... is it okay to use other devices with the included fast charger? I just hate having 2 micro usb chargers plugged in, when I could use just one
Sent from my XT1575 using XDA Free mobile app
crash613 said:
Sorry for the kind of off topic, but it's kind of related... is it okay to use other devices with the included fast charger? I just hate having 2 micro usb chargers plugged in, when I could use just one
Sent from my XT1575 using XDA Free mobile app
Click to expand...
Click to collapse
Yes, the Moto Turbo Charger can be used with any MicroUSB charging device. It will adjust charging as needed for the individual device. Moto made the Turbo Charger, to be a single charger for all MicroUSB devices.
If the battery is kept well charged, which Turbo Charging helps to accomplish. That's better to me, than more drain and slower chargers that leave the battery more drained overall. The batteries are supposed to last longer when kept fully charged more often.
crash613 said:
Sorry for the kind of off topic, but it's kind of related... is it okay to use other devices with the included fast charger? I just hate having 2 micro usb chargers plugged in, when I could use just one
Sent from my XT1575 using XDA Free mobile app
Click to expand...
Click to collapse
"...since Quick Charge 2.0 is compatible and interoperable, a certified adapter can be used with a non-Quick Charge 2.0 device, though the fast charging benefits of Quick Charge 2.0 will not be available. "
https://www.qualcomm.com/products/snapdragon/quick-charge/faq
By all appearances, Motorola's "TurboPower™ Charging" is nothing more than Qualcomm Quick Charge 2.0. (That's what Snapdragon 808 in the XT1575 uses.)
The third-party Qualcomm Quick Charge 2.0 chargers I bought are recognized as "Turbo" and function with the XT1575, just like the Motorola charger that came with the XT1575.
(There are a LOT of Qualcomm-certified QC 2.0 chargers for sale by third-party names. Qualcomm has been BUSY. )
To slow charge a S7, do we have to turn off fast charging from the settings and then charge via the charger that came with the phone(the so called fast charger) or should we use a charger from an old phone say SIII etc.?
Does this hold true for Motorola's phone also which have turbo charging option?
Also how to measure battery cycles? Any credible app for the same?
billubakra said:
To slow charge a S7, do we have to turn off fast charging from the settings and then charge via the charger that came with the phone(the so called fast charger) or should we use a charger from an old phone say SIII etc.?
Does this hold true for Motorola's phone also which have turbo charging option?
Also how to measure battery cycles? Any credible app for the same?
Click to expand...
Click to collapse
Moto doesn't have the option in settings, it uses industry standard Qualcomm Quick Charge standard rather than a OS hack like Samsung (no offense)... If it is connected to a QC 2.0 charger, it will negotiate the appropriate charge rate, if it is connected to a "standard" charger it will charge normally.
I don't think you can accurately measure battery/charge cycles... even if you could it would be extremely deceiving, what would be considered a cycle? Charging at 50%, 30%, 10%, and to what point 75%, 80%, 100%? Too much room for interpretation here that could be swayed either way depending on the person/app counting it's point of view.
acejavelin said:
Moto doesn't have the option in settings, it uses industry standard Qualcomm Quick Charge standard rather than a OS hack like Samsung (no offense)... If it is connected to a QC 2.0 charger, it will negotiate the appropriate charge rate, if it is connected to a "standard" charger it will charge normally.
I don't think you can accurately measure battery/charge cycles... even if you could it would be extremely deceiving, what would be considered a cycle? Charging at 50%, 30%, 10%, and to what point 75%, 80%, 100%? Too much room for interpretation here that could be swayed either way depending on the person/app counting it's point of view.
Click to expand...
Click to collapse
Thanks for replying dear. So, for S7 I have turned off fast charge, should I now charge via the charger that came with the phone(the so called fast charger) or should we use a charger from an old phone say SIII etc.?
For Moto G, the question is the same as above.
Of the little what I have understood from various threads here is to charge the battery when it is between 20-40% to 80-90% if you want to have a good battery life. I used to do the complete opposite charge, when the battery is at say 6-7% and charge it till it is maxed. I used to do the same for my laptop, any other tip for the battery?
And I have signed your petition Brother. I hope they listen to the users.
billubakra said:
Thanks for replying dear. So, for S7 I have turned off fast charge, should I now charge via the charger that came with the phone(the so called fast charger) or should we use a charger from an old phone say SIII etc.?
For Moto G, the question is the same as above.
Of the little what I have understood from various threads here is to charge the battery when it is between 20-40% to 80-90% if you want to have a good battery life. I used to do the complete opposite charge, when the battery is at say 6-7% and charge it till it is maxed. I used to do the same for my laptop, any other tip for the battery?
And I have signed your petition Brother. I hope they listen to the users.
Click to expand...
Click to collapse
Does Fast Charge hurt the battery life, no, at least not directly... heat does. Using an older style charger will avoid Quick Charging but I think that foregoing that benefit for a few more days of battery life is hardly worth it. I frequently have 30-60 minutes to charge, not 3-5 hours, so quick charge is nice, if it takes few days off the longevity of the battery so be it. Those who think it cuts the battery life by 20, 30, even 50% are wrong, that simply isn't the case because of Fast Charge itself.
The Moto G isn't an issue here, it doesn't support Quick Charge until the 4th generation, but why give up the feature?
I don't think the "rules" of charging apply as much as people think they do... I charge mine overnight and whenever it needs it during the day, if it does. There is no need to do anything special.
acejavelin said:
Does Fast Charge hurt the battery life, no, at least not directly... heat does. Using an older style charger will avoid Quick Charging but I think that foregoing that benefit for a few more days of battery life is hardly worth it. I frequently have 30-60 minutes to charge, not 3-5 hours, so quick charge is nice, if it takes few days off the longevity of the battery so be it. Those who think it cuts the battery life by 20, 30, even 50% are wrong, that simply isn't the case because of Fast Charge itself.
The Moto G isn't an issue here, it doesn't support Quick Charge until the 4th generation, but why give up the feature?
I don't think the "rules" of charging apply as much as people think they do... I charge mine overnight and whenever it needs it during the day, if it does. There is no need to do anything special.
Click to expand...
Click to collapse
Thanks for the wonderful and detailed reply. I am going to try, not stick, to slow charging to see the difference in heating of the battery. My SIII's charger 's input is 150-300VAC, 50-60 hz 0.15AA, output- 5.0V-1.0A and S7's details are input 100-240V 50-60hz 0.5A, output- 9.0V= 1.67 A or 5.0V=2.0A. Can I use the S3's charger to charge S7 after turning of fast charge or is there a voltage difference or something? G4 is at home, don't know about its details. Also in my country the battery or the replacement parts are way too expensive.
Method:
First I used my phone until the battery was below 15% in order to get a better picture of what the charging would look like over almost a full battery cycle. I did not start at the same battery percentage for each test because I did not find any benefit to doing so. I original did this for uniformity, but it did not make a difference after trying it using the more accurate equipment.
I then cleared my history in the Battery Monitor Pro Widget (BMW Pro) recording app which was used to log the battery [mV], battery temperature [F], time, and battery percentage changes. Once this was done I plugged in my USB Power Monitor, turned airplane mode on, removed the case, and let the phone charge. I started logging the data via my power monitor once the phone showed it was charging. From this point onward I let the phone charge without interrupting it until it reached 100%, then I let it charge for another 10-60 minutes to see if it was still drawing power from each charger. Once all of this was done, I exported my data collected from BMW Pro, emailed it to myself, and pasted it along with the USB Power Monitor data into an Excel spreadsheet. All of the data was then delimited to separate the clusters of data due to the way they were recorded, and subsequently graphed. The USB Power Monitor recorded data points every 0.36 seconds, while the BMW Pro took recordings every 5 seconds because I was having issues with the “real-time” recording option in the app working correctly.
All of the data was then graphed into the nice figures you will see below; each color reflects the same variable across all of the graphs to make reading them easier. I included a legend at the top of each set of graphs which should also help make it easier to read the data.
The most interesting part of this test is how cool the S7 Edge stays while charging, and the very marginal difference in overall charging time between QC 2.0/1.0. A 15-minute gap is marginal at best given the ‘big improvements’ Qualcomm claimed when launching the newer standards.
When conducting the wireless charging tests I think there is some error in the Samsung Fast Wireless charging data, so I plan on redoing it at some point. I already redid the Choetech one because it has a similar strangely long, but now it seems more in line with what I initially found before using the newer testing equipment.
I wanted to also quickly point out that both my HTC 10 and S7 Edge keep pulling current even after the phones show they are 100% charged. I’m not talking about a tiny amount; they both pulled ~1-5W+ after hitting 100% battery which is A LOT considering they are reporting to be fully charged. I verified this using 3 multimeters just to be sure. It appears as if Qualcomm, or the OEM’s are falsely reporting when the phone is actually charged, or there’s some other shady things going on here.
Another thing I wanted to mention is how the S7 Edge is so consistent in the way it charges the battery. It could be due to the lower rates Samsung uses (9V/1.67A max which is 15.03W) vs the HTC 10’s up to 18W that I’ve seen it pull. Just take a look at how the S7 Edge charges using QC 2.0 compared to the HTC 10 with lower temperatures, similar times, and a much more consistent overall charging curve.
If you look at the Tronsmart & Choetech QC2.0 tests, then you might notice the large difference between the two. The Tronsmart charger has a harder time holding onto the proper voltages, therefore it bounces around more from ~8.92V-9.03V (a 0.11V change) while the Choetech one ranges from 9.077V-9.092 which is a significantly smaller 0.015V range. The power control chip is responsible for controlling these voltages, and clearly the Choetech one has a better chip in it. This is especially important for external battery packs where efficiency really matters due to the limited amount of power they can store.
Equipment:
These tests were conducted using a series of different chargers. The same brand was used for both Quick Charge 2.0/3.0 tests to minimize experimental error; This trend remained the same was also done for the wireless charging tests
Wall Chargers:
Quick Charge 2.0: Tronsmart 18W charger 5V/2A, 9V/2A, 12V/1.5A
Quick Charge 3.0: Tronsmart 18W charger 3.6-6.5V/3A, 6.5-9V/2A, 9-12V/1.5A
USB inline Power Monitor:
XYZ Studio 0-24V, 0-3A USB Power Monitor
Tronsmart 5-12V USB multimeter (not used in this test, but was used in the older version)
Software/App(s):
Battery Monitor Widget Pro
Excel
Notepad++
Realterm (for the USB power monitor logging)
Legend
QC 2.0 Tronsmart S7
QC 2.0 Choetech
QC 1.0 Samsung
Choetech Fast Wireless Charger
Samsung Fast Wireless Charger
Samsung Wireless Charger
Normalized data Table
Full sized downloadable pictures of everything (data wise) you see above.
very good stuff!
maybe you could also record the heat at the hottest spot of the phone during charging? I think qc3 has the same charge rate but its able to change voltage to reduce creating waste heat compared to qc2.0.
my main concern with the s7 is the battery life, i know it won't last me a full 18hr day so i really need a portable fast qc2 charger that is pocketable, so maybe 5000mah, but have not seen such a small qc charger tho
Excellent post and well-made graphs. Thanks for your efforts.
well done. good info here.
sonhy said:
very good stuff!
maybe you could also record the heat at the hottest spot of the phone during charging? I think qc3 has the same charge rate but its able to change voltage to reduce creating waste heat compared to qc2.0.
my main concern with the s7 is the battery life, i know it won't last me a full 18hr day so i really need a portable fast qc2 charger that is pocketable, so maybe 5000mah, but have not seen such a small qc charger tho
Click to expand...
Click to collapse
I don't have a thermal camera, or way to do that otherwise I gladly would. I can recommend a small 6000mah external battery pack if you want; Ill do a quick write up too (if you need one). The Samsung charger stayed at 9V the whole time per my multimeter's reading it just dropped go .5A near the end.
CLARiiON said:
Excellent post and well-made graphs. Thanks for your efforts.
Click to expand...
Click to collapse
ISperfection said:
well done. good info here.
Click to expand...
Click to collapse
Thank you, I will add in a standard wireless charger test (since my free Samsung one is enroute), and I can also get their fast charger too. I believe Samsung's fast wireless charger is only 7W so it would be slower than the Choetech one but it never hurts to see how fast it is.
Sent from my Nexus 6P using XDA Labs
@Pilz yes pls let me know of a qc2.0 small portable charger I think a quick 30mins charge to add 50% battery life mid day will be the best ease-of-use solution for me.
with the heat measurement, I'll be happy with your commercial grade temperature sensor that the great designer created for you, your fingers or better yet, the inside of your wrist.
preferably touching the same area on the phone every time and grading something like 1-5 hot/comfort levels maybe? just suggesting, no pressure
sonhy said:
@Pilz yes pls let me know of a qc2.0 small portable charger I think a quick 30mins charge to add 50% battery life mid day will be the best ease-of-use solution for me.
with the heat measurement, I'll be happy with your commercial grade temperature sensor that the great designer created for you, your fingers or better yet, the inside of your wrist.
preferably touching the same area on the phone every time and grading something like 1-5 hot/comfort levels maybe? just suggesting, no pressure
Click to expand...
Click to collapse
I'll look into some methods to measure the heat easily while they're charging. I'm conducting the standard wireless charger test using my free Samsung wireless charger right now QC 2.0 chargers the fastest when you start at a very low battery percentage, so ideally you can achieve the results posted, but ambient temperature, starting % etc contribute to whether or not that's attainable. It's still a good estimate for 30 minutes of charging +/- 5% for other factors. The phone also chargers slower when the screen is in. The rate would go from 9V/1.67A to 9V/1.10A with the screen on. It was very consistent when I turned the screen on and off during the test.
Sent from my Nexus 6P using XDA Labs
i actually won't care about heat issues while charging this time round, it'll be like my moto defy, i just run it under cold water after a fast hot charge, wont be using the s7 for many years so not worried about moisture build up.
i have ordered a magnet micro usb cable that says its rated for 2.4A charging so hopefully it'll allow easy qc2.0 charging, no need to plug in, it magnetically snaps on and off.
i think the best charge setup would be a 30mins quick charge (magnet) on the office desk than a 60mins wireless qi charge, carried in your pocket type situation.
sonhy said:
i actually won't care about heat issues while charging this time round, it'll be like my moto defy, i just run it under cold water after a fast hot charge, wont be using the s7 for many years so not worried about moisture build up.
i have ordered a magnet micro usb cable that says its rated for 2.4A charging so hopefully it'll allow easy qc2.0 charging, no need to plug in, it magnetically snaps on and off.
i think the best charge setup would be a 30mins quick charge (magnet) on the office desk than a 60mins wireless qi charge, carried in your pocket type situation.
Click to expand...
Click to collapse
Did you by chance order the Znaps? I backed them ages ago for both the Type-C and micro USB connectors. If I'm lucky I might eventually maybe sometime before I die receive them. I don't expect much from a kickstarter campaign that's been delayed this much. I'm finishing up the standard Qi/PMA charging test. It shouldn't matter the standard it's using but if it's important the Samsung wireless charger is actually PMA.
Sent from my Nexus 6P using XDA Labs
OP Updated
-Standard wireless charging test added
-All figures updated to reflect the new test
no, its from aliexpress, $10 or so, ive seen cheaper so I would say the poor Kickstarters had their designs stolen and made cheaper... im not sure, i just buy what's available and easy. just search magnet usb cable, you'll find heaps, the more exy ones claim 2.4A current rating.
sonhy said:
no, its from aliexpress, $10 or so, ive seen cheaper so I would say the poor Kickstarters had their designs stolen and made cheaper... im not sure, i just buy what's available and easy. just search magnet usb cable, you'll find heaps, the more exy ones claim 2.4A current rating.
Click to expand...
Click to collapse
Let me know how it works, I rarely use cables to charge my phone becusse I hate micro USB ports plus I'm used to the type C on Nexus.
Sent from my Nexus 6P using XDA Labs
Have you tried charging with 18w charger(not wireless) rather than the samsung one?
peachpuff said:
Have you tried charging with 18w charger(not wireless) rather than the samsung one?
Click to expand...
Click to collapse
Yes it doesn't matter becuase the phone is only rated for 15.03W so it can't use more than that no matter the charging method. See screenshot below:
Sent from my Nexus 6P using XDA Labs
@Pilz yeah sure, it should arrive in a couple of weeks. i hate plugging in as well, even with the usb type c, its reversible but finding the port isn't always easy, they should have made the port surface like a cone or funnel so your guided into the port more easily.
the use of the magnet is awesome, Sony's external side charging pins have been around for ages, its really the charging current and quality of the copper that im worried about.
sonhy said:
@Pilz yeah sure, it should arrive in a couple of weeks. i hate plugging in as well, even with the usb type c, its reversible but finding the port isn't always easy, they should have made the port surface like a cone or funnel so your guided into the port more easily.
the use of the magnet is awesome, Sony's external side charging pins have been around for ages, its really the charging current and quality of the copper that im worried about.
Click to expand...
Click to collapse
I just wish they had Type-C because its so much better especially after using it for a while now.
Sent from my Nexus 6P using XDA Labs
I never knew that plugging in a micro USB cable was so difficult. It could be one of those things that once you try a better alternative (type c maybe?) makes you ask how you lived without it, but I don't see what the fuss is about just yet.
I've used wireless chargers for years (way back in the NExus 5 days even) including in the car. Any word on fast wireless charging and heat? I'm worried about it pumping a ton of heat on to the back of the phone especially for extended periods such as overnight.
xxaarraa said:
I never knew that plugging in a micro USB cable was so difficult. It could be one of those things that once you try a better alternative (type c maybe?) makes you ask how you lived without it, but I don't see what the fuss is about just yet.
I've used wireless chargers for years (way back in the NExus 5 days even) including in the car. Any word on fast wireless charging and heat? I'm worried about it pumping a ton of heat on to the back of the phone.
Click to expand...
Click to collapse
Micro USB is just more of a hassle becusse usually you need to angle it while inserting it into the phone. Type-C is nice becuase there no worrying about how I need to orient a cable when I'm half awake plugging my phone in. It's hard to understand why its nice until you use it everyday.
I haven't been able to measure the heat yet, but the phone is cooler using the 10W wireless fast charger than it is using QC 2.0. The phone isn't hot to the touch but it is warm using the fast wireless charger. I'll try to download a battery monitoring app that measures battery temp while it's charging. This method won't be as accurate as physically measuring it, but it should still give a good indication of the temperature.
Edit: I tested the temperature using GSAM battery montior via the fast wireless charger fro ~6% charge (28-34%) and the temperature rose 6 [F], the I let the phone cool and tested QC 2.0. The phone was charger for 6% to keep things cosnistent with a temperature change of 5[F]. I would need to find a way to more accurately measure these values because that quick test doesn't really mean anything at this point.
Sent from my Nexus 6P using XDA Labs
Does the Adaptive Fast charging by Samsung with with QC 2.0 compatible devices or is it only exclusive to Samsung?
ahrion said:
Does the Adaptive Fast charging by Samsung with with QC 2.0 compatible devices or is it only exclusive to Samsung?
Click to expand...
Click to collapse
It's just a QC 2.0 charger from what I can tell. I have a battery pack that will charge using QC 2.0 so I can test it using my multimeter
Sent from my Nexus 6P using XDA Labs