[Q] Is it safe to charge with Nexus 7 charger? - Nexus 4 Q&A, Help & Troubleshooting

I have a Nexus 7 factory-included charger and I was wondering if I could safely charge my Nexus 4 with it. Reading from the specs printed on the chargers, the N7 charger outputs 2.0A at 5.0V, while the N4 charger outputs 1.2A at 5.0V. If it's safe, would it affect the N4's battery longevity in the long term? Considering it would be charged with a higher current.

Yes the battery life will reduce more quickly if you use more current to frequently charge it.
Sent from my Nexus 4

It won't be charged with a higher current. The charging circuit will only draw what it needs regardless of what the supply can output.

Yes, you can. The charge controller is in the phone. This is perfectly acceptable and will not harm your device in the long run.
I've been using my Nexus 10 charger to charge my Nexus 4, which has the same output as the charger you just described.

Higher amperage will generate more heat, which in turn can damage circuitry and the battery. At the end of the day, it's your call.
Sent from my Nexus 4

I don't suggest you charge it with the nexus 7 charger all the time use it only when you really need to get a quick charge.

harmohn said:
Higher amperage will generate more heat, which in turn can damage circuitry and the battery. At the end of the day, it's your call.
Click to expand...
Click to collapse
Yes, higher amperage will generate more heat, but the battery management system (BMS) is sized for the battery. The BMS will not allow more current into the battery than it can handle. This is independent of how much current the power supply can provide.
<http://en.wikipedia.org/wiki/Battery_management_system>

Not really sure which side to believe

Shimakaze said:
Not really sure which side to believe
Click to expand...
Click to collapse
Haha me neither
I was told it's fine just like it's fine to charge an iPhone with an iPad charger. I've been doing it with my Nexus 4 but who knows.

no dont do this otherwise it damages your battery life

Using the Nexus 7 charger (or any other higher amperage charger) WILL NOT
Damage your phone
Damage your battery
Cause your phone to draw unsafe amount of amperage
Cause your phone to get hotter than normal
Make your battery charge faster
This is 2012, not 1831. We have industry standards and the microUSB standard was designed with the idea in mind that users may have both chargers with varying amperage outputs and devices with varying amperage inputs. People much smarter than us designed the standard to be compatible, a novel idea right?
Please do not spread FUD and misinformation, especially without bringing any evidence to the table. You are not doing people who come across this thread a service (like the OP) by creating the illusion that there are two sides to this discussion.

quentin0 said:
Using the Nexus 7 charger (or any other higher amperage charger) WILL NOT
Damage your phone
Damage your battery
Cause your phone to draw unsafe amount of amperage
Cause your phone to get hotter than normal
Make your battery charge faster
This is 2012, not 1831. We have industry standards and the microUSB standard was designed with the idea in mind that users may have both chargers with varying amperage outputs and devices with varying amperage inputs. People much smarter than us designed the standard to be compatible, a novel idea right?
Please do not spread FUD and misinformation, especially without bringing any evidence to the table. You are not doing people who come across this thread a service (like the OP) by creating the illusion that there are two sides to this discussion.
Click to expand...
Click to collapse
This sounds like the most sensible post in this thread. :good:

3bs11 said:
I was told it's fine just like it's fine to charge an iPhone with an iPad charger. I've been doing it with my Nexus 4 but who knows.
Click to expand...
Click to collapse
ptt1404 said:
no dont do this otherwise it damages your battery life
Click to expand...
Click to collapse
It does work fine like the iPhone/iPad charger dilemma. The BMS protects the battery from any damage, so there is no need to worry about abnormal decrease in battery life.
Sorry for the delay:
To prevent spam to the forums, new users must wait five minutes between posts. All new user accounts will be verified by moderators before this restriction is removed.
Click to expand...
Click to collapse

quentin0 said:
Using the Nexus 7 charger (or any other higher amperage charger) WILL NOT
Damage your phone
Damage your battery
Cause your phone to draw unsafe amount of amperage
Cause your phone to get hotter than normal
Make your battery charge faster
This is 2012, not 1831. We have industry standards and the microUSB standard was designed with the idea in mind that users may have both chargers with varying amperage outputs and devices with varying amperage inputs. People much smarter than us designed the standard to be compatible, a novel idea right?
Please do not spread FUD and misinformation, especially without bringing any evidence to the table. You are not doing people who come across this thread a service (like the OP) by creating the illusion that there are two sides to this discussion.
Click to expand...
Click to collapse
Complains about people not bringing any evidence to the table...
Brings no evidence to the table.
Sent from my Nexus 4 using Tapatalk 2

droidmakespwn said:
Complains about people not bringing any evidence to the table...
Brings no evidence to the table.
Sent from my Nexus 4 using Tapatalk 2
Click to expand...
Click to collapse
http://yourlogicalfallacyis.com/burden-of-proof

quentin0 said:
http://yourlogicalfallacyis.com/burden-of-proof
Click to expand...
Click to collapse
Where is your proof discrediting the others that said it is damaging. Check this link.
http://yourlogicalfallacyis.com/burden-of-proof
Sent from my Nexus 4 using Tapatalk 2

I wish I could find my post I did about this in another thread.
Yes it is perfectly safe, if not safer, to use a higher amp charger.
It is not safer to use a higher VOLTAGE charger.
Why? Ohms law.
The phone has 2 resistance settings triggered by the data lines of the port. If they are open (computer USB port) it sets itself to 10 ohms. Thus drawing 0.5A (500mA) @ 5V.
If the data lines are closed (wall or car charger) the phone sets itself to 5 ohms. Thus drawing 1A @ 5V.
Amperage is determined by the resistance of the phone.
The rating of the chargers are the MAX amps you can draw from them before they melt down or break. NOT what they push out to your phone. Amps are pulled/drawn out. Voltage is pushed out.
Don't believe me? Get a 5 Ohm resistor. Attach it in series with a multimeter on amps to the outputs of each charger. Then check the Amps of both chargers. It will be nearly equal (about 1A). NOT 1.2A and 2A!
I should make a video demonstration about this, because I see this question a lot!
Sent from my Nexus 4 using Tapatalk 2

Shimakaze said:
Not really sure which side to believe
Click to expand...
Click to collapse
Sorry, but there are a number of folks here who do not understand basic electricity, yet feel qualified to post advice about it.
A typical electrical outlet in your home is capable of delivering 15 to 20 amps. If you plug a device into one that can only handle one amp, does it explode or is it damaged in any way?
Electrical circuits present a "load" to the supply they are connected to. This load determines how much current it will draw.
This is middle school electric shop folks.

Umm, Not really sure if this helps..
But I've been charging my N4 from my N7 charger since I got it. The N4 charger/cable are still in the box.
I have not seen any side effects as of yet, but no clue if there could be months from now...
Would this be the same "questioning" between charging the phone from the provided outlet adapter vs a computer USB port?

UberSlackr said:
Would this be the same "questioning" between charging the phone from the provided outlet adapter vs a computer USB port?
Click to expand...
Click to collapse
A USB port is only capable of supplying 500mA, and that is what the n4 will charge at when plugged into one. This is actually handled by the phone as it knows it is a computer based on terminations on pins 2 and 3 (data signal). Power is delivered via pins 1 and 4.

Related

[Q] Charging with a 2.1 amp charger?

Do you think it would be safe to charge the evo with a 2.1 amp car charger? I found one on ebay that says its made for the iPad but I would love a faster charge on my epic.
In general, the slower you charge the battery, the longer it will last. The effect is pretty significant.
I don't know if that much current is safe.
http://www.batteryuniversity.com/parttwo-34.htm
Sent from my SPH-D700 using XDA App
It's more complicated than that. Just because the charger is able to provide 2.1A doesn't mean the phone will actually draw that much current.
The charge control circuitry is built into the phone. You are just providing a +5V rail as the charging power source via a standard USB connection. There is no charge control inherent in USB itself.
Sent from Samsung Vibrant
It will only pull as much as it needs. I use higher amp output chargers and it's not a problem. It will charge faster, regardless of what you use, if you turn the phone off.
jnadke said:
In general, the slower you charge the battery, the longer it will last.
Click to expand...
Click to collapse
Bingo.
Being an Aerospace Electrical Engineer I approve of this message.
jnadke said:
In general, the slower you charge the battery, the longer it will last. The effect is pretty significant.
I don't know if that much current is safe.
http://www.batteryuniversity.com/parttwo-34.htm
Sent from my SPH-D700 using XDA App
Click to expand...
Click to collapse
this is true, but the effect is not that significant, coming from using lithium packs in rc helicopers and cars, the battery will likely be obsolete before you kill it and the batteries aren't that expensive.
kerms said:
Do you think it would be safe to charge the evo with a 2.1 amp car charger? I found one on ebay that says its made for the iPad but I would love a faster charge on my epic.
Click to expand...
Click to collapse
do you have a link and is there a wall charger too? I run a remote desktop app and it destroys the battery, even with the 1 amp charger going the battery just gets lower and lower.
robl45 said:
do you have a link and is there a wall charger too? I run a remote desktop app and it destroys the battery, even with the 1 amp charger going the battery just gets lower and lower.
Click to expand...
Click to collapse
It could be due to the usb micro charging standards... Many chargers do not adhere to the standard, and this may cause some of the newer phones (droid X, galaxy S phones) not to charge at full power. Most older phones simply did not care, and would use all the amperage they could get their hands on.
Basically, if the D+ and D- pins of the USB cable are not shorted, then the device will draw minimal power from the +5v rail. It is probably drawing <500 mah, and could even be drawing as little as 100 mah from the charger.
Getting a proper 1A charger could fix this, but I'd like to test it out myself when I get the chance..
http://www.griffintechnology.com/products/powerduo-for-ipad
These work good. I use the ac one. Only thing is, like some have stated, even when charging when phone is on and using the phone, the battery will still go down. Maybe 2.2 will fix this or a patch.
I'm not going to debate fast vs slow charging. This isn't like debating what is the best charger for AA rechargeable nimh, fast or slow or charging method.

Using Nexus 7 charger to charge mobile phones?

Hi,
Nexus 7 charger ouputs 2A.
Is it safe and ok to use it to charge Galaxy Nexus, Nexus S, and HTC Desire?
I am thinking more in the long run, if it does not brake the phones.
Thanks
It should not hurt anything.My daughter charges her Droid 4 with my charger every day.
It'll charge it really slow since phone chargers are 1A. Everyone will have their opinion on this but my opinion is I wouldn't use a phone charger on the N7....it could over heat due to it charging slow. But I'm sure you'll hear others saying a slow charge is better so....
Sent from my Nexus 7 using xda premium
dirtyhamster73 said:
It'll charge it really slow since phone chargers are 1A. Everyone will have their opinion on this but my opinion is I wouldn't use a phone charger on the N7....it could over heat due to it charging slow. But I'm sure you'll hear others saying a slow charge is better so....
Sent from my Nexus 7 using xda premium
Click to expand...
Click to collapse
I was asking the other way around.
To use Nexus 7 to charge my phones. But the previous user just answered, thanks James.
When traveling, I want to carry just one charger for all my devices.
Sent from my Galaxy Nexus using xda premium
gogol said:
Hi,
Nexus 7 charger ouputs 2A.
Is it safe and ok to use it to charge Galaxy Nexus, Nexus S, and HTC Desire?
I am thinking more in the long run, if it does not brake the phones.
Thanks
Click to expand...
Click to collapse
Yup, it's fine, because a standard charger is, or used to be, 500 mAh, at 5 volts.
Some chargers are more mAh, like 700, and some are even 1 A.
If a charger is 2A, and your phone only draws 500 mAh, that is perfectly fine, because it's only drawing a quarter of what the charger can produce. In this case, the charger probably won't even get warm.
Neither my Sensation nor my wife's Sensation XL has died yet from using the Nexus charger
What mvmacd says is correct - just because the charger can supply 2A, it is the device that decides how much current it draws from the charger.
dirtyhamster73 said:
It'll charge it really slow since phone chargers are 1A. Everyone will have their opinion on this but my opinion is I wouldn't use a phone charger on the N7....it could over heat due to it charging slow. But I'm sure you'll hear others saying a slow charge is better so....
Sent from my Nexus 7 using xda premium
Click to expand...
Click to collapse
I actually find the charger that came with my razr does the job fine and its rated at 850ma. Other lower power chargers i have are slow though.
I doubt a slow charge would lead to overheating or else connecting to a pc would cause this too.
I think for chargers its a case of trying them to see how well they work.
Sent from my Nexus 7 using Tapatalk 2
gbroon said:
I actually find the charger that came with my razr does the job fine and its rated at 850ma. Other lower power chargers i have are slow though.
I doubt a slow charge would lead to overheating or else connecting to a pc would cause this too.
I think for chargers its a case of trying them to see how well they work.
Sent from my Nexus 7 using Tapatalk 2
Click to expand...
Click to collapse
Science proves other than your opinion. A too-low or too high max voltage or amperage charger can and will lead to overheating and severe reduction on battery life and can destroy the adapter as well.
MrSchroeder said:
Science proves other than your opinion. A too-low or too high max voltage or amperage charger can and will lead to overheating and severe reduction on battery life and can destroy the adapter as well.
Click to expand...
Click to collapse
Care to explain why Google says you can charge your device with a 500 mAh charger [standard USB port]? ["with the screen off"]
Won't it severely reduce battery life and burn out the motherboard of the USB? Oh, really? Google just forgot about that part when they were writing the instruction manual?
:silly:
MrSchroeder said:
Science proves other than your opinion. A too-low or too high max voltage or amperage charger can and will lead to overheating and severe reduction on battery life and can destroy the adapter as well.
Click to expand...
Click to collapse
Science generally proves things with facts and figures. From a forum point of view, a link is your minimum effort here
MrSchroeder said:
Science proves other than your opinion. A too-low or too high max voltage or amperage charger can and will lead to overheating and severe reduction on battery life and can destroy the adapter as well.
Click to expand...
Click to collapse
Modern devices and chargers shouldn't have this problem because of built-in regulators. A smartphone won't try to draw more than it can handle and chargers won't try to supply more than they can handle (unless they're very cheap).
I have been N7 charger on phone with no problem so far. I wonder about the statement about the phone not drawing more than it needs though. I replaced the battery in my TB after 9 months due to low life and swelling. I'm pretty sure the swelling came from leaving the phone on a car charger all day, even after the battery was full. If my phone had the ability to stop taking the charge it didn't need, this wouldn't happen...
Sent from my Nexus 7 using xda premium
My opinion still stands....I don't trust using anything other than the charger that came with the device. 6th post down makes perfect sense to me.
http://forum.xda-developers.com/archive/index.php/t-1370215.html
Your battery was likely defective. My phone literally stays on the charger all day when I'm not out.
gogol said:
Hi,
Nexus 7 charger ouputs 2A.
Is it safe and ok to use it to charge Galaxy Nexus, Nexus S, and HTC Desire?
I am thinking more in the long run, if it does not brake the phones.
Thanks
Click to expand...
Click to collapse
If the phones also charge at 2A then you should be fine. If the phones charge at lower amps (say 1A or 1.5A) then I wouldn't recommend using it everyday as it may reduce the battery efficiency. If it's an emergency go ahead and use it.
There's no harm in using a higher current charger with a lower current phone because the charger is not what's actually charging the battery, it's the phone, and the phone will limit the charging current. You can confirm this with a multimeter. The charger can't force the phone to draw more current than it was designed for. This would be different if you were charging the battery directly with a dedicated charger because then the charger itself is directly controlling the charging current.
MrSchroeder said:
Science proves other than your opinion. A too-low or too high max voltage or amperage charger can and will lead to overheating and severe reduction on battery life and can destroy the adapter as well.
Click to expand...
Click to collapse
Nope, just nope.
Sincerely, an electrical engineering student.
Sent from my Nexus 7 using Tapatalk 2

2A charger used with other phones

Hello!
Just curious if there is an issue with using my new Nexus 10 2A charger with other phones, such as my HTC Sensation or Blackberry Torch?
The Sensation uses a 1A charger, but I assume the phones are smart enough to only draw the current necessary, so they won't be damaged by drawing too much?
I'd like to just use the Nexus 10 charger and not have to carry other ones.
yes it is fine
Cool thanks
EniGmA1987 said:
yes it is fine
Click to expand...
Click to collapse
I heard though that
*first it creates unnecessary heat because the current drawn by circuitry on lower amperage device has to be dissipated as heat
*second, this is less science/engineering but someone said that the specific pins are created by companies and the pins themselves can vary in terms of impedance, thus change the overall circuitry of the device in the long run
*third Li-Ion can pull more current than the default charger and it tends to do so to charge faster, albeit at the cost of the battery overall life deteriorates because higher charging rates also leads to faster breakdown of cells?
I wish I had sources, but this is what I pulled off the Internet when I was younger... can you please assist and advise? Would greatly appreciate (even if we start new thread from this
nutnub said:
I heard though that
*first it creates unnecessary heat because the current drawn by circuitry on lower amperage device has to be dissipated as heat
*second, this is less science/engineering but someone said that the specific pins are created by companies and the pins themselves can vary in terms of impedance, thus change the overall circuitry of the device in the long run
*third Li-Ion can pull more current than the default charger and it tends to do so to charge faster, albeit at the cost of the battery overall life deteriorates because higher charging rates also leads to faster breakdown of cells?
I wish I had sources, but this is what I pulled off the Internet when I was younger... can you please assist and advise? Would greatly appreciate (even if we start new thread from this
Click to expand...
Click to collapse
Wish I knew for sure too. REally I don't care a lot about my HTC Sensation as I plan on getting a Nexus 4 LTE when it eventually comes out. Hopefully those come with 2A chargers!
Sure I could get a Nexus 4 and use LTE right now on Bell, but I'd rather wait for an official one.
nutnub said:
I heard though that
*first it creates unnecessary heat because the current drawn by circuitry on lower amperage device has to be dissipated as heat
*second, this is less science/engineering but someone said that the specific pins are created by companies and the pins themselves can vary in terms of impedance, thus change the overall circuitry of the device in the long run
*third Li-Ion can pull more current than the default charger and it tends to do so to charge faster, albeit at the cost of the battery overall life deteriorates because higher charging rates also leads to faster breakdown of cells?
I wish I had sources, but this is what I pulled off the Internet when I was younger... can you please assist and advise? Would greatly appreciate (even if we start new thread from this
Click to expand...
Click to collapse
Everybody seems to misunderstand LiPo charging, as it is different than previous battery technologies
For general LiPo Information, you should look here. Charging information is about halfway down the page
http://www.rchelicopterfun.com/rc-lipo-batteries.html
Ill quote the important part:
Selecting the correct charge current is also critical when charging RC LiPo battery packs. The golden rule here use to be "never charge a LiPo or LiIon pack greater than 1 times its capacity (1C)."
For example a 2000 mAh pack, would be charged at a maximum charge current of 2000 mA or 2.0 amps. Never higher or the life of the pack would be greatly reduced. If you choose a charge rate significantly higher than the 1C value, the battery will heat up and could swell, vent, or catch fire.
Times are a changing...
Most LiPo experts now feel however you can safely charge at a 2C or even 3C rate on quality packs that have a discharge rating of at least 20C or more safely and low internal resistances, with little effect on the overall life expectancy of the pack as long as you have a good charger with a good balancing system. There are more and more LiPo packs showing up stating 2C and 3C charge rates, with even a couple manufactures indicating 5C rates. The day of the 10 minute charge is here (assuming you have a high power charger and power source capable of delivering that many watts and amps).
Click to expand...
Click to collapse
Pretty much all phones are right around 2000mAh capacity now days so even going by the "old" golden charging rule a 2A charger would be safe to use. My Galaxy Nexus came with (I think) a 1A charger, but ever since I got my tablet shortly thereafter I have just used the tablets 2A charger for both devices and never once had an issue. It has been 8 months now of using the 2A charger on my phone. Idle life can still reach a little over 3 days on a single charge and I still get one of the best screen on time's of most people I know around the forums. So yes from personal experience a 2A tablet charger is completely fine to use on a phone.
Charging circuitry is built into the device, not the "charger"
Nothing to worry about
EniGmA1987 said:
Ill quote the important part:
Pretty much all phones are right around 2000mAh capacity now days so even going by the "old" golden charging rule a 2A charger would be safe to use. My Galaxy Nexus came with 9I think) a 1A charger, but ever since I got my tablet shortly thereafter I have just used the tablets 2A charger for both devices and never once had an issue. It has been 8 months now of using the 2A charger on my phone. Idle life can still reach a little over 3 days on a single charge and I still get one of the best screen on time's of most people I know around the forums. So yes from personal experience a 2A tablet charger is completely fine to use on a phone.
Click to expand...
Click to collapse
Is it safe to assume that all chargers come default at 1C charging for their device? Because if that's the case, I figure most electronics we own can just be replaced with 10w chargers (which would make life much more convenient).
This is slightly related/unrelated, but how do you know whether a charger is "high quality" or will only provide "constant current / constant voltage"? It seems strange to me that these days, you can't find the circuitry of many devices we own publicly available so you can't check if the design is good (let alone how they chose the components in their design?). Do you (and other veterans) have any thoughts on this?
Thanks for teaching me lots!
-newb, happily reading away
I bought one of those 2amp double chargers from a seller on Amazon. It wasn't really cheap either (in cost anyway- I spent a bit more hoping it would be higher quality). After plugging in my MotoRAZR and the wife's lumia the charger popped and some plastic from the housing of the charger flew across the room! Thankfully both phones were fine.
I wondered whether both phones tried to pull more than the charger could handle and the charger had poor quality circuitry.
Since then, I've only ever bought branded official replacement chargers (Motorola, Samsung etc). I'd happily mix and match them to the phones but I'd be wary of buying a no name Chinese jobby from Ebay or Amazon marketplace.
Sent from my XT910 using xda premium
nutnub said:
Is it safe to assume that all chargers come default at 1C charging for their device? Because if that's the case, I figure most electronics we own can just be replaced with 10w chargers (which would make life much more convenient).
Click to expand...
Click to collapse
Most batteries can discharge a lot faster than they can recharge, but with LiPo, the difference is getting smaller.
Batteries used to need trickle charging as if you charge fast they would get hot, which causes the chemicals inside to expand(think like a fizzy drink, pour it fast and it will overflow) causing the battery to burst, exposing nasty chemicals.
New technology means the charger can accurately monitor how fast we fill the battery, without letting it get too hot, and also the way it is filled(as with the fizzy drink, pour down the side of a glass rather than straight to the bottom and you will fill the glass faster, with less chance of it over-spilling)
This is slightly related/unrelated, but how do you know whether a charger is "high quality" or will only provide "constant current / constant voltage"? It seems strange to me that these days, you can't find the circuitry of many devices we own publicly available so you can't check if the design is good (let alone how they chose the components in their design?). Do you (and other veterans) have any thoughts on this?
Click to expand...
Click to collapse
Unfortunately, industry is full of products made to a budget, usually by using cheaper components/designs(the charger for the ASUS TF101 was renowned for failing), so there is no foolproof way of determining 'quality' apart from word of mouth, looking at quantities sold, feedback in reviews/forums.
Basically, it boils down to 'consumer testing'
---------- Post added at 09:54 AM ---------- Previous post was at 09:38 AM ----------
Here's a bit more related information found buried deep in documents here: http://www.usb.org/developers/devclass_docs
The USB2.0 specifications for current output say the maximum current is limited to 1.8A, while USB3.0 has a maximum current limit of 5A
Hopefully, USB3.0 will quickly become a new standard for portable devices.
more questions!
First of all, let me please thank you for responding and being so thorough with your answers! There is so much information out there, and in my 22 years of existence, I cannot for the life of me sort through the sheer amount of data. I do greatly enjoy reading every little thing that is posted, especially in this thread because I think it's super important to understand the electronics that we interact with.
sonicfishcake said:
I bought one of those 2amp double chargers from a seller on Amazon. It wasn't really cheap either (in cost anyway- I spent a bit more hoping it would be higher quality). After plugging in my MotoRAZR and the wife's lumia the charger popped and some plastic from the housing of the charger flew across the room! Thankfully both phones were fine.
I wondered whether both phones tried to pull more than the charger could handle and the charger had poor quality circuitry.
Since then, I've only ever bought branded official replacement chargers (Motorola, Samsung etc). I'd happily mix and match them to the phones but I'd be wary of buying a no name Chinese jobby from Ebay or Amazon marketplace.
Sent from my XT910 using xda premium
Click to expand...
Click to collapse
My concern with this is that if Motorola or Samsung does put out a product less than optimal, would we all know? Another way of asking this is how do we know that Apple/Motorola/Samsung/Lenovo does produce superior products and it's not merely a matter of advertisement or brand image? Do you think there is a way to know, as a consumer, that even third party products are becoming more competitive, given that smaller companies have much harder time advertising and building a name/brand for themselves? (if you can't tell, I am rooting for the little guys because I may one day work for the little guys)
skally said:
Most batteries can discharge a lot faster than they can recharge, but with LiPo, the difference is getting smaller.
Batteries used to need trickle charging as if you charge fast they would get hot, which causes the chemicals inside to expand(think like a fizzy drink, pour it fast and it will overflow) causing the battery to burst, exposing nasty chemicals.
New technology means the charger can accurately monitor how fast we fill the battery, without letting it get too hot, and also the way it is filled(as with the fizzy drink, pour down the side of a glass rather than straight to the bottom and you will fill the glass faster, with less chance of it over-spilling)
Click to expand...
Click to collapse
Thank you for clarifying for us. Would you happen to know if there are specifics to recharge specs, short of finding me published papers on the technology? What you said is definitely what I've been reading from the Internet and I do trust you, just would help me have greater peace of mind with my nice and shiny devices,,,
skally said:
...
Unfortunately, industry is full of products made to a budget, usually by using cheaper components/designs(the charger for the ASUS TF101 was renowned for failing), so there is no foolproof way of determining 'quality' apart from word of mouth, looking at quantities sold, feedback in reviews/forums.
Basically, it boils down to 'consumer testing'
---------- Post added at 09:54 AM ---------- Previous post was at 09:38 AM ----------
Here's a bit more related information found buried deep in documents here: http://www.usb.org/developers/devclass_docs
The USB2.0 specifications for current output say the maximum current is limited to 1.8A, while USB3.0 has a maximum current limit of 5A
Hopefully, USB3.0 will quickly become a new standard for portable devices.
Click to expand...
Click to collapse
A quick question, just because USB3.0 should allow up to 25W, that doesn't mean that it's the standard for devices, does it? As in Nexus 10 probably can only draw 10W, even if my computer (which although stated is USB3.0) may not have the circuitry behind it to allow for such a draw? I'm a little iffy on the whole implementation of USB standards. Because if USB2.0 has draw of up to 9W, I haven't seen this from my laptop or any devices claiming to have USB2.0 ports,,,
but then again, I may be paranoid. Just trying to line up my experience with theory!
Thank you all for so much support and enthusiasm. Any chance we'll see this on a top thread somewhere?
nutnub said:
A quick question, just because USB3.0 should allow up to 25W, that doesn't mean that it's the standard for devices, does it? As in Nexus 10 probably can only draw 10W, even if my computer (which although stated is USB3.0) may not have the circuitry behind it to allow for such a draw? I'm a little iffy on the whole implementation of USB standards. Because if USB2.0 has draw of up to 9W, I haven't seen this from my laptop or any devices claiming to have USB2.0 ports,,,
but then again, I may be paranoid. Just trying to line up my experience with theory!
Thank you all for so much support and enthusiasm. Any chance we'll see this on a top thread somewhere?
Click to expand...
Click to collapse
If the Nexus kernel says the limit is 2A then that's it. It cant use more power.
Have you seen the internal USB 3.0 cable?
It's at least twice as thick as a USB 2.0 cable, I got a new chassi for my computer last week, with a couple 2.0 and a 3.0 USB front port.
And if your motherboard's built for USB 3.0, I'm pretty sure it can take the current. Otherwise there would be no meaning of adding 3.0 support.
Sent from my Nexus 10 using xda app-developers app
If something is listed as a USB3 port, it must be up to USB3 certifications. Otherwise the manufacturer of the device is liable for a huge lawsuit if issues arise. If something says USB3 that doesnt mean it IS drawing 25w though, just that the port is capable of having 25w pulled through it over the USB connector. Same with USB2 and its 9w limit on the spec. Also, plugging a tablet such as this into a computer's USB3 port does not mean it will charge faster or get faster data transfers, since the cable being used and the device are still of the older specification.
nutnub said:
Thank you for clarifying for us. Would you happen to know if there are specifics to recharge specs, short of finding me published papers on the technology? What you said is definitely what I've been reading from the Internet and I do trust you, just would help me have greater peace of mind with my nice and shiny devices,,,
Click to expand...
Click to collapse
Have a look here for info on the recharging process for Lithium based cells.
https://sites.google.com/site/tjinguytech/charging-how-tos/the-charging-process
It is worth noting the level of precautions taken while charging the cells aggressively. You really don't need a bucket of sand on standby when you plug your phone in to it's charger
nutnub said:
A quick question, just because USB3.0 should allow up to 25W, that doesn't mean that it's the standard for devices, does it? As in Nexus 10 probably can only draw 10W, even if my computer (which although stated is USB3.0) may not have the circuitry behind it to allow for such a draw? I'm a little iffy on the whole implementation of USB standards. Because if USB2.0 has draw of up to 9W, I haven't seen this from my laptop or any devices claiming to have USB2.0 ports,,,
Click to expand...
Click to collapse
There are actually 2 different current limits for each USB specification: USB2.0 has 0.5A and 1.8A, while USB3.0 has 1.5A and 5.0A
The lower of the current limits is what I would expect to get from a USB port on a computer, while the higher one I would expect to get from a dedicated charger.
I believe the higher current specification was added purely for charging mobile devices, as it is only achieved by adding a resistance across D+ and D-, removing the data transmission capabilities of the port. I don't know if that's practical, or possible with a computer USB port.
I do remember seeing motherboards with ports specifically designed for fast charging, but I haven't got any info on them as yet.
There are also kernels which enable "fast charging" on a PC. Basically it removes the data connection in software and treats any USB connection as if it were plugged into AC. You can charge just as fast on a computer as you can on a wall charger when this feature is enabled in the kernel.
I am using the N10 charger for my Note 2 and it charges bloody fast using this charger. Charging is noticeably faster on Note 2 than the stock 1A charger that came with the Note.
Battery is not getting warm and battery temps are similar to those on 1A charger. Basically its cutting the charging time in almost half.
Agreed. Note 2 charger is awesome. Bought a powergen 3.1 amp car charger for the note 2 also after watching videos and reading up on proper car chargers for the phone. Guess I can use it for my nexus 10 too.
Sent from my Nexus 10 using xda premium
I own RC cars with lipo batteries and rule of thumb is total mah divide by 1000 = the Max amp charger you can use. So a 2100mah battery can be charged with a 2.1A charger.
On that note I charge my Samsung s3 that has a 2100mah battery with a 2.1A car charger without any issue.
Sent from my SGH-T999 using Tapatalk 2
I used the N10's charger to charge my iPod Nano 3rd gen, no problem

Question about the 5.3v charger

I don't know much about voltage and stuff, but with the 5.3v charger, I assume it's okay to use a 5v to charge the Note 3, it just might be a bit slower, correct?
Now what about using the 5.3v charger on the iPad Air? I can't find any info on its charger, or online, of what voltage it uses. will the 5.3v be too much for it or anything?
guttertrash said:
I don't know much about voltage and stuff, but with the 5.3v charger, I assume it's okay to use a 5v to charge the Note 3, it just might be a bit slower, correct?
Now what about using the 5.3v charger on the iPad Air? I can't find any info on its charger, or online, of what voltage it uses. will the 5.3v be too much for it or anything?
Click to expand...
Click to collapse
It's perfectly fine to use 5v, that's standard. The voltage the wall adapter puts out is not what will determine your charge speed, its the amperage your device draws - which I believe is capped at 1800 mah for the note 3. It doesn't matter what the amperage is (it can be higher), but the note 3 will only draw 1800 mah max and lower if the source is lower. When it comes to the 5.3v charger, there's a bit of mixed information from what I've gathered. Plenty of people have used it with no problems considering .3v is a fairly small difference, but there will be some that tell you otherwise. I personally avoid 5.3v just because I have plenty of alternative chargers.
If the source is lower, it will increase the time it takes to charge. But 5.3 volts as an output will actually drop to 5 once it is loaded by your device. Just like 5 drops to about 4.7 when loaded by a device. The unloaded output of a charger is always higher than when it is connected.
It's the device that determines the amount of charge drawn, the numbers on the charger are just the max-charge it can provide. (Think of it like a car. Your engine can go up to 200Kph, but the actual speed is determined by how much you tell it to give you.)
I charge my Note 3 with my 15V Asus TF700 charger. It still only charges at the max the Note 3 can draw.
Essentially a 15V charger can safely provide the 5V, but a 5V charger can never provide the charge for a device which requires 15V.
Besides, a USB2.0 cable can't go over 5Volt anyway, so your iPad is safe.
ShadowLea said:
It's the device that determines the amount of charge drawn, the numbers on the charger are just the max-charge it can provide. (Think of it like a car. Your engine can go up to 200Kph, but the actual speed is determined by how much you tell it to give you.)
I charge my Note 3 with my 15V Asus TF700 charger. It still only charges at the max the Note 3 can draw.
Essentially a 15V charger can safely provide the 5V, but a 5V charger can never provide the charge for a device which requires 15V.
Besides, a USB2.0 cable can't go over 5Volt anyway, so your iPad is safe.
Click to expand...
Click to collapse
Now I'm not trying to be rude or anything, but I think you have a few facts astray here. If you can prove me wrong I'll be happy to learn since it has been awhile since I took circuits. First off your right about the device being a big factor in the rate of charge, in fact technically the charger is in the device, the wall adapter is just the source. The source will force a certain voltage, i.e. 5v or 15v, and can provide up to a certain current (amperes). The charger in the device then determines how much of this current to draw.
With all that being said, the reason you can use your asus transformer charger is because it only forces 5v unless the transformer is plugged in at which point it kicks it up to 15v. So your not actually plugging a 15v source into your n3, if it were possible you'd be on thin ice. Also, I'm pretty sure you'd be able to charge your asus via say the note 3 charger (really slow though), probably not while its turned on, but turn it off leave it plugged in for a few hours and you should see a change.
A USB cable is just a wire, if you put 5.3V on it then your device receives 5.3V (minus voldage drop due to the resistance of the cable but that's negligible unless you're using a cheap thin cable). That said, 5.3V should be within tolerance for pretty much every USB charging device out there including the iPad (which I think uses a 5.3V charger itself).
Sent from my SM-N900T using xda app-developers app
Cool, good to know I wont fry either device. Thanks for all the help
Solarenemy68 said:
If the source is lower, it will increase the time it takes to charge. But 5.3 volts as an output will actually drop to 5 once it is loaded by your device. Just like 5 drops to about 4.7 when loaded by a device. The unloaded output of a charger is always higher than when it is connected.
Click to expand...
Click to collapse
Galaxy Note 3 have digital charger. so, I don't think undervolting is the case
Sent from Note 3
MILJANN said:
Galaxy Note 3 have digital charger. so, I don't think undervolting is the case
Sent from Note 3
Click to expand...
Click to collapse
It's not undervoltage. The resistance "R" of the USB cable is fixed (varies from cable to cable, but is constant for any single cable), so as the current load "I" increases the voltage "V" lost across the cable also increases, according to V = I*R.
Given a 5v source, loaded to 1800 mA, you would expect to see 4.7v at the phone's end of the USB cable. The resistance of the USB cable itself comes out to around 0.1666666 Ohms in that scenario, which is a perfectly reasonable value for the gauge wire used in those cords.
CalcProgrammer1 said:
A USB cable is just a wire, if you put 5.3V on it then your device receives 5.3V (minus voldage drop due to the resistance of the cable but that's negligible unless you're using a cheap thin cable).
Click to expand...
Click to collapse
Or using a long cable. With my (S5) 5.3V charger, i can finally use those 3M/10ft cables, which is really convenient for me
pizzaman79 said:
Or using a long cable. With my (S5) 5.3V charger, i can finally use those 3M/10ft cables, which is really convenient for me
Click to expand...
Click to collapse
In lieu of using higher voltage chargers to use with long cables, you can also use chargers that provide more current @ 5V or, of course, lower resistance cables.
I'm afraid not. The voltage will drop too far below 5.0V, the device will not accept the offered power, no matter how high the charging current. That's both based on physics and personal experience. Resistance in cables is an issue at 10 metres, even those with high gauge copper cores.
Edit: Delete.
pizzaman79 said:
I'm afraid not. The voltage will drop too far below 5.0V, the device will not accept the offered power, no matter how high the charging current. That's both based on physics and personal experience. Resistance in cables is an issue at 10 metres, even those with high gauge copper cores.
Click to expand...
Click to collapse
You are absolutely correct, however that isn't exactly what I was trying to share. I was trying to share a solution to situations where the charger cannot keep up and thus the voltage sags. I did not expect a 10metre situation. The longest I have is 5m. On my 5 metre cable, I don't notice any appreciable voltage drop from any of my chargers, so I generally consider it negligible. How much do you drop across 10 metres and at how many amps?
On a side note, I think the phone has some kind of safety when it comes to voltages. I just plugged in my 5.3V charger from my s5 into the s3 and it seemed to work fine. However, I also recently, and I only just realised this, plugged in a 7.2V supply to my S3 and though it did not charge, nothing bad seemed to happen. It was an off brand 5V USB Charger and when it died, it cranked up the voltage.
fusionstream said:
You are absolutely correct, however that isn't exactly what I was trying to share. I was trying to share a solution to situations where the charger cannot keep up and thus the voltage sags. I did not expect a 10metre situation. The longest I have is 5m. On my 5 metre cable, I don't notice any appreciable voltage drop from any of my chargers, so I generally consider it negligible. How much do you drop across 10 metres and at how many amps?
On a side note, I think the phone has some kind of safety when it comes to voltages. I just plugged in my 5.3V charger from my s5 into the s3 and it seemed to work fine. However, I also recently, and I only just realised this, plugged in a 7.2V supply to my S3 and though it did not charge, nothing bad seemed to happen. It was an off brand 5V USB Charger and when it died, it cranked up the voltage.
Click to expand...
Click to collapse
The voltage drop is dependant on gauge, copper quality and length. I never measured it at both ends but i estimate 0.3-0.5 volt based on what chargers do and don't charge what phones at 10ft. Note that i mixed up ft and metres lol, my cables are 3m. At that length voltage drop is an issue for me.
7v would perhaps be fine for a 10m cable I wish we knew the tolerance range of what voltages (at the usb in) common smartphones accept.
I think the reason the galaxy tab S tablets have a 5.3volt charger is to compensate a bigger voltage drop through the longer charging cable provided.
The voltage must be the correct voltage or near to it. I think from what others have said 5.3 is near enough to 5.0Volts.
F 0.95

YotaPhone 2 charger is Qualcomm Quickcharge 2.0 compatible

Maybe this is old news but today I learned that the YotaPhone 2 charger shipped with the phone is actually Qualcomm Quickcharge 2.0 compatible. This means you can also charge other Qualcomm Quickcharge 2.0 compatible phones with it, like my other phone the Moto G4+. Works perfectly.
Yes, I've been using my QuickCharge 3.0 charger and it's charging with 9V ~1,3A.
kbal said:
Yes, I've been using my QuickCharge 3.0 charger and it's charging with 9V ~1,3A.
Click to expand...
Click to collapse
Dont, fastcharging will greatly reduce you battery life.
Enviado desde mi SM-N930F mediante Tapatalk
kingtiamath said:
Dont, fastcharging will greatly reduce you battery life.
Enviado desde mi SM-N930F mediante Tapatalk
Click to expand...
Click to collapse
Although it is true it reduces your battery life it is only by a small margin, nog greatly.
VirtuaLeech said:
Although it is true it reduces your battery life it is only by a small margin, nog greatly.
Click to expand...
Click to collapse
Im afraid it does. I have done many experiments myself and batteries often charged with fastcharge in as little as 6 months give you no more than 70% of its original charge.
Enviado desde mi SM-N930F mediante Tapatalk
..the same goes for wireless charging btw.
Amplificator said:
..the same goes for wireless charging btw.
Click to expand...
Click to collapse
Are you joking? What is your answer based on?
Wireless charging runs on a much lower amperage so it should be the best solution to charge your phone.
nonyhaha said:
Are you joking? What is your answer based on?
Wireless charging runs on a much lower amperage so it should be the best solution to charge your phone.
Click to expand...
Click to collapse
My answer is based on simple physics.
Just because the amps are lower doesn't mean it's not bad for the battery.
Wireless charging is way less efficient than any form of wired charging.
What happens to the loss? Well, it gets dissipated as heat - and what is the "big killer" of lithium batteries? ..heat.
For this single fact alone, denying that wireless charging causes more harm than a cabled charging is simply.. well, silly.
The only ones denying this are either unaware of simple science or are lying to you, probably to sell you a charger
Yes, every form of charging, even at a theoretical 100% efficiency will heat up the battery due to chemical reactions inside the battery, but the lower efficiency you have the more energy is converted into heat - thus you do more damage and getting even less actual battery-energy out of it.
Simply put: the best charging method is the one that produces the least amount of heat while maintaining a high efficiency - wireless charging is simply not that.
Charging using a cable at 90% means 10% is being converted into heat (not all 10%, but for arguments sake, play along), where as using wireless charging might be at.. 50% depending on different circumstances (probably a lot closer to 70% than 50%, but again, for arguments sake).
This means that the other 50% is just turned into wasted, unnecessary and unwanted heat.
The percentages obviously aren't correct in this example, but it's more to get the point across.
With wireless charging you do more damage (it is subjective as to whether this matters to you) to the battery than you would by using a cable, simply because you create more excessive heat which only purpose is to heat up the battery and surrounding area than actually going into the battery itself.
If we consider the 50% efficiency of the before mentioned example, this means that you would need to charge your device for almost twice as long time as when you use a cable. Not only does it create more heat by virtue of being inductive charging, but it will be doing so for, again, almost twice the time length.
Efficiency also depends on things like distance - the less "perfect" your phone is placed on the charger the less efficient, and thus more wasteful it is.
Google something like "qi wireless charging overheating" and you will see plenty of people reporting on overheating problems when using wireless charging. This is because of all this wasted energy that is dissipated as heat - instead of "filling" the battery it simply heats it, and the surroundings, up.
Despite being made to the same specs, this seem to differ from charger to charger, such as this thread here on XDA would indicate: http://forum.xda-developers.com/google-nexus-5/help/post-qi-charging-battery-temp-t2544768
If you look at the version specifications you see that version 1.2 of the "low power" Qi charging branch which phones are a part of increased the power to up to 15W.
Unless they also worked on the efficiency this would actually mean that version 1.2 does more damage to the battery than 1.0 and 1.1, but for that you would have to dive a bit deeper than the information given in that link.
But as always it's sort of subjective as to what point people will see wireless charging as being too wasteful and/or damaging.
Personally, I don't care because the convenience of wireless charging by far outways the little damage it does to a battery, in my opinion, and the same goes for QuickCharge as well. By the time I would see a noticeable effect on battery life I have probably already bought a new phone anyway
If we take Qualcomms QuickCharge for example,I think QC 3.0 is at the point of where people shouldn't really care about the negative impact. If you read the spec sheet for QC 3.0 it's basically a tweaked version of QC 2.0 (well duh) where the power delivery is controlled much better than QC 2.0 was, bringing both the efficiency and therefor speed to a much higher level even though both are rated for 18W.
Some reading for those who still doubt basic physics :
http://batteryuniversity.com/learn/article/charging_without_wires
http://batteryuniversity.com/learn/article/charging_at_high_and_low_temperatures
http://batteryuniversity.com/learn/article/all_about_chargers
http://batteryuniversity.com/learn/article/ultra_fast_chargers
..and the best of all: https://google.com/
But let me ask you the same question you asked me; and I quote:
nonyhaha said:
Are you joking? What is your answer based on?
Click to expand...
Click to collapse
..that probably sounded very condescending (which is not how it was intended, of course), but I'm curious as to where you've acquired this absurd idea that Qi wireless charging is the best method of all? It's very likely the worst of all, actually.
There is almost no heat dissipated for QC3.0
For me quick charging is a big help, saves hours, if you have a large QC battery or powerbank especially. Yotaphone battery charges especially quickly with QC charger.
"..Wireless charging is way less efficient than any form of wired charging..."
Yes, because you convert first AC 110v or 240v to a lover voltage, f.e DC 5v with an efficiency of maybe 85%.
Then this 5v DC are chopped to a long wave ac voltage (about 19v / 110 to205 kHz) and sends to a cooper coil in the QI transmitter.
There the energy goes as a by a resonant inductive couppling (magnetic field) through a air gap to the QI receiver - again wit an efficiency of perhaps 70%.
The magnetic induction in the receiver coil delivers us again long wave ac voltage which is converted into adequate DC voltage (again efficiency about 70%).
So frankly speaking you may tell a bit of truth regarding losses converted to heat - but this heat ocures everywhere, but not in the Li-Po batteries. It does only in the last step: conversion of electrical energy into a chemical process inside of Li-Po.
Take a look to a label on your QI Charger and you will notice something like following: Input 5V/2a, Output 5V/1A (loss of 50%)
Almost all lithium batteries have their own charging controllers on board which take care of the correct charging parameters. Those controllers are adjusted to charge and also quick charge li-po batteries in the right manner.
Enough theory.
Just follow the electrical way:
in case of direct charger: USB-connector ->copper wire -> Smartphone -> copper wire->LiPo
in case of QI charger: USB Connector->copper coil->air gap->copper coil->copper wire-> LiPo
so there's no basic difference how the LiPo is connected to the power - in both cases by a copper wire
in both cases you can charge with lets say 5v/1A (of course LiPo will be charged with their own characteristical voltage and currents)
modern LiPos are built for a life of 700 bis 1000 charging cycles (about 2 years), and nobody knows if a LiPo would live longer by charging him slowly.
You can charge your smartphone in a fridge to prevent high temperatures.
USB devices are smart, they negotiate themselves by a protocol regarding the charge load. There is no danger to take a Smartphone with capability to be charged with 1.4 amps and connect it to a charger with a 2.1 amp.
You should take more care of the USB cable - it should be able to pass those required Amps to the devices.
Yes less efficient and worse for battery, maybe takes a few minutes more to charge, costs a little more to charge. But its much more pleasing not to use cables and very impressive too. I love wireless charging.
Amplificator said:
My answer is based on simple physics.
Just because the amps are lower doesn't mean it's not bad for the battery.
Wireless charging is way less efficient than any form of wired charging.
What happens to the loss? Well, it gets dissipated as heat - and what is the "big killer" of lithium batteries? ..heat.
For this single fact alone, denying that wireless charging causes more harm than a cabled charging is simply.. well, silly.
The only ones denying this are either unaware of simple science or are lying to you, probably to sell you a charger
Yes, every form of charging, even at a theoretical 100% efficiency will heat up the battery due to chemical reactions inside the battery, but the lower efficiency you have the more energy is converted into heat - thus you do more damage and getting even less actual battery-energy out of it.
Simply put: the best charging method is the one that produces the least amount of heat while maintaining a high efficiency - wireless charging is simply not that.
Charging using a cable at 90% means 10% is being converted into heat (not all 10%, but for arguments sake, play along), where as using wireless charging might be at.. 50% depending on different circumstances (probably a lot closer to 70% than 50%, but again, for arguments sake).
This means that the other 50% is just turned into wasted, unnecessary and unwanted heat.
The percentages obviously aren't correct in this example, but it's more to get the point across.
With wireless charging you do more damage (it is subjective as to whether this matters to you) to the battery than you would by using a cable, simply because you create more excessive heat which only purpose is to heat up the battery and surrounding area than actually going into the battery itself.
If we consider the 50% efficiency of the before mentioned example, this means that you would need to charge your device for almost twice as long time as when you use a cable. Not only does it create more heat by virtue of being inductive charging, but it will be doing so for, again, almost twice the time length.
Efficiency also depends on things like distance - the less "perfect" your phone is placed on the charger the less efficient, and thus more wasteful it is.
Google something like "qi wireless charging overheating" and you will see plenty of people reporting on overheating problems when using wireless charging. This is because of all this wasted energy that is dissipated as heat - instead of "filling" the battery it simply heats it, and the surroundings, up.
Despite being made to the same specs, this seem to differ from charger to charger, such as this thread here on XDA would indicate: http://forum.xda-developers.com/google-nexus-5/help/post-qi-charging-battery-temp-t2544768
If you look at the version specifications you see that version 1.2 of the "low power" Qi charging branch which phones are a part of increased the power to up to 15W.
Unless they also worked on the efficiency this would actually mean that version 1.2 does more damage to the battery than 1.0 and 1.1, but for that you would have to dive a bit deeper than the information given in that link.
But as always it's sort of subjective as to what point people will see wireless charging as being too wasteful and/or damaging.
Personally, I don't care because the convenience of wireless charging by far outways the little damage it does to a battery, in my opinion, and the same goes for QuickCharge as well. By the time I would see a noticeable effect on battery life I have probably already bought a new phone anyway
If we take Qualcomms QuickCharge for example,I think QC 3.0 is at the point of where people shouldn't really care about the negative impact. If you read the spec sheet for QC 3.0 it's basically a tweaked version of QC 2.0 (well duh) where the power delivery is controlled much better than QC 2.0 was, bringing both the efficiency and therefor speed to a much higher level even though both are rated for 18W.
Some reading for those who still doubt basic physics :
http://batteryuniversity.com/learn/article/charging_without_wires
http://batteryuniversity.com/learn/article/charging_at_high_and_low_temperatures
http://batteryuniversity.com/learn/article/all_about_chargers
http://batteryuniversity.com/learn/article/ultra_fast_chargers
..and the best of all: https://google.com/
But let me ask you the same question you asked me; and I quote:
..that probably sounded very condescending (which is not how it was intended, of course), but I'm curious as to where you've acquired this absurd idea that Qi wireless charging is the best method of all? It's very likely the worst of all, actually.
Click to expand...
Click to collapse
So you really think you know what you are saying there...
Heat dissipation will happen ONLY on the emitter part. So no heating on the recetor coil as well as no heating in the phone. I thinl ypu have to get your facts straight.
Because wireless charging coils run on such low amperage this will nevver become a problem of overheating.
As you said before me, you should get your phisics knowlege up to date. I am already a graduate with a phisics degree.
nonyhaha said:
So you really think you know what you are saying there...
Heat dissipation will happen ONLY on the emitter part. So no heating on the recetor coil as well as no heating in the phone. I thinl ypu have to get your facts straight.
Because wireless charging coils run on such low amperage this will nevver become a problem of overheating.
As you said before me, you should get your phisics knowlege up to date. I am already a graduate with a phisics degree.
Click to expand...
Click to collapse
Yes, I do think I know what I'm talking about - but luckily you came to the rescue and used your alleged physics degree to write a reply that proved me wrong with all of your facts, right?
Oh, no.. you didn't - you just doubled down instead, well done.
It doesn't matter (and is not important in this case) where the heat dissipation happens (and never did I claim it happened at the receiver - only that it happens) - the battery is still being heated up regardless, due to the energy loss.
If someone with an alleged physics degree keeps denying that the battery is heated up accordingly to my previous post then I doubt that you finished at the top of your class, if at all, sorry. I would really like to see all your evidence you have against what I wrote in my previous post (and that tons of people are posting about on the interwebz).
Just give it a go on Google, such as this thread from XDA: http://forum.xda-developers.com/google-nexus-5/help/post-qi-charging-battery-temp-t2544768
You can even do a simple charging test of your own, just compare battery temperatures while using a Qi wireless charger, QC2.0 and another at 1A.
Are everyone posting about high temperatures while using Qi chargers lying? Why would they do that? ..maybe the wired-charging-mafia are paying people to discredit WPC and other groups.. hm, maybe.
Yeah, it's getting a bit ridiculous, but silly claims require silly responses, sorry
If you can actually prove what I was saying in my previous post is wrong then I'll gladly accept it, but I do not take "na-ah, not true" with any degree of seriousness and neither do I give credit to claims of physics degrees. In that case I'm an ESA astronaut currently in space - see where this is going?
I go by what you actually write, not what you claim. The only reason for boasting about alleged degrees is to divert attention from the lack of any credible proof - disprove what my previous post said and I'll gladly accept it.
Ok, so my Yotaphone 2 charger has quick charge ability, as does my Samsung Galaxy Note 4 charger and car charger.
Despite all of these chargers having fast charging ability and my Samsung Note 4 fast charging perfectly with all of them, none of them appear to fast charge my Yotaphone 2......
It's at 63% charged right now and whether I plug it into a non QC charger or any of my quick chargers, it's saying 55 minutes until fully charged.
I've looked through the settings pages and can't find a way to enable quick charge on my Yotaphone like I could on my Note 4 battery page.
I'm running a Gearbest supplied YD206 which I flashed to the RU 134 ROM (so it's now showing as a YD201)
Am I missing something?
Any ideas/replies would be greatly appreciated!
Yotaphone 2 charger should indicate active quick charging by ligthing up "Yotaphone" with white LEDs on the charger. If its charging with 5V only your charger doesn't light up.
I don't think that theres something wrong with your phone. Just that charging estimation is inaccurate (at the moment).
Well my chargers Yotaphone logo is lighting up, so I guess it's working then. Thanks for the reply ?
I've had to put the two pin Yotaphone charger block into a three pin UK adaptor to try it. Annoyingly & worryingly it buzzes a lot & quite loudly - is that the same for everyone?
zippyioa said:
Ok, so my Yotaphone 2 charger has quick charge ability, as does my Samsung Galaxy Note 4 charger and car charger.
Despite all of these chargers having fast charging ability and my Samsung Note 4 fast charging perfectly with all of them, none of them appear to fast charge my Yotaphone 2......
It's at 63% charged right now and whether I plug it into a non QC charger or any of my quick chargers, it's saying 55 minutes until fully charged.
I've looked through the settings pages and can't find a way to enable quick charge on my Yotaphone like I could on my Note 4 battery page.
I'm running a Gearbest supplied YD206 which I flashed to the RU 134 ROM (so it's now showing as a YD201)
Am I missing something?
Any ideas/replies would be greatly appreciated!
Click to expand...
Click to collapse
Perhaps something wrong is with the cable, not the charger. Something like that happened to me sometime ago - when I used some different cable QC works again.
I had already tried three different chargers and two different cables
If the earlier post about the Yotaphone charger lighting up is correct, I think the phone is quick charging ok.
I guess I was expecting something similar to my Note 4 where it actually stated "fast charging" in the battery menu if I was charging it with a QC.
That message would then change to "charging" if I used a standard charger instead.
I bought a new powerbank, it seems to charge other phones ok but NOT the yotaphone. The powerbank displays the percentage charge for about 10 seconds then display goes off, but so does the yotaphone charging. Other phones and gadgets don't go off. Anyone else have this?
Powerbank is QC3.0. I have tried using different cables, always same.
Sometimes it charges OK. I thought my powerbank was fake until I found it charged other gadgets well.
I also noticed that YotaPhone2 sometimes doesn't want to charge. I just plug it in (cable&charger original), the YotaPhone logo lights up but the phone just doesn't charge! I will try with my power bank and see what happens.
I haven't understood the cause yet, maybe it's because mine has unlocked bootloader, TWRP, root, xposed. (YD206)

Categories

Resources