Related
Anyone ever use one of these?
http://cgi.ebay.com/NEW-EMERGENCY-A...ryZ48492QQssPageNameZWDVWQQrdZ1QQcmdZViewItem
Please comment on:
1) how fast they charge
2) how much they charge the phone
thanks!
The kaiser battery is rated for 3.7 volts and the battery that device uses is 1.5 volts. It would charge at the same rate, but drain the AA very quickly. The cost of the batteries it would take to power that thing would quickly outweigh the benefits it would produce. If it had 2 AA wired to it then it might be remotely useful. I mean if you absolutely cant survive if your phone dies, then yes it would give you enough battery life to keep your phone going for a few hours.
Yeah...well I bought two cheapo spare 1600mah batteries of ebay with the thought of carrying one around with me to swap out if needed....but I was considering getting one of the 2-piece rubberized cases for my phone. In that case, the battery compartment I believe will be inaccessible and I don't want to snap off the back case constanstly and wear down the integrity. Therefore I thought something like this might be a good option.
Even with free access to the battery door, I would think opening it up on a regular basis to swap batteries isn't good for it (would get loose, etc...).
Motorola P790
Well, I used Motorola P790 and it is pretty good, rechargeable. I can recharge it together with my phone with one charger. It gives abt 30% recovery of your battery in abt 2 hr.
rovanesyan said:
The kaiser battery is rated for 3.7 volts and the battery that device uses is 1.5 volts. It would charge at the same rate, but drain the AA very quickly. The cost of the batteries it would take to power that thing would quickly outweigh the benefits it would produce. If it had 2 AA wired to it then it might be remotely useful. I mean if you absolutely cant survive if your phone dies, then yes it would give you enough battery life to keep your phone going for a few hours.
Click to expand...
Click to collapse
Not really, these units have a small switching power supply which boosts the voltage to be able to charge the phone (if you look at the picture you can see the copper inductor coil for the oscillator circuit inside the top clear piece), a Kaiser won't charge with an input voltage under 5 volts.
I don't know how long it takes to charge, but I doubt that you can get more than 1 good charge from a fresh battery.
I've had all sorts of charging problems recently. With the Wifi, GPS & Phone on it would use about 500ma of power. But if I was running SatNav it would actually use about 850ma.
What I've done is twofold. I've bought a 2700ma internal battery for the phone and I've got a 12-24v charger that puts out 2a (2000ma) and so far it has kept the device fully charged, even with Bluetooth running!
I'm not sure how to gauge the 2700ma battery as I've only been running it a day. I think it needs a bit more of testing before I use it full time without the spare, but it's also good to have the spare....
Is it possible/safe to connect an external battery of same voltage but different mAh directly to the internal battery of a tablet to raise its mAh?
I am thinking about moding a tablet case to have a built in battery pack for my Iconia a500 tablet. I can easily squeeze 4 cell phone batteries in between the folds and make two 3.6v 3000mAh(or higher) battery pack with out adding much bulk or weight to it. can probably squeeze a usb hub in there too while i am at it.
I looked at the internal battery and it looks like i can splice its wires and add a connector to hook to the two 3.6v packs for easy connection.
I don't know much about electronics, but from my limited knowledge and layman point of view, as far as the tablet is concerns, it will just have a higher capacity battery, wont it?
can it be done safely without any other electronic components?
I'm not expert in that matter, but I think there's LiIon battery, and AFAIK it may make some confusement to your Fuel Gauge IC and Battery Charging IC, as these are probably calibrated to work with that certain battery type and capacitance. But I don't believe it may damage something if you connect it paralelly.
I would be careful with that if I were you. While it seems simple enough, batteries are much more complex than just voltage and capacity.
Most of the time, external batteries just go through the charging connection of the device so it can handle the power itself. If you connect it directly, you may bypass some part of the charging circuit (often times there's a circuit in the battery itself). Basically, I'm thinking the batteries may not fully charge or one will overcharge. It really depends on how it's set up. I'm not an expert with the design of rechargeable batteries though, so it may be safe.
In short, I would suggest connecting the batteries to the charger to extend your battery life. I'm sure others can offer additional insight.
A good read about batteries: http://batteryuniversity.com/learn/article/serial_and_parallel_battery_configurations
I can't comment on if it will work or if it is safe..
cellphone batteries have a built in protection to prevent them from overcharging or discharging, don't they?
The problem i have with connecting to the charging port is that it needs 12v. connecting 3 or 4 of battereis will give me either too little or too much volt. cant get 12v out of 3.7v batteries with out some kind of additional electronic parts to regulate the voltage. I don't have the sufficient knowledge to do that. this is why i figured ill do it directly. And charging would be simpler too if i can use the charger to charge both the internal and the external batteries at the same time.
Was hopping to hear from someone who did that already. I guess no one is brave/dumb enough to try it.
Sounds intresting looking forward for it
In a "perfect" environment this would be possible.
The inner resistance and voltage difference of the different batteries would even out the different capacity while charging.
You can avoid having these currents between the batteries by using diodes to separate the charging currents.
However my recommendations in terms of reliability, safety and efficiency is to use two or more batteries with the same capacity and dis-/charging curve.
is it the phone or the battery that cuts current when the battery is fully charged?
ridethisbike said:
is it the phone or the battery that cuts current when the battery is fully charged?
Click to expand...
Click to collapse
On most phones tablets etc. it's the device that cuts off the charging current when the battery is fully charged.
However nearly ever batteries has also a protection circuit that cuts off the battery in case of
under, over voltage and over current, sometimes even over temperature.
samotronta05 said:
On most phones tablets etc. it's the device that cuts off the charging current when the battery is fully charged.
However nearly ever batteries has also a protection circuit that cuts off the battery in case of
under, over voltage and over current, sometimes even over temperature.
Click to expand...
Click to collapse
ah, well it seems to me that if the battery has to take action, then you're doing it wrong.
I'm not sure exactly how diodes work so I can't comment on that, but definitely use like batteries (two 3000mAh batteries as opposed to one 3000mAh and one 3600mAh) as to not confuse the device about the charge, causing the 3600 to not get fully charged. just make sure they get connected properly (in parallel) and it should work just fine.
in reality its no different than someone using 4 car batteries to power their car audio system. as long as its done properly (in parallel), there is really no harm.
I wouldn't recommend it. While technically it is something that you can do like other posters have said the internal power system is configured in a particular matter. It is likely that the regulation and protection are minimal if they only expect the internal battery. There is also the issue of charging, if you did as you say, you would be discharging the batteries into each other if they weren't exactly at the same levels. So this isn't the recommend method.
What I would recommend you do is put them together so you get something over six volts (two 3.6V packs in series) and put a 5V regulator on it. Then connect it to a USB cable and charge through there. That way you utilize the systems built in protection, aren't messing with the battery connections and can swap it in much easier.
You can... as far as u meet these requirements.
1) notice ur original battery voltage..
2) buy some same kind of them, or u may use others, provided the voltage is same. If u can find higher watt-hour( current) battery of same voltage, go for it.
3) never mix up different amp/ voltage rated batteries together. Ie if u have 3.7v 3200ma batt, and planning to add one more batt in parallel to it go for another 3.7v 3200ma , and not 3.7v 1500ma or 3.5 v 3200ma batt.
4) if u cant find a higher capacity batt of same voltage, go for parallels. Ie join together the positives of two or more batt, and also join together their negatives too.. connect it to the batt terminals of tab,
5) no problems will be caused if u use high current batt, provided u have suited the correct voltage. In ur case u have to ensure proper voltage only. And current can increase to any value.. the system takes only what its needed.
6) the sad thing is it will take more time for charging. But never try to increase the charging current. It will damage the power section. But increase ur patients. Most of the tabs got 1-1.5 amp charger. Never increase the limits.
U may go for it. If u can manage these...
Sent from my HD2 using XDA App
giritrobbins said:
I would recommend it. While technically it is something that you can do like other posters have said the internal power system is configured in a particular matter.
Click to expand...
Click to collapse
Yes, to charge with a constant current until a certain voltage is reached.
giritrobbins said:
There is also the issue of charging, if you did as you say, you would be discharging the batteries into each other if they weren't exactly at the same levels.
Click to expand...
Click to collapse
Yes, this is the main problem and it's also the reason i recommended using two batteries with the same capacity. Otherwise the constant charge and discharge depending on the size of the difference between the batteries makes the system very inefficient.
giritrobbins said:
What I would recommend you do is put them together so you get something over six volts (two 3.6V packs in series) and put a 5V regulator on it. Then connect it to a USB cable and charge through there. That way you utilize the systems built in protection, aren't messing with the battery connections and can swap it in much easier.
Click to expand...
Click to collapse
Which would make it even more inefficient.
showlyshah said:
3) never mix up different amp/ voltage rated batteries together. Ie if u have 3.7v 3200ma batt, and planning to add one more batt in parallel to it go for another 3.7v 3200ma , and not 3.7v 1500ma or 3.5 v 3200ma batt.
Click to expand...
Click to collapse
You can mix up different batteries when the voltage is the same, however this is not a the recommend way, since you will not encounter much gain in capacity.
The in terms of capacity smaller battery will be charged by the bigger one all the time.
However if you say i don't care about the efficiency then you can mix up for e.g. a 3.7v 3200mAh and a 3.7v 3000mAh battery.
samotronta05 said:
Yes, to charge with a constant current until a certain voltage is reached.
Yes, this is the main problem and it's also the reason i recommended using two batteries with the same capacity. Otherwise the constant charge and discharge depending on the size of the difference between the batteries makes the system very inefficient.
Which would make it even more inefficient.
You can mix up different batteries when the voltage is the same, however this is not a the recommend way, since you will not encounter much gain in capacity.
The in terms of capacity smaller battery will be charged by the bigger one all the time.
However if you say i don't care about the efficiency then you can mix up for e.g. a 3.7v 3200mAh and a 3.7v 3000mAh battery.
Click to expand...
Click to collapse
The issue when you connect them like that will always occur when they are in parallel (you need to add some logic or some diodes in there). Using the same ampacity battery will mitigate it to a point but the internal resistance of a battery will dictate how fast it drains down.
As for the charging, that isn't what i am saying. The internal circuitry that he cannot change is expecting a certain battery with certain characteristics. Putting a second battery in there in parallel will potentially mess with that stuff. If it doesn't realize the size of all the attached battery it will register a fault because it won't be charging correctly.
He isn't going for efficiency here. And a linear regulator is easy, if we want efficient we can do that too. I was just suggesting an alternative that could be easily done.
And the most important point is not about voltages but about chemistry. Different chemistry have different voltages and different knees where they drop suddenly.
The original battery is 7.4v 3350mah. and the charger is 12v 1.5a.
so matching or passing that with batteries I can fit in the space I want it to fit in is a problem.
I was going to use 4 i9000 1500mah batteries, which would fit perfectly, to make a 7.4v 3000mha. but that would be lower then the original. i guess i can try to find higher capacity batteries that i can fit in the case
But if i understand it correctly, the current will keep flowing from the larger capacity battery to the lower one, it will keep them at 100% until the original will drop to lets say 3000mah and then they will all drain at the same rate until they are all empty. and when i charge them, the charging circuitry will get its information only from the original battery, so it will keep charging until the original is full. it may not be as efficient, but it should still get me a significant boost, wouldn't it.
I suppose I could connect them all to make a 14.8v 1500mha with a 12v regulator and use the charging port, but that will leave me with the problem of charging the external pack. no idea how to do that. how do i cheaply and safely charge a 14.8v battery pack?
ronkoni said:
The original battery is 7.4v 3350mah. and the charger is 12v 1.5a.
so matching or passing that with batteries I can fit in the space I want it to fit in is a problem.
I was going to use 4 i9000 1500mah batteries, which would fit perfectly, to make a 7.4v 3000mha. but that would be lower then the original. i guess i can try to find higher capacity batteries that i can fit in the case
But if i understand it correctly, the current will keep flowing from the larger capacity battery to the lower one, it will keep them at 100% until the original will drop to lets say 3000mah and then they will all drain at the same rate until they are all empty. and when i charge them, the charging circuitry will get its information only from the original battery, so it will keep charging until the original is full. it may not be as efficient, but it should still get me a significant boost, wouldn't it.
I suppose I could connect them all to make a 14.8v 1500mha with a 12v regulator and use the charging port, but that will leave me with the problem of charging the external pack. no idea how to do that. how do i cheaply and safely charge a 14.8v battery pack?
Click to expand...
Click to collapse
Hey why are u going to make a 14.8v batt pack when urz original ones are 7.4??? I dont realy understand the logic. If u have extra batt, then make the pack like this
Batt pack of 7.4v × 2 and connect them in parallel, thatz all, no need to modify charging circuits, no need to regulate the voltage, and no loss in voltage/ current,
But if u make a high voltage pack, u have to regulate the o/p with some ic, and in turn u will suffer some v/c drop.
Sent from my HD2 using XDA App
i like to connect directly to the battery, but some seem to believe it is not wise if the mAh of the original is higher then the external pack.
The 14.8v pack with a 12v regulator option is for use with the tablet's charging connector which require 12v . A safer option, but leaves me with the charging problem.
ronkoni said:
i like to connect directly to the battery, but some seem to believe it is not wise if the mAh of the original is higher then the external pack.
The 14.8v pack with a 12v regulator option is for use with the tablet's charging connector which require 12v . A safer option, but leaves me with the charging problem.
Click to expand...
Click to collapse
http://www.ebay.com/itm/USB-2-0-fem...ultDomain_0&hash=item3f0cd836f3#ht_2792wt_902
that actually took me a lot longer to find than I had originally planned... lol. of course.... you're going to have to find the rest of your charger. but that'll be easy with that adapter
So its a about making a batt pack which u can connect to the tab via charging port? And use it like a backup when u r out of power?
If its ur idea, you may tell about the size of the pack, and how long u need to use it( ie backup time), if u do, i will find out the solution, also post ur current batt and charger spec.
And hope i can help with some circuits too..
Sent from my HD2 using XDA App
the idea is to have a battery pack that can fit in between the layers of the stand part of this tablet's case. the available space I want to use is about 2"x9" so since the space is limited, the batteries must be the flat type and be able to get 12v out of them and be able to charge them externally.
this is the original tablet battery and the charger is 12v 1.5a.
4x Samsung i9000 1500mAh batteries fits perfectly. and since i already got 2 extra ones laying around, might as well go with these batteries. unless it is possible to get 12v out of 3 batteries. then i can probably use 3 Samsung note 2500mAh batteries instead.
so, what circuitry do i need to make this thing power the tablet and get charged?
thought about getting one of these battery packs, take it apart and replace its batteries with mine. since it already got the necessary circuitry for regulating the voltage and for charging, it would cut down on the guess work and probably be cheaper to build. i think they use 3 batteries in there which would be better.
Hi,
any way to accomplish this ? To be more clear, batteries will get a shorter life from the current work regime that I put them to, Unfortunately, the USB data cable of most phones also acts as a charger. I am using the phone for development, so this USB data cable is always attached to the phone and to the dev machine, thus forcing the battery to always charge, even at the slightest 1% discharge. It would be really good if I could take out the battery and still be able to run the phone.
Thank you!
kelogs said:
Hi,
any way to accomplish this ? To be more clear, batteries will get a shorter life from the current work regime that I put them to, Unfortunately, the USB data cable of most phones also acts as a charger. I am using the phone for development, so this USB data cable is always attached to the phone and to the dev machine, thus forcing the battery to always charge, even at the slightest 1% discharge. It would be really good if I could take out the battery and still be able to run the phone.
Thank you!
Click to expand...
Click to collapse
Depends on the phone, I know on the moto defy there was a cable mod that would bypass the battery
Sent from my GT-I9300T using xda app-developers app
adamo3957 said:
Depends on the phone, I know on the moto defy there was a cable mod that would bypass the battery
Sent from my GT-I9300T using xda app-developers app
Click to expand...
Click to collapse
That would be a Samsung Galaxy Note GT-N7000
i didn't know it is possible!
Some (most?) battery circuits are designed to deal with a dead or shorted battery.
The circuit is not arranged in a direct line between charger, battery, load.
Disconnecting a battery connector also disconnects the temperature-measuring thermistor.
With a NTC thermistor, it would think that the battery is ice cold.
A resistor of the correct value would fool it into believing it's a reasonable temperature.
I tried disconnecting the battery on my Nook and it wouldn't power up.
3,7V supply circuit as battery
You can supply the device with 3,7V (like the battery) from an external source. The only thing bad is that you have to attach wires to the gold plated battery slots on the device, or you can do it with small crocodile clips to avoid soldering. (better)
If you are ok with this, here is how you do it.
Take off your battery and measure voltage DC with a multimeter or voltmeter between battery leads. Now you know what your battery gives to the device. Example for 3 leads battery: you have 2 positive leads with the ground as reference (one slightly lower than the other) and the actual ground. So you have to supply with 3,7V the same leads that the battery was supplying. You can check while inserting the battery back to the device.
Hot to have the 3,7 Volts supply:
You will need 2 resistors, some capacitors, LM317 regulator, heat sink for that and a higher voltage DC power supply 6-12V.
Get and android device and go to play store. Install "Electrodroid" application. This will help you on sizing the LM317 regulator. Have in mind that this is programmable regulator, so you need 2 resistors to set the output voltage as 3,7V. LM317 is a linear voltage regulator, so it will act as uninterrupted 3,7 Volt battery. Be carefull to get a big heat sink, depending on the current you will be supplying and input voltage, also you can read this device datasheet online.
You can build this circuit on a breadboard if you are familiar with electronics or you can solder point to point the parts, or make a pcb if you can.
You're lucky it's N7000
full schemas are there - http://forum.xda-developers.com/showthread.php?t=1813315
PMIC's pin "_DETBAT" is connected to "VF" of battery connector. Perhaps if you pull it high it'll bootup.
And btw - usually HW is fully capable of starting off USB power. The thing is that bootloader does check if battery is present and, if not, turns off the phone. Actually this is because phone, especially during bootup, can peak to much more than 500mA current, and battery is there to compensate "missing" power.
//edit:
However, in case you don't provide any power into battery pins, it might try to charge it and U607 - Switching Charger might not really like working without load. This can generate alot of noise around AFAIK so modding kernel somehow to disable charging would be good choice.
Rebellos said:
You're lucky it's N7000
full schemas are there - http://forum.xda-developers.com/showthread.php?t=1813315
PMIC's pin "_DETBAT" is connected to "VF" of battery connector. Perhaps if you pull it high it'll bootup.
And btw - usually HW is fully capable of starting off USB power. The thing is that bootloader does check if battery is present and, if not, turns off the phone. Actually this is because phone, especially during bootup, can peak to much more than 500mA current, and battery is there to compensate "missing" power.
//edit:
However, in case you don't provide any power into battery pins, it might try to charge it and U607 - Switching Charger might not really like working without load. This can generate alot of noise around AFAIK so modding kernel somehow to disable charging would be good choice.
Click to expand...
Click to collapse
The system is powered off of Vbat - As a result, the charger MUST be active. Also, the N7000 is EASILY capable of drawing more than the maximum input current limit from Vbus, mandating extraction of power from the battery in some operating regimes.
The only way to achieve what the OP wants (total battery removal) would be with a dummy battery that had an external 4.0 volt power supply. Bad Things could happen if the device is connected to USB in this state.
However, to satisfy the OP's stated reasons for removing the battery (lots of time on USB), the likely best solution would be to disable the charging circuitry in the kernel at high states of charge. For example, one could set it up so that the charger would only be enabled when Vbat was below 4.0 volts, or when the fuel gauge SoC is below X per cent. See Ezekeel's "BLX" implementation for the Galaxy Nexus as one example of this.
thanks!
rebellos said:
you're lucky it's n7000
full schemas are there - http://forum.xda-developers.com/showthread.php?t=1813315
pmic's pin "_detbat" is connected to "vf" of battery connector. Perhaps if you pull it high it'll bootup.
And btw - usually hw is fully capable of starting off usb power. The thing is that bootloader does check if battery is present and, if not, turns off the phone. Actually this is because phone, especially during bootup, can peak to much more than 500ma current, and battery is there to compensate "missing" power.
//edit:
However, in case you don't provide any power into battery pins, it might try to charge it and u607 - switching charger might not really like working without load. This can generate alot of noise around afaik so modding kernel somehow to disable charging would be good choice.
Click to expand...
Click to collapse
thanks for the link1
Thank you all for sharing knowledge and experiences. I have made the decision to just go for some cheap ebay replacement batteries due to some advice I got from a friend, which I am sharing below:
Do not fiddle with such a fine piece of hardware (i.e. smartphone) by attaching some exposed wirings to it. The gadget could easily slip from your hands and cause the loosely hanging wirings to short-circuit upon landing on the floor. Definitely not a good perspective.
a
kelogs said:
Thank you all for sharing knowledge and experiences. I have made the decision to just go for some cheap ebay replacement batteries due to some advice I got from a friend, which I am sharing below:
Do not fiddle with such a fine piece of hardware (i.e. smartphone) by attaching some exposed wirings to it. The gadget could easily slip from your hands and cause the loosely hanging wirings to short-circuit upon landing on the floor. Definitely not a good perspective.
Click to expand...
Click to collapse
Oh give a f***ing break! It's a phone not an egg shell. And short-circuiting the wires would at worst damage the power supply not the phone.
Entropy512 said:
The system is powered off of Vbat - As a result, the charger MUST be active. .
Click to expand...
Click to collapse
This just isn't true. Modern PMIC not only have the option to stop charging the battery, but to charge the phone only on AC and also do things like send power the the USB OTG. So it is a matter of the PMIC knowing what to do. What I wish was possible was a nice app that told the PMIC to stop charging at 80% and then go to trickle mode. This would save your battery life by a lot, instead it appears to charge to 100% then back off the voltage a little.
bypass battery on unibody phone and run directly from charger
Entropy512 said:
The system is powered off of Vbat - As a result, the charger MUST be active. Also, the N7000 is EASILY capable of drawing more than the maximum input current limit from Vbus, mandating extraction of power from the battery in some operating regimes.
The only way to achieve what the OP wants (total battery removal) would be with a dummy battery that had an external 4.0 volt power supply. Bad Things could happen if the device is connected to USB in this state.
However, to satisfy the OP's stated reasons for removing the battery (lots of time on USB), the likely best solution would be to disable the charging circuitry in the kernel at high states of charge. For example, one could set it up so that the charger would only be enabled when Vbat was below 4.0 volts, or when the fuel gauge SoC is below X per cent. See Ezekeel's "BLX" implementation for the Galaxy Nexus as one example of this.
Click to expand...
Click to collapse
This would just disable the charging until the SoC dropped to the level you set though, correct? I.e. the phone is still running from the battery..? If so you still end up with more or less the same issue (although with some potential benefits depending from cycling at lower SoC).
I have a HTC One X so removing the battery and adding some circuit trickory isn't an option. But bc of this damn unibody even more motivation to run the phone directly from the charger bc I can't feed it with replacement batteries. ( which he is right thats the best options for the OP)
Anyone know if this is even possible with software mods based on the design of the phone charging system? Or any sources for literature on this. I really wanna save this battery if its the only one I got!
I am currently experimenting with this. I suppose I can only get 3 volts from the 5V input of a usb charger. Going to need to hook up to a 12v power source I suppose. Built this power supply, linear variable voltage regulator. Going to still want that data transfer. I am using diodes to make sure no power goes the wrong way into my electrolytic capacitors. I will try to post a thread if it works because I have not seen one yet.
Hello, sorry to post in an old thread, but this is the closest problem to mine that I could find.
So I'm using a 4G mi-fi modem on my PC. It's plugged in constantly 24/7 through USB cable to my PC. You can imagine the effect on the battery. I threw out 2 batteries coz they've gone bad. Sadly the modem won't turn on if it's not detecting a battery.
So I'm considering the capacitor route, just to fool the modem into thinking that a battery is installed, the real power comes from USB data cable anyway. It's a Huawei E5577, the battery got 4 terminals on it. The outermost terminals are (+) and (-), while I'm guessing the middle two are used to read the battery status (voltage, etc). So what's the simplest schematic to achieve this, using simplest capacitor circuit to fool the modem into thinking the battery is installed and working well.
Thank you
Hello!
Just curious if there is an issue with using my new Nexus 10 2A charger with other phones, such as my HTC Sensation or Blackberry Torch?
The Sensation uses a 1A charger, but I assume the phones are smart enough to only draw the current necessary, so they won't be damaged by drawing too much?
I'd like to just use the Nexus 10 charger and not have to carry other ones.
yes it is fine
Cool thanks
EniGmA1987 said:
yes it is fine
Click to expand...
Click to collapse
I heard though that
*first it creates unnecessary heat because the current drawn by circuitry on lower amperage device has to be dissipated as heat
*second, this is less science/engineering but someone said that the specific pins are created by companies and the pins themselves can vary in terms of impedance, thus change the overall circuitry of the device in the long run
*third Li-Ion can pull more current than the default charger and it tends to do so to charge faster, albeit at the cost of the battery overall life deteriorates because higher charging rates also leads to faster breakdown of cells?
I wish I had sources, but this is what I pulled off the Internet when I was younger... can you please assist and advise? Would greatly appreciate (even if we start new thread from this
nutnub said:
I heard though that
*first it creates unnecessary heat because the current drawn by circuitry on lower amperage device has to be dissipated as heat
*second, this is less science/engineering but someone said that the specific pins are created by companies and the pins themselves can vary in terms of impedance, thus change the overall circuitry of the device in the long run
*third Li-Ion can pull more current than the default charger and it tends to do so to charge faster, albeit at the cost of the battery overall life deteriorates because higher charging rates also leads to faster breakdown of cells?
I wish I had sources, but this is what I pulled off the Internet when I was younger... can you please assist and advise? Would greatly appreciate (even if we start new thread from this
Click to expand...
Click to collapse
Wish I knew for sure too. REally I don't care a lot about my HTC Sensation as I plan on getting a Nexus 4 LTE when it eventually comes out. Hopefully those come with 2A chargers!
Sure I could get a Nexus 4 and use LTE right now on Bell, but I'd rather wait for an official one.
nutnub said:
I heard though that
*first it creates unnecessary heat because the current drawn by circuitry on lower amperage device has to be dissipated as heat
*second, this is less science/engineering but someone said that the specific pins are created by companies and the pins themselves can vary in terms of impedance, thus change the overall circuitry of the device in the long run
*third Li-Ion can pull more current than the default charger and it tends to do so to charge faster, albeit at the cost of the battery overall life deteriorates because higher charging rates also leads to faster breakdown of cells?
I wish I had sources, but this is what I pulled off the Internet when I was younger... can you please assist and advise? Would greatly appreciate (even if we start new thread from this
Click to expand...
Click to collapse
Everybody seems to misunderstand LiPo charging, as it is different than previous battery technologies
For general LiPo Information, you should look here. Charging information is about halfway down the page
http://www.rchelicopterfun.com/rc-lipo-batteries.html
Ill quote the important part:
Selecting the correct charge current is also critical when charging RC LiPo battery packs. The golden rule here use to be "never charge a LiPo or LiIon pack greater than 1 times its capacity (1C)."
For example a 2000 mAh pack, would be charged at a maximum charge current of 2000 mA or 2.0 amps. Never higher or the life of the pack would be greatly reduced. If you choose a charge rate significantly higher than the 1C value, the battery will heat up and could swell, vent, or catch fire.
Times are a changing...
Most LiPo experts now feel however you can safely charge at a 2C or even 3C rate on quality packs that have a discharge rating of at least 20C or more safely and low internal resistances, with little effect on the overall life expectancy of the pack as long as you have a good charger with a good balancing system. There are more and more LiPo packs showing up stating 2C and 3C charge rates, with even a couple manufactures indicating 5C rates. The day of the 10 minute charge is here (assuming you have a high power charger and power source capable of delivering that many watts and amps).
Click to expand...
Click to collapse
Pretty much all phones are right around 2000mAh capacity now days so even going by the "old" golden charging rule a 2A charger would be safe to use. My Galaxy Nexus came with (I think) a 1A charger, but ever since I got my tablet shortly thereafter I have just used the tablets 2A charger for both devices and never once had an issue. It has been 8 months now of using the 2A charger on my phone. Idle life can still reach a little over 3 days on a single charge and I still get one of the best screen on time's of most people I know around the forums. So yes from personal experience a 2A tablet charger is completely fine to use on a phone.
Charging circuitry is built into the device, not the "charger"
Nothing to worry about
EniGmA1987 said:
Ill quote the important part:
Pretty much all phones are right around 2000mAh capacity now days so even going by the "old" golden charging rule a 2A charger would be safe to use. My Galaxy Nexus came with 9I think) a 1A charger, but ever since I got my tablet shortly thereafter I have just used the tablets 2A charger for both devices and never once had an issue. It has been 8 months now of using the 2A charger on my phone. Idle life can still reach a little over 3 days on a single charge and I still get one of the best screen on time's of most people I know around the forums. So yes from personal experience a 2A tablet charger is completely fine to use on a phone.
Click to expand...
Click to collapse
Is it safe to assume that all chargers come default at 1C charging for their device? Because if that's the case, I figure most electronics we own can just be replaced with 10w chargers (which would make life much more convenient).
This is slightly related/unrelated, but how do you know whether a charger is "high quality" or will only provide "constant current / constant voltage"? It seems strange to me that these days, you can't find the circuitry of many devices we own publicly available so you can't check if the design is good (let alone how they chose the components in their design?). Do you (and other veterans) have any thoughts on this?
Thanks for teaching me lots!
-newb, happily reading away
I bought one of those 2amp double chargers from a seller on Amazon. It wasn't really cheap either (in cost anyway- I spent a bit more hoping it would be higher quality). After plugging in my MotoRAZR and the wife's lumia the charger popped and some plastic from the housing of the charger flew across the room! Thankfully both phones were fine.
I wondered whether both phones tried to pull more than the charger could handle and the charger had poor quality circuitry.
Since then, I've only ever bought branded official replacement chargers (Motorola, Samsung etc). I'd happily mix and match them to the phones but I'd be wary of buying a no name Chinese jobby from Ebay or Amazon marketplace.
Sent from my XT910 using xda premium
nutnub said:
Is it safe to assume that all chargers come default at 1C charging for their device? Because if that's the case, I figure most electronics we own can just be replaced with 10w chargers (which would make life much more convenient).
Click to expand...
Click to collapse
Most batteries can discharge a lot faster than they can recharge, but with LiPo, the difference is getting smaller.
Batteries used to need trickle charging as if you charge fast they would get hot, which causes the chemicals inside to expand(think like a fizzy drink, pour it fast and it will overflow) causing the battery to burst, exposing nasty chemicals.
New technology means the charger can accurately monitor how fast we fill the battery, without letting it get too hot, and also the way it is filled(as with the fizzy drink, pour down the side of a glass rather than straight to the bottom and you will fill the glass faster, with less chance of it over-spilling)
This is slightly related/unrelated, but how do you know whether a charger is "high quality" or will only provide "constant current / constant voltage"? It seems strange to me that these days, you can't find the circuitry of many devices we own publicly available so you can't check if the design is good (let alone how they chose the components in their design?). Do you (and other veterans) have any thoughts on this?
Click to expand...
Click to collapse
Unfortunately, industry is full of products made to a budget, usually by using cheaper components/designs(the charger for the ASUS TF101 was renowned for failing), so there is no foolproof way of determining 'quality' apart from word of mouth, looking at quantities sold, feedback in reviews/forums.
Basically, it boils down to 'consumer testing'
---------- Post added at 09:54 AM ---------- Previous post was at 09:38 AM ----------
Here's a bit more related information found buried deep in documents here: http://www.usb.org/developers/devclass_docs
The USB2.0 specifications for current output say the maximum current is limited to 1.8A, while USB3.0 has a maximum current limit of 5A
Hopefully, USB3.0 will quickly become a new standard for portable devices.
more questions!
First of all, let me please thank you for responding and being so thorough with your answers! There is so much information out there, and in my 22 years of existence, I cannot for the life of me sort through the sheer amount of data. I do greatly enjoy reading every little thing that is posted, especially in this thread because I think it's super important to understand the electronics that we interact with.
sonicfishcake said:
I bought one of those 2amp double chargers from a seller on Amazon. It wasn't really cheap either (in cost anyway- I spent a bit more hoping it would be higher quality). After plugging in my MotoRAZR and the wife's lumia the charger popped and some plastic from the housing of the charger flew across the room! Thankfully both phones were fine.
I wondered whether both phones tried to pull more than the charger could handle and the charger had poor quality circuitry.
Since then, I've only ever bought branded official replacement chargers (Motorola, Samsung etc). I'd happily mix and match them to the phones but I'd be wary of buying a no name Chinese jobby from Ebay or Amazon marketplace.
Sent from my XT910 using xda premium
Click to expand...
Click to collapse
My concern with this is that if Motorola or Samsung does put out a product less than optimal, would we all know? Another way of asking this is how do we know that Apple/Motorola/Samsung/Lenovo does produce superior products and it's not merely a matter of advertisement or brand image? Do you think there is a way to know, as a consumer, that even third party products are becoming more competitive, given that smaller companies have much harder time advertising and building a name/brand for themselves? (if you can't tell, I am rooting for the little guys because I may one day work for the little guys)
skally said:
Most batteries can discharge a lot faster than they can recharge, but with LiPo, the difference is getting smaller.
Batteries used to need trickle charging as if you charge fast they would get hot, which causes the chemicals inside to expand(think like a fizzy drink, pour it fast and it will overflow) causing the battery to burst, exposing nasty chemicals.
New technology means the charger can accurately monitor how fast we fill the battery, without letting it get too hot, and also the way it is filled(as with the fizzy drink, pour down the side of a glass rather than straight to the bottom and you will fill the glass faster, with less chance of it over-spilling)
Click to expand...
Click to collapse
Thank you for clarifying for us. Would you happen to know if there are specifics to recharge specs, short of finding me published papers on the technology? What you said is definitely what I've been reading from the Internet and I do trust you, just would help me have greater peace of mind with my nice and shiny devices,,,
skally said:
...
Unfortunately, industry is full of products made to a budget, usually by using cheaper components/designs(the charger for the ASUS TF101 was renowned for failing), so there is no foolproof way of determining 'quality' apart from word of mouth, looking at quantities sold, feedback in reviews/forums.
Basically, it boils down to 'consumer testing'
---------- Post added at 09:54 AM ---------- Previous post was at 09:38 AM ----------
Here's a bit more related information found buried deep in documents here: http://www.usb.org/developers/devclass_docs
The USB2.0 specifications for current output say the maximum current is limited to 1.8A, while USB3.0 has a maximum current limit of 5A
Hopefully, USB3.0 will quickly become a new standard for portable devices.
Click to expand...
Click to collapse
A quick question, just because USB3.0 should allow up to 25W, that doesn't mean that it's the standard for devices, does it? As in Nexus 10 probably can only draw 10W, even if my computer (which although stated is USB3.0) may not have the circuitry behind it to allow for such a draw? I'm a little iffy on the whole implementation of USB standards. Because if USB2.0 has draw of up to 9W, I haven't seen this from my laptop or any devices claiming to have USB2.0 ports,,,
but then again, I may be paranoid. Just trying to line up my experience with theory!
Thank you all for so much support and enthusiasm. Any chance we'll see this on a top thread somewhere?
nutnub said:
A quick question, just because USB3.0 should allow up to 25W, that doesn't mean that it's the standard for devices, does it? As in Nexus 10 probably can only draw 10W, even if my computer (which although stated is USB3.0) may not have the circuitry behind it to allow for such a draw? I'm a little iffy on the whole implementation of USB standards. Because if USB2.0 has draw of up to 9W, I haven't seen this from my laptop or any devices claiming to have USB2.0 ports,,,
but then again, I may be paranoid. Just trying to line up my experience with theory!
Thank you all for so much support and enthusiasm. Any chance we'll see this on a top thread somewhere?
Click to expand...
Click to collapse
If the Nexus kernel says the limit is 2A then that's it. It cant use more power.
Have you seen the internal USB 3.0 cable?
It's at least twice as thick as a USB 2.0 cable, I got a new chassi for my computer last week, with a couple 2.0 and a 3.0 USB front port.
And if your motherboard's built for USB 3.0, I'm pretty sure it can take the current. Otherwise there would be no meaning of adding 3.0 support.
Sent from my Nexus 10 using xda app-developers app
If something is listed as a USB3 port, it must be up to USB3 certifications. Otherwise the manufacturer of the device is liable for a huge lawsuit if issues arise. If something says USB3 that doesnt mean it IS drawing 25w though, just that the port is capable of having 25w pulled through it over the USB connector. Same with USB2 and its 9w limit on the spec. Also, plugging a tablet such as this into a computer's USB3 port does not mean it will charge faster or get faster data transfers, since the cable being used and the device are still of the older specification.
nutnub said:
Thank you for clarifying for us. Would you happen to know if there are specifics to recharge specs, short of finding me published papers on the technology? What you said is definitely what I've been reading from the Internet and I do trust you, just would help me have greater peace of mind with my nice and shiny devices,,,
Click to expand...
Click to collapse
Have a look here for info on the recharging process for Lithium based cells.
https://sites.google.com/site/tjinguytech/charging-how-tos/the-charging-process
It is worth noting the level of precautions taken while charging the cells aggressively. You really don't need a bucket of sand on standby when you plug your phone in to it's charger
nutnub said:
A quick question, just because USB3.0 should allow up to 25W, that doesn't mean that it's the standard for devices, does it? As in Nexus 10 probably can only draw 10W, even if my computer (which although stated is USB3.0) may not have the circuitry behind it to allow for such a draw? I'm a little iffy on the whole implementation of USB standards. Because if USB2.0 has draw of up to 9W, I haven't seen this from my laptop or any devices claiming to have USB2.0 ports,,,
Click to expand...
Click to collapse
There are actually 2 different current limits for each USB specification: USB2.0 has 0.5A and 1.8A, while USB3.0 has 1.5A and 5.0A
The lower of the current limits is what I would expect to get from a USB port on a computer, while the higher one I would expect to get from a dedicated charger.
I believe the higher current specification was added purely for charging mobile devices, as it is only achieved by adding a resistance across D+ and D-, removing the data transmission capabilities of the port. I don't know if that's practical, or possible with a computer USB port.
I do remember seeing motherboards with ports specifically designed for fast charging, but I haven't got any info on them as yet.
There are also kernels which enable "fast charging" on a PC. Basically it removes the data connection in software and treats any USB connection as if it were plugged into AC. You can charge just as fast on a computer as you can on a wall charger when this feature is enabled in the kernel.
I am using the N10 charger for my Note 2 and it charges bloody fast using this charger. Charging is noticeably faster on Note 2 than the stock 1A charger that came with the Note.
Battery is not getting warm and battery temps are similar to those on 1A charger. Basically its cutting the charging time in almost half.
Agreed. Note 2 charger is awesome. Bought a powergen 3.1 amp car charger for the note 2 also after watching videos and reading up on proper car chargers for the phone. Guess I can use it for my nexus 10 too.
Sent from my Nexus 10 using xda premium
I own RC cars with lipo batteries and rule of thumb is total mah divide by 1000 = the Max amp charger you can use. So a 2100mah battery can be charged with a 2.1A charger.
On that note I charge my Samsung s3 that has a 2100mah battery with a 2.1A car charger without any issue.
Sent from my SGH-T999 using Tapatalk 2
I used the N10's charger to charge my iPod Nano 3rd gen, no problem
Maybe this is old news but today I learned that the YotaPhone 2 charger shipped with the phone is actually Qualcomm Quickcharge 2.0 compatible. This means you can also charge other Qualcomm Quickcharge 2.0 compatible phones with it, like my other phone the Moto G4+. Works perfectly.
Yes, I've been using my QuickCharge 3.0 charger and it's charging with 9V ~1,3A.
kbal said:
Yes, I've been using my QuickCharge 3.0 charger and it's charging with 9V ~1,3A.
Click to expand...
Click to collapse
Dont, fastcharging will greatly reduce you battery life.
Enviado desde mi SM-N930F mediante Tapatalk
kingtiamath said:
Dont, fastcharging will greatly reduce you battery life.
Enviado desde mi SM-N930F mediante Tapatalk
Click to expand...
Click to collapse
Although it is true it reduces your battery life it is only by a small margin, nog greatly.
VirtuaLeech said:
Although it is true it reduces your battery life it is only by a small margin, nog greatly.
Click to expand...
Click to collapse
Im afraid it does. I have done many experiments myself and batteries often charged with fastcharge in as little as 6 months give you no more than 70% of its original charge.
Enviado desde mi SM-N930F mediante Tapatalk
..the same goes for wireless charging btw.
Amplificator said:
..the same goes for wireless charging btw.
Click to expand...
Click to collapse
Are you joking? What is your answer based on?
Wireless charging runs on a much lower amperage so it should be the best solution to charge your phone.
nonyhaha said:
Are you joking? What is your answer based on?
Wireless charging runs on a much lower amperage so it should be the best solution to charge your phone.
Click to expand...
Click to collapse
My answer is based on simple physics.
Just because the amps are lower doesn't mean it's not bad for the battery.
Wireless charging is way less efficient than any form of wired charging.
What happens to the loss? Well, it gets dissipated as heat - and what is the "big killer" of lithium batteries? ..heat.
For this single fact alone, denying that wireless charging causes more harm than a cabled charging is simply.. well, silly.
The only ones denying this are either unaware of simple science or are lying to you, probably to sell you a charger
Yes, every form of charging, even at a theoretical 100% efficiency will heat up the battery due to chemical reactions inside the battery, but the lower efficiency you have the more energy is converted into heat - thus you do more damage and getting even less actual battery-energy out of it.
Simply put: the best charging method is the one that produces the least amount of heat while maintaining a high efficiency - wireless charging is simply not that.
Charging using a cable at 90% means 10% is being converted into heat (not all 10%, but for arguments sake, play along), where as using wireless charging might be at.. 50% depending on different circumstances (probably a lot closer to 70% than 50%, but again, for arguments sake).
This means that the other 50% is just turned into wasted, unnecessary and unwanted heat.
The percentages obviously aren't correct in this example, but it's more to get the point across.
With wireless charging you do more damage (it is subjective as to whether this matters to you) to the battery than you would by using a cable, simply because you create more excessive heat which only purpose is to heat up the battery and surrounding area than actually going into the battery itself.
If we consider the 50% efficiency of the before mentioned example, this means that you would need to charge your device for almost twice as long time as when you use a cable. Not only does it create more heat by virtue of being inductive charging, but it will be doing so for, again, almost twice the time length.
Efficiency also depends on things like distance - the less "perfect" your phone is placed on the charger the less efficient, and thus more wasteful it is.
Google something like "qi wireless charging overheating" and you will see plenty of people reporting on overheating problems when using wireless charging. This is because of all this wasted energy that is dissipated as heat - instead of "filling" the battery it simply heats it, and the surroundings, up.
Despite being made to the same specs, this seem to differ from charger to charger, such as this thread here on XDA would indicate: http://forum.xda-developers.com/google-nexus-5/help/post-qi-charging-battery-temp-t2544768
If you look at the version specifications you see that version 1.2 of the "low power" Qi charging branch which phones are a part of increased the power to up to 15W.
Unless they also worked on the efficiency this would actually mean that version 1.2 does more damage to the battery than 1.0 and 1.1, but for that you would have to dive a bit deeper than the information given in that link.
But as always it's sort of subjective as to what point people will see wireless charging as being too wasteful and/or damaging.
Personally, I don't care because the convenience of wireless charging by far outways the little damage it does to a battery, in my opinion, and the same goes for QuickCharge as well. By the time I would see a noticeable effect on battery life I have probably already bought a new phone anyway
If we take Qualcomms QuickCharge for example,I think QC 3.0 is at the point of where people shouldn't really care about the negative impact. If you read the spec sheet for QC 3.0 it's basically a tweaked version of QC 2.0 (well duh) where the power delivery is controlled much better than QC 2.0 was, bringing both the efficiency and therefor speed to a much higher level even though both are rated for 18W.
Some reading for those who still doubt basic physics :
http://batteryuniversity.com/learn/article/charging_without_wires
http://batteryuniversity.com/learn/article/charging_at_high_and_low_temperatures
http://batteryuniversity.com/learn/article/all_about_chargers
http://batteryuniversity.com/learn/article/ultra_fast_chargers
..and the best of all: https://google.com/
But let me ask you the same question you asked me; and I quote:
nonyhaha said:
Are you joking? What is your answer based on?
Click to expand...
Click to collapse
..that probably sounded very condescending (which is not how it was intended, of course), but I'm curious as to where you've acquired this absurd idea that Qi wireless charging is the best method of all? It's very likely the worst of all, actually.
There is almost no heat dissipated for QC3.0
For me quick charging is a big help, saves hours, if you have a large QC battery or powerbank especially. Yotaphone battery charges especially quickly with QC charger.
"..Wireless charging is way less efficient than any form of wired charging..."
Yes, because you convert first AC 110v or 240v to a lover voltage, f.e DC 5v with an efficiency of maybe 85%.
Then this 5v DC are chopped to a long wave ac voltage (about 19v / 110 to205 kHz) and sends to a cooper coil in the QI transmitter.
There the energy goes as a by a resonant inductive couppling (magnetic field) through a air gap to the QI receiver - again wit an efficiency of perhaps 70%.
The magnetic induction in the receiver coil delivers us again long wave ac voltage which is converted into adequate DC voltage (again efficiency about 70%).
So frankly speaking you may tell a bit of truth regarding losses converted to heat - but this heat ocures everywhere, but not in the Li-Po batteries. It does only in the last step: conversion of electrical energy into a chemical process inside of Li-Po.
Take a look to a label on your QI Charger and you will notice something like following: Input 5V/2a, Output 5V/1A (loss of 50%)
Almost all lithium batteries have their own charging controllers on board which take care of the correct charging parameters. Those controllers are adjusted to charge and also quick charge li-po batteries in the right manner.
Enough theory.
Just follow the electrical way:
in case of direct charger: USB-connector ->copper wire -> Smartphone -> copper wire->LiPo
in case of QI charger: USB Connector->copper coil->air gap->copper coil->copper wire-> LiPo
so there's no basic difference how the LiPo is connected to the power - in both cases by a copper wire
in both cases you can charge with lets say 5v/1A (of course LiPo will be charged with their own characteristical voltage and currents)
modern LiPos are built for a life of 700 bis 1000 charging cycles (about 2 years), and nobody knows if a LiPo would live longer by charging him slowly.
You can charge your smartphone in a fridge to prevent high temperatures.
USB devices are smart, they negotiate themselves by a protocol regarding the charge load. There is no danger to take a Smartphone with capability to be charged with 1.4 amps and connect it to a charger with a 2.1 amp.
You should take more care of the USB cable - it should be able to pass those required Amps to the devices.
Yes less efficient and worse for battery, maybe takes a few minutes more to charge, costs a little more to charge. But its much more pleasing not to use cables and very impressive too. I love wireless charging.
Amplificator said:
My answer is based on simple physics.
Just because the amps are lower doesn't mean it's not bad for the battery.
Wireless charging is way less efficient than any form of wired charging.
What happens to the loss? Well, it gets dissipated as heat - and what is the "big killer" of lithium batteries? ..heat.
For this single fact alone, denying that wireless charging causes more harm than a cabled charging is simply.. well, silly.
The only ones denying this are either unaware of simple science or are lying to you, probably to sell you a charger
Yes, every form of charging, even at a theoretical 100% efficiency will heat up the battery due to chemical reactions inside the battery, but the lower efficiency you have the more energy is converted into heat - thus you do more damage and getting even less actual battery-energy out of it.
Simply put: the best charging method is the one that produces the least amount of heat while maintaining a high efficiency - wireless charging is simply not that.
Charging using a cable at 90% means 10% is being converted into heat (not all 10%, but for arguments sake, play along), where as using wireless charging might be at.. 50% depending on different circumstances (probably a lot closer to 70% than 50%, but again, for arguments sake).
This means that the other 50% is just turned into wasted, unnecessary and unwanted heat.
The percentages obviously aren't correct in this example, but it's more to get the point across.
With wireless charging you do more damage (it is subjective as to whether this matters to you) to the battery than you would by using a cable, simply because you create more excessive heat which only purpose is to heat up the battery and surrounding area than actually going into the battery itself.
If we consider the 50% efficiency of the before mentioned example, this means that you would need to charge your device for almost twice as long time as when you use a cable. Not only does it create more heat by virtue of being inductive charging, but it will be doing so for, again, almost twice the time length.
Efficiency also depends on things like distance - the less "perfect" your phone is placed on the charger the less efficient, and thus more wasteful it is.
Google something like "qi wireless charging overheating" and you will see plenty of people reporting on overheating problems when using wireless charging. This is because of all this wasted energy that is dissipated as heat - instead of "filling" the battery it simply heats it, and the surroundings, up.
Despite being made to the same specs, this seem to differ from charger to charger, such as this thread here on XDA would indicate: http://forum.xda-developers.com/google-nexus-5/help/post-qi-charging-battery-temp-t2544768
If you look at the version specifications you see that version 1.2 of the "low power" Qi charging branch which phones are a part of increased the power to up to 15W.
Unless they also worked on the efficiency this would actually mean that version 1.2 does more damage to the battery than 1.0 and 1.1, but for that you would have to dive a bit deeper than the information given in that link.
But as always it's sort of subjective as to what point people will see wireless charging as being too wasteful and/or damaging.
Personally, I don't care because the convenience of wireless charging by far outways the little damage it does to a battery, in my opinion, and the same goes for QuickCharge as well. By the time I would see a noticeable effect on battery life I have probably already bought a new phone anyway
If we take Qualcomms QuickCharge for example,I think QC 3.0 is at the point of where people shouldn't really care about the negative impact. If you read the spec sheet for QC 3.0 it's basically a tweaked version of QC 2.0 (well duh) where the power delivery is controlled much better than QC 2.0 was, bringing both the efficiency and therefor speed to a much higher level even though both are rated for 18W.
Some reading for those who still doubt basic physics :
http://batteryuniversity.com/learn/article/charging_without_wires
http://batteryuniversity.com/learn/article/charging_at_high_and_low_temperatures
http://batteryuniversity.com/learn/article/all_about_chargers
http://batteryuniversity.com/learn/article/ultra_fast_chargers
..and the best of all: https://google.com/
But let me ask you the same question you asked me; and I quote:
..that probably sounded very condescending (which is not how it was intended, of course), but I'm curious as to where you've acquired this absurd idea that Qi wireless charging is the best method of all? It's very likely the worst of all, actually.
Click to expand...
Click to collapse
So you really think you know what you are saying there...
Heat dissipation will happen ONLY on the emitter part. So no heating on the recetor coil as well as no heating in the phone. I thinl ypu have to get your facts straight.
Because wireless charging coils run on such low amperage this will nevver become a problem of overheating.
As you said before me, you should get your phisics knowlege up to date. I am already a graduate with a phisics degree.
nonyhaha said:
So you really think you know what you are saying there...
Heat dissipation will happen ONLY on the emitter part. So no heating on the recetor coil as well as no heating in the phone. I thinl ypu have to get your facts straight.
Because wireless charging coils run on such low amperage this will nevver become a problem of overheating.
As you said before me, you should get your phisics knowlege up to date. I am already a graduate with a phisics degree.
Click to expand...
Click to collapse
Yes, I do think I know what I'm talking about - but luckily you came to the rescue and used your alleged physics degree to write a reply that proved me wrong with all of your facts, right?
Oh, no.. you didn't - you just doubled down instead, well done.
It doesn't matter (and is not important in this case) where the heat dissipation happens (and never did I claim it happened at the receiver - only that it happens) - the battery is still being heated up regardless, due to the energy loss.
If someone with an alleged physics degree keeps denying that the battery is heated up accordingly to my previous post then I doubt that you finished at the top of your class, if at all, sorry. I would really like to see all your evidence you have against what I wrote in my previous post (and that tons of people are posting about on the interwebz).
Just give it a go on Google, such as this thread from XDA: http://forum.xda-developers.com/google-nexus-5/help/post-qi-charging-battery-temp-t2544768
You can even do a simple charging test of your own, just compare battery temperatures while using a Qi wireless charger, QC2.0 and another at 1A.
Are everyone posting about high temperatures while using Qi chargers lying? Why would they do that? ..maybe the wired-charging-mafia are paying people to discredit WPC and other groups.. hm, maybe.
Yeah, it's getting a bit ridiculous, but silly claims require silly responses, sorry
If you can actually prove what I was saying in my previous post is wrong then I'll gladly accept it, but I do not take "na-ah, not true" with any degree of seriousness and neither do I give credit to claims of physics degrees. In that case I'm an ESA astronaut currently in space - see where this is going?
I go by what you actually write, not what you claim. The only reason for boasting about alleged degrees is to divert attention from the lack of any credible proof - disprove what my previous post said and I'll gladly accept it.
Ok, so my Yotaphone 2 charger has quick charge ability, as does my Samsung Galaxy Note 4 charger and car charger.
Despite all of these chargers having fast charging ability and my Samsung Note 4 fast charging perfectly with all of them, none of them appear to fast charge my Yotaphone 2......
It's at 63% charged right now and whether I plug it into a non QC charger or any of my quick chargers, it's saying 55 minutes until fully charged.
I've looked through the settings pages and can't find a way to enable quick charge on my Yotaphone like I could on my Note 4 battery page.
I'm running a Gearbest supplied YD206 which I flashed to the RU 134 ROM (so it's now showing as a YD201)
Am I missing something?
Any ideas/replies would be greatly appreciated!
Yotaphone 2 charger should indicate active quick charging by ligthing up "Yotaphone" with white LEDs on the charger. If its charging with 5V only your charger doesn't light up.
I don't think that theres something wrong with your phone. Just that charging estimation is inaccurate (at the moment).
Well my chargers Yotaphone logo is lighting up, so I guess it's working then. Thanks for the reply ?
I've had to put the two pin Yotaphone charger block into a three pin UK adaptor to try it. Annoyingly & worryingly it buzzes a lot & quite loudly - is that the same for everyone?
zippyioa said:
Ok, so my Yotaphone 2 charger has quick charge ability, as does my Samsung Galaxy Note 4 charger and car charger.
Despite all of these chargers having fast charging ability and my Samsung Note 4 fast charging perfectly with all of them, none of them appear to fast charge my Yotaphone 2......
It's at 63% charged right now and whether I plug it into a non QC charger or any of my quick chargers, it's saying 55 minutes until fully charged.
I've looked through the settings pages and can't find a way to enable quick charge on my Yotaphone like I could on my Note 4 battery page.
I'm running a Gearbest supplied YD206 which I flashed to the RU 134 ROM (so it's now showing as a YD201)
Am I missing something?
Any ideas/replies would be greatly appreciated!
Click to expand...
Click to collapse
Perhaps something wrong is with the cable, not the charger. Something like that happened to me sometime ago - when I used some different cable QC works again.
I had already tried three different chargers and two different cables
If the earlier post about the Yotaphone charger lighting up is correct, I think the phone is quick charging ok.
I guess I was expecting something similar to my Note 4 where it actually stated "fast charging" in the battery menu if I was charging it with a QC.
That message would then change to "charging" if I used a standard charger instead.
I bought a new powerbank, it seems to charge other phones ok but NOT the yotaphone. The powerbank displays the percentage charge for about 10 seconds then display goes off, but so does the yotaphone charging. Other phones and gadgets don't go off. Anyone else have this?
Powerbank is QC3.0. I have tried using different cables, always same.
Sometimes it charges OK. I thought my powerbank was fake until I found it charged other gadgets well.
I also noticed that YotaPhone2 sometimes doesn't want to charge. I just plug it in (cable&charger original), the YotaPhone logo lights up but the phone just doesn't charge! I will try with my power bank and see what happens.
I haven't understood the cause yet, maybe it's because mine has unlocked bootloader, TWRP, root, xposed. (YD206)