CPU or GPU when it comes to emulating? - Galaxy Note 3 Q&A, Help & Troubleshooting

I am wondering which one is more important when you are about to play games using emulators on your phone! I may be wrong but I if you were about to ask me I would have answered with quite confidence that CPU is way more important than GPU So anyone here can precisely answer this question?

Related

[Q] GPU overclocking?

is there any work being done on being able to overclock gpu?, being a gaming orientated phone it seems the obvious thing to try, I've had a look and been unable to find anything about it. gpu overclocks on mobile devices are rather massive from what ive seen thus far.
or has any work for doing this on phones with the same chipset been completed that we could work off in the future?
cheers
Mael5trom said:
is there any work being done on being able to overclock gpu?, being a gaming orientated phone it seems the obvious thing to try, I've had a look and been unable to find anything about it. gpu overclocks on mobile devices are rather massive from what ive seen thus far.
or has any work for doing this on phones with the same chipset been completed that we could work off in the future?
cheers
Click to expand...
Click to collapse
Adreno is part of the scorpion chipset, it's not a dedicated GPU and thus cannot be overclocked independently, obviously overclocking the CPU will help somewhat, with games, but only on performance not graphics
Ah, unfortunate limitation on this chipset, yep cpu overclocking looks like the way, thanks for the reply.

[Q] Overclock Adreno 205 (the GPU)

Hi I was wondering if it was possible of overclocking the GPU in our phone.
Has anyone heard about this being possible or a project that is being worked on for this?
As I understand it, since the MSM8255 is a "system on a chip" design, when the CPU is overclocked, the GPU is as well.
TeeJay3800 said:
As I understand it, since the MSM8255 is a "system on a chip" design, when the CPU is overclocked, the GPU is as well.
Click to expand...
Click to collapse
would there be anyway to simply oc gpu and not cpu?
Most probably not.
Once I did the analysis for Adreno 200 on MSM8250, you can look it up in Nexus One Android Development forum. I've explained in depth, why it isn't possible, as detailed as I could without disclosing the actual clock diagram. I believe the same applies to MSM8255 - though I didn't check to make sure.
Actually in the Desire HD forums, shaky153 said that you was working on a kernel to over clock the gpu of December 2011. He was able to over clock the gnu at 245 Mhz as stated in the beginning of the thread but it wouldn't stick and revert back to 192 Mhz. As of now he hasn't updated his process and is most likely abandoned and his account is a guest account. If he is able to over clock the gpu then it would be easily ported to the MyTouch 4G both being HTC and having the same processor
Judging, again, by the work I once did, the fact that he "thought" he overclocked the GPU doesn't mean a thing. If he executed a function that says "Set GPU clock as X", doesn't mean that GPU clock will be X. In fact, it might affect nothing at all.
To overclock a part of SoC, one needs to know the SoC clock diagram. SoC isn't PC, where each function is governed by its own controllable PLL, it's different.
I still think its odd that the Adreno 200 and 220 gpu can be overclocked but not the 205.
Sent from my Desire HD using Tapatalk 2
RoboWarriorSr said:
I still think its odd that the Adreno 200 and 220 gpu can be overclocked but not the 205.
Sent from my Desire HD using Tapatalk 2
Click to expand...
Click to collapse
220, being more advanced, most likely has more granularity in its clock divider. It figures from the frequency steps it can do.
200 can't be overclocked. Only together with the CPU.
205 is most likely in the same situation as 200. Not necessarily, but most likely.
Again, some things about clocking should be understood before talking about "overclocking". The clocks don't come from the sky, and this is not a PC.
Actually there is a thread on Overclocking the adreno 200 gpu on the htc desire from something like 96mhz to at least 200mhz which gave an enormous improvement. They were trying to get Fifa 12 to work. LINK: http://forum.xda-developers.com/showthread.php?t=698940f. And just because i ask doesn't mean i don't know. I know how to overclock PC gpu and know the difference between a PC and a Mobile device. And i definitely know that cores aren"t everything. Tegra 3 is never going to beat a Intel Core Duo just because it has more cores. And no poop, clocks don't come from the sky, that's common knowledge. THIS is XDA...http://forum.xda-developers.com/showthread.php?t=1264960&page=35 and this is the Desire HD thread where it all started.
Arguing with Jack_R1 is a terrible idea. Just telling you now.
estallings15 said:
Arguing with Jack_R1 is a terrible idea. Just telling you now.
Click to expand...
Click to collapse
I actually like watching flame wars. Especially on the internet
Sent from my Sense 4.0 Glacier using XDA Premium
RoboWarriorSr said:
Actually there is a thread on Overclocking the adreno 200 gpu on the htc desire from something like 96mhz to at least 200mhz which gave an enormous improvement. They were trying to get Fifa 12 to work. LINK: http://forum.xda-developers.com/showthread.php?t=698940f. And just because i ask doesn't mean i don't know. I know how to overclock PC gpu and know the difference between a PC and a Mobile device. And i definitely know that cores aren"t everything. Tegra 3 is never going to beat a Intel Core Duo just because it has more cores. And no poop, clocks don't come from the sky, that's common knowledge. THIS is XDA...http://forum.xda-developers.com/showthread.php?t=1264960&page=35 and this is the Desire HD thread where it all started.
Click to expand...
Click to collapse
Look at the CPU frequency with overclocked GPU.
Then think, why the hell did they go that low in CPU freq.
Then you're welcome to dig into history and read my thread in N1 forums, dealing with GPU overclocking on QSD8250. If you have some brains, that will tell you why it didn't advance anywhere since 2010. Let me give you a hint: dividers aren't PLLs, and their capabilities are hard-coded and can't be changed to anything but their allowed values. But let me guess: that you write you know something doesn't mean you actually have a tiny bit of clue what you're talking about, and my previous sentence remains a black hole to you.
Now, as I said, I didn't look at the clock diagram of 8255, so I don't know whether the same limit remains here. If I'll have some free time and will be able to lay my hands on it, I'll have a look. But as I wrote in the old N1 thread: the likely answer is the one you're not going to like.
Having said that, since I'm tired of trying to explain how stuff REALLY works, I won't return to it unless I find the clock diagram and it will say something positive.
Thanks for that, Jack. Some people need a reality/ego check.
Jack_R1 said:
Look at the CPU frequency with overclocked GPU.
Then think, why the hell did they go that low in CPU freq.
Then you're welcome to dig into history and read my thread in N1 forums, dealing with GPU overclocking on QSD8250. If you have some brains, that will tell you why it didn't advance anywhere since 2010. Let me give you a hint: dividers aren't PLLs, and their capabilities are hard-coded and can't be changed to anything but their allowed values. But let me guess: that you write you know something doesn't mean you actually have a tiny bit of clue what you're talking about, and my previous sentence remains a black hole to you.
Now, as I said, I didn't look at the clock diagram of 8255, so I don't know whether the same limit remains here. If I'll have some free time and will be able to lay my hands on it, I'll have a look. But as I wrote in the old N1 thread: the likely answer is the one you're not going to like.
Having said that, since I'm tired of trying to explain how stuff REALLY works, I won't return to it unless I find the clock diagram and it will say something positive.
Click to expand...
Click to collapse
I didn't really understand what you just said but I know robos ass just for kicked
Sent from my HTC Glacier using XDA
I'm probably sticking with this metal brick of a phone until it dies on me
Jack_R1 said:
I'm probably sticking with this metal brick of a phone until it dies on me
Click to expand...
Click to collapse
Me too...I think. Lol
Sent from my Sense 4.0 Glacier using XDA Premium
WTF XOXO said:
@THEindian
How long have you owned the MT4G boy? check this thread out below:
forum.xda-developers.com/showthread.php?t=1468698
Why do people always talk about things they don't understand? Do you understand that source to AMD Z460 for GB2.3.X may not ever be released? Only ICS driver and kernel support I think is WIP still. I haven't kept up with the upto date yet but ill look in to it now that I am back.
Sup @ invasion2, Jack_R1
Good 2 see you folks are still with us
Click to expand...
Click to collapse
Over a year, invasion2 and jack know me
Sent from my HTC Glacier using XDA
I wasn't trying to be rude or anything let alone start a flame war if you thought I was I apologize and nor was I trying to disprove, knowing XDA, I just really wanted to get the ball rolling on this especially with that anonymous user ability to semi-clock. Anyway to check if the guy who posted that he was able to overclock the gpu is legit because pulling up his account come up with invalid or something.
For a side note, wasn't some xperia devices with adreno 205 gpu over clocked? if heard/read correctly, is it possible to use the similar method and overclock the desirehd gpu or are the frameworks getting in the way and what not so it wouldn't be possible. I would like to milk this device to its limit if possible.
UPDATE: Someone just overclocked the adreno 205 gpu on the desire hd here is the link: http://forum.xda-developers.com/showthread.php?t=1264960&page=36 Beta testing seems to be in the works and for ICS.
RoboWarriorSr said:
UPDATE: Someone just overclocked the adreno 205 gpu on the desire hd here is the link: http://forum.xda-developers.com/showthread.php?t=1381426&page=2 Beta testing seems to be in the works and for ICS.
Click to expand...
Click to collapse
That is a link to this thread.
Sent from my myTouch 4g using xda app-developers app
You mean this?
http://forum.xda-developers.com/showthread.php?t=1264960
Look at the dates, and look at the end of the thread. "Just overclocked" 1/2 year ago, and no progress since then? Because the settings most probably never kicked in to begin with. Otherwise it would have already been done.
Or post a proper link...

[Q] Disable and oc?

OK so I know most games aren't multithreaded so with my N7 in route I was thinking about this. Has anyone disabled 2 cores and oced the live ones and the GPU more? Since most games wouldn't use them it could help cut down on heat and battery consumption providing the games aren't locking themselves to threads. So I was wondering if anyone could try it or if they have post the results.
Sent from my Nexus 4 using Tapatalk 2
nagle3092 said:
OK so I know most games aren't multithreaded so with my N7 in route I was thinking about this. Has anyone disabled 2 cores and oced the live ones and the GPU more? Since most games wouldn't use them it could help cut down on heat and battery consumption providing the games aren't locking themselves to threads. So I was wondering if anyone could try it or if they have post the results.
Sent from my Nexus 4 using Tapatalk 2
Click to expand...
Click to collapse
Android should be automatically disabling cores when they are not being used. So if you were to overclock the CPU and then go into a game which doesn't use all the cores, the extra cores should be disabled. I am not 100% on this but this is how I understand it should work.
The linux kernel that android runs on top of controls the cpu I do not think the apps have to be CPU CORE aware.. This is due to the memory and power management of Android and arm processors.. Yes i know its much deeper then this but basically the system handles this NOT THE APPS THEMSELVES.. If Apps has to be aware of the cpu/ cores to use them there would be so mach broken apps across devices.
If im wrong please explain.

GPU frequency

Hi guys,
I need a little information to write a specific article of FP16 FP32. The X1 GPU... what frequency does it have? I've read all over anandtech and such that it is 1GHz, but it is an estimation. Anyone here who has controlled it by root and specific programs? Thanks
Klaus88 said:
Hi guys,
I need a little information to write a specific article of FP16 FP32. The X1 GPU... what frequency does it have? I've read all over anandtech and such that it is 1GHz, but it is an estimation. Anyone here who has controlled it by root and specific programs? Thanks
Click to expand...
Click to collapse
I started a thread on the Dolphin forums a while back that touched upon it. https://forums.dolphin-emu.org/Thread-build-variation-and-benchmark which implies it can run at 1Ghz but scales back to 300Mhz. Never did get to the bottom how how to stop it scaling back though. The devs seem reluctant to give out that info and I can't say I blame them. They probably don;t want the backlash from fried machines.

Screen refresh rate

we can change 75 hz ?
you can if you want your device dead
Dairymilk said:
you can if you want your device dead
Click to expand...
Click to collapse
>DD redmi note 7 pro can be overclocked
somebody tried it, screen was flickering and unresponsive.
I would be really curious but why would you want to use such a frequency? For some now "well known" game? Why do you think they improve graphics performance? Well I your reason I don't know him, but I know that even if and I say if it were possible it would be useless and there would be no benefit (or maybe someone would but at the expense of the rest of the device) Of course there are kernels that increase the processor clock (at your own risk of course) but the CPU and GPU are not the same and above all it is a smartphone not a PC ...
Skake said:
I would be really curious but why would you want to use such a frequency? For some now "well known" game? Why do you think they improve graphics performance? Well I your reason I don't know him, but I know that even if and I say if it were possible it would be useless and there would be no benefit (or maybe someone would but at the expense of the rest of the device) Of course there are kernels that increase the processor clock (at your own risk of course) but the CPU and GPU are not the same and above all it is a smartphone not a PC ...
Click to expand...
Click to collapse
Thank you dude

Categories

Resources