im still not Decided help D: - Galaxy S 4 Q&A, Help & Troubleshooting

since the price is droping very fast of the S4 i want to buy one... but i play alot... and i want a phone that is faster... at the same time and i want to be on future proof about gaming as i know the Octa-Cote GPU is more old than the Quad-Core Andreno 320... and the Adreno has better Future Proof im comming from a HTC DROID DNA... so i know that the GPU of the ADRENO 320 Can overclock at 550MHz even 600Mhz... and i know on the time some one will get out a kernel that can overlock the GPU or CPU and i repeat i want to on the phone...
Hard Gaming
Speed....
and Future Proof with the GPU... and hope it be the BEst GPU..
or Should i wait to see what can offer me on a Few Months as i know the ZTE with TEgra 4i will com in July or June i dont remeber...

Well i did 29k in benchmark with adam kernel the device is good exept the smearing issues that i hope samsung will fix

darkhelio said:
since the price is droping very fast of the S4 i want to buy one... but i play alot... and i want a phone that is faster... at the same time and i want to be on future proof about gaming as i know the Octa-Cote GPU is more old than the Quad-Core Andreno 320... and the Adreno has better Future Proof im comming from a HTC DROID DNA... so i know that the GPU of the ADRENO 320 Can overclock at 550MHz even 600Mhz... and i know on the time some one will get out a kernel that can overlock the GPU or CPU and i repeat i want to on the phone...
Hard Gaming
Speed....
and Future Proof with the GPU... and hope it be the BEst GPU..
or Should i wait to see what can offer me on a Few Months as i know the ZTE with TEgra 4i will com in July or June i dont remeber...
Click to expand...
Click to collapse
LOL? So everyone thinks that i9500 gpu is worse than adreno 320...Check out this:
http://www.youtube.com/watch?v=ZegbcFRsMsA
As far as i am concerned, the difference between the two GPU is much insignificant than the difference between the two CPU. As you talk about future proof, you should know that game with a lot of AI are very CPU hungry, and the power of the A16 in the i9500 will crash the A9 qualcomm snapdragon.
You said yourself that you need a faster phone, well in every domain, the i9500 is faster than the i9505 except the insignificant GPU performance.
Talking about overclock, you know that galaxy s4 suffer from overheating whether i9500 or i9505, trying to overclock the gpu while the machin already suffers from overheating issue will destroy your device for sure. Ok let's imagin you can do this. Who knows that Samsung will not release new patch or software so that I9500 can activate all 8 cores so that it can gain a huge performance and have efficient battery.
After all, people buy what they like. Just give you my own opinion, hope that can help you deciding.

Related

Galaxy S SGX540 GPU. Any details up 'till now?

Hi everyone
For quite a long time i've been thinking about the whole "galaxy s can do 90mpolys per second" thing.
It sounds like total bull****.
So, after many, many hours of googling, and some unanswered mails to imgtec, i'd like to know-
Can ANYONE provide any concrete info about the SGX540?
From one side i see declerations that the SGX540 can do 90 million polygons per second, and from the other side i see stuff like "Twice the performance of SGX530".
...but twice the performance of SGX530 is EXACTLY what the SGX535 has.
So is the 540 a rebrand of the 535? that can't be, so WHAT THE HELL is going on?
I'm seriously confused, and would be glad if anyone could pour light on the matter.
I asked a Samsung rep what the difference was and this is what I got:
Q: The Samsung Galaxy S uses the SGX540 vs the iPhone using the SGx535. The only data I can find seems like these two GPU's are very similar. Could you please highlight some of the differences between the SGX535 and the SGX540?
A: SGX540 is the latest GPU that provides better performance and more energy efficiency.
SGX535 is equipped with 2D Graphic Accelerator which SGX540 does not support.
I also tried getting in contact with ImgTec to find out an answer, but I haven't received a reply back. It's been two weeks now.
Also, the chip is obviously faster than snapdragon with the adreno 200 gpu. I don't know if Adreno supports TBDR, I just know it's a modified Xenon core. Also, Galaxy S uses LPDDR2 ram. So throughput is quite a bit faster, even though it's not *as* necessary with all the memory efficiencies between the Cortex A8 and TBDR on the SGX540.
thephawx said:
A: SGX540 is the latest GPU that provides better performance and more energy efficiency.
SGX535 is equipped with 2D Graphic Accelerator which SGX540 does not support.
Click to expand...
Click to collapse
i think that is the cue, for cost saving for Samsung
besides who will need a 2D Accelerator, with a CPU as fast as it's already.
The HTC Athena (HTC Advantage) failed miserably at adding the ATI 2D Accelerator which no programmers were able to take advantage of, in the end the CPU did all the work.
I'd imagine its a 535 at 45nm. Just a guess, the cpu is also 45nm
Having tried a few phones the speed in games is far better, much better fps though there is a problem that we might have to wait for any games to really test its power as most are made to run on all phones.
This was the same problem with the xbox and ps2, the xbox had more power but the ps2 was king and so games were made with its hardware in mind which held back the xbox, only now and then did a xbox only game come out that really made use of its power....years later xbox changed places which saw 360 hold the ps3 back (dont start on which is better lol) and the ps3 has to make do with 360 ports but when it has a game made just for it you really get to see what it can do...anywayits nice to know galaxy is future proof game wise and cannot wait to see what it can do in future or what someone can port on to it.
On a side note I did read that the videos run through the graphics chip which is causing blocking in dark movies (not hd...lower rips) something about it not reading the difference between shades of black, one guy found a way to turn the chip off and movies were all good, guess rest of us have to wait for firmware to sort this.
thephawx said:
A: SGX540 is the latest GPU that provides better performance and more energy efficiency.
SGX535 is equipped with 2D Graphic Accelerator which SGX540 does not support.
Click to expand...
Click to collapse
smart move sammy
voodoochild2008-
I wouldn't say we'd have to wait so much.
Even today, snapdragon devices don't do very well in games, since their fillrate is so low (133Mpixels)
Even the motorola droid (SGX530 at 110mhz, about 9~ Mpoly's and 280~ Mpixels with that freq) fares MUCH better in games, and actually, runs pretty much everything.
So i guess the best hardware is not yet at stake, but weaker devices should be hitting the limit soon.
bl4ckdr4g00n- Why the hell should we care? I don't see any problem with 2D content and/or videos, everything flies at lightspeed.
well I can live in hope, and I guess apples mess (aka the iphone4) will help now as firms are heading more towards android, I did read about one big firm in usa dropping marketing for apple and heading to android, and well thats what you get when you try to sell old ideas...always made me laugh when the first iphone did like 1meg photo when others were on 3meg, then it had no video when most others did, then they hype it when it moves to a 3meg cam and it does video.....omg, ok I am going to stop as it makes my blood boil that people buy into apple, yes they started the ball rolling and good on them for that but then they just sat back and started to count the money as others moved on.................oh and when I bought my galaxy the website did say "able to run games as powerfull as the xbox (old one) so is HALO too much to ask for lol
wait so what about the droid x vs the galaxy s gpu?? i know the galaxy s is way advanced in specs wise... the droid x does have a dedicated gpu can anyone explain??
The droid X still uses the SGX530, but in the droid x, as opposed to the original droid, it comes in the stock 200mhz (or at least 180)
At that state it does 12-14Mpolygons/sec and can push out 400-500Mpixels/sec
Not too shabby
he 535 is a downgrade from the 540. 540 is the latest and greatest from the PowerVR line.
Samsung did not cost cut, they've in fact spent MORE to get this chip on their Galaxy S line. No one else has the 540 besides Samsung.
Like i said, its probably just a process shrink which means our gpu uses less power and is possibly higher clocked.
p.s. desktop gfx haven't had 2d acceleration for years removing it saves transistors for more 3d / power!
This worries me as well... Seems like it might not be as great as what we thought. HOWEVER again, this is a new device that might be fixed in firmware updates. Because obviously the hardware is stellar, there's something holding it back
Pika007 said:
The droid X still uses the SGX530, but in the droid x, as opposed to the original droid, it comes in the stock 200mhz (or at least 180)
At that state it does 12-14Mpolygons/sec and can push out 400-500Mpixels/sec
Not too shabby
Click to expand...
Click to collapse
http://www.slashgear.com/droid-x-review-0793011/
"We benchmarked the DROID X using Quadrant, which measures processor, memory, I/O and 2D/3D graphics and combines them into a single numerical score. In Battery Saver mode, the DROID X scored 819, in Performance mode it scored 1,204, and in Smart mode it scored 963. In contrast, the Samsung Galaxy S running Android 2.1 – using Samsung’s own 1GHz Hummingbird CPU – scored 874, while a Google Nexus One running Android 2.2 – using Qualcomm’s 1GHz Snapdragon – scored 1,434. "
The N1's performance can be explained by the fact it's 2.2...
But the Droid X, even with the "inferior" GPU, outscored the Galaxy S? Why?
gdfnr123 said:
wait so what about the droid x vs the galaxy s gpu?? i know the galaxy s is way advanced in specs wise... the droid x does have a dedicated gpu can anyone explain??
Click to expand...
Click to collapse
Same here. I want to know which one is has the better performance as well.
Besides that. Does anyone know which CPU is better between Dorid X and Galaxy S?
I knew that OMAP chip on the original Droid can overclock to 1.2Ghz from what, 550Mhz?
How about the CPU on Droid X and Galaxy S? Did anyone do the comparison between those chips? Which can overclock to a higher clock and which one is better overall?
Sorry about the poor English. Hope you guys can understand.
The CPU in the DroidX is a stock Cortex A8 running at 1GHz. The Samsung Hummingbird is a specialized version of the Cortex A8 designed by Intrinsity running at 1Ghz.
Even Qualcomm does a complete redesign of the Cortex A8 in the snapdragon cpu at 1GHz. But while the original A8 could only be clocked at 600Mhz with a reasonable power drain, the striped down versions of the A8 could be clocked higher while maintaining better power.
An untouched Cortex A8 can do more at the same frequencies compared to a specialized stripped down A8.
If anything the Samsung Galaxy S is better balanced, leveraging the SGX 540 as a video decoder as well. However, the Droid X should be quite snappy in most uses.
At the end of the day. You really shouldn't care too much about obsolescence. I mean the Qualcomm Dual-core scorpion chip is probably going to be coming out around December.
Smart phones are moving at a blisteringly fast pace.
TexUs-
I wouldn't take it too seriously.
Quadrant isn't too serious of a benchmark, plus, i think you can blame it on the fact that 2D acceleration in the SGS is done by the processor, while the DROID X has 2D acceleration by the GPU.
I can assure you- There is no way in hell that the SGX540 is inferior to the 530. It's at least twice as strong in everything related to 3D acceleration.
I say- let's wait for froyo for all devices, let all devices clear from "birth ropes" of any kind, and test again. with more than one benchmark.
Pika007 said:
TexUs-
I wouldn't take it too seriously.
Quadrant isn't too serious of a benchmark, plus, i think you can blame it on the fact that 2D acceleration in the SGS is done by the processor, while the DROID X has 2D acceleration by the GPU.
I can assure you- There is no way in hell that the SGX540 is inferior to the 530. It's at least twice as strong in everything related to 3D acceleration.
I say- let's wait for froyo for all devices, let all devices clear from "birth ropes" of any kind, and test again. with more than one benchmark.
Click to expand...
Click to collapse
The SGS might be falling behind in I/O speeds... It is well known that all the app data is stored in a slower internal SD-card partition... Has anyone tried the benchmarks with the lag fix?
Also, if only android made use of the GPU's to help render the UI's... It's such a shame that the GPU only goes to use in games...
Using the GPU to render the UI would take tons of battery power.
I preffer it being a bit less snappy, but a whole lot easier on the battery.
thephawx said:
At the end of the day. You really shouldn't care too much about obsolescence. I mean the Qualcomm Dual-core scorpion chip is probably going to be coming out around December.
Smart phones are moving at a blisteringly fast pace.
Click to expand...
Click to collapse
Smart phones aren't but batteries are.
IMO the only way we haven't had huge battery issues because all the other tech (screen, RAM power, CPU usage, etc) has improved...
Dual core or 2Ghz devices sound nice on paper but I worry if the battery technology can keep up.
TexUs said:
Smart phones aren't but batteries are.
IMO the only way we haven't had huge battery issues because all the other tech (screen, RAM power, CPU usage, etc) has improved...
Dual core or 2Ghz devices sound nice on paper but I worry if the battery technology can keep up.
Click to expand...
Click to collapse
I think so. The battery will be the biggest issue for the smart phone in the future if it just remain 1500mAh or even less.
The dual-core CPU could be fast but power aggressive as well.

Tegra 2 overclocking?

Any info out there about this baby overclocked? Will standard overclocking tools work or does new software need to be devloped?
To overclock the cpu I think you'd need a custom kernel that allows it first. But if the bootloader is locked then custom kernels can't be flashed.
You won't have to worry about performance issues with tegra 2 for while though .
As if you needed to run Crysis on it?
Tough crowd this morning!
This site is here for getting the most out of devices. Rooting and removing bloatware increases performance. Customized ROMS increase perfomance and user experience. I merely asked about another tool for optimizing a device.
bee55 said:
To overclock the cpu I think you'd need a custom kernel that allows it first. But if the bootloader is locked then custom kernels can't be flashed.
You won't have to worry about performance issues with tegra 2 for while though .
Click to expand...
Click to collapse
Haha,don't underestimate the people who hang out at XDA and other dev sites, we find ways to work these phones to the bone. I know for myself I will have probably 100 apps downloaded and installed in the first 24 hours, and will be testing its limits.
You have the best cpu in a phone ever and you want to over clock. Wow. Why?
Sent from my SAMSUNG-SGH-I897 using Tapatalk
snapdragon was the best @ one time and most roms had overclock built in!
Snapdragon is the worst CPU for 1ghz. Even the TI OMAP is better than Qualcomm. The main reason wont buy anymore HTC phones is because of Qualcomm and there ****ty performance in phone in comparison to Samsung, TI, and now Nvidia.
Recon Freak said:
snapdragon was the best @ one time and most roms had overclock built in!
Click to expand...
Click to collapse
Sent from my SAMSUNG-SGH-I897 using Tapatalk
Hence why he said 'at one time'.
Sent from my SGH-I897 using XDA App
AllTheWay said:
Snapdragon is the worst CPU for 1ghz. Even the TI OMAP is better than Qualcomm. The main reason wont buy anymore HTC phones is because of Qualcomm and there ****ty performance in phone in comparison to Samsung, TI, and now Nvidia.
Sent from my SAMSUNG-SGH-I897 using Tapatalk
Click to expand...
Click to collapse
Snapdragon is far from being the worst CPU, clock for clock. First of all, Snapdragon is not a CPU, is a SoC (System on a Chip), and the CPU core inside Snapdragon is called Scorpion. Scorpion is neither a standard ARM Cortex A8 nor A9 core unlike the CPU core inside the Hummingbird/TI OMAP/Nvidia Tegra. But it can be thought of as among the same class as Cortex A8 CPUs. The Scorpion has some big advantage over standard Cortex A8 core in some areas (e.g. floating point). The reason why many found the first generation (in Nexus One and HTC Desire) to be "slow" was that they look only at composite benchmark like Quadrant and/or 3D games. The first generation of Snapdragon has a rather dated GPU (Adreno 200) in it, and Adreno 200's 3D performance is honestly, bad. The second generation Snapdragon (Desire Z/G2, Desire HD) uses a much faster GPU, Adreno 205, making the Snapdragon 3D performance on par with Hummingbird and other current generation SoC.
So before you go again saying Snapdragon is the slowest "CPU", go do some reading, and think, before saying. Here is some good reading for you:
http://www.anandtech.com/show/4144/...gra-2-review-the-first-dual-core-smartphone/4
http://www.anandtech.com/show/4165/the-motorola-atrix-4g-preview/5
AllTheWay said:
Snapdragon is the worst CPU for 1ghz. Even the TI OMAP is better than Qualcomm. The main reason wont buy anymore HTC phones is because of Qualcomm and there ****ty performance in phone in comparison to Samsung, TI, and now Nvidia.
Sent from my SAMSUNG-SGH-I897 using Tapatalk
Click to expand...
Click to collapse
if you blindly trust benchmarks the Scorpion CPU in the 2nd gen snapdragons are quite fast... my G2 benchmarks at...
Quadrant: 2,700ish
Linpack: 52.69
Sunspider:2,257
Neocore:57
infact, all of those benchmarks either match, or surpass the Atrix 4G.
No problems here with my snapdragon 1Ghz. linpacks constant 42+
Now that the phone is rooted can we use setCPU to underclock it so to save battery.
Or does setcpu not support dual core.
Also is what I said above true. if we have root we can underclock without putting custom kernels.
The nvidia tegra 2 kernel does not have a simple method to modify the CPU freq table. The dev working on the gtablet kernel would be a good resource to ask, his name is Pershoot. From my understanding he would have to backport the original ARM scaling which is not trivial in the least.
Maybe someone can figure out another way.
tsekh501 said:
As if you needed to run Crysis on it?
Click to expand...
Click to collapse
Actually yeah, and who wouldn't? That's probably enough to get you instantly laid in some countries.
Arkasai said:
Actually yeah, and who wouldn't? That's probably enough to get you instantly laid in some countries.
Click to expand...
Click to collapse
Serious bragging rights right there.
Guy 1: "Damnit, I just got Crysis 2, and I can't even run Crysis 1 on my computer."
Guy 2: "Yeah well I can run it on my cell phone...look."
Guy 1's Girlfriend: "Take me, now, Guy 2!."
You get the picture.
Sorry to go off-topic there. But I do have a question. Isn't the Tegra 2 ARM9 based? And there's nothing wrong with wanting to push a device to it's limits. Overclocking is fun.
dandmcd said:
Haha,don't underestimate the people who hang out at XDA and other dev sites, we find ways to work these phones to the bone. I know for myself I will have probably 100 apps downloaded and installed in the first 24 hours, and will be testing its limits.
Click to expand...
Click to collapse
lol same here. I have about 45 installed on my Galaxy Tab and all of them will be installed on the Atrix immediately and tested. I plan on testing every single game I can find on the market lol biggest being Dungeon Defenders for now...runs a bit slow on the Galaxy Tab and I've heard on Tegra2 it runs *GREAT*.
AllTheWay said:
You have the best cpu in a phone ever and you want to over clock. Wow. Why?
Sent from my SAMSUNG-SGH-I897 using Tapatalk
Click to expand...
Click to collapse
Because you can make it better. Why settle for less? My captivate is fast and does everything I need it to do at 1ghz but I have it at 1.3 now; and under volted.
Why? Because it is better.
Captivate 2.2.1 Paragon
Is there a simple way to backup all the apps installed on my phone so I can just dump them instantly into a new phone? Preferably without having to hit "install" for every app on the market.
wow, its a dual core processor and you want OC... ugh, get out... lol

SGSIII Mali 400 Drivers on the note!

The folks at the HTC Sensation/EVO 3D section extracted the Adreno 225 drivers from the HTC One S, as some of you may know that the Adreno 225 is the same as the Adreno 220 GPU but just have double the frequency! the frequency has nothing to do here if you ask, using these drivers gave them a HUGE performance boost with the STOCK frequency
as we know that the Mali 400 GPU at the SGSIII is clocked at 400mhz but even if you clocked your Mali 400 GPU in your Note (which has the same Resolution) you wont be able to reach that performance which tells me that its all about the drivers just like the Adreno 225
So can the Developers extract the Mali 400 Drivers from the SGSIII so we can use it on our phones?
This is not a question so i think it belongs to here not the Q/A section as its just a discussion if this is going to work or not!
Same driver, bigger screen = performance loss.
That is why Sammy set CPU 200 Mhz faster on Note over S2.
Screen has NOTHING to do with anything the Resolution does, which is the same in the SGSIII and the Note
Also that's why i said if you overclock the GPU to 400mhz you still wont reach that performance so it has to do with the Drivers
The note and SGSIII do indeed have different different screen resolutions, the Note being at 1280x800, while the SGSIII is at 1280x720. not much of a difference though, basically 16:10 vs 16:9, respectively. I believe the new Mali400 Drivers will be in the next ROM update anyway.
Hell Guardian said:
The folks at the HTC Sensation/EVO 3D section extracted the Adreno 225 drivers from the HTC One S, as some of you may know that the Adreno 225 is the same as the Adreno 220 GPU but just have double the frequency! the frequency has nothing to do here if you ask, using these drivers gave them a HUGE performance boost with the STOCK frequency
as we know that the Mali 400 GPU at the SGSIII is clocked at 400mhz but even if you clocked your Mali 400 GPU in your Note (which has the same Resolution) you wont be able to reach that performance which tells me that its all about the drivers just like the Adreno 225
So can the Developers extract the Mali 400 Drivers from the SGSIII so we can use it on our phones?
This is not a question so i think it belongs to here not the Q/A section as its just a discussion if this is going to work or not!
Click to expand...
Click to collapse
Well , if they are exactly the same just different clock speeds then I would think they should work indeed.
This is interesting and I certainly hope it does , not that at 400mhz or even less, the GPU is lacking but who does not like more performance for free?
Muskie said:
The note and SGSIII do indeed have different different screen resolutions, the Note being at 1280x800, while the SGSIII is at 1280x720. not much of a difference though, basically 16:10 vs 16:9, respectively. I believe the new Mali400 Drivers will be in the next ROM update anyway.
Click to expand...
Click to collapse
I know that but that deference is not major by any mean to effect the performance that much is they are both have the same frequency
shaolin95 said:
Well , if they are exactly the same just different clock speeds then I would think they should work indeed.
This is interesting and I certainly hope it does , not that at 400mhz or even less, the GPU is lacking but who does not like more performance for free?
Click to expand...
Click to collapse
My thoughts exactly, If they folks at the Sensation did it, why can't we?
Link of the Drivers that got extracted from the One S
http://forum.xda-developers.com/showthread.php?t=1643472
Just check the replies to see the performance boost, This is the EXACT same situation as the Note and the SGSIII GPU
Wow, that's a good boost.
nex7er said:
Wow, that's a good boost.
Click to expand...
Click to collapse
I think if the Note users can have that kind of boost on their phones that will eliminate ANY kind of lag in the UI and it i will be Amazingly smooth it will also give huge boost to the SGSII users
if this really happened and it does work, what about the battery-life... can be poorer i think
In theory, I see where you're going with this, and in theory it sounds plausible. However, something that I think has been overlooked is the process design of the new S3's chipset vs the ones found in the current generation S2/Note (45nm vs 32nm). It's entirely possible that the only reason why Samsung is able to run the Mali-400 at 400mhz is due to the fact that the 32nm process is just that much more efficient, such that you can safely run at 400mhz using the same power as you would running at 266mhz on the 45nm process.
I just get the feeling that trying to push the 45nm process up to 400mhz might simply melt the silicon (or at least gobble your battery life in one gulp!). Call me defeatist if you have to, but I remain skeptical until I see evidence to the contrary.
I run my galaxy nexus with the GPU clocked to 512mhz (standard is 308mhz), and that cpu too uses the 45nm process.
Been running it like that for the last 3 months with no issue, and game fps is greatly improved.
Is there any kernels at all that even support over clocking the GNote gpu?
Very interesting, Would like to see this being investigated further for sure!
screen has nothing to do with it...on note we got 100k more pixels 1280x800-1280x720=100k
,,, and s3 has more cores in the mali-gpu...but yea i think the drivers would get us more performance
lyp9176 said:
if this really happened and it does work, what about the battery-life... can be poorer i think
Click to expand...
Click to collapse
The sg s3 seems to have a decent battery life
resistant said:
screen has nothing to do with it...on note we got 100k more pixels 1280x800-1280x720=100k
,,, and s3 has more cores in the mali-gpu...but yea i think the drivers would get us more performance
Click to expand...
Click to collapse
After some digging I found that the GPU In Exinos 4210(SGS2/Note) and 4412 (SGS3) is absolutely the same Mali 400MP4 (same number of GPU cores)! The only difference is that the 4412 GPU Can Go up to 400MHz (which is doable to our GPU too and have been done to the SGS2 already). The main difference here are the four CPU cores that help the GPU. I'm skeptical that the new drivers will do much (if at all) in terms of performance! Oh and lets not forget that the Adreno GPU Drivers are written by Qualcomm and they can't do anything right so the updated drivers may just be better written (or at least less buggier) than the old ones!
Manya3084 said:
I run my galaxy nexus with the GPU clocked to 512mhz (standard is 308mhz), and that cpu too uses the 45nm process.
Been running it like that for the last 3 months with no issue, and game fps is greatly improved.
Click to expand...
Click to collapse
It has been proved to make very little improvement over a well developed kernal. Hence why developers like Franco and imyosen took it out.
Game frame rate is simply due to force gpu being active
Sent from my GT-I9300 using xda premium
Mahoro.san said:
The sg s3 seems to have a decent battery life
Click to expand...
Click to collapse
That is due to the new processor voltage and the low idle drain of the CPU
Sent from my GT-I9300 using xda premium
GR36 said:
It has been proved to make very little improvement over a well developed kernal. Hence why developers like Franco and imyosen took it out.
Game frame rate is simply due to force gpu being active
Sent from my GT-I9300 using xda premium
Click to expand...
Click to collapse
This during kernel development in the gingerbread days or the new current ics kernels?
Sent from my Galaxy Nexus using Tapatalk 2
May be...
Clocking the GPU at 400Mhz would give a boost in performance but at the cost of battery life....and also making the phone really hot....which is not idle...just wait a little while and see how will s3 perform under those conditions...

Samsung Galaxy S III for Verizon keeps its design intact

Verizon is one of the five carriers to start offering the Samsung Galaxy S III this month and leaked pictures show that the device will virtually be an untouched version of its international GSM sibling.Android Central got their hands on some photos of the Galaxy S III for the Big Red, which, excluding the 4G Verizon logo on the back, is the same as the GSM model of the device. The only difference is it runs on a dual-core 1.5GHz Qualcomm Snapdragon S4 MSM8960 chipset with 2GB of RAM.
Samsung have decided not to alter the Galaxy S III as much as they did with the Galaxy S II lineup last year and launch the device with the same outfit as everywhere else. This seems to be the case with the US-bound T-Mobile version and the one sold by AT&T, as well.
Speaking of launch, it's yet unclear when Verizon is going to put the Galaxy S III on the shelves, but it will surely be sometime this month.
Dual core with 2gigs of ram? Isnt the S3 quad core with 1gig? Hmmm
Sent from my Nexus S 4G using XDA Premium App
QBANBOY407 said:
Dual core with 2gigs of ram? Isnt the S3 quad core with 1gig? Hmmm
Sent from my Nexus S 4G using XDA Premium App
Click to expand...
Click to collapse
2GB of RAM is nice, but I'd rather have a quad-core Exynos since I'm a gamer and that's a big selling point of the Galaxy line.
Product F(RED) said:
2GB of RAM is nice, but I'd rather have a quad-core Exynos since I'm a gamer and that's a big selling point of the Galaxy line.
Click to expand...
Click to collapse
Me too!
Sent from my Nexus S 4G using XDA Premium App
Snapdragon....so does that mean no Wolfson DAC for Verizon's phone?
alpha-niner64 said:
Snapdragon....so does that mean no Wolfson DAC for Verizon's phone?
Click to expand...
Click to collapse
Sent u a pm can u please reply ??
June 6th they are starting to take pre-orders is what I just saw.
Sent from my MB870 using xda premium
As the release of the Samsung Galaxy SIII looms, I am wondering what events will take place. Do you think big red will officially roll out the the new data plans before launch? I doubt it.
When the S3 is released this month presumably before the new data plans roll out, will I be able to keep my grandfathered unlimited plan?
ready to leave Apple for android, but is the GS3 good enough?
hey guys not sure if i should get this phone... im sad it will not have the overclocked Mali400 400mhz GPU.... but i know the S4 CPU with the andreno 225 is a beast, i held off the GN on big red cause of the old powervr540 GPU WTF but i know ICS is much better of using the GPU instead of 3.2 and below mostly using the CPU for graphics processing... im a big gamer , thats why i use the iphone4s i love the powervr543mp2 its badass... so... what should i get? i kinda wanna wait for the LG eclipse i hear it comes with the adreno 320, that alone makes me giddy or does any one know of any phones coming out with the exynos 5250? i hear that Mali-t604 GPU can walk all over the PowerVR544mp4 in the ipad3 so anyone please help... should i wait for phones with the next Gen GPUS ,adreno 320 and Mali-t604? or will my gaming needs be met with the GS3 with the S4 CPU running the adreno 225 GPU? im ready to get rid of my Iphone4s... but i still want the same graphics performance of the powerVR543mp2 in my iphone 4S , i love the idea of android and i cant wait to leave the dark side of apple!!!! FTW Andriod!!!
p.s i know im a noob here so sorry for the long post
jfriend33 said:
As the release of the Samsung Galaxy SIII looms, I am wondering what events will take place. Do you think big red will officially roll out the the new data plans before launch? I doubt it.
When the S3 is released this month presumably before the new data plans roll out, will I be able to keep my grandfathered unlimited plan?
Click to expand...
Click to collapse
If you go ahead and pre-order you will be able to keep your Unlimited. Tiered plans are supposed to begin July 1st, so anytime before then should be fine.
vader540is said:
hey guys not sure if i should get this phone... im sad it will not have the overclocked Mali400 400mhz GPU.... but i know the S4 CPU with the andreno 225 is a beast, i held off the GN on big red cause of the old powervr540 GPU WTF but i know ICS is much better of using the GPU instead of 3.2 and below mostly using the CPU for graphics processing... im a big gamer , thats why i use the iphone4s i love the powervr543mp2 its badass... so... what should i get? i kinda wanna wait for the LG eclipse i hear it comes with the adreno 320, that alone makes me giddy or does any one know of any phones coming out with the exynos 5250? i hear that Mali-t604 GPU can walk all over the PowerVR544mp4 in the ipad3 so anyone please help... should i wait for phones with the next Gen GPUS ,adreno 320 and Mali-t604? or will my gaming needs be met with the GS3 with the S4 CPU running the adreno 225 GPU? im ready to get rid of my Iphone4s... but i still want the same graphics performance of the powerVR543mp2 in my iphone 4S , i love the idea of android and i cant wait to leave the dark side of apple!!!! FTW Andriod!!!
p.s i know im a noob here so sorry for the long post
Click to expand...
Click to collapse
Developers never ever use the highest end hardware when designing their games because of exactly why you're worried. They'll always use the hardware that's the most friendly and easily-sourced in favor of something that is completely different like the Mali GPUs (which is more reserved for tablets anyways if theory comes to fact). Mali is still unproven whereas Adreno is easily sourced. I'll put money that developers will favor Adreno for some time until.
alpha-niner64 said:
Developers never ever use the highest end hardware when designing their games because of exactly why you're worried. They'll always use the hardware that's the most friendly and easily-sourced in favor of something that is completely different like the Mali GPUs (which is more reserved for tablets anyways if theory comes to fact). Mali is still unproven whereas Adreno is easily sourced. I'll put money that developers will favor Adreno for some time until.
Click to expand...
Click to collapse
The exception to your statement is of course the Tegra platform, which has versions of games optimized specifically for it. But in general you're correct. The Mali is significantly more powerful than the S4, although in real-world usage the difference would be negligible.
Does the VZW version with the Snapdragon MSM8960 radio have LTE on the actual SOC. Or is the LTE radio on a separate chip like the Bionic and Galaxy Nexus? Basically is there any battery saving with this radio by having the LTE on the SOC itself instead of a stand alone chip set.
proxus01 said:
Does the VZW version with the Snapdragon MSM8960 radio have LTE on the actual SOC. Or is the LTE radio on a separate chip like the Bionic and Galaxy Nexus? Basically is there any battery saving with this radio by having the LTE on the SOC itself instead of a stand alone chip set.
Click to expand...
Click to collapse
Its actually integrated in to the block of the CPU diagram
Sent from my ADR6400L using XDA
I found a Diagram
The Qualcomm Snapdragon S4 (MSM8960) is composed of two Krait CPUs clocked between 1.2 and 1.5 Ghz, an Adreno 225 GPU and a modem subsystem with LTE, GPS, Wifi, Bluetooth and FM support. It will be manufactured using 28nm technology and provide much lower power consumption compared to previous generations.
Snapdragon S4 Block Diagram
Key features and improvements:
New CPU micro-architecture: The Krait CPU offer a 60% performance improvement compared to the scorpion CPU used in previous generations.
CPU performance Roadmap
SIMD/VFP performance: Multimedia instructions (SIMD) and floating point operations have also been improved, but no metrics have been provided.
Optimized memory subsystem: Krait includes dual-channel memory. Dual-channel memory is critical in
order for the processor to being able to handle the large bandwidth requirements in multicore systems.
25/40% power improvement: Thanks to an asynchronous multi-core processing, the MSM8960 consumes between 25 to 40% less power.
Reduced complexity: Qualcomm explains a companion core is not needed to reduce power savings as they use aSMP (asynchronous SMP) technology. This goes against the choice of Nvidia to have a companion core in NVidia 3.
50% increase in GPU performance: The Adreno 225 GPU delivers 50% greater graphics processing power over the previous generation Adreno GPU, Adreno 220, and six times the processing power of Adreno 200.
Adreno GPU Power Improvements
Fully integrated 3G/4G world/multimode LTE Modem: Supports all of the world:s leading 2G, 3G
and 4G LTE standards. It also includes integrated support for multiple satellite position networks (GPS and GLONASS) as well as short range radios via Bluetooth, WiFi, FM and NFC.
Programmable Hexagon DSPTM Architecture: According to the block diagram above. They all contribute to the improved performance of the mobile processor. Custom DSP applications can also be written by OEM and ISV.
Read more: http://www.cnx-software.com/2011/10/08/qualcomm-snapdragon-s4-msm8960/
alpha-niner64 said:
Developers never ever use the highest end hardware when designing their games because of exactly why you're worried. They'll always use the hardware that's the most friendly and easily-sourced in favor of something that is completely different like the Mali GPUs (which is more reserved for tablets anyways if theory comes to fact). Mali is still unproven whereas Adreno is easily sourced. I'll put money that developers will favor Adreno for some time until.
Click to expand...
Click to collapse
Very true but with the growth of mobile gaming today, developers must use next Gen GPUs for example the Malit628 will have native support for open CL, 3D hi Res, multi threading and 64 bit... Look at TV now there will be ultra definition which will make 1080p look like my original Nintendo game boy in the 90s... So smart phones will follow suit... Look at LGs new super phone the eclipse, 5 inch display 440 ppi ! And has an adreno 320 GPU apple knows how important a smooth graphic interface is... Apple has always used high power GPUs in their Ipad and iPhones, look at ICS Google finally use integrated hardware and graphical acceleration in the ICS operating system, u can tell the difference on how smooth 4.0 is compared to 2.3 and 3.0 the future looks good right about now....its the waiting that is killing me lol
Sent from my ADR6400L using XDA
I am considering the Galaxy S III to keep my Verizon Unlimited Data plan, but I am wondering if it is rootable?
S. Prime said:
I am considering the Galaxy S III to keep my Verizon Unlimited Data plan, but I am wondering if it is rootable?
Click to expand...
Click to collapse
Samsung phones are ALWAYS rootable. They allow it. In fact, the bootloader just gives you a warning but lets you.

GPU and benchmarks

Hey everyone.
I'm a bit lost and I don't know what to choose to buy: I9500 or I9505.
So far I know that Adreno 320 is fully OpenGL 3.0 compatible, while PowerVR SGX544MP3 not.
Adreno 320 is scoring 4 FPS more than PowerVR in T-Rex GLBenchmark 2.7.0.
PowerVR is scoring 1-2 more FPS in GLBenchmark 2.5 Egypt
Both GPU is scoring the same in Antutu and Quadrant video test, with PowerVR slightly better for few seconds (Adreno is dropping 1-2 seconds of the test to 30 FPS while PowerVR stay constant at 50-60)
In Antutu, the 3rd test (with the DNA code), Adreno 320 stays at 30-40 fps while PowerVR scores constant 60.
Both, 3dmark and glbenchmark show the PowerVR in the S4 even weaker than Nexus 4 and other chinese mobiles.
What's the deal....what the hell it's happening ? Is PowerVR that weak in the new graphic technologies but scores well in the new ones ?
Also, is there any OpenGL 3.0 benchmark so we can compare the Adreno 320 (fully OpenGL 3.0) with the PowerVR 544MP3 (OpenGL 2.0 but with some OpenGL 3.0 features thanks to an API), to see what the score and quality is ? I really want to see what that 3.0 API knows to do, as the Imagination doesn't really says what that API really do. Would there be games or apps using only OpenGL 3.0 and we will have trouble to run them because of this old GPU ?
I'm wondering...if in one year will be released an OpenGL 3.0 game, what will happens with S4 Octa ? It will not be able to play it, right ? I have no idea how that OpenGL thing works, but I remember that a game requesting DirectX 10 will not work with DirectX 9.
PowerVR really sucks. Samsung dumbs should put the PowerVR 6 "Rogue".
My opinion is that the Qualcomm scores very well, even my S3 is enough to play every single game, but the phone lags on RAM and that's why I replace it now. Buying the Octa will costs me $150 more than the Qualcomm version and I will need to send it oversea in case I will have problems and need to send it to warranty. With those $150 I can buy 2 spare battery and the Samsung S band instead getting the Octa. I want the Octa, but this phone really deserve such attention with that old rubish PowerVR GPU chip ? I don't have 4G in my area, so I don't care about the 4G, but will be nice in case I will travel somewhere with 4G, even if for me HSPA+ is enough and very fast, so the only thing counts here is the CPU, GPU and the battery life. Battery life can be solved with an additional battery, so remains the GPU and the CPU....So far A15 cores are yet very fast, but can use a lot of energy. So I can have 2 days battery life with texting and calling, but 2 hours playing games and watching 1080p videos, while with A9 I will have something similar to S3
Any developer or experienced guy here can answer me to this questions ?
Nobody ?
I'm the same situation. I'm still deciding on what version i should buy...
We need an user with Galaxy S 4 Exynos and one with Snapdragon. They should do same tests (like linpack, vellamo, antutu, and much more) and give us results.
For OpenGL 3.0 i think is better to have native support, not via APIs. Also in Snapdragon we can have same Exynos Performance via OCs and much more. I find Snapdragon more optimizable than exynos, but PowerVR is still a good GPU.
Alberto96 said:
I'm the same situation. I'm still deciding on what version i should buy...
We need an user with Galaxy S 4 Exynos and one with Snapdragon. They should do same tests (like linpack, vellamo, antutu, and much more) and give us results.
For OpenGL 3.0 i think is better to have native support, not via APIs. Also in Snapdragon we can have same Exynos Performance via OCs and much more. I find Snapdragon more optimizable than exynos, but PowerVR is still a good GPU.
Click to expand...
Click to collapse
Totally agree with you. I don't get it why people says the powervr is better. I see that in antutu benchmark scores better than adreno, but in GLBenchmark is awful. This is my only worry right now: what happens if we put the two gpu to do a full OpenGL ES 3.0 test? It will throw an error or will pass it, but with lower score. I don't care the score so much, but its capability to pass the test. If it pass it, I'm sold to Octa.
Also I found that Octa supports LPPDDR3 at 800Mhz, which means 12.8GB/s bandwidth, while S600 is LPPDDR3 but only at 600Mhz or so (only 9.4GB/s or something like that)
Sent from my GT-I9300 using Tapatalk 2
I just read (italian forum) that Exynos in the future can use all of the 8 cores together with kernel 3.8 .
So.......i think i will buy the exynos I'm just waiting a friend reply that bought it on Expansys USA. If he receive it and is all good, i will buy it from that site. With Italian Taxes (21%) and shipping costs it will cost about 730-740€
Alberto96 said:
I just read (italian forum) that Exynos in the future can use all of the 8 cores together with kernel 3.8 .
Click to expand...
Click to collapse
Why would you need these eight cores working together? How will you be sure Android will dispatch your applications threads in a proper way among them? Just another headache. I also don't believe they will really help to save battery, it's a pure marketing. But A15 is a bit more powerful than Krait from S600.
I think PowerVR 544MP3 scores below Adreno 320 in T-Rex because of unified architecture implemented in Adreno. This test uses complex shaders on every surface, so, probably, Octa GPU runs out of its fragment processors.
If you don't need a new phone right now, wait for S800 models. I don't think Mali T65x is good enough either. Looking at S3 GPU - yes, it's pretty fast in some wonderful tasks as rendering to texture, but it has some weird bottlenecks making Horn and T-REX much slower in fps than I've expected looking at pure gflops values.
Phobos Exp-Nord said:
Why would you need these eight cores working together? How will you be sure Android will dispatch your applications threads in a proper way among them? Just another headache. I also don't believe they will really help to save battery, it's a pure marketing. But A15 is a bit more powerful than Krait from S600.
I think PowerVR 544MP3 scores below Adreno 320 in T-Rex because of unified architecture implemented in Adreno. This test uses complex shaders on every surface, so, probably, Octa GPU runs out of its fragment processors.
If you don't need a new phone right now, wait for S800 models. I don't think Mali T65x is good enough either. Looking at S3 GPU - yes, it's pretty fast in some wonderful tasks as rendering to texture, but it has some weird bottlenecks making Horn and T-REX much slower in fps than I've expected looking at pure gflops values.
Click to expand...
Click to collapse
Well, when you play some heavy games you need all cores. Also is useful to use all cores when you are charging phone, without killing battery.
I need a new phone, because my Galaxy S I9000 is slow with new apps and android versions. If i buy this is useless a S800 version. CPU is fast, gpu maybe not as Adreno 330, but with overclock we can boost a lot performances.
Dude, using all eight cores will simply melt your phone in your hands LOL. You will drink S4 cocktail LOL. Quad-core is enough, but a gpu it's never. Same things are happening with the PCs. I don't need huge fps in trex, but some safe reviews and opinions from people really knows this things....but so far only you two were able to answer (I will not pretend yet that this forum is full of noobs LOL).
I want new mobile because of the lack of ram in S3, even if it's smooth for me. I was happy to hear about the Octa version, because I wanted to try something new, but I'm kinda lost now.
Alberto96, please let me know when your friend gets that i9500. I want to get it from Expansys too (I think we already talked together about this in other threads). If I will buy i9505 I will get it from Amazon Italy as it cheaper than other places
I'm just comparing:
I9500: - 1 years of warranty (overseas)
I9505 - 2 years of warranty (locally)
I9500 = I9505 + 3 additional S4 batteries with external charger
That because:
740€ = 625€ + 35€ x 3 batteries (and I will still have money for a Burger King and a Cola)
So...it's really deserve the risk ? Still nobody answered me related to OpenGL ES 3.0
S800 and Adreno 330 will not be in a Samsung device soon (maybe never) and 2.1-2.3GHz looks too much for a mobile phone. We already have warming issues with the S4 (I even have issues in S3, with the phone going warmer). Also....My laptop is a Dual-Core AMD 2.1 GHz for God sake.
@Alberto96, I beg you, when your friend gets the phone, please test it and let me know what you think ?
demlasjr said:
2.1-2.3GHz looks too much for a mobile phone
Click to expand...
Click to collapse
Not when playing Hi10P in software.
I do not know the exact internal scheme of Exynos Octa, so it's easy for me to imagine the situation when two threads of single application will be dispatched to two different core domains, making it really hard to exchange the data between them, as probably each domain has its own cache subsystem, so the performance will drop even higher than with two threads on A7-domain together.
Phobos Exp-Nord said:
Not when playing Hi10P in software.
I do not know the exact internal scheme of Exynos Octa, so it's easy for me to imagine the situation when two threads of single application will be dispatched to two different core domains, making it really hard to exchange the data between them, as probably each domain has its own cache subsystem, so the performance will drop even higher than with two threads on A7-domain together.
Click to expand...
Click to collapse
Yeah, you're right here. I don't have much knowledge relating this profile as I'm not watching anime, but seems to depending more on the GPU than CPU in S4 case. I'm really sure that Exynos Octa is able to run it, but not sure about the PowerVR. I've read that an Hi10P plays anywhere from 15-20fps (watchable, but still not that great) with a Tegra 3 quad-core overclocked at 1.6GHz, so there is still hope.
demlasjr said:
I've read that an Hi10P plays anywhere from 15-20fps
Click to expand...
Click to collapse
That's about 720p. Just asked in another thread there about 1080p - S4 cannot play it smooth enough with MX Player. It's not a question of resolution, it's a problem of use a file from 1080p home collection without any additional efforts.
We'll see, maybe later there will be an update released for such issues. I think the GPU and the CPU of both variants are capable of playing such videos.
Hey guys,
http://withimagination.imgtec.com/i...or-todays-leading-platforms#comment-880303396
jumping directly from OpenGL ES 2.0 to 3.0 would create a situation where app compatibility would be severely broken across devices. But most people update their devices every two years; by that time, PowerVR Series6 would be the dominant OpenGL ES 3.0 GPU generation shipping in most devices.
It is also important to remember that the PowerVR Series5XT GPU family has been successfully holding its own against recently released competing graphics solutions despite being released almost four years ago, which in itself is an amazing feat.
Click to expand...
Click to collapse
So....we should trust alexvoica and go forward with PowerVR SGX544MP3 even if lacks of OpenGL ES 2.0 ? He said that there was long way til OpenGL ES 2.0, but it wasn't such a big way as he said. Now every single game use OpenGL ES 2.0, I'm sure soon will be OpenGL ES 3.0 games only and not after 2 years.
get a look at this http://gfxbench.com/compare.jsp?cols=2&D1=Samsung+GT-I9500+Galaxy+S4&D2=Samsung+GT-I9505+Galaxy+S4

Categories

Resources