Related
Hi everyone
For quite a long time i've been thinking about the whole "galaxy s can do 90mpolys per second" thing.
It sounds like total bull****.
So, after many, many hours of googling, and some unanswered mails to imgtec, i'd like to know-
Can ANYONE provide any concrete info about the SGX540?
From one side i see declerations that the SGX540 can do 90 million polygons per second, and from the other side i see stuff like "Twice the performance of SGX530".
...but twice the performance of SGX530 is EXACTLY what the SGX535 has.
So is the 540 a rebrand of the 535? that can't be, so WHAT THE HELL is going on?
I'm seriously confused, and would be glad if anyone could pour light on the matter.
I asked a Samsung rep what the difference was and this is what I got:
Q: The Samsung Galaxy S uses the SGX540 vs the iPhone using the SGx535. The only data I can find seems like these two GPU's are very similar. Could you please highlight some of the differences between the SGX535 and the SGX540?
A: SGX540 is the latest GPU that provides better performance and more energy efficiency.
SGX535 is equipped with 2D Graphic Accelerator which SGX540 does not support.
I also tried getting in contact with ImgTec to find out an answer, but I haven't received a reply back. It's been two weeks now.
Also, the chip is obviously faster than snapdragon with the adreno 200 gpu. I don't know if Adreno supports TBDR, I just know it's a modified Xenon core. Also, Galaxy S uses LPDDR2 ram. So throughput is quite a bit faster, even though it's not *as* necessary with all the memory efficiencies between the Cortex A8 and TBDR on the SGX540.
thephawx said:
A: SGX540 is the latest GPU that provides better performance and more energy efficiency.
SGX535 is equipped with 2D Graphic Accelerator which SGX540 does not support.
Click to expand...
Click to collapse
i think that is the cue, for cost saving for Samsung
besides who will need a 2D Accelerator, with a CPU as fast as it's already.
The HTC Athena (HTC Advantage) failed miserably at adding the ATI 2D Accelerator which no programmers were able to take advantage of, in the end the CPU did all the work.
I'd imagine its a 535 at 45nm. Just a guess, the cpu is also 45nm
Having tried a few phones the speed in games is far better, much better fps though there is a problem that we might have to wait for any games to really test its power as most are made to run on all phones.
This was the same problem with the xbox and ps2, the xbox had more power but the ps2 was king and so games were made with its hardware in mind which held back the xbox, only now and then did a xbox only game come out that really made use of its power....years later xbox changed places which saw 360 hold the ps3 back (dont start on which is better lol) and the ps3 has to make do with 360 ports but when it has a game made just for it you really get to see what it can do...anywayits nice to know galaxy is future proof game wise and cannot wait to see what it can do in future or what someone can port on to it.
On a side note I did read that the videos run through the graphics chip which is causing blocking in dark movies (not hd...lower rips) something about it not reading the difference between shades of black, one guy found a way to turn the chip off and movies were all good, guess rest of us have to wait for firmware to sort this.
thephawx said:
A: SGX540 is the latest GPU that provides better performance and more energy efficiency.
SGX535 is equipped with 2D Graphic Accelerator which SGX540 does not support.
Click to expand...
Click to collapse
smart move sammy
voodoochild2008-
I wouldn't say we'd have to wait so much.
Even today, snapdragon devices don't do very well in games, since their fillrate is so low (133Mpixels)
Even the motorola droid (SGX530 at 110mhz, about 9~ Mpoly's and 280~ Mpixels with that freq) fares MUCH better in games, and actually, runs pretty much everything.
So i guess the best hardware is not yet at stake, but weaker devices should be hitting the limit soon.
bl4ckdr4g00n- Why the hell should we care? I don't see any problem with 2D content and/or videos, everything flies at lightspeed.
well I can live in hope, and I guess apples mess (aka the iphone4) will help now as firms are heading more towards android, I did read about one big firm in usa dropping marketing for apple and heading to android, and well thats what you get when you try to sell old ideas...always made me laugh when the first iphone did like 1meg photo when others were on 3meg, then it had no video when most others did, then they hype it when it moves to a 3meg cam and it does video.....omg, ok I am going to stop as it makes my blood boil that people buy into apple, yes they started the ball rolling and good on them for that but then they just sat back and started to count the money as others moved on.................oh and when I bought my galaxy the website did say "able to run games as powerfull as the xbox (old one) so is HALO too much to ask for lol
wait so what about the droid x vs the galaxy s gpu?? i know the galaxy s is way advanced in specs wise... the droid x does have a dedicated gpu can anyone explain??
The droid X still uses the SGX530, but in the droid x, as opposed to the original droid, it comes in the stock 200mhz (or at least 180)
At that state it does 12-14Mpolygons/sec and can push out 400-500Mpixels/sec
Not too shabby
he 535 is a downgrade from the 540. 540 is the latest and greatest from the PowerVR line.
Samsung did not cost cut, they've in fact spent MORE to get this chip on their Galaxy S line. No one else has the 540 besides Samsung.
Like i said, its probably just a process shrink which means our gpu uses less power and is possibly higher clocked.
p.s. desktop gfx haven't had 2d acceleration for years removing it saves transistors for more 3d / power!
This worries me as well... Seems like it might not be as great as what we thought. HOWEVER again, this is a new device that might be fixed in firmware updates. Because obviously the hardware is stellar, there's something holding it back
Pika007 said:
The droid X still uses the SGX530, but in the droid x, as opposed to the original droid, it comes in the stock 200mhz (or at least 180)
At that state it does 12-14Mpolygons/sec and can push out 400-500Mpixels/sec
Not too shabby
Click to expand...
Click to collapse
http://www.slashgear.com/droid-x-review-0793011/
"We benchmarked the DROID X using Quadrant, which measures processor, memory, I/O and 2D/3D graphics and combines them into a single numerical score. In Battery Saver mode, the DROID X scored 819, in Performance mode it scored 1,204, and in Smart mode it scored 963. In contrast, the Samsung Galaxy S running Android 2.1 – using Samsung’s own 1GHz Hummingbird CPU – scored 874, while a Google Nexus One running Android 2.2 – using Qualcomm’s 1GHz Snapdragon – scored 1,434. "
The N1's performance can be explained by the fact it's 2.2...
But the Droid X, even with the "inferior" GPU, outscored the Galaxy S? Why?
gdfnr123 said:
wait so what about the droid x vs the galaxy s gpu?? i know the galaxy s is way advanced in specs wise... the droid x does have a dedicated gpu can anyone explain??
Click to expand...
Click to collapse
Same here. I want to know which one is has the better performance as well.
Besides that. Does anyone know which CPU is better between Dorid X and Galaxy S?
I knew that OMAP chip on the original Droid can overclock to 1.2Ghz from what, 550Mhz?
How about the CPU on Droid X and Galaxy S? Did anyone do the comparison between those chips? Which can overclock to a higher clock and which one is better overall?
Sorry about the poor English. Hope you guys can understand.
The CPU in the DroidX is a stock Cortex A8 running at 1GHz. The Samsung Hummingbird is a specialized version of the Cortex A8 designed by Intrinsity running at 1Ghz.
Even Qualcomm does a complete redesign of the Cortex A8 in the snapdragon cpu at 1GHz. But while the original A8 could only be clocked at 600Mhz with a reasonable power drain, the striped down versions of the A8 could be clocked higher while maintaining better power.
An untouched Cortex A8 can do more at the same frequencies compared to a specialized stripped down A8.
If anything the Samsung Galaxy S is better balanced, leveraging the SGX 540 as a video decoder as well. However, the Droid X should be quite snappy in most uses.
At the end of the day. You really shouldn't care too much about obsolescence. I mean the Qualcomm Dual-core scorpion chip is probably going to be coming out around December.
Smart phones are moving at a blisteringly fast pace.
TexUs-
I wouldn't take it too seriously.
Quadrant isn't too serious of a benchmark, plus, i think you can blame it on the fact that 2D acceleration in the SGS is done by the processor, while the DROID X has 2D acceleration by the GPU.
I can assure you- There is no way in hell that the SGX540 is inferior to the 530. It's at least twice as strong in everything related to 3D acceleration.
I say- let's wait for froyo for all devices, let all devices clear from "birth ropes" of any kind, and test again. with more than one benchmark.
Pika007 said:
TexUs-
I wouldn't take it too seriously.
Quadrant isn't too serious of a benchmark, plus, i think you can blame it on the fact that 2D acceleration in the SGS is done by the processor, while the DROID X has 2D acceleration by the GPU.
I can assure you- There is no way in hell that the SGX540 is inferior to the 530. It's at least twice as strong in everything related to 3D acceleration.
I say- let's wait for froyo for all devices, let all devices clear from "birth ropes" of any kind, and test again. with more than one benchmark.
Click to expand...
Click to collapse
The SGS might be falling behind in I/O speeds... It is well known that all the app data is stored in a slower internal SD-card partition... Has anyone tried the benchmarks with the lag fix?
Also, if only android made use of the GPU's to help render the UI's... It's such a shame that the GPU only goes to use in games...
Using the GPU to render the UI would take tons of battery power.
I preffer it being a bit less snappy, but a whole lot easier on the battery.
thephawx said:
At the end of the day. You really shouldn't care too much about obsolescence. I mean the Qualcomm Dual-core scorpion chip is probably going to be coming out around December.
Smart phones are moving at a blisteringly fast pace.
Click to expand...
Click to collapse
Smart phones aren't but batteries are.
IMO the only way we haven't had huge battery issues because all the other tech (screen, RAM power, CPU usage, etc) has improved...
Dual core or 2Ghz devices sound nice on paper but I worry if the battery technology can keep up.
TexUs said:
Smart phones aren't but batteries are.
IMO the only way we haven't had huge battery issues because all the other tech (screen, RAM power, CPU usage, etc) has improved...
Dual core or 2Ghz devices sound nice on paper but I worry if the battery technology can keep up.
Click to expand...
Click to collapse
I think so. The battery will be the biggest issue for the smart phone in the future if it just remain 1500mAh or even less.
The dual-core CPU could be fast but power aggressive as well.
Which is the better GPU and why ?
I'm not sure of the technical reasons why, maybe people are just going off benchmarks, but the general consensus is that Adreno 220 has better gaming performance.
However, unless you are planning some hardcore gaming; Mali-400 MP or GeForce ULP will be just fine.
MALI-400 MP is imo a faster GPU but it really lacks stuff needed to be a good GPU. Also on the low level some of the major 3d scores are even lower than Adreno 205. So the quality here sucks. It misses many compression texture formats so low compatibility. Most games will come up with a solution for that but with time and that time could really end the life cycle of the gs2. Mali 400 is slower than adreno 205 in Geometric Tests, Common Tests, Exponential Tests. Adreno 220 will be a slightly slower GPU in synthetic tests but with more compatibility, better quality from the lower level, more texture compression formats and will be compatible with all games since start as adreno gpu games are already abundant in the market. So its more like a Samung delivered the fastest GPU with major flaws. Here Adreno 220 is like ATI and Nvidia and Mali-400 is like any other generic GPU from another company. And Galaxy S 2 coming in tegra 2 would really mess up the compatibility of Mali-400 seeing that Mali will be missing the number of devices so Mali - 400 could be a left out here.
Right now the game here is a Faster GPU (by a small margin) vs a more Compatible GPU. Better - if u can wait with no definite future mali and if u want everything now and in future its adreno
With CF working on compat I wouldnt be surprised if we're all playing Tegra Zone next month.
bilboa1 said:
With CF working on compat I wouldnt be surprised if we're all playing Tegra Zone next month.
Click to expand...
Click to collapse
I agree. (10char)
Samsung's Galaxy S II Preliminary Performance: Mali-400MP Benchmarked 1Ghz Mali 400
Dual Core Snapdragon GPU Performance Explored - 1.5 GHz MSM8660 and Adreno 220 Benchmarks 1.5Ghz Adreno 220
Should give you a rough idea of what to expect.
_dsk_ said:
Samsung's Galaxy S II Preliminary Performance: Mali-400MP Benchmarked 1Ghz Mali 400
Dual Core Snapdragon GPU Performance Explored - 1.5 GHz MSM8660 and Adreno 220 Benchmarks 1.5Ghz Adreno 220
Should give you a rough idea of what to expect.
Click to expand...
Click to collapse
No, it's not even close.
My Galaxy S II scores 42.2 fps in the same benchmark, Adreno scores an impressive 38 fps but this is with the CPU at 1.5GHz.
_dsk_ said:
Samsung's Galaxy S II Preliminary Performance: Mali-400MP Benchmarked 1Ghz Mali 400
Dual Core Snapdragon GPU Performance Explored - 1.5 GHz MSM8660 and Adreno 220 Benchmarks 1.5Ghz Adreno 220
Should give you a rough idea of what to expect.
Click to expand...
Click to collapse
No, this one is not accurate.
Just look at the firmware, SGS2 was running Android2.3.1 at the time, it was not a retail device.
Retail SGS2 outperforms anything currently in GLbench.
"Originally Posted by iwantandroid
I cried when I lerned this phone i got from tmobile didnt have Android. Can sum1 help me get Android on my new G1 and then tel me how to jailbroke it please"
LOL OMG
_dsk_ said:
Samsung's Galaxy S II Preliminary Performance: Mali-400MP Benchmarked 1Ghz Mali 400
Dual Core Snapdragon GPU Performance Explored - 1.5 GHz MSM8660 and Adreno 220 Benchmarks 1.5Ghz Adreno 220
Should give you a rough idea of what to expect.
Click to expand...
Click to collapse
these tests are kinda misleading, between non-final device/software and capped framerate
i'm a bit disappointed that it comes from anandtech since they usually try to have stuff all squared out on PCs ;-)
lol, you guys are very defensive about your phones, understandably.
What you should be able to ascertain though is that the 1Ghz Mali benchmarks are decent and you can expect better performance with it clocked at 1.2Ghz.
Conversely you should be able to see that the Adreno at 1.5Ghz, though impressive, will be less so clocked at 1Ghz like in the Sensation, which will also have a higher resolution screen.
I only provided the links so that people could make up their own mind by using the same logic.
Are you sure the Mali-400 is clocked at 1.2Ghz ?
Because when I overclocked my SGS2 to 1.5Ghz I saw a 25% performance increase in computing performance, but almost no increase at all in graphics performance (using GL Benchmark), so I thought the frequencies of the two were totally unrelated.
I dont know what the clock speeds of the GPU are, but CPU speed bumps will also help with 3D performance.
_dsk_ said:
I dont know what the clock speeds of the GOP are, but CPU speed bumps will also help with 3D performance.
Click to expand...
Click to collapse
Well in my case it did not. I guess a dual core 1.2Ghz CPU is not a bottleneck on a smartphone lol.
Ive heard there are FPS caps on the Galaxy line, not sure how true this is, usually benchmarks should see an increase when handsets are overclocked.
_dsk_ said:
Ive heard there are FPS caps on the Galaxy line, not sure how true this is, usually benchmarks should see an increase when handsets are overclocked.
Click to expand...
Click to collapse
its true for the sgs1 and 2 at least, frame rate is capped between 56 and 66 fps depending on kernels/versions etc
many benchmarks hit the cap (like quadrant)
I could have sworn during the I/O conference it as stated that the Nexus 7 has a 12 core GPU that was separate than the 4 core CPU and that someone said the Tegra 3 in the Prime has the CPU and CPU combined.
am i incorrect?
or is the Nexus 7 and Prime Tegra 3 CPU and GPU exactly the same?
As far as the quad core cpu, and 12 core gpu are configured it should be the same. But the cpu on the nexus is actually clocked lower then the prime.
As far as I know they are the exact same silicon containing 4 CPU cores and 12 GPU units but with different clock rates. The T30 in the Prime has 25% higher GPU clock rate compared to the T30L in the Nexus 7.
I don't think the difference is too significant. Much uglier is that the T33 in the Infinity has the same GPU speed as the T30 but is trying to push 2.25x as many pixels as the Prime or the N7.
Hi there!
I am in the hunt for a 7"-8" Android 4/4.1 tablet. Currently my choices are the new Acer Iconia A110 (because of a microSD card slot), the Motorola Xoom 2 Media Edition (because of the bigger screen, excellent build and virtual surround sound), the Samsung Galaxy Tab 7.7 (again with a slightly bigger screen, a microSD card slot and an excellent AMOLED screen) and the top dog Google Nexus 7. But i am more interested with the Nexus 7 in terms of "Is it worth the investment" even on a small screen?. I will be using the thing mainly for checking email/news/weather, the usual Youtube, WIkipedia, Twitter, watching movies and also gaming. So, i'd like to ask:
1, Is the actual GPU dual or single channel? And what's the frequency? Does it matter?
2. Is the 1.3Ghz the base CPU speed? Or is it underclocked like what Apple is doing with its tabs?
3. Aside from connecting a mice or keyboard what other stuff can the Bluetooth 3.0 standard do?
4. Is it capable of wireless file transfer to & from a Macbook?
5. I'm aware that it doesn't have Flash but can i still install them via the Google Play?
6. Are they stereo speakers? Capable of surround sound? (some sound issues in some models i heard)
Please advice. Thanks.
gino_76ph said:
Hi there!
I am in the hunt for a 7"-8" Android 4/4.1 tablet. Currently my choices are the new Acer Iconia A110 (because of a microSD card slot), the Motorola Xoom 2 Media Edition (because of the bigger screen, excellent build and virtual surround sound), the Samsung Galaxy Tab 7.7 (again with a slightly bigger screen, a microSD card slot and an excellent AMOLED screen) and the top dog Google Nexus 7. But i am more interested with the Nexus 7 in terms of "Is it worth the investment" even on a small screen?. I will be using the thing mainly for checking email/news/weather, the usual Youtube, WIkipedia, Twitter, watching movies and also gaming. So, i'd like to ask:
1, Is the actual GPU dual or single channel? And what's the frequency? Does it matter?
2. Is the 1.3Ghz the base CPU speed? Or is it underclocked like what Apple is doing with its tabs?
3. Aside from connecting a mice or keyboard what other stuff can the Bluetooth 3.0 standard do?
4. Is it capable of wireless file transfer to & from a Macbook?
5. I'm aware that it doesn't have Flash but can i still install them via the Google Play?
6. Are they stereo speakers? Capable of surround sound? (some sound issues in some models i heard)
Please advice. Thanks.
Click to expand...
Click to collapse
1. Its either dual or quad I think clocked at 450 or something(can be over clocked)
2. Underclocked I think(prime has same CPU but at 1.5)
3. Don't know
4. There's a few apps that do this
5. No you have to sideload
6. Stereo and don't know about surround sound
Sent from my Jelly Nexus S
Would it matter if a tablet has dual or single channel GPU? Does it matter if the wifi is dual or single band? WIll it actually help make the graphics "better" and surfing the net faster?
Would you trust Acer when it comes to build quality of its tablets compared to say samsung or Motorola?
1. Not sure(I think I heard about it being overclocked somewhere)
2. Default is 1.2ghz, can be overclocked up to 1.5ghz.
3. For example: File transfer. If you root you can also use it as a PlayStation controller with BluePutDroid.
4. There are a number of ways to do this, I would recommend AirDroid.
5. To get flash(no root required):
A. Go to settings->security and enable unknown sources.
B. Download and install the flash apk on your device from here: http://forum.xda-developers.com/showthread.php?t=1763805
C. Get a browser that supports flash like boat browser(from play store).
6. Stereo, probably not surround sound.
(Second post)
Not sure what dual channel GPU means to tell you the truth.
I believe the nexus 7 has dual channel WiFi, using speed test app the speed reaches or goes above my maximum speed from the other end of the house.
gino_76ph said:
1, Is the actual GPU dual or single channel? And what's the frequency? Does it matter?
2. Is the 1.3Ghz the base CPU speed? Or is it underclocked like what Apple is doing with its tabs?
Click to expand...
Click to collapse
There is no such thing as a single or dual channel GPU. Channels refers to the RAM. It is a 12 core GPU.
1.3ghz is the maximum clock speed of the specific CPU used, the T30L. It is not underclocked.
this is the truth after reading some ****.no single or dual gpu.12 core has.channel intended only for the ram.this is the minor tegra3 out there,less freq. clock but high clocked ram and not the same as t30 packed.begginnning with the fact the clock cpu freq. is overcloccable without problems,the ram packed on n7 is IMHO better than ad example tf201 or htconex one's
Are you guys certain there is no such thing as single or dual channel CPU?
And If the GPU clocked speed is 1.3Ghz would it mean that there is 1.3Ghz on each of the 12 cores?
gino_76ph said:
Are you guys certain there is no such thing as single or dual channel CPU?
And If the GPU clocked speed is 1.3Ghz would it mean that there is 1.3Ghz on each of the 12 cores?
Click to expand...
Click to collapse
no you are wrong man.the CPU(4cores) is clocked at 1.3 ghz (4 cores running) and 1.5 (or 1.4 i don't remeber)in single mode (1 core running)
the GPU (12cores)is clocked at 416 mhz by default
apart them,if you flash a custom kernel,this Soc can reach (depending on tab,they aren't exactly the same chips)1.8\2.0 ghz for the CPU,and 484\520\600\650\700\750 with the GPU (here depending on tab as well)
I see. So, it is fast?
As a side question would it be practical to buy a new or latest tablet like the Nexus 7 than an older (and equally good in its own) say Galaxy Tab 7.7 or the Xoom 2 Media Edition? What i'm trying to ask here is the "problem" of compatibility with apps and games if a tab has an older GPU in them.
Would that be an issue or not?
yes,sure it's fast!a little bit faster than others with same chip.i do you an example regards the last question.
there are peoples with old gpus,that continue playing hd games with these old gpu without problems (not all games working,but many of them!).an example is the galaxy nexus that i own,it 's packed with a good cpu and a old gpu,that we found also on galaxy s,nexus s ecc,but honestly i never found a game that doesn't work for the odl gpu.i have also tegra2 devices,no prob with games,surely a tegra3 is more powerfull and you can play games with full effect enabled without problems.all apps works,not depending to gpu,but only the version of OS at least.
The Tegra 3 SoC only has a single channel memory. Specs are 1GB RAM of DDR3L -1333 MHz (Low Voltage) giving a total memory bandwidth of 5.3 GB/s, is this super fast, no, but it is more than than sufficient for the Nexus 7 display resolution.
To the OP, don't get stressed about specs, especially if you're 100% sure what they actually mean. The important part is user experience of the Nexus 7, due in part to Android Jelly Bean, it is smooth and enjoyable, it can play all the latest games well, I also run Playstation & N64 emulators on it without issue.
Finally, The Nexus 7 is fully unlockable, so it has great developer support on XDA and other forums, which is 50% of the device's appeal in my eyes. If you can wait a few weeks, the rumour is a 32 GB model will replace the current 16 GB version.
If you can manage to find a Nexus 7 used on Craigs or Ebay, I would do it. I got my perfect condition barely used 16gb for $160 from a buyer's remorse user on Craigslist. For this price I find the tablet to be very good. I would have a harder time paying the $250 plus tax in store for the same unit. Not that it's not worth the $250 but already owning a Galaxy S3 phone, it's too much of the same at the end of the day, much like I experienced when I had a iPhone and iPad together.
The Nexus7 for me is a great grab and go device for quick browsing, game playing, weather checking, etc.
If you've got to have the latest and fastest specs, the Tegra3 is getting dated already and you'd want to find something with a Qualcomm S4 chip (even this isn't really faster than Tegra3). Supposedly the OMAP 4470 in the bigger Fire HD and the Nook HD+ might be a little faster for more money.
i doubt 4470 it's faster than tegra3 (all 3 variant)..it's basically a 4460 with a bit more clock freq.,same 45nm tecnology and with a faster gpu (with dedicated 2d hw chipset).they claim it's up to 2 times more faster than sgx540.if it's true,i think that tegra3 is better (not for the quad).Anyway i have to agree with all the things sad in previous posts.OP don't care about spec,a nexus device is fast for many others things that i don't write,already sad,and also if tegra3 it's becoming an "old" chipset compared to new out this days,it performs very well with an optimized OS.wait for the 32gb version and never ever think only about cpu\gpu specs :good:
sert00 said:
i doubt 4470 it's faster than tegra3 (all 3 variant)..it's basically a 4460 with a bit more clock freq.,same 45nm tecnology and with a faster gpu (with dedicated 2d hw chipset).they claim it's up to 2 times more faster than sgx540.if it's true,i think that tegra3 is better (not for the quad).Anyway i have to agree with all the things sad in previous posts.OP don't care about spec,a nexus device is fast for many others things that i don't write,already sad,and also if tegra3 it's becoming an "old" chipset compared to new out this days,it performs very well with an optimized OS.wait for the 32gb version and never ever think only about cpu\gpu specs :good:
Click to expand...
Click to collapse
A full fat OMAP 4470 is faster than the Tegra 3. I read a review of the Archos 101 XS which runs an OMAP 4470 @ 1.5 GHz (GPU 384 MHz)
In the ultra demanding GL Benchmark 2.5 - Egypt HD (Offscreen 1080p)
Nexus 7 = 8.9 FPS
Archos = 11 FPS
Transformer Infinity = 11 FPS
There is scope for the 4470 to run at 1.8 GHz, but that is probably only for larger devices like Windows RT tablet, Amazon apparently have clocked it at 1.5 GHz. Overall in a tough benchmark the N7 is slower, however the Transformer Infinity is the same speed, which is basically as fast as an easily overclocked Nexus. As the OMAP is a dual-core, in theory a game developed specially for our Nexus (Tegra Zone?) could be faster or more feature packed in terms of physics etc, if it use all 4 cores.
Turbotab said:
A full fat OMAP 4470 is faster than the Tegra 3. I read a review of the Archos 101 XS which runs an OMAP 4470 @ 1.5 GHz (GPU 384 MHz)
In the ultra demanding GL Benchmark 2.5 - Egypt HD (Offscreen 1080p)
Nexus 7 = 8.9 FPS
Archos = 11 FPS
Transformer Infinity = 11 FPS
There is scope for the 4470 to run at 1.8 GHz, but that is probably only for larger devices like Windows RT tablet, Amazon apparently have clocked it at 1.5 GHz. Overall in a tough benchmark the N7 is slower, however the Transformer Infinity is the same speed, which is basically as fast as an easily overclocked Nexus. As the OMAP is a dual-core, in theory a game developed specially for our Nexus (Tegra Zone?) could be faster or more feature packed in terms of physics etc, if it use all 4 cores.
Click to expand...
Click to collapse
months ago the 4470 was supposed to run at 1.7 ghz.i remember when i bought the gnex in november 2011 that 4430 is at 1.2\4460 at 1.5\4470 at 1.7.theese number was in the official omap site and guide line referments.only after being out the fact of the 4460 bug (major part of them,wasn't capable of 1.5 ghz,and this Soc it isn't a downclocked one,from 1.5 to 1.2 by google.it's a 1.2 cpu.)they change also in the site some numbers.now the 4460 is at 1.2 and the 4470 there's write 1.3+,in this case of the archos 1.5.what a strange thing from omap!i saw same anandtech reviwe like you sad times ago,but honestly i think that in the total of bench that regularly they do,there are some in favor of 4470,and some in favor of tegra3,at least depending also if referred to cpu or gpu.with 4460 they did a good job,i really like it,but after have a look at 4460\70 documentation,seems that in term of cpu,there aren't so much differences.if i clock my 4460 at 1.5\16,do a bench and compare with a same bench do with a 4470,i think that the most differences are gpu related..and when i compare my bench with n7 and gnex,in term of cpu and both ultra-tweaked i see a big gap in scores...it's for that i continue to think in the total user exp and bench scores as well tegra3 remain more powerfull.but certainly the differences aren't visible by end user..but with bench at least and in th end what really count it's how's the user experience,not bench
sert00 said:
months ago the 4470 was supposed to run at 1.7 ghz.i remember when i bought the gnex in november 2011 that 4430 is at 1.2\4460 at 1.5\4470 at 1.7.theese number was in the official omap site and guide line referments.only after being out the fact of the 4460 bug (major part of them,wasn't capable of 1.5 ghz,and this Soc it isn't a downclocked one,from 1.5 to 1.2 by google.it's a 1.2 cpu.)they change also in the site some numbers.now the 4460 is at 1.2 and the 4470 there's write 1.3+,in this case of the archos 1.5.what a strange thing from omap!i saw same anandtech reviwe like you sad times ago,but honestly i think that in the total of bench that regularly they do,there are some in favor of 4470,and some in favor of tegra3,at least depending also if referred to cpu or gpu.with 4460 they did a good job,i really like it,but after have a look at 4460\70 documentation,seems that in term of cpu,there aren't so much differences.if i clock my 4460 at 1.5\16,do a bench and compare with a same bench do with a 4470,i think that the most differences are gpu related..and when i compare my bench with n7 and gnex,in term of cpu and both ultra-tweaked i see a big gap in scores...it's for that i continue to think in the total user exp and bench scores as well tegra3 remain more powerfull.but certainly the differences aren't visible by end user..but with bench at least and in th end what really count it's how's the user experience,not bench
Click to expand...
Click to collapse
An area the 4470 does hold a significant advantage over Tegra 3 is memory bandwidth, as it utilises dual-channel memory, hopefully Tegra 4 will sort out that deficiency. Ultimately the OMAP's GPU is not powerful enough to be bandwidth limited anyway, overall I like the Tegra 3 from a UX perspective, looking forward to a Tegra 4 in the next Nexus 7 v2:good:
Using a nexus 7 now. Very happy with the money I paid for it. In terms of spec? This beast will last you for awhile. Even if they are pushing specs already to the next level, it'll be a long time until a quad core 1 gb ram machine will be considered slow.
Simply put, at this price and quality, anyone can buy it and everyone should.
Sent from my Nexus 7 using xda app-developers app
Turbotab said:
Finally, The Nexus 7 is fully unlockable, so it has great developer support on XDA and other forums, which is 50% of the device's appeal in my eyes. If you can wait a few weeks, the rumour is a 32 GB model will replace the current 16 GB version.
Click to expand...
Click to collapse
The 32 gig will be replacing the 8 gig model. Two versions will be available by Christmas: a 16 gig model and a 32 gig model. The 16 will be priced at (or below) $200.00. The 32 will be at (or below) $250.00.
Posted via my Amiga 3000, EVO 3D , or Nexus 7
phillip1953 said:
The 32 gig will be replacing the 8 gig model. Two versions will be available by Christmas: a 16 gig model and a 32 gig model. The 16 will be priced at (or below) $200.00. The 32 will be at (or below) $250.00.
Posted via my Amiga 3000, EVO 3D , or Nexus 7
Click to expand...
Click to collapse
You have a link confirming that, or is that inside knowledge
It's the logic step for Google. The 32 gig is already being sold and nobody really wants the 8 gig model. To compete with the "other" tablets and to make up for the lack of an SD card slot, it only makes sense.
IOW.....my speculation from 40 years of computer use....starting with the Heathkit H8.
Posted via my Amiga 3000, EVO 3D , or Nexus 7
So the Qualcomm Snapdragon 820 is quad core. Why did Qualcomm decide to go with a quad core one over a octacore or hexacore? How would that affect the GS7/GS7 Edge if it were octacore or hexacore? How much of a difference is there between the Exynos and 820?
Indeed the snapdragon 820 is a quad core SOC unlike most recent socs which have featured 8 cores (2 clusters of 4 cores). However most big.little socs like the exynos 8890 and 7420 use 4 low power slow processors and 4 high power, but power hungry processors. They are completely different architectures. This means that while the exynos 8890 for instance has 8 cores. Only 4 of them are really designed for performance. The other 4 are designed to save power. The 820 is different. It's also some sort of big.little setup with 2 clusters of 2 cores. However both clusters are identical architectures. The difference is one cluster is clocked lower and has a different l2 cache configuration in order to use less power. On top of that the custom cores in the 820 are faster per core than the exynos 8890. So clock for clock the 820 would win against the high power cluster of the exynos. In heavily multitheaded situations though. The exynos still can tap into all 8 cores at the same time which should give it an advantage in that scenario. For the rest of the time I would imagine the 4 faster cores of the snapdragon would be better suited to everyday stuff. As for why they only went with 4. My guess is cost, and power efficiency. Kyro is a brand new architecture. Krait went through many iterations. Kyro will probably see a noticeable reduction in its power envelope in the next iteration which would make shoving more of them onto an SOC a more viable option. As for gpu, all signs are pointing to the snapdragons adreno GPU beating the Mali in the exynos atm. Development will also be improved on the snapdragon device as Qualcomm releases the proprietary vendor binaries and Samsung does not. This means the likelihood of seeing cm or aosp on an exynos variant is slim. Hope this helps!
Actually, the Kryo cores are (slightly) better at running single threaded tasks while the Exynos cores are (slightly) better at running multi-threaded tasks. I doubt the everyday users will notice.
The Adreno is also more powerful than the MALI GPU, though everyday users will mostly notice a performance improvement on applications using the Vulkan API vs regular applications, than anything between both these GPUs.
Finally the memory management seems much better on the Exynos 8890 for some reason (about twice as fast), since the same chips are used I wonder whether it's a software or a hardware implementation difference, both units are plenty fast though.
The real difference between both these SOCs will seen in the power management efficiency, in fact both variants are overpowered in every aspects as far as regular usage goes, so there is little point in comparing which one's the fastest. Instead, you need to wonder which one is the most conservative with power consumption while achieving equivalent performances.
Both the GPUs on these SOCs support the Vulcan API. And, whilst the Adreno is faster in terms of pure benchmark numbers, I very much doubt there will be a noticeable difference on any game or application, Vulcan or otherwise, that will be released during the lifetime of these phones.
Yeah, that does help explain it. Thanks. I just hoped there wouldn't be a TSMC vs Samsung difference in the iPhone 6S/6S Plus SoC.
i was thinking 8 core snapdargon 810 was over heating and thermal throttling so they went for 4 cores instead on snapdragon 820.
just my thoughts.